Pilot Testing the Debriefing for Meaningful Learning Evaluation Scale

Background Debriefing for Meaningful Learning (DML), an evidence-based debriefing method, promotes thinking like a nurse through reflective learning. Despite widespread adoption of DML, little is known about how well it is implemented. To assess the effectiveness of DML implementation, an evaluative...

Full description

Bibliographic Details
Main Authors: Sherraden Bradley, Cynthia, Dreifuerst, Kristina
Format: Text
Language:English
Published: e-Publications@Marquette 2016
Subjects:
DML
Online Access:https://epublications.marquette.edu/nursing_fac/652
https://epublications.marquette.edu/context/nursing_fac/article/1653/viewcontent/dreifuerst_13804.pdf
https://epublications.marquette.edu/context/nursing_fac/article/1653/filename/0/type/additional/viewcontent/dreifuerst_13804acc.docx
Description
Summary:Background Debriefing for Meaningful Learning (DML), an evidence-based debriefing method, promotes thinking like a nurse through reflective learning. Despite widespread adoption of DML, little is known about how well it is implemented. To assess the effectiveness of DML implementation, an evaluative rubric was developed and tested. Sample Three debriefers who had been trained to use DML at least 1 year previously, submitted five recorded debriefings each for evaluation. Methods Three raters who were experts in DML scored each of the 15 recorded debriefing session using DML Evaluation Scale (DMLES). Observable behaviors were scored with binary options. These raters also assessed the items in the DMLES for content validity. Results Cronbach's alpha, intraclass correlation coefficients, and Content Validity Index scores were calculated to determine reliability and validity. Conclusion Use of DMLES could support quality improvement, teacher preparation, and faculty development. Future testing is warranted to investigate the relationship between DML implementation and clinical reasoning.