Testing the Impact of an Asynchronous Online Training Program With Repeated Feedback

Background: Learning to effectively debrief with student learners can be a challenging task. Currently, there is little evidence to support the best way to train and evaluate a debriefer's competence with a particular debriefing method. Purpose: The purpose of this study was to develop and test...

Full description

Bibliographic Details
Published in:Nurse Educator
Main Authors: Woda, Aimee, Bradley, Cynthia Sherraden, Johnson, Brandon Kyle, Hansen, Jamie, Loomis, Ann, Pena, Sylvia, Singh, Maharaj, Dreifuerst, Kristina Thomas
Format: Article in Journal/Newspaper
Language:English
Published: Ovid Technologies (Wolters Kluwer Health) 2023
Subjects:
DML
Online Access:http://dx.doi.org/10.1097/nne.0000000000001405
https://journals.lww.com/10.1097/NNE.0000000000001405
Description
Summary:Background: Learning to effectively debrief with student learners can be a challenging task. Currently, there is little evidence to support the best way to train and evaluate a debriefer's competence with a particular debriefing method. Purpose: The purpose of this study was to develop and test an asynchronous online distributed modular training program with repeated doses of formative feedback to teach debriefers how to implement Debriefing for Meaningful Learning (DML). Methods: Following the completion of an asynchronous distributed modular training program, debriefers self-evaluated their debriefing and submitted a recorded debriefing for expert evaluation and feedback using the DML Evaluation Scale (DMLES). Results: Most debriefers were competent in DML debriefing after completing the modular training at time A, with DMLES scores increasing with each debriefing submission. Conclusion: The results of this study support the use of an asynchronous distributed modular training program for teaching debriefers how to implement DML.