Abstract
Objectives
Debriefing is an integral component of simulation education, and effective debriefing education is required to maintain effective simulation programs. However, many educators report financial and logistical barriers to accessing formal debriefing training. Due to limited educator development opportunities, simulation program leaders are often compelled to utilize educators with insufficient debriefing training, which can limit the impact of simulation‐based education. To address these concerns, the SAEM Simulation Academy Debriefing Workgroup authored the Workshop in Simulation Debriefing for Educators in Medicine (WiSDEM), a freely available, concise, and ready‐to‐deploy debriefing curriculum with a target audience of novice educators without formal debriefing training. In this study, we describe the development, initial implementation, and evaluation of the WiSDEM curriculum.
Methods
The Debriefing Workgroup iteratively developed the WiSDEM curriculum by expert consensus. The targeted level of content expertise was introductory. The curriculum's educational impact was assessed by surveying participants on their impressions of the curriculum and their confidence and self‐efficacy in mastery of the material. Additionally, facilitators of the WiSDEM curriculum were surveyed on its content, usefulness, and future applicability.
Results
The WiSDEM curriculum was deployed during the SAEM 2022 Annual Meeting as a didactic presentation. Thirty‐nine of 44 participants completed the participant survey, and four of four facilitators completed the facilitator survey. Participant and facilitator feedback on the curriculum content was positive. Additionally, participants agreed that the WiSDEM curriculum improved their confidence and self‐efficacy in future debriefing. All surveyed facilitators agreed that they would recommend the curriculum to others.
Conclusions
The WiSDEM curriculum was effective at introducing basic debriefing principles to novice educators without formal debriefing training. Facilitators felt that the educational materials would be useful for providing debriefing training at other institutions. Consensus‐driven, ready‐to‐deploy debriefing training materials such as the WiSDEM curriculum can address common barriers to developing basic debriefing proficiency in educators.
INTRODUCTION
Simulation‐based education (SBE) is essential for training effective clinical providers capable of practicing in the high‐stakes, time‐sensitive environment of the emergency department. 1 , 2 , 3 , 4 Debriefing is an integral component of SBE, 5 , 6 , 7 , 8 , 9 , 10 allowing learners to self‐reflect and re‐examine the simulation experience with the purpose of moving toward assimilation and accommodation of learning. The overarching goal of debriefing is the development of clinical judgment and critical thinking. 6 , 7 , 9
Debriefing training is critical to ensure consistency and value in SBE. The manner by which a debriefing is delivered can affect its interpretation and credibility, influencing its perception by the learner as unfair criticism or positive motivation for change. 6 Poorly structured debriefing may have negative and unintended consequences for learners. 5 , 11 , 12 , 13
Despite the well‐recognized value of debriefing in effective SBE, improving educator proficiency in debriefing remains a challenge at many institutions. Although no study has evaluated the percentage of simulation educators who have received formal debriefing training, one study estimates that only 19% to 47% of simulation educators have received formal simulation training. 14 Barriers include lack of educator self‐efficacy, 15 lack of educator time due to competing clinical or academic priorities, 15 , 16 , 17 , 18 , 19 lack of training opportunities, 14 , 17 , 20 , 21 and lack of financial support. 17 , 19 , 21 While many institutions have developed nationally recognized certificates, fellowships, and degree programs for SBE, 22 these formal training avenues typically are comprised of intensive, multiday workshops that gradually build expertise. As a result, they require significant investments in money, time, and travel, limiting accessibility to educators, particularly trainees and junior faculty. Conversely, ad hoc courses at individual institutions also exist, but may be inaccessible to unaffiliated educators or lack a standardized approach, leading to variations in consistency and quality. 23 , 24
As a result of these logistical and financial constraints, simulation program leaders often navigate a challenging trade‐off: debriefing training improves the quality and impact of SBE, thereby improving educational outcomes 5 , 8 ; however, mandatory debriefing training shrinks the available pool of educators and compromises deployment of SBE. In acknowledgment of this real dilemma, some experts have suggested that an ideal debriefing training program may have a “tiered” approach, in which educators are matched to curricula meeting their educational needs at their individual level of debriefing proficiency. 18 , 23 These trained educators can then be assigned to facilitate SBE compatible with their expertise. Given the general paucity of formal debriefing training, 14 , 17 the greatest need is likely establishing the most basic foundations of debriefing knowledge.
In response to these concerns, the Society for Academic Emergency Medicine (SAEM) Simulation Academy formed the Debriefing Workgroup, which collaboratively authored the Workshop in Simulation Debriefing for Educators in Medicine, “WiSDEM.” The WiSDEM curriculum was designed for participants of any level of clinical training with no prior formal debriefing education. Educational materials are free of cost and publicly available, allowing easy adoption by simulation program leaders for local training. By mitigating the financial and logistical barriers to debriefing training, the WiSDEM curriculum enables simulation program leaders to feasibly establish a basic minimum of debriefing proficiency among their simulation educators.
In this study, we describe the consensus‐driven creation of the WiSDEM curriculum by the SAEM Simulation Academy Debriefing Workgroup. We hypothesized that novice participants with no prior formal debriefing training would find the curriculum useful and acquire the intended knowledge, skills, and attitudes after participation. Furthermore, we hypothesized that simulation program leaders facilitating the curriculum would find the WiSDEM materials effective and time‐saving and would be willing to deploy the curriculum at their own institutions.
METHODS
Creation of the Debriefing Workgroup
The SAEM Simulation Academy's Education Subcommittee is composed of a broad membership of simulation faculty, directors, and researchers, collectively representing approximately 70 members from over 50 academic centers across the United States. In August 2020, the Debriefing Workgroup was formed based on Simulation Academy agreement that a publicly available, consensus‐generated introductory curriculum on debriefing, WiSDEM, would be of broad value to simulation program leaders across its membership base. Workgroup members consisted of experienced simulation faculty with senior leadership roles, which included deploying large‐scale SBE programs and constructing debriefing training for residents, simulation fellows, and faculty. Three authors and core content experts (THC, SNS, SKB) led the Debriefing Workgroup's process.
Development of WiSDEM educational materials
Curriculum development
The Debriefing Workgroup met monthly by virtual video conference and reported directly to the Simulation Academy Executive Committee. The WiSDEM curriculum was developed iteratively, beginning with key concept identification and amplifying through incorporation of feedback from multiple rounds of curriculum review by the Debriefing Workgroup as a whole. During each round of review, curriculum elements were selected by a core group of experts (THC, SNS, SKB), feedback from the entire workgroup was solicited, and amendments were incorporated into the final product.
The overall curriculum structure employed a combination of lecture slides, video examples of ideal and nonideal debriefing behaviors, and roleplayed debriefing, with opportunities for peer‐to‐peer discussion and facilitator‐guided discussion interspersed throughout. These curricular elements were selected to incorporate a mix of visual, aural, read/write, and kinesthetic sensory modalities based on the VARK model of learning styles. 25 , 26 This multimodal strategy enabled engagement with learners of various sensory preferences and maximized potential for active learning (Table 1).
TABLE 1.
Description of WiSDEM curriculum elements and incorporated VARK sensory preference.
| Curriculum element | VARK preference | Duration |
|---|---|---|
| Lecture slides | Aural read/write | 15 min |
| Video examples of debriefing | 20 min | |
| Video observation | Visual | |
| Peer‐to‐peer and facilitator‐guided discussion | Aural | |
| Roleplayed debriefing | 25 min | |
| Roleplay scenario | Kinesthetic | |
| Facilitator‐guided discussion, feedback, and reflection on debriefing roleplay | Aural |
Abbreviations: VARK, visual, aural, read/write, kinesthetic; WiSDEM, Workshop in Simulation Debriefing for Educators in Medicine.
The WiSDEM curricular structure emphasized peer‐to‐peer discussion, facilitator‐guided feedback and discussion, and opportunities for practicing and reflecting upon debriefing, to immerse participants in realistic educational contexts, create a community of practice, and maximize potential for learning. These educational strategies were selected based on principles of andragogy, specifically Kolb's experiential learning theory 27 , 28 and Bandura's social learning theory. 29 A visual representation of the multimodal WiSDEM curricular structure, its underlying theoretical frameworks for adult learning, and the assessment strategies employed is depicted in Figure 1.
FIGURE 1.

Multimodal sensory elements, underlying educational theoretical frameworks, and assessment strategies for learning incorporated in the WiSDEM curriculum. WiSDEM, Workshop in Simulation Debriefing for Educators in Medicine.
Curriculum scope and depth
As the WiSDEM curriculum took shape, key questions emerged on the desired scope and depth of the curriculum content, which required extensive discussion to achieve consensus. These key questions included (1) the target learner population, (2) the target level of content expertise for learners, and (3) the ideal length of the curriculum.
Through a literature search and consensus discussion, multiple barriers to formal debriefing training were identified, including lack of alignment with participant career goals (e.g., an educator facilitating simulation on an intermittent basis may not wish to pursue a 1‐week debriefing course), lack of participant time (e.g., due to competing clinical or professional priorities), financial disincentives (e.g., due to travel or tuition costs), and inaccessibility (e.g., due to local debriefing programs requiring institutional affiliation). Simulation program leaders frequently described a real, on‐the‐ground need to rapidly train novice educators for simulation program deployments and a lack of curricula addressing this specific need. An ideal debriefing curriculum would help novice educators avoid the most common debriefing errors, while also remaining freely available, ready to deploy, and concise.
Based on these defined educational objectives, the Debriefing Workgroup members focused on developing an abbreviated curriculum targeting the novice simulation educator, with “novice” defined as lacking prior formal debriefing training. The targeted level of content expertise remained introductory, with an emphasis on essential and basic principles of debriefing. The abbreviated, 1‐h length of the curriculum was chosen deliberately, as time constraints, scheduling, and competing academic and clinical priorities were commonly reported limitations to pursuing debriefing training, 15 , 16 , 17 , 18 , 19 and the targeted level of debriefing proficiency was basic.
Data collection
Key outcome measures
We defined the following key outcome measures in this study: (1) the educational impact of the WiSDEM curriculum and (2) simulation program leaders' impressions of the WiSDEM curriculum's usefulness for novice educators. To measure the educational impact of the WiSDEM curriculum, we utilized the New World Kirkpatrick Model for curriculum assessment. 30 As this study was an initial implementation of the WiSDEM curriculum, we focused on reactions and learning (Level 1 and Level 2 outcomes) to identify if the educational materials and delivery were targeted to the correct level of learner and facilitator use.
Data on Level 1 and Level 2 Kirkpatrick outcomes were collected by surveying both participants and facilitators deploying the WiSDEM curriculum. Participant survey questions were organized into themes of participant satisfaction and engagement with the curriculum (Level 1/reactions) and participant confidence and self‐efficacy on adopting optimal debriefing practices, avoiding debriefing pitfalls, and leading future debriefings (Level 2/learning). Facilitator survey questions were organized around themes of curriculum effectiveness in imparting intended knowledge (Level 2/learning).
Data on simulation program leaders' impressions of the WiSDEM curriculum were collected through surveying facilitators deploying the curriculum. Facilitator survey questions were organized around themes of feedback on curriculum content and duration, overall usefulness of the curriculum, and likelihood of using or recommending the curriculum for novice debriefing training in the future.
Survey design
The final participant and facilitator survey versions were modified through two rounds of refinement with input from the Debriefing Workgroup and tested through a limited pilot distribution with minor changes added for clarity and readability. To streamline the survey and minimize survey fatigue, our participant demographic questions focused on roles and experiences in simulation and deferred the typical demographic questions about sex, age, or geographic location. Both surveys were anonymous. The study was deemed exempt by the institutional review board of the Icahn School of Medicine at Mount Sinai.
Study setting and population
The WiSDEM curriculum was initially launched as a didactic during the SAEM 2022 Annual Meeting in New Orleans, LA. Study participants consisted of general attendees of SAEM 2022, with varying interest and expertise in SBE. This study site and participant pool were chosen because the work was part of a SAEM task force. Additionally, participants were likely to either adopt the WiSDEM curriculum for use at their own institutions or be a target learner for the curriculum.
Facilitators of the WiSDEM curriculum consisted of simulation program leaders from a range of institutions, with direct and extensive experience in leading debriefing training. The SAEM 2022 didactic presentation required multiple facilitators to maintain ideal participant‐to‐facilitator ratios. However, only facilitators without a direct authorship role in the WiSDEM curriculum participated in the facilitator survey.
Data analysis
Survey data were entered into Survey Monkey (Momentive) and then exported into Excel (Microsoft) for cleaning and analysis. We calculated descriptive statistics for participant demographic data, curricular impact measures, and facilitator impressions. On some survey items, the total percentage may surpass 100 as participants could select more than one answer.
We categorized participant data in “novice” and “experienced” participant categories to evaluate the educational impact of the WiSDEM curriculum on its intended target audience of novice debriefers. We defined novice participants as those who had no prior formal debriefing training, e.g., those who had been supervised while debriefing, had independently studied debriefing, or had studied debriefing by other methods. We defined experienced participants as those who had trained in a simulation fellowship or previously had formal debriefing training, such as a formal course.
RESULTS
Survey response rate
We received 39 completed surveys from a total of 44 session participants for a participant response rate of 89%. Four of four facilitators answered the post event survey for a response rate of 100%.
Participant demographic data
Table 2 summarizes demographic information from surveyed participants. Novice participants tended to be residents (55%) or faculty with >5 years of experience (35%). Most were occasional or recurring simulator educators (48% and 17%, respectively). More than half (59%) chose to experience the curriculum to improve their debriefing skills. They had a wide variety of debriefing experiences, although all had previously co‐led or led at least one debriefing.
TABLE 2.
Demographic information from surveyed participants.
| Debriefing experience | |||
|---|---|---|---|
| All (n = 39) | Novice (n = 29) | Experienced (n = 10) | |
| What is your training level or faculty rank? (select all) | |||
| Resident (PGY‐1 to ‐4) | 16 (41) | 16 (55) | 0 (0) |
| Simulation fellow | 6 (15) | 0 (0) | 6 (60) |
| Nonsimulation fellow | 0 (0) | 0 (0) | 0 (0) |
| Faculty for <5 years | 6 (15) | 3 (10) | 3 (30) |
| Faculty for 5–10 years | 8 (21) | 4 (14) | 4 (40) |
| Faculty for >10 years | 6 (15) | 6 (21) | 0 (0) |
| What is your educational role? (select all) | |||
| Residency program leadership | 1 (3) | 1 (3) | 0 (0) |
| Residency core faculty | 9 (23) | 6 (21) | 3 (30) |
| Simulation leadership | 9 (23) | 4 (14) | 5 (50) |
| Recurring simulation educator | 11 (28) | 5 (17) | 6 (60) |
| Occasional simulation educator | 14 (36) | 14 (48) | 0 (0) |
| Other (chief resident, assistant program director) | 3 (8) | 3 (10) | 0 (0) |
| Why did you take this course? (select all) | |||
| Suggestion by colleague | 0 (0) | 0 (0) | 0 (0) |
| General interest in education | 16 (41) | 12 (41) | 4 (40) |
| General interest in debriefing | 16 (41) | 12 (41) | 4 (40) |
| Interest in leading debriefing | 14 (36) | 13 (45) | 1 (10) |
| Desire to improve debriefing | 22 (56) | 17 (59) | 5 (50) |
| Other (interest in teaching debriefing, interest in speaker) | 2 (5) | 1 (3) | 1 (10) |
| What is your experience with debriefing? | |||
| None | 0 (0) | 0 (0) | 0 (0) |
| Co‐led <10 debriefings | 6 (15) | 6 (21) | 0 (0) |
| Co‐ led >10 debriefings | 8 (21) | 8 (28) | 0 (0) |
| Led <10 debriefings | 6 (15) | 5 (17) | 1 (10) |
| Led >10 debriefings | 16 (41) | 10 (34) | 6 (60) |
| Other (N/A) | 0 (0) | 0 (0) | 3 (30) |
| Have you ever been trained in debriefing before this curriculum? (select all) | |||
| Yes, via simulation fellowship | 7 (18) | 0 (0) | 7 (70) |
| Yes, via a debriefing course | 3 (8) | 0 (0) | 3 (30) |
| Yes, via supervised debriefing | 6 (15) | 5 (17) | 1 (10) |
| Yes, via self‐study | 7 (18) | 7 (24) | 0 (0) |
| No | 17 (44) | 17 (59) | 0 (0) |
| Other (educational feedback) | 1 (2) | 1 (3) | 0 (0) |
Note: Data are reported as n (%).
Experienced participants were generally simulation fellows (60%) or faculty with <10 years of experience (70%). Many were simulation leadership or recurring simulator educators (50% and 60%, respectively). Reasons they chose to experience the curriculum varied, but 50% wanted to improve their debriefing. More than 60% had led >10 debriefings.
Curriculum impact
Level 1 Kirkpatrick outcomes
All participants agreed or strongly agreed that the course improved their debriefing knowledge. Over 90% of both novice and experienced participants agreed that the didactic presentation was an effective introduction to basic principles of debriefing, although 7% of novices were neutral on the question. All participants agreed or strongly agreed that the videos provided effective illustrations of optimal and suboptimal examples of debriefing. Similarly, all participants agreed or strongly agreed that the roleplay scenarios provided an effective avenue for practicing debriefing skills (Figures 2 and 3).
FIGURE 2.

Experienced participant survey findings.
FIGURE 3.

Novice participant survey findings.
Level 2 Kirkpatrick outcomes
All participants agreed or strongly agreed that the course improved their ability to debrief; however, 72% of novice participants strongly agreed with this statement, compared to 50% of experienced participants. Fifty‐nine percent of novice participants planned to use a debriefing structure to scaffold a post‐simulation debriefing in the future, compared to 80% of experienced participants. More than 80% of novice participants felt more capable of describing optimal debriefing practices or avoiding debriefing pitfalls, with 14% reporting neutrality, compared to 100% of experienced participants. Ninety percent of novice participants felt more capable of facilitating a simulation debriefing after the workshop. Eighty‐seven percent of novice participants felt motivated to seek out more debriefing education in the future, 3% were neutral, and 10% disagreed (Figures 2 and 3).
All surveyed facilitators felt that the overall curriculum was effective in communicating the most basic and essential principles of debriefing to novice debriefers. They all agreed or strongly agreed that the educational objectives of each curriculum element were clear and that the educational materials were effective in helping them achieve those objectives (Figure 4).
FIGURE 4.

Facilitator survey findings on curriculum effectiveness.
Facilitator impressions
All facilitators strongly agreed that the WiSDEM curriculum would improve their efficiency in training novice debriefers, which was a key objective of the Debriefing Workgroup during curriculum development. Additionally, all facilitators would either reuse the WiSDEM curriculum themselves in the future or recommend it to others (Figure 4).
Facilitators largely felt that the duration of each curriculum element was appropriate. All facilitators felt that the duration of the lecture slides was appropriate. For the videos, roleplayed debriefing, and curriculum as a whole, one facilitator felt that too little time was allocated; but the remaining three facilitators found the durations appropriate (Figure 5).
FIGURE 5.

Facilitator survey findings on duration of curriculum elements.
DISCUSSION
In this study, we describe the process of generating the WiSDEM curriculum, a consensus‐driven debriefing curriculum targeting novice simulation educators at any level of clinical training created by the SAEM Simulation Academy Debriefing Workgroup. Initial deployment at the SAEM 2022 Annual Meeting suggests that the WiSDEM curriculum provided an effective, brief introduction to the essential elements of good debriefing practices with ample practice opportunities. Participant and facilitator feedback on the duration, educational objectives, and content of each component of the curriculum was positive. Additionally, participants agreed that the WiSDEM curriculum improved their confidence and self‐efficacy.
Of note, there were some differences in how novice and experienced participants responded to the curriculum. More novice participants felt that the WiSDEM curriculum would improve their ability to debrief compared to expert participants (70% vs. 56%). This aligns with our goal of creating an introductory level course for novice simulation educators without prior formal training in debriefing. However, more experienced participants than novice participants felt that the WiSDEM curriculum encourages use of a debriefing structure (89% vs. 57%) and felt more capable of describing optimal debriefing strategy (100% vs. 87%). This may be due to their prior comfort level with the material and ability to identify key elements. It is possible that the novice participants appreciated a greater impact from the curriculum but felt less confident in wielding the educational tools described, due to a relative lack of familiarity. It is important to note that the level of debriefing experience did not directly correlate with faculty rank or training level, suggesting that any debriefing course needs to target a diverse body of learners.
Facilitator impressions of the WiSDEM curriculum were overwhelmingly positive regarding its usefulness, which aligns with our original goals of creating a concise and ready‐to‐deploy curriculum. The positive reception of the WiSDEM curriculum among facilitators underscores a significant body of research indicating that simulation program leaders require supportive infrastructure to ensure delivery of quality SBE. Barriers to ensuring consistent and impactful debriefing practices among simulation educators include lack of clear guidance in how to best train educators in debriefing, 14 , 18 , 20 , 24 , 31 lack of institutional incentives for faculty development in SBE, 15 , 20 and lack of funding for SBE training. Though there is little research on how institutions choose to fund simulation faculty development, one survey of nursing simulation programs indicated that the amount of money spent on maintenance and SBE training was a very small percentage relative to the initial upfront costs of establishing the simulation center, ranging from <1% to 7%. 19
Although simulation program directors encounter many challenges in creating debriefing proficiency among novice educators, our successful creation and deployment of the WiSDEM curriculum suggests a number of interesting future directions. First, the development of the WiSDEM curriculum demonstrates the feasibility of creating expert consensus‐generated educational materials in response to commonly encountered educational challenges. Academic communities such as the Debriefing Workgroup are invaluable in providing a mechanism for consensus discussion and collaborative discourse, leading to intentionally designed educational objectives that overcome well‐described barriers.
Second, given the many logistic and financial challenges that educators face in obtaining debriefing training, one potentially effective solution is the creation of debriefing educational materials that are ready to deploy, concise, and free of cost. In our survey of WiSDEM facilitators, all facilitators felt that the availability of the WiSDEM curriculum would make them more efficient in training novice debriefers, underscoring the value of educational materials that are accessible for quick application at any institution. Free and open‐access educational materials are increasingly recognized as academic scholarship 32 , 33 , 34 and may be a good objective of educational task forces such as the Debriefing Workgroup.
After the successful initial deployment of the WiSDEM curriculum in one academic context, the Debriefing Workgroup is planning for deployment at other institutions, with the goal of collecting multi‐institutional feedback from participants and facilitators. Feedback will enable continued iterative revisions to curriculum elements. Additionally, with broader deployment, there is potential to measure higher Kirkpatrick level participant outcomes and gauge curricular impact more comprehensively.
Currently, the WiSDEM curriculum is publicly available through the SAEM Simulation Academy; however, the Debriefing Workgroup ultimately wishes to make the workshop more readily accessible through online publication of the educational materials. This would ensure the durability of the WiSDEM curriculum for simulation program leaders in need of a ready‐to‐deploy, free of cost, consensus‐driven debriefing curriculum targeting novices in SBE.
Finally, as simulation program leaders continue to face barriers to building debriefing proficiency among novice educators, the Debriefing Workgroup hopes to leverage its pool of expertise to create more consensus‐based and freely available education centered around the novice simulation educator.
LIMITATIONS
The WiSDEM curriculum is intended to create a basic understanding of debriefing strategies, and as such it cannot replace longer or more detailed debriefing training. Participants were limited to those who opted to attend the affiliated didactic workshop during the SAEM 2022 Annual Meeting. This creates potential for selection bias in the sample population, as participants who elected to attend this course may be more highly motivated to improve their debriefing skills. Our data are also subject to self‐reporting bias. As this study represented an initial deployment of the WiSDEM curriculum, only Kirkpatrick Level 1 and 2 outcome data were collected. Moreover, although this curriculum was intended for novice debriefers of any clinical discipline or training level, the participants in this session were emergency medicine faculty and trainees, potentially limiting the generalizability to outside the specialty.
CONCLUSIONS
Effective debriefing is critical to optimizing the educational value of simulation‐based education. Educators interested in debriefing training often encounter significant logistical and financial constraints to acquiring the skills necessary for conducting successful debriefing sessions. The SAEM Simulation Academy Debriefing Workgroup authored the Workshop in Simulation Debriefing for Educators in Medicine curriculum, a consensus‐driven debriefing curriculum intended for novice educators that is publicly available, ready to deploy, and concise. This curriculum was well received when implemented at the SAEM 2022 Annual Meeting as a didactic presentation. Participants felt the Workshop in Simulation Debriefing for Educators in Medicine curriculum was effective at introducing basic debriefing principles, providing illustrations of ideal and nonideal debriefing behaviors, and providing an opportunity to practice core debriefing skills. Facilitators felt that the Workshop in Simulation Debriefing for Educators in Medicine curriculum had future applicability to debriefing training at their own institutions and would recommend the training to others. Future directions include multi‐institutional deployment and evaluating curriculum impact with higher level Kirkpatrick outcomes.
AUTHOR CONTRIBUTIONS
Study concept and design: Tina H. Chen, Suzanne K. Bentley, Nur‐Ain Nadir, Lars K. Beattie, Brendan W. Munzer, Tiffany Moadel, Glenn Paetow, Amanda Young, Stephanie N. Stapleton. Acquisition of the data: Tina H. Chen, Suzanne K. Bentley, Charles Lei, Sara M. Hock, Tiffany Moadel, Stephanie N. Stapleton. Analysis and interpretation of the data: Tina H. Chen, Suzanne K. Bentley, Nur‐Ain Nadir, Stephanie N. Stapleton. Drafting of the manuscript: Tina H. Chen, Suzanne K. Bentley, Nur‐Ain Nadir, Lars K. Beattie, Charles Lei, Sara M. Hock, Brendan W. Munzer, Tiffany Moadel, Glenn Paetow, Amanda Young, Stephanie N. Stapleton. Critical revision of the manuscript for important intellectual content: Tina H. Chen, Suzanne K. Bentley, Nur‐Ain Nadir, Lars K. Beattie, Charles Lei, Sara M. Hock, Brendan W. Munzer, Tiffany Moadel, Glenn Paetow, Amanda Young, Stephanie N. Stapleton. Statistical expertise: Suzanne K. Bentley, Stephanie N. Stapleton.
CONFLICT OF INTEREST STATEMENT
The authors declare no conflicts of interest.
Chen TH, Bentley SK, Nadir N‐A, et al. Workshop in Simulation Debriefing for Educators in Medicine: Creation, implementation, and evaluation of a debriefing curriculum for novice simulation educators. AEM Educ Train. 2023;7(Suppl. 1):S58–S67. doi: 10.1002/aet2.10869
Supervising Editor: Dr. John Burkhardt.
REFERENCES
- 1. Simulation Training and Skill Assessment in Emergency Medicine. StatPearls. May 8, 2022. Accessed September 11, 2022. https://www.ncbi.nlm.nih.gov/books/NBK557695/
- 2. Okuda Y, Bond W, Bonfante G, et al. National growth in simulation training within emergency medicine residency programs, 2003‐2008. Acad Emerg Med. 2008;15(11):1113‐1116. [DOI] [PubMed] [Google Scholar]
- 3. McLaughlin S, Fitch MT, Goyal DG, et al. Simulation in graduate medical education 2008: a review for emergency medicine. Acad Emerg Med. 2008;15(11):1117‐1129. [DOI] [PubMed] [Google Scholar]
- 4. Bond WF, Hui J, Fernandez R. The 2017 Academic Emergency Medicine Consensus Conference: Catalyzing System Change Through Healthcare Simulation: Systems, Competency, and Outcomes. Acad Emerg Med. 2018;25(2):109‐115. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5. Fanning RM, Gaba DM. The role of debriefing in simulation‐based learning. Simul Healthc. 2007;2(2):115‐125. [DOI] [PubMed] [Google Scholar]
- 6. Fernandez R, Vozenilek JA, Hegarty CB, et al. Developing expert medical teams: toward an evidence‐based approach. Acad Emerg Med. 2008;15(11):1025‐1036. [DOI] [PubMed] [Google Scholar]
- 7. Barry Issenberg S, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high‐fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10‐28. [DOI] [PubMed] [Google Scholar]
- 8. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation‐based medical education research: 2003–2009: simulation‐based medical education research 2003–2009. Med Educ. 2010;44(1):50‐63. [DOI] [PubMed] [Google Scholar]
- 9. Paige JT, Arora S, Fernandez G, Seymour N. Debriefing 101: training faculty to promote learning in simulation‐based training. Am J Surg. 2015;209(1):126‐131. [DOI] [PubMed] [Google Scholar]
- 10. INACSL Standards Committee . INACSL standards of best practice: simulation facilitation. Clin Simul Nurs. 2016;12:S16‐S20. [Google Scholar]
- 11. Rudolph JW, Simon R, Dufresne RL, Raemer DB. There's no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc. 2006;1(1):49‐55. [DOI] [PubMed] [Google Scholar]
- 12. Eppich W, Cheng A. Promoting excellence and reflective learning in simulation (PEARLS): development and rationale for a blended approach to health care simulation debriefing. Simul Healthc. 2015;10(2):106‐115. [DOI] [PubMed] [Google Scholar]
- 13. Sawyer T, Eppich W, Brett‐Fleegler M, Grant V, Cheng A. More than one way to debrief: a critical review of healthcare simulation debriefing methods. Simul Healthc. 2016;11(3):209‐217. [DOI] [PubMed] [Google Scholar]
- 14. Paige JB, Graham L, Sittner B. Formal training efforts to develop simulation educators: an integrative review. Simul Healthc. 2020;15(4):271‐281. [DOI] [PubMed] [Google Scholar]
- 15. Musits AN, Merritt C, Petrone G, et al. Faculty support bundle for simulation education. Clin Teach. 2022;19(2):106‐111. [DOI] [PubMed] [Google Scholar]
- 16. Hallmark BF. Faculty development in simulation education. Nurs Clin North Am. 2015;50(2):389‐397. [DOI] [PubMed] [Google Scholar]
- 17. Nehring WM, Wexler T, Hughes F, Greenwell A. Faculty development for the use of high‐fidelity patient simulation: a systematic review. Int J Health Sci. 2013;1:35. [Google Scholar]
- 18. Cheng A, Grant V, Dieckmann P, Arora S, Robinson T, Eppich W. Faculty development for simulation programs: five issues for the future of debriefing training. Simul Healthc. 2015;10(4):217‐222. [DOI] [PubMed] [Google Scholar]
- 19. Adamson K. Integrating human patient simulation into associate degree nursing curricula. Clin Simul Nurs. 2010;6(3):e75‐e81. [Google Scholar]
- 20. Ferguson J, Astbury J, Willis S, Silverthorne J, Schafheutle E. Implementing, embedding and sustaining simulation‐based education: what helps, what hinders. Med Educ. 2020;54(10):915‐924. [DOI] [PubMed] [Google Scholar]
- 21. Paige JT, Khamis NN, Cooper JB. Learning how to “teach one”: a needs assessment of the state of faculty development within the consortium of the American College of Surgeons accredited education institutes. Surgery. 2017;162(5):1140‐1147. [DOI] [PubMed] [Google Scholar]
- 22. Health Care Simulation and Training Directory. Accessed September 11, 2022. https://www.ssih.org/HC‐SIM‐Training‐Directory
- 23. Peterson DT, Watts PI, Epps CA, White ML. Simulation faculty development: a tiered approach. Simul Healthc. 2017;12(4):254‐259. [DOI] [PubMed] [Google Scholar]
- 24. Kinnear J, Smith B, Akram M, Wilson N, Simpson E. Using expert consensus to develop a simulation course for faculty members. Clin Teach. 2015;12(1):27‐31. [DOI] [PubMed] [Google Scholar]
- 25. Fleming ND, Mills C. Not Another Inventory, Rather a Catalyst for Reflection. To Improve the Academy; 1992. [Google Scholar]
- 26. The VARK Modalities: What Do Visual, Aural, Read/write & Kinesthetic Really Mean? VARK Learn Limited. Accessed September 11, 2022. https://vark‐learn.com/introduction‐to‐vark/the‐vark‐modalities/
- 27. Taylor DCM, Hamdy H. Adult learning theories: implications for learning and teaching in medical education: AMEE guide no. 83. Med Teach. 2013;35(11):e1561‐e1572. [DOI] [PubMed] [Google Scholar]
- 28. Yardley S, Teunissen PW, Dornan T. Experiential learning: AMEE guide no. 63. Med Teach. 2012;34(2):e102‐e115. [DOI] [PubMed] [Google Scholar]
- 29. Passi V, Johnson S, Peile E, Wright S, Hafferty F, Johnson N. Doctor role modelling in medical education: BEME guide no. 27. Med Teach. 2013;35(9):e1422‐e1436. [DOI] [PubMed] [Google Scholar]
- 30. Moreau KA. Has the new Kirkpatrick generation built a better hammer for our evaluation toolbox? Med Teach. 2017;39:1‐3. [DOI] [PubMed] [Google Scholar]
- 31. Raemer D, Anderson M, Cheng A, Fanning R, Nadkarni V, Savoldelli G. Research regarding debriefing as part of the learning process. Simul Healthc. 2011;6(7):S52‐S57. [DOI] [PubMed] [Google Scholar]
- 32. Nickson CP, Cadogan MD. Free open access medical education (FOAM) for the emergency physician. Emerg Med Australas. 2014;26(1):76‐83. [DOI] [PubMed] [Google Scholar]
- 33. Jordan J, Jones D, Williams D, Druck J. Publishing venues for education scholarship: a needs assessment. Acad Emerg Med. 2016;23(6):731‐735. [DOI] [PubMed] [Google Scholar]
- 34. Lee M, Hamilton D, Chan TM. Cost of free open‐access medical education (FOAM): an economic analysis of the top 20 FOAM sites. AEM Educ Train. 2022;6(5):e10795. [DOI] [PMC free article] [PubMed] [Google Scholar]
