Abstract
This study sought to determine if learner self-performance assessment (SPA) and team-performance assessment (TPA) were different when simulation based education (SBE) was supported by self-debriefing (S-DB), compared to traditional facilitator-led debriefing (F-DB). “One-Night-On-Call,” an internship preparation curriculum, was selected to provide SBE. Participants worked as team members in 4 sequential bedside acute care problem-solving scenarios. Fifty-seven learners were randomized to 9 F-DB and 10 S-DB Teams. Participants completed SPA and TPA assessment checklist questionnaires immediately following the first and fourth (final) scenarios. Learner SPA and TPA scores improved overall from the first to the fourth scenarios (P <.05). F-DB versus S-DB cohorts did not differ in overall SPA scores. The F-DB average TPA score was 12.8 (SD±2.1) compared to a S-DB score of 14.1 (SD±2.1) (P =.001). F-DB participants' increase in TPA was due to increases in the Patient Assessment and Treatment sub-domains that exceeded corresponding improvements in the S-DB cohort. Self- debriefing strategies are equivalent to facilitator-led debriefing in some situations. Self-debriefing offers opportunities to enable simulation-based education by decreasing the number of required faculty debriefers, and may be uniquely well matched to simulation-based teamwork training.
Keywords/MESH terms: self-debriefing, patient simulation, teaching methods, problem based learning, teamwork, assessment, evaluation
Introduction
Simulation-based education (SBE) is an experiential learning format increasingly utilized in professional healthcare education.1 SBE encompasses a variety of approaches and technologies, including scenario-based simulation utilizing computer controlled high technology mannequins programmed to represent the physiology and anatomy of clinical problems. Other SBE techniques include technology-augmented training using partial body trainers to facilitate specific skill training (eg, airway intubation, lumbar puncture), computer based simulation, and virtual reality methods. Rigorous application of instructional design processes for SBE is required for optimal educational outcomes. Instructional design elements include curriculum development, scenario design, scenario facilitation methods, assessment strategies, feedback, and debriefing. Debriefing is an interactive feedback process in which learners review simulation experiences in a structured format, immediately following participation in scenario-based simulation. Debriefing serves to close gaps between faculty and participant perceptions of performance, and to enable learning through reflection.2 Debriefing is a format for feedback. Post—simulation debriefing with review of learner actions and performance during simulation is a crucial component of experiential learning processes such as SBE.1–5
Debriefing techniques include facilitator-led debriefing, with or without video review; group and individualized techniques; written exercises; in-simulation debriefing; after-simulation debriefing; and others.2,6,7 The debriefing process is most commonly guided by a facilitator, who provides immediate post simulation formative feedback, enabling self-reflection focused on established learning objectives. Studies indicate that groups of learners who receive such feedback have higher post-test performance scores.8
Interactive debriefing techniques are supported by theories of adult learning. Debriefing with facilitation by a content expert is considered critical for the process of experiential learning.1,4,9 The debriefer guides a structured deliberate debriefing process to reach established debriefing objectives.1,2,10 A crucial skill required for effective debriefing is structured critical observation of learner behaviors and actions during simulation, while simultaneously assessing knowledge, skills, and attitudes. Optimal individualized or small group debriefing is conducted with low facilitator/debriefer to learner ratios. Availability of trained debriefers is a common barrier to use of SBE, due to time and cost constraints.11,12
Self-debriefing concepts have been considered a method to decrease barriers to use of SBE and as a novel method of learning with SBE systems.2,13,14 However, it remains uncertain whether participants debriefed by self or by a peer accomplish equivalent learning outcomes compared to facilitated formative debriefing.15 One systematic review indicates that physicians have a limited ability to accurately self-assess, and meta analyses reveal that self-assessment correlates poorly with expert assessment.16–18 Evaluation of self-debriefing is a Society of Simulation in Healthcare SBE research priority.19
Formative debriefing emphasizes identification of gaps between actual performance and required performance standards, for the purpose of learning.20, 21 “Self” or “instructorless” checklist guided formative debriefing can improve the efficiency of simulation-based instruction and has been shown to be effective for training of nontechnical skills, such as team work and crisis resource management.20 Little is known about learner self-assessment and/or self-efficacy following formative self-debriefing versus instructor-led debriefing.
Simulation-based learning commonly employs scoring with checklists and rating scales to inform and guide feedback. Commonly used instruments include global rating scales (GRS), time-to-event assessments, and critical event checklists. Feedback with debriefing is typically guided by experienced expert facilitators who incorporate scoring instruments. Checklist-guided self-debriefing enables participants to engage in formative self-debriefing guided by objective performance benchmarks.
We sought to determine if self-performance assessment (SPA) and team-performance assessment (TPA) by participants was different when simulation based education was supported by self-debriefing compared to traditional facilitator led debriefing. Participant self-reported SPA and TPA were compared following simulation based training for learners randomized to either “checklist enabled self-debriefing” or “traditional instructor led-debriefing” groups. The instructional content was composed of “One-Night-On-Call” (NOC), a published Post Graduate Year 1 (PGY1)/Pre-internship curriculum, which had been completed by over 250 students in the four years preceding this report.22 NOC is designed to challenge new post graduate year 1 (PGY1) physician interns with simulated common clinical problem solving experiences which they are likely to encounter during the first month of internship. NOC educational objectives include components of both inter-professional teamwork and clinical problem solving.
Methods
This prospective, controlled, educational cohort intervention study was conducted in June and July 2011. The study was approved by the University of Hawai‘i Committee on Human Subjects (IRB/CHS). The setting was the SimTiki Simulation Center, University of Hawai‘i, John A Burns School of Medicine. NOC training was conducted for PGY1 interns during required internship orientation. Participants completed NOC once during the interval beginning 6 weeks prior to internship, ending no later than the end of their first PGY1 month. Individuals were assigned to a single day of training based on residency program scheduling and availability. On each training day 3–4 groups completed scenarios according to a rotating non-overlapping schedule. NOC curriculum is conducted over four hours and is comprised of four sequential simulated case management scenarios and debriefing. Each case management exercise was limited to 5 minutes. Each scenario was immediately followed by 15–20 minutes of debriefing, designed to facilitate reflection and reinforce pre-determined learning objectives. During each scenario learners worked as a team of 2–4 members to actively assess and treat a simulated patient. For each of four common in-hospital clinical scenarios, a team of 2–4 learners were summoned to assist in clinical care by a bedside nurse, whose role was played by the primary scenario facilitator. Simulated clinical scenarios were anaphylactic shock, stable atrial fibrillation, chronic obstructive pulmonary disease (COPD) exacerbation, and acute coronary syndrome (ACS). Each team of 2–4 learners experienced scenarios in the same sequence. Scenarios were conducted using the Laerdal SimMan® computer controlled human patient simulator (Laerdal USA, Wappingers Falls, NY, USA), with standardized programmed clinical case details. Standardized orientation to the simulator, environment, equipment, and simulation rules of engagement was conducted for learners prior to participation in scenarios.
Each scenario had specified learning objectives and was designed to last no more than 5 minutes. A trained faculty facilitator played the role of the primary bedside nurse. Facilitators were physicians with simulation based teaching experience who specialized in critical care medicine, anesthesiology, emergency medicine, and internal medicine. Facilitators received scenario orientation and facilitator training including mock facilitation sessions.
Participants were randomized to one of several teams. Roles for each scenario were assigned by team consensus; teams were instructed to distribute roles such that by the end of four scenarios each student had the opportunity to play each role on at least one occasion: nurse(s), primary intern, or assisting intern(s). Participants were aware that they were participating in an IRB approved research study, but were unaware of the objectives and content of the study. Teams were assigned on alternating instructional days to either post-simulation facilitator led debriefing (F-DB) or self-debriefing (S-DB) for all scenarios. F-DB or S-DB was conducted immediately following each of the four scenarios. F-DB teams included surgery, orthopedic surgery, internal medicine, transitional, and family medicine PGY1 physicians. S-DB teams included transitional, family medicine, pediatrics, OB/GYN, and psychiatry PGY1 physicians. F-DB groups participated in a traditional facilitator-led bedside debriefing process immediately following each scenario. Facilitators conducted detailed bedside debriefing using a scenario-specific checklist as a debriefing guide. S-DB groups participated in a debriefing process in which participants were instructed to individually and independently complete a scenario specific checklist, followed by participant group discussion using the scenario-specific checklist as a discussion guide. Facilitators did not participate in the S-DB group discussion. Identical scenario-specific checklists with three categories, varying only in content of the “treatment” category for each case (Figure 1) were utilized by both faculty for F-DB, and participants for S-DB. Time for F-DB and S-DB did not exceed 15 minutes.
Figure 1.
Scenario Checklist
Following completion of four sequential simulation exercises and debriefings, participants from all teams on each day of training attended an instructor-guided interactive course group wrap-up session, intended to review the main teaching objectives and encourage group reflection and discussion.
Primary outcome data was gathered from participants who completed both a Team Performance Assessment (TPA) GRS and a Self-Performance Assessment (SPA) GRS on two occasions during the NOC program. Baseline assessments were completed immediately following the first scenario before debriefing for that scenario, and again following scenario 4 before debriefing for that scenario. Both TPA and SPA were scored from 3–18 total points, in three content domains from low (“Unsatisfactory”) to high (“Superior”). Domains were weighted by uneven allocation of the maximum total point score for each domain: Patient assessment (1–8 points), Teamwork (1–6 points) and Patient treatment (1–4 points). Pre-debriefing GRS scores of the first case comprised baseline data for all participants. Post-debriefing GRS data from the last case was collected for between group F-DB and S-DB comparison of differences from baseline.
Statistical Methods
The primary analysis was designed to detect differences in the student rated TPA and SPA GRS scores between two debriefing cohorts, F-DB and S-DB. Baseline and post debriefing TPA and SPA GRS scores were analyzed with a two-way, mixed-design analysis of variance (ANOVA) using SPSS, version 19.0 (Armonk, NY: IBM Corp.). The average specific agreement for all combinations of paired facilitator total GRS scores was measured using the F- test.
Results
Sixty PGY1 residents were randomly assigned to 20 clinical teams, each with 2–4 participants. Each team completed four scenarios in the same sequence. SPA and TPA GRS scores were analyzed for 57 students (19 teams). One S-DB team was excluded from analysis because of missing data. The F-DB control group comprised 27 individuals (9 teams) and the S-DB intervention group comprised 30 individuals (10 teams). There was no difference detected between facilitator total GRS scores.
TPA and SPA scores completed by learners in facilitator-led debriefing and self-debriefing groups are reported in Table 1. Overall post-course SPA and TPA scores improved compared to pre-course scores (P =.014 for SPA and .013 for TPA). TPA scores were higher for the SDB group than the FDB group, whereas no significant difference was found in SPA scores between the SDB and FDB groups. The debriefing method (F-DB vs S-DB) showed no interaction between pre- and post-course assessment of the Self Performance Assessment (P =.50).
Table 1.
Pre-Post Course Results. Self- (SPA) /Team- (TPA) Performance Assessment. Mean Score (SD)
| Group | Pre-course | Post-course |
P-value (2-Way Mixed ANOVA) |
| Overall (n=57) | SPA Score | ||
| 11.6 (2.3) | 12.6 (1.2) | .014 F(1,110) = 6.28 |
|
| TPA Score | |||
| 13.0 (2.3) | 14.0 (1.9) | .013 F(1,110) = 6.33 |
|
| SPA Score | |||
| F-DB (n=27) | 11.2 (2.6) | 12.1 (1.9) | .050† F(1,110) = 3.93 |
| S-DB (n=30) | 12.5 (2.1) | 13.0 (2.0) | |
| TPA Score | |||
| F-DB (n=27) | 12.3 (2.3) | 13.3 (1.8) | .001†† F(1,110) = 11.50 |
| S-DB (n=30) | 13.6 (2.3) | 14.5 (1.8) | |
No difference between or within F-DB/S-DB groups for either pre- or post- course scores.
S-DB TPA score significantly higher than F-DB.
F-DB, facilitator led debriefing; SD, standard deviation; S-DB, self-debriefing; SPA, self-performance assessment; TPA, team-performance assessment.
Table 2 shows results stratified by the three domains of the GRS self-assessment. Significant differences were observed between pre- and post-course TPA scores (Table 2). F-DB groups had higher baseline and post course scores compared to S-DB groups in the Patient Assessment (P <.05) and Patient Treatment (P =.001) domains. There were no differences between F-DB and S-DB groups in the Teamwork domain (P =.059).
Table 2.
Team Performance sub-domain category scores. (Mean ± SD)
| Group | GRS category | Pre-course | Post-course | P-value ANOVA |
| S-DB | Patient Assessment (0–8 scale) |
5.1 ± 1.2 | 5.7 ± 1.00 | <.05 F (1,110) =13.58 |
| F-DB | 5.8 ± 1.3 | 6.4 ± 0.9 | ||
| S-DB | Patient Treatment (0–4 scale) |
2.7 ± 0.5 | 2.9 ± 0.5 | .001 F (1,110)=12.4 |
| F-DB | 3.1 ± 0.6 | 3.3 ± 0.5 | ||
| S-DB | Teamwork (0–6 scale) |
4.4 ± 1.0 | 4.7 ± 0.8 | .059 F (1,110) =3.66 |
| F-DB | 4.8 ± 0.7 | 4.9 ± 0.7 |
F-DB, facilitator led debriefing; GRS, global rating scales; SD, standard deviation; S-DB, self-debriefing
Discussion
Debriefing is a critical component of experiential learning through guided reflection.1,2,23 Our results show that formative checklist-guided self-debriefing is associated with higher overall self-reported team-performance compared to traditional facilitator-led formative debriefing. Checklist-guided, formative self-debriefing yields self-performance assessment which is equivalent to that of learners who underwent traditional facilitator led debriefing.
These results illuminate several learner outcomes which may be causally related to a specific debriefing method. In our study all participants were at an equivalent post graduate training level and were sequentially assigned to control and intervention learner groups varying only in the style of debriefing, rendering the findings likely to be due to the style of debriefing. This finding suggests that there may be implications for matching debriefing models to specific educational objectives or training curricula. Educators should be aware of differences in self-assessment which may result from specific debriefing methods. Self-assessment is an integrally related component of self-efficacy, one's belief in one's ability to succeed in specific situations, which is a core aspect of professionalism. Our results reinforce earlier comparisons showing equivalent educational outcomes using self-debriefing methods vs facilitator led debriefing, specifically in the area of “non-technical skills” as described by Boet, et al.20 This study documents that post simulation learner self-assessment is not inferior when self-debriefing is substituted for facilitator-debriefing in a mature scenario based simulation curriculum which includes objectives other than non-technical skills.
Self-debriefing may augment reflection through the establishment of an inherently safe learning environment. Evidence is mounting that self-debriefing techniques are valid for routine use in scenario-based SBE. Increasing interest in self-assessment as a tool for formative assessment suggests the lack of correlation between summative self-assessment and actual performance in experienced healthcare providers does not necessarily imply or prove lack of correlation between formative self-assessment and educational outcomes during learning.24 Eva et al argues that measures of “self-monitoring” correlated well with performance of healthcare professionals during learning experiences when contrasted with self-evaluation, and further articulates that “…self-assessment is a complicated, multifaceted, multipurpose phenomenon that involves a number of interacting cognitive processes.25 It functions as a monitor, a mentor, and a motivator through processes such as evaluation, inference, and prediction.” Through self-guided debriefing, learners may experience enhanced reflection, absent external perceived facilitator “judgment”. Thus, self-guided debriefing may effectively represent facilitated or guided reflection in the cycle of experiential learning, especially if a framework for self-reflection, such as a checklist used in our study, is incorporated in the debriefing design.
This study documents a significant difference between F-DB and S-DB only in team performance assessment by students, not in self-performance assessment. S-DB participants reflected on their performance with group members using the checklist as a guide instead of reflecting during a bedside debriefing by the facilitator. The S-DB cooperative cognitive task enabled participants to understand and share the ideas which guided them and may have improved team situation awareness and problem solving skills, since the process was inherently structured to enhance metacognition by providing explicit performance feedback with a checklist. Three major components of simulation fidelity (realism) for teamwork training are equipment fidelity, environment fidelity, and psychological fidelity. Of the three, psychological fidelity is the most important for teamwork skill training.26 Team performance is the unique skill that cannot be attained individually, under any circumstances. Because facilitator-guided traditional debriefing has the potential to engage learners solely as individuals as opposed to engagement as integrated team members, the method does not assure a team focused debriefing to close gaps between participant learning and assessment as individuals versus as team members. Guided self-debriefing which engages all participants models team work, and may therefore represent a preferred method for enhanced teamwork debriefing; however, further support for this concept is required by rigorous correlation of self-efficacy results and gold standard teamwork performance assessments.
Categorical analysis showed significant differences in patient assessment and treatment domains for the S-DB group, but not in the teamwork domain. Improved global perception of team performance was thus mostly recognized through perceived team improvement in patient assessment and treatment domains. We posit that this is most likely a result of the fact that healthcare providers are deeply vested in patient related factors, which heavily influence constructs of self-efficacy, especially when the links between teamwork and patient outcomes are not well understood, as is likely in this cohort of novice physicians.
As a secondary outcome, the magnitude of improvement in the overall self- and team- assessment scores from pre-test to post-test was equivalent in both debriefing groups. This result implies that the self-debriefing method compares favorably with the traditional facilitator-led debriefing method.
Limitations
First, this study did not include a blinded gold standard performance assessment by experts. Correlation with expert performance assessment is required to fully validate these results, since self-assessment is frequently unreliable when compared to gold-standard performance assessment. Second, the results reflect incomplete randomization. Participants were randomized into mixed groups based on scheduling requirements of the educational program, using a lottery for each session. Most groups were over-weighted with same specialty participants. Moreover, their experiences in simulation-based learning, gender, and educational background were not randomized. Future research on effectiveness of self-debriefing should include video performance ratings by independent blinded expert reviewers. Our results do not provide data which definitively prove efficacy of S-DB for improved performance outcomes compared to the F-DB method.
Conclusions
Debriefing for simulation based education is a complex endeavor, comprised of multiple factors which influence the debriefing experience; including when (immediate or delayed), where (bedside or elsewhere), what (objectives), why (formative or summative), and how (structure or video). This research explored the Who factor, which plays a central role in the construct and outcomes of debriefing during experiential learning.6 To explore effective learning processes, this research compared two different debriefing practices: facilitator-debriefing (F-DB) and self-debriefing (S-DB). In teamwork and personal performance, both the F-DB group and the S-DB group rated significant increases in scores from pre- to post-test, with no differences between groups. This finding suggests that self-debriefing may be equivalent to facilitator led debriefing in some educational settings. Our findings support further research to elaborate the potential for self-debriefing method to enhance efficiency of simulation based education by decreasing requirements for faculty debriefers, and equally critical, to investigate specific educational situations in which educational outcomes might be improved through the use of self-debriefing.
Substituting instructorless group self-debriefing guided by checklist for traditional facilitator-led debriefing has theoretical benefits and now empirical results which suggest self-debriefing is worthy of further prospective investigation.
| Abbreviations | Meaning |
| SPA | Self-performance assessment |
| TPA | Team performance assessment |
| S-DB | Checklist enabled self debriefing |
| F-DB | Facilitator led debriefing |
| SBE | Simulation-based education |
| GRS | Global rating scales |
| NOC | One Night on Call |
| COPD | Chronic Obstructive Pulmonary Disease |
| PGY | Post-graduate year |
| ANOVA | Analysis of variance |
| SPSS | Statistical Package for the Social Sciences |
| F-test | Statistical test in which the test statistic has an F-distribution under the null hypothesis. |
Acknowledgments
We would like to thank the staff of SimTiki Simulation Center for their support of this project: Kris M. Hara RRT and Eileen K Maeda MBA.
Conflict of Interest
None of the authors identify a conflict of interest.
References
- 1.Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB. Simulation in healthcare education: a best evidence practical guide. AMEE Guide No. 82. Med Teach. 2013 Oct;35(10):e1511–e1530. doi: 10.3109/0142159X.2013.818632. [DOI] [PubMed] [Google Scholar]
- 2.Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc. 2007 Summer;2(2):115–125. doi: 10.1097/SIH.0b013e3180315539. [DOI] [PubMed] [Google Scholar]
- 3.Hattie J, Timperley H. The power of feedback. Rev Educ Res. 2007;77(1):81. [Google Scholar]
- 4.Kolb DA. Experiential Learning: Experience as the Source of Learning and Development. Englewood Cliffs, NJ: Prentice-Hall; 1984. [Google Scholar]
- 5.Schön D. Educating the Reflective Practitioner: Toward a New Design for Teaching and Learning in the Professions. San Francisco, CA: Jossey-Bass; 1987. [Google Scholar]
- 6.Van Heukelom JN, Begaz T, Treat R. Comparison of post-simulation debriefing versus in-simulation debriefing in medical simulation. Simul Healthc. 2010 Apr;5(2):91–97. doi: 10.1097/SIH.0b013e3181be0d17. [DOI] [PubMed] [Google Scholar]
- 7.Raemer D, Anderson M. Research regarding debriefing as part of the learning process. Simul Healthc. 2011 Aug;(6 Suppl):S52–S57. doi: 10.1097/SIH.0b013e31822724d0. [DOI] [PubMed] [Google Scholar]
- 8.Lederman LC. Debriefing: toward a systematic assessment of theory and practice. Simulat Gaming. 1992;23(2):145–160. [Google Scholar]
- 9.Savoldelli GL, Naik VN, Park J, Joo HS, Chow R, Hamstra SJ. Value of debriefing during simulated crisis management: oral versus video-assisted oral feedback. Anesthesiology. 2006 Aug;105(20):279–285. doi: 10.1097/00000542-200608000-00010. [DOI] [PubMed] [Google Scholar]
- 10.Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10–28. doi: 10.1080/01421590500046924. [DOI] [PubMed] [Google Scholar]
- 11.Hayden J. Use of simulation in nursing education: national survey results. J Nurs Reg. 2010;1(3):52–57. [Google Scholar]
- 12.Rudolph J, Simon R, Dufresne RL, Raemer DB. There's no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc. 2006 Spring;1(1):49–55. doi: 10.1097/01266021-200600110-00006. [DOI] [PubMed] [Google Scholar]
- 13.Butler RE. Loft: Full-motion simulation as crew resource management training. In: Wiener E, Kanki B, Helmreich R, editors. Cockpit Resource Management. San Diego: Academic Press; 1993. pp. 231–259. [Google Scholar]
- 14.Foraida MI, DeVita MA, Schaefer JJ. Evaluation of an electronic system to enhance crisis resource management training. Simul Healthc. 2006;1(2):85–91. [Google Scholar]
- 15.Langendyk V. Not knowing that they do not know: self-assessment accuracy of third year medical students. Med Educ. 2006;40(2):173–179. doi: 10.1111/j.1365-2929.2005.02372.x. [DOI] [PubMed] [Google Scholar]
- 16.Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296(9):1094–1102. doi: 10.1001/jama.296.9.1094. [DOI] [PubMed] [Google Scholar]
- 17.Gordon MJ. A review of the validity and accuracy of self-assessments in health professions training. Acad Med. 1991;66(12):762–769. doi: 10.1097/00001888-199112000-00012. [DOI] [PubMed] [Google Scholar]
- 18.Ward M, Gruppen L, Regehr G. Measuring self-assessment: current state of the art. Adv Health Sci Educ Theory Pract. 2002;7(1):63–80. doi: 10.1023/a:1014585522084. [DOI] [PubMed] [Google Scholar]
- 19.Issenberg SB, Ringsted C, Ostergaard D, Dieckman P. Setting a research agenda for simulation-based healthcare education: a synthesis of the outcome from an Utstein style meeting. Simul Healthc. 2011;6(3):155–167. doi: 10.1097/SIH.0b013e3182207c24. [DOI] [PubMed] [Google Scholar]
- 20.Boet S, Bould MD, Bruppacher HR, Desjardins F, Chandra DB, Naik VN. Looking in the mirror: self-debriefing versus instructor debriefing for simulated crises. Crit Care Med. 2011;39(6):1377–1381. doi: 10.1097/CCM.0b013e31820eb8be. [DOI] [PubMed] [Google Scholar]
- 21.Rudolph JW, Simon R, Raemer DB, Eppich WJ. Debriefing as formative assessment: closing performance gaps in medical education. Acad Emerg Med. 2008 Nov;15(11):1010–1016. doi: 10.1111/j.1553-2712.2008.00248.x. [DOI] [PubMed] [Google Scholar]
- 22.Vincent D, Berg B. One Night On-Call: A Simulation Exercise for New Interns. MedEdPORTAL. 2009. Available at: http://services.aamc.org/30/mededportal/servlet/s/segment/mededportal/?subid=1760.
- 23.Lederman LC. Intercultural communication, simulation and the cognitive assimilation of experience: an exploration of the post-experience analytic process; Presented at the 3rd Annual Conference of the Speech Communication Association; December 1–3, 1983; San Juan, Puerto Rico. [Google Scholar]
- 24.Eva KW, Regehr G. Self-assessment in the health professions: A reformulation and research agenda. Acad Med. 2005;80(Suppl 10):S46–S54. doi: 10.1097/00001888-200510001-00015. [DOI] [PubMed] [Google Scholar]
- 25.Eva KW, Regehr G. Exploring the divergence between self-assessment and self-monitoring. Adv Health Sci Educ Theory Pract. 2011;16(3):311–329. doi: 10.1007/s10459-010-9263-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Beaubien JM, Baker DP. The use of simulation for training teamwork skills in health care: how low can you go? Qual Saf Health Care. 2004;13:i51–i56. doi: 10.1136/qshc.2004.009845. [DOI] [PMC free article] [PubMed] [Google Scholar]

