Skip to main content
BMJ Simulation & Technology Enhanced Learning logoLink to BMJ Simulation & Technology Enhanced Learning
. 2021 Jan 27;7(5):360–365. doi: 10.1136/bmjstel-2020-000685

Brick in the wall? Linking quality of debriefing to participant learning in team training of interprofessional students

John T Paige 1,, Deborah D Garbee 2, Qingzhao Yu 3, John Zahmjahn 4, Raquel Baroni de Carvalho 5, Lin Zhu 6, Vadym Rusnak 7, Vladimir J Kiselov 8
PMCID: PMC8936698  PMID: 35515739

Abstract

Background

The evidence for the conventional wisdom that debriefing quality determines the effectiveness of learning in simulation-based training is lacking. We investigated whether the quality of debriefing in using simulation-based training in team training correlated with the degree of learning of participants.

Methods

Forty-two teams of medical and undergraduate nursing students participated in simulation-based training sessions using a two-scenario format with after-action debriefing. Observers rated team performance with an 11-item Teamwork Assessment Scales (TAS) instrument (three subscales, team-based behaviours (5-items), shared mental model (3-items), adaptive communication and response (3-items)). Two independent, blinded raters evaluated video-recorded facilitator team prebriefs and debriefs using the Objective Structured Assessment of Debriefing (OSAD) 8-item tool. Descriptive statistics were calculated, t-test comparisons made and multiple linear regression and univariate analysis used to compare OSAD item scores and changes in TAS scores.

Results

Statistically significant improvements in all three TAS subscales occurred from scenario 1 to 2. Seven faculty teams taught learners with all scores ≥3.0 (except two) for prebriefs and all scores 3.5 (except one) for debriefs (OSAD rating 1=done poorly to 5=done well). Linear regression analysis revealed a single statistically significant correlation between debrief engagement and adaptive communication and response score without significance on univariate analysis.

Conclusions

Quality of debriefing does not seem to increase the degree of learning in interprofessional education using simulation-based training of prelicensure student teams. Such a finding may be due to the relatively high quality of the prebrief and debrief of the faculty teams involved in the training.

Keywords: debriefing, health professions education, high fidelity simulation, interprofessional Education, simulation-based medical education

Introduction

Effective teamwork remains a critical component for providing safe, quality patient care in today’s complex, dynamic and challenging healthcare environment. Its importance becomes even more apparent given the tribal nature of the healthcare work culture1 2 and the siloed manner in which the professions receive education and training.3 This situation not only has negative consequences related to communication,4–6 role clarity7 and safety attitudes8 in the clinical setting, but it is self-perpetuating due to the hidden curriculum resulting in health professional students modelling negative behaviours.9

Fortunately, team development interventions exist to improve team dynamics in healthcare.10 In fact, a clear link exists between the use of effective team processes and high team performance.11 In healthcare, simulation-based training (SBT) of interprofessional teams is a common modality used to improve team processes and performance,12 especially in acute care settings.13–16 It is also commonly used to train prelicensure student teams as part of curricula employing interprofessional education.17 Health professional students gain insight from such SBT incorporation into interprofessional education.18 In addition, they have more of a desire18 and a higher intrinsic motivation19 to participate in additional interprofessional education SBT. This popularity is likely due, in part, to the fact that interprofessional team SBT provides an opportunity for students to practice team-based competencies in a safe environment in which they can fail without harming patients.20

Conventional wisdom holds that SBT’s effectiveness as a team training modality is due to the quality of the debrief rendered during a training session.21 22 In fact, some authors contend that conducting an effective debrief is the most important component of SBT.23 Some literature supports such a view. For example, Coppens et al 24 demonstrated that incorporating a debrief after SBT resulted in increases in participant self-efficacy and team efficacy. Other literature, however, has had different findings. For example, Garden et al’s25 systematic review of debriefs after SBT to improve team-based competencies found that, in certain studies, debriefs did not lead to a performance improvement, suggesting that these debriefs may not have influenced teaching due to inferior quality.

The authors’ prior work26 has demonstrated that faculty teams facilitating debriefings immediately after SBT of interprofessional student teams did indeed have varying quality of key components of an effective debrief, especially in the prebriefs before the start of a session. In addition, it found that the quality among these PB components improved over time. For this study, we investigated whether the quality of prebriefs and debriefs in interprofessional education using SBT had an effect on the degree of learning of participants as measured by improvement in their team-based performance during SBT sessions. We hypothesised that student team-based performance during SBT, and, as a result, learning by students, improved more with higher quality debriefing.

Methods

Study design

The design for this study was a retrospective comparison of matched team performance scores of interprofessional students undergoing SBT with ratings of prebriefs and debriefs conducted by faculty teams following these performances. Data for this study drew on prior work by the authors related to high-fidelity simulation-based team training of interprofessional students27 and analysis of the quality of the immediate after-action faculty team-guided structured debrief of these SBT scenarios.26 SBT sessions occurred at the Louisiana State University (LSU) Health New Orleans School of Medicine Learning Centre. Interprofessional student team data were collected prospectively. Junior students undergoing their surgery rotation at the LSU Health New Orleans School of Medicine and senior undergraduate students enrolled in their intensive care course from the LSU Health New Orleans School of Nursing participated in the SBT. The interprofessional student team training was integrated into the mandatory SBT sessions in which the junior medical students participated during their surgery rotation. Evaluation of the quality of the prebriefs and debriefs was retrospectively undertaken through video review of faculty performances.

Team training of interprofessional students format

The team training of interprofessional students programme involved a 1-year, high-fidelity SBT team development intervention involving interprofessional student teams of junior medical students and senior undergraduate nursing students who participated in SBT sessions involving trauma resuscitations. Overall, 213 students comprising 42 interprofessional teams of 3–8 members participated. The training format consisted of a dual scenario design in which teams of students underwent an initial prebrief, participated in an SBT scenario, did an immediate after-action debrief, then participated in a second SBT scenario that was followed by another immediate after-action debrief. Faculty teams facilitated each prebrief and immediate after-action structured debrief. After each SBT scenario, participants and facilitators completed a multisource evaluation team performance instrument, the Teamwork Assessment Scales (TAS). The TAS is an 11-item tool using a 6-point Likert-type scale (definitely no=1 to 6=definitely yes). It has three subscales: (1) the 5-item team-based behaviours (TBB) subscale measuring individual performance within the team, and (2) the 3-item shared mental model as well as (3) the 3-item adaptive communication and response (ACR) subscales that each measured components of overall team performance. This instrument has evidence of both generalisability28 and convergent validity29 as a team assessment tool for interprofessional student teams undergoing SBT. It provided a means of evaluating the degree of student learning of team-based competencies during the SBT session by determination of the change in team performance from the first to the second scenario.

Character and rating of faculty team-led prebriefs and debriefs

Four faculty members (designated A, B, C and D) comprised the combination of seven faculty teams (designated team 1 through 7), with one to three instructors leading a debrief at one time. Two faculty members were physicians (one internist and one surgeon); the remaining two faculty members were nursing professionals with advanced degrees. Two members, members A and C, had worked together as a facilitator team before the beginning of the team training of interprofessional students programme. None of the instructors had undergone a formalised course in debriefing training. As mentioned previously, faculty teams conducted a prebrief at the beginning of each session. During the prebrief, instructors introduced themselves and oriented the students to the computer-operated, human patient simulator (CAE, Montreal, Canada) used for the SBT scenarios as well as the equipment available for each scenario. They discussed the learning objectives of the session related to using Advanced Trauma Life Support guidelines for trauma resuscitation and teaching team-based competencies to improve teamwork. They then provided three ground rules for optimising learning. They included the following: (1) treat it real (ie, consider the computerised manikin as a real-life patient and provide care as one would in the clinical environment); (2) treat the faculty as ghosts (ie, do not direct any questions to the faculty during the SBT scenario) and (3) treat it like ‘Vegas’ (ie, maintain confidentiality related to how people performed during the SBT scenarios and what was said during debriefs). Students were encouraged to practice the team-based competencies taught when they returned to the clinical environment. Finally, the faculty team then introduced the student teams to the first SBT scenario by providing background. Each debrief would begin with an investigation of student participants’ emotional response to the scenario (ie, How did that feel?). It would then analyse key actions to identify performance gaps and address learning objectives to help fill those gaps. Finally, it would finish with a summary of teaching points with an elicitation of student commitment to work on one item taught.

All SBT sessions underwent video recording of the prebriefs, as well as both scenarios and debriefs. Two independent observers, blinded to the teams’ scenario performances, rated the quality of the prebriefs before and the debriefs following the first scenario in a retrospective manner by reviewing videos of them blinded to the date of the performance. For this rating, they used an electronic (eAssessment) version of the Objective Structured Assessment of Debriefing (OSAD) instrument.30 The OSAD is an 8-item scale assessing evidence-based components of an effective debrief using a 5-point Likert scale (1=done very poorly to 5=done very well). It provided a means of evaluating how closely faculty teams adhered to evidence-based components of an effective debrief. It thus served as a marker of the quality of the debrief. The raters evaluated the prebriefs using only the first three items of the OSAD (ie, approach, establishment of learning environment and engagement of learners), since the five remaining items focused on after-action components of an effective debrief (ie, reflection, reaction, analysis, diagnosis and application). They rated the debriefs using all eight items.

Statistical analysis

For student team performance, investigators calculated mean subscale TAS scores for the first and second scenarios. Mean change scores were compared from the first to the second scenario using t-test analysis. For prebrief and debrief faculty performance, investigators calculated mean item OSAD scores for the prebrief and first debrief for every combination of faculty teams. Comparisons of performance on each OSAD item between the teams were calculated using analysis of variance with post hoc Tukey’s Studentised Range Procedure. Lastly, a trend analysis over time using the Kendall’s tau coefficient test and linear regression was conducted.

After matching the SBT performances of interprofessional student teams with the prebrief and first debrief conducted by the faculty team for a particular session, investigators conducted a multiple linear regression analysis comparing each TAS subscale mean score with the 11 faculty team OSAD mean scores (three scores for prebrief and eight scores for debrief). Investigators then performed univariate analysis for each of the TAS mean subscale scores with each of the 11 faculty performance mean item OSAD scores as a single predictor. Investigators set statistical significance for comparisons and linear regressions at a p<0.5. We did this comparison because we wanted to look at the relationship between each component of debriefing to overall learning of students in the SBT sessions. We felt that performance as rated by the TAS subscales was such an outcome.

Results

Results related to data analysis of the team training for interprofessional students programme27 as well as the analysis of prebrief and debrief quality,26 have been previously published. For the team training of interprofessional students programme, 187 first time attendee students participated in the SBT sessions. Seven separate faculty teams had their prebrief and debrief performances evaluated using the OSAD (table 1). Prebrief OSAD mean item scores were ≥3 except for two scores. All debrief OSAD mean item scores were ≥3.5, except for one score. Linear regression analysis demonstrated improvement in faculty prebrief and debrief scores on certain items over time.

Table 1.

Summary of the team training of interprofessional students programme’s OSAD scores of prebriefs and debriefs for the seven faculty team combinations conducting them

Faculty team designation Type of brief (N) OSAD item score (mean±SE)
Approach Environment Engagement Reaction Reflection Analysis Diagnosis Application
1 Prebrief (2) 3.5±0.7 4.3±0.4 4.0±0.0 n/a n/a n/a n/a n/a
Debrief (11) 4.7±0.4 4.1±0.8 4.7±0.4 4.4±0.7 4.7±0.4 4.7±0.4 4.7±0.4 4.6±0.5
2 Prebrief (9) 4.0±0.8 4.8±0.4 4.7±0.4 n/a n/a n/a n/a n/a
Debrief (5) 4.1±0.4 3.7±1.1 4.3±0.5 4.0±0.6 4.6±0.4 4.8±0.8 4.6±0.6 4.9±0.2
3 Prebrief (7) 4.3±0.7 4.5±0.5 3.9±0.5 n/a n/a n/a n/a n/a
Debrief (3) 4.7±0.3 3.8±0.8 4.7±0.6 4.5±0.5 4.8±0.3 4.8±0.3 4.5±0.5 4.7±0.3
4 Prebrief (4) 2.8±0.5 2.6±1.1 3.0±1.1 n/a n/a n/a n/a n/a
Debrief (3) 3.8±0.8 3.7±0.6 4.2±1.0 3.7±1.0 4.3±0.8 4.2±0.3 4.2±1.0 4.1±1.0
5 Prebrief (14) 3.2±1.0 3.0±1.0 3.4±1.2 n/a n/a n/a n/a n/a
Debrief (11) 3.8±0.8 4.0±0.6 4.1±0.7 4.1±0.8 4.4±0.6 4.4±0.7 4.5±0.7 4.3±1.0
6 PreBrief (0) n/a n/a n/a n/a n/a n/a n/a n/a
Debrief (4) 4.8±0.5 4.6±0.5 4.8±0.5 4.8±0.5 4.9±0.3 5.0±0.0 5.0±0.0 4.9±0.3
7 Prebrief (2) 3.0±0.0 4.3±0.4 4.0±0.7 n/a n/a n/a n/a n/a
Debrief (1) 3.0 3.5 4.0 4.0 4.0 4.0 4.0 4.0

n/a, not available; OSAD, Objective Structured Assessment of Debriefing.

Table 2 summarises the changes in the mean TAS subscale scores of student team performance before and after the first debriefing held during the SBT sessions. In brief, statistically significant increases of over one full unit occurred in TAS subscale scores after the first debriefing. Table 3 summarises multiple linear regression analysis results comparing mean TAS subscale change scores with each of the 11 rated mean OSAD items. In brief, multiple linear regression analysis demonstrated a statistically significant relationship between the ACR mean subscale score and the debrief engagement of the learners score (parameter estimate=1.46, p<0.013). No significance remained on univariate analysis.

Table 2.

Summary of mean Team Assessment Scales (TAS) Subscale Scores for interprofessional student team performance before and after the first debriefing of simulation-based training sessions

Team performance timing TAS subscale
TBB scores (n=168) SMM scores (n=151) ACR scores (n=151)
Mean±SD Mean±SD Mean±SD
Predebriefing 2.81±0.68 2.47±0.82 2.16±0.73
Postdebriefing 3.96±0.74 3.73±0.76 3.31±0.82
P value* <0.001 <0.001 <0.001

*One way ANOVA

*One way ANOVA.

ACR, adaptive communication and response; ANOVA, analysis of variance; SMM, shared mental model; TBB, team-based behaviours.

Table 3.

Multivariable model prediction between the Teamwork Assessment Scales (TAS) Subscales and the Objective Structured Assessment of Debriefing (OSAD) items with subsequent univariate analysis

TAS subscale TBB scores SMM scores ACR scores
OSAD items Multivariable model‡ P value Univariate analysis P value Multivariable model‡ P value Univariate analysis P value Multivariable model‡ P value Univariate analysis P value
Prebrief*
 Approach −0.08570 0.7216 0.21922 0.0714 −0.09995 0.7631 0.23391 0.1407 −0.32721 0.2560 0.16339 0.2661
 Environment 0.06062 0.7932 0.11709 0.2577 0.24958 0.4364 0.16771 0.2098 0.21718 0.4300 0.15226 0.2155
 Engagement 0.12304 0.6149 0.16471 0.1714 0.08976 0.7897 0.19771 0.2060 0.16727 0.5634 0.17361 0.2274
Debrief†
 Approach −0.09134 0.7628 0.22074 0.1863 −0.32172 0.4431 0.11518 0.5986 −0.45763 0.2083 0.12341 0.5390
 Environment 0.00374 0.9890 0.07585 0.6381 −0.14264 0.7041 0.00502 0.9809 0.09889 0.7588 0.10581 0.5813
 Engagement 0.84096 0.0794 0.36114 0.0552 1.00136 0.1267 0.31094 0.2097 1.45665 0.0130 0.37508 0.0967
 Reaction −0.20660 0.5784 0.21860 0.1851 −0.29717 0.5622 0.13788 0.5230 −0.38255 0.3865 0.16726 0.3981
 Reflection −0.48221 0.3191 0.00330 0.9886 −0.66129 0.3217 −0.03132 0.9164 −0.78257 0.1755 −0.06944 0.8000
 Analysis −0.31957 0.5076 −0.11259 0.6410 −0.05431 0.9346 −0.09752 0.7554 −0.23248 0.6827 −0.12190 0.6716
 Diagnosis 0.78778 0.1021 0.11326 0.5910 0.83410 0.2042 0.06251 0.8192 0.79181 0.1616 −0.00492 0.9844
 Application −0.35443 0.3859 −0.08307 0.6354 −0.36848 0.5118 −0.09371 0.6798 −0.32147 0.5047 −0.10831 0.6035

*N=38 events among six faculty team combinations.

†N=38 events among seven faculty team combinations.

‡Multiple linear regression analysis.

ACR, adaptive communication and response; SMM, shared mental model; TBB, team-based behaviours.

Discussion

In this study, the degree of change in interprofessional student team performance from one SBT scenario to a second did not correlate with the quality of either the prebrief before the first scenario or the debrief between the scenarios. Thus, we accept the null hypothesis, refuting our theory that debrief quality influenced the degree of student learning as measured by improvement in interprofessional student team performance from one scenario to the second. Only one statistically significant correlation between the 11 OSAD variables and the 3 subscales of the TAS existed. This correlation was between the debrief engagement of learners OSAD item and the ACR subscale score on multiple linear regression analysis, and it was lost on univariate analysis.

Substantial research in the literature seems to support the importance of debriefs in SBT. Several investigators have demonstrated the superiority of conducting a debrief in conjunction with SBT compared with just performing SBT alone.24 31–34 Thus, many simulation and medical education organisations support the use of debrief in SBT35 36 and interprofessional education.37 38 Garden et al’s systematic review25 on conducting debriefs after SBT for non-technical skills, however, found that, in several instances, debriefs were no more effective than other postintervention educational interventions, including not having debriefs at all. Nonetheless, debriefs have a powerful influence on team function and performance. For example, a different systematic review39 looking at the effectiveness of debriefs in health professional education concluded that including debriefs as part of SBT increases the effectiveness of teaching both technical and non-technical skills. Outside of SBT, debriefs can improve team performance by 25%,40 and their use in checklist form in an operating room has led to decreased costs and mortality.41

Our results are in line with the findings of Garden et al.25 We found that the improvements in team-based performance of the student teams did not rely on any particular aspect of an effective debrief as measured by the OSAD. This lack of linkage between quality debriefs and learning as measured by improvements in team-based performance during SBT sessions emphasises the complex nature of the debriefing and learning processes and their interplay. Debriefs have multiple structural elements,42 43 must pass through several phases44 and require the facilitator to maximum learner receptivity while minimising cognitive overload.45 46 All these efforts become even more challenging when the prebriefs and debriefs are in the setting of interprofessional education SBT, since the presence of other team members changes the dynamics of the process and can impede47 or prevent an opportunity for a learner from speaking up. Thus, our findings could result from a variety of issues related to debriefing, learning and their interplay.

One explanation for the findings of equivalency may be the fact that, given the relatively high mean OSAD item scores encountered, the variability between faculty teams in prebrief and debrief quality was insufficient to tease out which components of the debriefing process were most important. In other words, the calibre of the debriefing performance of the faculty teams was so homogeneous that finding a difference between OSAD mean item scores and interprofessional student team performance was too difficult. Faculty team OSAD mean items score were almost all ≥3.0 for prebriefs, except for two scores, and near universally ≥3.5 for debriefs, except for one score. Since trained observers rated each faculty team performance, we believe that these scores more accurately reflect actual debriefing quality than if we used student-based or facilitator-based OSAD ratings as Hull et al 48 have shown that these later ratings tend to overestimate scores compared with observers.

The good performance of the faculty teams likely rested on the fact that the team training of interprofessional students format followed many suggested and evidence-based guidelines for conducting an effective debriefing process. The authors intentionally planned to have postevent debriefs immediately following each SBT scenario to optimise recall and learning, especially in the context of teaching team-based competencies.36 47 They ensured that the after-action debriefs were facilitator guided, given the fact that students benefit more from this format compared with self-debriefing.49 Furthermore, they conducted a prebrief to prepare learners before commencing any SBT.

In addition to following the three-part plan, prebrief and debrief format, the authors also attempted to ensure that both prebriefs and debriefs followed best-practice structures and techniques. Facilitators adhered to three debriefing duties: (1) making it safe (ie, creating a learning environment in which learners felt secure and supported), (2) making it stick (ie, encouraging learners to self-reflect, identify gaps in performance and develop solutions to them) and (3) making it last (ie, eliciting a commitment to change).43 45 Furthermore, facilitators used a’fan’ approach when interacting with the learners in order to make sure that each one was able to express thoughts and reflections.50 Finally, the prebriefs and debriefs followed several best practices for debriefing medical teams.51

Finally, student teams consisted of members who were both novices to teamwork and team-based competencies and unfamiliar with debriefing processes and techniques. As a result, facilitators often had to lean more toward using instructor-centred debriefing techniques52 in lieu of learner-centred teaching in order to move debriefs along and ensure that students understood learning objectives. Such an instructor-centred approach may have resulted in a smaller increase in the performance scores of the student teams. This change in performance score may not have been enough to produce a statistically significant correlation between the OSAD mean item scores and the mean TAS subscale scores.

Future directions of research include educating student teams in debriefing processes in order to determine its impact on team behaviour. Student-led debriefings could then be rated using the OSAD in order to delineate which components are best learnt. In addition, the relationship between improvements in team performance and the quality of prebriefs and debriefs could then be examined. Finally, facilitator-led and student-led prebriefs and debriefs could be compared to determine which components of an effective debrief are more emphasised or of better quality.

This study has several limitations. First, the fact that facilitators had to use more instructor-centred teaching techniques during the prebriefs and debriefs could have blunted the degree of learning. Second, this study involves only faculty and students from a single institution over the course of 1 year, limiting generalisability and the opportunity to examine changes over time. Second, we used student team-based performance improvement during SBT as our marker for student learning. Improving team performance involves applying learnt knowledge, skills and abilities together to work better as a unit. Thus, if prebriefs and debriefs succeeded in improving one aspect of these knowledge, skills and abilities and not others, performance improvement may not have been enough to demonstrate a relationship. We did this comparison because we wanted to look at the overall learning and we felt that performance was such an outcome. Finally, the high OSAD mean item scores may have produced a ceiling effect on these scores, resulting in a lack of variance that lead to an inability to detect a correlation between debriefing quality and student learning. In other words, the relatively high quality of the prebriefs and debriefs overall may be an anomaly, limiting the generalisability of our findings. This ceiling effect could potentially be overcome by analysing the effect on learning between faculty teams that are novice in debriefing techniques with teams containing expert facilitators.

In conclusion, this study did not demonstrate an increase between the degree of learning among interprofessional student teams undergoing SBT as measured by improvements in team performance during an SBT session and the quality of debriefing among OSAD items related to effective prebriefs and debriefs. This lack of an improvement is likely due to the complex nature of the debriefing and learning processes and their interplay that may have blunted the degree of learning experienced by the student learners. In additions, the generally good quality of the prebriefs and debriefs may have prevented a detection with statistical analysis. Nonetheless, the team training of interprofessional students programme format and the relatively high quality of the faculty team prebriefs and debriefs led to students learning key team-based competencies essential to highly reliable team performance in the clinical environment.

What is already known on this subject?

  • Given the extensive literature emphasising the criticality of the use of prebriefs and debriefs in simulation-based training in health professional education, conventional wisdom assumes that the quality of a debriefing correlates with the amount of learning that occurs. To date, however, the literature lacks empirical studies that have quantitatively linked debriefing quality with degree of learning.

What this study adds?

  • This study is one of the first in the literature to determine whether inter-professional student team learning in simulation-based training improves with the quality of core components for effective debriefing. Our findings indicate that, in the setting of average to above average or higher debriefing quality, the degree of learning of student teams during simulation-based training sessions does not seem to be dependent on any particular core component of effective debriefing.

Acknowledgments

The authors would like to acknowledge all the students who participated in the simulation-based training sessions from the LSU Health New Orleans Schools of Medicine and Nursing.

Footnotes

Presented at: Aspects of this work were published as an online abstract on the Association for Surgical Education’s (ASE’s) website after being accepted as a poster presentation for the 2020 Annual Meeting in Seattle, WA that was cancelled due to the COVID-19 pandemic.

Contributors: Study concept and design: JTP, DDG, JZ and RBdC. Acquisition of data: JP, DDG, VR, VJK, JZ and RBdC. Analysis and interpretation of data: JTP, DDG, JZ, RBdC, QY and LZ. Drafting of the manuscript: JTP Critical revision of the manuscript for important intellectual content: all authors. Statistical analysis: QY and LZ. Obtained funding: JTP and DDG. Administrative, technical and material support: VR, VJK, JTP, DDG, JZ and RBdC.

Funding: This work was in part supported through a 2011–2012 Educational Enhancement Grant from the LSU Health New Orleans Teaching Academy.

Competing interests: JP receives royalties from Oxford University Press and Springer Nature for three books relating to simulation or surgical education. He also is a consultant to Boston Scientific as a faculty instructor. Finally, he receives grant support from the Southern Group on Educational Affairs (SGEA) and the International Association of Medical Science Educators (IAMSE) as PI for teamwork research as well as from Acell as a site investigator for wound healing research. DDG and QY are coinvestigators on the SGEA and IAMSE grants. The remaining authors do not have any disclosures.

Data availability statement

Data available on request from corresponding author.

Ethics statements

Patient consent for publication

Not required.

Ethics approval

The Institution Review Board at LSU Health New Orleans Health Sciences approved this work as part of an existing exempt protocol.

References

  • 1. Mannix R, Nagler J. Tribalism in Medicine-Us vs them. JAMA Pediatr 2017;171:831. 10.1001/jamapediatrics.2017.1280 [DOI] [PubMed] [Google Scholar]
  • 2. Braithwaite J, Clay-Williams R, Vecellio E, et al. The basis of clinical tribalism, hierarchy and stereotyping: a laboratory-controlled teamwork experiment. BMJ Open 2016;6:e012467. 10.1136/bmjopen-2016-012467 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. Weller J, Boyd M, Cumin D. Teams, tribes and patient safety: overcoming barriers to effective teamwork in healthcare. Postgrad Med J 2014;90:149–54. 10.1136/postgradmedj-2012-131168 [DOI] [PubMed] [Google Scholar]
  • 4. Davis WA, Jones S, Crowell-Kuhnberg AM, et al. Operative team communication during simulated emergencies: too busy to respond? Surgery 2017;161:1348–56. 10.1016/j.surg.2016.09.027 [DOI] [PubMed] [Google Scholar]
  • 5. El-Shafy IA, Delgado J, Akerman M, et al. Closed-Loop communication improves task completion in pediatric trauma resuscitation. J Surg Educ 2018;75:58–64. 10.1016/j.jsurg.2017.06.025 [DOI] [PubMed] [Google Scholar]
  • 6. Jung HS, Warner-Hillard C, Thompson R, et al. Why saying what you mean matters: an analysis of trauma team communication. Am J Surg 2018;215:250–4. 10.1016/j.amjsurg.2017.11.008 [DOI] [PubMed] [Google Scholar]
  • 7. Bochatay N, Muller-Juge V, Scherer F, et al. Are role perceptions of residents and nurses translated into action? BMC Med Educ 2017;17:138. 10.1186/s12909-017-0976-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Alzahrani N, Jones R, Rizwan A, et al. Safety attitudes in hospital emergency departments: a systematic review. Int J Health Care Qual Assur 2019;32:1042–54. 10.1108/IJHCQA-07-2018-0164 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Doja A, Bould MD, Clarkin C, et al. The hidden and informal curriculum across the continuum of training: a cross-sectional qualitative study. Med Teach 2016;38:410–8. 10.3109/0142159X.2015.1073241 [DOI] [PubMed] [Google Scholar]
  • 10. Lacerenza CN, Marlow SL, Tannenbaum SI, et al. Team development interventions: evidence-based approaches for improving teamwork. Am Psychol 2018;73:517–31. 10.1037/amp0000295 [DOI] [PubMed] [Google Scholar]
  • 11. Schmutz JB, Meier LL, Manser T. How effective is teamwork really? The relationship between teamwork and performance in healthcare teams: a systematic review and meta-analysis. BMJ Open 2019;9:e028280. 10.1136/bmjopen-2018-028280 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Marlow SL, Hughes AM, Sonesh SC, et al. A systematic review of team training in health care: ten questions. Jt Comm J Qual Patient Saf 2017;43:197–204. 10.1016/j.jcjq.2016.12.004 [DOI] [PubMed] [Google Scholar]
  • 13. McLaughlin C, Barry W, Barin E, et al. Multidisciplinary simulation-based team training for trauma resuscitation: a scoping review. J Surg Educ 2019;76:1669–80. 10.1016/j.jsurg.2019.05.002 [DOI] [PubMed] [Google Scholar]
  • 14. Wu M, Tang J, Etherington N, et al. Interventions for improving teamwork in intrapartem care: a systematic review of randomised controlled trials. BMJ Qual Saf 2020;29:77–85. 10.1136/bmjqs-2019-009689 [DOI] [PubMed] [Google Scholar]
  • 15. Wong AH-W, Gang M, Szyld D, et al. Making an "attitude adjustment": using a simulation-enhanced interprofessional education strategy to improve attitudes toward teamwork and communication. Simul Healthc 2016;11:117–25. 10.1097/SIH.0000000000000133 [DOI] [PubMed] [Google Scholar]
  • 16. Sacks GD, Shannon EM, Dawes AJ, et al. Teamwork, communication and safety climate: a systematic review of interventions to improve surgical culture. BMJ Qual Saf 2015;24:458–67. 10.1136/bmjqs-2014-003764 [DOI] [PubMed] [Google Scholar]
  • 17. Baker Valerie O'Toole, Cuzzola R, Knox C. Teamwork education improves trauma team performance in undergraduate health professional students. J Educ Eval Health Prof 2015;12:36. 10.3352/jeehp.2015.12.36 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Kyrkjebø JM, Brattebø G, Smith-Strøm H. Improving patient safety by using interprofessional simulation training in health professional education. J Interprof Care 2006;20:507–16. 10.1080/13561820600918200 [DOI] [PubMed] [Google Scholar]
  • 19. Escher C, Creutzfeldt J, Meurling L, et al. Medical students' situational motivation to participate in simulation based team training is predicted by attitudes to patient safety. BMC Med Educ 2017;17:37. 10.1186/s12909-017-0876-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. Paige JT, Garbee DD, Brown KM, et al. Using simulation in interprofessional education. Surg Clin North Am 2015;95:751–66. 10.1016/j.suc.2015.04.004 [DOI] [PubMed] [Google Scholar]
  • 21. Issenberg SB, McGaghie WC, Petrusa ER, et al. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005;27:10–28. 10.1080/01421590500046924 [DOI] [PubMed] [Google Scholar]
  • 22. McGaghie WC, Issenberg SB, Petrusa ER, et al. A critical review of simulation-based medical education research: 2003-2009. Med Educ 2010;44:50–63. 10.1111/j.1365-2923.2009.03547.x [DOI] [PubMed] [Google Scholar]
  • 23. Chauvin SW. Educational principles in simulation. In: Bok L, Robertson H, Paige JT, eds. Simulation in radiology. New York: Oxford University Press, 2012. [Google Scholar]
  • 24. Coppens I, Verhaeghe S, Van Hecke A, et al. The effectiveness of crisis resource management and team debriefing in resuscitation education of nursing students: a randomised controlled trial. J Clin Nurs 2018;27:77–85. 10.1111/jocn.13846 [DOI] [PubMed] [Google Scholar]
  • 25. Garden AL, Le Fevre DM, Waddington HL, et al. Debriefing after simulation-based non-technical skill training in healthcare: a systematic review of effective practice. Anaesth Intensive Care 2015;43:300–8. 10.1177/0310057X1504300303 [DOI] [PubMed] [Google Scholar]
  • 26. Paige JT, Zamjahn J, Baroni de Carvalho R. Evaluating the quality of video-taped debriefings after simulation team training. Surgery 2019;165:1069–74. [DOI] [PubMed] [Google Scholar]
  • 27. Paige JT, Garbee DD, Yu Q, et al. Team training of inter-professional students (TTIPS) for improving teamwork. BMJ Stel 2017;3:127–34. 10.1136/bmjstel-2017-000194 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28. Paige JT, Garbee DD, Kozmenko V, et al. Getting a head start: high-fidelity, simulation-based operating room team training of interprofessional students. J Am Coll Surg 2014;218:140–9. 10.1016/j.jamcollsurg.2013.09.006 [DOI] [PubMed] [Google Scholar]
  • 29. Garbee DD, Paige JT, Barrier K, et al. Interprofessional teamwork and communication collaboration among students in simulated codes: a quasiexperimental study. Nurs Educ Perspect 2013;34:339–44. [DOI] [PubMed] [Google Scholar]
  • 30. Zamjahn JB, Baroni de Carvalho R, Bronson MH, et al. eAssessment: development of an electronic version of the objective structured assessment of debriefing tool to streamline evaluation of video recorded debriefings. J Am Med Inform Assoc 2018;25:1284–91. 10.1093/jamia/ocy113 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31. Ryoo EN, Ha E-H. The importance of debriefing in simulation-based learning: comparison between debriefing and no debriefing. Comput Inform Nurs 2015;33:538–45. 10.1097/CIN.0000000000000194 [DOI] [PubMed] [Google Scholar]
  • 32. Shinnick MA, Woo M, Horwich TB, et al. Debriefing: the most important component in simulation? Clin Simul Nurs 2011;7:e105–11. 10.1016/j.ecns.2010.11.005 [DOI] [Google Scholar]
  • 33. Morgan PJ, Tarshis J, LeBlanc V, et al. Efficacy of high-fidelity simulation debriefing on the performance of practicing anaesthetists in simulated scenarios. Br J Anaesth 2009;103:531–7. 10.1093/bja/aep222 [DOI] [PubMed] [Google Scholar]
  • 34. Savoldelli GL, Naik VN, Park J, et al. Value of debriefing during simulated crisis management: oral versus video-assisted oral feedback. Anesthesiology 2006;105:279–85. 10.1097/00000542-200608000-00010 [DOI] [PubMed] [Google Scholar]
  • 35. Decker S, Fey M, Sideras S, et al. Standards of best practice: simulation standard VI: the debriefing process. Clin Simul Nurs 2013;9:S26–9. 10.1016/j.ecns.2013.04.008 [DOI] [Google Scholar]
  • 36. Motola I, Devine LA, Chung HS, et al. Simulation in healthcare education: a best evidence practical guide. AMEE guide No. 82. Med Teach 2013;35:e1511–30. 10.3109/0142159X.2013.818632 [DOI] [PubMed] [Google Scholar]
  • 37. Reeves S, Fletcher S, Barr H, et al. A BEME systematic review of the effects of interprofessional education: BEME guide No. 39. Med Teach 2016;38:656–68. 10.3109/0142159X.2016.1173663 [DOI] [PubMed] [Google Scholar]
  • 38. Hammick M, Freeth D, Koppel I, et al. A best evidence systematic review of interprofessional education: BEME guide No. 9. Med Teach 2007;29:735–51. 10.1080/01421590701682576 [DOI] [PubMed] [Google Scholar]
  • 39. Levett-Jones T, Lapkin S. A systematic review of the effectiveness of simulation debriefing in health professional education. Nurse Educ Today 2014;34:e58–63. 10.1016/j.nedt.2013.09.020 [DOI] [PubMed] [Google Scholar]
  • 40. Tannenbaum SI, Cerasoli CP. Do team and individual debriefs enhance performance? A meta-analysis. Hum Factors 2013;55:231–45. 10.1177/0018720812448394 [DOI] [PubMed] [Google Scholar]
  • 41. Rose MR, Rose KM. Use of a surgical debriefing checklist to achieve higher value health care. Am J Med Qual 2018;33:514–22. 10.1177/1062860618763534 [DOI] [PubMed] [Google Scholar]
  • 42. Lederman LC. Debriefing: toward a systematic assessment of theory and practice. Simul Gaming 1992;23:145–60. 10.1177/1046878192232003 [DOI] [Google Scholar]
  • 43. Paige JT. Making it stick: keys to effective feedback and debriefing. In: Stefanidis D, Kordorffer JR, Sweet R, eds. Surgical education in simulation for surgery and surgical subspecialties. New York: Springer, 2019. [Google Scholar]
  • 44. Rudolph JW, Simon R, Dufresne RL, et al. There's no such thing as "nonjudgmental" debriefing: a theory and method for debriefing with good judgment. Simul Healthc 2006;1:49–55. 10.1097/01266021-200600110-00006 [DOI] [PubMed] [Google Scholar]
  • 45. Paige JT. Principles of simulation. In: Bok L, Robertson H, Paige JT, eds. Simulation in radiology. New York: Oxford University Press, 2012. [Google Scholar]
  • 46. Fraser KL, Meguerdichian MJ, Haws JT, et al. Cognitive load theory for debriefing simulations: implications for faculty development. Adv Simul 2018;3:28. 10.1186/s41077-018-0086-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47. Kolbe M, Grande B, Spahn DR, Spahn DR. Briefing and debriefing during simulation-based training and beyond: content, structure, attitude and setting. Best Pract Res Clin Anaesthesiol 2015;29:87–96. 10.1016/j.bpa.2015.01.002 [DOI] [PubMed] [Google Scholar]
  • 48. Hull L, Russ S, Ahmed M, et al. Quality of interdisciplinary postsimulation debriefing: 360° evaluation. BMJ Stel 2017;3:9–16. 10.1136/bmjstel-2016-000125 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49. Kim Y-J, Yoo J-H. The utilization of debriefing for simulation in healthcare: a literature review. Nurse Educ Pract 2020;43:102968. 10.1016/j.nepr.2020.102698 [DOI] [PubMed] [Google Scholar]
  • 50. Dieckmann P, Molin Friis S, Lippert A, et al. The art and science of debriefing in simulation: ideal and practice. Med Teach 2009;31:e287–94. 10.1080/01421590902866218 [DOI] [PubMed] [Google Scholar]
  • 51. Salas E, Klein C, King H, et al. Debriefing medical teams: 12 evidence-based best practices and tips. Jt Comm J Qual Patient Saf 2008;34:518–27. 10.1016/S1553-7250(08)34066-5 [DOI] [PubMed] [Google Scholar]
  • 52. Cheng A, Morse KJ, Rudolph J, et al. Learner-centered debriefing for health care simulation education: lessons for faculty development. Simul Healthc 2016;11:32–40. 10.1097/SIH.0000000000000136 [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

Data available on request from corresponding author.


Articles from BMJ Simulation & Technology Enhanced Learning are provided here courtesy of BMJ Publishing Group

RESOURCES