Abstract
Simulation-based learning activities in the emergency department (ED) improve communication and teamwork and familiarise personnel with existing protocols. The authors’ objective was to develop standardised in-situ simulations and to assess their effects on team performance during simulated patient care. The study was a prospective, single-centre pre-in-situ and post-in-situ simulation-based intervention in the ED of an academic hospital between March 2017 and February 2018. Teams of three to five participants (n=46) were in two simulation interventions 2 weeks apart; each simulation was followed by debriefing with good judgement. The adapted Simulation Team Assessment Tool (STAT) Score was the primary measure for team performance. Skills are measured on a scale of 2–0 based on the complete and timely performance of tasks for a total (adapted) score of 171. Overall STAT scores improved significantly between simulations I (60.5 (28.3)) and II (81.1 (24.6)), p=029; notably in airway and teamwork domains, p=022 and p=023, respectively. A sub-analysis showed that participants performed significantly better when treating adult versus paediatric simulated patients (87.9 (20.1)), p=003, particularly in teamwork, p=01. The study yielded statistically significant improvement in clinical management, teamwork and resource management skills among ED personnel.
Keywords: simulation-based medical education, in-situ simulation, simulation for teamwork training, simZones
Introduction
As a high-stress, high-acuity environment with rapid patient turnover, the emergency department (ED) is a popular site for in-situ simulation.1 Working under extreme stress can cause medical professionals to deviate from clinical guidelines and policies even if they know of their existence.2 Simulation-based learning (SBL) activities are often used in the ED to improve communication and teamwork and familiarise ED personnel with existing protocols. Prior to designing and initiating this project, Batley et al conducted a needs assessment for simulation within the ED3 which showed a significant perceived knowledge gap among personnel in (1) adherence to ED guidelines, (2) inter-team communication and (3) dealing with difficult families. Respondents agreed unanimously to shift from traditional didactic teaching to integrated SBL, particularly in situ.
Roussin and Weinstock propose 0–4 zones targeted to different learner groups and goals; an innovative organisation of SBL they call SimZones.4 SimZone 3 is described as a team-based simulation for double loop learning for team and system development. A crucial element of this exercise is the expert facilitator-led debriefing meant to discover the assumptions and values guiding the team’s behaviour.5–7
Based on the needs and gaps identified by ED personnel, the authors’ objective for this project was to develop standardised in-situ simulation exercises within the context of SimZone 3 and to assess their effects on team performance during simulated patient care.
Methods
The project was a prospective, single-centre pre-in-situ and post-in-situ simulation-based intervention in the ED of an academic hospital between March 2017 and February 2018. Teams of three to five participants (n=46 ED personnel) were involved in two simulation interventions 2 weeks apart; each simulation followed by an expert facilitated debriefing with good judgement.8 The participants acted as their own control between the presimulation and postsimulation, to avoid confounding and bias. The simulations and debriefings were video recorded for analysis and review.
Scenarios were developed by the authors based on existing ED guidelines and protocols (table 1); they were piloted and refined by the team in the simulation laboratory prior to the in-situ intervention.
Table 1.
Simulated scenario goals and objectives map, based on the Simulation Team Assessment Tool (STAT) categorisation
| Asthma (adult and paediatric) |
Sepsis (adult and paediatric) |
Anaphylaxis (adult and paediatric) |
MVA (adult and paediatric) |
Burn (paediatric) |
STEMI (adult) |
|
| Core clinical objectives | Recognise and treat status asthmaticus | Recognise and treat septic shock | Recognise and treat anaphylaxis | Recognise trauma and Ddx of internal trauma and diagnostics (eFast); haemodynamic stabilisation | Secure airway and pain management | Recognise and manage STEMI |
| Basic skills | SAMPLE history | SAMPLE history | Primary survey and weight estimation | Primary and secondary survey | Primary and secondary survey | SAMPLE history |
| Airway and breathing | Assess, evaluate and anticipate respiratory failure | Oxygenation | RSI skills, preparation, intubation and confirmation | Supporting airway and breathing | Assessment, evaluation and anticipation of difficult intubation | Supporting airway and breathing |
| Circulation | Timely IV access and IVF resuscitation | Timely IV access, recognition of shock and IVF resuscitation | Timely IO access and IVF resuscitation | Monitor pulses and recognise abnormal rhythm | ||
| Teamwork | Directed closed loop communication, role-clarity and situational awareness | |||||
Ddx, differential diagnosis; IO, intraosseous; IV, intravenous; IVF, intravenous fluids; MVA, motor vehicle accident; RSI, Rapid Sequence Intubation; SAMPLE, sign/symptoms, allergies, medications, past illness, last meal and events preceding; STEMI, ST-elevation myocardial infarction.
The Simulation Team Assessment Tool (STAT) Score9 was the primary measure for team performance including an assessment of basic, airway, circulation and teamwork skills. The authors adapted the original STAT by forgoing 17 items under the subheadings of cardiopulmonary resuscitation and arrhythmias from the circulation category, as these skills were not integrated into the scenarios and were not tasks the participants were required to perform. The STAT is measured on a scale of 2–0 based on the complete and timely performance of tasks; the total score for the adapted STAT was 171 (see supplementary file STAT adapted). The raters assessed the simulation in real-time immediately following the simulation (preliminary Cronbach α 0.74); discrepancies between the raters were resolved by a review of the simulation videos and a second evaluation (Cronbach α 0.90).
bmjstel-2019-000473supp001.pdf (146.6KB, pdf)
The secondary measure was a postintervention survey, which included 13 close-ended questions regarding the participants’ experience with the simulation, their perceived benefits and willingness to participate in similar activities in the future.
Data was analysed quantitatively using the Statistical Package for Social Sciences (IBM Corp). Reliability analyses were performed for overall and domain scores. For categorical variables, descriptive analyses included number and per cent, and mean and SD for continuous ones. Paired t-tests were used to analyse differences between simulations I and II team performance, and independent t-tests were used for analysing differences based on scenario focus (eg, paediatric vs adult).
Results
Forty-six ED personnel participated in the simulations (28 residents, 12 nurses, 6 attendings). Overall STAT scores improved significantly between simulations I (60.5 (28.3)) and II (81.1 (24.6)), p=029 (table 2); notably in basic skills for a score of 11.8 (5.15) and 16.5 (3.9), p=022 and teamwork 20.3 (11.9) and 30.9 (13.7), p=023 in simulations I and II respectively. In the airway and circulation domains, scores improved between simulations I and II however not significantly (airway: from 18.0 (10.7) to 21.0 (12.7) p=0.4 and circulation from 10.4 (3.1) to 12.6 (2.8) p=0.1).
Table 2.
Simulation Team Assessment Tool (STAT) overall and domain scores for simulation I and simulation II (*p <0.05)
| Simulation I Mean (SD) (95% CI) |
Simulation II Mean (SD) (95% CI) |
Mean change Mean (SD) (95% CI) |
P value | |
| STAT score (/177) | 60.5 (28.3) (40.2 to 80.7) | 81.1 (24.6) (63.1 to 98.3) | 20.4 (11.9) (4.62 to 45.42) | 0.02* |
| Basic skills | 11.8 (5.15) (8.1 to 15.4) | 16.5 (3.9) (13.6 to 19.3) | 4.7 (5.3) (0.854 to 8.54) | 0.02* |
| Airway | 18.0 (10.7) (10.3 to 25.6) | 21.0 (12.7) (11.9 to 30.1) | 3 (11.3) (5.06 to 11.16) | 0.41 |
| Circulation | 10.4 (3.1) (8.1 to 12.6) | 12.6 (2.8) (10.6 to 14.6) | 2.2 (3.9) (0.608 to 5.1) | 0.10 |
| Teamwork | 20.3 (11.9) (11.7 to 28.8) | 30.9 (13.7) (21.1 to 40.7) | 10.6 (12.3) (2.73 to 1.84) | 0.02* |
A sub-analysis showed that participants performed significantly better when treating adult versus paediatric simulated patients (87.9 (20.1)), p=003, particularly in teamwork, p=01.
The postintervention survey was sent to all participants, the response rate was 75%. Most participants were certified in advanced cardiac life support and/or paediatric advanced life support; only four participants were inexperienced in high-fidelity simulation, 82% and 77% had experienced mock and real codes in the previous year, respectively—yet only 41% were comfortable leading a team during a code and 50% were more comfortable being team members. There was unanimous agreement among participants on the relevance and realism of the intervention. Fifty-five per cent believed they learnt both medical and resource management skills, while 41% learnt teamwork and communication skills. Ninety-six per cent responded affirmatively to future recurrent in-situ simulations in the ED.
Discussion
This intervention proved effective in increasing personnel performance scores in basic skills, airway management and most importantly teamwork. Participants also reported that the intervention was relevant and realistic, and asked for more in-situ simulation in the ED. As an academic hospital, education is an integral component of training in delivery of care. This project was designed to translate participant knowledge into practice; in-situ simulation provided the most opportune means for achieving this goal. Future scheduled scenarios and deliberate and just-in-time practice10 will help mitigate knowledge and skills decay. Additionally, the discovery of the discrepancy of care for adult versus paediatric simulated cases and the lack of knowledge of participants of established ED protocols sheds light on areas which imperatively require interventional initiatives.
Juggling high-quality learning and patient care in the ED setting comes with unique sets of challenges. On the one hand, implementing innovative new teaching and learning methodologies should be valued, yet piloting and assessing the need for such innovation is met with scepticism. Proposing change and innovation in the ED setting raises the ethical dilemma of diverting much needed human and space resources to participation in simulation-based education rather than treating actual patients. Nonetheless, participant feedback and evaluations suggest a willingness among the teams to improve by incorporating simulation into the chaos of the ED and their schedules.
The authors acknowledge the inherent bias of a pre-post design. However, simulation I was considered as the control event, and participants were considered as their own control when compared with simulation II. Our rationale for this was to control for confounding and bias.
Conclusion
The project yielded significant improvement between simulations I and II in clinical management, teamwork and resource management skills among ED personnel; participants are willing and eager to partake in similar in-situ simulation exercises. A recurrent programme will be helpful in reviewing and reaffirming ED protocols and guidelines, as well as improving participants’ teamwork and communication skills.
Acknowledgments
We would like to thank the administration of the emergency department for their constant support, especially Rima Jabbour and Samer Al-Halabi. We also wish to thank Aurelie Mailhac, MS from the AUBMC Clinical Research Institute who provided valuable statistical input. We also want to thank Maha Moteirek for acting as confederate during some scenarios.
Footnotes
Contributors: RS-C and NB contributed to the conception and design of the work, the acquisition of data and critical revision of the submitted report. ZL contributed to the design of the work, the acquisition, analysis and interpretation of the data, as well as drafting, revising and submitting the report. HT revised the analysis and interpretation of the data, as well as the submitted report. RF contributed to the design of the work and the revision of this report. All authors agree to the final version of the submitted report and agree to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.
Funding: The study was supported by institutional funding from AUB. The funding body played no role throughout the study, neither in design, nor implementation, conduct, analysis and preparation of the manuscript.
Competing interests: None declared.
Ethics approval: The study was reviewed by the Institutional Review Board of the American University of Beirut and approved as a quality improvement educational intervention.
Provenance and peer review: Not commissioned; internally peer reviewed.
References
- 1. Gardner R. Introduction to debriefing. Semin Perinatol 2013;37:166–74. 10.1053/j.semperi.2013.02.008 [DOI] [PubMed] [Google Scholar]
- 2. Morrison JB, Rudolph JW. Learning from accident and error: avoiding the hazards of workload, stress, and routine interruptions in the emergency department. Acad Emerg Med 2011;18:1246–54. 10.1111/j.1553-2712.2011.01231.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Batley N, Tayegh T, Lakissian Z, et al. Needs assessment for in situ simulation in the emergency department. Arch Emerg Med Crit Care 2016;1:1–4. [Google Scholar]
- 4. Roussin CJ, Weinstock P. Simzones: An organizational innovation for simulation programs and centers. Acad Med 2017;92:1114–20. 10.1097/ACM.0000000000001746 [DOI] [PubMed] [Google Scholar]
- 5. Argyris C. Teaching smart people how to learn. Reflections: The SoL Journal 2002;4:4–15. 10.1162/152417302762251291 [DOI] [Google Scholar]
- 6. Glassman PA, Kravitz RL, Petersen LP, et al. Differences in clinical decision making between internists and cardiologists. Arch Intern Med 1997;157:506–12. 10.1001/archinte.1997.00440260044008 [DOI] [PubMed] [Google Scholar]
- 7. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc 2007;2:115–25. 10.1097/SIH.0b013e3180315539 [DOI] [PubMed] [Google Scholar]
- 8. Rudolph JW, Simon R, Rivard P, et al. Debriefing with good judgment: combining rigorous feedback with genuine inquiry. Anesthesiol Clin 2007;25:361–76. 10.1016/j.anclin.2007.03.007 [DOI] [PubMed] [Google Scholar]
- 9. Reid J, Stone K, Brown J, et al. The Simulation Team Assessment Tool (STAT): development, reliability and validation. Resuscitation 2012;83:879–86. 10.1016/j.resuscitation.2011.12.012 [DOI] [PubMed] [Google Scholar]
- 10. McGaghie WC, Issenberg SB, Cohen ER, et al. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med 2011;86:706–11. 10.1097/ACM.0b013e318217e119 [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
bmjstel-2019-000473supp001.pdf (146.6KB, pdf)
