Abstract
Background
Infectious disease crisis requires incident specific training and presents unique circumstances for teaching and learning to protect patients and healthcare workers. This toolkit was developed by a team of nurse educators and infection prevention researchers to a) offer training to nurses and nursing students who were frontline healthcare providers during the COVID-19 pandemic on the use of personal protective equipment in the context of the hierarchy of controls and b) evaluate the just-in-time training using simulation methods.
Methods
Interactive content for online delivery was developed, including briefing, simulation content, and debriefing. Participants were recruited via professional, practice, and academic networks. Instructors and learners were asked to complete an evaluation on satisfaction with learning, self-confidence, and design. Data were collected between 05/20 and 09/23; descriptive quantitative and qualitative data were analyzed.
Results
Demographic data from 1,239 participants across 18 countries shows the majority were female (89%), 20–30 years old (81%), nurses (97%), and practiced in academic (48%) settings. Evaluation data from instructors (N = 39) and learners (N=1,159) was positive overall (M = 4.1; M = 4.5, respectively).
Conclusion
Simulation for just-in-time training with online video observation during a pandemic offers nurse educators in academia and practice settings a rapid, feasible, and safe method of learning crisis-specific infection prevention and occupational safety practice guidance.
Keywords: Infection prevention, nursing, occupational health, remote learning, simulation, healthcare education, emerging disease
Introduction
The public health crisis of COVID-19 created challenges and opportunities for academic and healthcare settings. Developing opportunities for learning for students and healthcare providers when they were needed at the bedside was one of the many challenges for systems. Moving beyond the traditional perspective of simulation as an educational and training strategy and giving attention to focusing on patient and team member safety could potentially improve outcomes. Often, academic and clinical educators are charged with developing content to meet the needs of their learners. Simulation training can be utilized to meet the immediate and ongoing need for knowledge and skill reinforcement.
This project was created as a part of a larger research trial, Simulation to Improve Infection Prevention and Patient Safety (SIPPS) Trial (AHRQ R18HS26418). The parent study is a 5-year group-randomized, group-interventional simulation trial taking place in two hospitals in two states, New York and New Jersey. The research team included infection prevention and epidemiological researchers and simulation educators in academia and practice. During the initial weeks of the SARS-CoV-2 outbreak in the United States, both sites were at the epicenter. The rapid global spread of the infection directly amplified the significance of the SIPPS trial, which focused on standard precaution adherence, personal protective equipment (PPE), occupational exposures, and outcomes of healthcare-associated infections (HAIs).
The impetus for this virtual simulation training toolkit was to address the infection prevention needs of nurses and nursing students. This toolkit was developed during the early months of the COVID-19 pandemic when nurses and nursing students as frontline healthcare personnel were identified as high-risk groups whose exposure to the novel SARS-CoV-2 virus was imminent. The purpose of this first-of-its-kind toolkit was to provide healthcare administrators and nurse educators in academic settings with educational tools needed to provide just-in-time training for nurses and nursing students to augment their ability to correctly and safely don and doff PPE in the context of the hierarchy of controls and standard precautions. The National Institute for Occupational Safety and Health Administration (NIOSH, 2024) describes the hierarchy of controls as having five safeguards for infectious disease hazards arranged from the most to least effective. Elimination of a hazard is the most effective, followed by substitution, engineering controls, and administrative controls, and the least effective is the use of PPE.
Just-in-time training, as described by Aggarwal, 2017, addresses knowledge or skill deficit related to a specific high-risk task, which is based on best practices and requires feedback. The toolkit helps address a need in our health system to quickly train healthcare workers on COVID-19 PPE standards through a remote learning environment, given the rapidly evolving guidance and ongoing need for training. In direct response to the COVID-19 pandemic, a toolkit was developed, rapidly disseminated, and formally evaluated entitled: “Personal Protective Equipment (PPE) for COVID-19: A Virtual Simulation Training”. The broad objective of the simulation just-in-time training toolkit was to review the knowledge and skills needed to protect oneself and others from COVID-19 using PPE.
Theoretical Framework
The Consolidated Framework for Implementation Research (CFIR) is the organizing framework and implementation model for the parent study. The CFIR identifies contextual factors that influence the implementation of an innovation and related outcomes (Damschroder et al., 2009, 2022; Rojas Smith et al., 2014). The CFIR framework helps describe and determine the effectiveness of the educational intervention at a time when online just-in-time training is uncommon, and a public health pandemic crisis was unfolding. Six CFIR domains frame the approach to this study where the just-in-time training toolkit is the intervention. The Outer Setting refers to factors at the macro level that influence the intervention, such as policies and procedures. This project recognized the hierarchy of controls and allowed for the personalization of the institution and response to policy updates due to the pandemic.
Intervention Characteristics are factors that influence the success of the innovation and the toolkit implementation. The toolkit characteristics include that it is freely accessible and designed to solicit end-user feedback, and content, delivery, and flow could be modified as needed. Individual/Team Characteristics refer to those who will teach, implement, or receive the innovation. These characteristics are identified through the collection of demographic and practice information. The Inner Setting domain is addressed by providing toolkit materials to clinicians and educators. Process of Implementation refers to the strategies used to implement the innovation. This includes online facilitation methods, the use of the Simulation Design Scale, and evaluation questions (Adamson et al., 2013). Finally, Measures of Implementation are gathered post-intervention through the Student Satisfaction and Self-Confidence in Learning scale and soliciting open-ended feedback questions about practice change and recommendations for simulation improvement (Adamson et al., 2013).
Bandura’s Social Learning Theory supports the hypothesis that critical learning occurs while participating in a simulation in the active observer role (Bethards, 2014; Crain, 2016; Damschroder et al., 2009). Bandura’s Social Learning Theory emphasizes active observation, modeling, and imitating others as the basis for learning (Crain, 2016). Bethards (2014) reported that “the observational learning construct of social learning theory can be used as the foundation for designing learning experiences for students participating in the observer role (p. e66).” The findings of an experimental study by Johnson (2019) indicate learner knowledge of observers increases like that of active participants. Learners participating in the just-in-time training would observe the PPE video demonstration.
Materials and Methods
The toolkit was designed to be delivered synchronously online by educators with small groups of learners over a sixty-minute session. Learner engagement on video conferencing software was encouraged through audio, video, and chat features. The toolkit’s content included educator directions, resources to prepare for teaching, and a PowerPoint presentation organized by key simulation components of briefing, video demonstration, and debriefing. Preparation slides addressed facts about COVID-19, differentiation of PPE supplies, and the multiple hierarchy of control strategies necessary for optimal infection protection and could be done asynchronously. The briefing focused on refreshing skills for isolation precautions and sequencing of PPE donning and doffing based on the Centers for Disease Control handout (Centers for Disease Control and Prevention, n.d.). The simulation activity was a video demonstration of donning/doffing created by the University of Nebraska Medical Center, based on CDC guidelines, for the care of patients infected or suspected of COVID-19. To encourage engagement with observation, prompts were given before starting the demonstration, and the video was paused after donning for discussion. In the end, learners needed to decide which doffing method they preferred. The debrief structure was based on Phrampus & O’Donnell’s (2013) gather, analyze, and summarize method. Debrief questions were provided in the toolkit and included adjusting PPE during conventional, contingency, and crisis status. Each part of the toolkit was designed so that instructors could modify it for inpatient, outpatient, and academic settings with options to adapt to local institutions’ policies and procedures. At the end of the toolkit, there was an option to complete an anonymous electronic evaluation.
The evaluation consisted of five demographic questions that each participant completed and then divided into two pathways – one for learners and one for instructors. The instructor pathway asks nine questions – six items designed to rate usability, feasibility, and acceptance of the toolkit and three open-ended questions about resulting practice changes and suggestions for improvement. The learner pathway asks 22 questions – two demographics items, 14 items from the Simulation Design Scale, and eight items from the Student Satisfaction and Self-Confidence in Learning Scale (each employing a 5-point Likert scale from 1 = “Strongly Disagree” to 5 = “Strongly Agree”) and two open-ended questions soliciting most enjoyable features of the experience and suggestions for improvement (National League for Nursing [NLN], 2005a) and (NLN, 2005b). The Simulation Design Scale was validated by the NLN (2021), with Cronbach’s alphas of 0.92 for the presence of features and 0.96 for the importance of features. Similarly, Cronbach’s alphas for Student Satisfaction and Self-Confidence in Learning Scale were found by the NLN (2021) to be 0.94 for the satisfaction items and 0.87 for the self-confidence items. Following IRB approval in May 2020, recruitment and dissemination of the toolkit via email and listservs began through local and global professional organizations in nursing, healthcare delivery (such as the New York Academy of Medicine, New Jersey Hospital Association), healthcare simulation, and infection prevention. A waiver of written documentation of consent was approved by the IRB, and participants were provided an information sheet about the research as a component of the recruitment email. The research team worked to disseminate the toolkit to as many educators as possible in academic and clinical settings to augment training on safe clinical care during a novel pandemic. The toolkit materials were shared through a dedicated webpage for participants and educators to access freely.
Results
Demographics
Results include data collected between 05/20 and 09/23. Following the dissemination of the toolkit, it was accessed by 2,919 users in 73 countries across six continents (Antarctica not represented). Many users accessed the toolkit from within the US (85.7%). Among participants who accessed the toolkit evaluation survey (n=1,369), most respondents were from the US (n=1,197, 96%), representing 18 different countries. Table 1 presents demographic data from these respondents. Among those in the US, respondents were from New Jersey (n=787, 65%), New York (n=183, 16%), and Indiana (n=135, 12%). Participants were primarily female (88%) and in the age range of 20–30 years old (76%). Most of the respondents were nurses and nursing students (96%) working in academic (48%) or hospital settings (42%).
Table 1.
Survey Respondent Demographics
| Characteristic | Learner (N = 1159) n (%) |
Instructor (N = 74) n (%) |
Total (N = 1239*) n (%) |
|---|---|---|---|
| Gender | |||
| Female | 1027 (88.6) | 63 (85.1) | 1095 (88.4) |
| Male | 128 (11.0) | 9 (12.2) | 138 (11.1) |
| Missing | 4 (0.4) | 2 (2.7) | 6 (0.5) |
| Age | |||
| <20 years old | 35 (3.0) | 0 (0) | 35 (2.8) |
| 20–30 years old | 934 (80.6) | 6 (8.1) | 945 (76.3) |
| 31–40 years old | 103 (8.9) | 11 (14.9) | 115 (9.3) |
| 41–50 years old | 43 (3.7) | 18 (24.3) | 61 (4.9) |
| >50 years old | 2 (0.2) | 10 (13.5) | 12 (1.0) |
| Other** | 39 (3.4) | 26 (35.1) | 65 (5.3) |
| Missing | 3 (0.3) | 3 (4.1) | 6 (0.5) |
| Country | |||
| USA | 1135 (97.9) | 56 (75.7) | 1197 (96.6) |
| Outside of USA*** | 10 (0.9) | 11 (14.9) | 21 (1.7) |
| Missing | 14 (1.2) | 7 (9.5) | 21 (1.7) |
| Profession | |||
| Medical | 12 (1.0) | 2 (2.7) | 14 (1.2) |
| Nursing | 1120 (96.6) | 59 (79.7) | 1185 (95.6) |
| Other | 21 (1.8) | 10 (13.5) | 31 (2.5) |
| Missing | 6 (0.5) | 3 (4.1) | 9 (0.7) |
| Practice Setting | |||
| Academic | 559 (48.2) | 38 (51.4) | 599 (48.4) |
| Hospital | 476 (41.2) | 24 (32.4) | 506 (40.8) |
| LTC/Nursing home | 22 (1.9) | 3 (4.1) | 25 (2.0) |
| Other | 84 (7.3) | 4 (5.4) | 88 (7.0) |
| Outpatient | 4 (0.4) | 3 (4.1) | 7 (0.6) |
| Missing | 12 (1.0) | 2 (2.7) | 14 (1.1) |
Note.
Includes 6 unspecified;
No age specified in text blank.
includes learners and instructors from: Antigua and Barbuda, Aruba, Bahamas, Bolivia, Brazil, Canada, Chile, Dominica, Ethiopia, Gambia, Grenada, Indonesia, Jamaica, Kuwait, Peru, UAE and Zimbabwe (each country <5).
Simulation Design and Student Satisfaction and Self-Confidence in Learning Scales
Among users who accessed the survey, 39 instructors and 1,062 learners completed evaluation data. Internal reliability was established by Cronbach’s Alpha testing for the instructor and learner scales (α = 0.97 and 0.98, respectively). Instructors reported they were able to apply information in their practice/service setting (M = 4.3), easily facilitate the session with the slides provided (M = 4.2), and that it was easy to use with their existing technology, including browser, operating system, and learning management system (M = 4.1). Similarly, instructors also found that the online session provided a sense of presence, which allowed learners to actively engage (M = 4.1), for learners to meet objectives (M = 4.1), and that the toolkit was easily personalized to their needs (M = 3.9).
Most learners were in a Baccalaureate program (75%) and in the first year of their program (32%). Learners highly rated the simulation across each category (from 1 = “Strongly Disagree” to 5 = “Strongly Agree”), with clearly communicated objectives and information being the highest rated (M = 4.6), followed by categories feeling supported by the instructor and technology, able/encouraged to problem solve the simulation, that the simulation provided fidelity and realism, self-confidence, satisfaction with current learning (each M =4.5). The lowest-rated category was the feedback and opportunities for guided reflection provided by the simulation, with a mean rating of 4.4. Overall, learners appeared to rate the simulation-based training more highly than instructors (M = 4.5 vs. M = 4.1).
Open-ended Feedback
Instructors
Instructors were asked on the survey to provide one example of how their practice will change because of using this toolkit. Instructors often reported that they would make a change in their teaching practices or in the way they instructed PPE donning and doffing. An example quote follows: “I am appreciative of the overarching hierarchy which provides a frame of reference as we make a myriad of decisions related to reentry.” Other instructors found it helpful to “use multi-media to continue the teaching-learning process during pandemic disease like COVID-19.” Several also reported plans to incorporate the toolkit into their teaching or provide it as a resource to their students. For example, one instructor shared, “I can implement this teaching in combination with our introduction to PPE and correct handwashing during Fundamental Skills Laboratory.”
In response to the prompt: “What will you do differently in your practice/service setting as a result of this toolkit?” instructors reported they would change their practice and refer to CDC guidelines in contingency situations, “especially in crisis scenarios when a shortage of supplies exists.” Suggestions for additional training content were solicited, and responses included information on environmental cleaning practices, infection control practices, and airway management during COVID-19. Finally, tips for improving the simulation included incorporating case studies, regional information, and increasing COVID-19 pathophysiology detail.
Learners
Learner responses to the survey’s prompt, “Any comments or suggestions on how we can improve this session?” revealed opportunities for improvement. Learners suggested increasing interaction, providing more details, and including application scenarios. Learners reported a need to increase interactivity through different modalities - more videos, instructor demonstrations, or the opportunity to practice donning and doffing skills. Select excerpts follow: “Watching the instructor do it herself would have been helpful rather than a video, as the video went a bit fast” and “More scenarios of what can go wrong with PPE and how to handle those situations.”
The survey also asked Learners, “What did you enjoy most about this session?” Participants most frequently reported enjoying the information, the pictures and video used during the simulation, and how the content was organized and presented. Learners highlighted the debrief or interactive elements as a part of the simulation they enjoyed. They reported that having the debrief or discussion allowed them to have questions answered and apply what they learned. Finally, they liked the instructors who led the simulation and how they provided different examples to reinforce understanding. Select excerpts follow: “The donning and doffing video gave very clear instructions, and I feel like I understand what to do even though I didn’t learn it in person” and “I liked how easy this simulation was to understand. It was broken down nicely but also gave plenty of useful information.”
Discussion
The evaluation data collected as a part of the toolkit distribution allowed for gathering critical information on the intervention design, implementation, and procedures. It also revealed the critical changes occurring within the healthcare setting, such as implementing universal pandemic precautions, which allowed us to account for novel factors and take steps to ensure robust, unbiased results of our parent study.
There were positive outcomes and lessons learned for nurse educators and learners in both hospital and academic settings. The overall feedback from educators was that the just-in-time simulation using active observation effectively teaches refresher skills on PPE during an infectious disease outbreak. Mosher et al. (2022) similarly found that confidence was significantly improved among observers of a COVID-19 PPE donning and doffing simulation-based experience, and participants believed it to be an effective educational tool (Mosher et al., 2022). As the toolkit was provided free of cost to educators, it saves valuable resources such as time and funds while also limiting the need for travel and face-to-face exposure. Similarly, other researchers have found observational simulation techniques to be both time- and cost-effective to mobilize training rapidly. (Reece et al., 2021; Watts et al., 2021). Qualitative feedback suggests that the toolkit can be combined with other teaching modalities to enhance and individualize learning experiences, adding hands-on experience, such as handwashing or expanding content on infectious disease pathophysiology.
Educators also reported feeling that the facilitation of the PPE donning and doffing video influenced and added value to the learning process. These findings are consistent with a systematic review that found learning and satisfaction in observer roles are related to learner engagement and contribution during the debrief (O’Regan et al., 2016). Feedback suggested that facilitation methods and questioning techniques of video watching varied throughout the study.
Self-report data from the learners demonstrates that the toolkit allowed them to reinforce the proper sequencing of PPE and gain additional details about the types of PPE appropriate for the care of COVID-19 patients. Learners also reported that the simulation content could fill in knowledge gaps related to the hierarchy of controls, surge capacity, and rapidly changing policies and procedures related to the unique and overwhelming clinical situation of a pandemic. Another important finding was the value of debriefing. The use of the active observer role in simulation coupled with debriefing is supported in the literature, helping to strengthen learning outcomes and yield similar educational benefits to direct participation in simulations (Bullard et al., 2019; Fung et al., 2021).
Future Research
This study highlighted aspects of infection prevention education and simulation teaching during a public health crisis that need further research. As we could not measure clinical outcomes after using this just-in-time simulation, more research around skill performance and adherence to PPE practices within the hierarchy of controls provided in this toolkit would further cement its utility and efficacy.
Limitations
This just-in-time training and research was initiated by nurse educators and infection preventionists working together in an unprecedented public health crisis. During the study, there were rapidly evolving external factors, including epidemiologic knowledge development, healthcare delivery practice related to PPE surge capacity, social isolation guidelines preventing face-to-face education, and uncertainty with the backdrop of high morbidity and mortality due to COVID-19. These factors may have increased access and utilization, and respondents may have either under or overrated responses based on these stressors. The open-ended responses provide clarity and insight into the quantitative findings. Here, we presented only a descriptive analysis of our evaluation results and not a more in-depth analysis of differences in results by factors such as learner or instructor practice setting. This could overlook some differences between these groups that may exist.
Conclusions
The toolkit reached a global audience during the initial months of the novel COVID-19 pandemic, and uptake continued throughout 2023. Learners and instructors reported that this just-in-time simulation was effective and well received. The toolkit filled knowledge gaps that arose during a crisis and uncertainty – through observation of a skills-based video, information on the hierarchy of controls, and time to debrief and discuss the content in the context of local settings and conditions, such as shelter-in-place restrictions.
Highlights (of core findings).
Refreshers and just-in-time training on PPE donning and doffing within the context of the hierarchy of controls was critical during the COVID-19 pandemic.
Remote learning in the active observer role is a cost-effective and highly effective method for learning the sequencing of existing skill sets and building confidence during uncertain times of a public health crisis.
Customizable components of the toolkit were available for educators in diverse settings.
Key Point(s) (Statements summarizing main points).
Collaboration between simulation educators and infection prevention clinical researchers was crucial for rapidly developing and disseminating evolving information related to pandemic crisis management.
Online simulation-based experience quickly reached a global audience in multiple healthcare settings.
Acknowledgments
We gratefully acknowledge the educators and frontline providers who faced unprecedented professional and personal challenges amid uncertainty in the state of the science of a novel pandemic. We are grateful the toolkit was utilized by so many across the globe to protect our front-line workers and the patients they care for. We acknowledge the support from Hackensack Meridian Health and Columbia University, School of Nursing, to develop and test the toolkit and the many diverse professional organizations that facilitated the rapid dissemination of the toolkit to their representative stakeholders.
Funding:
This project was supported by grant number R18HS026418 from the Agency for Healthcare Research and Quality (AHRQ). The content is solely the responsibility of the authors and does not necessarily represent the official views of the AHRQ.
References
- Adamson KA, Kardong-Edgren S, Willhaus J (2013, September 30). An updated review of published simulation evaluation instruments. Clinical Simulation in Nursing.9(9): e393–400. [Google Scholar]
- Aggarwal R (2017). Just-in-time simulation-based training. BMJ Qual Saf, 26(11), 866–868. doi: 10.1136/bmjqs-2017-007122 [DOI] [PubMed] [Google Scholar]
- Bethards ML (2014). Applying social learning theory to the observer role in simulation. Clinical Simulation in Nursing, 10(2), e65–e69. doi: 10.1016/j.ecns.2013.08.002 [DOI] [Google Scholar]
- Bullard MJ, Weekes AJ, Cordle RJ, Fox SM, Wares CM, Heffner AC, Howley LD, & Navedo D (2019). A mixed-methods comparison of participant and observer learner roles in simulation education. AEM Education and Training, 3(1), 20–32. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Centers for Disease Control and Prevention. (n.d.). Sequence for putting on personal protective equipment (PPE). Retrieved from https://www.cdc.gov/infection-control/media/pdfs/Toolkits-PPE-Sequence-P.pdf
- Crain W (2016). Bandura’s Social Learning Theory. Abingdon: Routledge. [Google Scholar]
- Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. (2009, August 7). Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation science. 4(1):50. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Damschroder LJ, Reardon CM, Widerquist MAO et al. The updated Consolidated Framework for Implementation Research based on user feedback. Implementation Science 17, 75 (2022). 10.1186/s13012-022-01245-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fung JTC, Zhang W, Yeung MN, Pang MTH, Lam VSF, Chan BKY, & Wong JYH (2021). Evaluation of students’ perceived clinical competence and learning needs following an online virtual simulation education programme with debriefing during the COVID-19 pandemic. Nursing Open, 8(6), 3045–3054. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Johnson BK (2019). Simulation Observers Learn the Same as Participants: The Evidence. Clinical Simulation In Nursing, 33, 26–34. doi: [Google Scholar]
- Mosher C, Mukhtar F, Alnaami N, Akkielah YA, Alsharif J, Khan, Taskiran HC, & Zafar M (2022). Donning and doffing of personal protective equipment: perceived effectiveness of virtual simulation training to decrease Covid-19 transmission and contraction. Cureus, 14(3). [DOI] [PMC free article] [PubMed] [Google Scholar]
- National Institute for Occupational Safety and Health. (2024, April 10). About Hierarchy of Controls. Retrieved from
- National League for Nursing. (2021). Tools and Instruments. Retrieved from: https://www.nln.org/education/teaching-resources/tools-and-instruments
- National League for Nursing. (2005a). Student Satisfaction and Self-confidence in Learning ©. Retreived from https://www.nln.org/education/teaching-resources/tools-and-instruments
- National League for Nursing. (2005b). Simulation Design Scale© (Student Version). Retrieved from https://www.nln.org/education/teaching-resources/tools-and-instruments
- O’Regan S, Molloy E, Watterson L & Nestel D (2016). Observer roles that optimise learning in healthcare simulation education: a systematic review. Advances in Simulation, 1, 1–10.. 10.1186/s41077-015-0004-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Phrampus PE, & O’Donnell JM (2013). Debriefing using a structured and supported approach. The comprehensive textbook of healthcare simulation, 73–84. [Google Scholar]
- Reece S, Johnson M, Simard K, Mundell A, Terpstra N, Cronin T, … & Grant V (2021). Use of virtually facilitated simulation to improve COVID-19 preparedness in rural and remote Canada. Clinical Simulation in Nursing, 57, 3–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rojas Smith L, Ashok M, Morss Dy S, et al. Contextual Frameworks for Research on the Implementation of Complex System Interventions [Internet]. Rockville (MD): Agency for Healthcare Research and Quality (US); 2014. Mar. Retrived 03/01/2024 from https://www.ncbi.nlm.nih.gov/sites/books/NBK196191/ [PubMed] [Google Scholar]
- Watts PI, Rossler K, Bowler F, Miller C, Charnetski M, Decker S, … Hallmark B (2021). Onward and Upward: Introducing the Healthcare Simulation Standards of Best Practice. Clinical Simulation In Nursing, 58, 1–4. doi: 10.1016/j.ecns.2021.08.006 [DOI] [Google Scholar]
