Skip to main content
Elsevier - PMC COVID-19 Collection logoLink to Elsevier - PMC COVID-19 Collection
. 2021 Jun 17;57:41–47. doi: 10.1016/j.ecns.2021.04.017

Embracing Disruption: Measuring Effectiveness of Virtual Simulations in Advanced Practice Nurse Curriculum

Michele L Kuszajewski a,, Jacqueline Vaughn b, Margaret T Bowers a, Benjamin Smallheer a, Rémi M Hueckel a, Margory A Molloy a
PMCID: PMC9329721  PMID: 35915814

Abstract

Changes in academia have occurred quickly in response to the COVID-19 pandemic. In-person simulation-based education has been adapted into a virtual format to meet course learning objectives. The methods and procedures leveraged to onboard faculty, staff, and graduate nurse practitioner students to virtual simulation-based education while ensuring simulation best practice standards and obtaining evaluation data using the Simulation Effectiveness Tool-Modified (SET-M) tool are described in this article.

Keywords: distance-based simulation, graduate nursing student education, virtual simulation, virtual standardized patients, evaluation


Key Points.

  • Kolb's Experiential Learning Theory (ELT) encourages learner application of previous experience and provides an ideal structure for developing simulation experiences to support nurse practitioner clinical education.

  • The INACSL Standards of Best Practice: SimulationSM (INACSL Standards Committee, 2016) were used to guide the development and implementation of the virtual simulations.

  • Team-based virtual simulations allowed collaboration, real-time feedback from an experienced clinician, and an opportunity to create a shared mental model with other participants throughout the scenario.

Introduction

The COVID-19 pandemic has challenged academic simulationists with rapidly transitioning most nursing program instruction from traditional face-to-face presentations to solely virtual formats. For the past 5 years, virtual/telehealth simulations have been developed and conducted as part of existing distance-based curriculum in nurse practitioner (NP) courses (Family, Pediatric Acute Care, and Adult-Gerontology Acute Care) at a private university in the southeastern United States. Due to the pandemic, in-person simulation-based education has also needed to be adapted into a new format to ensure that course learning objectives are met. This exemplar will include a description of the methods and procedures leveraged to (a) design virtual simulations, and (b) onboard faculty, staff, and students to virtual simulation-based education while ensuring simulation best practice standards. Challenges identified and lessons learned regarding technology, resources, and logistics will be shared. As all simulations were completed in a virtual setting, evaluation data was collected using the Simulation Evaluation Tool-Modified (SET-M).

The INACSL Standards of Best Practice: SimulationSM (INACSL Standards Committee, 2016; Table 1 ) wereused to guide the development and implementation of the virtual simulations. The shift to Zoom© technology as an online teaching platform provided a safe, interactive learning environment that maintains fidelity while allowing participants to be seen and heard. The virtual simulation scenarios (a) provided learners with an opportunity to experience socialization with the types of patients they may encounter in the future, (b) allowed for peer-to-peer interaction, and (c) were facilitated by nurse faculty with specific expertise in the content areas. For example, simulations included psychiatric mental health, cardiac, and emergency scenarios. Compared to case-based or avatar-based simulations that students work through at their own pace, this interactive team-based approach allowed collaboration, real-time feedback from an experienced clinician, and an opportunity to create a shared mental model with other participants throughout the scenario. Learning was reinforced through deliberate practice as students assumed different roles in subsequent iterations of the scenario.

Table 1.

Virtual Simulation Best Practices (Adapted From INACSL Standards of Best Practice: Simulation℠: Simulation)

INACSL Standard Virtual Simulation Best Practices
Design 1. Needs assessment performed
2. Modality identified
3. Fidelity maintained through virtual offering
4. Prebriefing included
5. Debrief session included
6. Pilot tested completed
Outcomes and Objectives 1. Measurable objectives identified
2. Expected outcomes identified
Facilitation 1. Sessions led by experienced facilitators
2. Case matched with level of the learner
3. Cues built into case to assist the learner
Debriefing 1. Virtual debrief based on best practices
2. Debrief led by facilitators who participated in the simulation-based experience (SBE)
3. Debrief congruent with objectives and outcomes
Participant Evaluation 1. Evaluation plan identified ahead of time
2. SBE selected for formative evaluation
Professional Integrity 1. Facilitators role-modeled professional integrity throughout the SBE
2. Safe learning environment maintained
Operations 1. Transitioned from in-person SBE to virtual simulation
2. Integrated existing simulation technology with interactive video conferencing platform
3. Technology supported by operations specialist

Theoretical framework

Nurse educators continually seek ways to promote learners’ active engagement in the learning process. Kolb's (1984) experiential learning theory (ELT) is based on the concept that past experience is the foundation for new knowledge and it provides a framework for educators to promote critical thinking. Kolb defined ELT as “the process whereby knowledge is created through the transformation of experience. Knowledge results from the combination of grasping and transforming an experience” (Kolb, 1984, p.41). Simulation, whether hands-on or virtual, provides learners the opportunity to participate in concrete (real) experiences and enables them to experience what they are learning (Silberman, 2007).

Experiential learning theory guided our NP students’ virtual simulations. Kolb's ELT includes four major stages: concrete experiences, reflective observation, abstract conceptualization, and active experimentation. Concrete experiences are built from reality in which learners are willingly involved in new experiences as active participants. Reflective observation involves critically reflecting on concrete experiences. Abstract conceptualization occurs when a learner uses analytical skills to combine ideas and concepts to frame an experience. In the final stage of active experimentation, the simulation experience provides learners with a hands-on opportunity to apply the concepts and skills learned and reflect on actual experiences (Murray, 2018). Kolb surmised that the four stages are cyclical in nature and that all must be present for learning to occur.

ELT is particularly suited as a framework to guide the use of simulation. It teaches NP students to (a) critically reflect on their prior experience as nurses, (b) actively experiment, and (c) apply new knowledge as nurse practitioners. ELT is learner-centered: participants control the direction of the simulation scenario, which is determined by their actions and the consequences of those actions. Learners can build on experiences in a scenario, moving from a passive to a more active role (Alinier, 2011). Reflective observation and abstract conceptualization occur during the simulation (as learners describe and support their actions) as well as during the simulation debrief. The stages of ELT encourage learner application of previous experiences and provide an ideal structure for developing simulation experiences to support NP clinical education.

Material and Methods

In accordance with the INACSL Standards of Best Practice: Simulation℠, a systematic approach was taken to design and implement the virtual simulations.

Simulation outcomes/objectives

The NP faculty were challenged with transitioning instruction to a completely virtual format. By using Kolb's Framework (1984) as a foundation, and by integrating past clinical experiences to determine simulation outcomes, they were able to match simulation activities with course and curricular objectives and to meet NP core competencies for education (National Organization of Nurse Practitioner Faculties, 2017).

Simulation design

Certified simulation health care educators (CHSEs) led the collaborative effort with course faculty to convert graduate nursing in-person simulations into a virtual format using simulation best practice. As part of a standard practice, all simulation activities conducted at the school of nursing are formatted using an internally approved simulation scenario template. The template facilitated the transition of existing simulations into a structured and standard virtual format. Learning objectives were reviewed to determine if they could be met virtually. We adapted our previous in-person scenarios which varied by course ranging from 1 to 4 scenarios per course. Approximately 60 students participated in NP roles in these virtual simulation activities. Objectives specific to kinesthetic learning outcomes were modified as hands-on assessment and skills would be unavailable in the virtual setting. After learning objectives had been finalized, consideration was given to equipment, simulation modalities for delivery, and logistics.

Simulation operations

Course faculty and CHSEs met with the program's Certified Healthcare Simulation Operations Specialist (CHSOS) to identify appropriate technology and platforms for simulation delivery. This step helped to ensure that outlined learning objectives could be achieved with the university's available resources. They discussed simulation software to display vital signs, the virtual delivery system, and digital forms of communication between technicians and faculty as well as between faculty and learners.

Laerdal Medical LLEAP was selected as the software for the display of vital signs and diagnostic testing results. This software is often used in the simulation center and was readily available. Additionally, many of the simulationists and simulation technicians were accustomed to using Laerdal LLEAP; their familiarity with the software made adapting it and programming the scenarios easier.

Zoom© was chosen as the video communication system because it was the university's approved platform and was familiar to the students. This platform provided simultaneous video, audio, and text capabilities for the virtual simulation. As this virtual platform was approved and already in use, it was available to faculty and students at no additional cost, and technical support was readily available from the school's Office of Information Technology. To display the Laerdal LLEAP software in these simulation activities, a second laptop was utilized to run LLEAP. Using this laptop, the simulation technicians logged into Zoom© using the LLEAP monitor as a participant so the learners could see the monitor in Zoom©. To ensure the best internet connection, the simulation technicians were on-site using the university's secure network.

Internal communication was achieved by use an instant messaging program (Cisco Jabber™). This was an efficient way for faculty to communicate with simulation technicians or with one another during case progression. Faculty and simulation technicians could plan when to display diagnostic test results (e.g., laboratory results, physical assessment findings, radiology images) to the student within the simulation. This platform was also chosen because it was a university-approved method of communication.

Students were encouraged to communicate via the chat box within the Zoom© platform. During the simulation, student discussions often addressed patient presentation, interview questions, critical thinking and diagnostic reasoning, and plans of care. Utilizing the chat box minimized incidences of students interrupting one another and allowed faculty to evaluate the students’ critical thinking in real time.

The virtual simulations contained multiple, interwoven components that required a high level of coordination and time. This included preprogramming of the simulation scenarios in the LLEAP software by the simulation technicians as well as internal testing of the technology and developed processes with the simulation technicians and simulation center's CHSEs. One unique addition to the implementation of these complex simulations was an internal simulation operations training session led by the school's CHSOS approximately one week prior to the session. During this one hour training session, the faculty, facilitators, and simulation technicians reviewed details of simulation delivery, anticipated challenges, and developed real-time solutions that could be deployed to maintain the integrity of learning. During this “dry run” it was identified that a strong internet signal was essential to run the various software simultaneously. It was suggested that the faculty/ facilitators come on-site for the simulations if they did not have a strong/ reliable signal with their home internet.

Simulation facilitation and debriefing

Specific training and faculty resources (Microsoft PowerPoint Template and practice sessions) were developed to ensure adherence to best practice for virtual simulations.

A PowerPoint template was designed that included (a) checklists for a structured prebrief to ensure psychological safety and suspension of disbelief, (b) a guide for successful simulation describing student engagement strategies and logistics necessary for virtual delivery (i.e., phone number to the university's technology help desk), and (c) debriefing prompts and suggestions for identifying key learning points. Faculty guided students through the debriefing using the plus-delta technique. Students were encouraged to state what went well (plus) and what could have been done differently (delta) during the virtual scenario.

The second component of resources and training was the practice session. A “dry run” practice session was scheduled to familiarize faculty with technology and logistics. During the practice session, the virtual simulation activity was assessed for procedural/design errors and adherence to simulation standards.

Simulation Evaluation

Students provided evaluations of their simulation experiences, and faculty evaluated students’ learning.

With Institutional Board approval (exemption), NP students (N = 50) participated in the virtual simulation exercise. A total of 7 unique simulations were completed virtually by students across multiple NP majors and specialty certificates (e.g., Cardiology, Psychiatric Mental Health). At the end of the simulation exercise, students (N = 50) were asked to complete an evaluation. The Simulation Effectiveness Tool-Modified (SET-M), a valid and reliable (α > 0.80) 19-item survey, was used to evaluate the nursing students’ perceptions of whether and how well virtual simulations met their learning needs (Leighton, Ravert, Mudra, & Macintosh, 2015). The SET-M was developed using concepts and terminology based on INACLS Standards of Best Practice: SimulationSM (Sittner et al., 2015) and Quality and Safety Education for Nurses (QSEN) competencies (Cronenwett et al., 2007); it focuses on four concepts: prebriefing, learning, confidence, and debriefing (Leighton et al., 2015).

Student learning was evaluated by faculty based on (a) a clinical documentation assignment and (b) a group discussion designed to allow for reflective observation and abstract conceptualization of simulation and course objectives. In addition, students provided positive feedback anecdotally to faculty regarding the development of their critical thinking skills in the clinical environment.

Results

Student perception (confidence/ learning) of the new simulation platform was evaluated using the SET-M. Results were used to provide additional changes to the virtual simulations to enhance student learning and ensure that course objectives were met. A total of 50 NP students participated in the virtual simulation experience and voluntarily completed the SET-M survey. A majority reported that the prebriefing session had increased their confidence and was beneficial to their learning (Table 2 ). Descriptive statistics were used to analyze the data using Qualtrics® XM.

Table 2.

Nursing Students’ (N = 50) Responses Related to the Prebriefing Session

Question Strongly agree Somewhat Agree Do Not Agree
1. Prebriefing increased my confidence 72% 20% 8%
2. Prebriefing was beneficial to my learning. 80% 16% 4%

The SET-M comprised 12 questions designed to elicit nursing students’ perceptions on overall learning and confidence. Results are reported in Table 3 .

Table 3.

Nursing Students’ (N = 50) Responses Related to Learning and Confidence in the Virtual Simulation Exercise

Question Strongly Agree Somewhat Agree Do Not Agree
1. I am better prepared to respond to changes in my patient's condition. 86% 14% 0%
2. I developed a better understanding of the pathophysiology 73% 27% 0%
3. I am more confident of my assessment skills. 76% 24% 0%
4. I felt empowered to make clinical decisions. 73% 27% 0%
5. I developed a better understanding of medications. (Leave blank if no medications in scenario) 73% 23% 5%
6. I had the opportunity to practice my clinical decision-making skills. 90% 10% 0%
7. I am more confident in my ability to prioritize care and interventions. 86% 14% 0%
8. I am more confident in communicating with my patient. 72% 26% 2%
9. I am more confident in my ability to teach patients about their illness and interventions. 69% 27% 4%
10. I am more confident in my ability to report information to health care team. 71% 29% 0%
11. I am more confident in providing interventions that foster patient safety. 78% 22% 0%
12. I am more confident in using evidence-based practice to provide care. 82% 18% 0%

Over 90% of the nursing students strongly agreed that the debrief was a valuable element that contributed to their learning and confidence (Table 4 ).

Table 4.

Nursing Students’ (N = 50) Responses Related to the Debrief Session

Question Strongly Agree Somewhat Agree Do Not Agree
1. Debriefing contributed to my learning. 90% 10% 0%
2. Debriefing allowed me to communicate my feelings before focusing on the scenario.* 90% 10% 0%
3. Debriefing was valuable in helping me improve my clinical judgment. 92% 8% 0%
4. Debriefing provided opportunities to self-reflect on my performance during simulation. 98% 2% 0%
5. Debriefing was a constructive evaluation of the simulation. 96% 4% 0%

Authors revised 4/3/20 for use in virtual debriefing.

In addition to the survey results, students were given the opportunity to respond to an open-ended question asking for their feedback about the virtual simulation experience. Of the 50 survey respondents, 29 volunteered narrative responses. Responses were grouped into four major themes: (1) educational impact, (2) additional time for simulation experiences, (3) facilitation to role transition, and (4) feedback and debriefing. The first theme addressed the educational impact of the virtual simulation. Several students (n = 14) emphasized that the virtual simulation was educational and had enhanced their learning. The second theme revealed that some students (n = 8) wanted more of this type of learning platform, referring both to the number of offerings throughout the semester and to the amount of time spent in each simulation. Students (n = 5) addressed the virtual simulation experience's ability to facilitate role transition to the advanced practice provider role as the third theme. The final theme, feedback and debriefing, was addressed in students’ (n = 7) comments on the value of feedback and debriefing.

Discussion

The literature is heavily populated with reports of student learning and confidence in the face-to-face environment; however, the data from this study demonstrate a strong report of effectiveness in conducting a simulation in the virtual environment. There has been less investigation of data supporting prebriefing, learning, confidence, and debriefing in the virtual space on the learning outcomes of graduate NP students. The first step in establishing a database for future studies was utilizing a validated tool (SET-M) in this convenience sample to collect data on virtual simulations. Simulation leaders acknowledge that the expansion of simulation use in graduate education is an effective tool to support/prepare “practice-ready professionals” (Bryant, Aebersold, Jeffries, & Kardong-Edgren, 2020).

Table 2 demonstrates that a majority of respondents felt use of prebriefing in the virtual environment strongly increased their confidence and was beneficial to their learning. The use of prebriefing is strongly encouraged in face-to-face simulations as it identifies a participant's expectations for the simulation [Standards of Best Practice (SBP) - Design]. As the mode of simulation used by faculty changed to a virtual platform, the need for a well-structured prebriefing continued and was supported by student evaluations.

An effective simulation design enables learners to meet the intended outcomes and objectives of the simulation (SBP - Design). Table 3 demonstrates that students consistently reported (a) improvements in their ability to recognize and understand clinical changes; and (b) an increase in confidence of clinical knowledge, skills, and attitudes. The use of a well-designed and validated tool increased the facilitator's ability to conduct a high-quality evaluation of students’ learning.

The use of debriefing is intended to improve the future performance of the learner (SBP-Debriefing). Table 4 reflects students’ responses to the debriefing process (plus-delta) conducted in the virtual platform. A majority of students consistently stated that the debriefing was constructive, contributed to their learning, and allowed them the ability to reflect on their critical thinking and decision making. The simulation team intentionally reviewed and considered the five criteria necessary to meet the simulation of standard of debriefing. Students articulated opportunity to transfer knowledge skills and attitudes from virtual simulations into future patient care experiences.

Students provided positive feedback and comments about virtual simulation, including requests for more simulation opportunities. Anecdotally, students shared that these simulations prepared them for clinical rotations and helped them to demonstrate a translation of knowledge into practice, thus reflecting Kolb's active experimentation stage. There are opportunities to start collecting data in the clinical setting to evaluate sustained retention of knowledge.

The design of the simulations was adapted to the virtual platform while maintaining INACSL standards. A standard slide set for prebrief was developed to set student expectations at the outset and to ensure that each simulation followed best practices by articulating objectives, expected outcomes, and specific nurse practitioner competencies. This standardization may have contributed to the high level of engagement during the simulation and in the debriefing.

Completion of a practice session (“dry run”) prior to the simulations provided an opportunity for facilitators and operators to address technical and communication issues. Addressing these issues ahead of time contributed to a more seamless virtual simulation experience. Faculty reported that the ‘dry run” prior to the event and the use of the PowerPoint slides helped with the organization and smooth implementation of the virtual simulations.

Using SET-M as a validated tool helped to capture real time data to use for evaluation of the virtual simulations. Data analysis using a standard tool provided information for process improvements as additional simulations were deployed. Professional integrity was introduced during the prebrief using the standardized slide set. Clear expectations, including zoom etiquette, were set for professional integrity to ensure psychological safety, confidentiality, and ethics. During the debrief simulation, facilitators addressed clinical decision making, therapeutic interventions, and professional comportment.

The limitations of this study included the sample size and evaluation tools. A small convenience sample of NP students was used to evaluate this change in simulation modality. This study was also conducted over a limited amount of time (1 semester), therefore the findings may not be generalizable to other educational offerings.

With previous in-person simulations, home grown evaluation tools were used instead of reliable and validated tools. Moving forward the faculty will consider using the SET-M to evaluate in-person simulations to provide comparison data for quality improvement.

Conclusions

Simulation activities routinely performed during on-campus intensive sessions prior to the COVID-19 pandemic were adapted to the virtual platform while maintaining INACSL standards. Faculty development and simulation integrity was enhanced by using a standardized template and structure for the design and delivery of the simulation. This standardization offered learners clarity about objectives, expected outcomes, and nurse practitioner competencies met through the simulation activity; it may have contributed to the high level of learner engagement during the simulation and in the debriefing. This was a valuable process for these activities and will inform our future online offerings.

Conflicts of Interest

The remaining authors have no financial relationships relevant to this article to disclose. The remaining authors have no competing interests to declare.

Acknowledgments

A special thanks to our simulation technician team Raymond Brisson, III and Tiffiany Parker whose contributions were vital to this project.

This work was supported by the National Institutes of Health, National Institute of Nursing Research Grant No. 1F31NR018100 (to JV through July 2020) and T32 (to JV at present);

The research reported in this publication is supported in part by the National Institute of Nursing Research of the National Institutes of Health under Award Number F31NR018100. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

References

  1. Alinier G. Developing high-fidelity health care simulation scenarios: A guide for educators and professionals. Simulation & Gaming. 2011;42(1):9–29. https://doi:10.1177/1046878109355683. [Google Scholar]
  2. Bryant K., Aebersold M.L., Jeffries P.R., Kardong-Edgren S. Innovations in simulation: Nursing leaders’ exchange of best practices. Clinical Simulation in Nursing. 2020;41(C):33–40. doi: 10.1016/j.ecns.2019.09.002. [DOI] [Google Scholar]
  3. Cronenwett L., Sherwood G., Barnsteiner J., Disch J., Johnson J., Mitchell P.…Warren J. Quality and safety education for nurses. Nursing Outlook. 2007;55(3):122–131. doi: 10.1016/j.outlook.2007.02.006. [DOI] [PubMed] [Google Scholar]
  4. International Nursing Association for Clinical Simulation and Learning (INACSL) Standards Committee INACSL standards of best practice: SimulationSM Simulation design. Clinical Simulation in Nursing. 2016;12:S5–S50. doi: 10.1016/j.ecns.2016.09.009. S. [DOI] [Google Scholar]
  5. Kolb D.A. Prentice-Hall; 1984. Experiential learning: Experience as the source of learning and development. [Google Scholar]
  6. Leighton K., Ravert P., Mudra V., Macintosh C. Updating the simulation effectiveness tool: Item modifications and reevaluation of psychometric properties. Nursing Education Perspectives. 2015;36(5):317–323. doi: 10.5480/15-1671. https://doi:10.5480/15-1671. [DOI] [PubMed] [Google Scholar]
  7. Murray R. An overview of experiential learning in nursing education. Advances in Social Science Research Journal. 2018;5(1):1–6. https://doi:10.14738/assrj.51.4102. [Google Scholar]
  8. National Organization of Nurse Practitioner Faculties. (2017). Nurse practitioner core competencies content. Retrieved from http://www.nonpf.org
  9. Silberman M. Wiley; 2007. The handbook of experiential learning. [Google Scholar]
  10. Sittner B.J., Aebersold M.L., Paige J.B., Graham L.L., Schram A.P., Decker S.I., Lioce L. INACSL Standards of Best Practice for Simulation: Past, present, and future. Nursing Education Perspectives. 2015;36(5):294–298. doi: 10.5480/15-1670. https://doi:10.5480/15-1670. [DOI] [PubMed] [Google Scholar]

Articles from Clinical Simulation in Nursing are provided here courtesy of Elsevier

RESOURCES