Skip to main content
Elsevier - PMC COVID-19 Collection logoLink to Elsevier - PMC COVID-19 Collection
. 2021 Jan 7;50:102967. doi: 10.1016/j.nepr.2021.102967

Curricular uptake of virtual gaming simulation in nursing education

Margaret Verkuyl a,, Jennifer L Lapum b, Oona St-Amant b, Michelle Hughes a, Daria Romaniuk b
PMCID: PMC9759869  PMID: 33465565

Abstract

In nursing education, virtual simulations are used to augment in-person simulation and prepare and supplement students for clinical placements. More recently, as a result of the COVID-19 pandemic, virtual simulations are being used to replace clinical hours. Many virtual simulations require the user to make decisions that affect the outcome of the simulated experience. In this article, we provide a historical account of the virtual gaming simulations that members of our team developed and the processes that led to successful uptake into curriculum. In addition, we share lessons learned from our experiences in terms of maximizing curricular uptake. We found engagement of the teaching team is essential when using VGS in a course. In addition, when using VGS, it is important to follow the process of prebrief, enactment, debrief and evaluation. Educators can build on and grow from our lessons learned so that the path to embedding virtual gaming simulation in curriculum becomes clear.

Keywords: Virtual simulation, Nursing, Education

1. Introduction

Simulation is an integral part of nursing education. With the exponential growth of technology-enabled platforms simulations shifted to the virtual world. Virtual simulation refers to simulations that students interact within an online format or on a computerized device (Cant et al., 2019). More recently, the move for regulative bodies in nursing to count virtual experiences as clinical hours has led to a significant uptake in the use of virtual experiences. Although virtual simulation is relatively new in nursing education, it has a long history in the fields of aviation and the military (Aebersold, 2016). Similar to nursing, these fields privilege safety as a core concept in professional development. In nursing, it is important to incorporate pedagogy that promotes patient safety. The nature of virtual simulation promotes psychological safety because it can be played individually at the learner's pace (Lapum et al., 2018b). and the decisions made in the game do not have a real-life impact (Verkuyl et al., 2017; Nelson, 2016).

Virtual Simulation is used in healthcare education and in the different levels of nursing education (Duff et al., 2016). Systematic reviews of healthcare students found using virtual simulation as effective or superior to in-person simulation or other teaching methods related to engagement, safety, convenience, clinical reasoning, procedural skill and team skills (Duff et al., 2016; Foronda et al., 2020; Kononowicz et al., 2019). Foronda et al.‘s (2020) systematic review concluded that virtual simulation is an effective pedagogy in nursing education.

In this article, we define virtual gaming simulation (VGS) using the terms fidelity, immersion, and patient (Cant et al., 2019) Our virtual simulation is a high fidelity, 2D immersive simulation using videos of simulated patients (played by actors) in which the user can make clinical decisions for learning in healthcare. VGS integrate gaming theory with computerized simulations re-enacting a clinical scenario (Verkuyl et al., 2017). Embedded gaming elements challenge the user to make decisions that affect the outcome of the simulated experience without actual consequences to the player or a real-life patient. Unlike in-person simulations limited by the need for ongoing physical and human resources, virtual simulations allow the user to repeatedly trial their decision-making in a safe learning environment while receiving feedback. Similar to other forms of simulation, researchers have found that VGS engage students while providing opportunities for knowledge application and the promotion of self-efficacy and reflection (Verkuyl et al., 2017; Verkuyl and Mastrilli, 2017).

There is literature related to developing virtual simulations (Rim and Shin, 2021; Verkuyl et al., 2019a); however, there is limited information on how to implement VGS in curriculum. This paper addresses how our team became involved in VGS development and the processes that facilitated successful curricular uptake. Lessons learned from our experiences can be used as a guide for educators to create sustainable programs that foster active learning. In addition, future VGS development, research studies and curriculum uptake are discussed.

2. Development of a VGS

The design of VGS is informed by a branching scenario approach which involves an unfolding storyline based on the learner's clinical decision-making and by simulation design principles (INACSL Standards Committee, 2016; Verkuyl et al., 2019b). The user is presented with realistic film clips of nurse-client scenarios followed by options on how to proceed (Lapum et al., 2018a; Verkuyl et al., 2017). This branching scenario approach allows students to choose an option while experiencing the hypothetical consequences of their clinical judgment (Lapum et al., 2018b). When the user chooses the incorrect answer, they view the consequence of their action before receiving feedback related to clinical competencies and standards of care; after this they are redirected to choose another answer. When the user chooses the correct answer, the simulation continues. Users receive a summary report at the end outlining each of their decisions. Students can replay the VGS and thus, repeat the simulation as many times as they choose.

Members of our team began creating VGS in 2013 in order to provide a safe environment for nursing students to practice clinical decision making and augment clinical practice experiences. The created games provide experiences in clinical areas such as pediatrics, mental health, maternal health, and emergency, which is freely accessible at: https://de.ryerson.ca/games/nursing/hospital/. The VGS learning outcomes are related to clinical practice and general in nature so that they are applicable to nursing students, practicing nurses and other healthcare professionals. Each VGS takes up to an hour to play. These VGS have been integrated into the curriculum at our educational institutions, internationally, and in practice for continuing competence. Over the last few years, we have worked to expand expertise and capacity to create these VGS by expanding our design team and sharing knowledge in peer-reviewed journals, book chapters, workshops, and conferences (Lapum et al., 2018b; Verkuyl et al., 2017; Verkuyl et al., 2019b).

The VGS provides students with experiences in specialty clinical areas with limited placement availability, such as pediatrics, mental health, and maternal health, where the availability of sufficient placements has proven limited in Canada (Canadian Association of Schools of Nursing, 2010). Placement availability has been limited even further with the COVID-19 pandemic. Faced with this challenge, educators have turned to virtual simulation to supplement clinical placements. While high quality VGS are expensive to produce, the number of times they can be used by students across years and programs is unlimited, thus mitigating some costs (Lapum et al., 2018b). In a cost utility ratio, virtual simulation was found to be one third of the cost of manikin-based simulation and there was no significant difference in learning (Haerling, 2018). Additionally, our VGS are open resources that educators and students can access for free. The games have been played over 600,000 times across 25 counties: these numbers continue to grow.

3. Curricular uptake of VGS

Educators play a vital role in both curricular uptake of VGS by advancing change in policies and faculty development (Dhilla, 2017). In an integrative review, Dhilla found that educators felt vulnerable when considering online learning environments because they were required to amend their pedagogical approach, without a clear path to ensure successful uptake. However, many educators recognize that the learning environment is changing, with today's digital savvy learners. At this juncture, it is important to support educators in the development and implementation of VGS by sharing experiences related to successful uptake in curriculum.

VGS are designed to facilitate students' application of learned concepts in a realistic scenario. In healthcare education, VGS is often used to practice clinical decision-making (Duff et al., 2016). With this in mind, VGS should reinforce specific learning outcomes of a course. Incorporating course specific VGS can improve student learning outcomes by helping students to develop a deep interest in the educational content (Wronowski, 2019). Game developers can refer to the specific course learning outcomes where the VGS will be integrated. However, to enhance curricular uptake outside of one's own course, it is important to consider learning outcomes that are common across nursing programs. Additionally, game developers can tailor learning outcomes to standards of practice and entry-to-practice competencies. When appropriate VGS learning outcomes are identified, educators then need to consider the content as it relates to the level of learner, and their personal experience using VGS within their course (Verkuyl et al., 2020a).

There are a number of teaching strategies for use of VGS in curriculum. One common strategy is to assign the VGS as an individual activity which students complete on their own at their convenience within a specified time frame. With this approach, students can make mistakes without encountering embarrassment among their peers and assess their personal decision-making abilities. Another strategy is to work through the VGS in a small student group either in-person or through web-conferencing. With each decision point, the group discussion can enhance learning, develop team decision making skills, and support conflict management. A third strategy, is to play the VGS as a large group activity. The group could make decisions by audience polling or group discussion. Differing views can promote student discussions on the various points of view for a decision creating an engaging and stimulating learning environment (Tosterud et al., 2014; Verkuyl et al., 2020a).

The prebrief is a dedicated time set aside before the VGS occurs. The educator introduces the VGS while setting the tone, articulating expectations, sharing goals, providing an orientation to the VGS environment, and clarifying evaluations (Bryant et al., 2019). The prebrief also reassures the user of the opportunity to discuss the scenario with faculty and their peers. Tyerman et al. (2018) found that an effective prebrief, tailored to the simulation and learner, resulted in positive learner satisfaction and learning outcomes. The prebrief should be tailored to include explanation of the virtual format, game expectations and decision making, required technology, and technological support. It is crucial to ensure technical glitches do not overshadow the VGS experience, as students indicated they learned less when they encounter technological challenges (Anderson et al., 2013; Verkuyl and Mastrilli, 2017). These challenges can be averted when faculty are familiar with VGS and technological support is available. It is particularly important to communicate if and how the VGS will be evaluated. One recommendation is to assign a participation mark or a graded reflective activity rather than grading based on the student's performance in the VGS (Verkuyl et al., 2020a). This recommendation supports experiential pedagogy that preserves the integrity of the VGS as ‘game’ and not a test, which allows the students to make mistakes, as well as learn from the VGS experience.

Debriefing is widely known to optimize learning and support students to reflect on their simulation experiences (Fey et al., 2014). There are a number of different formats for educators to consider when deciding on a debriefing format. The facilitated in-person group debrief is most commonly used with in-person simulation. However, a similar group debrief can be achieved through web conferencing platforms, which can be particularly useful with virtual simulations (Verkuyl et al., 2020b; Gordon, 2017). Other formats being explored include self-debrief, asynchronous online discussions, and a combination of self plus group debrief (Verkuyl et al., 2020b). Lapum et al. (2018a) defines a self-debrief as “an individual, written activity in which a series of questions (designed based on a theoretical debriefing framework) facilitate learners’ reflection on a simulation” (p.1). When deciding on which format to use, educators must consider how the VGS will be played, the level of the learner, the content, and the learning outcomes.

In the virtual environment, data collected and analyzed through learning analytics can provide insight into the user's interactions with the game so that the learning experience can be improved (Alonso-Fernández et al., 2019). The data and trends offered through analytics allow educators to assess user uptake and provide insight into common errors and learning gaps. The results can be used to provide specific education to the learners allowing for targeted teaching. It is important for educators to understand what learning analytics are available with the VGS, as well as which analytics are important for their learners and how best to integrate the data to augment learning. In our VGS, the students receive an individualized summary report sheet at the end of the game which identifies each decision made, right or wrong.

Researching and evaluating the user's experience and the learning outcomes of VGS is an important assessment mechanism. The results can be used to support the use of VGS learning, identify learning needs, and offer rationale for embedding VGS experiences in curriculum. The evaluation can be shared with peers to inspire usage and with administrators to validate using VGS. Completing a cost analysis helps to determine if the simulation is financially effective for meeting learning outcomes, and would provide educators with evidence-based rationales for the most effective ways to employ simulation pedagogy (Haerling, 2018).

4. Embedding VGS into curriculum

Choosing which VGS to embed in a course is determined by a number of factors such as learning outcomes, finances, faculty expertise, available resources, and the current healthcare environment. Once a VGS is chosen, educators must determine the most effective process for including it in their curriculum. This can be challenging since there are no clear guidelines. In addition, faculty development opportunities addressing these challenges are limited, but slowly expanding. To address these challenges, we suggest educators share their experiences and program and curricular leaders support champions of VGS and the curricular uptake of VGS.

As an example, a VGS created by members of our team, was taken up in a year one didactic health assessment course that has over 550 nursing students enrolled across three institutional sites. The syllabus, weekly outline and evaluations are consistent across all three sites. We approached the health assessment teaching team, consisting of eleven instructors, to review the newly developed VGS and consider embedding it in the course. Some of us were part of this teaching team, which enhanced interest in the process. The VGS learning objectives aligned with the course content, but the team had to reach a consensus about revising the course learning activities so it would be included. Following review and discussion, there was support to trial the game for one year. In addition, a small faculty team derived from the health assessment course conducted a study exploring students’ perception of their experience of the game.

Two of the authors of this article were involved in the creation of the first two VGS. All of the authors of this article became VGS champions by embedding VGS into curriculum, researching the outcomes, and creating new VGS to support additional courses. In addition to obtaining early buy-in from the course teaching team, we considered how best to support the educators with the uptake of this VGS in the health assessment course. Using new technology in courses can sometimes cause apprehension because of its novelty and educators’ lack of familiarity. We mitigated this concern by having a support person available to address any technological challenges throughout the process. The support person was on the development team and had the ability to connect with the web designer to address challenges promptly. At the faculty team meetings, we discussed the learning outcomes of the VGS, rationales for use, how to use it, and relevant technology issues. In our discussions, we highlighted that when faculty are excited about a specific learning experience, students are positively influenced. Before providing the link to students, we encouraged instructors to work through all branches of the scenario including correct and incorrect options, and review all of the rationales. It is important for educators to be intimately familiar with the possible learning experiences so that they can support students.

The VGS link was made available to students on their course learning management system and the VGS was referred to in the weekly outline and weekly PowerPoint. The VGS was to be completed after the content had been taught in class. During the prebrief, instructors provided an overview of the instructions to students, introduced them to the platform, discussed the learning outcomes, informed them of grading, announced the due date, and provided information on technological support. This process was important because students played the VGS individually in replacement of a class lecture. We assigned the VGS as an individual assignment so students made their own decisions without being influenced by others, allowing them to reflect on their choices, and identify personal knowledge gaps. Once the students had finished the VGS, we recommended they immediately complete their self debrief (a series of questions based on a theoretical debriefing framework) while referring to their individualized summary report of their decisions. We required that students complete the game by a specific date and send their individualized VGS summary report and completed self-debrief to their instructor; failure to do so resulted in a 1% deduction from their final grade. Then, within one to two weeks students participated in a facilitated, large class (30 students) debrief session conducted by faculty, which research supports as an effective form of debriefing (Verkuyl et al., 2019a, 2020b).

Along each step of our journey, we conducted research to explore the experience. We completed a usability study with nursing faculty and students to assess ease of use and perceived usefulness of the VGS before integrating it into curriculum (Verkuyl et al., 2018). The results were used to refine this VGS, inform future VGS, and mitigate technological issues for players. Ease of use and perceived usefulness by users are components which enhance curricular uptake since in our nursing program we had over 550 students playing the VGS over a one-week period. Afterwards, we studied the user experience by conducting studies on how best to embed the VGS in curriculum to optimize learning.

Research results (Verkuyl et al., 2017), students' positive feedback and their desire for other VGS experiences increased the teaching team's resolve to embed more VGS in the curriculum. It also increased the momentum for some of the team members to be involved in creating more VGS. Sharing a trailer of the VGS (see https://www.youtube.com/watch?v=oMk7Fyqqm3o) and discussing successful outcomes with colleagues at a professional development day heightened the enthusiasm for using VGS in other courses.

4.1. Lessons learned

For many educators, the use of technology-enabled learning activities like VGS may be out of their comfort zone resulting in reluctance to incorporate them in curriculum. However, as instructors we are called to regularly evaluate our teaching practices and consider more effective ways for students to learn. We have learned that with education and support, nurse educators can successfully engage in the use of technology-informed pedagogy.

To support curricular uptake of VGS, it is important to engage teaching teams early in the process. Engagement of the teaching team facilitates their connection to the VGS and makes them champions of its uptake. Seeking feedback from teaching teams and other educators on a newly created VGS through a usability study or anecdotally, facilitates improvements that enhance the faculty and student experience.

At this point we have given participation marks and have not graded the students' VGS performance or their self-debrief. It is our belief that the experiential nature of VGS provides opportunities for students to learn as much from choosing the incorrect answer as the correct answer. Grading the VGS may motivate students to get the highest score but limit their willingness to try different options and learn from their mistakes. As such, evaluating students’ success based on their VGS score may undermine the general philosophy and theory underpinning the use of games.

The first year we introduced the VGS we did not require the students to do a debrief because we felt they had sufficient feedback throughout the game and in their summary report. However, our first study of students’ VGS experiences suggested otherwise. We observed that students who participated in the focus group used it as a form of debrief; they not only talked about their VGS experience, but also indicated that they wanted to reflect on their experience once it was over (Verkuyl et al., 2017). Our challenge was to determine how to offer an effective debrief when the students play the VGS at their convenience. After a series of studies, we determined our solution: a two-step debriefing process (Verkuyl et al., 2020b). After playing the VGS, we now have students complete an immediate written, self-debrief, and then within two weeks, they participate in a large group debrief. During the group debrief, they are instructed to refer to their summary report and completed self-debrief. The questions asked in the group debrief are the same as the questions on the self-debrief. This combination of debriefing provides the opportunity for students to reflect on their experience individually, ponder the experience over a short period of time, and then participate in a larger group discussion and be exposed to other perspectives.

Another lesson we learned was the importance of evaluation during each step of our journey. The results of one of our focus group studies indicated that students appreciated the VGS experience, were engaged in the experience, learned from playing the game, and wanted more VGS experiences (Verkuyl et al., 2017). The exploratory and outcomes-based research into the students’ experience provided support for curricular uptake of VGS (Verkuyl et al., 2017). Anecdotally, students shared with their instructors how much they appreciated and learned from the experience spurring us to continue using the VGS.

4.2. Future directions

We continue to obtain funding to create more VGS so that additional clinical areas can be represented. Recently, one of our institutions received a large donation to fund simulation and some of us have become members of the steering committee to oversee the creation of seven more VGS. As part of this initiative, one of our goals is to expand the cadre of educators who have expertise in VGS. We were encouraged by the desire of a wide range of educators who voiced interest in being involved. So, what started as three educators in our collaborative program involved in the creation of VGS has grown to over 30. The formation of this steering committee has increased the organization of our development process so that documents, terms of reference, advisory groups, and processes are clearly defined and reproducible. It has also allowed for strategic direction in the creation of VGS. Recently we have increased the involvement of students in the VGS development. In addition to being involved in designing the scenarios, students review the games and act in them. We believe having both educators and students involved in the VGS development will increase the VGS uptake and applicability.

An existing gap is that educators are left to experiment and adapt in-person simulation best practices to virtual experience. But research related to using virtual experiences in nursing have increased exponentially and will pave a way for best practice guidelines. At this juncture, there is a need to develop workshops, faculty guides and study outcomes to provide approaches to designing and implementing virtual experiences.

There are two reasons to collect analytics: to understand student learning and to inform the gaming design process (Fernández, 2016). To really understand the analytics, we need to consider the reason to begin collecting data. We know our games are played internationally by tens of thousands of users. A game can be played by an individual, small or large groups so the number of plays does not precisely reflect the number of users. We believe the games are played because of their high quality, but also because of their open access availability, which has no log in. As a result, we do not know the players’ disciplines, year of nursing, demographics, or how the VGS was used. This information could increase our understanding about the VGS end user of the game. At this time, we are considering what data to collect, and how to best understand and use the data so we can make revisions to a VGS or to inform future VGS.

We have found evaluation of the experience to be key in promoting the VGS among our team and to educators internationally. However, there are still unknowns regarding how to evaluate VGS effectiveness related to knowledge retention, clinical decision-making, and clinical practice. We call on the simulation community to engage in research to advance our understanding because there are no studies to indicate that students’ clinical practice improves due to their use of VGS and how they can potentially replace or augment clinical hours.

5. Conclusion

There is an unprecedented surge of curricular uptake of VGS in nursing education. Because of its relative unfamiliarity, it is important that educators share their road map to embedding VGS in curriculum including both positive and challenging experiences. Our aim is for other educators to use and build on our lessons learned so together we can navigate the unchartered waters of using VGS.

References

  1. Aebersold M. The history of simulation and its impact on the future. Adv. Critical Care. 2016;27(1):56–61. doi: 10.4037/aacnacc2016436. [DOI] [PubMed] [Google Scholar]
  2. Alonso-Fernández C., Calvo-Morata A., Freire M., Martinez-Ortiz I., Fernandez-Manjon B. Applications of data science to game learning analytics data: a systematic literature review. Comput. Educ. 2019;141:1–13. doi: 10.1016/j.compedu.2019.103612. [DOI] [Google Scholar]
  3. Anderson J.K., Page A.M., Wendorf D.M. Avatar-assisted case studies. Nurse Educat. 2013;38(3):106–109. doi: 10.1097/NNE.0b013e31828dc260. [DOI] [PubMed] [Google Scholar]
  4. Bryant K., Aebersold M.L., Jeffries P.R., Kardong-Edgren S. Innovations in simulation: nursing leaders' exchange of best practices. Clin. Simul. Nursing. 2019:1–8. doi: 10.1016/j.ecns.2019.09.002. [DOI] [Google Scholar]
  5. Canadian Association of Schools of Nursing . 2010. The Case for Healthier Canadians: Nursing Workforce Education for the 21st Century.https://casn.ca/wp-content/uploads/2014/12/CASN2010draftJune1.pdf [Google Scholar]
  6. Cant R., Cooper S., Sussex R., Bogossian F. What's in a name? Clarifying the nomenclature of virtual simulation. Clin. Simul. Nursing. 2019;27:26–30. doi: 10.1016/j.ecns.2018.11.003. [DOI] [Google Scholar]
  7. Dhilla Sarah J. The role of online faculty in supporting successful online learning enterprises: a literature review. Higher Education Politics Economics. 2017;3(1) https://digitalcommons.odu.edu/aphe/vol3/iss1/3 Article 3. Available at: [Google Scholar]
  8. Duff E., Miller L., Bruce J. Online virtual simulation and diagnostic reasoning: a scoping review. Clin. Simul. Nursing. 2016;12:377–384. doi.org/10.1016/j.ecns.2016.04.001. [Google Scholar]
  9. Fernández C.A. 2016. Gaming Learning Analytics for Serious Games.https://pubman.e- ucm.es/drafts/e-UCM_draft_296.pdf [Google Scholar]
  10. Fey M.K., Scrandis D., Daniels A., Haut C. Learning through debriefing: students' perspectives. Clin. Simul. Nursing. 2014;10(5):e249–e256. doi: 10.1016/j.ecns.2013.12.009. [DOI] [Google Scholar]
  11. Foronda C.L., Fernandez-Burgos M., Nadeau C., Kelley C.N., Henry M.N. Virtual simulation in nursing education: a systematic review spanning 1996 to 2018. Soc. Simul. Healthcare. 2020;15(1):46–54. doi: 10.1097/SIH.0000000000000411. [DOI] [PubMed] [Google Scholar]
  12. Gordon R.M. Debriefing virtual simulation using an online conferencing platform: lessons learned. Clin. Simul. Nursing. 2017;13(12):668–674. doi: 10.1016/j.ecns.2017.08.003. [DOI] [Google Scholar]
  13. Haerling K. Cost-utility analysis of virtual and mannequin-based simulation. Simulat. Healthc. J. Soc. Med. Simulat. 2018;13(1):33–40. doi: 10.1097/SIH.0000000000000280. [DOI] [PubMed] [Google Scholar]
  14. INACSL Standards Committee INACSL standards of best practice: SimulationSM Simulation design. Clin. Simul. Nursing. 2016, December;12(S):S5–S12. doi: 10.1016/j.ecns.2016.09.005. [DOI] [Google Scholar]
  15. Kononowicz A.A., Woodham L.A., Edelbring S., Stathakarou N., Davies D., Saxena N., Tudor Car L., Carlstedt-Duke J., Car J., Zary N. Virtual patient simulations in health professions education: systematic review and meta-analysis by the digital health education collaboration. J. Med. Internet Res. 2019;21(7) doi: 10.2196/14676. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Lapum J., Verkuyl M., Hughes M., Romaniuk D., McCulloch T., Mastrilli P. Self-debriefing in virtual simulation. Nurse Educat. 2018;44(6):E6–E8. doi: 10.1097/NNE.0000000000000639. [DOI] [PubMed] [Google Scholar]
  17. Lapum J., Verkuyl M., Hughes M., St-Amant O., Romaniuk D., Betts L., Mastrilli P. In: Virtual Simulation in Nursing Education. Gordon R., McGonigle D., editors. Springer Publishing; New York: 2018. Design and creation of virtual gaming simulations in nursing education; pp. 127–141. [Google Scholar]
  18. Nelson R. Replicating real life: simulation in nursing education and practice. Am. J. Nurs. 2016;116(5):20–21. doi: 10.1097/01.NAJ.0000482956.85929.d8. [DOI] [PubMed] [Google Scholar]
  19. Rim D., Shin H. Effective instructional design template for virtual simulations in nursing education. Nurse Educ. Today. 2021;96 doi: 10.1016/j.nedt.2020.104624. 104624-104624. [DOI] [PubMed] [Google Scholar]
  20. Tosterud R., Hall-Lord M.L., Petzäll K., Hedelin B. Debriefing in simulation conducted in small and large groups - nursing students' experiences. J. Nurs. Educ. Pract. 2014;4(9):173–182. doi: 10.5430/jnep.v4n9p173. [DOI] [Google Scholar]
  21. Tyerman J., Luctkar-Flude M., Graham L., Coffey S., Olsen-Lynch E. A Systematic review of health care presimulation preparation and briefing effectiveness. Clin. Simul. Nursing. 2018;27(C):12–25. doi: 10.1016/j.ecns.2018.11.002. [DOI] [Google Scholar]
  22. Verkuyl M., Atack L., Kamstra-Cooper K., Mastrilli P. Virtual Gaming Simulation: an interview study of nurse educators. Simulat. Gaming. 2020;51(4):537–549. doi: 10.1177/1046878120904399. [DOI] [Google Scholar]
  23. Verkuyl M., Hughes M., Tsui J., Betts L., St-Amant O., Lapum J. Virtual gaming simulation in nursing education: a focus group study. J. Nurs. Educ. 2017;56(5):274–280. doi: 10.3928/01484834-20170421-04. [DOI] [PubMed] [Google Scholar]
  24. Verkuyl M., Hughes M., Atack L., McCulloch T., Lapum J.L., Romaniuk D., St-Amant O. Comparison of self-debriefing alone or in combination with group debrief. J. Clin. Simul. Nursing. 2019;37(C):32–39. doi: 10.1016/j.ecns.2019.08.005. [DOI] [Google Scholar]
  25. Verkuyl M., Lapum J., St-Amant O., Hughes M., Romaniuk R., Mastrilli P. Designing virtual gaming simulations. Clin. Simul. Nursing. 2019;32(c):8–12. doi: 10.1016/j.ecns.2019.03.008. [DOI] [Google Scholar]
  26. Verkuyl M., Lapum J.L., St-Amant O., Hughes M., Romaniuk D., McCulloch T. Exploring debriefing combinations after a virtual simulation. Clin. Simul. Nursing. 2020;40(C):36–42. doi: 10.1016/j.ecns.2019.12.002. [DOI] [Google Scholar]
  27. Verkuyl M., Mastrilli P. Virtual simulations in nursing education: a scoping review. J. Nursing Health Sci. 2017;3(2):39–47. https://pdfs.semanticscholar.org/3c79/0065159264dea06b94e8bb947ac331268aff.pdf Retrieved from. [Google Scholar]
  28. Verkuyl M., Romaniuk D., Mastrilli P. Virtual gaming simulation of a mental health assessment: a usability study. Nurse Educ. Pract. 2018;18(31):83–87. doi: 10.1016/j.nepr.2018.05.007. [DOI] [PubMed] [Google Scholar]

Articles from Nurse Education in Practice are provided here courtesy of Elsevier

RESOURCES