Skip to main content
Journal of Education and Health Promotion logoLink to Journal of Education and Health Promotion
. 2023 Feb 28;12:55. doi: 10.4103/jehp.jehp_1011_22

Health sciences educator's simulation debriefing practice needs: A mixed methods study

Champion N Nyoni 1, Anke van der Merwe 1,, Benjamin S Botha 1, Cecille Fourie 1, Yvonne Botma 1, Mathys J Labuschagne 1, Riaan van Wyk 2
PMCID: PMC10127471  PMID: 37113413

Abstract

BACKGROUND:

Simulation debriefing influences learning from healthcare simulation activities. Health sciences educators must be competent in conducting simulation debriefing for healthcare students. A structured faculty development intervention for health sciences educators must be informed by educator needs to enhance its utility. This paper describes the needs of health sciences educators regarding simulation debriefing at a faculty of health sciences.

MATERIALS AND METHODS:

A parallel convergent mixed methods study design was applied on a selected population of 30 health sciences educators at the University (x) who integrate immersive simulation for first- to final-year students in their undergraduate programs. The Objective Structured Assessment of Debriefing tool underpinned observations which informed the quantitative strand of the study, while semi-structured interviews were conducted as part of the qualitative strand. Descriptive statistics and thematic analysis were used to analyze the data.

RESULTS:

Health sciences educators struggled to establish the learning environment for simulation (median 1), facilitate learning (median 3), and evaluate their debriefing activities. However, they were able to apply an appropriate approach toward simulation (median 4). They identified the need to be educated on the fundamentals of simulation-based education.

CONCLUSION:

A continuing professional development program must be developed aimed at transforming approaches toward facilitating learning, explaining the fundamentals of simulation-based education, modeling of best-practices related to debriefing, and applying appropriate strategies for evaluating debriefing activities.

Keywords: Needs assessment, professional competence, simulation training

Introduction

Simulation debriefing is integral in enhancing learning from simulation activities. Health sciences educators who design and implement simulation activities for health sciences students are expected to be competent in simulation debriefing. However, poor simulation debriefing practices often negatively influence students' simulation experiences, leading to poor simulation outcomes.[1]

Simulation debriefing is a guided reflection of a simulation activity to improve students' performance.[2] The literature describes various approaches to enacting simulation debriefing in health sciences education.[2,3,4] Common to these approaches is the authenticity of the debriefer, who should have observed the specific simulation activity before any debriefing.[4] In addition, simulation debriefing is usually a stepwise process that enables students to reflect on their emotional experiences, simulation experiences, learning outcomes, and a possible plan of action.[3] The experiential learning model by Kolb (1984) prominently underpins simulation debriefing activities, albeit other more recent educational models.[5] Verkuyl et al. (2017)[6] state that simulation debriefing approaches should respond to various simulation modalities. Health sciences educators should be conscious of the need for appropriate simulation debriefing approaches in consideration of advancements in simulation in health sciences, as exemplified by the adoption of virtual reality simulation.[7]

The success of any simulation debriefing hinges on the competence of the debriefer. Kolbe et al.(2015)[8] explored the limitations of briefing and debriefing programs through a literature review and highlighted an urgent need for educators to be exposed to faculty development programs that focus on enhancing their simulation briefing and debriefing competence. Approaches to debriefing by facilitators are often described as haphazard with limited efficacy. For example, Atwater (2018) noted some educators as incompetent in debriefing practices in an undergraduate nursing program in Australia.[1] Surprisingly, little literature describes health sciences educators' debriefing competence regardless of the escalating adoption of simulation-based education in the African context.[9] However, it is quintessential that health sciences educators are empowered and continuously supported in executing their roles, such as simulation debriefing.[10] Such support strategies should be through contextually based and theory-driven educational programs.

The design, development, implementation, and evaluation of an educational program to enhance simulation debriefing competence for health sciences educators should be underpinned by sound theoretical models to improve quality and transferability.[11] The lack of educational theory has been described as a grave weakness and limitation of many educational programs, further influencing their utility.[12] The educational approach underpins the reasoning behind specific educational interventions and allows educationalists to appropriately evaluate their outcomes.[13] The Analysis, Design, Development, Implementation and Evaluation (ADDIE) instructional system design model[14] is a popular model used to develop instructional design activities such as a faculty development program to enhance debriefing competencies. The model posits that in the design of any instructional intervention, the developer must clearly understand the gaps, the desired outcomes and behaviors, and the audience's existing knowledge and skills. Understanding these gaps may be through observing the audience's performance and engaging the audience on their perceived needs.[14]

Educator competence is integral in simulation debriefing for positive educational outcomes. The literature on simulation debriefing practices of health sciences educators in low- and middle-income countries is limited.[15] In this article, we describe the simulation debriefing practices of educators across a faculty of health sciences in Africa. We argue that insight into the simulation debriefing practices for health science educators provides valuable understanding of the needs of health sciences educators which are essential inputs toward developing contextually appropriate faculty development interventions specifically in a low- and middle-income context.

The institution included in this study has one of the most extensive simulation facilities for health sciences in Africa, established in 2009.[16] Various departments in the Faculty of Health Sciences (FHS) integrate simulation as a teaching tool in their training programs driven by specific learning outcomes. Annually, over 1000 students from up to seven undergraduate health sciences programs and more than ten postgraduate and continuing professional development programs rotate through these simulation facilities for various activities. The equipment in the facilities includes state-of-the-art manikins, task trainers, and virtual reality simulation.[17] Considering the high volume of students participating in simulation as part of their training at the University of the Free State, the limited reporting on simulation debriefing practices of health sciences educators in low- and middle-income countries is worrisome and thus the current study aimed to explore the aforementioned debriefing practices of health science educators.[15]

Materials and Methods

Study design and setting

A parallel convergent mixed method design was used in this study, integrating quantitative and qualitative approaches.[18,19] Observations were used for collecting quantitative data and semi-structures interview for qualitative data.

Study participants and sampling

This study's population was all FHS health sciences educators who use immersive simulation in the seven undergraduate programs (N = 30). These health sciences educators were drawn from all departments with undergraduate programs. The participants were identified through their respective heads of department and were included through census sampling. The inclusion criteria were the identification by the relevant head of department and staff that are facilitators in immersive simulation.

Data collection tool and technique

The authors collected quantitative data by observing debriefing practices and qualitative data through semi-structured interviews. The Objective Structured Assessment of Debriefing (OSAD) tool, a validated and reliable tool to assess debriefing quality,[20] was used to collect quantitative data through peer-review of video-recorded debriefing activities. The OSAD tool comprises the eight categories of high-quality debriefing and enables the assessment of both debriefing quality and areas for improving debriefing practice.[20] These categories are approach, the establishment of learning environment, engagement of learners, descriptive reflection, reaction, analysis, diagnosis, and application. An author-developed semi-structured interview guide based on the OSAD tool categories was utilized to explore the experience and needs of the participants regarding simulation debriefing.

Participants were invited to the study through an information leaflet that explained the purpose of the research and the data collection methods. Consenting participants were required to sign the informed consent form before the commencement of data collection. The data were collected in two parallel phases, firstly the observations and secondly interviews [See Figure 1].

Figure 1.

Figure 1

Diagram depicting study phases

Observation of debriefing practices (Quantitative phase)

Dates for data collection were agreed upon between the participants and the data collector. Video recording equipment was mounted at the simulation venue to obtain an unobstructed recording of the debriefing activities. Students participating in the debriefing activity were informed that the debriefer would be recorded and would not be identifiable. After each session, the recordings were appropriately labeled and stored in a password-protected electronic folder.

Perceived needs related to simulation debriefing (Qualitative phase)

The data were collected by a Social Worker with extensive qualitative data collection experience. The data collector and the participants agreed upon dates for individual semi-structured interviews. All participants from the observation phase were included, and the interviews were conducted online with recording capabilities. The participants' OSAD scores were not provided to them nor discussed with them. Data collection was guided by the author generated tool which was aligned to the OSAD categories. Furthermore, the participants were asked about their perceived needs for simulation debriefing competence based their teaching practice. Various approaches and probes were used to enhance the discussion with the participants resulting in interviews that lasted approximately 40 minutes per interview. The interview audio recordings were appropriately labeled and stored in a password-protected electronic folder.

Ethical consideration

This article reports a section of an overarching study aiming to develop, implement, and evaluate a faculty development program for simulation debriefing in health sciences. The Department of Health (South Africa) ethics framework underpinned by universal research ethics principles underpinned this study. Participant autonomy was respected through allowing potential participants sufficient time to consider study participation prior informed consent. In addition, the participants had the option to drop out of the study at any time should they wish without any associated penalties. The authors ensured that all data were collected and managed with the utmost confidentiality through using unique participant numbers in interview transcripts and OSAD reporting. Ethics clearance and gate-keeper approval was sought and granted from the University of the Free State (UFS-HSD2020/1649/2411) All participants in this study provided written consent.

Data analysis

Data analysis was executed in two parts to integrate outcomes. The initial part focused on analyzing data from each phase of the study, and the second part focused on the convergence of the outcomes of the two phases to generate the study results.

Quantitative data analysis

Three of the authors with experience in simulation debriefing analyzed the data from the observations. The recordings and the OSAD tool were distributed to each author. Each author reviewed the recordings and scored the participant's performance. Each recording score was centrally collated, and a median score was calculated per OSAD category integrating the outcome scores from the three independent reviews.

Qualitative data analysis

Otter.ai software facilitated the data transcription process, which was checked and rectified against the recorded interviews for consistency. Three of the authors with experience in qualitative data analysis inductively analyzed the data by applying a generic analysis framework.[21] Initially, the transcripts were read and re-read for familiarization. One author coded the transcripts, and the codes were presented for discussion. Subsequently, through pattern coding, themes were identified and defined. The themes were the outcomes of the qualitative strand.

Converging outcomes of quantitative and qualitative data

Part two of the data analysis focused on converging the outcomes of quantitative and qualitative data aligned with convergent mixed methods research.

Rigor of study

Two pilot studies were performed to refine the data collection approaches. The three data collectors oriented themselves to the OSAD tool and collectively reviewed a sample simulation debriefing experience to check for consistency in scoring. Discussions among the data collectors on the application of the OSAD enhanced the transparency of scoring from the tool before analyzing study data. An exploratory interview was conducted to understand the feasibility of the qualitative data collection process. A reflection on the exploratory interview with the researcher helped enhance the quality of the subsequent interviews. Following a consensus meeting between the data collectors, no changes to the data collection procedures and semi-structured interview guide were required. The data were included in the main study.

Results

The study results integrate the quantitative and qualitative, presenting the former initially.

Quantitative results

Nine health science educators consented to participate in the quantitative study phase, each observed and scored independently by three authors. The results of the OSAD tool consisting of eight categories scored from one (lowest) to five (highest) are listed in Table 1.

Table 1.

Objective Structured Assessment of Debriefing (OSAD) ratings (n=27)

OSAD category Median Range
Approach (the manner in which the facilitator conducts the debriefing session, their level of enthusiasm and positivity when appropriate, showing interest in the learners by establishing and maintaining rapport and finishing the session on an upbeat note) 4 1-5
Establishes learning environment (introduction of the simulation/learning session to the learner (s) by clarifying what is expected of them during the debriefing, emphasizing ground rules of confidentiality and respect for others, and encouraging the learners to identify their own learning objectives) 1 1-4
Engagement of the learners (active involvement of all learners in the debriefing discussions, by asking open questions to explore their thinking and using silence to encourage their input, without the facilitator talking for most of the debriefing, to ensure that deep rather than surface learning) 3 1-5
Descriptive reflection (Self-reflection of events that occurred in the simulation/learning session in a step-by-step factual manner, clarifying any technical, clinical issues at the start, to allow ongoing reflection from all learners throughout the analysis and application phases, linking to previous experiences) 3 1-5
Reaction (establishing how the simulation/learning session impacted emotionally on the learners) 2 1-5
Analysis (eliciting the thought processes that drove a learner’s actions, using specific examples of observable behaviors, to allow the learner to make sense of the simulation/learning session events) 3 1-5
Diagnosis (enabling the learner to identify their performance gaps and strategies for improvement, targeting only behaviors that can be changed, and thus provide structured and objective feedback on the simulation/learning session) 3 1-5
Application (summary of the learning points and strategies for improvement that have been identified by the learner(s) during the debrief and how these could be applied to change their future clinical practice) 3 1-5

Participants scored highest (four) in the “approach” category, and all other categories were scored average or lower (two and three), with “established a learning environment” being scored the lowest (one). None of the categories achieved a median of five (highest).

Qualitative findings

The nine participants were invited to take part in semi-structured interviews, and five agreed to participate. Four themes and sub-themes were identified through the analysis of the structured interviews. The themes are simulation planning, evaluation, debriefing approach, and the need to improve debriefing competency [See Table 2].

Table 2.

Themes identified by semi-structured interviews

Theme Sub-theme
Simulation planning Team preparation
Student preparation
Logistical arrangements
Debriefing approach Space created
Student guidance
Challenges to approach
Strategies to enhance student engagement
Evaluation Simulation case
Student feedback
Formal evaluation of simulation and debriefing activities
Improving debriefing competency Theoretical training
Practical skills training

Simulation planning

The participants indicated that a large part of the simulation planning activity is ensuring that correct role-players are briefed and that everybody knows their involvement regarding the simulation. The students were reportedly informed on technical aspects regarding the functions and limitations of the manikin and simulation equipment. Additionally, the participants mentioned that students were briefed on the clinical aspects of the case. Logistical arrangements were to establish a simulation-ready environment. One participant summed it up as follows:

“So, I have the students, and I have my team who will be part of the simulations we will be running. So they [my team] know exactly what their role will be. We usually have three or four that help us. We also go with a technician and go through exactly what must be done with the manikin.” [ Participant 3 ]

The theme aligns with the approach and establishing learning environment categories of the OSAD tool [See Table 3]. The participants in this study scored high on the approach category but very low on the establishing learning environment category. A high score on the approach category indicates that the participants were able to establish and maintain rapport throughout the debriefing, using a non-threatening but honest approach and creating a psychologically safe environment. The very low score on establishing the learning environment category implies that the participants had unclear expectations of the learner, with no rules defining learner engagement during their debriefing session.

Table 3.

Convergence of identified themes and OSAD ratings

OSAD category Identified themes Median rating
Approach Simulation planning and Debriefing approach 4
Establishes learning environment Simulation planning and Debriefing approach 1
Engagement of the learners Debriefing approach 3
Descriptive reflection Debriefing approach 3
Reaction Debriefing approach and Evaluation 2
Analysis Debriefing approach 3
Diagnosis Evaluation 3
Application Evaluation 3
Overall improvement of facilitator’s debriefing competency using the OSAD tool Improving debriefing competency

“I think I'm a little bit hard on them, because I, I assume they prepared. So I will go one by one and you must give me something it doesn't matter. Just what did you notice? Even if it's just it was a mild patient, something that you notice?” [ Participant 2 ]

Debriefing approach

Participants expressed that they established a safe space for students by creating a relaxed atmosphere at a venue separate from where the simulation activity took place. Student emotions were reported to have been explored during the initial debriefing phase as mentioned:

“Usually, I would start with just the feelings. So, what they thought..” – [ Participant 5 ]

Furthermore, guided questioning was used as a technique to enable students to discuss the case in a relaxed manner. One participant expressed:

“Okay, so after the simulation is done, we usually go into a debriefing room, which is next door to the simulation room, and we sit around a table. It's a very relaxed environment and situation, and everyone is given a chance to explain, for example, what they think went well, what they could have done better, what went bad and things they can improve on.” [ Participate 1 ]

A participant indicated that it is difficult to move from “lecturer-mode” to “facilitator-mode” during a debriefing and had some uncertainty in their abilities as a facilitator. The participants provided examples from the clinical learning environment to facilitate the debriefing discussion. In addition, one participant used a technique where they make some statements, and the students must explain why it's true or false. In contrast, another participant said they expect the students to be prepared for the simulation and force participation from each student during the debriefing.

“So then I don't give them a choice to answer” [ Participant 2 ]

The theme aligned with the OSAD tool's six categories: approach, establishing learning environment, engagement of learners, reaction, descriptive reflection, and analysis [See Table 3]. Regarding the engagement category of learners, the participants scored satisfactorily, indicating that they allowed the learners to participate in the discussion, but primarily through closed questions, and they did not actively invite contributions from the more passive learners. Within the reaction category, the participants scored low, illustrating that they did not acknowledge the learner's reactions or the emotional impact of the experience.

In the category of descriptive reflection, the participants scored satisfactorily as they allowed for some description of events by the facilitator with little self-reflection by the learner. The participants scored satisfactorily in the category of analysis, as there was some exploration of the reasons and consequences of the actions by the facilitator, not allowing the learners to relate to previous experiences. In the application category, the participants scored satisfactory, meaning they had some discussion of learning points and improvement strategies but limited discussion on applying this knowledge to future clinical practice.

Evaluation

From the semi-structured interviews, the theme “evaluation” was identified. The participants expressed that one way of evaluating the success of the debriefing is through the attainment of learning outcomes by students during the simulation activity. Others indicated that they ask the students whether they feel confident and competent about the specific case after the simulation activity. However, most participants do not conduct formal evaluations of simulation activities and debriefing with the students. Participant 4 commented:

“…I don't think we've ever evaluated the debriefing.” [Participant 4 ]

This theme aligned with reaction, diagnoses, and application categories of the OSAD tool [See Table 3]. The participants scored low in the reaction category and satisfactory in the diagnoses and application categories. Accordingly, the participants did not acknowledge the learner's reaction or the emotional impact of their simulation experience. Still, they provided feedback only on the clinical skills, focusing on errors and not purely on behaviors that can be modified. In addition, the participants could discuss their learning points and strategies for improvement but lacked the application of this knowledge to future clinical practice.

“We don't have a formal way of evaluating it.” [Participant 3]

The need to improve debriefing competency

All participants indicated that they wanted to improve their debriefing competency. They mention that the improvement should focus on achieving a structured debriefing approach. Participants expressed the need for training on enhancing student engagement during the debriefing. One also indicated they need more training on planning a successful simulation activity and debriefing.

“What are the gaps? [sic] and how as a lecturer I can be better in assisting, better in planning my simulation stations.” [ Participant 4 ]

Other competencies where the participants felt they required training were how to lead students through a debriefing discussion and developing the competency of being a facilitator. One participant explained that they were not exposed to simulation during their undergraduate training and found it a challenge to use simulation as a teaching strategy:

“So to me, it's something that first I had to get my mind around the fact that this is the way you learn, and then I have to learn how to do it.” [ Participant 3 ]

Discussion

This study sought to describe the debriefing practices among educators engaged in the immersive simulation activities at a faculty of health sciences. Through observation and semi-structured interviews, the authors highlighted the practices and needs of educators regarding simulation debriefing, which are beneficial for designing and developing a faculty development initiative for simulation debriefing in health sciences in African settings.

The participants' approach to simulation debriefing included identifying the relevant role players and collaboratively planning the simulation. The plans focused on the technical and clinical case-based scenarios and the pre-briefing of the students. Participants established and maintained rapport throughout the debriefing process, using a non-threatening but honest approach to creating a psychologically safe environment for the students. The approach aligns with Muthathi et al. (2017),[22] who state that facilitators are integral to simulation activities in an academic program and must be involved in the planning.

The very low score on establishing the learning environment category implies that the participants had unclear expectations of the student, with no rules defining student engagement during their debriefing session.[20] However, a possible explanation for this deficiency could be the participants' assumptions that their expectations of the students during the debriefing were explained in the pre-briefing phase. Pre-briefing was beyond the scope of this study. Regardless, the debriefing facilitators must re-emphasize the expected learning outcomes during the debriefing to align the feedback with their simulation experience completing the feedback loop.[23]

Considering the diverse student population entering Africa's higher education institutions, students might feel uncomfortable in a debriefing environment or unaware of what is expected of them.[24] With most simulation-based learning being situated within the debriefing activity,[25] minimizing student uncertainty and fear through debrief role clarification is essential to foster learning. Contrary to quantitative results, participants perceived themselves as exploring student reactions and emotions during the debrief. The development of emotional intelligence is essential for healthcare professionals, especially considering the complex working environments they are exposed to and the high rate of student stress, burnout, and suicidality.[26,27] Allowing students time to explore the emotional impact of the simulation activity and identify their directions with consideration of future practice are essential steps in developing emotional and intelligent but also reflective practitioners. Students require guidance to develop these important skills,[26] which may be achieved through guided questioning during a debriefing activity and a facilitator who refrains from adopting a didactic persona. An inability to leave the didactic lecturing persona behind, coupled with forced participation, hints at uncertainty in both the students and facilitator and challenges the safety of the learning environment, negatively influencing learning. This study's results indicate a disconnect between the perceived abilities of participants regarding their debriefing approach and what transpires during the debriefing activity—highlighting the need for training trainers to purposefully establish a learning environment where students are expertly guided to self-reflect.

Kolb's experiential learning theory[5] supports simulation-based learning through student engagement—by means of reflecting on and learning from their simulation experience—during the debriefing activity. Considering the experiential learning cycle proposed by Kolb,[5] students need to be afforded the opportunity through open-ended statements to safely explore their reactions, emotions and how to improve future practice based on their experiences for future active experimentation. Through collaborative discussions between peers during the debriefing activity, students may be assisted in developing abilities such as self-regulation, self-reflection, and mindfulness, which are essential attributes for healthcare professionals.[5,28]

Evaluation should be embedded as part of a simulation debriefing exercise. Educators must use approaches to evaluation that allow for the collection of appropriate data that informs them on the quality of the simulation activities and that of the simulation debriefing. In this study, educators did not engage in formal evaluation practices as illustrated in Theme 3 of the qualitative results. As a debriefing evaluation tool, educators focused on the perception of the attainment of learning outcomes and students' perception of confidence with the clinical procedure taught during the simulation activity. According to Kirkpatrick, the reported approach to evaluation aligns with the lowest level of evaluating educational interventions.[29] The lowest level of evaluation focuses on students' reactions to the educational intervention, which may be of limited benefit to the overall value.[29] The findings of this study align with the work of Johnston et al. (2018).[30] They indicated through a literature review a paucity in engaging higher levels of evaluation in simulation debriefing practices.[30] Strategies that empower educators to evaluate simulation debriefing should be integrated into a faculty development program for simulation debriefing.

Although self-assessment outcomes tend to overestimate performance, a positive outcome of this study was the ability of the educators to identify the need and want to improve their debriefing competency.[31] In this case, the educators mentioned that they need education regarding simulation debriefing, conducting a simulation, and embracing a facilitator's role. A continuing professional development program for educators is therefore essential. This program must be tailor-made to the needs of the educators in practice embedding the latest theories and thinking with education and simulation debriefing.[32] Fundamentally, we suggest that the program must present simulation-based education as the gestalt—allowing the educators to understand the fundamentals of simulation-based education. Specifics regarding simulation debriefing must be enacted and role-modeled for complete immersion of educators—demonstrating best practices. A transformative approach toward engaging educators should be integrated into the training program, which re-focuses their function from being teachers to facilitators within a learning process.

This study was conducted in a single site at a large university in South Africa, namely the University (x). In as much as the sample size may be small, the integration of multiple methods and perspectives supports the value and quality of the outcomes of this research. The OSAD tool, which is a reliable tool and has been used extensively in evaluating simulation debriefing, was used in this study. The use of semi-structured interviews enriched the data by exploring facilitators' views and experiences, contributing to developing a training program for their specific needs

Conclusion

Literature supports that simulation debriefing is an essential aspect of simulation-based education due to its vital contribution to the learning process for students. Poor debriefing practices by educators can potentially negatively influence learning from the simulation activity. Educators central to designing, developing, and implementing immersive simulations must be trained through continuing professional education programs designed based on their needs. In this study, the authors explored the simulation debriefing needs of educators at a faculty of health sciences at an African university through mixed methods research.

The simulation debriefing practices of the educators were sub-optimal and not aligned with standard educational practices. The educators acknowledged the limitations of their methods and the need for training related to simulation debriefing. A continuing professional development program must be developed for educators and should include:

  • Transformative approaches to facilitating learning,

  • Fundamentals of simulation-based education,

  • Modeled best-practice debriefing activities,

  • Appropriate strategies for evaluating debriefing activities.

Educators must be competent in simulation debriefing for simulation to be used as an effective teaching and learning tool that promotes student learning.

Ethics approval

The study was reviewed and approved by the Health Sciences Research Ethics Committee at the University of the Free State (UFS-HSD2020/1649/2411).

Consent to participate

All participants provided written informed consent prior to study participation.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

Acknowledgements

The authors acknowledge Mrs. Elzarie Devenish for conducting the interviews and Prof. Ruth Albertyn for critically reading the work.

References

  • 1.Atwater A. East Carolina University; 2018. Redesigning Simulation Debriefing Practices of a Pre-licensure Baccalaureate Nursing Program. [Google Scholar]
  • 2.Phillips C, Bassell K, Fillmore L. Transforming nursing education through clinical faculty development. Teach Learn Nurs. 2019;14:47–53. [Google Scholar]
  • 3.Dos Sants Noguieira De Goes F, Jackman D. Development of an instructor guide tool: “Three stages of holistic debriefing''. Rev Latino- Am Enferm. 2020;28:e3229. doi: 10.1590/1518-8345.3089.3229. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB, Barry S, et al. Medical Teacher Simulation in healthcare education: A best evidence practical guide. AMEE Guide No. 82 Med Teach. 2013;82:1511–30. doi: 10.3109/0142159X.2013.818632. [DOI] [PubMed] [Google Scholar]
  • 5.Debasish L, Vasudevan K, Dhasaram P, Mathiyalagen P. Sensitizing the medical undergraduates to qualitative research: In the context of experiential learning debriefings. J Edu Health Promot. 2022;11:130. doi: 10.4103/jehp.jehp_807_21. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Verkuyl M, Lapum JL, St-Amant O, Betts L, Hughes M. An exploration of debriefing in virtual simulation. Clin Simul Nurs. 2017;13:519–94. [Google Scholar]
  • 7.Botha BS, Hugo-van Dyk L, Nyoni CN. The reality of virtual reality at a South African university during the COVID-19 pandemic. African J Heal Prof Educ. 2021;13:199–200. [Google Scholar]
  • 8.Kolbe M, Grande B, Spahn DR. Briefing and debriefing during simulation-based training and beyond: Content, structure, attitude and setting. Best Pract Res Clin Anaesthesiol. 2015;29:87–96. doi: 10.1016/j.bpa.2015.01.002. [DOI] [PubMed] [Google Scholar]
  • 9.Omaswa F, Kiguli-Malwadde E, Donkor P, Hakim J, Derbew M, Baird S, et al. The Medical Education Partnership Initiative (MEPI): Innovations and lessons for health professions training and research in Africa. Ann Glob Health. 2018;84:160–9. doi: 10.29024/aogh.8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Walsh A, Koppula S, Antao V, Bethune C, Cameron S, Cavett T, et al. Preparing teachers for competency-based medical education: Fundamental teaching activities. Med Teach. 2018;40:80–5. doi: 10.1080/0142159X.2017.1394998. [DOI] [PubMed] [Google Scholar]
  • 11.McLoughlin C, Patel KD, O’Callaghan T, Reeves S. The use of virtual communities of practice to improve interprofessional collaboration and education: Findings from an integrated review. J Interprof Care. 2018;32:136–42. doi: 10.1080/13561820.2017.1377692. [DOI] [PubMed] [Google Scholar]
  • 12.Mackintosh-Franklin C. Pedagogical principles underpinning undergraduate Nurse Education in the UK: A review. Nurse Educ Today. 2016;40:118–22. doi: 10.1016/j.nedt.2016.02.015. [DOI] [PubMed] [Google Scholar]
  • 13.Hatala R, Cook DA, Zendejas B, Hamstra SJ, Brydges R. Feedback for simulation-based procedural skills training: A meta-analysis and critical narrative synthesis. Adv Health Sci Educ. 2014;19:251–72. doi: 10.1007/s10459-013-9462-8. [DOI] [PubMed] [Google Scholar]
  • 14.Vandewaetere M, Manhaeve D, Aertgeerts B, Clarebout G, Van Merriënboer JJ, Roex A. 4C/ID in medical education: How to design an educational program based on whole-task learning: AMEE Guide No. 93. Med Teach. 2015;37:4–20. doi: 10.3109/0142159X.2014.928407. [DOI] [PubMed] [Google Scholar]
  • 15.Raney JH, Medvedev MM, Cohen SR, Spindler H, Ghosh R, Christmas A, et al. Training and evaluating simulation debriefers in low-resource settings: Lessons learned from Bihar, India. BMC Med Educ. 2020;20:9. doi: 10.1186/s12909-019-1906-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Botma Y. Nursing student's perceptions on how immersive simulation promotes theory–practice integration. Int J Africa Nurs Sci. 2014;1:1–5. [Google Scholar]
  • 17.van Wyk R. The role of a sustainable integrated systems approach to High-Technology Clinical Simulation in South Africa. (Ph.D Thesis) University of the Free State. 2021. Available from: http://hdl.handle.net/11660/11682 .
  • 18.Bryman A. 4th ed. Oxford: Oxford University Press; 2012. Social Research Methods. [Google Scholar]
  • 19.Creswell JW. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches SAGE Publications. 2013. p. 305. Available from: https://books.google.co.ls/books?id=EbogAQAAQBAJ .
  • 20.Arora S, Ahmed M, Paige J, Nestel D, Runnacles J, Hull L, et al. Objective structured assessment of debriefing: Bringing science to the art of debriefing in surgery. Ann Surg. 2012;256:982–8. doi: 10.1097/SLA.0b013e3182610c91. [DOI] [PubMed] [Google Scholar]
  • 21.Saldana J. 3rd ed. Thousand Oaks, CA: SAGE; 2016. The Coding Manual for Qualitative Researchers. [Google Scholar]
  • 22.Muthathi IS, Thurling CH, Armstrong SJ. Through the eyes of the student: Best practices in clinical facilitation. Curationis. 2017;40:e1–8. doi: 10.4102/curationis.v40i1.1787. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Cheng A, Eppich W, Kolbe M, Meguerdichian M, Bajaj K, Grant V. A conceptual framework for the development of debriefing skills. A journey of discovery, growth, and maturity. Simul Healthc. 2020;15:55–60. doi: 10.1097/SIH.0000000000000398. [DOI] [PubMed] [Google Scholar]
  • 24.van der Merwe LJ, van Zyl GJ, St Clair Gibson A, Viljoen M, Iputo JE, Mammen M, et al. South African medical schools: Current state of selection criteria and medical students' demographic profile. S Afr Med J. 2015;106:76–81. doi: 10.7196/SAMJ.2016.v106i1.9913. [DOI] [PubMed] [Google Scholar]
  • 25.Nestel D, Gough S. Designing simulation-based learning activities: A systematic approach. Healthc Simul Educ. 2017:135–42. [Google Scholar]
  • 26.Smith S, Kassam A, Griggs L, Rizzuti F, Horton J, Brown A. Teaching mindfulness-based stress management techniques to medical learners through simulation. Can Med Educ J. 2021;12:e95–7. doi: 10.36834/cmej.69821. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Foster K, Fethney J, Kozlowski D, Fois R, Reza F, McCloughen A. Emotional intelligence and perceived stress of Australian pre-registration healthcare students: A multi-disciplinary cross-sectional study. Nurse Educ Today. 2018;66:51–6. doi: 10.1016/j.nedt.2018.04.001. [DOI] [PubMed] [Google Scholar]
  • 28.Duers LE. The learner as co-creator: A new peer review and self-assessment feedback form created by student nurses. Nurse Educ Today. 2017;58:47–52. doi: 10.1016/j.nedt.2017.08.002. [DOI] [PubMed] [Google Scholar]
  • 29.Cahapay M. Kirkpatrick Model: Its Limitations as Used in Higher Education Evaluation. Int J Assess Tools Educ. 2021;8:135–44. [Google Scholar]
  • 30.Johnston S, Coyer FM, Nash R. Kirkpatrick's Evaluation of Simulation and Debriefing in Health Care Education: A Systematic Review. J Nurs Educ. 2018;57:393–8. doi: 10.3928/01484834-20180618-03. [DOI] [PubMed] [Google Scholar]
  • 31.Mazor M, Fleming SM. The Dunning-Kruger effect revised. Nat Hum Behav. 2021;5:677–8. doi: 10.1038/s41562-021-01101-z. [DOI] [PubMed] [Google Scholar]
  • 32.Farzi S, Shahriari M, Farzi S. Exploring the challenges of clinical education in nursing and strategies to improve it: A qualitative study. J Educ Health Promot. 2018;7:115. doi: 10.4103/jehp.jehp_169_17. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Journal of Education and Health Promotion are provided here courtesy of Wolters Kluwer -- Medknow Publications

RESOURCES