Abstract
Introduction
Debriefing plays a vital role in effective simulation-based learning by providing feedback to students to foster their development of critical thinking in the learning.
Objectives
We evaluated the effects of rubric-based debriefing on students’ critical thinking, level of confidence in performing tasks and communication.
Method
This is a quasi-experimental study. Final year nursing undergraduates (n=204) of a local university participated in the study. In the intervention group, students performed two 20 min simulation sessions individually with simulated patients followed by a 15 min individual rubric-based debriefing between the two sessions and had a 5 min of that at the end. In the control group, students performed the same simulation sessions followed by a 20 min individual rubric-based debriefing at the end. The primary outcome was comparing the critical thinking of students between two groups, which was assessed by The Northeastern Illinois University Critical Thinking Rubric. The qualitative data were collected by an open-ended question.
Results
Based on generalised estimating equation models, the intervention effect over time was found to be statistically significant (β=2.06, 95% CI 1.04 to 3.08) in enhancing students’ critical thinking. No statistically significant differences were reported in the self-perceived confidence levels between the intervention group and control group. Qualitative data reflected positive feedback from students on simulation activities.
Conclusions
This is the first study to provide evidence that a rubric-based debriefing enhances students’ critical thinking in simulation learning.
Keywords: simulation education, nursing student, rubric, debriefing
Key messages.
What is already known on this subject
Debriefing plays a vital role in ensuring effective simulation-based learning.
Debriefing has been demonstrated to foster nursing undergraduates’ development of critical thinking in simulation education.
Structured debriefing by using a competency checklist is common in simulation education.
What this study adds
We adopted an innovative structured debriefing by using grading rubric during simulation activities.
Introduction
Background
In the last decade, simulation-based learning has been incorporated into nursing education.1 It not only provides a platform for students to amplify real experiences in a systematic and interactive manner, but also develops their knowledge and skills while protecting patients from unnecessary risks.2 With the use of simulations which replicate real clinical scenarios, students are able to be educated in technical, problem-solving, decision-making, interpersonal and communication skills through learning by doing. Compared with traditional lecture-based learning, simulation-based learning is not restricted to only the cognitive dimension, but can also be expanded to the emotional and psychomotor dimensions to enhance clinical competency.3
Post simulation debriefing plays a vital role in maximising simulation-based learning in a systematic manner.4 It is an intentional discussion that provides feedback to students and allows students to obtain a clear understanding of their actions via a reflective learning process.5 The reflective learning process comprises three stages: awareness, critical analysis and new perspectives.6 Although students may aware of their own actions in a simulation, a systematic facilitator-led discussion during debriefing provides concrete feedback to analyse and reconstruct what happened in the scenario and develop new strategies in a collaborative manner.7
A meta-analysis found that post simulation debriefing had small positive effect size for all outcomes by comparing with those without debriefing (pooled effect size range=0.28–2.16); however, debriefing design and approach varied and different features, for example, video-assisted debriefing versus non-video-assisted debriefing, showed mixed or non-significant results.8 In general, post simulation debriefing can be classified as semistructured or structured approach. In the semistructured approach, facilitator-led discussion is based on several principles. For example, the debriefing model developed by Dreifuerst9 was based on three basic questions: (1) what went wrong; (2) what went right; and (3) what could you consider doing differently next time? Another example was the 3D Model of debriefing10 which considered elements of defusing (allowing students to vent any emotions related to the learning activities), discovering (allowing students to reflect on their performance and identify gaps in their knowledge), and deepening (allowing students to consider how they can apply what they learnt to future situations). The benefits of the semistructured approach include openness and flexibility to allow reflection and then consolidate learning. One of the challenges is finding a skilful facilitator to lead the discussion during debriefing. Also, it may take time to build rapport and create an environment of trust between the student and facilitator prior to the debriefing session.3 Notably, poorly facilitated debriefings may create an adverse learning experience for students and lead to degradation of self-reflection.11
The structured approach of debriefing provides a framework for a facilitator to deliver the debriefing. There are several common frameworks used in debriefing, for examples, Tanner’s Debriefing With Good Judgement,12 Dreifuerst’s Debriefing for Meaningful Learning9 and Kuiper’s Outcome Present-State Test model.13 Apart from structured framework, using real-time video to assist the structured debriefing by providing feedback according to observation of the videotaped skills performance is also common.14 However, to deliver an effective video-facilitated debriefing, the facilitator should have technical expertise in handling the equipment, be able to set up the debriefing to play the video and provide sufficient time for the debriefing session.15 In addition, a systematic review of 10 randomised controlled trials found that there was no practical difference in outcomes by using video-facilitated debriefing versus traditional debriefing.8 16 In another example of the structured approach, an objective structured clinical examination was used, which aimed to assess the clinical skill competency among students.17 The facilitator used a standard checklist to evaluate and scored the student’s performance. A group of experts typically agreed on the items in the checklist. Although the checklist provided a structured guide for debriefing, the pitfalls included difficulties in differentiating students’ performance in different ranges or levels of competency.18
To perform structured debriefing with indication of level of competency, graded rubric may be an appropriate tool. Rubric, in education, is defined as a document that articulates the assignment expectation by listing out the marking criteria and graded rubric refers to the rubric with marking criteria from poor to excellent.19 Graded rubric maintains the benefit of openness of non-structured debriefing to allow self-assessment and self-reflection in the learning process. Also, it provides novice facilitators the evaluation standard and enhances their judgement of student’s performance.20 However, there is a dearth of studies evaluating rubric-based debriefing in simulation education.
Theoretical framework
Although there is a variety of debriefing frameworks, there is a similarity about the reflective learning steps to complete the experiential learning cycle, in particular, facilitating the learner from observation to development of new concepts.21 Kolb22 described experiential learning as a process cycle—doing a task to collect concrete experience (feeling), which served as the basis for observations (reflection); reflecting on what had been and assimilating and distilling into abstract concepts (thinking); and using abstract concepts to apply new concepts that could be tested again (doing). By using rubrics to perform debriefing, the learner will be able to understand the performance indicators and criteria and receive constructive feedback from the facilitator during the debriefing session.23
Objectives
In this study, we adopted an innovative structured debriefing during simulation activities. Instead of a checklist, we used a grading rubric as a means to identify development of the students’ level of critical thinking and communicate criteria for successful handling of the simulated scenario. With rubrics, the facilitators had clear observational measures for students’ performance and access to consistent metrics in conducting the debriefing. There has not been clear evidence on debriefing designs in simulation-based learning nor evaluating rubric-based debriefing.8 Therefore, in this study, we evaluated the effects of rubric-based debriefing on students’ critical thinking and level of confidence in performing tasks and communication. We believe that this study will provide scientific evidence on efficacious debriefing.
Method
Design
The study employed a quasi-experimental design,24 which was a common design to evaluate the effects of rubric-based debriefing on students’ (1) critical thinking and (2) level of confidence in performing tasks and (3) communication.
Participants and setting
The whole class of final year nursing undergraduates (n=204) was invited to join the study by using convenience sampling.25 They were selected in this study because they were all equipped with clinical knowledge in different nursing specialties in classrooms and clinical practica the past 4 years and had gone through all basic and special nursing skills in the curriculum. The students were evaluated by facilitators in this study. All facilitators are nursing educators with at least 3 years of experience in nursing education including facilitator experience in classroom and clinical settings.
Instruments
-
Northeastern Illinois University (NEIU) Critical Thinking Rubric: this rubric was developed by the NEIU Center for Teaching and Learning.26 The rubric assessed six dimensions with the following criteria:
Issues: to be able to identify and concisely explain the current patient’s problems.
Context: to be able to recognise the influence of the context from different contexts including the patient’s cultural/social, educational and family background.
Perspectives: to be able to formulate their own perspective and consider other salient perspectives relevant to the patient’s problems.
Assumptions: to be able to evaluate the key assumptions behind the claims and recommendations made.
Evidence: to be able to provide supporting data relevant to the actions.
-
Implications: to be able to identify and evaluate conclusions and implications.
Four performance levels from ‘limited or no proficiency’ (score=one point) to ‘high proficiency’ (score=four points) were provided for facilitators to assess student’s performance and guide constructive feedback during debriefing. Clear guidelines were provided to assess the performance level in each dimension. For example, for the first dimension, students would score four points if they were able to assess the vital signs, blood glucose and consciousness level of the patients, report the findings and hand over the situation accurately with interpretations to the facilitators. They would score one point if they only performed some of the assessment criteria without reporting or performing further actions. The total rubric scores for a simulated scenario ranged from 6 to 24.
Demographics: this included items such as gender, prior working experience as part-time nursing staff and previous training in basic life support and advanced cardiovascular life support.
Self-perceived confidence level: the student’s self-perceived confidence level was assessed by individual self-rated items on a 7-point Likert Scale. The questions included, ‘what is your confidence level in assessing patient needs?’ ‘what is your confidence level in performing accurate assessment?’ ‘what is your confidence level in identifying patient problems?’ ‘what is your confidence level in prioritising patient needs?’ ‘what is your confidence level in implementing nursing procedures?’ ‘what is your confidence level in evaluating the effects of nursing procedures?’ and ‘what is your confidence level in communication?’ Higher scores indicated a higher self-perceived confidence level.
Qualitative evaluation: after the simulation sessions, students were asked to respond to an open-ended question, ‘do you have any suggestions for improving the simulation exercise?’
Data collection
A briefing session was arranged to inform the students about the project aim and objectives. The class was divided into the intervention group and control group. The grouping was done according to student preference for the date and time of the simulation activities. The students did not know their assigned group. On the simulation activity day, on completion of written consent forms, they completed a pretest questionnaire on demographics and self-perceived confidence level. Then, there was a briefing on the logistics of the simulation sessions and clinical environment. Expectations on the nursing role in caring for two simulated patients, including assessment, planning, nursing diagnosis, management, evaluation and all support to the patients through illness, were emphasised. The primary outcome was assessed using the NEIU Critical Thinking Rubric at the end of the simulation session. After two simulation sessions, the students completed a post-test questionnaire on their self-perceived confidence level and qualitative evaluation.
Simulation sessions
Each student performed two simulation sessions individually with simulated patients. In the intervention group, students worked on two 20 min simulation sessions with a 15 min individual rubric-based debriefing between sessions and another 5 min individual rubric-based debriefing at the end of the second session. In the control group, students worked on two 20 min simulation sessions without any debriefing. Owing to ethical considerations, debriefing was given at the end of the simulation sessions in the control group. During debriefing, focused and constructive feedback was provided by the facilitator according to the NEIU Critical Thinking Rubric Checklist. Figure 1 illustrates the simulation sessions in both the intervention group and control group and the timeline of data collection.
Figure 1.

The simulation sessions in both intervention group and control group. NEIU CTR, Northeastern Illinois University Critical Thinking Rubric.
The simulated scenarios were focused on admitting patients, and were specially designed to ensure the signs and symptoms of the simulated patients were similar to each other. They included patients with provisional diagnoses of (1) hypoglycaemia, (2) anaemia, (3) hepatic encephalopathy and (4) lumbar spinal cord compression. All simulated patients had problems of sudden onset of left-side weakness or dizziness after hitting the left side of their forehead on a wash basin after going to the toilet. The simulated patients did not have any loss of consciousness or external physical wounds at hospital admission. The simulated patients were recruited from community centres and were selected by interviews to ensure their suitability and feasibility to receive training. A 3 hours training session was given to each simulated patient to ensure that their emotional states, behaviours and responses to the physical examination were appropriate to the designed simulated scenarios. On the simulation activity day, they were given light makeup and dressed in standardised patient sleepwear.
Data analysis
Descriptive statistics were used to describe the participants’ characteristics. Generalised estimating equation (GEE) models were performed to examine the intervention effects over time to accommodate the extra covariance among the repeated measures.27 Specifically, for each outcome, we estimated the intervention effect at each time epoch after adjusting the baseline value and using an unstructured working correlation matrix. The intention-to-treat principle was applied to the analysis. All estimates were accompanied by 95% CIs, and the nominal level of significance was taken as 5%. IBM SPSS for Windows V.23.0 was used for the analysis. For qualitative data, content analysis was used to gain a holistic view of both context and meaning from the participants by reviewing the raw data collected, to forming the condensed meaning and then transferring into categories.28 This method goes beyond counting number of relevant words but examining meaning of words to form categories. Table 1 shows an example of the analysis process.
Table 1.
An example of showing the process of content analysis
| Categories | Code | Meaning unit | Raw data |
| Benefits of simulated activities | Learning facilitation | Leaning without stress | The scenarios are vivid and we can learn without stress |
| Understanding own ability | The debriefing session enabled us to know more about our own ability when facing a real situation | ||
| Improving knowledge | Simulation also helped to improve my knowledge | ||
| Providing chance of skill learning | More stimulation activities would better improve skills |
Results
The written consent forms were obtained from students to demonstrate understanding of and willingness to participate in the evaluation before completion of questionnaires. The majority of the 204 nursing students were women (n=153, 74.5%) and had working experience as part-time nursing staff (n=173, 84.8%). Only around 25% had received some outside training including basic life support training and only three of them had received advanced cardiovascular life support training from the American Heart Association in addition to their nursing education. Table 2 shows the details of the participant’s characteristics in both the intervention and control groups.
Table 2.
Participants’ characteristics (n=204)
| Variables (missing data), n (%) | Overall (n=204) | Intervention group (n=104) | Control group (n=100) | X2
P value |
| Female | 153 (75.0) | 70 (67.3) | 83 (83.0) | 0.01* |
| Working experience as part-time nursing staff | 173 (85.6) | 85 (82.5) | 88 (88.9) | 0.20 |
| Received basic life support training | 50 (24.9) | 25 (24.5) | 25 (25.3) | 0.90 |
| Received advanced cardiovascular life support training | 3 (1.5) | 1 (1.0) | 2 (2.0) | 0.54 |
*P<0.05.
The GEE results reflected that the time of assessment (time 2 vs time 1) and the grouping (intervention group vs control group) exerted main effects on critical thinking in performing nursing care tasks. Also, the interaction effect of the time × group was observed in the GEE analysis (table 3).
Table 3.
Generalised estimating equation models for primary and secondary outcomes (n=204)
| Outcomes | Intervention | Control | Time (time 2–time 1) |
Group (intervention–control) |
Time × group | |||
| Mean (SE) | Mean (SE) | Estimated effect | 95% CI | Estimated effect | 95% CI | Estimated effect | 95% CI | |
| NEIU Critical Thinking Rubric Score | ||||||||
| Time 1 (case scenario 1) | 10.31 (1.18) | 10.44 (1.18) | ||||||
| Time 2 (case scenario 2) | 12.87 (1.27) | 10.95 (1.21) | 2.57* | 1.77 to 3.36 | 1.92* | 1.00 to 2.85 | 2.06* | 1.04 to 3.08 |
| Confidence in… | ||||||||
| a) Assessing patient needs | ||||||||
| Time 1 (pretest) | 4.02 (0.16) | 4.02 (0.16) | ||||||
| Time 2 (post-test) | 4.19 (0.20) | 3.84 (0.20) | 0.18 | 0.05 to 0.40 | 0.36* | 0.04 to 0.68 | 0.36* | 0.01 to 0.71 |
| b) Performing accurate assessment | ||||||||
| Time 1 (pretest) | 3.99 (0.17) | 3.95 (0.18) | ||||||
| Time 2 (post-test) | 4.13 (0.22) | 3.95 (0.20) | 0.15 | −0.11 to 0.40 | 0.18 | −0.16 to 0.52 | 0.15 | −0.22 to 0.52 |
| c) Identifying patient problems | ||||||||
| Time 1 (pretest) | 3.87 (0.16) | 3.85 (0.16) | ||||||
| Time 2 (post-test) | 3.98 (0.21) | 3.81 (0.20) | 0.11 | −0.14 to 0.36 | 0.17 | −0.15 to 0.49 | 0.15 | −0.20 to 0.50 |
| d) Prioritising patient needs | ||||||||
| Time 1 (pretest) | 3.87 (0.17) | 3.86 (0.17) | ||||||
| Time 2 (post-test) | 4.03 (0.21) | 3.77 (0.19) | 0.17 | −0.10 to 0.43 | 0.25 | −0.09 to 0.60 | 0.25 | −0.13 to 0.62 |
| e) Implementing nursing procedures | ||||||||
| Time 1 (pretest) | 3.95 (0.18) | 3.94 (0.19) | ||||||
| Time 2 (post-test) | 4.07 (0.23) | 3.81 (0.21) | 0.12 | −0.14 to 0.37 | 0.26 | −0.08 to 0.59 | 0.24 | −0.12 to 0.60 |
| f) Evaluating effects of nursing procedures | ||||||||
| Time 1 (pretest) | 3.94 (0.18) | 3.94 (0.18) | ||||||
| Time 2 (post-test) | 4.00 (0.22) | 3.84 (0.22) | 0.06 | −0.20 to 0.32 | 0.16 | −0.18 to 0.49 | 0.15 | −0.21 to 0.51 |
| g) Communication | ||||||||
| Time 1 (pretest) | 4.09 (0.17) | 4.08 (0.17) | ||||||
| Time 2 (post-test) | 4.15 (0.20) | 3.95 (0.19) | 0.06 | −0.18 to 0.30 | 0.20 | −0.10 to 0.51 | 0.19 | −0.14 to 0.53 |
*P value<0.05.
NEIU, Northeastern Illinois University.
The mean total NEIU critical thinking rubric scored at time 2 was significantly higher than those at time 1 in both the intervention and control groups (β=2.57, 95% CI 1.77 to 3.36). It indicated that both groups enhanced critical thinking after the second simulation activity. In addition, there was a significantly greater difference in the intervention group (β=2.06, 95% CI 1.04 to 3.08), which indicated that intervention group performed better in critical thinking than control group. There was a statistically significant but small increase in confidence in assessing patients’ needs in the intervention group (β=0.36, 95% CI 0.01 to 0.71) but the same observation was not found in the control group. However, no significant finding was found in student confidence in the other domains such as confidence in performing accurate assessment, identifying patient problems, prioritising patient needs, implementing nursing procedures, evaluating effects of nursing procedures and communication.
Although only 1 question ‘do you have any suggestions for improving the simulation exercise?’ was asked as qualitative evaluation, 101 students were engaged in the evaluation and provided feedback on their experience about the simulation activities. Qualitative findings were classified into the following three categories.
Benefits of simulated activities
The students in general showed positive comments on simulation activities. Majority of them (n=88 out of 101, 87%) mentioned that the simulated activities improved their knowledge and very helpful to their learning. They also commented the scenarios were realistic.
”the scenarios are vivid and we can learn without stress”
”the debriefing session enabled us to know more about our own ability when facing a real situation”
Time arrangement for simulation activities
Half of the class (n=49 out of 101, 49%) provided feedback on the logistical arrangement of the simulation sessions. They suggested allowing more time for debriefing and discussion, for each simulation task and debriefing and for orientation of the environment, equipment and the patient assessment forms. The most encouraging suggestion from students was
“allow more time for students to assess and evaluate themselves on missed points during debriefing”
Future improvement of simulation activities
Regarding the future improvement, many students (n=66 out of 101, 65%) suggested organising more simulation activities. Some students suggested more challenging case scenarios and interprofessional simulation.
”suggest to include medical students or simulated physicians”
”add more problematic and difficult acts from simulated patients”
”the scenario could be more difficult”
Discussion
This is the first study to provide evidence that a rubric-based debriefing enhances students’ critical thinking in simulation learning. Although the findings showed that students in the control group were able to enhance their critical thinking through self-reflection, the effect was larger in the intervention group students who had a rubric-based debriefing. A few studies evaluated rubric-based debriefing in simulation education, to enhance teamwork29 and clinical judgement.30 Critical thinking is difficult to teach and train in didactic lecture-based teaching.31 However, the simulation activities in this study allowed students to assess, recognise, analyse, interpret and show evidence for their problem solving in simulated patients. Through communication with simulated patients, students were able to perform activities in the nursing process from nursing assessment to evaluation. The facilitators made use of the rubrics to focus observation and assessment on specific items and performed a structuralised debriefing according to student achievement.20 The rubrics not only defined the key dimensions of critical thinking, but also illustrated the performance levels in each dimension for students, which enabled the debriefing session to be specific. Furthermore, the rubrics facilitated the structure of debriefing to accommodate reflection, skill performance and decision-making. A recent study using the Lasater Clinical Judgment Rubric (LCJR)32 also demonstrated the effectiveness of rubric-based debriefing in improving students’ clinical judgement.33 The LCJR contains four dimensions: (1) noticing the situation; (2) interpreting the available data; (3) responding with appropriate actions; and (4) reflecting on one’s practice, behaviours and clinical judgement. A structuralised rubric-based debriefing based on the LCJR was found to improve students’ clinical judgement (F=2.21, p=0.022). Our study further supported to the usage of rubrics as both an assessment tool and a teaching tool for debriefing in simulation education.34
There was no statistically significant difference in the confidence levels of the intervention group and control group in performing accurate assessment, identifying patient’s problems, prioritising patient needs, implementing nursing procedures, evaluating effects of nursing procedures and communication. We only conducted two simulated activities in this study. The effects on enhancing the confidence level were not clear. Longitudinal studies are needed to investigate the effectiveness.
The qualitative findings supported that simulation and structured debriefing were beneficial to them and satisfied the students’ learning needs. They even requested more simulation activities and extended the debriefing time in the future. A previous study also reported that structured debriefing helped students for learning, scenario analysis and mapping of concepts.30 In addition, the qualitative findings indicated the need of incorporating peer assessment in the simulation activities. On the other hand, the students suggested an interprofessional approach for simulation activities. Some qualitative studies had already demonstrated that interprofessional simulation activities were effective in improving students’ knowledge and communication,35 situation awareness36 and teamwork.37 Future research should be conducted to evaluate critical thinking in interprofessional simulation activities in medical and nursing students.
Limitations
There were some limitations in this study. First, we only evaluated the effects of rubric-based debriefing on critical thinking but did not evaluate the rubric-based debriefing process. Second, a quasi-experimental design was used in this study. Replication of this study and evaluation with a randomised controlled trial would yield more robust results on the effects of rubric-based debriefing in enhancing students’ critical thinking in simulation education.
Implications for nursing education
This study provides evidence that rubric-based debriefing after a simulation activity is beneficial in enhancing students’ critical thinking. In classroom, interprofessional simulation education should be further explored and expanded. Interprofessional rubric-based debriefing may offer additional benefits for nursing students in handling patients in real-life situations. In a real clinical setting, a short but immediate debriefing after nursing procedures would also be helpful for students’ learning and reflection. It is suggested that clinical mentors provide a short debriefing for their mentees during a clinical practicum day for effective learning instead of a long debriefing after a clinical practicum day.
Conclusions
This study evaluated the effects of rubric-based debriefing in enhancing students’ critical thinking after simulated activities. The differences in overall critical thinking scored between the intervention and control groups were statistically significant, which indicated a positive effect. Also, the qualitative findings demonstrated the feasibility and benefits of conducting rubric-based debriefing in simulation education. The study findings provided scientific evidence on efficacious debriefing and concluded that short rubric-based debriefing after the first scenario was better than a longer rubric-based debriefing after the second scenario. Further research conducted with more rigorous designs and additional evaluation of the debriefing process would be needed to provide empirical evidence of the effects in enhancing students’ critical thinking.
Footnotes
Contributors: JYHW is the project leader, developed the research design and drafted the manuscript. MMKC, VWYT, MTHP and CKYC developed the case scenarios, implemented simulation activities and drafted the manuscript. PHC developed the research design, analysed and interpreted the data, reviewed the manuscript. AT developed the overall research design and provided expert advice on simulation education.
Funding: This study was supported by a Teaching Development Grant (Project no. 16/611) awarded by the University Grants Council in Hong Kong.
Competing interests: None declared.
Ethics approval: Ethical approval was gained from the Institutional Review Board of the University of Hong Kong/Hospital Authority Hong Kong West Cluster (HKU/HA HKW IRB Ref. UW 16-556).
Provenance and peer review: Not commissioned; externally peer reviewed.
Data availability statement: No data are available. Not applicable.
References
- 1. Cant RP, Cooper SJ. Use of simulation-based learning in undergraduate nurse education: an umbrella systematic review. Nurse Educ Today 2017;49:63–71. 10.1016/j.nedt.2016.11.015 [DOI] [PubMed] [Google Scholar]
- 2. Lateef F. Simulation-Based learning: just like the real thing. J Emerg Trauma Shock 2010;3:348–52. 10.4103/0974-2700.70743 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Fanning RM, Gaba DM. The role of Debriefing in simulation-based learning. Simul Healthc 2007;2:115–25. 10.1097/SIH.0b013e3180315539 [DOI] [PubMed] [Google Scholar]
- 4. Cheng A, Grant V, Dieckmann P, et al. Faculty development for simulation programs: five issues for the future of Debriefing training. Simul Healthc 2015;10:217–22. 10.1097/SIH.0000000000000090 [DOI] [PubMed] [Google Scholar]
- 5. Abulebda K, Auerbach M, Limaiem F. Debriefing techniques utilized in medical simulation [Internet]. Treasure Island, FL: StatPearls Publishing, 2019. [PubMed] [Google Scholar]
- 6. Brockbank A, McGill I, Beech N. Reflective learning in practice. Routledge, 2017: 18–28. [Google Scholar]
- 7. Mariani B, Cantrell MA, Meakim C. Nurse educators' perceptions about structured Debriefing in clinical simulation. Nurs Educ Perspect 2014;35:330–1. 10.5480/13-1190.1 [DOI] [PubMed] [Google Scholar]
- 8. Cheng A, Eppich W, Grant V, et al. Debriefing for technology-enhanced simulation: a systematic review and meta-analysis. Med Educ 2014;48:657–66. 10.1111/medu.12432 [DOI] [PubMed] [Google Scholar]
- 9. Dreifuerst KT. Using Debriefing for meaningful learning to foster development of clinical Reasoning in simulation. J Nurs Educ 2012;51:326–33. 10.3928/01484834-20120409-02 [DOI] [PubMed] [Google Scholar]
- 10. Zigmont JJ, Kappus LJ, Sudikoff SN. The 3D model of Debriefing: defusing, discovering, and deepening. Semin Perinatol 2011;35:52–8. 10.1053/j.semperi.2011.01.003 [DOI] [PubMed] [Google Scholar]
- 11. Palaganas JC, Fey M, Simon R. Structured Debriefing in simulation-based education. AACN Adv Crit Care 2016;27:78–85. 10.4037/aacnacc2016328 [DOI] [PubMed] [Google Scholar]
- 12. Tanner CA. Thinking like a nurse: a research-based model of clinical judgment in nursing. J Nurs Educ 2006;45:204–11. 10.3928/01484834-20060601-04 [DOI] [PubMed] [Google Scholar]
- 13. Kuiper R, Heinrich C, Matthias A, et al. Debriefing with the OPT model of clinical Reasoning during high fidelity patient simulation. Int J Nurs Educ Scholarsh 2008;5:1–14. 10.2202/1548-923X.1466 [DOI] [PubMed] [Google Scholar]
- 14. Grant JS, Moss J, Epps C, et al. Using Video-Facilitated feedback to improve student performance following high-fidelity simulation. Clin Simul Nurs 2010;6:e177–84. 10.1016/j.ecns.2009.09.001 [DOI] [Google Scholar]
- 15. Kessler DO, Cheng A, Mullan PC. Debriefing in the emergency department after clinical events: a practical guide. Ann Emerg Med 2015;65:690–8. 10.1016/j.annemergmed.2014.10.019 [DOI] [PubMed] [Google Scholar]
- 16. Levett-Jones T, Lapkin S. A systematic review of the effectiveness of simulation Debriefing in health professional education. Nurse Educ Today 2014;34:e58–63. 10.1016/j.nedt.2013.09.020 [DOI] [PubMed] [Google Scholar]
- 17. Casey PM, Goepfert AR, Espey EL, et al. To the point: reviews in medical education--the Objective Structured Clinical Examination. Am J Obstet Gynecol 2009;200:25–34. 10.1016/j.ajog.2008.09.878 [DOI] [PubMed] [Google Scholar]
- 18. Chumley HS. What does an OSCE checklist measure? Fam Med 2008;40:589–91. [PubMed] [Google Scholar]
- 19. Reddy YM, Andrade H. A review of rubric use in higher education. Assessment Eval Higher Educ 2010;35:435–48. 10.1080/02602930902862859 [DOI] [Google Scholar]
- 20. Brookhart SM, Chen F. The quality and effectiveness of descriptive rubrics. Educ Rev 2015;67:343–68. 10.1080/00131911.2014.929565 [DOI] [Google Scholar]
- 21. Cant RP, Cooper SJ. The benefits of Debriefing as formative feedback in nurse education. Aust J Adv Nurs 2011;29:37. [Google Scholar]
- 22. Kolb D. Learning styles inventory: technical manual. Boston, MA: McBer & Co, 1976. [Google Scholar]
- 23. Cox G, Brathwaite B, Morrison J. The rubric: an assessment tool to guide students and markers. Adv Higher Educ 2016;149. [Google Scholar]
- 24. Polit DF, Beck CT. Nursing research: generating and assessing evidence for nursing practice. 9th edn. Lippincott Williams & Wilkins, 2011. [Google Scholar]
- 25. Patton MQ. Qualitative research and evaluation methods. 3rd edn. Thousand Oaks, CA: SAGE, 2002. [Google Scholar]
- 26. Northeastern Illinois University . Critical thinking rubric, 2005. Available: http://web.uri.edu/assessment/files/CriticalThinkingRubric_NEIU.pdf [Accessed 1 Mar 2017].
- 27. Ghisletta P, Spini D. An introduction to generalized estimating equations and an application to assess selectivity effects in a longitudinal study on very old individuals. J Educ Behav Stat 2004;29:421–37. 10.3102/10769986029004421 [DOI] [Google Scholar]
- 28. Graneheim UH, Lundman B. Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness. Nurse Educ Today 2004;24:105–12. 10.1016/j.nedt.2003.10.001 [DOI] [PubMed] [Google Scholar]
- 29. Abe Y, Kawahara C, Yamashina A, et al. Repeated scenario simulation to improve competency in critical care: a new approach for nursing education. Am J Crit Care 2013;22:33–40. 10.4037/ajcc2013229 [DOI] [PubMed] [Google Scholar]
- 30. Mariani B, Cantrell MA, Meakim C, et al. Structured Debriefing and students' clinical judgment abilities in simulation. Clin Simul Nurs 2013;9:e147–55. 10.1016/j.ecns.2011.11.009 [DOI] [Google Scholar]
- 31. Alaagib NA, Musa OA, Saeed AM. Comparison of the effectiveness of lectures based on problems and traditional lectures in physiology teaching in Sudan. BMC Med Educ 2019;19:365. 10.1186/s12909-019-1799-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32. Lasater K. Clinical judgment development: using simulation to create an assessment rubric. J Nurs Educ 2007;46:496–503. 10.3928/01484834-20071101-04 [DOI] [PubMed] [Google Scholar]
- 33. HK O. Effects on Nursing Students’ Clinical Judgment, Communication, and Skill Performance Following Debriefing using a Clinical Judgment Rubric. Int J Bio-Sci Bio-Tech 2016;8:303–12. [Google Scholar]
- 34. Hansen EJ. Idea-based learning: a course design process to promote conceptual understanding. Stylus Publishing, LLC, 2012. [Google Scholar]
- 35. Tofil NM, Morris JL, Peterson DT, et al. Interprofessional simulation training improves knowledge and teamwork in nursing and medical students during internal medicine clerkship. J Hosp Med 2014;9:189–92. 10.1002/jhm.2126 [DOI] [PubMed] [Google Scholar]
- 36. King AEA, Conrad M, Ahmed RA. Improving collaboration among medical, nursing and respiratory therapy students through interprofessional simulation. J Interprof Care 2013;27:269–71. 10.3109/13561820.2012.730076 [DOI] [PubMed] [Google Scholar]
- 37. Oxelmark L, Nordahl Amorøe T, Carlzon L, et al. Students' understanding of teamwork and professional roles after interprofessional simulation-a qualitative analysis. Adv Simul 2017;2:8. 10.1186/s41077-017-0041-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
