Skip to main content
Canadian Journal of Respiratory Therapy: CJRT = Revue Canadienne de la Thérapie Respiratoire : RCTR logoLink to Canadian Journal of Respiratory Therapy: CJRT = Revue Canadienne de la Thérapie Respiratoire : RCTR
. 2016 Nov 1;52(4):114–117.

Advisory workgroup recommendations on the use of clinical simulation in respiratory therapy education

Irina Charania 1, Karl Weiss 2, Andrew J West 3,, Seana Martin 4, Manon Ouellet 5, Roger Cook 6
PMCID: PMC6422227  PMID: 30996620

Abstract

Clinical simulation has become established as a commonly used educational approach in respiratory therapy, though questions remain with regards to the evidence basis for its use in some contexts. In conjunction with the development of a new iteration of the National Competency Framework (NCF), the National Alliance of Respiratory Therapy Regulatory Bodies (NARTRB) reaffirmed its desire to continue to recognize the use of simulation as an educational tool. Given the expressed uncertainty as to best practices in the use of clinical simulation in entry-to-practice respiratory therapy education programs, the NARTRB requested the creation of an expert workgroup to develop a list of recommendations from which an implementation plan could be developed for the next iteration of the NCF. The resulting advisory workgroup recommendations are intended to inform the application of simulation in education programs relative to the attainment of entry-to-practice competencies as outlined in the current National Competency Profile. The recommendations presented focus on the use of clinical simulation for formative and summative assessment of respiratory therapy competencies. The recommendations indicate that the use of formative assessment in clinical simulations along with deliberate practice has been clearly shown to improve learning outcomes for which the simulations are designed. However, it is advised that the use of clinical simulation for the summative assessment of competency (e.g., to assess readiness for practice) be exercised cautiously in the context of respiratory therapy education. A number of requisite instructional design factors that should be considered before implementing summative simulation-based assessments are identified, including the validation of summative assessment tools.

Keywords: clinical simulation, assessment, competency, respiratory therapy education


Clinical simulation is not a new phenomenon in the context of respiratory therapy education. In past years its role has been gradually established across health professions education, although some suggest that there has historically been limited research of sufficient quality to provide robust evidence of its educational utility [1, 2]. In response, an emerging body of research-informed literature is beginning to explicate the complexities of clinical simulation. For instance, a recent systematic review by Cook et al. [2] demonstrated that in comparison with no intervention, simulation-based health professions education can be associated with positive effects on the knowledge, skills, and behaviours of learners. In respiratory therapy in Canada, the knowledge, skills, and behaviours required of learners for entry-to-practice into the profession are identified in the current Respiratory Therapy National Competency Profile (NCP). The identification of effective approaches to developing these entry-to-practice competencies in learners has become a matter of shared interest to many stakeholders within the profession.

The Respiratory Therapy NCP was first created in 2003 by the National Alliance of Respiratory Therapy Regulatory Bodies (NARTRB) and updated after a nationwide professional validation survey in 2011 [3, 4]. In this profile the use of clinical simulation was initially introduced with distinct definitions for high- and low-fidelity simulation. These definitions were used when simulation was identified as an acceptable method of evaluation for 20 of the 315 total competencies listed in the profile [4]. This national policy decision was intended to provide some limited flexibility to accommodate competency assessment in respiratory therapy programs that encountered difficulty in ensuring sufficient opportunities for students to gain clinical exposure to those competencies.

The direction provided by the 2011 NCP with regards to clinical-simulation-based student evaluation generated some controversy within the Canadian respiratory therapy community (e.g., operationalization of the policy, educational validation of the assessment, etc.). In development of a new iteration of the NCP—the National Competency Framework (NCF)—the NARTRB reaffirmed its desire to continue to recognize the use of simulation as an educational and assessment tool. With expressed uncertainty as to best practices in the use of clinical simulation in entry-to-practice contexts, and in response to action items agreed to in collaboration with CoARTE (Council on Accreditation for Respiratory Therapy Education), the NARTRB requested the creation of an expert workgroup to develop a list of recommendations. The working group was asked to examine and define under what conditions simulation could be used to assess the entry-to-practice competency of graduating respiratory therapy students. The primary mandate of the workgroup would be to determine how clinical simulation can best be employed in respiratory therapy education, with the interest of public safety and protection at the fore.

In response, the Advisory Workgroup on the Use of Clinical Simulation was struck to provide recommendations to the NARTRB on the appropriate use of clinical simulation for the learning and assessment of entry-to-practice competencies for the implementation of the 2011 NCP. The roles and responsibilities of the assembled committee were to: review literature related to the use of clinical simulation in the attainment and demonstration of competencies and related best practices; make recommendations with respect to the use of clinical simulation as a learning method to supplement and/or replace clinical practice to attain and demonstrate the competencies identified as entry-to-practice contained within the NCP; and make recommendations on establishing a plan for the management, administration, and implementation of clinical simulation.

The resulting recommendations are therefore intended to inform the application of clinical simulation in education programs relative to the attainment of entry-to-practice competencies as outlined in the current NCP [3]. It is noteworthy that interpretation of the literature may be different when clinical simulation is used in different contexts, such as in post-educational program licensure examinations, continuing professional development, etc.

DEVELOPMENT OF RECOMMENDATIONS

The Advisory Workgroup collaborated over the course of a number of months during 2015 and 2016. During that time the Advisory Workgroup engaged in considerable critical discussion regarding the current literature relative to the application of simulation in education programs for the attainment of entry-to-practice competencies, as was outlined in the 2011 NCP [4]. Consensus was achieved that wherever possible the recommendations would be evidence based on the currently available literature. As a foundational tenet of these recommendations, the Advisory Workgroup agreed that clinical simulation is an adjunct or a technique for learning, and is not a replacement for clinical assessment except in specific limited situations where it approximates clinical conditions.

The relevant literature was critically appraised in relation to the range of key conceptual areas discussed by the workgroup, including its use for enhancing traditional learning models, for replacing learning in clinical contexts, and for assessment of practice readiness. In brief, the Advisory Workgroup agreed that:

  • Extensive literature exists supporting the effectiveness of simulation for enhancing learning in both entry-to-practice education and professional development contexts, in particular when quality debriefing is included as part of the educational design [2, 59].

  • Limited literature exists describing the effectiveness of replacing traditional clinical education exposure with simulated clinical experiences within entry-to-practice health professional education programs [10].

  • A paucity of literature exists exploring the use of simulation to assess clinical competencies (for entry-to-practice readiness) in respiratory therapy. Literature available from other health professions describes validated simulation-based assessments that relate to specific professional competencies, typically in the context of Objective Structured Clinical Exam (OSCE) type licensure and certification assessment (i.e., post-completion of entry-to-practice education program) [1117].

As a result of this work, the Advisory Workgroup has developed recommendations for the use of simulation in both the formative and summative assessment settings. It was the opinion of the Advisory Workgroup that the assessment strategies should be designed to suit the educational purpose [18]. As such, the following recommendations address the use of clinical simulation for each assessment process separately (i.e., summative and formative assessment).

GENERAL RECOMMENDATIONS

Formative assessment of competencies in the 2011 NCP

With regards to educational practices, the use of formative assessment is encouraged to identify learning gaps and modify learning plans toward developing competencies [19]. The use of formative assessment in clinical simulations along with deliberate practice has been shown to improve learning outcomes for which the simulations are designed [2, 69, 14]. The degree of realism required of a clinical simulation is dependent on the level of the learner and the objectives of the simulation [20, 21]. This Advisory Workgroup recommends using formative assessment in clinical simulation activities followed with debriefing and deliberate practice as a mechanism to assist learners with competency development.

Summative assessment of competencies in the 2011 NCP

Summative assessment refers to evaluating if learners have achieved defined learning outcomes using traditional methods such as grading tests or providing marks [18]. For the purpose of this report, summative assessment is the assessment of students for confirming entry-to-practice competency. With regards to respiratory therapy practice, there is no direct evidence supporting the notion that summative assessments of competencies for entry-to-practice can be achieved in simulated clinical settings. The Advisory Workgroup cannot therefore recommend the use of simulation for routine summative assessment for entry-to-practice competency based on the current literature.

It is recognized that the degree of realism of a clinical simulation is dependent on the design and attention to approximating reality [20, 21]. It is the recommendation of this Advisory Workgroup that if under exceptional circumstances the use of simulation is deemed necessary for the summative assessment of any competencies, they must be conducted in the most realistic setting, under the most realistic conditions available. It is important that the level of realism is sufficient to allow valid assessment of the targeted construct to occur [22, 23]. Summative assessments of competencies should not be performed in settings that do not approximate realistic conditions or lack design and attention to approximating reality and ensuring validity.

SPECIFIC RECOMMENDATIONS

1. Recommendations on the use of clinical simulation for formative assessment for entry-to-practice

Recommendation 1.1

The use of clinical simulation for formative assessment is strongly encouraged in the curriculum of respiratory therapy education programs to foster the development of competencies and skills [2, 8]. Formative simulation may be sequentially incorporated as a component of the broader curriculum design (“at the right place, at the right time, for the right learning”).

Recommendation 1.2

Feedback and debriefing are essential elements of effective clinical simulation for formative assessment [5, 21, 24].

Recommendation 1.3

Clinical simulation for formative assessment is an effective approach to optimally prepare students for clinical exposure [2, 8, 10].

Recommendation 1.4

The following limitations should be considered when using clinical simulation for formative assessment:

  1. The assessment requires development by individuals who have the knowledge, skills, attitudes, and abilities in generally accepted principles in simulation-based education [9, 25].

  2. There are appropriate physical and human resources available [20].

  3. The evidence does not support the definition of high and low fidelity simulation as presented in the 2011 NCP. More effective distinctions with regards to the degree of fidelity can be made by examining three key elements—physical, semantical, and phenomenal—see Appendix A: Definition of Fidelity [6].

  4. Clinical simulation is an approximation of reality and may not be a sufficient replacement for clinical exposure [23].

Recommendation 1.5

The following evidence-informed strategies using clinical simulation for formative assessment include:

  1. Use of fidelity as appropriate to the learning objectives in the clinical simulation, to the learning and skills level of the student, and to their level of experience with simulation [16, 20].

  2. Thoughtful instructional design principles should be applied in clinical simulation for formative assessment [20, 21, 27, 28].

  3. Appropriate approaches to debriefing and feedback are selected from amongst those that have been proven effective [24, 25].

  4. Establishment of a learning environment characterized by trust and safe learning practices amongst the participants [16, 29].

Recommendation 1.6

The assessment of learning is an ongoing process throughout the education cycle [30].

Recommendation 1.7

These recommendations should be periodically reviewed and updated to meet the requirements of future iterations of the NCF and to ensure consistency with emerging literature.

2. Recommendations on the use of clinical simulation for summative assessment of competencies for entry-to-practice

Recommendation 2.1

Summative assessment within the clinical environment is the gold-standard [23].

Recommendation 2.2

Clinical simulation may be an acceptable alternative to assess competencies in some exceptional circumstances, such as:

  1. to address limitations in achieving and/or accessing clinical exposure;

  2. to assess internationally educated health professionals who are seeking to work in Canada as Registered Respiratory Therapists;

  3. an individual returning to active respiratory therapy practice after extended absence from practice;

  4. if clinical simulation is deemed to offer a higher quality of assessment than available in a clinical setting, as long as it supported by rigorous evidence.

Recommendation 2.3

For summative assessment using clinical simulation for entry-to-practice the following limitations should be considered:

  1. The assessment using clinical simulation is performed by individuals who have the knowledge, skills, attitudes, and abilities in generally accepted principles of simulation-based assessment [16, 22, 25].

  2. The appropriate physical and human resources are available [20].

  3. The evidence does not support the definitions of high- and low-fidelity simulation as presented in the 2011 NCP. More effective distinctions with regards to the degree of fidelity can be made by examining three key elements of realism—physical, semantical and phenomenal—see Appendix A: Definition of Fidelity [6].

  4. Clinical simulation is an approximation of reality and may not be a sufficient replacement for clinical exposure [23].

  5. A single point assessment may be insufficient to determine competency.

Recommendation 2.4

The following are evidence-informed strategies for summative assessment using clinical simulation:

  1. Establish a clinical situation and environment that approximates a real-world situation, addressing all three elements of high-fidelity simulation [6].

  2. Thoughtful instructional design principles should be applied in clinical simulation for summative assessment [20, 21, 27, 28].

Recommendation 2.5

The assessment of learning is an ongoing process through the education cycle [30].

Recommendation 2.6

It is recognized that assessment spans a continuum from formative to summative assessment and these recommendations are not meant to undermine the value of debriefing in the formative assessment context [8, 31].

Recommendation 2.7

These recommendations should be reviewed periodically and updated to meet the requirements of future iterations of the NCF and to ensure consistency with emerging literature.

3. Recommendations about the use of clinical simulation in the evaluation of the 2011 NCP competencies

Recommendation 3.1

Establish a standing advisory committee to provide ongoing expert advice to the NARTRB on education-related issues.

Recommendation 3.2

As simulation design represents only one component in a program’s curriculum design, it is the opinion of the Advisory Workgroup that imposing conditions, limitations, or encouraging a certain threshold with respect to the use of clinical simulation should not be incorporated into national educational program requirements for formative and summative assessments.

4. Recommendation of the advisory workgroup on the use of clinical simulation

Recommendation 4.1

Several environmental factors have been identified in the literature as essential in creating an effective debriefing environment in clinical simulation, including: fostering a supportive learning environment, ensuring participants feel comfortable, and establishing trust within the circle of participants [25, 29]. In light of the importance of fostering a debriefing environment that supports learning, there is value for educators and regulatory bodies to carefully consider that employing simulation-based summative assessment (e.g., high-stakes examinations) of respiratory therapy learners may impact those essential environmental factors.

The profession should consider how it may ensure that any move towards employing simulation-based high-stakes examinations in respiratory therapy education does not threaten to undermine the effectiveness of the clinical simulation learning environment.

FUTURE DIRECTIONS

In relation to the primary mandate assigned to the Advisory Workgroup by the NARTRB, it is the opinion of the Advisory Workgroup that the responsible use of clinical simulation is in the interest of public safety and protection. The Advisory Workgroup identified the opportunity to engage the collective respiratory therapy education community (e.g., CACERT (Canadian Advisory Council for Education in Respiratory Therapy), CoARTE, CBRC (Canadian Board for Respiratory Care)) as well as other professions on topics of mutual interest such as the integration of simulation into future curriculum design and the development for formative education in addition to the recommendations for summative evaluation as requested by the NARTRB. The Advisory Workgroup also indicated a willingness to continue to advise the NARTRB on the use of clinical simulation in future iterations of the NCF.

Looking forward, a need exists for original respiratory therapy specific research and scholarly work in the area of clinical simulation. In particular, research should be conducted that can enlighten the profession’s understanding of the use of clinical simulation for assessment of entry-to-practice competencies in educational programs.

Beyond these recommendations, it is the opinion of the Advisory Workgroup that it may be in the best interest of respiratory therapy educators to develop their own strategies for implementing these recommendations, or other emerging best practices on clinical simulation-based formative and summative assessment. It is felt that this has the potential to eventually lead to the establishment of national standards in simulation-based respiratory therapy education.

APPENDIX A

Definition of fidelity

Fidelity describes the degree of realism in simulated environments and includes physical, semantical, and phenomenal aspects [6]. “Skillful blending of the three…will allow our trainees to ‘suspend disbelief’ that this is a situation with real relevance to them” [26]. Participant engagement is based on no single element of realism, but assures that no single element “violates their expectations in a way that disrupts their engagement” [26].

Physical reality concerns characteristics that are measurable (e.g., the weight of an infant mannequin). In this way physical fidelity might be described as the reality of simulator equipment, measurable elements of the environment, or physical aspects of movements of such characteristics.

Semantical reality concerns those parts of the simulation experience that are “facts only by human agreement” [6]. Semantical fidelity describes “concepts and their relationships…presented as text, pictures, sounds, or events” [6]. Semantic fidelity is therefore assured only when the information presented is interpretable as realistic (e.g., when a simulated patient’s heart stops beating it is also made to stop breathing as is natural).

Phenomenal reality concerned with participants understanding of how the simulation event relates to another real situation, clinical practice for example (e.g., team interaction a simulated trauma scenario feels lifelike despite obvious physical differences compared with real life). This phenomenal fidelity depends on the “emotions, beliefs, and self-aware cognitive states of rational thought” [6] experienced by participants in simulation.

When considering the appropriate combination of realism elements for achieving optimal fidelity, the incorporation of appropriate interprofessional team members must also be carefully considered. Having the appropriate interprofessional team members present can greatly enhance the phenomenal reality of the simulation. Additionally, especially when team interactions are integral to the objectives of the simulation scenario, their participation could also enhance the physical and semantical reality. For example it may not be common to have five respiratory therapists caring for the same patient; however, it would not be uncommon to have five healthcare professionals caring for a critically ill patient, each bringing different yet overlapping roles and skill sets to the situation. When the professional composition of the care team (simulation participants) is not carefully considered during the development of simulation scenarios, the fidelity of the simulation as experienced by learners is likely to be negatively impacted. The development of an optimal interprofessional simulation experience may require collaboration with other educational institutions. This would allow students to gain exposure and experience working with professions they regularly interact with when providing care to patients, as opposed to limiting interprofessional interactions to only students studying in healthcare programs at the same institution.

REFERENCES

  • 1.Wayne DB, Didwania A, Feinglass J, Fudala MJ, Barsuk JH, McGaghie WC. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: A case-control study. Chest J 2008;133(1):56–61. doi: 10.1378/chest.07-0131. [DOI] [PubMed] [Google Scholar]
  • 2.Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation for health professions education: A systematic review and meta-analysis. J Am Med Assoc 2011;306(9):978–88. doi: 10.1001/jama.2011.1234. [DOI] [PubMed] [Google Scholar]
  • 3.National Alliance of Respiratory Therapy Regulatory Bodies Publications. NCP; 2011. <http://www.nartrb.ca/eng/publications.php> (Accessed August 29, 2016). [Google Scholar]
  • 4.National Alliance of Respiratory Therapy Regulatory Bodies Publications. NCP; 2003. <http://www.nartrb.ca/eng/publications.php> (Accessed August 29, 2016). [Google Scholar]
  • 5.Cheng A, Eppich W, Grant V, Sherbino J, Zendejas B, Cook DA. Debriefing for technology-enhanced simulation: A systematic review and meta‐analysis. Med Educ 2014;48(7):657–66. doi: 10.1111/medu.12432. [DOI] [PubMed] [Google Scholar]
  • 6.Dieckmann PD, Gaba D, Rall M. Deepening the theoretical foundations of patient simulation as social practice. Simul Healthc 2007;2(3):183–93. doi: 10.1097/SIH.0b013e3180f637f5. [DOI] [PubMed] [Google Scholar]
  • 7.Gaba DM. The future vision of simulation in health care. Qual Saf Health Care 2004;13(Suppl 1):i2–10. doi: 10.1136/qshc.2004.009878. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Levett-Jones T, Lapkin S. A systematic review of the effectiveness of simulation debriefing in health professional education. Nurse Educ Today 2014;34(6):e58–63. doi: 10.1016/j.nedt.2013.09.020. [DOI] [PubMed] [Google Scholar]
  • 9.McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med 2011;86(6):706–11. doi: 10.1097/ACM.0b013e318217e119. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Hayden JK, Smiley RA, Alexander M, Kardong-Edgren S, Jeffries PR. The NCSBN National Simulation Study: A longitudinal, randomized, controlled study replacing clinical hours with simulation in prelicensure nursing education. J Nurs Regul 2014;5(2):S3–40. doi: 10.1016/S2155-8256(15)30062-4. [DOI] [Google Scholar]
  • 11.Alston GL, Love BL. Development of a reliable, valid annual skills mastery assessment examination. Am J Pharmaceutical Educ 2010;74(5). Article 80 10.5688/aj740580. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Baid H. The objective structured clinical examination within intensive care nursing education. Nurs Crit Care 2011;16(2):99–105. doi: 10.1111/j.1478-5153.2010.00396.x. [DOI] [PubMed] [Google Scholar]
  • 13.Berkenstadt H, Ziv A, Gafni N, Sidi A. Incorporating a simulation-based objective structured clinical examination into the Israeli national board examination in anaesthesiology. Anesth Analg 2006;102:853–8. doi: 10.1213/01.ane.0000194934.34552.ab. [DOI] [PubMed] [Google Scholar]
  • 14.Gallagher AG, Cates CV. Approval of virtual reality training for carotid stenting: What this means for procedural-based medicine. JAMA 2004;292(24):3024–6. doi: 10.1001/jama.292.24.3024. [DOI] [PubMed] [Google Scholar]
  • 15.Höfer SH, Schuebel F, Sader R, Landes C. Development and implementation of an objective structured clinical examination (OSCE) in CMF-surgery for dental students. J Cranio-Maxillofac Surg 2012;41(5):412–16. doi: 10.1016/j.jcms.2012.11.007. [DOI] [PubMed] [Google Scholar]
  • 16.McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation‐based medical education research: 2003–2009. Med Educ 2010;44(1):50–63. doi: 10.1111/j.1365-2923.2009.03547.x. [DOI] [PubMed] [Google Scholar]
  • 17.Michelson JD, Manning L. Competency assessment in simulation-based procedural education. Am J Surg 2008;196(4):609–15. doi: 10.1016/j.amjsurg.2007.09.050. [DOI] [PubMed] [Google Scholar]
  • 18.Earl L. Assessment – A powerful lever for learning. Brock Educ 2006;16 (1):1–15 [Google Scholar]
  • 19.Assessment Reform Group Assessment for learning: Beyond the black box. University of Cambridge School of Education; 1999. <http://www.nuffieldfoundation.org/sites/default/files/files/beyond_blackbox.pdf> (Accessed August 29, 2016). [Google Scholar]
  • 20.Chiniara G, Cole G, Brisbin K, et al. Simulation in healthcare: A taxonomy and a conceptual framework for instructional design and media selection. Med Teach 2013;35(8):e1380–95. doi: 10.3109/0142159X.2012.733451. [DOI] [PubMed] [Google Scholar]
  • 21.Cook DA, Hamstra SJ, Brydges R, et al. Comparative effectiveness of instructional design features in simulation-based education: Systematic review and meta-analysis. Med Teach 2013;35(1):e867–98. doi: 10.3109/0142159X.2012.714886. [DOI] [PubMed] [Google Scholar]
  • 22.McWilliam P, Botwinski C. Identifying strengths and weaknesses in the utilization of Objective Structured Clinical Examination (OSCE) in a nursing program. Nurs Educ Perspect 2012;33(1):35–9. doi: 10.5480/1536-5026-33.1.35. [DOI] [PubMed] [Google Scholar]
  • 23.O’Leary F. Simulation as a high stakes assessment tool in emergency medicine. Emerg Med Aust 2015;27(2):173–5. doi: 10.1111/1742-6723.12370. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Dreifuerst K. Using debriefing for meaningful learning to foster development of clinical reasoning in simulation. J Nurs Educ 2012;51(6):326–33. doi: 10.3928/01484834-20120409-02. [DOI] [PubMed] [Google Scholar]
  • 25.Eppich W, Cheng A. Promoting excellence and reflective learning in simulation (PEARLS): Development and rationale for a blended approach to health care simulation debriefing. Simul Healthc 2015;10(2):106–15. doi: 10.1097/SIH.0000000000000072. [DOI] [PubMed] [Google Scholar]
  • 26.Rudolph JW, Simon R, Raemer DB. Which reality matters? Questions on the path to high engagement in healthcare simulation. Simul Healthc 2007;2(3):161–3. doi: 10.1097/SIH.0b013e31813d1035. [DOI] [PubMed] [Google Scholar]
  • 27.Jeffries PR. A frame work for designing, implementing, and evaluating simulations used as teaching strategies in nursing. Nurs Educ Perspect 2005;26(2):96–103 [PubMed] [Google Scholar]
  • 28.Khamis NN, Satava RM, Alnassar SA, Kern DE. A stepwise model for simulation-based curriculum development for clinical skills, a modification of the six-step approach. Surg Endoscopy 2016;30(1):279–87. doi: 10.1007/s00464-015-4206-x. [DOI] [PubMed] [Google Scholar]
  • 29.Wickers MP. Establishing the climate for a successful debriefing. Clin Simul Nurs 2010;6(3):e83–6. doi: 10.1016/j.ecns.2009.06.003. [DOI] [Google Scholar]
  • 30.Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB. Simulation in health care education: A best evidence practical guide. AMEE Guide no. 82. Med Teach 2013;35(10):e1511–30 [DOI] [PubMed] [Google Scholar]
  • 31.Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Soc Simul Healthc 2007;2(2):115–25. doi: 10.1097/SIH.0b013e3180315539. [DOI] [PubMed] [Google Scholar]

Articles from Canadian Journal of Respiratory Therapy: CJRT = Revue Canadienne de la Thérapie Respiratoire : RCTR are provided here courtesy of Canadian Society of Respiratory Therapy

RESOURCES