Abstract
Background
Simulation-based medical education (SBME) is a cornerstone for procedural skill training in residency education. Multiple studies have concluded that SBME is highly effective, superior to traditional clinical education, and translates to improved patient outcomes. Additionally it is widely accepted that mastery learning, which comprises deliberate practice, is essential for expert level performance for routine skills; however, given that highly structured practice is more time and resource-intensive, it is important to assess its value for the acquisition of rarely performed technical skills. The bougie-assisted cricothyroidotomy (BAC), a rarely performed, lifesaving procedure, is an ideal skill for evaluating the utility of highly structured practice as it is relevant across many acute care specialties and rare – making it unlikely for learners to have had significant previous training or clinical experience. The purpose of this study is to compare a modified mastery learning approach with deliberate practice versus self-guided practice on technical skill performance using a bougie-assisted cricothyroidotomy model.
Methods
A multi-centre, randomized study will be conducted at four Canadian and one American residency programs with 160 residents assigned to either mastery learning and deliberate practice (ML + DP), or self-guided practice for BAC. Skill performance, using a global rating scale, will be assessed before, immediately after practice, and 6 months later. The two groups will be compared to assess whether the type of practice impacts performance and skill retention.
Discussion
Mastery learning coupled with deliberate practice provides systematic and focused feedback during skill acquisition. However, it is resource-intensive and its efficacy is not fully defined. This multi-centre study will provide generalizable data about the utility of highly structured practice for technical skill acquisition of a rare, lifesaving procedure within postgraduate medical education. Study findings will guide educators in the selection of an optimal training strategy, addressing both short and long term performance.
Keywords: Medical education, Deliberate practice, Mastery learning, Simulation-based medical education, Self-guided practice, Cricothyroidotomy, Procedural skill
Background
There is substantial evidence that SBME is a superior training technique compared to traditional didactic methods for technical skill acquisition [1, 2]. As both technology and patient care become increasingly complex, simulation-based medical education (SBME) provides a feasible alternative allowing trainees to practice without harming patients [3]. Unfortunately, SBME is a resource intensive approach to medical training, requiring careful alignment of education and practice design principles with the intended outcomes. For example, there is growing evidence that deliberate practice and mastery learning approaches to training for procedural skills can ensure expert level performance, particularly for routine procedures [4–7]. Applying this structured approach, learners can transition as outlined in the “Dreyfus model” along a continuum of five stages: novice, advanced beginner, competent, proficient and expert [4]. Highly structured practice requires a substantial commitment of time and instructional resources [8]. Identifying the most effective and evidence-based SBME methods for rare procedures is a critical task for educators in acute care medicine [9, 10].
Deliberate practice and mastery learning (DP + ML)
Deliberate practice (DP) is an instructional method widely regarded as the mainstay for expert skill acquisition [11–13]. First described by Ericsson and his colleagues, this method refers to engagement in structured activities with the goal to improve performance in a domain, through an iterative cycle of practice, feedback, and successive refinement [14]. It is based on 4 key components [11]:
Motivated learners
Well-defined goals for improvement
Ample opportunity to practice through repetition
During practice, focused feedback is provided with skill adjustments made accordingly
Deliberate practice is often coupled with the mastery learning (ML) model, where tasks are broken into a series of smaller and progressively more complex microskills [15]. Learners advance through each of the tasks by applying skills they have acquired during preceding steps [16] (Fig. 1). Deliberate practice and masterly learning (DP + ML) improves performance across a variety of disciplines including sports and music and there is growing evidence of its effectiveness within medical education [2, 12, 17].
Some authors suggest that simulation training using DP + ML results in superior outcomes compared to other methods of simulation-based practice [2]. An alternative to DP + ML is self-guided practice (also referred to as self-directed or self-regulated) whereby “individuals take the initiative, with or without the help of others in diagnosing their learning needs, formulating learning goals, identifying human and material resources for learning, choosing and implementing appropriate learning strategies, and evaluating learning outcomes” [18]. The few studies that compare DP with self-guided practice yielded mixed results while also limited by both small sample sizes and generalizability of a single institution setting [8, 19].
An important confounding factor that may explain these discrepant results is the variability in how DP is applied during SBME. Deliberate practice is often described in the context of sustained investment in practice over long periods of time in order to attain expert status [9]. In medical training, this can be difficult given the wide breadth of skills and knowledge that are required, especially when it comes to rarely performed procedures. While most clinical educators adhere to the key components of Ericsson’s criteria, in real practice, training is condensed over a shorter time-frame. The goal of the current study therefore is to explore the application of DP in a realistic context in order to understand how it is best optimized in curricular design, particularly for the acquisition of a rarely performed technical skill in the emergency medicine setting. Our study design incorporates several metrics of performance including a global rating scale and chronometry measured both immediately after and 6–12 months following skill instruction.
Chronometry
Chronometry, the measurement of time, is an important metric for time sensitive procedures yet it remains uncommonly applied within technical skill training curricula [20]. However, introducing procedural time-based measurements can further improve deliberate practice by facilitating overlearning (going beyond the minimal level of competence required), increasing the challenge level, and providing additional feedback which can enhance a learner’s self-assessment and increase motivation [20].
Retention
A key aspect of procedural skill acquisition is skill retention. Medical trainees are expected to demonstrate competence in performing a wide array of procedures, despite having limited exposure to practice opportunities [21]. Acute care physicians face the additional challenge of performing these procedures with little to no warning in stressful situations with unstable patients.
Several studies have looked at SBME to improve skill retention yielding variable results. Skill decay ranges from 2 weeks to 14 months for procedures such as ACLS, shoulder dystocia and cricothyroidotomy skills, with the most common retention interval being 6–12 months [22–25]. These differences are perhaps unsurprising, given that there are a number of different factors that influence retention including degree of overlearning, performance conditions during retention evaluation, task characteristics, and unmeasured individual differences [26–30]. While it appears that 6–12 months is a reasonable timeframe to expect for skill retention, there is uncertainty regarding the training approach best suited to optimize skill performance over time.
Bougie-assisted cricothyroidotomy
The cricothyroidotomy is a life-saving procedure performed as a final option in all emergency airway algorithms [31, 32]. It represents an ideal prototype to evaluate the utility of DP + ML in skill acquisition and retention as it is rare, time-sensitive, and relevant across most acute care, hospital-based specialties [31, 33–36]. Furthermore, the recent introduction of a novel technique—the bougie-assisted cricothyroidotomy (BAC)—offers additional opportunity as residents are unlikely to have previous training or experience.
Justification/summary
Large scale, multi-centre, comparative effectiveness studies of SBME methods are needed to better define optimal simulation teaching strategies. This study will compare DP + ML with self-guided practice on the acquisition and retention of a rarely performed skill –bougie assisted cricothyroidotomy. To our knowledge, this is the first multi-centre, randomized study to evaluate the role of DP + ML on skill acquisition and retention.
Methods
Aim/objective
The objectives of this study are to:
Compare two instructional techniques on the skill performance of a bougie-assisted cricothyroidotomy immediately and after 6–12 months after training
Evaluate participant attitudes and preferences following two instructional techniques and practice
Study design and setting
This is a multi-centre, blinded, randomized study (Fig. 2). The study will take place at five emergency medicine (EM) residency programs across North America (4 Canadian- and 1 US-based centres). All EM residents PGY1–5 are eligible and will be invited to participate. Written consent will be obtained from all study participants. There are no exclusion criteria. We received research ethics board approval from all participating institutions.
Instructional methods
Residents at each study site will be assigned to receive either DP + ML, or self-guided practice using a computer-generated randomization process. Participants will be blinded to the study group assignments. For practical purposes, we will cluster 4–6 participants with 1 instructor who will facilitate the training using one of the two instructional methods. This group size aligns with recommendations from surgical education literature for procedural skill training that demonstrates the optimal trainee:instructor ratio for procedural skills training [37]. All instructors are board certified emergency physicians with expertise in procedural skill training. Prior to the session, each instructor will receive an introduction to the task trainer, a briefing on how to conduct the training and a standardized approach to BAC performance. The instructors assigned to the DP + ML session will also receive an introduction to the BAC checklist and explicit instructions to facilitate mastery learning followed by deliberate practice [38]. The instructors assigned to the self-guided practice session will be told explicitly that they can only provide feedback following a participant request. They will be instructed by the study investigators on the necessary steps for BAC performance.
Both groups will receive an introductory lecture to surgical airway performance and a demo of the BAC on the task trainer by the session instructor. Participants will then complete a pre-training survey assessing self-confidence and prior cricothyroidotomy experience, followed by a video-recorded performance of a BAC using an upper body task trainer (head and neck from a Laerdal SimMan) to assess baseline performance before training begins.
Deliberate practice and mastery learning
In the DP + ML group, the instructor will demonstrate the procedure following the list of essential steps, published previously [38]. These steps were selected using a modified Delphi methodology. Participants will perform each step under direct instructor observation in keeping with mastery learning techniques [8] and subsequent practice will follow principles of deliberate practice – with the goal of skill improvement using repetitive performance coupled with personalized feedback from the instructor. Chronometry will be introduced in later stages of practice through timed performance [20]. We chose to incorporate chronometry as an additional pedagogic intervention in later stages of learning to further incentivize practice with an objective feedback metric. The training session will be considered complete once the participant and instructor independently agree that competent skill performance is achieved.
Self-guided learning
In the self-guided learning group, participants will observe the procedure demonstrated by an instructor, followed by self-guided practice using the same equipment and working environment as the DP + ML group. During practice, feedback will only be provided following an explicit participant request. Participants will be informed that they can complete their training at their own discretion, once they report feeling comfortable with the procedure.
Finally, participants from both groups will perform a video-recorded attempt immediately following completion of their respective practice sessions and complete a post-training survey.
Skill retention
Skill retention will be evaluated at 6 to 12 months following the initial training session. Due to logistical challenges related to resident schedules (e.g. away on elective, off-service), we require a 6 month time frame to maximize participation. Each participant will perform a video-recorded BAC on the same task trainer used during their training sessions followed by a post-training retention survey.
Primary outcome
The primary outcome is the difference in skill performance of BAC between the two groups at post-training, assessed using a previously validated 7-point global rating scale (GRS) [22, 39]. Video recordings will be scored by three independent airway experts (2 Royal College certified emergency physicians, 1 staff anesthesiologist) trained in BAC. All reviewers will receive training from study investigators followed by practice video review to ensure scoring standardization. All reviewers will be blinded to the subjects’ identities and group allocation.
Secondary outcomes
Secondary outcomes include participant feedback and chronometry after the initial training and retention sessions. Performance time begins when the participant first palpates the neck of the task trainer and ends with successful ventilation. Time scores will be measured using video review. Participant feedback will be collected with pre and post-training surveys. Skill performance (measured using GRS) at 6–12 months will also be compared.
Sample size determination
A justification of sample size requires knowledge of results from similar studies. As there have been no studies of this exact design conducted previously we have used a sample size formula to evaluate our proposed sample size [40]. We base the sample size calculation on assumptions inherent of a between groups comparison [40]. Given the multi-site nature of this study, there will be approximately 16–20 participants in each site; for a total of 80 in each group. At 80 participants in each group, there is sufficient power (80%) to detect a difference in GRS score of 1.3 between groups, assuming α = 0.05 and β = 0.2 to detect. Similar approaches have been used by Freidman et al. [39] and Naik et al. [41].
Statistical analysis
The survey data will be analyzed using descriptive statistics. The primary outcome (GRS) score will be evaluated using a repeated measures ANOVA with between subject factor of instruction (DP + ML or SG group) with one within subject factor of assessment repeated 3 times (pre, post, and retention). SPSS software (SPSS Inc. 18.0, Chicago IL) will be used for data analysis. 2-sided P < 0.05 is considered statistically significant for the primary outcome. Interrater reliability (IRR) for the reviewers’ scores will be analyzed using an intraclass correlation, after reviewing the same 20 videos. If a target IRR of ≥0.7 is achieved, the rest of the videos will be scored by one reviewer. Confidence ratings will be compared with GRS at pre-training, post-training and retention using Spearman’s correlations.
Discussion
The unique challenges of postgraduate training in medicine demands that research seeks to enhance our understanding of optimal simulation-based instructional methods rather than further highlighting the benefits of hands-on practice when compared to non-simulation-based teaching [42]. While the benefits of ML + DP in simulation are promising, we should acknowledge that it is a complex intervention with a variety of elements that differ as a result of local circumstances and resource availability [2]. The resources required to implement DP + ML cannot be understated. There is often more equipment, trainers, and time required with this method [15]. In a similar study comparing simulation-based deliberate practice with self-guided practice on ultrasound regional anesthesia skills, subjects in the self-guided practice group spent an average of 6.8 min practicing compared to 48.2 min in the deliberate practice group.
Currently, there is a paucity of data to inform preferred methods of simulation-based procedural training. McGaghie et al. concluded in their 2011 meta-analysis that “only a small number of studies were identified that address head-to-head comparative effectiveness of SBME with deliberate practice and traditional clinical education or a preintervention baseline” [2]. A more recent systematic review and meta-analysis concluded that “limited evidence suggests” the superiority of mastery learning SBME over other techniques [15]. The authors noted that few studies compare ML with other simulation-based instructional methods, such as self-guided practice, with mixed results. In one study, anesthesia residents learning ultrasound guided regional anesthesia were randomized to either DP + ML or self-guided practice. The authors reported that while performance improved among both groups, there were no differences in participant skill performance or retention between groups [8]. In contrast, a study of medical students learning ultrasound guided regional anesthesia techniques concluded that the deliberate practice resulted in fewer errors compared to self-guided practice [19]. Both of these studies are subject to important limitations including a small sample size at a single institution. There remains uncertainty as to whether these findings can be extrapolated among a larger study population across multiple sites. Recently, a large, comparative, single site study of undergraduate medical trainees concluded mastery learning with simulation is superior to simulation without mastery learning for peripheral venous catheter insertion [43]. This certainly supports the use of ML in the undergraduate setting, however, its utility in postgraduate training requires further study.
Our study addresses this by comparing two different simulation-based instructional methods for EM residents using a large sample size at multiple centres. This study design seeks to overcome the limitation of generalizability—common among other single site SBME-based studies. In addition, the rarity and relative novelty of the BAC procedure itself minimizes baseline variation in prior experience skill.
This study has several limitations. Given that both teaching and testing occur with a low-fidelity simulator, our study is designed only to assess procedural skills rather than the decision making and non-technical skills of an interprofessional resuscitation. Due to logistical constraints, retention data collection will take place between 6 to 12 months rather than at a single point in time. Although all study participants will voluntarily enrol, we cannot guarantee their motivation to learn and improve, which is a key aspect of deliberate practice.
In conclusion, this study will have important implications for residency training as it will provide insight for medical educators involved in the design of simulation-based technical skills training. High quality simulation-based skills training is a promising method to improve clinical outcomes and patient safety [44]. At a system-based level, this study will aid administrators and medical education leaders in their decision making for resource allocation towards skills training and inform future curricular designs.
Acknowledgements
The authors wish to thank the staff at the Allan Waters Family Simulation Centre (Nazanin Khodadoust, Susan Zelko, Hentley Small and Ashley Rosen) for their contributions to the simulation design and logistical planning support. The authors also wish to thank the simulation and research teams at each participating centre: Queen’s University (Dr. Andrew Hall, Jessica Montagner) McMaster University (Dr. Julian Owen), Yale University (Dr. Thomas Beardsley, Dr. James Bonz) and the University of Ottawa (Dr. George Mastoras) for their support of this study.
Funding
This work is supported by a medical education grant from Physicians Services Incorporated (Ontario, Canada). The funding sources had no role in the design of this study and they will not have any role in the study’s implementation, data analysis or dissemination of study results.
Availability of data and materials
The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.
Abbreviations
- BAC
Bougie-assisted cricothyroidotomy
- DP
Deliberate practice
- GRS
Global rating scale
- IRR
Interrater reliability
- ML + DL
Mastery learning and deliberate practice
- ML
Mastery learning
- SBME
Simulation-based medical education
Authors’ contributions
AP, SG, CH and JS designed the study. ML and AP conducted the literature review. AP, MM and SM developed the study methods and data analysis process. ML, AP and MM drafted the manuscript. SG, CH, JS and SM critically reviewed and edited the manuscript. All authors have read and approved the final manuscript.
Ethics approval and consent to participate
Ethics approval was received at all participating centres, including Queen’s University (Kingston, Canada), McMaster University (Hamilton, Canada), Yale University (New Haven, Connecticut), the University of Ottawa (Ottawa, Canada) and St. Michael’s Hospital, Toronto, Canada. Written informed consent will be obtained from all participants.
Consent for publication
Not applicable.
Competing interests
The authors declare that they have no competing interests.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Contributor Information
Andrew Petrosoniak, Phone: 416-864-5095, Email: petro82@gmail.com.
Marissa Lu, Email: marissa.lu@mail.utoronto.ca.
Sara Gray, Email: grays@smh.ca.
Christopher Hicks, Email: chrismikehicks@gmail.com.
Jonathan Sherbino, Email: sherbino@mcmaster.ca.
Melissa McGowan, Email: mcgowanm@smh.ca.
Sandra Monteiro, Email: monteisd@mcmaster.ca.
References
- 1.McGaghie WC, Issenberg SB, Barsuk JH, Wayne DB. A critical review of simulation-based mastery learning with translational outcomes. Med Educ. 2014;48:375–385. doi: 10.1111/medu.12391. [DOI] [PubMed] [Google Scholar]
- 2.McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86:706–711. doi: 10.1097/ACM.0b013e318217e119. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Gaba DM. The future vision of simulation in health care. Qual Saf Health Care. 2004;13(Suppl 1):i2–10. doi: 10.1136/qshc.2004.009878. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Dreyfus SE. The five-stage model of adult skill acquisition. Bull Sci Technol Soc. 2004;24:177–181. doi: 10.1177/0270467604264992. [DOI] [Google Scholar]
- 5.Kulasegaram KM, Grierson LE, Norman GR. The roles of deliberate practice and innate ability in developing expertise: evidence and implications. Med Educ. 2013;47:979–989. doi: 10.1111/medu.12260. [DOI] [PubMed] [Google Scholar]
- 6.Pusic M, Pecaric M, Boutis K. How much practice is enough? Using learning curves to assess the deliberate practice of radiograph interpretation. Acad Med. 2011;86:731–736. doi: 10.1097/ACM.0b013e3182178c3c. [DOI] [PubMed] [Google Scholar]
- 7.Walsh CM, Rose DN, Dubrowski A, et al. Learning in the simulated setting: a comparison of expert-, peer-, and computer-assisted learning. Acad Med. 2011;86:S12–S16. doi: 10.1097/ACM.0b013e31822a72c7. [DOI] [PubMed] [Google Scholar]
- 8.Udani AD, Harrison TK, Mariano ER, et al. Comparative-effectiveness of simulation-based deliberate practice versus self-guided practice on resident Anesthesiologists' Acquisition of ultrasound-guided regional anesthesia skills. Reg Anesth Pain Med. 2016;41:151–157. doi: 10.1097/AAP.0000000000000361. [DOI] [PubMed] [Google Scholar]
- 9.Issenberg SB, McGaghie WC, Hart IR, et al. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282:861–866. doi: 10.1001/jama.282.9.861. [DOI] [PubMed] [Google Scholar]
- 10.Pusic MV, Kessler D, Szyld D, Kalet A, Pecaric M, Boutis K. Experience curves as an organizing framework for deliberate practice in emergency medicine learning. Acad Emerg Med. 2012;19:1476–1480. doi: 10.1111/acem.12043. [DOI] [PubMed] [Google Scholar]
- 11.Ericsson KA. Deliberate practice and acquisition of expert performance: a general overview. Acad Emerg Med. 2008;15:988–994. doi: 10.1111/j.1553-2712.2008.00227.x. [DOI] [PubMed] [Google Scholar]
- 12.Macnamara BN, Hambrick DZ, Oswald FL. Deliberate practice and performance in music, games, sports, education, and professions: a meta-analysis. Psychol Sci. 2014;25:1608–1618. doi: 10.1177/0956797614535810. [DOI] [PubMed] [Google Scholar]
- 13.Sawyer T, White M, Zaveri P, et al. Learn, see, practice, prove, do, maintain: an evidence-based pedagogical framework for procedural skill training in medicine. Acad Med. 2015;90:1025–1033. doi: 10.1097/ACM.0000000000000734. [DOI] [PubMed] [Google Scholar]
- 14.Ericsson KA, Lehmann AC. Expert and exceptional performance: evidence of maximal adaptation to task constraints. Annu Rev Psychol. 1996;47:273–305. doi: 10.1146/annurev.psych.47.1.273. [DOI] [PubMed] [Google Scholar]
- 15.Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Mastery learning for health professionals using technology-enhanced simulation: a systematic review and meta-analysis. Acad Med. 2013;88:1178–1186. doi: 10.1097/ACM.0b013e31829a365d. [DOI] [PubMed] [Google Scholar]
- 16.McGaghie WC. When I say ... Mastery learning. Med Educ. 2015;49:558–559. doi: 10.1111/medu.12679. [DOI] [PubMed] [Google Scholar]
- 17.Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA. 2011;306:978–988. doi: 10.1001/jama.2011.1234. [DOI] [PubMed] [Google Scholar]
- 18.Knowles M. Self-directed learning: a guide for learners and teachers. New York: Cambridge University Press; 1975. p. 1975. [Google Scholar]
- 19.Ahmed OMA, Azher I, Gallagher AG, Breslin DS, O'Donnell BD, Shorten GD. Deliberate practice using validated metrics improves skill acquisition in performance of ultrasound-guided peripheral nerve block in a simulated setting. J Clin Anesth. 2018;48:22–27. doi: 10.1016/j.jclinane.2018.04.015. [DOI] [PubMed] [Google Scholar]
- 20.Pusic MV, Brydges R, Kessler D, Szyld D, Nachbar M, Kalet A. What's your best time? Chronometry in the learning of medical procedures. Med Educ. 2014;48:479–488. doi: 10.1111/medu.12395. [DOI] [PubMed] [Google Scholar]
- 21.Petrosoniak A, Herold J, Woolfrey K. Emergency medicine procedural skills: what are residents missing? CJEM. 2013;15:241–248. doi: 10.2310/8000.2013.130897. [DOI] [PubMed] [Google Scholar]
- 22.Boet S, Borges BC, Naik VN, et al. Complex procedural skills are retained for a minimum of 1 yr after a single high-fidelity simulation training session. Br J Anaesth. 2011;107(4):533–9. [DOI] [PubMed]
- 23.Kuduvalli PM, Jervis A, Tighe SQ, Robin NM. Unanticipated difficult airway management in anaesthetised patients: a prospective study of the effect of mannequin training on management strategies and skill retention. Anaesthesia. 2008;63:364–369. doi: 10.1111/j.1365-2044.2007.05353.x. [DOI] [PubMed] [Google Scholar]
- 24.Moser DK, Coleman S. Recommendations for improving cardiopulmonary resuscitation skills retention. Heart Lung. 1992;21:372–380. [PubMed] [Google Scholar]
- 25.Wayne DB, Siddall VJ, Butter J, et al. A longitudinal study of internal medicine residents' retention of advanced cardiac life support skills. Acad Med. 2006;81:S9–S12. doi: 10.1097/00001888-200610001-00004. [DOI] [PubMed] [Google Scholar]
- 26.Ahya SN, Barsuk JH, Cohen ER, Tuazon J, McGaghie WC, Wayne DB. Clinical performance and skill retention after simulation-based education for nephrology fellows. Semin Dial. 2012;25:470–473. doi: 10.1111/j.1525-139X.2011.01018.x. [DOI] [PubMed] [Google Scholar]
- 27.Arthur W, Jr, Bennett W, Jr, Stanush PL, McNelly TL. Factors that influence skill decay and retention: a quantitative review and analysis. Hum Perform. 2009;11:57–101. doi: 10.1207/s15327043hup1101_3. [DOI] [Google Scholar]
- 28.Barsuk JH, Cohen ER, McGaghie WC, Wayne DB. Long-term retention of central venous catheter insertion skills after simulation-based mastery learning. Acad Med. 2010;85:S9–12. doi: 10.1097/ACM.0b013e3181ed436c. [DOI] [PubMed] [Google Scholar]
- 29.Rohrer D, Taylor K, Pashler H, Wixted JT, Cepeda N. The effect of overlearning on long-term retention 2005. 2005. pp. 361–374. [Google Scholar]
- 30.Stefanidis D, Korndorffer JR, Jr, Sierra R, Touchard C, Dunne JB, Scott DJ. Skill retention following proficiency-based laparoscopic simulator training. Surgery. 2005;138:165–170. doi: 10.1016/j.surg.2005.06.002. [DOI] [PubMed] [Google Scholar]
- 31.Frerk C, Mitchell VS, McNarry AF, et al. Difficult airway society 2015 guidelines for management of unanticipated difficult intubation in adults. Br J Anaesth. 2015;115:827–848. doi: 10.1093/bja/aev371. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Kovacs G, Sowers N. Airway Management in Trauma. Emerg Med Clin North Am. 2018;36:61–84. doi: 10.1016/j.emc.2017.08.006. [DOI] [PubMed] [Google Scholar]
- 33.Braude D, Webb H, Stafford J, et al. The bougie-aided cricothyrotomy. Air Med J. 2009;28:191–194. doi: 10.1016/j.amj.2009.02.001. [DOI] [PubMed] [Google Scholar]
- 34.Hill C, Reardon R, Joing S, Falvey D, Miner J. Cricothyrotomy technique using gum elastic bougie is faster than standard technique: a study of emergency medicine residents and medical students in an animal lab. Acad Emerg Med. 2010;17:666–669. doi: 10.1111/j.1553-2712.2010.00753.x. [DOI] [PubMed] [Google Scholar]
- 35.Nakstad AR, Bredmose PP, Sandberg M. Comparison of a percutaneous device and the bougie-assisted surgical technique for emergency cricothyrotomy: an experimental study on a porcine model performed by air ambulance anaesthesiologists. Scand J Trauma Resusc Emerg Med. 2013;21:59. doi: 10.1186/1757-7241-21-59. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Reardon R, Joing S, Hill C. Bougie-guided cricothyrotomy technique. Acad Emerg Med. 2010;17:225. doi: 10.1111/j.1553-2712.2009.00638.x. [DOI] [PubMed] [Google Scholar]
- 37.Dubrowski A, MacRae H. Randomised, controlled study investigating the optimal instructor: student ratios for teaching suturing skills. Med Educ. 2006;40:59–63. doi: 10.1111/j.1365-2929.2005.02347.x. [DOI] [PubMed] [Google Scholar]
- 38.Dharamsi A, Gray S, Hicks C, Sherbino J, McGowan M, Petrosoniak A. Bougie-assisted cricothyroidotomy: Delphi-derived essential steps for the novice learner. CJEM. 2018;21(2):283–90. [DOI] [PubMed]
- 39.Friedman Z, You-Ten KE, Bould MD, Naik V. Teaching lifesaving procedures: the impact of model fidelity on acquisition and transfer of cricothyrotomy skills to performance on cadavers. Anesth Analg. 2008;107:1663–1669. doi: 10.1213/ane.0b013e3181841efe. [DOI] [PubMed] [Google Scholar]
- 40.Norman G, Monteiro S, Salama S. Sample size calculations: should the emperor's clothes be off the peg or made to measure? BMJ. 2012;345:e5278. doi: 10.1136/bmj.e5278. [DOI] [PubMed] [Google Scholar]
- 41.Naik VN, Matsumoto ED, Houston PL, et al. Fiberoptic orotracheal intubation on anesthetized patients: do manipulation skills learned on a simple model transfer into the operating room? Anesthesiology. 2001;95:343–348. doi: 10.1097/00000542-200108000-00014. [DOI] [PubMed] [Google Scholar]
- 42.Huang GC, McSparron JI, Balk EM, et al. Procedural instruction in invasive bedside procedures: a systematic review and meta-analysis of effective teaching approaches. BMJ Qual Saf. 2016;25:281–294. doi: 10.1136/bmjqs-2014-003518. [DOI] [PubMed] [Google Scholar]
- 43.Friederichs H, Marschall B, Weissenstein A. Simulation-based mastery learning in medical students: skill retention at 1-year follow up. Med Teach. 2018:1–8. [DOI] [PubMed]
- 44.McGaghie WC, Draycott TJ, Dunn WF, Lopez CM, Stefanidis D. Evaluating the impact of simulation on translational patient outcomes. Simul Healthc. 2011;6 Suppl:S42–S47. doi: 10.1097/SIH.0b013e318222fde9. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.