Skip to main content
Journal of Advances in Medical Education & Professionalism logoLink to Journal of Advances in Medical Education & Professionalism
. 2018 Jan;6(1):1–5.

Clinical skills temporal degradation assessment in undergraduate medical education

JOSEPH FISHER 1*, REBECCA VISCUSI 2, ADAM RATESIC 1, CAMERON JOHNSTONE 1, ROSS KELLEY 1, ANGELA M TEGETHOFF 1, JESSICA BATES 3, ELAINE H SITU-LACASSE 3, WILLIAM J ADAMAS-RAPPAPORT 2, RICHARD AMINI 3
PMCID: PMC5757151  PMID: 29344523

Abstract

Introduction:

Medical students’ ability to learn clinical procedures and competently apply these skills is an essential component of medical education. Complex skills with limited opportunity for practice have been shown to degrade without continued refresher training. To our knowledge there is no evidence that objectively evaluates temporal degradation of clinical skills in undergraduate medical education. The purpose of this study was to evaluate temporal retention of clinical skills among third year medical students.

Methods:

This was a cross-sectional study conducted at four separate time intervals in the cadaver laboratory at a public medical school. Forty-five novice third year medical students were evaluated for retention of skills in the following three procedures: pigtail thoracostomy, femoral line placement, and endotracheal intubation. Prior to the start of third-year medical clerkships, medical students participated in a two-hour didactic session designed to teach clinically relevant materials including the procedures. Prior to the start of their respective surgery clerkships, students were asked to perform the same three procedures and were evaluated by trained emergency medicine and surgery faculty for retention rates, using three validated checklists. Students were then reassessed at six week intervals in four separate groups based on the start date of their respective surgical clerkships. We compared the evaluation results between students tested one week after training and those tested at three later dates for statistically significant differences in score distribution using a one-tailed Wilcoxon Mann-Whitney U-test for non-parametric rank-sum analysis.

Results:

Retention rates were shown to have a statistically significant decline between six and 12 weeks for all three procedural skills.

Conclusion:

In the instruction of medical students, skill degradation should be considered when teaching complex technical skills. Based on the statistically significant decline in procedural skills noted in our investigation, instructors should consider administering a refresher course between six and twelve weeks from initial training

Keywords: Clinical skills , Assessment , Medical education

Introduction

Medical students interested in procedure-based medical specialties such as emergency medicine (EM) often choose to improve their procedural skills in high fidelity settings such as simulation laboratories. The clinical years of a medical school are often marked by a transition from theory to practice, where opportunities for kinesthetic skill development are first presented. It is established that the degree of technical proficiency is highly dependent on sustained deliberate practice (1,2), which often leads to clinical competency (3). The traditional model of “see one, do one, teach one” has been deemed dangerous for invasive or high-risk procedures, prompting a shift to simulation based training (4-6) and limiting practice. In addition, without the opportunity for practice, procedural skills operator confidence has been shown to degrade, leading to skill atrophy (7,8).

Theoretical models have produced forgetting curves that indicate a decaying rate of retention over time similar to an inverse power function, with the rate of decay slowing over time (7,9). Formalized training and interval refreshers have been shown to positively impact skill retention (10,11). However, current research is deficient in examining the long-term durability of procedural skills acquired through simulation based training, and there is no accepted time frame for interval refreshers (12). To our knowledge there is no current measure of when temporal decay is most rapid for a technical skill and how this differs among the variety of procedural skills learned in medical schools. The objective of this study was to evaluate temporal retention of endotracheal intubation, pigtail thoracostomy, and ultrasound guided femoral line placement among third year medical students with a hypothesis that the selected skills will deteriorate at similar rates without refresher training.

Methods

Study design and setting

This project was a single-center cross-sectional study conducted at four separate time intervals in the cadaver laboratory at a public medical school. This study was reviewed and approved by the institutional review board. The study participants were third year medical students without any formal training in endotracheal intubation, pigtail thoracostomy, orultra sound guided femoral line placement. Selection was based on the start date of the students’ surgical clerkships; however, participation was voluntary and performance was not used to determine clinical grades. The study was conducted between June 2015 and November 2015.

Study Protocol

Prior to the beginning of third year clerkship rotations, students at our institution take part in a week of non-traditional classroom education, where skills necessary for success in clinical rotations are taught. A portion of this training takes place in the cadaver laboratory where students were afforded opportunities to practice the following clinically relevant procedures: femoral line placements, endotracheal intubation, and pigtail thoracostomy. Initial training of the clinical skills involved two hours of formalized hands-on instruction directed by surgical and emergency medicine faculty and senior residents. During the formalized instruction the participants were separated into groups composed of three to four students and one instructor. The student groups then rotated through the clinical skills lab. All participants were given the opportunity to practice the clinical skills under direct supervision allowing immediate feedback for skill development. The participants were also provided additional time for self-controlled practice. Prior to leaving the lab all students demonstrated successful completion of each procedure to one of the instructors by achieving a score of 100% on previously validated checklists created by surgical and emergency medicine faculty (4,13).

Assessment

One week later and every six weeks subsequently, (at the end of each surgical clerkship orientation day), the students were offered an opportunity to demonstrate retention of skills in the above three procedures. The participant’s procedural knowledge was assessed using the same previously validated checklists (4,13). A total of 8 students were tested for retention of clinical skills to determine baseline recall at one week post training. Subsequently, three additional groups were tested at six, 12 and 18 weeks.

Data analysis

One-tailed Wilcoxon Mann-Whitney U-test was used for non-parametric samples to assess if the follow-up group evaluation result distributions statistically significantly differed from the immediate post-training group evaluation results based on rank-sum test statistic comparison. We report median test results of the original evaluation group and follow-up groups as well as interquartile ranges and p-values for Mann-Whitney U value rank-sum distribution comparisons (Table 1). We selected a one-tailed test because we were interested only in temporal degradation in student knowledge and skills as would be reflected by declines in standardized evaluation scores and did not expect any increase in skills in the time intervals selected. Our hypothesis was that scores did not decline in the student population at later evaluation dates compared with the initial group who were assessed within one week after initial learning. All analyses were performed using SAS statistical software (Version 9.4, Cary, NC, USA).

Table 1.

Median assessment scores and interquartile ranges for baseline and follow-up evaluation groups

Endotracheal intubation
Evaluation date Median score (IQR) 1 P-value vs. within one week after course group2
One week after course (n=8) 8.5 (2)
Six weeks after course (n=13) 9 (2.5) 0.2887
12 weeks after course (n=15) 7 (3) 0.0002
18 weeks after course (n=9) 5 (4) 0.0058
Pigtail thoracostomy
Evaluation date Median score (IRQ)1 P-value vs. within one week after course group2
Within one week after course (n=5) 8 (1)
Six weeks after course (n=13) 7 (2.5) 0.2465
12 weeks after course (n=15) 3 (3) 0.0097
18 weeks after course (n=8) 2 (4.5) 0.0109
Central line insertion
Evaluation date Median score (IQR)1 P-value vs. within one week after course group2
Within one after course (n=7) 7 (2)
Six weeks after course (n=13) 8 (1) 0.0175
12 weeks after course (n=15) 5 (3) 0.0140
18 weeks after course (n=9) 3 (3.5) 0.0066
1

Median score out of 10. Interquartile range for test scores displayed in parentheses.

2

One-sided Mann-Whitney U Exact Test P value.

Results

Forty-five third year medical students participated in the skills retention study, and all participants were included in the results. The population had an average age of 26±4.1 years, of whom 52% were men and 48% were women. Based on the Wilcoxon Mann-Whitney U-Test, there was a statistically significant decline in clinical skills between six and 12 weeks for all three clinical skills (Table 1). There was no difference in score distribution between the baseline group and the six-week group for pigtail thoracostomy and endotracheal intubation (Table 1). A graphical representation of the decline in skills is demonstrated in Figure 1, where a sharp decline in clinical skills is noted in all clinical procedures following week six.

Figure1.

Figure1

Mean assessment scores curve

Y axis- Mean assessment score out of 100%.

X axis- Weeks post initial skill training.

Discussion

Simulation based hands-on training has become a standard of instruction for introducing new procedures to medical students and postgraduate residents (14). This pragmatic shift in training necessitates that we begin to place greater emphasis on the longevity and durability of a learned skill, rather than the more common performed analysis of immediate recollection or immediate improvement (15,16). It is pivotal that the education and assessment sessions regarding complex skills incorporate objective and validated evidence-based techniques. In our study, the participants were ideal candidates because pre-clinical medical students are procedurally naïve. Furthermore, the procedures chosen (endotracheal intubation, femoral line placement, and pigtail thoracostomy) are high yield and necessary skills for various specialties.

Previous investigations by Issenberg et al. identified three factors which lead to maximized learning from simulation sessions: repetitive and active educational experiences, feedback, and embedding the education into the curriculum (17). In our study, we developed three hands-on learning sessions, designed to capitalize on Issenberg’s concepts. Students were divided into small groups with experienced instructors at a 3:1 or 4:1 ratio in order to maximize repetition, active learning, and individualized feedback. Furthermore, the simulation lab developed in this study was embedded into the third year curriculum. Improving upon previous research, our study approximated the reality of a clinical setting by utilizing fresh cadavers.

Previous research has attempted to evaluate decay of procedural knowledge for various EM related skills; however, there is limited data that supports any specific interval for assessment of such decay or an accepted recommendation for interval refreshers (7,8,19). To our knowledge, our study is the first to evaluate students for retention with intervals as short as six weeks. Reed et al. evaluated retention at time intervals of one to nine months post educational intervention, while Kovacs et al. evaluated retention at 16, 25 and 40 weeks (10,19). Interestingly, Reed et al. found retention of six basic EM skills to be present as long as nine months post educational intervention. While our study demonstrates retention of skills, it demonstrates a statistically significant decline between 6 and 12 weeks. It is possible that the prolonged length of retention found in the Reed study was influenced by the simplicity of the skills chosen. Alternatively, it may be a result of the frequency in which these skills (such as cardiopulmonary resuscitation) are encountered during an EM rotation and serve as “built-in” refresher intervals.

While our data appears to contradict the study performed by Kovacs et al., it is likely that our more frequent interval of assessment led to this data. Kovacs et al. conclude that endotracheal intubation skills decline at 16 weeks; however, 16 weeks was their first interval of assessment (10,20). Our data demonstrated a decline of endotracheal intubation skill between six and 12 weeks.

Limitations

Our model creates a lifelike training simulation for the three procedures chosen; however, not every medical center will have a well-supported Willed Body Program that provides donated cadavers for medical education. There was a statistically significant improvement between the baseline population and the six-week population in performance of femoral line placement, which can be a result of its simplicity or confounding educational experiences during the surgical clerkship. Additionally, our educational model relies on instructor time and dedication for successful implementation.

Conclusion

In the instruction of medical students, skill degradation should be considered when teaching complex technical skills. Based on the statistically significant decline in procedural skills noted in our investigation, instructors should consider administering a refresher course between six and twelve weeks from the initial training.

Footnotes

Conflict of interests: None declared.

References

  • 1.Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 Suppl):S70–81. doi: 10.1097/00001888-200410001-00022. [DOI] [PubMed] [Google Scholar]
  • 2.Wang EE. The Role of Simulation in Procedural Skill Acquisition. Acad Emerg Med. 2016;15(11):1046–57. doi: 10.1111/j.1553-2712.2008.00218.x. [DOI] [PubMed] [Google Scholar]
  • 3.Promes SB, Chudgar SM, Grochowski CO, Shayne P, Isenhour J, Glickman SW, et al. Gaps in Procedural Experience and Competency in Medical School Graduates. Acad Emerg Med. 2016;16(s2): S58–S62. doi: 10.1111/j.1553-2712.2009.00600.x. [DOI] [PubMed] [Google Scholar]
  • 4.Miller R, Ho H, Ng V, Tran M, Rappaport D, Rappaport WJ, et al. Introducing a Fresh Cadaver Model for Ultrasound-guided Central Venous Access Training in Undergraduate Medical Education. Western Journal of Emergency Medicine. 2016;17(3):362–6. doi: 10.5811/westjem.2016.3.30069. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Vozenilek J, Huff JS, Reznek M, Gordon JA. See one, do one, teach one: advanced technology in medical education. Acad Emerg Med. 2004;11(11):1149–54. doi: 10.1197/j.aem.2004.08.003. [DOI] [PubMed] [Google Scholar]
  • 6.Kneebone R. Evaluating clinical simulations for learning procedural skills: a theory-based approach. Acad Med. 2005;80(6):549–53. doi: 10.1097/00001888-200506000-00006. [DOI] [PubMed] [Google Scholar]
  • 7.Amini R, Stolz LA, Hernandez NC, Gaskin K, Baker N, Sanders AB, et al. Sonography and hypotension: a change to critical problem solving in undergraduate medical education. Adv Med Educ Pract. 2016; 7:7–13. doi: 10.2147/AMEP.S97491. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Pusic MV, Kessler D, Szyld D, Kalet A, Pecaric M, Boutis K. Experience Curves as an Organizing Framework for Deliberate Practice in Emergency Medicine Learning. Acad Emerg Med. 2012;19(12):1476–80. doi: 10.1111/acem.12043. [DOI] [PubMed] [Google Scholar]
  • 9.Jaber YM, Bonney M. Production breaks and the learning curve: The forgetting phenomenon. Applied Mathematical Modeling. 1997;21:523–31. [Google Scholar]
  • 10.Kovacs G, Bullock G, Ackroyd-Stolarz S, Cain E, Petrie D. A randomized controlled trial on the effect of educational interventions in promoting airway management skill maintenance. Ann Emerg Med. 2000;36(4):301–9. doi: 10.1067/mem.2000.109339. [DOI] [PubMed] [Google Scholar]
  • 11.Bosse HM, Mohr J, Buss B, Krautter M, Weyrich P, Herzog W, et al. The benefit of repetitive skills training and frequency of expert feedback in the early acquisition of procedural skills. BMC Medical Education. 2015;15(1):1. doi: 10.1186/s12909-015-0286-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Lynagh M, Burton R, Sanson Fisher R. systematic review of medical skills laboratory training: where to from here? . Med Educ. 2007;41(9):879–87. doi: 10.1111/j.1365-2923.2007.02821.x. [DOI] [PubMed] [Google Scholar]
  • 13.Saman N, Kaplan S, Christopher M, Ho H, Alvarado J, Viscusi R, et al. Introduction of a fresh cadaver laboratory during the surgery clerkship improves emergency technical skills. Am J Surg. 2015;210(2):401–3. doi: 10.1016/j.amjsurg.2015.01.018. [DOI] [PubMed] [Google Scholar]
  • 14.McGaghie WC, Draycott TJ, Dunn WF, Lopez CM, Stefanidis D. Evaluating the Impact of Simulation on Translational Patient Outcomes. Simul Healthc. 2011;6(Suppl):S42–S47. doi: 10.1097/SIH.0b013e318222fde9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Amini R1, Stolz LA, Gross A, O'Brien K, Panchal AR, Reilly K, et al. Theme-based teaching of point-of-care ultrasound in undergraduate medical education. Intern Emerg Med. 2015;10(5):613–8. doi: 10.1007/s11739-015-1222-8. [DOI] [PubMed] [Google Scholar]
  • 16.Hoyer R, Means R, Robertson J, Rappaport D, Schmier C, Jones T, et al. Ultrasound-guided procedures in medical education: a fresh look at cadavers. Intern Emerg Med. 2016;11(3):431–6. doi: 10.1007/s11739-015-1292-7. [DOI] [PubMed] [Google Scholar]
  • 17.Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10–28. doi: 10.1080/01421590500046924. [DOI] [PubMed] [Google Scholar]
  • 18.Ericsson KA. Deliberate practice and acquisition of expert performance: a general overview. Acad Emerg Med. 2008;15(11):988–94. doi: 10.1111/j.1553-2712.2008.00227.x. [DOI] [PubMed] [Google Scholar]
  • 19.Reed T, Pirotte M, McHugh M, Oh L, Lovett S, Hoyt AE, et al. Simulation-Based Mastery Learning Improves Medical Student Performance and Retention of Core Clinical Skills. Simul Healthc. 2016;11(3):173–80. doi: 10.1097/SIH.0000000000000154. [DOI] [PubMed] [Google Scholar]
  • 20.Lateef FL. Simulation-based learning: Just like the real thing. J Emerg Trauma Shock. 2010; 3(4):348–52. doi: 10.4103/0974-2700.70743. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Journal of Advances in Medical Education & Professionalism are provided here courtesy of Shiraz University of Medical Sciences

RESOURCES