Abstract
INTRODUCTION
Direct observation of medical students' clinical skills is important, but occurs infrequently. The mini-clinical evaluation exercise (mCEX) is a tool developed for use with internal medicine (IM) residents that can be used to promote direct observation of medical students' clinical skills. It is unknown how many IM core clerkships in the United States use the mCEX or how it has been implemented.
METHODS
Questions about use of the mCEX were incorporated into an online annual survey distributed to the 114 IM clerkships belonging to Clerkship Directors in Internal Medicine, a national organization of individuals responsible for teaching IM to medical students.
RESULTS
The survey response rate was 83%. Twenty-eight percent (N = 27) of respondents use the mCEX in their clerkship. The mean number of required mCEX encounters is 2.3 (SD 1.6). The mCEX is used for formative assessment (68%) more than summative assessment (11%). Ward attendings are the most common mCEX evaluators (72%).
DISCUSSION
The mCEX is being used to promote direct observation of medical students' clinical skills in a significant minority of IM core clerkships. The mCEX is 1 tool for facilitating feedback from both faculty and residents on trainees' developing skills.
Keywords: clinical skills, undergraduate medical education, medical students, feedback, clinical competence
Medical educators are increasingly emphasizing the assessment of medical students' clinical skills.1 Such evaluation necessitates direct observation of students' skills in history taking, physical examination, and communication so that timely, specific feedback can be used to promote growth and correct deficiencies.2 Nonetheless, the unfortunate reality remains that students are infrequently observed at the bedside with patients.3–5 During any given core clerkship, 17% to 39% of medical students report that no faculty member observed them performing a physical examination.6
Medical educators have long struggled to find tools to augment and structure observation of students' skills. Although many clerkships use standardized patients and objective structured clinical examinations for assessment,6 it has been suggested that standardized patient experiences should supplement, but not replace, observed encounters with actual patients.2
Prior research has demonstrated that the American Board of Internal Medicine's mini-clinical evaluation exercise (mCEX) is a reliable, valid means of assessing residents' clinical skills during actual patient encounters.7–9 One study has shown that the mCEX is also a reliable and valid means of assessing students' clinical skills.11 When the mCEX is used with students, an evaluator, typically a faculty member or resident, observes a student conducting a focused history and physical examination, rates the student's performance, and provides feedback. Despite the availability of the mCEX, documented experience with internal medicine (IM) clerkship students is limited to a few medical schools.10,11 It is unknown how many IM core clerkships use the mCEX for student assessment and how it has been implemented. The purpose of this study was to determine how the mCEX is being used in IM clerkships nationally.
METHODS
Clerkship Directors in Internal Medicine (CDIM), a national organization of individuals responsible for teaching IM to medical students, conducts an annual survey of its membership. In May 2004, the 114 IM clerkships belonging to CDIM (not all IM clerkships belong to CDIM) received the online annual CDIM survey that included questions about the mCEX (questionnaire available online). Nonresponders were contacted once. Clerkship representatives, typically the clerkship director (CD), were asked whether they currently use the mCEX in their clerkship, and if so, the number of required encounters, whether they modified the form to make it more appropriate for students, which individuals serve as mCEX evaluators, and how evaluation results are used. Additionally, CDs were asked whether there are any other methods of required bedside observation of students' clinical skills with actual patients (not standardized patients) by faculty or residents in the core medicine rotation. The Institutional Review Boards at the University of Pennsylvania and the University of California San Francisco approved the study.
RESULTS
Ninety-five (83%) surveys were returned. Twenty-eight percent of respondents (N = 27) reported using the mCEX in their clerkship. The mean number of mCEX encounters required per student during the IM clerkship was 2.3 (SD 1.6). Respondents reported that ward attendings (72%), outpatient attendings (46%), and residents (46%) were the most common evaluators. However, CDs or clerkship site directors (25%) and interns (25%) were also reported to be mCEX evaluators, as were teaching attendings (25%), defined in the survey as “faculty without primary clinical responsibilities who meet with clerkship students for teaching.” Sixty-eight percent of those who use the mCEX indicated that the mCEX was used only for formative assessment, whereas 11% used the mCEX only for summative assessment. The remainder (21%) used the mCEX for both formative and summative assessment. For those clerkships using the mCEX for summative assessment, the percent contribution of the mCEX to the final clerkship grade was small (mean 1.4%, SD 3, range 0 to 10). Thirty percent (N = 8) of mCEX users indicated that they modified the standard mCEX form. Examples of modifications included adding a section on oral case presentations (N = 1), decreasing the number of competencies (N = 2), changing rating scale descriptors or adding behavioral anchors (N = 2), and simplifying language on the card (N = 1). Of the 87 respondents who answered the question regarding other methods of required observation of students' skills, only 14 respondents reported using other methods of required beside observation of students' skills with actual patients. Of these, 10 were clerkships that were also using the mCEX.
DISCUSSION
This is the first study describing mCEX use with third-year medical students in multiple IM clerkships. We found that a significant minority of all IM clerkships use the mCEX. In contrast to the literature describing structured observations in the ambulatory setting,12,13 our finding that inpatient and outpatient attendings as well as house officers are serving as mCEX evaluators indicates that the mCEX is being applied in the varied clinical settings typical of IM clerkships.
Our results demonstrate that IM clerkships use the mCEX primarily for feedback. Use of the mCEX as a tool for structured feedback about trainees' clinical skills is important for trainees' development as physicians because the specific feedback that occurs with direct observation is a critical component of deliberate practice, a learning strategy believed to maximize skill improvement through repetition and coaching.14,15 Additionally, direct observation improves the reliability and validity of clinical preceptors' summative assessments of students,16 although the small number of mCEX encounters per student reported by our study population suggests that reliability would be low.
This study has limitations. First, the fact that not all IM clerkships belong to CDIM, coupled with the 17% survey nonresponse rate, introduces the possibility of bias. Differences between mCEX users and nonusers were not investigated, thereby limiting our understanding about the feasibility of mCEX use in various institutional settings. We do not have data on why the mCEX has been adopted at certain institutions while not at others. This study only describes mCEX use in IM clerkships. Methods for structured observation of clinical encounters in other disciplines, particularly pediatrics and ambulatory settings, have been reported.12,13 How these different tools for structured observation compare with the mCEX is unknown, and more research is needed in this area. Such information will be important, given that our data suggest that the majority of clerkships who are not using the mCEX do not have other methods of required bedside observation of students' skills with actual patients. Finally, this study does not address whether mCEX use ultimately helps students improve their skills.
Practical, efficient systems that can help busy faculty structure their observations of students are imperative in clinical education.17 Our findings demonstrate that the mCEX is being used to facilitate observation and feedback in inpatient and ambulatory components of some IM clerkships. As some clerkship directors have modified the mCEX for use with students, it will be important to evaluate any effect that these changes may have had on the tool's reliability and validity. Encouraging use of a standardized mCEX form for students in IM clerkships could facilitate rigorous multi-institution research on mCEX use in undergraduate medical education.
Acknowledgments
We gratefully acknowledge the contributions of Paul Hemmer, MD, Kathleen Kerr, and Judy Shea, PhD.
supplementary-material
The following supplementary material is available for this article online: at www.blackwell-synergy.com
REFERENCES
- 1.Liaison Committee on Medical Education. LCME accreditation standards. [January 2005]; Available at http://www.lcme.org/fucntionslist.htn.
- 2.Holmboe ES. Faculty and observation of trainees' clinical skills: problems and opportunities. Acad Med. 2004;79:16–22. doi: 10.1097/00001888-200401000-00006. [DOI] [PubMed] [Google Scholar]
- 3.Kassebaum DG, Eaglen RH. Shortcomings in the evaluation of students' clinical skills and behaviors in medical school. Acad Med. 1999;74:842–9. doi: 10.1097/00001888-199907000-00020. [DOI] [PubMed] [Google Scholar]
- 4.Kernan WN, Holmboe E, O'Connor PG. Assessing the teaching behaviors of ambulatory care preceptors. Acad Med. 2004;79:1089–94. doi: 10.1097/00001888-200411000-00017. [DOI] [PubMed] [Google Scholar]
- 5.Howley LD, Wilson WG. Direct observation of students during clerkship rotations: a multiyear descriptive study. Acad Med. 2004;79:276–80. doi: 10.1097/00001888-200403000-00017. [DOI] [PubMed] [Google Scholar]
- 6.Medical School Graduation Questionnaire: all schools report. [January 2005];2004 Available at http://www.aamc.org/data/gq/allschoolsreports/2004.pdf.
- 7.Norcini JJ, Blank LL, Arnold GK, Kimball HR. The mini-CEX (clinical evaluation exercise): a preliminary investigation. Ann Intern Med. 1995;123:795–9. doi: 10.7326/0003-4819-123-10-199511150-00008. [DOI] [PubMed] [Google Scholar]
- 8.Norcini JJ, Blank LL, Duffy FD, Fortna GS. The mini-CEX: a method for assessing clinical skills. Ann Intern Med. 2003;138:476–81. doi: 10.7326/0003-4819-138-6-200303180-00012. [DOI] [PubMed] [Google Scholar]
- 9.Durning SJ, Cation LJ, Markert RJ, Pangaro LN. Assessing the reliability and validity of the mini-clinical evaluation exercise for internal medicine training. Acad Med. 2002;79:900–4. doi: 10.1097/00001888-200209000-00020. [DOI] [PubMed] [Google Scholar]
- 10.Hauer KE. Enhancing feedback to students using the mini-CEX (clinical evaluation exercise) Acad Med. 2000;75:524. doi: 10.1097/00001888-200005000-00046. [DOI] [PubMed] [Google Scholar]
- 11.Kogan JR, Bellini LM, Shea JA. Feasibility, reliability, and validity of the mini-clinical evaluation exercise (mCEX) in a medicine core clerkship. Acad Med. 2003;78:S33–5. doi: 10.1097/00001888-200310001-00011. [DOI] [PubMed] [Google Scholar]
- 12.Lane JL, Gottlieb RP. Structure clinical observations: a method to teach clinical skills with limited time and financial resources. Pediatrics. 2000;105(Pt 2):93–7. [PubMed] [Google Scholar]
- 13.Greenberg LW. Medical students' perceptions of feedback in a busy ambulatory setting: a descriptive study using a clinical encounter card. South Med J. 2004;97:1174–8. doi: 10.1097/01.SMJ.0000136228.20193.01. [DOI] [PubMed] [Google Scholar]
- 14.Rolfe IE, Sanson-Fisher RW. Translating learning principles into practice: a new strategy for learning clinical skills. Med Educ. 2002;36:345–52. doi: 10.1046/j.1365-2923.2002.01170.x. [DOI] [PubMed] [Google Scholar]
- 15.Issenberg SB, McGaghie WC. Clinical skills training-practice makes perfect. Med Educ. 2002;36:210–11. doi: 10.1046/j.1365-2923.2002.01157.x. [DOI] [PubMed] [Google Scholar]
- 16.Hasnain M, Connell KJ, Downing SM. Toward meaningful evaluation of clinical competence: the role of direct observation in clerkship ratings. Acad Med. 2004;79:S21–4. doi: 10.1097/00001888-200410001-00007. [DOI] [PubMed] [Google Scholar]
- 17.Watson RT. Rediscovering the medical school. Acad Med. 2003;78:659–65. doi: 10.1097/00001888-200307000-00002. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.