Skip to main content
Medscape General Medicine logoLink to Medscape General Medicine
. 2005 May 24;7(2):76.

A Critique of the USMLE Clinical Skills Examination

Nupur P Mehta 1, Daniel B Kramer 2
PMCID: PMC1681601  PMID: 16369454

At our institution, all medical students are now required to take and pass the United States Medical Licensing Examination, Step 2 Clinical Skills (CS) prior to graduation, and many of us have now received our “Pass” CS score report. But unlike passing Step 1 CS and Step 2 Clinical Knowledge (CK), receiving a passing score report on the CS only heightened our frustration with the entire experience. The CS has become a licensure requirement for all doctors graduating from medical school in 2005 or later. A similar exam has been required since 1998 for foreign medical graduates to ensure a minimum proficiency in patient interaction and English communication, as part of the Educational Commission for Foreign Medical Graduates (ECFMG) certification.[1] In its current form, the CS exam claims to use “standardized patients to test medical students and graduates on their ability to gather information from patients, perform physical examinations, and communicate their findings to patients and colleagues.[2]” As the inaugural subjects, we suffered with our colleagues around the country through all the trains and planes, rumors and rituals, pompous rhetoric, and laughable acting. All this, now combined with the stunningly inadequate feedback, confirms our prior suspicions about an entirely dubious enterprise.

Consider the CS score report, which includes no information about areas of weakness or strength, even with crudely defined criteria, such as thoroughness of history taking, physical exam skills, or formulation of differential diagnosis. The report simply implies that we have met a very bare minimum requirement without providing any further information. This paucity of feedback underscores one of the critical inadequacies of this examination only briefly explored in previously published discussions.[3,4] The 3 score categories that were mentioned – Integrated Clinical Encounter, Communication and Interpersonal Skills, and Spoken English Proficiency – are hopelessly broad, particularly when compared with structured categories of analysis offered by the Step 1 CS and Step 2 CK exams.[5] In these 2 exams, the score report, pass or fail, provides detailed performance information for the specific subjects and disciplines covered on the test (internal medicine, psychiatry, biochemistry, etc). In comparison, the CS score report for the nearly 97% of test takers expected to pass the exam[6] provides no helpful evaluation or feedback of any kind. Thus, it remains entirely unclear what students, medical schools, residency programs, or the general public ought to conclude about this exam or the students who pass it.

The shortcomings of the CS feedback are even more evident when we compare it with the Objective Structured Clinical Examination (OSCE) exams currently held voluntarily and at great expense and effort by some three quarters of US medical schools, with nearly half of these requiring a satisfactory performance for graduation.[7] (Of the remainder, many schools are in the process of creating OSCE exams and others are making them requirements for graduation.) On these exams, the standardized patient and an experienced physician observe every history question, physical exam technique, and treatment formulation. Detailed evaluation and feedback come from the patient and physician observers on each stage and the overall exercise. This allows students to quickly and accurately address weak areas, while also building confidence in those skills of which they may already be proficient. This system also allows educators to monitor students' progress and focus on the needs of specific students or entire curricula. Students may find the OSCE experience to be anxiety-provoking, but evidence suggests that the exercises overall are very well received.[812]

In addition to the superior feedback mechanism, the OSCE approaches the stated goals of the CS exam much more rigorously, particularly in the area of the physical exam. An orthopaedist observes your low-back-pain exam; a cardiologist ensures that you can hear the murmur and describe it correctly; a neurologist increases the odds that your motor exam has any chance of eliciting abnormal findings. In our experience, we found these specialists to be quite forthcoming, almost eager, with their constructive criticism. The CS exam elicits the motions of a competent exam, but without an experienced clinician-observer in the room, nothing prevented us on test day from, say, auscultating the scapula. Similarly, the written component of the CS exam compares poorly with the on-the-spot presentation of our history, findings, differential diagnosis, and decision making. Laying exam findings bare before the faculty supports the immediate integration of communication and clinical thinking. Indeed, the supervision and evaluation of the OSCE attenuate a common criticism of standardized encounters generally: the artificial feel of “fake” patients. Physician-observers provide a real-time quality-control mechanism in which imitated physical findings or patient questions can be properly qualified and contrasted to genuine experience. Thus, even though both the CS and OSCE require often unrealistic portrayals of sick patients, the presence, supervision, and evaluation by a physician in the room support and facilitate a more worthwhile experience.

We submit that the OSCE precisely captures the supposed virtues of the CS exam, which itself adds little to our education or training at over $1000 per student. To some, this cost may seem minor compared with the overall costs of medical education – recently estimated at $140,000 for public and $225,000 for private schools.[13] We strenuously disagree with this reasoning, however, which substitutes obvious math for serious debate on the attitudes and principles driving modern medical instruction and healthcare generally. Claiming that the CS exam is “just another $1000” fails to address the trends toward higher costs and subsequent barriers to accessing quality graduate medical education. This flippant attitude also ignores the way in which multiple smaller expenses – textbooks, supplies, and student health insurance – add up to increase the average student-debt burden. As with any new intervention offered in healthcare today, the CS exam must justify its expense, whatever the magnitude. In our opinion, it does not.

We therefore applaud those schools already investing in the OSCE and encourage other schools to consider their funding priorities and develop OSCEs of their own. If the public indeed demands this manner of examination, as has been claimed, then perhaps all medical schools should be required by the National Board of Medical Examiners (NBME) to hold 1 or more OSCE exercises with satisfactory performances necessary for graduation. Holding these exams at each school, rather than at a few centers nationwide, would reduce the inconvenience and expense for students while allowing individual schools to adapt their curricula rapidly on the basis of areas of strength and weakness. Hopefully, this could be achieved without passing on additional expenses to students. This approach would keep the burden of creating skilled clinicians in the province of medical schools, where it belongs.

Indeed, what else does the CS exam do but call into question the ability of American medical schools to teach a physician's most fundamental skills? We accept the tedium of written exams in order to guarantee a consistent fund of knowledge across the country; this is relatively easy to test and while providing helpful feedback. But the challenge of the clinical encounter – earning trust and constructing a story, looking and listening, testing hypotheses and making decisions, and explaining and reassuring – cannot possibly be met with this elaborate educational sham. Let us instead earn the public's trust by supporting rigor within medical schools, demanding of ourselves and our teachers a greater commitment to mastering the skills that matter most to our patients.

Contributor Information

Nupur P. Mehta, Harvard Medical School, Boston, Massachusetts.

Daniel B. Kramer, Harvard Medical School, Boston, Massachusetts.

References

  • 1.Educational Commission for Foreign Medical Graduates. Educational Commission for Foreign Medical Graduates. Available at: http://www.ecfmg.org Accessed April 28, 2005.
  • 2.National Board of Medical Examiners. 2005 USMLE Step 2 CS Content Description and General Information Booklet. Philadelphia, Pa: National Board of Medical Examiners; 2005. [Google Scholar]
  • 3.Papadakis MA. The Step 2 clinical-skills examination. N Engl J Med. 2004;350:1703–1705. doi: 10.1056/NEJMp038246. [DOI] [PubMed] [Google Scholar]
  • 4.Diaz D, Bogdonoff MD, Musco S, et al. The clinical-skills examination. N Engl J Med. 2004;351:507–509. [PubMed] [Google Scholar]
  • 5.National Board of Medical Examiners. United States Medical Licensing Examination Score Report. Philadelphia, Pa: National Board of Medical Examiners; 2005. USMLE Step 2 CK Performance Profiles. [Google Scholar]
  • 6.Whelan G. High-stakes medical performance testing: the Clinical Skills Assessment program. JAMA. 2000;283:1748–1748. [PubMed] [Google Scholar]
  • 7.Baranksy B, Etzel SI. Educational programs in US medical schools, 2003–2004. JAMA. 2004;292:1025–1031. doi: 10.1001/jama.292.9.1025. [DOI] [PubMed] [Google Scholar]
  • 8.Pierre RB, Wierenga A, Barton M, Branday JM, Christie CD. Student evaluation of an OSCE in paediatrics at the University of the West Indies, Jamaica. BMC Med Educ. 2004;4:22. doi: 10.1186/1472-6920-4-22. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Yudkowsky R, Alseidi A, Cintron J. Beyond fulfilling the core competencies: an objective structured clinical examination to assess communication and interpersonal skills in a surgical residency. Curr Surg. 2004;61:499–503. doi: 10.1016/j.cursur.2004.05.009. [DOI] [PubMed] [Google Scholar]
  • 10.Brazeau C, Boyd L, Crosson J. Changing an existing OSCE to a teaching tool: the making of a teaching OSCE. Acad Med. 2002;77:932. doi: 10.1097/00001888-200209000-00036. [DOI] [PubMed] [Google Scholar]
  • 11.Tervo RC, Dimitrievich E, Trujillo AL, Whittle K, Redinius P, Wellman L. The Objective Structured Clinical Examination (OSCE) in the clinical clerkship: an overview. S D J Med. 1997;50:153–156. [PubMed] [Google Scholar]
  • 12.Walters K, Osborn D, Raven P. The development, validity and reliability of a multimodality objective structured clinical examination in psychiatry. Med Educ. 2005;39:292–298. doi: 10.1111/j.1365-2929.2005.02091.x. [DOI] [PubMed] [Google Scholar]
  • 13.Morrison G. Mortgaging our future – the cost of medical education. N Engl J Med. 2005;352:117–119. doi: 10.1056/NEJMp048089. [DOI] [PubMed] [Google Scholar]

Articles from Medscape General Medicine are provided here courtesy of WebMD/Medscape Health Network

RESOURCES