Abstract
Assessment drives learning. However, when it comes to high-stakes examinations (e.g., for licensure or certification), these assessments of learning may be seen as unnecessary hurdles by some. Licensing clinical skills assessment in particular have come under fire over the years. Recently, assessments such as the Medical Council of Canada Qualifying Examination Part II, a clinical skills objective structured clinical examination, have been permanently cancelled. The authors explore potential consequences of this cancellation including those that are inadvertent and undesirable. Future next steps for clinical skills assessment are explored.
Abstract
L’évaluation est le moteur de l’apprentissage. Cependant, lorsqu’il s’agit d’examens à enjeux élevés (par exemple, pour l’obtention du titre de licencié ou la certification), ces évaluations de l’apprentissage peuvent être perçues comme inutiles par certains. L’évaluation des compétences cliniques pour l’obtention du titre de licencié, en particulier, a été critiquée au fil des ans. Récemment, des évaluations comme l’examen d’aptitude du Conseil médical du Canada, partie II, un examen clinique objectif structuré permettant d’évaluer les compétences cliniques, ont été définitivement retirées. Les auteurs explorent les conséquences potentielles de l’annulation de ces évaluations incluant celles non intentionnelles et indésirables, ainsi que des perspectives sur l’évaluation des habiletés cliniques.
Introduction
Assessment drives learning. This assumption has been the topic of many scholarly works but also the subject of controversy.1,2,3,4 When it comes to high-stakes examinations (e.g., for licensure or certification), these assessments of learning may be seen as unnecessary hurdles where the aim is to pass rather than to enhance learning.5 Licensing clinical skills assessments have come under fire in recent years with some stating that these examinations are historical artifacts, that certification requirements suffice for licensure and that the benefits do not justify the expense, especially when pass rates are high.6,7,8,9 In Canada specifically, the timing of the Medical Council of Canada Qualifying Examination Part II (MCCQE Part II) after one year of residency (rather than at the end of medical school) has been criticized for not situating the scenarios within the scope of practice of the candidates.6
On the flip side, there are some strong supporters of these high-stakes examinations. Examinations are seen by many as necessary to ensure the protection of the public through national standards. Examinations are also very powerful drivers of learning.10 In the US, following a petition to eliminate the United States Medical Licensing Examination (USMLE) Step-2 clinical skills (CS) examination, clinical skills directors stipulated that the elimination of this exam would lead to the de-valuing of clinical skills in medical education, loss of a national standard across candidates, individual schools’ inability to have robust psychometrics and failure to protect the public.11
The COVID-19 pandemic has caused major disruption in high-stakes assessments, especially for clinical skills licensure examinations. Both the USMLE Step-2 CS examination and the MCCQE Part II, both objective structured clinical examinations (OSCEs), have permanently been cancelled, whereas the National Board of Osteopathic Medical Examiners have suspended theirs. Although this has caused some degree of jubilation on the part of candidates no longer having to prepare or pay fees for these examinations, should the greater medical community (including the public) be concerned? Should a demonstration of having attained core national standards for clinical skills be an expectation of physicians?
Core clinical skills assessment in Canada
The Medical Council of Canada (MCC) was established in 1912 through federal legislation to provide a national qualification in medicine that would be acceptable in all provinces. In 1992, a clinical skills examination was added to the traditional written examination (the MCCQE Part I) at the request of the medical regulators.12 The purpose of the clinical skills examination was to assess a candidate’s core abilities to apply medical knowledge, demonstrate clinical skills, as well as demonstrate professional behaviours and attitudes at a level expected of a physician in independent practice in Canada. From 1992 until the end of 2019, all physicians (Canadian and internationally trained physicians) were required to pass both the MCCQE Part I and Part II to obtain the Licentiate of the Medical Council of Canada (LMCC), a standard requirement for licensure in Canada. It should be noted that this requirement is necessary in addition to successful completion of medical training as well as certification from one of the certifying colleges in Canada.
The MCCQE Part II was based on the MCC Objectives (https://www.mcc.ca/objectives) and assessed skills beyond basic history-taking and physical examination. In 2018, the MCCQE Part II introduced a new blueprint with stations that were more reflective of core skills required of all physicians and with scenarios deemed to be more authentic by clinicians. It assessed physician activities including assessment and diagnosis of common conditions, management skills and decision making, communication skills (e.g., breaking bad news), and professional behaviours (e.g., dealing with ethical dilemmas) in four different dimensions of care (acute, chronic, health promotions/illness prevention and psychosocial aspects of care).13
The World Health Organization declared the COVID-19 pandemic in March 2020 and the MCC subsequently cancelled the May delivery of the MCCQE Part II due to public health concerns. The MCC then planned to deliver a modified version of the MCCQE Part II in October 2020, incorporating the use of personal protective equipment, physical distancing, enhanced cleaning protocols and a touchless physical examination. However, this examination was cancelled on short notice when a number of examination sites were unable to fulfill their commitments to deliver the examination due to concerns related to rising numbers of COVID cases. There was then an attempt to pivot to a virtual examination in May 2021 but, due to validity issues with challenges of scalability of the delivery platform for the large candidate cohort, this effort had to be abandoned shortly after its launch. The cancellation of three examination sessions in a row left thousands of physicians in limbo as they were unable to apply for licensure without the LMCC credential. The MCC recognized that, with the ongoing pandemic, it was unable to ensure that the backlog of candidates could be tested in a timely fashion. After discussion with medical regulators, the governing council of the MCC made the difficult decision to cancel the MCCQE Part II permanently in June 2021.
Certification examinations for specialty designation such as those of the College of Family Physicians of Canada and the Royal College of Physicians and Surgeons of Canada regulators provided some reassurance that clinical skills have been assessed. However, the MCCQE Part II assessed core knowledge, skills and behaviours expected of all physicians regardless of specialty and so its cancellation raises some concerns.
In this opinion paper, we explore possible unintended consequences of the cancellation of the MCCQE Part II through impacts on public safety, learning, and curriculum as well as discuss possible next steps. We both bring our perspectives through extensive leadership experience in the field of assessment through their academic appointments, research, international collaborations and in our affiliation with the MCC.
Impacts on patient safety and the safeguarding of the public
National examinations are an important step in protecting the public. Clinical skills are vital to assess, yet we know that trainees’ clinical skills have historically been infrequently observed in practice.14 When questioning whether these examinations do protect the public, outcome studies have demonstrated validity evidence for the MCCQE Part II. Lower scores on the MCCQE Part II have been shown to be related to physician behaviour such as college complaints, prescribing practices and appropriate use of screening.15,16,17 Most recently, De Champlain et al. have shown that those physicians who failed the MCCQE Part II on their first attempt were more likely to mis-prescribe opiates and benzodiazipines.18 These studies suggest that the scores on national performance-based examinations may predict patterns of behaviour which could be worth observing from a licensure perspective in order to ensure proper care of patients.
Although the assessment of clinical skills may improve with the introduction of competency-based medical education (CBME) it is unlikely to be a complete solution. Critics of CBME bemoan the lack of evidence for this pedagogical approach and warn that the tediousness of documentation associated with CBME may lead to assessments that favour reductionism over holism.19 In addition, a ‘failure to fail’ culture has been identified as an ongoing problem in medical education and there is significant evidence that schools struggle to identify and/or remediate trainees with inadequate clinical skills.20,21 Reasons for this may include everything from bureaucratic hurdles to lack of resourcing and faculty development. As such, it may not be realistic to rely on schools to report or act on unsatisfactory performance even with the introduction of CBME.
Opponents to the MCCQE Part II have argued that high pass rates indicate that these are superfluous assessments, and thus the expense is not justified.7,8 In Canada, between 2005-2019, the pass rate for the MCCQE Part II has ranged between 90-97% for Canadian trained first-time takers and between 55-75% for those trained outside of Canada (MCC Annual Reports; www.mcc.ca ). The high pass rate for Canadian-trained test takers is most likely due to a combination of clinical skills training during medical school (including frequent OSCEs), ongoing practice of those skills through the workplace in first year of residency and examination preparation. However, it is important to note that these high pass rates for graduates of Canadian medical schools still translate into several hundred candidate failures per year. Those who fail then have an incentive to focus on improving their clinical skills. The loss of a national examination will eliminate any signal that something is amiss and may also cause those that would have passed to have less motivation to cultivate these skills.
Some might question the need for a clinical skills licensing examination for Canadian physicians when they are also assessed at a national level by certification examinations. However, these latter examinations are not designed to assess the same core competencies as the MCCQE Part II. Because the certification examinations are specialty-specific, there is much heterogeneity and so not all physicians are assessed on the same competencies. Some include a performance-based component, while others do not. Some include a physical examination component, while others do not. Although there may be some overlap in terms of skills assessed, particularly between the CFPC certification examination and the MCCQE Part II, each has its unique blueprint and each serves a unique role.
Impacts on learning
We know that OSCEs promote learning before, during and after testing: knowledge of an impending assessment provides a powerful incentive to study; the very act of taking a test leads directly to learning; and the feedback provided after a test helps examinees focus future learning efforts.22 In other words, assessment helps learners to consolidate their knowledge and clinical skills.
Strong clinical skills are imperative when making accurate diagnoses and caring for patients. The competent clinician gathers pertinent information through history-taking and a physical examination and communicates effectively with patients while upholding the tenets of the profession. Since the 1970s, studies have shown that a well-conducted history and physical examination can accurately diagnose patients greater than 70% of the time.14 There is also a relationship between good communication skills, the provision of patient-centered care and minimizing diagnostic and other errors.23,24,25
If we need physicians to master clinical skills, then they should be assessed to a national standard prior to licensure. If basic clinical skills are not assessed in high-stakes examinations, these skills run the risk of being de-valued by learners. The motivation for learning these skills will erode as time spent developing these skills may be viewed as a poor return on investment. With so much to learn in medicine, trainees may focus less on learning clinical skills and these basic skills risk being lost.
Impacts on curriculum
Assessment not only drives learning, but it can also help to drive curriculum. When Internal Medicine Clerkship Directors were surveyed after the introduction of the USMLE Step 2 CS (a clinical skills examination), 40% of respondents indicated that their schools had begun placing increased emphasis or curricular time on clinical skills education.11 Because of the external motivation of high-stakes examinations, schools have invested considerable effort in developing excellent clinical skills programs to ensure their students are well-prepared. Since then, most schools have implemented their own local OSCEs. In Canada, every medical school includes up to four OSCEs in their curriculum assessing clinical skills. However, Yudkowski and Szauter question whether schools may infer that the cessation of testing these skills means that they are no longer considered important for licensure.26 Getting rid of a national clinical skills examination could lead to a de-emphasis of these skills in medical school curricula and a return to the status quo (i.e., reliance on assessments that focus on basic knowledge).11
Summative clinical skills assessment is a resource-intensive and expensive proposition. Many Canadian medical schools’ standardized patient (SP) programs were developed in part because of the MCCQE Part II. MCC has been actively involved in providing training and support to SP programs for over 20 years. Revenue from the MCCQE Part II and other high-stakes national examinations allowed schools to develop and maintain high-quality SP programs. Without the incentive of the national examination and with ongoing fiscal constraints, this loss may further have impacts on medical schools’ ability to maintain these valuable programs as teaching and assessment resources.
Although schools currently assess clinical skills through their own OSCEs (and they may continue to do so in the future), the inherent heterogeneity in medical training necessitates the use of a gatekeeper if we wish to maintain a national standard. If we rely on schools to assess clinical skills by developing their own examinations, they may not have the resources to develop adequate content banks, ensure content security, and maintain rigorous psychometric standards.
As a national, high-stakes examination, the MCCQE Part II provided standards of core clinical skills expected of all physicians (Canadian and internationally trained) for the purpose of licensure. It also provided a national benchmark for new programs such as new medical schools (e.g., the Northern Ontario School of Medicine) and the many regional medical campuses that have been created over the last 20 years. Without this examination, the assessment responsibility of these standards would fall to medical schools and residency programs. Accreditation agencies may play a role in mandating high quality local clinical skills assessments, however school-to-school variability in quality of assessment may not allow for truly standardized assessments at the individual trainee level.
So what’s next?
With the cancellation of the MCCQE Part II, we must actively identify and remedy unintended consequences. It is imperative that we maintain national standards for core clinical skills expected of all physicians, especially as they exit medical school and enter residency training. Improvements to workplace-based assessments with such frameworks as the Association of Faculties of Medicine of Canada Entrustable Professional Activities (EPAs) for the transition from medical school to residency offer some promise of standards.27 Studies of such core EPAs are emerging and showing some correlation with local OSCEs but this is a far from demonstrating validity evidence at a national level.28 Developing rigorous programs of assessments with frequent observations and use of clinical data may eventually trump any need for point-in-time examinations,29 although the authors suspect that regulators and the public will continue to demand that physicians undergo national high-stakes examination as a requirement for licensure.
If national standards for clinical skills are valued, then they deserve to be assessed. Emphasis on core diagnostic and management skills, patient-centered communication, professional behaviours, cultural humility and safety, virtual and collaborative care should all be considered.25,30,31,32 The design of any future assessment strategy should be aligned with educational and societal needs and must include the ability to provide meaningful feedback such that the candidates can learn from their experience and finally, be mindful of cost.9,33,34
Past criticism and recent experience with the COVID-19 pandemic have clearly shown that there is a need to reconsider the way these assessments are done. Reinstating a national clinical skills standard should be a consideration by the medical regulators and educators. More appropriate timing of an examination should be considered with demonstration of core skills at the end of medical school rather than during residency training. How such an assessment is administered should depend on what is being assessed and how best to assess it.
Finally, we have identified some potential consequences of cancelling the national clinical skills examinations, but what about those consequences that we have not yet considered? How can we effectively study the effect of this? Can we pivot, as is happening in the US, creating novel opportunities with regional and national clinical skills initiatives?35,36
In summary
Before becoming jubilant about one less hurdle for Canadian physicians to jump to obtain their license to practice, we should determine what skills are necessary to be attained by all physicians. Demonstration of these skills to a national standard should be an expectation. Certification colleges play an important role in the pathway to licensure but do not consistently assess core clinical skills across all specialties. CBME and workplace-based assessments offer promises of rigorous assessment of core clinical skills but implementation is still ongoing and outcome studies will be years in the making. Failure to fail will likely remain an issue. Losing the MCCQE Part II will have many unintended consequences. As a community, we need to address these, consider future assessment of these important skills and place safeguards to ensure the public that we are doing what is best for our patients as a self-regulated profession.
Conflicts of Interest
D. Pugh is a paid employee of the Medical Council of Canada. C. Touchie is a paid consultant advisor for the Medical Council of Canada. The views expressed in this manuscript are those of the authors and do not necessarily reflect the views of the Medical Council of Canada.
Funding
None
References
- 1.Schuwirth L, van der Vleuten C. Merging views on assessment. Med Educ. 2004;38(12):1208-1210. 10.1111/j.1365-2929.2004.02055.x [DOI] [PubMed] [Google Scholar]
- 2.Wood T. Assessment not only drives learning, it may also help learning. Med Educ. 2009;43(1):5-6. 10.1111/j.1365-2923.2008.03237 [DOI] [PubMed] [Google Scholar]
- 3.Wormald BW, Schoeman S, Somasunderam A, Penn M. Assessment drives learning: an unavoidable truth? Anat Sci Educ. 2009;2(5):199-204. 10.1002/ase.102 [DOI] [PubMed] [Google Scholar]
- 4.Scott IM. Beyond ‘driving’: The relationship between assessment, performance and learning. Med Educ. 2020;54(1):54-59. 10.1111/medu.13935 [DOI] [PubMed] [Google Scholar]
- 5.Pugh D, Regehr G. Taking the sting out of assessment: is there a role for progress testing? Med Educ. 2016;50(7):721-729. 10.1111/medu.12985 [DOI] [PubMed] [Google Scholar]
- 6.Lougheed T. Is it time to rethink the MCCQE Part II? Can Med Educ J. 2016;7(1):e87-e88 [PMC free article] [PubMed] [Google Scholar]
- 7.Jayakumar KL. The limited value of USMLE Step 2CS (letter to the editor). Acad Med. 2018;93(3):345. [DOI] [PubMed] [Google Scholar]
- 8.Lehman EP, Guercio JR. The Step 2 clinical skills exam–a poor value proposition. NEJM. 2013;368(10):889-891. 10.1056/NEJMp1213760 [DOI] [PubMed] [Google Scholar]
- 9.Tsichlis JT, Del Re AM, Carmody JB. The past, present, and future of the United States medical licensing examination step 2 clinical skills examination. Cureus 2021;13(8):e17157. 10.7759/cureus.17157 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Larsen DP, Butler AC, Roediger HL III. Test-enhanced learning in medical education. Med Educ. 2008;42(10):959–66. 10.1111/j.1365-2923.2008.03124.x [DOI] [PubMed] [Google Scholar]
- 11.Ecker DJ, Milan FB, Cassese T, et al. Step up–Not on–the Step 2 Clinical Skills exam: Directorsof clinical skills courses (DOCS) oppose ending Step 2CS. Acad Med. 2018;93(5):693-698. 10.1097/ACM.0000000000001874 [DOI] [PubMed] [Google Scholar]
- 12.Vodden C. Licentiate to heal: a history of the Medical Council of Canada. Medical Council of Canada. 2008. pp.66-69 [Google Scholar]
- 13.Touchie C, Streefkerk C. for the Blueprint Project Team. Blueprint and content specifications. Ottawa, Ontario. September 2014. Available at https://mcc.ca/media/Blueprint-Report-1.pdf [Google Scholar]
- 14.Holmboe ES. Faculty and the observation of trainees’ clinical skills: problems and opportunities. Acad Med. 2004;79(1):16-22. 10.1097/00001888-200401000-00006 [DOI] [PubMed] [Google Scholar]
- 15.Tamblyn R, Abrahamowicz M, Dauphinee D, et al. Physician scores on a national clinical skills examination as predictors of complaints to medical regulatory authorities. JAMA. 2007;288(23):993-1001. 10.1001/jama.288.23.3019 [DOI] [PubMed] [Google Scholar]
- 16.Cadieux G, Abrahamowicz M, Dauphinee D, Tamblyn R. Are physicians with better clinical skills on licensing examinations less likely to prescribe antibiotics for viral respiratory infections in amburlatory care settings? Med Care 2011;49(2):156-165. 10.1097/MLR.0b013e3182028c1a [DOI] [PubMed] [Google Scholar]
- 17.Meguerditchian AN, Dauphinee D, Girard N, Eguale T, Riedel Ket al. Do physician communication skills influence screening mammography utilization? BMC Health Serv Res. 2012;12:219-226. 10.1186/1472-6963-12-219 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.De Champlain A, Ashworth N, Kain N, Qin S, Wiebe D, Tian F. Does pass/fail on medical licensing exams predict future physician performance in practice? A longitudinal cohort study of Albertan physicians. J Med Reg. 2020;106(4):17-26 [Google Scholar]
- 19.Holmboe ES, Sherbino J, Englander R, Snell L, Frank JR. A call to action: The controversy of and rationale for competency-based medical education. Med Teach. 2017;39(6):574-581. 10.11080/0142159X.2017.13115067 [DOI] [PubMed] [Google Scholar]
- 20.Mak-van der Vossen M. ‘Failure to fail’: the teacher’s dilemma revisited. Med Educ. 2019;53(1):106-114. 10.1111/medu.13772 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Yepes-Rios M, Dudek N, Duboyce R, Curtis J, Allard RJ, Varpio L. The failure to fail underperforming trainees in health professions education: A BEME systematic review: BEME Guide No. 42. Med Teach. 2016;38(11):1092-1099. 10.1080/0142159X.2016.1215414 [DOI] [PubMed] [Google Scholar]
- 22.Pugh D, Desjardins I, Eva K. How do formative objective structured clinical examinations drive learning? Analysis of residents’ perceptions. Med Teach. 2018;40(1):45-52. 10.1080/0142159X.2017.1388502 [DOI] [PubMed] [Google Scholar]
- 23.National Academies of Sciences, Engineering, and Medicine . Crossing the quality chasm: A new health system for the 21st century. 2001. Washington, DC: The National Academies Press. [PubMed] [Google Scholar]
- 24.National Academies of Sciences, Engineering, and Medicine . Improving diagnosis in health care. 2015. Washington, DC: The National Academies Press. [Google Scholar]
- 25.Levinson W, Lesser CS, Epstein RM. Developing physician communication skills for patient-centered care. Health Affairs. 2019;29(7):1310-1318. 10.1377/hlthaff.2009.0450 [DOI] [PubMed] [Google Scholar]
- 26.Yudkowsky R, Szauter K. Farewell to the Step 2 Clinical Skills exam: New opportunities, obligations and next steps. Acad Med. 2021;96(9):1250-1253. 10.1097/ACM.0000000000004209 [DOI] [PubMed] [Google Scholar]
- 27.Association of the Faculties of Medicine . Entrustable professional activities for the transition from medical schools to residency. 2016. (updated in 2019). Available at https://www.afmc.ca/sites/default/files/pdf/AFMC_Entrustable%20Professional%20Activities_EN_Final.pdf
- 28.Soukoulis V, Martindale J, Bray MJ, Bradley E, Gusi ME. The use of EPA assessments in decision-making: do supervision ratings correlate with other measures of clinical performance? Med Teach. 2021;July 9:1-7 (Online ahead of print). 10.1080/0142159X.2021.1947480 [DOI] [PubMed] [Google Scholar]
- 29.Thoma B, Monteiro S, Pardhan A, Waters H, Chan T. Replacing high-stakes summative examinations with graduated medical licensure in Canada. CMAJ 2022;194:E169-170. https://doi:10.1503/cmaj.211816 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Association of the Faculties of Medicine . Joint commitment to action on Indigenous Health. May 2019. Available at http://www.afmc.ca/sites/default/files/pdf/AFMC_Position_Paper_JCAIH_EN.pdf
- 31.Canadian Medical Association, College of Family Physicians of Canada and Royal College of Physicians and Surgeons of Canada. Virtual Care – Recommendations for scaling up virtual medical services. Report of the virtual care task force. February 2020. Available at https://policybase.cma.ca/documents/PolicyPDF/PD20-07.pdf
- 32.Canadian Interprofessional Health Collaborative . A national interprofessional competency framework. February 2010. Available at https://drive.google.com/file/d/1Des_mznc7Rr8stsEhHxl8XMjgiYWzRIn/view
- 33.Eva KW, Bordage G, Campbell C, Galbraith R, Ginsburg Set al. Towards a program of assessment for health professionals: from training into practice. Adv in Health Sci Educ Theory Pract. 2016;21(4):897-913. 10.1007/s10459-015-9653-6 [DOI] [PubMed] [Google Scholar]
- 34.Burdick WP, Boulet JR, LeBlanc KE. Can we increase the value and decrease the cost of clinical skills assessment? Acad Med. 2018;93(5):690-692. 10.1097/ACM.0000000000001867 [DOI] [PubMed] [Google Scholar]
- 35.Nevins AB, Boscardin C, Kahn D, May W, Murdock-Vlautin Tet al. A call to action from the California consortium for the assessment of clinical competence: Making the case for regional collaboration. Acad Med 2022. (on-line ahead of print). 10.1097/ACM.0000000000004663 [DOI] [PubMed] [Google Scholar]
- 36.Thomas John J, Gowda D, Schlair S, Hojsak J, Milan F, Auerbach L. After the discontinuation of Step 2 CS: A collaborative statement from the directors of clinical skills education (DOCS). Teach Learn Med 2022. (on-line ahead of print). 10.1080/10401334.2022.2039154 [DOI] [PubMed] [Google Scholar]
