Skip to main content
Perspectives on Medical Education logoLink to Perspectives on Medical Education
. 2024 Mar 18;13(1):201–223. doi: 10.5334/pme.1096

Competence By Design: a transformational national model of time-variable competency-based postgraduate medical education

Jason R Frank 1, Jolanta Karpinski 2,3,4, Jonathan Sherbino 5, Linda S Snell 4,6, Adelle Atkinson 4,7, Anna Oswald 4,8,9, Andrew K Hall 4,10, Lara Cooke 11, Susan Dojeiji 12, Denyse Richardson 4,13, Warren J Cheung 4,14, Rodrigo B Cavalcanti 14,15, Timothy R Dalseg 4,16, Brent Thoma 4,17, Leslie Flynn 4,18, Wade Gofton 19, Nancy Dudek 20, Farhan Bhanji 4,21, Brian M-F Wong 22, Saleem Razack 23, Robert Anderson 4,24, Daniel Dubois 25, Andrée Boucher 26, Marcio M Gomes 27, Sarah Taber 28, Lisa J Gorman 4, Jane Fulford 29, Viren Naik 25,30, Kenneth A Harris 31,32, Rhonda St Croix 33, Elaine van Melle 4,34
PMCID: PMC10959143  PMID: 38525203

Abstract

Postgraduate medical education is an essential societal enterprise that prepares highly skilled physicians for the health workforce. In recent years, PGME systems have been criticized worldwide for problems with variable graduate abilities, concerns about patient safety, and issues with teaching and assessment methods. In response, competency based medical education approaches, with an emphasis on graduate outcomes, have been proposed as the direction for 21st century health profession education. However, there are few published models of large-scale implementation of these approaches. We describe the rationale and design for a national, time-variable competency-based multi-specialty system for postgraduate medical education called Competence by Design. Fourteen innovations were bundled to create this new system, using the Van Melle Core Components of competency based medical education as the basis for the transformation. The successful execution of this transformational training system shows competency based medical education can be implemented at scale. The lessons learned in the early implementation of Competence by Design can inform competency based medical education innovation efforts across professions worldwide.

Introduction

Postgraduate medical education (PGME) has been described as an essential societal enterprise that prepares physicians to achieve the level of competence needed to practise and serve society [1]. Without an effective PGME system, a population may lack a sufficient health workforce, or have a cadre of physicians who are not adequately prepared for practice. The 20th century model of medical education, heavily influenced by Osler, Halsted, and Flexner, evolved out of an apprenticeship model that progressively incorporated more educational structure over decades [2, 3]. However, this model has been criticized as inadequate for the 21 st century [4,5,6,7,8,9,10] and in need of greater attention to social accountability [11, 12]. In response, new outcomes-oriented and competency-based approaches have been endorsed [13,14,15,16,17,18]. Worldwide, competency based medical education (CBME) has become a major transformational movement in the health professions [19,20,21,22,23].

CBME has been defined as “an outcomes-based approach to the design, implementation, assessment and evaluation of an education program using an organizing framework of competencies” [18]. This approach to health professions education (HPE) extends back to a major report by the World Health Organization [24], later further developed by many authors and organizations. The International CBME Collaborators have proposed five elements (the van Melle Core Components) of a modern CBME model:

  1. Training outcomes organized as a competency framework for graduates

  2. Defined progression of training from novice to expert

  3. Tailored learning experiences to meet the needs of learners

  4. Teaching focused on competency achievement

  5. Programmatic assessment [25]

PGME systems in many countries have moved to adopt CBME [26,27,28,29,30,31]. Driving this movement are a number of concerns about contemporary training and opportunities to enhance PGME design. Patient safety risks [32,33,34], variability in graduate competence [35,36,37,38], issues with transitions to, within, and from PGME [39,40,41,42,43], inadequate supervision and insufficient direct observation of trainee work [44,45,46,47,48,49,50,51,52], concerns with workplace-based assessments and promotion decisions [53,54,55,56,57,58,59], lack of equity in clinical assessments [60, 61], and little or poor feedback [62,63,64,65,66,67] are all examples of important recurring challenges with PGME that education leaders have sought to address. At the same time, innovations and developments such as programmatic assessment [68,69] entrustable professional activities (EPAs) [70, 71], new coaching feedback models [72, 73], deliberate practice and mastery learning [74, 75], Competence Committees [76, 77], assessment software [78,79,80], learning analytics [81], and novel approaches to accreditation [82] all present significant opportunities for better PGME through the implementation of the best evidence in medical education (See Table 1).

Table 1.

Drivers of the Competence by Design project.


ISSUES OF CONCERN IN PGME SYSTEM OPPORTUNITIES FOR PGME SYSTEM ENHANCEMENT

  • Public expectations for greater social accountability of health professions and their education systems [11, 12]

  • Calls for greater focus on outcomes of training [14,15,16,17,18]

  • Patient safety concerns with care provided during and after postgraduate training [32,33,34]

  • Evidence of unacceptable variability in the competence of medical graduates [35,36,37,38]

  • Recommendations to shift to outcomes-oriented, competency-based systems have been made by major medical organizations (e.g., World Health Organization) [90,91,92]

  • There are successful CBME implementations to build upon (e.g., Toronto Orthopedics [93], CFPC [94], ACGME [91])


  • Little direct observation of trainees at work

  • Incidents of inadequate supervision of trainees [44,45,46,47,48,49,50,51,52]

  • Use of entrustable professional activities allow more faculty to provide better input on trainee progress [70, 71]


  • Failure to address identified weaknesses in trainees (“failure to fail”)

  • Certification examination failures

  • Promotion despite evidence of gaps or unreadiness for practice

  • Concerns about promotion decision-making

  • Concerns about inadequate workplace-based assessments

  • Few supervisors involved in workplace assessment [53,54,55,56,57,58,59]

  • Use of programmatic assessment can enhance assessment decisions [68, 69, 95]

  • Application of learning analytics to medical education allows for new insights into trainee progression toward competence [81, 96]


  • Reports of workplace assessments perceived as burdensome [53]

  • Development of electronic portfolio software allows digitization and documentation of assessments [78,79,80]


  • Reports of inadequate quality and frequency of the feedback given to trainees [62,63,64,65,66,67]

  • New coaching models have been developed for medical education [72, 73]

  • A CanMEDS framework update incorporates developmental milestones that can be used as scaffolding for supervisor feedback [88]


  • Reports of trainee anxiety with workplace assessment [97, 98]

  • Growth mindset may enhance learning [99,100,101]


  • Issues of transitions to postgraduate training, transition to senior trainee responsibilities, and transitions to practice [39,40,41,42,43, 103]

  • Mastery learning methods enhance learning [74, 75]

  • Stages of training may allow for explicitly addressing transitions into and out of PGME [25, 103]


  • Reports of trainee disengagement [104]

  • Greater trainee engagement with training enhances learning outcomes [105]


  • Reports of assessment inequity [60, 61]

  • Concerns that assessment of learning approaches overemphasize seeking trainees with problems instead of trainee development [77]

  • A developmental view of training allows for tailoring training and assessment to ensure every trainee progresses to competence (assessment for learning) [68]


  • Program reviews inordinately focused on process measures that may not enhance training [106]

  • New accreditation systems place greater emphasis on program outcomes and continuous improvement [106, 107]

  • Application of learning analytics to medical education allows for new insights into program performance [108]


ACGME Accreditation Council for Graduate Medical Education; CBME competency based medical education; CFPC College of Family Physicians of Canada; PGME postgraduate medical education.

Using the Core Components, CBME designs can address these issues and opportunities. CBME shifts the emphasis from time spent in training to competencies achieved by graduates. A clear statement of the levels and types of competencies required of a graduate directs the attention of learners and teachers to a shared mental model of competence [83]. A developmental approach to attainment of competence is reflected in deliberately sequenced training experiences and coaching feedback. More frequent and better quality feedback enhances learning and trainee satisfaction. Programmatic assessment, with many data points contributed by a variety of assessors and tools, allows for better informed and more equitable decision-making about learner progress. Combined, these CBME design elements have the potential to ensure trainees are truly prepared for each stage of training, providing safe and effective care [25].

These changes to longstanding HPE designs have led to criticisms that CBME is a set of assertions with no evidence base, that the underlying assumptions are invalid, and that there is a lack of proof of concept of CBME at a national scale [84, 85]. While there are large-scale CBME implementation projects underway around the world, few have been described in the literature. Without an evidence base describing CBME implementation in a variety of settings, these outcome-focused approaches may be regarded as aspirational, theoretical, or unfounded.

We describe the transformational change of a national PGME environment to a multi-specialty, time-variable competency-based system. The van Melle Core Components of CBME were used as the basis of specialty PGME. This major system reform project was called Competence by Design (CBD) [86] to distinguish it from the previous system, which was based on achieving competence by time-based training. This paper provides an overview of the rationale, drivers, and the bundle of educational interventions that formed the CBD national innovation. Accompanying papers in this special collection explore specific aspects of Competence by Design, while this one focuses on the aims and innovations involved in putting CBME theory into practice.

Context

In Canada, the PGME system is an interwoven network of university medical schools, academic hospitals, community clinical teaching centres, government funders, and regulatory bodies [87]. The Royal College of Physicians and Surgeons of Canada (Royal College) is a specialty standards body created by an act of federal Parliament to oversee specialty medicine standards, accreditation, certification, and maintenance of competence outside of family medicine. In the contemporary landscape, the Royal College partners with all stakeholders and institutions in the PGME system to carry out its functions. In this paper, we present the early stages of a transformational medical education change from the perspective of the design, policy, and standards teams of the Royal College who were involved at the time. Thousands of other individuals contributed to and would also have perspectives on this transformation.

Twentieth-century Canadian PGME had a typical North American design. Following medical school graduation, trainees entered a system of time-based training in clinical settings. Canadian training is overseen by three collaborating medical Colleges: the Royal College (67 specialties and subspecialties), the College of Family Physicians of Canada (family medicine), and the Collège des médecins du Québec (all disciplines recognized in the region of Quebec). Directly from medical school, trainees outside of family medicine entered into Royal College programs leading to certification in primary specialties. All Canadian specialties and subspecialties were structured around the CanMEDS competency framework [88, 89]. Training consisted of immersion in specific clinical services typically from four to 52 weeks, as well as structured regular instruction in classrooms, skills workshops, simulation sessions, or laboratories. Experiences were selected to provide opportunities for trainees to acquire the defined competencies relevant to the specialty of training, prepare for certification examinations, provide needed clinical services, and meet all the criteria for credentials. Assessment most commonly entailed a CanMEDS-based retrospective form completed by a single supervisor at the end of every four-week block of training. Some training sites incorporated other assessment methods (e.g., objective structured clinical examinations [OSCEs]) on an ad hoc basis. Typically, a Royal College trainee would rotate through 13 blocks each year for four to six years before writing a final, high-stakes Royal College specialty examination. Successful trainees would then be certified in their specialty, enabling them to move to practice, begin a subspecialty training program, or undertake less-structured further fellowship training.

Drivers for system change

At the time of its development, CBD was driven by the Royal College’s commitment to continuous improvement in the Canadian PGME system, as a fiduciary duty to those served by the medical profession. The Royal College, along with other stakeholders, scanned the environment for areas of concern and opportunities to enhance the training of future physicians. These are summarized in Table 1.

Development process

Canadian PGME had a history of major reforms, including the incorporation of the CanMEDS competency framework going back to 1990 [109,110,111]. Not all of these proposed reforms were successful [112], so the desire to improve the PGME system through CBME was organized into a formal system-wide project to support its success. The Royal College launched a major institutional project group to develop CBD, including teams responsible for education strategy, specialty standards, CanMEDS, faculty development, accreditation, policy, assessment, finance, IT, communications, and governance. The CBD project was organized into four phases (see Figure 1 and Supplement A). To do its work, the project group adopted six principles applied to the PGME system:

Figure 1.

Phases of Competence By Design included Plan the Plan, Preparation, Implementation, and Evaluation & Adaptation

Phases of Competence by Design development.

Organizational alignment and support

To execute this project effectively, the involved organizations themselves would need to be transformed to align with the initiative. Therefore, the Royal College and its medical school and organizational partners created formal project teams, working groups, shared governance bodies, change strategies, and new policies that facilitated the creation of the new competency-based PGME system [113].

Stakeholder engagement and co-production

As this was recognized as a transformational change in a long-established system, the project group prioritized early and extensive stakeholder engagement [114, 115]. Deans, government representatives, medical students, postgraduate trainees, senior education leaders, front-line teachers, education administrators, medical regulators, and many others were invited to co-produce CBD as a community. Engagement and support varied, but the majority of stakeholders supported the change effort.

Iterative community development and rollout

Early on, it was decided that this large transformational change required an iterative approach. Educational and policy designs were brainstormed in a series of summits beginning in 2010, which were then widely circulated among stakeholders for comment and improvement before implementation. Similarly, specialties and subspecialties (e.g., Anesthesiology, Medical Oncology, Otolaryngology-Head and Neck Surgery) were invited to be early adopters and volunteer to join the first cohorts of disciplines to implement the new model. Lessons learned from each step of the journey informed changes for the next cohort and the whole system [116].

Resource sharing

As this was a large, systemic change, it was recognized early that additional education resources would be needed. New faculty development resources (conferences, webinars, videos, templates) were developed [86]. Grants were established to support the work to be done in training programs as well as program evaluation and the dissemination of findings. Funding was provided to help establish new change leadership roles (called CBME Leads) at each medical school [117].

Creation of formal expert networks

While CBME had been discussed since the World Health Organization’s 1978 call to action [13], it was recognized when CBD started that many aspects of CBME implementation were still in development. Pooling of ideas and sharing of best practices and pitfalls would be a key ingredient in the project’s success. Therefore, the Royal College team founded and facilitated several national and international networks to facilitate knowledge creation and dissemination. These included the International CBME Collaborators, the Learning Analytics Medical Education Network, the Canadian Competence Committee Chairs Collaborative, a Residents Roundtable, a series of Program Evaluation Summits, Collaborations with the College of Family Physicians of Canada, and the medical school-based CBME Leads Roundtable.

A priori program evaluation

As many aspects of CBD were new and transformative, it was an early priority to build a robust program evaluation strategy and network to ensure continuous improvement of all aspects of CBD. It was imperative that any negative unintended consequences of the new PGME system be recognized and ameliorated in a timely manner. Similarly, positive unintended consequences needed to be recognized, celebrated, and amplified. The CBD program evaluation strategy is elaborated in the paper by Hall et al. in this collection [118].

The Competence by Design model: a bundle of 14 innovations to support a CBME system

Competence by Design involved transformational changes to all aspects of the Canadian specialist PGME system. Innovations were derived from a program logic model connecting the PGME issues and opportunities to the Van Melle Core Components of CBME [25] (see Table 2). All aspects of PGME, from core competencies to the role of time in training, to policies and standards for assessment, accreditation, credentialing, and certification, were reimagined from first principles. CBD “bundled “ 14 major innovations to enable the new PGME system, which are described below.

Table 2.

Competence By Design Logic Model.


CORE COMPONENTS OF CBME* ISSUES & OPPORTUNITIES CBD DESIGN ELEMENTS OUTPUTS IMPACT

Outcomes as a Competency Framework for Graduates PGME to ensure all graduates meet needed level of competence (focus on graduate outcomes for safe patient care)
Program reviews focused on process, not outcomes
CanMEDS 2015 Framework
New specialty-specific competencies
New outcomes-oriented accreditation
Clear new competencies for every specialty
New accreditation standards focused on outcomes of PGME
Competent graduates, ready for practice
Enhanced training programs

Defined progression of training from novice to expert Issues with transitions
Time-based training produces variable graduates
Patient safety concerns
Incidents of inadequate supervision
Planned transitions
4 stages of PGME
CanMEDS milestones
Better transitions to residency and practice
Clear pathways to competence
Better assessments for learning
Residents prepared for each stage of training
Competent graduates, ready for practice
Safer care

Tailored learning experiences Generic training produces variable graduates
Resident engagement with training enhances learning
Time variable training
Flexible training requirements
Promotions on achievements
Individualized rotation plans
Coaching over time
Residents with individualized pathways to certification Residents prepared for each stage of training
Competent graduates, ready for practice
Greater resident satisfaction with training

Competency-based teaching Little direct observation of trainees
Inadequate feedback in workplace
Growth mindset may enhance mastery of expertise
EPAs provide opportunity for more faculty to provide better input
Developmental view ensures no trainee left behind
Direct observation
EPAs for workplace based assessment
Coaching in the moment
Developmental view of training
Growth mindset
More direct observation
More and better feedback
Trainee portfolios provide rich picture of progress
Residents prepared for each stage of training
Competent graduates, ready for practice
Greater resident satisfaction with training

Programmatic assessment Exam failures
Promotions despite dyscompetence
Few assessments
Concerns about WBA
Concerns about promotion decisions
Opportunity to use learning analytics
Opportunity to digitize assessment
Competence committee review of every trainee progress
High number of EPA observations
Learning analytics & eportfolios
Developmental view of training
Growth mindset
Coaching over time
New role for certification exam
Better promotion decisions
Trainee portfolios provide rich picture of progress
More faculty involved in WBA
Clear pathways to competence
Residents with individualized pathways to certification
More and better feedback
Residents prepared for each stage of training
Competent graduates, ready for practice
Greater resident satisfaction with training
Fewer appeals of assessments needed
Same or higher exam pass rates

*After Van Mell E, et al. International Competency-based Medical Education Collaborators. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019; 94: 1002–9.

New competence framework with developmental milestones

The Royal College PGME system has used and regularly updated the CanMEDS competency framework as the basis of curriculum since 1996 [110, 111]. For CBD, a new version, CanMEDS 2015, was created that included developmental milestones for each domain of competence (e.g., communication skills) in the form of short statements that reflect a progression from the end of medical school to specialist level [88]. The milestones were deployed as a scaffold for workplace-based coaching conversations [119].

Introduction of developmental entrustable professional activities

As described by Karpinski and Frank [120], the Royal College chose entrustable professional activities (EPAs) both as an approach to organize learning and as a framework for assessment. The CBD form of EPAs (RCEPAs) represented a series of professional tasks tailored to the specialty and the stage of training. They were explicitly developmental, in that RCEPAs grew in complexity and scope as training progressed. RCEPAs at the beginning of training were simpler (e.g., “Admitting patients to the Urology service”) and at the end of training reflected abilities approaching that of a practising clinician (e.g., “Coordinating, organizing, and executing the day’s list of surgical procedures”). An RCEPA included a description of the task, eight to 12 milestones from two or more CanMEDS Roles that are fundamental to complete the task, a supervision ordinal score (i.e., the O-Score [121]), and an area to complete a mandatory narrative comment. Such EPAs were to be directly observed in the workplace on a frequent basis, serving as a framework for monitoring progress (assessment of learning) and for coaching in the moment (assessment for learning) in the clinical setting [122]. RCEPAs were completed, logged, and aggregated in a digital platform. EPAs therefore served to: define progression of training, tailor learning of individual trainees, facilitate workplace based teaching around key tasks, and generate data for programmatic assessment.

New stages of training

To enable a focus on program outcomes that ensures every graduate has acquired all of the competencies to practise safely, CBD moved from an organizing framework of time spent in training to competencies achieved sequentially [25, 103]. Postgraduate years (PGYs) were formally replaced in the educational system in favour of four defined stages of training: Transition to Discipline, Foundations, Core, and Transition to Practice. Each stage was designed to build upon previous experiences and achievements. Stages incorporated predefined competencies to be achieved, learning experiences (e.g., rotations, types of patient encounters, simulation sessions), EPAs, other assessments, and criteria for promotion. For the first time, specific attention was drawn to preparing trainees for transitions into PGME and into practice. Progression through the stages required a formal recommendation by the Competence Committee. The new standards required programs to prepare trainees for transitions between stages, ensuring they had acquired all relevant competencies, to increase their effectiveness on future rotations and promote safe patient care. The stages are illustrated in the CBD Competence Continuum (see Figure 2).

Figure 2.

The Competence by Design Competence Continuum describes the stages of physician careers from PGME to Certification to Continuing professional development to Transition out of professional Practice

The Competence by Design Competence Continuum. Copyright 2012. The Royal College of Physicians and Surgeons of Canada. Reproduced with permission.

New specialty-specific standards

The CanMEDS 2015 framework was used to template a new, updated set of competencies tailored to every Royal College specialty and subspecialty. Every discipline created a new national blueprint for curriculum and for assessment using these competencies, stages, and RCEPAs [123]. The disciplines then disseminated their new custom-built design for PGME based on the four stages of training; each stage included the requirements for training experiences, instruction, competencies to be achieved, and EPAs to be observed and recorded. These stages were not based on time or quotas for EPAs; time was a resource for learning, and workplace assessments were an opportunity to give/receive feedback and to document evidence of progress. CBD therefore provided a generational opportunity to revamp each specialty’s training design.

New program blueprints for teaching and assessment

The CBD national specialty standards were translated into new blueprints for teaching and for assessment at all of the nearly 1000 Royal College training programs in Canada. These were required as an accreditation standard. Local program committees were asked to use this opportunity to reimagine training in their context, allocating time to essential rotations and instruction and allocating EPAs and other assessments to planned experiences. This provided an occasion to reassess the best experiences to aid achievement of competence and reconsider experiences that were no longer needed. In this way CBD facilitated renewal of every training program in the country.

Workplace-based assessment system with direct observation

Competence by Design introduced a new workplace-based assessment system that placed emphasis on both assessment for learning and assessment of learning [68, 69]. Instead of a single retrospective workplace-based assessment completed at the end of every four weeks of training by a single supervisor, programs were asked to ensure that every trainee was frequently observed using EPAs in the workplace and received coaching in the moment around that EPA, and that the supervisor’s impressions were recorded in an electronic portfolio. EPAs were employed to ensure multiple micro-assessments of performance, captured from multiple assessors to inform a richer and more reliable determination of learner progress. In a study of the first year of implementation in a single specialty, observations recorded per trainee rose from fewer than 20 traditional assessments to 90–230 EPAs across sites [124].

Competency-based coaching model

To address concerns around a lack of useful feedback given to trainees in traditional training [125], EPAs in CBD were not solely used as an assessment framework. EPAs were also the foundation of frequent direct observations [126] and clinical coaching in the moment [127, 128]. To support this, the “RX-OCR” coaching model was rolled out, as described by Richardson et al. in this collection [122].

Training in a growth mindset

The introduction of increased direct observation, workplace-based assessment, and competency-oriented coaching feedback created the risk that the new CBD system would be overwhelmingly focused on assessment. Instead, the goal of the new system was to enhance teaching and learning [101]. Therefore, the CBD organizers explicitly included orientation for trainees and teachers to Carol Dweck’s growth mindset [129, 130]. This approach to competency-based education advocates a developmental view of learning; every learner is on a journey to competence, and any given competency is achieved or not yet achieved. Under CBD, teachers were encouraged to record not just when a trainee achieved an EPA or area of competence but also their progress enroute. Many observations meant that any given data point in a learner’s portfolio was a “pixel in a picture of competence,” and each one was not high stakes. The goal was for trainees and teachers to use discussions of trainee abilities as a “progress note” on development and not a commentary on a learner’s character. This was a distinct shift from the fixed mindset that is prevalent in medical education [100, 122, 128].

Introduction of competence committees and programmatic assessment

Programmatic assessment was incorporated to synthesize a spectrum of assessments (mainly RCEPAs but addressing various assignments relevant to the stage of training) from a diversity of supervisors over time into a global assessment of a trainee at a specific stage of training. In the CBD model, a Competence Committee was a formally appointed group of dedicated educators who met regularly to look at the EPA and other aggregated performance data in a trainee’s portfolio and assign a formal status to their progress (e.g., “Progressing as Expected”) using a prescribed consensus process. Feedback and an educational prescription were to be provided to the trainee. Modification of future training was also possible, including early remediation or accelerated progression through training. The Competence Committee reported to the overall Residency Program Committee (RPC), and the RPC was accountable for and aware of all promotion recommendations and decisions to ensure overall alignment across the local program [131,132,133,134].

Introduction of electronic portfolios

CBD made digital trainee portfolios an essential ingredient in residency education [135]. The Royal College wanted to move away from the use of paper forms (still present in some programs) and incentivize the use of more sophisticated electronic portfolios to manage the increase in trainee progress data (e.g., EPA observations) [78,79,80]. Digital completion of RCEPAs ensured efficient and secure data capture. An electronic portfolio allowed for dashboard views that trainees, teachers, and Competence Committees could use to monitor trainees’ progress in meeting the program requirements [136, 137]. To support all of this, the Royal College invested in and provided a free eportfolio for every accredited program. Universities and hospitals also had the option of implementing another electronic portfolio of their choice.

Learning analytics

CBD enabled learning analytics nationally for the PGME system with the availability of many more assessment data points and ease of aggregation of digital data. While learning analytics are prevalent in higher education [138], there was minimal use in Canadian PGME before the CBD rollout. Learning analytics are a powerful set of tools to display trainee progress against a standard. These analytics may also provide views on teacher behaviour, rotation effectiveness, and the program overall [139]. A powerful graphical learning analytics dashboard has become an important tool for Competence Committees under CBD [81, 108].

Changed role for certification examination

For many decades, the Royal College certification examination was the final act of PGME, occurring at the end of training as a single, high-stakes external gateway to independent licensure and certification. Under CBD, this examination was moved to the end of the Core stage of training [140]. This action moved examination preparation, a powerful driver of learning, into an earlier stage of training and created an examination-free period of time for a true Transition to Practice stage. In general, examination candidates performed just as well when examinations were moved from the end of training to the end of the Core stage. The examination became another major exhibit in a trainee’s portfolio of progression to competence, and successful completion of the examination was still a requirement for certification. The role of examinations in CBD is further elaborated in Bhanji et al. in this collection [141].

New accreditation standards emphasizing continuous quality improvement and quality assurance

To support the rollout of the CBD system, the Royal College accreditation system was also renewed to focus more on outcomes. As part of a consortium of accreditation stakeholders (CanRAC), a fresh set of standards was produced that included requirements around the elements of CBME [142]. In keeping with the philosophies that inform CBD, the new accreditation system shifted its emphasis from quality assurance (meeting a standard) to continuous quality improvement (rewarding programs showing a strategy of enhancing the program) [106]. The new accreditation system for CBD is further described in the accompanying paper by Dalseg et al in this collection [143].

Time-variable credentialing

Finally, CBD marked a move away from credentialing for certification based on time spent in training on prescribed clinical services [144]. Instead, the Royal College adopted a policy of accepting the promotion decision of a local program Competence Committee based on all of the data available on a trainee’s readiness for practice [132]. As a safeguard, Competence Committee functions were examined as part of accreditation visits. While CBD represents a hybrid time-variable approach, trainees in CBD programs could graduate earlier than the standard training duration if there was evidence that they had achieved all required competencies and training experiences.

Discussion

While many countries have begun work on CBME designs, little has been published to date describing a national-scale transformation of a PGME system to competency-based education. In this paper, we have described CBD as a unique innovation in health professions education, and elaborated the drivers, development, and design of a novel CBME system that was the biggest change in Canadian training since the founding of PGME in the country. While it may be that all educational programs are continuously evolving, large-scale transformations in education systems are not common [145]. The CBD project represents both a transformational change to an existing PGME system and an application of time-variable CBME. There are several lessons learned from the early implementation of CBD and implications for those leading change in health professions education.

Lessons learned about large-scale educational change

Large-scale change provides a rare opportunity to reimagine how a system works

The CBD initiative became an opportunity to fundamentally reimagine every aspect of a PGME system, from policies to philosophies to procedures, from accreditation to assessment. This is a rare phenomenon in professional education. This allowed many long-standing concerns and ideas to be addressed as part of this implementation (e.g., digitizing accreditation reviews). Those involved were committed to continuous improvement, ultimately for future graduates and their patients. Nevertheless, the occasion to re-examine fundamentals must be balanced with the high degree of effort needed to pursue such an opportunity.

Organizational transformation is needed to sustain “big change”

In the course of implementing CBD, it was realized that to be successful, the participating organizations themselves needed to change. Organizations changed policies, procedures, and personnel. The latter change was necessary to put in place individuals dedicated to new educational processes. Fundamentally, CBD changed the mental models of all those who adopted this new way of preparing physicians, including the leadership of the institutions involved [146].

Co-production with stakeholders is essential

The implementation of CBD required the engagement of numerous stakeholders to make progress in change. Stakeholders such as trainee organizations and faculties of medicine were on the front line of impacts of any PGME project, so they had critical input into shaping CBD. Co-production of the elements of CBD with partners, while slower, was essential to get the best possible design from many perspectives [115].

An adaptive program evaluation strategy is essential

From the outset, CBD developed a robust program evaluation strategy with three main pillars of activity: readiness to implement, fidelity of implementation, and outcomes. Ongoing evaluation studies from across the PGME system rapidly informed education leaders of issues, concerns, strengths, and regional variations. This was absolutely critical to the success of such a large and complex change project [118].

Large-scale change in medical education can lead to scholarship and career changes

Anecdotally, CBD triggered participants to change roles and produce scholarship, as an unintended impact of the transformation. Trainees and faculty became interested in an education career track, becoming chief residents (chief postgraduate trainees), program directors, or scholars.

Lessons learned about CBME implementation

CBD is a CBME proof of concept at scale

CBD was explicitly created to use the Van Melle Core Components of CBME [25]. Among the criticisms of the CBME movement, there has been a concern that this approach is theoretical, without an evidence base or track record [84]. CBD contributes to the discourse of HPE by demonstrating that the Core Components of CBME can be used as the basis for a 21 st century national PGME system.

There are benefits to implementing CBME as a “bundle” of changes

In previous work by the International CBME Collaborators, many pioneering CBME designs focused primarily on competency frameworks or programmatic assessment. These are two of the Van Melle Core Components. By contrast, the CBD initiative used all of the Core Components, which led to 14 implemented innovations “bundled” into one transformative system change [147, 148]. Early program evaluation findings suggested that greater alignment with the changes at the training program level produced better alignment with the desired outcomes of the CBD Logic Model [118].

Time is a resource for learning, not the criterion for completion of training

PGME systems use time in training in a variety of ways. Some have fixed-time designs that require a specified number of weeks in prescribed learning experiences. Systems based on time spent have been criticized for having the potential risk that graduates may exit without having achieved all required competencies for safe practice. By contrast, open-time systems have been criticized for inefficiency and prolonged training. CBD tried to create a time-hybrid system, with guidelines for learning experiences that enable achievement of required competencies. In this system, time is a resource for training, not the organizing framework. Therefore, given rotations were recommended, not required, and the achievement of competence was not based on time. Quality controls were built into accreditation visits to ensure programs were tailoring training to individual trainees’ needs, while ensuring there was evidence that every graduate had achieved all essential competencies [144].

Time variability enabled individualized learning plans

The CBD design allowed individualized trainee learning plans. Competence Committees were encouraged to consider future training experiences on the basis of what the trainees’ portfolios indicated they needed to progress to the next stage. For the vast majority of trainees, this did not mean early or late completion of training. What it did mean was that Competence Committees were able to recommend, as needed, changes to a trainee’s rotations or other activities to enable them to achieve the program outcomes. In doing so, programs balanced the needs of clinical services and the trainee’s educational needs. The extent to which this was implemented varied [134].

“Developmental” EPAs facilitate progress decisions

Many health professions education programs around the world that use EPAs have designed them to be tasks that a graduating trainee works toward. In CBD, the Royal College explicitly wanted to sequence training from novice to expert and ensure learners truly were prepared for each stage of their development. This aligned with the theory of the Core Components, addressed concerns about patient safety by ensuring trainees were prepared for their tasks, and allowed for direct observation and coaching around specific tasks for the level of training. By pinpointing tasks that a learner was expected to be able to perform at the end of each stage, Competence Committees had a set of criteria to guide promotion decisions [120].

Better feedback is possible

One of the drivers for CBD was perennial complaints about the lack of useful and actionable feedback to trainees [62,63,64,65,66,67]. The strategies chosen to address this included the deployment of EPAs as a focus for learning and observation, a new workplace coaching model, explicit discussion of the growth mindset, and requirements for regular direct observation and coaching feedback. Early evaluation studies showed that trainees reported more frequent and more actionable feedback as part of workplace-based assessments and EPA conversations [118].

Programmatic assessment offers key benefits

A fundamental pillar of CBD was the deployment of programmatic assessment. Its use was intended to address long-standing concerns about PGME assessment being subjective, lacking a comprehensive view of development of competence, and being based on too few supervisors and/or too few observations. Programmatic assessment was a major change for most programs in the PGME system, with variable rates of adoption. In the programs where this approach to assessment was adopted with fidelity, local education leaders reported high satisfaction with stronger assessment decision-making, richer data on individual trainees, fewer appeals of assessment decisions, and better quality feedback to learners. When programmatic assessment was conducted, summative assessment decisions were shifted from the workplace supervisors to Competence Committees [133, 134, 149, 150].

Real-time, low-stakes workplace-based assessment is possible

Worldwide, a major challenge to supporting a robust CBME design lies in obtaining an adequate number of useful direct observations in the workplace of trainee progress toward competence [45, 126]. In implementing CBD, the Royal College asked supervisors at all 1000 training sites to sample every trainee’s work on a regular basis. It was found that supervisors can do direct observation in small, brief episodes and record rich and useful notes on trainees. This was not easy for all settings, but some programs did realize a shift towards greater direct observation [118, 150].

Competence Committees work

In CBD, program leaders reported high satisfaction with implementing Competence Committees [118, 133]. They reported that individual trainees were discussed in greater depth and richness, that assessment decisions were more robust, and that the processes to create a functional formal Competence Committee were doable across multiple settings. This experience provides further support for the use of Competence Committees in PGME.

CBME may support equity in assessment

Recently, multiple studies have identified concerns with equity in assessment of trainees in various Refs. By requiring that all trainees — not just those favoured or flagged for concern — be directly observed by multiple supervisors and discussed at a Competence Committee on a frequent basis, the CBD assessment system took a small step toward equity.

Pitfalls in PGME transitions can be ameliorated

Multiple previous reports flagged that transitions in training are stressful for learners, could put patients at risk, and are not ideal education designs [39,40,41,42,43, 102]. These transitions — from medical student to PGME trainee, from junior to senior trainee, and from senior trainee to practice — are perennial challenges. CBD explicitly planned to address these challenges by using stages as a deliberate sequence of training. In particular, the Transition to Discipline stage explicitly oriented the learner to the discipline and promoted the learner’s professional identity development as a junior member in that discipline. The Transition to Practice stage provided a capstone opportunity for the trainee to safely act in the role of the most responsible physician or surgeon while preparing for the realities of independent practice.

Learning analytics is a powerful suite of tools with benefits

While learning analytics has existed in education for a long time, adopting programmatic assessment under CBD allowed the first whole-scale use of this suite of tools across a PGME system. Learning analytics allowed trainees to visualize their progress, Competence Committees to make data-driven decisions, faculty to improve their feedback, programs to gain insight on learning environments, and institutions to flag outlier programs [80, 81, 136,137,138,139, 151,152,153,154].

Certification examinations still have a role

Under CBD, the Royal College certification examination was maintained and moved to become a formal assessment after the Core stage of training. Pass rates were, on the whole, unchanged. From an educational design perspective, the earlier certification examination was considered by many to be a powerful driver for learning, to be another key data point for Competence Committees, and to enable a focus on transition to practice after the examination was completed [141].

Lessons learned: pitfalls in large-scale CBME implementation

The implementation of CBD is ongoing. At every step of the change, challenges were encountered that have potentially important implications for others contemplating CBME and other transformational education changes.

Large-scale change stresses a system

CBD brought 14 innovations to a national PGME system. Inevitably, some training sites found the changes easier to adopt than others. On the basis of the accreditation achievements across training sites, the CBD design team assumed the new design was achievable by all 1000 training programs. However, once change was underway, some training sites reported difficulties with various aspects of the new training scheme. Some of these reported difficulties had been existing requirements of the previous training paradigm; the transformation to CBD shed new light on challenges within the PGME system. Multiple institutions reported greater cost to implement CBD than expected. Variability across the country was the biggest pattern, and some presumed training features were not always in place when CBD arrived.

Specialty variability required flexible approaches

Medical disciplines (specialties, subspecialties, etc.) have their own distinct subcultures. As CBD rolled out, disciplines displayed differing levels of responsiveness to change, ability to undertake educational reform, and cohesiveness. The CBD team worked with disciplines individually to support the rollout of the new training approach. Clinical realities (e.g., the COVID pandemic, resource stressors) seemed to impact education adoption [116].

Requirements for workplace-based assessments were a wellness issue

One unexpected development early in CBD was trainee stress with the new programmatic assessment requirements. Guidelines related to EPA observations to populate each learner’s portfolio were perceived as quotas, and residents were often given the responsibility to initiate faculty engagement in EPA form completion. These implementation issues led to some training sites reporting wellness issues with trainees that were not anticipated. While we hoped teaching Dweck’s growth mindset and a learner-centered approach would help trainees see the new workplace-based assessments as beneficial, this was clearly not universal in the early years of the new scheme [53, 98, 150, 155].

Implementing large-scale change during a pandemic was unanticipated

The implementation of CBD was planned as a multiyear project, and the COVID-19 pandemic began when CBD implementation was underway. As with HPE worldwide, CBD designs were drastically impacted [156]. Not only were certain learning experiences shut down for periods, but trainees and teachers were redeployed to treat large numbers of patients with COVID-19. Fortunately, the flexibility built into CBD allowed trainees to continue to progress in their training, employing evidence of achievement of competencies from alternative activities.

Electronic portfolio technology was problematic

At the time CBD was conceived, it was assumed that a country with a small population like Canada would share a national eportfolio developed by the Royal College and deployed for free. However, it was soon found that no software package satisfied all the needs of training centres, met preferred workflows, or was deployable in every software environment. In addition, trainees and institutions raised learner privacy concerns, so there were unexpected barriers with data sharing [157]. Instead, numerous local electronic portfolios were used across the country over time and the landscape continues to evolve rapidly.

Growth mindset is difficult to implement across PGME

As discussed above, one of the innovations of CBD was to encourage adoption of Carol Dweck’s growth mindset approach to teaching, learning, and assessment across the PGME system. Early in CBD, participants were intrigued, but widespread adoption was not readily seen. Instead, embracing a new mindset was an innovation that appeared to be on a long, slow adoption curve [101].

Competencies can be subsumed when using EPAs

CBD promoted the use of both CanMEDS competencies and EPAs as dual frameworks. However, in promoting EPAs as part of supporting implementation, educators on advisory committees reported a concern about over-emphasis on EPAs. As an unintended consequence, there was a perception of less emphasis on CanMEDS in PGME than before CBD.

Comparisons to Other CBME Implementations in PGME

In a 2021 study by the International CBME Collaborators, the majority of CBME programs surveyed had worked on implementing two of the van Melle Core Components: a competence framework and programmatic assessment [158]. However, four major PGME programs were comparable in scope and scale to Competence by Design: the Triple-C project of the College of Family Physicians of Canada [27, 159, 160], the ACGME Outcomes project in the US [22], the Australian Orthopaedic Association’s AOA-21 curriculum [161], and the Dutch Association of Medical Specialties’ Individualizing Postgraduate Medical Training project [162]. All of these competency-based PGME initiatives feature their own methods for implementing the van Melle Core Components, as in CBD. A simple comparison of the design features of these initiatives is displayed in Table 3.

Table 3.

Comparing CBD to Other CBME Implementations.


CORE COMPONENT COMPETENCE BY DESIGN(ROYAL COLLEGE OF PHYSICIANS AND SURGEONS OF CANADA) TRIPLE-C(COLLEGE OF FAMILY PHYSICIANS OF CANADA) OUTCOMES PROJECT(ACCREDITATION COUNCIL FOR GRADUATE MEDICAL EDUCATION, USA) AOA 21(AUSTRALIAN ORTHOPAEDIC ASSOCIATION) INDIVIDUALIZING POSTGRADUATE MEDICAL TRAINING(DUTCH ASSOCIATION OF MEDICAL SPECIALISTS, NETHERLANDS)

Training outcomes organized as a competency framework for graduates CanMEDS framework CanMEDS-FM framework ACGME 6 Competencies AOA 21 Curriculum Framework CanMEDS framework

Defined progression of training from novice to expert Stages of training Progression through training program ACGME Milestones Stages of training Postgraduate years and EPAs

Tailored learning experiences to meet the needs of learners Time-variable, flexible training Tailoring within program Tailoring within program Time-variable, flexible training Time-variable, flexible training

Teaching focused on competency achievement EPA-driven, direct observation, and coaching in workplace.
Growth mindset.
Teaching guided by Assessment Objectives for Certification in Family Medicine Teaching guided by ACGME milestones Teaching focused on stage-specific curriculum Teaching focused on EPAs

Programmatic assessment CBD program of assessment including Competence Committee review.
Multiple eportfolios.
Triple-C program of assessment including Continuous Reflective Assessment for Training (CRAFT) reviewed by residency program committee.
Multiple eportfolios.
Milestones-based program of assessment including Clinical Competency Committee review
Multiple eportfolios.
AOA-21 program of assessment including Regional Training Committee review.
National eportfolio.
EPA-based Program of assessment including Clinical Competency Committee review.
Multiple eportfolios.

All of these initiatives reported similar challenges implementing large-scale change into a PGME system. These initiatives each required major change management efforts and resources. Every one of these transformative curriculum changes required major investments in faculty development immediately for implementation [163] (e.g. the Dutch curriculum alone reached ~7000 clinical supervisors [164]). All of them reported stakeholders’ concerns with the new workplace-based assessments [52, 53, 150, 155, 165, 166], though only the Dutch system and CBD required the use of EPAs. They also shared initial challenges with digital assessment portfolio software that improved over time. All groups revised their assessment requirements based on feedback from concerned stakeholders.

There were some benefits in common as well. All of the reported enhanced feedback opportunities for trainees. All of these initiatives successfully deployed competence committee-type programmatic assessment of trainees which increased the rigour of judgments about competence [77, 83, 133, 134, 149]. Time-variable, trainee-tailored training was achieved to varying degrees in the Netherlands, AOA-21, and in CBD, while finding ways to ensure service provision was not overly impacted. (Time-variability was not a design element of Triple-C and ACGME.) Overall, all of these major competency-based PGME curricula for the 21st century were successfully deployed, sustained, and evolved over time [93, 159,160,161,162, 167,168,169].

Limitations

As discussed above, this paper is written from the perspective of the Royal College design and implementation team in place at the time. It describes the data available to this team. With any such large-scale transformation, there are inevitably differing perspectives from a variety of stakeholders. These often vary over time, vary with the issues in question, and vary with the degree of intensity of emotion involved. CBD was no exception. As CBD evolved, other PGME stakeholders and commentators had differing perspectives. Each of these perspectives has lessons for change leaders in HPE.

Conclusions

Competence by Design is a major transformational change to a national postgraduate medical education system. A bundle of 14 innovations, CBD provides an example of implementation of competency-based time-variable outcomes-oriented medical education at scale. CBD addresses recurring concerns about 20th century training designs that can impact patient care provided by graduates. Others interested in implementing CBME can learn lessons from the CBD design and experience.

Additional File

The additional file for this article can be found as follows:

Supplement A.

Phases and activities of the Competence by Design project.

pme-13-1-1096-s1.pdf (536.8KB, pdf)
DOI: 10.5334/pme.1096.s1

Acknowledgements

The authors would like to thank the thousands trainees, teachers, supervisors, CBME Leads, program administrators, PG Managers, PG Deans, Deans, Department Heads, scholars, accreditors, colleagues, and many others who contributed significantly to the development and implementation of Competence by Design.

This paper is dedicated to the memory of Robert F. Maudsley MD FRCSC, whose ideas formed the basis of this competency-based design, more than 20 years before they became reality.

Funding Statement

The CBD Project was funded by the Royal College of Physicians and Surgeons of Canada.

Disclaimer

The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Royal College of Physicians and Surgeons of Canada (“Royal College”). Information in this article about Competence by Design (“CBD”), its implementation and related policies and procedures do not necessarily reflect the current standards, policies and practices of the Royal College. Please refer to the Royal College website for current information.

Funding Information

The CBD Project was funded by the Royal College of Physicians and Surgeons of Canada.

Competing Interests

  • JRF, JK, LSS, FB, VN, ST, JF, KH, RST were employees of the Royal College.

  • JS, AA, AO, AKH, LC, SD, DR, WJC, RBC, TD, BT, LF, WG, ND, BW, RS, RA, DD, AB, MG, and EVM performed contract work for the Royal College.

References

  • 1.Snell L, Frank JR, Pihlak R, Sa J. Postgraduate medical education: a ‘pipeline’ to competence. In: Dent JA, Harden RM, Hunt D, (eds.), A Practical Guide for Medical Teachers. 5th ed. New York: Elsevier; 2017. [Google Scholar]
  • 2.Cooke M, Irby DM, Sullivan W, Ludmerer K. American medical education 100 years after the Flexner report. NEJM. 2006; 355: 1339–44. DOI: 10.1056/NEJMra055445 [DOI] [PubMed] [Google Scholar]
  • 3.Ludmerer KM. Time to Heal: American Medical Education from the Turn of the Century to the Era of Managed Care. New York: Oxford University Press; 1999. [Google Scholar]
  • 4.Neufeld VR, Maudsley RF, Pickering RJ, Walters BC, Turnbull JM, Spasoff RA, et al. Demand-side medical education: educating future physicians for Ontario. CMAJ. 1993; 148: 1471–7. [PMC free article] [PubMed] [Google Scholar]
  • 5.Ludmerer KM, Johns MME. Reforming graduate medical education. JAMA. 2005; 294: 1083–7. DOI: 10.1001/jama.294.9.1083 [DOI] [PubMed] [Google Scholar]
  • 6.Ludmerer KM. Let Me Heal: The Opportunity to Preserve Excellence in American Medicine. New York: Oxford University Press; 2015. [Google Scholar]
  • 7.Sales CS, Schlaff AL. Reforming medical education: a review and synthesis of five critiques of medical practice. Soc Sci Med. 2010; 70: 1665–8. DOI: 10.1016/j.socscimed.2010.02.018 [DOI] [PubMed] [Google Scholar]
  • 8.Iobst WF, Sherbino J, Ten Cate O, Richardson DL, Dath D, Swing SR, et al. Competency-based medical education in postgraduate medical education. Med Teach. 2010; 32: 651–6. DOI: 10.3109/0142159X.2010.500709 [DOI] [PubMed] [Google Scholar]
  • 9.Irby D. Educating physicians for the future: Carnegie’s calls for reform. Med Teach. 2011; 33: 547–50. DOI: 10.3109/0142159X.2011.578173 [DOI] [PubMed] [Google Scholar]
  • 10.Frenk J, Chen L, Bhutta ZA, Cohen J, Crisp N, Evans T, et al. Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet. 2010; 376: 1923–58. DOI: 10.1016/S0140-6736(10)61854-5 [DOI] [PubMed] [Google Scholar]
  • 11.Boelen C. Adapting health care institutions and medical schools to societies’ needs. Acad Med. 1999; 74(8 Suppl.): S11–S20. DOI: 10.1097/00001888-199908000-00024 [DOI] [PubMed] [Google Scholar]
  • 12.Barber C, van der Vleuten C, Leppink J, Chahine S. Social accountability frameworks and their implications for medical education and program evaluation: a narrative review. Acad Med. 2020; 95: 1945–54. DOI: 10.1097/ACM.0000000000003731 [DOI] [PubMed] [Google Scholar]
  • 13.McGaghie WC, Sajid W, Miller GE, Telder TV, Lipson L, et al. Competency-based Curriculum Development in Medical Education. Geneva: World Health Organization; 1978. [PubMed] [Google Scholar]
  • 14.Harden RM, Crosby JR, Davis MH, Friedman M. AMEE Guide No. 14: Outcome-based education. Part 1. Med Teach. 1999; 21: 546–52. DOI: 10.1080/01421599978951 [DOI] [PubMed] [Google Scholar]
  • 15.Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C. Shifting paradigms: from Flexner to competencies. Acad Med. 2002; 77: 361–7. DOI: 10.1097/00001888-200205000-00003 [DOI] [PubMed] [Google Scholar]
  • 16.ten Cate O, Scheele F. Competency-based postgraduate training: Can we bridge the gap between theory and clinical practice? Acad Med. 2007; 82: 542–7. DOI: 10.1097/ACM.0b013e31805559c7 [DOI] [PubMed] [Google Scholar]
  • 17.Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach. 2007; 29: 648–54. DOI: 10.1080/01421590701392903 [DOI] [PubMed] [Google Scholar]
  • 18.Frank JR, Snell LS, Ten Cate O, Holmboe ES, Carraccio C, Swing SR, et al. Competency-based medical education: theory to practice. Med Teach. 2010; 32: 638–45. DOI: 10.3109/0142159X.2010.501190 [DOI] [PubMed] [Google Scholar]
  • 19.Vasquez JA, Marcotte K, Gruppen LD. The parallel evolution of competency-based education in medical and higher education. J Competency-based Educ. 2021; 6: e1234. DOI: 10.1002/cbe2.1234 [DOI] [Google Scholar]
  • 20.Accreditation Council for Graduate Medical Education (ACGME). Outcome Project. Chicago: ACGME. www.acgme.org/outcome. [Google Scholar]
  • 21.Powell DE, Carraccio C. Toward competency-based medical education. NEJM. 2018; 378: 3–5. DOI: 10.1056/NEJMp1712900 [DOI] [PubMed] [Google Scholar]
  • 22.Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system-rationale and benefits. NEJM. 2012; 366: 1051–6. DOI: 10.1056/NEJMsr1200117 [DOI] [PubMed] [Google Scholar]
  • 23.Holmboe ES, Sherbino J, Englander R, Snell R, Frank JR, ICBME Collaborators. A call to action: the controversy of and rationale for competency-based medical education. Med Teach. 2017; 39: 574–81. DOI: 10.1080/0142159X.2017.1315067 [DOI] [PubMed] [Google Scholar]
  • 24.McGaghie, WC, Sajid W, Miller GE, Telder TV, Lipson L, et al. Competency-based Curriculum Development in Medical Education. Geneva: World Health Organization; 1978. DOI: 10.1097/ACM.0000000000002743 [DOI] [PubMed] [Google Scholar]
  • 25.Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J, International Competency-based Medical Education Collaborators. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019; 94: 1002–9. DOI: 10.1097/ACM.0000000000002743 [DOI] [PubMed] [Google Scholar]
  • 26.Chou FC, Hsiao CT, Yang CW, Frank JR. Glocalization in medical education: a framework underlying implementing CBME in a local context. J Formos Med Assoc. 2022; 121: 1523–31. DOI: 10.1016/j.jfma.2021.10.024 [DOI] [PubMed] [Google Scholar]
  • 27.Schultz K, Griffiths J. Implementing competency-based medical education in a postgraduate family medicine residency training program. Acad Med. 2016; 91: 685–9. DOI: 10.1097/ACM.0000000000001066 [DOI] [PubMed] [Google Scholar]
  • 28.Shrivasta SR, Shrivastava PS. How to successfully implement competency-based education in India. Educ Health Prof. 2018; 1: 61–63. DOI: 10.4103/EHP.EHP_20_18 [DOI] [Google Scholar]
  • 29.Yoon CH, Myung SJ, Park WB. Implementing CBME in internal medicine residency training program. J Korean Med Sci. 2019; 34: e201. DOI: 10.3346/jkms.2019.34.e201 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.McKenzie-White J, Mubuuke AG, Westergaard S, Munabi IG, Bollinger RC, et al. Evaluation of a competency based medical curriculum in a Sub-Saharan African medical school. BMC Med Educ. 2022; 22: 724. DOI: 10.1186/s12909-022-03781-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Weller JM, Naik VN, San Diego RJ. Systematic review and narrative synthesis of CBME in anaesthesia. Br J Anaesth. 2020; 124: 748–60. DOI: 10.1016/j.bja.2019.10.025 [DOI] [PubMed] [Google Scholar]
  • 32.Greiner AC, Knebel E. eds. Health Professions Education: A Bridge to Quality. Washington: National Academies Press; 2003. DOI: 10.1111/j.1945-1474.2004.tb00473.x [DOI] [PubMed] [Google Scholar]
  • 33.Shojania KG, Fletcher KE, Saint S. Graduate medical education and patient safety: a busy — and occasionally hazardous — intersection. Ann Intern Med. 2006; 145: 592–8. DOI: 10.7326/0003-4819-145-8-200610170-00008 [DOI] [PubMed] [Google Scholar]
  • 34.Crosson FJ, Leu J, Roemer BM, Ross MN. Gaps in residency training should be addressed to better prepare doctors for a twenty-first century delivery system. Health Aff (Millwood). 2011; 30: 2412–8. DOI: 10.1377/hlthaff.2011.0184 [DOI] [PubMed] [Google Scholar]
  • 35.Asch DA, Nicholson S, Srinivas SK, Herrin J, Epstein AJ. How do you deliver a good obstetrician? Outcome-based evaluation of medical education. Academic Medicine. 2014; 89: 24–6. DOI: 10.1097/ACM.0000000000000067 [DOI] [PubMed] [Google Scholar]
  • 36.Wennberg JE. Unwarranted variations in healthcare delivery: implications for academic medical centres. BMJ. 2002; 325: 961–4. DOI: 10.1136/bmj.325.7370.961 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Kennedy PJ, Leathley CM, Hughes CF. Clinical practice variation. Med J Aust. 2010; 193: S97–S99. DOI: 10.5694/j.1326-5377.2010.tb04021.x [DOI] [PubMed] [Google Scholar]
  • 38.Tamblyn R, Abrahamowicz M, Dauphinee WD, Hanley JA, Norcini J, Girard N, et al. Association between licensure examination scores and practice in primary care. JAMA. 2002; 288: 3019–26. DOI: 10.1001/jama.288.23.3019 [DOI] [PubMed] [Google Scholar]
  • 39.Teunissen PW, Westerman M. Opportunity, or threat: the ambiguity of the consequences of transitions in medical education. Med Educ. 2011; 45: 51–9. DOI: 10.1111/j.1365-2923.2010.03755.x [DOI] [PubMed] [Google Scholar]
  • 40.Colbert-Getz JM, Baumann S, Shaffer K, Lamb S, Lindsley JE, et al. What’s in a transition? An integrative perspective on transitions in medical education. Teach Learn Med. 2016; 28: 347–52. DOI: 10.1080/10401334.2016.1217226 [DOI] [PubMed] [Google Scholar]
  • 41.Matheson C, Matheson D. How well prepared medical students for their first year as doctors? Postgrad Med J. 2009; 85: 582–9. DOI: 10.1136/pgmj.2008.071639 [DOI] [PubMed] [Google Scholar]
  • 42.Kilminster S, Zukas M, Quinton N, Roberts T. Preparedness is not enough: understanding transitions as critically intensive learning periods. Med Educ. 2011; 45: 1006–15. DOI: 10.1111/j.1365-2923.2011.04048.x [DOI] [PubMed] [Google Scholar]
  • 43.Goldacre MJ, Davidson JM, Lambert TW. The first house officer year: views of graduate and non-graduate entrants to medical school. Med Educ. 2008; 42: 286–93. DOI: 10.1111/j.1365-2923.2007.02992.x [DOI] [PubMed] [Google Scholar]
  • 44.Holmboe ES. Faculty and the observation of trainees’ clinical skills: problems and opportunities. Acad Med. 2004; 79: 16–22. DOI: 10.1097/00001888-200401000-00006 [DOI] [PubMed] [Google Scholar]
  • 45.Holmboe ES. Realizing the promise of competency-based medical education. Acad Med. 2015; 90: 411–3. DOI: 10.1097/ACM.0000000000000515 [DOI] [PubMed] [Google Scholar]
  • 46.Halpern SD, Detsky AS. Graded autonomy in medical education – managing things that go bump in the night. NEJM. 2014; 370: 1086–9. DOI: 10.1056/NEJMp1315408 [DOI] [PubMed] [Google Scholar]
  • 47.Cottrell D, Kilminster S, Jolly B, Grant J. What is effective supervision and how does it happen? A critical incident study. Med Educ. 2002; 36: 1042–9. DOI: 10.1046/j.1365-2923.2002.01327.x [DOI] [PubMed] [Google Scholar]
  • 48.Kilminster SM, Jolly BC. Effective supervision in clinical practice settings: a literature review. Med Educ. 2001; 34: 827–40. DOI: 10.1046/j.1365-2923.2000.00758.x [DOI] [PubMed] [Google Scholar]
  • 49.Baldwin DC, Daugherty SR, Ryan PM. How residents view their clinical supervision: a reanalysis of classic national survey data. J Grad Med Educ. 2010; 2(1): 37–45. DOI: 10.4300/JGME-D-09-00081.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Rietmeijer CBT, Huisman D, Blankenstein AH, de Vries H, Scheele F, Kramer AWM, et al. Patterns of direct observation and their impact during residency: general practice supervisors’ views. Med Educ. 2018: 52: 981–91. DOI: 10.1111/medu.13631 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Kennedy TJT, Lingard L, Baker GR, Kitchen L, Regehr G. Clinical oversight: conceptualizing the relationship between supervision and safety. J Gen Intern Med. 2017; 22: 1080–5. DOI: 10.1007/s11606-007-0179-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Watling C, LaDonna KA, Lingard L, Voyer S, Hatala R. “Sometimes the work just needs to be done”: socio-cultural influences on direct observation in medical training. Med Educ. 2016; 50: 1054–64. DOI: 10.1111/medu.13062 [DOI] [PubMed] [Google Scholar]
  • 53.Massie J, Ali JM. Workplace-based assessment: a review of user perceptions and strategies to address the identified shortcomings. Adv Health Sci Educ. 2016; 21: 455–73. DOI: 10.1007/s10459-015-9614-0 [DOI] [PubMed] [Google Scholar]
  • 54.Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010; 32: 646–82. DOI: 10.3109/0142159X.2010.500704 [DOI] [PubMed] [Google Scholar]
  • 55.Govaerts MJ, van der Vleuten CPM, Schuwirth LWT, Muijtjens AM. Broadening perspectives on clinical performance assessment: rethinking the nature of in-training assessment. Adv Health Sci Educ. 2007; 12: 239–60. DOI: 10.1007/s10459-006-9043-1 [DOI] [PubMed] [Google Scholar]
  • 56.Herbers JE, Noel GL, Cooper GS, Harvey J, Pangaro LN, Weaver MJ. How accurate are faculty evaluations of clinical competence? J Gen Intern Med. 1989; 4: 202–8. DOI: 10.1007/BF02599524 [DOI] [PubMed] [Google Scholar]
  • 57.Lockyer J, Carraccio C, Chan MK, Hart D, Smee S, Touchie C, et al. ICBME Collaborators. Core principles of assessment in competency-based medical education. Med Teach. 2017; 39: 609–16. DOI: 10.1080/0142159X.2017.1315082 [DOI] [PubMed] [Google Scholar]
  • 58.Gingerich A, Regehr G, Eva KW. Rater-based assessments as social judgments: rethinking the etiology of rater errors. Acad Med. 2011; 86: S1–S7. DOI: 10.1097/ACM.0b013e31822a6cf8 [DOI] [PubMed] [Google Scholar]
  • 59.Sebok-Syer SS, Klinger DA, Sherbino J, Chan TM. Mixed messages or miscommunication? Investigating the relationship between assessors’ workplace-based assessment scores and written comments. Acad Med. 2017; 92: 1774–9. DOI: 10.1097/ACM.0000000000001743 [DOI] [PubMed] [Google Scholar]
  • 60.Klein R, Julian KA, Snyder ED, Koch J, Ufere NN, Volerman A, et al. Gender bias in resident assessment in graduate medical education: review of the literature. J Gen Intern Med. 2019; 34: 712–9. DOI: 10.1007/s11606-019-04884-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Ross DA, Boatright D, Nunez-Smith M, Jordan A, Chekroud A, Moore EZ. Differences in words used to describe racial and gender groups in medical student performance evaluations. PLOS One. 2017; 12: e0181659. DOI: 10.1371/journal.pone.0181659 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Archer JC. State of the science in health professional education: effective feedback. Med Educ. 2010; 44: 101–8. DOI: 10.1111/j.1365-2923.2009.03546.x [DOI] [PubMed] [Google Scholar]
  • 63.Jensen AR, Wright AS, Kim S, Horvath KD, Calhoun KE. Educational feedback in the operating room: a gap between resident and faculty perceptions. Am J Surg. 2012; 204: 248–55. DOI: 10.1016/j.amjsurg.2011.08.019 [DOI] [PubMed] [Google Scholar]
  • 64.Watling CJ, Kenyon CF, Zibrowski EM, Schulz V, Goldszmidt MA, Singh I, et al. Rules of engagement: residents’ perceptions of the in-training evaluation process. Acad Med. 2008; 83: S97–S100. DOI: 10.1097/ACM.0b013e318183e78c [DOI] [PubMed] [Google Scholar]
  • 65.Watling C, Driessen E, van der Vleuten CPM, Lingard L. Learning culture and feedback: an international study of medical athletes and musicians. Med Educ. 2014; 48: 713–23. DOI: 10.1111/medu.12407 [DOI] [PubMed] [Google Scholar]
  • 66.Kogan JR, Conforti LN, Bernabeo EC, Durning SJ, Hauer KE, Holmboe ES, et al. Faculty staff perceptions of feedback to residents after direct observation of clinical skills. Med Educ. 2012; 46: 201–15. DOI: 10.1111/j.1365-2923.2011.04137.x [DOI] [PubMed] [Google Scholar]
  • 67.Onuoha O, Heins SJ, Clapp JT, Muralidharan M, Baranov DY, Fleisher LE, et al. Improving formative feedback in the operating room setting: developing and implementing an initiative to improve feedback quality and culture. Acad Med. 2022; 97: 222–7. DOI: 10.1097/ACM.0000000000004229 [DOI] [PubMed] [Google Scholar]
  • 68.Schuwirth LWT, Van Der Vleuten CPM. Programmatic assessment: from assessment of learning to assessment for learning. Med Teach. 2011; 33: 478–85. DOI: 10.3109/0142159X.2011.565828 [DOI] [PubMed] [Google Scholar]
  • 69.Schuwirth LWT, Van Der Vleuten CPM. Current assessment in medical education: programmatic assessment. J Appl Test Technol. 2019; 20(S2): 2–10. [Google Scholar]
  • 70.Ten Cate O, Chen HC, Hoff RG, Peters H, Bok H, van der Schaaf M. Curriculum development for the workplace using entrustable professional activities (EPAs): AMEE Guide No. 99. Med Teach. 2015; 37: 983–1002. DOI: 10.3109/0142159X.2015.1060308 [DOI] [PubMed] [Google Scholar]
  • 71.Ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013; 5(1): 157–8. DOI: 10.4300/JGME-D-12-00380.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 72.Graddy R, Reynolds SS, Wright WM. Coaching residents in the ambulatory setting: faculty direct observation and resident reflection. J Grad Med Educ. 2018; 10: 449–54. DOI: 10.4300/JGME-17-00788.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.Lockyer J, Armson H, Könings KD, Lee-Krueger RCW, des Ordons AM, Ramani S, et al. In-the-moment feedback and coaching: improving R2C2 for a new context. J Grad Med Educ. 2020; 12: 27–35. DOI: 10.4300/JGME-D-19-00508.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 74.McGaghie WC. Mastery learning: it is time for medical education to join the 21 st century. Acad Med. 2015; 90: 1438–41. DOI: 10.1097/ACM.0000000000000911 [DOI] [PubMed] [Google Scholar]
  • 75.McGaghie WC, Barsuk JH, Wayne DB. Mastery learning with deliberate practice in medical education. Acad Med. 2015; 90: 1575. DOI: 10.1097/ACM.0000000000000876 [DOI] [PubMed] [Google Scholar]
  • 76.Colbert CY, Dannefer EF, French JC. Clinical competency committees and assessment: changing the conversation in graduate medical education. J Grad Med Educ. 2015; 7: 162–5. DOI: 10.4300/JGME-D-14-00448.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 77.Hauer KE, Chesluk B, Iobst W, Holmboe E, Baron RD, Boscardin CK, et al. Reviewing residents’ competence: a qualitative study of the role of clinical competency committees in performance assessment. Acad Med. 2015; 90: 1084–92. DOI: 10.1097/ACM.0000000000000736 [DOI] [PubMed] [Google Scholar]
  • 78.Marty AP, Linsenmeyer M, George B, Young JQ, Breckwoldt J, Ten Cate O. Mobile technologies to support workplace-based assessment for entrustment decisions: guidelines for programs and educators: AMEE Guide No. 154. Med Teach. 2023. Online ahead of print. DOI: 10.1080/0142159X.2023.2168527 [DOI] [PubMed] [Google Scholar]
  • 79.Van der Schaaf M, Donkers J, Slof B, Moonon-van Loon J, van Tartwijk J, Driessen E, et al. Improving workplace-based assessment and feedback by an eportfolio enhanced with learning analytics. Educ Technol Res Dev. 2017; 65: 359–80. DOI: 10.1007/s11423-016-9496-8 [DOI] [Google Scholar]
  • 80.Thoma B, Bandi V, Carey R, Mondal D, Woods R, Martin L, Chan T. Developing a dashboard to meet Competence Committee needs: a design-based research project. Canadian Medical Education Journal. 2020; 11(1): e16. DOI: 10.36834/cmej.68903 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 81.Thoma B, Ellaway RH, Chan TM. From Utopia through Dystopia: charting a course for learning analytics in competency-based medical education. Academic Medicine. 2021; 96(7S): S89–S95. DOI: 10.1097/ACM.0000000000004092 [DOI] [PubMed] [Google Scholar]
  • 82.Taber S, Akdemir N, Gorman L, van Zanten M, Frank JR. A fit for purpose framework for medical education accreditation system design. BMC Med Educ. 2020; 20(Suppl. 1): 306. DOI: 10.1186/s12909-020-02122-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 83.Edgar L, Jones MD Jr, Harsy B, Passiment M, Hauer KE. Better decision-making: shared mental models and the clinical competency committee. J Grad Med Educ. 2021; 13(2S): 51–8. DOI: 10.4300/JGME-D-20-00850.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 84.Boyd VA, Whitehead CR, Thille P, Ginsburg S, Brydges R, Kuper A. Competency-based medical education: the discourse of infallibility. Med Educ. 2017; 52: 45–57. DOI: 10.1111/medu.13467 [DOI] [PubMed] [Google Scholar]
  • 85.Brydges R, Boyd VA, Tavares W, Ginsburg S, Kuper A, Anderson M, et al. Assumptions about competency-based medical education and the state of the underlying evidence: a critical narrative review. Acad Med. 2021; 96: 296–306. DOI: 10.1097/ACM.0000000000003781 [DOI] [PubMed] [Google Scholar]
  • 86.Royal College of Physicians and Surgeons of Canada. Competence by Design. Retrieved 5 February 2023. https://www.royalcollege.ca/rcsite/cbd/competence-by-design-cbd-e.
  • 87.Tugwell P. Postgraduate training in Canada. Postgrad Med J. 1986; 63: 707–9. DOI: 10.1136/pgmj.63.742.707 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 88.Frank JR, Snell L, Sherbino J, editors. CanMEDS 2015 Physician Competency Framework. Ottawa: Royal College of Physicians and Surgeons of Canada; 2015. [Google Scholar]
  • 89.Frank JR, Danoff D. The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Med. Teach. 2007; 29: 642–647. DOI: 10.1080/01421590701746983 [DOI] [PubMed] [Google Scholar]
  • 90.McGaghie WC, Sajid W, Miller GE, Telder TV, Lipson L, et al. Competency-based Curriculum Development in Medical Education. Geneva: World Health Organization; 1978. [PubMed] [Google Scholar]
  • 91.Swing S. The ACGME Outcomes Project: retrospective and prospective. Med Teach. 2007; 29: 648–54. DOI: 10.1080/01421590701392903 [DOI] [PubMed] [Google Scholar]
  • 92.Frenk J, Chen L, Bhutta ZA, Cohen J, Crisp N, Evans T, et al. Health professionals for a new century. Lancet commission. Lancet. 2010; 376: 1923–58. DOI: 10.1016/S0140-6736(10)61854-5 [DOI] [PubMed] [Google Scholar]
  • 93.Nousiainen MT, Mironova P, Hynes M, Glover Takahashi S, Reznick R, Kraemer W, et al. Eight-year outcomes of a competency-based residency training program in orthopedic surgery. Med Teach. 2018; 40: 1042–54. DOI: 10.1080/0142159X.2017.1421751 [DOI] [PubMed] [Google Scholar]
  • 94.Ellaway RH, Mackay MP, Lee S, Hofmeister M, Malin G, Archibald D, et al. The impact of a national competency-based medical education initiative in family medicine. Acad Med. 2018; 93: 1850–7. DOI: 10.1097/ACM.0000000000002387 [DOI] [PubMed] [Google Scholar]
  • 95.Acai A, Li SA, Sherbino J, Chan TM. Attending emergency physicians’ perceptions of a programmatic workplace-based assessment system the McMaster Modular Assessment program (McMAP). Teach Learn Med. 2019; 31: 434–44. DOI: 10.1080/10401334.2019.1574581 [DOI] [PubMed] [Google Scholar]
  • 96.Li SA, Sherbino J, Chan TM. McMaster Modular Assessment Program (McMAP) through the years: residents’ experience with an evolving feedback culture over a 3-year period. AEM Educ Train. 2017; 1: 5–14. DOI: 10.1002/aet2.10009 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 97.Bindal T, Wall D, Goodyear HM. Trainee doctors’ views on workplace-based assessments: Are they just a tick box exercise? Med Teach. 2011; 33: 919–27. DOI: 10.3109/0142159X.2011.558140 [DOI] [PubMed] [Google Scholar]
  • 98.Ott MC, Pack R, Cristancho S, Chin M, Van Koughnett JA, Ott M. “The most crushing thing”: understanding resident assessment burden in a competency-based curriculum. J Grad Med Ed. 2022; 14(5): 583–592. DOI: 10.4300/JGME-D-22-00050.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 99.Yeager DS, Hanselman P, Walton GM, Murray JS, Crosnoe R, Muller C, et al. A national experiment reveals where a growth mindset improves achievement. Nature. 2019; 573: 364–9. DOI: 10.1038/s41586-019-1466-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 100.Ramani S, Könings KD, Ginsburg S, van der Vleuten CPM. Twelve tips to promote a feedback culture with a growth mind-set. Med Teach. 2019; 41: 625–31. [DOI] [PubMed] [Google Scholar]
  • 101.Richardson D, Kinnear B, Hauer KE, Turner TL, Warm EJ, Hall AK, et al. Growth mindset in competency-based medical education. Med Teach. 2021; 43: 751–7. DOI: 10.1080/0142159X.2021.1928036 [DOI] [PubMed] [Google Scholar]
  • 102.Teunissen PW, Westerman M. Opportunity or threat: the ambiguity of the consequences of transitions in medical education. Med Educ. 2011; 45: 51–9. DOI: 10.1111/j.1365-2923.2010.03755.x [DOI] [PubMed] [Google Scholar]
  • 103.Holmboe ES, Ginsburg S, Bernabeo E. The rotational approach to medical education: time to confront our assumptions? Med Educ. 2011; 45: 69–80. DOI: 10.1111/j.1365-2923.2010.03847.x [DOI] [PubMed] [Google Scholar]
  • 104.Prentice S, Benson J, Kirkpatrick E, Schuwirth L. Workplace-based assessment in postgraduate medical education: a hermeneutic review. Med Educ. 2020; 54: 981–92. DOI: 10.1111/medu.14221 [DOI] [PubMed] [Google Scholar]
  • 105.Young JQ, Sugarman R, Schwartz J, O’Sullivan PS. Faculty and resident engagement with a workplace-based assessment tool. Acad Med. 2020; 95: 1937–44. DOI: 10.1097/ACM.0000000000003543 [DOI] [PubMed] [Google Scholar]
  • 106.Frank JR, Taber S, van Zanten M, Scheele F, Blouin D, International Health Professions Accreditation Outcomes Consortium. The role of accreditation in 21 st century health professions education: report of an International Consensus Group. BMC Med Educ. 2020; 20(Suppl. 1): 305. DOI: 10.1186/s12909-020-02121-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 107.Fishbain D, Danon YL, Nissanholz-Gannot R. Accreditation systems for postgraduate medical education: a comparison of five countries. Adv Health Sci Educ. 2019; 24: 503–24. DOI: 10.1007/s10459-019-09880-x [DOI] [PubMed] [Google Scholar]
  • 108.Yilmaz Y, Carey R, Chan TM, Bandi V, Wang S, Woods RA, et al. Developing a dashboard for program evaluation in competency-based training programs: A design-based research project. Can Med Educ J. 2022; 13: 14–27. DOI: 10.36834/cmej.73554 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 109.Frank JR, Jabbour M, Tugwell P; The Societal Needs Working Group. Skills for the new millennium: Report of the Societal Needs Working Group. CanMEDS 2000 Project. Ann R Coll Phys Surg Can. 1996; 29: 206–16. [Google Scholar]
  • 110.Frank JR, ed. The CanMEDS 2005 Physician Competency Framework. Ottawa: Royal College of Physicians and Surgeons of Canada. [Google Scholar]
  • 111.Frank JR. The CanMEDS project. In: Dinsdale HB, Hurteau G, (eds.), The Evolution of Specialty Medicine. Ottawa: Royal College of Physicians and Surgeons of Canada; 2004. [Google Scholar]
  • 112.Maudsley RF, et al. Report of the Task Force to Review Fundamental Issues in Specialty Education. Ottawa: Royal College of Physicians and Surgeons of Canada; 1996. [Google Scholar]
  • 113.Hornstein HA. The integration of project management and organizational change management is now a necessity. Int J Proj Manage. 2015; 33: 291–8. DOI: 10.1016/j.ijproman.2014.08.005 [DOI] [Google Scholar]
  • 114.Holmboe ES. Work-based assessment and co-production in postgraduate medical training. GMS J Med Educ. 2017; 34: 58. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 115.Buttemer S, Hall J, Berger L, Weersink K, Dagnone JD. Ten ways to get a grip on resident co-production within medical education change. Can Med Educ J. 2020; 11: e124–e129. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 116.Karpinski J, Stewart J, Oswald A, Dalseg TR, Atkinson A, Frank JR. Competency based medical education at scale: a road map for transforming national systems of postgraduate medical education. Perspect Med Educ. 2024; 13(1): 24–32. DOI: 10.5334/pme.957 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 117.Royal College of Physicians and Surgeons of Canada. CBME Leads Directory. Retrieved 5 February 2023. https://www.royalcollege.ca/rcsite/cbd/implementation/getting-started-cbd-local-support-e/cbme-leads-e.
  • 118.Hall AK, Oswald A, Frank JR, Dalseg T, Cheung WJ, Cooke L, et al. Evaluating Competence by Design as a large system change initiative: readiness, fidelity, and outcomes. Perspect Med Educ. 2024; 13(1): 95–107. DOI: 10.5334/pme.962 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 119.Royal College of Physicians and Surgeons of Canada. Milestones. Retrieved 5 February 2023. https://canmeds.royalcollege.ca/en/milestones.
  • 120.Karpinski J, Frank JR. The role of EPAs in creating a national system of time-variable competency-based medical education. Acad Med. 2021; 96(7S): S36–S41. DOI: 10.1097/ACM.0000000000004087 [DOI] [PubMed] [Google Scholar]
  • 121.Gofton WT, Dudek NL, Wood TJ, Balaa F, Hamstra SJ. The Ottawa Surgical Competency Operating Room Evaluation Score (O-Score). Acad Med. 2012; 87: 1401–7. DOI: 10.1097/ACM.0b013e3182677805 [DOI] [PubMed] [Google Scholar]
  • 122.Richardson D, Landreville JM, Trier J, Cheung WJ, Bhanji F, Hall AK, et al. Coaching in Competence by Design: coaching in the moment and coaching over time. Perspect Med Educ. 2024; 13(1): 33–43. DOI: 10.5334/pme.959 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 123.Royal College of Physicians and Surgeons of Canada. Information by Discipline. Retrieved 5 February 2023. https://www.royalcollege.ca/rcsite/ibd-search-e.
  • 124.Thoma B, Hall AK, Clark, K, Meshkat N, Cheung WJ, Desaulniers P, et al. Evaluation of a national competency-based assessment system in emergency medicine: a CanDREAM study. J Grad Med Educ. 2020; 12: 425–34. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 125.Sender Liberman A, Liberman M, Steinert Y, McLeod P, Meterissian S. Surgery residents and attending surgeons have different perceptions of feedback. Med Teach. 2005; 5: 470–2. DOI: 10.1080/0142590500129183 [DOI] [PubMed] [Google Scholar]
  • 126.Holmboe ES. Direct observation. In: Holmboe ES, Durning SJ, Hawkins RS, (eds.), Practical Guide to the Evaluation of Clinical Competence. 2nd ed. Philadelphia: Mosby-Elsevier; 2017. [Google Scholar]
  • 127.Westberg J, Hilliard J. Collaborative Clinical Education. New York: Springer; 1993. [Google Scholar]
  • 128.ten Cate O, Snell L, Mann K, Vermunt J. Orienting teaching toward the learning process. Acad Med. 2004; 79: 219–28. DOI: 10.1097/00001888-200403000-00005 [DOI] [PubMed] [Google Scholar]
  • 129.Dweck CS. Mindset: The New Psychology of Success. New York: Penguin Random House; 2006. [Google Scholar]
  • 130.Wolcott MD, McLaughlin JE, Hanne A, Miklavec A, Dallaghan GLB, Rhoney DH, et al. A review to characterise and map the growth mindset theory in health professions education. Med Educ. 2021; 44: 430–40. DOI: 10.1111/medu.14381 [DOI] [PubMed] [Google Scholar]
  • 131.Goldhamer MEJ, Martinez-Lage M, Black-Schaffer WS, Huang JT, Co JPT, Weinstein DF, et al. Reimagining the clinical competency committee to enhance education and prepare for competency-based time-variable advancement. J Gen Intern Med. 2022; 37: 2280–90. DOI: 10.1007/s11606-022-07515-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 132.Royal College of Physicians and Surgeons of Canada. Competence committees. Retrieved 5 February 2023. https://www.royalcollege.ca/rcsite/cbd/assessment/competence-committees-e.
  • 133.Oswald A, Dubois D, Snell L, Anderson R, Karpinski J, Hall AK, et al. Implementing competence committees on a national scale: design and lessons learned. Perspect Med Educ. 2024; 13(1): 56–67. DOI: 10.5334/pme.961 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 134.Cheung WC, Bhanji F, Gofton W, Hall AK, Karpinski J, Richardson D, et al. Programmatic assessment in Canadian specialty residency education; implementation and lessons learned. Perspect Med Educ. 2024; 13(1): 44–55. DOI: 10.5334/pme.956 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 135.Royal College of Physicians and Surgeons of Canada. 2016. CBD Working Group Communique: Assessment. p.4. https://www.royalcollege.ca/content/dam/documents/accreditation/competence-by-design/directory/cbd-policy-communique-assessment-e.pdf. Accessed March 10, 2023.
  • 136.Thoma B, Bandi V, Carey R, Mondal D, Woods R, Martin L, Chan T. Developing a dashboard to meet Competence Committee needs: a design-based research project. Canadian Medical Education Journal. 2020; 11(1): e16. DOI: 10.36834/cmej.68903 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 137.Carey R, Wilson G, Bandi V, Mondal D, Martin LJ, Woods R, Thoma, B. Developing a dashboard to meet the needs of residents in a competency-based training program: a design-based research project. Canadian Medical Education Journal. 2020; 11(6): e31. DOI: 10.36834/cmej.69682 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 138.Viberg O, Hatak M, Balter O, Mavroudi A. The current landscape of learning analytics in higher education. Comput Hum Behav. 2018; 89: 98–110. DOI: 10.1016/j.chb.2018.07.027 [DOI] [Google Scholar]
  • 139.Yilmaz Y, Carey R, Chan TM, Bandi V, Wang S, Woods RA, … Thoma, B. Developing a dashboard for faculty development in competency-based training programs: a design-based research project. Canadian Medical Education Journal. 2021; 12(4): 48–64. DOI: 10.36834/cmej.72067 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 140.Royal College of Physicians and Surgeons of Canada. Exams and Credentialing. Retrieved 5 February 2023. https://www.royalcollege.ca/rcsite/cbd/assessment/cbd-exams-e.
  • 141.Bhanji F, Naik V, Skoll A, Pittini, Daniels VJ, Bacchus CM. Competence by Design: the role of high-stakes examinations in a competency based medical education system. Perspect Med Educ. 2024; 13(1): 68–74. DOI: 10.5334/pme.965 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 142.CanRAC consortium. CanERA. Retrieved 5 February 2023. https://www.canera.ca/canrac/home-e.
  • 143.Dalseg TR, Thoma B, Wycliff-Jones K, Frank JR, Taber S. Enabling implementation of competency-based medical education through an outcomes-focused accreditation. Perspect Med Ed. 2024; 13(1): 75–84. DOI: 10.5334/pme.963 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 144.Ten Cate O, Gruppen LD, Kogan JR, Lingard LA, Teunissen PW. Time-variable training in medicine: theoretical considerations. Acad Med. 2018; 93(3S): S6–S11. DOI: 10.1097/ACM.0000000000002065 [DOI] [PubMed] [Google Scholar]
  • 145.Tyack DB, Cuban L. Tinkering toward Utopia. Cambridge: Harvard University Press; 1995. [Google Scholar]
  • 146.Thoma B, Caretta-Weyer H, Schumacher DJ, Warm E, Hall AK, Hamstra SJ, ICBME Collaborators. Becoming a deliberately developmental organization: using competency based assessment data for organizational development. Medical Teacher. 2021; 43(7), 801–809. DOI: 10.1080/0142159X.2021.1925100 [DOI] [PubMed] [Google Scholar]
  • 147.Fulbrook P, Mooney S. Care bundles in critical care: a practical approach to evidence-based practice. Nurs Crit Care. 2003; 8: 249–55. DOI: 10.1111/j.1362-1017.2003.00039.x [DOI] [PubMed] [Google Scholar]
  • 148.Wilson P, Kislov R. Implementation Science. Cambridge: Cambridge University Press. 2022. DOI: 10.1017/9781009237055 [DOI] [Google Scholar]
  • 149.Cheung WC, Wagner N, Frank JR, Oswald A, Van Melle E, Skutovich A, et al. Implementation of competence committees during the transition to CBME in Canada: national fidelity-focused evaluation. Med Teach. 2022; 44: 781–9. DOI: 10.1080/0142159X.2022.2041191 [DOI] [PubMed] [Google Scholar]
  • 150.Ahn E, Ladonna KA, Landreville JM, Mcheimech R, Cheung WJ. Only as strong as the weakest link: resident perspectives on entrustable professional activities and their impact on learning. J Grad Med Ed. 15(6): 676–684. DOI: 10.4300/JGME-D-23-00204.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 151.Thoma B, Bandi V, Carey R, Mondal D, Woods R, Martin L, Chan, T. Developing a dashboard to meet Competence Committee needs: a design-based research project. Canadian Medical Education Journal. 2020; 11(1): e16. DOI: 10.36834/cmej.68903 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 152.Carey R, Wilson G, Bandi V, Mondal D, Martin LJ, Woods R, Thoma, B. Developing a dashboard to meet the needs of residents in a competency-based training program: a design-based research project. Canadian Medical Education Journal. 2020; 11(6): e31. DOI: 10.36834/cmej.69682 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 153.Yilmaz Y, Carey R, Chan TM, Bandi V, Wang S, Woods RA, Thoma, B. Developing a dashboard for faculty development in competency-based training programs: a design-based research project. Canadian Medical Education Journal. 2021; 12(4): 48–64. DOI: 10.36834/cmej.72067 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 154.Yilmaz Y, Carey R, Chan TM, Bandi V, Wang S, Woods RA, Thoma, B. Developing a dashboard for program evaluation in competency-based training programs: a design-based research project. Canadian Medical Education Journal. 2022; 13(5): 14–27. DOI: 10.36834/cmej.73554 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 155.Resident Doctors of Canada and the Royal College of Physicians and Surgeons of Canada. Competence by Design: Resident Physician Pulse Check Report from the 2021 Collaborative Study. September, 2022. Ottawa: Royal College. [Google Scholar]
  • 156.Hall AK, Nousiainen MT, Campisi P, Dagnone JD, Frank JR, Kroeker KI, et al. Training disrupted: practical tips for supporting competency-based medical education during the COVID-19 pandemic. Med Teach. 2020; 42: 756–61. DOI: 10.1080/0142159X.2020.1766669 [DOI] [PubMed] [Google Scholar]
  • 157.Prinsloo P, Slade S, Khalil M. The answer is (not only) technological: Considering student data privacy in learning analytics. Br J Educ Technol. 2022; 53: 876–93. DOI: 10.1111/bjet.13216 [DOI] [Google Scholar]
  • 158.The International CBME Collaborators. Implementation of CBME: A Typology of International Programs. 2021; Unpublished manuscript. Ottawa: ICBMEC. [Google Scholar]
  • 159.Oandasan I. Advancing Canada’s family medicine curriculum: Triple C. Can. Fam. Phys. 2011; 57(6): 739–740. [PMC free article] [PubMed] [Google Scholar]
  • 160.The Triple-C Competency-based Curriculum. https://www.cfpc.ca/en/education-professional-development/educational-frameworks-and-reference-guides/triple-c-competency-based-curriculum. Accessed November 25, 2023.
  • 161.Incoll I, Atkin J, Owen J, Kean A, Khorshid O, Cosenza A, Frank JR. Australian orthopaedic surgery training: Australian Orthopaedic Association’s strategic education review. ANZ J Surg. 2020; 90(6): 997–1003. DOI: 10.1111/ans.15609 [DOI] [PubMed] [Google Scholar]
  • 162.de Graaf J, Bolk M, Dijkstra A, van der Horst M, Hoff RG, ten Cate O. The Implementation of Entrustable Professional Activities in Postgraduate Medical Education in the Netherlands: Rationale, Process, and Current Status. Acad. Med. 2021; 96(7S): S29–S35. DOI: 10.1097/ACM.0000000000004110 [DOI] [PubMed] [Google Scholar]
  • 163.Heard JK, Allen RM, Clardy J. Assessing the Needs of Residency Program Directors to Meet the ACGME General Competencies. Acad. Med. 2002; 77(7): 750. DOI: 10.1097/00001888-200207000-00040 [DOI] [PubMed] [Google Scholar]
  • 164.de Graaf et al. p.S32. [Google Scholar]
  • 165.Green ML, Holmboe E. Perspective: the ACGME Toolbox: half empty or half full? Acad. Med. 2010; 85(5): 787–790. DOI: 10.1097/ACM.0b013e3181d737a6 [DOI] [PubMed] [Google Scholar]
  • 166.Holmboe ES. Realizing the promise of Competency-based Medical Education. Acad. Med. 2015; 90(4): 411–413. DOI: 10.1097/ACM.0000000000000515 [DOI] [PubMed] [Google Scholar]
  • 167.Swing SR. The ACGME outcome project: retrospective and prospective. Med. Teach. 2007; 29: 648–654. DOI: 10.1080/01421590701392903 [DOI] [PubMed] [Google Scholar]
  • 168.Holmboe ES, Yamazaki K, Nasca TJ, Hamstra SJ. Using Longitudinal Milestones Data and Learning Analytics to Facilitate the Professional Development of Residents: Early Lessons From Three Specialties. Acad. Med. 2020; 95(1): 97–103. DOI: 10.1097/ACM.0000000000002899 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 169.Australian Orthopedic Association. About AOA-21. https://aoa.org.au/orthopaedic-training/content-page/about-aoa-21. Accessed November 25, 2023.

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplement A.

Phases and activities of the Competence by Design project.

pme-13-1-1096-s1.pdf (536.8KB, pdf)
DOI: 10.5334/pme.1096.s1

Articles from Perspectives on Medical Education are provided here courtesy of Ubiquity Press

RESOURCES