Skip to main content
Perspectives on Medical Education logoLink to Perspectives on Medical Education
. 2024 Feb 6;13(1):75–84. doi: 10.5334/pme.963

Enabling Implementation of Competency Based Medical Education through an Outcomes-Focused Accreditation System

Timothy R Dalseg 1,2,3, Brent Thoma 2,4, Keith Wycliffe-Jones 5, Jason R Frank 6, Sarah Taber 2
PMCID: PMC10854411  PMID: 38343559

Abstract

Competency based medical education is being adopted around the world. Accreditation plays a vital role as an enabler in the adoption and implementation of competency based medical education, but little has been published about how the design of an accreditation system facilitates this transformation. The Canadian postgraduate medical education environment has recently transitioned to an outcomes-based accreditation system in parallel with the adoption of competency based medical education. Using the Canadian example, we characterize four features of an accreditation system that can facilitate the implementation of competency based medical education: theoretical underpinning, quality focus, accreditation standards, and accreditation processes. Alignment of the underlying educational theories within the accreditation system and educational paradigm drives change in a consistent and desired direction. An accreditation system that prioritizes quality improvement over quality assurance promotes educational system development and progressive change. Accreditation standards that achieve the difficult balance of being sufficiently detailed yet flexible foster a high fidelity of implementation without stifling innovation. Finally, accreditation processes that recognize the change process, encourage program development, and are not overly punitive all enable the implementation of competency based medical education. We also discuss the ways in which accreditation can simultaneously hinder the implementation of this approach. As education bodies adopt competency based medical education, particular attention should be paid to the role that accreditation plays in successful implementation.

Introduction

The implementation of competency based medical education (CBME) throughout Canadian postgraduate medical education (PGME) has resulted in an unprecedented level of reform. With the College of Family Physicians of Canada (CFPC) implementing its Triple C Competency-based Curriculum and the Royal College of Physicians and Surgeons of Canada (Royal College) transitioning to Competence by Design (CBD), the way in which trainees are educated and assessed in PGME has fundamentally changed [1,2,3]. Implementation of CBME is a large-scale, complex change initiative [4]. Without successful implementation, learners will fail to benefit from the purported advantages of this educational approach.

While others have characterized the CBME implementation experience and have identified factors perceived to have influenced the outcome, the literature fails to discuss the important role that accreditation plays in enabling this transition [5,6,7]. The substantive changes that have taken place in the Canadian PGME system have occurred in parallel with a transition to an outcomes-based accreditation process within PGME. There must be alignment between both the educational paradigm and the accreditation system, or the implementation of transformational change will be hampered.

We describe how a unique, collaboratively constructed accreditation system enabled the transition to, and implementation of, CBME. The lessons learned are shared to facilitate the continuous improvement process locally and to inform the world discourse on accreditation as a tool to maximize fidelity and integrity of implementation in these evolving systems.

Definitions

Competency-based medical education is “oriented to graduate outcome abilities, organized around competencies derived from an analysis of societal and patient needs” (p. 636) [8]. Individual learner competencies are observable abilities that integrate knowledge, skills, and attitudes [9]. CBD is the Royal College’s version of CBME that began implementation in 2017, and combines an outcomes-based approach to learning while utilizing time as a resource [3]. CBME is more than a curriculum. It involves the design, implementation, assessment, and evaluation of the associated educational program as well as a series of policy decisions and implications [9,10]. The adoption of CBME, and in this case CBD, has been characterized as a transformational change, requiring a significant shift in behaviours [11].

Accreditation is “the process of formal evaluation of an educational program, institution, or system against defined standards by an external body for the purposes of quality assurance and enhancement” (p. 4) [12]. The fit-for-purpose framework for accreditation systems states that the best design prioritizes local needs and contexts, resulting in a tailored approach that is adaptable to a jurisdiction’s changes over time [13]. By prioritizing contexts, the fit-for-purpose framework establishes an important connection between local factors and an accreditation system’s ability to effectively conduct a situationally meaningful accreditation in its pursuit of quality. One such local context that must be considered in the design of any accreditation system is the educational approach: in this case, CBME [13,14].

CanERA

In Canada, three colleges share a mandate to accredit PGME: the CFPC, the Collège des Médecins du Québec (CMQ), and the Royal College. The colleges’ system of accreditation went without transformational change for decades (from approximately 1990 to 2010), with incremental improvements focused on enhancing standards and improving alignment between the colleges. The impetus for the current change arose in the late 2000s (circa 2008–2009), with key postgraduate medical education stakeholders signalling the need for modern redesign [15]. This call for change coincided with early developments of CBME models and the drive toward CBME in Canada [16,17].

In response to this call for change, the three colleges formed the Canadian Residency Accreditation Consortium (CanRAC) in 2013. Collaboratively, CanRAC hosted a series of series of summits from 2013 to 2018 that included stakeholders from across the Canadian PGME and accreditation landscape. Throughout this process, CanRAC sought to identify opportunities for alignment and coordination of accreditation processes across the continuum of medical education, incorporate and innovate on best practices in accreditation, and adapt to changes taking place within the education paradigm [18]. This work culminated in the launch of a new, unified PGME accreditation system entitled Canadian Excellence in Residency Accreditation (CanERA) officially on July 1, 2019. CanERA features [18] a bundle of transformational accreditation changes (Figure 1).

Figure 1.

Table of transformational changes that have taken place within the Canadian Excellence in Residency Accreditation (CanERA) accreditation system

Transformational changes in the Canadian Excellence in Residency Accreditation (CanERA) accreditation system. Copyright 2017. The Canadian Residency Accreditation Consortium (CanRAC). https://www.canrac.ca/canrac/about-e. Reproduced with permission.

A set of goals was established for CanERA with the design and launch of this new system. These included an intent to; modernize the approach to accreditation through new standards, provide a new framework to guide standards’ evaluation, increase the emphasis on continuous quality improvement, reduce workload for all stakeholders through a longer eight year accreditation cycle and a digital accreditation management platform, and to support CBME implementation through standards, policies, and processes that align with the principles of CBME.

What features of an accreditation system enable the implementation of CBME?

Alignment of educational theory

Previous work has proposed that accreditation systems and their design are influenced by the environments in which they are situated [13]. However, the opposite is true as well: the educational environment can be profoundly influenced by the accreditation system and its design. It must be recognized that accreditation systems wield considerable influence through their regulatory mandates. Potential accreditation outcomes drive the educational behaviour and decisions of those being accredited. When implementing a new educational paradigm, it is therefore important to have an accreditation system that promotes and prioritizes shared characteristics and constructs. Alignment of the underlying features, including the theoretical basis of the new paradigm, educational priorities, and societal responsibilities, is critical to ensure that implementation is driven in the direction of intended educational change.

Traditional accreditation systems were focused on structures and processes, which was well suited to an educational system that relied on time to ensure that learners had received the required training to be a competent physician. As an outcome-based educational paradigm, CBME utilizes a curriculum that is organized around the assessment, documentation, and interpretation of outcomes in the form of competencies across the educational experience [9,19,20]. CanERA’s new accreditation system design reflects a departure from the more easily evaluated constructs of structure and process and instead places emphasis on outcomes within the accreditation standards [21]. For example, instead of simply requiring that a policy exists, those undergoing accreditation would be expected to demonstrate how a policy is used, and its outcome, to establish compliance with the standard. Within an outcomes-based accreditation system, postgraduate training programs are more freely able to adopt and prioritize educational innovations that are consistent with the principles of CBME. These may include educational interventions, learning experiences, and assessments that are focused on the outcomes of the trainee. For example, instead of requiring that all learners rotate through a specific educational experience, programs are free to utilize non-traditional, or learner-driven experiences to satisfy educational competencies. The ability to incorporate these types of critical components is important to advancing the fidelity of implementation or, in other terms, the extent to which the critical components of an innovation are present in an enacted system [22].

Furthermore, the principles of an outcomes-based accreditation system align with the educational outcome priorities of CBME on a more fundamental level: specifically, an emphasis on graduate and program outcomes that will create competent physicians who will contribute to, and be responsible for, improving the health of individuals and populations [9,23,24]. This alignment serves to strengthen the validity of the accreditation system and process, while simultaneously addressing expectations for social accountability [9,23,24]. Alignment at this level increases the likelihood that programs and institutions will integrate and apply these essential principles in their own contexts, thereby furthering the integrity of implementation [25].

Quality assurance versus quality improvement focus

Systems of accreditation have a purpose that often includes both quality assurance and quality improvement [12]. A quality assurance focus assures the public, regulators, and others that programs meet minimum standards for educational quality thereby preventing harm, particularly during a period of major curricular reform [26]. Accreditation organizations thus have an important fiduciary duty to institutions and programs, trainees, and, ultimately, to patients. Conversely, a quality improvement focus emphasizes helping programs and, ultimately, the system of medical education, to improve quality through the conduct of self-evaluation and achieve aspirational standards over time all while having a lower risk to their accreditation status [27,28]. Arguably, this notion of pursing improvement may be particularly important during the early stages of CBME adoption. A quality improvement philosophy can also help to promote the identification and sharing of “next” and best practices and thus promote the diffusion of innovation through a period of CBME implementation [27]. While some accreditation systems place greater emphasis on one quality objective over the other, most systems find themselves situated somewhere between these two ends of a spectrum [13].

To meet both priorities, the CanERA system reframed its mandate and placed increased emphasis and expectations on continuous quality improvement while retaining design features that focused on quality assurance. At the program level, this emphasis was manifested by the creation of a new quality improvement domain within the accreditation standards [29]. This domain, with its associated standards and indicators, was created to ensure that a culture of quality improvement was present in postgraduate training programs and to clearly define the expectations to which all programs would be held. In this new system design, a balance was struck that advanced implementation while maintaining safety within the system [29].

Accreditation standards

Accreditation standards can be defined as “measures or generally accepted benchmarks used in making decisions about the quality of a program, institution, or system” (p. 6) [12]. Thus, standards outline the expectations for programs’ and institutions’ achievement of quality and drive associated behaviour to meet the criteria [30,31]. Accreditation standards are perhaps one of the most fundamental components of an accreditation system [32]. In the case of CBME, ensuring that the content of accreditation standards is aligned with and ideally based upon the core components of a competency-based design [11] will help to drive the implementation of CBME.

Accreditation standards differ in terms of their level of detail, the flexibility afforded in their requirements, and their focus on structures, processes, and/or outcomes. Accreditation standards that are highly detailed or prescriptive, or that primarily emphasize structural and process-based criteria, may be most effective at driving standardization across programs [33]. In CBME, this can provide clarity to programs in terms of expectations and drive fidelity in relation to the desired model of educational design. However, it may be argued that this approach could also stifle important innovation that occurs through the natural processes of change diffusion [34]. Conversely, standards that are written to allow for greater flexibility or that place a greater emphasis on educational outcomes may facilitate programs’ abilities to innovate in their local implementation of CBME. This approach, combined with a mechanism that allows the accreditation standards to be continuously informed by innovations as implementation occurs, could be quite powerful in driving ongoing iteration and improvement, thus supporting the implementation of CBME. However, overly flexible or vague standards may not provide programs with the guidance they need, particularly at early stages of implementation, and may ultimately result in a loss of fidelity of implementation of the desired model of educational design.

Overall, the CanERA standards were written with a balance of structure, process, and outcome measures in mind, reflecting a spectrum from the more detailed to the more flexible, respectively. Specifically, the standards were structured in a hierarchy, beginning with an overarching standard, under which is nested elements, requirements and indicators; the evaluation framework requires each measurable requirement and indicator to be evaluated [29], providing precise feedback to programs about what has been implemented according to the standards’ expectations and about where areas for improvement are identified. For example, the standard regarding the educational program (standard 3) states an overarching outcome that, “Residents are prepared for independent practice,” (p.9) [29]; its nested elements, requirements and indicators then outline expectations based on the core components of CBME. The specific indicators in this standard that are then evaluated through the accreditation process provide specific guidance to programs in terms of what is required as part of CBME implementation (e.g., Indicator 3.1.1.2: “The competencies and/or objectives address each of the Roles in the CanMEDS/CanMEDS-FM Framework specific to the discipline” (p.9) [29]) and where more flexibility in how to achieve the outcome is afforded (e.g., 3.2.3.1: “Individual residents’ educational experiences are tailored to accommodate their learning needs and future career aspirations, while meeting the national standards and societal needs for their discipline” (pg.9) [29]). The content of CanERA standards, combined with their intentional evaluation structure and wording, act as a powerful enabler in the transition to CBME.

Accreditation processes

“Accreditation processes” is a broad term used to encompass several core activities common to most accreditation systems, including how the standards are evaluated (i.e., models of self-assessment and external assessment, information used to evaluate standards), how decisions are rendered (i.e., decision categories and processes), by whom (i.e., site review models), and how often (i.e., the accreditation cycle) [13]. There is significant variation in these processes across accreditation systems. What is important — or perhaps essential — is that they are aligned with and reward (and therefore promote) the desired behaviours of the program or institution being accredited [13].

The CanERA accreditation system introduced several features intended to help foster the implementation of CBME. First, programs were given time to work on CBME implementation before any accreditation impacts. Specifically, where the accreditation standards require that programs implement CBD according to specialty-specific educational requirements, for example the requirement for programs to use specialty-specific learner competencies in defining objectives for the program (indicator 3.1.1.1, pg. 9 [29]) or specific educational experiences, e.g., rotations (indicator 3.2.1.2, pg. 9 [28]), the accreditation policies ensured that programs would not be expected to comply with those requirements for the first year of a specialty’s CBD launch [35].

Second, CanERA introduced new approaches to accreditation decision-making intended to support CBD implementation. CanERA accreditation categories and associated decision-making principles were revised to encourage and reward continuous improvement rather than being punitive; specifically, the decision principles were designed such that programs and institutions that actively share with accreditors what they are working on are not penalized for those areas at the time of the accreditation decision, provided they achieve minimum standards and are making improvement [35]. This is contrasted with previous accreditation models where any gap in the standards, including those known to the program where improvements were being made, were included in the program’s formal list of weaknesses, and factored into the accreditation decision. Additionally, the newly introduced accreditation decision at the institution level, and additional standards requirements for how institutions oversee their postgraduate training programs and assist them with quality improvement, is intended to bolster the role that PGME offices and their leadership play in leading an institution-wide change such as CBME.

Finally, the new CanERA accreditation process aims to foster innovation and experimentation, particularly with novel approaches to successfully implement CBD. In CanERA, a leading practice and innovation (LPI) is defined as “a practice (method, procedure, etc.) that is noteworthy for the discipline, or residency education writ large; and/or is unique and innovative in nature” [35]. LPIs introduced the ability for surveyors to recognize programs and institutions for a novel or interesting practice that could be of interest to other programs or institutions across the country. The aim is to share these practices in the future, including novel and successful approaches to implementing CBD, in a database that other programs looking for ways to address specific areas for improvement or generally improve their program could access.

How accreditation can simultaneously be a barrier to implementation of CBME

While accreditation can act as an enabler of the implementation of CBME, features of the system, standards, and processes introduce limitations and consequences.

Despite agreement that the aligned focus on outcomes within both accreditation and CBME can enable implementation [36,37], there are challenges that come with an over-reliance on outcomes in accreditation [23]. In any curricular enterprise, there are elements, including educational processes and features of the learning environment, that cannot be evaluated in the form of outcomes [24] and yet are inherently important to the quality of the program and therefore to accrediting bodies. Rigid adherence to the outcomes-based construct would prevent these elements from being effectively evaluated, compromising the fiduciary responsibility that accreditation bodies hold. Furthermore, the absence of these constructs within the accreditation system design could impair early implementation through deficient characterization of these required elements (see Table 1).

Table 1.

How features of an accreditation system can enable or hinder implementation of competency based medical education.


ACCREDITATION SYSTEM FEATURE ENABLE IMPLEMENTATION OF CBME HINDER IMPLEMENTATION OF CBME

Alignment of educational theory
  • Alignment of educational theory within the educational paradigm and accreditation system drives change in a consistent and intended direction.

    • Example: Outcomes-based alignment: a decreased emphasis on structure and process in accreditation facilitates the adoption of CBME principles within programs.

  • Rigid adherence to a single educational theory ignores the importance of competing traditional constructs preventing their evaluation and risk impairing early implementation.

    • Example: Even in an outcomes-based system, process elements such as attributes of the learning environment are not best evaluated by outcomes, yet remain critical to effective curricular implementation.


Quality focus
  • A quality improvement focus drives self-evaluation, with an objective of system development that can facilitate adoption of CBME, driving the fidelity and integrity of implementation.

  • This focus also promotes sharing of best practices and diffusion of innovations.

  • A quality assurance focus ensures that minimum standards are met by programs, minimizing the risk of harm to stakeholders. In doing so, programs prioritize the achievement of standards instead of the implementation of CBME.

  • Adoption of innovations is not prioritized.


Accreditation standards
  • Highly detailed and prescriptive standards can provide clear expectations, improving the fidelity of implementation.

  • Outcome-based standards that allow greater flexibility promote innovation and site-specific adaptations that encourage implementation.

  • Highly detailed and prescriptive standards can stifle innovation and the change process, therefore impairing implementation.

  • Outcome-based standards may be too flexible, resulting in inadequate guidance and a low fidelity of implementation.

  • Standards that fail, or are slow to evolve, may slow the pace of CBME adoption.


Accreditation processes
  • Leniency on accreditation standards during the first year of implementation may support programs’ transition to CBME.

  • Accreditation decision categories and decision-making principles that promote continuous quality improvement within programs can facilitate implementation without punitive consequences.

  • Recognition of novel leading practices and innovations can foster implementation by other programs.

  • The perception of a lower stakes accreditation process through leniency and tailored decision-making principles may, for some, reduce the motivation for change and CBME implementation.


Abbreviation: CBME: competency based medical education.

Accreditation is, by nature, a cyclical process spanning many years; programs often spend several years preparing to meet accreditation standards, which can make it challenging for accreditation systems to be agile and ensure standards remain relevant [34]. Yet, to promote the implementation of CBME, certain accreditation standards may be aspirational or, at a minimum, be flexible enough to support the desired changes. Anachronistic standards that conflict with the desired model of CBME, such as prescriptive requirements regarding time spent in training, risk unduly punishing programs that adopt CBME and slowing the overall pace of CBME adoption and implementation across the system. Accreditation bodies thus have a duty to ensure their standards keep pace with major changes in medical education — in this case, CBME.

Finally, the resource implications of accreditation reform alongside CBME change can present a barrier. As existing systems of accreditation evolve to better match the adoption of the CBME paradigm, many internal changes are required to maintain the quality and effectiveness of the accreditation process [36]. The exact resource requirements for large-scale changes such as these have not been described but, as an example, Greenfield et al. [38] noted that the development and revision of accreditation standards for general practice in Australia alone was a significant, challenging, and difficult undertaking requiring considerable resources. These implications could be even greater when the accreditation process itself is novel or significantly modified because of the adoption of CBME. Changes to an accreditation system have the potential to then result in resource implications that cycle back to the program. Engagement and participation in an accreditation process has resource impacts that sometimes stretch over a period of many months to years. This has the potential for negative impacts not only on the educational mission that the accreditation process is there to examine but also on the implementation process of CBME.

Considerations for future system design

CBME represents a fundamental shift in medical education design to emphasize outcomes, primarily the competencies acquired by graduates of a given program. In the future, methods and technologies developed to provide intelligence based on program activities and graduate outcomes may transform the accreditation enterprise.

  1. Program evaluation analytics. CBME programs generate large amounts of assessment data that, in addition to informing trainee progression [39,40], could provide insights into other aspects of the program that are the focus of accreditation standards such as the teaching and assessment practices of faculty [41] and rotations [42]. This information could provide particularly valuable insight into the effectiveness of quality improvement efforts that are driven by the accreditation process [43], as faculty development efforts would be expected to lead to improved supervisor feedback of trainees on assessments and changes to rotation schedules to improve the alignment between the desired and realized completion of assessments. The use of this type of data could therefore advance the specificity and objectivity of evaluation processes in accreditation.

  2. Program outcomes standards. Beyond the inclusion of simple assessment metrics, the opportunity to track more distal educational and clinical outcomes could provide an opportunity to evaluate additional dimensions of quality from an accreditation lens that would not be possible without the adoption of CBME [4,20,44]. For example, what better evidence of a quality educational program could there be than the demonstration that graduates of a program have appropriate clinical outcomes once they begin practice? The robust integration of educational and clinical data can, at the program level, provide insight into how well a residency program is preparing its graduates for practice [45] while at the system level it could provide insight into features of residency training that result in graduates with better clinical outcomes. These insights could, in turn, inform the development of future accreditation standards.

  3. Validity evidence for transformative designs. Finally, given the high-stakes nature of the accreditation process and the value that is attributed to it, it would be interesting to consider how the validity argument for accreditation [46] might be strengthened as new accreditation designs roll out in tandem with CBME [47]. The search for validity evidence to support medical accreditation and the decisions that are a part of it are not new [48]. Current published validity evidence is limited and typically restricted to the validity of unique survey tools and face validity of isolated standards [48,49]. The transformative design decisions within CanERA described above were made to not only enable the implementation of CBME, but to advance the quality of decision making that is part of accreditation. While anecdotal evidence would suggest that this has been successful, assessment data, program analytics, and new program outcome measures could all be used in the future to explore this objectively and contribute to the validity of accreditation decisions.

Conclusion

Accreditation systems have a powerful role to play in enabling the transition to CBME. Their features, including their alignment of underlying educational theory, quality focus (quality assurance versus quality improvement), accreditation standards, and accreditation processes, all represent design opportunities that can be used to facilitate this transition. It must be acknowledged however, that within each of these domains there are limitations and that all design opportunities can come with a consequence. We have presented examples from the CanERA accreditation system to illustrate the opportunities and limitations that have been observed to date, as well as the opportunities for the future. Educational institutions that are considering or are undertaking a transition to CBME should consider the importance of accreditation in enabling the successful transition to this new educational paradigm.

Disclaimer

The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Royal College of Physicians and Surgeons of Canada (“Royal College”). Information in this article about Competence by Design (“CBD”), its implementation and related policies and procedures do not necessarily reflect the current standards, policies and practices of the Royal College. Please refer to the Royal College website for current information.

Funding Statement

The Competence By Design project was funded by the Royal College of Physicians and Surgeons of Canada and some individual authors received funding from the Royal College either as staff (JRF, ST) or consultants (TRD, BT).

Funding Information

The Competence By Design project was funded by the Royal College of Physicians and Surgeons of Canada and some individual authors received funding from the Royal College either as staff (JRF, ST) or consultants (TRD, BT).

Competing Interests

Some individual authors received funding form the Royal College either as staff (ST, JRF) or as consultants (TRD, BT).

References

  • 1.Oandasan I, Working Group on Postgraduate Curriculum Review. Advancing Canada’s family medicine curriculum: Triple C. Can Fam Physician. 2011; 57(6): 739–40. [PMC free article] [PubMed] [Google Scholar]
  • 2.Karpinski J, Frank JR. The role of EPAs in creating a national system of time-variable competency-based medical education. Acad Med. 2021; 96(7S): S36–S41. DOI: 10.1097/ACM.0000000000004087 [DOI] [PubMed] [Google Scholar]
  • 3.Frank JR, Karpinski J, Sherbino J, Snell LS, Atkinson A, Oswald A, et al. Competence By Design: a transformational national system of time-variable competency-based postgraduate medical education. Perspect Med Educ. 2024; 13(1: Forthcoming. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Ferguson PC, Caverzagie KJ, Nousiainen MT, Snell L, ICBME Collaborators. Changing the culture of medical training: an important step toward the implementation of competency-based medical education. Med Teach. 2017; 39(6): 599–602. DOI: 10.1080/0142159X.2017.1315079 [DOI] [PubMed] [Google Scholar]
  • 5.Iobst WF, Holmboe ES. Programmatic assessment: the secret sauce of effective CBME implementation. J Grad Med Educ. 2020; 12(4): 518–21. DOI: 10.4300/JGME-D-20-00702.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Crawford L, Cofie N, McEwen L, Dagnone D, Taylor SW. Perceptions and barriers to competency-based education in Canadian postgraduate medical education. J Eval Clin Pract. 2020; 26(4): 1124–31. DOI: 10.1111/jep.13371 [DOI] [PubMed] [Google Scholar]
  • 7.Stoffman JM. Overcoming the barriers to implementation of competence-based medical education in post-graduate medical education: a narrative literature review. Med Educ Online. 2022; 27(1): 1–5. DOI: 10.1080/10872981.2022.2112012 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Frank JR, Mungroo R, Ahmad Y, Wang M, De Rossi S, Horsley T. Toward a definition of competency-based education in medicine: a systematic review of published definitions. Med Teach. 2010; 32(8): 631–7. DOI: 10.3109/0142159X.2010.500898 [DOI] [PubMed] [Google Scholar]
  • 9.Frank JR, Snell LS, Ten Cate O, Holmboe ES, Carraccio C, Swing SR. Competency-based medical education: theory to practice. Med Teach. 2010; 32(8): 638–45. DOI: 10.3109/0142159X.2010.501190 [DOI] [PubMed] [Google Scholar]
  • 10.Taber S, Frank JR, Harris KA, Glasgow NJ, Iobst W, Talbot M. Identifying the policy implications of competency-based education. Med Teach. 2010; 32(8): 687–91. DOI: 10.3109/0142159X.2010.500706 [DOI] [PubMed] [Google Scholar]
  • 11.Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, et al. International Competency-based Medical Education Collaborators. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019; 94(7): 1002–9. DOI: 10.1097/ACM.0000000000002743 [DOI] [PubMed] [Google Scholar]
  • 12.Frank JR, Taber S, van Zanten M, Scheele F, Blouin D, International Health Professions Accreditation Outcomes Consortium. The role of accreditation in 21st century health professions education: report of an international consensus group. BMC Med Educ. 2020; 20(Suppl 1): 305. DOI: 10.1186/s12909-020-02121-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Taber S, Akdemir N, Gorman L, van Zanten M, Frank JR. A “fit for purpose” framework for medical education accreditation system design. BMC Med Educ. 2020; 20(Suppl 1): 306. DOI: 10.1186/s12909-020-02122-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Ho MJ, Abbas J, Ahn D, Lai CW, Nara N, Shaw K. The “glocalization” of medical school accreditation: case studies from Taiwan, South Korea, and Japan. Acad Med. 2017; 92(12): 1715–22. DOI: 10.1097/ACM.0000000000001999 [DOI] [PubMed] [Google Scholar]
  • 15.Van Melle E. Key Stakeholder Consultation Report (technical report). Ottawa: Royal College of Physicians and Surgeons of Canada; 2012. [Google Scholar]
  • 16.Royal College of Physicians and Surgeons of Canada. Competence by Design: Canada’s Model for Competency-based Medical Education. Retrieved 4 June 2020. https://www.royalcollege.ca/rcsite/cbd/competence-by-design-cbd-e.
  • 17.Tannenbaum D, Kerr J, Konkin J, Organek A, Parsons E, Saucier D, et al. Triple C competency-based Curriculum. Report of the Working Group on Postgraduate Curriculum Review—Part 1. Mississauga, ON: College of Family Physicians of Canada; 2011. Retrieved 20 October 2022. https://www.cfpc.ca/en/resources/education/triple-c-competency-based-curriculum-report-part-1. [Google Scholar]
  • 18.Canadian Residency Accreditation Consortium. About CanERA. Retrieved 17 October 2022. https://www.canrac.ca/canrac/about-e.
  • 19.Chan TM, Paterson QS, Hall AK, Zaver F, Woods RA, Hamstra SJ, et al. Outcomes in the age of competency-based medical education: recommendations for emergency medicine training in Canada from the 2019 Symposium of Academic Emergency Physicians. Can J Emerg Med. 2020; 22(2): 204–14. DOI: 10.1017/cem.2019.491 [DOI] [Google Scholar]
  • 20.Hall AK, Schumacher DJ, Thoma B, Caretta-Weyer H, Kinnear B, Gruppen L, et al. Outcomes of competency-based medical education: a taxonomy for shared language. Med Teach. 2021; 43(7): 788–93. DOI: 10.1080/0142159X.2021.1925643 [DOI] [PubMed] [Google Scholar]
  • 21.Davis DJ, Ringsted C. Accreditation of undergraduate and graduate medical education: how do the standards contribute to quality? Adv Health Sci Educ Theory Pract. 2006; 11(3): 305–13. DOI: 10.1007/s10459-005-8555-4 [DOI] [PubMed] [Google Scholar]
  • 22.Century J, Rudnick M, Freeman C. A framework for measuring fidelity of implementation: a foundation for shared language and accumulation of knowledge. Am J Eval. 2010; 31(2). 199–218. DOI: 10.1177/1098214010366173 [DOI] [Google Scholar]
  • 23.Bandiera G, Frank J, Scheele F, Karpinski J, Philibert I. Effective accreditation in postgraduate medical education: from process to outcomes and back. BMC Med Educ. 2020; 20(Suppl 1): 307. DOI: 10.1186/s12909-020-02123-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Goroll AH, Sirio C, Duffy FD, LeBlond RF, Alguire P, Blackwell TA. A new model for accreditation of residency programs in internal medicine. Ann Intern Med. 2004; 140(11): 902–9. DOI: 10.7326/0003-4819-140-11-200406010-00012 [DOI] [PubMed] [Google Scholar]
  • 25.Patton MQ. Principles-focused Evaluation: The Guide. New York: The Guilford Press; 2017. [Google Scholar]
  • 26.Burney IA, Al-Lamki N. Accreditation of graduate medical education programmes: one size fits all-or does it? Sultan Qaboos Univ Med J. 2013; 13(2): 198–201. DOI: 10.12816/0003224 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Blouin D, Tekian A, Kamin C, Harris IB. The impact of accreditation on medical schools’ processes. Med Educ. 2018; 52(2): 182–91. DOI: 10.1111/medu.13461 [DOI] [PubMed] [Google Scholar]
  • 28.Blouin D, Tekian A. Accreditation of medical education programs: moving from student outcomes to continuous quality improvement measures. Acad Med. 2018; 93(3): 377–83. DOI: 10.1097/ACM.0000000000001835 [DOI] [PubMed] [Google Scholar]
  • 29.Canadian Residency Accreditation Consortium. General Standards of Accreditation for Residency Programs. Ottawa: CanRAC; 2020. Accessed 20 October, 2022. https://www.canrac.ca/canrac/general-standards-e. [Google Scholar]
  • 30.Kassebaum DG, Cutler ER, Eaglen RH. The influence of accreditation on educational change in U.S. medical schools. Acad Med. 1997; 72(12): 1127–33. DOI: 10.1097/00001888-199712000-00029 [DOI] [PubMed] [Google Scholar]
  • 31.Chandran L, Fleit HB, Shroyer AL. Academic medicine change management: the power of the liaison committee on medical education accreditation process. Acad Med. 2013; 88(9): 1225–31. DOI: 10.1097/ACM.0b013e31829e7a25 [DOI] [PubMed] [Google Scholar]
  • 32.Hinchcliff R, Greenfield D, Moldovan M, Westbrook JI, Pawsey M, Mumford V, et al. Narrative synthesis of health service accreditation literature. BMJ Qual Saf. 2012; 21(12): 979–91. DOI: 10.1136/bmjqs-2012-000852 [DOI] [PubMed] [Google Scholar]
  • 33.Prados JW, Peterson GD, Lattuca LR. Quality assurance of engineering education through accreditation: the impact of Engineering Criteria 2000 and its global influence. J Eng Educ. 2013; 94(1): 165–84. DOI: 10.1002/j.2168-9830.2005.tb00836.x [DOI] [Google Scholar]
  • 34.Akdemir N, Lombarts KMJMH, Paternotte E, Schreuder B, Scheele F. How changing quality management influenced PGME accreditation: a focus on decentralization and quality improvement. BMC Med Educ. 2017; 17(1): 98. DOI: 10.1186/s12909-017-0937-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Canadian Residency Accreditation Consortium. CanERA Policy Manual for the Accreditation of Canadian Residency Institutions and Programs. Ottawa: CanRAC; 2020. [Google Scholar]
  • 36.Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system — rationale and benefits. N Engl J Med. 2012; 366(11): 1051–6. DOI: 10.1056/NEJMsr1200117 [DOI] [PubMed] [Google Scholar]
  • 37.Institute of Medicine. Graduate Medical Education that Meets the Nation’s Health Needs. Washington: The National Academies Press; 2014. [PubMed] [Google Scholar]
  • 38.Greenfield D, Civil M, Donnison A, Hogden A, Hinchcliff R, Westbrook J, et al. A mechanism for revising accreditation standards: a study of the process, resources required and evaluation outcomes. BMC Health Serv Res. 2014; 14: 571. DOI: 10.1186/s12913-014-0571-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Carey R, Wilson G, Bandi V, Mondal D, Martin LJ, Woods R, et al. Developing a dashboard to meet the needs of residents in a competency-based training program: a design-based research project. Can Med Educ J. 2020; 11(6): e31–e45. DOI: 10.36834/cmej.69682 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Thoma B, Bandi V, Carey R, Mondal D, Woods R, Martin L, et al. Developing a dashboard to meet competence committee needs: a design-based research project. Can Med Educ J. 2020; 11(1): e16–e34. DOI: 10.36834/cmej.68903 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Yilmaz Y, Carey R, Chan T, Bandi V, Wang S, Woods RA, et al. Developing a dashboard for faculty development in competency-based training programs: a design-based research project. Can Med Educ J. 2021; 12(4): 48–64. DOI: 10.36834/cmej.72067 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Yilmaz Y, Carey R, Chan T, Bandi V, Wang S, Woods RA, et. al. Developing a dashboard for program evaluation in competency-based training programs: a design-based research project. Can Med Educ J. 2022; 13(5): 14–27. DOI: 10.36834/cmej.73554 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Thoma B, Caretta-Weyer H, Schumacher DJ, Warm E, Hall AK, Hamstra SJ, et al. Becoming a deliberately developmental organization: using competency based assessment data for organizational development. Med Teach. 2021; 43(7): 801–9. DOI: 10.1080/0142159X.2021.1925100 [DOI] [PubMed] [Google Scholar]
  • 44.Schumacher DJ, Holmboe ES, van der Vleuten C, Busari JO, Carraccio C. Developing resident-sensitive quality measures: a model from pediatric emergency medicine. Acad Med. 2018; 93(7): 1071–78. DOI: 10.1097/ACM.0000000000002093 [DOI] [PubMed] [Google Scholar]
  • 45.Asch DA, Nicholson S, Srinivas S, Herrin J, Epstein A. Evaluating obstetrical residency programs using patient outcomes. JAMA. 2009; 302(12): 1277–83. DOI: 10.1001/jama.2009.1356 [DOI] [PubMed] [Google Scholar]
  • 46.Kinnear B, Schumacher DJ, Driessen EW, Varpio L. How argumentation theory can inform assessment validity: a critical review. Med Educ. 2022; 56(11): 1064–75. DOI: 10.1111/medu.14882 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Tavares W, Rowland P, Dagnone D, McEwen LA, Billett S, Sibbald M. Translating outcome frameworks to assessment programmes: implications for validity. Med Educ. 2020; 54(10): 932–42. DOI: 10.1111/medu.14287 [DOI] [PubMed] [Google Scholar]
  • 48.Kassebaum DG, Cutler E, Eaglen RH. On the importance and validity of medical accreditation standards. Acad Med. 1998; 73(5): 550–64. DOI: 10.1097/00001888-199805000-00027 [DOI] [PubMed] [Google Scholar]
  • 49.Sticca RP, Macgregor JM, Szlabick RE. Is the Accreditation Council for Graduate Medical Education (ACGME) Resident/Fellow survey, a valid tool to assess general surgery residency programs compliance with work hours regulations? J Surg Educ. 2010; 67(6): 406–11. DOI: 10.1016/j.jsurg.2010.09.007 [DOI] [PubMed] [Google Scholar]

Articles from Perspectives on Medical Education are provided here courtesy of Ubiquity Press

RESOURCES