Skip to main content
Wiley Open Access Collection logoLink to Wiley Open Access Collection
. 2024 Aug 6;59(4):428–438. doi: 10.1111/medu.15475

Fixing disconnects: Exploring the emergence of principled adaptations in a competency‐based curriculum

Mary C Ott 1,2,, Lori Dengler 3, Kathryn Hibbert 4, Michael Ott 5
PMCID: PMC11906271  PMID: 39105665

Abstract

Purpose

Competency‐based medical education (CBME) promises to improve medical education through curricular reforms to support learner development. This intention may be at risk in the case of a Canadian approach to CBME called Competence by Design (CBD), since there have been negative impacts on residents. According to Joseph Schwab, teachers, learners and milieu must be included in the process of curriculum‐making to prevent misalignments between intended values and practice. This study considered what can be learned from the process of designing, enacting and adapting CBD to better support learners.

Methods

This qualitative study explored the making of CBD through the perspectives of implementation leads (N = 18) at national, institutional and programme levels. A sociomaterial orientation to agency in curriculum‐making guided the inductive approach to interviewing and analysis in phase one. A deductive analysis in phase two applied Schwab's theory to further understand sources of misalignments and the purpose of adaptive responses.

Results

Misalignments occurred when the needs of teachers, learners and milieu were initially underestimated in the process of curriculum‐making, disconnecting assessment practices from experiences of teaching, learning and entrustment. While technical and structural issues posed significant constraints on agency, some implementation leads were able to make changes to the curriculum or context to fix the disconnects. We identified six purposes for principled adaptations to align with CBME values of responsive teaching, individualised learning and meaningful entrustment.

Conclusion

Collectively, the adaptations we characterise demonstrate constructive alignment, a foundational principle of CBME in which assessment and teaching work together to support learning. This study proposes a model for making context‐shaped, values‐based adaptations to CBME to achieve its promise.

Short abstract

How do adaptations to CBME emerge and which ones hold promise for supporting learners? This study on fixing #disconnects in #CompetencebyDesign offers insights

1. INTRODUCTION

Competency‐based medical education (CBME) promised to be a learner‐centred approach to improving medical education, 1 , 2 , 3 , 4 but implementation has presented challenges to the best of intentions. 5 , 6 , 7 , 8 In Canada, unintended consequences of the national rollout of Competence by Design (CBD) include widely documented negative impacts on residents' learning and well‐being. 9 , 10 , 11 , 12 , 13 , 14 Is this a problem of putting the intended curriculum into practice?

Curriculum reforms are often evaluated in terms of discrepancies between the ‘intended’ and ‘enacted’ curriculum. 4 However, this division of the implementation process sets up an inherent hierarchy and false dichotomy between curriculum design as an enlightened stage of intent and curriculum enactment as the subject of evaluation. Such binary thinking can lead to two dead ends for the improvement of any curriculum project. It silences questions of how well a curriculum reflects the needs of its context, and it avoids consideration of how flaws in design can lead to failures in enactment.

In 2015, The Royal College of Physicians and Surgeons of Canada began the process of developing a version of CBME in the residency programmes it accredits that was intended to standardise curricula for entrustable professional activities (EPAs) in each specialty. 7 CBME leads were recruited to participate in the planning and organisational deployment required at national, institutional and programme levels. In 2023, the College responded to the challenges and unintended consequences that emerged with the implementation of Competence by Design (CBD) by introducing ‘CBD 2.0’ to provide more flexibility in the design of competency‐based learning for individual training programmes while maintaining alignment with core values. 15 These values are described in a core components framework 3 containing the defining features of CBME to facilitate the evaluation of ‘fidelity of implementation’ across different contexts. The framework proposes a constructive alignment of competency‐focused instruction, programmatic assessment of and for learning and tailored learning experiences to produce intended outcomes. The core components framework is intended to support ‘principled adaptations’. 3 In Canada, the College uses the core components framework in its evaluations of CBD. 7 , 14

Although programme evaluation literature has been generative in developing the theory of CBME, 3 , 16 , 17 , 18 , 19 it has critical limitations for curriculum improvement. First, the focus on fidelity obscures the fact that curriculum design is inseparable from implementation. Adaptations are often responses to problems posed by an original design. Second, recommendations from CBME evaluations tend to focus on behavioural change, with implementation challenges often treated as indications of needs for “culture change,” 7 a posture of “growth mindset” 7 , 20 and calls for further faculty and resident development. 16 , 17 These limitations may ignore problems that emerge at national and institutional levels of planning and overlook the needs and experiences of training programmes and faculty members struggling to manage these concerns. To disrupt these potential dead ends in CBME evaluation, we need a complex view of curriculum‐making as a multilinear system where design informs enactment and enactment informs design. 21 , 22 , 23 , 24 We also require a stronger understanding of agency in this adaptive process. 25 , 26 , 27

CBD 2.0 15 presents a crucial opportunity for educators tasked with leading implementation efforts to understand their agency in curriculum‐making. How might they design and enact locally developed competency‐based curricula to support learner development? This study of sociomaterial agency in how CBME implementation leads at various levels of decision‐making designed, enacted and adapted Competence by Design explores how misalignments can occur that disconnect a CBME curriculum from its intended purpose and contributes examples of principled adaptations to re‐connect with core values.

2. METHODOLOGY

This qualitative study explored the curriculum‐making process of CBD through the perspectives of CBME implementation leads at national, institutional and programme levels, representing nine specialty training programmes in different academic centres and regions of Canada. The study was approved by the institutional research ethics board of Western University. Our two‐phase methodology borrowed 28 the iterative approach to collecting data and writing analytic memos from constructivist grounded theory 29 and applied curriculum theories to analyse the dataset. We conducted in‐depth, semi‐structured interviews (N = 18) to inquire into participants' experiences of the curriculum‐making process, reflecting work done between 2015 and 2021. Schwab's curriculum commonplaces framework 21 and sociomaterial theories of curriculum‐making 25 , 26 , 27 informed our approach to analysis.

2.1. Theoretical framework

The work of designing, enacting and adapting a curriculum is a contested, contextual and ongoing process. Joseph Schwab, a leading 20th‐century philosopher of curriculum theory, understood this. In seminal work, 21 , 22 , 23 , 24 Schwab described curriculum‐making as a multilinear process of developing the content and values of a curriculum that should involve four “commonplaces”: teachers, students, subject matter experts and the milieu (representing all things cultural and contextual that should inform the curriculum). In his ideal version, the needs and perspectives of each commonplace should be given equal consideration in deliberations about design and enactment throughout the process, reflecting a non‐hierarchical approach. Schwab's commonplaces framework predicts that misalignments may occur when the interests of one commonplace dominate in relation to others. 21

Schwab's work was instrumental in flattening the intuitive hierarchy that appears if we separately view curriculum design as a higher stage of decision‐making and enactment as a lower level of implementation. However, in the case of large‐scale curriculum reform, multiple levels of decision‐making are involved, and it is useful to study agency in curriculum‐making through this lens as well. Recent research understands agency as the capacity to act in macro (national), meso (institutional, programme) and micro (teacher) levels of curriculum‐making and finds that meso‐level actors have the most leverage by acting as knowledge conduits between macro and micro levels. 25 Another innovation in curriculum theory is expanded awareness of what counts as an actor. Sociomaterial perspectives have considered the agency of materials, space and time in structuring possibilities for curriculum design and enactment. 26 , 27

2.2. Reflexive process

The study design embedded reflexivity in the processes of problem framing and analysis. At the time of the study, the authors were affiliated with Western University's medical school in Ontario, Canada. The senior author [MO] conceptualised the study as research into the unintended consequences of CBD based on his role as a CBME lead with responsibilities for both planning EPAs at the national specialty committee level and implementation in his residency training programme. Another investigator [LD] joined due to her experience as an education coordinator supporting implementation in a different training programme that likewise experienced difficulties and changes in how CBD was designed and enacted. These shared experiences expanded our understanding of the different actors involved in the design and enactment of the curriculum in different contexts and of the changes that occurred across various phases and levels of implementation. Two authors [MO, MCO] were also members of the medical school's postgraduate training committee and thus privy to discussions on challenges related to implementation across the training programmes at this institution. Members of the research team with expertise in curriculum studies and sociomaterial theories conducted the data collection and preliminary analyses [MCO, LD, KH]. These practice‐based and theoretical insights allowed us to consider a range of experiences and perspectives throughout the study.

2.3. Participants

Our recruitment strategy was both broad (to capture the widest possible range of perspectives from CBME leads) and focused (to sample insights emerging in the process of data collection). We extended an open invitation to CBME leads who had participated in CBD development in the period between 2015 and 2021, when the process of curriculum‐making underwent a phased rollout. At the time, the role of a CBME lead could include curriculum design responsibilities in national specialty committees, institutional responsibilities for resource allocation and faculty development, programme support for implementation (education coordinators) and programme leadership for implementation (programme directors and competence committee chairs). Some CBME leads held more than one role as the process unfolded. When quoting participants in the results, we highlight their role at the time of the interviews as follows: programme leads (P), education coordinators (E) and institution leads (I).

CBME leads took part in nationally organised webinars by the Royal College to collectively learn through the implementation process, 7 and as part of this were encouraged to share practices and participate in research. As members of this network, we sent a study letter of information to CBME leads who had consented to being contacted for research purposes. We also invited all CBME leads at our academic centre to participate by disseminating the study information through the postgraduate medical education committee. One issue was the recruitment of education support staff, who did not always identify as ‘leads.’ Using the technique of network sampling, 30 an education coordinator at our institution shared the study information with coordinators in other institutions. Interested participants contacted the study team to complete the recruitment and consent processes. The study sample is summarised in Table 1.

TABLE 1.

Study sample.

CBME Lead category Number Roles Regions (medical schools) Training contexts
Program Leads (P) 9

Program Director: 8

Program CBME lead: 1

Competence Committee chair:4

Ontario (1)

Saskatchewan (1)

Internal Medicine, Emergency Medicine, Physical Medicine, Neurology, Geriatric Psychiatry, Anesthesiology, Paediatric Urology, OtoHnS, Orthopaedic Surgery, General Surgery
Institutional Leads (I) 5

Associate Dean: 3

PGME Chair: 2

Quebec (1)

Ontario (3)

Saskatchewan (1)

Education Coordinators/Program Assistants (E) 4

Program Coordinator: 3

Program Assistant: 1

Ontario (3)

Note: CBME = competency‐based medical education; PGME = postgraduate medical education committee; OtoHnS = Otolaryngology‐Head & Neck Surgery.

2.4. Interview process

Semi‐structured interviews averaged 40 minutes and probed participants' experiences with the design, enactment and adaptation of CBD, focusing on planning processes at the national, institutional and programme levels, the kinds of work and resources required to enact the curriculum within institutions and programmes, and their perceived agency as CBME leads in making adaptations. The interview guide is available as a supplementary file. The questions regarding work, resources and agency allowed the participants to reflect on their own role(s) as well as to describe the contributions of material actors in the curriculum‐making process, such as policies set by the Royal College, institutional budget constraints and structural features of training programmes.

2.5. Analysis

The first phase of the study, guided by CGT principles of inductive and iterative approaches to data collection and analysis, focused on identifying the agency of human and material actors at the national, institutional and programme levels of curriculum‐making. The first author conducted the interviews and wrote analytic memos on technical and structural agency, focusing on how actors such as budgets, technologies and differences in training contexts were recurring themes in the data. Refining interview probes in this phase allowed us to inquire further into these aspects of implementation and sharpened our analysis of the milieu of CBD across different contexts and levels of implementation. Data collection ended when criteria for conceptual depth 31 were met so that the agency of these actors could be elaborated with multiple examples.

Once the dataset was complete, the second phase of analysis began. The first and second authors read all interview transcripts and coded the data independently using Schwab's framework to identify activities reflecting the commonplaces of subject matter expert, teacher, student and milieu. Meeting with the third author as a team of curriculum scholars, we discussed our initial insights, returned to the data and met again over a cycle of multiple rounds of analysis. Our goal was to identify how the needs and perspectives of each commonplace were represented (or not) in the process of curriculum‐making, and the effects this had on the curriculum as experienced by faculty members and residents. Analysing examples of the commonplaces of decision‐making through the lens of sociomaterial agency in curriculum‐making, we began to identify patterns of misalignment in the design and enactment of CBD that required adaptations to align with intended values. The practice‐based researchers on the team also contributed insights based on the resonance 32 of the findings with their experiences as CBME leads. The analysis for this phase of the study was completed when we could categorise and elaborate 31 examples of principled adaptions – defined broadly as adaptations intended to fulfil the purpose of CBME. 3

3. RESULTS

The commonplace of subject matter experts was well represented by specialty committee members and assessment experts at the Royal College, as work was done at the national level to define curricula based on EPAs for each specialty and set assessment criteria. Perspectives on how the curriculum would impact teaching and learning were rarely identified at this level. While teaching and learning were concerns at the institutional and programme levels of planning, CBME leads expressed difficulties early on in making changes due to upstream consequences for the national level of implementation. In addition, we found three often underestimated structural actors in the milieu of CBD that had unanticipated effects on how the curriculum translated into practice: policies, technologies and training contexts. As Schwab's framework predicts, when curriculum‐making was dominated by a specific commonplace, unintended consequences resulted from misalignments between the intended and enacted curriculum.

Misalignments emerged through the process of curriculum‐making when the interests of the three commonplaces of teachers, learners and milieu were underestimated. New assessment practices for EPAs were disconnected from approaches to teaching, learning and meaningful entrustment. However, we also identified six purposes for adaptive responses some CBME leads employed to resolve one or more of the disconnects. These purposes included shared responsibility for teaching, gradual release of responsibility for assessment, tailoring the context for learning, tailoring individual learning plans, flexible use of technologies and longitudinal coaching.

3.1. Emergence of misalignments

Teachers' and learners' perspectives were rarely sought during the initial stages of planning EPAs. An example from our data shows what can occur when teachers and students are positioned as curriculum makers: “We would send feedback from our national committee meetings back to our [faculty and resident] membership to say, this is the stage of where we're at, these are the EPAs that are currently a work in progress, so that they could give feedback to us” (P8).

More often, teachers and learners were positioned as curriculum takers at the national planning level. For example, an education coordinator who attempted to raise a concern about curriculum alignment from a teaching and learning perspective observed: “There was an outspoken clinician on the call who right away, very quickly, announced that they didn't think there was a misalignment problem. And it basically shut down the conversation” (E1).

As this experience demonstrates, competing perspectives and priorities contributed to conditions for dominant voices to emerge in the design phase for EPA curricula. One participant described difficulties in arriving at a consensus: “We were going to have to define what it means to be a [specialty] surgeon […] that was actually a very heated discussion” (P6). Another mentioned the challenges of establishing a generic scope of practice for EPAs across different training contexts for a specialty: “A lot of disciplines are operationalized a little bit differently in different places […] So, it's trying to figure out what's within that scope for everyone” (I2).

However, once the curriculum outcomes were defined, participants who had been specialty committee members described learning from implementation that some EPAs did not make sense in context: “Things that seemed like such a good idea, educationally were sound, but then you go to implement it, and it's like, well, if they don't happen to see a patient that needs [a specific procedure], what do they do?” (I2).

Now working in the commonplaces of teaching and learning, participants in the subject matter expert position suddenly encountered constraints on their agency to make changes to EPA assessment plans: “The College was pretty clear that we can't make any changes. I intermittently sit on the specialty committee for these [discussions]” (P4). The reasons for this nationally set policy on not making iterative changes to EPAs included a desire to ensure consistency in specialty training: “How do we know it is happening? How do we know they have the skills?” (I2). However, institutional budgets were a hidden actor in the CBD milieu that also exerted significant influence on how much change the College was willing to accept in adapting EPA requirements. The budget for CBME needed to include “all sorts of things. It includes human resources for administration. It includes manpower for technology” (I1). This did not mean that all institutions had equal budgets:

What we've found at the national, let's say the specialty committee level, is [that] each different program has different resources, so their university has different resources […] There are programs that say don't even change a letter because I just don't have the capacity to do that. (I2)

Reflections by institutional leads demonstrated that constraints on changes to EPAs were largely due to the costs of technology development. These costs had local effects on enactment as well as an upstream impact on other contexts nationally: “The experience with anesthesia really illustrated that all the different changes and versions, how that messed up the platform, so I think that has made on the national level everyone more scared to change some of these things” (I2). Even institution leads who were successful in acquiring resources to begin the process of developing e‐portfolios felt challenged by the work involved in maintaining them:

The technology is also expensive and requires people power to continuously update […] it didn't have that much functionality, so you see every few months there's new bits and pieces being added to try to improve the function, and the interface, the useability of it […] it's a lot of time, a lot of money, and a lot of people. (I4)

Another lead further characterised the costs of changes to e‐portfolios in terms of extra workload on programme leads who were already stretched thin:

Who do I tell after? What do I track? So, uncertainty also with that, and that requires work … If we're getting that granular, you have to go through each milestone, each EPA, the assessment plans, and the contextual variables. That's a lot of work on the program director and the CBME lead. (I3)

Competing institutional priorities for technology deployment also disadvantaged teachers and learners in unanticipated ways. For example, an education coordinator stated that their academic centre was “doing a big revamp right now to make it more flexible for family medicine, and while that work is ongoing, they're not going to do any more work to the CBD dashboard […] the resident upgrades that we're looking for are far off in the future” (E2). In another context, a university's choice of technology did not mesh with the hospital networks: “The [EPA assessment] system, you can't access from some of the hospital computers […] and it wasn't easily accessible on tablets or phones, which is what most people use and carry with them” (P1). Problems such as these had a direct impact on residents' learning. An institutional lead described the barriers to timely EPA assessment completion created by poorly developed and integrated technology for e‐portfolios: “There's lots and lots of workarounds, it's just I think for many of us working in the hospital, three extra clicks and one of those computers that is infinitely slow is enough to basically put you on hold” (I4).

3.2. Assessment disconnects

As these examples highlight, CBD translated into practice with many challenges and negative impacts when the interests of teachers, learners and the milieu were underestimated in the initial rounds of curriculum‐making. We found these unintended consequences could be grouped into three types of disconnects: assessment disconnected from teaching, assessment disconnected from learning and assessment disconnected from the meaningful use of data for entrustment. Table 2 provides illustrative examples of each disconnect.

TABLE 2.

Curriculum disconnects.

Assessment disconnected from teaching Assessment disconnected from learning Assessment disconnected from meaningful entrustment

“The specific procedure that I do, they do not have an EPA for it … do not define it based on a procedure. Define it based on what is needed in that procedure … a stoma can be done for a number of different reasons.” (P 4)​

“I feel like there's more training experiences that do not fit into the EPAs.” (P 3)

“We had a resident who we did feel needed more exposure on a certain rotation. We thought what can we do? If we put them on there and change the rotation schedule it's like a ripple. It changes everything else around them and that's not practical to do except under extreme circumstances.” (P 6)​

“The Resident Leads Council just did a big study on EPA observation, and it was interesting that their language kept going back to getting EPAs. It's like, but the whole point is not getting an EPA, the point is learning.” (I 2)

“If you are offering to give me feedback on something that I have a lot of assessments on, is it useless? It's useless to promote me but it should not be useless in the learning sense … [but] for residents, it's a lot of frustration with the electronic platform … trying to track all of their contextual variables in all the EPAs they need to do is a lot, and you can tell, it definitely really stresses them out.” (E 4)

“The unfortunate thing is, because of the difficulty in using the [technology] system and there is no dashboard, most of the Competence Committee members do not go to the details part of the report, they just go to the numbers … There's a lot of discussions of, well, this resident has not done enough EPAs or tried enough EPAs, not about, this resident is not able to do this independently.” (P 1)

“Where is the evidence that three is the number that you need to successfully achieve to be declared competent? Shouldn't it be five, maybe it's one, maybe it's 56? My point is that there is no data to back that up whatsoever … I think even this far in, how do you know how many [EPAs] you have to do before you can do a competent independent [EPA]? There is no hard and set number for that.” (P 6)

Note: EPA = entrustable professional activity.

Assessment was disconnected from teaching when EPA design was not aligned with teaching practices (P4) or training requirements (P14), or when the structural demands of training contexts (such as rotation schedules) did not align with the instructional goal of providing tailored learning experiences (P6).

EPA assessments were disconnected from the goal of assessment for learning, as residents felt driven toward “getting EPAs” without meaning and guidance (I2 and E4), or when programmes failed to close the feedback loop by using EPA data to plan responsive teaching. In both cases, participants described the disconnect as more than simply a problem for faculty development in terms of understanding the purposes of formative assessment. Technology was frequently cited as a barrier rather than a facilitator in supporting resident learning and was the cause of much stress (E4). However, other structural actors also played an unseen role in limiting opportunities for tailored instruction. One programme director contemplated how CBME intentions would look in an ideal world:

In the ideal world, you would know that Resident A is strong in EPAs 1, 2, and 3, Resident B [in] 4, 5, and 6, so you could focus Resident A's teaching on four, five, and six and Resident B's on one, two, three. I think that's the intent of the process, but in practice, I must say that I don't think that's actually what happens. (P6)

The reasons why assessment‐informed teaching rarely functioned in the real world of CBD were structural, as the following reflection from a competence committee chair revealed:

[If you were] part of the CC membership, you would know, okay, for Resident X, they have done so many of these, and they haven't achieved it based on this milestone. Therefore, when they are in my operating room for the next time, I will try and focus on that. Do you see the number of steps that needs to fall in line for you to use that knowledge? (P4)

As these examples highlight, barriers to using assessment to facilitate individualised learning included underdeveloped communication channels to convey insider competence committee knowledge to other faculty and constraints on the learning opportunities possible in training rotations.

The third type of disconnect occurred when EPA data were not used in meaningful ways to support entrustment decisions. Again, poorly developed technological solutions had negative impacts on CBD – in this case, by diverting competence committees from the “details part of the report” toward decisions based on “the numbers” (P1). As an institutional lead observed, “You can't just throw somebody 89 pieces of paper and say, well, put these together and make a picture” (I5). However, overreliance on easy metrics was not simply a problem of faculty interpretation. Recalling that the specialty planning committees set numeric criteria for EPA observations, the critique that emerged from the commonplace of teaching – “Where is the evidence that three is the number?” (P6) – was a critical point that was overlooked in early phases of the curriculum‐making process.

3.3. Principled adaptations

Together, these cases illustrate how misalignments emerged as CBD was translated into practice, disconnecting the core component of programmatic assessment from the core components of competency‐focused instruction and tailored learning experiences. Due to the challenges that manifested at the programme level, some CBME leads developed adaptations as they learned from their own and others' experiences. These adaptations helped to realign the curriculum with core values. Table 3 provides examples of six purposes for adaptation grouped under the categories of responsive teaching, individualised learning and meaningful entrustment.

TABLE 3.

Principled adaptations.

Responsive teaching Individualised learning Meaningful entrustment
Shared Faculty Responsibility for Teaching Tailoring the Context Assessment Toolboxes

“Knowing my colleagues, if I come with 47 EPAs … they are just going to be like, this is not my problem … I wanted them to try to own it with a smaller number of EPAs to take responsibility for.” (P 9)

“We would do our normal teaching with them and then highlight points in time when a certain skill that we are teaching relates to a certain EPA. And then, when we are evaluating someone for the EPA, we'll say, oh, remember when we had that talk about X, this is where that feeds in.” (P 8)​

“We meet quarterly with a competence committee who evaluate the residents… And then we, as a group with the residents, evaluate rotations and whether or not they think they are useful or what they are getting out of them … and we alter the curriculum. And so, for example, we have moved one of those anaesthesia blocks to second year … This way they could get some skills in first year but then in second year they could probably acquire some more advanced skills.” (P 2)​

“One thing we struggled with is trying to figure out how to review resident files for the CC Meeting … recently, we have been able to create a bit of a spreadsheet [for all types of assessment data] which will follow them from PGY‐1 all the way through. I do not think even if Elentra gets their dashboard up and running that we'll do away with that.” (P 5) ​

“My program administrator, she'll pull some PDFs of their ITERs, summary out of Elentra for their EPAs, their training evaluation scores, their peer‐to‐peer reviews, and she'll put them all into a drop box [for the] person who is supposed to evaluate.” (P 9)

Gradual Release of Responsibility for Assessment to Residents Tailoring the Learning Plan Coaching
“We recognised that it's overwhelming to come into a new program … one thing we encouraged for our roll‐out was for staff to take responsibility for triggering the transition to discipline EPAs.” (P 5) “The thing is that the numbers are the least important parts … the flexibility is within the competency committee. We can actually pass an EPA irrespective of what numbers they achieve … If something in the narrative captures our eyes, we bring that up, as well, because those are the definitions that we use to see whether or not they are achieved.” (P 4)​

“We actually created longitudinal mentor teams because we were finding it was so checkbox focused … And the goal is that every rotation that they do in the Emergency Department, they work at least 3 shifts with those mentors. And that way they can have someone who is seeing them over the whole year and making sure they are actually progressing … We know we are getting the numbers but how are they doing professionally? How are they doing collegially? (P 2)​

“Each of us has two residents that we are coaching. We work together on that, making sure that we look more holistically at what's needed and any areas of gaps.” (P 5

3.4. Responsive teaching: shared responsibilities for teaching and assessment

The programme leads described attempts to share the workload of EPA assessment to make it more manageable for faculty and residents, as in the following example: “What we tried to do is say, when you're on this service, these are the things you need to focus on” (P1). Another programme recognised that new trainees were in a phase of uncertainty and anxiety and made faculty responsible for initiating EPA assessments during the Transition to Discipline phase (P5).

However, in most cases, CBME programme leads did not feel that faculty took responsibility for teaching EPAs: “Nobody is probably teaching… They're not pulling out the EPA to say, this is what this EPA is talking about” (P9). By contrast, the example provided by Participant 8 of how “normal teaching” can highlight “points in time when a certain skill relates to an EPA” illustrated that EPA assessment can be skillfully integrated within teachable moments. It also exemplified how EPA assessment can serve the objective of responsive teaching: “Remember when we had a talk about X? This is where that feeds in.”

3.5. Individualised learning: tailoring the context and the learning plan

We found several promising examples of programmes overcoming structural barriers to accommodate the learning environment for individual learning needs. One programme director identified hospital differences in caseload volume versus complexity in their context, which created opportunities for residents to cross‐train at different sites when rare cases came up in a training block (P3).

The example provided by Participant 2 in Table 3 is also noteworthy because the student commonplace was included in deliberations with residents about revising the curriculum map. In this case, splitting one training block into two training phases in a recursive approach allowed for deeper learning of an EPA across the Transition to Discipline and Foundations phases (P2). Although there is nothing new about the idea of a spiral curriculum for supporting longitudinal growth, it is an adaptation to CBD. Another CBME lead with previous experience with a spiral curriculum described how programme directors at their institution felt constrained by the staged approach to learning progression in CBD (E4).

Flexibility in the competence committee to make judgements about how many EPA observations to include for entrustment and what types of EPA complexity “count” toward progression is not a deviation from the intentions of CBME (P4). However, it did constitute a national‐level adaptation for CBD, given the initial approach to standardising the numbers of required EPAs. As the Royal College learned from early adopters, processes of design and enactment changed: specialty committees were recommended to be “lumpers not splitters” when it came to planning the total number of EPAs for a specialty (P1), and competence committees were advised to rely on narrative feedback and professional judgement over the number of assessments completed in making individualised entrustment decisions (P4).

The participants involved in specialty committee planning during later phases provided many examples of national‐level adaptations to EPA design: “Basically [early adopters] were subclassifying every single thing. I think we did a better job of maybe backing off and trying to use EPAs as little biopsies of skills as opposed to a comprehensive assessment of all skills” (P9). Specialty committees who began planning later than 2015 sought to simplify the assessment process using one or more of the following tactics: reduction of the overall number of EPAs in a specialty, reduction of the numbers of contextual variables and milestones needed to complete an EPA and reduction of the numbers of successful observations per EPA for entrustment.

These adaptations to reduce the complexity of EPA plans may seem to be overly generalised approaches not aligned with the core component of competency‐focused instruction, but subject matter experts rationalised the changes as a means of tailoring the curriculum to the kinds of learning valued in their specialties. For example, some specialty committees designed EPAs to be “approach‐driven as opposed to content‐driven” (P8) or identified EPAs in terms of transferable skills: “Rather than focus on Problems 1, 2, and 3, we focused on, do you have the skill set of X, Y, and Z, which would allow you to solve all three problems” (P1). Such adaptations revealed practice‐driven values for more generalised skill sets, such as professional judgement and flexible problem‐solving.

3.6. Meaningful use of data: assessment toolboxes and coaching

Programme leads and competence committee chairs agreed that EPA assessments did not provide enough information to holistically assess resident progression. They also agreed that e‐portfolios were inadequate for helping them make sense of the big picture. Some described principled workarounds to incorporate a variety of old and new assessments into resident portfolios and to augment complicated technology with familiar and adaptable tools, such as spreadsheets and shared online folders (P2, P9). This made reviewing resident performance a more comprehensive and easier process, although it remained time‐consuming.

Finally, the examples of coaching in Table 3 highlight the context‐driven ways in which this adaptation can support both learner and programme development. In the case of a specialty based on shift work, ensuring that residents were scheduled with mentor teams enabled a holistic perspective on skills and professionalism (P2). In another programme, mentors played important roles in helping residents consolidate multisource feedback to plan next steps and in identifying gaps in learning opportunities offered by the programme (P5).

4. DISCUSSION

The purpose of CBME is to improve medical education through developmentally supportive and tailored approaches to learning. 1 , 2 , 3 , 4 This value is reflected in the core components framework that provides guidance on making principled adaptations to CBME. 3 The underlying principle is that when teaching and assessment work together in support of learning, they are constructively aligned to achieve learning outcomes. 3 Since medical educators are responsible for teaching and assessment, it seems evident they are the problem if learner experience is less than supportive. This reasoning explains the prevailing calls in CBME evaluation literature for mindset shifts and culture change. 2 , 4 , 7 , 14 , 17 , 20 , 33 However, as this study of curriculum‐making in Competence by Design demonstrates, the problems can be more complicated and the solutions more complex. We showed how human and material aspects of the design and enactment process at national and institutional levels disconnected assessment of EPAs from teaching, learning and meaningful entrustment at the programme level, requiring adaptations throughout the system aimed at reconnecting the curriculum to its purpose. By focusing on agency in addressing unintended consequences and making change, we also shed light on the constraints and possibilities for CBME implementation leads to do this work. Our study contributes to the literature on making principled adaptations to CBME in three critical ways.

4.1. Identifying purpose and agency

This research provides insight into why and how adaptations were made to CBD to better fit with the contexts of specialty training and the needs of teachers and learners. As Schwab's work reminds us, a curriculum achieves its values when the process of curriculum‐making is iterative and inclusive of the needs and perspectives of teachers, learners, subject matter and milieu. 21 His theory anticipates misalignments between intentions and enactment when the process is top‐down and exclusive. 21 , 22 , 23 , 24 We know from the Royal College's evaluation studies of CBD 7 , 14 that CBME leads in later stages of implementation learned from the experiences of teachers and learners in early adopter programmes and sought ways to make improvements. If unintended consequences emerge in a CBME programme, as they may in any new curriculum, 21 this study offers diagnostic tools and principled responses. To begin, a sociomaterial understanding of agency in the milieu and the capacity of meso‐level actors to facilitate curriculum‐making is vital. 25 , 26 , 27

CBME leads in any system must understand the technical and structural issues that can disconnect assessment from learning and entrustment. 27 In the case of CBD, no amount of bemoaning faculty reluctance to fix their culture 7 will resolve the problems posed by technologies, budgets, policies and other institutional and programmatic structures that also create training contexts and cultures. The fact that the programmes we highlight in this paper managed to adapt at all is admirable in light of these challenges and the well‐documented impacts of CBD on resident and faculty workload and well‐being. 6 , 7 , 9 , 10 , 11 , 12 , 13 , 14 This study reveals constraints on the agency but also the capacity of CBME leads as meso‐level curriculum makers 25 to influence change at all levels of curriculum‐making.

For example, at the programme and institutional level, implementation leads can advocate for adapting technology from existing tools to more easily document and share assessment data that meaningfully informs teaching, learning and entrustment. Technology for CBME does not have to be costly, but the cost of choosing poorly is too high to pay in terms of poor fit for purpose 27 , 34 , 35 and negative impacts on well‐being. 13 We know that cumbersome technology in medical practice plays a role in chronic work overload leading to burnout. 36 , 37 Institutions must allocate appropriate levels of funding for acquiring technologies for CBME designed to meet teacher and learner needs. 27 , 34 , 35

4.2. Re‐designing curricula

Our study offers examples of principled changes to curriculum design in CBME. While the need to adapt aspects of CBME to fit with context is a growing concern in the literature, 3 , 5 , 6 now recognised in the shift to CBD 2.0, 15 this elides the adaptive work that occurred to re‐design CBD at both national and programme levels. Changes to the nature, number and sequencing of EPAs described in our study reveal values‐based shifts in how competency‐based curricula is conceptualised in CBD. A challenge for the Royal College and other systems implementing CBME is to acknowledge the need for changes to curriculum design because this marks a significant departure from the focus on fidelity of implementation. This study has the potential to advance the field theoretically by documenting possibilities for re‐design of curriculum outcomes from a focus on performing tasks to solving problems. This would better align CBME with calls to recognise the complex nature of learning within its own literature, 1 , 3 , 8 , 38 , 39 , 40 , 41 the broader scholarship on adaptive expertise in medical education, 42 and health care as a complex adaptive system. 43

4.3. Making context‐shaped — values‐based adaptations

This study augments the core components framework 3 by proposing a constructively aligned model for making principled adaptations to CBME through context‐shaped, values‐based changes. Some may wonder whether the adaptations detailed in Table 3 are simply representations of the core components of competency‐focused instruction, tailored learning experiences and programmatic assessment. Individually, some are examples of approaches to making the core components work in context. But some of them are also evidence of values‐based changes to CBD. For example, the notion of gradually releasing responsibility for requesting EPA assessments from teachers to learners is not common practice in CBD, but better aligns with the goal to support learning. And while coaching is now recognised as a core component of CBD, 44 it was introduced in 2019 as an innovation. 3

Collectively, the adaptations in Table 3 reflect principled approaches to CBME that demonstrate constructive alignment. In the table headings, we group the adaptations not by core component but by the values they represent: responsive teaching, individualised learning and meaningful entrustment. We offer these labels as a principled language that better represents the purpose of the core components.

In its genesis, CBME was cast as a curricular reform based on a programmatic approach to assessment that included frequent formative feedback to support learner development. 1 Conceptualising CBME as a programmatic approach to teaching establishes a stronger ground for this intention. A basic disconnect emerges if curriculum reformers do not fully grasp the paradigmatic shift in teaching implicated by ‘assessment for learning’. 45 Framing this practice solely as feedback for learners misses the point that the feedback should also change teaching. 46 To engage in assessment for learning, teachers must answer an essential question: Do I understand enough about how my students are learning to be able to help them? 45 This is the principle of constructive alignment in action. Using assessment to adapt instruction 3 , 44 , 45 and to coach learners 46 centres the importance of responsive teaching in CBME.

However, coaching may place responsibility for resident development in the hands of a limited number of educators. Our study demonstrated challenges but also possibilities in using assessment data at the programme level to provide more individualised learning experiences and meaningful entrustment. The ability to think programmatically about teaching by closing the feedback loop between the competence committee and all faculty members may be the next significant innovation to support learner development. As this research shows, that work will require more than a shift in individual teacher mindsets.

4.4. Limitations

The insights from our findings are unique to the case of Competence by Design but, given the Royal College's influence in the development of CBME theory, some findings may resonate with CBME leaders in other contexts. As an interview‐based study, it is subject to participant bias towards strongly positive or negative viewpoints. Further, while our recruitment strategy sought a wide range of perspectives, the study does not include representation from every speciality and every medical school in Canada. Due to the cross‐specialty findings, more granular studies of specialty and programme‐specific adaptations are needed. Finally, to arrive at the most informative understanding of the challenges and opportunities involved in transitioning to competency‐based medical education, this study must be compared with other approaches to CBME and other accounts of faculty and resident experiences.

5. CONCLUSION

This study provides a rich picture of the purpose and potential of adaptations in competency‐based medical education. We explored how the Competence by Design curriculum was designed, enacted and adapted from the perspective of implementation leads at national, institutional and residency training programme levels. We found disconnects between assessment, teaching and learning emerged when the needs and perspectives of teachers, learners and milieu were underestimated, which required adaptive responses to reconnect the curriculum to its core values. A sociomaterial perspective on agency in curriculum‐making allowed us to account for constraints and possibilities in making change. We identified six purposes for adaptations to ensure responsive teaching, individualised learning and meaningful entrustment. Collectively these principled adaptations offer a constructively aligned model for making context‐shaped, values‐based changes to CBME to achieve its intended purpose.

AUTHOR CONTRIBUTIONS

Mary C. Ott: Conceptualization; methodology; investigation; funding acquisition; writing—original draft; writing—review and editing; project administration; formal analysis; data curation; supervision. Lori Dengler: Formal analysis; writing—review and editing; methodology; resources. Kathryn Hibbert: Writing—review and editing; formal analysis; supervision. Michael Ott: Conceptualization; supervision; writing—review and editing; resources.

DISCLOSURES

  • The authors have no conflicts of interest to disclose.

ACKNOWLEDGMENTS

This research was supported by a Faculty Research in Education grant from the Schulich School of Medicine and Dentistry, Western University, Canada.

Ott MC, Dengler L, Hibbert K, Ott M. Fixing disconnects: Exploring the emergence of principled adaptations in a competency‐based curriculum. Med Educ. 2025;59(4):428‐438. doi: 10.1111/medu.15475

DATA AVAILABILITY STATEMENT

Research data are not shared.

REFERENCES

  • 1. Frank JR, Snell LS, Cate OT, et al. Competency‐based medical education: theory to practice. Med Teach. 2010;32(8):638‐645. doi: 10.3109/0142159X.2010.501190 [DOI] [PubMed] [Google Scholar]
  • 2. Carraccio C, Englander R, Van Melle E, et al. ICBME collaborators. Advancing competency‐based medical education: a charter for clinician‐educators. Acad Med. 2016;91(5):645‐649. doi: 10.1097/ACM.0000000000001048 [DOI] [PubMed] [Google Scholar]
  • 3. Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J. A core components framework for evaluating implementation of competency‐based medical education programs. Acad Med. 2019;94(7):1002‐1009. doi: 10.1097/ACM.0000000000002743 [DOI] [PubMed] [Google Scholar]
  • 4. Dagnone JD, Bandiera G, Harris K. Re‐examining the value proposition for competency‐based medical education. Can Med Educ J. 2021;12(3):155‐158. doi: 10.36834/cmej.68245 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Dagnone JD, Chan MK, Meschino D, et al. Living in a world of change: bridging the gap from competency‐based medical education theory to practice in Canada. Acad Med. 2020;95(11):1643‐1646. doi: 10.1097/ACM.0000000000003216 [DOI] [PubMed] [Google Scholar]
  • 6. Szulewski A, Braund H, Dagnone DJ, et al. The assessment burden in competency‐based medical education: how programs are adapting. Acad Med. 2023;98(11):1261‐1267. doi: 10.1097/ACM.0000000000005305 PMID: 37343164 [DOI] [PubMed] [Google Scholar]
  • 7. Hall AK, Oswald A, Frank JR, et al. Evaluating competence by design as a large system change initiative: readiness, fidelity, and outcomes. Perspect Med Educ. 2024;13(1):95‐107. doi: 10.5334/pme.962 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. de Graaf J, Bolk M, Dijkstra A, van der Horst M, Hoff RG, Ten Cate O. The implementation of entrustable professional activities in postgraduate medical education in the Netherlands: rationale, process, and current status. Acad Med. 2021;96(7S):S29‐S35. doi: 10.1097/ACM.0000000000004110 [DOI] [PubMed] [Google Scholar]
  • 9. Mann S, Truelove AH, Beesley T, Howden S, Egan R. Resident perceptions of competency‐based medical education. Can Med Educ J. 2020;11(5):e31‐e43. doi: 10.36834/cmej.679589 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Fédération des médicins résidents du Québec . Year 3 of implementation of competence by design: negative impact still outweighs theoretical benefits. Fédération des médicins résidents du Québec; 2020. [Google Scholar]
  • 11. Branfield Day L, Miles A, Ginsburg S, Melvin L. Resident perceptions of assessment and feedback in competency‐based medical education: a focus group study of one internal medicine residency program. Acad Med. 2020;95(11):1712‐1717. doi: 10.1097/ACM.0000000000003315 [DOI] [PubMed] [Google Scholar]
  • 12. Mueller V, Morais M, Lee M, Sherbino J. Implementation of entrustable professional activities assessments in a Canadian obstetrics and gynecology residency program: a mixed methods study. Can Med Educ J. 2019;13(5):77‐81. doi: 10.36834/cmej.72567 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. Ott MC, Pack R, Cristancho S, Chin M, Van Koughnett JA, Ott M. “The most crushing thing”: understanding resident assessment burden in a competency‐based curriculum. J Grad Med Educ. 2022;14(5):583‐592. doi: 10.4300/JGME-D-22-00050.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Resident Doctors of Canada and Royal College of Physicians and Surgeons . Competence by design: resident pulse check report executive summary. https://www.royalcollege.ca/rcsite/cbd/cbd-program-evaluation-e. 2022. [Google Scholar]
  • 15. Royal College of Physicians and Surgeons . Statement on enhanced flexibility for CBD program implementation. https://newsroom.royalcollege.ca/commitment‐to‐action‐statement‐on‐enhanced‐flexibility‐for‐cbd‐program‐implementation/. 2023. Accessed June 25, 2023. [Google Scholar]
  • 16. Hall AK, Rich J, Dagnone JD, et al. It's a marathon, not a sprint: rapid evaluation of competency‐based medical education program implementation. Acad Med. 2020;95(5):786‐793. doi: 10.1097/ACM.0000000000003040 [DOI] [PubMed] [Google Scholar]
  • 17. Hall J, Oswald A, Hauer KE, et al. Twelve tips for learners to succeed in a CBME program. Med Teach. 2021;43(7):745‐750. doi: 10.1080/0142159X.2021.1925233 [DOI] [PubMed] [Google Scholar]
  • 18. Carraccio C, Martini A, Van Melle E, Schumacher DJ. Identifying core components of EPA implementation: a path to knowing if a complex intervention is being implemented as intended. Acad Med. 2021;96(9):1332‐1336. doi: 10.1097/ACM.0000000000004075 [DOI] [PubMed] [Google Scholar]
  • 19. Hall AK, Schumacher DJ, Thoma B, et al. Outcomes of competency‐based medical education: a taxonomy for shared language. Med Teach. 2021;43(7):788‐793. doi: 10.1080/0142159X.2021.1925643 [DOI] [PubMed] [Google Scholar]
  • 20. Richardson D, Kinnear B, Hauer KE, et al. Growth mindset in competency‐based medical education. Med Teach. 2021;43(7):751‐757. doi: 10.1080/0142159X.2021.192803 [DOI] [PubMed] [Google Scholar]
  • 21. Schwab JJ. The practical 3: translation into curriculum. School Rev. 1973;81(4):501‐522. doi: 10.1086/443100 [DOI] [Google Scholar]
  • 22. Eisner E. No easy answers: Joseph Schwab's contributions to curriculum. Curriculum Inquiry. 1984;14(2):201‐210. doi: 10.1080/03626784.1984.11075921 [DOI] [Google Scholar]
  • 23. Schwab JJ. A reply to Charles Wegener. Curriculum Inquiry. 1987;17(2):229‐233. doi: 10.1080/03626784.1987.11075287 [DOI] [Google Scholar]
  • 24. Ben‐Peretz M, Craig CJ. Intergenerational impact of a curriculum enigma: the scholarly legacy of Joseph. J Schwab Educ Stud. 2018;44(4):421‐448. doi: 10.1080/03055698.2017.1387099 [DOI] [Google Scholar]
  • 25. Priestley M, Philippou S, Alvunger D, Soini T. Curriculum making: A conceptual framing. In: Priestley M, Alvunger D, Philippou S, Soini T, eds. Curriculum making in Europe: policy and practice within and across diverse contexts. Emerald Publishing; 2021:1‐28. doi: 10.1108/978-1-83867-735-020211002 [DOI] [Google Scholar]
  • 26. Hopwood N, Dahlgren MA, Siwe K. Developing professional responsibility in medicine: a sociomaterial curriculum. Reconceptualising Professional Learning. Routledge; 2014:171‐183. [Google Scholar]
  • 27. Ott MC, Apramian T, Cristancho S, Roth K. Unintended consequences of technology in competency‐based education: a qualitative study of lessons learned in an OtoHNS program. J Otolaryngol Head Neck Surg. 2023;52(1):55. doi: 10.1186/s40463-023-00649-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28. Varpio L, Martimianakis T, Mylopoulos M. Qualitative research methodologies: Embracing methodological borrowing, shifting and importing. In: Cleland J, Durning SJ, eds. Researching medical education. John Wiley & Sons; 2015. doi: 10.1002/9781118838983.ch21 [DOI] [Google Scholar]
  • 29. Charmaz K. Constructing grounded theory. 2nd ed. SAGE Publications; 2014. [Google Scholar]
  • 30. Noy C. Sampling knowledge: the hermeneutics of snowball sampling in qualitative research. Int J Soc Res Methodol. 2008;11(4):327‐344. doi: 10.1080/13645570701401305 [DOI] [Google Scholar]
  • 31. Nelson J. Using conceptual depth criteria: addressing the challenge of reaching saturation in qualitative research. Qualitat Res. 2017;17(5):554‐570. doi: 10.1177/1468794116679873 [DOI] [Google Scholar]
  • 32. Kuper A, Lingard L, Levinson W. Critically appraising qualitative research. BMJ. 2008;337(3):a1035. doi: 10.1136/bmj.a1035 [DOI] [PubMed] [Google Scholar]
  • 33. Ferguson PC, Caverzagie KJ, Nousiainen MT, Snell L. Changing the culture of medical training: an important step toward the implementation of competency‐based medical education. Med Teach. 2017;39(6):599‐602. doi: 10.1080/0142159X.2017.1315079 [DOI] [PubMed] [Google Scholar]
  • 34. Siddiqui ZS, Fisher MB, Slade C, et al. Twelve tips for introducing E‐portfolios in health professions education. Med Teach. 2023;45(2):139‐144. doi: 10.1080/0142159X.2022.2053085 [DOI] [PubMed] [Google Scholar]
  • 35. Carey R, Wilson G, Bandi V, et al. Developing a dashboard to meet the needs of residents in a competency‐based training program: a design‐based research project. Can Med Ed J. 2020;11(6):e31‐e45. doi: 10.36834/cmej.69682 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36. Lemire F. Combating physician burnout. Can Fam Physician. 2018;64(6):480. https://www.cma.ca/physician-wellness-hub/resources/burnout/combating-physician-burnout. Accessed April 19, 2023 [PMC free article] [PubMed] [Google Scholar]
  • 37. Yates SW. Physician stress and burnout. Am J Med. 2020;133(2):160‐164. doi: 10.1016/j.amjmed.2019.08.034 [DOI] [PubMed] [Google Scholar]
  • 38. CBD Program Evaluation Operations Team . Competence by design (CBD) implementation pulse check. https://www.royalcollege.ca/rcsite/documents/cbd/cbd-pulse-check-annual-report-2020-e.pdf. 2020. [Google Scholar]
  • 39. ten Cate O, Schwartz A, Chen HC. Assessing trainees and making entrustment decisions: on the nature and use of entrustment‐supervision scales. Acad Med. 2020;95(11):1662‐1669. doi: 10.1097/ACM.0000000000003427 [DOI] [PubMed] [Google Scholar]
  • 40. ten Cate O, Taylor DR. The recommended description of an entrustable professional activity: AMEE guide no. 140. Med Teach. 2020;43(10):1106‐1114. doi: 10.1080/0142159X.2020.1838465 [DOI] [PubMed] [Google Scholar]
  • 41. Hennus MP, van Dam M, Gauthier S, Taylor DR, ten Cate O. The logic behind EPA‐frameworks: a scoping review of the literature. Med Educ. 2022;56(9):881‐891. doi: 10.1111/medu.14806 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42. Pusic MV, Hall E, Billings H, et al. Educating for adaptive expertise: case examples along the medical education continuum. Adv Health Sci Educ Theory Pract. 2022;27(5):1383‐1400. doi: 10.1007/s10459-022-10165-z [DOI] [PubMed] [Google Scholar]
  • 43. Plsek PE, Greenhalgh T. Complexity science: the challenge of complexity in health care. BMJ. 2001;323(7313):625‐628. doi: 10.1136/bmj.323.7313.625 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44. Richardson D, Landreville JM, Trier J, et al. Coaching in competence by design: a new model of coaching in the moment and coaching over time to support large scale implementation. Persp Med Educ. 2024;13(1):33‐43. doi: 10.5334/pme.959 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45. Black P. Formative assessment – an optimistic but incomplete vision. Assess Educ: Principles pol Practice. 2015;22(1):161‐177. doi: 10.1080/0969594X.2014.999643 [DOI] [Google Scholar]
  • 46. Black P, Wiliam D. Assessment and classroom learning. Assess Educ: Principles pol Practice. 1998;5(1):7‐74. doi: 10.1080/0969595980050102 [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

Research data are not shared.


Articles from Medical Education are provided here courtesy of Wiley

RESOURCES