Abstract
Assessing residents and clinical fellows is a high-stakes activity. Effective assessment is important throughout training so that identified areas of strength and weakness can guide educational planning to optimize outcomes. Assessment has historically been underemphasized although medical education oversight organizations have strengthened requirements in recent years. Growing acceptance of competency-based medical education and its logical extension to competency-based time-variable (CB-TV) graduate medical education (GME) further highlights the importance of implementing effective evidence-based approaches to assessment. The Clinical Competency Committee (CCC) has emerged as a key programmatic structure in graduate medical education. In the context of launching a multi-specialty pilot of CB-TV GME in our health system, we have examined several program’s CCC processes and reviewed the relevant literature to propose enhancements to CCCs. We recommend that all CCCs fulfill three core goals, regularly applied to every GME trainee: (1) discern and describe the resident’s developmental status to individualize education, (2) determine readiness for unsupervised practice, and (3) foster self-assessment ability. We integrate the literature and observations from GME program CCCs in our institutions to evaluate how current CCC processes support or undermine these goals. Obstacles and key enablers are identified. Finally, we recommend ways to achieve the stated goals, including the following: (1) assess and promote the development of competency in all trainees, not just outliers, through a shared model of assessment and competency-based advancement; (2) strengthen CCC assessment processes to determine trainee readiness for independent practice; and (3) promote trainee reflection and informed self-assessment. The importance of coaching for competency, robust workplace-based assessments, feedback, and co-production of individualized learning plans are emphasized. Individual programs and their CCCs must strengthen assessment tools and frameworks to realize the potential of competency-oriented education.
KEY WORDS: competency-based medical education, time-variable graduate medical education, competency-based advancement, clinical competency committee, Milestones, individualized learning plan, COVID-19
Vignette:
Leila is in her second year of internal medicine (IM) residency. Before emigrating to the United States (U.S.), she had completed IM training in her native country and practiced independently for 2 years.
There was little discussion of Leila at the Clinical Competency Committee’s (CCC’s) regular 6-month meeting: her evaluations consistently reflected “exceeding expectations.” When Leila met with her residency program director, no specific summative information was available from the CCC. The available assessment data was generic and interpreted by the program director as “doing fine.” Leila left the meeting wondering about the CCC’s role, and how it helps optimize her educational trajectory. Leila also questions why she needs to finish 3 years of residency, since she was a practicing doctor prior to emigrating to the U.S.A., and all evaluators note her advanced skills. Leila’s program is participating in a competency-based time-variable GME pilot, where advancement is based on demonstrated competency rather than time in training. How can the CCC utilize available assessments to determine Leila’s readiness for unsupervised practice?
INTRODUCTION
Assessing physicians-in-training is a high-stakes activity. Effective assessment is important throughout training so that identified areas of strength and weakness can guide educational planning to optimize outcomes. Then, as residents and fellows complete their training, assessment provides the basis to confirm competence for unsupervised practice. Periodic assessment during graduate medical education (GME) should also help physicians-in-training hone their ability to self-assess and regulate their learning1—critical skills and a career-long responsibility essential for high-quality patient care which can be cultivated through informed self-assessment.2
Recognizing the importance (and historic under-emphasis) of assessment, medical education oversight organizations such as the Accreditation Council for Graduate Medical Education (ACGME) have strengthened related requirements in recent years.3–5 Growing acceptance of competency-based medical education (CBME), and its logical extension to competency-based, time-variable (CB-TV) GME, highlights the importance of implementing effective, evidence-based approaches to assessment.6–8 The implementation of CCCs in the USA, and their equivalent in Canada, Switzerland, the Netherlands, and globally through ACGME-International accreditation, is an outgrowth of widespread educational reform promoting a reorientation of trainee assessment.4,6,9–16 In addition, COVID-19’s disruption to routine residency and fellowship training amplifies the importance and urgency of having sound and trustworthy assessment processes to determine readiness for advancement.17–23
Clinical competency committees (CCCs) are the lynchpin of assessment in GME—the locus for interpreting evaluative information and determining further actions. When the ACGME initiated its requirement to implement CCCs as part of the “Next Accreditation System,” the committees’ key responsibilities were outlined, with the details of implementation left to each program’s judgment.4,5 Varying approaches have now been described in the literature, and the third edition of a CCC guidebook for GME programs was issued by ACGME in 2020; however, a clear best approach has yet to be identified.24
Studies have sought to evaluate CCC structure, process, composition, and outcomes25,26; correlation of faculty ratings with trainee self-assessment27–29; the role of competency coaches30; and trainee ability to develop meaningful individualized learning plans (ILPs).31 Other studies have sought to elucidate how trainees in internal medicine, pediatrics, emergency medicine, visual diagnostic, surgical, and procedural specialties7,14,27,32–39 are assessed on the specialty-specific Milestones and Entrustable Professional Activities (EPAs)—which is essential for competency-based advancement decisions.22,39 Additional studies have evaluated the impact of CCC competency decisions on subsequent levels of supervision and independence during residency training.7,27,35,36
At Mass General Brigham, the participation of several residency programs in a CB-TV GME pilot40 (e.g., where advancement and graduation are based on demonstrated competency rather than solely on time spent in a program) has stimulated closer examination of CCC processes in order to enhance their effectiveness and ensure trustworthy data-informed decisions about individualized advancement from residency to unsupervised practice.40 Our engagement with CCCs in several residency programs considering participation in the pilot, along with our review of the CCC literature, has led us to reconceptualize the goals of residency program CCCs and make recommendations for achieving them.
BACKGROUND
The ACGME’s “Next Accreditation System” and Milestones project call for residency programs to assess the developmental progression of each trainee in terms of measurable competencies, reflecting widespread consensus favoring a competency-based framework for medical education.4 CCCs are the principal vehicle for synthesizing available data to assess trainee performance and, importantly, developmental progression over time.4,24,41,42
CCC Goals
The ACGME’s “Common Program Requirements” outline the following core responsibilities of the CCC: (1) review all resident evaluations at least semi-annually; (2) determine each resident’s progress on achievement of the specialty-specific Milestones; and (3) meet prior to the residents’ semi-annual evaluations and advise the program director regarding each resident’s progress.5 The ACGME’s “Clinical Competency Committees: A Guidebook for Programs” delineates (Table 1, p 5–7) 41 granular items as “purposes” of the CCC, organized by stakeholder groups (“the program itself, program directors, faculty members, program coordinators, residents and fellows, the institution, and the ACGME”), but notes that “the ultimate purpose is to demonstrate accountability as medical educators to the public: that graduates will provide high quality, safe care to patients while in training, and be well prepared to do so once in practice.”24
Table 1.
CCC goal | Current landscape | Specific limitations | Proposed improvements | Examples for vignette |
---|---|---|---|---|
Discern and describe the developmental status of each resident to optimize education | Lack of a shared mental model of how to conduct trainee developmental assessment50 | Straight-line scoring on the Milestones |
Provide faculty development activities aimed at a shared model of assessment and competency-based advancement. CCCs synthesize evaluative feedback for all trainees, whether struggling, average, or exceptional (like Leila), to inform individualized learning plans, co-produced by trainees with program leadership |
Data is available that takes into account Leila’s unique journey, allowing individualization |
Lack of a shared mental model of how to conduct trainee developmental assessment50 | Focus on outlier identification | Discuss EVERY trainee at the CCC meeting with a view to providing forward-oriented recommendations, based on the competency model. | Developmental perspective allows Leila to plan and adjust her training experiences; educational value becomes a criterion for activity scheduling | |
Failure to address coach-evaluator tension | CCC members often fill both coach and evaluator roles | Diversify CCC membership to include a wide range of stakeholders, including those who do not necessarily have an education role | Clear separation of coach and evaluator increases opportunity for Leila to confide stressors and to adopt growth mindset | |
CCC may not have sufficient diversity in terms of race, gender, ethnicity, LGBTQ+ | Prone to implicit bias and to counter-productive group dynamics | Ensure diversity of CCC membership, explicit consideration of the group processes | Leila was pleased to see a foreign medical graduate represented on the CCC. | |
Determine each resident’s readiness for unsupervised practice | Lack of explicit competency-based criteria to determine readiness for graduation and unsupervised practice | Advancement is based on demonstration of specific, observable positive behaviors, rather than absence of problems or sanctions | Utilize explicit criteria for competency-based advancement including achievement of the ACGME Milestones | Leila understands what competencies she needs to demonstrate in order to graduate, and where this has or has not been accomplished |
Foster each resident’s ability to self-assess | Resident self-evaluation and reflection often only done informally | Informed self-assessment, self-monitoring, and reflective practice are underemphasized by faculty and undervalued by trainees |
Ensure that residents practice the skills of informed self-assessment. Incorporate trainee Milestone self-assessment into CCC meeting discussion Utilizing CCC determinations for co-produced individualized learning plans |
As Leila learns to self-assess, she understands in which areas she is less strong than others and understands what additional growth is needed to graduate |
Few data visualizations available, and even fewer that are informed by a competency model | When examinations are the key data point, that sends a message as to what is valued | Adopt a quality improvement mindset for self-improvement, where data visualizations play a key role | Leila works with her program director to make evidence-based decisions to determine which elective rotations or other experiences will enable her to achieve competency |
Programs note that ongoing assessment and CCC consideration of every resident requires considerable time and resources.24 p 18-22 However, the negative impact of sub-optimal assessment, such as delayed recognition of competency gaps, can cost considerably more. Moreover, if an opaque, under-resourced assessment system results in failing to maximize individual potential, and perhaps even allows less-than-competent trainees to graduate, the downstream costs to society are far greater. For these reasons, it is essential that GME programs strengthen the developmental assessment of all trainees to improve education today and prepare us for time-variable graduation based on demonstrated competency as a model for the future.
We propose that CCCs have three core goals. First, the CCC must regularly and iteratively discern and describe the developmental status of each resident for the purpose of optimizing their education. This requires aggregating and interpreting a variety and sufficient volume of evaluative material—with an emphasis on multi-source (“360-degree”) evaluations, drawn from a sufficient variety of settings and informed by direct observation.43,44 It also requires that CCC findings are incorporated in an individualized educational plan, where summative assessments are incorporated into an action plan co-produced with each trainee.24
The CCCs’ second goal relates to GME programs’ fundamental responsibility to protect the public by graduating competent physicians. Thus, CCCs must affirmatively determine each resident’s readiness for unsupervised practice to support graduation decisions. This requires having explicit promotion criteria that can be applied consistently.
We assert that a third key goal of CCCs is to foster each resident’s ability to take responsibility for their ongoing learning, the collection of skills variably known as self-assessment, self-monitoring, and self-regulation of learning.1,2 Understanding one’s own level of skill, knowledge, and judgment is central to providing good care. An important tenet of CBME is the shift of learning control from the faculty to the resident. 6,45,46 Physicians must discern when to seek help in delivering care; when to pursue additional education, training, or practice (e.g., simulation); or when to limit their scope of practice—rather than relying on external, usually post hoc oversight of their independent practice. The ability cannot be assumed to develop spontaneously; in fact, studies have demonstrated that highly competent physicians tend to under-rate themselves while the less competent overrate themselves.47 Thus, informed self-assessment is a relevant skill to cultivate and ensure during training, linked to the CCC process.2,24 The importance of self-assessment and reflective practice is underscored by the recent implementation of the harmonized ACGME Milestone 2.0 sub-competency, “Practice-based Learning and Improvement-2”—“Reflective Practice and Commitment to Personal Growth.”1,48
How Do CCCs Fare in Fulfilling These Goals?
Formative and Summative Workplace-Based Assessments Inform CCC Decisions
While the ACGME Common Program Requirements and CCC Guidebook provide a framework for CCCs, some evidence indicates that CCCs fall short of meeting these requirements in adequately evaluating the developmental trajectory of trainees.5,24,25,49–53 The inception of the ACGME Outcomes Project in 2001 established the six core competencies and stimulated the competency-based medical education movement in the USA, defining the roadmap for GME training outcomes.3 Since that time, the ACGME has recommended both formative and summative assessment methods to evaluate trainees. Examples of formative assessment methods include competency-based multi-source evaluation (e.g., evaluation of trainees by faculty, peers, patients, other healthcare professionals, and self-assessment), direct observation with feedback, objective structured clinical examinations, and chart review.5,24,43 Summative trainee assessment was then strengthened by the implementation of bi-annual evaluation on specialty-specific Milestones as part of the “Next Accreditation System” in 2013.4 Pediatrics has used individualized learning plans (ILPs) for more than a decade, and co-production of ILPs with program leadership is a recent requirement for trainees in all specialties.5,54,55 The requirement for both formative and summative assessment has led to innovation and collaboration among academic centers to understand how trainees can be assessed across the continuum of learning and how competency-based assessment supports competency-based medical education.7,56 ACGME assessment requirements have stimulated CCCs to codify a process and timetable for evaluations, to collect a sufficient number of evaluations [though what number of evaluations suffices remains subjective], and to incorporate multiple perspectives, including from members with first-hand experience working with residents.56–58 With the movement to competency-based medical education and consideration of competency-based advancement, Kinnear and others have described a validity argument for how workplace-based assessment and the CCC process can support competency-based advancement.8,59
At the same time, however, in several ways, CCCs are failing to support—and sometimes distinctly undermining—the three stated goals.51,53,60 Table 1 outlines current obstacles and key enablers to achieving the three CCC goals. We will explore these obstacles and highlight three recommended “focus areas” for CCCs as they aim to meet the proposed goals and enhance competency-based assessment decisions.
Key Obstacles and Recommended Areas of Focus to Achieve CCC Goals
Focus Area #1: Assess and promote the development of competency in all trainees, not just outliers, through a shared model of assessment and competency-based advancement50,56
The CCC should review and synthesize all assessments that inform each trainees’ developmental trajectory towards achievement of competency and provide this information to trainees. Trainees can then use determinations and feedback from the CCC to co-produce an individualized learning plan with program leadership during bi-annual meetings, potentially with participation of a coach.24 p. 44-45,50
Many CCCs, especially those with large numbers of residents, focus primarily on outliers, those few residents who are struggling. Hauer and colleagues evaluated the structure and function of CCCs in 34 residency programs at 5 public institutions in California.60 Using semi-structured interviews with program directors, they found the majority of the CCCs had an outlier approach, focusing primarily on struggling trainees rather than using a developmental approach to address the individual needs of all trainees.60 Schumacher and colleagues developed a structure for identification of the struggling pediatric trainee but noted the need to also develop a process to identify outliers at the other extreme—the exceptional trainee.36 While this approach would include more trainees under the CCC’s consideration, it still falls short of a thorough assessment of each individual to provide granular, thematic feedback about their areas of relative strength or weakness to inform ongoing training or refine the self-assessment capabilities.
The failure to individualize all trainee assessments has in some cases led to “straight line scoring,” where all trainees are assigned the same milestone sub-competency score, rather than considering demonstrated competency, undermining the milestone evaluation process.52,61 This is compounded when CCCs lack a shared model on CCC process and function; these norms of outlier identification and straight-line scoring become established, and then are hard to break.50 In order to discern and describe the developmental status of each resident for the purpose of optimizing their education, the CCC must first establish a shared model and commitment to reviewing each individual resident and providing summative feedback that can be used by trainees to co-produce an ILP with program leadership.5,24,50,55,56,58,60,62 Faculty development for CCC members is essential to mitigate biases that could potentially influence CCC ratings, including bias regarding gender, race, ethnicity, and other forms of cognitive bias.53,63,64 CCCs are encouraged to think deliberately about the diversity of their membership and incorporate the science of effective group processes to ensure fair, unbiased committee discussions and decisions.25,26
Focus Area #2: Strengthen CCC assessment and coaching processes for the determination (and promotion) of trainee readiness for independent practice
The CCC should be structured to explicitly incorporate the useful tension between formative and summative assessment, with workplace-based formative assessment gathered through direct observation, multi-source evaluation and feedback, competency coaching, and summative assessment on the specialty-specific Milestones.39,65,66 Coaching is the provision of support and instruction by someone acting as a learner advocate.67,68 Coaching provides the opportunity to directly observe trainees and provide specific feedback in an area(s) of competency, moving trainees along the Milestones trajectory towards competence and readiness for independence.
The majority of coaching programs in both undergraduate and graduate medical education focus on student and trainee career development and wellness while few programs offer coaching that utilizes methods aimed to enhance clinical skills and achieve clinical competence.30,67–70 Further, we postulate that insufficient attention is paid to the potential complementarity of formative coaching and summative assessment.69,71 The R2C2 [build relationships, explore reactions, explore content, and coach for change] model has been validated across specialties and offers specific strategies for both longitudinal and “in-the-moment” coaching focused on patient care, clinical skills, and competency achievement.67,69,71,72 Coaching models such as the R2C2 model strive to manage the tension between coaching on the one hand and the need for evaluation on the other, by emphasizing creation of a personal relationship and positive interactions between the coach and resident.24,30,67,68,73,74 When coaches serve a dual role of both evaluator and coach on the CCC, this undermines trust and their subsequent ability to serve as a coach.65,75 Frequently, the same CCC member provides both a coach and evaluator perspective, not based on design but on coincidental intersection with individual trainees in the clinical environment; we advocate for these roles to be served by different persons who can provide distinct and individualized perspectives.30,65,75 The “Bow Tie Framework” delineates the roles and responsibilities of the resident, competency coach, and evaluator in the CCC process (Fig. 1).
Focus Area #3: Promote informed self-assessment by each trainee to identify learning needs
Resident-informed self-assessment should be a celebrated component of the CCC process.
Despite the growing appreciation for the importance of self-reflection, CCC structures often have under-developed mechanisms for celebrating and encouraging a dialectic between the resident’s developing skill of self-assessment and the recognized standards set forth by each specialty.28,29,76,77 Self-regulated learning and professional accountability both depend on recognizing when one needs additional knowledge, enhanced skill, or direct assistance in order to deliver excellent care. Thus, a key prerequisite for independent practice is not only a collection of experience and demonstrated skills but also the ability to recognize gaps and opportunities, especially in regard to continually evolving professional standards.27,31,78–81 There is increasing recognition that self-assessment and reflective practice are practiced skills that can be encouraged and incorporated into a program’s culture.1,2,28,29,31,33,78,81,82 For example, calls for an increased emphasis on meta-cognition and adaptive expertise explicitly point to the importance of informed self-assessment as well as self-monitoring.1,83–87 Discernment, the ability to judge one’s limits, is a key component of entrustability, another increasing emphasis in modern health professions assessment frameworks.32,88,89
We suggest that CCCs adopt a standard process of incorporating resident Milestone self-evaluation as part of the CCC deliberations instead of having trainees compare their self-determined Milestone ratings to those of the CCC post meeting.77 This serves to incorporate the trainee perspective into the CCC and ensures the trainee is aware of the trajectory of competence progression in their chosen specialty.76,77 CCCs will need to have a mechanism in place to address marked discrepancies, which can and should be discussed during the bi-annual program director-trainee meeting and during the process of co-producing the trainee’s ILP.24,62
Further, individualized learning plans offer trainees and program faculty a process to define both short- and long-term goals through a forward-looking lens or roadmap towards competence.24,54,62 A study by Li and colleagues found that pediatric residents’ ability to write actionable goals significantly improved over the course of residency training.31 Additional studies have focused on coaching and the use of learning change plans, an ILP equivalent.90 Under an outlier identification model, CCC data is used to identify and customize the learning plans of only a small number of outliers.60 Have problems with this resident been identified? If not, then they can carry on in a standardized program. An individualized approach to assessment and educational planning is taken only if problems are identified. Under a forward-looking, ILP perspective, data are used not only to identify problems, but to map when and how each competency or milestone can be achieved by each resident, helping to chart the best path forward to optimize each learner’s development, including those “ahead of the curve.”36 Co-production of an ILP by every resident, based on the input of the CCC, is then used to actualize this objective.24,54,62,90
The ILP process leads to finer-grained examination of the existing data in the light of the resident’s remaining scheduled activities, including an emphasis on longitudinal learning trajectories. For programs utilizing competency-based advancement or preparing to pilot CB-TV GME graduation, determining each resident’s appropriate graduation date involves risk and opportunity for both the resident and the program.7,16,21–23,91–93 This dynamic can be a positive force for ensuring that data collection and interpretation is transparent and fully codified. Each individual resident’s ILP should include relevant data-driven predictions, creating both short- and long-term actionable goals. We assert that this data-driven ILP process is beneficial to all programs regardless of whether they are piloting a time-variable graduation date.
Connecting the Goals: Data Management as an Enabling Skill of All Stakeholders
To accomplish its goals, the CCC must utilize effective mechanisms to collect a wide range of data, analyze both its quality and sufficiency, and develop robust reporting mechanisms. The ACGME CCC Guidebook includes recommendations to manage administrative tasks and defines the roles and responsibilities for each member of the CCC.24 p14-16;18-22 While all GME programs must utilize robust assessment, time-variable training provides a more urgent stimulus to strengthen assessment, given the necessity of making evidence-based graduation decisions based on demonstrated competency.21,22,56,93
The following are recommendations to strengthen the CCC process:
Hold meetings frequently enough to avoid data overload. More frequent meetings should also help to ensure that rotation-based assessments are completed without many months of delay and can help address concerns in a timely fashion, as well as ensuring that developmental needs are addressed on a timescale consistent with the learning.
Parse the workload by assigning CCC members a manageable subset of residents whose data they review and report on—or, alternatively a subset of competencies for which they review all resident data. These two perspectives are complementary.
Utilize multi-source data that incorporate formative and summative assessments, incorporating clinical outcomes data when available.
Use data visualizations to highlight individual or programmatic trends.94,95 The degree to which a CCC can carry out its work without the inside knowledge of the residency program director is a measure of its ability to serve as a complementary check on the day-to-day functioning of the program. An ideal information system to support CCC operation includes a data portfolio that can run the gamut from individual observations, through summations of individual resident achievement, to integrative displays at the program level.
Consider the heat map shown in Figure 2 which can provide a perspective on each of the CCC goals we have described. Each column represents a single resident, and so, the visualization can show all residents in the program. Each row represents a single Milestone sub-competency (or EPA) so that the columns taken together represent the entirety of the competency model for the specialty. Each cell represents how that individual resident is doing on that individual competency, with the temperature of the color suggesting a five-point scale of longitudinal achievement. As such, the representation provides a summary of the current state of the program, with the between-resident variability manifest at a glance, especially if the residents are ordered by stage of training. The variability between competency elements is also on display with their differing rate of achievement. Clearly, some competencies are easier to develop than others. Clearly, some residents are further along in their development than are others. The visualization is consistent with the breadth of the CCC’s mission, across all residents and across the entire competency model. A further embellishment would be to represent resident self-assessment data on the same grid.
Our example is a static visualization. Ideally, CCCs are supported by dynamic dashboards which allow the members to consider multiple views on the data, drilling down when necessary, to the granular data that determine the current estimate of milestone progression.94–96 An important point here is that the CCC can assess the sufficiency of the evaluation data available to it. What data is missing? Why is it missing? Are there program-level quality improvement (QI) implications? Or specific implications for this resident? As the locus of control for assessment is tilted towards a self-regulated resident learner, the degree to which the learner is able to meet the program expectations in terms of collecting the necessary evidence of achievement may be its own datapoint. CCC data visualizations should be engineered to allow dynamic access within the CCC meeting to provide both an overall program-level map, and to drill down to the individual data point level.
Conclusion
In this article, we have proposed three core CCC goals that must be regularly applied to every resident: (1) discern and describe developmental status to optimize education, (2) determine readiness for unsupervised practice, and (3) foster self-assessment ability. We have recommended areas of focus to enhance the CCC process to actualize these goals including the following: assess and promote the development of competency in all trainees, not just outliers, through a shared model of assessment and competency-based advancement; strengthen CCC assessment processes to determine trainee readiness for independent practice; and promote informed self-assessment of each trainees’ learning needs. We have emphasized the importance of providing formative feedback through coaching and robust workplace-based multi-source assessments to inform the CCC’s determination of the developmental trajectory of each trainee coupled with co-production of an individualized learning plan. Further, we emphasize the importance of data visualizations to provide a comprehensive overview of each trainee’s competency trajectory, noting areas of both strength and growth.
Institutions and programs must recognize that trainee assessment is a critical and resource-intensive process and must prioritize and fund it accordingly. Participating faculty should be appropriately trained and compensated for their effort.64 In addition, engagement in assessment may (and should) contribute to the academic advancement of faculty, providing another important incentive. Successful strategies to support effective assessment should be disseminated. Competency-based medical education promotes individualized pathways and requires flexible educational systems regardless of whether programs plan for time-variable advancement.6,97
Overall, we are promoting a forward-looking mindset in service of competency-based advancement, one where the question is not “how have you done until now?” but rather “given what we know about you, how can we help optimize your forward trajectory?”. The ACGME has provided the structure and framework for CCCs to actualize these goals, yet individual programs must conceptualize and strengthen the tools and personalize the framework to realize the potential of the CCC in fulfilling its role in competency-based medical education and advancement.
Acknowledgements
Contributors
The authors thank Emilio Madrigal, DO, and Long Phi Le, MD, PhD, for their contributions to the development of the Massachusetts General Hospital (MGH) Pathology Milestones Passport Management Platform and John A. Branda, MD, for his contributions to the Promotion in Place pilot in the MGH Pathology residency program.
Funding
American Medical Association, Reimagining Residency grant, “Promotion in Place: Enhancing Trainee Well Being and Patient Care Through Time Variable Graduate Medical Education” to Mass General Brigham, Office of Graduate Medical Education
Declarations
Conflict of Interest
This work is funded by an American Medical Association (AMA) Reimagining Residency grant. Dr. Co is the principal investigator, Drs. Goldhamer and Pusic are co-investigators, Dr. Weinstein is a collaborator and the former principal investigator, and Drs. Black-Schaffer and Martinez-Lage are on the project team. The content reflects the views of the authors and does not purport to reflect the views of the AMA or any member of the Accelerating Change in Medical Education Consortium.
Dr. Martinez-Lage is associate program director for anatomic pathology, Massachusetts General Hospital, and member, Test Development and Advisory Committee for Neuropathology, American Board of Pathology.
Dr. Black-Schaffer is associate chief of pathology for training and education, Massachusetts General Hospital.
Dr. Huang is program director, Harvard-wide dermatology residency program, and section chief of dermatology, Boston Children’s Hospital.
Dr. Co is the director of graduate medical education, Mass General Brigham, and member of the Board of Directors, Accreditation Council for Graduate Medical Education (ACGME).
Dr. Weinstein is executive vice dean for academic affairs, University of Michigan Medical School, and chief academic officer, Michigan Medicine.
Footnotes
Drs. Weinstein and Pusic are co-last authors.
Prior Presentations:“Contemporary Board Certification Issues Related to Residency”/60-min moderated panel presentation. Speakers: Jo Buyske, MD, DABS; Mary Ellen J. Goldhamer, MD, MPH; Martin V. Pusic, MD, PhD; Moderators: Eric Holmboe, MD, senior vice president, Milestone Development and Evaluation, Accreditation Council for Graduate Medical Education (ACGME) and Greg Ogrinc, MD, MS, senior vice president, Certification Standards and Programs. American Board of Medical Specialties Transforming Certification Conference, September 24, 2020.
“The Clinical Competency Committee’s role in time-variable competency-based GME assessment: Moving from a deficit to a growth model”/25-min presentation. Mary Ellen J. Goldhamer, MD, MPH; Debra F. Weinstein, MD; John Patrick T. Co, MD, MPH; Martin V. Pusic, MD, PhD. American Medical Association GME Innovations Summit, October 6, 2020.
“Moving the Clinical Competency Committee (CCC) from a deficit to a growth model through competency-based time-variable (CB-TV) graduate medical education (GME) advancement”/60-min workshop. Mary Ellen J. Goldhamer, MD, MPH; Debra F. Weinstein, MD; John Patrick T. Co, MD, MPH; Martin V. Pusic, MD, PhD. Accreditation Council of Graduate Medical Education (ACGME) Annual Educational Conference, February 26, 2021.
“The MGH Pathology Milestones Passport: A Management Platform for the Accreditation Council for Graduate Medical Education Milestones Program.” Emilio Madrigal, DO; Long Phi Le, MD, PhD; W. Stephen Black-Schaffer, MD. Association of Pathology Chairs, Annual Meeting, poster presentation. July 21, 2020.
“Milestone Assessments at Massachusetts General Hospital,” invited presentation. W. Stephen Black-Schaffer, MD, Association of Pathology Chairs, Annual Meeting: July 21, 2020.
“Certifications Based on Readiness to Care for Patients: Implications for GME Programs and Specialty Boards Certification Programs”/60-min presentation. Benjamin Kinnear, MD, MEd; Mary Ellen J. Goldhamer, MD, MPH; John Patrick T. Co, MD, MPH; Michelle Chen, MD, MHS; and Holly Caretta-Weyer, MD, MHPE. American Board of Medical Specialties, Transforming Certification for Better Care Conference, September 28, 2021.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. Acad Med. 2005;80(Supplement):S46–S54. doi: 10.1097/00001888-200510001-00015. [DOI] [PubMed] [Google Scholar]
- 2.Sargeant J, Armson H, Chesluk B, et al. The processes and dimensions of informed self-assessment: a conceptual model. Acad Med. 2010;85(7). 10.1097/ACM.0b013e3181d85a4e [DOI] [PubMed]
- 3.Batalden P, Leach D, Swing S, Dreyfus H, Dreyfus S. General competencies and accreditation in graduate medical education. Health Aff. 2002;21(5). 10.1377/hlthaff.21.5.103 [DOI] [PubMed]
- 4.Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system — rationale and benefits. N Engl J Med. 2012;366(11). 10.1056/NEJMsr1200117 [DOI] [PubMed]
- 5.ACGME Common Program Requirements, https://www.acgme.org/globalassets/PFAssets/ProgramRequirements/CPRResidency2021.pdf, Accessed 11/21/21
- 6.Frank JR, Snell LS, Cate O ten, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638-645. 10.3109/0142159X.2010.501190 [DOI] [PubMed]
- 7.Schwartz A, Borman-Shoap E, Carraccio C, et al. Learner levels of supervision across the continuum of pediatrics training. Acad Med. 2021;96(7S). 10.1097/ACM.0000000000004095 [DOI] [PubMed]
- 8.Kinnear B, Kelleher M, May B, et al. Constructing a validity map for a workplace-based assessment system: cross-walking Messick and Kane. Acad Med. 2021;96(7S). 10.1097/ACM.0000000000004112 [DOI] [PubMed]
- 9.Holmboe ES, Sherbino J, Englander R, Snell L, Frank JR. A call to action: the controversy of and rationale for competency-based medical education. Med Teach. 2017;39(6):574–581. doi: 10.1080/0142159X.2017.1315067. [DOI] [PubMed] [Google Scholar]
- 10.Stockley D, Egan R, van Wylick R, et al. A systems approach for institutional CBME adoption at Queen’s University. Med Teach. 2020;0(0):1–6. doi: 10.1080/0142159X.2020.1767768. [DOI] [PubMed] [Google Scholar]
- 11.Duitsman ME, Fluit CRMG, van Alfen-van der Velden JAEM, et al. Design and evaluation of a clinical competency committee. Perspect Med Educ. 2019;8(1):1–8. doi: 10.1007/s40037-018-0490-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.van Melle E, Gruppen L, Holmboe ES, Flynn L, Oandasan I, Frank JR. Using contribution analysis to evaluate competency-based medical education programs: it’s all about rigor in thinking. Acad Med. 2017;92(6):752–758. doi: 10.1097/ACM.0000000000001479. [DOI] [PubMed] [Google Scholar]
- 13.Day SH, Nasca TJ. ACGME International: the first 10 years. J Grad Med Educ. 2019;11(4):5–9. doi: 10.4300/JGME-D-19-00432. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Harris KA, Nousiainen MT, Reznick R. Competency-based resident education—the Canadian perspective. Surgery. 2020;167(4):681–684. doi: 10.1016/j.surg.2019.06.033. [DOI] [PubMed] [Google Scholar]
- 15.Swing SR, Beeson MS, Carraccio C, et al. Educational milestone development in the first 7 specialties to enter the next accreditation system. J Grad Med Educ. Published online 2013. 10.4300/jgme-05-01-33 [DOI] [PMC free article] [PubMed]
- 16.Hall AK, Rich J, Dagnone JD, et al. It’s a marathon, not a sprint: rapid evaluation of competency-based medical education program implementation. Acad Med. 2020;95(5):786–793. doi: 10.1097/ACM.0000000000003040. [DOI] [PubMed] [Google Scholar]
- 17.Accreditation Council for Graduate Medical Education and American Board of Medical Specialties. https://www.abms.org/news-events/abms-and-acgme-joint-principles-physician-training-during-the-covid-2019-pandemic/. Accessed November 9, 2020. https://www.abms.org/news-events/abms-and-acgme-joint-principles-physician-training-during-the-covid-2019-pandemic/
- 18.Bhashyam AR, Dyer GSM. “Virtual” boot camp: orthopaedic intern education in the time of COVID-19 and beyond. J Am Acad Orthop Surg. 2020;28(17):e735–e743. doi: 10.5435/JAAOS-D-20-00559. [DOI] [PubMed] [Google Scholar]
- 19.Shi J, Miskin N, Dabiri BE, et al. Quantifying impact of disruption to radiology education during the COVID-19 pandemic and implications for future training. Curr Probl Diagn Radiol. Published online August 2020. 10.1067/j.cpradiol.2020.07.008 [DOI] [PMC free article] [PubMed]
- 20.Ko LN, Chen ST, Huang JT, McGee JS, Liu KJ. Rethinking dermatology resident education in the age of COVID-19. Int J Dermatol. Published online October 7, 2020. 10.1111/ijd.15235 [DOI] [PMC free article] [PubMed]
- 21.Goldhamer MEJ, Pusic MV, JPT C, Weinstein DF. Can Covid catalyze an educational transformation? Competency-based advancement in a crisis. N Engl J Med. 2020;383(11):1003–1005. doi: 10.1056/nejmp2018570. [DOI] [PubMed] [Google Scholar]
- 22.Misra S, Iobst WF, Hauer KE, Holmboe ES. The importance of competency-based programmatic assessment in graduate medical education. J Grad Med Educ. 2021;13(2s):113–119. doi: 10.4300/JGME-D-20-00856.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.ten Cate O, Schultz K, Frank JR, et al. Questioning medical competence: should the Covid-19 crisis affect the goals of medical education? Med Teach. Published online May 27, 2021. 10.1080/0142159X.2021.1928619 [DOI] [PubMed]
- 24.Andolsek K, Padmore J, Hauer KE, Holmboe E. Accreditation Council for Graduate Medical Education (ACGME) Clinical Competency Committees A Guidebook for Programs. 2015;(January):1-53. http://www.acgme.org/acgmeweb/Portals/0/ACGMEClinicalCompetencyCommitteeGuidebook.pdf, Accessed 11/21/21
- 25.Hauer KE, Edgar L, Hogan SO, Kinnear B, Warm E. The science of effective group process: lessons for clinical competency committees. J Grad Med Educ. 2021;13(2s):59–64. doi: 10.4300/JGME-D-20-00827.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Kinnear B, Kelleher M, Sall D, Schumacher DJ, Warm EJ. When I say … the wisdom of crowds. Med Educ. 2020;54(6):502–503. doi: 10.1111/medu.14158. [DOI] [PubMed] [Google Scholar]
- 27.Watson RS, Borgert AJ, O’Heron CT, et al. A multicenter prospective comparison of the Accreditation Council for Graduate Medical Education Milestones: clinical competency committee vs. resident self-assessment. J Surg Educ. 2017;74(6):e8–e14. doi: 10.1016/j.jsurg.2017.06.009. [DOI] [PubMed] [Google Scholar]
- 28.Stahl CC, Jung SA, Rosser AA, et al. Entrustable professional activities in general surgery: trends in resident self-assessment. J Surg Educ. 2020;77(6). 10.1016/j.jsurg.2020.05.005 [DOI] [PubMed]
- 29.Cooney CM, Aravind P, Lifchez SD, et al. Differences in operative self-assessment between male and female plastic surgery residents: a survey of 8,149 cases. Am J Surg. 2021;221(4). 10.1016/j.amjsurg.2020.04.009 [DOI] [PubMed]
- 30.Ketteler ER, Auyang ED, Beard KE, et al. Competency champions in the clinical competency committee: a successful strategy to implement milestone evaluations and competency coaching. J Surg Educ. 2014;71(1):36–38. doi: 10.1016/j.jsurg.2013.09.012. [DOI] [PubMed] [Google Scholar]
- 31.Li STT, Paterniti DA, Tancredi DJ, et al. Resident self-assessment and learning goal development: evaluation of resident-reported competence and future goals. Acad Pediatr. 2015;15(4):367–373. doi: 10.1016/j.acap.2015.01.001. [DOI] [PubMed] [Google Scholar]
- 32.Warm EJ, Held JD, Hellmann M, et al. Entrusting observable practice activities and milestones over the 36 months of an internal medicine residency. Acad Med. 2016;91(10). 10.1097/ACM.0000000000001292 [DOI] [PubMed]
- 33.Chow I, Nguyen VT, Losee JE, et al. Milestones in plastic surgery. Plast Reconstr Surg. 2019;143(2). 10.1097/PRS.0000000000005214 [DOI] [PubMed]
- 34.Santen SA, Rademacher N, Heron SL, Khandelwal S, Hauff S, Hopson L. How competent are emergency medicine interns for level 1 milestones: who is responsible? Acad Emerg Med. 2013;20(7):736–739. doi: 10.1111/acem.12162. [DOI] [PubMed] [Google Scholar]
- 35.Li STT, Tancredi DJ, Schwartz A, et al. Competent for unsupervised practice: use of pediatric residency training milestones to assess readiness. Acad Med. 2017;92(3):385–393. doi: 10.1097/ACM.0000000000001322. [DOI] [PubMed] [Google Scholar]
- 36.Schumacher DJ, Michelson C, Poynter S, et al. Thresholds and interpretations: how clinical competency committees identify pediatric residents with performance concerns. Med Teach. 2018;40(1):70–79. doi: 10.1080/0142159X.2017.1394576. [DOI] [PubMed] [Google Scholar]
- 37.Kinnear B, Bensman R, Held J, O’Toole J, Schauer D, Warm E. Critical deficiency ratings in milestone assessment: a review and case study. Acad Med. 2017;92(6):820–826. doi: 10.1097/ACM.0000000000001383. [DOI] [PubMed] [Google Scholar]
- 38.Hamstra SJ, Yamazaki K, Barton MA, Santen SA, Beeson MS, Holmboe ES. A national study of longitudinal consistency in ACGME milestone ratings by clinical competency committees: exploring an aspect of validity in the assessment of residents’ competence. Acad Med. 2019;94:1522–1531. doi: 10.1097/ACM.0000000000002820. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Ekpenyong A, Baker E, Harris I, et al. How do clinical competency committees use different sources of data to assess residents’ performance on the internal medicine milestones? A mixed methods pilot study. Med Teach. 2017;39(10). 10.1080/0142159X.2017.1353070 [DOI] [PubMed]
- 40.AMA announces awardees of $15M Reimagining Residency Initiative, https://www.ama-assn.org/press-center/press-releases/ama-announces-awardees-15m-reimagining-residency-initiative, Accessed 11/21/21
- 41.Holmboe ES, Yamazaki K, Edgar L, et al. Reflections on the first 2 years of milestone implementation. J Grad Med Educ. 2015;7(3):506–511. doi: 10.4300/JGME-07-03-43. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Potts JR. Assessment of competence: The Accreditation Council for Graduate Medical Education/Residency Review Committee Perspective. Surg Clin N Am. 2016;96(1):15–24. doi: 10.1016/j.suc.2015.08.008. [DOI] [PubMed] [Google Scholar]
- 43.Kogan JR, Hatala R, Hauer KE, Holmboe E. Guidelines: the do’s, don’ts and don’t knows of direct observation of clinical skills in medical education. Perspect Med Educ. 2017;6(5):286–305. doi: 10.1007/s40037-017-0376-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Hauer KE, Holmboe ES, Kogan JR. Twelve tips for implementing tools for direct observation of medical trainees’ clinical skills during patient encounters. Med Teach. Published online 2011. 10.3109/0142159X.2010.507710 [DOI] [PubMed]
- 45.Schumacher DJ, Englander R, Carraccio C. Developing the master learner: applying learning theory to the learner, the teacher, and the learning environment. Acad Med. 2013;88(11):1635–1645. doi: 10.1097/ACM.0b013e3182a6e8f8. [DOI] [PubMed] [Google Scholar]
- 46.Cutrer WB, Miller B, Pusic MV, et al. Fostering the development of master adaptive learners: a conceptual model to guide skill acquisition in medical education. Acad Med. 2017;92(1):70–75. doi: 10.1097/ACM.0000000000001323. [DOI] [PubMed] [Google Scholar]
- 47.Davis DA, Mazmanian PE, Fordis M, van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence. JAMA. 2006;296(9). 10.1001/jama.296.9.1094 [DOI] [PubMed]
- 48.Edgar L, Roberts S, Holmboe E. Milestones 2.0: a step forward. J Grad Med Educ. 2018;10(3). 10.4300/JGME-D-18-00372.1 [DOI] [PMC free article] [PubMed]
- 49.Andolsek KM, Jones MD, Ibrahim H, Edgar L. Introduction to the Milestones 2.0: assessment, implementation, and clinical competency committees supplement. J Grad Med Educ. 2021;13(2s):1–4. doi: 10.4300/JGME-D-21-00298.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Edgar L, Jones MD, Harsy B, Passiment M, Hauer KE. Better decision-making: shared mental models and the clinical competency committee. J Grad Med Educ. 2021;13(2s):51–58. doi: 10.4300/JGME-D-20-00850.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Hauer KE, Cate O ten, Boscardin CK, et al. Ensuring resident competence: a narrative review of the literature on group decision making to inform the work of clinical competency committees. J Grad Med Educ. 2016;8(2):156-164. 10.4300/JGME-D-15-00144.1 [DOI] [PMC free article] [PubMed]
- 52.Beeson MS, Hamstra SJ, Barton MA, et al. Straight line scoring by clinical competency committees using emergency medicine milestones. J Grad Med Educ. 2017;9(6):716–720. doi: 10.4300/JGME-D-17-00304.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Dickey CC, Thomas C, Feroze U, Nakshabandi F, Cannon B. Cognitive demands and bias: challenges facing clinical competency committees. J Grad Med Educ. 2017;9(2). 10.4300/JGME-D-16-00411.1 [DOI] [PMC free article] [PubMed]
- 54.Burke A. Individualized learning plans: faculty as facilitators. MedEdPORTAL. 2009;5(1). 10.15766/mep_2374-8265.1684
- 55.Li STT, Burke AE. Individualized learning plans: basics and beyond. Acad Pediatr. 2010;10(5). 10.1016/j.acap.2010.08.002 [DOI] [PubMed]
- 56.Schwartz A, Balmer DF, Borman-Shoap E, et al. Shared mental models among clinical competency committees in the context of time-variable, competency-based advancement to residency. Acad Med. 2020;95(11S). 10.1097/ACM.0000000000003638 [DOI] [PubMed]
- 57.Kinnear B, Warm EJ, Caretta-Weyer H, et al. Entrustment unpacked: aligning purposes, stakes, and processes to enhance learner assessment. Acad Med: journal of the Association of American Medical Colleges. 2021;96(7):S56–S63. doi: 10.1097/ACM.0000000000004108. [DOI] [PubMed] [Google Scholar]
- 58.Pack R, Lingard L, Watling C, Cristancho S. Beyond summative decision making: illuminating the broader roles of competence committees. Med Educ. 2020;54(6). 10.1111/medu.14072 [DOI] [PubMed]
- 59.Hamstra SJ, Yamazaki K. A validity framework for effective analysis and interpretation of milestones data. J Grad Med Educ. 2021;13(2s). 10.4300/JGME-D-20-01039.1 [DOI] [PMC free article] [PubMed]
- 60.Hauer KE, Chesluk B, Iobst W, et al. Reviewing residents’ competence: a qualitative study of the role of clinical competency committees in performance assessment. Acad Med. 2015;90(8):1084–1092. doi: 10.1097/ACM.0000000000000736. [DOI] [PubMed] [Google Scholar]
- 61.Hamstra SJ, Yamazaki K. A Validity framework for effective analysis and interpretation of milestones data. J Grad Med Educ. 2021;13(2s):75–80. doi: 10.4300/JGME-D-20-01039.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.Holmboe ES. Work-based Assessment and Co-production in Postgraduate Medical Training. GMS J Med Educ. 2017;34(5):Doc58. Published 2017 Nov 15. 10.3205/zma001135 [DOI] [PMC free article] [PubMed]
- 63.McClintock AH, Fainstad T, Jauregui J, Yarris LM. Countering bias in assessment. J Grad Med Educ. 2021;13(5). 10.4300/JGME-D-21-00722.1 [DOI] [PMC free article] [PubMed]
- 64.Heath JK, Davis JE, Dine CJ, Padmore JS. Faculty development for milestones and clinical competency committees. J Grad Med Educ. 2021;13(2s):127–131. doi: 10.4300/JGME-D-20-00851.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Ekpenyong A, Zetkulic M, Edgar L, Holmboe ES. Reimagining feedback for the milestones era. J Grad Med Educ. 2021;13(2s):109–112. doi: 10.4300/JGME-D-20-00840.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Hauer KE, Holmboe ES, Kogan JR. Twelve tips for implementing tools for direct observation of medical trainees’ clinical skills during patient encounters. Med Teach. 2011;33(1):27–33. doi: 10.3109/0142159X.2010.507710. [DOI] [PubMed] [Google Scholar]
- 67.Sargeant J, Lockyer JM, Mann K, et al. The R2C2 model in residency education: how does it foster coaching and promote feedback use? Acad Med. 2018;93(7):1055–1063. doi: 10.1097/ACM.0000000000002131. [DOI] [PubMed] [Google Scholar]
- 68.Deiorio NM, Hammoud MM. Coaching in Medical Education | AMA, https://www.ama-assn.org/system/files/2019-09/coaching-medical-education-faculty-handbook.pdf, Accessed 11/21/21
- 69.Sargeant J, Lockyer J, Mann K, et al. Facilitated reflective performance feedback: developing an evidence-and theory-based model that builds relationship, explores reactions and content, and coaches for performance change (R2C2) Acad Med. 2015;90(12):1698–1706. doi: 10.1097/ACM.0000000000000809. [DOI] [PubMed] [Google Scholar]
- 70.Wolff M, Hammoud M, Santen S, Deiorio N, Fix M. Coaching in undergraduate medical education: a national survey. Med Educ Online. 2020;25(1). 10.1080/10872981.2019.1699765 [DOI] [PMC free article] [PubMed]
- 71.Armson H, Lockyer JM, Zetkulic M, Könings KD, Sargeant J. Identifying coaching skills to improve feedback use in postgraduate medical education. Med Educ. 2019;53(5). 10.1111/medu.13818 [DOI] [PubMed]
- 72.Lockyer J, Armson H, Könings KD, et al. In-the-moment feedback and coaching: improving R2C2 for a new context. J Grad Med Educ. 2020;12(1):27–35. doi: 10.4300/JGME-D-19-00508.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 73.Sargeant J, Armson H, Driessen E, et al. Evidence-informed facilitated feedback: the R2C2 feedback model. MedEdPORTAL. 2016;12(1). 10.15766/mep_2374-8265.10387
- 74.Reynolds AK. Academic coaching for learners in medical education: Twelve tips for the learning specialist. Med Teach. 2020;42(6):616–621. doi: 10.1080/0142159X.2019.1607271. [DOI] [PubMed] [Google Scholar]
- 75.Cavalcanti RB, Detsky AS. The education and training of future physicians: why coaches can’t be judges. JAMA. 2011;306(9). 10.1001/jama.2011.1232 [DOI] [PubMed]
- 76.Lim J, Westerman ME, Stewart NH, Correa R, Eno C. Trainee perspectives on the writing and implementation of milestones 2.0. J Grad Med Educ. 2021;13(2s):8–10. doi: 10.4300/JGME-D-20-00859.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77.Eno C, Correa R, Nancy Stewart EH, et al. Milestones Guidebook for Residents and Fellows, https://www.acgme.org/globalassets/PDFs/Milestones/MilestonesGuidebookforResidentsFellows.pdf, Accessed 11/21/21
- 78.Colbert CY, Dannefer EF, French JC. Clinical competency committees and assessment: changing the conversation in graduate medical education. J Grad Med Educ. 2015;7(2):162–165. doi: 10.4300/JGME-D-14-00448.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 79.Kinnear B, Warm EJ, Hauer KE. Twelve tips to maximize the value of a clinical competency committee in postgraduate medical education. Med Teach. 2018;40(11):1110–1115. doi: 10.1080/0142159X.2018.1474191. [DOI] [PubMed] [Google Scholar]
- 80.Meier AH, Gruessner A, Cooney RN. Using the ACGME milestones for resident self-evaluation and faculty engagement. J Surg Educ. 2016;73(6):e150–e157. doi: 10.1016/j.jsurg.2016.09.001. [DOI] [PubMed] [Google Scholar]
- 81.Schumacher DJ, Englander R, Carraccio C. Developing the master learner: applying learning theory to the learner, the teacher, and the learning environment. Acad Med. 2013;88(11):1635–1645. doi: 10.1097/ACM.0b013e3182a6e8f8. [DOI] [PubMed] [Google Scholar]
- 82.Li STT, Tancredi DJ, Schwartz A, et al. Identifying gaps in the performance of pediatric trainees who receive marginal/unsatisfactory ratings. Acad Med. 2018;93(1):119–129. doi: 10.1097/ACM.0000000000001775. [DOI] [PubMed] [Google Scholar]
- 83.Sargeant J, Armson H, Chesluk B, et al. The processes and dimensions of informed self-assessment: a conceptual model self-directedlearningandself. Acad Med. 2010;85(7):1212–20. doi: 10.1097/ACM.0b013e3181d85a4e. [DOI] [PubMed] [Google Scholar]
- 84.Mylopoulos M, Kulasegaram K, Woods NN. Developing the experts we need: fostering adaptive expertise through education. J Eval Clin Pract. 2018;24(3). 10.1111/jep.12905 [DOI] [PubMed]
- 85.Cutrer WB, Miller B, Pusic M V, et al. Fostering the development of master adaptive learners. Acad Med. 2017;92(1). 10.1097/ACM.0000000000001323 [DOI] [PubMed]
- 86.Mann K, van der Vleuten C, Eva K, et al. Tensions in informed self-assessment: how the desire for feedback and reticence to collect and use it can conflict. Acad Med. 2011;86(9). 10.1097/ACM.0b013e318226abdd [DOI] [PubMed]
- 87.Sargeant J, Lockyer J, Mann K, et al. Facilitated reflective performance feedback. Acad Med. 2015;90(12). 10.1097/ACM.0000000000000809 [DOI] [PubMed]
- 88.ten Cate O. Entrustment as assessment: recognizing the ability, the right, and the duty to act. J Grad Med Educ. 2016;8(2):261–262. doi: 10.4300/JGME-D-16-00097.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 89.Brasel KJ, Klingensmith ME, Englander R, et al. Entrustable professional activities in general surgery: development and implementation. J Surg Educ. 2019;76(5). 10.1016/j.jsurg.2019.04.003 [DOI] [PubMed]
- 90.Lockyer JM, Armson HA, Könings KD, Zetkulic M, Sargeant J. Impact of Personalized Feedback: The Case of Coaching and Learning Change Plans. In: The Impact of Feedback in Higher Education. Springer International Publishing; 2019. 10.1007/978-3-030-25112-3_11
- 91.Nguyen VT, Losee JE. Time- versus competency-based residency training. Plast Reconstr Surg. 2016;138(2). 10.1097/PRS.0000000000002407 [DOI] [PubMed]
- 92.Kinnear B, Srinivas N, Jerardi K. Striking while the iron is hot: using the updated PHM competencies in time-variable training. J Hosp Med. 2021;(2021-March ONLINE 1st). 10.12788/jhm.3611 [DOI] [PubMed]
- 93.Andrews JS, Bale JF, Soep JB, et al. Education in Pediatrics Across the Continuum (EPAC): first steps toward realizing the dream of competency-based education. Acad Med. 2018;93(3):414–420. doi: 10.1097/ACM.0000000000002020. [DOI] [PubMed] [Google Scholar]
- 94.Heath JK, Edgar L, Guralnick S. Assessment of learning, for learning: operationalizing milestones data for program-level improvement. J Grad Med Educ. 2021;13(2s):120–123. doi: 10.4300/JGME-D-20-00849.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 95.Thoma B, Bandi V, Carey R, et al. Developing a dashboard to meet competence committee needs: a design-based research project. Can Med Educ J. Published online January 7, 2020. 10.36834/cmej.68903 [DOI] [PMC free article] [PubMed]
- 96.Friedman KA, Raimo J, Spielmann K, Chaudhry S. Resident dashboards: helping your clinical competency committee visualize trainees’ key performance indicators. Med Educ Online. 2016;21(1). 10.3402/meo.v21.29838 [DOI] [PMC free article] [PubMed]
- 97.Lomis KD, Mejicano GC, Caverzagie KJ, Monrad SU, Pusic M, Hauer KE. The critical role of infrastructure and organizational culture in implementing competency-based education and individualized pathways in undergraduate medical education. Med Teach. 2021;43(sup2):S7–S16. doi: 10.1080/0142159X.2021.1924364. [DOI] [PubMed] [Google Scholar]