Skip to main content
Journal of Medical Education and Curricular Development logoLink to Journal of Medical Education and Curricular Development
. 2025 May 25;12:23821205251336279. doi: 10.1177/23821205251336279

Cost and Challenges of Clinical Competency Committees and Milestones Assessment

Marcos C Borges 1,2, McKenzie M Kennedy 2, Charles W Kropf 3, Matthew D Caldwell 4, Robert D Huang 3, Grace J Kim 5, David T Hughes 5, Larry D Gruppen 2, Deborah M Rooney 2,
PMCID: PMC12106988  PMID: 40432835

Abstract

Background

Several challenges regarding Clinical Competency Committee (CCC) processes have been previously reported. Few studies have addressed the cost of assessment in healthcare professional education. This study aimed to assess the time spent on and the cost of CCC processes, and faculty perception of the Milestones assessment in three residency programs.

Methods

We surveyed CCC faculty members to capture time devoted to, and satisfaction with, CCC processes from three residency programs at the University of Michigan: Anesthesiology, Emergency Medicine, and Surgery. During preparatory periods before CCC meetings, administrative staff used daily logs to record time spent in the CCC preparatory period to develop meeting documents and resident reports. CCC faculty members supplied estimated time spent preparing residents' assessments through a survey administered the day following the meeting. Additionally, the duration of each CCC meeting was recorded, and salaries were confirmed to estimate total cost. Total faculty and staff time was summed and reviewed by each departmental CCC program director.

Results

CCC members found the unstandardized semi-annual report compiled by the programs was poorly organized, not easy to review, and did not provide high-quality information for setting Milestones. The majority of CCC members reported the current CCC process does not allow enough time for review of resident data, does not provide valuable feedback to inform resident progression, and does not provide adequate rigor to defend CCC decisions. Annually, administrative staff spent 162.9 ± 151.7 h preparing the reports. Faculty spent 147.0 ± 78.4 h for the resident assessment preparation and 97.3 ± 24.1 h in CCCs meeting. Based on salaries, the cost of CCC processes for Milestone assessment totaled USD83,437, with USD22,776, USD31,764, and USD28,897 for Anesthesiology, Emergency Medicine, and Surgery, respectively. With an average of USD395.44 per resident, the total annual CCC cost for University of Michigan Medical was extrapolated to be USD404,531.

Conclusions

Though Milestones were implemented more than ten years ago, CCC processes are still unsatisfactory to faculty and pose a significant institutional cost. Alternative approaches are still needed to improve resident competency assessment processes.

Keywords: medical residency, milestones, cost, clinical competence

Introduction

In recent years, graduate medical education has embraced the principles of competency-based education (CBE), ie, focus on outcomes, emphasis on abilities and skills, de-emphasis on time-based training, and promotion of learner-centeredness.1-4 CBE has been adopted globally across various professions and by numerous institutions and countries. Several medical regulatory bodies, including the General Medical Council (GMC) in the United Kingdom, the Royal Australasian College of Physicians in Australia, the Royal College of Physicians and Surgeons of Canada (RCPSC), and the Accreditation Council for Graduate Medical Education (ACGME) in the United States have incorporated CBE principles through different frameworks for the training and evaluation of learners.1-3

The ACGME implemented the Next Accreditation System in 2013 based on CBE.5,6 In response, all ACGME-accredited United States (US) residency programs were required to establish a Clinical Competency Committee (CCC) to assess residents according to the Milestones framework and submit a semi-annual report to the ACGME.7-9 The Milestones framework assesses residents based on six core competencies (patient care and procedural skills, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice) and some sub-competencies.

To adequately measure the full breadth of residents' clinical competency, the Milestones assessment process requires data to be captured by several methods (eg, direct observations, global evaluation, cognitive assessment, and audits) and sources (eg, faculty, peers, nurses, other healthcare professionals, patients, family, and the resident). These data must be aggregated, synthesized, and judged by the CCC to reach a consensus about residents' longitudinal performance.7,10

Milestones assessment practices vary among programs and institutions.11,12 Yaghmour et al analyzed the stages of Milestones implementation in 16 programs across 4 specialties and classified them into three stages: early, transition, and final. Additionally, several challenges related to CCC processes have been reported, such as gender bias in assessments, discrepancies in feedback, and data quality and interpretation.13-16 Faculty and program directors have also reported negative perceptions, poor-quality data, and that Milestones is not always helpful, accurate, “real-world” applicable, or aligned with performance.10,17-19 The discrepancies in Milestones implementation are associated with faculty dissatisfaction, which suggests that despite some progress, there is a significant variability and that a best practice has not been established.

The calculation of the cost of medical education is complex and varies across countries and institutions. While there are estimates of global health professional education (HPE) expenditure, the costs associated with its components—such as undergraduate and postgraduate education, different teaching strategies, assessment, and evaluation—have not been adequately addressed. Many gaps still exist in identifying and, consequently, estimating these costs.20-22

Few studies have objectively evaluated the actual costs associated with mandatory assessments in the Milestones framework. 23 A theoretical study suggested an estimated cost of USD218,143 for a 5-year residency program. 24 Given the number of residency programs that utilize the Milestones framework, understanding its real cost, as well as faculty perception, is of great importance. Therefore, this study assessed the cost of CCC processes and Milestones assessment and identified associated challenges and sources of faculty dissatisfaction in three residency programs at one US institution.

Materials and Methods

Study Setting

This study was conducted at the University of Michigan (UofM), which had 31 residency programs with 1023 residents and 83 fellowship programs in 2023. Current administrative staff and faculty responsible for CCC for three residency programs at the UofM: Anesthesiology, Emergency Medicine, and Surgery, with residents totaling 109, 62, and 40 respectively, were studied in 2023. Programs were selected based on their size and to represent different specialties. This project was exempted by the University of Michigan Institutional Review Board (HUM00123511). This study report conforms to the SQUIRE-EDU (Standards for QUality Improvement Reporting Excellence in Education): Publication Guidelines for Educational Improvement. 25

CCC Process

Anesthesiology and Surgery CCCs members met twice a year to review residents' assessment data according to the Milestones framework. The Emergency Medicine CCC meets monthly, but only twice a year do the meetings serve the purpose of submitting the report to the ACGME. Before the meeting, staff combined assessment data from several sources (eg, MedHub, Qualtrics, Excel spreadsheet, etc) into an unstandardized semi-annual report, which was designed prior to this research. These reports were then discussed and judged during the CCC meeting and twice a year Milestones were assigned for each of the competencies and sub-competencies. The reports were unstructured, as were the reviewing materials within each CCC. Once compiled, each department sent the semi-annual report to the ACGME, per accreditation requirements.

CCC Process Evaluation

A novel 16-item web-based survey (QualtricsXM, Seattle, WA; supplemental material) captured faculty perceptions related to CCC processes and Milestone assessment. The survey included 16 questions: 9 multiple-choice questions and 7 open-text boxes (supplemental material). It was initially drafted by medical education researcher (DMR), and reviewed for relevance, scope and clarity by departmental chairs and CCC program chairs, and approved by all authors. The three CCC program chairs administered the survey to CCC members who attended the most recent meeting, with a reminder email sent the following week. There were no added incentives to participate.

Cost Calculation

All CCC faculty members from Anesthesiology, Emergency Medicine, and Surgery were invited to participate, and all were core program faculty. CCC faculty members supplied estimated time spent preparing residents' assessments through the aforementioned survey which was administered the day following the meeting, with all responses collected within five days. CCC administrative staff logged in daily using another web-based 3-item survey to record the time spent preparing the CCC meeting documents and semi-annual reports during the preparatory periods before CCC meetings. Additionally, the duration of each CCC meeting was recorded. Total faculty and staff time committed to CCC and Milestone assessment was summed and reviewed by each departmental CCC program director and salaries were confirmed to estimate total cost. For cost calculation, we considered only the two annual meetings required by the ACGME. An average cost per resident was calculated for the Anesthesiology, Emergency Medicine, and Surgery programs by dividing the dollar value by the number of residents. Afterward, the cost of each program (Anesthesiology, Emergency Medicine, and Surgery) was calculated. Subsequently, the University of Michigan annual cost was estimated by extrapolating the cost per resident to all 1023 current residents.

Data Analysis

Descriptive statistics summarized the characteristics of CCC members and processes and data were reported as percentage, range, or mean ± standard deviation. One-way ANOVA or Fisher's exact tests were used as appropriate. Pearson's correlation coefficient was used to estimate the correlation of number of years of assessment experience and methods used. GraphPad Prism version 10.0 (Prism Software Inc., San Diego, CA) was used for statistical analyses. All tests were two-tailed and statistical significance was set at p ≤ 0.05. The main challenges and potential solutions reported by the CCC members were classified by an author (MCB) into three categories: Assessment Quality, Faculty Development/Feedback to Residents, and CCC Process/Cost.

Results

Residency Programs and CCCs Characteristics

Anesthesiology, Emergency Medicine, and Surgery are residency programs of 4-, 4-, and 5-year (plus 2-years duration fellowship), respectively. The number of residents in each program year (PGY) was as follows: Anesthesiology (109 residents), 27 PGY1, 27 PGY2, 27 PGY3, and 28 PGY4;

Emergency Medicine (62 residents), 16 each for PGY1 and PGY2, and 15 each for PGY3 and PGY4; Surgery (40 residents) 7 for each of the PGY1 through PGY4 training years, 4 for PGY5 training year, plus 8 preliminary (noncategorical) trainees. The CCCs for Anesthesiology, Emergency Medicine, and Surgery consisted of 13, 25, and 9 members, respectively (Table 1).

Table 1.

Characterization and Cost Calculation of Anesthesiology, Emergency Medicine, and Surgery Clinical Competency Committees (CCCs) at the University of Michigan.

Anesthesiology Emergency Medicine Surgery
Administrative staff
Number of members 2 2 1
Accumulated annual number of hours for preparation 105.00 335.00 48.67
Mean salary per hour USD26.44 USD36.06 USD36.54
Faculty
Number of members 13 25 9
Years of experience in resident assessment 5.83 ± 3.49 7.10 ± 5.92 9.89 ± 5.26
Number of meetings attended in the CCC 12.67 ± 18.49 40.20 ± 44.30 13.89 ± 7.22
Have been assistant program director 67.0% 60.0% 67.0%
Number of methods currently available in the CCC for Milestones assessment 5 15 19
Time to review each resident (min) 14.17 ± 4.92 22.25 ± 18.22 27.22 ± 19.38
Accumulated annual number of hours for preparation 100.00 103.60 237.50
Accumulative Annual number of hours for CCC in-meeting 100.00 120.00 72.00
Mean salary per hour USD200.00 USD190.00 USD114.18
Cost
Annual cost for preparation and in-meeting USD22,776 USD31,764 USD28,897

CCC Member Experience

Among all Anesthesiology, Emergency Medicine, and Surgery CCC members, 6, 10, and 8 members responded to the survey. Therefore, the survey response rates were 46.2% (6/13), 40.0% (10/25), and 88.9% (8/9), respectively. The majority of members self-reported having experience as assistant program director and over 5 years' experience in resident assessment.

Mean member participation was 12.67 ± 18.49, 40.20 ± 44.30, and 13.89 ± 7.22 meetings for Anesthesiology, Emergency Medicine, and Surgery CCCs, respectively. The mean reported time for CCC members to review each resident was 14.17 ± 4.92 (5 to 15 min/resident), 22.25 ± 18.22 (5 to 160 min/resident), and 27.22 ± 19.38 (10 to 90 min/resident) minutes for Anesthesiology, Emergency Medicine, and Surgery, respectively (Table 1).

Annual Time Commitment and Cost

Administrative staff for Anesthesiology, Emergency Medicine, and Surgery programs collectively dedicated 105.00, 335.00, and 48.67 h per year (mean of 162.9 ± 151.7) preparing reports, respectively. Faculty member in these programs collectively spent 100.00, 103.60, and 237.50 h per year (mean of 147.0 ± 78.4) on report preparation, and 100.00, 120.00, and 72.00 h per year (mean of 97.3 ± 24.1) in CCC meetings. The mean staff and faculty hourly salary rates for Anesthesiology, Emergency Medicine, and Surgery were as follows: USD26.44 and USD200.00, USD36.06 and USD190.00, and USD36.54 and USD114.18, respectively. Based on salaries, Milestone assessment cost totaled USD83,437, with USD22,776, USD31,764, and USD28,897 for Anesthesiology, Emergency Medicine, and Surgery, respectively. With 211 residents for these three programs, the calculated annual cost was USD395.44 per resident. Considering the 31 University of Michigan medical residency programs have 1023 residents, 26 the estimated annual cost was USD404,531.

CCC Members' Perceptions

A low percentage of members of Anesthesiology, Emergency Medicine, and Surgery CCCs reported the unstandardized reports compiled by programs were easy to review (50.0%, 40.0%, and 44.4%, respectively; p > 0.99), or provided high-quality information for setting Milestones (50.0%, 11.1%, and 33.3%; p = 0.29). Few respondents reported the reports were well-organized (40.0% and 44.4%; p = 0.68), with the exception of 66.7% of Anesthesiology CCC members, who found it well-organized. No more than 50.0% (p = 0.26) believed the process provided adequate rigor to support defensible CCC decisions (Table 2).

Table 2.

Evaluation of Clinical Competency Committee (CCC) Process Reported by the Members of Anesthesiology, Emergency Medicine, and Surgery CCC at the University of Michigan.

Anesthesiology Emergency Medicine Surgery p value
Semi-Annual Evaluation Report#
Easy to ready## 50.0% (3/6) 40.0% (4/10) 44.4% (4/9) >0.99###
Well-organized## 66.7% (4/6) 40.0% (4/10) 44.4% (4/9) 0.68###
Supports to provide high-quality information for setting milestones## 50.0% (3/6) 11.1% (1/9) 33.3% (3/9) 0.29###
Current CCC process
Allows enough time for high-quality review of resident data## 66.7% (4/6) 50.0% (5/10) 44.4% (4/9) 0.78###
Provides valuable feedback for appropriate resident residency progression and advancement## 50.0% (3/6) 40.0% (4/10) 11.1% (1/9) 0.29###
Provides adequate information for determination of resident need for remediation/probation/termination## 50.0% (3/6) 20.0% (2/10) 11.1% (1/9) 0.30###
Provides adequate rigor to support defensible CCC decisions## 50.0% (3/6) 30.0% (3/9) 22.22% (2/9) 0.26###
#

The semi-annual evaluation report consists in the resident evaluations based on the Milestones framework, which must be submitted to the ACGME twice a year.

##

Data is presented as the percentage of ‘yes’ responses and the number of ‘yes’ responses relative to the total number of answers.

##

Fisher's exact test.

Data sources used to define resident Milestones varied among members of the same CCC and across different CCCs, with only a few members using data from all available assessment sources (Figures 1 and 2). Most frequent data used were not always considered to be high value by CCCs' members (Figure 2). There was a moderate positive correlation between the number of methods used and years of experience, r = 0.41, p = 0.045.

Figure 1.

Figure 1.

The Frequency and Value of the Methods Currently Used for Milestones Assessment were Reported by Members of the Anesthesiology, Emergency Medicine, and Surgery Clinical Competency Committee (CCC) at the University of Michigan.

Figure 2.

Figure 2.

The Number of Methods Currently Used for Milestones Assessment Reported by Members of the Anesthesiology, Emergency Medicine, and Surgery Clinical CompetencyCommittees (CCCs) at the University of Michigan.

CCC members were generally not satisfied with the CCC processes, with 44.4%–66.7% (p = 0.78) reporting they were provided enough time for high-quality resident review. Between 11.1%–50.0% (p = 0.29) agreed the process provided valuable feedback for appropriate resident progression and between 11.1%–50.0% (p = 0.30) agreed the process provided adequate information to identify residents who need remediation, probation, or termination (Table 2). The main challenges were categorized as assessment quality, feedback to residents, CCC process, and cost/time (Table 3).

Table 3.

Main Challenges of Clinical Competency Committee (CCC) Process Reported by CCC Members of Anesthesiology, Emergency Medicine, and Surgery at the University of Michigan.

Assessment quality Faculty development / Feedback to residents CCC process / Cost (Time)
Emergency medicine
Unclear the exact definitions of each resident metric At times can feel stagnant and repetitive
It's a long day but unavoidable
Lack of written comments to support a lower milestone Dates can be off in the spreadsheet MedHub* itself for the times when I do need to cross-check
It is a long, long process and often difficult to carve out that degree of time
Spreadsheets have had errors in equations
Surgery
Lack of information with which to rate specific Milestones, leading to simply choose based on their PGY level; Inadequate SIMPL data to get a complete picture of operative performance; I worry that the reliability of the data is not quantified-are the data I'm looking at representative? How much weight should I put in them? Framing feedback so as not to offend trainees Finding examples to illustrate a point without allowing the resident to connect comments back to me or other faculty
Time consuming
Narrative summaries different than first-hand experiences Limited data for interns
Not enough end of rotation data; Inadequate data to make decisions More time to discuss
Lack of meaningful data in the end of rotation evaluations; Lack of comparative data between a resident's performance and expectations; Lack of evaluations from nursing, patients and those outside of surgery Greater participation from faculty would be helpful, perhaps rotating 2-year commitments
Lack of descriptive comments
Lack of data for a lot of things
Not enough data
Anesthesiology
Confidential comments not reflecting the remainder of the daily evaluation; For residents for whom there is a substantial safety concern, floor runners often adjust their clinical assignments creating a scenario where a CA3 might be in CA1 level rooms in order to provide safe patient care. However, this distorts the daily scores in terms of meets expectations because they aren't actually doing the complexity of case required to compare to their peers Limited useful commentary feedback from faculty for residents The timing where the governing bodies require the CCC designation prior to the quarter actually being completed which can lead to a paucity of actual data to make determinations
Sometimes it feels like there is too much raw data and it is hard to synthesize
The daily OR scores are not very helpful; Comments are far more useful but unfortunately limited
Feel never a large enough sample; Sample bias (based on who actually submits an evaluation and/or comment); Not sure the milestone data reflects the performance of the individual across time or the individual relative to peers (ie, is there bias in how I eval a traditionally lower-performing individual if I see some growth compared to peers?)

*MedHub, Minneapolis, MN.

We summarize solutions suggested by members as the need of a more aggregated, synthetic, and user-friendly report (Table 4).

Table 4.

Potential Solutions Suggested by the Members of Anesthesiology, Emergency Medicine, and Surgery Clinical Competency Committees (CCC) at the University of Michigan that Could be used to Support the Review of Resident Performance data and Help Improve CCC Decision-Making.

Assessment quality Faculty development / Feedback to residents CCC process / Cost (Time)
Emergency medicine
I think that (lack of) critical/actionable feedback from faculty is the #1 issue - if I don't have the information, then CCC doesn't function very well; I think faculty are afraid to be negative and give it, and partly b/c residents don't receive it well So that goes back to faculty development on how to give/write feedback and resident development on how to receive feedback I do like how we've had non-APDs assigned to residents being discussed, which allows for an alternative viewpoint, b/c APDs tend to get emotionally attached to their mentees and it's harder to be objective during CCC
The work arounds developed to accommodate a bad system are time consuming
Encourage faculty to give constructive feedback both in written and verbal form Easy integration of various MedHub* evaluations
The guidelines are helpful to think about the resident's performance. However, it would be helpful to have a tool to “ping” importantly good and importantly bad aspects to remember when filing evaluations and that other people could refer to when judging the evaluations. Sometimes, the automatic “add-on” of a shift is not as reflective as the issues themselves
I think the idea of non-voting members on the CCC is an interesting one. Our CCC is too big and becomes a gossip mill with members who pop in and out and need to be updated on the latest happenings which takes up a significant portion of our meetings without constructive motion
I'd like aggregated reports of all comments pulled from each evaluation of each of my residents from the previous 2 months sent to me in a single report 3 days prior to the meeting.
I think getting members to try to participate more - some do a great job, others not as much
There has to be some dashboard or other way to synthesize the information from MedHub in a more user friendly way that would stream line review
Surgery
I would love to have more and more informative data, ideally scored based on the quality of validity and reliability evidence supporting how that data should be used; For quantitative data, that would include comparison against benchmarks (ie expected scores) after adjustment for confounding factors (especially rater bias). For narrative data, NLP might be a useful approach to synthesis, although that is still very new and would require a lot more investment. Require sections/divisions write a group set of descriptive comments with four sentences: what the resident does well, what the resident needs to work on, the residents find of knowledge and technical skills, the residents ability to work in teams The new EPA's have promise in that they focus evaluation on what we actually do as surgeons (ie take care of a patient with cholecystitis) and that the faculty on the CCC will better be able to benchmark the performance of residents. However I do worry that the evaluation burden for EPA's is going to fall onto a small number of faculty and that they will become burned out due to the shear volume of evaluation requests.
More workshops about assessing I would also like more synthesis, if possible
Better incorporation of SIMPL
Anesthesiology
Concern that primarily clinical faculty versus primarily admin faculty on CCC meeting have very different perspective on resident performance/progression in the program. In particular, clinical faculty expressing safety concerns intraoperatively or significant lapses in professionalism by residents

*MedHub, Minneapolis, MN.

Discussion

As this work highlights, cost calculation of medical education and trainee assessment is complex and methods vary among institutions and programs. Although this study provides an estimation of global HPE expenditure, the cost of its parts, such as undergraduate and postgraduate education, different teaching strategies, and assessment have not been fully examined.20-22 Additionally, we showed that even in a process regulated by an external accreditation board (ACGME), the internal processes varied considerably among the different residency programs and CCC members, highlighting significant variability without a clear best practice, which was associated with faculty dissatisfaction.13-16

Although previous studies have highlighted faculty dissatisfaction with the Milestone process,10,17-19 few studies have evaluated Milestone assessment costs. To our knowledge, no study has objectively estimated these costs. A possible explanation for the lack of cost studies is that, since ACGME requirements are mandatory for accredited programs, many may simply comply without calculating the associated costs. Goyal et al estimated that in a program with 42 residents, 23.5 h of faculty time were necessary for the report but did not calculate its cost. 27 Based on a theoretical program with 5 residents per year for 5 years, the annual cost of the ACGME requirements (eg, program director and coordinator time, rounds and clinical discussion, journal club, CCC, etc) was estimated as USD218,143, with USD159,600 being the cost of program director and faculty time and USD3,600 for the CCC. 24 Our calculated CCC cost, including faculty and staff time, was substantially higher (mean USD27,812 per program), perhaps because we actively captured staff- and faculty-reported time spent before and during the CCC meetings, and this assessment was conducted close to the meetings.

We have estimated faculty cost based on their salaries and the dedicated hours per meeting (preparation and in-meeting). However, a significant additional cost of having clinicians conduct the CCC is the lost clinical productivity. Therefore, if we consider the average Relative Value Units, a measure of physician cost in the United States, for these three specialties per hour, the cost for the CCCs processes would be substantially higher. Additionally, we assessed the cost only related to the CCC process, ie, after all assessments have been performed and residents have been graded. Therefore, a more complete and accurate estimate of the cost for Milestones should encompass all components, such as assessment preparation, delivery, and analysis; training; simulation mannequins and equipment; etc.

The assessment methods used to review residents' progress in each CCC and the number of different sources of data used by each CCC member varied substantially among the three committees, highlighting the lack of homogeneity among CCCs and their members. Moreover, CCC members did not consistently use the data they considered most valuable. Despite numerous publications about the best practices for CCCs, Yaghmour et al demonstrated significant variation in Milestones processes implementation across residency programs in four specialties. 11

Limitations

This study has limitations to consider. First, it reflects a single institution with data from three (9.7% of 31) of its residency programs. Although these three programs encompass different specialties (Anesthesiology, Emergency Medicine, and Surgery) and have varying numbers of residents, the results may not be generalizable to other institutions and programs of different sizes. Second, survey respondents may have self-selected to participate so may reflect only those who were dissatisfied with process. Third, cost estimates were calculated from participants' and staff members' self-reported time, which may not be entirely accurate and could be either underestimated or overestimated. A more accurate cost could have been calculated if we had included other explicit (eg, materials, equipment, and rents) and implicit costs (eg, foregone opportunities, wasted time, and duplicate work). 21 Fourth, since the survey was anonymous, we were also unable to calculate the full-time equivalent (FTE) dedicated effort of core faculty. Auditing of faculty and program director FTE would be particularly important to ensure alignment with ACGME requirements. Fifth, we calculated only the cost of residency. A substantially higher cost would result if the cost of the 83 fellowship programs were also included, due to the additional number of learners as well as their different levels. Finally, we could not estimate the cost per milestone, competency, or resident characteristic. This would require a different approach, such as a dashboard, where the exact amount of time spent by CCC members on each competency, sub-competency, or resident characteristic could be tracked.

Despite these limitations, this work highlights the high cost of CCC processes, and CCC members' dissatisfaction. Some complaints were related to poor assessment quality, significant time requirements, and lack of data. Conversely, our findings suggest more experienced members utilized more data sources for the CCC Milestone setting process. This might suggest that having access to more data could improve the resident assessment process, especially when different technologies could be used, as encouraged by some researchers.28,29 This further highlights the potential benefit of automatic solutions, such as learning analytics and dashboards.

Automating data collection could reduce staff workload and minimize errors associated with manual data entry. During the CCC meetings, these solutions could help faculty visualize and interpret residents' data more effectively, allowing them to compare performance across peers and over time. These tools could also be made available to residents, enabling them to track their progress and identify areas that require attention. Streamlining the process could optimize CCC members' time while enhancing the rigor of the evaluation. Lastly, incorporating more advanced psychometric analyses, such as the Rasch model, could provide deeper insights into resident development and assessment outcomes.

Conclusion

Our work estimated the time administrators and faculty dedicate to Milestone assessment, with an annual cost of USD395.44 per resident and USD404,531 for all residency programs. Though Milestones were implemented more than ten years ago, some CCC processes are still unsatisfactory to faculty and pose a significant institutional cost. Alternative approaches are still needed to improve the value of resident competency assessment processes should be evaluated.

Supplemental Material

sj-docx-1-mde-10.1177_23821205251336279 - Supplemental material for Cost and Challenges of Clinical Competency Committees and Milestones Assessment

Supplemental material, sj-docx-1-mde-10.1177_23821205251336279 for Cost and Challenges of Clinical Competency Committees and Milestones Assessment by Marcos C Borges, McKenzie M Kennedy, Charles W Kropf, Matthew D Caldwell, Robert D Huang, Grace J Kim, David T Hughes, Larry D Gruppen and Deborah M Rooney in Journal of Medical Education and Curricular Development

Acknowledgements

Not Applicable.

Statements and Declarations

Ethical considerations: This project was exempted by the University of Michigan Institutional Review Board (HUM00123511).

Consent to Participate: This project was exempted by the University of Michigan Institutional Review Board.

Consent for Publication: Not applicable

Author Contributions/CRediT: MCB: contributed to methodology, formal analysis, and writing-original draft preparation. MMK: contributed to methodology, formal analysis, writing-review and editing. CWK, MDC, RDH, GJK, and DTH: contributed to methodology, formal analysis, writing-review and editing. LDG: contributed to conceptualization, methodology, formal analysis, writing-review and editing. DMR: contributed to conceptualization, project administration, methodology, supervision, funding acquisition, writing-review and editing. All authors read and approved the final manuscript.

Funding: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was funded by the University of Michigan Graduate Medical Education Innovations grant and grant # 2023/01393-8, São Paulo Research Foundation (FAPESP).

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Supplemental material: Supplemental material for this article is available online.

References

  • 1.Van Melle E, Hall AK, Schumacher DJ, et al. Capturing outcomes of competency-based medical education: the call and the challenge. Med Teach. Jul 2021;43(7):794-800. doi: 10.1080/0142159X.2021.1925640 [DOI] [PubMed] [Google Scholar]
  • 2.Frank JR, Snell LS, Cate OT, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638-645. doi: 10.3109/0142159X.2010.501190 [DOI] [PubMed] [Google Scholar]
  • 3.ten Cate O, Scheele F. Viewpoint: competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med. Jun 2007;82(6):542-547. doi: 10.1097/ACM.0b013e31805559c7 [DOI] [PubMed] [Google Scholar]
  • 4.Gruppen LD, Burkhardt JC, Fitzgerald JT, et al. Competency-based education: programme design and challenges to implementation. Med Educ. May 2016;50(5):532-539. doi: 10.1111/medu.12977 [DOI] [PubMed] [Google Scholar]
  • 5.Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system–rationale and benefits. N Engl J Med. Mar 15 2012;366(11):1051–6. doi: 10.1056/NEJMsr1200117 [DOI] [PubMed] [Google Scholar]
  • 6.Hall AK, Schumacher DJ, Thoma B, et al. Outcomes of competency-based medical education: a taxonomy for shared language. Med Teach. Jul 2021;43(7):788-793. doi: 10.1080/0142159X.2021.1925643 [DOI] [PubMed] [Google Scholar]
  • 7.Ekpenyong A, Padmore JS, Hauer KE. The purpose, structure, and process of clinical competency committees: guidance for members and program directors. J Grad Med Educ. Apr 2021;13(2 Suppl):45-50. doi: 10.4300/JGME-D-20-00841.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Edgar SM L, Hogan SO, Hamstra S, Holmboe ES. The Milestones Guidebook. Accreditation Council for Graduate Medical Education (ACGME); 2020. https://www.acgme.org/globalassets/milestonesguidebook.pdf [Google Scholar]
  • 9.Andolsek JP K, Hauer KE, Ekpenyong A, Edgar L, Holmboe E. Clinical Competency Committees - A Guidebook for Programs. Accreditation Council for Graduate Medical Education (ACGME); 2020. https://www.acgme.org/globalassets/acgmeclinicalcompetencycommitteeguidebook.pdf [Google Scholar]
  • 10.Holmboe ES, Yamazaki K, Edgar L, et al. Reflections on the first 2 years of milestone implementation. J Grad Med Educ. Sep 2015;7(3):506-511. doi: 10.4300/JGME-07-03-43 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Yaghmour NA, Poulin LJ, Bernabeo EC, et al. Stages of milestones implementation: a template analysis of 16 programs across 4 specialties. J Grad Med Educ. Apr 2021;13(2 Suppl):14-44. doi: 10.4300/JGME-D-20-00900.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Doty CI, Roppolo LP, Asher S, et al. How do emergency medicine residency programs structure their clinical competency committees? A survey. Acad Emerg Med. Nov 2015;22(11):1351-1354. doi: 10.1111/acem.12804 [DOI] [PubMed] [Google Scholar]
  • 13.Klein R, Ufere NN, Rao SR, et al. Association of gender with learner assessment in graduate medical education. JAMA Netw Open. Jul 1 2020;3(7):e2010888. doi: 10.1001/jamanetworkopen.2020.10888 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Mueller AS, Jenkins TM, Osborne M, Dayal A, O'Connor DM, Arora VM. Gender differences in attending physicians’ feedback to residents: a qualitative analysis. J Grad Med Educ. Oct 2017;9(5):577-585. doi: 10.4300/JGME-D-17-00126.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Pack R, Lingard L, Watling CJ, Chahine S, Cristancho SM. Some assembly required: tracing the interpretative work of clinical competency committees. Med Educ. Jul 2019;53(7):723-734. doi: 10.1111/medu.13884 [DOI] [PubMed] [Google Scholar]
  • 16.Carney PA, Sebok-Syer SS, Pusic MV, Gillespie CC, Westervelt M, Goldhamer MEJ. Using learning analytics in clinical competency committees: increasing the impact of competency-based medical education. Med Educ Online. Dec 2023;28(1):2178913. doi: 10.1080/10872981.2023.2178913 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Sebesta EM, Cooper KL, Badalato GM. Program director perceptions of usefulness of the accreditation council for graduate medical education milestones system for urology resident evaluation. Urology. Feb 2019;124:28-32. doi: 10.1016/j.urology.2018.10.042 [DOI] [PubMed] [Google Scholar]
  • 18.Raaum SE, Lappe K, Colbert-Getz JM, Milne CK. Milestone implementation's impact on narrative comments and perception of feedback for internal medicine residents: a mixed methods study. J Gen Intern Med. Jun 2019;34(6):929-935. doi: 10.1007/s11606-019-04946-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Maranich AM, Hemmer PA, Uijtdehaage S, Battista A. ACGME milestones in the real world: a qualitative study exploring response process evidence. J Grad Med Educ. Apr 2022;14(2):201-209. doi: 10.4300/JGME-D-21-00546.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Frenk J, Chen L, Bhutta ZA, et al. Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet. Dec 4 2010;376(9756):1923-1958. doi: 10.1016/S0140-6736(10)61854-5 [DOI] [PubMed] [Google Scholar]
  • 21.Yaros J, de Mortier C, Oude Egbrink MGA, Evers S, Paulus A. Identifying costs in health professions education: a scoping review protocol. BMJ Open. Oct 17 2023;13(10):e074410. doi: 10.1136/bmjopen-2023-074410 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Walsh K, Jaye P. Cost and value in medical education. Educ Prim Care. Sep 2013;24(6):391-393. doi: 10.1080/14739879.2013.11494206 [DOI] [PubMed] [Google Scholar]
  • 23.Walsh K. Costs and assessment in medical education: a strategic view. Perspect Med Educ. Oct 2016;5(5):265-267. doi: 10.1007/s40037-016-0299-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Kempenich JW, Willis RE, Campi HD, Schenarts PJ. The cost of compliance: the financial burden of fulfilling accreditation council for graduate medical education and American board of surgery requirements. J Surg Educ. Nov 2018;75(6):e47-e53. doi: 10.1016/j.jsurg.2018.07.006 [DOI] [PubMed] [Google Scholar]
  • 25.Ogrinc G, Armstrong GE, Dolansky MA, Singh MK, Davies L. SQUIRE-EDU (Standards for QUality improvement reporting excellence in education): publication guidelines for educational improvement. Acad Med. Oct 2019;94(10):1461-1470. doi: 10.1097/ACM.0000000000002750 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Michigan Uo. Accredited residency & fellowship programs. Accessed February 26, 2024. https://medschool.umich.edu/programs-admissions/residency-fellowship/accredited-residency-fellowship-programs.
  • 27.Goyal N, Folt J, Jaskulka B, et al. Assessment methods and resource requirements for milestone reporting by an emergency medicine clinical competency committee. Med Educ Online. Dec 2018;23(1):1538925. doi: 10.1080/10872981.2018.1538925 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Goldhamer MEJ, Martinez-Lage M, Black-Schaffer WS, et al. Reimagining the clinical competency committee to enhance education and prepare for competency-based time-variable advancement. J Gen Intern Med. Jul 2022;37(9):2280-2290. doi: 10.1007/s11606-022-07515-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Friedman KA, Raimo J, Spielmann K, Chaudhry S. Resident dashboards: helping your clinical competency committee visualize trainees’ key performance indicators. Med Educ Online. 2016;21:29838. doi: 10.3402/meo.v21.29838 [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

sj-docx-1-mde-10.1177_23821205251336279 - Supplemental material for Cost and Challenges of Clinical Competency Committees and Milestones Assessment

Supplemental material, sj-docx-1-mde-10.1177_23821205251336279 for Cost and Challenges of Clinical Competency Committees and Milestones Assessment by Marcos C Borges, McKenzie M Kennedy, Charles W Kropf, Matthew D Caldwell, Robert D Huang, Grace J Kim, David T Hughes, Larry D Gruppen and Deborah M Rooney in Journal of Medical Education and Curricular Development


Articles from Journal of Medical Education and Curricular Development are provided here courtesy of SAGE Publications

RESOURCES