Abstract:
This chapter focuses on resident recruitment and recent US National Resident Matching Program changes and the impact in the evaluation and ranking of applicants within the specialty of anesthesiology. Recruitment challenges are examined as well as program strategies and potential future directions. Also discussed are DEI initiatives within the recruitment process.
The COVID-19 pandemic disrupted medical education and resident recruitment in a variety of ways. With the onset of the pandemic and subsequent changes in the US National Resident Matching Program (NRMP), no decade has seen as much significant change within the residency application and selection processes. After the declaration of the COVID-19 pandemic by the World Health Organization in March 2020, the Coalition for Physician Accountability (CoPA)—an organization of stakeholders focused on the oversight, education, and assessment of trainees and physicians—recommended virtual interviews instead of in-person recruitment at the same time away rotations for medical students were suspended. A report from the 2021 NRMP residency program director survey reported that 96% of interviews were conducted virtually in 2021 to assess candidates for ranking.1 This same year, the Federation of State Medical Boards (FSMB) and the National Board of Medical Examiners (NMBE) no longer required US Medical Licensing Examination (USMLE) Step 2 Clinical Skills (CS), and the exam was discontinued. On January 26, 2022, the NBME also ended the release of 3-digit scores for the USMLE Step 1 examination, instead reporting results as pass/fail only. National medical education organizations have continued to endorse virtual resident interviews. This recommendation aims to enhance safety, equity, and transparency in the residency selection process.2–4 In addition, the application process for prospective residents in various specialties, including anesthesiology has gone through several iterations. Collectively, these significant changes are necessitating and promoting, more holistic review of applications. They move away from the traditional reliance on conventional objective metrics to help residency programs to select, evaluate, and rank applicants more effectively. This shift comes at a time when the number of applications to programs is steadily increasing. These changes have also exposed a flawed match system with misaligned stakeholders that is in need of reform. This chapter examines resident recruitment in its current state, given these recent changes as well as resident recruitment challenges and potential future direction. We also discuss the recruitment process in light of recent US Supreme Court decisions and concerns related to DEI initiatives.
THE RESIDENT MATCHES BY THE NUMBERS
The NRMP was established in 1952 at the request of medical students to provide a fair, systematic mechanism for students matching to residency training programs. Its mission is to match health care professionals to Graduate Medical Education (GME) training programs and to advance training programs through a process that is “fair, efficient, transparent, and reliable.”5 The 2023 Main Residency Match was the largest in the NRMP’s 70-year history, with a record 48,156 registered applicants and 42,952 applicants certifying rank lists.6 In fact, the average US medical graduate currently applies to 68 programs. The average international medical graduate (IMG) applies to 139.7 In anesthesiology, there were 172 programs offering 1609 anesthesiology PGY-1 positions, and 99.8% of these positions were filled. Based on Association of American Medical College data from the 2024 NRMP match cycle, this year’s applicant pool is expected to be even higher, with 56,053 total registered applicants.7 Of those applying to anesthesiology positions this year, US senior medical students averaged 56.9 application submissions, with an average of 60.3 for international medical graduates and 68.0 for D.O. applicants. However, even with a steady increase over time in applications to residency programs via the Electronic Residency Application Service (ERAS), the match rate for US senior medical students has not changed dramatically, with a match rate of 92% to 95% rate for decades.8 Despite the nuances of this crude measure for match success,9 it is important to point out that these US medical students make up less than half the applicants in the NRMP, with other groups having less success but some modest match rate gains driven primarily by an increase in the number of available positions. For instance, the match rates for US citizen IMGs have improved by more than 15% over the last decade, reaching 67.6% in last year’s match.8 Of the 1609 PGY1 positions offered from 172 anesthesiology programs last year, 1199 (74.5%) were filled by MD senior students. In addition, 52 programs offered 301 PGY-2 positions with 66.8% of them filled by MD senior students.
In 1992, the AAMC proposed and later adopted the use of a computerized residency application system. At the time, applicants submitted a median number of 12 applications to programs.10 Coincidentally, this was the last year that there were more residency positions available than active applicants in the NRMP. In 1996, the system known as ERAS was first offered to applicants, making it easier for them to apply more broadly for residency positions. By this time, the ratio of positions to applicants had decreased from >1 to 0.83,11 likely giving applicants additional motivation to apply to more programs. In subsequent years, this ratio has not improved for applicants. That is, the number of total active applicants in the match continues to exceed the number of total first-year GME positions offered. Despite a consecutive annual increase in the number of PGY-1 positions offered in the match over the last 2 decades, the number of US MD senior and US DO senior applicants was at all-time highs based on the latest biennial NRMP match data report.12 In fact, 30 new MD-granting schools have opened between 2002 and 2020.13
For the last 2 decades, applications to residency programs have steadily increased except for this last year—likely related to the integration of program signaling into the system (Fig. 1). The average rank order list (ROL) length for residency programs of all fields increased by 48.6% with an average of 12.9 average ranks per position for filled programs in 2023. The average length of ROL for matched applicants has also steadily grown over time with an average of 12.4 average ranks per matched applicant this same year.14 The US medical school student attempting to match into anesthesiology applied this year to an average of 56.9 programs with 1316 applications per program.15 On the basis of the latest NRMP data report, the number of all applicants per position in anesthesiology was 1.30, with an 89.5% match rate for US MD senior students.12 Regarding this cohort, the mean number of contiguous ranks for matched applicants was 15.7 (6.3 if unmatched), with a mean USMLE step 2 CK score of 248. Nearly 20% had another graduate degree, over 12% were AOA members, and over 30% were from one of the 40 US medical schools with the highest NIH funding.
FIGURE 1.
Trends in ERAS application numbers and applications per applicant within anesthesiology. Data available through https://www.aamc.org/data-reports/students-residents/report/eras-statistics. Accessed February 10, 2024.
THE PROGRAM PERSPECTIVE
Despite stability in applicant match rates and program fill rates (indeed, the match rate for US senior applicants has been relatively stable over 4 decades), the residency application process has gone down what some have termed a “behavioral economics rabbit hole”16 in which candidates perceive a relative advantage over a similarly qualified candidate by applying to additional programs. However, when modeled with a hypothetical match model using a game theory concept known as the prisoner’s dilemma, it has been illustrated that applicants within the group may actually lessen their match probabilities when each applicant acts in his or her own self-interest.17 That is, the perceived benefit of over-applying to programs is not actualized by the applicant even though it becomes a dominant strategy in the process. Interestingly, this phenomenon as seen with ERAS and residency applications, has not been identified in a review of application rates to medical schools, graduate schools, or law schools.18–20
With inflation in the number of ERAS-filed applications making program holistic review more difficult in short time, residency programs have become increasingly reliant on filters such as USMLE step scores for screening, and there is some evidence that such exam performance predicts some metrics for residency success. USMLE scores have been one of the few universal, standardized metrics in the residency application process, and the Accreditation Council for Graduate Medical Education (ACGME) requires programs’ aggregate pass rates for first-time test takers to be 80% in most specialties.21 According to the AAMC, most medical schools have moved to pass/fail grading in the preclinical courses,22 clerkship grading distributions vary between institutions,23 and many medical student performance evaluations (MSPEs) lack standardized objective measures.24 In one systematic review, USMLE Step 1 scores were strongly associated with performance on in-training and specialty board certification exams and were also positively associated with subjective faculty evaluations of resident clinical performance.25 These same authors reported in their analysis that standardized examination performance and medical school grades revealed the strongest associations with measures of physician performance, and such objective selection strategies are “potentially the most useful to residency selection committees based on current evaluative methods.” In one single-center study of 69 anesthesiology residents, both USMLE step 1 and step 2 CK examinations correlated significantly with all intraresidency standardized tests.26 In a larger questionnaire study to anesthesiology residents in over 30 ACGME-accredited residency programs in the United States, only prior standardized (USMLE) exam performance, level of training in anesthesiology, and test-taking anxiety scores correlated with ITE performance.27 Another national sample study reported a moderately high, positive relationship between scores obtained on each of the USMLE Step examinations and performance on the ABA Part 1 Certification Examination. Further, the level of performance on the USMLE Step examinations showed a clear relationship to success on ABA part 1.28 In other words, poor USMLE step score performance may be useful for program directors to identify applicants who may later struggle to pass important board certification examinations. Moreover, with high application numbers, program directors have few objective assessment data otherwise to compare applicants. Other studies have found only a modest correlation between step 1 scores and in-training examination scores and a weak correlation between step 1 scores and clinical performance.29 USMLE step 1 and 2 scores both correlated positively with ITE scores across the three clinical anesthesia resident training years. Nonetheless, the most recent NRMP national survey reported that 46% of anesthesiology program directors required a target USMLE step 1 score when selecting US MD and IMG applicants to interview; an additional 31% of program directors preferred a specific target score (when applicant numerical scores were still reported).30 Based on one invited program directors survey before the proposed NBME score reporting change for USMLE step 1 results, only 15.3% of program directors were in agreement with changing step 1 to a binary (pass/fail) result, and over 80% surveyed reported they would place more emphasis on step 2 CK scores for application selection.31 Finally, only 9% of respondents stated that step 2 CK should also be reported as pass/fail, further emphasizing the desire for program directors to assess objective measures of achievements to distinguish applicants.
Given such selection process emphasis on USMLE performance, students have likely viewed these exams as an opportunity to prove relative merit to residency programs as they compete for spots with a single-minded focus on the exam and its importance. A low score might also present a barrier to applicants seeking to match in the most competitive fields. For instance, in one cohort of students, over one-fourth reported that their step 1 score influenced a change in specialty choice.32 Of interest, since the USMLE was introduced in 1992, the minimum passing step 1 score was adjusted upward 6 times, though pass rates have been relatively stable. At the test’s inception in 1993, the minimum passing score for step 1 was 176 (national mean, 200); in 2022, the minimum passing score was 194 (national mean, 232). A similar increasing score trend exists with USMLE step 2 CK test performance as well (Fig. 2). Two decades ago, the step 2 CK score national mean was 200; today, this value would not represent a passing score. Over this time, the distribution of scores is also rightward-shifted and compressed with less variance across scores, making it more difficult to distinguish exam results among examinees. Furthermore, small differences in numeric score performance may be overinterpreted despite being reported as within the margins of error.33
FIGURE 2.
USMLE step 2 CK Score frequency among first-time test-takers from US and Canadian medical schools between the years 2010 to 2013 and 2020 to 2023. Used with permission from J. Bryan Carmody, MD. Data available through www.usmle.org/scores-transcripts. Accessed February 10, 2024.
The reasons why overall USMLE exam performance improves within testing cohorts each year are not entirely apparent. This trend also began well before the impact of the pandemic. It may be that students are taking additional time and studying harder for these high-stakes examinations, given their perceived impact in an ever-increasingly competitive match process. More simply, it could be that the score distribution shift and compression over time may represent improvements in the quality of applicants learning and surpassing the minimum standard given institutional curricula trends. Nonetheless, as pointed out by USMLE score interpretation guidelines, “it is important to note that all USMLE examinations evolve over time in terms of test content, and that examinations taken at two substantially different points in time may vary somewhat in terms of inclusion or emphasis of certain content areas. USMLE stakeholders should avoid comparing scores that were obtained at dramatically different points in time.”30 In addition, the fundamental intent of the exam is not to stratify applicants for competitive residency programs but rather to support initial licensure decisions made by the nation’s state and territorial medical boards. Numerical score data on the exam are used additionally to inform secondary uses; these include pedagogical and psychometric research, content-specific feedback to examinees, and curricular assessment for medical school courses.
A consequence of the Federation of State Medical Board and National Board of Medical Examiner decision to change USMLE step 1 score reporting may be the benefit of limiting the exam’s role in dictating students’ acceptance into their chosen specialty. Despite the controversy of the exam change or tensions around its use as an objective knowledge assessment in the match process, this transition may be a critical step in diversifying various specialties, including anesthesiology as under-represented minority groups have struggled to tighten achievement gaps with other groups in medicine. Similar to the Medical College Admission Test (MCAT), scores on USMLE step 1 are lower among black, Hispanic, and Asian-American students than among white counterparts.34 Importantly, based on the latest NRMP Program Director survey, of the 1293 mean applications received by anesthesiology programs, 469 were rejected based on standardized assessments, including step scores but over 40% of applications receiving a more holistic view compared with the previous year.27 Emphasis on other applicant accomplishments and attributes rather than initial screening by a step score may help increase the breadth of interviewed and ranked under-represented candidates. Unintended effects of this exam reporting change may be increased scrutiny on step 2 CK performance, clinical clerkship grades, and student performance assessment during clinical rotations. However, the benefits of diversifying the medical workforce are easy to recognize; for example, increasing physician diversity has been estimated to decrease cardiovascular mortality disparity between white and black men by 19% and life expectancy by 8%,35 and medical educators increasingly recognize the benefits of cognitive diversity in solving complex tasks.36 Data also suggest that patients from marginalized racial or ethnic groups fare better when treated by a clinician of their own racial or ethnic group,37 and under-represented providers are more likely than nonminority providers to serve in minority and medically underserved communities.38
VIRTUAL RESIDENCY RECRUITMENT
Since the onset of the pandemic, virtual interview platforms have been widely adopted by residency programs to aid in residency recruitment and interviews. Virtual platform program aims include showcasing program strengths, faculty, and residents as well as more holistically evaluating (and subsequently ranking) candidates given the recent USMLE step 1 reporting change and fewer objective student assessments available from applications by their respective medical schools or prior training institutions. Based on recommendations from the Coalition for Physician Accountability and other national recommendations, the Association of Anesthesiology Core Program Directors (AACPD) has recommended anesthesiology programs continue to conduct virtual interviews during resident recruitment.39 Since the initial virtual 2020-2021 recruitment cycle, residency programs have also hosted social events and virtual “meet and greet” sessions with applicants to present opportunities to get to know the applicants better in an informal virtual setting. The effectiveness of virtual interviews remains under investigation, with the new format offering both advantages and drawbacks.
The replacement of in-person interactions with video sessions understandably limits certain interview aspects and the “in-person vibe” during the process. However, virtual platforms offer several advantages for both applicants and programs. First, they are time-effective and cost-effective.40 The format enables candidates to apply and interview with programs they may not have initially considered due to location, travel expense, or required leave from clinical duties, research, or other coursework. The addition of preference signaling into ERAS in the 2022-2023 application cycle may have benefited less competitive applicants, though one single-center study found no difference in step 1 and step 2 CK scores or type of medical school between in-person and virtual interview groups for anesthesiology residency.41 One single-institution study reported that virtual (web-based) interviewing (compared with in-person interviewing) did not affect the rate of applicant admission to the residency program or impact how programs ranked applicants.42 Though data are limited, interviewees surveyed across specialties and programs have reported perceived advantages with virtual interviewing, including cost savings, time efficiency, scheduling ease, and reduced environmental impact (with decreased travel). The most common barriers reported with virtual interviewing include problems with adjusting to time zone differences, candidate access to an appropriate interview setting, and reliable internet access/connectivity.43 Other notable disadvantages include limited interactions with current trainees (thus making it more difficult to evaluate program culture), lack of interactions among applicants, and inability to see the physical spaces in which they will train and live.
An important impact of programs shifting to virtual programming is the associated exponential rise in program social media presence. During the 2020-2021 recruitment year, over 60% of all anesthesiology programs had some form of social media presence.44 This method to reach prospective applicants may have been most significant during the first unprecedented application cycle amid the pandemic. Before this time, based on an AAMC survey of 2019 medical school graduates, 42.8% of prospective anesthesiology residents completed an away rotation,45 and any invited applicant was expected to visit for an interview. Initial pandemic-induced travel restrictions and institutional suspensions of visiting away rotations shifted recruitment strategies and a need for programs to increase online and virtual opportunities to better share information with applicants. Figure 3 shows the social media account expansion before and at the time of program transition to virtual interviewing with the onset of the pandemic. Of the 76-residency affiliated accounts across all platforms, 40 were created after March 1, 2020, with COVID-19 onset in the United States. During this first virtual recruitment cycle, approximately half of all anesthesiology programs offered virtual open houses, with only one-fifth holding multiple events. It is probable that many more programs offer such open houses in more recent recruitment cycles. While these events do not enable applicants to visit the medical center or experience the area firsthand as well as connect in-person with residents and faculty, they provide an alternative to traditional recruitment activities. It remains uncertain if such virtual tools enhance recruitment efforts, attract new target groups of applicants, or transform applicant outreach with each new interview cycle.
FIGURE 3.
Anesthesiology social media accounts by creation year, 2009 to 2020. Adapted from Lee et al.44
Early studies of the virtual interview format for resident recruitment (from medical and surgical specialties) reported limitations with the approach in the 2020-2021 cycle, with half of program director respondents in one study reporting difficulty assessing applicant fit and communication skills using the virtual format.46 In fact, only 15% of program directors reported that virtual interviews were better than in-person ones in this same study. Program directors also reported that a virtual format made it more challenging for them to assess candidate emotional intelligence, commitment to the specialty, and ability to function as resident physician. Another similar survey highlights that resident program directors reported less confidence in assessing an applicant’s interpersonal skills and professionalism using any virtual platform. Over 50% found it more difficult to rank interviewees after a virtual interview and described relying more heavily on submitted application components as well as placing a greater number of applicants on their rank order list compared with previous years. Only 41.3% agreed that they could accurately represent their own program in a virtual format. Importantly, in a post-match NRMP student survey this same year, 42% of students reported applying to more programs, and over a third stated they accepted and interviewed at more programs as a result of virtual formatting.1
More specific to anesthesiology, the most recent NRMP program director (PD) survey reported that nearly all anesthesiology programs conducted resident interviews virtually throughout the pandemic, with 72% of respondents anticipating that either part or all of the interview process will remain virtual in format (28% of those surveyed were uncertain).27 Program social media presence and virtual open houses were utilized as first-time engagement strategies by 22% and 14% of respondents, respectively. In addition, over half of PD respondents reported these 2 approaches as moderately to significantly beneficial. As applicants and programs have adjusted to virtual recruitment, there has also been continued interest and emphasis on strengthen and expand virtual interactions with recruitment strategies. In addition to virtual “open house” events and social media presence referenced earlier, newer program strategies may include technologies that increase interactions beyond the interview and may provide better data to programs and applicants. Available digital platforms on a 2-dimensional plane (a physical space where points can be locally described with 2 coordinates or they can move in 2 independent directions) may better facilitate more natural, unintrusive mingling for candidates to network virtually. For instance, the platforms Remo (Remo, San Francisco, CA) and Gather (Gather Presence Inc., San Bruno, CA) enable participants, or potential candidates, to move their avatar around a 2-dimensional construct to create different and spontaneous groupings with video conferencing ability. Cisco Systems Inc. is developing a Webex version that permits participants to see a holographic version of each other and interact with virtual 3D objects simultaneously. Other more advanced interweb platforms such as Meta Quest (Meta, Menlo Park, CA) and VRChat (VR Chat Inc., San Francisco, CA) enable a 3-dimensional metaverse for potential engagement of candidates. Some residency programs have engaged applicants virtually in live clinical rounds already in other specialties.47 With such newer technologies, programs may be able to create a virtual world or space that enables candidates to view and interact in unique locations such as the operating room or simulation center. However, the costs of such modern technology may be a significant barrier to program adoption.
Interestingly, given the reliance on interviews by programs to help assess applicant candidacy, studies examining longitudinal resident performance have reported mixed findings, with only a minority of such studies suggesting that traditional interview strategies can be utilized to predict resident success.48 With some evidence that the use of behavioral questions may enhance the ability of interviewers to assess leadership potential and interpersonal skills,49,50 the use of behavioral interviews with standardized questions has been trialed and used to assess ACGME competencies for anesthesiology residency selection.51 Developed by industrial psychologists, behavioral interviewing focuses on the premise that future performance can best be predicted by past performance and emphasizes questions regarding specific behaviors that often represent decision making and critical thinking skills as well as interpersonal communication styles. However, at least one study has reported that behavioral questions demonstrated little ability to discriminate applicant behaviors with residency candidates’ applications predisposing interviewers’ evaluation of interviews and preventing the interviews from providing discrete assessments of candidate interpersonal qualities even when behavioral questions were included.52 Further studies are needed to understand the optimal interviewing styles and their role in the resident selection process.
RECRUITMENT PROBLEMS AND REFORMS
The shift to a virtual interview approach in the residency selection process has changed recruitment in various ways; it has also further exposed a flawed system in need of reform and continues to present opportunities to innovate GME recruitment in the setting of other system-related change. When evaluating candidates for selection to interview and rank, programs will continue to seek approaches to optimize the balance between effort and results; even with a greater emphasis on holistic review, they will likely aim to identify student assessment metrics that permit candidate evaluation and comparison most clearly, and that may best predict resident success. Without a broad systems exploration and thinking approach (and understanding) of the undergraduate medical education (UME) to graduate medical education (GME) transition, independent or fractionated system changes may lead to unintended consequences. For instance, with the high numbers of applications to programs and the change in USMLE step 1 score reporting, programs may utilize step 2 CK for screening applicants in the absence of additional changes.28 In a “fixes that fail” archetype that some authors have described, moving to pass/fail reporting (and in clerkship grading) may not address underlying issues and actually worsen problems overall, such as application inflation, absent other system changes.53
Numerous reforms have been proposed to improve the integrity, equity, and efficiency of the match process.54 With virtual interview platforms and the ease to which applicants can apply (and interview) at programs, many proposals have been developed in response to increasing application volume, as well as the variability in student assessment methods and limited comparative metrics among candidates. As one root cause analysis reports, from a broad systems perspective with varying stakeholders, problems include culture, costs and limited resources, bias, system constraints, lack of standards, and lack of alignment. More to the point, the UME to GME transition “suffers from a lack of shared actionable definitions of important concepts, such as what constitutes a successful match, learner readiness for residency, competence, and professionalism.” Suggested reform (or enacted) measures, sometimes fragmented across specialties, have included caps on applications, programs, and interviews; standardized student grading and metrics; the addition of supplemental applications; standardized MSPE letters or other letters of recommendation; program (and applicant) transparency; specialty-specific metrics; and holistic application review, to name a few. Each approach in this nonexhaustive list has merits and disadvantages.
ERAS currently uses a tiered fee structure and is implementing a new fee rate and structure beginning in the next recruitment cycle, charging a small premium for additional applications past a base threshold ($11 for the first 30 applications; $30 per additional application). The new fee model hopes to limit application inflation and promote holistic review, though previous tiered fees have not diminished overall application numbers. Holistic review has also not been a practical method for programs with progressively more applications per applicant, necessitating screening mechanisms by programs such as Step score cut-offs. Before the pandemic, specialties implemented application cycle reforms with variable success through piecemeal in single specialties. For example, emergency medicine initiated the Standardized Letter of Evaluation (SLOE) in 1997 for applicant comparison, and this process remains a mainstay of the resident selection process in this specialty.55 Other specialties have led reforms prepandemic and postpandemic as well. Given concerns that a small cohort of highly competitive applicants may accept and attend a disproportionate number of interviews and that the virtual format might exacerbate this maldistribution with the ease of cost and travel constraints, several specialties have adopted strategies to limit interviews. For example, ophthalmology has designed a centralized scheduling platform to limit applicants to 20 virtual interviews;56 other specialties have created standardized interview offer dates with acceptance windows to create a more predictable timeline for applicants.57,58
During the 2020-2021 recruitment cycle, otolaryngology residency programs introduced preference signaling in the application process, permitting applicants 5 signals each to designate strong preference in programs and only in a limited number of programs.59 The subsequent year, anesthesiology programs trialed this strategy with 97% of programs participating in signaling (with 5 signals of equal weight provided for each applicant), and 95% of applicants utilizing their signals for selected programs. Further, 89% of applicants designated program signals and geographic preferences that were aligned, and anesthesiology programs received, on average 104.5 signals.60 In a program director survey during this period, 88% of respondents reported that signals were important or very important as a screening tool for programs, though they were not intended to be used in this way; 71% included them in a composite filter to conduct holistic review. In addition, 84% agreed that program signals helped identify applicants whom the program would have otherwise overlooked. In this same year, anesthesiology joined the ERAS supplemental application program in the second year of this pilot program. The supplement introduced 3 additional elements to the ERAS application: program signaling, geographic preference signaling, and identification of meaningful and impactful experiences. Applicants could indicate they did not have a geographic preference or identify up to 3 (of 9) geographic divisions identified by the US Census Bureau that they would prefer for training. The experiences section enabled applicants to identify and expand on up to 5 experiences and how they were important to their development and goals. Sixty-one percent of program directors responded that they used this last section during application review; half thought it provided valuable information beyond that in the ERAS application (experiences section).
In the 2023-2024 application cycle, the AACPD Council strongly recommended applicant signaling and tiered signaling with 15 signals (5 “gold” and 10 “silver”) representing 23% of the average number of programs applied to per student in anesthesiology was employed. At present, multiple specialties provide signaling options for applicants (Fig. 4). Students who do not signal a program may still be interested, but the lack of a signal may become as informative to program directors as the presence of one. Related to this signaling option addition, this past application cycle represents the first year that our specialty has observed a decrease in applications to programs. Further, the ‘emental application was discontinued; however, the ERAS application included an “experiences” section for applicants to provide more detailed descriptions if desired. Applicants may share information about their background, hardships they might have overcome, or life experiences that have shaped their development. Finally, the Society of Academic Associations or Anesthesiology and Perioperative Medicine (SAAAPM) also provided additional guidance for programs and applicants in an informational section on their website regarding these updates.61 The goals of the application changes were designed to help applicants more effectively highlight their attributes and preferences as well as encourage a holistic review of applications by programs.
FIGURE 4.
Preference signals offered by specialty in the NRMP match. Data available through https://students-residents.aamc.org/applying-residencies-eras/myeras-application-and-program-signaling-2023-24. Accessed February 10, 2024.
One additional and more recent complexity with the application review process is the potential use of artificial intelligence applications based on natural language processing algorithms. For example, ChatGPT (OpenAI, San Francisco, CA) is one such type of large language model designed to mimic human language processing abilities and has gained worldwide attention for its impressive performance in language processing tasks such as text generation, machine translation, and question answering. One study has shown that ChatGPT can generate resident applicant personal statements that most anesthesiology program directors found acceptable for applicants to their programs.62 The value of such statements generated by such artificial intelligence platforms requires further investigation, especially if such technologies can produce high-quality statements and invented narratives that may be difficult to detect. A varied application format with directed questions instead may provide increased benefit for program review. Such technology’s impact on the screening and selection process by programs remains to be determined but it has the potential to increase the efficiency of any holistic review process in a large candidate pool.
THE LEGAL LANDSCAPE AND IMPLICATIONS TO DIVERSITY-PROMOTING RECRUITMENT STRATEGIES
It is important to highlight the June 2023 US Supreme Court landmark ruling Students for Fair Admissions Inc. v. President and Fellow of Harvard College (2023) and Students for Fair Admissions Inc. v. University of North Carolina (2023) that places a challenge to diversity in medical school and will have potential impact on residency training programs. Overturning more than 40 years of precedent, the Court determined that affirmative action policies (those that consider race and ethnicity) as key criteria for admission within undergraduate, graduate, and professional education are unconstitutional since they breach the equal protection clause of the Constitution of the United States.63 In other words, the consideration of race alone, outside of the context of one’s life experiences, is unlawful, and all institutions receiving federal funding must abide by Title VI of the Civil Rights Act of 1964. Central to the ruling was a determination that diversity goals relating to the educational benefits of diversity driving these 2 institutions in using race-conscious admissions practices were immeasurable and that there is no certainty as to the degree of diversity that would be necessary to achieve desired outcomes. This decision extends to medical school admissions, though Chief Justice Roberts (writing for the Court’s majority) provided guidance for moving forward, noting that “nothing prohibits universities from considering an applicant’s discussion of how race affected the applicant’s life, so long as that discussion is concretely tied to a quality of character or unique ability that the particular applicant can contribute to the university.”
Before this ruling, an amicus brief submitted in July 2022 by the Association of American Medical Colleges and supported by 45 other health care organizations, stated that diversity in the education of physicians and other health care professionals in the United States is a medical imperative with a goal to produce a class of health care providers that are best equipped to serve all of society.64 At present, the racial and ethnic makeup of the US physician workforce does not match that of the country. According to a recent AAMC survey, 5.7% of active US physicians are African American, which is less than half of the population proportion of 18% in the country based on recent census data. Over 19% of the US population identifies as Hispanic, with 6.9% of this population practicing medicine; American Indian and Alaskan Natives make up about 1.1% of Americans, with 0.3% as physicians based on recent data.65 In 2022, under-represented groups comprised 11.4% of MD/PhD program matriculants and just 5% of physician-scientist investigators funded by the National Institutes of Health.66 There is growing evidence that state-level bans on the use of affirmative action in public university admissions in recent decades have led to declines in student enrollment from historically marginalized populations in public medical schools.67 The AAMC-submitted amicus brief highlights that states with such bans have reported a 37% decrease in the number of minority-medical students. This lack of and potentially diminishing diversity in the physician workforce has significant implications for health equity and population health. Physicians from concordant racial and ethnic groups may improve trust and health engagement with preventative care and thus reduce premature mortality in populations that have faced structural barriers to health care access and treatment.68 For instance, a recent study demonstrated that for every 10% increase in the number of black primary care physicians, the life expectancy of black individuals increased by one month.69 Diversity in the health care professions also facilitates access to health care for underserved minorities,70 with minority health professionals demonstrating higher likelihood to serve in high rates of uninsured and areas of under-represented racial and ethnic groups.71
Subsequent to the Supreme Court decision, the ACGME re-emphasized one of its requirement standards with sponsoring institutions to focus on mission-driven, systematic recruitment and retention of a diverse and inclusive workforce of residents, fellows, and faculty members.72 A follow-up letter also highlighted that their standards do not require race-based affirmative action to achieve diversity, and institutions and programs are not required to change their resident selection practices.73 While the Court decision applies to college and medical school admissions, it does not specifically place such restrictions on program match lists and GME employment contracts. Moreover, the ruling has not changed affirmative action programs at this time. However, it does point to employment practices related to DEI initiatives and how they might be treated in the future. To date, racial and ethnic academic disparities have been detailed in residency selection factors, especially as they relate to traditional academic metrics. This includes demographic differences in standardized USMLE exam scores,34 clerkship grading and MSPE summary words,74 and honor society membership,75 which are valued metrics in resident selection. Specific groups (American Indian or Alaska Native, Black, Hawaiian Native or Pacific Islander, and Hispanic students) have demonstrated the highest unsuccessful GME placement rates, particularly under-represented in medicine (URM) men and low-income URM students.76 In one study, self-identification as a non-white race or ethnicity was an independent predictor and a lower likelihood for interview selection for general surgery residency programs.77 Furthermore, under-represented students with low income and black men experience attrition at multiple points during training.78,79 Unmatched black, Hispanic, and non-US citizen graduates have increased over time, and racial/ethnic minority graduates are consistently less likely to begin GME the year they graduated than white students, though these differences diminish with time.80 Importantly, under-representation of minority applicants in anesthesiology (based on the 2022 main residency match) is compounded by their under-representation among applicants overall.81 It is not clear if URM students are matching at less successful rates because they are being interviewed with less frequency or whether they are placed lower on program rank lists. At present, the NRMP does not provide demographic reports for applicants who did not match or where URM applicants rank on program lists in total or stratified by specialty.
FUTURE DIRECTIONS
Residency program directors and recruitment committees continue to deliberate on data from multiple sources in applications to interview and rank applicants in an increasingly competitive applicant field. The paucity of quantitative metrics available in candidate assessment will continue to compel programs to advance the holistic review process, though it will likely place more emphasis on applicants and programs on the USMLE Step 2 CK examination as programs seek standardized, stratifying metrics to compare candidates. While step 1 was not designed as a primary determinant of the predictability of residency academic performance or success, low scores on this exam have correlated with failure on specialty in-training and certification examinations.82 The step 1 reporting change may unfavorably impact US applicants from lesser-known medical schools, as well as DO and IMG applicants. The recent use of preference program signaling by applicants to express genuine interest in a program at time of application has tempered the trend in recent application increases based on the most recent ERAS data. For example, between 2023 and 2024, an approximate 10% decrease in average applications per applicant was observed within anesthesiology.83 Programs are also likely to assess the cohort of applicants who signaled them to make interview decisions. It is possible some programs who receive a high percentage of applicants who signal their program may not need to look beyond this cohort to fill their interview spots.
Among ongoing reforms in the residency selection process, a fundamental assessment must occur at the UME to GME transition. While the recent step 1 reporting change may be a focus on de-emphasizing a single metric in residency recruitment, there should be more objective measures available for assessing candidates. At present, such metrics are declining or absent for program review in the residency interview selection process. Reliable comparative data in applicant performance across various domains are lacking. In addition, NBME Subject Examination (“shelf”) scores at the end of student clerkships are less frequently reported, and medical school evaluations are increasingly difficult to interpret.84 MSPEs have been reported to inconsistently provide graphic comparative data,85 rarely provide comparative professionalism data,86 and routinely suppress negative academic information on applicants.87 Many medical schools have abandoned class ranking in favor of competence determination across various domains (pass/fail designations). Given the heterogeneity of the selection process as well as within the literature and definitions of residency success, it has been difficult to determine the predictive value of individual application factors as well as their relative importance, which likely varies by specialty.
Efforts to improve the UME to GME transition are being studied, though the transition remains discontinuous and dysfunctional in key ways. The system is organized into multiple, independent processes, each managed by leadership with detached goals, accountabilities, and financial processes that lack coordination across the educational continuum. As one author notes, the transition “suffers from a lack of shared actionable definitions of important concepts, such as what constitutes a successful match, learner readiness for residency, competence, and professionalism.”45 There are variable student assessment methods and performance reporting practices across medical schools. At a fundamental level, there is a dichotomy between the objective of medical schools—to educate students and define, measure, and ensure competency goals are met for all students to provide the public with physicians—and residency programs, which are motivated to select top performers or the “best” applicants using available metrics as well as avoid selecting those who may struggle. Programs also search for meaningful assessment data to distinguish the academic records of candidates and sort through high application volumes to select and rank candidates. Students in this system learn which components serve as key currency as they work to find competitive advantages in an individualized pursuit that may not represent broader, core system goals to whom the entire medical education system is accountable. Standardized, summative, and high-reliability multiple-choice examinations like the USMLE step 1 examination shortchange the adoption of other student assessments that may be more valuable for learning and thus create a meritocracy around such single metrics. This has led to over-reliance on a single exam performance as well as limiting diversity in the selection process.34
It may be better that educators work to understand both the strengths and limitations of current assessments and what they can (or cannot) represent about candidates. Programs need to emphasize a clear vision and aggregate characteristics of who they are trying to match and learn better tools for finding candidates who best fit these goals and values. Program directors should detail explicit selection criteria to applicants. Students need unbiased information and multiple relevant evaluation metrics to demonstrate their knowledge and clinical skills relevant to residency preparation. Medical schools need to examine education-driven assessments that provide selection value to residency programs in the match process. Options might include student NBME clinical science examination scores as well as composite or aggregate clinical performance scores on core rotations as ways for students to demonstrate core medical knowledge and clinical aptitude and proficiencies. Residents in our specialty must pass staged examinations by the American Board of Anesthesiology as early as at the completion of their PGY2 (CA1) year. Though no longer available to programs, USMLE step 1 numeric values have predicted ABA part 1 (BASIC) staged examination success at a single academic anesthesiology program88 by one study and served to identify at-risk residents and allow for the development of individualized learning plans for trainees upon starting residency before taking the mandated in-training examination (ITE). Further, ITE scores after 1 year of clinical anesthesia training have been shown to serve as significant and moderately strong predictors of high performance on the ABA BASIC examination and a significant predictor of success in completing both Part 1 and Part 2 ABA examinations within the calendar year after the year of graduation from residency.89 Studies in other medical specialties have found a strong correlation between ITE performance and certification examination performance.90–93 Such studies provide evidence that high-stakes exams represent desired performance metrics in training required for mastering basic knowledge. They also suggest we should not expect less from students in their preparation for residency by reducing psychometrically sound, standardized, validated assessments.
The residency application experience remains inherently competitive. Despite this current match cycle’s decrease in application numbers to anesthesiology programs, increases in application numbers within all specialties in recent years have been driven primarily by an increase in non-US citizen international medical graduates and US osteopathic (DO) seniors. Though research demonstrates some differences in USMLE scores attributable to race and ethnicity,31 the majority of these observed differences abate when controlling for Medical College Admission test scores and undergraduate grade point average.94 Potential improvements to the residency application process must consider varied stakeholders and perspectives given the interconnectedness of UME and GME and complex, sometimes conflicting, objectives. These gaps exist because program directors, educators, learners, and the public value related yet varied goals. Systems thinking archetypes may help residency programs better understand and direct analysis of problems with the transition from UME to GME and the challenges with residency selection. For instance, the recent changes to medical school pass/fail grading and USMLE Step 1 score reporting may exacerbate other system problems, such as overapplication and increased reliance on remaining quantitative data in applications such as USMLE step 2 CK results. Potential reforms such as more formalized MSPEs, standardized virtual video interviews, or the development of one or more additional academic metrics for student assessment may be just a few potential changes to benefit program directors and recruitment committees as they select residency candidates though any changes will likely have both intended and unintended consequences. The NRMP may take the lead in providing deidentified data on student and GME rank lists and corresponding Match results. New standardized, stratifying metrics may need to be developed to assess applicants. Programs will also need to improve data available to applicants about recruitment benchmarks and successful matched candidate attributes. Longitudinal effects of changes will also need to be measured, including focus on resident performance metrics. Tensions around purposes of assessment to support learning versus program selection and ranking need to be recognized and discussed as well as the needs and goals of both educators and learners in a collective process designed to educate medical students and train them to become tomorrow’s physicians.
Footnotes
The authors have no sources of support or conflicts of interest to disclose.
Contributor Information
Stephen Collins, Email: src2f@uvahealth.org.
E. Brooke Baker, Email: EBBaker@salud.unm.edu.
REFERENCES
- 1. 2021 Applicant and Program Director Survey Findings: Impact of the virtual experience on the transition to residency Accessed February 10, 2024. https://www.nrmp.org/wp-content/uploads/2021/08/Research-Brief-Virtual-Experience-2021-FINAL.pdf
- 2. Recommendations on 2021-22 Residency Season Interviewing for Medical Education Institutions Considering Applicants from LCME-Accredited, U.S. Osteopathic, and Non-U.S. Medical Schools Accessed February 10, 2024. https://physicianaccountability.org/wpcontent/uploads/2021/08/Virtual-Rec_COVID-Only_Final.pdf
- 3. AAMC Interview Guidance for the 2022-2023 Residency Cycle. Association of American Medical Colleges. Accessed February 10, 2024. https://www.aamc.org/what-we-do/mission-areas/medical-education/aamcinterview-guidance-2022-2023-residency-cycle
- 4. AAMC: Interviews in GME: Where Do We Go From Here?. Accessed February 10, 2024. https://www.aamc.org/about-us/mission-areas/medical-education/interviews-gme-where-do-we-go-here
- 5. National Resident Matching Program . What is the match? About the NRMP. Accessed February 10, 2024. http://www.nrmp.org/about
- 6. National Resident Matching Program (NRMP) . Results and Data: 2023 Main Residency Match National Resident Matching Program. Washington, DC; 2023. Accessed February 10, 2024. https://www.nrmp.org/match-data-analytics/residency-data-reports/ [Google Scholar]
- 7. ERAS® Statistics. Preliminary Data. Accessed February 10, 2024. https://www.aamc.org/data-reports/data/eras-statistics-data
- 8. National Resident Matching Program (NRMP) . Accessed February 10, 2024. https://www.nrmp.org/about/news/2023/05/nrmp-releases-the-2023-main-residency-match-results-and-data-report-the-most-trusted-data-resource-for-the-main-residency-match/
- 9. Mott NM, Carmody B, Marzano DA, et al. What’s in a number? Breaking down the residency match rate. N Engl J Med. 2022;386:1583–1586. [DOI] [PubMed] [Google Scholar]
- 10. Taylor CA, Mayhew HE, Weinstein L. Residency directors’ responses to the concept of a proposed electronic residency application service. Acad Med. 1994;69:138–142. [DOI] [PubMed] [Google Scholar]
- 11. National Resident Matching Program (NRMP) . Accessed February 10, 2024. https://www.nrmp.org/match-data-analytics/archives/
- 12. National Resident Matching Program (NRMP) . Accessed February 10, 2024. https://www.nrmp.org/match-data-analytics/residency-data-reports/
- 13. Accessed February 10, 2024. https://www.aamc.org/media/47726/download
- 14. National Resident Matching Program (NRMP) . Accessed February 10, 2024. https://www.nrmp.org/wp-content/uploads/2023/03/ROL-Length-Data_2023-Final.pdf
- 15.https://www.nrmp.org/match-data-analytics/archives/ Accessed February 10, 2024.
- 16. Antono B, Willis J, Phillips RL, et al. The price of fear: an ethical dilemma underscored in a virtual residency interview season. J Grad Med Educ. 2021;13:316–320. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17. Weissbart SJ, Kim SJ, Feinn RS, et al. Relationship between the number of residency applications and the yearly match rate: time to start thinking about an application limit? J Grad Med Educ. 2015;7:81–85. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18. Association of American Medical Colleges . FACTS: Applicants and Matriculants Data, Table A-1, 2009-2019. https://www.aamc.org/system/files/2019-11/2019_FACTS_Table_A-1.pdf
- 19. Council of Graduate Schools . Graduate Enrollment and Degrees: 2008 to 2018 Accessed February 10, 2024. https://cgsnet.org/graduateenrollment-and-degrees
- 20. Law School Admission Council . Archive: 2000-1015 ABA End-of-Year Summaries – Applicants, Admitted Applicants & Applications Accessed February 10, 2024. https://www.lsac.org/dataresearch/data/admission-trends-aba-applicantsadmitted-applicants-applications
- 21. Accreditation Council of Graduate Medical Education . ACGME program requirements for graduate medical education (residency) Accessed February 10, 2024. https://www.acgme.org/globalassets/pfassets/programrequirements/cprresidency_2023.pdf
- 22. Association of American Medical Colleges . Grading systems use by U.S. medical schools (2020) Accessed February 10, 2024. https://www.aamc.org/data-reports/curriculum-reports/interactive-data/grading-systems-use-us-medical-schools
- 23. Westerman ME, Boe C, Bole R, et al. Evaluation of medical school grading variability in the United States: are all honors the same? Acac Med. 2019;94:1939–1945. [DOI] [PubMed] [Google Scholar]
- 24. Hom J, Richam I, Hall P, et al. The state of medical student performance evaluations: improved transparency or continued obfuscation. Acad Med. 2016;91:1534–1539. [DOI] [PubMed] [Google Scholar]
- 25. Kenny S, McInnes M, Singh V. Associations between residency selection strategies and doctor performance: a meta-analysis. Med Educ. 2013;47:790–800. [DOI] [PubMed] [Google Scholar]
- 26. Guffey RC, Rusin K, Chidiac EJ, et al. The utility of pre-residency standardized tests for anesthesiology resident selection: the place of United States Medical Licensing Examination scores. Anesth Analg. 2011;112:201–206. [DOI] [PubMed] [Google Scholar]
- 27. Manuel SP, Grewal GK, Lee JS. Millennial resident study habits and factors that influence American Board of Anesthesiology in-training examination performance: a multi-institutional study. J Educ Periop Med. 2018;20:E623. [PMC free article] [PubMed] [Google Scholar]
- 28. Dillon GF, Swanson DB, McClintock JC, et al. The relationship between the American Board of Anesthesiology Part 1 certification examination and the United States medical licensing examination. J Grad Med Educ. 2013;5:276–283. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29. Chen F, Arora H, Martinelli SM, et al. The predictive value of pre-recruitment achievement on resident performance in anesthesiology. J Clin Anesth. 2017;39:139–144. [DOI] [PubMed] [Google Scholar]
- 30. National Resident Matching Program (NRMP) . Results of the 2022 NRMP Program Director Survey. Accessed February 10, 2024. https://www.nrmp.org/wp-content/uploads/2022/09/PD-Survey-Report-2022_FINALrev.pdf
- 31. Makhoul AT, Pontell ME, Kumar NG, et al. Objective measures needed—program directors’ perspectives on a pass/fail USMLE Step 1. N Eng J Med. 2020;382:2389–2392. [DOI] [PubMed] [Google Scholar]
- 32. Khan M, Gil N, Lin W, et al. The impact of Step 1 scores on medical students’ residency specialty choice. Med. Sci Educ. 2018;28:699–705. [Google Scholar]
- 33. USMLE Score Interpretation Guidelines. Accessed February 10, 2024. https://www.usmle.org/sites/default/files/2021-08/USMLE_Step_Examination_Score_Interpretation_Guidelines.pdf
- 34. Rubright JD, Jodoin M, Barone MA. Examining demographics, prior academic performance, and United States Medical Licensing Examination scores. Acad Med. 2019;94:364–370. [DOI] [PubMed] [Google Scholar]
- 35. Talamantes E, Henderson MC, Fancher TL, et al. Closing the gap—making medical school admissions more equitable. N Eng J Med. 2019;380:803–805. [DOI] [PubMed] [Google Scholar]
- 36. Page S. The diversity bonus: How great teams pay off in the knowledge economy. Princeton University Press; 2017. [Google Scholar]
- 37. Alsan M, Garrick O, Graziani GC. Does diversity matter for health? Experimental evidence from Oakland. National Bureau of Economic Research; 2024. [Google Scholar]
- 38. Xierali IM, Nivet MA. The racial and ethnic composition and distribution of primary care physicians. J Health Care Poor Underserved. 2018;29:556–570. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39. Association of Anesthesiology Core Program Directors (AACPD) . AACPD Statement on Residency Interview Format. Accessed February 10, 2024. https://www.saaapm.org/assets/docs/aacpd/AACPD_InterviewGuidance_2023-24.pdf
- 40. Huppert LA, Hsiao EC, Cho KC, et al. Virtual interviews at graduate medical education training programs: determining evidence-based best practices. Acad Med. 2021;96:1137–1145. [DOI] [PubMed] [Google Scholar]
- 41. Arthur ME, Aggarwal N, Lewis S, et al. Rank and match outcomes of in-person and virtual anesthesiology residency interviews. J Educ Periop Med. 2021;23:E664. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42. Vadi MG, Malkin MR, Lenart J, et al. Comparison of web-based and face-to-face interviews for application to an anesthesiology training program: a pilot study. Int J Med Educ. 2016;7:102–108. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43. Domingo A, Rdesinski RE, Stenson A, et al. Virtual residency interviews: applicant perceptions regarding virtual interview effectiveness, advantages, and barriers. J Grad Med Educ. 2022;14:224–228. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44. Lee DC, Kofskey AM, Singh NP, et al. Adaptations in anesthesiology residency programs amid the COVID-19 pandemic: virtual approaches to applicant recruitment. BMC Medical Education. 2021;21:1–5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45. Association of American Medical Colleges . Away rotations of U.S. medical school graduates by intended specialty. In: 2019 AAMC medical school graduation questionnaire. 2019. Accessed February 10, 2024. https://aamc-orange.global.ssl.fastly.net/production/media/filer_public/c6/a7/c6a79bc8-3279-4e0a-9bbf-7b9359172db1/away_rotations_by_specialty_gq_2019_public.pdf
- 46. Asaad M, Elmorsi R, Ferry A, et al. Interviewing amidst a pandemic: perspectives of U.S. residency program directors on the virtual format. J Eur CME. 2022;11:1–5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47. Breitkopf D, Albold C, Barlow J, et al. Virtual residency recruitment: future directions in the new era. Annals of Medicine. 2022;54:3341–3347. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48. Stephenson-Famy A, Houmard BS, Oberoi S, et al. Use of the interview in resident candidate selection: a review of the literature. J Grad Med Educ. 2015;7:539–548. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49. Campion MA, Palmer DK, Campion JE. A review of structure in the selection interview. Pers Pyschol. 1997;50:655–702. [Google Scholar]
- 50. Janz T. Initial comparisons of patterned behavior description interviews versus unstructured interviews. J Appl Psychol. 1982;67:577–580. [Google Scholar]
- 51. Easdown LJ, Castro PL, Shinkle EP, et al. The behavioral interview, A method to evaluate ACGME competencies in resident selection: A pilot project. J Educ Perioper Med. 2005;7:E032. [PMC free article] [PubMed] [Google Scholar]
- 52. Gordon EKB, Clapp JT, Heins SJ, et al. The role of the interview in residency selection: a mixed-methods study. Med Educ. 2020;54:1029–1039. [DOI] [PubMed] [Google Scholar]
- 53. Swails JL, Angus S, Barone MA, et al. The undergraduate to graduate medical education transition as a systems problem: a root cause analysis. Acad Med. 2023;98:180–187. [DOI] [PubMed] [Google Scholar]
- 54. Zastrow RK, Burk-Rafel J, London D. Systems-level reforms to the U.S. resident selection process: a scoping review. J Grad Med Educ. 2021;13:355–370. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55. Jackson JS, Bond M, Love JN, et al. Emergency medicine standardized letter of evaluation (SLOE): findings from the new electronic SLOE format. J Grad Med Educ. 2019;11:182–186. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56. Quillen DA, Siatkowski RM, Feldon S. COVID-19 and the ophthalmology match. Ophthalmology. 2020;128:181–184. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57. American Medical Association . Murphy B. The match process is packed with stress. OB-GYN’s aim to fix it. Accessed February 10, 2024. https://www.ama-assn.org/education/improve-gme/match-process-packed-stress-ob-gyns-aim-fix-it
- 58. American Orthopedic Association . Universal interview offer day. Accessed February 10, 2024. https://www.aoassn.org/programs
- 59. AMA. Murphy B Pilot offers residency applicants a chance to say, “Look at me.” 2020. Accessed February 17, 2024. https://www.ama-assn.org/print/pdf/node/56346
- 60. AAMC . Supplemental ERAS Application Data and Reports. Accessed February 10, 2024. https://www.aamc.org/data-reports/students-residents/report/supplemental-eras-application-data-and-reports
- 61. SAAAPM . Changes to the 2023-2024 ERAS Application. Accessed February 10, 2024. https://www.saaapm.org/2024-program-signaling
- 62. Johnstone RE, Neely G, Sizemore DC. Artificial intelligence software can generate residency application personal statements that program directors find acceptable and difficult to distinguish from applicant compositions. J Clin Anesthesia. 2023;89:111185. [DOI] [PubMed] [Google Scholar]
- 63. Students for Fair Admissions, Inc v . President and Fellow of Harvard College and Students for Fair Admissions, Inc. v. University of North Carolina, 20-1199 and 21-707 U.S. (2023) (Opinion of the Court).
- 64. Brief for Amici Curiae, Association of American Medical Colleges et al . In Support of Respondents. Nos. 20-1199 & 21-707. Accessed February 10, 2024. https://www.aamc.org/media/61976/download?attachment
- 65. AAMC . Physician Specialty Data Report. Accessed February 10, 2024.https://www.aamc.org/data-reports/data/2022-physician-specialty-data-report-executive-summary
- 66. Behera A, Tan J, Erickson H. Diversity and the next-generation physician-scientist. J Clin Transl Sci. 2019;3:47–49. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67. Ly DP, Essien UR, Olenski AR, et al. Affirmative action bans and enrollment of students from underrepresented racial and ethnic groups in U.S. public medical schools. Ann Intern Med. 2022;175:873–878. [DOI] [PubMed] [Google Scholar]
- 68. Frakes MD, Gruber J. Racial concordance and the quality of medical care: evidence from the military. National Bureau of Economic Research; 2022. [Google Scholar]
- 69. Snyder JE, Upton RD, Hassett TC, et al. Black representation in the primary care physician workforce and its association with population life expectancy and mortality rates in the US. JAMA Netw Open. 2023;6:e236687. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 70. Jackson CS, Gracia JN. Addressing health and health-care disparities: the role of a diverse workforce and the social determinants of health. Public Health Rep. 2014;129(suppl 2):61. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71. Pittman P, Chen C, Erikson C, et al. Health workforce for health equity. Medical Care. 2021;59:S405–S408; S405. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 72. ACGME . Letter to the GME Community from Thomas J. Nasca, MD, MACP, President and Chief Executive Officer of the ACGME: Why workforce diversity matters to health care and graduate medical education. 2023. Accessed February 10, 2024. https://www.acgme.org/newsroom/2023/6/letter-to-the-gme-community-why-workforce-diversity-matters-to-GME/
- 73. ACGME . Follow-up to Dr. Nasca’s June 13 letter to the community after Supreme Court decision regarding college admissions and race. 2023. Accessed February 10, 2024. https://www.acgme.org/newsroom/2023/7/letter-to-the-community-july-12/
- 74. Low D, Pollack SW, Liao ZC, et al. Racial/ethnic disparities in clinical grading in medical school. Teach Learn Med. 2019;31:487–496. [DOI] [PubMed] [Google Scholar]
- 75. Boatright D, Ross D, O’Connor P, et al. Racial disparities in medical student membership in the Alpha Omega Alpha honor society. JAMA Intern Med. 2017;177:659–665. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 76. Nguyen M, Chaudhry SI, Desai MM, et al. Rates of medical student placement into graduate medical education by sex, race and ethnicity, and socioeconomic status, 2018-2021. JAMA Netw Open. 2022;5:e2229243. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77. Jarman BT, Kallies KJ, Joshi ART, et al. Underrepresented minorities are underrepresented among general surgery applicants selected to interview. J Surg Educ. 2019;76:e15–e23. [DOI] [PubMed] [Google Scholar]
- 78. Nguyen M, Chaudhry SI, Desai MM, et al. Association of sociodemographic characteristics with U.S. medical student attrition. JAMA Intern Med. 2022;182:917. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 79. Nguyen M, Cross J, Chaudhry SI, et al. Association of sex and ethnoracial identities with attrition from medical school. J Gen Intern Med. 2022;37:3762–3765. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 80. Sondheimer HM, Xierali IM, Young GH. Placement of U.S. medical school graduates into graduate medical education, 2005 through 2015. JAMA. 2015;314:2409–2410. [DOI] [PubMed] [Google Scholar]
- 81. National Resident Matching Program (NRMP) . Research Brief: Applicant demographics and the transition to residency: it’s time to leverage data on preferred specialty and match outcomes to inform the national conversation about diversity and equity in medical education. Accessed February 10, 2024. https://www.nrmp.org/wp-content/uploads/2023/02/Demographic-data-perspectives-paper_FINAL.pdf
- 82. Hamdy H, Prasad K, Anderson MB, et al. BEME systematic review: predictive values of measurements obtained in medical schools and future performance in medical practice. Med Teach. 2006;28:103–116. [DOI] [PubMed] [Google Scholar]
- 83. AAMC . Preliminary program signaling data and their impact on residency selection. Accessed February 10, 2024. https://www.aamc.org/services/eras-institutions/program-signaling-data
- 84. Hartman ND, Lefebvre CW, Manthey DE. A narrative review of the evidence supporting factors used by residency program directors to select applicants for interviews. J Grad Med Educ. 2019;11:268–273. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 85. Boysen-Osborn MJ, Yanuck J, et al. Ranking practice variability in the medical student performance evaluation: so bad, it’s ‘good. Acad Med. 2016;91:1540–1545. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 86. Boysen-Osborn M, Yanuck J, Mattson J, et al. Who to interview? Low adherence by US medical schools to medical student performance evaluation format makes resident selection difficult. West J Emerg Med. 2017;18:50–55. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87. Edmond M, Roberson M, Hasan N. The dishonest dean’s letter: an analysis of 532 dean’s letters from 99 US medical schools. Acad Med. 1999;74:1033–1035. [DOI] [PubMed] [Google Scholar]
- 88. Markham TH, de Haan JB, Guzman-Reyes S, et al. Anesthesiology resident performance on the US medical licensing examination predicts success on the American Board of Anesthesiology BASIC staged examination: an observational study. J Educ Perioper Med. 2020;22:E646. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 89. McClintock JC, Gravlee GP. Predicting success on the certification examinations of the American Board of Anesthesiology. Anesthesiology. 2010;12:212–219. [DOI] [PubMed] [Google Scholar]
- 90. Garvin P, Kaminsky D. Significance of the in-training examination in a surgical residency program. Surgery. 1984;96:109–113. [PubMed] [Google Scholar]
- 91. Klein G, Austin M, Randolph S, et al. Can USMLE and orthopaedic in-training examination scores predict passage of the ABOS part-I examination. J Bone Joing Surg Am. 2004;86:1092–1095. [PubMed] [Google Scholar]
- 92. Rollins L, Martindale J, Edmond M, et al. Predicting pass rates on the American Board of Internal Medicine certifying examination. J Gen Intern Med. 1998;13:414–416. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 93. Rayamajhi S, Dhakal P, Wang L, et al. Do USMLE steps, and ITE score predict the American Board of Internal Medicine certifying exam results? BMC Medical Education. 2020;20:79. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 94. Katsufrakis PJ, Chaudhry HJ. Improving residency selection requires close study and better understanding of stakeholder needs. Acad Med. 2019;94:305–308. [DOI] [PubMed] [Google Scholar]