Abstract
Background:
Orthopaedic surgery is among the most competitive specialties in the US residency match. Preference signaling was introduced in 2023 with hopes of reducing application volume and improving efficiency. While overall applications have declined, little is known about how signals are distributed across programs.
Methods:
Data from the Association of American Medical Colleges Residency Explorer Tool were analyzed. Application volumes from 2021 to 2025, program characteristics, percentage of applicants signaling, and interview invitation rates were collected for each program. Program tier quintiles were determined using Doximity rankings. Application and signal densities (applications or signals per available position) were calculated for each program, and averages were compared across program tier, setting, size, and geography.
Results:
Average application density decreased from 139.18 applications per position in 2021 to 95.84 in 2025, despite an increase in applicants. The average signal density was 61.22 signals per position. Signal and application densities were highest in community-based, lower-tier, smaller nonmetropolitan programs and lowest in university-based, higher-tier, larger, metropolitan programs. Interview rates averaged 23% for signaling applicants and 2% for nonsignaling applicants.
Conclusions:
The introduction of preference signaling has coincided with reduced application volume to orthopaedic residency programs. Program-level data suggest a trend of disproportionate signaling to community-based, lower Doximity rank, smaller, and nonmetropolitan programs, providing insight for applicants and programs navigating the evolving orthopaedic residency match.
Introduction
Orthopaedic surgery remains among the most competitive specialties in the US National Resident Matching Program1. In response to rising application numbers and administrative burden on applicants and programs, the Association of American Medical Colleges (AAMC) introduced preference signaling for the 2022 to 2023 application cycle. Under this system, applicants may send 30 signals to demonstrate program interest, with the goal of improving efficiency and reducing cost for both parties. Early reports suggest that application volumes have decreased, applicants are submitting fewer applications, and match rates remain stable2. Preference signaling has generally been well-received by both program directors and applicants; however, uncertainty remains regarding its optimal use, due in part to a lack of granular, program-level data3-5.
To date, most published studies have been survey-based or rely on limited data sets collected from small groups of applicants and programs6,7. Crucially, there is little publicly available information about how signals are allocated across individual orthopaedic residency programs. Without this transparency, applicants are unable to make informed decisions about where to send signals. The purpose of this study was to leverage data from a publicly available AAMC database to quantify signals received by orthopaedic residency programs in an effort to improve transparency and provide guidance for applicants navigating this process.
Methods
The AAMC Residency Explorer Tool was queried for data on all orthopaedic surgery programs8. Variables included program characteristics, application volumes (2021 and 2025), number of available positions (2025), interview rates by signal status (2025), and proportion of applicants signaling. Years correspond to match result years (i.e., 2025 data refer to the 2024-2025 application cycle). The number of signals to a program was calculated by multiplying the number of applications to that program by the percentage of applicants that signaled that program in the 2025 cycle. Signal density was calculated by dividing the number of signals to a program by the number of categorical positions offered. Application density was calculated by dividing the number of applications in a given year by the number of categorical positions offered. The 2025 data on the number of categorical positions offered were used as earlier data were not readily available. Signal and application density were calculated as the unweighted average across programs within each subgroup to enable statistical comparison across groups. These averages are reported as signal and application densities overall and for each subgroup unless otherwise specified.
Program ranking was collected from the 2025 Doximity Residency Navigator, sorted by reputation9. Doximity uses physician surveys from 2023 to 2025 to compile their ranking list. Doximity data were chosen because it is the only comprehensive ranking system available, although it remains a narrow and potentially biased way to compare programs. Program tiers were grouped into quintiles from the Doximity ranking, with Tier 1 consisting of the top 20% of programs and Tier 5 of the bottom 20%. Three programs were not listed in the Doximity rankings and were included in Tier 5. Data on the 20 most populous metropolitan areas were collected from the 2021 US Census10. Proximity to these metropolitan areas was determined using zip code coordinates; programs within 20 miles were classified as metropolitan. Kruskal-Wallis and Wilcoxon ranked sum tests were used to compare rates of signal and application densities across independent subgroups. Statistical analyses were performed in Stata (version 17.0, StataCorp). Figures were generated using Microsoft Excel and MapChart.net.
Results
A total of 210 programs were included in the AAMC data set. Of these, 111 programs (52.9%) had available 2025 signaling data and number of applications for the 2021 and 2025 cycles, accounting for 533 offered residency positions, and were included in the study (Table I). These 111 programs received a total of 46,049 applications and 30,160 signals in 2025, compared with 68,991 applications in 2021. The average signal density was 61.22 signals/position (Table II). Average application density decreased from 139.18 applications/position in 2021 to 95.84 in 2025. Overall interview rates were 23% for applicants who signaled and 2% for those who did not. Of 111 programs, 53 offered no interviews for applicants who did not signal.
TABLE I.
Number of Positions Offered, Applications, and Signals by Program Characteristics
| No. of Programs | No. of Included Programs (% of Total) | No. of Positions Offered* (% of Included Programs) | No. of Signals* (% of Included Programs) | 2021 No. of Applications* | 2025 No. of Applications* | Change in Application Number* | |
|---|---|---|---|---|---|---|---|
| Overall | 210 | 111 (52.9) | 533 (100.0) | 30,160 (100.0) | 68,991 | 46,049 | −33 |
| Program region | |||||||
| East North Central | 43 | 24 (55.8) | 103 (19.3) | 6,257 (20.7) | 14,190 | 9,666 | −32 |
| East South Central | 10 | 5 (50.0) | 28 (5.3) | 1,373 (4.6) | 3,356 | 2,105 | −37 |
| Middle Atlantic | 45 | 28 (62.2) | 133 (25.0) | 6,973 (23.1) | 15,050 | 11,164 | −26 |
| Mountain | 8 | 5 (62.5) | 25 (4.7) | 1,521 (5.0) | 4,070 | 2,322 | −43 |
| New England | 9 | 7 (77.8) | 39 (7.3) | 1944 (6.4) | 5,373 | 2,893 | −46 |
| Pacific | 23 | 12 (52.2) | 59 (11.1) | 2,971 (9.8) | 7,395 | 4,238 | −43 |
| South Atlantic | 38 | 14 (36.8) | 70 (13.1) | 5,387 (17.9) | 9,965 | 7,723 | −22 |
| West North Central | 12 | 7 (58.3) | 35 (6.6) | 1,560 (5.2) | 3,917 | 2,482 | −37 |
| West South Central | 22 | 9 (40.9) | 41 (7.7) | 2,176 (7.2) | 5,675 | 3,456 | −39 |
| Program location | |||||||
| Metropolitan | 53 | 32 (60.4) | 176 (33.0) | 9,149 (30.3) | 21,226 | 13,477 | −37 |
| Nonmetropolitan | 157 | 79 (50.3) | 357 (67.0) | 21,011 (69.7) | 47,765 | 32,572 | −32 |
| Program tier | |||||||
| Tier 1 | 42 | 27 (64.3) | 195 (36.6) | 9,167 (30.4) | 22,011 | 12,978 | −41 |
| Tier 2 | 42 | 24 (57.1) | 119 (22.3) | 7,600 (25.2) | 18,035 | 11,136 | −38 |
| Tier 3 | 42 | 20 (47.6) | 85 (15.9) | 4,726 (15.7) | 10,579 | 7,658 | −28 |
| Tier 4 | 42 | 22 (52.4) | 80 (15.0) | 5,134 (17.0) | 11,583 | 8,459 | −27 |
| Tier 5 | 42 | 18 (42.9) | 54 (10.1) | 3,532 (11.7) | 6,783 | 5,818 | −14 |
| Program setting | |||||||
| Community-based | 27 | 12 (44.4) | 37 (6.9) | 2,984 (9.9) | 4,271 | 4,633 | +8 |
| Community-based, university-affiliated | 67 | 37 (55.2) | 145 (27.2) | 8,419 (27.9) | 19,052 | 13,450 | −29 |
| University-based | 98 | 60 (61.2) | 336 (63.0) | 18,273 (60.6) | 44,459 | 27,264 | −39 |
| Not available | 16 | 2 (12.5) | 15 (2.8) | 485 (1.6) | 1,209 | 702 | −42 |
| Other | 1 | 0 (0.0) | 0 (0.0) | 0 (0.0) | 0 | 0 | n/a |
| Military-based | 1 | 0 (0.0) | 0 (0.0) | 0 (0.0) | 0 | 0 | n/a |
| Program size | |||||||
| Small (1-3 positions) | 55 | 29 (52.7) | 78 (14.6) | 5,656 (18.8) | 12,756 | 9,564 | −25 |
| Medium (4-6 positions) | 118 | 63 (53.4) | 299 (56.1) | 17,659 (58.6) | 40,848 | 26,858 | −34 |
| Large (7+ positions) | 27 | 19 (70.4) | 156 (29.3) | 6,845 (22.7) | 15,387 | 9,627 | −37 |
| Not reported | 10 | 0 (0.0) | 0 (0.0) | 0 (0.0) | 0 | 0 | n/a |
| Supplemental information | |||||||
| Required | 49 | 26 (53.1) | 122 (25.0) | 7,162 (23.7) | 16,093 | 10,712 | −33 |
| Not required | 135 | 74 (54.8) | 366 (75.0) | 20,624 (68.4) | 47,956 | 31,489 | −34 |
| Not reported | 26 | 11 (42.3) | 45 (9.2) | 2,374 (7.9) | 4,942 | 3,848 | −22 |
n/a = not applicable.
These data are reported only for programs that were included in the study (programs that had available 2025 signaling data and number of applications for the 2021 and 2025 cycles).
TABLE II.
Average Application Density, Signal Density, and Interview Rates by Program Characteristics for Included Programs
| 2021 Application Density | p | 2025 Application Density | p | 2025 Signal Density | p | Interview Rate (Signal) (%) | p | Interview Rate (No Signal) (%) | p | |
|---|---|---|---|---|---|---|---|---|---|---|
| Overall | 139.18 | 95.84 | 61.22 | 23 | 2 | |||||
| Program region | ||||||||||
| East North Central | 142.48 | 101.07 | 63.64 | 23 | 1 | |||||
| East South Central | 127.76 | 79.49 | 51.05 | 21 | 4 | |||||
| Middle Atlantic | 129.05 | 100.14 | 61.54 | 25 | 3 | |||||
| Mountain | 178.3 | 97.87 | 62.52 | 18 | 2 | |||||
| New England | 158.84 | 85.42 | 54.85 | 26 | 4 | |||||
| Pacific | 137.72 | 79.31 | 53.94 | 22 | 1 | |||||
| South Atlantic | 147.38 | 119.02 | 81.38 | 17 | 2 | |||||
| West North Central | 115.37 | 75.1 | 45.71 | 29 | 1 | |||||
| West South Central | 138.97 | 0.242 | 86.63 | 0.015 | 54.15 | 0.003 | 26 | 0.008 | 2 | 0.124 |
| Program location | ||||||||||
| Metropolitan | 136.76 | 89.25 | 58.58 | 21 | 2 | |||||
| Nonmetropolitan | 141.17 | 0.602 | 101.23 | 0.028 | 63.4 | 0.213 | 24 | 0.419 | 2 | 0.903 |
| Program tier | ||||||||||
| Tier 1 | 120.24 | 70.5 | 49.64 | 21 | 3 | |||||
| Tier 2 | 158.68 | 97.13 | 65.77 | 23 | 1 | |||||
| Tier 3 | 128.19 | 96.24 | 59.31 | 26 | 3 | |||||
| Tier 4 | 150.28 | 109.86 | 66.16 | 25 | 2 | |||||
| Tier 5 | 140.26 | 0.041 | 114.53 | <0.001 | 68.66 | 0.006 | 21 | 0.256 | 1 | 0.145 |
| Program setting | ||||||||||
| Community-based | 123.99 | 133.06 | 84.17 | 17 | 2 | |||||
| Community-based, university-affiliated | 141.91 | 100.55 | 61.78 | 23 | 2 | |||||
| University-based | 142.02 | 86.77 | 57.2 | 24 | 2 | |||||
| Not available | 94.88 | 57.38 | 34.25 | 24 | 7 | |||||
| Other | n/a | n/a | n/a | Not reported | Not reported | |||||
| Military-based | n/a | 0.386 | n/a | <0.001 | n/a | 0.001 | Not reported | 0.102 | Not reported | 0.716 |
| Program size | ||||||||||
| Small (1-3 positions) | 167.41 | 125.84 | 74.36 | 22 | 3 | |||||
| Medium (4-6 positions) | 137.3 | 91.55 | 59.89 | 23 | 2 | |||||
| Large (7+ positions) | 102.34 | 64.24 | 45.62 | 24 | 2 | |||||
| Not reported | n/a | <0.001 | n/a | <0.001 | n/a | <0.001 | Not reported | 0.721 | Not reported | 0.723 |
| Supplemental information | ||||||||||
| Required | 139.86 | 93.77 | 60.99 | 22 | 2 | |||||
| Not required | 138.98 | 96.47 | 61.3 | 24 | 2 | |||||
| Not reported | 127.36 | 0.873 | 106.93 | 0.674 | 63.25 | 0.926 | 22 | 0.592 | 4 | 0.872 |
n/a = not applicable.
Signal density varied significantly by tier (p = 0.006), with Tier 5 recording the highest (68.66) and Tier 1 recording the lowest (49.64). Metropolitan programs had a signal density of 58.58, compared to 63.40 for nonmetropolitan programs (p = 0.213). Signal density varied significantly by setting (p = 0.001): highest at community-based programs (84.17) compared with community-based university-affiliated programs (61.78) and university-based programs (57.20). Signal density varied significantly by size (p < 0.001), with highest values at small (1-3 positions) programs (74.36), compared with medium (4-6 positions) programs (59.89), and large (≥7 positions) programs (45.62). Signal density was similar among programs requiring the submission of supplemental information (60.99) and programs with no supplemental information requirement (61.30, p = 0.926).
Interview rates for applicants who signaled varied significantly between tiers (p = 0.008): highest in Tier 3 (26%) and lowest in Tier 1 and Tier 5 (21%). Interview rates for signaling applicants did not vary significantly by program setting (17%-24%), program size (22%-24%), or metropolitan location (21%-24%). Nonsignaling interview rates did not vary significantly and ranged from 2% to 7% across all program characteristics.
Discussion
Since the introduction of preference signaling, orthopaedic residency programs included in this study saw a marked reduction in applications, despite a concurrent increase in the number of applicants from 1,282 in 2021 to 1,590 in 202511,12. This trend aligns with previous survey-based studies and Electronic Residency Application Service national statistics4,13. The decline likely reflects program statements and applicant perception that programs seldom extend interviews without a signal, reducing incentive to apply beyond signaled programs. Collectively, these findings suggest that preference signaling has effectively decreased administrative burden and costs for programs and applicants.
However, the decrease in applications has not been uniform. Community-based, lower-ranked, smaller, nonmetropolitan programs demonstrated the most modest reductions, sometimes even increases, while university-based, higher-ranked, larger, metropolitan programs showed the greatest declines.
A previous national survey-based study found a decrease in applications as well as a wider distribution of signals in 2024 than in 2023, showing that 50% of signals were concentrated among 32% of programs in the 2024 cycle vs. just 27% of programs the previous year. However, they did not stratify by the characteristics of programs that received the most signals14.
The 2025 data are the first to include the ratio of signals to applications for individual programs, enabling a more granular understanding of applicant signaling behavior. This study calculates the signal density, a measure of the number of signals a program receives per position offered. Conceptually, applicants may have a greater chance of receiving interview invitations at programs with a lower signal density, although this is one of many factors that must be considered in the application process. The following comparisons across program characteristics may give applicants greater insight into signal allocation strategies.
Differences Between Program Tiers
Program rank appears to have an inverse relationship with signal density. Tier 1 programs demonstrated both the lowest signal and application densities, while Tier 5 programs had the highest values of each (Fig. 1). This may reflect the highly competitive landscape of the match: less competitive applicants may preferentially signal Tier 4 and Tier 5 programs rather than expend signals on programs perceived as out of reach, while more competitive applicants may distribute signals across multiple tiers, leading to a concentration of applications and signals at lower-ranked programs. Furthermore, a greater proportion of applications to Tier 4 and Tier 5 programs are submitted without signals (Fig. 1), which may reflect applicants’ perception that lower-tier programs serve as “safety” options where interviews could be obtained without signaling, thereby increasing application density along with signal density at these programs. These data should be interpreted with caution, however, given that a lower proportion of Tier 5 were included (42.9%) compared to Tier 1 programs (64.3%), raising the possibility of reporting bias.
Fig. 1.

Comparison of application density in 2021 and 2025 as well as signal density in 2025 by program tier.
Differences Between Program Setting
Similarly, community-based programs demonstrated the highest signal and application densities, followed by community-based university-affiliated programs and university-based programs. There is likely overlap with program rank, as the Doximity system relies on physician surveys which may favor university-based programs with greater name recognition and higher research output. Thus, the clustering of applications as described above may help explain this phenomenon. Community-based programs represented only 7% of available positions and had the lowest rate of available signal data, which may exaggerate these trends. Nevertheless, community-based programs have attracted more applications in the signaling era and consequently face a disproportionately higher administrative burden than their counterparts.
Differences Between Program Size
An inverse relationship was observed between signal density and program size. Small programs had the highest signal and application densities, whereas large programs had the lowest (Fig. 2). Larger programs may be ranked higher due to having more graduates and greater likelihood of being university-based. However, the contrast in signal density between small and large programs was greater than that observed across program rank or setting.
Fig. 2.

Comparison of application density in 2021 and 2025 as well as signal density in 2025 by program size (number of positions offered).
This pattern suggests that applicants may not prioritize a higher number of available positions when selecting programs. Concentrating signals on smaller programs may limit the total number of available positions an applicant is competing for, potentially placing them at a disadvantage. It should be noted that there was a difference in available data between groups, with 70.4% of large programs included in the analysis compared to 52.7% and 53.4% of small and medium programs.
Differences Between Program Location
There was considerable variation in signal and application density across geographical regions. In the 2025 cycle, the West North Central region had the lowest application and signal densities, and the South Atlantic region had the highest for both (Fig. 3). Metropolitan programs demonstrated lower application densities compared with nonmetropolitan programs (p = 0.028).
Fig. 3.

Signal density by ERAS region. ERAS = Electronic Residency Application Service.
A recent study found that programs in the southern United States were more likely to exhibit “pipelining”, or recruiting a high proportion of residents from the same medical schools14. Regional variation may therefore reflect this pipelining phenomenon: students from program-favored medical schools might preferentially apply to and signal these programs, while other applicants also target them, concentrating applications and signals.
Another factor may be the geographic distribution of applicants, as 50% to 60% of orthopaedic residents train at programs within the same region as their medical school15,16. A 2023 study of the Fellowship and Residency Electronic Interactive Database (FREIDA) examined the medical school regions of a representative sample of orthopaedic residents17. It found that while 19.8% of residents graduated from Southeast medical schools, only 16.8% trained in Southeast residency programs. By contrast, 8.8% of residents came from Pacific West medical schools, yet Pacific West residency programs accounted for 12.6% of residents. This imbalance suggests that in the Southeast, there are more medical school graduates than available residency positions, which may result in a greater proportion of Southeast applicants matching outside of their home region despite heavier use of local application pipelines, although it is important to note that the FREIDA database defines regions differently than the AAMC. Similarly, applicants’ proximity to large metropolitan areas may influence their likelihood of applying to and signaling metropolitan programs, but this information is not publicly available.
These findings should be interpreted with caution. Only 36.8% of programs in the South Atlantic region had available data, which may not be representative of the region as a whole. Furthermore, geographic patterns likely overlap with other characteristics such as program size, tier, and university affiliation. From the available data, applicants may benefit from considering geographic variation, the location of their own medical school, and program-specific practices such as pipelining when allocating signals.
Interview Rates by Program Characteristics
Overall, nearly 1 in 4 applicants who signaled a program received an interview, compared with 1 in 50 who did not signal, which is similar to previously reported data18. This demonstrates that programs strongly favor signals, and applications without them are rarely successful. These findings align with surveys on program director and applicant perceptions and are indicative that signals are being used to limit administrative burden3,4. This pattern held across program rank, setting, size, and proximity to major metropolitan areas.
Taken together, these results indicate that applicants should not expect meaningful interview opportunities from programs they do not signal. Given that successful applicants average 11 to 12 interview invitations, choosing 30 signals carefully is essential19.
Effect of Supplemental Information Requirements
There were no significant differences in signal or application density between programs that did and did not require supplemental information such as essays. This suggests that supplemental requirements do not deter applicants. Previously seen as a way to reduce application volume, supplemental applications may no longer serve this function20.
Strengths and Limitations
This study leverages a large, publicly available AAMC data set to provide the first program-level analysis of preference signal distribution in orthopaedic surgery. By evaluating programs across rank, size, setting, and geography, it offers applicants and programs a more transparent view of how signals are distributed in the residency match. Importantly, the analysis highlights disparities in the residency landscape, with university-based, higher-tier, larger, metropolitan programs tending to have lower signal and application densities than community-based, lower-tier, smaller, nonmetropolitan programs. These findings extend beyond prior survey-based work by incorporating multiple years of data from over half of the programs and introducing signal density to guide applicant strategy.
This study has several limitations. First, just over half of the programs had available signaling data, disproportionately underrepresenting community-based and lower-tier programs. Nonetheless, as compared with previously available national and survey-based data, this is the first time program-specific information has been available for the majority of programs. Second, applicant-level factors, including geographic distribution, competitiveness, and signaling rationale, were not available, but likely play a major role in the observed trends. In addition, signaling data were only available beginning with the 2025 cycle, preventing assessment of year-to-year variability in these trends. Program size was also calculated using 2025 data, including when calculating the 2021 application density, as program-specific 2021 data were not readily available. Although the Doximity ranking system offers an incomplete and limited measure of program comparison, it was used as the only publicly accessible ranking currently available for residency programs. Finally, the observational nature of the analysis precludes any conclusions about causation. More comprehensive data are needed to enable deeper analyses and to address inequities among programs, and applicants should not rely solely on this data set when making signaling decisions.
Conclusion
Since the introduction of preference signaling, the orthopaedic surgery residency match has seen a reduced application volume, with notable variation across program characteristics. University-based, higher-tier, larger, and metropolitan programs had lower application and signal densities, while community-based, lower-tier, smaller, and nonmetropolitan programs reported higher ones. These findings underscore differences in program-level signaling patterns and may help guide both applicants and programs in navigating the evolving residency match.
Footnotes
Investigation performed at the Warren Alpert Medical School of Brown University, Providence, RI
Disclosure: The Disclosure of Potential Conflicts of Interest forms are provided with the online version of the article (http://links.lww.com/JBJSOA/B119).
Contributor Information
Kathryn Segal, Email: kathryn_segal@brown.edu.
Brad Blankenhorn, Email: blankenhorn@gmail.com.
Raymond Hsu, Email: Raymond_hsu@brown.edu.
References
- 1.Weintraub MJ, Patel RV, Singh R, Ahn DB, Mendiratta D, Kaushal NK, Vosbikian MM. Orthopedic surgery residency match trends in 2024: step scores and research on the rise. J Surg Educ. 2025;82(10):103647. [DOI] [PubMed] [Google Scholar]
- 2.Folkman MJ, Mihalic A, Sanford CG. The evolution and impact of preference signaling on orthopaedic residency after two years. JB JS Open Access. 2025;10(3):e25.00036. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Suresh KV, Covarrubias O, Mun F, LaPorte DM, Aiyer AA. Preference signaling survey of program directors-after the match. J Am Acad Orthop Surg. 2024;32(5):220-7. [DOI] [PubMed] [Google Scholar]
- 4.Minhas A, Berkay F, Hudson T, Barry K, Froehle AW, Krishnamurthy A. Perceptions of preference signaling in orthopaedic surgery: a survey of applicants and program directors. J Am Acad Orthop Surg. 2024;32(2):e95-105. [DOI] [PubMed] [Google Scholar]
- 5.Howard C, Martinez VH, Hughes G, Zaheer A, Allen C, Hanson C, Norris B, Checketts JX. Preference signaling in orthopaedic surgery: applicant perspectives and opinions. J Osteopathic Med. 2024;124(12):529-36. [DOI] [PubMed] [Google Scholar]
- 6.Kotlier JL, Mihalic AP, Petrigliano FA, Liu JN. Understanding the match: the effect of signaling, demographics, and applicant characteristics on match success in the orthopaedic residency application process. J Am Acad Orthop Surg. 2024;32(5):e231-39. [DOI] [PubMed] [Google Scholar]
- 7.Sorenson JC, Ryan PM, Dennison JG, Ward RA, Fornfeist DS. The power of preference signaling: a monumental shift in the orthopaedic surgery application process. J Am Acad Orthop Surg. 2025;33(2):51-5. [DOI] [PubMed] [Google Scholar]
- 8.Residency Explorer™ Tool: Home. Available at: https://www.residencyexplorer.org/. Accessed October 29, 2025. [Google Scholar]
- 9.Doximity Residency Navigator. Doximity. Available at: https://www.doximity.com/residency/programs?specialtyKey=bd234238-6960-4260-9475-1fa18f58f092-orthopaedic-surgery&sortByKey=reputation&trainingEnvironmentKey=&intendedFellowshipKey=. Accessed October 30, 2025. [Google Scholar]
- 10.Metropolitan and Micropolitan Statistical Areas Totals: 2020-2021. 2022. Available at: https://www.census.gov/data/tables/time-series/demo/popest/2020s-total-metro-and-micro-statistical-areas.html#v2022. Accessed October 24, 2025. [Google Scholar]
- 11.Gerhold C, Kravchuk P, Shah R, Cannada LK. The perfect orthopedic surgery applicant?: exploring trends in the National Resident Matching Program “Charting Outcomes in the Match” reports. J Surg Educ. 2025;82(9):103616. [DOI] [PubMed] [Google Scholar]
- 12.Match Data. NRMP. 2025. Available at: https://www.nrmp.org/match-data/. Accessed October 27, 2025. [Google Scholar]
- 13.ERAS® Statistics. AAMC. Available at: https://www.aamc.org/data-reports/publication/eras-statistics. Accessed October 24, 2025. [Google Scholar]
- 14.Sparks CA, Contrada EV, Kraeutler MJ, Scillia AJ. Prevalence of pipelining in the United States orthopedic surgery residency match. Cureus. 2025;17(3):e80303. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Lee JJ, Jacome FP, Weintraub M, Kim IE, Cho S, Chopra A, Akwuole F, Lema O, Bakaes Y, Hsu W. Characteristics of US medical schools that produce the highest percentage of orthopaedic surgery residents: analysis of the 2019 to 2023 orthopaedic surgery residency cohort. JB JS Open Access. 2025;10(2):e24.00185. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Cinquegrani E, Giese D, Driscoll J, Basmayor A, Perry M, Thiessen A. Have degree, will travel? Geography and orthopaedic surgery residency match. J Surg Educ. 2025;82(2):103352. [DOI] [PubMed] [Google Scholar]
- 17.Ranson R, Mao H, Saker C, Lehane K, Gianakos A, Stamm M, Mulcahey MK. The demographic make-up of orthopaedic surgery residents in the United States post ACGME merger. J Orthop Exp Innov. 2023;4(1). [Google Scholar]
- 18.Alkhouri S, Michelberger M, Parikh J, Fang CJ, Harris C, Nelson K, Maitra S, Wentz B. The impact of preference signaling on interview invitations and match outcomes in the 2023 to 2024 orthopaedic residency cycle. JB JS Open Access. 2025;10(3):e25.00083. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Chen AF, Secrist ES, Scannell BP, Patt JC. Matching in orthopaedic surgery. J Am Acad Orthop Surg. 2020;28(4):135-44. [DOI] [PubMed] [Google Scholar]
- 20.Lynch CP, Cha EDK, Patt JC, Kogan M. Supplemental essays for residency applications: the modern day typewriter. J Surg Educ. 2022;79(4):896-903. [DOI] [PubMed] [Google Scholar]
