Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2026 Jan 6.
Published in final edited form as: Am J Med. 2024 Apr 30;137(8):784–789. doi: 10.1016/j.amjmed.2024.04.027

Internal Medicine Applicants’ Experiences With Program Signaling and Second Looks: A Multi-Institutional Survey

Katherine R Schafer a, Marc Heincelman b, William Adams c, Reeni A Abraham d, Brian Kwan e, Meghan Sebasky e, Jennifer G Foster f, Matthew Fitz g
PMCID: PMC12767167  NIHMSID: NIHMS2127452  PMID: 38697591

INTRODUCTION

The Association of American Medical Colleges (AAMC) piloted a supplemental application for the standard residency application in the Electronic Residency Application Service (ERAS) during the 2021–2022 residency cycle. Specifically, the supplemental application provided applicants the opportunity to indicate geographic and program preferences.1 During the most recent residency application cycle, internal medicine applicants could signal interest in 7 programs. While AAMC provides broad findings on the supplemental application from previous application cycles, limited data exist regarding interviews granted from program signaling and applicant perception and use of signaling.1,2 Otolaryngology has shared data on preference signaling, and more recent data for both categorical and preliminary internal medicine applicants suggest that preference and geographic signaling are predictors of receiving an interview.36 This study adds to the existing literature on internal medicine applicant experiences and perceptions of program signaling by measuring the association between signals, interview invitations, and applicant rank lists.

In addition to the supplemental application, an all-virtual residency interview process has been another revision to the application cycle since the COVID-19 pandemic. National organization guidelines continue to recommend virtual-only recruitment, citing equity and access for all applicants.7,8 Concerns remain about the limitations of a strictly virtual interview season, including the inability to adequately understand program culture and a lack of geographic familiarity. Applicant opportunities and experiences with second looks since the COVID-19 pandemic are relatively unknown. This study provides a glimpse into the current practice of second looks in the context of AAMC recommendations for virtual recruitment.

METHODS

In response to the COVID-19 pandemic’s impact on the residency application process, 6 US allopathic medical schools collaborated to survey the experience of fourth-year medical students. The schools include both public and private institutions, representing various geographic regions.

Survey Development

Following a literature review focusing on residency advising, the authors added 3 new constructs to their annual survey for the 2022–2023 application cycle: 1) student approaches to and outcomes from program signaling for residency applications; 2) student perceptions of the signaling process; and 3) student use of second look opportunities.9,10 Overall, the survey contained questions that solicited both quantitative and qualitative responses. For the purposes of this report, the authors focused on quantitative responses that were unique to the 2022–2023 match cycle. Qualitative responses addressing separate constructs were not analyzed for this article. Full text of the relevant survey items is available on request.

The authors designed items to address the questions of “reach” programs and rank list prediction as these topics are frequently discussed in student advising. Regarding the question “Of the reach programs who invited you to interview, how many did you signal through the ERAS Supplemental Application Process?(Q15), the definition of a reach school was left to the discretion of the lead advisors at their respective institutions but in general suggested the applicant did not meet the average metrics based on published data. For example, 1 school defined a reach program as one in which an applicant’s standardized test scores, grades, and publications do not match the program’s averages. For question 17 (“Of the signaled programs who invited you to interview, how many did you rank in the Top 4 of your submitted rank order list?”), the authors chose the number 4 from the applicant rank order list because previous data showed more than 95% of internal medicine applicants matched to 1 of the top 4 programs on their rank order list.11

Survey Administration

This study was reviewed and approved by all institutions’ Institutional Review Boards (IRB) before beginning any study activities. Students were invited via email to complete the web-based survey during a 3-week period after the applicant rank list submission deadline and prior to the beginning of Match Week 2023. The survey was offered electronically to all categorical internal medicine applicants from the 6 participating medical schools with weekly electronic reminders, and responses were aggregated and deidentified.

Statistical Analysis

Summary statistics are reported as median (with interquartile and full range) for the questions “How many programs did you signal?” (Q13); “Of the programs you signaled, how many programs offered an interview?” (Q14); “Of the reach programs who invited you to interview, how many did you signal through the ERAS Supplemental Application Process?” (Q15); and “Of the signaled programs who invited you to interview, how many did you rank in the Top 4 of your submitted rank order list?” (Q17). Summary frequencies (with proportion) are reported for these questions as well as for the questions “Did you take a 2nd look opportunity?” (Q28); “What was the structure of your 2nd look?” (Q29), and “What was your preference in the structure of the 2nd look?” (Q30). Logistic regression models (with logit links) were used to estimate the overall interview to signal rate (ie, the ratio of Q14/Q13), the reach interview to signal rate (ie, the ratio of Q15/Q13), and the rank to interview rate (ie, the ratio of Q17/Q14). Ordinal logistic regression models (with cumulative logit links) were used to estimate the odds of a higher perception of program signaling as a function of the overall interview to signal rate, reach interview to signal rate, and rank to interview rate. For these models, the assumption of proportional odds was assessed using a score statistic as described by Agresti.12 The analysis was completed using SAS version 9.4 (Cary, NC).

RESULTS

Of the 179 categorical internal medicine applicants invited to complete the survey, 120 students responded to at least 1 survey item for an overall response rate of 67%. The median number of programs to which applicants applied was 40 (range 9–116), the median number of reach programs to which applicants applied was 7 (range 0–60), and the median number of interviews was 16 (range 2–41) (Table 1). Most students (n = 109 or 90.8%) answered both questions “How many programs did you signal?” (Q13) and “Of the programs you signaled, how many programs offered an interview?” (Q14). Among these students, the median number of programs signaled was 7 (IQR: 7–7; Range: 3–7), and the median number of interviews offered was 5 (IQR: 3–6; Range: 0–7). The overall interview to signal ratio (Q14/Q13) was 64.7% (95% CI: 61.1% to 68.0%), though a quarter (n = 28/109 or 25.6%) of students received interviews from fewer than half of programs they signaled. Additionally, of the 90% of students who signaled exactly 7 programs (n = 98/109, 90.0%), only 11.2% received interviews from all seven. As the interview rate increases, the probability of reporting a more favorable perception of program signaling increases (Figure 1) (χdf=12=17.0; P < .001).

Table 1.

Summary Statistics for Overall Application and Program Signaling Items

Valid N Summary Median (IQR) (Full Range)

Total number of applications 119 40 (30–55) (8–116)
Total number of interviews 118 16 (11–21) (2–41)
Total number of reach applications 105 7 (4–13) (0–60)
How many programs did you signal? (Q13) 110 7 (7–7) (3–7)
Of the programs you signaled, how many programs offered an interview? (Q14) 110 5 (3–6) (0–7)
Of the reach programs who invited you to interview, how many did you signal through
the ERAS Supplemental Application Process? (Q15)
111 1 (0–2) (0–7)
Of the signaled programs who invited you to interview, how many did you rank in the
Top 4 of your submitted rank order list? (Q17)
101 2 (1–3) (0–4)
Overall interview to signal ratio: Q14/Q13 (95% CI) 109 64.7% (61.1%-68.0%)
Reach interview to signal ratio: Q15/Q13 (95% CI) 109 20.7% (17.9%-23.7%)
Rank to interview ratio: Q17/Q14 (95% CI) 97 49.7% (45.1%-54.3%)

Note: Valid N = Number of students used to compute the estimate. Median with interquartile (IQR) and full range are reported for individual survey questions. Proportion with 95% confidence limits are reported for all ratios.

Figure 1.

Figure 1

Probability of overall program perception by overall interview to signal rate.

To evaluate the impact of program signaling on applications to reported reach programs, 90.8% of respondents answered both “How many programs did you signal?” (Q13) and “Of the reach programs who invited you to interview, how many did you signal through the ERAS Supplemental Application Process?” (Q15). Among these students, the median number of programs signaled was 7 (IQR: 7–7; Range: 3–7), and the median number of interviews received from a reach program was 1 (IQR: 0–2; Range: 0–7). For this ratio (Q15/Q13), the reach interview to signal ratio was 20.7% (95% CI: 17.9%-23.7%). Figure 2 shows that as this rate increases, the probability of reporting a more favorable perception of program signaling increases χdf=12=3.9; P = .048).

Figure 2.

Figure 2

Probability of overall program perception by reach interview to signal rate.

Table 1 also reports that 97 (80.8%) students answered both questions: “Of the programs you signaled, how many programs offered an interview?” (Q14) and “Of the signaled programs who invited you to interview, how many did you rank in the Top 4 of your submitted rank order list?” (Q17). Among these students, the median number of interviews offered was 5 (IQR: 4–6; Range: 1–7), and the median number of signaled programs ranked in the student top 4 was 2 (IQR: 2–3; Range: 0–4). The rank to interview ratio (Q17/Q14) was 49.7% (45.1%-54.3%), and there was no association between this interview rank and the odds of reporting a higher perception of program signaling (P =.78).

Finally, Table 2 reports summary frequencies for the questions “Did you take a second look opportunity?” (Q28), “What was the structure of your second look?” (Q29), and “What was your preference in the structure of the second look?” (Q30). Most respondents (65.8%) did not take a second look (n = 79/120, 65.8%). Among the 39 (32.5%) students who took a second look, most were coordinated by the residency program (n = 17, 43.6%) and most students preferred a combination of program-coordinated second look with a visit to the region.

Table 2.

Summary Frequencies for Second Look Items

N = 120

Did you take a second look opportunity?
 No 79 (65.8%)
 Yes 39 (32.5%)
 Unknown 2 (1.7%
What was the structure of your second look? (*N = 39)
 A coordinated visit by the residency program 17 (43.6%)
 Limited to visiting the city/region of the program 10 (25.6%)
 Both 12 (30.8%)
What was your preference in the structure of the second look? (*N = 39)
 I had no preference 4 (10.3%)
 I preferred second looks limited to visiting the city/region of the program 2 (5.1%)
 I preferred second looks coordinated by the residency program 9 (23.1%)
 I preferred a combination of the above 14 (35.9%)
 Unknown 10 (25.6%)

Note:

*

N = 120 unless otherwise stated.

DISCUSSION

The residency application process continues to evolve over time, particularly with the transition to virtual interviewing. Notwithstanding certain benefits, including decreased costs to the applicant, drawbacks to these changes include the potential exacerbation of application inflation and concerns with applicant ability to adequately identify the best programs for their needs.13 Program signaling intended to address these challenges for both programs and applicants by facilitating prioritized interview invitations as applicants more specifically communicate their initial preferences.14 This study is a step towards discerning how students perceive the benefit of this innovation and suggests that student positive impression of signaling most closely correlates with obtaining interviews. This positive impression, however, does not correlate with student rank lists. Additionally, the cross-sectional analysis serves as an initial glimpse into how applicants and programs are approaching second-look opportunities, which can inform organizations on how to augment the benefits of virtual interviewing.

Prior to signaling, internal medicine-matched US MD graduates had a median interview to application ratio of 52% in 2021.15 This ratio dropped to 44% in 2022 and 2023, with the introduction of signaling.11,16 This cohort of matched US MD graduates revealed a comparable median interview to application ratio of 40% (Table 1). Interestingly, the interview to signal ratio (64%) seen in this study, which is similar to data reported in Texas STAR,6,17 suggests a dramatic and positive impact of program signaling. With median number of applications, the same from years 2022 to 2023, program signaling may contribute value in the signal to interview ratio as well as limiting application inflation. Nevertheless, a quarter of our respondents received less than 50% interviews granted for the number of programs signaled. These findings highlight a need for more data-driven advising practices and offer 2 areas of opportunity that could facilitate higher yield advising: 1) defining a desired signal to interview ratio target for applicants, and 2) identifying best practices to support applicants reaching this target. More transparent information published on residency websites, increased access to school- and national-level match data to support all applicant types, standard utilization of AAMC Residency Explorer, and participation in novel pilots such as AMA Alignment Check Index should guide the advising process and lead to more appropriately targeted signaling.

To our knowledge, no published data detail how program signals relate to applicants’ final rank list. Applicant positive perception of program signaling directly correlated with increasing rates of overall and reach interviews granted, but notably, there was no correlation between perception of program signaling and number of signaled programs in the top 4 of the rank order list. One hypothesis to explain this correlation is that students found program signaling more helpful if the signals helped secure interviews at programs where they might not have been as competitive. In this data, applicants included approximately half of signaled programs in the top 4 on their rank list; however, there was a wide range of responses across applicants. This percentage likely underestimates alignment between applicant preferred programs and their rank lists since national guidance discouraged applicants from signaling home institutions and away rotation programs during the 2022–2023 application cycle. This guidance has changed for subsequent application cycles, and it will be important to reassess the rank to interview ratio. The variability in number of signaled programs included in the applicant top 4 as well as the lack of positive correlation between this number and signaling perception may illustrate the limited program information available for applicants to review in order to make ranked preferences prior to the interview season in addition to changes in preferences during the application cycle. These data suggest programs should exercise caution if using program signals when making final match lists as initial signal preferences may not correspond to applicant final choices.

Importantly, approximately one-third of survey respondents participated in second looks, most often coordinated by residency programs, despite the fact that second looks have been discouraged since the pivot to virtual recruitment.18 Of the respondents participating in second looks, most preferred to combine their coordinated second look experience with a visit to the city/region of the program. This finding suggests that applicants desire more information before making a rank list based solely from virtual interviewing. Some applicants may feel visiting the city or campus will give them a better sense of program culture. Currently, there are no systems by which to regulate second look visits, either program-coordinated or applicant-driven, which will continue to be a challenge in enforcing recommendations, but early data suggest the benefit (and equity) of in-person visits after programs finalize their rank lists.19

This study has some limitations. First, while students from all participating schools were offered an opportunity to meet with a residency advisor in preparation for their application, not all schools required a meeting. Additionally, even with a designated advisor at a particular school, students often receive advice from multiple sources within a school and across institutions, which could influence a number of variables we examined. Second, although this study had a good response rate from diverse groups of US allopathic schools, there is limited applicability due to the sample size; the perspectives may not be representative of international medical and osteopathic graduates. Given the data presented by AAMC that show signaling a program increases the probability of an interview invitation, we did not include it in the survey.20 However, it may be helpful for advisors, students, and programs to determine any correlation between program signaling, rank list, and match, particularly when considering an applicant’s approach to reach programs. Last, it is difficult to build a comprehensive understanding of the interaction between signaling and “reach” schools since the definition of a “reach” program is inherently subjective. Our method for determining a “reach” program was an attempt to use objective data, but this approach does not account for student perception.

For future application cycles, further research is needed on the value of program signaling to applicants and programs, especially as it applies to international and osteopathic medical graduates, preliminary internal medicine applicants, and ultimate match outcomes. To improve resource utilization, data-driven innovations to ascertain the best applicant-program pairings early in the application cycle should be widely implemented and simultaneously assessed. Further guidance around equitable practices for second looks is necessary since many students and programs are pursuing these opportunities despite current recommendations. Additional research and data sharing will facilitate greater generalizability for all participants in the internal medicine residency application process.

PERSPECTIVES VIEWPOINTS.

  • Students received varying numbers of interviews from program signaling suggesting opportunities for improvement for both students and advisors.

  • Students’ perceptions of program signaling more closely correlated to number of interviews granted rather than final place on their rank list.

  • Despite concern from national educational organizations in internal medicine, more than one-third of students participated in second look opportunities during the 2023 application cycle.

ACKNOWLEDGMENTS

The authors would like to thank the team from Texas STAR database for allowing us to compare data.

The project described was partially supported by the National Institutes of Health, Grant UL1TR001442. The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH. This study did not receive internal or external funding.

Funding:

There was no internal or external funding for this study.

Footnotes

Conflict of Interest: The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

RESOURCES