Abstract
Background
Internal medicine (IM) residency programs receive information about applicants via academic transcripts, but studies demonstrate wide variability in satisfaction with and usefulness of this information. In addition, many studies compare application materials to only 1 or 2 assessment metrics, usually standardized test scores and work-based observational faculty assessments.
Objective
We sought to determine which application materials best predict performance across a broad array of residency assessment outcomes generated by standardized testing and a yearlong IM residency ambulatory long block.
Methods
In 2019, we analyzed available Electronic Residency Application Service data for 167 categorical IM residents, including advanced degree status, research experience, failures during medical school, undergraduate medical education award status, and United States Medical Licensing Examination (USMLE) scores. We compared these with post-match residency multimodal performance, including standardized test scores and faculty member, peer, allied health professional, and patient-level assessment measures.
Results
In multivariate analyses, USMLE Step 2 Clinical Knowledge (CK) scores were most predictive of performance across all residency performance domains measured. Having an advanced degree was associated with higher patient-level assessments (eg, physician listens, physician explains, etc). USMLE Step 1 scores were associated with in-training examination scores only. None of the other measured application materials predicted performance.
Conclusions
USMLE Step 2 CK scores were the highest predictors of residency performance across a broad array of performance measurements generated by standardized testing and an IM residency ambulatory long block.
What was known and gap
Use of USMLE 1 scores as an applicant selection tool for residency programs has increased variability in usefulness of other performance metrics, but studies across many specialties demonstrate wide variability in using this information to select residents during the application process.
What is new
An analysis of Electronic Residency Application Service (ERAS) data for categorical internal medicine (IM) residents compared with a robust program of assessment of resident performance during a yearlong ambulatory block.
Limitations
Study was completed at a single institution and all clinical performance was measured from a unique IM residency ambulatory long block structure, limiting generalizability.
Bottom line
The USMLE Step 2 CK was the best predictor of residency performance on standardized testing during and after residency, as well as clinical performance from multiple perspectives during a yearlong ambulatory long block continuity experience.
Introduction
Internal medicine (IM) residency programs receive large amounts of information about applicants, including academic transcripts, the Medical Student Performance Evaluation (MSPE), letters of recommendation (LORs), and United States Medical Licensing Examination (USMLE) scores. However, studies across many specialties demonstrate wide variability in satisfaction and usefulness of this information in selecting residents during the application process.1–5 The MSPE, despite recent efforts at improvement, often lacks transparency and standardization, making it difficult to interpret during the selection process.6,7 Evidence is mixed about LOR predictive value. One small study showed successful residents had more LOR comments about excellence in the Accreditation Council for Graduate Medical Education core competency areas of patient care, medical knowledge, and interpersonal and communication skills,8 but other studies found little value of LOR altogether for residency selection or resident performance.1,9,10
Because of these issues, use of USMLE Step 1 scores as a prominent applicant selection tool has intensified in recent years.11 Most studies show USMLE Step 1 largely predicts future test scores, such as in-training examinations (ITEs) and specialty board examinations, but not competency domains such as communication, teamwork, and professionalism.12–18 Studies that have shown a connection between USMLE Step 1 and global performance generally have weak associations,19,20 limited scope of comparisons (ie, just faculty assessment),21 or were in fields other than IM.19–21 Despite the heavy reliance on USMLE Step 1 scores, recent studies suggested USMLE Step 2 Clinical Knowledge (CK) actually may be a better predictor of ITE scores and resident performance overall.22–25
Many of these studies compare application materials to only 1 or 2 other assessment metrics, usually standardized test scores and work-based observational faculty assessments. We believe these limited forms of assessment, while valuable, are not enough to fully capture a physician's competence.26 At the University of Cincinnati, we created a robust program of assessment,26 consisting of multimodal performance data including faculty member, peer, allied health professional, and patient-level assessment, as well as standardized test scores.27,28 In this retrospective study, we examine which application materials best predict performance across this broader array of residency assessment outcomes.
Methods
The University of Cincinnati IM Residency Program is based in an urban academic medical center. Categorical IM classes consist of approximately 25 residents who are accepted through the National Resident Matching Program (NRMP). The program director (PD) and 2 faculty members interview each resident during the recruitment season. The entirety of each application is reviewed by the interviewers. Information gathered from this process is submitted to the residency selection committee to develop a rank list for submission to the NRMP.
Inclusion criteria for this study consisted of categorical residents who matriculated to our program from 2007 to 2014 (167 total). Final analysis of the data was conducted in 2019. Applicants were excluded if they were preliminary residents, clinical scientist track program residents, part of combined programs (eg, IM–pediatrics) or had transferred from another program after their first year. We analyzed selected Electronic Residency Application Service (ERAS) data, including the presence of an advanced degree, the number of research experiences (defined here as publications and posters), the presence of failures during medical school (reported examinations, clerkships, basic science courses, or USMLE), Alpha Omega Alpha Honor Medical Society awards, Humanism in Medicine awards, or other undergraduate medical education (UME) award status, and USMLE Step 1 and 2 CK scores. We excluded class rank and clerkship grades because of the extreme variability in the way these are determined among medical schools (including some schools that use pass-fail for these measures), making direct comparison difficult.29 We also chose not to include medical school strength as we did not have a standardized way of determining this.
We measured residency performance in several ways. First, we included a multisource assessment that residents receive at the end of a yearlong ambulatory long block27,28 that spans parts of their second and third years of residency. This assessment contains quantitative and narrative feedback from attending physicians, peers, nurses, and allied health professionals in the ambulatory practice. In the long block 360-degree ratings, each resident received approximately 50 global ratings per half-year in the domains of patient care, teamwork, professionalism, and efficiency, and these scores were averaged to produce a composite measure of overall performance and class ranking. All raters used the same anonymous reporting system, with each category ranging from 1 (poor) to 5 (superior). We made no accounting for the relative contribution of assessment volume each rater delivered for a given resident. Second, we included a minimum of 25 direct patient assessments of resident performance per resident during the ambulatory long block experience using the physician communication score subset of the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) ambulatory survey.30 Patients assessed residents on 7 physician attributes (physician explains, physician listens, physician gives instructions, physician knows history, physician respects patient, physician is on time, and physician calls with results), using a 6-point scale (1, never, to 6, always) for each behavior. We also included the American College of Physician ITE scores and the American Board of Internal Medicine (ABIM) certification status on first attempt.
We used descriptive statistics, including means and medians, to summarize the data. For continuous outcomes, univariate linear regression models were used to determine the relationship between the outcome and other potential covariates. All covariates were considered for inclusion in the multivariable linear regression models and were removed by backward elimination using the stepwise method. Only the covariates that were significant at a P value of < .10 were included in the final models. For dichotomous outcomes, logistic regression models were developed using the same methods. All analyses were performed using SAS 9.4 (SAS Institute Inc, Cary, NC).
The University of Cincinnati Institutional Review Board approved this study.
Results
Among 167 residents, 20 (12%) had an advanced degree, 9 (5%) had a UME award, and 23 (14%) had a failure in medical school. The mean USMLE score was 218.5 for Step 1 and 230.8 for Step 2 CK.
Table 1 shows the relationship between ERAS material (USMLE scores, advanced degrees, awards, research, presence of failures in medical school) and the long block 360-degree ratings. Although the univariate analysis demonstrated several associations, in the multivariate analysis, only USMLE Step 2 CK scores were significantly associated with all modes of the long block faculty/peer/staff multisource assessment ratings (higher scores were associated with higher ratings).
Table 1.
Univariate Analysis | ||||||||||||
Faculty, Peer, Staff Patient Care | Faculty, Peer, Staff Teamwork | Faculty, Peer, Staff Professionalism | Faculty, Peer, Staff Efficiency | Overall Long Block Score | Long Block Class Rank | |||||||
Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | |
USMLE Step 1 score | -0.004 | .036 | 0.002 | .09 | 0.003 | .023 | 0.002 | .10 | 0.003 | .035 | -0.068 | .012 |
USMLE Step 2 score | 0.004 | .001 | 0.003 | .04 | 0.003 | .009 | 0.004 | .002 | 0.003 | .004 | -0.076 | .008 |
Presence of advanced degree (eg, PhD, MPH) | 0.057 | .47 | 0.134 | .11 | 0.057 | .54 | -0.008 | .93 | -0.019 | .82 | -0.672 | .70 |
Undergraduate award (eg, AOA, Humanism in Medicine) | 0.076 | .49 | -0.054 | .65 | -0.055 | .67 | 0.047 | .71 | 0.044 | .68 | -1.026 | .41 |
Listing any research experience | 0.153 | .043 | 0.047 | .56 | 0.154 | .08 | 0.285 | .001 | 0.186 | .017 | -1.925 | .25 |
Listing greater than 5 research experiences | -0.135 | .08 | -0.112 | .17 | -0.129 | .16 | -0.050 | .58 | -0.067 | .39 | 0.830 | .62 |
Presence of any failure in medical school | -0.105 | .17 | -0.156 | .05 | -0.056 | .53 | -0.079 | .36 | -0.049 | .54 | 2.633 | .11 |
Multivariable Analysis | ||||||||||||
Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | |
USMLE Step 2 score | 0.00323 | .009 | 0.00278 | .044 | 0.00309 | .012 | 0.00418 | .002 | 0.00345 | .004 | -0.07579 | .008 |
Undergraduate award (eg, AOA, Humanism in Medicine) | 0.07306 | .043 | ||||||||||
Listing any research experience | 0.15036 | .09 | ||||||||||
Listing greater than 5 research experiences | 0.06962 | .07 | -0.14445 | .044 | ||||||||
R square | 0.1151 | 0.0312 | 0.0815 | 0.0942 | 0.0724 | 0.0621 | ||||||
Mean | 4.24 | 4.26 | 4.31 | 4.14 | 4.19 | 11.84 | ||||||
Range | 2.83–4.76 | 3.13–4.85 | 1.96–4.83 | 2.54–4.76 | 3.19–4.69 | 1–24 |
Abbreviations: USMLE, United States Medical Licensing Examination; AOA, Alpha Omega Alpha.
Table 2 shows the relationship between ERAS application materials and patient ratings. In the multivariate analysis, higher USMLE Step 2 CK scores and having an advanced degree were associated with all patient-derived ratings.
Table 2.
Univariate Analysis | ||||||||||||||
Physician Explains | Physician Listens | Physician Gives Instructions | Physician Knows History | Physician Respects Patient | Physician Is on Time | Physician Calls With Results | ||||||||
Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | |
USMLE Step 1 score | 0.004 | .26 | 0.004 | .19 | 0.003 | .29 | 0.004 | .23 | 0.004 | .26 | 0.004 | .28 | 0.004 | .26 |
USMLE Step 2 score | 0.007 | .031 | 0.008 | .024 | 0.007 | .029 | 0.008 | .020 | 0.007 | .035 | 0.007 | .05 | 0.008 | .024 |
Presence of advanced degree (eg, PhD, MPH) | 0.408 | .06 | 0.434 | .049 | 0.441 | .044 | 0.424 | .056 | 0.426 | .054 | 0.459 | .042 | 0.478 | .037 |
Undergraduate award (eg, AOA, Humanism in Medicine) | 0.185 | .50 | 0.136 | .64 | 0.156 | .58 | 0.165 | .57 | 0.176 | .54 | 0.173 | .56 | 0.065 | .83 |
Listing any research experience | 0.101 | .61 | 0.048 | .82 | 0.019 | .93 | 0.067 | .75 | 0.012 | .95 | 0.079 | .70 | 0.052 | .81 |
Listing greater than 5 research experiences | 0.257 | .19 | 0.205 | .32 | 0.187 | .36 | 0.251 | .22 | 0.182 | .37 | 0.231 | .27 | 0.251 | .24 |
Presence of any failure in medical school | -0.424 | .026 | -0.471 | .017 | -0.442 | .024 | -0.381 | .06 | -0.446 | .024 | -0.451 | .025 | -0.196 | .34 |
Multivariable Analysis | ||||||||||||||
Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | |
USMLE Step 2 score | 0.008 | .018 | 0.008 | .013 | 0.008 | .016 | 0.009 | .011 | 0.008 | .020 | 0.007 | .029 | 0.009 | .013 |
Presence of advanced degree (eg, PhD, MPH) | 0.441 | .037 | 0.468 | .033 | 0.470 | .029 | 0.457 | .039 | 0.451 | .041 | 0.472 | .033 | 0.534 | .022 |
R square | 0.0771 | 0.0828 | 0.0817 | 0.0829 | 0.0742 | 0.0721 | 0.0879 | |||||||
Mean | 5.32 | 5.37 | 5.37 | 5.24 | 5.42 | 5.33 | 4.94 | |||||||
Range | 3.52–6.00 | 3.52–6.00 | 3.50–6.00 | 3.32–6.00 | 3.54–6.00 | 3.31–6.00 | 2.71–6.00 |
Abbreviations: USMLE, United States Medical Licensing Examination; AOA, Alpha Omega Alpha.
In Table 3, the multivariate analysis shows that higher USMLE Step 1 scores were associated with higher ITE scores, but not ABIM pass rate, and higher USMLE Step 2 CK scores were associated with all testing measures. For every point increase in USMLE Step 2 CK scores, the odds of passing the ABIM increased by 6.9%.
Table 3.
Univariate Analysis | ||||||||
ITE 1 | ITE 2 | ITE 3 | Pass ABIM | |||||
Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Odds Ratio | P Value | |
USMLE Step 1 score | 0.759 | < .0001 | 0.809 | < .0001 | 0.765 | < .0001 | 1.033 | .021 |
USMLE Step 2 score | 0.941 | < .0001 | 0.977 | < .0001 | 0.828 | < .0001 | 1.069 | .001 |
Presence of advanced degree (eg, PhD, MPH) | -2.950 | .66 | -10.838 | .12 | -15.722 | .05 | 0.744 | .72 |
Undergraduate award (eg, AOA, Humanism in Medicine) | 20.063 | .033 | 20.939 | .031 | 24.008 | .021 | 0.812 | .85 |
Listing any research experience | 5.091 | .44 | 2.304 | .72 | 6.883 | .36 | 1.250 | .78 |
Listing greater than 5 research experiences | 4.421 | .51 | -3.200 | .65 | 1.276 | .88 | 1.857 | .56 |
Presence of any failure in medical school | -13.221 | .036 | -17.224 | .009 | -12.500 | .10 | 0.333 | .09 |
Multivariable Analysis | ||||||||
Coefficient | P Value | Coefficient | P Value | Coefficient | P Value | Odds Ratio | P Value | |
USMLE Step 1 score | 0.240 | .020 | 0.271 | .008 | 0.332 | .023 | ||
USMLE Step 2 score | 0.712 | < .0001 | 0.722 | < .0001 | 0.524 | .001 | 1.069 | .001 |
R square | 0.5474 | 0.5716 | 0.389 | |||||
C statistic | 0.821 |
Abbreviations: ITE, in-training examination; ABIM, American Board of Internal Medicine; USMLE, United States Medical Licensing Examination; AOA, Alpha Omega Alpha.
Discussion
Our study shows that USMLE Step 2 CK performance correlates with test scores throughout residency and beyond, but is also associated with assessment of clinical competence from multiple perspectives during a yearlong ambulatory long block. USMLE Step 1 correlated only with ITE scores. Having an advanced degree was associated with higher patient communication scores, but none of the other measures, including UME awards or presence of research experience, were significant predictors of any outcome in the multivariate analysis.
Much of the current residency performance prediction literature compares information in application materials to clinical performance using faculty rating scales and/or standardized testing materials. We expanded on this by adding in non-faculty member ratings and patient evaluations derived from a unique yearlong ambulatory experience. Our data add to the growing body of literature suggesting that USMLE Step 2 CK may be a better predictor of resident performance.22,24,31–34 Reasons for these findings may be secondary to USMLE Step 2 CK being more clinically relevant or closer in time to residency graduation and board examinations. Reasons for why having an advanced degree was associated with higher patient communication scores may include residents having more life experience, more maturity, and/or completion of a previous rigorous training program.
Despite evidence for USMLE Step 2 CK, USMLE Step 1 scores continue to be one of the highest cited factors used by many residency programs in selecting applicants for interviews, although the available evidence suggests residency programs may do better by giving more weight to USMLE Step 2 CK in the application process.22,24,31–34
A major limitation of our study was that it was completed at a single institution and all clinical performance was measured only from a unique IM residency ambulatory long block structure. In addition, the staff, peer, and allied health assessment tools used in our program did not have significant supportive validity evidence for use. We did not weight certain medical school application items, preferring a present/absent accounting (eg, a failure in medical school could have been something as small as a shelf examination, or as large as an entire year). Due to difficulty in direct comparison we did not include medical school strength or commonly reported ERAS materials such as class rank and clerkship grades in the analysis. The multisource evaluation was anonymous and we could not determine the amount of contribution of each type of rater for any given resident. Residents were ranked on application data prior to matching so there is selection bias in the sample. Finally, no patient level outcomes data were included, and, we did not analyze the rich content in the narratives that accompany all of this data.
Future research should seek to understand why USMLE Step 2 CK may be a better predictor of residency success, identify the best strategies for applicants and programs to use USMLE Step 2 CK in residency selection, and determine if the presence of advance degrees is associated with higher patient-derived communication scores in other settings and specialties.
Conclusion
We have found that USMLE Step 2 CK is the best predictor of IM residency performance with regard to standardized testing during and after residency, as well as clinical performance from multiple perspectives during a yearlong ambulatory long block continuity experience.
References
- 1.Fortune JB. The content and value of letters of recommendation in the resident candidate evaluative process. Curr Surg. 2002;59(1):79–83. doi: 10.1016/s0149-7944(01)00538-4. [DOI] [PubMed] [Google Scholar]
- 2.Janis JE, Hatef DA. Resident selection protocols in plastic surgery: a national survey of plastic surgery program directors. Plast Reconstr Surg. 2008;122(6):1929–1939. doi: 10.1097/PRS.0b013e31818d20ae. [DOI] [PubMed] [Google Scholar]
- 3.Nallasamy S, Uhler T, Nallasamy N, Tapino PJ, Volpe NJ. Ophthalmology resident selection: current trends in selection criteria and improving the process. Ophthalmology. 2010;117(5):1041–1047. doi: 10.1016/j.ophtha.2009.07.034. [DOI] [PubMed] [Google Scholar]
- 4.Harfmann KL, Zirwas MJ. Can performance in medical school predict performance in residency? A compilation and review of correlative studies. J Am Acad Dermatol. 2011;65(5):1010–1022.e1012. doi: 10.1016/j.jaad.2010.07.034. [DOI] [PubMed] [Google Scholar]
- 5.Chole RA, Ogden MA. Predictors of future success in otolaryngology residency applicants. Arch Otolaryngol Head Neck Surg. 2012;138(8):707–712. doi: 10.1001/archoto.2012.1374. [DOI] [PubMed] [Google Scholar]
- 6.Andolsek KM. Improving the medical student performance evaluation to facilitate resident selection. Acad Med. 2016;91(11):1475–1479. doi: 10.1097/ACM.0000000000001386. [DOI] [PubMed] [Google Scholar]
- 7.Boysen Osborn M, Mattson J, Yanuck J, Anderson C, Tekian A, Fox JC, et al. Ranking practice variability in the medical student performance evaluation: so bad, it's “good”. Acad Med. 2016;91(11):1540–1545. doi: 10.1097/ACM.0000000000001180. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Stohl HE, Hueppchen NA, Bienstock JL. The utility of letters of recommendation in predicting resident success: can the ACGME competencies help? J Grad Med Educ. 2011;3(3):387–390. doi: 10.4300/JGME-D-11-00010.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Boyse TD, Patterson SK, Cohan RH, Korobkin M, Fitzgerald JT, Oh MS, et al. Does medical school performance predict radiology resident performance? Acad Radiol. 2002;9(4):437–445. doi: 10.1016/s1076-6332(03)80189-7. [DOI] [PubMed] [Google Scholar]
- 10.DeZee KJ, Thomas MR, Mintz M, Durning SJ. Letters of recommendation: rating, writing, and reading by clerkship directors of internal medicine. Teach Learn Med. 2009;21(2):153–158. doi: 10.1080/10401330902791347. [DOI] [PubMed] [Google Scholar]
- 11.Moynahan KF. The current use of United States Medical Licensing Examination Step 1 scores: holistic admissions and student well-being are in the balance. Acad Med. 2018;93(7):963–965. doi: 10.1097/ACM.0000000000002101. [DOI] [PubMed] [Google Scholar]
- 12.Shellito JL, Osland JS, Helmer SD, Chang FC. American Board of Surgery examinations: can we identify surgery residency applicants and residents who will pass the examinations on the first attempt? Am J Surg. 2010;199(2):216–222. doi: 10.1016/j.amjsurg.2009.03.006. [DOI] [PubMed] [Google Scholar]
- 13.Fryer JP, Corcoran N, George B, Wang E, Darosa D. Does resident ranking during recruitment accurately predict subsequent performance as a surgical resident? J Surg Educ. 2012;69(6):724–730. doi: 10.1016/j.jsurg.2012.06.010. [DOI] [PubMed] [Google Scholar]
- 14.Kenny S, McInnes M, Singh V. Associations between residency selection strategies and doctor performance: a meta-analysis. Med Educ. 2013;47(8):790–800. doi: 10.1111/medu.12234. [DOI] [PubMed] [Google Scholar]
- 15.Kay C, Jackson JL, Frank M. The relationship between internal medicine residency graduate performance on the ABIM certifying examination, yearly in-service training examinations, and the USMLE Step 1 examination. Acad Med. 2015;90(1):100–104. doi: 10.1097/ACM.0000000000000500. [DOI] [PubMed] [Google Scholar]
- 16.Neely D, Feinglass J, Wallace WH. Developing a predictive model to assess applicants to an internal medicine residency. J Grad Med Educ. 2010;2(1):129–132. doi: 10.4300/JGME-D-09-00044.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Stohl HE, Hueppchen NA, Bienstock JL. Can medical school performance predict residency performance? Resident selection and predictors of successful performance in obstetrics and gynecology. J Grad Med Educ. 2010;2(3):322–326. doi: 10.4300/JGME-D-09-00101.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Wagner JG, Schneberk T, Zobrist M, Hern HG, Jordan J, Boysen-Osborn M, et al. What predicts performance? A multicenter study examining the association between resident performance, rank list position, and United States Medical Licensing Examination Step 1 scores. J Emerg Med. 2017;52(3):332–340. doi: 10.1016/j.jemermed.2016.11.008. [DOI] [PubMed] [Google Scholar]
- 19.Bhat R, Takenaka K, Levine B, Goyal N, Garg M, Visconti A, et al. Predictors of a top performer during emergency medicine residency. J Emerg Med. 2015;49(4):505–512. doi: 10.1016/j.jemermed.2015.05.035. [DOI] [PubMed] [Google Scholar]
- 20.Sutton E, Richardson JD, Ziegler C, Bond J, Burke-Poole M, McMasters KM, Is USMLE. Step 1 score a valid predictor of success in surgical residency? Am J Surg. 2014;208(6):1029–1034. doi: 10.1016/j.amjsurg.2014.06.032. [DOI] [PubMed] [Google Scholar]
- 21.Yousem IJ, Liu L, Aygun N, Yousem DM. United States Medical Licensing Examination Step 1 and 2 scores predict neuroradiology fellowship success. J Am Coll Radiol. 2016;13(4):438–444.e432. doi: 10.1016/j.jacr.2015.10.024. [DOI] [PubMed] [Google Scholar]
- 22.Spurlock DR, Jr, Holden C, Hartranft T. Using United States Medical Licensing Examination((R)) (USMLE) examination results to predict later in-training examination performance among general surgery residents. J Surg Educ. 2010;67(6):452–456. doi: 10.1016/j.jsurg.2010.06.010. [DOI] [PubMed] [Google Scholar]
- 23.Sharp C, Plank A, Dove J, Woll N, Hunsinger M, Morgan A, et al. The predictive value of application variables on the global rating of applicants to a general surgery residency program. J Surg Educ. 2015;72(1):148–155. doi: 10.1016/j.jsurg.2014.06.003. [DOI] [PubMed] [Google Scholar]
- 24.Raman T, Alrabaa RG, Sood A, Maloof P, Benevenia J, Berberian W. Does residency selection criteria predict performance in orthopaedic surgery residency? Clin Orthop Relat Res. 2016;474(4):908–914. doi: 10.1007/s11999-015-4317-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Cuddy MM, Young A, Gelman A, Swanson DB, Johnson DA, Dillon GF, et al. Exploring the relationships between USMLE performance and disciplinary action in practice: a validity study of score inferences from a licensure examination. Acad Med. 2017;92(12):1780–1785. doi: 10.1097/ACM.0000000000001747. [DOI] [PubMed] [Google Scholar]
- 26.van der Vleuten CP, Schuwirth LW, Driessen EW, Govaerts MJ, Heeneman S. 12 Tips for programmatic assessment. Med Teach. 2014. pp. 1–6. [DOI] [PubMed]
- 27.Warm EJ, Schauer DP, Diers T, Mathis BR, Neirouz Y, Boex JR, et al. The ambulatory long-block: an accreditation council for graduate medical education (ACGME) educational innovations project (EIP) J Gen Intern Med. 2008;23(7):921–926. doi: 10.1007/s11606-008-0588-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Warm EJ, Schauer D, Revis B, Boex JR. Multisource feedback in the ambulatory setting. J Grad Med Educ. 2010;2(2):269–277. doi: 10.4300/JGME-D-09-00102.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Naidich JB, Grimaldi GM, Lombardi P, Davis LP, Naidich JJ. A program director's guide to the Medical Student Performance Evaluation (former dean's letter) with a database. J Am Coll Radiol. 2014;11(6):611–615. doi: 10.1016/j.jacr.2013.11.012. [DOI] [PubMed] [Google Scholar]
- 30.HCAHPS: Hospital Consumer Assessment of Healthcare Providers and Systems. 2019 https://www.hcahpsonline.org/ Accessed June 14.
- 31.Welch TR, Olson BG, Nelsen E, Beck Dallaghan GL, Kennedy GA, Botash A. United States Medical Licensing Examination and American Board of Pediatrics Certification Examination results: does the residency program contribute to trainee achievement. J Pediatr. 2017;188:270–274.e273. doi: 10.1016/j.jpeds.2017.05.057. [DOI] [PubMed] [Google Scholar]
- 32.Maker VK, Zahedi MM, Villines D, Maker AV. Can we predict which residents are going to pass/fail the oral boards? J Surg Educ. 2012;69(6):705–713. doi: 10.1016/j.jsurg.2012.08.009. [DOI] [PubMed] [Google Scholar]
- 33.Thundiyil JG, Modica RF, Silvestri S, Papa L. Do United States Medical Licensing Examination (USMLE) scores predict in-training test performance for emergency medicine residents? J Emerg Med. 2010;38(1):65–69. doi: 10.1016/j.jemermed.2008.04.010. [DOI] [PubMed] [Google Scholar]
- 34.Perez JA, Jr, Greer S. Correlation of United States Medical Licensing Examination and Internal Medicine In-Training Examination performance. Adv Health Sci Educ Theory Pract. 2009;14(5):753–758. doi: 10.1007/s10459-009-9158-2. [DOI] [PubMed] [Google Scholar]