Skip to main content
AEM Education and Training logoLink to AEM Education and Training
. 2018 Jan 31;2(2):73–76. doi: 10.1002/aet2.10079

Relationship Between Institutional Standardized Letter of Evaluation Global Assessment Ranking Practices, Interviewing Practices, and Medical Student Outcomes

Alexis Pelletier‐Bui 1,, Michael Van Meter 2, Michael Pasirstein 3, Christopher Jones 1, Diane Rimple 4
Editor: Teresa Man‐Yee Chan
PMCID: PMC6001728  PMID: 30051071

Abstract

Background

Emergency medicine (EM) program directors rely largely on the standardized letter of evaluation (SLOE) to help determine which applicants to interview in the face of an increasing number of applications. To further characterize the SLOE's role in the EM application process, particularly the global assessment (GA) ranking and its effect on interviewing practices and medical student outcomes, the leaders of EM programs were surveyed regarding their experiences in both generating and utilizing the SLOE.

Methods

Individuals on the Council of Emergency Medicine Residency Directors (CORD) and Clerkship Directors in Emergency Medicine (CDEM) Academy listservs were anonymously surveyed from March 21–30, 2015, with 18 questions in multiple‐choice and fill‐in‐the‐blank formats.

Results

There were 99 respondents. Only 39 respondents (39%) reported adhering strictly to SLOE guidelines by evenly placing their students into thirds (top, middle, lower) on the SLOE GA. Most respondents interviewed individuals ranked in the lower third. Programs adhering strictly to ranking guidelines were more likely to interview students in the lower third than those adhering loosely or not at all. There was no relationship between a program's self‐reported adherence to the SLOE ranking guidelines and the number of unmatched students in EM during the 2014 and 2015 academic years.

Conclusion

Many SLOE writers do not strictly adhere to CORD's SLOE writing guidelines when using the GA ranking, due to the fear of adversely impacting an applicant's ability to successfully match into EM. This calls into question the validity of the SLOE as it is currently used. However, this study suggests that adhering to recommended SLOE ranking guidelines is unlikely to substantially increase the risk that students will fail to match. If more evaluators were to adhere to the guidelines, the SLOE could become the valid evaluation instrument that graduate medical education has long been pursuing.


The average number of emergency medicine (EM) residency applications submitted by matched U.S. seniors and the average number of applications received by EM residency programs continue to rise.1, 2, 3, 4 Subsequently, residency leadership has increasingly relied on metrics, such as United States Medical Licensing Examination (USMLE) Step scores, to determine which applicants they will interview. EM is unique among medical specialties in that it uses an additional standardized tool to assist with determining whom to interview: the standardized letter of evaluation (SLOE). According to the National Resident Matching Program (NRMP) 2016 Program Directors Survey, program directors (PDs) place more importance on this piece of the application than on any other part.4

The SLOE was developed as an evaluative tool to help differentiate an often otherwise indistinguishable mass of applicants. Authors are requested to rank applicants in the top 10%, top third, middle third, or lower third, in comparison to their peers, in a variety of noncognitive areas (i.e., work ethic, communication) and globally. There is no explicit guidance regarding how to utilize or interpret the global assessment (GA), although the instructions for SLOE writing on the Council of Emergency Medicine Residency Directors (CORD) website remind evaluators that EM generally attracts a very competitive group of applicants relative to residency applicants in other specialties, and as such, “middle 1/3 and lower 1/3 rank estimates should be viewed as competitive applicants who will likely match.”5 However, results from a survey show that 60% of evaluators reported some degree of inflating scores to avoid decreasing the likelihood of an applicant's chance of matching.6 Our survey sought to further characterize the SLOE's role in the EM application process, particularly the GA ranking and its relationship to interviewing patterns and medical student match outcomes.

Methods

Institutional review board approval was obtained. The CORD Medical Student Advising Task Force surveyed all individuals listed on the CORD and Clerkship Directors in Emergency Medicine Academy listservs using Google Forms online software. While exact listserv rosters are not available, in 2015 there were 167 accredited EM residency programs7 and there are 163 EM clerkships listed in the Society for Academic Emergency Medicine (SAEM) clerkship directory.8 Clerkship directors (CDs) and PDs were sampled together given the strong overlap in responsibilities for both writing and utilizing the SLOE. The participants granted consent for publication of the anonymous, pooled survey results by simply completing the form. No individual program‐identifying data were collected to encourage honest reporting of practices/impressions. There were 18 survey questions in multiple‐choice and comment format. Questions assessed institutional adherence to CORD's GA ranking guidelines, interview patterns regarding students ranked in the lower and middle thirds, number of unmatched students at respondents’ institution, and hypotheses regarding why these students did not match. The survey was administered from March 21–30, 2015.

Data Analysis

We conducted simple descriptive analyses (proportions, mean, median, 95% confidence interval [CI], and range). Comparisons between programs with respect to the number of unmatched students (interval variables) based on SLOE adherence (nominal variable with three levels) were performed using the Kruskal‐Wallis test for nonnormal data. Data were analyzed using PASW Version 18.0 (IBM Corp.). Missing data were excluded in a pairwise fashion.

Results

There were 99 respondents with regional breakdown defined by the regional meetings of the SAEM: Great Plains (5), Mid Atlantic (11), Midwest (26), New England (11), Southeast (21), West (23), and unsure (2). Respondents were affiliated with academic training programs (70), county programs (27), and community programs (17; respondents could place themselves into more than one category).

Seven respondents (7%) reported not adhering to SLOE guidelines for placing students into top, middle, and lower thirds; 53 (54%) reported loose adherence; and 39 (39%) reported strict adherence (Table 1). The most common reason cited for not adhering was the concern that a lower third ranking is detrimental to the candidate's residency application. However, the majority (90%) of respondents have interviewed individuals ranked in the lower third, with those adhering most strictly being more likely to interview a candidate ranked in the lower third (Table 1). Additionally, we observed no difference in the number of students who went unmatched in the combined 2013–14 and 2014–15 academic years between institutions grouped according to their self‐reported SLOE guideline adherence with means of 1.86 unmatched students (95% CI = 1.13–2.59) among programs reporting strict adherence, 1.15 (0.74–1.55) among programs reporting loose adherence, and 2.67 (0–6.63) among programs reporting no adherence (p = 0.31; Table 1). The most common reasons cited for students going unmatched were poor grades (17 respondents), personality issues (9), and a failed USMLE examination (8).

Table 1.

Number of Applicants Interviewed with a Lower Third Ranking (Outside of the Institution's Home Students) and Number of Unmatched Applicants when Compared to Adherence to SLOE Ranking Guidelines

Number of Unmatched Applicants in 2013–14 and 2014–15 Academic Years Number (%) of Applicants Interviewed With a Lower Third Ranking
N Mean 95% CI Median Range N No Interviews <5% of Interviews 5%–15% of Interviews 15%–30% of Interviews Skipped Question
All respondents 89 1.53 1.12–1.94 1 0–10 99 5 (5) 36 (36) 42 (42) 11 (11) 5 (5)
Strict adherence 35 1.86 1.13–2.59 1 0–8 39 2 (5) 9 (23) 20 (51) 5 (13) 3 (8)
Loose adherence 48 1.15 0.74–1.55 1 0–5 53 3 (6) 23 (43) 20 (38) 5 (9) 2 (4)
No adherence 6 2.67 0–6.63 1.5 0–10 7 0 (0) 5 (71) 1 (14) 1 (14) 0 (0)

SLOE = standardized letter of evaluation.

Sixty‐eight respondents (69%) reported writing a SLOE for a student rated in the lower third in the GA who they thought were well suited to become respectable EM physicians. Twenty‐six respondents (26%) matched an applicant who was ranked in the lower third on the GA of their SLOE, 41 respondents (41%) did not, and 32 respondents (32%) did not know or skipped the question.

Discussion

The USMLE Step I score has long been the only simple, clear comparison residencies can make regarding applicants. However, demand exists for more honest and transparent methods of appraising residency candidates, including the assessment of bedside clinical skills. The SLOE was developed in an attempt to address this need. More recently, the Association of American Medical Colleges (AAMC) has begun to tackle this more universal problem for the larger house of medicine by standardizing content within the Medical Student Performance Evaluation (MSPE) and by developing the standardized video interview, which scores students’ communication & professionalism—skills that may be more relevant to their performance within residency than a USMLE score.

There are lessons to be learned and perhaps fears to be dispelled from EM's experience with the SLOE. While it remains to be seen whether or not AAMC's newer methods of comparing students are being utilized as originally intended, there are studies in EM literature that suggest otherwise for the EM SLOE, raising concerns regarding validity of the tool. Many writers do not use the full spectrum on the GA portion of the SLOE.9 In this survey, we also found that self‐reported grade inflation was common because of fear of adversely impacting an applicants’ ability to match successfully into EM. However, our survey demonstrates that this fear may not be warranted. Only 5% of respondents report not interviewing applicants ranked in the lower third. Most importantly, there was no evidence that strict adherence to the GA grading guidelines was associated with higher numbers of medical students failing to match in EM.

Interestingly, programs adhering strictly to ranking guidelines appeared to be more likely to interview students in the lower third than those adhering loosely or not at all (Table 1). These findings suggest that those who adhere to the guidelines also interpret the SLOE rankings as CORD intends when evaluating applicants, therefore not viewing a student ranked in the lower third as “undesirable.”

Our data show that adherence to the intended SLOE rankings is poor, limiting the validity and usefulness of the included information. However, it is plausible that if more SLOE authors adhered strictly to CORD's guidelines, the stigma of a lower third ranking could be dispelled and the SLOE could develop into a more useful measurement tool of EM applicants. We recommend continued use of the SLOE, but encourage better adherence to CORD's ranking guidelines and accurate reporting of each institution's grade distribution. By not adhering to the character of a standardized transparent document and succumbing to grade inflation, we are back to where we started from the graduate medical education perspective.

Limitations

A major limitation in our study is with response process validity—did our survey respondents interpret the questions in the same way we intended?10 For example, two people might answer “yes” to the question about “strict adherence” to the SLOE guidelines, but one has placed 10% of their applicants in the lower third GA versus the other placing 33% in the lower third.

Additionally, the number of respondents (99) compared to the total number of ACGME‐accredited EM residency programs (167) was limited. Those educators who chose to respond may differ from those who did not with respect to their ranking, advising, and interviewing practices. It is possible that there is a difference in opinion between PDs and CDs that could have skewed the data depending on the response ratio between the two groups. It is also possible that the survey captured more than one respondent from individual institutions; we did not apply a limit of one response per institution to maintain blinding of the results and because multiple people often contribute to grading and advising within a single institution.

Conclusion

Many standardized letter of evaluation writers do not strictly adhere to Council of Emergency Medicine Residency Directors’ standardized letter of evaluation writing guidelines when using the global assessment ranking. This study suggests that strict adherence to recommended evaluation guidelines is unlikely to substantially increase applicant chances of failing to match. If more evaluators were to adhere to the standardized letter of evaluation guidelines, it could become the valid measurement tool that graduate medical education has long been seeking.

AEM Education and Training 2018;2:73–76

An earlier version of this manuscript was presented as a poster at the Council of Emergency Medicine Residency Directors (CORD) Academic Assembly, Nashville, TN, March 2016.

The authors have no relevant financial information or potential conflicts to disclose.

References

  • 1. National Resident Matching Program, Data Release and Research Committee: Results of the 2009 NRMP Applicant Survey by Preferred Specialty and Applicant Type. Washington, DC: National Resident Matching Program, 2010. [Google Scholar]
  • 2. National Resident Matching Program, Data Release and Research Committee: Results of the 2015 NRMP Applicant Survey by Preferred Specialty and Applicant Type. Washington, DC: National Resident Matching Program, 2015. [Google Scholar]
  • 3. National Resident Matching Program, Data Release and Research Committee: Results of the 2012 NRMP Program Director Survey. Washington, DC: National Resident Matching Program, 2012. [Google Scholar]
  • 4. National Resident Matching Program, Data Release and Research Committee: Results of the 2016 NRMP Program Director Survey. Washington, DC: National Resident Matching Program, 2016. [Google Scholar]
  • 5. SLOE Standard Letter of Evaluation. Council of Emergency Medicine Residency Directors (CORD) website. Available at: http://www.cordem.org/i4a/pages/index.cfm?pageid=3743. Accessed Oct 24, 2017.
  • 6. Hegarty CB, Lane DR, Love JN, et al. Council of Emergency Medicine Residency Directors’ standardized letter of recommendation writers’ questionnaire. J Grad Med Educ 2014;6:301–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. ACGME – Accreditation Data System. Accreditation Council for Graduate Medical Education (ACGME) website. Available at: https://apps.acgme.org/ads/Public/Reports/Report/3. Accessed Nov 24, 2017.
  • 8. Clerkship Directory . Society for Academic Emergency Medicine (SAEM) website. Available at: http://www.saem.org/resources/directories/clerkship-directory. Accessed Nov 24, 2017.
  • 9. Love J, Deiorio NM, Ronan‐Bentle S, et al. Characterization of the Council of Emergency Medicine Residency Directors’ standardized letter of recommendation in 2011‐2012. Acad Emerg Med 2013;20:926–32. [DOI] [PubMed] [Google Scholar]
  • 10. Messick S. Validity of psychological assessment: validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. Am Psychol 1995;50:741–9. [Google Scholar]

Articles from AEM Education and Training are provided here courtesy of Wiley

RESOURCES