Skip to main content
HCA Healthcare Journal of Medicine logoLink to HCA Healthcare Journal of Medicine
. 2024 Mar 29;5(1):5–9. doi: 10.36518/2689-0216.1598

Variation in Transcript Reports Among Residency Applicants: An Anesthesia Program’s Perspective

Alex M Hendon 1,, Imani Thornton 1
PMCID: PMC10939090  PMID: 38560392

Abstract

Background

With recent changes made to move USMLE Step 1 and COMLEX Level 1 scores to Pass/Fail, it becomes necessary to find other metrics to evaluate residency candidates. One conserved metric included in all residency applications is medical school transcripts. This study aims to highlight the highly varied transcript reporting in a new era of holistic applicant review.

Methods

Medical school transcripts were extracted from the Electronic Residency Application Service applications to our anesthesiology residency program for the 2021–2022 application cycle. All personally identifiable information was removed. Results were categorized and tallied by 2 independent reviewers. Overall, we assessed transcript information from 156 allopathic and osteopathic medical schools. Transcript data were separated into 9 different categories.

Results

The most common grading system for allopathic medical schools was Pass/Fail. The most common grading system for osteopathic medical schools was Pass/Fail and Letter Grades. There were several medical schools that had unique grading systems and many of those did not provide a grading key for interpretation. Less than half of the allopathic and osteopathic schools offered Honors or High Pass in their grading systems, often with little information provided as to how these grades were earned.

Conclusion

The information provided on medical school transcripts is extremely variable. Although many schools reported grades as Pass/Fail, there was no majority or consistent presentation among the transcripts. Much of the information provided on transcripts required interpretation by its reviewer and made the process of holistic applicant review more difficult.

Keywords: medical education, graduate medical education, internship and residency, medical school transcripts, licensing exam, grading systems

Introduction

The transition from medical student to resident physician is a long and arduous process for both medical students and their respective residency programs. The process of applying for residency programs by entering the residency match coordinated by the National Residency Match Program (NRMP) involves a complicated exchange of information between applicants and programs. Depending on the desired specialty, medical students may apply to a large number of programs to increase their chances of match success. Competitive specialties may receive thousands of applications for a small number of available positions. Consequently, residency program administrators spend countless hours reviewing and narrowing the pool of applicants with the hope of creating a list of ranked applicants that embody the mission of their respective programs. Historically, programs have relied on different objective measures to help screen applications. However, the medical school community has recently taken steps to encourage a more holistic view of residency applicants in an effort to address the disparities that exist in graduate medical education. Typical application packages include medical school transcripts, United States Medical Licensing Examination (USMLE)/Comprehensive Osteopathic Medical Licensing Examination of the United States (COMLEX) scores, a Medical Student Performance Evaluation (MSPE, Dean’s letter), letters of recommendation, and personal statements. With the recent change in reporting of USMLE Step 1 and COMLEX Level 1 score to pass/fail, it is possible that transcript data will become more important in applicant review and consideration. A recent study of the MSPE or “Dean’s letter” reported that there is considerable variability in the content of the MSPEs submitted by United States (US) allopathic medical schools,1 further highlighting the difficulty programs face in differentiating candidates based on application data. Extreme variability persists in medical school reporting for residency applications to this day. In this study, we describe the variability that exists in medical school transcripts from various allopathic and osteopathic medical schools.

Methods

For purposes of this review, transcript data from US allopathic and osteopathic medical schools were separated from residency applications for examination. Only transcripts from the 2021–2022 applicant pool to our anesthesiology residency were included in this study. All identifiable personal information was removed. The data provided in those transcripts were categorized and tallied. The categories established were based on the grading format provided by each medical school. The grading systems were categorized based on independent reviews by 2 separate reviewers. Grading categories included the following: pass/fail, letter grades (A–F), a combination of pass/fail and letter grades, satisfactory/unsatisfactory, unique grading system with a key, unique grading system without a key, grades withheld, number grades with pass/fail, and letter grades with satisfactory/unsatisfactory. All transcripts were reviewed; however, medical schools with multiple applicants to our program were counted once, assuming that each medical school only provided 1 type of transcript for each of their medical students. We also observed that some schools offered different grading systems for the preclinical and clinical training years. The following data are based on transcripts received from each medical school. In total, we received 117 unique allopathic school applications and 36 unique osteopathic school applications.

Results

In analyzing the data presented in Tables 1 and 2, allopathic medical schools most commonly provided transcript data are in the form of pass/fail for both preclinical and clinical years, which represented 41% of the transcripts evaluated. In contrast, only 16% of the osteopathic schools reported data in a purely pass/fail format. Additionally, some transcripts included an honor/high pass category; however, these data also varied widely with some transcripts offering both honors and high pass while others offered either honors or high pass. In total, we received 43 allopathic transcripts and 6 osteopathic transcripts that listed an honors/high pass category which constituted 36.7% and 16.7% of these schools, respectively. Unique grading systems were also frequently observed (41 total schools). For example, one transcript reported grading from ‘Excellent’ to ‘Unsatisfactory’ while another reported only an undefined ‘SA’ or ‘SH.’

Table 1.

Evaluation of Allopathic Medical School Transcripts, Separated by Grading Criteria

Transcript* categories Category tally
Pass/fail 48
Letter grades 4
Pass/fail & letter grades 13
Satisfactory/unsatisfactory 12
Unique with key 20
Unique without key 14
Grades withheld 1
Number grades with pass/fail 1
Letter grades with satisfactory/unsatisfactory 4
Total schools offering honors 43/117
*

All data were collected from transcripts of individuals who applied to HCA Florida Westside Hospital Anesthesiology Residency.

Table 2.

Evaluation of Osteopathic Medical School Transcripts, Separated by Grading Criteria

Transcript* categories Category tally
Pass/fail 6
Letter grades 2
Pass/fail & letter grades 15
Satisfactory/unsatisfactory 12
Unique with key 0
Unique without key 5
Grades withheld 2
Number grades with pass/fail 6
Letter grades with satisfactory/unsatisfactory 0
Total schools offering honors 6/36
*

All data were collected from transcripts of individuals who applied to HCA Florida Westside Hospital Anesthesiology Residency.

Discussion

With the progression to pass/fail board exams (Both USMLE Step 1 and COMLEX Level 1) and the increasing number of medical students applying to residency, selecting prospective medical students for anesthesiology residency has become more challenging for residency programs. Current trends have shown that the number of graduate medical education applications has outpaced the number of residency slots available. Per the AAFP, the number of medical school graduates has increased by 37.5% over the past 2 decades.2 An even more shocking number exists in the 2019 NRMP match data, which showed that a record 38 376 applicants were competing for 32 194 positions.3 These applicant increases have created a substantial challenge for residency programs as many receive many more applications than residency positions available. According to data from the Electronic Residency Application Service (ERAS), the average number of applications anesthesiology programs received in 2022 was 1299 compared with 880 in 2017.4 Consequently, these increases have amplified the administrative burden and increased the time commitment and labor costs associated with residency application reviews. Residency faculty (program directors, associate program directors, and core faculty) are integrally involved in the applicant review process, often in addition to busy clinical schedules. Despite acknowledging the current status of the residency application process, the USMLE Scoring Committee decided to make the licensing exam pass/fail.4 This move toward pass/fail board exams followed pass/fail preclinical/clinical medical school coursework after the Association for American Medical Colleges published a multicenter study suggesting that students had enhanced well-being when evaluated with pass/fail grading systems.5 Many medical schools moved to pass/fail curriculums in the early 2000s, which may have amplified the impact of USMLE Step 1 and COMLEX Level 1 scores in applicant evaluation. In fact, based on a survey of residency program directors from the NRMP, USMLE Step 1/COMLEX Level 1 scores have traditionally been the most important factor leading to an interview invitation for medical school applicants.6 While the emphasis on medical student well-being should not be ignored, these data suggest that objective measures are important when comparing and evaluating applicants.

After carefully examining transcripts from allopathic and osteopathic medical students, we found wide variability in the transcript data being provided to residency programs. In addition to pass/fail, many unique grading systems were used and there was often no key provided. Thus, interpreting the transcript data in the context of medical student success was frequently impossible. For example, if a sample transcript had only pass/fail criteria and the student did not earn any high passes/honors grades, it was impossible to determine if such grades were offered at that institution. Many schools offered honors or high pass on transcripts, but they did not provide a description of the criteria that qualified a student to achieve an honors or high pass grade. There is no shared understanding of what constitutes honors or high pass among the medical education committee, which significantly limits the value of these grades when comparing applicants. Another important observation is the lack of a conserved curriculum. Medical schools offer and require many different courses making some curricula appear to be far more challenging than others. This alone may challenge the holistic approach as students from institutions with a perceived higher pedigree may be preferred over students from smaller, lesser-known institutions.

When considering residency applicants, one of the most important criteria includes the potential for the resident to successfully matriculate through a particular program. Virtual interviewing has changed the way we interact with residency applicants during recruitment season and has introduced new challenges to evaluating prospective candidates for program fitness. However, the virtual interview, believed to be a function of the COVID-19 pandemic, appears as though it will be a mainstay in the interview process, at least for now. Physical observation, including body language and interaction with residents and other applicants, can often provide important information to the interviewer. Virtual interviews significantly limit these observations. Removing these in-person elements has further dampened the ability of programs to compare and assess future residents. While there are obviously many variables that could affect the potential for a resident to perform well, one constant that remains is the ability to perform well on standardized exams. Every year residents in their respective specialties sit for in-service training exams. These exams test knowledge acquired and resident progression as they move through residency training. The exams are scored and percentiles are assigned based on exam performance.7 In-training exams typically provide data on a resident’s ability to sit for and pass a board exam for certification. Transcript data containing pass/fail grades gives little information to a residency program about a student’s ability to pass important exams during and after residency. These exams serve as an important metric that the Accreditation Council for Graduate Medical Education (ACGME) continues to include in accreditation decisions. According to the ACGME manual, the ACGME may remove accreditation from programs in which residents and fellows are unable to satisfy the requirements of that program.8 This may pose a problem for anesthesiology programs that require its residents to sit for and pass the American Board of Anesthesiology basic board examination after the first year of clinical anesthesia training. As comparative data such as number scores and transcript letter grades become obsolete, evaluating students’ ability to pass board exams that still use scoring systems proves difficult.

After carefully evaluating transcripts from many different medical schools, it is clear that there are no benchmarks or conserved metrics contained within the data. Thus, we must consider alternatives when evaluating potential medical residents. USMLE Step 2/COMLEX Level 2 exams continue to provide number scores and are likely to increase in importance when comparing applicants. Adoption of this new evaluation strategy detracts from the ACGME’s push to reduce medical student stress. As standardized objective measures are removed from applications, holistic measures, which may include applicant experiences and research interests, are increasing in importance. Often the MSPE helps to provide additional information, although these also vary significantly in content.9 Expanding information provided on transcripts to include criteria for achieving reported grades may help programs use transcript data in a more useful way.

This study has some limitations. First, medical school transcripts from applications to a single institution were evaluated. However, the total number of applications to our program was comparable with the numbers reported by ERAS10 and should therefore provide an objective sample of different medical schools. Also, the available transcript data was based on the grades earned by a particular applicant and may not reflect all of the grades offered by that school.

Conclusion

We found that the information reported on medical school transcripts varies widely and is difficult to compare among institutions. Although there is no consensus, many descriptions of “holistic review” of residency applications include medical school transcript review as a part of that process.11,12 Given the number of different variables reported in the transcripts, and the difference in the meaning of those variables depending on the institution, residency programs have a limited ability to use transcript data to evaluate applicants. Much of the data in medical school transcripts requires interpretation by its reviewers, which may actually compromise the holistic review process.

Funding Statement

This research was supported (in whole or in part) by HCA Healthcare and/or an HCA Healthcare-affiliated entity.

Footnotes

Conflicts of Interest: The authors declare they have no conflicts of interest.

The authors are employees of HCA Florida Westside Hospital, a hospital affiliated with the journal's publisher.

This research was supported (in whole or in part) by HCA Healthcare and/or an HCA Healthcare-affiliated entity. The views expressed in this publication represent those of the author(s) and do not necessarily represent the official views of HCA Healthcare or any of its affiliated entities.

References


Articles from HCA Healthcare Journal of Medicine are provided here courtesy of Emerald Medical Education

RESOURCES