Abstract
Background:
Research experience is believed to be an important component of the neurosurgery residency application process. One measure of research productivity is publication volume. The pre-residency publication volume of US neurosurgery interns and any potential association between applicant publication volume and the match results of top ranked residency programs has not been well-characterized.
Objective:
In this study, we sought to characterize the pre-residency publication volume of U.S. neurosurgery residents in the 2018–2019 intern class using the Scopus database.
Methods:
For each intern, we recorded the total number of publications, total number of first or last author publications, total number of neuroscience-related publications, mean number of citations per publication, and mean impact factor of the journal per publication. Pre-residency publication volumes of interns at the top 25 programs (based on a composite ranking score according to four different ranking metrics) were compared to those at all other programs.
Results:
We found that 82% of neurosurgery interns included in the analysis (190 interns from 95 programs) had at least one publication. The average number of publications per intern among all programs was 6 ± 0.63 (mean ± SEM). We also found that interns at top-25 neurosurgery residency programs tended to have a higher number of publications (8.3 ± 1.2 vs 4.8 ± 0.7, P = 0.0137), number of neuroscience-related publications (6.8 ± 1.1 vs 4.1 ± 0.7, P = 0.0419), and mean number of citations per publication (9.8 ± 1.7 vs 5.7 ± 0.8, P = 0.0267) compared to interns at all other programs.
Conclusion:
Our results provide a general estimate of the pre-residency publication volume of U.S. neurosurgery interns and suggest a potential association between publication volume and matching into top-25 neurosurgery residency programs.
Keywords: Publications, Neurosurgery, Residency, Medical Education, Surgery
Introduction
Research is an important component of the neurosurgery residency match process. In a 2018 survey, research involvement and interest ranked in the top 5 of the 33 factors that are used to evaluate neurosurgery residency applicants1. In 2018, Neurosurgery match applicants reported an average of 18.3 publications, abstracts and conference publications and 5.2 research experiences2. However, self-reported research productivity statistics have been previously shown to be potentially misrepresented and inaccurate 3–7. Further, average research productivity masks individual program level differences. For example, research-focused neurosurgery departments that receive significant NIH funding may attract and/or seek applicants with strong research backgrounds. Indeed, a study of U.S. orthopedic surgery interns found that higher publication volume is associated with matching into programs affiliated with departments in the top 25 for NIH funding8. To date, no such analysis has been performed for the neurosurgery match process. Here, we sought to (1) determine neurosurgery residency applicant research productivity as measured by publication volume and impact, and (2) determine whether there exists an association between applicant publication metrics and matching into top-25 neurosurgery residency programs.
Methods
Since the study analyzed publicly available data, IRB/ethics committee approval and patient consent were not sought.
Residency and resident selection
Neurosurgery residency programs as of October 2018 were identified using the Accreditation Council for Graduate Medical Education’s (ACGME) database. For each accredited residency program, all first-year residents (PGY-1) listed on the program’s website were included in the study. Residency programs with no publicly listed residents were excluded from the study. Each resident’s name and medical school attended were obtained from the residency program website and recorded.
Residency rankings
Residency program rankings were determined using the following 4 methods: (1) US News and World Neurology and Neurosurgery ranking (2018–2019), (2) NIH funding to Neurosurgery departments (2018), (3) Doximity Residency Navigator Reputation (2018–2019), and (4) Doximity Residency Navigator Research (2018–2019). For the US News Neurosurgery rankings, the ranking corresponding to the academic hospital connected with each training program was recorded. For NIH funding to medical school rankings, we obtained data from the Blue Ridge Institute for Medical Research database (www.brimr.org). For Doximity rankings, the Doximity Residency Navigator was searched for Neurological Surgery specialty programs and sorted by either Reputation or Research Output. We then calculated a composite rank for each of the program by averaging the ranks across the four ranking lists (Table 1).
Table 1:
Top 25 US neurosurgery residency programs. Composite ranks were calculated by taking the average of four common ranking lists, including NIH funding to neurosurgery department, US News Neurology & Neurosurgery, Doximity Reputation, and Doximity Research Output.
| Institution | Composite Rank |
|---|---|
| University of California (San Francisco) Program | 1 |
| Johns Hopkins University Program | 2 |
| Massachusetts General Hospital Program | 3 |
| Mayo Clinic College of Medicine and Science (Rochester) Program | 4 |
| New York Presbyterian Hospital (Columbia Campus) Program | 5 |
| Washington University/B-JH/SLCH Consortium Program | 6 |
| University of Virginia Medical Center Program | 7 |
| St Joseph’s Hospital and Medical Center-Barrow Neurological Institute Program | 8 |
| UPMC Medical Education Program | =9 |
| McGaw Medical Center of Northwestern University Program | =9 |
| Stanford Health Care-Sponsored Stanford University Program | 11 |
| University of Pennsylvania Health System Program | 12 |
| Cleveland Clinic Foundation Program | 12 |
| UCLA David Geffen School of Medicine/UCLA Medical Center Program | 14 |
| University of Michigan Health System Program | 15 |
| New York Presbyterian Hospital (Cornell Campus) Program | 16 |
| New York University School of Medicine Program | 17 |
| Brigham and Women’s Hospital/Children’s Hospital Program | 18 |
| University of Utah Program | 19 |
| University of Florida Program | 20 |
| Sidney Kimmel Medical College at Thomas Jefferson University/TJUH Program | =21 |
| Baylor College of Medicine Program | =21 |
| Emory University School of Medicine Program | 23 |
| University of Southern California/LAC+USC Medical Center Program | 24 |
| Oregon Health & Science University Program | 25 |
Resident publication data collection
The name of each resident included in the study was entered into the Scopus database (www.scopus.com) to identify all publications associated with the resident, number of citations, and position of the resident on each publication. For each paper, we also determined the 2018 impact factor of the journal that the manuscript was published in using the Master Journal List Beta database (https://apps.clarivate.com/mjl-beta/home). All publications published through December 2018 were included in the study. Each potential publication in Scopus was ensured to meet all of the following criteria: (1) an affiliation at an institution in which the resident previously attended or worked at, (2) a corresponding or senior author at an institution in which the resident previously attended or worked at. In the case of two residents with the same or similar name, the two study lead authors (DP, MP) performed an additional search evaluating the resident’s LinkedIn and residency website profiles to confirm past institutions attended by the resident. Only publications affiliated with current or past instutions that the resident attended were included in the analysis.
For each publication, the total number of citations and position of the resident in the author list were recorded from Scopus. Two independent study authors classified each publication as either review, case report, original research, or other. All clinical trials, retrospective chart reviews, basic science research, and meta analyses were classified as original research. Case reports and case series were both classified as case reports. All publications lacking any novel data or analyses and focused on literature review were classified as review publications. All other publications, including editorials, reflections and narratives were classified were as other. Two independent study authors also classified each publication as neuroscience related or not. Any publication in a neurosurgery-related journal (ie. Journal of Neurosurgery) or neuroscience-related journal (ie. Neuron or Nature Neuroscience) was classified independently by two study authors as neuroscience-related. For publications not in these journals, the abstracts were reviewed and publications were classified as neuroscience-related if they involved pathologies or processes in the nervous system. Any discrepancy between the two independent study authors was discussed as a group.
Statistical analysis
For each intern included in the study, the total number of publications, number of first- or last-author publications, number of neuroscience-related publications, mean number of citations per publication, mean impact factor per publication, and total number of review, case report, original research, or other articles were calculated. These metrics were compared for residents in top 25 programs vs those in all other programs using the Welch unpaired t test. P<0.05 was considered statistically significant. All statistical analyses were conducted using Graphpad Prism (GraphPad Software).
Results
A total of 95 accredited programs and 190 interns were identified and included in the present study. 1174 publications by the 190 interns were identified from Scopus and analyzed. Out of 190 interns, 156 (82%) had at least one publication and 118 had at least one first/last author publication (62%). To assess publication volume for each intern, we examined total number of publications, total number of first/last author publications, and total number of neuroscience-related publications and total number of case reports, review articles, original research and other articles. To assess the publication impact for each intern, we examined the mean number of citations per publication and the mean impact factor of the journal that the manuscript was published in. On average, each neurosurgery intern published a total 6.0 ± 0.6 papers, with 2.2 ± 0.3 as first or last author (Table 2). Furthermore, the average impact factor of journal publications for each resident was 4.5 ± 0.6 (Table 2).
Table 2:
Summary of publication volume among 190 interns from 95 ACGME-accredited Neurosurgery residency programs considered in this study.
| Publication productivity per intern | Mean ± SEM | 25th percentile | 50th percentile | 75th percentile | 90th percentile |
|---|---|---|---|---|---|
| Total # of publications | 6.0 ± 0.6 | 1 | 4 | 8 | 12 |
| Total # of first/last author publications | 2.2 ± 0.3 | 0 | 1 | 3 | 5 |
| Total # of neuroscience-related publications | 5.0 ± 0.6 | 0 | 2 | 7 | 11 |
| Mean # of citations per publication | 7.2 ± 0.7 | 1 | 3 | 9 | 19 |
| Mean impact factor per publication | 4.5 ± 0.6 | 2 | 3 | 4 | 7 |
To characterize the association between publication productivity and matching into top-25 residency programs, we examined the total number of publications, total number of first/last author publications, total number of neuroscience-related publications, total number of case reports, review articles, original research and other articles as surrogates for publication volume. We found that interns in top-25 neurosurgery residency programs had statistically higher numbers of total publications (8.3 ± 1.2 vs 4.8 ± 0.7, P = 0.0137) and neuroscience-related publications (6.8 ± 1.1 vs 4.1 ± 0.7, P = 0.0419) (Figure 1A, B). No statistically significant differences in total number of first/last author publications were found (2.8 ± 0.4 vs 1.9 ± 0.4, P = 0.0844) (Figure 1C). No statistically significant differences between top 25 neurosurgery programs and non-top 25 programs in the number of case reports, review articles, original research or other articles were found (Supplementary Figure 1). To investigate differences in publication impact between interns at top-25 programs and those at all other programs, we examined the mean number of citations per publication as well as the impact factor of the journal. We found that while interns at top-25 programs tended to have higher number of mean citations per publication (9.8 ± 1.7 vs 5.7 ± 0.8, P = 0.0267) (Figure 2A), there were no statistically significant differences in the impact factor of the journal (6.0 ± 1.4 vs 3.6 ± 0.3, P = 0.0893) (Figure 2B).
Figure 1:
Publication volume of interns in top-25 neurosurgery residency programs and those in all other programs. A) Total number of publications per intern. B) Total number of first- or last-author publications per intern. C) Total number of neuroscience-related publications per intern. *P<0.05, ns denotes P>0.05. Statistical significance was determined by a Welch two-sample t test. Mean ± SEM are shown by the graphs.
Figure 2:
Publication impact of interns in top-25 neurosurgery residency programs and those in all other programs. A) Mean number of citations per publication per intern. B) Mean impact factor per publication per intern. *P<0.05, ns denotes P>0.05. Statistical significance was determined by a Welch two-sample t test. Mean ± SEM are shown by the graphs.
Among 190 interns, 36 interns were observed staying at the same institution for medical school and Neurosurgery residency. Of the 36 interns that matched to their home institution for residency, 21 belonged to top 25 programs and 15 did not. We found that home-matching interns were significantly more likely to be in top 25 programs (21 home matching interns out of 70 interns) compared to non-top-25 programs (15 home-matching interns out of 105 interns) (X2 (1, N = 190) = 8.8, p <0.01).
Discussion
This study sought to determine the relationship between neurosurgery applicant research publication productivity and match outcome by leveraging publicly available publication data. The major finding from this study is that the average number of publications was 6 ± 0.63 in the 2018–2019 Neurosurgery intern class. Further, interns in top 25 programs had a significantly greater number of total publications, number of neuroscience-related publications and mean number of citations per publication. Intriguingly, no statistically significant differences were found in the number of first/last author publications, original research articles, review articles, case reports, other articles and the mean impact factor of the journal, although it may be possible that our study did not have sufficient power to detect small but significant differences in these parameters. Altogether, our data suggest that a higher number of well-cited publications is associated with matching into top-25 neurosurgery residency programs in the US.
Our results highlight a number of novel and informative statistics regarding the neurosurgery match process. Firstly, we find that the average number of publications (6 ± 0.63) in the 2018–2019 Neurosurgery intern class is significantly fewer than the 2018 self-reported average of 18.3 publications, abstracts and conference publications. This suggests an inaccuracy in self-reported research productivity statistics, corroborating reports of inflation in self-reported research measurements in orthopedic4, emergency medicine3, and oncology residency5 match processes. Secondly, while our results cannot establish causation, they point to a paradigm in a greater number of well cited publications, regardless of author position and publication type, is associated with matching into a top 25 program.
Certain neurosurgery residences may exhibit preference for neurosurgery applicants from their institutions. To examine this phenomenon, we computed the number of neurosurgery applicants matching to the same institution as their medical school (home-matching). interns made up a significantly greater proportion of interns in top 25 programs (21 home matching interns out of 70 interns) compared to non-top-25 programs (15 home-matching interns out of 105 interns) (X2 (1, N = 190) = 8.8, p <0.01). Several explanations may explain this result. First, residency program directors at top 25 programs match may place greater value on the medical education and clinical training received at their institution relative to other institutions. Alternately, residency directors at top 25 programs may place greater emphasis on establishing prior relationships with match candidates, a task that is inherently easier among home-matching interns. Thirdly, medical students at top 25 institutions may be more willing to stay at their institution for residency. Lastly, this may have been a spurious result due to our analysis of only a single year of residency match data. More work is needed to elucidate the motives underlying this behavior.
Several limitations exist in this study. First, we did not account for applicants with PhD degrees or those who took gap years before or during medical school, all of which would enable more dedicated time towards research pursuits that result in higher publication output. Second, while publication volume may be a convenient metric to quickly assess research productivity, the total number of publications does not necessarily reflect the quality and depth of the research experience. Indeed, meaningful research investigations of high-quality, such as wet lab basic science studies, often require a significant amount of time to complete. Third, our study did not examine USMLE scores, clerkship grades, sub-internship performance, and aspects of personal character (dedication, professionalism, ability to work in a team) that also play important roles in selection of neurosurgery candidates in addition to research productivity. We chose to focus on publication volume due to availability of data in public databases that are amenable to a quantitative research study. While research productivity is only one assessment of neurosurgery applicants and other personal and academic traits are crucial to the development of a successful neurosurgeon, research productivity is generally recognized as an important aspect of successful applications. This present results confirm the importance of research productivity, as measured by publication metrics, for a successful match within a top-25 neurosurgery residency. Lastly, data in the present study were drawn from a single class of interns. In spite of these limitations, this study provides a baseline measure of research productivity among neurosurgery interns and demonstrates that increased publication volume is associated with matching into a top-25 neurosurgery residency program. Future studies should consider several years of data and include characteristics beyond publication volume to increase generalizability and minimize residual confounds. Additional analysis ultimately providing a data-driven approach will provide a better understanding of factors that could influence neurosurgery residency matching.
Conclusion
Higher, well-cited publication volume is associated with matching into top-ranked US neurosurgery residency programs, confirming the importance of research productivity in the neurosurgery residency application process. Future studies should continue to examine other factors that may influence neurosurgery residency match outcomes, ultimately providing an evidence-based approach to better understand the qualities that make a compelling neurosurgery residency application.
Supplementary Material
Acknowledgements:
We would like to thank Dr. Eyiyemisi Damisah and Dr. Michael Apuzzo for their helpful comments on the manuscript.
Funding: This work was supported by the NIH Medical Scientist Training Program Training Grant T32GM007205 to PQD. All authors do not report any personal or institutional financial interests.
Abbreviations list:
- NIH
National Institutes of Health
- ACGME
Accreditation Council for Graduate Medical Education
Footnotes
Disclosure: The authors do not report any personal or institutional financial interest.
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
References
- 1.National Resident Matching Program. Results of the 2018 NRMP Program Director Survey. https://www.nrmp.org/wp-content/uploads/2018/07/NRMP-2018-Program-Director-Survey-for-WWW.pdf; 2018. Accessed 24 February 2019.
- 2.Dossani RH, Adeeb N, Tumialán LM. Commentary: Trends in the National Resident Matching Program (NRMP) Data for Graduating US Medical Students Matching in Neurosurgery. Neurosurgery. 2018;83(2):E65–70. 10.1093/neuros/nyy181. [DOI] [PubMed] [Google Scholar]
- 3.Katz ED, Shockley L, Kass L, et al. Identifying inaccuracies on emergency medicine residency applications. BMC Med Educ. 2005;5(8):30 10.1186/1472-69205-30. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Konstantakos EK, Laughlin RT, Markert RJ, Crosby LA. Follow-up on misrepresentation of research activity by orthopaedic residency applicants: Has anything changed? Journal of Bone and Joint Surgery - American Volume. 2007;89(9):2084–88. 10.2106/JBJS.G.00567. [DOI] [PubMed] [Google Scholar]
- 5.Yang GY, Schoenwetter MF, Wagner TD, Donohue KA, Kuettel MR. Misrepresentation of Publications Among Radiation Oncology Residency Applicants. J Am Coll Radiol. 2006;3(4):259–64. 10.1016/j.jacr.2005.12.001. [DOI] [PubMed] [Google Scholar]
- 6.Hsi RS, Hotaling JM, Moore TN, Joyner BD. Publication misrepresentation among urology residency applicants. World J Urol. 2013;313):697–702. 10.1007/s00345-012-0895-0. [DOI] [PubMed] [Google Scholar]
- 7.Dale JA, Schmitt CM, Crosby LA. Misrepresentation of Research Criteria by Orthopaedic Residency Applicants. Journal of Bone and Joint Surgery - American Volume. 1999;81(12):1679–81. 10.2106/00004623-200010000-00028. [DOI] [PubMed] [Google Scholar]
- 8.Campbell ST, Gupta R, Avedian RS. The Effect of Applicant Publication Volume on the Orthopaedic Residency Match. J Surg Educ. 2016;73(3):490–95. 10.1016/j.jsurg.2015.11.011. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.


