Skip to main content
SAGE Open Medicine logoLink to SAGE Open Medicine
. 2023 May 29;11:20503121231177765. doi: 10.1177/20503121231177765

Assessing academic productivity of U.S. otolaryngology departments using the h(5) index

Deesha Desai 1, Philip J Grosse 2, Carl H Snyderman 3,
PMCID: PMC10240857  PMID: 37284570

Abstract

Objectives:

We aim to examine the h(5) index of U.S. otolaryngology programs to help assess current academic productivity.

Methods:

A total of 116 otolaryngology departments with residency programs were included. Our primary outcome was the h(5) index, calculated cumulatively for faculty MDs, DOs, and PhDs within the department. Audiologists and clinical adjunct faculty were excluded. This was calculated over a 5-year period (2015–2019) using Elsevier’s database SCOPUS. Faculty affiliation within SCOPUS was confirmed by cross-referencing department websites. The h(5) indices were calculated and then correlated with other publication metrics, including total publications by department and publications in major otolaryngology journals.

Results:

The h(5) index was highly correlated positively with other metrics of academic productivity, including total publications and publications in top 10 otolaryngology journals. Greater variability in data was noted as the h(5) index increased. Similar trends were observed when the h(5) was compared to the number of residents accepted per year. Rankings of departments by Doximity and US News and World Report were positively correlated with h(5) though they remained weaker when compared to other correlations.

Conclusions:

h(5) indices are a valuable tool to objectively assess academic productivity for otolaryngology residency departments. They are a better indicator of academic productivity than national rankings.

Keywords: h(5) index, academic productivity, otolaryngology programs, social media, residency training

Introduction

Academic productivity is considered a benchmark measure for those considering a future career in research or academia.14 The academic productivity of a department provides a better assessment of the overall scholarly activity of the program. Currently, however, there are relatively few objective measures to evaluate academic productivity within the field of otolaryngology.57

Departments tend to determine academic productivity through the use of numerous bibliometric measures, such as total number of publications or citation count. 8 As stand-alone measures, these parameters fail to differentiate the varying impact of articles and do not speak to the true productivity of an author, as they can be easily self-inflated.912 The h-index, on the other hand, measures both the productivity and citation impact of a scholar’s publications. As described by Hirsch 13 who introduced the metric in 2005, the h-index is an author’s number of articles, h, that have at least h citations each. This concept can be applied collectively to publications originating from a department. 14 It remains one of the most objective measures to evaluate both the scientific impact and contribution of an author and/or large-scale department.5,10,15

The aim of this study is to measure the h-index calculated over a 5-year period, termed h(5), of U.S. otolaryngology programs to objectively assess recent academic productivity.

Methods

The nature of the study was primarily data analysis of academic otolaryngology departments with residency programs. U.S. Otolaryngology departments, including Puerto Rico, were analyzed. The list of residency programs for the United States and the number of first-year residency positions were obtained from the Fellowship and Residency Electronic Interactive Database (FREIDA).

Our primary bibliometric outcome was the h-index calculated over a 5-year period (2015–2019), hence termed h(5). This was obtained using Elsevier’s database SCOPUS. The h(5) was calculated cumulatively for faculty within each institution’s department. Otolaryngology faculty included MD, MBBS, DO, and PhD degrees listed on individual department websites as of 2021. Audiologists, nurse practitioners, speech pathologists, physician associates, and clinical adjunct faculty were excluded from the analysis. If a faculty member was not listed on the individual department website, they were generally excluded from the analysis.

If authors of similar names appeared in the SCOPUS search, department affiliations (e.g., institution or city) on SCOPUS were evaluated and cross-referenced with department websites and the journals in which the faculty member published. If neither criterion aligned, faculty members were excluded in an effort to ensure that the correct otolaryngology faculty member had been included.

The SCOPUS database was used to obtain other publication metrics, including total publications by departments and total publications in top 10 journals (determined by Journal Citations Reports, Otorhinolaryngology) all within the same time frame (2015–2019). The SCOPUS database sorted and filtered publications to avoid repeat publication inclusions. Doximity, Research Output, Otolaryngology was used to assess Doximity’s research output score for each residency program. US News and World Report (USNWR) directly provided its 2015–2019 data on “Best Hospitals for Ear, Nose, and Throat.” The average of the year span was calculated and used in the analysis. Rankings for Doximity and USNWR were reversed to allow for a positive correlation between academic productivity and highest rated programs.

This study was reviewed by the University of Pittsburgh Institutional Review Board and received exemption as it did not involve human subjects.

Statistical analysis

In terms of statistical analysis, Spearman rho analysis was calculated to assess the correlation between the variables. 16 One outlier, Mass Eye and Ear/Harvard, was excluded from the production of all the graphs to better indicate trends as it has multiple associated academic programs, including Brigham and Children’s Hospitals. However, it remains included in the overall correlations calculations.

Results

A total of 123 programs were initially gathered from FRIEDA. Of these, seven institutions lacked faculty listings and were excluded from the analysis. A total of 116 otolaryngology departments with residency programs were included. The h(5) indices of the programs were calculated and are listed in Table 1. Correlations across all institutions were calculated. The h(5) versus other metrics of academic productivity and number of residents had strong, positive correlations. When compared against Doximity, Research Output, and USNWR’s rankings, positive but weaker correlations were observed (Table 2).

Table 1.

h(5) indices.

ENT programs/residencies
H5 (2015–2019) index range: 41–80 (listed alphabetically)
 Emory University School of Medicine Program
 Icahn School of Medicine at Mount Sinai/New York Eye and Ear Infirmary
 Johns Hopkins University Program
 Mass Eye and Ear Infirmary/Harvard
 New York Presbyterian Hospital (Columbia and Cornell Campus) Program
 University of California (San Francisco) Program
 University of Michigan Health System Program
 University of Pennsylvania Health System Program
 UPMC Medical Education (Pittsburgh) Program
 Stanford Health Care
H5 (2015–2019) index range: 31–40 (listed alphabetically)
 Indiana University School of Medicine Program
 Medical University of South Carolina Program
 NYU Grossman School of Medicine Program
 Ohio State University Hospital Program
 University of California (San Diego) Medical Center Program
 University of Chicago Program
 University of Cincinnati Medical Center/College of Medicine Program
 University of Colorado Program
 University of Texas Southwestern Medical Center Program
 University of Utah Health Program
 University of Washington Program
 Vanderbilt University Medical Center Program
 Washington University/B-JH/SCLH Consortium Program
H5 (2015–2019) index range: 21–30 (listed alphabetically)
 Baylor College of Medicine Program
 Case Western Reserve/University Hospitals Cleveland Medical Center Program
 Cedars-Sinai Medical Center Program
 Cleveland Clinic Foundation Program
 Dartmouth Hitchcock/Mary Hitchcock Memorial Hospital Program
 Duke University Hospital Program
 Mayo Clinic Minnesota
 McGaw Medical Center of Northwestern University Program
 Medical College of Wisconsin Affiliated Hospitals Program
 Oregon Health and Science University Program
 Rutgers New Jersey Medical School Program
 Sidney Kimmel Medical College at Thomas Jefferson University/TJUH Program
 SUNY Downstate
 UCLA David Geffen School of Medicine/UCLA Medical Center Program
 University of Alabama Medical Center Program
 University of Arizona College of Medicine-—Tucson Program
 University of California (Irvine) Program
 University of California Davis Health Program
 University of Iowa Hospitals and Clinics Program
 University of Kansas School of Medicine Program
 University of Miami/Jackson Health System Program
 University of Minnesota Program
 University of North Carolina Hospitals Program
 University of Southern California Program
 University of Tennessee Program
 University of Wisconsin Hospitals and Clinics Program
 Wayne State University School of Medicine Program
 Yale New Haven Medical Center Program
H5 (2015–2019) index range: 11–20 (listed alphabetically)
 Ascension Macomb-Oakland Hospital Program
 Boston University Medical Center Program
 Eastern Virginia Medical School Program
 Geisinger Health System Program
 George Washington University Program
 Henry Ford Hospital
 Kaiser Permanente Northern California Program
 Loma Linda University Health Education Consortium Program
 Louisiana State University (Shreveport) Program
 Louisiana State University Program
 Mayo Clinic College of Medicine and Science (Arizona) Program
 Medical College of Georgia Program
 MedStar Health/Georgetown University Hospital Program
 Montefiore Medical Center/Albert Einstein College of Medicine Program
 Oklahoma State University Center for Health Sciences Program
 Penn State Milton S Hershey Medical Center Program
 Rush University Medical Center Program
 Southern Illinois University Program
 St. Louis University School of Medicine Program
 Temple University Hospital Program
 Texas Tech University Health Sciences Center at Lubbock Program
 Tripler Army Medical Center Program
 Tulane University Program
 University at Buffalo Program
 University of Arkansas for Medical Sciences Program
 University of Connecticut Program
 University of Florida Program
 University of Kentucky College of Medicine Program
 University of Louisville School of Medicine Program
 University of Maryland Program
 University of Mississippi Medical Center Program
 University of Missouri Program
 University of Nebraska Medical Center College of Medicine Program
 University of Oklahoma Health Sciences Center Program
 University of Rochester Program
 University of South Florida Morsani Program
 University of Texas Health Science Center at Houston Program
 University of Texas Medical Branch Hospitals Program
 University of Virginia Medical Center Program
 Virginia Commonwealth University Health System Program
 Wake Forest University School of Medicine Program
 West Virginia University Program
 Western Reserve Hospital Program
 Zucker School of Medicine at Hofstra/Northwell at Lenox Hill Hospital Program
H5 (2015–2019) index range: 0–10 (listed alphabetically)
 Albany Medical Center Program
 Baylor Scott and White Medical Center (Temple) Program
 Beaumont Health (Farmington Hills) Program
 Cooper Hospital—University Medical Center Program
 Detroit Medical Center Corporation Program
 Henry Ford Macomb Hospital/Lakeshore ENT
 Kettering Health Network Program
 Loyola University Medical Center Program
 McLaren Health Care/Oakland/MSU Program
 Ohio Health/Doctors Hospital Program
 Philadelphia College of Osteopathic Medicine Program
 St. Elizabeth Boardman Hospital Program
 Stony Brook Medicine Program
 SUNY Upstate Medical University Program
 Tufts Medical Center Program
 University of Illinois College of Medicine at Chicago Program
 University of Nevada Las Vegas School of Medicine Program
 University of Puerto Rico Program
 University of Texas Health Science Center San Antonio Program
 University of Vermont Medical Center Program
 UPMC Medical Education (Erie) Program

Table 2.

Overall correlations.

Covariates h(5) index
Total publications Correlation coefficient 0.947**
p-Value <0.001
Publications in top 10 journals Correlation coefficient 0.925**
p-Value <0.001
Number of residents Correlation coefficient 0.730**
p-Value <0.001
Doximity, research output Correlation coefficient 0.767**
p-Value 0.000
US News and World Report Correlation coefficient 0.495**
p-Value <0.001
**

p ≤ 0.001.

The h(5) was significantly correlated with the total number of publications and total publications in top 10 journals (Figures 1 and 2). Greater variability of the data was noted with larger h(5) index. Additionally, a larger h(5) had a positive correlation with the number of residents accepted per year with similar variability seen (Figure 3). Furthermore, the positive but weaker correlations of h(5) with Doximity, Research Output, and USNWR rankings were highlighted. Here, the variability remained consistent as h(5) increased (Figures 4 and 5).

Figure 1.

Figure 1.

Total publications versus h(5).

Figure 2.

Figure 2.

Publications in top 10 journals versus h(5).

Figure 3.

Figure 3.

Number of residents per year versus h(5).

Figure 4.

Figure 4.

Doximity, research output versus h(5).

Figure 5.

Figure 5.

USNWR’s rank versus h(5).

Discussion

Currently, individuals and departments assess academic productivity using a variety of bibliometric measures and sources, including citation count, total number of publications, Doximity, American Medical Association resources and USNWR. However, according to a recent study, 56% of respondents believed that Doximity may not be accurate, indicating that these resources come with limitiations. 17

Doximity is one of the most widely used and accessible sites for healthcare professionals, as it allows individuals to search programs based on reputation, research output, size, and percent who subspecialize. 18 The research output score for each program is based on the recent alumni base. This score is calculated from a combination of the collective h-index of publications authored by alumni graduating within the past 15 years. 19 A significant shortcoming of this approach, however, is that it does not provide perspective on the current academic productivity of departments, as alumni do not necessarily remain at their training institutions, and 15 years is too long to accurately capture present-day program productivity. Furthermore, alumni may practice in nonacademic settings; thus, this metric does not accurately reflect current academic productivity.

Additionally, USNWR rates hospitals based on their clinical care performance. It identifies medical centers in various specialties that are best suited to patients whose illnesses pose unusual challenges due to underlying conditions, procedure difficulty, advanced age, or other medical concerns that increase risk. 20 Although these clinical factors are important, they do not reflect research productivity. This is exemplified by the weaker correlation between h(5) and USNWR rankings.

In comparison, the h-index carries significant advantages. Not only does it provide insight into an individual author, but it can also be used to assess entire journals or departments. 15 Furthermore, unlike other measures of productivity, the h-index is a reproducible measure that remains robust to outliers and is not skewed by a single popular article.21,22 In this study, h-index was correlated with both the total number of departmental publications and publications in top 10 journals. This suggests that the h(5) may be more advantageous than single-number metrics by having the ability to combine both output and impact and help provide an objective, robust alternative to currently existing metrics that only provide insight into output or impact individually. 23

However, this metric also presents limitations. For instance, an h-index does not differentiate between various types of publications, such as a review versus original research article. 24 Additionally, it does not consider differences in authorship contribution. First or senior authors are not differentiated from other authors on the publication, as all contributions are weighted equally. Although this assumption may not be true in all cases, no widely used objective measure of academic productivity currently considers an author’s individual contributions to a publication. 25 Additionally, departments with a higher percentage of research faculty (PhD degrees) may elevate the h(5) due to publications in journals with greater impact factors. Similarly, h(5) does not account for department size. As seen with our outlier, programs with larger amounts of faculty members may elevate the h(5). Finally, it is important to stress that the h(5) calculations in this article were dependent on accurate representation of faculty and residents on individual department websites.

The h-index and academic productivity are simply a few of the multiple indicators of academic standing of a department. Other variables that could be considered include non-peer-reviewed publications such as textbooks, extramural research funding (NIH, PCORI, SBIR, etc.), leadership roles in professional societies, courses offered at the institution, presentations at national meetings, and patents. However, this information is not readily available. The academic community would benefit from establishing key metrics that could be self-reported by departments on an annual basis to provide greater transparency into scholarly activity.

Overall, the h-index uniquely offers an objective metric to evaluate academic productivity, and its advantages outweigh any potential disadvantages. By allowing evaluation of a department’s academic productivity as a whole, h(5) provides insight into institutions’ current productivity. Although it correlates well with other measures of productivity, it does not suffer from the subjectivity and bias of reputation rankings or influence of clinical factors. It can be updated annually and may thus provide a more current representation than factors with longer time courses.

This study is not without limitations. Due to the disparate research activities of many individuals and possible submissions into broad journals rather than otolaryngology-specific journals, it is difficult for the “publications in top 10 journals” metric to fully capture this cumulative work. However, Otolaryngology journals frequently include basic science work. Furthermore, faculty members could have possibly shifted between departments during the 2015–2019 duration. With the methods provided, it would be challenging to verify the faculty timelines in their department. However, the work of faculty members would contribute to the overall academic productivity in the new department as well as their previous department in the event of overlap. Lastly, a power analysis for sample size calculation was not performed for this study.

Conclusion

The h(5) offers an objective measure of academic productivity. This metric can be used to provide a current perspective of scholarly activity at academic otolaryngology departments and is easily updated using available data.

Acknowledgments

We would like to acknowledge CTSI at the University of Pittsburgh for their assistance with statistical work.

Footnotes

Previous presentation: This article was presented at the American College of Surgeons’ Virtual Clinical Congress, October 23–27, 2021.

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding: The author(s) received no financial support for the research, authorship, and/or publication of this article.

Ethics approval: Ethical approval for this study was waived by the University of Pittsburgh Institutional Review Board and received exemption as it did not involve human subjects (IRB #2106003).

ORCID iD: Carl H Snyderman Inline graphichttps://orcid.org/0000-0002-8920-4522

References

  • 1.Hill RG, Boeckermann LM, Huwyler C, et al. Academic and gender differences among U.S. Otolaryngology Board Members. Laryngoscope 2021; 131(4): 731–736. [DOI] [PubMed] [Google Scholar]
  • 2.Kohlert S, Zuccaro L, McLean L, et al. Does medical school research productivity predict a resident’s research productivity during residency? J Otolaryngol Head Neck Surg 2017; 46(1): 34. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Oleck NC, Gala Z, Weisberger JS, et al. Relevance of academic productivity in the assessment of integrated plastic surgery applicants. J Surg Educ 2020; 77(6): 1429–1439. [DOI] [PubMed] [Google Scholar]
  • 4.Campbell PG, Awe OO, Maltenfort MG, et al. Medical school and residency influence on choice of an academic career and academic productivity among neurosurgery faculty in the United States. J Neurosurg 2011; 115(2): 380–386. [DOI] [PubMed] [Google Scholar]
  • 5.Hirsch JE.Does the H index have predictive power? Proc Natl Acad Sci U S A 2007; 104(49): 19193–19198. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Thangamathesvaran L, Patel NM, Siddiqui SH, et al. The otolaryngology match: a bibliometric analysis of 222 first-year residents. Laryngoscope 2019; 129(7): 1561–1566. [DOI] [PubMed] [Google Scholar]
  • 7.Aoun SG, Bendok BR, Rahme RJ, et al. Standardizing the evaluation of scientific and academic performance in neurosurgery–critical review of the “h” index and its variants. World Neurosurg 2013; 80(5): e85–e90. [DOI] [PubMed] [Google Scholar]
  • 8.Carpenter CR, Cone DC, Sarli CC.Using publication metrics to highlight academic productivity and research impact. Acad Emerg Med 2014; 21(10): 1160–1172. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Sarkiss CA, Riley KJ, Hernandez CM, et al. Academic productivity of US neurosurgery residents as measured by h-index: program ranking with correlation to faculty productivity. Neurosurgery 2017; 80(6): 975–984. [DOI] [PubMed] [Google Scholar]
  • 10.Svider PF, Choudhry ZA, Choudhry OJ, et al. The use of the h-index in academic otolaryngology. Laryngoscope 2013; 123(1): 103–106. [DOI] [PubMed] [Google Scholar]
  • 11.Choudhri AF, Siddiqui A, Khan NR, et al. Understanding bibliometric parameters and analysis. Radiographics 2015; 35(3): 736–746. [DOI] [PubMed] [Google Scholar]
  • 12.Lee J, Kraus KL, Couldwell WT.Use of the h index in neurosurgery. J Neurosurg 2009; 111(2): 387–392. [DOI] [PubMed] [Google Scholar]
  • 13.Hirsch JE.An index to quantify an individual’s scientific research output. Proc Natl Acad Sci U S A 2005; 102(46): 16569–16572. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Ponce FA, Lozano AM.Academic impact and rankings of American and Canadian neurosurgical departments as assessed using the h index. J Neurosurg 2010; 113(3): 447–457. [DOI] [PubMed] [Google Scholar]
  • 15.Jones T, Huggett S, Kamalski J.Finding a way through the scientific literature: indexes and measures. World Neurosurg 2011; 76(1–2): 36–38. [DOI] [PubMed] [Google Scholar]
  • 16.Dodge Y.Spearman rank correlation coefficient. In: Dodge Y. (ed.) The concise encyclopedia of statistics. New York, NY: Springer, 2008, pp. 502–505. [Google Scholar]
  • 17.Smith BB, Long TR, Tooley AA, et al. Impact of doximity residency navigator on graduate medical education recruitment. Mayo Clin Proc Innov Qual Outcomes 2018; 2(2): 113–118. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Peterson WJ, Hopson LR, Khandelwal S, et al. Impact of doximity residency rankings on emergency medicine applicant rank lists. West J Emerg Med 2016; 17(3): 350–354. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Doximity, Inc. Doximity residency navigator. 3MD Communications, Inc. San Francisco, CA. [Google Scholar]
  • 20.Olmsted MG, Powell R, Murphy J, et al. Methodology: U.S. News & World Report Best Hospitals: specialty Rankings. U.S. News & World Report, LP, Washington, DC. [Google Scholar]
  • 21.Lando T, Bertoli-Barsotti L.A new bibliometric index based on the shape of the citation distribution. PLoS One 2014; 9(12): e115962. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Ruscio J.Taking advantage of citation measures of scholarly impact: hip hip h index! Perspect Psychol Sci 2016; 11(6): 905–908. [DOI] [PubMed] [Google Scholar]
  • 23.Agarwal A, Durairajanayagam D, Tatagari S, et al. Bibliometrics: tracking research impact by selecting the appropriate metrics. Asian J Androl 2016; 18(2): 296–309. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Dinis-Oliveira RJ.The H-index in life and health sciences: advantages, drawbacks and challenging opportunities. Curr Drug Res Rev 2019; 11(2): 82–84. [DOI] [PubMed] [Google Scholar]
  • 25.Tscharntke T, Hochberg ME, Rand TA, et al. Author sequence and credit for contributions in multiauthored publications. PLoS Biol 2007; 5(1): e18. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from SAGE Open Medicine are provided here courtesy of SAGE Publications

RESOURCES