One of the best parts of being a journal editor is talking with authors, reviewers, and readers. Face-to-face discussions with early career scientists are especially interesting and enlightening.
In June we gave “Getting Published” presentations to two groups at the Genetics Society of America’s (GSA) International Caenorhabditis elegans Meeting at UCLA. The attendees were mainly graduate students, peppered with some postdocs and a few principal investigators. A segment of our talk focused on numerous ways of understanding the short and long term influence, impact, and value of their research. We also encouraged them to write as scientific communicators with a story to tell.
After each session, we sat down with several attendees who wanted to chat. One graduate student working in Asia explained that a requirement for earning her Ph.D. is to publish a first-author paper in a journal with a Journal Impact Factor (JIF) greater than five. We asked whether she would consider other journals that might be below that arbitrary bar. She shrugged. With funding so competitive, her advisor feels he has no choice but to think of the JIF. Another student, also working in Asia, said she could publish in journals with JIFs below five, but earning her Ph.D. would require at least two papers in journals whose JIFs add up to at least five.
Disconcerting, but hardly a surprise, particularly for Asia. Publishing in high-impact journals is used by the Center for World-Class Universities (CWCU) in China as part of their ranking of world universities.1 Some institutions even directly tie the JIF to monetary rewards — which can exceed $30,000 — for (typically first) authors.2,3,4
The simple fact is that statistics and impact factors can make or break scientists vying for grants, career promotions, and doctorate degrees, no matter where they are located.5
“What’s your impact factor?” is often the first question we’re asked at conferences. “Why are you asking?” is our sincere response. The subsequent conversations about the pressure young scientists experience are both candid and disconcerting. Some include the JIF for each publication listed in their CV. Many are looking for a number above a certain threshold. Others seem sheepish: “I have to make sure...” followed by words like “because career” or “I'm still a grad student.”
For those unfamiliar with a journal or trying to place a journal in the pecking order, JIF is sometimes mistaken as a proxy for journal quality, journal prestige, article quality, and author prestige.
But as a measure of a journal’s quality, the JIF is limited. As a measure of a particular paper or author, it is meaningless. When it is used as a shortcut to determine whether or not an author will earn a Ph.D., be awarded a grant, or earn tenure, it's just plain ridiculous.
We realize that some authors submit papers to a journal regardless of its JIF, but most students – especially international ones – say they can’t ignore JIFs or (the more euphemistic) tier or prestige component. Unfortunately, this means that countless journals (including many well regarded society-sponsored ones) are off the table.
Are stories like these just outliers, merely anecdotes? How important is the JIF to the genetics community?
A recent GSA survey of the community revealed that the Journal Impact Factor is the #2 reason authors chose a journal. “Fit” — the ability to reach the right audience — is #1. But when an author was up for promotion or tenure, the JIF became the #1 factor in authors’ decisions of where to submit. The Nature Publishing Group/Palgrave MacMillan Author Insights 2015 Survey6 reported the JIF as “very important or quite important” to 90% of respondents when deciding where to submit a paper. The cohort of respondents from China rated journal reputation and impact factor higher than did the respondents from the rest of the world.
In fact, the reputation of the journal was the most important consideration to 97% of respondents. The top-reported component of a journal’s reputation? Its impact factor!
Our angst is no secret. We lament the misuse of this journal metric as a proxy for article quality, importance, and influence. We worry because the JIF is frequently used as an indicator of the impact of individual scientists and their work. Scientists, editors, and publishers take to blogs,7 journals,8 social media, and interview forums 9 to discuss the deficiencies, abuse and dire implications of the impact factor. Many have explained in well-articulated, thoughtful detail the ways the metric has warped science, from driving the topics of research to picking journal club articles.10 We rail that things must change, that we should ignore the impact factor, that all metrics are reductive, that it's become survival of the fittest.11
We wonder why the JIF is calculated to three significant decimal places, but then realize that besides making journals easier to rank, a number calculated to 1/1000 feels somehow more legitimate, more precise12 – more scientific – than a mere whole number. We are just plain sick of impact factors.13
Even Thomson Reuters, the company that calculates JIFs and publishes Journal Citation Reports®, has written cautionary notes explaining the JIF and its intended uses.14 Eugene Garfield, who in 1955 created the idea leading to the official impact factor a decade later, seems to realize his invention has gone rogue.15
OK, so we can’t blame the overuse, misuse, misinterpretation, and worship of the JIF on a lack of data, opinions, and analyses of the impact factor and its discontents. Then what’s keeping the JIF alive?
Yes, we have reasons to be optimistic. Many of us came together to sign the San Francisco Declaration on Research Assessment (DORA), which has significantly raised awareness of this issue. Alternative metrics (“altmetrics”) such as Mendeley, Altmetric, F1000, Impact Story, Plum Analytics, new types of publishing platforms, and others that we have surely missed here, seek to provide new, immediate data on how research is accessed, discussed, and used.
Diverse groups are working toward widespread refinement, acceptance, and use of complementary assessment methods and metrics. NISO is developing new standards for using alternative metrics to assess research outcomes.* The Leiden Manifesto for Research Metrics emphasizes myriad principles to guide research evaluation. In October, Bruce I. Hutchins and NIH colleagues posted on bioRxiv a white paper unveiling the Relative Citation Ratio (RCR), a promising new metric that examines article-level influence.16 A new format of the NIH biosketch17 directs the focus of reviewers toward researcher accomplishments. Editors propose practical reforms for curing Impact Factor Mania.18 Progress!
But the JIF continues to play an important — perhaps still the most important — role in authors’ choice of where to submit papers.
The 2014 JIFs were released in June. Our instincts tell us to ignore it. We want to ignore it. But many scientists worldwide don’t have the luxury of ignoring it. Not yet, anyway.
This year, we could tell those graduate students we talked with at the C. elegans meeting that yes, they could now submit their best work to GENETICS, because our JIF now exceeds their arbitrary threshold of five; this means that our journal is now in play for them. They were glad. We were glad. Then, we talked science.
Is GENETICS a “better” journal because its JIF crossed an arbitrary line in the sand? We don’t think so. But some people in a position to judge those students, and to judge others competing for grants and promotion — people who have great influence over the course of their careers — seem to think so.
GENETICS will stay true to its mission of offering a high-quality, selective, innovative platform for publishing our colleagues’ stories. We will continue to strive to be fair and prompt in our review process. We will keep helping authors increase the accessibility and intellectual impact of their paper in the short and long terms. We see ourselves as author advocates and as it turns out, so do many of our authors.
None of us can change the ecosystem ourselves, nor can it be changed quickly. None of us in science can, at this time, force hiring, promotion, and grant review committees, and those in the international community, to jettison the JIF. But we can encourage and embolden change from within.
If nothing else we wrote resonated with you, please remember this: behind the JIF data and analyses, behind the promotion, tenure, and grant review committees, behind the numbers and the relentless pressure to publish in prestige journals — stand real scientists with real stories and struggles. They are trying in earnest to do good science and to progress in their careers. Why must they also cross an arbitrary number line in the sand?
Footnotes
*Disclosure: TAD is a member of NISO Altmetrics Working Group A.
Literature Cited
- 1.Academic Ranking of World Universities 2015. Press Release. Center for World-Class Universities at Shanghai Jiao Tong University; August 15, 2015. Available at: http://www.shanghairanking.com/Academic-Ranking-of-World-Universities-2015-Press-Release.html. [Google Scholar]
- 2.Davis P., 2011. Paying for impact: does the Chinese model make sense? Available at: scholarlykitchen.sspnet.org/2011/04/07/paying-for-impact-does-the-chinese-model-make-sense/.
- 3.Shao J., Shen H., 2011. The outflow of academic papers from China: why is it happening and can it be stemmed? Learned Publishing 24: 95–97. [Google Scholar]
- 4.Tatlow D. K., 2015 A scientific ethical divide between China and West, Available at: http://www.nytimes.com/2015/06/30/science/a-scientific-ethical-divide-between-china-and-west.html?_r=0.
- 5.The Economist, 2013. Looks good on paper: a flawed system for judging research is leading to academic fraud. Available at: http://www.economist.com/news/china/21586845-flawed-system-judging-research-leading-academic-fraud-looks-good-paper.
- 6.Nature Publishing Group, 2015. Author Insights 2015 survey. figshare. Available at: http://figshare.com/articles/Author_Insights_2015_survey/1425362.
- 7.Graham B., 2005. Impact factors and academic careers: insights from a postdoc perspective. Available at: http://blogs.biomedcentral.com/bmcblog/2015/06/17/impact-factors-academic-careers/.
- 8.Pulverer B., 2013. Impact fact-or-fiction? The EMBO Journal 32: 1651–1652. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Dawson S., 2014. Article vs journal Impact – Perspective from PLoS One Editorial Director Damian Pattinson. Available at: http://blog.scienceopen.com/2014/07/article-vs-journal-impact/.
- 10.Drugmonkey, 2014. Putting Impact factor restrictions on journal club articles is stupid and unscholarly. Available at: http://drugmonkey.scientopia.org/2011/05/04/putting-impact-factor-restrictions-on-journal-club-articles-is-stupid-and-unscholarly/.
- 11.Verma I. M., 2015. Impact, not impact factor. PNAS 112: 7875–7876. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Bertuzzi S., 2015. A false sense of precision—what happens to journal impact factor (JIF) rankings when you drop a decimal place? Available at: http://www.ascb.org/a-false-sense-of-precision-what-happens-to-journal-impact-factor-jif-rankings-when-you-drop-a-decimal-place/.
- 13.Curry S., 2012. Sick of impact factors reciprocal space. Available at: http://occamstypewriter.org/scurry/2012/08/13/sick-of-impact-factors/.
- 14.Thomson Reuters Web of Science, 1994. The Thomson Reuters impact factor. Available at: http://wokinfo.com/essays/impact-factor/.
- 15.Garfield E., 2005. “The agony and the ecstasy—the history and meaning of the journal impact factor” International Congress on Peer Review and Biomedical Publication, Chicago. Available at: http://www.psych.utoronto.ca/users/psy3001/files/JCR.pdf.
- 16.Hutchins B. I., Yuan X., Anderson J. M., Santangelo G. M., 2015. Relative citation ratio (RCR): A new metric that uses citation rates to measure influence at the article level bioRxiv. DOI: 10.1101/029629. Available at: http://biorxiv.org/content/early/2015/10/22/029629. [DOI] [PMC free article] [PubMed]
- 17.Rockey Sally, 2014. Changes to the NIH Biosketch National Institutes of Health Office of Extramural Research Extramural Nexus Rock Talk. Available at: https://nexus.od.nih.gov/all/2014/05/22/changes-to-the-biosketch/.
- 18.Casadevall A., Fang F. C., 2014. Causes for the persistence of impact factor mania. mBio 5(2):e00064–14. [DOI] [PMC free article] [PubMed] [Google Scholar]