In July 2018, the New England Journal of Medicine identified problems with randomization of study participants in the retraction of a foundational study associating the Mediterranean diet1 with prevention of cardiovascular disease.2 Although news of this retraction was widespread, the article was just 1 of the 500-600 articles retracted each year from peer-reviewed journals.3 Sustaining high-quality research in fields such as public health is important for protecting and improving human health and well-being, but scientific progress and public trust have weakened with the growing rate of article retractions4 and reports of research containing errors, omissions, or fraud.5-7
Researchers in psychology and biomedicine have been the most active in examining research quality. Studies of psychology researchers found that most admit to using at least 1 questionable research practice. A questionable research practice is a choice made during data management, analysis, or reporting that appears incorrect or dishonest and results in more favorable research findings such as statistical significance.8,9 Use of questionable research practices may have contributed to successful replication of only 40%-60% of 100 articles published during 2008 in psychology.10,11 Likewise, a 2011 study found that just 14 of 67 (21%) drug studies were successfully replicated.7 Although there is little direct examination of the quality of public health research, a study published in 2018 by the lead author (J.K.H.) attempted to reproduce results from 6 published articles and found the use of questionable research practices and numerous errors in reporting.5 Low-quality research has serious consequences for human health; 1 project found 400 000 human participants were enrolled and 70 501 were treated in medical studies that were later retracted.12 Adopting reproducible research practices can help improve research quality.
Replication—or conducting a study again to verify its results—provides the strongest means to ensure research quality and to validate findings. Reproducing a study—or analyzing an existing data source to produce the same results—is an alternative when replication is not feasible. Reproducing research does not ensure original analyses were correct, but the process can improve research quality by verifying findings and identifying weaknesses, questionable research practices, and errors in in data management, analyses, and reporting.
Research is reproducible when data are shared and either (1) the statistical code is shared or (2) detailed, clear, and complete instructions for analysis are accessible.13 Using reproducible research practices can increase the pace of scientific discovery, promote greater exchange of ideas and development of new collaborations among scientists, and improve the visibility of scientific contributions.14,15 Research articles with shared data or shared code are cited more often than articles without shared data or code, and articles with shared data have fewer errors than articles without shared data.15 Data sources that are shared are used in more publications than data sources that are not shared.16
Publications with the necessary elements for reproducibility appear rare in public health.5,14 Widespread use of reproducible research practices in public health will require a sustained and deliberate effort from stakeholders across the field. Numerous researchers and organizations, many from psychology, have suggested guidelines for improving research quality.17-20 For researchers, trying to adopt all these recommended reproducible research practices may feel like trying to eat an elephant. That is, the task seems impossibly large because of barriers such as data privacy policies, journal word limits, and time constraints. Add to this the pressure to publish or secure external funding for promotion and tenure, and the task may seem even more overwhelming. We suggest researchers take small bites and start with 3 changes to their research practices21: (1) use existing coding guidelines (eg, the Google R Style Guide,22 Best Practice Programming Techniques for SAS Users23) to write clear, well-documented code; (2) publicly share the statistical code—or include adequate details about methodology in the main text or supplemental materials—for each published article; and (3) publicly share original, de-identified, or simulated data whenever possible.
Although our focus here is on quantitative research, researchers are increasingly making parallel calls for enhancing transparency and reproducibility in qualitative and mixed-methods research.24 The 3 suggestions apply to qualitative work with small modifications. Specifically, sharing de-identified data and detailed documentation could increase reproducibility for qualitative research.25 However, given the more subjective and iterative nature of qualitative data collection, analysis, and interpretation, tasks such as data de-identification and development of sufficiently detailed documentation are not as straightforward as for quantitative research and can require excessive time and effort.20,26 Given the role of qualitative and mixed-methods research in improving our understanding of public health problems and their solutions,26 additional work on development of strategies for reproducible qualitative research is needed to identify more feasible strategies.
Along with researchers, journals and funders can contribute to the culture shift by rewarding27 or requiring the use of reproducible research practices, such as shared data and code, as a condition for publishing or funding. For journals publishing qualitative and mixed-methods research, modifications of journal data-sharing policies should be considered to make them more appropriate for facilitating sharing for all data types.25 Employers might consider acknowledging, incentivizing, or rewarding researchers who adopt reproducible research practices. Public health training programs and program accreditors can build reproducible research practices into the curriculum so that the next generation formats and shares its data and code in reproducible ways by default.
Improving the quality of reported research through the use of reproducible research practices could hasten scientific progress, increase public trust in public health evidence, and increase the positive impact of this evidence on human health. Three small steps for researchers could contribute to 1 giant leap for public health.
Footnotes
Declaration of Conflicting Interests: The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding: The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: The authors received funding from the Robert Wood Johnson Foundation Increasing Openness and Transparency in Research program (PI: Harris; 74 421).
ORCID iD: Jenine K. Harris, PhD
https://orcid.org/0000-0002-3576-5906
References
- 1. Estruch R, Ros E, Salas-Salvadó J, et al. Primary prevention of cardiovascular disease with a Mediterranean diet. N Engl J Med. 2013;368(14):1279–1290. doi:10.1056/NEJMoa1200303 [DOI] [PubMed] [Google Scholar]
- 2. Estruch R, Ros E, Salas-Salvadó J, et al. Retraction and republication: primary prevention of cardiovascular disease with a Mediterranean diet. N Engl J Med. 2013;368:1279–90. N Engl J Med. 2018;378(25):2441–2442. doi:10.1056/NEJMc1806491 [DOI] [PubMed] [Google Scholar]
- 3. Retraction Watch. Help us: here’s some of what we’re working on. https://retractionwatch.com/help-us-heres-some-of-what-were-working-on/. Accessed November 22, 2018.
- 4. Steen RG. Retractions in the scientific literature: is the incidence of research fraud increasing? J Med Ethics. 2011;37(4):249–253. doi:10.1136/jme.2010.040923 [DOI] [PubMed] [Google Scholar]
- 5. Harris JK, Wondmeneh S, Zhao Y, Leider JP. Examining the reproducibility of 6 published studies in public health services and systems research. J Public Health Manag Pract. 2018. doi:10.1097/PHH.0000000000000694 [DOI] [PubMed] [Google Scholar]
- 6. Fang FC, Steen RG, Casadevall A. Misconduct accounts for the majority of retracted scientific publications [published erratum appears in Proc Natl Acad Sci U S A. 2013;110(3):1137]. Proc Natl Acad Sci U S A. 2012;109(42):17028–17033. doi:10.1073/pnas.1212247109 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Prinz F, Schlange T, Asadullah K. Believe it or not: how much can we rely on published data on potential drug targets? Nat Rev Drug Discov. 2011;10(9):712 doi:10.1038/nrd3439-c1 [DOI] [PubMed] [Google Scholar]
- 8. Agnoli F, Wicherts JM, Veldkamp CL, Albiero P, Cubelli R. Questionable research practices among Italian research psychologists. PLoS One. 2017;12(3):e0172792 doi:10.1371/journal.pone.0172792 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. John LK, Loewenstein G, Prelec D. Measuring the prevalence of questionable research practices with incentives for truth telling. Psychol Sci. 2012;23(5):524–532. doi:10.1177/0956797611430953 [DOI] [PubMed] [Google Scholar]
- 10. Anderson CJ, Bahnik S, Barnett-Cowan M, et al. Response to comment on “estimating the reproducibility of psychological science”. Science. 2016;351(6277):1037. [DOI] [PubMed] [Google Scholar]
- 11. Open Science Collaboration. Estimating the reproducibility of psychological science. Science. 2015;349(6251):aac4716 doi:10.1126/science.aac4716 [DOI] [PubMed] [Google Scholar]
- 12. Steen RG. Retractions in the medical literature: how many patients are put at risk by flawed research? J Med Ethics. 2011;37(11):688–692. doi:10.1136/jme.2011.043133 [DOI] [PubMed] [Google Scholar]
- 13. Peng R. The reproducibility crisis in science: a statistical counterattack. Significance. 2015;12(3):30–32. [Google Scholar]
- 14. Peng RD, Dominici F, Zeger SL. Reproducible epidemiologic research. Am J Epidemiol. 2006;163(9):783–789. doi:10.1093/aje/kwj093 [DOI] [PubMed] [Google Scholar]
- 15. McKiernan EC, Bourne PE, Brown CT, et al. How open science helps researchers succeed. Elife. 2016;5:e16800 doi:10.7554/eLife.16800 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Piwowar HA, Day RS, Fridsma DB. Sharing detailed research data is associated with increased citation rate. PLoS One. 2007;2(3):e308 doi:10.1371/journal.pone.0000308 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17. Asendorpf JB, Conner M, De Fruyt F, et al. Recommendations for increasing replicability in psychology. Eur J Pers. 2013;27(2):108–119. [Google Scholar]
- 18. Nosek BA, Alter G, Banks GC, et al. Promoting an open research culture. Science. 2015;348(6242):1422–1425. doi:10.1126/science.aab2374 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19. Miguel E, Camerer C, Casey K, et al. Promoting transparency in social science research. Science. 2014;343(6166):30–31. doi:10.1126/science.1245317 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20. Tsai AC, Kohrt BA, Matthews LT, et al. Promises and pitfalls of data sharing in qualitative research. Soc Sci Med. 2016;169:191–198. doi:10.1016/j.socscimed.2016.08.004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21. coding2share. Reproducible research toolkit. https://coding2share.github.io/ReproducibilityToolkit. Accessed June 8, 2018.
- 22. Google. Google’s R style guide. https://google.github.io/styleguide/Rguide.xml. Accessed November 22, 2018.
- 23. Lafler KP, Rosenbloom M. Best practice programming techniques for SAS users. SAS Glob Forum Conf Proc. 2017;175–2017. https://support.sas.com/resources/papers/proceedings17/0175-2017.pdf. Accessed November 22, 2018.
- 24. Smith B, McGannon KR. Developing rigor in qualitative research: problems and opportunities within sport and exercise psychology. Int Rev Sport Exerc Psychol. 2018;11(1):101–121. [Google Scholar]
- 25. Levitt HM, Bamberg M, Creswell JW, Frost DM, Josselson R, Suárez-Orozco C. Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: the APA Publications and Communications Board Task Force report. Am Psychol. 2018;73(1):26–46. doi:10.1037/amp0000151 [DOI] [PubMed] [Google Scholar]
- 26. Creswell JW, Klassen AC, Clark VLP, Smith KC. Best Practices for Mixed Methods Research in the Health Sciences. Bethesda, MD: National Institutes of Health; 2011. [Google Scholar]
- 27. Kidwell MC, Lazarević LB, Baranski E, et al. Badges to acknowledge open practices: a simple, low-cost, effective method for increasing transparency. PLoS Biol. 2016;14(5):e1002456 doi:10.1371/journal.pbio.1002456 [DOI] [PMC free article] [PubMed] [Google Scholar]
