Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2015 Aug 26.
Published in final edited form as: Science. 2015 Jun 26;348(6242):1422–1425. doi: 10.1126/science.aab2374

Promoting an open research culture

Author guidelines for journals could help to promote transparency, openness, and reproducibility

B A Nosek *, G Alter, G C Banks, D Borsboom, S D Bowman, S J Breckler, S Buck, C D Chambers, G Chin, G Christensen, M Contestabile, A Dafoe, E Eich, J Freese, R Glennerster, D Goroff, D P Green, B Hesse, M Humphreys, J Ishiyama, D Karlan, A Kraut, A Lupia, P Mabry, T Madon, N Malhotra, E Mayo-Wilson, M McNutt, E Miguel, E Levy Paluck, U Simonsohn, C Soderberg, B A Spellman, J Turitto, G VandenBos, S Vazire, E J Wagenmakers, R Wilson, T Yarkoni
PMCID: PMC4550299  NIHMSID: NIHMS714651  PMID: 26113702

Transparency, openness, and reproducibility are readily recognized as vital features of science (1, 2). When asked, most scientists embrace these features as disciplinary norms and values (3). Therefore, one might expect that these valued features would be routine in daily practice. Yet, a growing body of evidence suggests that this is not the case (46).

POLICY

A likely culprit for this disconnect is an academic reward system that does not sufficiently incentivize open practices (7). In the present reward system, emphasis on innovation may undermine practices that support verification. Too often, publication requirements (whether actual or perceived) fail to encourage transparent, open, and reproducible science (2, 4, 8, 9). For example, in a transparent science, both null results and statistically significant results are made available and help others more accurately assess the evidence base for a phenomenon. In the present culture, however, null results are published less frequently than statistically significant results (10) and are, therefore, more likely inaccessible and lost in the “file drawer” (11).

The situation is a classic collective action problem. Many individual researchers lack strong incentives to be more transparent, even though the credibility of science would benefit if everyone were more transparent. Unfortunately, there is no centralized means of aligning individual and communal incentives via universal scientific policies and procedures. Universities, granting agencies, and publishers each create different incentives for researchers. With all of this complexity, nudging scientific practices toward greater openness requires complementary and coordinated efforts from all stakeholders.

THE TRANSPARENCY AND OPENNESS PROMOTION GUIDELINES

The Transparency and Openness Promotion (TOP) Committee met at the Center for Open Science in Charlottesville, Virginia, in November 2014 to address one important element of the incentive systems: journals’ procedures and policies for publication. The committee consisted of disciplinary leaders, journal editors, funding agency representatives, and disciplinary experts largely from the social and behavioral sciences. By developing shared standards for open practices across journals, we hope to translate scientific norms and values into concrete actions and change the current incentive structures to drive researchers’ behavior toward more openness. Although there are some idiosyncratic issues by discipline, we sought to produce guidelines that focus on the commonalities across disciplines.

Standards

There are eight standards in the TOP guidelines; each moves scientific communication toward greater openness. These standards are modular, facilitating adoption in whole or in part. However, they also complement each other, in that commitment to one standard may facilitate adoption of others. Moreover, the guidelines are sensitive to barriers to openness by articulating, for example, a process for exceptions to sharing because of ethical issues, intellectual property concerns, or availability of necessary resources. The complete guidelines are available in the TOP information commons at http://cos.io/top, along with a list of signatories that numbered 86 journals and 26 organizations as of 15 June 2015. The table provides a summary of the guidelines.

First, two standards reward researchers for the time and effort they have spent engaging in open practices. (i) Citation standards extend current article citation norms to data, code, and research materials. Regular and rigorous citation of these materials credit them as original intellectual contributions. (ii) Replication standards recognize the value of replication for independent verification of research results and identify the conditions under which replication studies will be published in the journal. To progress, science needs both innovation and self-correction; replication offers opportunities for self-correction to more efficiently identify promising research directions.

graphic file with name nihms-714651-f0001.jpg

Second, four standards describe what openness means across the scientific process so that research can be reproduced and evaluated. Reproducibility increases confidence in results and also allows scholars to learn more about what results do and do not mean. (i) Design standards increase transparency about the research process and reduce vague or incomplete reporting of the methodology. (ii) Research materials standards encourage the provision of all elements of that methodology. (iii) Data sharing standards incentivize authors to make data available in trusted repositories such as Dataverse, Dryad, the Interuniversity Consortium for Political and Social Research (ICPSR), the Open Science Framework, or the Qualitative Data Repository. (iv) Analytic methods standards do the same for the code comprising the statistical models or simulations conducted for the research. Many discipline-specific standards for disclosure exist, particularly for clinical trials and health research more generally (e.g., www.equator-network.org). Many more are emerging for other disciplines, such as those developed by Psychological Science (12).

Finally, two standards address the values resulting from preregistration. (i) Standards for preregistration of studies facilitate the discovery of research, even unpublished research, by ensuring that the existence of the study is recorded in a public registry. (ii) Preregistration of analysis plans certify the distinction between confirmatory and exploratory research, or what is also called hypothesis-testing versus hypothesis-generating research. Making transparent the distinction between confirmatory and exploratory methods can enhance reproducibility (3, 13, 14).

Levels

The TOP Committee recognized that not all of the standards are applicable to all journals or all disciplines. Therefore, rather than advocating for a single set of guidelines, the TOP Committee defined three levels for each standard. Level 1 is designed to have little to no barrier to adoption while also offering an incentive for openness. For example, under the analytic methods (code) sharing standard, authors must state in the text whether and where code is available. Level 2 has stronger expectations for authors but usually avoids adding resource costs to editors or publishers that adopt the standard. In Level 2, journals would require code to be deposited in a trusted repository and check that the link appears in the article and resolves to the correct location. Level 3 is the strongest standard but also may present some barriers to implementation for some journals. For example, the journals Political Analysis and Quarterly Journal of Political Science require authors to provide their code for review, and editors reproduce the reported analyses publication. In the table, we provide “Level 0” for comparison of common journal policies that do not meet the transparency standards.

Adoption

Defining multiple levels and distinct standards facilitates informed decision-making by journals. It also acknowledges the variation in evolving norms about research transparency. Depending on the discipline or publishing format, some of the standards may not be relevant for a journal. Journal and publisher decisions can be based on many factors—including their readiness to adopt modest to stronger transparency standards for authors, internal journal operations, and disciplinary norms and expectations. For example, in economics, many highly visible journals such as American Economic Review have already adopted strong policies requiring data sharing, whereas few psychology journals have comparable requirements.

In this way, the levels are designed to facilitate the gradual adoption of best practices. Journals may begin with a standard that rewards adherence, perhaps as a step toward requiring the practice. For example, Psychological Science awards badges for “open data,” “open materials,” and “preregistration” (12), and approximately 25% of accepted articles earned at least one badge in the first year of operation.

The Level 1 guidelines are designed to have minimal effect on journal efficiency and workflow while also having a measurable impact on transparency. Moreover, although higher levels may require greater implementation effort up front, such efforts may benefit publishers and editors and the quality of publications by, for example, reducing time spent on communication with authors and reviewers, improving standards of reporting, increasing detectability of errors before publication, and ensuring that publication-related data are accessible for a long time.

Evaluation and revision

An information commons and support team at the Center for Open Science is available (top@cos.io) to assist journals in selection and adoption of standards and will track adoption across journals. Moreover, adopting journals may suggest revisions that improve the guidelines or make them more flexible or adaptable for the needs of particular subdisciplines.

The present version of the guidelines is not the last word on standards for openness in science. As with any research enterprise, the available empirical evidence will expand with application and use of these guidelines. To reflect this evolutionary process, the guidelines are accompanied by a version number and will be improved as experience with them accumulates.

Conclusion

The journal article is central to the research communication process. Guidelines for authors define what aspects of the research process should be made available to the community to evaluate, critique, reuse, and extend. Scientists recognize the value of transparency, openness, and reproducibility. Improvement of journal policies can help those values become more evident in daily practice and ultimately improve the public trust in science, and science itself.

Supplementary Material

Proposed Standards and references

Summary of the eight standards and three levels of the TOP guidelines

LEVEL 0 LEVEL 1 LEVEL 2 LEVEL 3
Citation standards Journal encourages
citation of data, code,
and materials—or says
nothing.
Journal describes
citation of data in
guidelines to authors
with clear rules and
examples.
Article provides appropriate
citation for data and materials
used, consistent with journal's
author guidelines.
Article is not published until
appropriate citation for data
and materials is provided that
follows journal's author
guidelines.

Data transparency Journal encourages
data sharing—or says
nothing.
Article states whether
data are available and,
if so, where to access
them.
Data must be posted to a
trusted repository. Exceptions
must be identified at article
submission.
Data must be posted to a
trusted repository, and
reported analyses will be
reproduced independently
before publication.

Analytic methods
(code) transparency
Journal encourages
code sharing—or says
nothing.
Article states whether
code is available and, if
so, where to access
them.
Code must be posted to a
trusted repository. Exceptions
must be identified at article
submission.
Code must be posted to a
trusted repository, and
reported analyses will be
reproduced independently
before publication.

Research materials
transparency
Journal encourages
materials sharing—or says
nothing
Article states whether
materials are available
and, if so, where to
access them.
Materials must be posted to a
trusted repository. Exceptions
must be identified at article
submission.
Materials must be posted to a
trusted repository, and
reported analyses will be
reproduced independently
before publication.

Design and analysis
transparency
Journal encourages
design and analysis
transparency or says
nothing.
Journal articulates
design transparency
standards.
Journal requires adherence to
design transparency standards
for review and publication.
Journal requires and enforces
adherence to design transpar-
ency standards for review and
publication.

Preregistration
of studies
Journal says nothing. Journal encourages
preregistration of
studies and provides
link in article to
preregistration if it
exists.
Journal encourages preregis-
tration of studies and provides
link in article and certification
of meeting preregistration
badge requirements.
Journal requires preregistration
of studies and provides link and
badge in article to meeting
requirements.

Preregistration
of analysis plans
Journal says nothing. Journal encourages
preanalysis plans and
provides link in article
to registered analysis
plan if it exists.
Journal encourages preanaly-
sis plans and provides link in
article and certification of
meeting registered analysis
plan badge requirements.
Journal requires preregistration
of studies with analysis plans
and provides link and badge in
article to meeting requirements.

Replication Journal discourages
submission of
replication studies—or
says nothing.
Journal encourages
submission of
replication studies.
Journal encourages submis-
sion of replication studies and
conducts blind review of
results.
Journal uses Registered
Reports as a submission option
for replication studies with peer
review before observing the
study outcomes.

Levels 1 to 3 are increasingly stringent for each standard. Level 0 offers a comparison that does not meet the standard.

ACKNOWLEDGMENTS

This work was supported by the Laura and John Arnold Foundation.

Footnotes

Afliations for the authors, all of whom are members of the TOP Guidelines Committee, are given in the supplementary materials.

REFERENCES

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Proposed Standards and references

RESOURCES