Skip to main content
. 2021 May 30;138:219–226. doi: 10.1016/j.jclinepi.2021.05.018

Table 3.

Examples of initiatives to improve the methodology and reproducibility of research

Topic / initiative Description
Establishment of reproducibility networks and research centers Reproducibility networks/centers aim to improve the robustness of scientific research by investigating how research can be improved, and sharing best practices through trainings and workshops. Importantly, these networks aim to collaborate with stakeholders (funders, publishers, academic organizations) in order to broadly improve research practices. See www.ukrn.org and www.swissrn.org for reproducibility networks in the UK and Switzerland. Examples of reproducibility are QUEST at the Berlin Institute of Health (https://www.bihealth.org/en/research/quest-center/) and the Center for Reproducible Science at the University of Zurich (http://www.crs.uzh.ch/en.html).
Lancet series on research waste in 2014 17 recommendations for researchers, academic institutions, scientific journals, funding agencies and science regulators were provided[46]. In 2016, it was noticed that this series had an impact, but rather hesitatingly[46]. For example, with respect to being fully transparent during every stage of research, researchers mentioned issues such as lack of time, lack of benefit, and fear of being scooped.
Hong Kong principles for research assessment The Hong Kong principles focus on responsible research practices, transparent reporting, open science, valuing a diversity of research, and recognizing all contributions to research and scholarly activity [24]. Examples of specific initiatives that are consistent with each principle are provided. These principles were based on earlier efforts such as DORA (www.sfdora.org). DORA has been signed by about 2000 organizations and more than 15000 individuals, indicating widespread support among academics.
EQUATOR network (Enhancing the QUAlity and Transparency Of health Research) The EQUATOR Network (www.equator-network.org) hosts a library reporting guidelines for a wide range of study designs and clinical research objectives, as well as for preparing study protocols [47]. These guidelines are continuously updated and amended where necessary. There is no excuse for not following the most relevant guideline(s) when preparing a manuscript.
STRATOS (STRengthening Analytical Thinking for Observational Studies. The STRATOS initiative unites methodological experts to prepare guidance documents regarding the design and analysis of observational studies (www.stratos-initiative.org). Guidance documents are prepared on different levels, in order to reach non-statisticians as well as practicing statisticians.
Center for Open Science (COS) COS is a center which mission it is to ‘increase openness, integrity, and reproducibility’ of research (cos.io) [48]. COS aims to achieve this through meta-research (study and track the state of science), infrastructure (see e.g. the Open Science Foundation, osf.io), training, incentives, and collaboration/connectivity. They have referred to their vision as scientific utopia.
Study registries Study registries make study information publicly available at the start of the study, to improve transparency and completeness and allow comparison to resulting publications (e.g., clinicaltrials.gov, crd.york.ac.uk/prospero). Registration is widely established for interventional studies, and slowly getting more attention for observational studies. Recently, initiatives for animal studies are being taken (https://preclinicaltrials.eu/, http://animalresearchregistry.org/).
Registered reports COS has introduced the registered reports system (https://www.cos.io/our-services/registered-reports): papers undergo peer review before data collection, based on the research questions and the proposed methodology [49]. If the study is considered to be of high methodological quality, it is provisionally accepted for publication if the authors adhere to the methodology as registered. Currently 244 journals, including medical journals, accept this system as a publishing format.
Transparence and Openness Promotion (TOP) Committee TOP, also under the umbrella of COS, provides guidelines to support journals’ policies for the publication of papers (https://www.cos.io/our-services/top-guidelines)[48].
Findability, Accessibility, Interoperability, and Reusability (FAIR) principles FAIR provides guiding principles for data sharing, which is important for transparency and utility of research projects [50]. Hitherto, journals and researchers still show considerable reserve towards data sharing [51]. As long as the focus in academia emphasises quantity rather than quality, there will be concern that others will take advantage of the effort to collect (high quality) data [46]. Further, privacy and intellectual property issues are important additional bottlenecks.
Methodological/statistical reviewing Several medical journals recognize the importance of methodological review (e.g., statisticians, information specialists/librarians), although the implementation varies widely. Some journals decide on an ad hoc basis when statistical input is required, although this decision may itself require statistical input. Some journals include statisticians on the editorial board, whilst some journals hire a team of statisticians and methodologists.
Reviewer recognition (e.g. Publons) Initiatives such as Publons (www.publons.com) aim to increase recognition for doing peer review. Such initiatives are a good start, although the question remains what peer reviewers really get out of it.
Replication grants The Dutch Research Council (www.nwo.nl) offers grants for doing replications studies of ‘cornerstone research’ (https://www.nwo.nl/onderzoeksprogrammas/replicatiestudies).

All mentioned URLs were accessed on May 23rd 2021.