Highlights
-
•
Open Science principles are vital for ensuring reproducibility, trust, and legacy.
-
•
Evidence synthesis is a vital means of summarizing research for decision-making.
-
•
Open Synthesis is the application of Open Science principles to evidence synthesis.
-
•
Open approaches to planning, conducting, and reporting synthesis have many benefits.
-
•
We call on the evidence synthesis community to embrace Open Synthesis.
Keywords: Evidence ecosystem, Collaborative research, Repeatability, Research waste, Transparency, Traceability
The coronavirus disease 2019 (COVID-19) pandemic of 2020 has caused high levels of mortality and continues to threaten the lives of the global population [1]. The pandemic has amounted to a “once in a lifetime” event for humanity and has affected it across its different sectors of existence: health, education, economy, environment, etc. The pandemic continues to threaten job prospects for millions of people and has resulted in widespread economic turmoil [2]. It has also led to the cancellation of numerous conferences (e.g., [3]) and research fieldwork and closed offices across the globe.
As the scientific community grapples to respond to the massive and rapidly evolving crisis, the volume of research literature that has been published in relation to the outbreak has expanded rapidly (Figure 1 ). Simultaneously, efforts to synthesize this growing evidence base have begun, both through ongoing traditional approaches to independent systematic reviews (e.g., [4,5]), and through both rapid and living systematic reviews (e.g., https://covidrapidreviews.cochrane.org/search/site). Rapid systematic reviews provide in a timely way the evidence needed to inform policy making under urgent circumstances. On the other hand, living systematic reviews ensure that any evidence synthesis is up to date with the latest evidence (e.g., by the L.OVE team at Epistemonikos).
As the volume of evidence increases and decision makers and scientists struggle to grapple with the rapidly expanding evidence base, many research groups are volunteering to support these efforts by using online collaborative tools and virtual workspaces, in an effort to support continued working during challenging times, and also to help identify, map, and synthesize research as it emerges.
This work faces a suite of challenges because of the often closed nature of science. The major challenges are the duplication of efforts (leading to research waste), the inefficiency in conducting research, and missing the opportunity to address important questions. Open science principles present an opportunity to address these challenges in the context of the COVID-19 pandemic. They would also ensure that the research in the field is more collaborative, transparent, and rigorous. This article argues for, and illustrates how, to apply the principles of Open Science to the field of evidence synthesis, a concept we refer to as Open Synthesis [6]. We use the COVID-19 pandemic as a case in point to highlight the potential significant benefits of Openness to the research, policy, and practice communities.
1. Evidence synthesis
Evidence synthesis is the name for research methodologies that involve identifying, collating, appraising, and summarizing a body of research evidence using tried and tested systematic and robust literature review methods: i.e., systematic reviews and systematic maps [7]. Systematic reviews are now widely used in the field of health care as a “gold standard” for summarizing evidence to provide support for decision-making in policy and practice, through a variety of knowledge translation products and practice guidelines [8].
However, systematic reviewers face challenges as a result of an often closed academic system; research can be difficult to find and download without access to expensive bibliographic databases [9]; primary research articles and the systematic reviews that synthesize them are hidden behind paywalls [10,11]; reporting of methods used in trials and syntheses is often deficient to some degree, hampering verification and learning about methodology [12]; research data are often not made public, particularly when produced by organizations with commercial interests, such as pharmaceutical companies [13]; analytical code is rarely shared and statistical methods can be hard to verify [14], and educational materials to train the next generation of evidence synthesists are often not made public [15].
2. Open Science
Open Science has central premises relating to accessibility and the collaborative nature of knowledge creation and the knowledge itself [16]. These principles (see Table 1 ) include concepts such as open access (unrestricted availability of research publications,11) and open data (freely accessible research data used in analyses; [17]) that together support efficient, transparent, and rigorous research.
Table 1.
Concept | Definition |
---|---|
Open data | Freely available research data |
Open source | Use and production of freely accessible software and hardware |
Open methodology | Documentation of methods for a research process as far as possible |
Open peer review | Transparent and traceable quality assurance through open peer review |
Open access | Publish research articles in an accessible manner, making them useable and accessible for all |
Open educational resources | Free and accessible materials for education and university teaching |
There are various definitions of Open Science, ranging from relatively simple classifications of “data, analysis, publications, and comments” [18] to somewhat more elaborate frameworks (see Table 1), all the way to complex hierarchical conceptual models [19]. Although these classifications differ in their complexity, they each attempt to cover all aspects of research processes from initiation to communication.
3. Open Synthesis
Some of the problems with traditional approaches to evidence synthesis described above (access to data, methods, publications, etc.) can be and indeed are being mitigated by applying these Open Science principles to evidence synthesis; the result has been termed Open Synthesis [6]. Open Synthesis was first proposed to apply Open Access, Open Data, Open Source and Open Methodology to evidence synthesis, with the possible addition of Open Education. We propose a finer resolution based on more complex taxonomies (e.g., [19]).
We suggest that such Open Synthesis would support the transfer of knowledge from primary research to decision support tools and evidence portals (e.g., the Teaching and Learning Toolkit), particularly during humanitarian crises; for example, Evidence Aid hosts a freely accessible evidence repository that holds summaries of COVID-19 relevant evidence (https://www.evidenceaid.org/coronavirus-covid-19-evidence-collection/) [20]. Many Open Synthesis resources have been developed and assembled in an effort to facilitate access to the novel evidence base emerging in relation to the COVID-19 pandemic. These examples are (understandably) almost exclusively related to the field of health, but the evidence base will become increasingly multidisciplinary and cross-sectoral as research focus spreads to include the societal and environmental impacts of the outbreak and subsequent social policies, such as widescale lockdowns. The key components of Open Synthesis are described in Figure 2 , and examples are given below.
3.1. Open collaboration
The COVID-19 evidence map of emerging literature produced by the Meta-Evidence blog was open to interested collaborators (before the project was discontinued because of considerable overlap with several other projects) and involved substantial efforts to translate and extract information from literature written in Chinese. The synthesizing group under COVID Evidence Network to support Decision makers (COVID-END; https://www.mcmasterforum.org/networks/covid-end/working-groups/synthesizing) supports efforts to synthesize the evidence that already exists in ways that are more coordinated and efficient and that balance quality and timeliness. Cochrane's COVID Rapid Reviews repository provides space for Open Collaboration by connecting authors interested in addressing the same rapid review question that were submitted by the public.
3.2. Open discovery
To enable free (i.e., not paywalled) searching for relevant evidence, various efforts are seeking to build “living” bibliographies and databases of research on COVID-19. For example, the CORD19 database (MIT); the COVID-19 living systematic map (EPPI center); Cochrane's COVID-19 Study Register; the Norwegian Institute of Public Health's live map of COVID-19 evidence. Similarly, the McMaster GRADE Center is collaborating with the Norwegian Institute of Public Health and others to map recommendations relevant to COVID-19 and make them publicly available (including the strength and certainty of supporting evidence) [21].
3.3. Open methods
Efforts exist to ensure that evidence syntheses use transparent and well reported methods to improve repeatability and usability. For example, the systematic review registry PROSPERO has provided a link to already registered reviews of human and animal studies relevant to COVID-19.
3.4. Open data
Freely accessible data (including those extracted and generated within the process of conducting a systematic review) are being made available for reuse and analysis. From evidence syntheses, the Epistemonikos COVID-19 collection archives data extracted from within reviews in a publicly accessible database (https://www.epistemonikos.cl/all-about-covid-19/).
3.5. Open source
Freely useable and adaptable tools for analysis and visualization have been made available online to support the conduct and communication of COVID-19 relevant research, for example, corona-cli (code for analyzing and visualizing data on the outbreak); the EviAtlas tool for mapping the geographical spread of evidence on COVID-19 [22].
3.6. Open code
Many researchers routinely publish the analytic code to accompany their research (e.g., R script for statistical analyses), although to date this practice is not common in the syntheses we have examined; perhaps because this is challenging where reviewers have not made use of code-driven software, and code does not readily exist (e.g., for reviews conducted using RevMan software). However, some examples of Open Code in primary research include code to webscrape COVID data from Worldometers and epidemiological modeling code for COVID.
3.7. Open access
Several publishers and journals have made COVID-19 relevant research articles and evidence syntheses freely accessible, including the Cochrane COVID-19 evidence collection and several Elsevier journals including Journal of Clinical Epidemiology and The Lancet (https://www.elsevier.com/connect/coronavirus-information-center). Systematic reviewers can facilitate Open Access by ensuring their reviews are freely accessible (e.g., by publishing in open access journals or depositing preprints or postprints in publicly accessible repositories) but also by facilitating access to the primary research synthesized in their reviews (e.g., by providing DOIs for the full texts of their included studies).
3.8. Open peer review:
Although most journals do not currently publish peer review reports and revisions of systematic reviews, some resources exist to support this, including the Outbreak Science Rapid PREReview for prepublication peer review.
3.9. Open education
Various freely accessible training resources (e.g., courses, webinars, and handbooks) exist for evidence synthesis methodology, including #ESTraining provided by the Collaboration for Environmental Evidence and Stockholm Environment Institute and webinars provided by the Global Evidence Synthesis Initiative.
3.10. Open interests
Systematic reviews have been shown to suffer from poor reporting of funding, role of funders, and conflicts of interest in general [23]. Open Interests calls for individuals to transparently declare possible financial and nonfinancial interests—ideally, this would be performed by all parties involved in the conduct and publication of systematic reviews (including educators, engaged stakeholders, review authors, advisory group members, peer reviewers, editors, and publishers); these should be updated regularly. In practical terms, this could either be a declaration at the point of publication (e.g., review publications, educational materials, or peer review comments) or via a freely accessible central database of interests. At present, no Open Interests initiative exists.
3.10.1. Challenges of implementing Open Synthesis and their relation to Open Science criticisms
Although no criticisms have been fielded against Open Synthesis yet, some researchers have raised concerns about Open Science. We have described some of these in Table 2 . These concerns either relate to openness itself as a practice or the application and enforcement of Open Science within current institutions and incentive structures.
Table 2.
Concern relating to Open Science | Description of the concern | Applicability to Open Synthesis | Potential mitigations for Open Synthesis |
---|---|---|---|
Exacerbation of power imbalance and inequality or exclusion of minorities [24] | Open Science practices applied within the current incentive structures and institutions can exacerbate power imbalance and inequality, particularly adversely affecting minorities and the vulnerable or oppressed | Highly applicable to evidence syntheses, just as with primary research. | Open Synthesis principles can be endorsed rather than enforced to avoid penalizing vulnerable researchers who may struggle to be Open. Structures can be put in place to support minorities and vulnerable researchers (e.g., publication fee waivers for low- and middle-income researchers [25], mentoring in Open practices). |
Risk of misuse [26] | Open Data and Code may be reused or reanalyzed incorrectly, potentially for nefarious reasons | Although some data in syntheses are in the public domain, some data from unpublished studies or unpublished outcomes obtained from authors are not available in the public domain. Furthermore, the calculation of effect sizes may use assumptions that affect the estimates calculated. | Ensure full methodological transparency to avoid misunderstandings, including annotation of analytic or statistical code and any assumptions. Adequate reference and easy linkage to the original data source should be provided for clarity. |
Risk of public misunderstanding (e.g., [27]) | Detailed language and nuance of data may be misunderstood by lay people, nonspecialists, or those who did not collect the data | Systematic reviews are typically not intended to be a means of communication with the public (plain language summaries instead). The risk is not higher for Open Synthesis relative to standard synthesis. | Synthesis methods must be detailed enough and follow standard language to allow full understanding. |
Potential to be overwhelmed by information [28] | Publication of large volumes of data or information may make it difficult to find important details within/across studies | Information is typically more structured across evidence syntheses than primary research because they use a common methodological framework. | Standardized reporting templates could be built to support or facilitate metadata formatting so that information is readily found and understood. Reviewers could provide different versions with different levels of detail for different audiences (e.g., Plain language summary for the lay public). |
Fear of repercussions if mistakes are unearthed after publication [29] | Authors may fear that they could be subjected to persecution if mistakes are identified in their methods after publication and so may prefer to keep data and analyses private | There is potential for error in the identification, selection, appraisal, and analysis of studies included in systematic reviews | Reviewers should be incentivized to admit errors and supported when these occur. Institutional punitive measures for publishing corrections or retractions should first examine the reasons behind the action, avoiding blanket punishments and acknowledge authors who act ethically and responsibly, while promoting and rewarding Open behaviors. Open Synthesis should be reframed as an opportunity to validate findings as opposed to detecting mistakes. |
Publication of data leads to “research parasitism” [30] | Some researchers feel that reuse of data or methods by others is an unfair practice and that authors alone should retain exclusive rights | Cochrane, the Campbell Collaboration and the Collaboration for Environmental Evidence allow review teams the right to lead updates to their reviews for a fixed period. Data collected and used in an evidence synthesis is typically already in the public domain, anyway. | Raise awareness of the benefits in legacy and impact of research resulting from reuse of data. Ensure those reusing data provide appropriate and full acknowledgment of data sources. Reconsider rules for academic credit, reward, and promotion. |
Belief that low quality science will proliferate [31] | [Specifically referring to Open Peer Review and preprints] some argue that a lack of traditional peer review for preprints removes the gatekeeping that ensures research validity, and low-quality research will become common | Preprints are, in part, a response to a lack of immediate Open Access and closed peer review. They are not an integral part of Open Science but rather an extension of it. Current institutions and incentive structures may not be sufficient to prevent low quality evidence syntheses from being published, but this is also the case for those that are traditionally peer reviewed. | Make use of opportunities for Open Peer Review that complement and strengthen preprints (i.e., postpublication peer review;,31). Raise awareness and establish standard communication practices for understanding preprints within the communications community (i.e., journalists and institutional communications officers). Ensure preprints follow standards for conducting and reporting evidence synthesis (e.g., PRISMA and ROSES) |
Increased resources needed to attain Openness [26,32] | Ensuring that data and information are made fully Open may require resources (time and funding) that are not readily available to all | The large amounts of data potentially produced within a systematic review project could require considerable resources to clean and annotate if not planned from the outset, particularly for analytic code. Open Collaboration could require considerable time to manage if roles and tasks are not carefully predefined. | Openness can be achieved for the most part by using cost-free alternatives (e.g., self-archiving to avoid publication fees and the use of free data repositories) and by incentivizing and institutionalizing Open and transparent practices from an early career stage (e.g., good code annotation practices). However, this point is not trivial and highlights the need for careful planning across all aspects of Open Synthesis; planning can significantly reduce resource requirements. Standardizing methods and processes and tools used to abstract and store data could assist in this process [33] |
Risk of “platform capitalism” (i.e., commercialization of public data) [34] | The free availability of data permits the development of subscription-based/pay-to-use services (e.g., Academia.edu) that aim to provide additional services using public data (e.g., analytics) and platforms that may exploit or disadvantage certain groups of people (e.g., by charging for a service that is otherwise already free elsewhere) | Grass roots and no-cost alternatives to these services are often available but awareness of free-to-use services is vital to avoid entrapment by commercial enterprises (e.g., paying a publisher to access an article that is already Open Access). | Noncommercial use Creative Commons licenses may help restrict/prevent commercial use of Open Data (e.g., CC BY-NC 3.0), but they are not without criticism, for example, that Creative Commons licenses are based on copyright law that is overly restrictive to academic collaborations [35]. |
Need to maintain confidentiality [36,37] | Research subjects are typically provided anonymity that may mean publication of raw data is not feasible or safe | Evidence syntheses often make use of summary data not disaggregated at the level of individual participants, and for these reviews this may not be an issue. Individual participant data (IPD) meta-analyses, however, may not be able to publish data openly. | For IPD meta-analyses, the requirements for Open Data may need to be relaxed or adapted in some contexts to ensure anonymity can be maintained. For example, data on request repositories for individual patient data exist [38]. Standardized ethical practices could be established where needed for IPD meta-analysis. |
Institutional barriers including career incentives that reward closed practices [39] | Career incentives in academic typically and historically center around publication in high-impact journals that are prohibitively expensive to publish Open Access. Recruitment and promotion in academia typically also do not reward or acknowledge Open practices. Institutions may not understand/accept the desire to be Open | Systematic reviewers often work within institutions established around primary research practices, so the same incentives apply. Organizations primarily focusing on evidence synthesis may already have Open practices. | Incentive structures are likely to change over time as Open Science practices become more common, but authorities must take a stand to support researchers who are likely to be disadvantaged by being more Open (e.g., early career researchers). |
In addition, there are risks associated with some of the practices that may be facilitated by Open Synthesis, for example, 1) living systematic reviews may involve repeated incremental rerunning of meta-analyses, leading to increased chances of false positive that need to be accounted for (e.g., [40]); 2) updates may need to account for changes in best practice in risk of bias assessments as novel methods become available, potentially involving reassessment of studies identified in the original review.
These are not problems with Open Synthesis but rather important issues that should be addressed when planning incentives and infrastructure in support of Open Syntheses. However, a pathway to Open systematic reviews and systematic maps will involve many steps and a diverse array of different actions; these changes should not be expected overnight, and there is a need for detailed discussion about implications and pitfalls. That said, it is generally accepted that the advantages of Open Science outweigh the disadvantages [41].
3.10.2. Open Synthesis and current systematic review traditions
At present, some of these Open Synthesis practices are enforced or encouraged by review coordinating bodies. Cochrane reviews can be made immediately Open Access at the point of publication for a fee (payable by authors) or made free after a 12 month period (otherwise requiring subscription to access, green Open Access). Cochrane does not yet require systematic review–extracted data to be made public [42]. While methods in Cochrane reviews are typically well-reported thanks to the Methodological Expectations for Cochrane Intervention Reviews reporting standards [43], the “raw” data extracted from primary studies within a review are not typically included. All Campbell Collaboration reviews are published in their Open Access journal. Transparent and Open Methods are required by the Methodological Expectations for Campbell Collaboration Intervention Reviews. Open Data and Code are in the vision for the future of the journal [44]. For both organizations, review protocols are published online and time-stamped before work commences, as should be performed with all systematic reviews and maps (e.g., in PROSPERO, Cochrane Database of Systematic Reviews, or published in a suitable journal).
3.10.3. Ways forward
Adopting truly Open evidence synthesis approaches has the potential to globalize research, break down barriers to data sharing and collaboration, and mitigate inequality in knowledge availability (e.g., a large body of Chinese coronavirus trials was recently translated and mapped by researchers from Lanzhou University). Open synthesis also supports either living systematic reviews or intermittent updates; it is agnostic toward the framework chosen to update reviews. Importantly, it emphasizes the need to facilitate updates however that may occur.
Moreover, Open Synthesis of evidence will provide guideline developers with faster and better access to the synthesis methods, findings, conflict of interest information, and other elements necessary for guideline development, and subsequently, improve the quality and efficiency of guideline development.
Achieving the optimal impact of Open Synthesis requires the consideration of other principles. Of outmost importance is to respond to the knowledge needs of decision makers by adopting valid prioritisty setting approaches. Similarly, it has to feed into knowledge translation tools that are appropriate to the target decision makers. In addition, it should build on emerging concepts, such as Evidence Synthesis 2.0 [33], to ensure the efficiency of the process and appropriateness of the output.
We encourage adoption of these principles across all disciplines to meet the social, legal, ethical, and economic challenges of the global COVID-19 pandemic, such as supporting home-based education for children out of school; mitigating social impacts of isolation; responding to the increased risk and severity of domestic violence, global food insecurity, or the implications of social lockdowns on environmental recovery from long-term anthropogenic disturbance and climate change.
We call for increasing application of Open Science and Open Synthesis principles across disciplines both within and beyond the COVID-19 epidemic to support evidence production, synthesis, and evidence-informed policy. By embracing Open Synthesis, evidence synthesis communities from all disciplines can maximize the efficiency, impact, and legacy of systematic reviews and better support decision-making, particularly in global crises such as the current COVID-19 pandemic, establishing a more resilient and collaborative future in the event of similar global challenges.
CRediT authorship contribution statement
Neal R. Haddaway: Conceptualization, Data curation. Elie A. Akl: Conceptualization, Writing - original draft, Writing - review & editing. Matthew J. Page: Writing - original draft, Writing - review & editing. Vivian A. Welch: Writing - original draft, Writing - review & editing. Ciara Keenan: Writing - original draft, Writing - review & editing. Tamara Lotfi: Conceptualization, Writing - original draft, Writing - review & editing.
Footnotes
Declarations of interest: NRH and TL are the coordinators of the Open Synthesis Working Group, a voluntary collaboration of stakeholders interested in the application of Open Science principles in evidence synthesis conduct and publication.
Funding: This work was produced in part as a result of funding from FORTE, the Swedish Research Council for Health, Working Life, and Welfare (2018-01619).
Author’s contributions: NRH and TL developed the concept for the manuscript. NRH drafted the manuscript. All authors have read and approved the manuscript prior to submission.
Supplementary data to this article can be found online at https://doi.org/10.1016/j.jclinepi.2020.06.032.
Supplementary data
References
- 1.World Health Organization . World Health Organisation; Geneva: 2020. Coronavirus disease 2019 (COVID-19): situation report, 85. [Google Scholar]
- 2.McKibbin W.J., Fernando R. The global macroeconomic impacts of COVID-19: seven scenarios. SSRN J. 2020. https://www.ssrn.com/abstract=3547729 [cited 2020 Apr 15]; Available at.
- 3.Robbins R. STAT’s guide to health care conferences disrupted by the coronavirus crisis. STAT News. 2020. https://www.statnews.com/2020/03/07/stats-guide-health-care-conferences-disrupted-covid-19/ [cited 2020 Apr 7]. Available at.
- 4.Sahin A.R., Erdogan A., Mutlu Agaoglu P., Dineri Y., Cakirci A.Y., Senel M.E. 2019 novel coronavirus (COVID-19) outbreak: a review of the current literature. Eurasian J Med Oncol. 2020;4(1):1–7. [Google Scholar]
- 5.Salehi S., Abedi A., Balakrishnan S., Gholamrezanezhad A. Coronavirus disease 2019 (COVID-19): a systematic review of imaging findings in 919 patients. Am J Roentgenol. 2020:1–7. doi: 10.2214/AJR.20.23034. [DOI] [PubMed] [Google Scholar]
- 6.Haddaway N.R. Open Synthesis: on the need for evidence synthesis to embrace Open Science. Environ Evid. 2018;7(1):26. [Google Scholar]
- 7.Gough D., Oliver S., Thomas J. 2nd ed. SAGE Publication; London: 2017. An introduction to systematic reviews; p. 304. [Google Scholar]
- 8.Alonso-Coello P., Schünemann H.J., Moberg J., Brignardello-Petersen R., Akl E.A., Davoli M. GRADE Evidence to Decision (EtD) frameworks: a systematic and transparent approach to making well informed healthcare choices. 1: Introduction. BMJ. 2016;353:i2016. doi: 10.1136/bmj.i2016. [DOI] [PubMed] [Google Scholar]
- 9.Livoreil B., Glanville J., Haddaway N.R., Bayliss H., Bethel A., Lachapelle F.F. Systematic searching for environmental evidence using multiple tools and sources. Environ Evid. 2017;6(1):23. [Google Scholar]
- 10.Chawla A., Twycross-Lewis R., Maffulli N. Microfracture produces inferior outcomes to other cartilage repair techniques in chondral injuries in the paediatric knee. Br Med Bull. 2015;116(1):93–103. doi: 10.1093/bmb/ldv040. [DOI] [PubMed] [Google Scholar]
- 11.Piwowar H., Priem J., Larivière V., Alperin J.P., Matthias L., Norlander B. The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles. PeerJ. 2018;6:e4375. doi: 10.7717/peerj.4375. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Glasziou P., Altman D.G., Bossuyt P., Boutron I., Clarke M., Julious S. Reducing waste from incomplete or unusable reports of biomedical research. Lancet. 2014;383:267–276. doi: 10.1016/S0140-6736(13)62228-X. [DOI] [PubMed] [Google Scholar]
- 13.Moynihan R., Bero L., Hill S., Johansson M., Lexchin J., Macdonald H. Pathways to independence: towards producing and using trustworthy evidence. BMJ. 2019;367:l6576. doi: 10.1136/bmj.l6576. [DOI] [PubMed] [Google Scholar]
- 14.Chiang I.C.A., Jhangiani R.S., Price P.C. From the “replicability crisis” to open science practices. Research Methods in Psychology. BCcampus. 2015. https://opentextbc.ca/researchmethods/chapter/from-the-replicability-crisis-to-open-science-practices/ Available at:
- 15.Farrow R. Open education and critical pedagogy. Learn Media Technology. 2017;42(2):130–146. [Google Scholar]
- 16.Fecher B., Friesike S. Open science: one term, five schools of thought. In: Bartling S., Friesike S., editors. Opening Science: The Evolving Guide on How the Internet is Changing Research, Collaboration and Scholarly Publishing [Internet] Springer International Publishing; Cham: 2014. pp. 17–47. [Google Scholar]
- 17.Gewin V. Data sharing: an open mind on open data. Nature. 2016;529:117–119. doi: 10.1038/nj7584-117a. [DOI] [PubMed] [Google Scholar]
- 18.Foster E.D., Deardorff A. Open science framework (OSF) J Med Libr Assoc. 2017;105:203–206. [Google Scholar]
- 19.Knoth P., Pontika N. figshare; 2015. Open Science Taxonomy. [DOI] [Google Scholar]
- 20.Clarke M. Evidence Aid – from the Asian tsunami to the Wenchuan earthquake. J Evid Based Med. 2008;1(1):9–11. doi: 10.1111/j.1756-5391.2008.00007.x. [DOI] [PubMed] [Google Scholar]
- 21.Schünemann H.J., Santesso N., Vist G.E., Cuello C., Lotfi T., Flottorp S. Using GRADE in situations of emergencies and urgencies: certainty in evidence and recommendations matters during the COVID-19 pandemic, now more than ever and no matter what. J Clin Epidemiol. 2020;0 doi: 10.1016/j.jclinepi.2020.05.030. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Haddaway N.R., Feierman A., Grainger M.J., Gray C.T., Tanriver-Ayder E., Dhaubanjar S. EviAtlas: a tool for visualising evidence synthesis databases. Environ Evid. 2019;8(1):22. [Google Scholar]
- 23.Bou-Karroum L., Hakoum M.B., Hammoud M.Z., Khamis A.M., Al-Gibbawi M., Badour S. Reporting of financial and non-financial conflicts of interest in systematic reviews on health policy and systems research: a cross sectional survey. Int J Health Policy Manag. 2018;7(8):711–717. doi: 10.15171/ijhpm.2017.146. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Bahlai C., Bartlett L.J., Burgio K.R., Fournier A.M., Keiser C.N., Poisot T. Open science isn’t always open to all scientists. Am Sci. 2019;107(2):78–82. [Google Scholar]
- 25.Lawson S. Fee Waivers for Open Access Journals. Publications. 2015;3(3):155–167. [Google Scholar]
- 26.Grand A., Wilkinson C., Bultitude K., Winfield A.F.T. Open science: a new “trust technology”? Sci Commun. 2012;34(5):679–689. [Google Scholar]
- 27.Nielsen M. Princeton University Press; New Jersey: 2020. Reinventing discovery: the new era of networked science; p. 204. [Google Scholar]
- 28.Grand A., Wilkinson C., Bultitude K., Winfield A.F.T. Mapping the hinterland: data issues in open science. Public Underst Sci. 2016;25(1):88–103. doi: 10.1177/0963662514530374. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Allen C., Mehler D.M.A. Open science challenges, benefits and tips in early career and beyond. PLoS Biol. 2019;17(5):e3000246. doi: 10.1371/journal.pbio.3000246. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Longo D.L., Drazen J.M. Data sharing. N Engl J Med. 2016;374(3):276–277. doi: 10.1056/NEJMe1516564. [DOI] [PubMed] [Google Scholar]
- 31.Lancaster A. Open Science and its Discontents | Ronin Inst. http://ronininstitute.org/open-science-and-its-discontents/1383/ Available at.
- 32.Beagrie N., Lavoie B., Woolard M. Keeping research data safe (Phase 2). Jisc. 2010. https://www.webarchive.org.uk/wayback/archive/20140613220103mp_/http://www.jisc.ac.uk/publications/reports/2010/keepingresearchdatasafe2.aspx Available at.
- 33.Akl E.A., Haddaway N.R., Rada G., Lotfi T. Evidence synthesis 2.0: when systematic, scoping, rapid, living, and overviews of reviews come together. J Clin Epidemiol. 2020;0 doi: 10.1016/j.jclinepi.2020.01.025. [DOI] [PubMed] [Google Scholar]
- 34.Pievatolo M.C. Open science: human emancipation or bureaucratic serfdom? SCIRES-it. https://archiviomarini.sp.unipi.it/858/ Available at.
- 35.Corbett S. Creative commons licences, the copyright regime and the online community: is there a fatal disconnect? Mod Law Rev. 2011;74(4):503–531. [Google Scholar]
- 36.Cummings J.A., Zagrodney J.M., Day T.E. Impact of open data policies on consent to participate in human subjects research: discrepancies between participant action and reported concerns. PLoS One. 2015;10:e0125208. doi: 10.1371/journal.pone.0125208. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Walsh C.G., Xia W., Li M., Denny J.C., Harris P.A., Malin B.A. Enabling open-science initiatives in clinical psychology and psychiatry without sacrificing patients’ privacy: current practices and future challenges. Adv Methods Practices Psychol Sci. 2018;1(1):104–114. [Google Scholar]
- 38.van Middelkoop M., Lohmander S., Bierma-Zeinstra S.M.A. Sharing data–taming the beast: barriers to meta-analyses of individual patient data (IPD) and solutions. 2020. https://bjsm.bmj.com/content/early/2020/01/29/bjsports-2019-101892 Available at. [DOI] [PMC free article] [PubMed]
- 39.Gagliardi D., Cox D., Li Y. The transformation of university institutional and organizational boundaries. Brill Sense; Leiden, The Netherlands: 2015. Institutional inertia and barriers to the adoption of open science; pp. 107–133. [Google Scholar]
- 40.Mavergames C., Elliott J.H. Living Systematic Reviews: towards real-time evidence for health-care decision-making | BMJ Best Pract. 2020. https://bestpractice.bmj.com/info/toolkit/discuss-ebm/living-systematic-reviews-towards-real-time-evidence-for-health-care-decision-making/ Available at.
- 41.LeBel E.P., Campbell L., Loving T.J. Benefits of open and high-powered research outweigh costs. J Pers Soc Psychol. 2017;113(2):230. doi: 10.1037/pspi0000049. [DOI] [PubMed] [Google Scholar]
- 42.Shokraneh F., Adams C.E., Clarke M., Amato L., Bastian H., Beller E. Why Cochrane should prioritise sharing data. BMJ. 2018;362:k3229. doi: 10.1136/bmj.k3229. [DOI] [PubMed] [Google Scholar]
- 43.Higgins J., Churchill R., Lasserson T., Chandler J., Tovey D. Cochrane Methods. Cochrane. 2012. Update from the methodological Expectations of Cochrane Intervention reviews (MECIR) project. ; 2012. Accessed April 22 2020. [Google Scholar]
- 44.Welch V.A. Campbell systematic reviews takes next step to meeting FAIR principles. Campbell Syst Rev. 2019;15(1–2):e1032. doi: 10.1002/cl2.1032. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.