Skip to main content
BMJ Open logoLink to BMJ Open
. 2020 Feb 13;10(2):e034463. doi: 10.1136/bmjopen-2019-034463

Reproducible research practices, openness and transparency in health economic evaluations: study protocol for a cross-sectional comparative analysis

Ferrán Catalá-López 1,2,3,, Lisa Caulley 3,4,5,6, Manuel Ridao 7, Brian Hutton 3,8, Don Husereau 9,10, Michael F Drummond 11, Adolfo Alonso-Arroyo 12,13, Manuel Pardo-Fernández 14, Enrique Bernal-Delgado 7, Ricard Meneu 15, Rafael Tabarés-Seisdedos 2, José Ramón Repullo 1, David Moher 3,8
PMCID: PMC7045222  PMID: 32060160

Abstract

Introduction

There has been a growing awareness of the need for rigorously and transparent reported health research, to ensure the reproducibility of studies by future researchers. Health economic evaluations, the comparative analysis of alternative interventions in terms of their costs and consequences, have been promoted as an important tool to inform decision-making. The objective of this study will be to investigate the extent to which articles of economic evaluations of healthcare interventions indexed in MEDLINE incorporate research practices that promote transparency, openness and reproducibility.

Methods and analysis

This is the study protocol for a cross-sectional comparative analysis. We registered the study protocol within the Open Science Framework (osf.io/gzaxr). We will evaluate a random sample of 600 cost-effectiveness analysis publications, a specific form of health economic evaluations, indexed in MEDLINE during 2012 (n=200), 2019 (n=200) and 2022 (n=200). We will include published papers written in English reporting an incremental cost-effectiveness ratio in terms of costs per life years gained, quality-adjusted life years and/or disability-adjusted life years. Screening and selection of articles will be conducted by at least two researchers. Reproducible research practices, openness and transparency in each article will be extracted using a standardised data extraction form by multiple researchers, with a 33% random sample (n=200) extracted in duplicate. Information on general, methodological and reproducibility items will be reported, stratified by year, citation of the Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement and journal. Risk ratios with 95% CIs will be calculated to represent changes in reporting between 2012–2019 and 2019–2022.

Ethics and dissemination

Due to the nature of the proposed study, no ethical approval will be required. All data will be deposited in a cross-disciplinary public repository. It is anticipated the study findings could be relevant to a variety of audiences. Study findings will be disseminated at scientific conferences and published in peer-reviewed journals.

Keywords: cost-effectiveness analysis, data sharing, methodology, quality, reporting, reproducibility


Strengths and limitations of this study.

  • To our knowledge, this will be the first attempt to examine the extent to which health economic evaluations indexed in MEDLINE incorporate transparency, openness and reproducibility research practices.

  • We will be able to collect data on a broad cross-section of health economic evaluations and will not restrict inclusion based on the medical specialty, disease condition or healthcare intervention.

  • Study findings could be used to strengthen Open Science strategies and recommendations to increase the value of health economic evaluations.

  • The study may be limited by the inclusion of articles only catalogued in one database and written in English.

Introduction

In recent years, there has been a growing awareness of the need for rigorous and transparent reporting of health research to ensure that studies can be reproduced.1–7 The value of health research can be improved by increasing transparency and openness of the processes of research design, conduct, analysis and reporting.8 9 Sharing data and materials from health research studies has multiple positive effects within the research community: it is part of good publication practice in keeping the principles of Open Science; it allows for the conduct of additional analyses to further explore data and generate new hypotheses; it allows access to unpublished data and it encourages reproducibility in research.10 Recognising the potential impact of open research culture, journals are increasingly supporting the use of reporting guidelines, as well as policies and technologies that help to improve transparency.11–13 Scientists are increasingly encouraged to use reproducible research practices, which allow others to perform direct replication of studies using the same data and analytic methods.14 15 Furthermore, research funders are changing their grant requirements including open data sharing.16 17

Health economic evaluations, which compare alternative interventions or programmes in terms of their costs and consequences,18 can help inform resource allocation decisions. A cost-effectiveness analysis, a specific form of economic evaluation that compares alternative options in terms of their costs and their health outcomes, is a valuable tool in health technology assessment processes. Cost-effectiveness analyses haves been promoted as an important research methodology for assessing value for money of healthcare interventions and an important source of information for making clinical and policy decisions.19 Decisions about the use of new interventions in healthcare are often based on health economic evaluations. Efforts to increase transparent conduct and reporting of health economic evaluations have existed for many years.20–30 For example, the Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement,30 first published in March 2013, provides recommendations for authors, peer reviewers and journal editors regarding how to prepare reports of health economic evaluations. The aim of CHEERS is to facilitate complete and transparent reporting of health economic evaluations and help more formal critical appraisal and interpretation. As a potential measure of impact,31 CHEERS has been cited over 1000 times in the Web of Science. However, little attention has been given to reproducibility practices such as sharing of study protocols, data and analytic methods (which allow others to recreate the study findings) as part of health economic evaluation studies.22–25 29

Previous research has evaluated the impact of economic evaluation guidelines and the reporting quality of published articles. For example, Jefferson et al 32 previously investigated whether publication (in August 1996) of the BMJ guidelines on peer review of economics submissions made any difference to editorial and peer review processes, quality of submitted manuscripts and quality of published manuscripts in two high-impact factor medical journals (BMJ and T he Lancet). In a sample of 105 articles on economics submissions, 27 (24.3%) were full health economic evaluations. Although Jefferson et al 32 were not studying reproducibility, openness and transparency directly, they did undertake an assessment of the impact of a reporting guideline for health economic evaluations. A 'before and after' assessment of implementation of the guideline was performed to assess how closely the reporting guidelines were followed. The authors found that the publication of the guidelines helped the editors improve the efficiency of the editorial process but had no impact on the reporting quality of health economic evaluations submitted or published.

The primary objective of this study will be to examine the extent to which articles of health economic evaluations of healthcare interventions indexed in Medline incorporate transparency, openness and reproducibility research practices. Secondary objectives will be to explore (1) how the reporting and reproducibility characteristics of health economic evaluations change between 2012 and 2022 and (2) whether the transparency and reproducibility practices have improved after the publication of the CHEERS statement in 2013.

Methods and analysis

This is the study protocol for a cross-sectional, comparative analysis. The present protocol has been registered within the Open Science Framework (registration identifier: osf.io/gzaxr). It is anticipated the study will be conducted during January 2020–December 2023.

Eligibility criteria

We will evaluate a random sample of 600 cost-effectiveness and cost-utility analyses of healthcare interventions, indexed in Medline during 2012 (n=200), 2019 (n=200) and 2022 (n=200), which focus on a healthcare intervention in humans and reports an incremental cost-effectiveness ratio in terms of costs per life years gained, quality-adjusted life years or disability-adjusted life years. In particular, this analysis will focus on full health economic evaluations that measures health effects in terms of prolongation of life and/or health-related quality of life. We will select this specific form of health economic evaluations because many decision-makers and researchers have recommended this framework as the standard reference for cost-effectiveness in health and medicine.19 Publications of health economic evaluations will be limited to journal articles written in English with an abstract available.

We will exclude editorials, letters, narrative reviews, systematic reviews, meta-analysis, methodological articles, retracted publications and health economic evaluations that do not quantify health impacts in terms of life years gained, quality-adjusted life years or disability-adjusted life years.

Searching

To provide a reliable summary of the literature, we will search Medline through PubMed (National Library of Medicine, Bethesda, Maryland, USA) for candidate studies throughout three cross-sectional, comparative time periods. First, we will search Medline-indexed articles in 2019 (‘reference year’) as it is the year closest to when the protocol for this study was drafted. In part two, we will search for articles indexed in 2012 and 2022, respectively, to further assess whether the transparency and reproducibility practices improved between 2012 (as it is 1 year before the publication of the CHEERS statement in 201330) and 2022 (10 years after). The literature searches will be conducted by an experienced information specialist. Our main literature search will be peer reviewed by a senior health information specialist using the Peer Review of Electronic Search Strategies checklist.33 The draft literature search strategy is based on a Medline search filter for economic evaluations34 and can be found in the online supplementary appendix 1.

Supplementary data

bmjopen-2019-034463supp001.pdf (38.7KB, pdf)

Screening

All titles and abstracts will be screened using liberal acceleration (where two reviewers need to independently exclude a record while only one reviewer needs to include a record). We will retrieve the full text of any citations meeting our eligibility criteria or for which eligibility remains unclear. A form for screening full-text articles will be pilot tested on 50 articles. Subsequently, at least two reviewers will independently screen all full-text articles. Any discrepancies in screening full-text articles will be resolved via discussion or adjudication by a third reviewer if necessary.

Data extraction

If more than 600 health economic evaluations are identified in the search, we will perform data extraction on a random sample of articles stratified by publication year (200 in 2022, 2019 and 2012, respectively). If fewer than 200 articles are identified in a given year (eg, 2012), we will randomly select the sufficient number of studies published from the preceding year (eg, October–December 2011) to match the number used in the study sample. We will not perform any sample size calculations since our study will evaluate multiple indicators that are considered all equally important, and they may vary substantially in the proportion to which they are satisfied by the included articles. However, 200 articles per year was assumed to be sufficient to capture potential differences.

Data in each article will be extracted using a standardised data extraction form by multiple researchers, with a 33% random sample (n=200) extracted in duplicate. All data extractors will independently pilot test the form on 30 included studies to ensure consistency in interpretation of data items. Subsequently, data from each study will be independently extracted by one of several reviewers. Any discrepancies in the data extracted will be resolved via discussion or adjudication by a third researcher if necessary. Full articles and supplementary materials with data and analyses will be examined for general and methodological characteristics, statements of publicly available full protocols and data sets, conflicts of interest and funding disclosures. In particular, we will review the final versions of the articles available online.

The selection and wording of general, methodological and reproducibility indicators will be influenced by recommendations from relevant articles on research transparency and reproducibility.4 5 7 8 29 35–41 The standardised data extraction form will include the following:

General characteristics

  • Name of journal.

  • Journal impact factor (according to the latest Journal Citation Report at the time of data extraction).

  • Journal type (fully open access journal or subscription-based journal including those that may have open access content, eg, hybrid).

  • Year of publication.

  • Name, gender and country of corresponding author.

  • Type of condition addressed by the economic evaluation (International Statistical Classification of Diseases and Related Health Problems, 10th Revision category).

  • Type of interventions addressed (pharmacological, non-pharmacological, both) and the intervention to which it was compared (the ‘comparator’, eg, active alternative, usual care or placebo/do nothing) with adequate descriptions.40 41

  • Type of economic evaluation (single-study-based economic evaluation or model-based economic evaluation).

  • Study perspective (eg, society, healthcare system/provider) and relate this to the costs being evaluated.

  • Time horizon over which costs and outcomes are being evaluated.

  • Discount rate used for costs and outcomes with rationale (when applicable).

  • Health outcomes used as the measure of benefit (eg, life years gained, quality-adjusted life years or disability-adjusted life years) and their relevance for the type of analysis performed.

  • Measurement of effectiveness (eg, for single-study-based estimates: a description of the design features of the single effectiveness study and why the single study was a sufficient source of clinical effectiveness; and for synthesis-based estimates: a description of the methods used for identification of included studies and synthesis of clinical effectiveness data).

  • Estimate of resources and costs (including a description of approaches used to estimate resource use associated with the alternative interventions and describe methods for valuing each resource item in terms of its unit costs).

  • Discussion of all analytical methods supporting the evaluation (eg, methods for dealing with skewed, missing or censored data; extrapolation methods; methods for pooling data; methods for handling population heterogeneity and uncertainty such as subgroup analysis); choice of model and model calibration and validation (when applicable).

  • Results including number of incremental cost-effectiveness ratios (ICERs), sensitivity analyses, subgroup or heterogeneity analyses (eg, variations between subgroups of patients with different baseline characteristics or other variability in effects), incremental costs and outcomes for base case analysis ICERs (defined as a qualitative representation of the index ICER for example, ‘more costs, more outcomes’, ‘less costs, more outcomes’, ‘less costs, comparable outcomes’), the cost-effectiveness ratio values (defined as quantitative representation of the base case analysis ICER), incremental costs (the ratio’s numerator) and health effects (life years gained, quality-adjusted life years or both—the denominator of the ratio for base case analysis).

  • Conclusions including favourable if the intervention clearly claims to be the preferred choice (eg, cited as ‘cost-effective’, ‘reduced costs’, ‘produced cost savings’‘ ‘an affordable option’, ‘value for money’), unfavourable if the final comments are negative (eg, the intervention is ‘unlikely to be cost-effective’, ‘produced higher costs’, ‘is economically unattractive’ or ‘exceeded conventional thresholds of willingness to pay’) and neutral or uncertain when the intervention of interest do not surpass the comparator and/or when some uncertainty is expressed in the conclusions.

  • Funding (eg, no statement, no funding, public, private, other, combination of public/private/other).

  • Conflicts of interests (eg, no statement, statement no conflicts exist, statement conflicts exist).

Enablers for reproducibility, transparency and openness

  • Citation and/or mention of CHEERS statement (eg, no citation/mention, citation/mention without reporting checklist, citation/mention with reporting checklist).

  • Use of CHEERS appropriately (eg, when CHEERS was used as a reporting guideline to ensure a clear report of the study’s design, conduct and findings), inappropriately (eg, when CHEERS was used as a methodological tool to design or conduct health economic evaluations or as an assessment tool of methodological quality of publications reporting cost-effectiveness research) or in an unclear or neutral manner (eg, when use was neither appropriate nor inappropriate).31 42

  • Open access or free availability in PubMed Centralbased on assignment of a specific identifier (yes, no).

  • Protocol/registration mentioned (eg, no protocol, full protocol publicly available, full protocol publicly available and preregistered).

  • Health economics analysis plan mentioned (eg, no analysis plan, indicated that analysis plan was available on request, full access to analysis plan along with research protocol).39

  • Mention of raw data availability (eg, no data sharing, indicated that raw data were available on request, full access to raw data for reanalysis).

  • Mention of access to analytic methods and algorithms (eg, ‘code’, ‘script’, ‘model’) used to perform analyses (eg, no access, indicated that analytic methods were available on request, full access to analytic methods for reanalysis).

  • Type of data repository used, if appropriate including use of an open globally scoped repository (eg, Open Science Framework, Dryad, Mendeley, Zenodo), a journal repository (eg, additional file as webappendix or data paper) or other repository (eg, repository from a specific institution, project or nation).

  • Data made available to recreate the index ICERs (base case).

  • Data made available to recreate all core ICERs (base case and heterogeneity analysis).

  • Data made available to recreate all ICERs (base case, heterogeneity analysis and uncertainty analysis) according to reporting standards.30 38

  • Results have undergone rigorous independent replication and reproducibility checks (eg, whether the study claimed to be a replication effort in the abstracts and introductions)4 5: statement of novel findings (eg, the cost-effectiveness analysis claims that it presents some novel findings), statement of replication (eg, the cost-effectiveness analysis clearly claims that it is a replication effort trying to validate previous knowledge or it is inferred that the cost-effectiveness is a replication trying to validate previous knowledge), statement of novel findings and replication (eg, the cost-effectiveness analysis claims to be both novel and to replicate previous findings), no statement on novelty or replication (eg, no statement or an unclear statement about whether the cost-effectiveness analysis presents a novel finding or replication).

Data analysis

The analysis will be descriptive, with data summarised as frequency for categorical items or median and IQR for continuous items. We will characterise the indicators for the period 2012–2022. The proportion of general, methodological and reproducibility indicators stratified by year will be reported, as well as citation use of the CHEERS statement and journal (eg, according to whether it is an original CHEERS endorsed journal or not). The draft list of original CHEERS endorsed journals can be found in the online supplementary appendix 2. A priori established Fisher’s exact tests and risk ratios with 95% CIs will be calculated to represent changes in reporting between 2012–2019 and 2019–2022. We will explore whether reproducible research practices are associated with the citation of the CHEERS statement. We will apply the p value <0.005 threshold for statistical significance with p values 0.05–0.005 suggestive.5 43 44

Supplementary data

bmjopen-2019-034463supp002.pdf (42.2KB, pdf)

All analyses will be performed using Stata V.16 or higher (StataCorp LP).

Updates and additional analyses

We plan to conduct a continual surveillance of the health economic literature, keeping evidence as up-to-date as possible. Iterations of the searches and review process will be repeated at regular intervals (eg, 3-year intervals after 2022) to continue to present timely and accurate findings. Reanalysis of the proposed reproducibility and transparency metrics and indicators may offer insight into progressive improvements in design, conduct and analysis of health economic evaluations over time.

Any (new) additional analysis examining potential associations between general characteristics from extracted studies (eg, results including index ICER or funding source) and enablers of reproducibility, transparency and openness (eg, mention of CHEERS statement, open access, protocol registration or mention of raw data) will be prospectively reported in a new specific (substudy) protocol, following standard methods described in this paper.

Patient and public involvement

No patients and/or public were involved in setting the research question nor they were involved in developing plans for design (or implementation) of this study protocol.

Ethics and dissemination

To the best of our knowledge, this cross-sectional analysis will be the first attempt to investigate the extent to which articles of cost-effectiveness of healthcare interventions incorporate transparent, open and reproducible research practices. Without complete and transparent reporting of how a health economic evaluation is being designed and conducted, it is difficult for readers and potential knowledge users to assess its conduct and validity. Strengthening the reproducibility, openness and reporting of methods and results can maximise the impact of health economic evaluations by allowing more accurate interpretation and use of their findings. We anticipate the study could be relevant to a variety of audiences including journal editors, peer reviewers, research authors, health technology assessment agencies, guideline developers, research funders, educators and other potential key stakeholders. Moreover, the study findings could further be used in discussions to strengthen Open Science to increase value and reduce waste from incomplete or unusable reports of health economic evaluations.

Any amendments made to this protocol when conducting the analyses will be outlined and reported in the final manuscript. Once completed, findings from this study will be published in peer-reviewed journals. All data underlying the findings reported in the final manuscript will be deposited in a cross-disciplinary public repository, such as the Open Science Framework (https://osf.io/). In addition, when new data have become available, we will update the analysis and present the updated findings at a public repository (and we may also seek publication in a peer-reviewed journal).

Supplementary Material

Reviewer comments
Author's manuscript

Footnotes

Twitter: @donhusereau, @chromosome8, @repunomada, @dmoher

Contributors: All authors contributed to conceptualising and designing the study. FC-L drafted the manuscript. LC, MR, BH, DH, MFD, AA-A, MP-F, EB-D, RM, RT-S, JRR and DM commented for important intellectual content and made revisions. All authors read and approved the final version of the manuscript. FC-L accepts full responsibility for the finished manuscript and controlled the decision to publish.

Funding: FC-L and RT-S are supported by the Institute of Health Carlos III/CIBERSAM. BH is supported by a New Investigator Award from the Canadian Institutes of Health Research and the Drug Safety and Effectiveness Network. MR and EB-D are supported by the Institute of Health Carlos III/Spanish Health Services Research on Chronic Patients Network (REDISSEC). DM is supported by a University Research Chair, University of Ottawa. The funders were not involved in the design of the protocol or decision to submit the protocol for publication nor will they be involved in any aspect of the study conduct.

Disclaimer: The views expressed in this manuscript are those of the authors and many not be understood or quoted as being made on behalf of, or reflecting the position of, the funder(s) or any institution.

Competing interests: None declared.

Patient consent for publication: Not required.

Provenance and peer review: Not commissioned; externally peer reviewed.

References

  • 1. Nosek BA, Alter G, Banks GC, et al. . Scientific standards. promoting an open research culture. Science 2015;348:1422–5. 10.1126/science.aab2374 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Begley CG, Buchan AM, Dirnagl U. Robust research: institutions must do their part for reproducibility. Nature 2015;525:25–7. 10.1038/525025a [DOI] [PubMed] [Google Scholar]
  • 3. Goodman SN, Fanelli D, Ioannidis JPA. What does research reproducibility mean? Sci Transl Med 2016;8:341ps12 10.1126/scitranslmed.aaf5027 [DOI] [PubMed] [Google Scholar]
  • 4. Iqbal SA, Wallach JD, Khoury MJ, et al. . Reproducible research practices and transparency across the biomedical literature. PLoS Biol 2016;14:e1002333 10.1371/journal.pbio.1002333 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Wallach JD, Boyack KW, Ioannidis JPA. Reproducible research practices, transparency, and open access data in the biomedical literature, 2015-2017. PLoS Biol 2018;16:e2006930 10.1371/journal.pbio.2006930 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Naudet F, Sakarovitch C, Janiaud P, et al. . Data sharing and reanalysis of randomized controlled trials in leading biomedical journals with a full data sharing policy: survey of studies published in The BMJ and PLOS Medicine. BMJ 2018;360:k400 10.1136/bmj.k400 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Page MJ, Altman DG, Shamseer L, et al. . Reproducible research practices are underused in systematic reviews of biomedical interventions. J Clin Epidemiol 2018;94:8–18. 10.1016/j.jclinepi.2017.10.017 [DOI] [PubMed] [Google Scholar]
  • 8. Ioannidis JPA, Greenland S, Hlatky MA, et al. . Increasing value and reducing waste in research design, conduct, and analysis. Lancet 2014;383:166–75. 10.1016/S0140-6736(13)62227-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Chan A-W, Song F, Vickers A, et al. . Increasing value and reducing waste: addressing inaccessible research. Lancet 2014;383:257–66. 10.1016/S0140-6736(13)62296-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Committee on Strategies for Responsible Sharing of Clinical Trial Data; Board on Health Sciences Policy; Institute of Medicine Sharing clinical trial data: maximizing benefits, minimizing risk. Washington, DC: The National Academies Press, 2015. [PubMed] [Google Scholar]
  • 11. Moher D. Reporting guidelines: doing better for readers. BMC Med 2018;16:233 10.1186/s12916-018-1226-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Loder E, Groves T. The BMJ requires data sharing on Request for all trials. BMJ 2015;350:h2373 10.1136/bmj.h2373 [DOI] [PubMed] [Google Scholar]
  • 13. Taichman DB, Backus J, Baethge C, et al. . Sharing clinical trial data: a proposal from the International Committee of medical Journal editors. PLoS Med 2016;13:e1001950 10.1371/journal.pmed.1001950 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Krumholz HM, Waldstreicher J. The Yale Open Data Access (YODA) Project--A Mechanism for Data Sharing. N Engl J Med 2016;375:403–5. 10.1056/NEJMp1607342 [DOI] [PubMed] [Google Scholar]
  • 15. Bertagnolli MM, Sartor O, Chabner BA, et al. . Advantages of a truly open-access Data-Sharing model. N Engl J Med 2017;376:1178–81. 10.1056/NEJMsb1702054 [DOI] [PubMed] [Google Scholar]
  • 16. Collins FS, Tabak LA. Policy: NIH plans to enhance reproducibility. Nature 2014;505:612–3. 10.1038/505612a [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Schiltz M. Science without publication Paywalls: cOAlition S for the Realisation of full and immediate open access. PLoS Med 2018;15:e1002663 10.1371/journal.pmed.1002663 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Drummond MF, Sculpher MJ, Torrance G, et al. . Methods for the economic evaluation of health care programmes. 3rd ed Oxford: Oxford University Press, 2005. [Google Scholar]
  • 19. Gold MR, Siegel JE, Russell LB, et al. . Cost-Effectiveness in health and medicine. Oxford University Press: Oxford, 1996. [Google Scholar]
  • 20. Hillman AL, Eisenberg JM, Pauly MV, et al. . Avoiding bias in the conduct and reporting of cost-effectiveness research sponsored by pharmaceutical companies. N Engl J Med 1991;324:1362–5. 10.1056/NEJM199105093241911 [DOI] [PubMed] [Google Scholar]
  • 21. Bell CM, Urbach DR, Ray JG, et al. . Bias in published cost effectiveness studies: systematic review. BMJ 2006;332:699–703. 10.1136/bmj.38737.607558.80 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22. Rennie D, Luft HS. Pharmacoeconomic analyses: making them transparent, making them credible. JAMA 2000;283:2158–60. 10.1001/jama.283.16.2158 [DOI] [PubMed] [Google Scholar]
  • 23. Poole C, Agrawal S, Currie CJ. Let cost effectiveness models be open to scrutiny. BMJ 2007;335:735.1–735. 10.1136/bmj.39360.379664.BE [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Cohen JT, Neumann PJ, Wong JB. A call for open-source cost-effectiveness analysis. Ann Intern Med 2017;167:432–3. 10.7326/M17-1153 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25. Dunlop WCN, Mason N, Kenworthy J, et al. . Benefits, challenges and potential strategies of open source health economic models. Pharmacoeconomics 2017;35:125–8. 10.1007/s40273-016-0479-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26. Neumann PJ, Sanders GD. Cost-Effectiveness analysis 2.0. N Engl J Med 2017;376:203–5. 10.1056/NEJMp1612619 [DOI] [PubMed] [Google Scholar]
  • 27. Drummond MF, Jefferson TO. Guidelines for authors and peer reviewers of economic submissions to the BMJ. The BMJ economic evaluation Working Party. BMJ 1996;313:275–83. 10.1136/bmj.313.7052.275 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28. Sanders GD, Neumann PJ, Basu A, et al. . Recommendations for conduct, methodological practices, and reporting of cost-effectiveness analyses: second panel on cost-effectiveness in health and medicine. JAMA 2016;316:1093–103. 10.1001/jama.2016.12195 [DOI] [PubMed] [Google Scholar]
  • 29. Neumann PJ, Kim DD, Trikalinos TA, et al. . Future directions for cost-effectiveness analyses in health and medicine. Med Decis Making 2018;38:767–77. 10.1177/0272989X18798833 [DOI] [PubMed] [Google Scholar]
  • 30. Husereau D, Drummond M, Petrou S, et al. . Consolidated health economic evaluation reporting standards (cheers) statement. BMJ 2013;346:f1049 10.1136/bmj.f1049 [DOI] [PubMed] [Google Scholar]
  • 31. Caulley L, Khoury M, Whelan J, et al. . Citation analysis of reporting guidelines, 2019. Available: https://osf.io/v46s2/
  • 32. Jefferson T, Smith R, Yee Y, et al. . Evaluating the BMJ guidelines for economic submissions: prospective audit of economic submissions to BMJ and the Lancet. JAMA 1998;280:275–7. 10.1001/jama.280.3.275 [DOI] [PubMed] [Google Scholar]
  • 33. McGowan J, Sampson M, Salzwedel DM, et al. . PRESS Peer Review of Electronic Search Strategies: 2015 Guideline Statement. J Clin Epidemiol 2016;75:40–6. 10.1016/j.jclinepi.2016.01.021 [DOI] [PubMed] [Google Scholar]
  • 34. Glanville J, Kaunelis D, Mensinkai S. How well do search filters perform in identifying economic evaluations in MEDLINE and EMBASE. Int J Technol Assess Health Care 2009;25:522–9. 10.1017/S0266462309990523 [DOI] [PubMed] [Google Scholar]
  • 35. Nosek BA, Ebersole CR, DeHaven AC, et al. . The preregistration revolution. Proc Natl Acad Sci U S A 2018;115:2600–6. 10.1073/pnas.1708274114 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36. Wilkinson MD, Dumontier M, Aalbersberg IJJ, et al. . The fair guiding principles for scientific data management and stewardship. Sci Data 2016;3:160018 10.1038/sdata.2016.18 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37. Aczel B, Szaszi B, Sarafoglou A, et al. . A consensus-based transparency checklist. Nat Hum Behav 2020;4:4–6. 10.1038/s41562-019-0772-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38. Chiou C-F, Hay JW, Wallace JF, et al. . Development and validation of a grading system for the quality of cost-effectiveness studies. Med Care 2003;41:32–44. 10.1097/00005650-200301000-00007 [DOI] [PubMed] [Google Scholar]
  • 39. Dritsaki M, Gray A, Petrou S, et al. . Current UK practices on health economics analysis plans (HEAPs): are we using Heaps of them? Pharmacoeconomics 2018;36:253–7. 10.1007/s40273-017-0598-x [DOI] [PubMed] [Google Scholar]
  • 40. Hoffmann TC, Glasziou PP, Boutron I, et al. . Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ 2014;348:g1687 10.1136/bmj.g1687 [DOI] [PubMed] [Google Scholar]
  • 41. Hoffmann TC, Oxman AD, Ioannidis JP, et al. . Enhancing the usability of systematic reviews by improving the consideration and description of interventions. BMJ 2017;358:j2998 10.1136/bmj.j2998 [DOI] [PubMed] [Google Scholar]
  • 42. da Costa BR, Cevallos M, Altman DG, et al. . Uses and misuses of the STROBE statement: bibliographic study. BMJ Open 2011;1:e000048 10.1136/bmjopen-2010-000048 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43. Ioannidis JPA. The Proposal to Lower P Value Thresholds to .005. JAMA 2018;319:1429–30. 10.1001/jama.2018.1536 [DOI] [PubMed] [Google Scholar]
  • 44. Ioannidis JPA. Lowering the P value Threshold-Reply. JAMA 2018;320:937–8. 10.1001/jama.2018.8743 [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary data

bmjopen-2019-034463supp001.pdf (38.7KB, pdf)

Supplementary data

bmjopen-2019-034463supp002.pdf (42.2KB, pdf)

Reviewer comments
Author's manuscript

Articles from BMJ Open are provided here courtesy of BMJ Publishing Group

RESOURCES