Skip to main content
PLOS One logoLink to PLOS One
. 2021 Feb 8;16(2):e0246080. doi: 10.1371/journal.pone.0246080

An overview of the characteristics and quality assessment criteria in systematic review of pharmacoeconomics

Chen Min 1,2, Mi Xue 1,2, Fei Haotian 1,2, Li Jialian 1,2, Zhang Lingli 1,2,*
Editor: Kevin Lu3
PMCID: PMC7870091  PMID: 33556056

Abstract

Background

The systematic review of economic evaluations plays a critical role in making well-informed decisions about competing healthcare interventions. The quality of these systematic reviews varies due to the lack of internationally recognized methodological evaluation standards.

Methods

Nine English and Chinese databases including the Cochrane Library, PubMed, EMbase (Ovid), NHS economic evaluation database (NHSEED) (Ovid), Health Technology Assessment (HTA) database, Chinese National Knowledge Infrastructure (CNKI), WangFang, VIP Chinese Science & Technology Periodicals (VIP) and Chinese Biomedical Literature Database (CBM) were searched. Two reviewers independently screened studies and extracted data. The methodological quality of the literature was measured with modified AMSTAR. Data were narrative synthesized.

Results

165 systematic reviews were included. The overall methodological quality of the literature was moderate according to the AMSTAR scale. In these articles, thirteen quality assessment tools and 32 author self-defined criteria were used. The three most widely used tools were the Drummond checklist (19.4%), the BMJ checklist (15.8%), the Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement (12.7%). Others included the Quality of Health Economic Studies (QHES), the Consensus on Health Economic Criteria (CHEC), the checklist of Center for Reviews and Dissemination (CRD), the Philips checklist, the World Health Organization (WHO) checklist, the checklist of Critical Appraisal Skills Program (CASP), the Pediatric Quality Appraisal Questionnaire (PQAQ), the Joanna Briggs Institute (JBI) checklist, Spanish and Chinese guidelines. The quantitative scales used in these literature were the QHES and PQAQ.

Conclusions

Evidence showed that pharmacoeconomic systematic reviews’ methodology remained to be improved, and the quality assessment criteria were gradually unified. Multiple scales can be used in combination to evaluate the quality of economic research in different settings and types.

Introduction

It has become a great challenge for health policy-makers concerning the issue of realizing a more efficient allocation of limited medical and health resources [1, 2]. As a branch of health economics, pharmacoeconomic evaluations have been widely applied for healthcare decision-making and are essential for health technology assessment (HTA) [3, 4]. However, with an increasing amount of economic evaluations, healthcare professionals, consumers and policy-makers can often be overwhelmed by the presence of different results [5, 6]. The systematic review (SR) of these studies is considered a useful approach in making well-informed decisions on competing healthcare interventions [7].

Unlike the opponents, advocates such as Cochrane Collaboration, National Institute for Health and Care Excellence (NICE), Pharmaceutical Benefits Advisory Committee (PBAC), Canadian Agency for Drugs and Technologies in Health (CADTH), believed that the results of a pharmacoeconomic SR could help in public health assessment and policy-making [5, 811]. They maintained that it was not to produce the authoritative result on the relative cost-effectiveness but rather help decision-makers understanding the structure and potential impact of resource allocation issues. It can identify for decision-makers the range and quality of available studies and gaps in the evidence base [12]. In order to maximize its usefulness, pharmacoeconomic SR should be conducted in a systematic process. However, their quality varies due to the lack of internationally recognized methodological evaluation standards and the paucity of detailed reports [13, 14]. In 2002, the Cochrane Economic Collaboration Group assessed the quality of 39 economic evaluation SRs published in 1990–2001, with a 6-item self-made checklist. The results showed that the quality was satisfactory, but the search strategies and instruments to assess included studies’ quality needed to pay more attention [15]. The weaknesses were that the database was not thoroughly searched and the quality evaluation of included studies was not described in approximately half and one-third of literature, respectively. Another SR conducted by Luhnen et al. in 2018 concluded that the methodologies applied for 83 SRs of health economic evaluations in HTA agencies and their reporting quality were very heterogeneous [16]. Process steps of these SRs such as data extraction, methodology quality or applicability assessment were frequently not performed at all, not reported or reported nontransparently.

Appraising the quality of included individual studies is important for performing pharmacoeconomic SRs. Using a well-developed checklist will make pharmacoeconomic SRs more transparent, informative and comparable [6]. A number of international organizations and researchers such as the British Medical Journal (BMJ), the International Society for Pharmacoeconomics and Outcomes Research (ISPOR) have developed generic and specific guidelines, checklists, and recommendations to conduct and report acceptable methods of economic analysis [17, 18]. Therefore, this study aimed to overview, summarize, and analyze pharmacoeconomic SRs’ characteristics and the quality assessment tool used for the included individual studies. It would be useful for future researchers wanting to conduct a SR of evidence about economic evaluations.

Materials and methods

Search strategy

The computer-based retrieval was conducted in the Cochrane Library, PubMed, EMbase (Ovid), NHS economic evaluation database (NHSEED) (Ovid), HTA Database, Chinese National Knowledge Infrastructure (CNKI), WanFang Data, VIP Chinese Science & Technology Periodicals (VIP) and Chinese Biomedical Literature Database (CBM) on October 25, 2019. Reference lists of relevant published studies and grey literature were also searched. The combination of MeSH terms and free text words were used for retrieval, with the corresponding adjustment according to the specific database. English search terms were "cost-benefit analysis", "cost-utility analysis", "cost-effectiveness analysis", "cost analysis economic evaluation", "review literature" and "systematic review". Meanwhile, Chinese search terms were "economics", "cost-effectiveness", "cost-benefit", "cost-utility" and "systematic review". The specific search strategies were shown in the S1 Text.

Selection criteria

We used PICOS in defining explicit inclusion and exclusion criteria.

Inclusion criteria: (1) Patients: individuals with identified disease category; (2) Interventions/Comparators: informed drug intervention for disease prevention, treatment, diagnosis, etc.; and (3) Outcome/Setting: SRs of economic evaluations or HTA including SRs of economic evaluations.

Exclusion criteria: (1) literature published in other languages other than Chinese and English; (2) duplicate publication; (3) SRs of economic evaluations related to the discussion of disease burdens; (4) literature without access to the full text, such as conference summary; (5) quality evaluation or general review that was not a SR; and (6) SR on the comparison of drug and non-drug interventions.

Study selection and data extraction

Two researchers independently completed the literature screening and data extraction according to the pre-determined inclusion and exclusion criteria, and the results were crossed checked. In case of disagreement, a discussion was made for clarification or resolved through a third reviewer. Data extraction consisted of general information (including author, year of publication, country of publication, journal of publication, quality appraisal tool, result synthesis method) and essential characteristics of quality appraisal criteria (including name, number of items, categories, etc).

Study quality assessment

We used the Measure Tool to Assess Systematic Review (AMSTAR), an instrument for assessing the quality of SR of health care interventions, to evaluate the quality of included SRs [19]. The modified AMSTAR scale has the following ten items: (1) Was an "a priori" design provided? (2) Was there a duplicate study selection and data extraction? (3) Was a comprehensive literature search performed? (4) Was the status of publication (i.e. grey literature) used as an inclusion criterion? (5) Was a list of studies (included and excluded) provided? (6) Were the characteristics of the included studies provided? (7) Was the scientific quality of the included studies assessed and documented? (8) Was the scientific quality of the included studies used appropriately in formulating conclusions? (9) Were the methods used to combine the findings of studies appropriate? (10) Was the conflict of interest included? Each questions was answered with "yes" (score 1), "no" (score 0) and "impossible to judge" (score 0.5). The final score was classified as a low score (< 5), medium score (5–8), or high score (>8). In this overview, item 7 of the AMSTAR scale refers to using checklists or guidelines to assess the quality of included economic studies.

Data analysis

Microsoft Excel 2013 software was used to input the data and analyze the essential characteristics of the included literature, including the types of journals published, disease categories, quality evaluation methods of the included research, etc. A qualitative description was made on the quality assessment tool used for the included individual studies in pharmacoeconomic SR. We provided numbers and percentages for nominal data and median and ranges for count data.

Results

Study search and selection

A total of 11, 172 articles in English, 1, 004 articles in Chinese from databases and 17 articles obtained through google scholar were obtained in the initial literature retrieval. 9,938 articles were obtained after excluding duplicate publications. Of the titles and abstracts screened, 2,524 were ordered as full papers and assessed in detail. After independent screening by two researchers, 148 articles in English and 17 articles in Chinese were finally included for the overview. The process and results of the literature screening were shown in Fig 1. The references of 165 included articles were listed in the S2 Text.

Fig 1. PRISMA flow chart of literature search.

Fig 1

This flow diagram illustrated the search results and the process of screening and selecting studies for inclusion, and the reasons for exclusions in this review.

Characteristics of the included studies

The number of pharmacoeconomic SR literature publications gradually increased in the past 20 years (Fig 2). Of the 165 studies, More than two-thirds of articles were conducted in the United Kingdom, China, Canada, Netherlands, and the United States. Approximately half of the articles were published in professional economics journals such as HTA and Pharmacoeconomics. 3.0% of the full articles were published as a dissertation. Neoplasms were the most focused disease in these studies, accounting for 20.6%, followed by some infectious and parasitic diseases (18.8%). Furthermore, 19 disease categories (classified by the International Classification of Diseases-10) were involved. In these SRs, the most commonly searched databases included general databases such as MEDLINE, Embase, the Cochrane Library and the specialized databases of economic evaluations such as NHSEED and HTA database. 144 studies (87.3%) explicitly stated an assessment of the quality of included economic studies with a checklist or guideline recommendations, and 14 of which combined different tools. Data synthesis for all articles was qualitative descriptive analysis. The characteristics of 165 articles were shown in Table 1.

Fig 2. The number of published pharmacoeconomic systematic reviews, 2000–2018.

Fig 2

This plot listed the number of pharmacoeconomic SR literature published in different years. The number on top of the bars represented the number of studies.

Table 1. Characteristics of the included 165 systematic reviews.

Category Characteristic Article (n = 165) Percentage (%)
Country United Kingdom 56 33.9
China 21 12.7
Canada 15 9.1
Netherlands 12 7.3
United States 11 6.7
Others 50 30.3
Journal Health Technology Assessment 52 31.5
Pharmacoeconomics 24 14.5
Expert Review of Pharmacoeconomics & Outcomes Research 5 3.0
Vaccine 5 3.0
Human Vaccines & Immunotherapeutics 4 2.4
PLOS ONE 4 2.4
China Pharmacy 3 1.8
Others 68 41.2
Disease Neoplasms 34 20.6
Certain infections and parasitic diseases 31 18.8
Disease of the circulatory system 15 9.1
Disease of the respiratory system 13 7.9
Diseases of the musculoskeletal system and connective tissue 12 7.3
Endocrine, nutritional and metabolic diseases 12 7.3
Others 48 29
Intervention Anti-tumor medication 31 18.8
vaccines 21 12.7
Anti-infective medication 19 11.5
Endocrine system medication 17 10.3
Immunosuppressive medication 17 10.3
Blood and hematopoietic system medication 12 7.3
Central system medication 12 7.3
Others 36 21.8
Search Database MEDLINE 120 72.7
EMbase 103 62.4
The Cochrane Library 92 55.8
NHS economic evaluation database 79 47.9
Health Technology Assessment database 62 37.6
Quality assessment tool Customized 32 19.4
Drummond 32 19.4
BMJ 26 15.8
CHEERS 21 12.7
None 21 12.7
QHES 10 6.1
CHEC 7 4.2
CRD 5 3.0
Philips 3 1.8
WHO 3 1.8
CASP 1 0.6
PQAQ 1 0.6
JBI 1 0.6
Spanish guideline 1 0.6
Chinese guideline 1 0.6
Data synthesis Narrative synthesis 165 100

BMJ, British Medical Journal; CHEERS, Consolidated Health Economic Evaluation Reporting Standards; QHES, Quality of Health Economic Studies; CRD, Center for Reviews and Dissemination; CHEC, Consensus on Health Economic Criteria; CASP, Critical Appraisal Skills Program; WHO, World Health Organization; PQAQ, Pediatric Quality Appraisal Questionnaire; JBI, Joanna Briggs Institute.

Study quality assessment

According to the modified AMSTAR scale’s evaluation results, the overall methodological quality was moderate, and the score ranged from 1 to 10, with an average score of 6.3. Of the included 165 literature, 36 were of low quality, 89 were of medium quality and 40 were of high quality. Three of the 10 quality items (comprehensive literature search, characteristics of the included studies, quality assessment) were fulfilled in at least 85% of reviews, with 88.5% (item 3), 94.5% (item 6) and 90.9% (item 7), respectively. However, 2 items (priori design and result synthesis) were fulfilled just about or less than thirty percent, with 30.9% (item 1) and 20.6% (item 9), respectively. A detailed methodological quality assessment of the pharmacoeconomic SR was shown in Fig 3 and S1 Table.

Fig 3. The methodological quality was evaluated by the modified AMSTAR scale in the pharmacoeconomic systematic review.

Fig 3

This plot illustrated the quality of included studies with each judgment ("yes" (score 1), "no" (score 0) and "impossible to judge" (score 0.5)). The overall methodological quality was moderate. Three of the 10 quality items (item 3, item 6 and item 7) were fulfilled in at least 85% of reviews, while 2 items (item 1 and item 9) were fulfilled just about or less than thirty percent.

Quality assessment tool used for the included studies

Table 2 detailed the characteristics of the checklists or guidelines. Eleven published checklists and two quality appraisal tools recommended by guidelines were used in the included 165 pharmacoeconomic SRs. And tools as follows had a relatively high frequency of use, including the Drummond checklist (19.4%), the BMJ checklist (15.8%), the Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement (12.7%). Besides, 32 studies modified the already published quality evaluation tools, such as the Drummond Checklist, the BMJ Checklist, the Consensus on Health Economic Criteria (CHEC) list, the CHEERS list and the Philips Checklist that were frequently applied. 21 articles did not evaluate the quality of the included studies. Among the checklists, the number of items ranged from 7 for model-based economic evaluation in the Drummond et al. Checklist to 57 for trial-and model-based economic evaluation in the Pediatric Quality Appraisal Questionnaire (PQAQ). Compared with the generic scale, the Quality of Health Economic Studies (QHES), the Philips and Joanna Briggs Institute (JBI) checklists were created to analyze the quality of economic evaluations based on modeling studies and the CHEC checklist on trial-based studies. Three articles used a disease-specific checklist created by the World Health Organization (WHO) to appraise the quality of the immunization program’s economic evaluations, such as vaccines. While the majority of instruments contain personal and open-ended items, the QHES and PQAQ checklists provide a score to enable a simple comparison among studies.

Table 2. Summary of the existing checklist used in the pharmacoeconomic systematic review.

Author,Year Tools Country Journal Setter Items Economic Evaluation Type Response Scored Percentage(%)
Drummond 2005 [20] Drummond checklist UK International Journal of Technology Assessment in HealthCare - 10(Trial)
7(model)
Trial-and model-based Yes, no, can’t tell No 19.4
Drummond 1996 [17] BMJ checklist UK BMJ BMJ Economic Evaluation Party 35 Trial-and model-based Yes, no, not clear, NA No 15.8
Husereau 2013 [18] CHEERS Canada BMJ ISPOR 24 Trial-and model-based Yes, no No 12.7
Chiou 2003 [21] QHES USA Medical Care - 16 Model-based Yes, no Yes 6.1
Evers 2005 [6] CHEC Netherlands International Journal of Technology Assessment in Health Care - 19 Trial-based Yes, no No 4.2
CRD 2009 [9] CRD UK Online CRD University of York 36 Trial-and model-based Yes, no No 3.0
Philips 2004 [22] Philips UK pharmacoeconomics - 60 Model-based Yes, no, unclear, NA No 1.8
Walker 2010 [23] WHO checklist Switzerland Vaccine Department of Immunization, Vaccines and Biologicals 43 Vaccine-specific Yes, no, partially, not clear, NA No 1.8
Ungar 2003 [24] PQAQ Canada Value in Health - 57 Trial-and model-based pediatric-specific Yes, no, NA Yes 0.6
López 2010 [25] Spanish guideline spain European Journal of Health Economics - - Trial-and model-based Yes, no, in part, NA No 0.6
Liu 2011 [26] Chinese guideline China China Journal of Pharmaceutical Economics Pharmacoeconomic Evaluations Group - Trial-and model-based Yes, no No 0.6
CASP 2013 [27] CASP UK Online Oxford Centre for Triple Value Healthcare 12 Trial-and model-based yes, no, can’t tell No 0.6
Gomersall 2015 [12] JBI Australia International Journal of Evidence-Based Healthcare Joanna Briggs Institute 11 Model-based Yes, no, unclear, NA No 0.6

BMJ, British Medical Journal; CHEERS, Consolidated Health Economic Evaluation Reporting Standards; QHES, Quality of Health Economic Studies; CRD, Center for Reviews and Dissemination; CHEC, Consensus on Health Economic Criteria; CASP, Critical Appraisal Skills Program; JBI, Joanna Briggs Institute; WHO, World Health Organization; PQAQ, Pediatric Quality Appraisal Questionnaire; ISPOR, International Society for Pharmacoeconomics and Outcomes Research. "-", not available; "NA", not appropriate.

Discussion

Our review’s findings indicated that the amount of pharmacoeconomic SR published has grown gradually from 2000 to 2018. Unlike the literature of Jefferson et al. [15] and Luhnen et al. [16], we evaluated the quality of the latest 165 SRs and searched for more comprehensive databases. Our review results revealed that the two methodological flaws described by Jefferson (unsatisfied search strategies and quality assessment of included studies) appeared to have improved but still needed further refinement. In our research, the majority of 165 articles (78.2%) had medium or high methodological quality according to the modified AMSTAR scale. Most of the SRs were reported following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) procedures, but two parts were worth noting.

First, there was no preliminary study design, with only 51 literature (30.9%) providing registration numbers. It is recommended to write a protocol for the SR of economic evaluations in advance. Second, it is difficult to answer whether the method of pooling the research results was appropriate. Identified economic evaluations are usually too heterogeneous to conduct a meta-analysis of results [12]. This is in line with our findings that all review’s results were synthesized narratively and 94.5% of the SRs provided the characteristics of the included studies. It may be useful to include summary tables that present key information relating to costs and consequences, including but not limited to population, country, perspective, comparison of interventions, the measurement of effectiveness and incremental cost-effectiveness ratio (ICER) [9]. It is suggested that the focus of pharmacoeconomic SRs is not to try to generate aggregate estimates of cost-effectiveness ratios but rather to explain the reasons for this difference from setting to setting. This is also an approach to improve the generalizability and transferability of data.

There were several checklists/guidelines for guiding researchers in conducting and reporting acceptable economic analysis. Walker et al. [28] reported ten original checklists for assessing an economic evaluation from 1992 to 2011. Zhang et al. [29] reported twelve original checklists with good reliability and validity from 1987 to 2013. In this overview, we retrieved 13 critical appraisal tools developed by institutions and several customized scales based on these pharmacoeconomic evaluation tools. The Drummond checklist, the BMJ checklist, and the CHEERS scale were the three-most widely used quality assessment tools in pharmacoeconomic SR. The Drummond checklist was from the monograph "Methods for the Economic Evaluation of Health Care Programmes", proposed by professor Drummond in 1987. Based on the successive four consecutive updates, it comprehensively and systematically introduced the principles, concepts, methods, and economic evaluation applications. It has a wide range of global influence and is an essential guideline for economic evaluation in the healthcare field [30]. The BMJ Economic Evaluation Working Party published the BMJ checklist in 1996 to improve the quality of submitted and published economic articles, which played an important role in economic research [17]. In 2013, through a 2-rounds Delphi technique, the ISPOR developed a 24 items CHEERS checklist [18]. Customized evaluation scales were used in 32 literature. For example, a pharmacoeconomic SR published in Lancet in 2002, which evaluated the cost-effectiveness of various interventions for HIV patients, customized the methods of literature quality appraisal with eight evaluation items [31]. In our previous study of evaluating therapeutic drugs for treating immune thrombocytopenia, the evaluation method was also developed based on the Drummond list and the CHEC scale, with 13 evaluation items customized [32]. Another paper, published in HTA, concerning the pharmacoeconomic SR of chronic myelogenous leukemia, referred to the Drummond list and defined 18 items to evaluate the quality of the original economic research [33].

Although the use of checklists/guidelines will not guarantee that the results of an economic SR are valid, it can ensure that the review has appropriate components [28]. In our view, the Drummond checklist, the BMJ checklist, and the CHEERS scale could be adopted as a quality assessment tool to assess the methodological quality of full and partial economic evaluations, and the Philips checklist especially for modeling evaluations. It should be noted that the above quality assessment tools are used for qualitative evaluation, which contain personal and open-ended items. In the case of quantitative assessment, QHES and PQAQ scales are recommended. In addition to generic ones, there are population-specific scales (such as PQAQ) and disease-specific scales (such as the WHO vaccine checklist). Finally, different scales may address different types of economic assessments, such as trial-based and model-based. Therefore, different studies may combine multiple different scales.

There were several limitations to our approach. Firstly, our overview did not include literature published other than Chinese and English, leading to publication bias. Secondly, the literature search was updated on October 25, 2019. The number of pharmacoeconomic SR has indeed increased, possibly leading to changes in results. We believe these problems will not have a significant impact on our findings. Thirdly, we did not conduct a statistical analysis comparing the quality of English versus Chinese, with or without checklists/guidelines. Because AMSTAR is not designed to generate an overall "score″. A high score on total items may disguise critical weaknesses in specific domains, such as an inadequate literature search or failure to assess the risk of bias with individual studies included in a systematic review. We also did not conduct a statistical analysis comparing the quality of different checklists/guidelines, owing to the lack of standardization for the quality assessment of economic evaluation. Each of these quality assessment tools has its characteristics and applicable conditions.

Conclusion

Our results identified a number of well-developed quality assessment criteria available to researchers to ensure the informative and transparent SR of economic assessments. The Drummond checklist, the BMJ checklist, and the CHEERS criteria were the three-most widely used tools. Evidence showed that the methodology of pharmacoeconomic SRs remained to be improved according to the AMSTAR scale. Considering the different types and population of included studies, a multiple scales may need to be combined in future pharmacoeconomic SRs.

Supporting information

S1 Text. Search strategies for English and Chinese databases.

(DOCX)

S2 Text. The references of included 165 studies.

(DOCX)

S1 Table. AMSTAR assessment scale for included 165 studies.

(DOCX)

S1 Checklist. PRISMA checklist.

(DOC)

Data Availability

All relevant data are within the manuscript and its Supporting Information files.

Funding Statement

This review was funded by the Health Commission of Sichuan Province to CM and FH (No. 20PJ069). The funder has no role in the study design, data collection and analysis, decision to publish or preparation of the manuscript.

References

  • 1.Kanchanachitra C, Lindelow M, Johnston T, Hanvoravongchai P, Lorenzo FM, Huong NL, et al. Human resources for health in southeast Asia: shortages, distributional challenges, and international trade in health services. Lancet. 2011; 377(9767):769–781. 10.1016/S0140-6736(10)62035-1 . [DOI] [PubMed] [Google Scholar]
  • 2.Sun J, Luo H. Evaluation on equality and efficiency of health resources allocation and health services utilization in China. Int J Equity Health. 2017; 16(1):1–8. 10.1186/s12939-016-0499-1 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Eddama O, Coast J. A systematic review of the use of economic evaluation in local decision-making. Health policy. 2008; 86(2–3):129–141. 10.1016/j.healthpol.2007.11.010 [DOI] [PubMed] [Google Scholar]
  • 4.Yagudina RI, Kulikov AU, Serpik VG, Ugrekhelidze DT. Concept of Combining Cost-Effectiveness Analysis and Budget Impact Analysis in Health Care Decision-Making. Value Health Reg Issues. 2017; 13:61–66. 10.1016/j.vhri.2017.07.006 . [DOI] [PubMed] [Google Scholar]
  • 5.Shemilt I, Mugford M, Byford S, Drummond M, Eisenstein E, Knapp M, et al. 15 Incorporating economics evidence. Cochrane handbook for systematic reviews of interventions. 2008:449. [Google Scholar]
  • 6.Evers S, Goossens M, De Vet H, Van Tulder M, Ament A. Criteria list for assessment of methodological quality of economic evaluations: Consensus on Health Economic Criteria. International journal of technology assessment in health care. 2005; 21(2):240–245. 10.1017/S0266462305050324 [DOI] [PubMed] [Google Scholar]
  • 7.Pignone M, Saha S, Hoerger T, Lohr KN, Teutsch S, Mandelblatt J. Challenges in systematic reviews of economic analyses. Ann Intern Med. 2005; 142(12):1073–1079. 10.7326/0003-4819-142-12_part_2-200506211-00007 . [DOI] [PubMed] [Google Scholar]
  • 8.Anderson R. Systematic reviews of economic evaluations: utility or futility? Health Econ. 2010. March;19(3):350–64. 10.1002/hec.1486 . [DOI] [PubMed] [Google Scholar]
  • 9.Dissemination CfR. CRD’s guidance for undertaking reviews in healthcare: Systematic Reviews. 3rd edition.2009: University of York.
  • 10.Mazumder D, Kapoor A, Gwatkin N, Medeiros C. Benchmarking Health Technology Assessment (Hta) Agencies for Setting Standards on Pharmacoeconomic, Pricing, Evidence, and General Submission Requirements: development of a Multidimensional Rating Scale. Value in Health. 2015; 18(7):A854 10.1016/j.jval.2015.09.453 [DOI] [Google Scholar]
  • 11.Phelps CE, Lakdawalla DN, Basu A, Drummond MF, Towse A, Danzon PM. Approaches to Aggregation and Decision Making-A Health Economics Approach: An ISPOR Special Task Force Report [5]. Value Health. 2018; 21(2):146–154. 10.1016/j.jval.2017.12.010 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Gomersall JS, Jadotte YT, Xue Y, Lockwood S, Riddle D, Preda A. Conducting systematic reviews of economic evaluations. Int J Evid Based Healthc. 2015; 13(3):170–178. 10.1097/XEB.0000000000000063 . [DOI] [PubMed] [Google Scholar]
  • 13.Schuller Y, Hollak CE, Biegstraaten M. The quality of economic evaluations of ultra-orphan drugs in Europe-a systematic review. Orphanet J Rare Dis. 2015; 10:1–12. 10.1186/s13023-014-0216-3 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Drummond MF. Economic evaluation of pharmaceuticals: science or marketing? Pharmacoeconomics. 1992; 1(1):8–13. 10.2165/00019053-199201010-00004 . [DOI] [PubMed] [Google Scholar]
  • 15.Jefferson T, Demicheli V, Vale L. Quality of systematic reviews of economic evaluations in health care. JAMA. 2002; 287(21):2809–2812. 10.1001/jama.287.21.2809 . [DOI] [PubMed] [Google Scholar]
  • 16.Luhnen M, Prediger B, Neugebauer EAM, Mathes T. Systematic Reviews of Economic Evaluations in Health Technology Assessment: A Review of Characteristics and Applied Methods. Int J Technol Assess Health Care. 2018; 34(6):1–10. 10.1017/S026646231800363X . [DOI] [PubMed] [Google Scholar]
  • 17.Drummond MF, Jefferson T. Guidelines for authors and peer reviewers of economic submissions to the BMJ. BMJ. 1996; 313(7052):275–283. 10.1136/bmj.313.7052.275 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Husereau D, Drummond M, Petrou S, Carswell C, Moher D, Greenberg D, et al. Consolidated health economic evaluation reporting standards (CHEERS) statement. International journal of technology assessment in health care. 2013; 29(2):117–122. 10.1017/S0266462313000160 [DOI] [PubMed] [Google Scholar]
  • 19.Shea BJ, Grimshaw JM, Wells GA, Boers M, Andersson N, Hamel C, et al. Development of AMSTAR: a measurement tool to assess the methodological quality of systematic reviews. BMC Med Res Methodol. 2007; 7:1–7. 10.1186/1471-2288-7-1 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Drummond MF, Sculpher MJ, Claxton K, Stoddart GL, Torrance GW. Methods for the economic evaluation of health care programmes.2015: Oxford university press. [Google Scholar]
  • 21.Chiou C-F, Hay JW, Wallace JF, Bloom BS, Neumann PJ, Sullivan SD, et al. Development and validation of a grading system for the quality of cost-effectiveness studies. Medical care. 2003:32–44. 10.1097/00005650-200301000-00007 [DOI] [PubMed] [Google Scholar]
  • 22.Philips Z, Ginnelly L, Sculpher M, Claxton K, Golder S, Riemsma R, et al. Review of guidelines for good practice in decision-analytic modelling in health technology assessment.2004: Health Technol Assess; pp. 1–188. [DOI] [PubMed] [Google Scholar]
  • 23.Walker DG, Hutubessy R, Beutels P. WHO Guide for standardisation of economic evaluations of immunization programmes. Vaccine. 2010; 28(11):2356–2359. 10.1016/j.vaccine.2009.06.035 [DOI] [PubMed] [Google Scholar]
  • 24.Ungar WJ, Santos MT. The Pediatric Quality Appraisal Questionnaire: an instrument for evaluation of the pediatric health economics literature. Value in Health. 2003; 6(5):584–594. 10.1046/j.1524-4733.2003.65253.x [DOI] [PubMed] [Google Scholar]
  • 25.López-Bastida J, Oliva J, Antonanzas F, García-Altés A, Gisbert R, Mar J, et al. Spanish recommendations on economic evaluation of health technologies. The European Journal of Health Economics. 2010; 11(5):513–520. 10.1007/s10198-010-0244-4 [DOI] [PubMed] [Google Scholar]
  • 26.Liu GE HS, Wu JH. China Guidelines for Pharmacoeconomic Evaluations. China Journal of Pharmaceutical Economics. 2011; 3:6–48. [Google Scholar]
  • 27.CASP U. Critical Appraisal Skills Programme (CASP) Checklists. Available from: http://publichealthwell.ie/node/901903 [Accessed: March 2, 2020].
  • 28.Walker DG, Wilson RF, Sharma R, Bridges J, Niessen L, Bass EB, et al. Best practices for conducting economic evaluations in health care: a systematic review of quality assessment tools. Methods Research Report. (Prepared by Johns Hopkins University Evidence-based Practice Center under contract No. 290-2007-10061-I.) AHRQ Publication No.12(13)-EHC132-EF. Rockville, MD: Agency for Healthcare Research and Quality. October.2012. [PubMed]
  • 29.Zhang SY MA, LI HC, Guan X. Instruments designed for quality assessment of pharmaceutical economic evaluations: an overview. Chinese Journal of Evidence-based Medicine. 2019; 19(7):844–850. [Google Scholar]
  • 30.Doran CM. Critique of an economic evaluation using the Drummond checklist. Appl Health Econ Health Policy. 2010; 8:357–359. 10.2165/11584400-000000000-00000 [DOI] [PubMed] [Google Scholar]
  • 31.Creese A, Floyd K, Alban A, Guinness L. Cost-effectiveness of HIV/AIDS interventions in Africa: a systematic review of the evidence. Lancet. 2002; 359(9318):1635–1643. 10.1016/S0140-6736(02)08595-1 . [DOI] [PubMed] [Google Scholar]
  • 32.Chen M, Zhang LL, Hu M, Gao J, Tong RS. Cost-effectiveness of treatment for acute childhood idiopathic thrombocytopenic purpura (ITP)-a systematic review. J Int Med Res. 2008; 36(3):572–578. 10.1177/147323000803600324 . [DOI] [PubMed] [Google Scholar]
  • 33.Loveman E, Cooper K, Bryant J, Colquitt JL, Frampton GK, Clegg A. Dasatinib, high-dose imatinib and nilotinib for the treatment of imatinib-resistant chronic myeloid leukaemia: A systematic review and economic evaluation. Health Technology Assessment. 2012; 16(23):1–137. 10.3310/hta16230 [DOI] [PMC free article] [PubMed] [Google Scholar]

Decision Letter 0

Kevin Lu

13 Oct 2020

PONE-D-20-23142

An Overview of the Characteristics and Quality Assessment Criteria in Systematic Review of Pharmacoeconomics

PLOS ONE

Dear Dr. chen,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Nov 27 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Kevin Lu, PhD

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information.

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: No

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Comment #1: Please provide the complete search terms and search results for each database in appendix.

Comment #2: Please list specifically how many records were identified out in each database in the flowchart in Figure 1.

Comment #3: The review searched literatures in both English and Chinese, but there is no description of how many of the literatures included were in Chinese and how many were in English. Please add relevant information.

Comment #4, page 8: Please add references for the following statement.

1. “However, with an increasing amount of economic evaluations, healthcare professionals, consumers and policy-makers can often be overwhelmed by the presence of different results.”

2. “Opponents held that they were worthless because of differences between the populations, settings and background of the included studies.”

Comment #5, page 9: The literature search was up to October 25, 2019, almost a year before. Could the authors consider updating the literature search?

Comment #6, page 19: The PubMed search strategy provided in the manuscript seems a bit weird. Why was the term “cost-benefit analysis” searched in full text alone in #1 and again in the title and abstract in #2?

Comment #7, page 19: Please add information on the interventions of included studies in Table 1.

Reviewer #2: Major comments:

1 Introduction. Need to literature gap.

2 Methods. Need to distinguish between 2 types of tools: AMSTAR for systematic review and checklists/guidelines for included studies

3 Recommend to conduct statistical analysis comparing the quality of 1) English vs. Chinese, 2) with and without checklists/guidelines, 3) different checklists/guidelines, and other as you see fit.

4 Limitation. Need to develop.

5 How do you link AMSTAR for systematic review and checklists/guidelines for included studies? Do you recommend using checklists/guidelines to improve the quality of systematic review?

Minor comments:

1 Table 2. Add the classification of each tool. E.g. score range for high, medium, and low quality

2 Table 2. What does the column “Answer” mean?

3 Figure 1: What are other sources?

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2021 Feb 8;16(2):e0246080. doi: 10.1371/journal.pone.0246080.r002

Author response to Decision Letter 0


26 Nov 2020

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

Response: Many thanks for these useful and valid suggestions!

We have revised the format.

2. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information.

Response: Many thanks for these useful and valid suggestions!

We have added the supporting information.

Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Comment #1: Please provide the complete search terms and search results for each database in appendix.

Response: Many thanks for these useful and valid suggestions!

We have listed the retrieval strategies for all English and Chinese databases in the supporting information. We have modified the sentences in the text “ The specific search strategies were shown in the S1 Text”.

Comment #2: Please list specifically how many records were identified out in each database in the flowchart in Figure 1.

Response: Many thanks for these useful and valid suggestions!

We have listed the search records for each database in the flowchart in Figure 1.

Comment #3: The review searched literatures in both English and Chinese, but there is no description of how many of the literatures included were in Chinese and how many were in English. Please add relevant information.

Response: Many thanks for these useful and valid suggestions!

We have revised the sentences in the “ Results-Study search and selection” part as: “ A total of 11, 172 articles in English, 1, 004 articles in Chinese from databases and 17 articles obtained through google scholar were obtained in the initial literature retrieval.” and “ 148 articles in English and 17 articles in Chinese were finally included for the overview”.

Comment #4, page 8: Please add references for the following statement.

1. “ However, with an increasing amount of economic evaluations, healthcare professionals, consumers and policy-makers can often be overwhelmed by the presence of different results.”

2. “ Opponents held that they were worthless because of differences between the populations, settings and background of the included studies.”

Response: Many thanks for these useful and valid suggestions!

We have added references:

1. Shemilt I, Mugford M, Byford S, Drummond M, Eisenstein E, Knapp M, et al. 15 Incorporating economics evidence. Cochrane handbook for systematic reviews of interventions. 2008:449.

2. Evers S, Goossens M, De Vet H, Van Tulder M, Ament A. Criteria list for assessment of methodological quality of economic evaluations: Consensus on Health Economic Criteria. International journal of technology assessment in health care. 2005; 21(2):240-245.

3. Anderson R. Systematic reviews of economic evaluations: utility or futility? Health Econ. 2010 Mar;19(3):350-64.

Comment #5, page 9: The literature search was up to October 25, 2019, almost a year before. Could the authors consider updating the literature search?

Response: Many thanks for these useful and valid suggestions!

We also considered retrieving the databases and updating to the latest date. After a rough search we found that the number of new system reviews did increase as of 2019 and 2020, but the evaluation tools used were similar. So we didn't update them. We have considered this problem a limitation in the “ discussion” part: “ Second, the literature search was updated on October 25, 2019. The number of pharmacoeconomic systematic review has indeed increased, possibly leading to changes in results. We believe these problems will not have a significant impact on our findings.”.

Comment #6, page 19: The PubMed search strategy provided in the manuscript seems a bit weird. Why was the term “ cost-benefit analysis” searched in full text alone in #1 and again in the title and abstract in #2?

Response: Many thanks for these useful and valid suggestions!

It should be the mesh term “ cost-benefit analysis” in #1 and again in the title and abstract in #2. We have revised the search strategy. The research strategies for all English and Chinese databases were listed in the supporting information.

Comment #7, page 19: Please add information on the interventions of included studies in Table 1.

Response: Many thanks for these useful and valid suggestions!

We have added information on the interventions of included studies in Table 1.

Reviewer #2: Major comments:

1 Introduction. Need to literature gap.

Response: Many thanks for these useful and valid suggestions!

We have revised the sentences and added some literature such as two existed systematic reviews (Jefferson 2002 and Luhnen 2018). For more modifications, please see the Introduction section.

2 Methods. Need to distinguish between 2 types of tools: AMSTAR for systematic review and checklists/guidelines for included studies.

Response: Many thanks for these useful and valid suggestions!

AMSTAR is used for assessing the quality of systematic reviews and checklists/guidelines for included individual studies. We have added the ten items in the “ Methods- Study Quality Assessment” section as “ There are the following ten items: (1) Was an 'a priori' design provided? (2) Was there a duplicate study selection and data extraction? (3)Was a comprehensive literature search performed? (4) Was the status of publication (i.e. grey literature) used as an inclusion criterion? (5) Was a list of studies (included and excluded) provided? (6) Were the characteristics of the included studies provided?(7) Was the scientific quality of the included studies assessed and documented?(8) Was the scientific quality of the included studies used appropriately in formulating conclusions? (9) Were the methods used to combine the findings of studies appropriate? (10) Was the conflict of interest included?”

We have added “ In this overview, item 7 of the AMSTAR scale refers to using checklists or guidelines to assess the quality of included economic studies.”

3 Recommend to conduct statistical analysis comparing the quality of 1) English vs. Chinese, 2) with and without checklists/guidelines, 3) different checklists/guidelines, and other as you see fit.

Response: Many thanks for these useful and valid suggestions!

It’s challenging to conduct a statistical analysis comparing these issues. We have added these issues in the “ discussion-limitation” section: “ Thirdly, we did not conduct a statistical analysis comparing the quality of English versus Chinese, with or without checklists/guidelines. Because AMSTAR is not designed to generate an overall “ score”. A high score on total items may disguise critical weaknesses in specific domains, such as an inadequate literature search or failure to assess the risk of bias with individual studies included in a systematic review. We also did not conduct a statistical analysis comparing the quality of different checklists/guidelines, owing to the lack of standardization for the quality assessment of economic evaluation. Each of these quality assessment tools has its characteristics and applicable conditions”.

4 Limitation. Need to develop.

Response: Many thanks for these useful and valid suggestions!

We have revised the limitations: “ There are several limitations present in our approach. Firstly, our overview did not include literature published other than Chinese and English, which may lead to publication bias. Secondly, the literature search was updated on October 25, 2019. The number of pharmacoeconomic systematic review has indeed increased, possibly leading to changes in results. We believe these problems will not have a significant impact on our findings. Thirdly, we did not conduct a statistical analysis comparing the quality of English versus Chinese, with or without checklists/guidelines. Because AMSTAR is not designed to generate an overall “ score”. A high score on total items may disguise critical weaknesses in specific domains, such as an inadequate literature search or failure to assess the risk of bias with individual studies included in a systematic review. We also did not conduct a statistical analysis comparing the quality of different checklists/guidelines, owing to the lack of standardization for the quality assessment of economic evaluation. Each of these quality assessment tools has its characteristics and applicable conditions.”

5 How do you link AMSTAR for systematic review and checklists/guidelines for included studies? Do you recommend using checklists/guidelines to improve the quality of systematic review?

Response: Many thanks for these useful and valid suggestions!

AMSTAR is used for assessing the quality of systematic reviews and checklists/guidelines for included individual studies. We have added “ In this overview, item 7 of the AMSTAR scale refers to using checklists or guidelines to assess the quality of included economic studies.” in the Method section.

As one of the essential items in the AMSTAR scale, we recommend using checklists/guidelines to improve the quality of systematic review. We have added the following sentences in the Discussion section: “ In our view, the Drummond checklist, the BMJ checklist, and the CHEERS scale could be adopted as a quality assessment tool to assess the methodological quality of full and partial economic evaluations, and Phillips checklist especially for modeling evaluations. It should be noted that the above quality assessment tools are qualitative evaluation, which contain personal and open-ended items. In the case of quantitative assessment, QHES and PQAQ scales are recommended. In addition to generic ones, there are population-specific scales (such as PQAQ) and disease-specific scales (such as the WHO vaccine checklist). Finally, different scales may address different types of economic assessments, such as trial-based and model-based. Therefore, different studies may combine multiple different scales.”.

Minor comments:

1 Table 2. Add the classification of each tool. E.g. score range for high, medium, and low quality

Response: Many thanks for these useful and valid suggestions!

We detailed the characteristics of the checklists or guidelines in Table 2. It is hard to classify a score range of high, medium and low quality, because many checklists/guidelines do not provide a quantitative scoring criteria and rules. Usually those checklists/guidelines only provide some items to check whether a study contains specific contents. In the evaluation process, we suggest to customize the criteria of quality classification.

2 Table 2. What does the column “ Answer” mean?

Response: Many thanks for these useful and valid suggestions!

It means some of the options for checklists/guidelines evaluation. We have revised the “ Answer” to “ Response”.

3 Figure 1: What are other sources?

Response: Many thanks for these useful and valid suggestions!

“ Other sources” refer to the search results from “ Google scholar” in our overview. We have revised the sentences in the “ Study search and selection” section: “ A total of 11, 172 articles in English, 1, 004 articles in Chinese from databases and 17 articles obtained through google scholar were obtained in the initial literature retrieval.”.

Attachment

Submitted filename: Response to Reviewers.docx

Decision Letter 1

Kevin Lu

29 Dec 2020

PONE-D-20-23142R1

An overview of the characteristics and quality assessment criteria in systematic review of pharmacoeconomics

PLOS ONE

Dear Dr. chen,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Feb 12 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Kevin Lu, PhD

Academic Editor

PLOS ONE

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thanks to the authors for addressing the previous review comments well. However, there are some minor revisions that need to be made in the manuscript.

Comment 1: There are several grammatical errors or typos in the article, and I have listed some of them below. It would be great if the authors could proofread the article carefully and correct the errors.

1) Abstract background section: “A systematic review of economic evaluations plays a critical…..” to “The systematic reviews of economic evaluations plays a critical…..”

2) Abstract results section: “Others include the Quality of Health Economic Studies (QHES)” to “Others included the Quality of Health Economic Studies (QHES)”

3) Discussion: “…… these pharmcoeconomic evaluation tools.” to “…... these pharmacoeconomic evaluation tools.”

Comment 2: Please spell out the name of the database described in the methods section of the abstract.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2021 Feb 8;16(2):e0246080. doi: 10.1371/journal.pone.0246080.r004

Author response to Decision Letter 1


9 Jan 2021

Reviewer #1: Thanks to the authors for addressing the previous review comments well. However, there are some minor revisions that need to be made in the manuscript.

Comment 1: There are several grammatical errors or typos in the article, and I have listed some of them below. It would be great if the authors could proofread the article carefully and correct the errors.

1) Abstract background section: “A systematic review of economic evaluations plays a critical…..” to “The systematic reviews of economic evaluations plays a critical…..”

2) Abstract results section: “Others include the Quality of Health Economic Studies (QHES)” to “Others included the Quality of Health Economic Studies (QHES)”

3) Discussion: “…… these pharmcoeconomic evaluation tools.” to “…... these pharmacoeconomic evaluation tools.”

Response: Many thanks for these useful and valid suggestions!

We have corrected the errors in the article. Please see the revision of the manuscript.

Comment 2: Please spell out the name of the database described in the methods section of the abstract.

Response: Many thanks for these useful and valid suggestions!

We have detailed the name of the database described in the methods section of the abstract as following:

“Nine English and Chinese databases including the Cochrane Library, PubMed, EMbase (Ovid), NHS economic evaluation database (NHSEED) (Ovid), Health Technology Assessment (HTA) database, Chinese National Knowledge Infrastructure (CNKI), WangFang, VIP Chinese Science & Technology Periodicals (VIP) and Chinese Biomedical Literature Database (CBM) were searched.”

Attachment

Submitted filename: Response to Reviewers.docx

Decision Letter 2

Kevin Lu

13 Jan 2021

An overview of the characteristics and quality assessment criteria in systematic review of pharmacoeconomics

PONE-D-20-23142R2

Dear Dr. chen,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Kevin Lu, PhD

Academic Editor

PLOS ONE

Acceptance letter

Kevin Lu

28 Jan 2021

PONE-D-20-23142R2

An overview of the characteristics and quality assessment criteria in systematic review of pharmacoeconomics

Dear Dr. Min:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Professor Kevin Lu

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Text. Search strategies for English and Chinese databases.

    (DOCX)

    S2 Text. The references of included 165 studies.

    (DOCX)

    S1 Table. AMSTAR assessment scale for included 165 studies.

    (DOCX)

    S1 Checklist. PRISMA checklist.

    (DOC)

    Attachment

    Submitted filename: Response to Reviewers.docx

    Attachment

    Submitted filename: Response to Reviewers.docx

    Data Availability Statement

    All relevant data are within the manuscript and its Supporting Information files.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES