Rigorous reviews that adhere to methodological standards can advance biomedical and health informatics knowledge by synthesizing research and assessing its quality, identifying knowledge gaps, and making recommendations for research, practice, or policy. Thus, reviews are an important manuscript type for Journal of the American Medical Informatics Association (JAMIA) and complement other types of research papers. Reviews can be characterized based on 4 methodological aspects: search strategy (formal or informal), appraisal of quality (present or absent), synthesis (narrative or quantitative), and analysis (eg, quantity, quality, themes, knowledge gaps, limitations, recommendations).1 In its 25 years, JAMIA has published more than 150 reviews; the frequency of reviews has increased in recent years with increasing awareness of the role of high-quality reviews as a foundation for future research on a topic. This has included scoping reviews that provide a preliminary assessment of potential size and scope of available research literature, but do not include a formal appraisal of study quality.2 Most recent JAMIA reviews are identified as systematic reviews that include a formal search strategy, appraisal of study quality, and a narrative3 or quantitative (ie, meta-analysis)4 synthesis of findings. JAMIA has also published critical reviews that synthesize the literature conceptually and offer key recommendations for the field.5
This issue of JAMIA has 5 systematic reviews including 1 from an American Medical Informatics Association Working Group.6 In this editorial, I highlight how these systematic reviews demonstrate their relevance to the JAMIA audience and meet best-practice standards for systematic reviews as delineated in the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) statement7 and other sources.
RELEVANCE TO JAMIA READERS
As a starting place, the purpose of the review must be relevant to JAMIA readers. Reviews that do not address a central topic in biomedical and health informatics are not a fit for JAMIA. The purpose may be relatively narrow or broad in scope. For example, Koleck et al8 focused on use of natural language processing to process or analyze information related to a selected set of symptoms in electronic health record free-text narratives while other systematic reviews examined broader topics such as clinical pathways,9 speech recognition technology for clinician documentation,10 and evaluation approaches for visual analytic technologies in health.6
FORMAL SEARCH STRATEGY
Systematic reviews must be based on a formal search strategy. The search strategy should be included in the manuscript or as an online supplement. Use of an informationist who is expert in search strategies increases the likelihood that the search strategy is appropriate. An inadequate search strategy is a common reason for rejecting a review submitted to JAMIA. Moreover, it is important that multiple databases are searched and that the databases are a match for the review topic. For example, Neame et al9 searched 3 databases for their review on the effects of implementing health information technology (HIT)–supported clinical pathways, Martin et al’s11 systematic review on the impact of mobile technology on teamwork and communication in hospitals was based on a search of 7 databases, and Blackley et al10 reported a search of 10 scientific and medical literature databases to find articles about clinician use of speech recognition for clinical documentation. The range of years searched should be based on a sound rationale for the review topic.
FLOW OF INFORMATION THROUGH THE PHASES OF THE SYSTEM REVIEW
The flow of information through the different phases of a systematic review should be depicted with a PRISMA flow diagram that maps out the number of records identified, included, and excluded, and the reasons for exclusions.7 In addition to the flow diagram, the narrative should include who (minimum of 2) involved in the flow process, the level of agreement between them, and how discrepancies were resolved. The manuscript should also provide details about how the process was managed. The methods varied in our 5 examples, ranging from author-developed spreadsheets6 and specialized software such as Covidence from the Cochrane Collaboration (https://community.cochrane.org/help/tools-and-software/covidence).8
QUALITY ASSESSMENT
By definition, systematic reviews include a method of quality assessment. However, those methods vary depending on the focus of the review and whether there are published formal standards for the types of studies in the review. For their systematic review on the impact of mobile technology on teamwork and communication in hospitals, Martin et al11 applied multiple formal methods of quality assessment: (1) the National Institutes of Health Study Quality Assessment tool for each type of study12 and (2) a set of criteria specific to mobile health studies.13 In contrast, Blackley et al10 characterized the quality of studies by their reporting of metrics related to speech recognition for clinical documents, including number of speakers, number of documents, and a variety of accuracy measures. Similarly, Koleck et al8 reported on the number of documents and a set of metrics suitable for evaluating an natural language processing algorithm or pipeline including sensitivity, specificity, precision, recall, F measure, kappa statistic, area under the receiver-operating characteristic curve, and C-statistic. In 1 instance,6 the actual purposes of the review were to discover what evaluation measures were used in the literature and to make recommendations related to quality assessments. I encourage authors to visit the EQUATOR (Enhancing the QUAlity and Transparency Of health Research) network website (https://www.equator-network.org/reporting-guidelines/) to retrieve reporting guidelines for a variety of study designs to assist with selection of quality assessment criteria.
SYNTHESIS
No systematic reviews in this issue reported a meta-analysis; 1 review9 had planned to do so, but the heterogeneity and quality of the studies in the review precluded such an analysis. Synthesis approaches were primarily tabular along with narrative text. However, 3 reviews contained graphical components, including a histogram of studies over time,10 a chord diagram showing the associations between symptoms and studies,8 and a heatmap displaying the relationships among PICOS (participants, interventions, comparisons, outcomes, and study design)7 components and setting and measurement classifications.6 Now that JAMIA is an online journal, there are no charges for color, so I encourage authors to include graphs when appropriate.
ANALYSIS
Analyses for systematic reviews typically focus on 4 areas: what is known, what remains unknown, uncertainty around findings, recommendations for future research, and recommendations for practice.1 To address the known vs unknown, the systematic reviews in this issue used PICOS7 or a variation on it to analyze studies and identify knowledge gaps. Given the informatics focus of the reviews, interventions were typically classified by a set of characteristics suitable for the types of technologies studied. Examples include clinical pathways (eg, clinical decision support, clinical documentation, dashboards)9 and visualizations (eg, type of intervention [dashboard, mobile app, etc.], type of visualization [planar, volumetric, network, etc.], unit visualized, interactivity).6 Although all reviews included some type of quality assessment and a limitations sections, 2 reviews9,11 explicitly addressed the uncertainty in a more formal manner based on the rigor of the study designs. All reviews identified areas for future research. In addition, Wu et al6 provided 4 specific recommendations for the conduct of evaluation studies for visual analytic technologies.
REGISTRATION OF REVIEW PROTOCOL
Only 2 systematic reviews9,11 in this issue, both from the United Kingdom, were registered in PROSPERO, an international database of prospectively registered systematic reviews in health and social care, welfare, public health, education, crime, justice, and international development, in which there is a health-related outcome.14 Similar to other prospective study registration sites, such as www.ClinicalTrials.gov, PROSPERO aims to help avoid duplication and reduce reporting bias by enabling comparison of the completed review with what was planned in the protocol. While not required for JAMIA, registration is encouraged for these reasons.
CONCLUSION
Bottom line, JAMIA is a great publication venue for systematic reviews focused on biomedical and health informatics topics. I hope that these examples offer guidance for JAMIA authors, motivate American Medical Informatics Association Working Groups to follow the lead of the Visualization Working Group to conduct and submit a systematic review, and assure JAMIA readers that the journal is a source of high-quality systematic reviews to inform future informatics research, practice, and policy.
REFERENCES
- 1. Grant MJ, Booth A.. A typology of reviews: an analysis of 14 review types and associated methodologies. Health Info Libr J 2009; 26 (2): 91–108. [DOI] [PubMed] [Google Scholar]
- 2. Davis S, Roudsari A, Raworth R, Courtney KL, MacKay L.. Shared decision-making using personal health record technology: a scoping review at the crossroads. J Am Med Inform Assoc 2017; 24 (4): 857–66. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Tolley CL, Forde NE, Coffey KL, et al. Factors contributing to medication errors made when using computerized order entry in pediatrics: a systematic review. J Am Med Inform Assoc 2018; 25 (5): 575–84. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. Heitkemper EM, Mamykina L, Travers J, Smaldone A.. Do health information technology self-management interventions improve glycemic control in medically underserved adults with diabetes? A systematic review and meta-analysis. J Am Med Inform Assoc 2017; 24 (5): 1024–35. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5. Miller K, Mosby D, Capan M, et al. Interface, information, interaction: a narrative review of design and functional requirements for clinical decision support. J Am Med Inform Assoc 2018; 25 (5): 585–92. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Wu DTY, Chen AT, Manning JD, et al. Evaluating visual analytics for health informatics applications: a systematic review from the American Medical Informatics Association Visual Analytics Working Group Task Force on Evaluation. J Am Med Inform Assoc 2019; 26 (4): 314–23. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Moher D, Liberati A, Tetzlaff J, Altman DG; PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. J Clin Epidemiol 2009; 62 (10): 1006–12. [DOI] [PubMed] [Google Scholar]
- 8. Koleck TA, Dreisbach C, Bourne PE, Bakken S.. Natural language processing of symptoms documented in free-text narratives of electronic health records: a systematic review. J Am Med Inform Assoc 2019; 26 (4): 364–79. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Neame M, Chacko J, Surace AE, Sinha IP, Hawcutt DB.. A systematic review of the effects of implementing clinical pathways supported by health information technologies. J Am Med Inform Assoc 2019; 26 (4): 356–63. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Blackley S, Huynh J, Wang L, Korach Z, Zhou L.. Speech recognition for clinical documentation from 1990 to 2018: A systematic review. J Am Med Inform Assoc 2019; 26 (4): 324–38. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Martin G, Khajuria A, Arora S, King D, Ashrafian H, Darzi A.. The impact of mobile technology on teamwork and communication in hospitals: a systematic review. J Am Med Inform Assoc 2019; 26 (4): 339–55. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.National Institutes of Health. Study Quality Assessment Tools. https://www.nhlbi.nih.gov/health-topics/study-quality-assessment-tools Accessed January 21, 2019.
- 13. Agarwal S, LeFevre AE, Lee J, et al. Guidelines for reporting of health interventions using mobile phones: mobile health (mHealth) evidence reporting and assessment (mERA) checklist. BMJ 2016; 352: i1174. [DOI] [PubMed] [Google Scholar]
- 14.Centre for Reviews and Dissemination. PROSPERO—International Prospective Register of Systematic Reviews. https://www.crd.york.ac.uk/prospero/ Accessed January 21, 2019.
