Abstract
The number of scientific publications is growing at an unprecedented rate. Failure to properly evaluate existing literature at the start of a project may result in a researcher wasting time and resources. As pharmacy researchers and scholars look to conceptualize new studies, it is imperative to begin with a high-quality literature review that reveals what is known and unknown about a given topic. The purpose of this commentary is to provide useful guidance on conducting rigorous searches of the literature that inform the design and execution of research. Guidance for less formal literature reviews can be adapted from best practices utilized within the formalized field of evidence synthesis. Additionally, researchers can draw on guidance from PRESS (Peer Review of Electronic Search Strategies) to engage in self-evaluation of their search strategies. Finally, developing an awareness of common pitfalls when designing literature searches can provide researchers with confidence that their research is designed to fill clearly articulated gaps in knowledge.
Keywords: biomedical research, peer review, research, research design, pharmacy research
Background
The ability to search online is a necessary skill for everyone, generally either to learn more about a particular topic or to gather information to help make decisions. Whether looking ahead to the week’s weather, comparing reviews of local restaurants, or learning about how to perform a home repair, we search to increase our understanding. The platforms that we use to find information, and the strategies we use to search, impact the results we gather and the decisions we make based on them.
For researchers, searching scientific literature serves a similar purpose, and is an essential step in the conceptualization and design of new research. Data from the National Science Foundation (NSF) estimated a global output of over 2.9 million publications in science and engineering in 2020, with health sciences representing the largest proportion (25%) of publications.1 In 2010, it was estimated that 75 trials and 11 systematic reviews were published daily,2 while in 2021, this estimation had grown to 80 systematic reviews per day.3 Effective use of this literature prior to engaging in research can reduce waste in research and improve value.4 For example, one study found that only 1% of protocols used data from meta-analyses to plan sample sizes, which can lead to optimism bias (the difference between a person’s expectation and the outcome that follows5) and underestimation of samples sizes needed for an adequately powered study.6,7 Many research studies fail to cite relevant existing literature, including randomized trials8 and systematic reviews,9,10 often because they were unaware such literature existed.11
With a rapidly growing knowledge base globally, researchers must develop skills that enable them to efficiently and effectively assess available knowledge on their topic of interest. Impactful research is based on an assessment of what is already known, carving out a clear gap as to what remains unknown through a well-conducted literature review. Compelling research manuscripts operationalize this within the background section, framing within a problem/gap/hook heuristic.12 Pharmacy researchers and scholars may recognize the importance of a good literature review, but still not have a consistent and confident approach in how they execute it. However, investing in this foundational skill can prevent a number of downstream challenges when designing and conducting a new research study. As such, the purpose of this commentary is to provide useful guidance on executing rigorous literature searches in preparation for research. This follows our previous publication with advice on how to compose good research questions.13
Deciding on the literature review approach
Every researcher should begin their literature search by defining the goal that they hope to accomplish. There are many different types of literature reviews used within health science evidence synthesis, each unique in its methodological approach to searching, appraisal, synthesis, and analysis (SALSA).14 Each of these are accompanied by different strengths and weaknesses in what they can accomplish. Some literature reviews are broad, with the studies chosen entirely at the discretion of the researchers themselves, and with a focus on informational summary over synthesis. Other reviews employ a tightly defined question, a systematic and comprehensive approach to searching, and a specific methodological approach to synthesizing the included evidence. Indeed, these types of evidence syntheses (e.g., systematic reviews, scoping reviews) are classified as methods of research on their own due to their highly technical approaches focused on reproducibility and rigor. Most often, a pharmacy researcher will find themself at the start of a new research project, looking to perform a literature review to confirm the novelty and scope of their research question (in line with the FINER criteria15). Although there is an argument that all research should be informed by formalized systematic reviews,4 many researchers choose a less formal, non-systematic approach due to time, resources, or skill sets.
That being said, there is an excellent opportunity to adapt best practices from the field of evidence synthesis when searching the literature in preparation for research beyond systematic reviews. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) provides guidance for how to report the results of systematic reviews and meta-analyses (SR/MA), with extensions available which provide specific guidance for other review types.16 Although not composed as a guideline on the conduct of reviews, items identified within PRISMA highlight important components of reviews that should be transparently described, providing researchers a helpful framework. Actual guidance and training on how to design and execute SR/MAs is available from a variety of sources including the Cochrane Collaboration,17 the Campbell Collaboration,18 and the Johanna Briggs Institute.19
Choosing the literature source(s)
When beginning a literature search, it is important to consider where you are going to source information. Although it is possible to search for scientific literature in the same manner you would search for everyday information (e.g., using search engines like Google), this often is not efficient, effective, or reproducible. In executing a literature search in preparation for other research, a scholar generally wants to identify what is already known in order to build upon it. This requires a more focused approach, using abstracting and indexing (A&I) databases, or online platforms that catalogue abstracts of publications, coupled with indexing, descriptors, and access guides. There are many available A&I databases, each with different focus, availability, and tools available to assist in your search (Table 1). When designing a literature search, it is important to choose A&I databases that will have the information you are seeking. Each A&I database covers different portions of the biomedical literature depending on the journals it indexes. Additionally, some A&I databases are freely available, whereas others require subscriptions for use. This means that researchers need to consult resources within their institution prior to selection to ensure access. Finally, for most literature searches, it is worth conducting searches within multiple A&I databases. This lessens the chance of missing publications that may be relevant to the topic, despite the fact that some level of overlap in results will occur. For each of these reasons, it is highly recommended to collaborate with a librarian during this process, who can advise on database coverage and selection.
Table 1.
Basic information for selected A&I databases
A&I database | Coverage | Indexing | Details |
---|---|---|---|
PubMed | Indexes biomedical journals from MEDLINE starting 1966, and data from printed indexes (including Index Medicus and Current List of Medical Literature) starting in 1946 Also includes indexing for open access manuscripts deposited in PMC |
5,200+ current journals indexed in MEDLINE plus 3,500+ journals currently in PMC | Produced by the US NLM and searches three databases: MEDLINE, PMC, and PubMed Bookshelf Automatically maps to MeSH to help formulate searches In order to search pre-1946 medical literature reliably, use printed versions of Index Medicus and other indexes |
Embase | Indexes biomedical journals starting 1947 Indexes biomedical conferences starting 2005 |
8,400+ current journals, including 3,200+ unique to Embase 15,000+ conferences |
Produced by Elsevier Science and includes strong international coverage as well as articles from MEDLINE Suggests relevant Emtree terms to help formulate searches Detailed searching vocabulary for drugs and medical devices Available using multiple platforms including Ovid & Embase.com* |
Scopus | Indexes journals and books from 200+ disciplines including engineering, biomedical, natural sciences, social sciences, and humanities starting 1970 Provides extensive citation searching |
28,000+ current journals 327,000+ books Limited number of conferences |
Produced by Elsevier Science and includes articles from MEDLINE, Embase, and more Includes CiteScore™ journal metrics, which are based on multiple year average citation rates |
APA PsycInfo | Indexes psychology and related fields Journal literature starting 1887 Limited book indexing starting 1597 Cited reference searching starting 2001 | 2,300+ current journals Psychology journals indexed “cover to cover,” related journals selectively indexed for relevant content. |
Produced by the APA Available using multiple platforms including APA PsycNet, EBSCOhost, Ovid & ProQuest* |
CINAHL | Indexes nursing and several allied health professions Coverage starting 1976 Provides limited citation searching |
3,800+ current journals covering nursing & health professions Indexes selected relevant books and dissertations |
Produced by EBSCO Numerous versions are available with varying coverage and indexing |
Web of Science | Indexes scientific (including biomedical) and social sciences literature starting in 1900, arts/humanities literature starting in 1975, and book literature starting 2005 Includes indexing from Index Chemicus starting in 1993, Current Chemical Reactions starting 1986, and INPI Archives from 1840–1985 Provides extensive citation searching |
12,000+ current journals 30,000+ books 2,000+ conferences |
Currently produced by Clarivate Analytics, originally created by the Institute for Scientific Information Highly selective coverage of journals and books Includes Journal Impact Factor™ journal metrics, which are based on multiple year average citation rates Provides tools for chemical structure searching |
SciFindern | Indexes science and chemistry as well as patents from CAplus starting 1907, MEDLINE starting 1950, and Chemisches Zentralblatt from 1830–1869 Includes indexing and data regarding chemical substances, reactions, regulated chemicals, and suppliers |
50,000+ journals 64 national/international patent authorities 204+ million substances 150+ million chemical reactions |
Produced by the CAS of the ACS and provides comprehensive coverage of scientific, engineering, and biomedical literature, including articles from MEDLINE Provides extensive tools for chemical structure searching Authoritative source for chemical names/structures, including CAS Registry Numbers used to identify chemical substances |
IPA | Indexes literature related to pharmaceuticals and the pharmaceutical industry starting 1970 Provides abstracts for presentations at major pharmacy meetings |
800+ current journals | Published by ASHP and produced by Clarivate Analytics Available using multiple platforms including EBSCOhost, Ovid & ProQuest* |
Because search functionality varies by platform, it is essential to report which platform was used when reporting methods for systematic reviews and other formal evidence syntheses.
A&I: abstract and indexing; ACS: American Chemical Society; APA: American Psychology Association; ASHP: American Society of Health-System Pharmacists; CAS: Chemical Abstract Service; CINAHL: Cumulative Index of Nursing and Allied Health Literature; IPA: International Pharmaceutical Abstracts; MeSH: medical subject headings; NLM: National Library of Medicine; PMC: PubMed Central
Additionally, researchers should consider searching grey literature, which is not routinely indexed in A&I databases. This form of literature can include governmental documents (e.g., Food and Drug Administration [FDA], National Institutes of Health [NIH]), reports from non-governmental organizations (e.g., World Health Organization [WHO]), technical reports from private companies, academic theses/dissertations, patents, and conference abstracts. Although there are specific databases that can be searched for grey literature (e.g., GreyNet, OpenGrey), this is an example of an area where a search engine like Google may have a role. Researchers should be mindful that grey literature may not have undergone peer review, meaning that critical appraisal of the source is essential. Depending on your area of research, however, grey literature can be crucial for your result set, given the existence of publication bias. One analysis submitted to the FDA for the approval of antidepressants found that 31% of trials were not published, and those that were published were more likely to have positive results.20 As a result, a failure to investigate literature outside of the usual journal publication pipeline can lead to conclusions that fail to represent the entire body of literature.
Finally, researchers should consider incorporating ancestry searching as final source, which is when the reference lists of identified literature are examined for additional citations of interest. For instance, a researcher may identify a set of useful journal articles from a search in PubMed. The bibliographies of each of these articles may lead to additional useful articles not otherwise captured before. Although this process requires some time investment, it is an excellent strategy to capture relevant results.
Basic building blocks of a search
Once sources are chosen for a literature search, the next key component is the choice of keywords, or the terms that researchers use to describe concepts in the literature. It can be tempting to enter a few random words into a search engine to see where they take you, but if your goal is a result set that comprehensively represents the existing literature, a more purposeful approach is required. Keywords can be generated from the research question itself, considering the different domains addressed within the PICO framework.21 From there, a researcher can take those concepts and snowball additional words/phrases, including synonyms (e.g., misuse, abuse, diversion, addiction) abbreviations (e.g., Drug Enforcement Administration or DEA), alternative English spellings (e.g., license vs licence, counselor vs counsellor), and variations in word endings (e.g., rehab vs rehabilitation, pharmacy vs pharmacist). For pharmacy research in particular, misspelling of drug names can affect search results and should be taken into consideration.22 Beyond the research question, researchers should also incorporate keywords from relevant papers already published, which can lend a sense of how existing research has been described and categorized. Mentors and colleagues can also be extremely valuable for keyword generation, particularly as researchers at different career stages can reflect on different terminology and how it has changed over time. The use of multiple A&I databases can also be helpful in keyword generation, given it can help the researcher brainstorm alternative ways of describing their concepts.
A related concept are index terms, or keywords that are systematically incorporated in the controlled vocabulary of A&I databases. Databases have their own set of index terms, often assigned by database staff after reviewing the article and consulting author supplied keywords. Common examples include MeSH (Medical Subject Headings) and Emtree within PubMed and Embase, respectively. A&I databases may allow you to search their index term listing, allowing you to supplement your keywords to help identify relevant articles. For instance, a search of the National Library of Medicine (NLM) MeSH Database identifies the index terms drug misuse; prescription drug misuse; prescription drug diversion; substance abuse, intravenous; substance abuse, oral; and substance-related disorders, relevant to the previous example, each a slight variation of researcher-generated keywords.
Once you have your keywords/index terms, there are various tools available in most A&I databases that allow you to either expand or limit your search. For example, Boolean operators can be used to either expand (OR) or limit (AND/NOT) your search. A search for heroin AND fentanyl will return a narrower result set than the search for heroin OR fentanyl, based on the chosen logic. Other helpful tools include truncation, phrase searching, and field tags. Truncation involves searching all terms that begin with a certain word stem; for instance, a search of pharm* within PubMed will return results for pharmacy, pharmacies, pharmaceutical, pharmacotherapy, and so on. Phrase searching involves searching a set of words in order, often denoted using single quotes “, double quotes ““, braces “{}”, or parentheses “()”, a function which some A&I databases perform automatically. Field tags involve searching for the keyword only in certain fields; for instance, a search of harm reduction[tiab] in PubMed will only return results with this phrase in the title, abstract and author-provided keyword fields. Other field tags in PubMed may be useful depending on the scope and type of literature of interest.23 Finally, most databases have built-in functions that allow you to filter your results. This can be useful if you want to limit to specific years of publication, for instance.
Table 2 provides a few examples of how these recommendations can come together to form a search, beginning with a well-formed research question. From there, keywords and index terms (specific to a database) can be combined with relevant tools to generate an initial search syntax, which can serve as a basis for further iteration.
Table 2.
Examples of draft search composition in PubMed using keywords, index terms, and other tools
Research question | Potential keywords and index terms | Potential search strategy | Notes |
---|---|---|---|
What are the attitudes, beliefs, knowledge, and practices of community pharmacy staff related to the sale of over-thecounter syringes? |
Keywords: chemist; apothecary; needle Index terms (MeSH): pharmacists; students, pharmacy; pharmacy technicians; pharmacy; pharmacies; syringes; needle-exchange programs |
(pharmacists[MeSH] OR “students, pharmacy”[MeSH] OR “pharmacy technicians”[MeSH] OR pharmacy[MeSH] OR pharmacies[MeSH] OR pharmac*[tiab] OR apothecary[tiab] OR “chemist shop”[tiab] OR “chemist’s shop”[tiab] OR “pharmaceutical service*”[tiab]) AND (syringes[MeSH] OR syringe*[tiab] OR “needle-exchange programs”[MeSH] OR “needle exchange”[tiab] OR “needle-exchange”[tiab]) | Truncation has been used to search for variant word endings Boolean operators have been utilized to ensure that the results have mention of both pharmacy-related terms AND syringe-related terms Specific domains (attitudes, beliefs, knowledge, and practices) of interest will be parsed out during screening as opposed to the search |
What is the effectiveness of cephalosporins for the treatment of acute pyelonephritis? |
Keywords: cefalexin; keflex; biocef; cystopyelitis Index terms (MeSH): cephalosporins; cephalexin; cefazolin; pyelonephritis |
(cephalosporins[MeSH]
OR
cephalosporin*[tiab]
OR
cefalosporin*[tiab]
OR
cephalosporine*[tiab]
OR
“delta3 cephalosporin”[tiab]
OR
cephalexin[MeSH]
OR
cephalexin[tiab]
OR
keflex[tiab] OR biocef[tiab]) AND (pyelonephritis[MeSH] OR pyelonephritis[all] OR cystopyelitis[tiab] OR “kidney pyelonephritis”[tiab] OR “nephritis, pyelo”[tiab] OR pyelitis[tiab] OR pyelocystitis[tiab] OR pyelonephritis[tiab]) |
Spelling and brand/generic variations have been incorporated for drug names and classes Variations in terminology for pyelonephritis as a condition has been included |
What is the impact of menopausal hormone therapy on sleep quality? |
Keywords: menopause, estrogen, sleep Index terms (MeSH): menopause, estrogen replacement therapy, sleep |
((menopau*[tiab] OR menopause[MeSH] OR perimenopau*[tiab] OR postmenopau*[tiab]) AND (“estrogen replacement therapy”[MeSH] OR estradiol*[tiab] OR estrogen*[tiab] OR progestin[tiab]) AND (sleep[MeSH] OR sleep*[tiab] OR insomnia*[tiab])) AND (random*[tiab]) AND (2002:2024[pdat]) | Truncation has been used to search for variant word endings Boolean operators have been utilized to ensure that the results have mention of both menopause-related terms AND sleep-related terms A limit has been built into the search using a field tag related to publication date |
Blue text represents the use of index terms within PubMed; Green text represents the use of keywords; Red text represents the use of Boolean operators; Orange text represents the use of wildcards/truncations; Purple text represents the use of field tags ([all] = all fields; [pdat] = publication date; [tiab] = title and abstract)23
Search strategies provided here are not necessarily the final or “correct” versions, but an initial draft on which a researcher can iterate through reviewing results and working toward optimizing the needs of their project
Self-evaluating a literature search strategy
Developing a search and searching the literature should be an iterative process. Upon executing a draft search, a researcher should evaluate not only the quantity of results returned but the quality. Did the search yield many results or did it find too few? Were the results largely relevant or irrelevant to the goal of the search? Although there are not specific right/wrong answers to these questions, a researcher should be introspective and trial different search strategies to see how it impacts the result set. One way to do this is through the use of exemplars, or articles already known to the researcher as highly relevant. Ideally, a good search strategy should return results that include the exemplars. If they are missing from the results, then it is worth re-examining the approach and why this occurred.
To help evaluate ways to improve a search, especially if you are having difficulty with the scope or sensitivity of your search, researchers can utilize the Peer Review of Electronic Search Strategies (PRESS) guidance.24 Originally developed to facilitate peer review of search strategies from evidence syntheses, it can also be a useful tool for self-evaluation. Guiding questions adapted from PRESS are presented in Table 3; these can be utilized as a sort of checklist for researchers to critically evaluate their approach and make any necessary adjustments.25
Table 3.
Self-evaluation of a search strategy adapted from PRESS25
Domain | Questions |
---|---|
Translation of the research question | Does the strategy match the research question? Are all elements of the question included? Are the search concepts clear? Are they too broad/narrow? |
Boolean/proximity operators | Are Boolean/proximity operators used correctly? If NOT is used, is this likely to result in any unintended exclusions? Could phrase searching/proximity operators be used instead of AND? If proximity operators are used, is the length appropriate? |
Index terms | Are index terms used appropriate for each database? Are any relevant index terms missing? Are the index terms appropriately scoped (not too broad/narrow)? |
Keywords | Are there any spelling errors in the search? If present, are the errors important spelling variants? Does the search include all synonyms? Are acronyms or abbreviations appropriately used? Are the keywords appropriate (e.g., not too broad/narrow)? |
Syntax | Were the appropriate fields searched? Is the search syntax appropriate for the database? |
Limits and filters | Are all limits and filters appropriate? |
PRESS: Peer Review of Electronic Search Strategies
©CADTH 2024. Modified and reprinted with permission from CADTH January 12, 2024. This table may not be further modified or reused without permission from CADTH.
Common pitfalls in developing literature search strategies
As you develop and enhance your approach to searching the literature, there are several common pitfalls to avoid. First, a frequent oversight in searching the literature is failing to engage the assistance of a librarian. Librarians are trained professionals with expertise in retrieval and navigating complex databases to find relevant literature. Even for experienced researchers, consultation with a librarian can help to build and refine a search strategy, identify potential errors with your approach, or introduce you to new and emerging data sources. In fact, engaging with an information specialist such as a librarian as a key team member has been shown to improve the quality of searches.26
Additionally, documentation of literature searches is key for transparency. Formalized evidence syntheses require documentation to allow peers the ability to assess and reproduce searches for reliability, enhancing and building knowledge in a particular field. However, even when composing informal searches, proper documentation is essential for a researcher internally. It allows for detection of errors in the literature searching process (perhaps with keywords or index terms) as well as providing the ability to iterate a search over time. It is critical to document a search not only once it is finalized, but also as it is being developed, since it can be otherwise nearly impossible for a researcher to recall all the conditions that have been trialed during the iteration process. Most research projects occur over timeframes where repeating the literature search would be beneficial to understand new work that has been published since initiation of the research; having a clearly documented strategy will save time and effort as you return to update the literature.
Another common pitfall is failing to iterate a search strategy. Most researchers recognize that ideas and strategies improve as we discuss and add to them over time, and search strategies are no different. Through iteration, a researcher can refine their search in alignment with their research questions, identifying potential flaws and limitations. Many researchers are pressed for time and may rush through the literature search process in an effort to get their research started sooner, simply running with their first draft. As a result, they may miss important keywords/index terms, rendering a literature search that misses out on important results. Some researchers have made it all the way to stage of their research being submitted for publication, only for a peer reviewer to point out a citation that negates the work or would have changed the approach in a fundamental way. Although it may be tempting to speed through the literature review phase, consider it as fundamental in the research process as data collection and analysis.
Finally, confirmation bias is encountered quite frequently in the literature searching process. Often in science, there is a tendency not to want to deviate from prior knowledge/practices/beliefs, resulting in a lack of pursuit of new ideas. Failure to have an objective approach to literature searching can result in “cherry-picking” of results that align with a researcher’s hypothesis. Therefore, it is important that searches are composed using multiple databases to locate information, seeking input on your work from peers/colleagues, and even actively seeking out literature that presents opposing views or challenges previous assumptions. However, with this also comes the need to ensure that sources are reputable and trusted, being mindful of potential misinformation.
Conclusion
Much like research questions, literature searches are fundamental to high-quality and rigorous research. Investing time in a well-composed literature search will set up a project for success. All researchers engage in the literature searching process to varying degrees, from less formal searches to more formalized evidence syntheses. Understanding the landscape of A&I databases, the basic building blocks of searches, and processes you can use for self-evaluation can improve your approach.
ACKNOWLEDGEMENTS:
The authors acknowledge the helpful input over time of individuals associated with: (1) the Creating an Educational Nexus for Training in Experimental Rigor (CENTER) team at the University of Pennsylvania, (2) the three other first-year Materials to Enhance Training in Experimental Rigor (METER) teams at Harvard Medical School, Johns Hopkins University, and Smith College, and (3) the Duquesne University METER Advisory Committee.
FUNDING:
This work was supported by the National Institute of Neurological Disorders and Stroke (NINDS) of the National Institutes of Health (NIH) under the Award Number 1UE5NS128228. The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH.
ABBREVIATIONS:
- A&I
abstracting and indexing
- CENTER
Creating an Educational Nexus for Training in Experimental Rigor
- DEA
Drug Enforcement Administration
- FDA
Food and Drug Administration
- FINER
Feasible; Interesting; Novel; Ethical; and Relevant
- MeSH
Medical subject headings
- METER
Materials to Enhance Training in Experimental Rigor
- NSF
National Science Foundation
- NIH
National Institutes of Health
- NINDS
National Institute of Neurological Disorders and Stroke
- NLM
National Library of Medicine
- NSF
National Science Foundation
- PICO
Population, Intervention, Comparison, Outcome
- PRESS
Peer Review of Electronic Search Strategies
- PRISMA
Preferred Reporting Items for Systematic Reviews and Meta-Analyses
- SALSA
searching, appraisal, synthesis, and analysis
- SR/MA
systematic reviews and meta-analyses
- WHO
World Health Organization
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
REFERENCES:
- 1.National Science Foundation. Publications output: US trends and international comparisons. Publication output by country, region, or economy and scientific field. October 2021. Accessed November 7, 2023. https://ncses.nsf.gov/pubs/nsb20214/publication-output-by-country-region-or-economy-and-scientific-field [Google Scholar]
- 2.Bastian H, Glasziou P, Chalmers I. Seventy-five trials and eleven systematic reviews a day: how will we ever keep up? PLoS Med. 2010;7(9):e1000326. doi: 10.1371/journal.pmed.1000326 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Hoffmann F, Allers K, Rombey T, et al. Nearly 80 systematic reviews were published each day: observational study on trends in epidemiology and reporting over the years 2000–2019. J Clin Epidemiol. 2021;138:1–11. doi: 10.1016/j.jclinepi.2021.05.022 [DOI] [PubMed] [Google Scholar]
- 4.Chalmers I, Bracken MB, Djulbegovic B, et al. How to increase value and reduce waste when research priorities are set. Lancet. 2014;383(9912):156–65. doi: 10.1016/S0140-6736(13)62229-1 [DOI] [PubMed] [Google Scholar]
- 5.Sharot T The optimism bias. Curr Biol. 2011;21(23):R941–5. doi: 10.1016/j.cub.2011.10.030 [DOI] [PubMed] [Google Scholar]
- 6.Clark T, Berger U, Mansmann U. Sample size determinations in original research protocols for randomised clinical trials submitted to UK research ethics committees: review. BMJ. 2013;346:f1135. doi: 10.1136/bmj.f1135 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Djulbegovic B, Kumar A, Magazin A, et al. Optimism bias leads to inconclusive results-an empirical study. J Clin Epidemiol. 2011;64(6):583–93. doi: 10.1016/j.jclinepi.2010.09.007 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Robinson KA, Goodman SN. A systematic examination of the citation of prior research in reports of randomized, controlled trials. Ann Intern Med. 2011;154(1):50–5. doi: 10.7326/0003-4819-154-1-201101040-00007 [DOI] [PubMed] [Google Scholar]
- 9.Andreasen J, Norgaard B, Draborg E, et al. Justification of research using systematic reviews continues to be inconsistent in clinical health science-A systematic review and meta-analysis of meta-research studies. PLoS One. 2022;17(10):e0276955. doi: 10.1371/journal.pone.0276955 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Clarke M, Hopewell S. Many reports of randomised trials still don’t begin or end with a systematic review of the relevant evidence. J Bahrain Med Soc. 2013;24(3):145=148. [Google Scholar]
- 11.Cooper NJ, Jones DR, Sutton AJ. The use of systematic reviews when designing studies. Clin Trials. 2005;2(3):260–4. doi: 10.1191/1740774505cn090oa [DOI] [PubMed] [Google Scholar]
- 12.Lingard L Joining a conversation: the problem/gap/hook heuristic. Perspect Med Educ. 2015;4(5):252–253. doi: 10.1007/s40037-015-0211-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Covvey JR, McClendon C, Gionfriddo MR. Back to the basics: guidance for formulating good research questions. Res Social Adm Pharm. 2023;doi: 10.1016/j.sapharm.2023.09.009 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Grant MJ, Booth A. A typology of reviews: an analysis of 14 review types and associated methodologies. Health Info Libr J. 2009;26(2):91–108. doi: 10.1111/j.1471-1842.2009.00848.x [DOI] [PubMed] [Google Scholar]
- 15.Farrugia P, Petrisor BA, Farrokhyar F, Bhandari M. Practical tips for surgical research: Research questions, hypotheses and objectives. Can J Surg. 2010;53(4):278–81. [PMC free article] [PubMed] [Google Scholar]
- 16.Page MJ, McKenzie JE, Bossuyt PM, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372:n71. doi: 10.1136/bmj.n71 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Cochrane Collaboration. Guides and handbooks. Accessed November 13, 2023. https://training.cochrane.org/handbooks
- 18.Campbell Collaboration. Training. Accessed November 13, 2023. https://www.campbellcollaboration.org/research-resources/training-courses.html
- 19.JBI. JBI manual for evidence synthesis. Accessed November 13, 2023. https://jbi-global-wiki.refined.site/space/MANUAL [Google Scholar]
- 20.Turner EH, Matthews AM, Linardatos E, Tell RA, Rosenthal R. Selective publication of antidepressant trials and its influence on apparent efficacy. N Engl J Med. 2008;358(3):252–60. doi: 10.1056/NEJMsa065779 [DOI] [PubMed] [Google Scholar]
- 21.Schardt C, Adams MB, Owens T, Keitz S, Fontelo P. Utilization of the PICO framework to improve searching PubMed for clinical questions. BMC Med Inform Decis Mak. 2007;7:16. doi: 10.1186/1472-6947-7-16 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Ferner RE, Aronson JK. Nominal ISOMERs (Incorrect Spellings Of Medicines Eluding Researchers)-variants in the spellings of drug names in PubMed: a database review. BMJ. 2016;355:i4854. doi: 10.1136/bmj.i4854 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.National Library of Medicine. PubMed user guide: using search field tags. Updated December 20, 2023. Accessed January 10, 2024. https://pubmed.ncbi.nlm.nih.gov/help/#using-search-field-tags [Google Scholar]
- 24.McGowan J, Sampson M, Salzwedel DM, Cogo E, Foerster V, Lefebvre C. PRESS Peer Review of Electronic Search Strategies: 2015 guideline statement. J Clin Epidemiol. 2016;75:40–6. doi: 10.1016/j.jclinepi.2016.01.021 [DOI] [PubMed] [Google Scholar]
- 25.Table 9: PRESS 2015 Evidence-Based Checklist. Peer Review of Electronic Search Strategies: 2015 Guideline Explanation and Elaboration (PRESS E&E). Ottawa: CADTH; 2016. Jan. https://www.cadth.ca/sites/default/files/attachments/2021-07/CP0015_PRESS_Update_Report_2016_0.pdf [Google Scholar]
- 26.Rethlefsen ML, Farrell AM, Osterhaus Trzasko LC, Brigham TJ. Librarian co-authors correlated with higher quality reported search strategies in general internal medicine systematic reviews. J Clin Epidemiol. 2015;68(6):617–26. doi: 10.1016/j.jclinepi.2014.11.025 [DOI] [PubMed] [Google Scholar]