5. InterTASC ‐ Lee 2012.
A. Information | |
A.1 State the author’s objective | "The objective is to describe the development and validation of the health‐evidence.ca systematic review search filter and to compare its performance with other available systematic review filters." |
A.2 State the focus of the research. | Balance of sensitivity |
A.3 Database(s) and search interface(s) . | MEDLINE, EMBASE, and CINAHL (Ovid) |
A.4 Describe the methodological focus of the filter (e.g. RCTs). | Systematic reviews |
A.5 Describe any other topic that forms an additional focus of the filter (e.g. clinical topics such as breast cancer, geographic location such as Asia or population grouping such as paediatrics). | Included reviews should meet the criteria for relevance and should be systematic reviews that focus on public health, provide outcome data on the effectiveness of interventions, and include a documented search strategy. |
A.6 Other observations. | No |
B. Identification of a gold standard (GS) of known relevant records | |
B.1 Did the authors identify one or more gold standards (GSs)? | 1 GSs. “We considered this set (the electronic database searches plus additional search strategies), the‘gold standard’ for health‐evidence.ca.” |
B.2 How did the authors identify the records in each GS? | “Our PH search filter typically yielded a very high volume of results with very low precision. For example,between January 2006 and December 2007, of the 136,427 titles screened, 409 were relevant for the health evidence.ca registry, or in other words, precision was 0.3%. In addition to using the PH search filter, more than 40 public health‐relevant journals were hand searched annually, as well as the reference lists of allrelevant reviews. Given this systematic search of the published review literature, we were reasonably confident thatour retrieval methods were capturing a near complete set of relevant articles.” |
B.3 Report the dates of the records in each GS. | Jan 2006 to Dec 2007 |
B.4 What are the inclusion criteria for each GS? | Systematic reviews that focus on public health, provide outcome data on the effectiveness of interventions, and include a documented search strategy. |
B.5 Describe the size of each GS and theauthors’ justification, if provided (for example the size of the gold standard may have been determined by a power calculation) | No reported |
B.6 Are there limitations to the gold standard(s)? | No |
B.7 How was each gold standard used? | To test internal validity |
B.8 Other observations. | No |
C. How did the researchers identify the search terms in their filter(s) (select all that apply)? | |
C.1 Adapted a published search strategy. | Yes, The health‐evidence.ca Systematic Review (SR) search filter we developed in 2008 was adapted from a previously validated filter |
C.2 Asked experts for suggestions of relevant terms. | No |
C.3 Used a database thesaurus. | Yes, only Medline Ovid, and adapted to Embase and CINAHL. |
C.4 Statistical analysis of terms in a gold standard set of records (see B above). | No |
C.5 Extracted terms from the gold standard set of records (see B above). | No |
C.6 Extracted terms from some relevant records (but not a gold standard). | No |
C.7 Tick all types of search terms tested. | ‐ Text words (e.g. in title, abstract) ‐ Publicationn types |
C.8 Include the citation of any adapted strategies. | Default search strategies used for BMJ Clinical Evidence |
C.9 How were the (final) combination(s) of search terms selected? | No |
C.10 Were the search terms combined (using Boolean logic) in a way that is likely to retrieve the studies of interest? | The health‐evidence.ca Systematic Review (SR) search filter we developed in 2008 was adapted from a previously‐validated filter, which included the terms: MEDLINE.tw, systematic review.tw, meta‐analysis.pt, combined with the Boolean OR operator. |
C.11 Other observations. | No |
D. Internal validity testing (This type of testing is possible when the search filter terms were developed from a known gold standard set of records). | |
D.1 How many filters were tested for internal validity? | 1 filters |
For each filter report the following information | |
D.2 Was the performance of the search filter tested on the gold standard from which it was derived? | No |
D.3 Report sensitivity data (a single value, a range, ‘Unclear’* or ‘not reported’, as appropriate). *Please describe. | ‐ MEDLINE: 86.8% (75.2 to 93.5) ‐ Embase: 72.7% (55.8 to 84.9) ‐ CINAHL: 86.1% (71.4 to 93.9) |
D.4 Report precision data (a single value, a range, ‘Unclear’* or ‘not reported’ as appropriate). *Please describe. | ‐ MEDLINE: 1.1% (0.9 to 1.2) ‐ Embase: 0.6% (0.4 to 0.7) ‐ CINAHL: 1.6% (1.4 to 1.8) |
D.5 Report specificity data (a single value, a range, ‘Unclear’* or ‘not reported’ as appropriate). *Please describe. | ‐ MEDLINE: 99.2% (99.2 to 99.2) ‐ Embase: 99.1% (99.1 to 99.1) ‐ CINAHL:98.2% (98.2 to 98.2) |
D.6 Other performance measures reported. | Number needed to read ‐ MEDLINE: 91.6 (85.0 to 105.9) ‐ Embase: 171.6 (146.7 to 224.6) ‐ CINAHL: 61.3 (56.1 to 74.3) |
D.7 Other observations. | No |
E. External validity testing (This section relates to testing the search filter on records that are different from the records used to identify the search terms) | |
E.1 How many filters were tested for external validity on records different from those used to identify the search terms? | 1 filters |
E.2 Describe the validation set(s) of records, including the interface. | No |
E.3 On which validation set(s) was the filter tested? | No |
E.4 Report sensitivity data for each validation set (a single value, a range or ‘Unclear’ or ‘not reported’, as appropriate). | ‐ MEDLINE: 89.9 (85.0, 93.3) ‐ Embase: 87.9 (80.3, 92.8) ‐ CINAHL: 89.9 (93.5, 94.0) |
E.5 Report precision data for each validation set (report a single value, a range or ‘Unclear’ or ‘not reported’, as appropriate). | ‐ MEDLINE: 1.4 (1.3, 1.5) ‐ Embase: 0.5 (0.5, 0.6) ‐ CINAHL: 1.8 (1.6, 1.8) |
E.6 Report specificity data for each validation set (a single value, a range or ‘Unclear’ or ‘not reported’, as appropriate). | ‐ MEDLINE: 98.9 (98.9, 98.9) ‐ Embase: 98.2 (98.2, 98.2) ‐ CINAHL:97.6 (97.6, 97.6) |
E.7 Other performance measures reported. | No |
F. Limitations and comparisons. | |
F.1 Did the authors discuss any limitations to their research? | Yes: ‐ "Searching was conducted in OVID’s search interface for all three databases; other search interfaces for these databases (e.g. PubMed) may handle the searches some‐ what differently" ‐ "Precision and NNR scores were calculated specifically forpublic health content and cannot be generalized to topicareas outside of public health" ‐ |
F.2 Are there other potential limitations to this research that you have noticed? | No |
F.3 Report any comparisons of the performance of the filter against other relevant published filters (sensitivity, precision, specificity or other measures). | Yes: Performance of the health‐evidence.ca SR search filter compared to the PH search filter in retrieving systematic reviews in MEDLINE. 1. health‐evidence.ca SR search filter†: Development: sensitivity 86.8 (75.2, 93.5) Specificity 86.8 (75.2, 93.5); Precision1.1 (0.9, 1.2) Number needed to read;91.6 (85.0, 105.9) Validation: sensitivity 89.9 (85.0, 93.3); Specificity 98.9 (98.9, 98.9); Precision 1.4 (1.3, 1.5); 2. PH search filter Development sensitivity 86.8 (75.2, 93.5) Specificity 86.8 (75.2, 93.5); Precision1.1 (0.9, 1.2) Number needed to read;91.6 (85.0, 105.9) Validation: sensitivity 89.9 (85.0, 93.3); Specificity 98.9 (98.9, 98.9); Precision 1.4 (1.3, 1.5) |
F.4 Include the citations of any compared filters. | Yes, Montori 2005; Shojania 2001;Hunt 1997; Boynton 1998 ; BMJ Clinical Evidence ; SIGN ; Wilczynski 2007; Wong 2006; Wilczynski 1995 |
F.5 Other observations and / or comments. | No |
G. Other comments. This section can be used to provide any other comments. Selected prompts for issues to bear in mind are given below. | |
G.1 Have you noticed any errors in the document that might impact on the usability of the filter? | No |
G.2 Are there any published errata or comments (for example in the MEDLINE record)? | No |
G.3 Is there public access to pre‐publication history and / or correspondence? | No |
G.4 Are further data available on a linked site or from the authors? | Yes, additional files 1, 2 and 3 with the details on the search strategies and their performance. |
G.5 Include references to related papers and/or other relevant material. | No |
G.6 Other comments. | No |