Table 7.
Comparison of evaluation tools previously described in the literature and QUEST
Name of tool | Focus | Criteria | Format | |
---|---|---|---|---|
0 | QUality Evaluation Scoring Tool (QUEST) | Quality of online health information | Authorship, attribution, conflict of interest, complementarity, currency, tone | 6 questions rated on a scale of 0–2 or 0–1 and differentially weighted, yielding an overall quality score between 0 and 28 |
1 | DISCERN | Quality of written information about treatment choices | Reliability, balance, dates, source, quality of information on treatment sources, overall rating | 15 questions rated on a scale of 1–5 |
2 | EQIP: Ensuring Quality Information for Patients | Quality of written patient information applicable to all information types | Clarity, patient-oriented design, currency, attributon, conflict of interest, completeness | 20 questions rated Y/Partly/N with an equation to generate a % score |
3 | Jones’ Self-Assessment Method | Self-assessment tool for patients to evaluate quality and relevance of health care oriented websites | Content, design, communication, and credibility | 9 broad questions based on 4 criteria rated Yes/No/NA |
4 | Health on the Net Foundation’s HONcode Patient Evaluation Tool | Patient evaluation tool for health-related websites | Authorship, attribution, currency, reliability, balance, mission/target audience, privacy, interactivity, overall reliability | 16-item interactive questionnaire returning a % score |
5 | Silberg standards | Standards of quality for online medical information for consumers and professionals | Authorship, attribution, disclosure, currency | Set of core standards; no score is generated |
6 | Sandvik’s General Quality Criteria | General quality measure for online health information | Ownership, authorship, source, currency, interactivity, navigability, balance | 7 questions rated on a scale of 0–2 |
7 | Health Information Technology Institute (HITI) Information Quality Tool *No longer available | Quality measure for health-related websites | Credibility, content, disclosure, links, design, interactivity | Not available |
8 | 5 C’s website evaluation tool | Structured guide to systematically evaluating websites; specifically developed for nurses to use in patient care and education | Credibility, currency, content, construction, clarity | Series of 36 open-ended and yes/no questions grouped under the “5 C’s”; no score is generated |
9 | Health Literacy INDEX | Tool to evaluate the health literacy demands of health information materials | Plain language, clear purpose, supporting graphics, user involvement, skill-based learning, audience appropriateness, instructions, development details, evaluation methods, strength of evidence | 63 indicators/criteria rated yes/no, yielding criterion-specific scores and an overall % score |
10 | Bath and Bouchier’s evaluation tool | Tool to evaluate websites providing information on Alzheimer’s disease | General details, information for carers, currency, ease of use, general conclusions | 47 questions scored from 0 to 2, generating an overall % score |
11 | Seidman quality evaluation tool | Quality of diabetes consumer-information websites | Explanation of methods, validity of methods, currency, comprehensiveness, accuracy | 7 structural measures and 34 performance measures, generating composite scores by section and an overall score |
12 | Appraisal of Guidelines, REsearch and Evaluation (AGREE) Collaboration instrument | Quality of clinical practice guidelines | Scope and purpose, stakeholder involvement, rigour of development, clarity and presentation, applicability, editorial independence | 23 items grouped into six quality domains with a 4 point Likert scale to score each item |
13 | Communication AssessmenT Checklist in Health (CATCH) tool | Quality of printed educational materials for clinicians | Appearance, layout and typography, clarity of content, language and readability, graphics, risk communication, scientific value, emotional appeal, relevance, social value/source credibility, social value/usefulness for the clinician, social value/usefulness for the health care system (hospital or government) | 55 items nested in 12 concepts, each rated yes/no, generating concept-specific and overall scores |
14 | LIDA Minervation tool | Evaluates the design and content of healthcare websites | Accessibility, usability (clarity, consistency, functionality, engagability), reliability (currency, conflict of interest, content production) | 41 questions scored on a scale of 0–3, yielding a total % score |
15 | Mitretek Information Quality Tool (IQT) *no longer available | Evaluates information quality of online health information | Authorship, sponsorship, currency, accuracy, confidentiality, navigability | 21 questions rated yes/no and weighted according to importance, generating a total score between 0 to 4 |
16 | “Date, Author, References, Type, and Sponsor” (DARTS) | Assists patients in appraising the quality of online medicines information | Currency, authorship, credibility, purpose, conflict of interest | A series of six guiding questions; no score generated |
17 | Quality Index for health-related Media Reports (QIMR) | Monitors the quality of health research reports in the lay media | Background, sources, results, context, validity | 17 items rated on a 0–6 Likert scale with an 18th global rating |
18 | Index of Scientific Quality (ISQ) | Index of scientific quality for health reports in the lay press | Applicability, opinions vs. facts, validity, magnitude, precision, consistency, consequences | 7 items rated on a 1–5 Likert scale with an 8th global rating |