Table 1.
Summary of included articles containing a performance-based measure of eHealth literacy (organized alphabetically by category).
| Authors, year | Terminology for measured construct | Participants and context | Instrument design | Health topics | Evidence of validity | Instrument availability | |
| Answering health -related questions using the internet | |||||||
|
|
Agree et al [33], 2015 | Web-based health literacy | 323 participants aged 35 to 90 years | Six health-related questions were answered by performing web-based searches on a computer, limited to 15 minutes per task; answers were coded by 2 researchers for response accuracy (0 or 1) and specificity (0, 1, or 2) to give a score ranging from 0 to 18 | Diet and nutrition guidelines, skin cancer, alternative medicine, vaccines, assistive health technology, and over-the-counter genetic testing | Construct validity was demonstrated in that having a college degree and daily internet use were positively associated with more successful health information searches, and the oldest age group had lower success scores compared with younger participants; criterion validity was demonstrated in that higher health literacy was positively associated with success on some search tasks | Partially |
|
|
Blakemore et al [34], 2020 | eHealth literacy | A massive open web-based course run 8 times | Participants responded to 1 health-related question and were asked to list the resources they used to inform that answer; answers were coded according to the extent that quality resources were used in this question | Epigenetics and cancer | Ecological validity was demonstrated via having participants access web-based resources to inform answers to a health-related question | Yes |
|
|
Chang et al [35], 2021 | Searching performance | 11 older adult participants | Participants were asked to search for specific health information using a web browser on a computer; search completion time and problem correctness were measured by researchers during live observation | Vaccination for older adults, stroke, and angina | Ecological validity was demonstrated via participants accessing real-world web-based health information to answer health-related questions | No |
|
|
Freund et al [36], 2017 | eHealth literacy | 79 older adult participants | Participants were asked to answer 6 questions (3 each for 2 health scenarios) while being given the option of using links to web-based medical databases with relevant information | Hypertension, high blood pressure, osteoporosis, breast cancer, and prostate cancer | Construct validity was demonstrated in that test scores for the intervention group improved; ecological validity was demonstrated in that participants responded to questions by referencing a web-based resource | Yes |
|
|
Kordovski et al [37], 2020 | eHealth search skills | 56 undergraduate students enrolled in psychology courses | Participants were instructed to find the answer to 5 short questions and 1 vignette-based question using an internet browser of their choice; participants’ accuracy, time to complete each task, and total number of search queries were recorded | Headaches, migraines, and Lyme disease | Criterion validity was demonstrated in that long-answer accuracy was associated with better performance on a learning and memory composite test; construct validity was demonstrated in that lower performance on short questions was associated with lower maternal education and lower socioeconomic status | Yes |
|
|
Loda et al [38], 2020 | Information-seeking behavior | 140 medical students | Students were randomly assigned to use a specific search engine (Google, Medisuch, or free choice) and had 10 minutes to fill in a worksheet outlining a diagnostic recommendation; to pass, students needed to give at least 1 of 3 recommendations matching those of a clinical expert | Histamine intolerance | None | No |
|
|
Quinn et al [25], 2017 | eHealth literacy | 54 adults | Participants were presented 6 health questions, and they used a browser to search for information to answer the questions; each answer was scored as correct or incorrect, with a final sum score out of 6 | Various topics | Criterion validity was demonstrated because the scores correlated with health literacy | Yes |
|
|
Sharit et al [39], 2008 | Internet search task performance | 40 older adults | Participants were assigned 6 search problems involving health-related information, for which they had to provide an answer using information they found on the internet; participants had 15 minutes to solve each problem, and the problems were progressively more complex; the problem solutions were scored as incorrect, partially correct, or correct by the researcher to create a task performance score; scores were weighted by difficulty, and participants’ completion times for each problem were also measured and factored into the score such that faster times indicated better performance | Various | Criterion validity was demonstrated in that higher performance correlated with higher knowledge of the internet, as well as with measures of reasoning, working memory, and perceptual speed | Yes |
|
|
Sharit et al [40], 2015 | Search accuracy | 60 adults | Participants were given a health scenario, followed by a series of questions related to it, and they could use the internet to find answers; to assess accuracy, researchers assigned a score for each question; questions were weighted based on their difficulty (differed in complexity and number of subtasks) | Multiple sclerosis | Criterion validity was demonstrated as search accuracy significantly correlated with reasoning, verbal ability, visuospatial ability, processing speed, and executive function | Yes |
|
|
van Deursen and van Dijk [41], 2011 | Internet skills performance | 88 adults | Participants completed 9 health-related assignments using a computer with high-speed internet; assignment was deemed successfully completed if a correct answer was provided and deemed unsuccessful if no correct answer was provided in the given time frame | Various | Ecological validity was demonstrated via participants using unrestricted web-based searching to answer health-related questions | Yes |
|
|
van Deursen [42], 2012 | Internet skills performance | 88 adults | Participants completed 9 health-related assignments using a computer with high-speed internet; assignment was deemed successfully completed if a correct answer was provided and deemed unsuccessful if no correct answer was provided in the given time frame | Various | Ecological validity was demonstrated via participants using unrestricted web-based searching to answer health-related questions; construct validity was demonstrated as education was predictive for making incorrect decisions based on the information found | Yes |
| Simulated internet tasks | |||||||
|
|
Camiling [43], 2019 | Actual eHealth literacy (distinct from perceived eHealth literacy) | 40 grade-10 students from public and private schools | Participants completed 10 simulation tasks; 2 researchers used a rubric to rate eHealth literacy based on task performance | Not specified | Ecological validity was demonstrated via use of simulated internet research tasks resembling a realistic environment | No |
|
|
Chan and Kaufman [44], 2011 | eHealth literacy | 20 adult participants aged between 18 and 65 years | Participants completed eHealth tasks while verbalizing their thoughts (think-aloud protocol); researchers observed their performance, rated accuracy, and denoted barriers based on video capture, audio recording, and notes taken during observation | Comparing hospital ratings | Ecological validity was demonstrated via participants actively completing health-related internet tasks in a realistic environment | Partially |
|
|
Maitz et al [22], 2020 | Health literacy (the authors note in their study that their understanding of health literacy includes “internet-based information literacy and reading literacy”) | 14 secondary school students aged 12 to 14 years | Participants were asked to provide health-related advice in response to a short narrative text; they were asked to take screenshots of all searches and web pages opened; the web pages were later classified by researchers as good, fair, poor, or bad | Rhinoplasty and skin cancer | None | Yes |
|
|
Neter and Brainin [24], 2017 | eHealth literacy | 88 older adults | Participants completed 15 computerized simulation tasks assessing digital and health literacy skills; tasks were rated as completed or not completed by the researcher upon reviewing the recorded performance; time needed to perform the task was also recorded; 2 researchers provided overall observational judgment on participants’ performance, ranging from 1 (poor) to 5 (good); a third researcher evaluated whether disagreements were present | Various topics | Construct validity was demonstrated because lower performers had significantly fewer years of experience using the internet | Yes |
|
|
van der Vaart et al [26], 2011 | eHealth literacy | 88 adults | Participants completed 9 health-related assignments using a computer with high-speed internet; assignment was deemed successfully completed if a correct answer was provided and deemed unsuccessful if no correct answer was provided in the given time frame | Various | Ecological validity was demonstrated via participants using unrestricted web-based searching to answer health-related questions | Yes |
|
|
van der Vaart et al [45], 2013 | eHealth literacy | 31 adult patients | In study 1, participants could use the internet freely to complete 6 health-related assignments; in study 2, participants used specific websites to complete 5 health-related assignments; researchers coded whether the assignment was completed and whether help was needed; in addition, the time needed to perform each assignment was recorded; the performance was ultimately scored as good, reasonable, or poor according to the skills participants used to execute the assignment | Various | Construct validity was demonstrated through correlations of higher performance with higher education | Yes |
|
|
Witry et al [46], 2018 | eHealth task performance | 100 adult patients with COPDa | Participants completed a series of timed eHealth simulation exercises using a laptop computer and 2 different tablet devices; the time taken to complete each task was measured and used to indicate task performance, with faster times indicating better performance | COPD | Construct validity was demonstrated because those who reported using video chat took less time than nonusers to complete most of the tasks | No |
| Website evaluation tasks | |||||||
|
|
Kalichman et al [47], 2006 | Health information evaluation skills | 448 adults who used the internet <3 times in the month before screening | Participants rated 2 preselected web pages—1 from a medical association and 1 with scientifically unsupported claims—on 5 dimensions of website quality; a larger difference in scores indicated higher health information evaluation skills | HIV and AIDS treatment | Construct validity was demonstrated in that those receiving internet skills training had better discrimination | Yes |
|
|
Mitsuhashi [48], 2018 | eHealth literacy evaluation skills | 300 adult participants | Participants were shown a search engine results page with 5 websites and asked which should be viewed first; the list included 2 commercial websites, 2 personal health care websites, and 1 government website; participants choosing the government website were assigned 1 point, others were assigned 0 points | Not specified | Construct validity was demonstrated in that evaluation skills improved significantly in an e-learning intervention group compared with the control group | No |
|
|
Schulz et al [49], 2021 | Health literacy | 362 adults | Participants rated 2 health information websites (one was of high quality, whereas the other was of low quality) using 3 seven-step semantic differential scales; in addition, participants were asked to choose beneficial depression treatments from a list of relevant and nonrelevant treatments | Depression treatment | Criterion validity was demonstrated in that those with high health literacy and accurate recognition of the low-quality website demonstrated good judgment for depression treatment | Partially |
|
|
Trettin et al [50], 2008 | Website evaluation | 142 high school students | Two measures: first, a brief 2-item pretest of knowledge about how to evaluate a website, and second, participants ranked 2 different websites (assigned to them from a list of 12 websites) using 6 credibility factors, with scores ranging from 1 (very bad) to 5 (very good) | Not specified | Ecological validity was demonstrated in that participants ranked authentic health-related websites according to their credibility | Yes |
|
|
Xie [51], 2011 | eHealth literacy | 124 older adults | Participants were asked to evaluate the quality of 20 health websites: 10 selected from the Medical Library Association’s recommended sites and 10 from a commercial search engine; each correct assessment received 1 point, whereas incorrect or uncertain assessments received 0 points | Not specified | Construct validity was demonstrated because scores improved after an educational intervention | No |
| Knowledge of the web-based health information–seeking process | |||||||
|
|
Hanik and Stellefson [52], 2011 | eHealth literacy | 77 undergraduate health education majors | Researchers used the RRSA-hb, which is a questionnaire that tests participants’ declarative knowledge of concepts, skills, and thinking strategies related to using the internet to find health information | Various | Ecological validity was demonstrated in the study by Ivanitskaya et al [53] | No |
|
|
Hanna et al [54], 2017 | eHealth literacy | 165 adult dental patients | Participants were asked to circle the web-based health information quality seals they recognized and report the purpose of 1 circled figure | Third molar knowledge | Criterion validity was demonstrated in that the eHEALSc scores correlated with the dental procedural web-based information–seeking measure; construct validity was demonstrated in that web-based dental procedural information seeking was significantly associated with educational attainment and dental decisional control preference | Yes |
|
|
Ivanitskaya et al [53], 2006 | Health information competency | 400 college-aged students | Researchers used the RRSA-h, which is a web-based quiz that assesses declarative and procedural knowledge related to web-based health information seeking | Various | Ecological validity was demonstrated in that some questions had participants access real health-related websites to assess their credibility | No |
|
|
Ivanitskaya et al [55], 2010 | eHealth literacy skills | 1914 undergraduate and graduate students enrolled in health-related courses | Researchers used the RRSA-h, which is a web-based quiz that assesses declarative and procedural knowledge related to web-based health information seeking; a proxy measure of critical judgment skills related to pharmacies was also included | Pharmaceuticals and various others | Construct validity was demonstrated in that evaluation skills positively correlated with the number of earned college credits and being enrolled in a health-related major | Partially |
|
|
St. Jean et al [56], 2017 | Digital health literacy | 19 adolescents | Participants were given 13 questions related to searching for health information; researchers analyzed responses using thematic analysis; no evident scoring system used | Type 1 diabetes | None | Yes |
|
|
van der Vaart and Drossaert [57], 2017 | Digital health literacy, eHealth literacy | 200 adults | Participants completed a 28-item questionnaire: 21 are self-report items, whereas 7 are performance-based items for each of which there is a correct answer | Various | None for performance-based items | Yes |
| Health-related knowledge | |||||||
|
|
Holt et al [58], 2019 | eHealth literacy | 246 adult patients with diabetes, other endocrine conditions, and gastrointestinal diseases | Researchers used eHLAd performance tests (tools 1 and 4); tool 1 is a performance-based health literacy test based on an information leaflet, and tool 4 is a performance test for knowledge of health and health care | Various | Construct validity was demonstrated in that educational level was positively correlated with tool 4 | Partially |
|
|
Holt et al [59], 2020 | eHealth literacy | 366 nursing students | Researchers used eHLA performance tests (tools 1 and 4); tool 1 is a performance-based health literacy test based on an information leaflet, and tool 4 is a performance test for knowledge of health and health care | Various | Construct validity was demonstrated in that graduate-level students scored higher than entry-level students, and performance on tools 1 and 4 was correlated with having at least 1 parent with experience in the social or health care system | Partially |
|
|
Karnoe et al [31], 2018 | eHealth literacy | 475 adults used as a validation sample | Researchers used the eHLA, which consists of 7 tools, 2 of which are objective measures: tool 1 is a performance-based health literacy test based on an information leaflet, and tool 4 is a performance test for knowledge of health and health care | Various | None | Partially |
|
|
Liu et al [60], 2020 | Digital health literacy | 1588 adult participants | Participants were provided 5 randomly selected items from a large web-based health information bank and asked whether the information was right or wrong; 2 items were designed to be relatively easy to judge accurately, 2 were moderately easy, and 1 was difficult; participants scored 1 for each accurate judgment and 0 for being incorrect or unsure | Various | Ecological validity was demonstrated in that the web-based health information bank was generated from real web-based sources; construct validity was demonstrated because participants at high risk for misjudging health information had lower education level, poorer health, and used the internet less | Partially |
aCOPD: chronic obstructive pulmonary disease.
bRRSA-h: Research Readiness Self-Assessment-health.
ceHEALS: eHealth Literacy Scale.
deHLA: eHealth literacy assessment toolkit.