Abstract
eHealth literacy is the ability to access, assess, and use digital health information. This study compared the effects of a multimedia tutorial versus a paper-based control in improving older adults’ eHealth literacy from pre- to posttest. A total of 99 community-dwelling older adults (63–90 years old; mean = 73.09) participated from July 2019 to February 2020. Overall, knowledge about computer/Internet terms, eHealth literacy efficacy, knowledge about the quality of health information websites, and procedural skills in computer/Internet use improved significantly from pre- to posttest. No interaction effect was found between time and group. Participants in both groups had an overwhelmingly positive attitude toward training. Their attitudes toward training approached a statistically significant difference between the two conditions: F(1, 89) = 3.75, p = .056, partial η2 = .040, with the multimedia condition showing more positive attitudes. These findings have implications for designing effective eHealth literacy interventions for older adults.
Keywords: multimedia tutorial, eHealth literacy, older adults, attitudes
Health literacy is “the degree to which individuals have the capacity to obtain, process, and understand basic health information and services needed to make appropriate health decisions” (Healthy People 2010, 2000). The connections between poor health literacy and health outcomes and costs are well-documented (Berkman et al., 2011; Institute of Medicine, 2004). Yet we know little about effective health literacy interventions (Mika et al., 2005), and even less about how such interventions might affect individuals with varying characteristics (e.g., age) differently. Older adults are in great need of health literacy interventions, given that their needs for health information and services are typically high yet their health literacy levels are low (Kutner et al., 2006). Because of age-related changes in cognitive and physiological abilities and social environments (Birren & Warner, 1990), interventions for younger people are unlikely to reach older adults or affect them similarly.
Health literacy is “a multidimensional, dynamic construct” (Squiers et al., 2012, p. 47); its definitions and requirements present a moving target (Berkman et al., 2010; Paasche-Orlow et al., 2010). Earlier definitions of health literacy did not explicitly include the ability to find information on the Internet, but as information and communication technologies are increasingly used in health care (Institute of Medicine, 2009; Oh et al., 2005), the ability to use such technology is becoming integrated into health literacy’s conceptualization (Bann et al., 2012; Berkman et al., 2010; Chan & Kaufman, 2011; McCormack et al., 2010; Norman, 2009; Paasche-Orlow et al., 2010; Squiers et al., 2012). The Health Literacy Skills Framework, for example, has explicitly added online information-seeking and eHealth skills as a dimension of health literacy that “can be developed, enhanced, refined, and even lost” (Squiers et al., 2012, p. 47). eHealth refers to the application of electronic technology in healthcare and eHealth literacy is defined as the “ability to seek out, find, evaluate and appraise, integrate and apply what is gained in electronic environments towards solving a health problem” (Norman & Skinner, 2006a, p. e27). eHealth literacy is composed of traditional literacy, information, scientific and media literacy, computer and health literacy (Norman & Skinner, 2006b).
Older adults are in double jeopardy in the eHealth era because they tend to have not only low health literacy but also low computer literacy (Czaja et al., 2006; Zickuhr & Smith, 2012). Previous studies have indicated that although a wide option of health-related information products and services (e.g. patient portal, health monitoring technologies) are available, older adults are less likely to participate in those services than any other age groups (Levine et al., 2016; Sakaguchi-Tang et al., 2017). Without effective interventions, older adults are unlikely to avail themselves of health care technology’s full potential (Fox & Duggan, 2013; Xie, 2008). Our prior research over the past decade provides ample evidence that older adults can learn to use new technology, particularly when they are provided with age-appropriate training and technology that is properly designed (Jaeger & Xie, 2009; Piper et al., 2009; Xie, 2008, 2011a; Xie, Yeh, at al., 2012). Still, exactly what constitutes such training and design remains understudied, as does how such training and design can be developed and implemented.
Multimedia Learning
Our search for an effective training intervention for older adults has led us to multimedia learning. The term multimedia refers to “presenting both words (such as spoken text or printed text) and pictures (such as illustrations, photos, animation, or video)” (Mayer, 2005, p. 15). The cognitive theory of multimedia learning posits that a well-designed multimedia curriculum is more advantageous than a single-medium curriculum in reducing extraneous cognitive load (created by the way in which the instructional material is presented), thus freeing cognitive resources for deeper learning (Mayer, 2005). Derived from this theory, the split-attention principle (Ayres & Sweller, 2005) warns that instructions in which multiple sources of information are not integrated increase extraneous cognitive load because they require the use of working memory resources for mental integration. In contrast, instructions that feature the integration of multiple sources of information do not increase extraneous cognitive load and thus are superior in promoting learning.
The split-attention principle is particularly applicable to situations involving (a) learning of materials with high element interactivity (i.e., materials featuring a high number of elements that must be simultaneously processed in working memory in order to learn) and (b) low knowledge learners (Ayres & Sweller, 2005). These situations are typical among older adults trying to learn new computer skills. Nevertheless, split-attention instructions are still commonly used for older learners (Echt et al., 1998; Hawthorn, 2007). Such instructions require the user to alternate between multiple interfaces—for example, an instruction interface that presents the instructions (e.g., a manual) and an application interface where the instruction is applied (e.g., a computer screen). This adds cognitive burden for the user, and it is particularly challenging for older adults, whose cognitive abilities tend to be more limited than those of younger adults (Birren & Warner, 1990). Integrating multiple interfaces into a single interface (i.e., adding instructions onto an application interface) provides a novel technical solution to such challenges. Preliminary evidence suggests that this integrated approach benefits general computer users (Bergman et al., 2005; Kang et al., 2003), school children (Kelleher & Pausch, 2005), and, as shown in our own qualitative pilot research, older adults (Xie, Yeh, et al., 2012). More systematic examination is needed in order to understand if and how the integrated approach may work for older adults, who, owing to age-related cognitive and physiological changes (Birren & Warner, 1990), may require age-specific instructional features and designs for integrated tutorials, to enhance their eHealth literacy.
The Present Study
The present study aimed to: (a) develop a multimedia eHealth tutorial for older adults; and (b) assess the efficacy of this tutorial in improving older adults’ eHealth literacy. We therefore recruited nine older adults to join our team as designer partners and worked with them over 11 participatory design sessions, 2 hours per session, to gather design ideas and use those ideas to guide the design of our eHealth tutorial. We report details about this participatory design process elsewhere (Davis et al., 2021). Next, as we will describe in this paper, we used a randomized controlled design with pre- and posttest to compare the efficacy of our multimedia tutorial with that of a paper-based tutorial (control).
A key feature of our multimedia tutorial, Online Tutorial Overlay Presenter (OnTOP), is that it recognizes elements on a website (e.g., a button) and displays visual annotations (e.g., “click this button to continue”) in an overlay on top of those elements. Using OnTOP, learners no longer need to split their attention between a website and a separate tutorial. Instead, users can stay focused on the website, which greatly reduces cognitive load (Mayer, 2005). Our OnTOP tutorial was developed using Nickelled (https://www.nickelled.com), a cross-platform tool for hosting multimedia tutorials on any web browser. To facilitate the navigation between multiple learning goals, we added a panel to the left of the interface to display an outline of the learning goals. This outline view was made possible by a custom Google Chrome extension developed by our own team. With technical assistance from the research site’s staff, we installed this extension on our research site’s computers. To our best knowledge, OnTOP is the first to integrate overlay instructions on real, live websites.
Our primary research question (RQ) was: How effective is our multimedia tutorial, OnTOP, in improving older adults’ eHealth literacy from pre- to posttest, compared with the paper-based tutorial? (RQ1) Our secondary RQ was: What are older adults’ attitudes towards the multimedia versus paper-based tutorial? (RQ2)
Method
Sample and Data
A total of 99 older adults aged 65–90 years were recruited from senior centers, public libraries, and senior-living facilities in Central Texas from July 2019 to February 2020. Of these participants, 91 completed all sessions and the pre- and posttests (multimedia condition: 45; paper-based condition: 46), resulting in a retention rate of 92%. Data collection took place in the computer labs of our partnering senior centers and public libraries, which were easily accessible via public transportation and had ample parking space. Informed consent was obtained prior to any data collection. The study was approved by the Institutional Review Board of the authors’ university.
Measures
We used both subjective and objective measures. Objective measures included computer/Internet knowledge; skills in evaluating the quality of online information; and procedural skills in computer/Internet use. Subjective measures included eHealth literacy efficacy and attitudes (Table 1). Background characteristics (demographics, prior experience, and primary language) served as control variables. All pre- and posttests were administered in a small group setting, completed independently, and recorded using paper and pen.
Table 1.
Variables, instruments, and time of measurement
Variable | Instrument/What is measured | Pre | Post |
---|---|---|---|
Computer/Internet knowledge | Measured knowledge about computer and Internet terms; 20 items; scoring range 0–20; the higher score, the higher the knowledge | X | X |
Skills in evaluating the quality of online health information websites | A list of 8 health information websites, including 4 good websites and 4 bad ones, was provided for participants to evaluate the quality of the health information on each website; 8 items; scoring range 0–8; the higher the score, the better the skills in evaluating the quality of health websites | X | X |
Procedural skills in computer/Internet use | A set of tasks performed independently by the participants to determine their procedural skills in using computers and the internet; 12 items; scoring range 0–12; the higher the score, the higher the procedural skills | X | X |
eHealth literacy efficacy | The eHealth literacy scale (Norman & Skinner, 2006a); measured self-perceived skills at and comfort with using computers and the Internet for health information and decision making; excellent internal consistency reliability (alpha = .89–.97) with good test–retest reliability (Norman, 2009); 8 items, each on a 1–5 Likert scale; scoring range 8–40; the higher the score, the higher the eHealth literacy efficacy | X | X |
Attitudes | Satisfaction with the intervention (e.g., format, duration); 6 items each on a 1–5 Likert scale; scoring range 6–30; the higher the score, the more positive the attitude | X | |
Background | Age, gender, education, health, race/ethnicity, income, prior experience with computers and the Internet, language; 14 items; developed in-house | X |
Procedure
Participants were randomly assigned to either the intervention condition featuring learning with OnTOP or the control condition featuring learning with a paper-based tutorial developed by the National Institute on Aging (NIA) (AmeriCorps, 2010). The paper-based NIA tutorial was tested in our prior studies and was found effective in improving older adults’ eHealth literacy (Xie, 2011a, 2011b, 2011c, 2012; Xie & Bugg, 2009). Our OnTOP tutorial is distinct from the paper-based NIA tutorial in that it (a) provides instructions presented on top of real, live websites; (b) uses multimedia, is interactive, and enables real-time feedback; and (c) derives specific content and features through participatory design, a user-centered approach. In each condition, a trained facilitator (a graduate research assistant) directed participants to use the respective tutorial and provided guidance as needed. Except for the one difference of using either the OnTOP tutorial or the paper-based NIA tutorial, the two conditions were identical in all other respects (the site, procedure, etc.).
The experiment involved a total of four sessions that occurred twice a week, 2 hours per session, over 2 weeks at a research site. Each session included no more than eight participants to ensure a small group context. During each session, participants each used a networked computer to learn online information-seeking/eHealth skills, outlined in Table 2.
Table 2.
Outline of eHealth skills covered in the four experimental sessions.
Session | Skills |
---|---|
1 | Basic computer/Internet terms and skills |
2 | Introduction to the MedlinePlus.gov website; use of the Health Topics section on MedlinePlus.gov |
3 | Use of the Drugs and Supplements and the Medical Encyclopedia sections on MedlinePlus.gov |
4 | How to evaluate the quality of health information websites |
Data Analysis
Data were entered into IBM SPSS v.27 by graduate research assistants. Prior to conducting inferential analyses, the data were evaluated for accuracy, missing data, out-of-range values, and violation of statistical assumptions. Background variables (demographics, prior experience, and language) were examined to detect potential differences between the intervention and control groups. Descriptive statistics were used to provide a statistical profile of the sample, reporting frequencies and percentages for categorical data and means and standard deviations for continuous data. We examined the effects of the tutorials on each of the outcome measures using multiple mixed models analysis of variance (ANOVA), with one between-subjects factor (multimedia vs. paper-based tutorial) and one within-subjects factor (pre- vs. posttest). We assessed differences between the intervention and the control with univariate ANOVA and repeated measures univariate ANOVA. Significance tests were two-tailed with p = .05, and Bonferroni corrections were applied.
We screened the data to ensure that the assumptions of univariate and repeated measures ANOVA were fulfilled. Although there was a violation of the normality assumption (skewness, kurtosis, Shapiro-Wilk test), the univariate and repeated measures ANOVA are robust tests that can tolerate non-normally distributed data. We did not identify any outliers in the data, and we found no violation of the assumption of homogeneity of covariance across groups; therefore Wilk’s λ was used to examine the differences in means between pre- and posttest. However, since the assumption of sphericity was not conclusive, we used the Greenhouse-Geisser test to examine the main effect of time.
Results
Participants’ demographics and computer/Internet use are summarized in Table 3. No statistically significant difference in any of these variables was found between participants in the multimedia and paper-based tutorial groups.
Table 3.
Participant Characteristics
Characteristics | Participants Overall | Paper Group | Multimedia Group | χ2 or t | P |
---|---|---|---|---|---|
| |||||
Age in years, mean (SD) | 73.09 (7.02) | 74.20 (7.34) | 71.96 (6.55) | 1.60 | .11 |
| |||||
Gender, n (%) | .08 | .78 | |||
Female | 64 (64.6) | 33 (66.0) | 31 (63.3) | ||
Male | 35 (35.4) | 17 (34.0) | 18 (36.7) | ||
| |||||
Hispanic, n (%) | .01 | .91 | |||
Yes | 25 (26.0) | 13 (26.5) | 12 (25.5) | ||
No | 71 (74.0) | 36 (73.5) | 35 (74.5) | ||
| |||||
Race, n (%) | 3.32 | .51 | |||
American Indian/Alaska Native | 2 (2.1) | 1 (2.1) | 1 (2.1) | ||
African American | 22 (22.7) | 11 (22.9) | 11 (22.4) | ||
Asian | 4 (4.1) | 1 (2.1) | 3 (6.1) | ||
White Caucasian | 54 (55.7) | 30 (62.5) | 24 (49.0) | ||
Other | 15 (15.5) | 5 (10.4) | 10 (20.4) | ||
| |||||
Education, n (%) | 6.03 | .54 | |||
Less than high school graduate | 5 (6.1) | 4 (8.0) | 2 (4.0) | ||
High school graduate/GED/Vocational training | 37 (37.4) | 15 (30.0) | 22 (44.9) | ||
Some college/Associate degree | 27 (27.3) | 14 (28.0) | 13 (26.5) | ||
Bachelor's degree | 10 (10.1) | 6 (12.0) | 4 (8.2) | ||
Graduate degree (Master’s & doctoral) | 19 (19.2) | 11 (22.0) | 8 (16.4) | ||
| |||||
Yearly household income, n (%) | 8.40 | .49 | |||
Less than $20,000 | 42 (42.4) | 19 (38.0) | 23 (46.9) | ||
$20,000 - $29,999 | 13 (13.1) | 5 (10.0) | 8 (16.3) | ||
$30,000 – $39,999 | 6 (6.1) | 4 (8.0) | 2 (4.1) | ||
$40,000 – $49,999 | 2 (2.0) | 0 (0.0) | 2 (4.1) | ||
$50,000 – $59,999 | 3 (3.0) | 2 (4.0) | 1 (2.0) | ||
$60,000 – $69,999 | 4 (4.0) | 2 (4.0) | 2 (4.1) | ||
$70,000 – $99,999 | 8 (8.1) | 6 (12.0) | 2 (4.1) | ||
$100,000 or more | 3 (3.0) | 2 (4.0) | 1 (2.0) | ||
Do not know for certain | 6 (6.1) | 2 (4.0) | 4 (8.2) | ||
Do not wish to answer | 12 (12.1) | 8 (16.0) | 4 (8.2) | ||
| |||||
Health status, n (%) | 2.28 | .52 | |||
Fair | 27 (27.3) | 16 (32.0) | 11 (22.4) | ||
Good | 50 (50.5) | 23 (46.0) | 27 (55.1) | ||
Very good | 21 (21.2) | 10 (20.0) | 11 (22.4) | ||
Excellent | 1 (1.0) | 1 (2.0) | 0 (0.0) | ||
| |||||
English as primary language, n (%) | 2.29 | .32 | |||
Yes | 91 (91.9) | 45 (90.0) | 46 (93.9) | ||
No | 7 (7.1) | 5 (10.0) | 2 (4.1) | ||
| |||||
Internet use length, n (%) | 4.11 | .66 | |||
Never | 19 (19.4) | 9 (18.0) | 10 (20.8) | ||
Less than 1 year | 4 (4.1) | 1 (2.0) | 3 (6.3) | ||
More than 1 year, less than 3 years | 14 (14.3) | 7 (14.0) | 7 (14.6) | ||
More than 3 years, less than 5 years | 8 (8.2) | 6 (12.0) | 2 (4.2) | ||
More than 5 years, less than 10 years | 12 (12.2) | 6 (12.0) | 6 (12.5) | ||
More than 10 years | 41 (41.8) | 21 (42.0) | 20 (41.7) | ||
| |||||
Internet use frequency, n (%) | 3.58 | .61 | |||
Never | 16 (16.3) | 7 (14.0) | 9 (18.8) | ||
Less than once a month | 8 (8.2) | 5 (10.0) | 3 (6.3) | ||
More than once a month | 9 (9.2) | 6 (12.0) | 3 (6.3) | ||
Once a week | 5 (5.1) | 1 (2.0) | 4 (8.3) | ||
Every 2–3 days | 21 (21.4) | 11 (22.0) | 10 (20.8) | ||
Everyday | 39 (39.8) | 20 (40.0) | 19 (39.6) | ||
| |||||
Computer use length, n (%) | 1.82 | .87 | |||
Never | 18 (18.4) | 8 (16.0) | 10 (20.8) | ||
Less than 1 year | 7 (7.1) | 4 (8.0) | 3 (6.3) | ||
More than 1 year, less than 3 years | 13 (13.3) | 8 (16.0) | 5 (10.4) | ||
More than 3 years, less than 5 years | 6 (6.1) | 2 (4.0) | 4 (8.3) | ||
More than 5 years, less than 10 years | 9 (9.2) | 5 (10.0) | 4 (8.3) | ||
More than 10 years | 45 (45.9) | 23 (46.0) | 22 (45.8) | ||
| |||||
Computer use frequency, n (%) | .82 | .98 | |||
Never | 19 (19.4) | 9 (18.0) | 10 (20.8) | ||
Less than once a month | 8 (8.2) | 5 (10.0) | 3 (6.3) | ||
More than once a month | 6 (6.1) | 3 (6.0) | 3 (6.3) | ||
Once a week | 8 (8.2) | 4 (8.0) | 4 (8.3) | ||
Every 2–3 days | 14 (14.3) | 8 (16.0) | 6 (12.5) | ||
Everyday | 43 (43.9) | 21 (42.0) | 22 (45.8) |
Procedural skills in computer/Internet use.
Time had a significant main effect on procedural skills in computer/Internet use: F(1,89) = 95.742, p < .001, η2 = .518. Bonferroni follow-up tests indicated that the posttest procedural skills score was significantly higher than the pretest score: Mdifference = 1.99, SD = .20, p < .001. Time and group had a significant interaction: F(1, 89) = 5.126, p = .026, η2 = .054; however, Bonferroni follow up tests showed that there was no significant mean differences in procedural skills between the intervention group and the control group: Mdifference = 0.484, SD = .58, p =.407. When controlling for patients’ characteristics in the model, we found that prior internet use (F(1,77) = 15.88, p < .001, η2 = .171) contributed significantly to the variations in procedural skills in computer/Internet use.
eHealth literacy efficacy.
Time had a significant main effect on eHealth literacy efficacy: F(1,89) = 164.34, p < .001, η2 = .649. Bonferroni follow-up tests indicated that posttest eHealth literacy efficacy was significantly higher than pre-test eHealth literacy efficacy (Mdifference = 9.62, SD = .75, p < .001). Time and group had a significant interaction effect on this outcome measure, F(1, 89) = 4.67, p = .033; however, Bonferroni follow up tests showed that there was no significant mean differences in eHealth literacy efficacy between the intervention group and the control group: Mdifference = 0.683, SD = .95, p =.477. Health status (F(1,77) = 4.16, p = .045, η2 = .051), and race (F(1,77) = 3.99, p = .049, η2 = .049) were found to be significant contributing factors in the variations of eHealth literacy efficacy among older adults, after controlling for demographic and technology variables.
Knowledge about computer/Internet terms.
Time had a significant main effect on computer/Internet knowledge: F(1,89) = 50.31, p < .001, η2 = .361, indicating significant differences in means between pre- and posttest, with a large effect size. Bonferroni follow-up tests indicated that computer/Internet knowledge was significantly higher at posttest than that at pretest: Mdifference = 2.50, SD .35, p < .001. No statistically significant time by group interaction was found. However, after controlling for participants’ characteristics, age (F(1,77) = 5.28, p = .024, η2 = .064), education level (F(1,77) = 8.01, p = .006, η2 = .094), health status (F(1,77) = 4.58, p = .036, η2 = .056), race (F(1,77) = 4.56, p = .036, η2 = .056), and prior internet use (F(1,77) = 11.68, p < .001, η2 = .132) were found to contribute significantly to the variations in knowledge about computer and internet.
Knowledge about the quality criteria for evaluating online health information websites.
The main effect of time was significant: F(1,89) = 26.53, p < .001, η2 = .230, with a large effect size. Bonferroni follow-up tests indicated that the posttest evaluation skills score was significantly higher than the pretest evaluation skills score: Mdifference = .81, SD = .16, p < .001. No statistically significant interaction of time and group was found. Controlling for demographic and technology knowledge variables, we found that race (F(1,77) = 5.18, p = .026, η2 = .063) contributed significantly to the variations in evaluation skills.
Attitudes.
Univariate ANOVA showed that older adults’ attitudes toward training at posttest approached a statistically significant difference between the multimedia and paper-based conditions: F(1, 89) = 3.75, p = .056, partial η2 = .040, with the OnTOP condition showing more positive attitudes. Overall, participants in both groups had an overwhelmingly positive attitude toward training.
These results are illustrated in Tables 4 and 5.
Table 4.
Means of the Variables and Effect Sizes
Variables | Mean (SD) | 95% CI | Cohen’s d | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Group | Pretest | Posttest | LL | UL | |||||||
Computer/ Internet knowledge | Multimedia | 14.07 (3.86) | 16.87 (3.02) | .601 | 1.309 | .959 | |||||
Paper | 14.04 (4.21) | 16.26 (3.60) | .273 | .900 | .589 | ||||||
eHealth literacy efficacy | Multimedia | 25.28 (6.90) | 33.28 (3.28) | .958 | 1.775 | 1.371 | |||||
Paper | 22.98 (7.77) | 34.22 (3.97) | .937 | 1.737 | 1.341 | ||||||
Evaluation skills | Multimedia | 5.02 (1.98) | 5.78 (1.98) | .213 | .837 | .527 | |||||
Paper | 5.28 (1.95) | 6.15 (1.65) | .240 | .861 | .553 | ||||||
Procedural skills | Multimedia | 7.98 (3.47) | 8.76 (3.79) | .004 | .593 | .296 | |||||
Paper | 8.00 (3.24) | 10.46 (2.14) | .665 | 1.381 | 1.027 |
Table 5.
Participants’ Attitudes toward the training
Variable | Overall N (%) | Paper Group n (%) | Multimedia Group n (%) |
---|---|---|---|
| |||
Entire experience | |||
Neither | 2 (2.2) | 1 (2.2) | 1 (2.2) |
Satisfied | 38 (41.8) | 23 (50.0) | 15 (33.3) |
Extremely Satisfied | 51 (56.0) | 22 (47.8) | 29 (64.4) |
| |||
Usefulness of the training | |||
Completely Useless | 1 (1.1) | 1 (2.2) | 0 (0.0) |
Somewhat Useful | 5 (5.5) | 4 (8.7) | 1 (2.2) |
Useful/ Very Useful | 85 (93.4) | 41 (89.1) | 44 (97.7) |
| |||
Quality of the tutorial | |||
Fair | 4 (4.5) | 3 (6.8) | 1 (2.2) |
Good | 48 (53.9) | 24 (54.5) | 24 (53.3 |
Excellent | 37 (41.6) | 17 (38.6) | 20 (44.4) |
| |||
Instructors’ teaching: | |||
Fair/Good | 15 (16.5) | 9 (19.5) | 6 (13.3) |
Excellent | 76 (83.5) | 37 (80.4) | 39 (86.7) |
| |||
Would recommend computer class to peers | |||
Probably Not | 1 (1.1) | 1 (2.2) | 0 (0.0) |
Not sure | 1 (1.1) | 0 (0/0) | 1 (2.3) |
Yes/Definitely Yes | 88 (97.7) | 45 (97.8) | 43 (97.7) |
| |||
Would re-attend the same class if they could start over | |||
Definitely Not/ Probably Not | 3 (3.3) | 3 (6.5) | 0 (0.0) |
Not Sure | 5 (5.5) | 1 (2.2) | 4 (8.9) |
Yes/Definitely Yes | 83 (91.2) | 42 (91.4) | 41 (91.1) |
Overall, procedural skills, eHealth literacy efficacy, computer/Internet knowledge, and knowledge about quality criteria all improved significantly from pre- to posttest regardless of group assignment; improvements in the intervention group from pre- to posttest did not differ from those in the control group. Participants in both groups had an overwhelmingly positive attitude toward training; their positive attitude toward the OnTOP tutorial was even greater than that toward the paper-based tutorial.
Discussion
Older adults tend to have multiple chronic conditions and are major users of the healthcare system. eHealth offers access to health information about those conditions, treatment options, and management, empowering older adults to take an active role in their health care (Rockman & Gewald, 2015). Previous studies have indicated that older adults tend to have limited eHealth literacy and low perceived eHealth self-efficacy (Choi & DiNitto, 2013; Xie, 2012; Watkins & Xie, 2014). With effective educational strategies, older adults with limited health literacy can learn how to use eHealth services as well as information and communication technologies, which in turn may help to reduce their health disparities and improve their access to healthcare information (Xie, 2012; Watkins & Xie, 2014). Interventions supporting health self-management including the use of eHealth services, and eHealth literacy interventions are critical for optimizing older adults’ overall health and well-being (Arcury et al., 2020).
Existing health literacy interventions focus predominantly on simplifying medical instructions and materials (Andrus & Roth, 2002; Berkman et al., 2011; Schaefer, 2008). Although such an approach can foster proper use of health care resources, it can do so only to a certain degree, given the complexity of medical knowledge. Education and training that raise individuals’ actual health literacy level are essential to addressing the health illiteracy crisis. Evidence exists that multimedia tutorials are more advantageous to older adults than unimodal tutorials, because they reduce extraneous cognitive load (Mayer, 2005). Based on the cognitive theory of multimedia learning, multimedia tutorials can provide a stronger foundation for integrating knowledge and ensuring deeper learning, which can be important for older learners who may be prone to cognitive overload in new learning situations (Czaja et al., 2013; Mayer, 2005). Multimedia-based interventions have been found effective in improving health outcomes in older adults and in enhancing knowledge and self-efficacy, particularly in individuals with low health literacy (Czaja et al., 2013; Moussa et al., 2013; Watkins & Xie, 2014).
In this study, we have focused on online information-seeking/eHealth skills. By focusing on this relatively new dimension of health literacy skills, our intervention can contribute to ongoing discussions about the evolving conceptualization of health literacy (e.g., helping to begin an understanding of how online information-seeking/eHealth skills can be developed, enhanced, and refined). Specifically, we have compared the efficacy of our multimedia eHealth tutorial, OnTOP, with a conventional paper-based eHealth tutorial, on improving eHealth literacy among community-dwelling older adults. Our findings show that, overall, older adults’ computer/Internet knowledge, eHealth literacy efficacy, evaluation skills, and procedural skills increased significantly from pre- to posttest in both the intervention and the paper-based tutorial, with effect sizes (η2) varying from .23 (f2= .30) to .65 (f2= 1.86). According to Cohen’s f2(1988), a small effect size is represented by values lower than .02, a medium effect with values of .15, and a larger effect with values greater than .35. Thus, our intervention, regardless of the tutorial format, had a large effect size on each of the four eHealth literacy measures—procedural skills, eHealth literacy efficacy, computer/Internet knowledge, and knowledge about quality criteria—from pre- to posttest. This finding is important, given the short-term nature of the study (8 hours of training over 2 weeks). Our findings are in line with studies suggesting that age-appropriate training for older adults leads to significant benefits in eHealth literacy (Czaja et al., 2013; Xie, 2011a). In fact, older adults are different from younger adults in terms of their learning needs and requirements when using technology, due to their age-related physical and physiological characteristics (Chen & Chan, 2014). In addition, older adults’ attitudes toward the multimedia tutorial and the paper-based tutorial were very positive, and their attitudes toward the two tutorials approached statistical significance with a more positive attitude towards the multimedia tutorial (p = .056).
No statistically significant interaction of time and group was found on any of the outcome measures. Several factors might have contributed to these findings. There might have been a ceiling effect related to the paper-based tutorial, because both tutorials were effective. Also, the level of education of the participants could have played a role in the non-significant differences of the outcome measures between the tutorials, as more than 50% of our participants were highly educated and high education level was found to be associated with greater use of traditional printed information (Millar et al., 2020). In addition, in both conditions, a trained facilitator who was present at all sessions, directed participants during the sessions, and provided them with guidance as needed. Research shows that health communication that engages individuals interactively and personally is an effective method of patient education (Moussa et al., 2013). Thus, the facilitator might have reduced potential disparities between the two learning conditions. Further, differences in the facilitators’ teaching styles, experiences, personalities, gender, a lack of consistency in the training protocol, etc., might have affected the outcomes. Although in our experiment, we integrated sufficient time for breaks during the sessions, the participants in the OnTOP tutorial group might have experienced a computer-related fatigue compared to their counterparts in the paper-based tutorial, thus minimizing the differences between the learning conditions. Additionally, many of our participants had been using computers and the Internet daily, with an average computer and Internet use of more than 10 years. Because greater exposure to and prior experience of computers and the Internet are significant predictors of performance (Czaja et al., 2013), our participants’ prior experience might have helped them at least partially overcome the cognitive burden associated with paper-based tutorials.
Limitations and Future Research Directions
This study has some limitations. First, we used a convenience sample, with older adults recruited from senior centers, public libraries, and senior-living facilities in a geographically limited area, which may limit the generalizability of study findings. Second, although participants were randomly assigned to either the intervention condition or the control condition, the older adults who participated in the study shared a common interest and motivation, which might have created a sampling bias because the participants might not be representative of the general older adult population. Third, there might have been some confounding elements such as the ceiling effect related to the paper-based tutorial, as well as the fact that the majority of the participants had a high level of education and used the computer/Internet every day with an average computer and Internet use of more than 10 years. Fourth, the validity and reliability of some of the outcome measures were not available. In future research, it would be important to use a representative sample of older adults, including those with limited computer and Internet experience, to evaluate older adults’ health literacy skills with more sensitive comparisons that would minimize potential ceiling effects. Future studies could consider allocating a shorter time limit to the sessions (less than 2 hours) to reduce potential computer-related fatigue. In future research, a psychometric evaluation of some of the outcome measures might be warranted to ensure their validity, reliability and suitability to older adults.
Despite these limitations, the present study is the first to use a tutorial with overlay instructions on real, live websites and show its effectiveness for eHealth literacy in older adults. Our OnTop tutorial was efficacious in improving older adults’ eHealth literacy from pre- to posttest. OnTOP tutorial was delivered using the conventional computer and keyboard system. Future technology designs could include user-friendly technology such as touch screens instead of keyboards, and transferability of OnTOP to iPad/tablet interface as older adults are increasingly using these devices to find health information to aid in their health care decision-making. In addition, the multimedia tutorial used MedlinePlus.gov as the main source of health information for this study (to ensure the credibility and quality of resources provided). Future research could include testing OnTOP on other websites related to the health of older adults. Future research will also benefit from adapting OnTOP for different older adult populations that might require culturally specific content and procedures, which we are currently developing—for example, for Native American elders.
What the paper adds:
Both multimedia and paper-based eHealth tutorials were effective in ameliorating older adults’ eHealth literacy;
Older adults displayed more positive attitude towards the multimedia eHealth tutorial than the paper-based eHealth tutorial.
Applications of study findings:
Older adults can learn to use new digital technology to access, assess, and use health information;
Policy and community resources are needed to design and implement effective interventions to promote digital inclusion and equity in a rapidly aging society.
Acknowledgements
Research reported in this publication was supported by the National Institute on Aging of the National Institutes of Health under Award Number R21AG052761 [Principal Investigator: Bo Xie]. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Footnotes
Declaration of Conflicting Interests: The Authors declare that there is no conflict of interest.
Ethical Board: The Office of Research Support and Compliance (RSC) at The University of Texas at Austin
IRB protocol/human subjects approval number: 2017-09-0058
Date of Approval: 11/20/2018
References
- AmeriCorps. (2010). Helping older adults search for health information online: a toolkit for trainers. NIHSeniorHealth. https://www.nationalservice.gov/resources/performance-measurement/helping-older-adults-search-health-information-online-toolkit [Google Scholar]
- Andrus MR, & Roth MT (2002). Health literacy: A review. Pharmacotherapy, 22(3), 282–302. 10.1592/phco.22.5.282.33191 [DOI] [PubMed] [Google Scholar]
- Arcury TA, Sandberg JC, Melius KP, Quandt SA, Leng X, Latulipe C, Miller DP Jr., Smith A, & Bertoni AG (2020). Older adult internet use and eHealth literacy. Journal of Applied Gerontology, 39(2), 141–150. 10.1177/0733464818807468 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ayres P, & Sweller J (2005). The split-attention principle in multimedia learning. In Mayer RW (Ed.), The Cambridge handbook of multimedia learning (pp. 135–146). Cambridge University Press. [Google Scholar]
- Bann CM, McCormack LA, Berkman ND, & Squiers LB (2012). The Health Literacy Skills Instrument: A 10-item short form. Journal of Health Communication, 17(Suppl. 3), 191–202. 10.1080/10810730.2012.718042 [DOI] [PubMed] [Google Scholar]
- Bergman L, Castelli V, Lau T, & Oblinger D (2005). DocWizards: A system for authoring follow-me documentation wizards. In Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology (pp. 191–200). ACM Press. 10.1145/1095034.1095067 [DOI] [Google Scholar]
- Berkman ND, Davis TC, & McCormack L (2010). Health literacy: What is it? Journal of Health Communication, 15(Suppl. 2), 9–19. 10.1080/10810730.2010.499985 [DOI] [PubMed] [Google Scholar]
- Berkman ND, Sheridan SL, Donahue KE, Halpern DJ, Viera A, Crotty K, Holland A, Brasure M, Lohr KN, Harden E, Tant E, Wallace I, & Viswanathan M (2011). Health literacy interventions and outcomes: An updated systematic review (AHRQ Report No. 11-E006). Agency for Healthcare Research and Quality. https://www.ncbi.nlm.nih.gov/books/NBK82434 [PMC free article] [PubMed]
- Birren J, & Warner SK (Eds.). (1990). Handbook of the psychology of aging (3rd ed.). Academic Press. [Google Scholar]
- Chan CV, & Kaufman DR (2011). A framework for characterizing eHealth literacy demands and barriers. Journal of Medical Internet Research, 13(4), Article e94. 10.2196/jmir.1750 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Choi N, & Dinitto D (2013). The digital divide among low-income homebound older adults: Internet use patterns, eHealth literacy, and attitudes toward computer/Internet use. Journal of Medical Internet Research, 15(5), Article e93. 10.2196/jmir.2645 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cohen J (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Lawrence Erlbaum Associates. 10.4324/9780203771587 [DOI] [Google Scholar]
- Chen K, & Chan AH (2014). Gerontechnology acceptance by elderly Hong Kong Chinese: A senior technology acceptance model (STAM). Ergonomics, 57, 635–652. 10.1080/00140139.2014.895855 [DOI] [PubMed] [Google Scholar]
- Czaja SJ, Charness N, Fisk AD, Hertzog C, Nair SN, Rogers WA, & Sharit J (2006). Factors predicting the use of technology: Findings from the Center for Research and Education on Aging and Technology Enhancement (CREATE). Psychology and Aging, 21(2), 333–352. 10.1037/0882-7974.21.2.333 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Czaja SJ, Sharit J, Lee CC, Nair SN, Hernández MA, Arana N, & Fu SH (2013). Factors influencing use of an e-health website in a community sample of older adults. Journal of the American Medical Informatics Association, 20(2), 277–284. 10.1136/amiajnl-2012-000876 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Davis N, Shiroma K, Xie B, Yeh T, Han X, & De Main A (2021). Designing eHealth tutorials with and for older adults. Proceedings of the Association for Information Science and Technology, 58(1), 92–103. 10.1002/pra2.439 [DOI] [Google Scholar]
- Echt KV, Morrell RW, & Park DC (1998). Effects of age and training formats on basic computer skill acquisition in older adults. Educational Gerontology, 24(1), 3–25. 10.1080/0360127980240101 [DOI] [Google Scholar]
- Fox S, & Duggan M (2013, April 15). Health online 2013. Pew Research Center Internet & Technology. https://www.pewresearch.org/internet/2013/01/15/health-online-2013/
- Hawthorn D (2007). Interface design and engagement with older people. Behaviour & Information Technology, 26(4), 333–341. 10.1080/01449290601176930 [DOI] [Google Scholar]
- Healthy People 2010: Understanding and improving health. (2000). Department of Health and Human Services. https://eric.ed.gov/?id=ED443794
- Institute of Medicine (2004). Health literacy: A prescription to end confusion. The National Academies Press. 10.17226/10883 [DOI] [PubMed] [Google Scholar]
- Institute of Medicine. (2009). Health Literacy, eHealth, and Communication: Putting the consumer first. Workshop summary. The National Academies Press. 10.17226/12474 [DOI] [PubMed] [Google Scholar]
- Jaeger PT, & Xie B (2009). Developing online community accessibility guidelines for persons with disabilities and older adults. Journal of Disability Policy Studies, 20(1), 55–63. 10.1177/1044207308325997 [DOI] [Google Scholar]
- Kang H, Plaisant C, & Schneiderman B (2003). New approaches to help users get started with visual interfaces: Multi-layered interfaces and integrated initial guidance. In Proceedings of the 2003 Annual National Conference on Digital Government Research. https://ils.unc.edu/govstat/papers/dg-kang-final.pdf [Google Scholar]
- Kelleher C, & Pausch R (2005). Stencils-based tutorials: Design and evaluation. In CHI ‘05: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 541–550). [Google Scholar]
- Kutner M, Greenberg E, Jin Y, & Paulsen C (2006). The health literacy of America’s adults: Results from the 2003 National Assessment of Adult Literacy (NCES 2006–483). National Center for Education Statistics. http://nces.ed.gov/pubs2006/2006483.pdf [Google Scholar]
- Levine DM, Lipsitz SR, & Linder JA (2016). Trends in seniors’ use of digital health technology in the United States, 2011–2014. JAMA: The Journal of the American Medical Association, 316(5), 538–540. 10.1001/jama.2016.9124 [DOI] [PubMed] [Google Scholar]
- Mayer RE (2005). The Cambridge handbook of multimedia learning. Cambridge University Press. [Google Scholar]
- McCormack L, Bann C, Squiers L, Berkman ND, Squire C, Schillinger D, Ohene-Frempong J, & Hibbard J (2010). Measuring health literacy: A pilot study of a new skills-based instrument. Journal of Health Communication, 15(Suppl. 2), 51–71. 10.1080/10810730.2010.499987 [DOI] [PubMed] [Google Scholar]
- Mika VS, Kelly PJ, Price MA, Franquiz M, & Villarreal R (2005). The ABCs of health literacy. Family & Community Health, 28(4), 351–357. 10.1097/00003727-200510000-00007 [DOI] [PubMed] [Google Scholar]
- Millar RJ, Sahoo S, Yamashita T, & Cummins P (2020). Problem solving in technology-rich environments and self-rated health among adults in the U.S.: An analysis of the program for the international assessment of adult competencies. Journal of Applied Gerontology, 39(8), 889–897. 10.1177/0733464819829663 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Moussa M, Sherrod D, & Choi J (2013). An e-health intervention for increasing diabetes knowledge in African Americans. International Journal of Nursing Practice, 19(Suppl. 3), 36–43. 10.1111/ijn.12167 [DOI] [PubMed] [Google Scholar]
- Norman CD, & Skinner HA (2006a). eHEALS: The eHealth literacy scale. Journal of Medical Internet Research, 8(4), e27. 10.2196/jmir.8.4.e27 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Norman CD, & Skinner HA (2006b). eHealth literacy: Essential skills for consumer health in a networked world. Journal of Medical Internet Research, 8(2), e9. 10.2196/jmir.8.2.e9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Norman CD (2009). Skills essential for eHealth. In Hernandez L (Ed.), Health literacy, eHealth, and communication: Putting the consumer first. Workshop summary (pp. 10–15). The National Academies Press. http://www.nap.edu/catalog.php?record_id=12474 [PubMed] [Google Scholar]
- Oh H, Rizo C, Enkin M, & Jadad A (2005). What is eHealth?: A systematic review of published definitions. World Hospitals and Health Services, 41, 32–40. [PubMed] [Google Scholar]
- Paasche-Orlow MK, Wilson EAH, & McCormack L (2010). The evolving field of health lteracy research. Journal of Health Communication, 15(Suppl. 2), 5–8. 10.1080/10810730.2010.499995 [DOI] [PubMed] [Google Scholar]
- Piper D, Palmer S, & Xie B (2009). Services to older adults: Preliminary findings from three Maryland public libraries. Journal of Education for Library and Information Science, 50(2), 107–118. https://www.jstor.org/stable/40732568 [Google Scholar]
- Rockmann R, & Gewald H (2015). Elderly people in eHealth: Who are they? Procedia Computer Science, 63, 505–510. 10.1016/j.procs.2015.08.376 [DOI] [Google Scholar]
- Sakaguchi-Tang DK, Bosold AL, Choi YK, & Turner AM (2017). Patient Portal Use and Experience Among Older Adults: Systematic Review. JMIR medical informatics, 5(4), e38. 10.2196/medinform.8092 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schaefer CT (2008). Integrated review of health literacy interventions. Orthopaedic Nursing, 27(5), 302–317. 10.1097/01.NOR.0000337283.55670.75 [DOI] [PubMed] [Google Scholar]
- Squiers L, Peinado S, Berkman N, Boudewyns V, & McCormack L (2012).The Health Literacy Skills Framework. Journal of Health Communication, 17(Suppl. 3), 30–54. 10.1080/10810730.2012.713442 [DOI] [PubMed] [Google Scholar]
- Xie B (2008). Older adults, health information, and the Internet. ACM Interactions, 15(4), 44–46. https://interactions.acm.org/archive/view/july-august-2008/lifelong-interactionsolder-adults-health-information-and-the-internet1 [Google Scholar]
- Xie B (2011a). Effects of an eHealth literacy intervention for older adults. Journal of medical Internet research, 13(4), Article e90. 10.2196/jmir.1880 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Xie B (2011b). Experimenting on the impact of learning methods and information presentation channels on older adults’ e-health literacy. Journal of the American Society for Information Science and Technology, 62(9), 1797–1807. 10.1002/asi.21575 [DOI] [Google Scholar]
- Xie B (2011c). Older adults, e-health literacy, and collaborative learning: An experimental study. Journal of the American Society for Information Science and Technology, 62(5), 933–946. 10.1002/asi.21507 [DOI] [Google Scholar]
- Xie B (2012). Improving older adults’ e-health literacy through computer training using NIH online resources. Library & Information Science Research, 34(1), 63–71. 10.1016/j.lisr.2011.07.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Xie B, & Bugg JM (2009). Public library computer training for older adults to access high-quality Internet health information. Library & Information Science Research, 31(3), 155–162. 10.1016/j.lisr.2009.03.004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Xie B, Watkins I, Golbeck J, & Huang M (2012). Understanding and changing older adults’ perceptions and learning of social media. Educational Gerontology, 38(4), 282–296. 10.1080/03601277.2010.544580 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Xie B, Yeh T, Walsh G, Watkins I, & Huang M (2012). Co-designing an e-health tutorial for older adults. In iConference ‘12: Proceedings of the 2012 iConference (pp. 240–247). 10.1145/2132176.2132207 [DOI] [Google Scholar]
- Watkins I, & Xie B (2014). eHealth literacy interventions for older adults: A systematic review of the literature. Journal of Medical Internet Research, 16(11), e225. 10.2196/jmir.3318 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zickuhr K, & Smith A (2012, April 13). Digital differences. Pew Research Center Internet & Technology. https://www.pewresearch.org/internet/2012/04/13/digital-differences/