Skip to main content
Innovation in Aging logoLink to Innovation in Aging
. 2023 Mar 23;7(4):igad029. doi: 10.1093/geroni/igad029

Predicting Older Adults’ Continued Computer Use After Initial Adoption

Shenghao Zhang 1,, Walter R Boot 2
Editor: Jennifer Tehan Stanley
PMCID: PMC10184684  PMID: 37197443

Abstract

Background and Objectives

Sustained computer and internet use have the potential to help older adults in various aspects of their lives, making predicting sustained use a critical goal. However, some factors related to adoption and use (e.g., computer attitudes) change over time and with experience. To understand these dynamics, the current study modeled changes in constructs related to computer use after initial computer adoption and examined whether these changes predict continued use.

Research Design and Methods

We used data from the computer arm (N = 150, MAge = 76.15) of a 12-month field trial examining the potential benefits of computer use in older adults. Individual differences identified in the technology acceptance literature (perceived usefulness, ease of use, computer interest, computer self-efficacy, computer anxiety, quality of life, social isolation, and social support) were measured before (baseline), during (Month 6), and after the intervention (post-test). Univariate and bivariate latent change score models examined changes in each predictor and their potential causal relationship with use.

Results

Results demonstrated large interindividual differences in the change patterns of individual difference factors examined. Changes in perceived usefulness, perceived ease of use, computer interest, computer self-efficacy, and computer anxiety were correlated with but not predictive of change in use.

Discussion and Implications

Our findings demonstrate the limitation of popular constructs in technology acceptance literature in predicting continued use and point out important gaps in knowledge to be targeted in future investigations.

Keywords: Adherence, Digital divide, Information system continuance, Unified theory of acceptance and use of technology (UTAUT)


Translational Significance: Technologies can help older adults in various aspects of their lives, but they need to keep using them to get the full benefits. Older adults’ usage of a technology is associated with various factors including usefulness, ease of use, and self-efficacy, but the relationships are correlational rather than causal. Stakeholders using technology acceptance models to guide practice need to be aware of the limitations of those models and understand that there might be other less understood contextual barriers lying between the decision to adopt technology and actual usage in daily life.

Sustained computer and internet users have the potential to help older adults in various aspects of their lives (Charness & Boot, 2022). However, acceptance and usage rates are still lower among older adults compared with younger people. Only around 75% of older adults use the internet and 61% own a smartphone, whereas usage and ownership of these technologies in the younger and middle-aged groups are nearly universal (Pew Research Center, 2021a, 2021b). Numerous factors including users’ perceptions and attitudes, accessibility and affordability, and product design and support, have been identified as potential determinants and barriers to usage in older adults (e.g., Charness & Boot, 2022; Francis et al., 2019; Lee & Coughlin, 2015). Given the complex patterns of factors related to technology use, it is not surprising that providing technology access and creating conditions for its initial usage alone might not lead to sustained usage (e.g., Bhattacherjee, 2001; Hsieh et al., 2008; Sharit et al., 2019). The goal of the current study is to understand the value of factors identified in the technology acceptance literature in predicting continued use for older adults with limited technology experience from a longitudinal perspective.

Technology Acceptance in Older Adults

Technology acceptance is often defined as the behavioral intention to use or adopt a technology. Several models have been developed to understand factors influencing technology acceptance in organizational and consumer contexts. One of the most referenced models in the literature is the Unified Theory of Acceptance and Use of Technology (UTAUT; Venkatesh et al., 2003, 2012). UTAUT integrated prominent models in technology acceptance literature and identified performance expectancy (also known as perceived usefulness), effort expectancy (also known as perceived ease of use), social influences, facilitating conditions, technology interest, user habit, and price value as determinants of technology acceptance. Although attitudes such as technology anxiety and technology self-efficacy were not theorized to be important in UTAUT, a recent meta-analysis of studies testing the model suggests that these attitudes are central to acceptance and partially mediate the effects of performance expectancy, effort expectancy, social influences, and facilitating conditions on acceptance (Dwivedi et al., 2019).

Additional factors were introduced in the aging and technology acceptance literature to account for older adults’ characteristics and needs when interacting with new technologies (e.g., Chen & Chan, 2014; Czaja et al., 2006). For instance, learning to use and using new technologies is cognitively challenging and could pose barriers for some older adults (e.g., Czaja et al., 2006; Zhang et al., 2017). Vision, hearing, and fine motor skill losses could influence the quality of interaction with the graphical-, sound-, and touch-based interfaces that are dominant in technological devices (e.g., Chen & Chan, 2014; Czaja et al., 2020). Psychosocial needs were also important in older adults’ acceptance of new technologies. Specific benefits on quality of life (Berkowsky et al., 2017), emotional or psychological aspects of life (e.g., potential to promote social connection and receive social support, Lee & Coughlin, 2015) were strong facilitators of acceptance, whereas devices or services that activate stereotypes of older adults being dependent, frail, or in need of special care could interfere with acceptance (Lee & Coughlin, 2015).

Unfortunately, behavioral intention does not always lead to actual use, let alone continued use. Behavioral intention, facilitating conditions, and habits explained around 50% of the variance in self-reported usage in the original studies validating UTAUT (Venkatesh et al., 2003, 2012), and meta-analysis of 162 technology acceptance studies showed that behavioral intention and facilitating conditions together explained only 21% of the variance in usage behavior (Dwivedi et al., 2019). This intention–behavior gap is also prominent and well recognized in research on health behavioral changes (Sheeran, 2002). Meta-analysis results of intervention studies showed that a medium-to-large change in intention only leads to a small-to-medium change in behavior (Webb & Sheeran, 2006), and among those with positive activity intentions, 48% failed to enact those actions (Rhodes & de Bruijn, 2013). Therefore, it is not surprising that predicting long-term use with individual difference factors identified in the technology acceptance literature yielded poor results (e.g., Mitzner et al., 2019). Mitzner et al. (2019) examined predictors of mid-term (Weeks 21–23) and long-term (Weeks 41–43) computer use within older recent computer adopters in a year-long technology intervention study. They found that earlier use (Weeks 1–3), executive functioning, and computer self-efficacy were the best predictor of mid-term and long-term use, whereas effort expectancy was only predictive of mid-term use and performance expectancy was not predictive of use.

Longitudinal Perspective on Technology Acceptance

One potential explanation for the poor performance in predicting continued use is that some individual difference predictors proposed in previous literature, such as effort expectancy and technology attitudes, can change substantially as people interact with technology. Therefore, initial levels of those predictors might no longer reflect current views, and thus not be predictive of continued use or use over the long run. The idea that predictors of technology acceptance can change with experience has been alluded to in previous literature (e.g., Bhattacherjee, 2001). For example, experience using a technology was proposed to moderate the relationship between several predictors in UTAUT and acceptance of that technology. Specifically, the influence of effort expectancy and social influence on acceptance diminishes, and the influence of habit and facilitating conditions increases as experience grows (Venkatesh & Bala, 2008; Venkatesh et al., 2012). Similarly, the DeLone and Mclean information systems success model proposed that performance expectancy after using a system, directly and indirectly, influences the intention of future use through user satisfaction (DeLone & McLean, 2003). Numerous other studies also compared the association between individual differences and technology acceptance or user experience (e.g., satisfaction) at first use and at a later point (e.g., Bhattacherjee & Premkumar, 2004; Cheng & Yuen, 2018; Hu et al., 2003; Mclean, 2018), and results differ depending on study population and contexts. For instance, Hu et al. (2003) examined the acceptance of classroom technology in public school teachers before and after technology training. They found that effort expectancy and performance expectancy became increasingly important in acceptance after the training, while associations between other factors and acceptance decreased. This pattern was confirmed in a recent study on consumer technologies (a retailer’s mobile-commerce application; Mclean, 2018), but was contradicted by studies on e-learning technology with middle school students where Cheng and Yuen (2018) found that the effects of performance expectancy on intention to continue using the system and satisfaction decrease significantly over time.

There is likely a dynamic relationship between attitudes and usage over time and with the development of experience. Although meta-analytic results suggest that baseline performance expectancy, effort expectancy, and attitudes are predictive of subsequent self-reported use through concurrent technology acceptance (e.g., Dwivedi et al., 2019), very few studies have examined if subsequent performance expectancy, effort expectancy, and attitudes help to understand changes in use (Cheng & Yuen, 2018; Venkatesh & Bala, 2008). Both studies measured performance expectancy, effort expectancy, technology acceptance, and use in multiple waves. Both studies found that initial performance expectancy and effort expectancy were predictive of subsequent use at Time 2 through initial technology acceptance. And performance expectancy and effort expectancy at Time 2 were predictive of subsequent use at Time 3 through technology acceptance at Time 2. These findings provide some support for a dynamic relationship between individual differences factors identified in technology acceptance literature and use. However, these results need to be interpreted with caution in that stability of constructs was not controlled (e.g., Time 2 use was not controlled when predicting Time 3 use), and different degrees of stability in constructs can lead to spurious causal predictions (see Rogosa, 1980, for a detailed discussion).

The dynamic nature of attitudinal predictors is highlighted by some technology acceptance models for older adults. One notable example is the Senior Technology Exploration, Learning, and Acceptance (STELA; Tsai et al., 2019) model. By conceptualizing technology acceptance as a multistep process spanning over time, the model emphasized the importance of initial explorations of new technologies in fostering positive attitudes to promote subsequent use. However, very few empirical studies with older adults have investigated the change in those predictors. Sharit et al. (2019) modeled linear change in computer proficiency and computer interest, anxiety, and self-efficacy, and found significant associations between changes in proficiency and changes in computer interest, anxiety, and self-efficacy. These results further shed light on the merits of examining changes in individual difference predictors.

The Current Study

The current study focused on predictors within the technology acceptance literature that are sensitive to change and explored their value in predicting continued use shortly after adoption in older adults with limited technology experience. We used data from the Personal Reminder Information and Social Management trial (PRISM; Czaja et al., 2015). PRISM is a computer system designed to be useful and easy to use for older adults to support social connectivity, prospective memory, and knowledge about topics and community resources. The trial provided the PRISM system with internet access to older adults and followed up on their usage for 12 months after computer training. This setup offers an ideal testbed for how attitudes and other individual difference factors influence adoption and continued use with sufficient technology support in the absence of access and price barriers. Previous research showed that the PRISM system is effective in decreasing loneliness and had a positive influence on computer proficiency (Czaja et al., 2018).

The current study has two specific aims. The first aim is to describe the intraindividual changes in performance expectancy, effort expectancy, computer anxiety, computer self-efficacy, computer interest, quality of life, social support, and social isolation over 12 months. To our knowledge, this is the first study to describe the change in individual difference factors related to technology acceptance in older adults. Given that (a) previous research suggests that interindividual differences and changes in some attitudes and beliefs about technologies are associated with interindividual differences and changes in computer proficiency (Sharit et al., 2019; Zhang et al., 2017), and (b) computer proficiency reflects the various skills needed to successfully operate a computer (Boot et al., 2015), we expect change trajectories of these variables to roughly obey the power law of skill acquisition (Newell & Rosenbloom, 1981). Specifically, we hypothesize that effort expectancy, computer anxiety, and social isolation would decrease with more experience interacting with the system and reach a plateau (i.e., negatively decelerating change over time). Similarly, performance expectancy, computer self-efficacy, computer interest, perceived quality of life, and perceived social support would increase with system use and reach a plateau (i.e., positively decelerating change over time). The second aim is to examine the value of those predictors in predicting continued use of the system in 12 months. We expect the interindividual difference relationships between predictors and acceptance in previous literature will translate into intraindividual dynamic relationships between predictors and continued use. In other words, predictors from the technology acceptance literature will be predictive of a subsequent change in use.

Facilitating conditions and price value were not examined given that equipment, technology training, and technology support were provided through the study without cost to participants. Changes in health and cognition were not included because they are conceptualized to be relatively stable for normally aging older adults over a year and not significantly influenced by technology use according to previous studies (e.g., Czaja et al., 2018; Zhang et al., 2022). We acknowledge that declines in health and cognition could potentially have large influences on the use and the decision to disengage with technology over longer periods.

To summarize, the current study will use latent change score models to describe how constructs related to computer use change over a period of 12 months (Aim 1), and we will use bivariate latent change score models to examine how constructs related to computer use influence continued use (Aim 2).

Method

Design, Participants, and Procedures

The PRISM system trial was a multisite randomized controlled trial conducted in three diverse locations: Atlanta, GA; Miami, FL; and Tallahassee, FL. The trial was 12 months in duration and collected measures at baseline, 6 months, and 12 months. For full trial details, see Czaja et al. (2015, 2018).

Three hundred community-dwelling older adults were randomized into either the intervention condition in which they were provided with computer training and the PRISM system (N = 150, MAge= 76.97, standard deviations [SD] = 7.3), or the control condition where they interacted with parallel, noncomputer-based content (N = 150, MAge = 75.34, SD = 7.4). Participants were cognitively healthy, had little computer experience, and were at risk for social isolation (lived alone, worked or volunteered minimally, and made minimal use of senior center or formal organizations). The current study used data from the intervention arm of the trial only. The sample from the intervention arm was 79.3% female, diverse (46.7% non-White), and many were of low socioeconomic status (43.3% had attained a high school diploma or less, and 84.7% had an annual household income of <$30,000). All participants were compensated $25 per assessment (baseline, 6 months, and 12 months), and participants were allowed to keep the computer after the trial.

Measures

A battery of assessments was administered at baseline, 6 months, and 12 months. Effort expectancy and performance expectancy were measured by the Technology Acceptance Scale. Computer self-efficacy, computer anxiety, and computer interest were measured by the Computer Attitudes Scale. Perceived quality of life was measured by the Quality of Life Scale. Social isolation was measured by the Friendship Scale, and perceived social support was measured by the Interpersonal Support Evaluation List. The full battery is reported elsewhere (Czaja et al., 2015), and details about psychometric properties are shown in Supplementary Table 1.

Continued use was operationalized as early-term, mid-term, and long-term use following a previous study using the same dataset (Mitzner et al., 2019). The use of the system on each day was recorded as a binary variable. Early-term use was defined as the average number of days that any feature of PRISM was used during Weeks 1–3. Long-term use was defined as the period toward the latter end of the trial (i.e., 41–43 weeks). The final 5 weeks were not included because of concerns about the end of the study effects. Mid-term use was defined as use from Weeks 21 to 23. This period was selected because it approximately sits midway between early-term (1–3 weeks) and long-term use (41–43 weeks).

Demographic information was gathered at baseline and controlled as time-invariant covariates in all models. Descriptive statistics of all the measures involved are shown in Supplementary Table 2.

Statistical Analysis

Latent change score models were used to model the longitudinal data. This approach combines the strength of cross-lag regression models in providing causal inferences as well as the strength of latent growth curve models in explicitly modeling the means and variances of the change trajectories. These characteristics make latent change score modeling the ideal approach to model dynamic relations between constructs as they change over time (for reviews, see McArdle, 2009).

Separate univariate latent change score models were fit to each predictor variable and to the continued use variables to describe the change patterns over 12 months (Supplementary Figure 1A). The latent factor at each occasion (x[t]) was perfectly regressed on previous occasion latent factor of the same construct (x[t−1]). Change between the previous and current time points was modeled as a higher-order latent change score (Δx[t]). Latent change scores of the same construct subsequently serve as indicators for the latent slope factor (sx). Loadings of the latent slope factor were fixed to one to model a linear constant change trajectory. A prediction of the latent change score by the latent factor at the previous time (βx) was included to represent proportional change. The proportional change was fixed to be invariant over time.

Bivariate latent change score models were fit to model each predictor’s contribution to change in use. This was achieved by estimating the latent change score model of a predictor variable and the latent change score model of the continued use variable simultaneously (Supplementary Figure 1B). Two nested multivariate latent change score models (a no-coupling model and a univariate coupling model) were estimated for bivariate relationships between each predictor and use. In the no-coupling model, the intercept and slope of the predictor and the intercept and slope of use were correlated, but there was no relationship between the predictor and change in use (γyx = 0) or use and change in the predictor variable. In the univariate coupling model, the intercept and slope of the predictor and the intercept and slope of use were correlated, and the latent change score for use was regressed on the previous occasion latent factor of the predictor variable to specify the dynamic coupling relationship. Nested model comparisons were used to test whether previous occasion predictors predict change in use. Model fits of the no-coupling model and the univariate coupling model were compared with Chi-square difference testing. Evidence supporting a predictor contributing to use would be indicated by a significantly worse fit for the no-coupling model compared with the univariate coupling model.

Latent change scores are not interpretable without meaningful scaling of observed scores over time. Observed scores were converted to z scores using the means and SDs from the first time point to scale the latent change score. After the scaling, the unit of the latent change score models can be interpreted as standardized unit change relative to the variability observed at the first time point. Age, gender, education, race, and income were controlled in all models. Those covariates were centered or recoded to increase the interpretability of the models. Age was treated continuously and centered on the sample mean. Education was recoded into “high school and below” and “above high school,” with “high school and below” as the default group. Gender was coded with female as the default group given that a majority of the sample (79.3%) were female. Race was recoded into “Whites” and “Non-Whites,” with “Whites” as default. Non-Whites were not further differentiated given the small number of participants in each category. Income was recoded into “below $15,000” and “above $15,000,” with “below $15,000” as the default group. $15,000 was chosen in accordance with age and family size-adjusted poverty line (US Census Bureau).

Model fit was assessed with the Tucker–Lewis Index (TLI), Comparative Fit Index (CFI), and Root-Mean-Squared Error of Approximation (RMSEA). Model fit is considered to be adequate when TLI and CFI are above 0.95 and RMSEA is below 0.06 (Hu & Bentler, 1999).

Power Analysis

A priori power analyses were conducted for bivariate latent change score models with positively decelerating change for one variable (the predictor variable of interest) and negatively decelerating change for another (the use variable). Effects of fixed effects (e.g., means) and random effects (e.g., variances) were represented by effect sizes from the difference family (Cohen’s d), and effects of covariances were represented by effect sizes from the correlation family (i.e., correlated coefficients r). Given the lack of similar previous research to draw values of effect sizes from, we used values corresponding to small and medium effects as suggested by Cohen (Cohen, 1988). Fixed (sx, sy, βx, βy) and random (σ2sx, σ2sy) effects for change parameters were set as small (d = 0.2) to reflect relatively conservative estimates of changes. Random effects were specified at different magnitudes to reflect multiple possible scenarios. Results showed that a sample size of 150 has enough power (>0.80) to detect a medium effect of the predictor variable on change in use (γyx, d = 0.5) in all circumstances, and has enough power (>0.80) to detect a small effect of the predictor variable on change in use (γyx, d = 0.2) when slope variances are smaller than intercept variances (σ2sx < σ2ix and σ2sy < σ2iy). Detailed results and model specifications for the power analysis are shown in Supplementary Table 3.

Results

Univariate Latent Change Score Models

All models for predictor variables demonstrated adequate fit, whereas model fit for the user variable was slightly worse (Supplementary Table 4). Parameter estimates are shown in Table 1.

Table 1.

Univariate Latent Change Score Model Unstandardized Coefficient (and standard errors) for Predictor Variables and Use Variable

Estimates Performance expectancy Effort expectancy Computer anxiety Computer self-efficacy Computer interest Quality of life Social supporto Social isolation Use
Fixed effects
Initial level mean, μix –0.048
(0.186)
–0.003
(0.193)
0.008
(0.186)
–0.017
(0.186)
–0.074
(0.188)
–0.33
(0.189)
–0.104
(0.188)
–0.303
(0.186)
0.002
(0.193)
Slope mean, μsx –0.333
(0.167)*
–0.371
(0.247)
–0.429
(0.159)**
0.174
(0.159)
0.136
(0.118)
–0.394
(0.222)
–0.185
(0.157)
–0.036
(0.188)
–0.171
(0.142)
Proportional change, βx –0.756
(0.122)***
–0.975
(0.072)***
–0.771
(0.137)***
–0.894
(0.137)***
–0.630
(0.286)*
–1.214
(0.141)***
–0.751
(0.255)**
–1.108
(0.177)***
–0.524
(0.198)**
Random effects
Initial level variance, σ2ix 0.573
(0.122)***
0.573
(0.133)***
0.629
(0.119)***
0.617
(0.121)***
0.643
(0.123)***
0.694
(0.122)***
0.690
(0.12)***
0.649
(0.119)***
0.693
(0.127)***
Slope variance, σ2sx 0.571
(0.147)***
1.399
(0.239)***
0.521
(0.159)**
0.520
(0.149)***
0.247
(0.178)
1.084
(0.286)***
0.508
(0.298)
0.758
(0.257)**
0.406
(0.194)*
Covariance, ρix,sx 0.302
(0.095)**
0.353
(0.120)**
0.386
(0.107)***
0.341
(0.101)***
0.256
(0.169)
0.781
(0.153)***
0.499
(0.187)**
0.605
(0.144)***
0.349
(0.146)*
Residual variance, σ2ux 0.355
(0.045)***
0.414
(0.052)***
0.292
(0.037)***
0.306
(0.039)***
0.309
(0.039)***
0.26
(0.032)***
0.247
(0.031)***
0.270
(0.034)***
0.312
(0.039)***

Notes: Age, education, gender, race, and income were controlled in all models.

*p < .05. **p < .01. ***p < .001.

The average initial levels for all predictors were not significantly different from zero. This is expected due to the conversion from raw scores to z scores. There were significant variations in initial levels of all predictors (Table 1, σ2ix), indicating substantial interindividual differences for performance expectancy, effort expectancy, computer anxiety, computer self-efficacy, computer interest, quality of life, social support, and social isolation within older nonusers before interacting with the computer system. On average, computer anxiety decreased by 0.429 SD, performance expectancy decreased by 0.333 SD every 6 months, whereas effort expectancy, computer self-efficacy, computer interest, perceived quality of life, social support, and social isolation did not experience significant change (Table 1, μsx). There were also significant variations around the mean change trajectory of all predictors except for computer interest and social support (Table 1, σ2sx), suggesting an interindividual difference in changes over the 12-month period during their interaction with the computer system. The proportionate change was significant for all models (Table 1, βx), suggesting an overall slowing of change over time proportionate to the previous level for all constructs.

Within the model, usage on average did not show significant change over time. Although this may seem surprising, individual participants demonstrated a great deal of variability in their patterns of change over time (Table 1, σ2sx). Some participants maintained high usage over the study period and some participants maintained low usage. Some participants increased their usage over time, and others decreased their usage over the course of the 12-month intervention.

Bivariate Latent Change Score Models

All models demonstrated adequate fit. Model comparison results showed that constraining the relationship between the predictor and change in use (γyx = 0) did not lead to a significantly worse fitting model for any of the models (Δχ2(1) range from less than 0.001 to 3.260, Supplementary Table 5). Covariance between the slope of performance expectancy, effort expectancy, computer anxiety, computer self-efficacy, computer interest, and the slope of use was significant (ρsx,sy range from −0.157 to 0.305, Table 2). Specifically, older adults who experience more decrease in effort expectancy and performance expectancy, less decrease in computer anxiety, less increase in computer self-efficacy and computer interest also decrease more in use. Taken together, these results suggest that performance expectancy, effort expectancy, computer anxiety, computer self-efficacy, and computer interest change with usage, but critically, none of the predictors identified in the previous technology adoption literature were predictive of continued use.

Table 2.

Bivariate Latent Change Score Model Unstandardized Coefficient (and Standard Errors) for Predictor Variables and Use Variable

Estimates Performance expectancy Effort expectancy Computer anxiety Computer self-efficacy Computer interest Quality of life Social support Social isolation
Fixed effects
x
Initial level mean, μix –0.053
(–0.186)
–0.001
(–0.193)
0.009
(–0.186)
–0.013
(–0.186)
–0.075
(–0.188)
–0.33
(–0.189)
–0.106
(–0.188)
–0.299
(–0.187)
Slope mean, μsx –0.34
(0.169)*
–0.370
(–0.247)
–0.435
(0.163)**
0.172
(–0.157)
0.133
(–0.116)
–0.398
(–0.223)
–0.188
(–0.159)
–0.033
(–0.183)
Proportional change, βx –0.778
(0.116)***
–0.973
(0.072)***
–0.804
(0.130)***
–0.873
(0.137)***
–0.608
(0.275)*
–1.222
(0.135)***
–0.769
(0.245)**
–1.078
(0.177)***
y
Initial level mean, μiy 0.016
(–0.193)
0.010
(–0.193)
0.000
(–0.193)
–0.001
(–0.193)
0.004
(–0.193)
0.006
(–0.193)
0.002
(–0.193)
0.005
(–0.193)
Slope mean, μsy –0.158
(–0.142)
–0.162
(–0.141)
–0.161
(–0.14)
–0.168
(–0.143)
–0.166
(–0.144)
–0.176
(–0.14)
–0.172
(–0.141)
–0.171
(–0.144)
Proportional change, βy –0.534
(0.194)**
–0.523
(0.198)**
–0.515
(0.200)*
–0.535
(0.19)**
–0.549
(0.192)**
–0.511
(0.199)*
–0.517
(0.199)**
–0.545
(0.191)**
Random effects
x
Initial level variance, σ2ix 0.575
(0.123)***
0.575
(0.133)***
0.632
(0.120)***
0.614
(0.12)***
0.641
(0.122)***
0.697
(0.122)***
0.69
(0.120)***
0.655
(0.12)***
Slope variance, σ2sx 0.592
(0.147)***
1.400
(0.238)***
0.56
(0.161)***
0.505
(0.146)***
0.236
(-0.166)
1.096
(0.279)***
0.527
(-0.294)
0.721
(0.249)**
Covariance, ρix,sx 0.312
(0.095)***
0.356
(0.12)**
0.408
(0.107)***
0.335
(0.101)***
0.246
(-0.163)
0.787
(0.151)***
0.51
(0.182)**
0.589
(0.142)***
Residual variance, σ2ux 0.355
(0.045)***
0.413
(0.052)***
0.291
(0.037)***
0.307
(0.039)***
0.309
(0.039)***
0.260
(0.032)***
0.248
(0.031)***
0.268
(0.034)***
y
Initial level variance, σ2iy 0.693
(0.128)***
0.694
(0.127)***
0.693
(0.127)***
0.692
(0.127)***
0.691
(0.128)***
0.696
(0.127)***
0.693
(0.127)***
0.694
(0.128)***
Slope variance, σ2sy 0.416
(0.195)*
0.405
(0.194)*
0.398
(0.193)*
0.417
(0.19)*
0.430
(0.197)*
0.394
(0.191)*
0.399
(0.192)*
0.427
(0.195)*
Covariance, ρiy,sy 0.357
(0.145)*
0.349
(0.146)*
0.343
(0.147)*
0.356
(0.142)*
0.366
(0.143)*
0.339
(0.147)*
0.344
(0.147)*
0.362
(0.143)*
Residual variance, σ2uy 0.313
(0.039)***
0.312
(0.039)***
0.312
(0.039)***
0.312
(0.039)***
0.313
(0.039)***
0.311
(0.039)***
0.312
(0.039)***
0.312
(0.039)***
x and y
Covariance, ρix,iy –0.048
(–0.088)
0.04
(–0.092)
–0.069
(–0.087)
–0.048
(–0.088)
0.036
(–0.088)
0.063/
(–0.088)
–0.082/
(–0.088)
–0.014
(–0.087)
Covariance, ρsx,sy 0.210
(0.073)**
0.305
(0.124)*
–0.157
(0.077)*
0.156
(0.07)*
0.148
(0.064)*
–0.056
(–0.073)
0.011
(–0.051)
0.021
(–0.062)
Covariance, ρix,sy 0.017
(–0.062)
0.028
(–0.064)
–0.017
(–0.062)
0.155
(0.064)*
0.079
(–0.065)
0.031
(–0.063)
0.003
(–0.062)
0.067
(–0.063)
Covariance, ρiy,sx 0.143
(–0.077)
0.398
(0.117)***
–0.251
(0.079)**
0.173
(0.072)*
0.135
(0.06)*
–0.057
(–0.098)
0.017
(–0.07)
–0.051
(–0.082)

Notes: Age, education, gender, race, and income were controlled in all models.

*p < .05. **p < .01. ***p <. 001.

Discussion

We considered changes in predictors identified in previous technology adoption literature and examined the value of those changes in predicting the continued use of a computer system for older nonusers. Data from the intervention arm of the PRISM trial were assessed with univariate latent change score models to examine intraindividual changes in interindividual differences that are predictive of technology adoption in previous literature. These data were also assessed with bivariate latent change score models to explore the value of those interindividual differences in predicting continued use. Results showed that performance expectancy, effort expectancy, computer anxiety, computer self-efficacy, computer interest, quality of life, social support, and social isolation all experienced change of different extent for different individuals over the study period. Although performance expectancy, effort expectancy, computer anxiety, computer self-efficacy, and computer interest change alongside use, none of those interindividual differences were predictive of change in use. Power analysis suggested that we have enough power to detect a medium effect of such relationships. Taken together, these results suggest a correlated and noncausal relationship between those predictors and use.

Changes in Proposed Predictors of Continued Use

Several previous studies have considered the potential of intraindividual changes of interindividual differences that are predictive of technology adoption in various population and contexts (e.g., Hu et al., 2003; Mclean, 2018; Venkatesh & Bala, 2008; Venkatesh et al., 2012), but none of the previous studies described the changes in detail. Drawing on the power law in the skill acquisition literature, we expected that performance expectancy, computer self-efficacy, computer interest, and benefits on everyday life brought on by using a computer (perceived quality of life and perceived social support) would experience positive decelerating change as computer proficiency grows with interaction and experience with the system. On the other hand, effort expectancy and computer anxiety would experience negative decelerating change as computer proficiency grows with interaction and experience with the system. Our findings on those changes are partially in line with those expectations. Specifically, the current finding showed a decelerating trend for changes in all those individual differences. These results suggest that initial interactions with a new technology shortly after adoption play a more important role in fostering attitudes and perceptions of the new technology than later interactions. This is consistent with propositions in theoretical models on technology acceptance and training, such as the STELA model (Tsai et al., 2019) reviewed in earlier sections. Contrary to our expectation, average performance expectancy declined over time. This construct assesses whether participants believe use of the system helps them accomplish various tasks. Note that compared with a typical computer system, PRISM was restricted in the features it provided, and after time, it is possible that participants wished the system had more features to assist them with other activities, consistent with some qualitative data collected at the completion of the trial (e.g., the desire for videoconferencing features).

Our findings also suggested that the change patterns can differ substantially across different individuals. In another word, although we expected performance expectancy, computer self-efficacy, computer interest, perceived quality of life, perceived social support to increase with computer use and effort expectancy, computer anxiety, and expected perceived social isolation to decrease with computer use for all users, we only observed those patterns in some users. It is unclear what contributes to the large interindividual differences in intraindividual changes of those individual difference factors. Given that attitudes and perceptions of a technology are, conceptually, formed in part through interacting with that technology, the large interindividual differences in change could reflect interindividual differences in user experiences and satisfaction with the computer system. Future studies can explore the antecedents of those attitudes to better understand our findings.

Predicting Continued Use

Results from bivariate latent change score models suggested that changes in performance expectancy, effort expectancy, computer anxiety, computer self-efficacy, and computer interest correlated with changes in use. These findings are consistent with significant correlations between performance expectancy, effort expectancy, attitudes, and technology acceptance found in previous studies in various populations and contexts (e.g., Dwivedi et al., 2019 for a review). These findings are also consistent with findings from studies showing coupling relationships between performance expectancy, effort expectancy and technology acceptance over different phases of use (e.g., Bhattacherjee & Premkumar, 2004; Hu et al., 2003; Mclean, 2018).

Of more interest to the current study is the predictive value of those individual differences on subsequent use. Bivariate latent change score models showed no evidence of the examined individual difference predictors being predictive of subsequent use. This is partially consistent with the findings of a cross-sectional survey on gerontechnology use in older adults (Chen & Chan, 2014). Specifically, Chen and Chan (2014) found effort expectancy and performance expectancy to be not predictive of concurrent self-reported use of gerontechnology. Our study extended their findings by incorporating an objective measure of use and a longitudinal design that draws stronger causal inferences.

The lack of causal findings is not consistent with previous findings suggesting that performance expectancy and effort expectancy are predictive of subsequent self-reported use through concurrent technology acceptance (e.g., Dwivedi et al., 2019). The null finding is especially noteworthy given that all the examined individual difference predictors were well grounded in technology acceptance theories and demonstrated strong effects in previous literature. The differences in finding might be due to the differences in study population and design. First, the focus and assumptions behind the analysis of the current study are different from those of previous studies. The current study focused on the continued use of older users who had already accepted the computer system. Therefore, all users were assumed to be using the system to different extents after getting it (μix, σ2ix; Supplementary Figure 1B), and individual difference factors were used to predict changes in actual use from previous occasions to subsequent occasions (Δx[t]; Supplementary Figure 1B). Whereas previous technology acceptance literature generally focuses on the intention to use a new technology after being introduced to the technology and sometimes subsequent use after that. In their cases, all participants were assumed to start from not using the technology (μix and σ2ix both fixed to 0), therefore, individual difference factors were used to predict a change from not using the technology to using at a certain frequency through intention to use.

Another major difference lies in how system use is measured. Early-, mid-, and long-term use was defined by the number of days the system was used within specific time frames (e.g., long-term use was defined as the frequency of use from Weeks 41 to 43) based on objective system records, whereas previous studies relied on general self-reported frequency of use without any time frame (e.g., “On average, how much time do you spend on the system each day?”; Venkatesh & Bala, 2008). Together, our assumptions on a more dynamic starting point of use and more precise and granulated use data could potentially contribute to the differences in findings.

The correlated and noncausal relationships between performance expectancy, effort expectancy, attitudes, and continued use also suggested that some third variables might be in play in predicting all those individual differences and use. An example of a potential third variables could be user expectation suggested by the expectation confirmation theory (Bhattacherjee, 2001; Oliver, 1980) and the Information System success model (DeLone & McLean, 2003) in marketing research. Future studies on the digital divide can take a more interdisciplinary perspective by looking beyond technology acceptance literature to identify the role of potential moderators and third variables between established individual differences and continued use.

Given the lack of findings from the current study, it is still unclear what factors are predictive of continued use of new technologies in older adults. It is possible that contextual factors, such as busyness, awareness of aging, affect, or routine and habits moderate the relationship between performance expectancy, effort expectancy, attitudes, and usage within a particular day or week. Within-person microlongitudinal design has been fruitful in understanding various psychosocial and cognitive processes in aging research (e.g., Brose et al., 2012; Sliwinski et al., 2006; Zhang & Neupert, 2021; Zhang et al., 2020). Future studies could adopt this approach and collect more contexture information to understand use in a more fine-grained perspective. Future studies could also adopt qualitative and mixed methods design to gain insights directly from the users as to why they are engaging or disengaging with a technology. Finally, machine learning is another promising approach that showed the potential to help with understanding determinants and early signs of adherence failure and disengagement in technology-based activities (e.g., He et al., 2022; Singh et al., 2022). Future research could use machine learning models to supplement current understandings and existing theories.

Strengths, Limitations, and Future Directions

The current study has many strengths. First, it is one of the first to acknowledge and systematically demonstrate how computer use change alongside attitudes, beliefs, and perceived benefits related to computer use mentioned in previous literature. It also extended previous research that used only baseline individual differences to predict continued use (e.g., Mitzner et al., 2019) by including subsequent waves of attitudes, beliefs, and perceived benefits after initial adoption. These approaches acknowledge that users change and evolve alongside their interactions with technologies and devices. Second, the current study used advanced longitudinal structural equation modeling techniques. These models controlled for the stability of constructs over time to avoid superfluous causal inferences. Therefore, our current approach provided a more accurate estimation of the dynamic relationships than those previous studies.

The current study should also be considered alongside some limitations. First, the sample size is relatively small. Power analysis suggested that our sample had enough power to detect medium effects but not small effects of the causal effects between the individual differences and use when individual differences in change trajectories are larger than individual difference at baseline. Therefore, it is possible that performance expectancy, effort expectance, and attitudes are predictive of use, but the effects are much smaller than proposed in previous literature. Another limitation is that participants are largely females with lower social economic status, which may influence the generalizability of the results. Studies with broader samples and larger sample size will determine if the null findings hold and generalize to other subpopulations of older adults.

Conclusion

Limitations notwithstanding, the current study showed evidence of large variabilities in change trajectories of attitudes, perceptions, and perceived benefits of computers. The current study further suggested correlated and noncausal relationships between performance expectancy, effort expectancy, attitudes, and continued use of computers and the internet in older adults who had limited previous experience with computers. Those findings pointed out that perceptions and attitudes about technologies change in very different ways for different older users. Although those changes are associated with change in use, the reasons underlying changes in perceptions, attitudes, and use are unclear. Understanding factors that influence the use and continued use of computers and the internet is critical given the variety of benefits these technologies can have on older adults’ life. Our findings showed the limitation of popular constructs in technology acceptance literature in predicting use and continued use and pointed out important gaps for future studies as well as a need for long-term technology use and continued use theories in older users.

Supplementary Material

igad029_suppl_Supplementary_Material

Acknowledgments

Sara J. Czaja, Walter R. Boot, Neil Charness, Wendy A. Rogers, and Joseph Sharit conceptualized and designed the PRISM system and study.

Contributor Information

Shenghao Zhang, Department of Psychology, Florida State University, Tallahassee, Florida, USA.

Walter R Boot, Department of Psychology, Florida State University, Tallahassee, Florida, USA.

Funding

This work was supported by two grants from the National Institute on Aging (R01AG064529, The Adherence Promotion with Person-centered Technology Project: Promoting Adherence to Enhance the Early Detection and Treatment of Cognitive Decline; and 4P01AG17211, under the auspicious of the Center for Research and Education on Aging and Technology Enhancement).

Conflict of Interest

None declared.

References

  1. Berkowsky, R. W., Sharit, J., & Czaja, S. J. (2017). Factors predicting decisions about technology adoption among older adults. Innovation in Aging, 1(3), igy002. doi: 10.1093/geroni/igy002 [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Bhattacherjee, A. (2001). Understanding information systems continuance: An expectation-confirmation model. MIS Quarterly, 25(3), 351–370. doi: 10.2307/3250921 [DOI] [Google Scholar]
  3. Bhattacherjee, A., & Premkumar, G. (2004). Understanding changes in belief and attitude toward information technology usage: A theoretical model and longitudinal test. MIS Quarterly, 28(2), 229–254. doi: 10.2307/25148634 [DOI] [Google Scholar]
  4. Boot, W. R., Charness, N., Czaja, S. J., Sharit, J., Rogers, W. A., Fisk, A. D., Mitzner, T., Lee, C. C., & Nair, S. (2015). Computer proficiency questionnaire: Assessing low and high computer proficient seniors. Gerontologist, 55(3), 404–411. doi: 10.1093/geront/gnt117 [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Brose, A., Schmiedek, F., Lövdén, M., & Lindenberger, U. (2012). Daily variability in working memory is coupled with negative affect: The role of attention and motivation. Emotion, 12(3), 605–617. doi: 10.1037/a0024436 [DOI] [PubMed] [Google Scholar]
  6. Charness, N., & Boot, W. R. (2022). A grand challenge for psychology: Reducing the age-related digital divide. Current Directions in Psychological Science, 31(2), 187–193. doi: 10.1177/09637214211068144 [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Chen, K., & Chan, A. H. S. (2014). Gerontechnology acceptance by elderly Hong Kong Chinese: A senior technology acceptance model (STAM). Ergonomics, 57(5), 635–652. doi: 10.1080/00140139.2014.895855 [DOI] [PubMed] [Google Scholar]
  8. Cheng, M., & Yuen, A. H. K. (2018). Student continuance of learning management system use: A longitudinal exploration. Computers & Education, 120, 241–253. doi: 10.1016/j.compedu.2018.02.004 [DOI] [Google Scholar]
  9. Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Routledge. [Google Scholar]
  10. Czaja, S. J., Boot, W., Charness, N., & Rogers, W. A. (2020). Designing for older adults: Principles and creative human factors approach. CRC Press. [Google Scholar]
  11. Czaja, S. J., Boot, W. R., Charness, N., Rogers, W. A., & Sharit, J. (2018). Improving social support for older adults through technology: Findings from the PRISM randomized controlled trial. Gerontologist, 58(3), 467–477. doi: 10.1093/geront/gnw249 [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Czaja, S. J., Boot, W. R., Charness, N., Rogers, W. A., Sharit, J., Fisk, A. D., Mitzner, T., Lee, C. C., & Nair, S. N. (2015). The Personalized Reminder Information and Social Management System (PRISM) trial: Rationale, methods and baseline characteristics. Contemporary Clinical Trials, 40, 35–46. doi: 10.1016/j.cct.2014.11.004 [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Czaja, S. J., Charness, N., Fisk, A. D., Hertzog, C., Nair, S. N., Rogers, W. A., & Sharit, J. (2006). Factors predicting the use of technology: Findings from the center for research and education on aging and technology enhancement (create). Psychology and Aging, 21(2), 333–352. doi: 10.1037/0882-7974.21.2.333 [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. DeLone, W. H., & McLean, E. R. (2003). The DeLone and McLean model of information systems success: A ten-year update. Journal of Management Information Systems, 19(4), 9–30. doi: 10.1080/07421222.2003.11045748 [DOI] [Google Scholar]
  15. Dwivedi, Y. K., Rana, N. P., Jeyaraj, A., Clement, M., & Williams, M. D. (2019). Re-examining the Unified Theory of Acceptance and Use of Technology (UTAUT): Towards a revised theoretical model. Information Systems Frontiers, 21(3), 719–734. doi: 10.1007/s10796-017-9774-y [DOI] [Google Scholar]
  16. Francis, J., Ball, C., Kadylak, T., & Cotten, S. R. (2019). Aging in the digital age: Conceptualizing technology adoption and digital inequalities. In Neves B.B. & Vetere F. (Eds.), Ageing and digital technology (pp. 35–49). Springer. [Google Scholar]
  17. He, Z., Tian, S., Singh, A., Chakraborty, S., Zhang, S., Lustria, M. L. A., Charness, N., Roque, N. A., Harrell, E. R., & Boot, W. R. (2022). A machine-learning based approach for predicting older adults’ adherence to technology-based cognitive training. Information Processing & Management, 59(5), 103034. doi: 10.1016/j.ipm.2022.103034 [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Hsieh, J. P. A., Rai, A., & Keil, M. (2008). Understanding digital inequality: Comparing continued use behavioral models of the socio-economically advantaged and disadvantaged. MIS Quarterly, 32(1), 97–126. doi: 10.2307/25148830 [DOI] [Google Scholar]
  19. Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1–55. doi: 10.1080/10705519909540118 [DOI] [Google Scholar]
  20. Hu, P. J. H., Clark, T. H., & Ma, W. W. (2003). Examining technology acceptance by school teachers: A longitudinal study. Information & Management, 41(2), 227–241. doi: 10.1016/S0378-7206(03)00050-8 [DOI] [Google Scholar]
  21. Lee, C., & Coughlin, J. F. (2015). PERSPECTIVE: Older adults’ adoption of technology: An integrated approach to identifying determinants and barriers. Journal of Product Innovation Management, 32(5), 747–759. doi: 10.1111/jpim.12176 [DOI] [Google Scholar]
  22. McArdle, J. J. (2009). Latent variable modeling of differences and changes with longitudinal data. Annual Review of Psychology, 60, 577–605. doi: 10.1146/annurev.psych.60.110707.163612 [DOI] [PubMed] [Google Scholar]
  23. McLean, G. (2018). Examining the determinants and outcomes of mobile app engagement—A longitudinal perspective. Computers in Human Behavior, 84, 392–403. doi: 10.1016/j.chb.2018.03.015 [DOI] [Google Scholar]
  24. Mitzner, T. L., Savla, J., Boot, W. R., Sharit, J., Charness, N., Czaja, S. J., & Rogers, W. A. (2019). Technology adoption by older adults: Findings from the PRISM trial. Gerontologist, 59(1), 34–44. doi: 10.1093/geront/gny113 [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Newell, A., & Rosenbloom, P. (1981). Mechanisms of skill acquisition and the power law of practice. In Anderson J. R. (Ed.) Cognitive skills and their acquisition (pp. 1–55). Lawrence Erlbaum Associates. [Google Scholar]
  26. Oliver, R. L. (1980). A cognitive model of the antecedents and consequences of satisfaction decisions. Journal of Marketing Research, 17(4), 460–469. doi: 10.2307/3150499 [DOI] [Google Scholar]
  27. Pew Research Center. (2021a). Internet/broadband fact sheet. https://www.pewresearch.org/internet/fact-sheet/internet-broadband/
  28. Pew Research Center. (2021b). Mobile fact sheet. https://www.pewresearch.org/internet/fact-sheet/mobile/
  29. Rhodes, R. E., & de Bruijn, G. J. (2013). How big is the physical activity intention–behaviour gap? A meta-analysis using the action control framework. British Journal of Health Psychology, 18(2), 296–309. doi: 10.1111/bjhp.12032 [DOI] [PubMed] [Google Scholar]
  30. Rogosa, D. (1980). A critique of cross-lagged correlation. Psychological Bulletin, 88(2), 245–258. doi: 10.1037/0033-2909.88.2.245 [DOI] [Google Scholar]
  31. Sharit, J., Moxley, J. H., Boot, W. R., Charness, N., Rogers, W. A., & Czaja, S. J. (2019). Effects of extended use of an age-friendly computer system on assessments of computer proficiency, attitudes, and usability by older non-computer users. ACM Transactions on Accessible Computing (TACCESS), 12(2), 1–28. doi: 10.1145/3325290 [DOI] [Google Scholar]
  32. Sheeran, P. (2002). Intention–behavior relations: A conceptual and empirical review. European Review of Social Psychology, 12(1), 1–36. doi: 10.1080/14792772143000003 [DOI] [Google Scholar]
  33. Singh, A., Chakraborty, S., He, Z., Tian, S., Zhang, S., Lustria, M. L. A., Charness, N., Roque, N. A., Harrell, E. R., & Boot, W. R. (2022). Deep learning-based predictions of older adults’ adherence to cognitive training to support training efficacy. Frontiers in Psychology, 13, 980778. doi: 10.3389/fpsyg.2022.980778 [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Sliwinski, M. J., Smyth, J. M., Hofer, S. M., & Stawski, R. S. (2006). Intraindividual coupling of daily stress and cognition. Psychology and Aging, 21(3), 545–557. doi: 10.1037/0882-7974.21.3.545 [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Tsai, H. Y. S., Rikard, R. V., Cotten, S. R., & Shillair, R. (2019). Senior Technology Exploration, Learning, and Acceptance (STELA) model: From exploration to use—A longitudinal randomized controlled trial. Educational Gerontology, 45(12), 728–743. doi: 10.1080/03601277.2019.1690802 [DOI] [Google Scholar]
  36. Venkatesh, V., & Bala, H. (2008). Technology acceptance model 3 and a research agenda on interventions. Decision Sciences, 39(2), 273–315. doi: 10.1111/j.1540-5915.2008.00192.x [DOI] [Google Scholar]
  37. Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425–478. doi: 10.2307/30036540 [DOI] [Google Scholar]
  38. Venkatesh, V., Thong, J. Y., & Xu, X. (2012). Consumer acceptance and use of information technology: Extending the unified theory of acceptance and use of technology. MIS Quarterly, 36(1),157–178. doi: 10.2307/41410412 [DOI] [Google Scholar]
  39. Webb, T. L., & Sheeran, P. (2006). Does changing behavioral intentions engender behavior change? A meta-analysis of the experimental evidence. Psychological Bulletin, 132(2), 249–268. doi: 10.1037/0033-2909.132.2.249 [DOI] [PubMed] [Google Scholar]
  40. Zhang, S., Boot, W. R., & Charness, N. (2022). Does computer use improve older adults’ cognitive functioning? Evidence from the Personal Reminder Information & Social Management (PRISM) trial. Gerontologist, 62(7), 1063–1070. doi: 10.1093/geront/gnab188 [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Zhang, S., Gamaldo, A. A., Neupert, S. D., & Allaire, J. C (2020). Predicting control beliefs in older adults: A micro-longitudinal study. The Journals of Gerontology, Series B: Psychological Sciences and Social Sciences, 75(5), e1–e12. doi: 10.1093/geronb/gbz001 [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Zhang, S., Grenhart, W. C., McLaughlin, A. C., & Allaire, J. C (2017). Predicting computer proficiency in older adults. Computers in Human Behavior, 67, 106–112. doi: 10.1016/j.chb.2016.11.006 [DOI] [Google Scholar]
  43. Zhang, S., & Neupert, S. D. (2021). Within-and between-person relationships among health, awareness of aging, and control beliefs: A micro-longitudinal study. The Journals of Gerontology, Series B: Psychological Sciences and Social Sciences, 76(5), 858–870. doi: 10.1093/geronb/gbaa180 [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

igad029_suppl_Supplementary_Material

Articles from Innovation in Aging are provided here courtesy of Oxford University Press

RESOURCES