Abstract
Introduction
Participants from a longitudinal cohort study were surveyed to evaluate the practical feasibility of remote cognitive assessment.
Methods
All active participants/informants at the University of California San Diego Alzheimer's Disease Research Center were invited to complete a nine‐question survey assessing technology access/use and willingness to do cognitive testing remotely.
Results
Three hundred sixty‐nine of 450 potential participants/informants (82%) completed the survey. Overall, internet access (88%), device ownership (77%), and willingness to do cognitive testing remotely (72%) were high. Device access was higher among those with normal cognition (85%) or cognitive impairment (85%) than those with dementia (52%), as was willingness to do remote cognitive testing (84%, 74%, 39%, respectively). Latinos were less likely than non‐Latinos to have internet or device access but were comparable in willingness to do remote testing.
Discussion
Remote cognitive assessment using interactive video technology is a practicable option for nondemented participants in longitudinal studies; however, additional resources will be required to ensure representative participation of Latinos.
Keywords: aging, Alzheimer's disease, cognition, mild cognitive impairment, neuropsychology, remote assessment
1. INTRODUCTION
The sudden and unexpected emergence of the COVID‐19 pandemic was extremely disruptive to all aspects of clinical research with human subjects. The disruption was particularly detrimental for observational studies and clinical trials that included elderly individuals with prodromal or symptomatic Alzheimer's disease (AD) given the extreme vulnerability of these individuals to the adverse effects of the virus due to age and potential underlying health issues. These health concerns and the implementation of stay‐at‐home orders commencing in the United States in early March of 2020 prevented the face‐to‐face clinical assessment and cognitive testing required by these studies from going forward. These circumstances have highlighted the need for methods of clinical assessment and cognitive testing that can be easily and reliably administered remotely, with the research participant remaining safely in their own home or other place of residence. Efforts to develop these methods can build upon currently available technologies for real‐time, interactive clinical and cognitive assessment using personal computers, tablet computers, and smartphones.
Potential barriers to widespread implementation of remote online or live interactive testing in studies of AD include lack of accessibility to reliable high‐speed internet service; unavailability of personal computer, tablet, or smartphone technology; and insufficient technical skill to establish an appropriate connection. Each of these potential barriers may vary by age, education, socioeconomic status, and race/ethnicity. 1 Furthermore, little is known about the willingness of older adults to use digital devices for remote online or live interactive cognitive testing, even though ownership and use of internet‐capable technologies among individuals over age 65 has increased in recent years. 2 , 3
To evaluate the feasibility of remote cognitive assessment in our National Institutes of Health/National Institute on Aging (NIH/NIA)‐funded Alzheimer's Disease Research Center (ADRC), we surveyed participants about their use of smartphone or interactive video technology and willingness to complete cognitive assessments remotely. The ADRC longitudinal study requires annual clinical and cognitive assessments that typically are conducted in person. The cohort includes older adults across a spectrum of cognitive functioning from normal to dementia and is approximately 20% Latino. While this cohort is not representative of the general population of older adults, the demographic profile is typical of current volunteers in clinical trials and observational studies of aging and dementia who may need to transition to remote clinical assessment and cognitive testing to continue participation. The goal of the survey was to provide information about the feasibility of making the transition to remote assessment, factors that may impact such a transition, and the resources that may be needed to do so.
RESERACH IN CONTEXT
Systematic review: The authors reviewed population‐based surveys of technology usage among older adults and prior research on remote cognitive assessment in aging and dementia cohorts. The current investigation builds upon this prior literature by providing information about the practical feasibility of transitioning current volunteers in observational studies of aging and dementia to remote assessment.
Interpretation: Access to requisite technology for remote assessment and willingness to do cognitive testing via video or smartphone were high among non‐demented participants in our Alzheimer's Disease Research Center, but low among dementia patients. Latino participants had less access to technology but comparable willingness to do testing remotely. Remote cognitive assessment using videoconferencing or smartphone technologies is a practicable option for nondemented participants; however, additional resources will be required to ensure representative participation of Latinos.
Future directions: Additional research is needed to determine whether the results described herein generalize to other cohorts, particularly those with different participant characteristics.
2. METHODS
2.1. Participants
All active participants in the University of California San Diego (UCSD) Shiley‐Marcos ADRC longitudinal study were invited to complete the survey. As part of their ongoing ADRC participation, almost all had received a standardized in‐person dementia evaluation with detailed clinical and neuropsychological assessments within the past 18 months, and had received a diagnosis of dementia, cognitive impairment (CI: includes mild cognitive impairment [MCI] and cognitive impairment–not MCI), or normal cognition (NC). Dementia diagnosis was based on Diagnostic and Statistical Manual of Mental Disorders, 5th Edition 4 and NIA‐Alzheimer's Association (AA) 5 diagnostic criteria and CI was based on NIA‐AA criteria for MCI. 6 Some participants known to have severe dementia from prior assessments instead received minimal follow‐up via a telephone interview with a knowledgeable informant/study partner that included an assessment of interval functional decline, review of new cognitive symptoms or medical problems, and any change in living/care arrangements.
There were 465 participants classified as active in the UCSD ADRC longitudinal study as of March 19, 2020, the day the California governor issued a stay‐at‐home order due to the COVID‐19 pandemic. Initial contact showed that 11 of these individuals were deceased and 4 had dropped from the study since last contact with the ADRC. Thus, a total of 450 participants/informants were eligible to receive the survey. This target cohort averaged 76.9 (standard deviation [SD] = 8.1) years of age, had 16.2 (SD = 3.0) years of education, was 58% female, 95% White, and 17% Latino; 26% were classified with dementia, 15% with CI, and 59% as NC.
2.2. Remote assessment technology survey
A nine‐question survey was designed to assess capability and willingness to participate in remote cognitive assessment (Figure 1). The survey queried access to high‐speed internet, access to equipment for video chat/conferencing, experience with video chat/conferencing, type of device and software used for video chat/conferencing, access to technical help, ownership of a smartphone, and willingness to complete cognitive testing remotely via video chat or smartphone. Two versions of the survey were constructed: one with first person wording to be self‐completed by NC participants and one with identical questions but third‐person wording to be completed by the informant/study partner about the participant if the participant had CI or dementia. Questions were presented one at a time using Qualtrics online survey software. Both versions of the survey were available in English and Spanish.
FIGURE 1.

Remote assessment technology survey (self‐administered version)
2.3. Procedure
The Remote Assessment Technology Survey was conducted between April 28, 2020, and June 8, 2020. An invitation to complete the survey was sent by e‐mail to all NC participants or to the study partner of participants with CI or dementia. The e‐mail included a brief message explaining the purpose of the study, a link to the survey, instructions for completing the survey, and a deadline date 2 weeks from the day the survey link was sent. A reminder e‐mail that included the survey link was sent after 1 week to those who had not yet responded. If there was still no response after the 2‐week deadline had passed, the participant or study partner was called and asked to complete the online survey if possible. If the participant or study partner could not access the online survey (e.g., due to no internet access), the survey was administered over the telephone and the responses were input by ADRC personnel. Similarly, if no e‐mail address was on file for the participant or study partner, the survey was administered over the telephone and the responses were input by ADRC personnel. This ensured that the entire cohort had the opportunity to participate in the survey. If a participant or study partner did not complete the online survey and could not be reached by telephone after three attempts spaced several days apart (with a message to please return the call), it was assumed that they passively declined to complete the survey.
2.4. Data analysis
Survey responses were uploaded by the Qualtrics software into a secure, password‐protected Microsoft Excel worksheet. Individual participants’ survey data were then linked to their ADRC identification number for co‐registration with demographic and clinical data collected by the ADRC. Because we were interested in determining the feasibility of remote cognitive assessment among our ADRC participants, demographic and clinical data included in analyses are those of the participant, not the study partner/informant who completed the survey on behalf of participants with CI or dementia. Participant characteristics of those who completed or declined the survey were compared using t tests and Χ2 analysis. Among survey completers, logistic regression was used to evaluate the association of participant characteristics (diagnostic group, age, education, sex, and ethnicity) with survey response for each yes/no question. All predictor variables were entered into each logistic regression model simultaneously in a single block, therein permitting evaluation of each participant characteristic while holding all others constant. Χ2 analyses were used to determine whether the frequency of use of specific devices (e.g., computer/laptop, tablet, smartphone) and software programs (e.g., Zoom, Skype, FaceTime, etc.) differed by ethnicity. All data analyses were completed using SPSS Statistical Software (version 27).
3. RESULTS
3.1. Participant characteristics
The Remote Assessment Technology Survey was completed by 369 of the 450 eligible ADRC participants/informants (82%), actively declined/refused by 22 (5%), and passively declined by 59 (13%; Figure 2). Demographic and clinical characteristics of participants who completed the survey (or had it completed for them) are compared to participants who actively or passively declined the survey in Table 1. Those who completed the survey had slightly but significantly more years of education than non‐completers (t[448] = 2.55; P < .05) and were less likely to be Latino (χ2[1] = 7.83; P < .01). Survey completers and non‐completers did not differ by age or sex. The NC participants were more likely to complete the survey (89%) than were informants/study partners of CI (78%; χ2[1] = 5.58; P < .05) or dementia (69%; χ2[1] = 21.92; P < .001) participants (omnibus χ2[2] = 22.12; P < .001), while the latter participant groups did not differ.
FIGURE 2.

Flow diagram of survey response rates
TABLE 1.
Characteristics a of the overall eligible cohort and comparisons of survey completers and non‐completers
| All (N = 450) | Completed survey (N = 369) | Actively (N = 22) or passively (N = 59) declined survey (N = 81) | Group comparison (completers vs. non‐completers) | |
|---|---|---|---|---|
| Age: Mean (SD) | 76.9 (8.1) | 77.2 (7.9) | 76.1(8.77) |
t (448) = 1.05 n.s. |
| Education: Mean (SD) | 16.1 (3.0) | 16.2 (2.9) | 15.3 (3.3) |
t (448) = 2.55 P < .05 |
| Sex: % female | 58% | 58% | 61% |
χ 2 (1) = 0.17 n.s. |
| Ethnicity: % Latino | 17% | 14% | 27% |
χ 2 (1) = 7.83 P < .01 |
| Diagnostic classification: | χ 2 (2) = 22.12 | |||
| N (% completing survey) | P < .001 | |||
| Normal cognition (N = 261) | 232 (89%) | 29 (11%) | ||
| Cognitively impaired (N = 68) | 53 (78%) | 15 (22%) | ||
| Dementia (N = 121) | 84 (69%) | 37 (31%) |
Demographic characteristics are those of the Alzheimer's Disease Research Center participant, not the study partner who completed the survey for participants classified with cognitive impairment or dementia.
Demographic and clinical characteristics of ADRC participants who completed the survey (or had it completed for them) are presented by diagnostic category in Table 2. The three diagnostic groups did not differ by age or ethnicity, but the dementia group had fewer years of education than the NC or CI groups, while the NC and CI groups did not differ. The CI group had significantly fewer women than the NC or dementia groups. As expected, the groups differed on scores from mental status exams and functional rating scales with dementia worse than NC and CI, and CI worse than NC. Clinical Dementia Rating (CDR) scores indicated mild impairment in the CI group and moderate to severe impairment in the dementia group. The CI and dementia groups scored higher than the NC group on the Geriatric Depression Scale (GDS) but did not differ from one another and did not have scores indicative of clinically significant depression.
TABLE 2.
Participant demographic characteristics, mental status scores, activities of daily living ratings, and depression scores for survey completers grouped by diagnostic classification
| Normal (N = 232) | Cognitively impaired (N = 53) | Dementia (N = 84) | Group comparison | |
|---|---|---|---|---|
| Age: Mean (SD) | 76.8 (6.8) | 78.6 (6.0) | 77.8 (11.3) |
F(2,366) = 1.59 n.s. |
| Education: Mean (SD) | 16.8 (2.2) | 16.2 (3.0) | 14.7 (3.7) b , c |
F(2,366) = 16.35 P < .0001 |
| Sex: % female | 60 | 38 b , d | 67 |
χ 2 (2) = 11.73 P < .05 |
| Ethnicity: % Latino | 12 | 19 | 18 |
χ 2 (2) = 2.7 n.s. |
| MMSE: Mean (SD) | 29.3 (1.2) | 28.2 (1.8) b , d | 19.6 (6.6) b , c | F(2,321) = 243.51 |
| [N] a | [220] | [51] | [53] | P < .0001 |
| MoCA: Mean (SD) | 26.5 (2.3) | 23.0 (3.2) b , d | 12.9 (6.0) b , c | F(2,322) = 351.95 |
| [N] a | [220] | [51] | [54] | P < .0001 |
| CDR: Mean (SD) | 0.10 (0.2) | 0.40 (0.3) b , d | 1.8 (0.9) b , c | F(2,355) = 422.84 |
| [N] a | [226] | [51] | [81] | P < .0001 |
| FAQ: Mean (SD) | 0.5 (1.7) | 2.4 (3.2) b , d | 21.5 (8.4) b , c | F(2,355) = 696.68 |
| [N] a | [226] | [51] | [81] | P < .0001 |
| GDS: Mean (SD) | 0.9 (1.4) | 1.7 (1.8) b | 1.9 (2.41) b | F(2,317) = 8.48 |
| [N] a | [220] | [51] | [49] | P < .0001 |
Abbreviations: CDR, Clinical Dementia Rating; DRS, Dementia Rating Scale; FAQ, Functional Assessment Questionnaire; GDS, Geriatric Depression Scale; MMSE, Mini‐Mental State Examination; MoCA, Montreal Cognitive Assessment.
Any mental status or rating scale score not obtained within 24 months of the survey was considered missing. For some of the more cognitively impaired patients, the most recent assessment was conducted by telephone interview and MMSE, MoCA, DRS, and GDS scores were not obtained; however, the CDR and FAQ typically were completed, and the last clinical diagnosis was carried forward.
Significantly different from normal cognition.
Significantly different from cognitive impairment.
Significantly different from dementia.
3.2. Survey results
Overall, 88% of ADRC participants for whom survey data were available (N = 369) have reliable high‐speed internet; 77% have a computer, tablet, or smartphone with video chat capabilities they could use for remote testing; 66% regularly use a smartphone; 60% currently use video chat technology to interact with friends and colleagues; 65% have someone to help set up a video assessment interaction, if help is needed; 72% are willing to do cognitive testing (approx. 1 hour) via video; and 59% are willing to do multiple episodes of very brief cognitive testing (i.e., “burst” testing) on their smartphone.
Results of logistic regression analyses are presented in Table 3. Participants with dementia have significantly less access to video chat–capable technology and are less willing to participate in remote cognitive assessment than participants with either CI or NC, who do not differ from one another. Increasing age also is associated with less access to the requisite technology and less willingness to participate in remote cognitive assessment. Lower educational attainment is associated with less access to a computer with webcam or smartphone but not with willingness to do testing via video. Similarly, Latino participants are less likely than non‐Latinos to have access to reliable high‐speed internet or a device they could use for video chat; however, they are comparable in regular smartphone usage as well as willingness to participate in remote cognitive testing via video or smartphone. Survey results are presented by diagnostic category and by ethnicity in Figure 3.
TABLE 3.
Results of binary logistic regression analyses for each yes/no survey question
| Do you have reliable high‐speed internet? | ||
|---|---|---|
| Adjusted odds ratio | 95% C.I. | |
| Diagnostic group | ||
| Dementia | 0.21‡ | 0.10–0.44 |
| Cognitive Impairment | 1.45 | 0.40–5.30 |
| Age (y) | 0.94† | 0.91–0.98 |
| Education (y) | 1.10 | 0.98–1.25 |
| Sex (male) | 1.14 | 0.53–2.41 |
| Ethnicity (Latino) | 0.72 | 0.28–1.87 |
| Do you have a computer with a webcam and microphone, a tablet, or a smartphone that could be used for video chats? | ||
|---|---|---|
| Adjusted odds ratio | 95% C.I. | |
| Diagnostic group | ||
| Dementia | 0.22‡ | 0.12–0.41 |
| Cognitive impairment | 1.20 | 0.48–3.00 |
| Age (y) | 0.94† | 0.91–0.98 |
| Education (y) | 1.18† | 1.06–1.32 |
| Sex (male) | 1.28 | 0.70–2.36 |
| Ethnicity (Latino) | 0.39* | 0.18–0.82 |
| Do you use a computer, tablet, or smartphone to have video chats with family, friends, or colleagues? | ||
|---|---|---|
| Adjusted odds ratio | 95% C.I. | |
| Diagnostic group | ||
| Dementia | 0.25‡ | 0.14–0.44 |
| Cognitive impairment | 0.72 | 0.38–1.38 |
| Age (y) | 0.95† | 0.93–0.98 |
| Education (y) | 1.15† | 1.05–1.26 |
| Sex (male) | 1.03 | 0.64–1.66 |
| Ethnicity (Latino) | 0.70 | 0.35–1.37 |
| Is there someone who could help with getting you set up for a video visit, if needed? | ||
|---|---|---|
| Adjusted odds ratio | 95% C.I. | |
| Diagnostic group | ||
| Dementia | 1.03 | 0.59–1.81 |
| Cognitive impairment | 1.68 | 0.84–3.36 |
| Age (y) | 0.96* | 0.93–0.98 |
| Education (y) | 1.05 | 0.96–1.14 |
| Sex (male) | 1.28 | 0.80–2.05 |
| Ethnicity (Latino) | 0.73 | 0.38–1.40 |
| Would you be willing to do some cognitive testing via video chat? | ||
|---|---|---|
| Adjusted odds ratio | 95% C.I. | |
| Diagnostic group | ||
| Dementia | 0.14‡ | 0.08–0.26 |
| Cognitive impairment | 0.59 | 0.28–1.23 |
| Age (y) | 0.95† | 0.92–0.98 |
| Education (y) | 1.10 | 0.99–1.21 |
| Sex (male) | 1.22 | 0.70–2.13 |
| Ethnicity (Latino) | 0.80 | 0.37–1.70 |
| Do you regularly use a smartphone (e.g., iPhone or Android mobile phone)? | ||
|---|---|---|
| Adjusted odds ratio | 95% C.I. | |
| Diagnostic group | ||
| Dementia | 0.08‡ | 0.04–0.16 |
| Cognitive impairment | 1.11 | 0.53–2.34 |
| Age (years) | 0.92‡ | 0.89–0.95 |
| Education (years) | 1.21‡ | 1.09–1.34 |
| Sex (male) | 0.50* | 0.29–0.88 |
| Ethnicity (Latino) | 1.86 | 0.79–4.36 |
| Would you be willing to do brief cognitive assessments on a smartphone? | ||
|---|---|---|
| Adjusted odds ratio | 95% C.I. | |
| Diagnostic group | ||
| Dementia | 0.35‡ | 0.21–0.61 |
| Cognitive impairment | 0.78 | 0.42–1.47 |
| Age (y) | 0.97* | 0.95–1.00 |
| Education (y) | 1.04 | 0.96–1.14 |
| Sex (male) | 0.85 | 0.54–1.35 |
| Ethnicity (Latino) | 1.29 | 0.67–2.52 |
Note: All predictor variables entered each model simultaneously in a single block, therein permitting evaluation of each characteristic while holding all others constant. Diagnostic group is referenced to normal cognition; sex is referenced to female; and ethnicity is referenced to Non‐Latino.
Abbreviation: C.I., confidence interval.
* Significant at P < .05; † Significant at P < .01; ‡ Significant at P < .001.
FIGURE 3.

Percent of participants responding “yes” by diagnostic classification (A) and ethnicity (B)
Among the 72% of respondents who indicated that they would be willing to do some cognitive testing via video chat (N = 267), desktop/laptops (68%) and smartphones (52%) are used more frequently for video chats than tablets (27%). Device use differs by ethnicity such that use of desktop/laptop computers for video chat is less frequent among Latino respondents (46% Latino, 71% non‐Latino; χ2[1] = 8.60; P < .01), while tablet or smartphone use for video chat does not differ by ethnicity. Zoom (71%) and FaceTime (45%) are the most frequently used software platforms for video chat, followed by Skype (18%) and WhatsApp (9%). Zoom and Skype are less commonly used by Latino respondents (Zoom: 49% Latino, 74% non‐Latino; χ2[1] = 9.44, P < .01; Skype: 3% Latino, 20% non‐Latino; χ2[1] = 5.71; P < .01), while use of FaceTime and WhatsApp does not differ by ethnicity.
4. DISCUSSION
Survey results from ADRC participants indicate that willingness to do cognitive testing via video or smartphone is high among non‐demented participants but low among patients with dementia. Access to requisite technology also is high overall, with 88% of respondents having reliable high‐speed internet and 77% having a computer, tablet, or smartphone that they could use for remote testing; however, internet and device access is significantly lower among Latino respondents. Results suggest that remote cognitive assessment using videoconferencing or smartphone technologies may be a practicable option for nondemented older adult participants in longitudinal research; however, additional resources will be required to ensure representative participation by Latinos.
Because of cohort aging and recruitment priorities in our ADRC, many participants with dementia are moderately to severely impaired, and this may have impacted our survey results. The average CDR Global Rating among the dementia group was 1.84 (SD = 0.94) and 15% of the group resides in a skilled nursing facility (SNF). Exploratory post hoc analyses revealed that dementia patients residing in a SNF were approximately half as likely as those residing elsewhere to have access to interactive video technology. Access to interactive video technology and willingness to participate in remote cognitive assessments may be higher among less impaired dementia patients.
Use of smartphones and interactive video technology among older adults has been increasing steadily in recent years. 2 , 3 The onset of the COVID‐19 pandemic and associated regional stay‐at‐home orders likely have further incentivized older individuals who had not previously seen the need for these technologies to adopt them for purposes not previously anticipated, such as physician visits via telemedicine or keeping in touch with family. This survey was conducted between April 28, 2020, and June 8, 2020, approximately 6 weeks after the statewide stay‐at‐home order was issued in California. Even at this relatively early point in the pandemic a sizable percentage of nondemented participants in our cohort were familiar with interactive video technology and reported having used it to have video chats with family, friends, and colleagues. It seems likely that as the duration of the pandemic extended beyond what many initially assumed would be a relatively brief and circumscribed lockdown, adoption of interactive video technology increased even further among older adults as they explored alternative means to stay in touch with loved ones. Anecdotally, in the weeks and months after closing the survey, we received unsolicited correspondence from several respondents indicating that although they had initially responded that they were not interested in participating in remote cognitive assessments, they were now interested in doing so.
The Latino community in San Diego County has been disproportionately affected by the COVID‐19 pandemic, and this may have contributed to the lower survey completion rate among Latino ADRC participants. Among those completing the survey, Latino participants are comparable to non‐Latino participants in terms of their willingness to do cognitive testing remotely; however, they are significantly less likely to have access to the requisite technologies. A 2019 survey by the Pew Research Center similarly found that Black and Hispanic adults are less likely to report owning a traditional computer or having high‐speed internet at home than White respondents. 2 Ownership of a smartphone, however, is comparable among Latino and non‐Latino respondents in both our survey and the Pew survey, suggesting that smartphones may play a role in improving access to interactive video technology for these participants. 7 The relatively limited number of Latino respondents in our survey sample (N = 53) precluded subgroup analyses to determine whether there was an interaction between ethnicity and level of cognitive impairment; however, this will be an important direction in future research. Similarly, the number of non‐White, non‐Latino participants in our cohort is not sufficient to analyze their survey responses separately.
While these results suggest that use of interactive video technology may be a practicable option for fostering and/or maintaining participation of older adults in cognitive aging research, they do not speak to the validity of remote cognitive assessments. Although there is mounting evidence that assessments conducted remotely can yield results that are comparable to face‐to‐face assessments in older adults 8 and in patients with MCI and mild AD, 9 the majority of studies have been conducted with participants being assessed in a highly controlled and structured in‐clinic setting, with the examiner located remotely. For example, a recent critical review, published in response to the COVID‐19 pandemic, examined published validity studies that used counterbalanced cross‐over designs of face‐to‐face and teleneuropsychology assessments in older adults (aged 65+). 8 Results showed strong support for the validity of remote administration of the Mini‐Mental State Examination (MMSE) and Montreal Cognitive Assessment (MoCA), the Hopkins Verbal Learning Test, and Letter Fluency; moderate‐to‐strong support for Digit Span (Forward, Backward, and Total); good support for the Boston Naming Test; and moderate support for Category Fluency (for animals). There was limited support for remote administration of measures of executive functioning and processing speed, primarily because they are rarely included in remote assessment. Similarly, a small pilot study found no differences in the MMSE and Alzheimer's Disease Assessment Scale‐Cognitive (ADAS‐Cog) scores of patients with AD when the tests were administered face‐to‐face or by (in‐clinic) videoconference, except for severely impaired patients (MMSE < 17) in whom the assessment via videoconference overestimated cognitive impairment. 14
Despite these encouraging findings from validity studies of in‐clinic cognitive assessments performed remotely using interactive video, to date there have been few investigations of remote assessments done with participants located in their own homes and using their own device(s). Validity data on home‐based assessments are likely to be forthcoming as research centers and clinical programs worldwide make accommodations to continue data collection in the wake of the COVID‐19 pandemic. For example, the NIH/NIA Alzheimer's Disease Centers program has adapted portions of the Uniform Data Set Neuropsychological Test Battery 10 for remote administration and plan to examine the comparability of remote and in‐person administrations. Efforts by researchers worldwide to maintain continuity and integrity in research programs with cognitive assessment may provide a natural laboratory to produce results that will improve knowledge of methods, measures, and best practices for home‐based remote assessment.
The ability to conduct cognitive assessments remotely affords a number of advantages to longitudinal research programs: It provides access and opportunity to individuals who might otherwise be excluded due to geographical distance, transportation difficulties, and physical frailty; it permits ongoing participation and follow‐up of participants who relocate or travel from the area in which they are enrolled; it decreases caregiver and participant burden; it facilitates more frequent cognitive assessment, including brief “burst” assessments; 11 and it decreases exposure to infectious disease, which is clearly highly relevant for older adult participants in the setting of the current COVID‐19 pandemic. Remote cognitive assessments, if well validated, will also have similar benefits for clinical assessments, potentially improving access and diminishing health disparities. Previous work has supported the feasibility and utility of home‐based cognitive assessments, 12 and recently published guidance details considerations and best practices for conducting remote cognitive and behavioral assessment. 13 Considerations for remote assessment include potential impacts of vision or hearing impairments, size and resolution of display screens, and reliability/stability of internet connectivity.
A strength of this study is that our entire ADRC cohort was surveyed to minimize potential selection bias. In an effort to include all participants, those who did not have an e‐mail address on file or who did not respond to the initial e‐mail solicitation were contacted by study personnel via telephone to complete the survey. To further optimize response rate, informants/caregivers were surveyed as proxies for participants with known CI or dementia. A consideration in interpreting these results, however, is that informant/caregiver responses may not always accurately reflect the responses that participants themselves would have provided, particularly with regard to willingness to participate in remote cognitive assessments.
There also are several limitations to our survey. First, because our primary aim was to determine feasibility of remote assessment, and we wanted to keep the survey as brief as possible, we did not ask follow‐up questions that may have clarified the reasoning behind participants’ responses. As a result, we cannot determine, for example, if those who do not use video chat do not do so because of access, knowledge, or choice. Second, although the demographic characteristics of our cohort are fairly typical of current volunteers for clinical trials and observational studies of AD, they do not represent the general population of older adults, which is more racially diverse and has, on average, less formal education than our cohort. Further, participants at our ADRC are highly motivated and dedicated to the program's aims—for example, most of our participants have consented to lumbar puncture and brain donation—and this also may limit the generalizability of our survey results. Nevertheless, to the extent that our ADRC cohort is representative of current volunteers in AD clinical trials and longitudinal cohort studies of cognition in aging that may need to transition to remote, on‐line interactive assessments, these results suggest that access to the requisite technology and willingness to participate in remote assessments are generally high among non‐demented participants. Remote cognitive assessment using interactive video technology may be a practicable option for nondemented participants in longitudinal studies; however, additional resources will be required to ensure representative participation by Latinos and other groups that are underrepresented in clinical research.
CONFLICTS OF INTEREST
The authors have no conflicts of interest to report. DPS is a paid consultant for Aptinx, Inc. and Biogen, Inc.
ACKNOWLEDGMENTS
Research reported in this publication was supported by the National Institute on Aging of the National Institutes of Health under award numbers P30AG062429 and R01AG064002. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. The authors are grateful to the participants, staff, and volunteers at the UCSD Shiley‐Marcos ADRC for their ongoing commitment to our research program.
Jacobs DM, Peavy GM, Banks SJ, et al. A survey of smartphone and interactive video technology use by participants in Alzheimer's disease research: Implications for remote cognitive assessment. Alzheimer's Dement. 2021;13:e12188. 10.1002/dad2.12188
REFERENCES
- 1. Pew Research Center. Internet/Broadband Fact Sheet. https://www.pewresearch.org/internet/fact‐sheet/internet‐broadband/: Pew Research Center; 2019.
- 2. Anderson M, Mobile Technology and Home Broadband 2019. https://www.pewresearch.org/internet/2019/06/13/mobile‐technology‐and‐home‐broadband‐2019/: Pew Research Center; 2019.
- 3. Anderson M, Perrin A, Tech Adoption Climbs Among Older Adults. https://www.pewresearch.org/internet/2017/05/17/tech‐adoption‐climbs‐among‐older‐adults/: Pew Research Center; 2017.
- 4. American Psychiatric Association . Diagnostic and statistical manual of mental disorders. 5th ed. Washington, DC: American Psychiatric Publishing; 2013. [Google Scholar]
- 5. McKhann GM, Knopman DS, Chertkow H, et al. The diagnosis of dementia due to Alzheimer's disease: recommendations from the National Institute on Aging‐Alzheimer's Association workgroups on diagnostic guidelines for Alzheimer's disease. Alzheimers Dement. 2011;7:263‐269. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Albert MS, DeKosky ST, Dickson D, et al. The diagnosis of mild cognitive impairment due to Alzheimer's disease: recommendations from the National Institute on Aging‐Alzheimer's Association workgroups on diagnostic guidelines for Alzheimer's disease. Alzheimers Dement. 2011;7:270‐279. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Perrin A, Turner E, Smartphones help blacks, Hispanics bridge some – but not all – digital gaps with whites. https://www.pewresearch.org/fact‐tank/2019/08/20/smartphones‐help‐blacks‐hispanics‐bridge‐some‐but‐not‐all‐digital‐gaps‐with‐whites/: Pew Research Center; 2019.
- 8. Marra DE, Hamlet KM, Bauer RM, Bowers D. Validity of teleneuropsychology for older adults in response to COVID‐19: a systematic and critical review. Clin Neuropsychol. 2020;34:1411‐1452. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Costanzo MC, Arcidiacono C, Rodolico A, Panebianco M, Aguglia E, Signorelli MS. Diagnostic and interventional implications of telemedicine in Alzheimer's disease and mild cognitive impairment: a literature review. Int J Geriatr Psychiatry. 2019;35:12‐28. [DOI] [PubMed] [Google Scholar]
- 10. Weintraub S, Besser L, Dodge HH, et al. Version 3 of the Alzheimer Disease Centers' Neuropsychological Test Battery in the Uniform Data Set (UDS). Alzheimer Dis Assoc Disord. 2018;32:10‐17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Sliwinski MJ, Mogle JA, Hyun J, Munoz E, Smyth JM, Lipton RB. Reliability and validity of ambulatory cognitive assessments. Assessment. 2018;25:14‐30. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12. Sano M, Zhu CW, Kaye J, et al. A randomized clinical trial to evaluate home‐based assessment of people over 75 years old. Alzheimers Dement. 2019;15(5):615‐624. [DOI] [PubMed] [Google Scholar]
- 13. Geddes MR, O'Connell ME, Fisk JD, et al. Remote cognitive and behavioral assessment: report of the Alzheimer Society of Canada Task Force on dementia care best practices for COVID‐19. Alzheimers Dement (Amst). 2020;12:e12111. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Carotenuto A, Rea R, Traini E, Ricci G, Fasanaro AM, Amenta F. Cognitive Assessment of Patients With Alzheimer's Disease by Telemedicine: Pilot Study. JMIR Mental Health. 2018;5: 2:e31. 10.2196/mental.8097. [DOI] [PMC free article] [PubMed] [Google Scholar]
