Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2022 Aug 11.
Published in final edited form as: DIS (Des Interact Syst Conf). 2022 Jun 13;2022:1831–1848. doi: 10.1145/3532106.3533471

Control Matters in Elder Care Technology:

Evidence and Direction for Designing It In

Clara Berridge 1, Yuanjin Zhou 2, Amanda Lazar 3, Anupreet Porwal 4, Nora Mattek 5, Sarah Gothard 6, Jeffrey Kaye 7
PMCID: PMC9367632  NIHMSID: NIHMS1827071  PMID: 35969716

Abstract

Studies find that older adults want control over how technologies are used in their care, but how it can be operationalized through design remains to be clarified. We present findings from a large survey (n=825) of a well-characterized U.S. online cohort that provides actionable evidence of the importance of designing for control over monitoring technologies. This uniquely large, age-diverse sample allows us to compare needs across age and other characteristics with insights about future users and current older adults (n=496 >64), including those concerned about their own memory loss (n=201). All five control options, which are not currently enabled, were very or extremely important to most people across age. Findings indicate that comfort with a range of care technologies is contingent on having privacy- and other control-enabling options. We discuss opportunities for design to meet these user needs that demand course correction through attentive, creative work.

Keywords: Health technology, Remote monitoring, Assistive technology, Privacy, Control, Aging, Older adult, Dementia, Memory loss

1. INTRODUCTION

Older adults and people living with dementia are among the most targeted groups for technological intervention in the name of risk assessment and management [7, 59, 136]. The desire to predict and prevent health events through increased remote monitoring and analysis of behavioral biometrics is driven by health and long-term care cost containment and the widespread preference to manage chronic conditions at home. The need to be attentive to older adults’ preferences for control and privacy enabled in technology design is pressing because of the strong momentum in many countries to rely on technologies in care provision, without fully considering how they could restrict rather than enhance autonomy, privacy, human connection, and other important needs and values [14, 17, 20, 51, 60, 62 In addition to potentially restricting important needs and values, the ways that these emerging technologies instantiate control and privacy may impede adoption. Research has found that negative attitudes that older adults hold towards technology stem in part from concerns about technology interrupting home life, as well as privacy threats [15, 101].

Technology acceptance research has dominated the field of gerontechnology with little attention to the assumptions participants make about control options they might have or that they might require. The field of human-computer interaction (HCI) and design practice are ideally positioned to address this gap. Human-computer interaction researchers are establishing the importance of control and privacy for older adults, primarily through qualitative research [51, 62, 95, 144]. This project takes the approach of survey research, gathering feedback from older adults about the elements of control that are important to them in care technology. With the growing recognition that the category ‘older adult’ represents an extraordinarily diverse group of people, surveys are useful for breaking down how control needs may differ between people along certain characteristics. For example, people with memory concerns, mild cognitive impairment (MCI), and dementias are often targeted for remote monitoring – yet it is unknown how their preferences compare to others’ [24].

We conducted a large (n=825) survey with an online U.S. cohort to better understand potential users’ perspectives on various options that enable control over elder care technologies. With an eye to designing for the future, the sample is age diverse, while skewing older than age 64 (n=496), so that we can compare across age as well as deepen our understanding of the preferences of current older adults. We engaged in this work to identify opportunities to “design in” control and to identify potential disconnects where more attentive design visioning is needed. Following research on the control desired by older adults in the context of monitoring technologies, we conceptualize control as a form of ongoing agency in which an individual can decide if and what information is collected about them and with whom it is shared [111, 149]. Survey participants were first presented with three distinct types of data collection and use: location outside of the home, audio for detection of changes to brain health, and visual with audio telepresence robot (video chat on wheels). After asking them to assess their comfort level with each, we presented five control-enabling options, which gerontechnology domain experts have identified as ways to mitigate prominent risks posed by these and other technologies used in elder care [18]. Survey participants rated the importance of each option that could be applied to these particular technologies and beyond.

Our findings show that across the many health and sociodemographic factors that we accounted for, people place high importance on being able to exercise basic control over technologies that could be used in their care. Comfort was relatively high with a wide scope of types of data collected and transmitted for use by one’s primary support person,1 but comfort is contingent on basic control options that are not standard options in the design of elder care technologies. This paper contributes actionable evidence, including the value of operationalizing these forms of control, and provides specific design recommendations to better match the needs of older adults. In doing so, our work advances the growing body of research that aims to counter ageist stereotypes in design [146], rebalance unfavorable power dynamics in care, and mitigate other risks of data-intensive technology use.

2. BACKGROUND AND RELATED WORK

Technologies to support older adults and their caregivers at home is an area of strong growth for design. Policy makers, providers, and family caregivers are looking to technological solutions and investing in artificial intelligence (AI) and other technologies that monitor activity and safety of older adults, with a particular focus on Alzheimer’s disease and related dementias (e.g., the AAL Joint Platform of the European Union and AGE-WELL Initiative in Canada) [25, 69, 91, 127, 129, 138]. In the U.S., health care payment systems including Medicaid are now beginning to cover a range of technologies that have remote monitoring functions [17, 28], foretelling significant growth as lack of third party reimbursement had impeded startups in the aging space [19, 129].

The nature of data continuously collected about older adults is also rapidly changing, especially as big tech has recently established itself in the space of monitoring and risk assessment of older adults [81, 151]. There is particular excitement about the potential capability of detecting cognitive change using predictive linguistic markers [18, 112], which involves audio capture in the home. Location tracking is becoming ubiquitous [25, 48], and the problem of social isolation during COVID is further spurring the evolution of telepresence [64, 87]. These three categories of technologies represent distinct types of data collection that may be carried out passively with little room for direct control by the older adult.

Sustained focus on the control needs and preferences of older adults in relation to these and other monitoring technologies has been delayed, in part due to what has been termed by Peine and Neven the ‘interventionist logic’ that has dominated and which problematizes old age and prioritizes the needs of caregivers [114, 115, 146]. This dominant logic together with the marginalization of issues of elder care in a range of fields that are concerned with power, privacy and other values in sociotechnical practices, such as HCI, science and technology studies (STS), and surveillance studies [51], has left us with limited and vague understanding of the specific controls desired by older adults in technologies used for care purposes. This gap in knowledge and inattention to older adults’ needs for control over how they are monitored may set up technology-mediated care practices to fail or disempower.

2.1. What’s at stake for older adults in how technologies are used in their care?

Invasion of privacy is by far the most often cited threat posed by elder care technologies that older adults are concerned about [14, 20, 47, 51, 89, 113, 132, 155, 157]. Older adults in general tend to have greater privacy concerns about Internet use and fraud but lower privacy self-efficacy than do younger adults [8, 156]. Research also indicates that privacy needs and concerns will vary with the technology and data collected [20, 128], meaning that personalized control options would be worthwhile design elements. Privacy law scholar, Cohen’s conceptualization of privacy interests as “an interest in breathing room to engage in socially situated processes of boundary management” [35:149] is useful to this context of elder care because it foregrounds the fact that privacy is necessary for individuation as it relates to subjectivity, and that one’s ability to manage these boundaries is protective of “the capacity for self-determination” [36:1905]. To have a degree of control over how one is monitored is to be able to engage in boundary management and be protective of personal privacy needs [14].

Privacy is inextricably linked to other values. It enables values that older adults tend to care about, like freedom and autonomy [14, 20]. HCI researchers, McNeill and colleagues draw similar conclusions about the important role of privacy [96]. They provide a privacy framework that includes functions of privacy identified through interviews with older adults. These functions are Self-protection, Autonomy, Emotional release, Confiding (“control over the extent of information disclosed and to whom it is disclosed”), Social identity, Self-concept, and Protecting others. The authors recommend autonomy-enabling options, such as control over when a device is collecting data, and caution that “if AAL technologies are a success at the expense of the individual’s privacy then to what extent is the AAL really empowering or improving the well-being of the elderly?” [96:101].

Risks have been associated with monitoring technologies used in dementia care in the home, including location tracking outside of the home, audio recording, and roaming telepresence. These risks include the fact that privacy invasion for the individual and for visitors can cause self-limiting activity or diminished autonomy, independence, and agency, and can compromise dignity through information capture (including inferred) about sexual or bodily functions [18]. Risks also include caregiver overreach and unnecessary harassment based on data, as well as feeling infantilized or “baby sat,” uncomfortable at home, without a place to hide, bugged, and uneasy if a caregiver could visually or audibly enter at any time. This could lead to distrust, suspicion, anxiety or paranoia [18]. Of primary concern is also the risk that the use of these data could enable fewer social calls or substitute for personal visits. Caregiver stress is also cited as a risk should they receive ambiguous data, not understand what warrants an intervention, or experience information or alert overload [18].

2.2. What do we know about older adults’ preferences for control?

There is little in the gerontechnology literature on specific desirable options to enable control. Much of it focuses on data flows, such as willingness for in-home monitoring technologies to share personal and health information [13, 24, 68, 73]. Acceptance tends to be high with regard to sharing health information with medical providers or family members, but older adults may express distrust in digital technologies with diffuse data implications and take a moral position against the loss of privacy as a cultural value [74]. Values dissonance can contribute to resistance to use of new technologies [12, 15] and uncertainty or not understanding a technology or its data flows begets distrust [74].

Qualitative studies have found that control tends to be very important to older adults [20, 44, 52, 111]. Small studies have shown that they want control over decision making about what data are accessed by whom under what conditions [46, 52, 89, 149]. Mean-while, systems that enable control by older adults are not the norm, and many that have a monitoring function are fixed or hard to customize to accommodate personal preferences [60, 105, 145].

The work in HCI on the needs of older adults has been expanding [12]. In a study of 12 people in care facilities about the sharing of health and well being data with caregivers, Nurgalieva et al. found that control mattered to residents who wanted to maintain it as long as possible - even in these spaces where privacy is more limited than it is in home and community environments [111]. The authors conclude that “designers should ensure that information exchange occurs with informed consent and is aligned with seniors’ preferences for privacy and control” [111]. A qualitative study of home robots with 30 older adults found that the need for control was one of four user needs that robots potentially threaten [44]. Another small-scale study of older adults’ preferences for companion robots found that control was a significant factor for some, which factored into their preference for an animal form because having a pet-like robot conveys a sense of control over it [34]. Control was found to be “a key condition for privacy preservation” by Schomakers et al. when they surveyed 97 adults, including many over the age of 50, about fall detection and vital parameter monitoring [128]. The authors explain that privacy concerns “do not only concern information privacy and data security, but all dimensions of privacy are touched as intimate and private aspects of life are digitized and the physical, psychological, and social self is made more available to others than desired” [128]. Their survey respondents reported that they desired control over what, how, when, where, and to whom data flows [128]. Frik and colleagues [51] similarly conclude, based on an interview study with 46 older adults, that designers must improve transparency and control as part and parcel of addressing misconceptions among older adults about data flows.

Extant research suggests that empowering older adults with control over data-intensive health and elder care technologies will only become more important as use climbs and populations age [73]. This will require a finer-grained understanding of how to enable control and its prerequisites, such as awareness, and to learn what options might be important to which subgroups of older adults for whom products are designed to support.

2.3. On power and why who’s given control matters

Due to its surveilling nature, elder care monitoring technologies used in the home are ripe for familial conflict and stress [20, 70]. Studies have found differences between older adults and family members with regard to perception of need and comfort with the collection of various levels of data granularity [16, 20, 58, 89]. Older adults and adult children are likely to have different preferences because the risks and benefits of monitoring fall differently to each [20, 54, 110]. A qualitative dyadic study of Meals on Wheels clients and their adult children found that adult children rated passive remote monitoring technologies more favorably than did their older parents and expressed conflicting views with those of their parents about how and when they should be used [20]. Adult children overwhelmingly underestimated their parents’ demonstrated ability to comprehend the basic functions of these technologies and thought they would engage them minimally in decisions about adoption [20].

Given the potential risks and ‘intimate threats’ [84], it is especially important to take a close look at power dynamics that are enabled, amplified, or disrupted through those technologies. Critical data studies scholars have emphasized that ethics discourse and research must incorporate explicit attention to power [27, 61]. This is of particular relevance to the design of technologies for elder care [94].

Burmeister and Kreps write that “Rather than allowing power to be exercised in an intuitive or unconscious manner” [27], design needs to not just be attentive to power, but to design intentionally with it top of mind. This is particularly important because older adults are likely to assess their own risk differently from how others assess it such that proxy decision making may not reflect the decisions older adults would otherwise make for themselves [125]. The practices that develop around remote monitoring can negatively impact older adults where greater exercise of power over aspects of their lives is authorized [16, 54]. For instance, an ethnography of dementia care dyads revealed troubling ways in which using sensors and fall detectors led caregivers to restrict movement and remove privileges, as well as reduce their personal visits [79]. Ethicists and gerontologists have cautioned that older adults may be vulnerable to infringements on self-determination through remote monitoring [15, 60, 106], as well as the ways in which decision making based on AI monitoring “may inadvertently intensify power and control” [60:3].

2.4. The need for analysis of potential difference

Previous research has found that age, formal education, gender and sexual identity, race and ethnicity, and health conditions are factors that could affect comfort and willingness to share personal health or other data use in remote monitoring [2, 13, 26, 50, 56, 65, 72, 75, 118, 141, 153]. A common assumption made about future generations of older adults when considering study findings about current older adults’ privacy concerns is that those concerns will essentially age out because younger older adults will hold very different expectations for privacy. That is, they’ll be inured to more extensive data collection about them. Because different future user needs would have significant implications for design practices, it is vital that these assumptions be tested. Further, despite the heterogeneity of people over age 64, research on technology preferences in HCI often lacks insight into potential differences among older adults by socio-demographic and health characteristics [118]. Research shows us that older adults are heterogeneous in terms of tech use [39] and that views vary greatly by individual for perceived benefits and privacy risk [128], but the heterogeneity and fine-grained nature of older adults’ privacy preferences are understudied [50, 146]. Designing for a non-homogenous group thus requires a closer look at socio-demographic characteristics, acknowledging that different preferences and needs may be present based on lived experience in different body-minds with accumulated diverse life experiences. For example, Poulsen et al. found that LGBTIQ+ older adults had specific concerns about their vulnerability to robot data security, indicating a greater perceived vulnerability to personal data collection [118]. Learning more about such potential differences would therefore provide valuable insights into how socio-demographic factors may interact with preferences for control over elder care technologies.

It may be particularly important to understand the needs of people who are concerned about their memory or other possible signs of dementia–not only because so much technology is designed for dementia care use–but also because gerontology research has demonstrated various ways in which the preferences of people living with dementia can go unrepresented [23, 57, 98, 100, 152]. Those who provide care for this large, marginalized group of people are increasingly looking to technologies to support caregiving. Studies show that people living with dementia often prefer to be more involved in decision-making than they are [100], and that losses in autonomy with a dementia diagnosis are stressful to them [134]. These stressors may be compounded if they do not have a sense of control over how data-intensive technologies are used in their care.

2.5. Risk mitigation by design

What can design do to enable feelings of control, privacy preservation, and to otherwise mitigate the risks that accompany elder care technologies that can be experienced as intrusive? And how can design create the most relevant control-enabling features possible for a person living with dementia? Enabling control for diverse needs of older adults will take extra design work. First, beyond knowledge that older adults would generally like to maintain as much control as possible, a clear understanding of what specific forms of control are desired is needed. Understanding more concretely what options would be valued is key to mitigating risk through design.

Researchers within and adjacent to HCI have offered responsive, creative suggestions for accommodating the needs of older adult users [51, 55, 96, 145, 149], including some focused on those living with dementia [95, 123]. Often, authors infer design implications and suggestions from needs identified in small-scale studies. Close, attentive qualitative design research such as these, and particularly those engaging people living with dementia, produce rich insights and ways of understanding [80, 104]. Our relatively large-scale survey study takes a next step in the process of understanding how to design more empowering elder care technology.

3. METHODS

This study involved a survey completed by 825 individuals. Rather than ending with design implications inferred from project engagement with older adults, we have fed specific user options back to a large number of older adults so we can learn directly from them if they would indeed be worthwhile. Further, due to the large sample size, we are able to offer unique insights into ways in which people might differently weigh the importance of control options by their age, gender, formal education, having a parent with dementia, and self-reported memory status, among other characteristics.

The survey was based on a review of the literature and our findings from a prior survey process, which was a Delphi study of interdisciplinary domain experts [18]. The control options that were derived from domain experts in this Delphi study are tightly linked to risk mitigation but, like most studies, they are not generated directly by older adults, nor have they been given the opportunity to vet them. The survey study reported in this paper completes this feedback loop to assess the importance of these control options in order to better guide design and future user studies.

3.1. Survey design

We created a survey consisting of 19 items organized by three sections: technology scenarios, options, and artificial companionship. In this paper, we focus on the responses to the scenarios and options sections (see the survey in Appendix A.1). Content for the current survey was drawn from a review of the literature on risks and benefits associated with remote monitoring technologies, as well as from the findings of our published multi-wave Delphi expert survey on dementia care technologies used in the home [18]. That multi-wave survey was conducted with domain experts in aging and technology research, design, and implementation in the U.S. and Canada. It included disciplinarily diverse participants, such as social scientists in gerontechnology. The Delphi study was conducted to identify technologies that will be prevalent in home dementia care in five years, along with their benefits, risks, and risk mitigation strategies, and was informed by a review of the state of knowledge on near-future home-based technologies for dementia care [18]. The study used the Delphi approach to obtain excerpts’ opinions and build consensus with them regarding their responses. We selected this expert study as the foundation of our survey to respond to gerontechnology researchers’ call for design and development to be more informed by gerontological and social sciences knowledge [39, 40, 42, 49, 66, 88, 129]. In this way, this study puts gerontology into more direct conversation with HCI.

The survey presented in this paper draws on the technology scenarios and options developed in our prior work [18]. For the technology scenarios, the expert study had participants identify the technologies they predicted would be the most prevalent in dementia care in the home in the next 5 years (see [18]). We selected three of the twelve technologies identified by experts for the current study: location tracking outside of the home for safety, in-home speech/audio analysis for early detection of brain changes, and video and audio capture through telepresence on wheels. We chose these three technologies because they are each 1) among the most commonly endorsed by domain experts as likely to cause tension and require a conversation with the older adult and caregiver before use, and 2) each represent a distinct type of data collected by a range of elder care technologies that, together, represent a range of data types: location, visual, and audio; and 3) each can be communicated with confident clarity in a self-administered online survey format. For the purpose of this survey, we created scenarios around the three technologies and added a time variant for the telepresence robot (video chat on wheels) for pandemic vs. normal times (see Table 2).

Table 2:

Comfort with four technology scenarios

Survey Items Response Frequencies
Very Uncomfortable
n (%)
Somewhat Uncomfortable
n (%)
Somewhat Comfortable
n (%)
Very Comfortable
n (%)
Your primary support person is concerned about your well-being. They want to track where you are when you are driving to make sure you are safe. How comfortable are you with this? (n=789a) 91 (11.5%) 137 (17.4%) 234 (29.7%) 327 (41.4%)
New technology that tracks speech changes over time could help people learn about changes to their brain health early. This would allow a person to get help from a medical provider if they have early signs of dementia or memory loss. Your primary support person wants to record audio in your home to learn if and when you might be experiencing changes in your brain health. How comfortable are you with this? (n=825) 154 (18.7%) 193 (23.4%) 245 (29.7%) 233 (28.2%)
Some forms of technology allow a loved one to be a remote presence through video chat (such as FaceTime or Zoom). Unlike those options on your phone or computer, robotic devices are able to be driven remotely in your home. Video chat or “check-in on wheels” can take place anywhere in your home.
Please think about unusual times when someone cannot come to your home such as during the coronavirus pandemic. In these times, how comfortable would you be with this video chat or “check-in on wheels” driven by your primary support person in your home? (n=825) 88 (10.7%) 132 (16.0%) 293 (35.5%) 312 (37.8%)
Now please imagine that we are again living under normal circumstances so that you are able to spend time in person with other people. In normal times, how comfortable would you be with this video chat or “check-in on wheels” driven by your primary support person in your home? (n=825) 149 (18.1%) 164 (19.9%) 276 (33.5%) 236 (28.6%)
a

Missing observations for this question=36 (Not applicable because participants selected that they do not drive)

Our survey also presents five control options derived from the Delphi study, where experts produced ideas for how to mitigate risks that they had identified. The five control options we chose for the current study are the most commonly endorsed risk mitigation strategies from this past work [18]. These risk mitigation strategies are:

  • Devices should actively remind users of how they operate, who is controlling them, and how data are being collected and used

  • Establish informed consent as a process

  • Allow person living with dementia to stop or not use

  • Allow repetitive opportunities to try it out, decline to use, and try again

  • Avoid continuous streaming—require permissions for video or audio

  • Enable ability for person living with dementia to pause system/data collection

We translated these expert-identified risk mitigation strategies into five specific, actionable user options for our survey (see Table 4) to make them concrete and understandable for survey participants (i.e., non-domain experts). These five options align well with extant work in HCI. For example, both [145] and [51] have illustrated the need for feedback about what data are being collected about older adults in order to clear up potential confusion about data flows and enable informed use. While the five options do not represent an exhaustive list of control options that may matter to older adult users, they do act as both a drilling down and synthesis of many of the suggestions put forward by both ethics and HCI researchers [95, 96, 123, 145].

Table 4:

Frequencies for options responses

Not at all Important Very Unimportant Somewhat Unimportant Somewhat Important Very Important Extremely Important
Q1: To have your primary support person check in with you now and then about whether you’ve changed your mind about using the technology 22 (2.7%) 17 (2.1%) 63 (7.8%) 164 (20.2%) 332 (41.0%) 212 (26.2%)
Q2: To be reminded every now and then about what information a technology collects about you 9 (1.1%) 15 (1.9%) 39 (4.8%) 131 (16.2%) 287 (35.4%) 329 (40.6%)
Q3: To try out a technology that is used in your care before deciding to keep it 6 (0.7%) 12 (1.5%) 17 (2.1%) 95 (11.7%) 326 (40.2%) 354 (43.7%)
Q4: To be able to control when a “video chat on wheels” is turned on, if you had one in your home 9 (1.1%) 7 (0.9%) 15 (1.9%) 67 (8.3%) 207 (25.6%) 505 (62.3%)
Q5: To have the ability to pause a technology in your home when you want privacy 5 (0.6%) 8 (1.0%) 7 (0.9%) 28 (3.5%) 163 (20.1%) 599 (74.0%)

3.2. Study participants

The survey was administered using Qualtrics and disseminated by email in June of 2020 to a pre-existing online survey cohort of the Research via Internet Technology and Experience (RITE) program of the Oregon Health & Science University (OHSU).2 The only inclusion criteria for this cohort is to be over the age of 18. The volunteer RITE Online Cohort was recruited using OHSU’s Oregon Clinical and Translational Research Institute’s Cohort Discovery and through social media campaigns and flyers. Social media campaigns and flyers were secondary recruitment strategies. Health and socio-demographic data are collected as part of intake and are updated through an annual survey.

The current study’s survey was distributed online to all 2,434 members of the RITE cohort. We achieved a response rate of 45% (1,082). Because the focus is on technologies used in the home setting, two respondents living in assisted living were excluded for analysis. Those without data for key variables of interest, age (missing=4), gender (missing=72), education (missing=150), or memory problem history (missing=179) were excluded, leaving a final sample of 825. The acceptable rate of missing value for each covariate of secondary interest ranges from 0.1% to 4.2% and 9.1% for the variable history of dementia in parents.

Table 1 presents the health and socio-demographic characteristics of our sample. Compared to the general national population, the study sample is older, whiter, and more formally educated. The respondents’ ages range from 21–92 years with a mean of 64 (SD=13.13). Sixty percent of respondents are 65 or older while only 16.5% of the general U.S. population is 65 and older [142]. Sixty-six percent of this sample are women. The majority of respondents (74.2%) have a college degree or more education–far higher than the 32.1% of the U.S. general population. Ninety-six percent of respondents are white, compared with 76.3% of the population [142]. Because our sample skews older than the general population, nearly one quarter (24.4%) of our sample report either current memory problems and/or that they have been seen by a physician for memory problems, which is a far greater percentage than the general population. The percentage of those within our sample over 65 who report memory problems is consistent with most population studies [37, 38, 147].

Table 1:

Description of the sample according to all independent variables (n=825)

Category Subcategories Mean, SD/Frequencies Percentage
Age (n=825) Range: 25–88 Mean=63.93
SD=13.17
Gender (n=825) Female 534 64.7%
Male 291 35.3%
Marital status (n=820) Married/living as if married 577 70.4%
Not married 243 29.6%
Living status (n=824) Living alone 162 19.7%
Living with others 662 80.3%
Education (n=825) No college degree 202 24.5%
College degree 276 33.5%
Master degree and above 347 42.1%
Memory problem history (n=825) Memory problem reported 201 24.4%
No memory problem reported 624 75.6%
Number of chronic conditions (n=790) 3+ 540 68.4%
0–2 250 31.6%
Confidence using computer (n=792) Highly confident 668 84.3%
Low-moderately confident 124 15.7 %
History of dementia in parents (n=750) History of dementia in either of parents 226 30.1%
No history of dementia in either of parents 524 69.9%
Social activity level score (n=800) Range: 0–17 (out of 20) Mean:8.47
SD=2.82

This sample is also far more technologically experienced and resourced and may thus represent a group that would be expected to be early adopters of new technologies. The vast majority of our sample (84.3%) rated their confidence using the computer as very high. Ninety-five percent of our respondents report using the computer daily while 81% of the general population reports going online daily [117]. Our sample also differs from the general population in their greater access to wireless internet (95% vs. 77%) [116]. Only 75% of the general 65+ population uses the internet [116] and 42% do not have wireless broadband at home [63]. While our sample skews older, among our participants 65+, 93.3% have wireless internet and 100% use the internet. These sample characteristics are discussed further in the limitations section.

3.3. Dependent variables

Comfort was assessed for three distinct types of data collection and use: location tracking while driving for safety, in-home speech/audio analysis for early detection of brain changes, and video and audio capture through telepresence robots. The four response options are Very Uncomfortable; Somewhat Uncomfortable; Somewhat Comfortable; Very Comfortable. For the telepresence technology, described as video chat on wheels, participants were asked two questions about their comfort level: in scenarios of “during normal circumstances” and “in unusual times when someone cannot come to your home such as during the coronavirus pandemic.”

The importance participants place on technology control options was assessed through five questions (presented in Table 4). The six response options are Not at all Important; Very Unimportant; Somewhat Unimportant; Somewhat Important; Important; Extremely Important. As described in section 3.1, these options were derived from the risk mitigation strategies recommended by gerontechnology domain experts for data-intensive technologies used in dementia care in the home (see [18]).

3.4. Independent variables

Sociodemographic characteristics and personal health conditions were pre-collected through the RITE cohort surveys. According to prior literature, characteristics that have been shown to be associated with comfort with technologies were used in bivariate and multivariate analysis, including age [140], gender [29], education [78], marital status [1], living alone or with others [78], confidence using computers [68], social support [9], number of chronic conditions [78], and memory problem history [30, 39]. The Brief Assessment of Social Engagement (BASE) scale ranging from 0–20 [103] was used to assess engagement in social activities outside of the home as one indicator of social engagement and support, including travel out of town, attending religious events, attending clubs or group events, visiting friends/family, and eating out. Each item of the BASE scale was rated on the basis of frequency (0 rarely or never, 1 yearly, 2 monthly, 3 weekly, 4 daily). Respondents were asked if they had self-reported current memory problems and if the participants were seen by a physician for memory problems. A dichotomous variable, memory problem history, was positive when a respondent replied yes to either of these two questions. Having a parent with a history of dementia was also included because experience with dementia could impact assessment of the technology scenarios or options and it might be relevant to respondents’ perceived risk of acquiring dementia [71]. The limited response options for this cohort’s pre-collected gender question are a limitation of this study that resulted in the exclusion of six respondents, which we discuss in the limitations section.

3.5. Analysis

Descriptive analysis was conducted using R software [120]. We conducted bivariate and multivariate ordered logistic regression [22] using R package “MASS” [121] and “ordinal” [31] to examine the associations between independent variables and respondents’ comfort with three types of technologies. The Wilcoxon Signed-rank test [154] was used to detect differences in responses to the question about comfort level with video-chat on wheels in normal vs. pandemic times.

We used latent class analysis (LCA) to identify classes of respondents reporting similar patterns of their perceived importance of technology control options. This “person-centered” approach allows us to identify complex patterns or typologies of participants in multivariate categorical data [130]. The five ordinal technology control options questions were used to classify respondents into subgroups based on the differences and similarities of their responses to these five questions. Some existing R packages including poLCA [86] performed LCA for polytomous nominal categorical variables. As treating ordinal variables as nominal excludes potentially useful information contained in those variables, we modeled the ordinal nature of responses using an adjacent category logit model [4, 5]. We estimated parameters of the model using Expectation Maximization (EM) approach [43]. The optimal number of classes was determined using the Bayesian information criterion (BIC) to balance model fit and parsimony [130]. We then generated a jittered spaghetti plot of respondents stratified by their estimated latent class. Detailed descriptions of the LCA analysis process are provided in Appendix A.2.

After we determined the number of classes, we examined the associations between independent variables and classes using the Welch one way test [102] for independent variables that are continuous variables (age and social activity level scores) and Fisher’s exact test [3] for independent variables that are categorical.

4. FINDINGS

4.1. Comfort with technology scenarios by health and demographic characteristics

Response frequencies to each of the four technology scenario questions are presented in Table 2. 71.1% (561) felt somewhat or very comfortable if their primary support person tracked where they are when they are driving to make sure they are safe. 57.9% (478) felt somewhat or very comfortable with their primary support person recording audio in their home to learn if and when they might be experiencing changes in their brain health. 73.3% (605) felt somewhat or very comfortable with video chat on wheels driven by their primary support person in their home during pandemic times and 62.1% (512) felt this way under normal circumstances. Respondents were significantly more comfortable during pandemic versus normal times (p<0.001) with video-chat on wheels; however, the effect size (0.19) indicates that this statistically significant difference is small.

Table 3 presents those characteristics with statistically significant relationships to reported comfort with each of the three technologies, including bivariate results. In bivariate analysis, higher age by one year (odds ratio [OR] = 1.01; 95% confidence interval [CI] = [1.00, 1.02], p=0.027), being married (OR=1.87; [1.40, 2.50], p<0.001), and greater level of social activities (OR=1.05; [1.00, 1.10], p=0.034) were associated with greater comfort with having one’s location tracked while driving by a primary support person, while living alone was associated with lower comfort (OR=0.54; [0.39, 0.75], p<0.001). Women were significantly less likely than men to report comfort with having their audio recorded (OR=0.56; [0.43, 0.73], p<0.001). As with location tracking, those who were married were more comfortable (OR=1.55; [1.18, 2.04], p=0.001) and those who live alone were less comfortable with audio recording (OR=0.61; [0.45, 0.83], p=0.002). In bivariate analysis, no characteristics differentiated respondents by comfort with video chat on wheels during pandemic times, but during normal circumstances, women (OR=0.72; [0.55, 0.93], p=0.011), people with a college degree (OR=0.70; [0.50, 0.97], p=0.033), and those living alone (OR=0.73; [0.54, 1.00], p=0.049) were significantly less comfortable with video chat on wheels. People reporting high confidence using the computer (OR=1.52; [1.07, 2.14], p=0.019) and those with a parent with a history of dementia (OR=1.33; [1.00, 1.76], p=0.050) reported greater comfort levels.

Table 3:

Statistically significant variables for bivariate and multivariate ordinal logistic regression

Predictors Location tracking for driving Audio recording for brain health Check-in on wheels during pandemic Check-in on wheels during normal times
Predictors based on bivariate ordinal logistic regression Age: 1.01 (1.00–1.02)* Female: 0.56 (0.43–0.73)*** Female: 0.72 (0.55–0.93)*
Married/living as if married: 1.87 (1.40–2.50)*** Married/living as if married: 1.55 (1.18–2.04)** College degree: a 0.70 (0.50–0.97)*
Living alone: 0.54 (0.39–0.75)*** Living alone: 0.61 (0.45–0.83)** Living alone: 0.73 (0.54–1.00)*
Social activity: 1.05 (1.00–1.10)* High confidence using computer:b 1.52 (1.07–2.14)*
History of dementia in parents: 1.33 (1.00–1.76)*
Predictors based on multivariate ordinal logistic regression Age: 1.02 (1.01–1.03)** Female: 0.58 (0.43–0.78)*** Female: 0.72 (0.54–0.97)*
Married/living as if married: 1.66 (1.01–2.70)* College degree: 0.65 (0.45–0.94)*
Master degree or above: 0.62 (0.43–0.89)**
High confidence using computer: 1.60 (1.09–2.35)*
History of dementia in parents: 1.46 (1.08–1.98)*
a

Reference group: No college degree

b

Reference group: Low-moderate confidence of using computers

*

p<0.05;

**

p<0.01;

***

p<0.001

In adjusted, multivariate analysis, higher age (OR=1.02; [1.01, 1.03], p=0.006) and being married (OR=1.66; [1.01, 2.70], p=0.043) remain significant for comfort with having one’s location tracked while driving by a primary support person. Women remain significantly less likely than men to report comfort with having their audio recorded (OR=0.58; [0.43, 0.78], p<0.001). As with bivariate results, no characteristics differentiated respondents by comfort with video chat on wheels during pandemic times. Lower comfort during normal circumstances remains significant for women (OR=0.72; [0.54, 0.97], p=0.030), and those with greater education. Both people with a college degree (OR=0.65; [0.45, 0.94], p=0.023) and those with a master’s degree or higher (OR=0.62; [0.43, 0.89], p=0.009) reported lower comfort with video chat on wheels during normal times. As with bivariate analysis, people reporting high confidence using the computer (OR=1.60; [1.09, 2.35], p=0.017) and those with a parent with a history of dementia (OR=1.46; [1.08, 1.98], p=0.015) reported greater comfort levels with video chat on wheels in an adjusted model.

4.2. The importance of five control options

Each of the five options that enable different aspects of control over how technology is used in one’s care were rated by the vast majority of respondents as very or extremely important (Table 4). For the ability to pause a technology in your home when you want privacy, nearly all (94%) rated that as very or extremely important, with only 2.5% rating it as somewhat important or less. Eighty-eight percent think it is very or extremely important that they be able to control when a video chat on wheels is turned on in their home. Eighty-four percent rated trying out a technology before deciding to keep it as very or extremely important. Seventy-six percent thought it would be very or extremely important to be reminded every now and then about what information a technology collects about them, and 67 percent rated it very or extremely important that their support person check in with them about whether they had changed their mind about using a given technology, with 20 percent rating it somewhat important. A very small minority thought any of these options were very unimportant or not at all important (range of 1.6%−4.8%).

4.3. Latent classes for control options

Latent class analysis (LCA) was used to understand characteristics predicting how people ranked control on aggregated options. We estimated the number of classes to be four with minimum BIC for LCA. Figure 1 shows the four latent classes relative to the indicator variables describing respondents’ perceived importance of technology control options. Class 1 (n=18, 2.2%) is labeled “Unimportant,” representing a small number of respondents who selected Not at all important to Somewhat unimportant to most questions. Class 2 (n=333, 41.1%) is “Very important with variation.” Respondents reported varied perceived importance, clustering in the very and extremely important range with far lower importance placed on the option of checking in about whether they’ve changed their mind about using the technology. Class 3 (n=232, 28.6%) is labeled “Important” with those who responded to most questions from Somewhat to Very important. Class 4 (n= 227, 28.0%) is labeled “Highly important” with those who selected very or extremely important for most options.

Figure 1:

Figure 1:

Latent class analysis model: responses stratified by estimated latent class.

Age, gender, and history of dementia in parents were significantly associated with the four LCA classes (p<0.05) (Table 5). Age tended to be higher in Class 1 of perceiving technology control options to be unimportant compared with the other classes, and Class 4 (Highly important) had lower aged participants compared to Class 3 (Important). Women were overrepresented in Class 4 (Highly important). Participants who had a parent with dementia were less likely to be in class 4 (Highly important) and more likely to be in the small Class 1 (Unimportant). The higher prevalence of memory problem history among respondents in Class 1 compared to all these classes did not reach statistical significance (p=0.120). No significant associations were identified between LCA classes with the other independent variables we examined (marital status, living alone, education, chronic conditions, confidence using computers, and social activity level scores).

Table 5:

LCA and independent variables

Class 1 Unimportant(n=18) Class 2 - Varied(n=333) Class 3 - Important(n=232) Class 4 - Very important(n=227) P-value
Age 69.1 (11.7) 62.5 (13.1) 66.1 (12.3) 63.3 (13.8) 0.003
Female 66.7% 64.6% 57.3% 72.7% 0.007
Married/living as if married 66.7% 71.5% 70.7% 66.1% 0.503
Living alone 16.7% 18.9% 17.7% 23.8% 0.376
College degree 27.8% 33.6% 33.2% 34.4% 0.514
Master’s degree 38.9% 44.4% 42.7% 37.0% -
Memory problem history 38.9% 25.2% 19.4% 26.0% 0.120
3+ chronic conditions 66.7% 64.9% 68.1% 63.9% 0.802
High confidence of using computer 61.1% 83.5% 80.6% 79.7% 0.1681
History of dementia in parents 38.9% 25.2% 26.0% 19.4% 0.022
Social activity level score 7.7(2.1) 8.4 (2.9) 8.4 (2.8) 8.7 (2.8) 0.276

5. DISCUSSION

Qualitative HCI research has yielded critical insights about barriers to informed use of care technologies, such as lack of feedback and resulting problems of confusion about data flows among older adults. As part of this, researchers have called for the need for privacy options throughout the body of work on remote monitoring technologies for older adults. Robillard has developed stepwise guidelines based on 5 pillars for ethical adoption specifically for dementia care technologies, which begin with participatory design to align needs and outcomes [122]. However, the levers to apply each additional ideal condition for ethical use are often missing. In this context, it is perhaps more critical that design be attentive to user needs.

Our study reveals that nearly three in four survey participants were at least somewhat comfortable using video chat on wheels controlled by a primary support person in their home, and slightly fewer were somewhat comfortable with them tracking their driving location. The least comfort is reported with audio recording to monitor brain health, but still a majority report some comfort with this tracking by a primary support person. A greater majority rated all five control options as very or extremely important, with the ability to pause a device of most importance. An implication of this set of findings on both comfort with monitoring technologies and desire for control options is that technology acceptance studies should assess control (and other) needs in tandem. If we do not understand what needs people expect to be accommodated when they report acceptance, we risk designing based on faulty assumptions and incomplete understanding of their desires and concerns. Below we discuss ways in which design can be responsive to the need for control along with implications for our comparative findings across socio-demographics.

5.1. Implications for Design: design’s role in enabling control and privacy options

This study has a number of implications for the design of elder care technologies. It is important to put these in real-world context in which implementation of data intensive technology across care settings can be complicated and vary dramatically depending on specific care dynamics, as well as available resources, whether at the state (i.e., Medicaid) or household level. Policy lags behind use [17, 18, 124] and well-intentioned guidelines to support ethical and more empowering relationships to these technologies lack regulatory power and do not reach into private, familial practices. Codes of conduct and guidelines have been in place for over a decade and yet are still being called for and pointed to as a viable solution to potentially disempowering care practices [6, 118]. Risks are sometimes acknowledged, yet the feedback loop to both design and implementation is not complete [18]. This is likely an effect of consumer focus. That is, products are designed and marketed to caregivers rather to older adults with or on whom they are to be used [145]. As others have pointed out, this is on the one hand unsurprising given the rampant, hidden nature of ageism that manifests in disregard for the need to examine the needs of older adults in technological designs. While the digital divide has been foregrounded in gerontechnology for decades, the concept of “digital ageism” has entered the lexicon, motivated by the discourse on AI harms, particularly regarding race and gender bias [32, 92]. Digital ageism is a result of the youth-centricity of design and technology industries and the often unquestioned stereotypical representation of older adults and old age that permeates many aspects of U.S. cultures [32, 92, 126, 139]. The hidden nature of ageism in the context of limited resources, ethics oversight, and regulation, makes it all the more important for design to foreground issues of power and control.

HCI researchers have investigated issues of power and control of technologies for older adults from the perspectives of different stakeholders. Family members have been found to be willing to support older adults in keeping technology usage secure and private [97]. However, researchers have pointed out that stewardship can sometimes veer into paternalism, where privacy and security decisions are made on behalf of older adults, without their input or even knowledge [107]. Each of the design implications below center the older adult as the primary decision maker and agent in utilizing these options. In doing so, we move the locus of agency towards the older adult, while also recognizing the potential of alternative approaches that harness collective efficacy of communities [76] rather than focusing on individual older adults and designer choices.

5.1.1. To have the ability to pause a technology for privacy.

This research demonstrates strongly that control and privacy options in elder care technologies matter to potential users. Over 90 percent of people felt that the option to pause a technology is very or extremely important. Others have written about the need for this kind of option [96, 145] but none have had the large sample to demonstrate such strong evidence of the desire to have the privacy-enhancing option available. This evidence is needed because the pause option is not typically offered to older adults in technologies to support aging in place or care. Here there is a disconnect between needs and desires and products designed.

Fortunately for design, this is a relatively straightforward solution. Frik et al. have recommended using the most private settings in privacy control defaults to make this option user friendly for older adults [51]. But resistance among companies in this space and in gerontechnology generally to enabling pauses has often been expressed as concern that older adults, and people living with dementia especially, could forget to turn a device back on. Simple design solutions such as an automated timer and indication that the device is re-activating could address this concern. It is worth noting that early emergency alert devices, such as those to detect falls, did not have a cancel emergency response option, despite the high false alert rates and older adults’ preferences to have some control over how a fall is responded to [15]. This trend is changing and more flexible pause and use customization options could follow the same successful route. More work is needed to highlight where the disconnect is in devices on the market as well as in policy and implementation.

5.1.2. To be reminded every now and then about what information a technology collects.

Previous work has found that older adults may be more prone than are younger adults to misperceptions and confusion about data flows. Frik et al. explain:

Data flows in emerging technologies are especially opaque for older adults because they may be less familiar with the state-of-the-art sensors and algorithms, or with advances in artificial intelligence, than the younger population [131]. They may base their assumptions about how devices work—and therefore their privacy mitigations—on analogies with more familiar technology [51:29].

Given this problem of misperception, the current study’s insight that older adults generally want to be reminded about what data is collected about them makes evident the need to address awareness. To support boundary management, privacy, and autonomy for a general user population, Leong and Selinger suggest that bots push out reminders that they are not real [83]. A similar approach to enable transparency could be used to signal to older adults what a device is tracking and who is receiving that information. To ensure transparent feedback, devices could visually show the person what information the caregiver is receiving, as suggested by Vines et al. [145] who years ago noted the problem of lack of feedback and resulting misperceptions among older adult study participants. This work would require attentive participatory design to make sure the technique used for such communication is appropriately targeted and successful.

5.1.3. To try out a technology that is used before deciding to keep it and to check in in case preferences change about using the technology.

Eighty-four percent rated the option of trying out a technology first as very or extremely important. Sixty-seven percent rated checkins to learn if they had changed their mind about use as very or extremely important. Others have noted from a neuroethics perspective how consenting is often implicit for elder care technology, leading to recommendations for ongoing, dynamic processes for consenting [123]. Guiding ethical practices that are attentive to this need within private and even publicly funded care provision is a challenge. What role could design play to facilitate this? How can designers and HCI effectively communicate and design for trial periods and routine opportunities to adjust device use? Creative solutions are needed, perhaps including options that embed the norm of trial periods, such as designing a suite of titrated levels of monitoring whereby a user selects the desired option upon testing them in the short term. There is great potential here for designers to influence norms of use, testing, and even ongoing adjustment for personalization. This could be a space for important innovation based on the needs of older adults and the evolving needs of those living with dementia that may be adapted for other user groups.

Standard practices might also be adjusted to accommodate these preferences, in light of the fact that it is hard to fully predict how a technology will be experienced in practice [15]. For example, collaborations could be explored with assistive technology programs, such as U.S. programs funded through the Assistive Technology Act that promote technology access for disabled people, for greater outreach to older adults and caregivers to enable low-stakes trial periods. Business models could offer trial periods or rent to own options that spread payment risk out over time for the consumer or third party payer such that financial commitment is graded to give time to try it out. This would differ from current contract practices. Purchasers might require this to avoid overcommitment to one product solution. In the U.S., this is likely to become a bigger issue with new reimbursement options through cost-conscious third party payers like Medicaid and Medicare for remote patient monitoring and other devices [17, 28]. Moreover, feeling locked into a device can impact relations between beneficiaries and their families and care managers for such programs, placing the frontline workers in difficult positions. Whatever model might be used, it should be based on the expectation that we don’t know how people will interact with a given technology. High initial fees lock people in and should be avoided. These are not design recommendations, per se, but as HCI researchers who have illustrated the complexity of dementia care have pointed out, non-technological features like supports and business models, are also key elements to successfully implemented technology products [148].

5.1.4. To be able to control when a “video chat on wheels” is turned on, if you had one in your home.

Care robots have received heightened attention during the pandemic due to amplified problems among older adults of social isolation [45, 53, 67, 135]. Robots used in elder care range dramatically in data use, integration of AI and form [21, 119, 123]. The technology we posed to study participants falls at one end of this spectrum that does not employ AI as a conversational agent, but rather, as telepresence on wheels in which family and friends could enter a home visually, audibly, and remotely maneuver the location of the robot.

Roaming telepresence like the Beam and Giraff offer the potential benefit of easy use for an older adult with memory or mobility limitations because unlike Zoom, FaceTime, Skype, or other applications, they can be remotely steered [123]. A caregiver could help the person square up to the screen or use the roaming capacity to find the person in their home. The idea that someone could enter another person’s home without need for an acceptance is appealing in emergency situations. But as others have noted, opportunity to accept or decline an initiated telepresence session should be normalized in design and practice [18, 123]. Roaming telepresence without the capability of accepting/declining a call has been cited by gerontechnology domain experts as an extreme violation of personal privacy [18]. For years, the dominant assumption has been that “unobtrusive” technologies with little to no required action on the part of the older adult are ideal, but there are two problems with this assumption. One is that research on actual use has shown how older adults choose to participate as active users and disrupt the “passive age scripts” [108] of devices even when a system is designed for passivity (e.g., passive sensing) [15]. The other issue is that unobtrusive can mean that people are uninformed or unaware and thus lack the privacy they believe they have, and as a consequence cannot adjust their behavior accordingly. Unobtrusiveness implies ease and convenience, which full remote control of telepresence targets; however, older adults are likely to experience visual and audio entrance into the home as invasive if it lacks an option to decline it.

This is not dissimilar from the capacity to pause a monitoring technology when one wants privacy. One can imagine all kinds of scenarios in which it is not a good time for the older adult to have an impromptu visual and audio screen visit from a family member who can control its location in the older adult’s home. The design challenge is how to balance potential need for ease of use and benefits of mobile remote presence with the need for the recipient to have control over initiated calls. This includes accommodating physical difficulty squaring up with or accessing the telepresence robot.

5.2. Implications for findings of differences across participants

People ages 65 and older represent multiple generations and age cohorts. They are extraordinarily heterogeneous [90], including in preferences regarding technologies [39] and privacy concerns [14, 51, 128]. The current state of knowledge about the ways these factors may impact user preferences and needs for control options over elder care technology practices is very limited. We included multiple socio-demographic variables in our models and found that age, gender, and experience with dementia in parents are significantly associated with the latent classes for our five control options.

5.2.1. Age.

In adjusted models, age was not associated with comfort with video-chat on wheels or audio recording, but greater age was associated with greater comfort with having one’s location tracked while driving by a primary support person. Like younger participants, most participants over age 65 rated the control options as important; however, older participants were more likely than were younger participants to rate the control options as unimportant or just important and less likely to rate them as very important.

This raises the question, do these control options become less important with age or is this a cohort effect? That is, will the future 65-year-olds of 2050 retain their relative views given cohort-distinct exposures to technology (e.g., through employment vs. retirement exposure) and understandings of risk? Or, longitudinally, do all cohorts of people become slightly inured to lower levels of control over technology as they age? It is possible that greater understanding of a given technology and data flows make one more sensitive to control needs. While we found no significant differences with regard to education among latent classes for the ratings of control, both people with a college degree and those with a master’s degree or more reported lower comfort with video chat on wheels during normal times than did those without a college degree. A recent study of algorithmic awareness in Norway similarly found that negative assessment of algorithmic uses was positively correlated with education, which correlates too with higher awareness of algorithms [55]. Parallel findings about education are described in a survey of Korean older adults where higher education was associated with negative attitudes about sharing a range of personal data with entities including family and hospitals [73].

While our sample was large enough that we could control for covariates, where greater understanding comes from greater exposure, such as through higher formal education or specialized knowledge gained in the workplace, these factors are also associated with age. That is, median higher education attainment increases with new generations, as does tech industry employment, and as such, while we do control for education in our models, age may be confounding for the purpose of distinguishing cohort effect from age. As others have noted in [128], this question of age vs. cohort effect is important for the design community to understand as it seeks to anticipate the technological needs and relevant values held by future older adults. But it is a difficult question to answer given how dramatically technology shifts over time. To understand change over time requires longitudinal studies which are unfortunately rare in this space.

A common question of user research in gerontechnology and HCI in general is “But shouldn’t we expect younger older cohorts to be more accepting and unconcerned with privacy?” Perhaps not necessarily. Our finding about age conflicts with a common assumption that future older adults will be less concerned about privacy and potentially less invested in privacy-enabling control options than are today’s cohorts over age 65. Control options, including pausing for privacy were very important to our entire study sample as a whole, but they were assigned greater weight with lower age. Our participants younger than 65 asserted the importance of these five control options even more strongly than did the older participants. What we can glean from this age-comparative study is that a popular assumption may not be true that today’s younger adults will grow into older adults who are inured to privacy concerns and willing to give up control over data about them. Our study indicates that unless their preferences change over time, they may be even more likely to desire control over technologies used in their care. This will be important to explore through further study as it has implications for how designers should orient to the needs of younger older adults. Regardless, the main takeaway is these control options were very or extremely important to most people across age.

5.2.2. Gender.

In contrast to a recent survey on smart homes that did not find an association between preferences for privacy between demographic traits such as gender and age [10], we found that women were significantly less likely than men to report comfort with having their audio recorded and less comfortable with video chat on wheels during normal times in adjusted models. Our finding that women were less likely to be comfortable with two of the technology scenarios than were men is consistent with research on other areas of data capture and privacy concerns and is particularly meaningful to gerontechnology given the feminization of aging. Women are also overrepresented in the “Highly important” class for the control options. These gender differences might reflect greater vulnerability experienced by women when interacting with monitoring and information technology and associated power dynamics and that these concerns may vary with the technology and purpose. Numerous studies of online privacy concerns suggest that women might experience more risks and concerns compared with men [85]. One explanation is differential experiences with and perceptions of sexual data leakage [77], exploitation, and intimate partner and other abuses (i.e., sexual harassment, doxing, stalking) [93, 99, 133]. Objections, including refusals, that may be partially shaped by experiences of gender-as well as sexual orientation which this study did not examine-will be important to take seriously when engaging older adults of diverse genders in technology that collects and shares private data [33, 118]. Further, the number of men to women sharply declines at older ages, leaving far more women in the 85+ age category, suggesting that designers of care technologies would be wise to attend more to their concerns and needs.

5.2.3. Memory problems and history of a parent with dementia.

Compared with those without a parent with a history of dementia, those with this parental history were significantly more likely to report higher comfort levels with video chat on wheels in an adjusted model. They are also overrepresented among the minority who generally rated the control options as “Unimportant” or just “Important” compared with higher levels of importance. We might infer from these findings that their exposure to dementia and likely caregiver experience also exposed them to the realities of limited tools and resources to provide such care. Given the resource-restricted environment of dementia care in the U.S., it is quite likely that these participants have also experienced under-met need for caregiving supports, memory care, or long-term care for a parent living with dementia. It is probable that witnessing and perhaps providing care to someone through a neurodegenerative disease has helped them to appreciate the challenges of caregiving. That is, their experience may allow them to see greater value in certain tools and place less value on control over these tools in older age. This possibility aligns with previous gerontechnology research that has established that perception of usefulness is a key component for acceptance of technology [41, 82, 143].

There were no statistically significant differences detected between those with self-reported memory problems and those without. Similarly, a longitudinal study about home-based monitoring data and privacy concerns that included people with and without mild cognitive impairment (MCI), found no differences at baseline [24]. They also found few differences between the two groups at one-year follow up. While those without cognitive impairment became significantly less comfortable with video monitoring over the one year study period, people living with MCI did not report a decrease in willingness to be videotaped over time. At one year, participants with MCI were also significantly less likely to report concerns that information could be used to harm them. The authors posit that people with MCI may be more accepting of monitoring due to awareness of threats to their ability to live independently or that they may be less attentive to media content on data and internet risks [24]. Our findings of no difference between people with and without self-reported memory problems suggest that increased perception of health and safety vulnerability that comes with dementia or dementia risk may be counterbalanced by other salient vulnerabilities such as to privacy and autonomy loss, as well as desire not to become a “burden.” New data collection and monitoring may be perceived by different people as alleviating care worry or burden or as intensifying it. One might expect that technologies designed to support people at risk for or living with dementia would be more appealing to people at risk for dementia than those without memory problems. Yet our finding and that of Boise et al. of no greater initial comfort among those experiencing memory problems indicates that other considerations are at play [24].

We found that control over technologies used in care may be no less important to people experiencing memory problems [24]. This aligns with a qualitative study of online safety surveillance for couples with memory concerns in which participants thought that control over settings that would enable personalization would be important [95]. It suggests that an important DIS and HCI path will be to acknowledge and attend to the control needs of people living with dementia over technologies used in their care and to learn how to adjust options that enable control to align with changing capacities. This is a potentially exciting challenge for design that has implications for enabling optimal control for other users of monitoring technologies who have degenerative conditions. Enabling degrees of control with dementia is an important area for further inquiry and more longitudinal research is needed to understand how needs may change over time and with experience gained through actual use.

5.3. Limitations and future work

A critical line of inquiry that has to our knowledge not been pursued is to compare the needs of older adults across race and ethnicity, which we were unable to do due the racial homogeneity of our sample. The vast majority of our respondents to this survey were white (95%) and have more formal education and technological experience than the general population and are thus not representative of it. It is possible that the concerns and considerations driving their preferences differ from those of other populations. We cannot speculate at this point given the dearth of research on the potential relationship between race and control needs in care technologies among older adults. There are a couple ways to think about how education level and the high technology access of this online cohort may impact their responses. On the one hand, our study participants’ relative greater access to and comfort with digital technologies likely skewed their responses toward greater comfort and acceptance of the technologies of interest. Because of this, they may represent views closer to the perspective of early adopters. On the other hand, their relative comfort and real-world experience with technology may also give them a stronger understanding of the pitfalls of the technology use scenarios they were presented with (i.e., security issues, privacy threats). Formal education has been associated in other studies and in ours with lower comfort with some forms of data collection, so our sample that has more formal education than the general population may be reporting less comfort and greater control needs than would a representative U.S. sample. It is also important to note that stated opinions about technologies that have not been used do not necessarily predict use, acceptance, or comfort.

Additional factors may have affected the results. Survey respondents were not administered cognitive tests or subjected to physical examinations so that the self-report for memory problems may not accurately reflect actual cognitive performance. The gender variable recorded as part of the initial intake for the online cohort was a limited binary response option of male and female with a third write-in option. For this analysis, we coded binary transgender individuals with their reported gender (those who wrote in trans female (n=1) were coded as women and we coded as men those who wrote in trans male (n=3). Included in the count of missing gender variables are the six additional people who wrote in various responses that broadly fall under the umbrella of gender diverse and questioning because they were excluded from analysis. The exclusion of these six participants is a limitation of this study. Future surveys should provide a broader range of response options for gender and use sampling methods to recruit adequate numbers of participants who identify as non-binary to enable inclusion for multivariate analysis.

Finally, as Nissenbaum has pointed out with regard to reducing privacy protections to control, it is important to note that control is not the end game in ethical, power-aware implementation of technologies that collect, transmit, and analyze new data about older adults [11, 109]. It is possible to achieve informed consent and to enable myriad control options but to do so in a way that results in individual or collective harms [137], such as where choices to use a given technology are still constrained by resource restriction that create unviable or undesirable alternatives to use. That being said, control is important to many older adults and should be extended to contribute to more ethical and equitable practices.

6. CONCLUSION

This study builds on previous HCI work by providing direct feedback from potential users on options that could mitigate the primary risks monitoring technologies pose to older adults and care relationships, including those resulting from uninformed use. We presented findings from a survey of 825 people about three technology uses predicted by domain experts to soon be prevalent in dementia home care, followed by five control options. This is the first survey to assess potential users’ interest in such a range of specific options that are within the realm that design can enable. We found that participants report relatively high comfort with sharing data with their primary support person, but that this data collection and sharing is contingent on having control options enabled. All five control options were very or extremely important to most participants, including a full 94 percent for the ability to pause a device. We found no significant difference in adjusted models between those with and without self-reported memory problems for comfort with the technologies or desire for the control options. The control options that matter to the vast majority are not standard options in the design of many products for elder care. We discuss various implications for design to respond to this strong demonstration of user need and argue that design has a significant role to play in enabling more empowering elder care practices. It is important that this role be played given the current limitations to implementing and reinforcing ethics and other guidelines for integrating technologies into elder care practices in ways that protect against risks.

Supplementary Material

Appendices

CCS CONCEPTS.

• :; • Human-centered computing → Human computer interaction (HCI); Empirical studies in HCI; • Social and professional topics → User characteristics; Age; Seniors; User characteristics; People with disabilities;

ACKNOWLEDGMENTS

The authors thank Melissa Clark for her survey methodology consultation and George Demiris, Karen Fredriksen-Goldsen, and Julie Robillard for reviewing survey drafts. This research was supported by the National Institute on Aging (PI: Berridge, NIA K01AG062681), the Oregon Roybal Center for Care Support Translational Research Advantaged by Integrating Technology (ORCASTRAIT; PI Kaye; supported by NIH P30AG024978), the NIA-Oregon Layton Aging & Alzheimer’s Disease Research Center (PI: Kaye, NIA P30AG066518), and the National Science Foundation (PI: Lazar, NSF IIS - 2045679).

Footnotes

1

As shown in Appendix A.1, the survey introduction defines primary support person as follows: Your “primary support person” is someone who would be most likely to step in if you needed care or help. We know you may not have a primary support person now but please think about it in terms of your family member or friend who would care for and look out for you.

2

A description of the original RITE Online Cohort, launched in 2015, can be found in Kabacińska et al., 2020.

Contributor Information

Clara Berridge, University of Washington.

Yuanjin Zhou, University of Texas at Austin.

Amanda Lazar, University of Maryland.

Anupreet Porwal, University of Washington.

Nora Mattek, Oregon Health & Science University.

Sarah Gothard, Oregon Health & Science University.

Jeffrey Kaye, Oregon Health & Science University.

REFERENCES

  • [1].Abd-alrazaq Alaa A., Bewick Bridgette M., Farragher Tracey, and Gardner Peter. 2019. Factors that affect the use of electronic personal health records among patients: A systematic review. International Journal of Medical Informatics 126, 164–175. 10.1016/j.ijmedinf.2019.03.014 [DOI] [PubMed] [Google Scholar]
  • [2].Abdelhamid Mohamed, Gaia Joana, and Sanders G. Lawrence. 2017. Putting the focus back on the patient: How privacy concerns affect personal health information sharing intentions. Journal of Medical Internet Research 19, 9, e169. 10.2196/jmir.6877 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [3].Agresti Alan. 1992. A Survey of exact inference for contingency tables. Statistical Science 7, 1, 131–153. 10.1214/ss/1177011454 [DOI] [Google Scholar]
  • [4].Agresti Alan. 2010. Analysis of Ordinal Categorical Data. John Wiley & Sons. [Google Scholar]
  • [5].Agresti Alan and Lang Joseph B.. 1993. Quasi-symmetric latent class models, with application to rater agreement. Biometrics 49, 1, 131–139. 10.2307/2532608 [DOI] [PubMed] [Google Scholar]
  • [6].Alzheimer Europe. 2010. The ethical issues linked to the use of assistive technology in dementia care. Retrieved May 3, 2018 from http://www.alzheimer-europe.org/EN/Ethics/Ethical-issues-inpractice/2010-The-ethical-issues-linked-to-the-use-of-assistivetechnology-in-dementia-care
  • [7].Astell Arlene J., Bouranis Nicole, Hoey Jesse, Lindauer Allison, Mihailidis Alex, Nugent Chris, Robillard Julie M., and Technology and Dementia Professional Interest Area. 2019. Technology and dementia: The future is now. Dementia and Geriatric Cognitive Disorders 47, 3, 131–139. 10.1159/000497800 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [8].Baig Ed. 2021. Older Adults Wary About Their Privacy Online: Companies increase transparency about data collection to ease those concerns. AARP. Retrieved February 11, 2022 from https://www.aarp.org/home-family/personal-technology/info-2021/companies-address-online-privacy-concerns.html [Google Scholar]
  • [9].Baisch Stefanie, Kolling Thorsten, Schall Arthur, Rühl Saskia, Selic Stefanie, Kim Ziyon, Rossberg Holger, Klein Barbara, Pantel Johannes, Oswald Frank, and Knopf Monika. 2017. Acceptance of social robots by elder people: does psychosocial functioning matter? International Journal of Social Robotics 9, 2: 293–307. 10.1007/s12369-016-0392-5 [DOI] [Google Scholar]
  • [10].Barbosa Natã M., Zhang Zhuohao, and Wang Yang. 2020. Do privacy and security matter to everyone? quantifying and clustering {user-centric} considerations about smart home device adoption. 417–435. Retrieved February 11, 2022 from https://www.usenix.org/conference/soups2020/presentation/barbosa
  • [11].Barocas Solon and Nissenbaum Helen. 2014. Big data’s end run around procedural privacy protections. Communications of the ACM 57, 11, 31–33. 10.1145/2668897 [DOI] [Google Scholar]
  • [12].Pena Belén Barros, Clarke Rachel E, Holmquist Lars Erik, and Vines John. 2021. Circumspect users: older adults as critical adopters and resistors of technology. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI ‘21), 1–14. 10.1145/3411764.3445128 [DOI] [Google Scholar]
  • [13].Beach Scott, Schulz Richard, Downs Julie, Matthews Judith, Barron Bruce, and Seelman Katherine. 2009. Disability, age, and informational privacy attitudes in quality of life technology applications: Results from a national web survey. ACM Transactions on Accessible Computing 2, 1, 1–21. 10.1145/1525840.1525846 [DOI] [Google Scholar]
  • [14].Berridge Clara. 2016. Breathing room in monitored space: The impact of passive monitoring technology on privacy in independent living. The Gerontologist 56, 5: 807–816. 10.1093/geront/gnv034 [DOI] [PubMed] [Google Scholar]
  • [15].Berridge Clara. 2017. Active subjects of passive monitoring: responses to a passive monitoring system in low-income independent living. Ageing and society 37, 3: 537–560. 10.1017/S0144686X15001269 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [16].Berridge Clara. 2017. Selling passive monitoring to manage risk in independent living: Frontline workers in a bind. In Under Observation: The Interplay Between eHealth and Surveillance, Adams Samantha, Purtova Nadezhda and Leenes Ronald (eds.). Springer International Publishing, Cham, 73–90. 10.1007/978-3-319-48342-9_5 [DOI] [Google Scholar]
  • [17].Berridge Clara. 2018. Medicaid becomes the first third-party payer to cover passive remote monitoring for home care: Policy analysis. Journal of Medical Internet Research 20, 2: e66. 10.2196/jmir.9650 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [18].Berridge Clara, Demiris George, and Kaye Jeffrey. 2021. Domain experts on dementia-care technologies: Mitigating risk in design and implementation. Science and Engineering Ethics 27, 1: 14. 10.1007/s11948-021-00286-w [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [19].Berridge Clara, Furseth Peder Inge, Cuthbertson Richard, and Demello Steven. 2014. Technology-based innovation for independent living: Policy and innovation in the United Kingdom, Scandinavia, and the United States. Journal of Aging & Social Policy 26, 3, 213–228. 10.1080/08959420.2014.899177 [DOI] [PubMed] [Google Scholar]
  • [20].Berridge Clara and Terrie Fox Wetle. 2020. Why older adults and their children disagree about in-home surveillance technology, sensors, and tracking. The Gerontologist 60, 5, 926–934. 10.1093/geront/gnz068 [DOI] [PubMed] [Google Scholar]
  • [21].Bevilacqua Roberta, Felici Elisa, Cavallo Filippo, Amabili Giulio, and Maranesi Elvira. 2021. Designing Acceptable Robots for Assisting Older Adults: A Pilot Study on the Willingness to Interact. International Journal of Environmental Research and Public Health 18, 20, 10686. 10.3390/ijerph182010686 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [22].Bilder Christopher R. and Loughin Thomas M.. 2014. Analysis of Categorical Data with R. Chapman and Hall/CRC, Boca Raton. [Google Scholar]
  • [23].de Boer Marike E., Hertogh Cees M. P. M., Dröes Rose-Marie, Riphagen Ingrid I., Jonker Cees, and Eefsting Jan A.. 2007. Suffering from dementia – the patient’s perspective: a review of the literature. International Psychogeriatrics 19, 6, 1021–1039. 10.1017/S1041610207005765 [DOI] [PubMed] [Google Scholar]
  • [24].Boise Linda, Wild Katherine, Mattek Nora, Ruhl Mary, Dodge Hiroko H., and Kaye Jeffrey. 2013. Willingness of older adults to share data and privacy concerns after exposure to unobtrusive in-home monitoring. Gerontechnology: International Journal on the Fundamental Aspects of Technology to Serve the Ageing Society 11, 3, 428–435. 10.4017/gt.2013.11.3.001.00 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [25].Breasail Mícheál Ó, Biswas Bijetri, Smith Matthew D., Mazhar Md Khadimul A., Tenison Emma, Cullen Anisha, Lithander Fiona E., Roudaut Anne, Henderson Emily J.. 2021. Wearable GPS and accelerometer technologies for monitoring mobility and physical activity in neurodegenerative disorders: A systematic review. Sensors 21, 24, 8261. 10.3390/s21248261 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [26].Buckley Brian S., Murphy Andrew W., and MacFarlane Anne E.. 2011. Public attitudes to the use in research of personal health information from general practitioners’ records: a survey of the Irish general public. Journal of Medical Ethics 37, 1, 50–55. 10.1136/jme.2010.037903 [DOI] [PubMed] [Google Scholar]
  • [27].Burmeister Oliver K. and Kreps David. 2021. Power influences upon technology design for age-related cognitive decline using the VSD framework. Ethics and Information Technology 23, 1, 95–98. 10.1007/s10676-018-9460-x [DOI] [Google Scholar]
  • [28].Center for Connected Health Policy (CCHP). 2022. State Telehealth Policies Remote Patient Monitoring. CCHP. Retrieved February 11, 2022 from https://www.cchpca.org/topic/remote-patient-monitoring/ [Google Scholar]
  • [29].Chappell Neena L. and Zimmer Zachary,. 1999. Receptivity to new technology among older adults. Disability and Rehabilitation 21, 5–6, 222–230. 10.1080/096382899297648 [DOI] [PubMed] [Google Scholar]
  • [30].Charness Neil and Boot Walter R.. 2009. Aging and information technology use: Potential and barriers. Current Directions in Psychological Science 18, 5, 253–258. 10.1111/j.1467-8721.2009.01647.x [DOI] [Google Scholar]
  • [31].Christensen Rune Haubo Bojesen and Christensen Maintainer Rune Haubo Bojesen. 2015. Package ‘ordinal.’ Stand 19: 2016.
  • [32].Chu Charlene H, Nyrup Rune, Leslie Kathleen, Shi Jiamin, Bianchi Andria, Lyn Alexandra, Molly McNicholl Shehroz Khan, Rahimi Samira, and Grenier Amanda. 2022. Digital ageism: Challenges and opportunities in artificial intelligence for older adults. The Gerontologist, gnab 167. 10.1093/geront/gnab167 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [33].Cifor M, Garcia P, Rault J, Sutherland T, Chan A, Rode J, Hoffmann AL, Salehi N, and Nakamura L. 2019. Feminist Data Manifest-No. Feminist Data Manifest-No. Retrieved June 7, 2021 from https://www.manifestno.com
  • [34].Coghlan Simon, Waycott Jenny, Lazar Amanda, and Neves Barbara Barbosa. 2021. Dignity, autonomy, and style of company: Dimensions older adults consider for robot companions. Proceedings of the ACM on Human-Computer Interaction 5, CSCW1,104:1–104:25. 10.1145/3449178 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [35].Cohen Julie E.. 2012. Configuring the networked self: law, code, and the play of everyday practice. Yale University Press, New Haven [Conn.]. [Google Scholar]
  • [36].Cohen Julie E.. 2013. What privacy is for. Harvard Law Review 126, 1904–1933. [Google Scholar]
  • [37].C. J. A. M Commissaris, R. W. H. M Ponds, and Jolles J. 1998. Subjective forgetfulness in a normal Dutch population: possibilities for health education and other interventions. Patient Education and Counseling 34, 1, 25–32. 10.1016/S0738-3991(98)00040-8 [DOI] [PubMed] [Google Scholar]
  • [38].Cooper Claudia, Bebbington Paul, Lindesay James, Meltzer Howard, McManus Sally, Jenkins Rachel, and Livingston Gill. 2011. The meaning of reporting forgetfulness: a cross-sectional study of adults in the English 2007 Adult Psychiatric Morbidity Survey. Age and Ageing 40, 6, 711–717. 10.1093/ageing/afr121 [DOI] [PubMed] [Google Scholar]
  • [39].Czaja Sara J., Charness Neil, Fisk Arthur D., Hertzog Christopher, Nair Sankaran N., Rogers Wendy A., and Sharit Joseph. 2006. Factors predicting the use of technology: Findings from the Center for Research and Education on Aging and Technology Enhancement (CREATE). Psychology and aging 21, 2, 333–352. 10.1037/0882-7974.21.2.333 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [40].Czaja Sara J. and Lee Chin Chin. 2006. The impact of aging on access to technology. Universal Access in the Information Society 5, 4, 341. 10.1007/s10209-006-0060-x [DOI] [Google Scholar]
  • [41].Dahlke Deborah Vollmer, Lee Shinduk, Smith Matthew Lee, Shubert Tifany, Popovich Stephen, and Ory Marcia G.. 2021. Attitudes toward technology and use of fall alert wearables in caregiving: Survey study. JMIR Aging 4, 1, e23381. 10.2196/23381 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [42].Demiris George and Hensel Brian K.. 2008. Technologies for an aging society: A systematic review of “smart home” applications. Yearbook of Medical Informatics 17, 1, 33–40. 10.1055/s-0038-1638580 [DOI] [PubMed] [Google Scholar]
  • [43].Dempster AP, Laird NM, and Rubin DB. 1977. Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society: Series B (Methodological) 39, 1, 1–22. 10.1111/j.2517-6161.1977.tb01600.x [DOI] [Google Scholar]
  • [44].Deutsch Inbal, Erel Hadas, Paz Michal, Hoffman Guy, and Zuckerman Oren. 2019. Home robotic devices for older adults: Opportunities and concerns. Computers in Human Behavior 98, 122–133. 10.1016/j.chb.2019.04.002 [DOI] [Google Scholar]
  • [45].Engelhart Katie. 2021. What robots can—and can’t—do for the old and lonely. The New Yorker. Retrieved June 7, 2021 from https://www.newyorker.com/magazine/2021/05/31/what-robots-can-and-cant-do-for-the-old-and-lonely [Google Scholar]
  • [46].Essén Anna. 2008. The two facets of electronic care surveillance: An exploration of the views of older people who live with monitoring devices. Social Science & Medicine 67, 1, 128–136. 10.1016/j.socscimed.2008.03.005 [DOI] [PubMed] [Google Scholar]
  • [47].Ethical Frameworks for Telecare Technologies for older people at home (EFORTT). 2011. Ethical Frameworks for Telecare Technologies for older people at home (EFORTT) -Department of Sociology, Lancaster University, UK. European Commission. Retrieved February 11, 2022 from https://www.lancaster.ac.uk/efortt/ [Google Scholar]
  • [48].Fillekes Michelle Pasquale, Kim Eun-Kyeong, Trumpf Rieke, Zijlstra Wiebren, Giannouli Eleftheria, and Weibel Robert. 2019. Assessing older adults’ daily mobility: A comparison of GPS-derived and self-reported mobility indicators. Sensors 19, 20, 4551. 10.3390/s19204551 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [49].Fisk Arthur D., Czaja Sara J, Rogers Wendy A., Charness Neil, and Sharit Joseph. 2009. Designing for Older Adults. Taylor and Francis. [Google Scholar]
  • [50].Frik Alisa, Bernd Julia, Alomar Noura, and Egelman Serge. 2020. A qualitative model of older adults’ contextual decision-making about information sharing. In Workshop on the Economics of Information Security (WEIS 2020). [Google Scholar]
  • [51].Frik Alisa, Nurgalieva Leysan, Bernd Julia, Lee Joyce, Schaub Florian, and Egelman Serge. 2019. Privacy and security threat models and mitigation strategies of older adults. In Fifteenth Symposium on Usable Privacy and Security (SOUPS 2019), 21–40. [Google Scholar]
  • [52].Galambos Colleen, Rantz Marilyn, Craver Andy, Bongiorno Marie, Pelts Michael, Holik Austin John, and Jun Jung Sim. 2019. Living with intelligent sensors: older adult and family member perceptions. CIN: Computers, Informatics, Nursing 37, 12, 615–627. 10.1097/CIN.0000000000000555 [DOI] [PubMed] [Google Scholar]
  • [53].Ghafurian Moojan, Ellard Colin, and Dautenhahn Kerstin. 2020. Social companion robots to reduce isolation: A perception change due to COVID-19. arXiv: 2008.05382 [cs]. Retrieved June 7, 2021 from http://arxiv.org/abs/2008.05382 [Google Scholar]
  • [54].Godwin Beatrice. 2012. The ethical evaluation of assistive technology for practitioners: a checklist arising from a participatory study with people with dementia, family and professionals. Journal of Assistive Technologies 6, 2, 123–135 10.1108/17549451211234975 [DOI] [Google Scholar]
  • [55].Gran Anne-Britt, Booth Peter, and Bucher Taina. 2021. To be or not to be algorithm aware: a question of a new digital divide? Information, Communication & Society 24, 12, 1779–1796. [Google Scholar]
  • [56].Grande David, Asch David A., Wan Fei, Bradbury Angela R., Jagsi Reshma, and Mitra Nandita. 2015. Are patients with cancer less willing to share their health information? Privacy, sensitivity, and social purpose. Journal of Oncology Practice 11, 5, 378–383. 10.1200/JOP.2015.004820 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [57].Harman Guy and Clare Linda. 2006. Illness representations and lived experience in early-stage dementia. Qualitative Health Research 16, 4, 484–502. 10.1177/1049732306286851 [DOI] [PubMed] [Google Scholar]
  • [58].Hensel Brian K., Demiris George, and Courtney Karen L.. 2006. Defning obtrusiveness in home telehealth technologies: A conceptual framework. Journal of the American Medical Informatics Association 13, 4, 428–431. 10.1197/jamia.M2026 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [59].Higgs Paul and Gilleard Chris. 2021. Techno-fixes for an ageing society. Aging & Mental Health 0, 0, 1–3. 10.1080/13607863.2021.2008308 [DOI] [PubMed] [Google Scholar]
  • [60].Ho Anita. 2020. Are we ready for artificial intelligence health monitoring in elder care? BMC Geriatrics 20, 1, 358. 10.1186/s12877-020-01764-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [61].Hofmann Anna Lauren and Mills Charles W.. 2020. Data Ethics for Non-Ideal Times—Some Notes on the Course. Retrieved June 7, 2021 from https://www.annaeveryday.com/s/Hofmann-Data-Ethics-for-Non-Ideal-Times-Lecture-Notes.pdf.
  • [62].Hornung Dominik, Claudia Müller, Shklovski Irina, Jakobi Timo, and Wulf Volker. 2017. Navigating relationships and boundaries: Concerns around ICT-uptake for elderly people. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ‘17), 7057–7069. 10.1145/3025453.3025859 [DOI] [Google Scholar]
  • [63].Humana Foundation and OATS. 2021. Report: Aging Connected: Exposing the Hidden Connectivity Crisis for Older Adults. Retrieved February 11, 2022 from https://agingconnected.org/report/
  • [64].Isabet Baptiste, Pino Maribel, Lewis Manon, Benveniste Samuel, and Rigaud Anne-Sophie. 2021. Social telepresence robots: A narrative review of experiments involving older adults before and during the covid-19 pandemic. International Journal of Environmental Research and Public Health 18, 7, 3597. 10.3390/ijerph18073597 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [65].Ivanov Anton, Sharman Raj, and Rao H. Raghav. 2015. Exploring factors impacting sharing health-tracking records. Health Policy and Technology 4, 3,263–276. 10.1016/j.hlpt.2015.04.008 [DOI] [Google Scholar]
  • [66].Jæger Birgit. 2004. Trapped in the digital divide? Old people in the information society. Science & Technology Studies 17, 2, 5–22. 10.23987/sts.55163 [DOI] [Google Scholar]
  • [67].Jecker Nancy S.. 2020. You’ve got a friend in me: sociable robots for older adults in an age of global pandemics. Ethics and Information Technology. 10.1007/s10676-020-09546-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [68].Kavandi Hamidreza and Jaana Mirou. 2020. Factors that affect health information technology adoption by seniors: A systematic review. Health & Social Care in the Community 28, 6, 1827–1842. 10.1111/hsc.13011 [DOI] [PubMed] [Google Scholar]
  • [69].Kaye Jeffrey. 2017. Making pervasive computing technology pervasive for health & wellness in aging. Public Policy & Aging Report 27, 2, 53–61. 10.1093/ppar/prx005 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [70].Alison Marie Kenner. 2008. Securing the elderly body: Dementia, surveillance, and the politics of “Aging in Place.” Surveillance & Society 5, 3. 10.24908/ss.v5i3.3423 [DOI] [Google Scholar]
  • [71].Kessler Eva-Marie, Bowen Catherine E., Baer Marion, Froelich Lutz, and Wahl Hans-Werner. 2012. Dementia worry: a psychological examination of an unexplored phenomenon. European Journal of Ageing 9, 4, 275–284. 10.1007/s10433-012-0242-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [72].Kim Katherine K., Sankar Pamela, Wilson Machelle D., and Haynes Sarah C.. 2017. Factors affecting willingness to share electronic health data among California consumers. BMC Medical Ethics 18, 1, 1–10. 10.1186/s12910-017-0185-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [73].Kim Tae Kyung and Choi Moon. 2019. Older adults’ willingness to share their personal and health information when adopting healthcare technology and services. International Journal of Medical Informatics 126, 86–94. 10.1016/j.ijmedinf.2019.03.010 [DOI] [PubMed] [Google Scholar]
  • [74].Knowles Bran and Hanson Vicki L.. 2018. Older adults’ deployment of ‘distrust.’ ACM Transactions on Computer-Human Interaction 25, 4, 1–25. 10.1145/3196490 [DOI] [Google Scholar]
  • [75].Krahe Michelle, Milligan Eleanor, and Reilly Sheena. 2019. Personal health information in research: Perceived risk, trustworthiness and opinions from patients attending a tertiary healthcare facility. Journal of Biomedical Informatics 95, 103222. 10.1016/j.jbi.2019.103222 [DOI] [PubMed] [Google Scholar]
  • [76].Kropczynski Jess, Aljallad Zaina, Elrod Nathan Jeffrey, Lipford Heather, and Wisniewski Pamela J.. 2021. Towards building community collective efficacy for managing digital privacy and security within older adult communities. Proceedings of the ACM on Human-Computer Interaction 4, CSCW3, 1–27. 10.1145/3432954 [DOI] [Google Scholar]
  • [77].Lageson Sarah Esther, McElrath Suzy, and Palmer Krissinda Ellen. 2019. Gendered public support for criminalizing “revenge porn.” Feminist Criminology 14, 5, 560–583. 10.1177/1557085118773398 [DOI] [Google Scholar]
  • [78].Lai Claudia KY, Chung Jenny CC, Leung Natalie KL, Wong Jimmy CT, and Mak Diana PS. 2010. A survey of older Hong Kong people’s perceptions of telecommunication technologies and telecare devices. Journal of Telemedicine and Telecare 16, 8, 441–446. 10.1258/jtt.2010.090905 [DOI] [PubMed] [Google Scholar]
  • [79].Lariviere Matthew, Poland Fiona, Woolham John, Newman Stanton, and Fox Chris. 2021. Placing assistive technology and telecare in everyday practices of people with dementia and their caregivers: findings from an embedded ethnography of a national dementia trial. BMC Geriatrics 21, 1, 121. 10.1186/s12877-020-01896-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [80].Lazar Amanda, Edasis Caroline, and Piper Anne Marie. 2017. A critical lens on dementia and design in HCI. In Chi, 2175–2188. [Google Scholar]
  • [81].Lecher Colin. 2021. How Big Tech Is Pitching Digital Elder Care to Families. The Markup. Retrieved February 11, 2022 from https://themarkup.org/privacy/2021/10/28/how-big-tech-is-pitching-digital-elder-care-to-families [Google Scholar]
  • [82].Lee Chaiwoo and Coughlin Joseph F.. 2015. PERSPECTIVE: Older adults’ adoption of technology: An integrated approach to identifying determinants and barriers. Journal of Product Innovation Management 32, 5, 747–759. 10.1111/jpim.12176 [DOI] [Google Scholar]
  • [83].Leong Brenda and Selinger Evan. 2019. Robot eyes wide shut: Understanding dishonest anthropomorphism. Social Science Research Network, Rochester, NY. 10.2139/ssrn.3762223 [DOI] [Google Scholar]
  • [84].Levy Karen and Schneier Bruce. 2020. Privacy threats in intimate relationships. Journal of Cybersecurity 6, 1, tyaa006. [Google Scholar]
  • [85].Li Yuan. 2011. Empirical studies on online information privacy concerns: Literature review and an integrative framework. Communications of the Association for Information Systems 28, 1. 10.17705/1CAIS.02828 [DOI] [Google Scholar]
  • [86].Linzer Drew A. and Lewis Jeffrey B.. 2011. poLCA: An R package for polytomous variable latent class analysis. Journal of statistical software 42, 10, 1–29. [Google Scholar]
  • [87].Lociciro Adrien, Guillon Antoine, and Bodet-Contentin Laetitia. 2021. A telepresence robot in the room of a COVID-19 patient can provide virtual family presence. Canadian Journal of Anesthesia/Journal canadien d’anesthésie 68, 11, 1705–1706. 10.1007/s12630-021-02039-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [88].Loe Meika. 2010. Doing it my way: old women, technology and wellbeing. Sociology of Health & Illness 32, 2, 319–334. 10.1111/j.1467-9566.2009.01220.x [DOI] [PubMed] [Google Scholar]
  • [89].Lorenzen-Huber Lesa, Boutain Mary, Camp L. Jean, Shankar Kalpana, and Connelly Kay H.. 2011. Privacy, technology, and aging: A proposed framework. Ageing International 36, 2, 232–252. 10.1007/s12126-010-9083-y [DOI] [Google Scholar]
  • [90].Lowsky David J., Olshansky S. Jay, Bhattacharya Jay, and Goldman Dana P.. 2014. Heterogeneity in healthy aging. The Journals of Gerontology: Series A 69, 6, 640–649. 10.1093/gerona/glt162 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [91].Lukkien Dirk R M, Nap Henk Herman, Buimer Hendrik P, Peine Alexander, Boon Wouter P C, Ket Johannes C F, Minkman Mirella M N, and Moors Ellen H M. 2021. Toward responsible artificial intelligence in long-term care: A scoping review on practical approaches. The Gerontologist, gnab180. 10.1093/geront/gnab180 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [92].Manor Shlomit and Herscovici Arie. 2021. Digital ageism: A new kind of discrimination. Human Behavior and Emerging Technologies 3, 5, 1084–1093. 10.1002/hbe2.299 [DOI] [Google Scholar]
  • [93].Matthews Tara, O’Leary Kathleen, Turner Anna, Sleeper Manya, Woelfer Jill Palzkill, Shelton Martin, Manthorne Cori, Churchill Elizabeth F., and Consolvo Sunny. 2017. Stories from survivors: Privacy & security practices when coping with intimate partner abuse. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ‘17), 2189–2201. 10.1145/3025453.3025875 [DOI] [Google Scholar]
  • [94].Nora McDonald and Andrea Forte. 2020. The politics of privacy theories: Moving from norms to vulnerabilities. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 1–14. Retrieved February 11, 2022 from 10.1145/3313831.3376167 [DOI] [Google Scholar]
  • [95].McDonald Nora and Mentis Helena M.. 2021. Building for ‘we’: Safety settings for couples with memory concerns. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI ‘21), 1–11. 10.1145/3411764.3445071 [DOI] [Google Scholar]
  • [96].McNeill Andrew, Briggs Pam, Pywell Jake, and Coventry Lynne. 2017. Functional privacy concerns of older adults about pervasive health-monitoring systems. In Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA ‘17), 96–102. 10.1145/3056540.3056559 [DOI] [Google Scholar]
  • [97].Mendel Tamir and Toch Eran. 2019. My mom was getting this popup: Understanding motivations and processes in helping older relatives with mobile security and privacy. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 3, 4, 1–20. 10.1145/336982134164595 [DOI] [Google Scholar]
  • [98].Menne Heather L and Whitlatch Carol J. 2007. Decision-making involvement of individuals with dementia. The Gerontologist 47, 6, 810–9. [DOI] [PubMed] [Google Scholar]
  • [99].Messing Jill, Bagwell-Gray Meredith, Brown Megan Lindsay, Kappas Andrea, and Durfee Alesha. 2020. Intersections of stalking and technology-based abuse: Emerging definitions, conceptualization, and measurement. Journal of Family Violence 35, 7, 693–704. 10.1007/s10896-019-00114-7 [DOI] [Google Scholar]
  • [100].Miller Lyndsey M, Whitlatch Carol J, and Lyons Karen S. 2016. Shared decision-making in dementia: A review of patient and family carer involvement. Dementia 15, 5, 1141–1157. 10.1177/1471301214555542 [DOI] [PubMed] [Google Scholar]
  • [101].Mitzner Tracy L., Boron Julie B., Fausset Cara Bailey, Adams Anne E., Charness Neil, Czaja Sara J., Dijkstra Katinka, Fisk Arthur D., Rogers Wendy A., and Sharit Joseph. 2010. Older adults talk technology: Technology usage and attitudes. Computers in Human Behavior 26, 6, 1710–1721. 10.1016/j.chb.2010.06.020 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [102].Moder Karl. 2010. Alternatives to F-test in one way ANOVA in case of heterogeneity of variances (a simulation study). Psychological Test and Assessment Modeling 52, 4, 343–353. [Google Scholar]
  • [103].Morgan K, Dallosso HM, and Ebrahim SBJ. 1985. A brief self-report scale for assessing personal engagement in the elderly: reliability and validity. Ageing: recent advances and creative responses. Beckenham, Croom Helm, 298–304. [Google Scholar]
  • [104].Morrissey Kellie, McCarthy John, and Pantidi Nadia. 2017. The value of experience-centred design approaches in dementia research contexts. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ‘17), 1326–1338. 10.1145/3025453.3025527 [DOI] [Google Scholar]
  • [105].Mort Maggie, Roberts Celia, and Callén Blanca. 2013. Ageing with telecare: care or coercion in austerity? Sociology of Health & Illness 35, 6, 799–812. 10.1111/j.1467-9566.2012.01530.x [DOI] [PubMed] [Google Scholar]
  • [106].Ben Mortenson W, Sixsmith Andrew, and Woolrych Ryan. 2015. The power(s) of observation: theoretical perspectives on surveillance technologies and older people. Ageing & Society 35, 3, 512–530. 10.1017/S0144686X13000846 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [107].Murthy Savanthi, Bhat Karthik S., Das Sauvik, and Kumar Neha. 2021. Individually vulnerable, collectively safe: The security and privacy practices of households with older adults. Proceedings of the ACM on Human-Computer Interaction 5, CSCW1, 1–24. 10.1145/3449212 [DOI] [Google Scholar]
  • [108].Neven Louis. 2015. By any means? Questioning the link between gerontechnological innovation and older people’s wish to live at home. Technological Forecasting and Social Change 93, 32–43. 10.1016/j.techfore.2014.04.016 [DOI] [Google Scholar]
  • [109].Nissenbaum Helen. 2009. Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford University Press, USA. [Google Scholar]
  • [110].Novitzky Peter, Smeaton Alan F., Chen Cynthia, Irving Kate, Jacquemard Tim, O’Brolcháin Fiachra, O’Mathúna Dónal, and Gordijn Bert. 2015. A review of contemporary work on the ethics of ambient assisted living technologies for people with dementia. Science and engineering ethics 21, 3, 707–765. 10.1007/s11948-014-9552-x [DOI] [PubMed] [Google Scholar]
  • [111].Nurgalieva Leysan, Frik Alisa, Ceschel Francesco, Egelman Serge, and Marchese Maurizio. 2019. Information design in an aged care context: Views of older adults on information sharing in a care triad. In Proceedings of the 13th EAI International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth’19), 101–110. 10.1145/3329189.3329211 [DOI] [Google Scholar]
  • [112].Parsa Mah, Alam Muhammad Raisul, and Mihailidis Alex. 2021. Towards AI-powered language assessment tools. In Press. 10.21203/rs.3.rs-246079/v1. 10.21203/rs.3.rs-246079/v1 [DOI] [Google Scholar]
  • [113].Peek Sebastiaan T. M., Aarts Sil, and Wouters Eveline J. M.. 2015. Can smart home technology deliver on the promise of independent living? In Handbook of Smart Homes, Health Care and Well-Being, van Hoof Joost, Demiris George and Wouters Eveline J.M. (eds.). Springer International Publishing, Cham, 1–10. 10.1007/978-3-319-01904-8_41-2 [DOI] [Google Scholar]
  • [114].Peine Alexander and Neven Louis. 2019. From intervention to co-constitution: New directions in theorizing about aging and technology. The Gerontologist 59, 1, 15–21. 10.1093/geront/gny050 [DOI] [PubMed] [Google Scholar]
  • [115].Peine Alexander and Neven Louis. 2020. The co-constitution of ageing and technology – a model and agenda. Ageing and Society, 1–22. 10.1017/S0144686X20000641 [DOI] [Google Scholar]
  • [116].Pew Research Center. 2021. Demographics of Internet and Home Broadband Usage in the United States. Pew Research Center: Internet, Science & Tech. Retrieved February 11, 2022 from https://www.pewresearch.org/internet/fact-sheet/internet-broadband/ [Google Scholar]
  • [117].Andrew Pew Research Center. 2021. About three-in-ten U.S. adults say they are ‘almost constantly’ online. Pew Research Center. Retrieved February 11, 2022 from https://www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/ [Google Scholar]
  • [118].Poulsen Adam, Fosch-Villaronga Eduard, and Burmeister Oliver K.. 2020. Cybersecurity, value sensing robots for LGBTIQ+ elderly, and the need for revised codes of conduct. Australasian Journal of Information Systems 24. 10.3127/ajis.v24i0.2789 [DOI] [Google Scholar]
  • [119].Pu Lihui, Moyle Wendy, Jones Cindy, and Todorovic Michael. 2019. The effectiveness of social robots for older adults: A systematic review and meta-analysis of randomized controlled studies. The Gerontologist 59, 1, e37–e51. 10.1093/geront/gny046 [DOI] [PubMed] [Google Scholar]
  • [120].R Core Team. 2013. R: A language and environment for statistical computing.
  • [121].Ripley Brian, Venables Bill, Bates Douglas M., Hornik Kurt, Gebhardt Albrecht, Firth David, and Ripley Maintainer Brian. 2013. Package ‘mass.’ Cran r 538, 113–120. [Google Scholar]
  • [122].Robillard Julie M., Cleland Ian, Hoey Jesse, and Nugent Chris. 2018. Ethical adoption: A new imperative in the development of technology for dementia. Alzheimer’s & Dementia 14, 9, 1104–1113. 10.1016/j.jalz.2018.04.012 [DOI] [PubMed] [Google Scholar]
  • [123].Robillard Julie M., Goldman Ian P., Prescott Tony J., and Michaud François. 2020. Addressing the ethics of telepresence applications through end-user engagement. Journal of Alzheimer’s Disease 76, 2, 457–460. 10.3233/JAD-200154 [DOI] [PubMed] [Google Scholar]
  • [124].Robillard Julie M., Wu Julia M., Feng Tanya L., and Tam Mallorie T.. 2019. Prioritizing benefits: A content analysis of the ethics in dementia technology policies. Journal of Alzheimer’s Disease 69, 4, 897–904. 10.3233/JAD-180938 [DOI] [PubMed] [Google Scholar]
  • [125].Rolison Jonathan J., Hanoch Yaniv, and Freund Alexandra M.. 2019. Perception of risk for older adults: Differences in evaluations for self versus others and across risk domains. Gerontology 65, 5, 547–559. 10.1159/000494352 [DOI] [PubMed] [Google Scholar]
  • [126].Rosales Andrea and Fernández-Ardèvol Mireia. 2020. Ageism in the era of digital platforms. Convergence 26, 5–6, 1074–1087. 10.1177/1354856520930905 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [127].Rubeis Giovanni. 2020. The disruptive power of Artificial Intelligence. Ethical aspects of gerontechnology in elderly care. Archives of Gerontology and Geriatrics 91, 104186. 10.1016/j.archger.2020.104186 [DOI] [PubMed] [Google Scholar]
  • [128].Schomakers Eva-Maria and Ziefle Martina. 2019. Privacy perceptions in ambient assisted living. In ICT4AWE, 205–212.
  • [129].Schulz Richard, Wahl Hans-Werner, Matthews Judith T., De Vito Dabbs Annette, Beach Scott R., and Czaja Sara J.. 2015. Advancing the Aging and Technology Agenda in Gerontology. The Gerontologist 55, 5, 724–734. 10.1093/geront/gnu071 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [130].Rosato Nancy Scotto and Baer Judith C.. 2012. Latent class analysis: A method for capturing heterogeneity. Social Work Research 36, 1, 61–69. 10.1093/swr/svs006 [DOI] [Google Scholar]
  • [131].Shankar Kalpana, Camp L. Jean, Connelly Kay, and Huber Lesa. 2012. Aging, privacy, and home-based computing: Developing a design framework. IEEE Pervasive Computing 11, 4, 46–54. 10.1109/MPRV.2011.19 [DOI] [Google Scholar]
  • [132].Shankar Kalpana. 2010. Pervasive computing and an aging populace: Method-ological challenges for understanding privacy implications. Journal of Information, Communication and Ethics in Society 8, 3, 236–248. 10.1108/14779961011071051 [DOI] [Google Scholar]
  • [133].Sheehan Kim Bartel. 1999. An investigation of gender Differences in online privacy concerns and resultant behaviors. Journal of Interactive Marketing 13, 4, 24–38. [DOI] [Google Scholar]
  • [134].Shelton Evan G, Orsulic-Jeras Silvia, Whitlatch Carol J, and Szabo Sarah M. 2018. Does it matter if we disagree? The impact of incongruent care preferences on persons with dementia and their care partners. The Gerontologist 58, 3, 556–566. 10.1093/geront/gnw202 [DOI] [PubMed] [Google Scholar]
  • [135].Shen Yang, Guo Dejun, Long Fei, Mateos Luis A., Ding Houzhu, Xiu Zhen, Hellman Randall B., King Adam, Chen Shixun, Zhang Chengkun, and Tan Huan. 2021. Robots under COVID-19 pandemic: A comprehensive survey. IEEE Access 9, 1590–1615. 10.1109/ACCESS.2020.3045792 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [136].Shu Sara and Woo Benjamin KP. 2021. Use of technology and social media in dementia care: Current and future directions. World Journal of Psychiatry 11, 4, 109–123. 10.5498/wjp.v11.i4.109 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [137].Smuha Nathalie A.. 2021. Beyond the Individual: Governing AI’s Societal Harm. Social Science Research Network, Rochester, NY. Retrieved February 11, 2022 from https://papers.ssrn.com/abstract=3941956 [Google Scholar]
  • [138].Steventon Adam, Bardsley Martin, Billings John, Dixon Jennifer, Doll Helen, Beynon Michelle, Hirani Shashi, Cartwright Martin, Rixon Lorna, Knapp Martin, Henderson Catherine, Rogers Anne, Hendy Jane, Fitzpatrick Ray, and Newman Stanton. 2013. Effect of telecare on use of health and social care services: findings from the Whole Systems Demonstrator cluster randomised trial. Age and Ageing 42, 4, 501–508. 10.1093/ageing/aft008 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [139].Stypińska Justyna. 2021. Ageism in AI: New forms of age discrimination in the era of algorithms and artificial intelligence. Retrieved February 11, 2022 from https://eudl.eu/doi/10.4108/eai.20-11-2021.2314200 [Google Scholar]
  • [140].Thordardottir Björg, Fänge Agneta Malmgren, Lethin Connie, Gatta Danae Rodriguez, and Chiatti Carlos. 2019. Acceptance and use of innovative assistive technologies among people with cognitive impairment and their caregivers: A systematic review. BioMed Research International 2019, e9196729. 10.1155/2019/9196729 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [141].Trinidad M. Grace, Platt Jodyn, and Kardia Sharon L. R.. 2020. The public’s comfort with sharing health data with third-party commercial companies. Humanities and Social Sciences Communications 7, 1, 1–10. 10.1057/s41599-020-00641-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [142].U.S. Bureau of the Census. 2019. U.S. Bureau of the Census 2019. Retrieved February 11, 2022 from https://www.census.gov/programs-surveys/acs/news/updates/2019.html
  • [143].Venkatesh Viswanath and Bala Hillol. 2008. Technology acceptance model 3 and a research agenda on interventions. Decision Sciences 39, 2, 273–315. 10.1111/j.1540-5915.2008.00192.x [DOI] [Google Scholar]
  • [144].Vines John, Blythe Mark, Dunphy Paul, and Monk Andrew. 2011. Eighty something: banking for the older old. In Proceedings of HCI 2011 The 25th BCS Conference on Human Computer Interaction 25, 64–73. [Google Scholar]
  • [145].Vines John, Lindsay Stephen, Pritchard Gary W., Lie Mabel, Greathead David, Olivier Patrick, and Brittain Katie. 2013. Making family care work: dependence, privacy and remote home monitoring telecare systems. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing (UbiComp ‘13), 607–616. 10.1145/2493432.2493469 [DOI] [Google Scholar]
  • [146].Vines John, Pritchard Gary, Wright Peter, Olivier Patrick, and Brittain Katie. 2015. An age-old problem: Examining the discourses of ageing in hci and strategies for future research. ACM Transactions on Computer-Human Interaction 22, 1, 1–27. 10.1145/2696867 [DOI] [Google Scholar]
  • [147].Vlachos George S., Cosentino Stephanie, Kosmidis Mary H., Anastasiou Costas A., Yannakoulia Mary, Dardiotis Efthimios, Hadjigeorgiou Georgios, Sakka Paraskevi, Ntanasi Eva, and Scarmeas Nikolaos. 2019. Prevalence and determinants of subjective cognitive decline in a representative Greek elderly population. International Journal of Geriatric Psychiatry 34, 6, 846–854. 10.1002/gps.5073 [DOI] [PubMed] [Google Scholar]
  • [148].Wan Lin, Müller Claudia, Wulf Volker, and Randall David William. 2014. Addressing the subtleties in dementia care: pre-study & evaluation of a GPS monitoring system. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ‘14), 3987–3996. 10.1145/2556288.2557307 [DOI] [Google Scholar]
  • [149].Wang Shengzhi, Bolling Khalisa, Mao Wenlin, Reichstadt Jennifer, Jeste Dilip, Kim Ho-Cheol, and Nebeker Camille. 2019. Technology to support aging in place: Older adults’ perspectives. Healthcare 7, 2, 60. 10.3390/healthcare7020060 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [150].Waycott Jenny, Vetere Frank, Pedell Sonja, Morgans Amee, Ozanne Elizabeth, and Kulik Lars. 2016. Not for me: Older adults choosing not to participate in a social isolation intervention. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI ‘16), 745–757. 10.1145/2858036.2858458 [DOI] [Google Scholar]
  • [151].Wetsman Nicole. 2021. Apple’s new health features bring new focus to elder care technology. The Verge. Retrieved February 11, 2022 from https://www.theverge.com/2021/6/10/22527707/apple-health-data-eldery-falls-walking-privacy [Google Scholar]
  • [152].Whitlatch Carol J., Feinberg Lynn Friss, and Tucke Shandra S.. 2005. Measuring the values and preferences for everyday care of persons with cognitive impairment and their family caregivers. The Gerontologist 45, 3, 370–380. 10.1093/geront/45.3.370 [DOI] [PubMed] [Google Scholar]
  • [153].Wild Katherine V., Mattek Nora C., Maxwell Shoshana A., Dodge Hiroko H., Jimison Holly B., and Kaye Jeffrey A.. 2012. Computer-related self-efcacy and anxiety in older adults with and without mild cognitive impairment. Alzheimer’s & Dementia 8, 6, 544–552. 10.1016/j.jalz.2011.12.008 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [154].Woolson RF. 2008. Wilcoxon signed-rank test. In Wiley Encyclopedia of Clinical Trials. American Cancer Society, 1–3. 10.1002/9780471462422.eoct979 [DOI] [Google Scholar]
  • [155].Yusif Salifu, Soar Jeffrey, and Hafeez-Baig Abdul. 2016. Older people, assistive technologies, and the barriers to adoption: A systematic review. International Journal of Medical Informatics 94, 112–116. 10.1016/j.ijmedinf.2016.07.004 [DOI] [PubMed] [Google Scholar]
  • [156].Zeissig Eva-Maria, Lidynia Chantal, Vervier Luisa, Gadeib Andera, and Ziefle Martina. 2017. Online privacy perceptions of older adults. In Human Aspects of IT for the Aged Population. Applications, Services and Contexts (Lecture Notes in Computer Science), 181–200. 10.1007/978-3-319-58536-9_16 [DOI] [Google Scholar]
  • [157].Zwijsen Sandra A., Niemeijer Alistair R., and Hertogh Cees M.P.M.. 2011. Ethics of using assistive technology in the care for community-dwelling elderly people: An overview of the literature. Aging & Mental Health 15, 4, 419–427. 10.1080/13607863.2010.543662 [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Appendices

RESOURCES