Abstract
Introduction
Socially assistive robots are devices designed to aid users through social interaction and companionship. Social robotics promise to support cognitive health and aging in place for older adults with and without dementia, as well as their care partners. However, while new and more advanced social robots are entering the commercial market, there are still major barriers to their adoption, including a lack of emotional alignment between users and their robots. Affect Control Theory (ACT) is a framework that allows for the computational modeling of emotional alignment between two partners.
Methods
We conducted a Canadian online survey capturing attitudes, emotions, and perspectives surrounding pet-like robots among older adults (n = 171), care partners (n = 28), and persons living with dementia (n = 7).
Results
We demonstrate the potential of ACT to model the emotional relationship between older adult users and three exemplar robots. We also capture a rich description of participants’ robot attitudes through the lens of the Technology Acceptance Model, as well as the most important ethical concerns around social robot use.
Conclusions
Findings from this work will support the development of emotionally aligned, user-centered robots for older adults, care partners, and people living with dementia.
Keywords: Social robotics, pet-like robots, dementia, care partners, older adults, affect control theory, emotional alignment
Background
Life expectancy in industrialized countries is projected to increase in the next 10 years, resulting in a growth of the population of adults over the age of 65. 1 Identifying and addressing the needs of this growing population is imperative to improve quality of life and promote healthy aging. In addition to the maintenance of physical health, two integral components of healthy aging are feelings of autonomy and independence,2,3 and social and emotional support.4–6 This is equally true for the 55 million people living with dementia worldwide, a figure projected to rise to 139 million by 2050. 7
Social assistive robots are robots designed to aid users through social interaction and have shown potential as a tool to help support the needs of older adults.8,9 Social robots can be used in assisting with instrumental activities of daily living; for example, by providing reminders to take medication8,10,11 and in providing social companionship.2,12,13 A large area of innovation in the field of social robotics has been the development of robots with pet-like appearances and/or qualities. This push in the development of pet-like social robots may be due to established research on the abilities of pet-like social robots in providing significant emotional support to both healthy older adults and older adults living with dementia. 14 Studies have found that people treat pet-like social robots as they would real pet animals in many aspects. 15 Participants reported perceived social companionship and attachment with social robots similar to animal pets. In light of the potential benefits of these socially assistive technologies, developers have created robots mimicking dogs, cats, furry seals or entirely imagined animals.2,16–19
However, even as the newest and most advanced social robots enter the commercial market, 20 and feelings towards social robots are generally positive among potential users, 21 there are still major barriers to adoption of these technologies.11,22–24 A critical barrier to the adoption of assistive social robots is the lack of inclusion of older adult users in the design of social robots, and the resulting disconnect between the format and functionalities of the social robot devices and the specific needs and values of older adults. 25
Theoretical framework
To better understand older adults’ perspectives as potential end users of social robots, we adopted the sociotechnical perspective as our overarching framework (Figure 1). This approach, emerging from computer sciences, holds that a piece of technology is best understood not only as a physical object but also as something that is socially constructed – the relationships between the user, the object, their values, and their society all come together to shape the impact of a device. 26 It proposes that the qualities of the robot partner are not static but are instead dependent on the abilities, beliefs, and goals of the user.
Figure 1.
Relationship between frameworks informing the study. The Sociotechnical perspective considers the interaction between user and technology to be taking place in a societal context (green). The Almere model is a subset of the Technology Acceptance Model – Almere constructs characterize the user. Finally, Affect Control Theory models the relationship between user and technology in a broader societal context.
To identify potential barriers for older adults as users of social robots, we rely on the Technology Acceptance Model (TAM). 27 The TAM was initially developed as a tool to evaluate employees’ attitudes towards the email system in the workplace and predict their adoption of the technology. It has since been applied to other fields and has been deemed an effective model of technology acceptance in the healthcare setting. 28 This model proposes two main factors that shape an individual’s attitude towards a novel technology: perceived usefulness (PU) and perceived ease of use (PEU). Heerink and colleagues have proposed a TAM model specific to assistive social robots for older adults – the Almere model – with twelve constructs, which inform the construction of our measurement instruments.29,30
Affect control theory
A focus of the present work is to improve the emotional alignment between older adults and robotic devices, which has been proposed as a major adoption barrier.11,12,31–33 Emotional alignment is a state where all partners in an interaction have congruent emotional interpretations of the situation. Such congruence is the foundation of social and emotional connectedness. 34 Emotional capabilities are considered desirable features in social robotics and there have been many efforts to equip robots with the ability to understand and display emotion.31,35–37 However, there has been no robust computational model of emotional alignment deployed in this field. Here, we explore a cross-disciplinary computational model of emotional alignment called “Affect Control Theory” as a potential solution to enhance emotional alignment between older adult end-users and social robots.
Affect Control Theory (ACT) is a socio-psychological theory of human interaction in which affect plays a central role in decision-making and behaviour. 38 It predicts and prescribes behaviour that minimizes unlikeliness and incoherence for more emotionally aligned interaction. This is based on congruence between a set of culturally shared sentiments and transient situational impressions of social situations defined in terms of socio-cultural identities (e.g. a doctor, patient), behaviours (e.g. medicate, obey), and contexts (e.g. hospital, home). Emotions are assessed on three dimensional bi-directional constructs: Evaluation (ranging from bad to good), Potency (ranging from weak to strong) and Activity (ranging from inactive to active), together called EPA. For example, a doctor identity is generally considered more powerful and active than a patient identity although both are considered good. There can be variations in the sentiments associated with different identities across individuals and cultures.
Affect control theorists have compiled datasets of a few thousand words along with average EPA ratings obtained from survey participants who are knowledgeable about their culture. 39 For example, most English speakers agree that doctors are about as nice as patients (E), however they are more powerful (P) and less active (A). The corresponding EPA profiles are {1.9, 0.69, 0.05} for doctor and {0.9, −0.69, −1.05} for patient. 40 The values range by convention from −4.3 to +4.3. 39 In general, within-culture agreement about EPA meanings of social concepts is high even across subgroups of society, and cultural-average EPA ratings from as little as a few dozen survey participants have shown to be extremely stable over extended periods of time. 39
In ACT, a simple interaction event is defined in the terms Actor-Behaviour-Object (A-B-O), where Actor and Object are the interactants (settings are left out here for simplicity). When an individual interacts with another agent with a particular identity, a set of “transient impressions” about that agent interaction are formed according to ACT. Impressions are estimated with a non-linear model that combines fundamental sentiments according to known socio-psychological constructs (e.g. a good person doing something bad to a good person will leave a bad impression of the acting person). These impressions may be congruent or incongruent with the individual’s fundamental (out-of-context) sentiments about the other’s identity and behaviour prior to the interaction. This incongruency is termed “deflection” or “incoherence” – a measure of how strongly the interaction diverges from expectations. Incoherence is defined as the Euclidean distance between culturally shared out-of-context fundamental sentiments and the in-context impressions of the current interaction. An interaction with larger incoherence is more unlikely from socio-cultural perspective. For example, the statement “the doctor yells at the patient” would have a larger incoherence than the statement “the doctor diagnoses the patient.” ACT predicts that people will act in ways that minimize deflection and maintain congruency between fundamental sentiments and transient impressions. In sum, ACT is a theory of emotion that predicts and prescribes a behaviour, which minimizes unlikeliness and incoherence for better social interaction.
The computational nature of ACT allows for the emotional states of the user and the socially assistive robot to be interpreted using a single, shared framework. 41 ACT allows a human-robot interaction to be modeled in this multidimensional “emotion space,” and this information can then be used to dictate the robot’s future behaviour in such a way as to maximize emotional alignment with the user.
Study objectives
Our goal in the present work was to develop an understanding of attitudes and emotions of three groups of participants – healthy older adults, older adults living with dementia, and care partners of older adults living with dementia – in response to three commercially available pet-like social robots, with a particular focus on emotional alignment between user and robot.
Robots included
Three commercially available robots were presented in this study to highlight the diversity in the appearance and functionality of pet-like social robots: Sony’s AIBO, 18 Hasbro’s Joy for All Cat, also called JustoCat, 16 and MiRo-E. 19 AIBO is a dog-like social robot that follows the developmental arc of a typical dog. The visible AIBO shell is entirely made of plastic and the robot can learn tricks and respond to commands by making movements and sounds. JustoCat is cat-like social robot with realistic looking fur and facial expressions. It can respond to petting by meowing and changing its facial expressions. MiRo-E does not resemble any one animal but has features from many different animals. MiRo-E has a control system modelled on orienting systems in the mammalian brain and can respond to novel situations through movement and sound. In the online survey, participants were presented with still images of these three devices as well as a few short bullet points about each one.
Methods
This work was conducted in accordance with the Declaration of Helsinki and was approved by the University of British Columbia Behavioural Research Ethics Board (approval number H19-03308). We developed an online survey that introduced respondents to three existing social robots (Figure 2) and that captured their perspective on and emotional alignment with these robots, as well as the idea of social robots more generally, through a combination of standardized scale items and open-ended questions. Survey development followed five stages: (1) consultation with our older adult advisory to identify priority areas for the survey (ethical concerns, emotional alignment); (2) review of existing scales to measure attitudes towards robots and selection of scales and constructs (TAM, PANAS); (3) development of an early survey draft; (4) survey pilot with the research team and initial refinement; (5) survey pilot with our older adult advisory and final refinement.
Figure 2.
Robots used in the study. (a) AIBO; (b) Joy for All Cat; (c) MiRo-E.
The survey was distributed via print posters, e-newsletters, paid ads, and social media posts (Facebook and Twitter) in partnership with multiple venues engaging older adults living in Canada. These included seniors’ centres, community centres, a brain research centre, caregiver support organizations, and continuing education non-profits across the province. Two of the authors included invitations to participate when giving public talks about aging technology for older adult lay audiences. We also invited participants from the research recruitment platform REACH BC, an initiative by health authorities and universities in the province of British Columbia, Canada, and information was also circulated by the Alzheimer’s Society of BC and by AGE-WELL NCE, a Canadian technology and aging network. Participants provided written informed consent before accessing the online survey.
Data analysis
Four types of data were collected from the online survey: demographic data, ratings around experiences and barriers around social robots (some derived from TAM and others open-ended), a measure of the emotional impact of reading about the robots, and ratings of the self and the robots on ACT constructs for the emotional alignment modeling.
Experiences and barriers
After participants read about the three robots (MiRo, JustoCat, and AIBO), they were asked to rate the robots on a subset of scale items from the Almere model – perceived enjoyment, perceived usefulness, social presence, intention to use, attitude, and perceived sociability – to capture participants’ general attitudes towards the three pet-like social robots. Additional questions about robot movement and ethical concerns surrounding robot use were also posed. Open-ended questions were also used to assess opinions of older adults, persons with dementia and caregivers. These responses were analyzed using content analysis. Working together, two coders first identified themes among responses and generated a preliminary coding guide using an emergent coding strategy. Code definitions were established. Then, each coder independently coded 20% of the sample with this initial coding guide. Results were then compared between coders, and the coding guide and its definitions were revised through discussion to remove any ambiguities that were identified. This process was repeated until an inter-rater reliability of 80% or higher was achieved between coders, at which point a single coder coded the entire corpus.
Ethical concerns
Although scholars are generally enthusiastic around the use of socially assistive robots in the care setting, ethical concerns are still raised and discussed.3,24,42,43 Some common concerns identified in the literature centre around: (i) stigma of use, (ii) safety and privacy concerns, (iii) perceptions of the robot, and (iv) changes in care-providing patterns due to robot use. We expanded these ethical concerns into eight specific statements. Respondents were asked to rank these from most to least concerning.
Emotional impact
Participants were asked to complete the positive and negative affect scale (PANAS), a 20-item instrument with questions, such as “After reading about the three robots, to what extent did you feel each of the words listed below: Interested, etc.” The PANAS is a validated measure of affect 44 that is extensively used in clinical and cognitive psychology. Numerous studies have established that pet-like social robots can alter the affective state of the end-user,2,3 which can be captured by the PANAS.
Affect control theory modelling
Participants were also asked to rate the three robots, a hypothetical ideal pet-like robot, and themselves on a scale from −10 to +10 on the three ACT sentiments: Evaluation (bad or awful to good or nice), Potency (powerless or weak to powerful or strong), and Activity (inactive or passive to active or lively). These ratings were scaled to be consistent with EPA standards (−4.3 to 4.3). 39 Then, we analysed two values that we computed using these ratings: the EPA distance between the self and the robot, and the deflection between fundamental sentiments and transient impressions from the rater’s point of view for the statement “the robot assists the rater [me].”
First, to calculate the EPA distance for each rater from the identity associated with each robot to their rating of their own identity , we used the formula
In this formula, O is the rater’s ratings of themself, A is the rater’s ratings of the robot, and e, p, and a are the three ACT dimensions. 45 A larger EPA distance indicates a greater discrepancy between the rater’s identity and their perception of the social robot’s identity.
Second, for the deflection calculation, according to ACT grammar, the fundamental sentiment f (represented by over-bar) is represented as follows
and the transient impression τ (represented by caret) evoked by an event is given by
The weighted sum of squared Euclidean distances between fundamental sentiments and transient impressions is then represented as total deflection D:
We used the ‘Indiana 2002-4’ dataset 40 and ACT’s impression formation equations46,47 to calculate the deflection associated with each rater and each robot that would be predicted if the robot (actor) were to engage in the behaviour of ‘assisting’ a person (object) such that the interaction can be worded as “the robot assists the person”.
Results
Sample
Research data are available upon request. Prior to data analysis, blank or largely incomplete survey responses were removed following a pre-set cutoff criterion (15% of submissions). Responses from 171 healthy older adults, 28 care partners, and 7 persons living with dementia were included in the final sample (N = 206). The detailed breakdown of the demographic information we collected from participants can be found in Supplementary Table 1.
Our sample covered a wide range of ages. Care partner ages ranged from twenties to seventies, with the largest group in their sixties (32%). Healthy older adults ranged from fifties to nineties, with a fairly equal spread across the fifties to seventies (31, 33, and 29% across those decades). Persons living with dementia were mostly in their seventies (71%), with one individual in their eighties and one in their nineties. Care partners predominantly self-identified as female (82% female, 18% male), as did healthy older adults (73% female, 25% male, 2% preferred not to disclose). Six of seven persons with dementia were male, and one preferred not to disclose this information. The majority of all three groups identified as White or Caucasian (80% of health older adults, 82% of care partners, five of seven persons living with dementia). The next-largest demographic group in all cases was Asian/Pacific Islanders (9%, 14%, 14% respectively). Black/African American, Hispanic/Latino, and Indigenous Canadian/Native American respondents each represented less than 2% of the samples.
The majority of respondents across all three groups were located in British Columbia: 87% of healthy older adults, 64% of care partners, and 57% of persons living with dementia. Two care partners and one healthy older adult were located outside of Canada; the rest of the sample lived in Canada but outside of B.C. The largest share of each group held college or university degrees (42% of healthy older adults, 46% of care partners, and 43% of those living with dementia). Also represented were individuals with post-graduate degrees (28%, 28%, and 6% respectively), trade/technical/vocational school experience (9%, 7%, 29%), a high school education or equivalent (12%, 4%, 14%), and less than high school education (2%, 4%, 14%).
Among healthy older adults, the most common living situation was to live alone (43%), followed by living with a spouse/partner (37%), living with a spouse/partner and one or more children (9%), and living with one or more children (5%). Among care partners, 57% reported that they lived with the person for whom they provided care. Most commonly, care partners were a family member (46%) or a spouse/partner (39%). Among persons living with dementia, five of seven lived with a spouse or partner, one lived alone, and one with a sibling.
The majority of care partners (82%) reported that the person for whom they care has received a formal diagnosis of dementia. Care partners reported assisting their partner with activities of daily living on a daily (57%), 4-6 times per week (15%), 2-3 times per week (11%), once a month (4%), or less than monthly basis (14%). When asked what types of professional help they received, 60% of healthy older adults and 43% of persons living with dementia indicated that they received medical treatment from a General Practitioner (GP). Other types of help received included medical specialists (26%, 29%), house cleaning or cooking (9%, 14%), and other types of help (6%, 14%). Some reported receiving no professional help (35%, 29%), and healthy older adults also reported receiving help in the form of counselling or therapy (11%) and occupational or physical therapy (8%).
Around half of the sample had been the primary caretaker for a pet in their lifetime (48% of healthy older adults, 50% of care partners, 57% of persons living with dementia). Other respondents had shared caretaking responsibilities for a pet (36%, 39%, 29%). A minority had no pet experience (16%, 11%, 14%). Of those who had cared for pet(s), those pet(s) were dogs (70%), cats (61%), fish (33%), birds (22%), rodents (19%), and other animals (9%). Pet experiences were almost always (96%) rated as somewhat to extremely positive.
Experiences and barriers
Respondents were asked to rate their agreement with statements about each of the three robots on a 5-point scale (Figure 3). Statements were selected from the Almere scale and the Negative Attitudes towards Robots Scale.29,48 For ease of visualization, the “somewhat agree” and “strongly agree” responses were collapsed as “agree” and responses across the three robots were averaged. Notably, care partners were more likely than the other two groups to report that the robots would be useful, that they would use the robots in the next few days, and that they found the robots fascinating. All three groups agreed more strongly that they would enjoy the robots as opposed to finding them useful. They were also more likely to indicate that the robots would be useful for other people rather than themselves. Less than a quarter of respondents felt that robots would make them nervous or uneasy.
Figure 3.
Ratings of the three socially assistive robots.
Agreement with the nine statements was consistent (within a 10% spread) across the three robots for five of nine questions. For the four questions that eliciting different responses across the robots, respondents were more likely to agree that JustoCat would be useful for others than the other two robots (82% agreement versus 62% for AIBO and MiRo) and that they themselves would find JustoCat enjoyable (59% versus 46% for AIBO and 41% for MiRo). Responses were also widely spread for the item “[robot] reminded me of a live animal” (44%, 32%, and 14% for JustoCat, AIBO, and MiRo respectively) and “I could imagine [robot] to be a living creature” (32%, 24%, 21%).
Respondents were also asked to rate their agreement with statements about the three robots as a group (Figure 4). Again, care partners as a group were the most positive about the robots’ potential, with the majority indicating that it was a good idea to use the robots and that they liked the idea, that they would make life more interesting, and that the robots would be pleasant to interact with. Across all three groups, the level of agreement with statements attributing internal states to the robots (understanding others, having emotions) was low.
Figure 4.
Ratings of the socially assistive robots as a group.
Respondents were also asked about their perceptions of others’ opinions on the robots: family, friends, caregivers/care receivers, and the media. Care partners were very likely (79%) to agree with the statement “my care receiver’s opinions on the robots are important to me,” but were less likely (43%) to feel that their care receiver would actually support use of the robot. Among persons living with dementia, this relationship was reversed, with relatively few (20%) agreeing that their caregivers’ opinions on the robots were important to them but half (50%) reporting that their caregivers would support their use of the robots. Care partners were most likely, and persons with dementia least likely, to indicate that their friends and family would support the use of the robots (care partners: friends 77%, family 80%; healthy older adults: friends 52%, family 51%; persons living with dementia: friends 25%, family 20%). No group strongly agreed that their friends’ or family’s opinions on the robots were important to them (agreement 25–53%), and agreement with the statement “the media’s opinions on the robots are important to me” was weaker still (10–23%).
On the topic of robot movement, persons living with dementia and healthy older adults were twice as likely to prefer mobile to static robots (57% and 56% mobile for persons living with dementia and healthy older adults respectively versus 29% and 20% static), with the remainder indicating no preference. By contrast, care partners chose both possibilities (41% mobile, 45% static) with equal frequency. Experiences and barriers associated with social robots were also investigated with a series of open-ended questions including what the robots should and should not do (Box 1).
Box 1. Open-ended responses around experiences and barriers around social robotics. Theme followed by illustrative quote
“What do you think the robots should do?” (>10% of responses each)
Act as a companion or reduce loneliness: “They offer company and a reason to direct love.”
Be responsive and interactive: “An element of interaction that would add color to your day.”
Complete tasks and chores: “Anything that’s helpful for humans.” “Fetch and carry”
Foster safety – monitor the environment, call for help, or warn of danger: “Would be great if they could provide a connection to emergency services if asked to do so.”
What shouldn’t the robots do? (>10% of responses each)
Pose a tripping hazard: “Should not be left on the floor where someone can trip over them.”
Make unpleasant sounds: “Loud barks, noises could startle or scare some older adults”
Diminish or replace connection to people or pets: “For some older adults they might replace social relationships and encourage further social isolation.” “They should not be… a substitute for human care and concern.”
When asked why they would or would not use the example robots if they possessed them, top reasons to use included curiosity (e.g. “I am interested to try it out”; 17% of comments) and fun and entertainment (e.g. “JustoCat looks like he is fun to cuddle. I would like that.”; 7%). Common reasons not to use robots were a lack of need or connection with the robots (e.g. “Wouldn’t because it’s somewhat of a toy and I don’t need one.”; 11%) and the artificial appearance of the robots (e.g. “The attempt at reality fails. May as well have a stuffed toy to cuddle.”; “He looks like a robot and that scares me a little.”; 7%).
We also asked who respondents thought could benefit from using the robots. A majority of respondents indicated that the robots could be useful to people who were lonely or would like companionship (e.g. “People living alone, especially seniors without caregivers or family close by.”; 52%). Other answers included individuals living with Alzheimer’s, dementia, or cognitive impairment (e.g. “Patients with a certain type of dementia, possible to go with companions for home visits.”; 25%), people who couldn’t have, would like, or used to have pets (e.g. “Anyone who had pets would benefit. They may be a calming influence, especially to people who owned pets in the past.”; 19%), and older adults (21%), children (11%), and people living in long-term care, care homes, and other supportive housing (10%).
Participants were asked to choose a favorite robot and to explain what they liked about it (Box 2). Reasons for liking included a familiar or realistic look (e.g. “It presents as something close to a real pet.” “I like JustoCat as it looks closer to nature.”; 41%), a cute or expressive appearance (e.g. “Cutest of the lot.”; 19%), the presence of fur (e.g. “Much more tactile inherently with the ‘fur’, which I think would be a necessary feature.”; 16%) and cuddliness and softness (e.g. “I also like that its soft, so that she could interact with it by touch.”; 12%). Finally, they were also asked what they disliked about the example robots, and replies were themed around an artificial look (e.g. “look very mechanical, artificial”; 34%), a creepy or disturbing impact of the robot (e.g. “Cat is a bit creepy with its faux lifelike features.”; 13%), the robots being toy-like or for children (e.g. “Looks like a cheap knock-off of a stuff toy.”; 12%), or having limited capacity (e.g. “I think all of these things are really useless…”; 10%).
Box 2. Open-ended responses around likes and dislikes of the robots in the study. Theme followed by illustrative quote
“What did you like about this robot?” (>10% of responses each)
Realistic, familiar, or animal-like: “Most similar to a living animal.” “Lifelike.”
Cute or expressive: “The eyes are arresting and interested.” “Looks cute just like a real doggo.”
Fur: “…the fur it has makes me want to pet it.” “The feel of the fur is comforting.”
Cuddly, soft, or fluffy: “Looks to be more soft and nice to hold or pat.”
“What are some of the things you dislike about the robots shown?” (>10% of responses each)
Mechanical or artificial: “They look artificial and futuristic.” “Plastic, alien looking, “lifeless””.
Creepy or disturbing impact: “[The robot] is kind of creepy and doesn’t resemble anything real.”
Toy-like or for children: “Just feel they are a glorified toy.”
Limited capacity – lacks functionality or usefulness: “Mostly I dislike their limited capacity for interaction or usefulness.” “Not as many features as I’d like.”
Ethical concerns
We asked participants to rank a set of eight ethical concerns associated with social robotics based on importance. Broadly, the three groups tended to agree in their rankings of the eight concerns. Relative rankings, collapsed across groups, are shown in Figure 5.
Figure 5.
Relative priority of ethical considerations around social robots. Leftmost scores indicate smaller mean rankings, i.e. higher-priority concerns.
Five concerns were largely ranked as most concerning to all three groups: risks to user privacy, a potential reduction in human-human interaction with the introduction of robots, unrealistic expectations of robots, controlling access to data generated by social robots, and the potential for deception. The three considerations that tended to be ranked as less concerning were the burden of caring for the robot, the potential for stigma associated with robot use, and the possibility of becoming “too attached” to the robot.
Emotional impact
The affective impact of reading about the three robots was measured using the PANAS (Figure 6). This scale is subdivided into two subscales measuring the intensity of positive and negative emotions independently. We conducted a one-way between-subjects ANOVA with Group (healthy older adult, care partner, person living with dementia) as the independent variable and positive PANAS score as the dependent variable, and found no effect of group membership (F(2, 170) = 1.1, p = 0.32, η2 = 0.01). However, the same analysis, when conducted with negative PANAS score as the dependent variable, found a main effect of Group (F(2, 177) = 3.6, p = 0.03, η2 = 0.04). Post-hoc tests revealed that persons living with dementia had significantly higher (more negative) scores than the other two groups, albeit based on a small number of data points (pbonf = 0.03-0.04). For context, group means for positive PANAS scores ranged from 21.8-25.3 or roughly the 13th to 21st percentile in a non-clinical adult reference population. 49 Group means for negative PANAS scores ranged from 12.0-17.9 or corresponding to the 28th-74th percentiles.
Figure 6.
Violin plot of PANAS scores. Data points for individual subjects are shown in black, with density curves for each group in colour.
Affect control theory modeling
To model the emotional relationship between person and robot, we used ACT. First, we created sentiment profiles for each of the three socially assistive pet-like robots (Figure 7). We conducted a repeated-measures ANOVA with Robot (AIBO, JustoCat, MiRo) and Sentiment (Evaluation, Potency, Activity) as within-subjects factors, Group (Care Partner, Healthy Older Adult, Person Living with Dementia) as a between-subjects factor, and sentiment scores as the dependent variable. We found a significant main effect of Sentiment (F(2, 234) = 3.76, p = 0.025, η2 = 0.031), with Evaluation scores (i.e. “goodness”) being the highest-rated among the three sentiments. We also found a Robot x Sentiment interaction (F(4, 468) = 13.65, p < .001, η2 = 0.102); the three robots had significantly different sentiment profiles from one another. There were no main effects nor interactions (all F < 1.8, p > .09, η2 < 0.03); notably, participants did not rate the robots in a way that was significantly different across groups.
Figure 7.
Sentiments associated with the three socially assistive robots. Means plus standard errors are shown.
Next, we looked at the EPA distances between each rater’s identity (their ratings of themselves on the three EPA dimensions) and their ratings of each robot (Figure 8). We asked: does EPA distance (i.e. congruency between one’s own identity and one’s perception of the robot) correspond to differences in one’s expectations and intentions around robot use? We performed a series of linear regressions to evaluate whether self-robot EPA distance for a particular robot was a significant predictor of agreement with positive statements about that robot. For example, does a greater congruency between the rater’s own identity and their perception of JustoCat’s identity predict higher agreement with the statement “I think I would find JustoCat enjoyable”? We found that EPA distance significantly predicted agreement with all three statements for all three robots (all p < 0.001). EPA distance between self and robot explained between 4.9% and 27.4% of the variance in agreement with robot statements (all R2 > 0.048, R2 < 0.275). In other words, participants whose identities were similar to those of the robots were more likely to indicate that they would find the robots enjoyable, use them during the next few days, and find them useful than those whose identities were dissimilar. The highest proportions of variance explained was for the three statements “I would think I would find AIBO/JustoCat/Miro enjoyable” (all R2 > 0.164).
Figure 8.
Relationship between responses to three statements about the social robots (I would enjoy…, I would use…, I would find useful…) and respondents’ self-robot EPA distance. Means and standard errors are shown.
Finally, we used ACT to model a hypothetical scenario in which a robot assists a person (Figure 9). “Assists” is an action existing the ACT dictionary with particular values associated with it, making it possible to calculate the “deflection,” or incongruency between expectations and reality, that a person is predicted to experience when that action takes place between themselves and another agent (and EPA values are known for both the self and the partner). A small deflection value indicates that a particular action is aligned with what someone would expect from that partner, and a large deflection value indicates that action is misaligned (i.e. surprising). Using a series of linear regressions, we found that a smaller deflection associated with the action “assists” predicted agreement with the statement “I think that [robot] would be useful for me,” explaining 7.8–15.7% of the variance on this measure (all R2 > 0.077, all p < 0.001).
Figure 9.
Relationship between agreeing that a robot would be useful and ACT deflection for the concept of being assisted by the robot. Means and standard errors shown.
Discussion
In this work, we investigated the perspectives of a sample of older adult users around emotionally aligned, socially assistive robotics. While older adults are often interested in the idea of social robots, barriers remain to their widespread and evidence-based use. At present, studies are typically small and heterogenous in their methods, often focusing on a single robot. Persons living with dementia in particular are rarely consulted directly, and we theorise that this lack of co-creation may lead to barriers and mismatches between real users’ needs and the products that are available to them. In addition, for social robots to truly be designed to assist users socially, a robust model of emotional alignment between user and robot is needed.
To engage with these issues, we collected data from a large group of potential users of social robotics, including persons living with dementia and their care partners. Using a mixed-methods approach, we captured data on experiences and barriers, ethical concerns, and emotional impacts of pet-like social robots. We also elicited sentiment ratings that allowed us to model user-robot relationships and identities across a number of hypothetical interaction scenarios using Affect Control Theory.
Our respondents were largely neutral-to-positive about the pet-like social robot exemplars they were shown. Few felt that the robots would make them nervous or uneasy, and care partners as a group responded most positively to the idea of using the robots. Broadly, ratings of the three robots overlapped with one another, though participants found the robotic cat to be the most lifelike and most likely to be enjoyed. Care partners prioritized their care receiver’s opinions of robots. High priority uses for robots focused on companionship, interaction, and safety, and tripping over robots was a practical concern of note. Participants shared that robots should not make unpleasant sounds or replace connections between people. Reasoning around robot use was quite pragmatic; curiosity and entertainment were motivators, while a perceived lack of need and the mechanical appearance of robots were reasons not to use them. Respondents identified people experiencing loneliness, persons living with dementia, and people in supportive housing as potential user groups. On the whole, robots were liked for being realistic, cute, cuddly, and furry. They were disliked for looking artificial, creepy, and toy-like and for lacking clear utility. In terms of ethics around robot use, the three groups of respondents rated ethical concerns similarly. The top ethical concerns were user privacy, a potential reduction in human-human interaction with the introduction of robots, unrealistic expectations of robots, a potential inability to control access to data generated by social robots, and the possibility of deception.
Turning to the emotion modeling data, ratings of the three robots produced three sentiment profiles that were quite different from one another, suggesting that participants viewed them as having unique identities. The calculated EPA distance between a rater’s judgments of themself and their judgments of a robot was very strongly linked to an anticipation that the robot would be enjoyable and useful. This is promising evidence that the ACT measures used in this work were able to capture dimensions of participants’ identities that predict their real-world behaviour and experiences with robots. Similarly, when our model indicated that the concept “robot assists person” was highly congruent for a particular respondent and robot, that respondent was more likely to agree that the robot would be useful – another piece of evidence validating ACT as a promising model of human-robot emotional alignment.
Taken together, this work examined perceptions of social robots among a large sample of older adults across a range of lived experiences. Furthermore, we demonstrated that computational modeling of emotional alignment between humans and robots was possible for this type of sample.
We acknowledge the limitations of our approach. First, to achieve a large reach, this survey asked respondents about hypothetical robot interactions. As such, our findings should be interpreted with caution, and real experiences and interactions with the robots in our study may yield different results with regards to attitudes, emotions, and ethical considerations. Looking to the future, we plan to expand on our findings using more fine-grained research methodologies (e.g. workshops, qualitative interviewing) with either video or real-world robot interactions as part of the study design. Second, participants in the survey were likely to have an interest in social robots prior to participating, leading to selection bias. Thus their views may differ from potential users we did not sample. Similarly, we heard from a primarily white and well-educated group of respondents, but diversity across racial and socioeconomic lines is critical for future robotics work to be equitable and genuinely co-created with the entire base of potential users. Third, the nature of our recruitment strategy led to a heterogenous sample, with a majority of respondents identifying as older adults, and only seven participants indicating a lived experience of dementia. Although where relevant we analyzed the data by group, this heterogenous sample constitution does remain a limitation in the generalizability of our findings. Finally, it would be valuable to study social robot experiences within families containing both care partners and persons living with dementia in order to understand where those individuals’ perspectives do and do not align.
Despite these limitations, the present work has implications for two dimensions in the advancement of social robotics to support aging. First, it contributes new knowledge about the preferences of older adults for social robots, thereby informing the development of solutions that are more likely to become adopted and, in turn, to benefit end-users. Second, it provides an early validation for the use of Affect Control Theory as a model to develop social robots that are emotionally aligned with end-users. Emotional alignment between technology and end-users is increasingly recognized as a key factor for the adoption and sustained use of devices, but clear models to implement the right type and amount of emotion have been lacking. Here we demonstrate that a computational theory that incorporates identity, values, and cultural context has the potential to successfully model interactions between older adults and social robots.
In conclusion, we found that a convenience sample of older adults hold views around social robotics that are both practical and nuanced. They report that robots may be useful as sources of companionship and interaction, especially for people who are lonely or experiencing dementia, and were generally curious about them and open to their use. However, older adults who answered our survey were concerned about the role of robots expanding in a way that limits connections with other people and the potential for robots to introduce physical dangers like tripping. Participants strongly preferred soft, cuddly, furry robots to those that were more artificial-looking. Affect Control Theory proved to be a very promising tool to understand person-robot emotional relationships computationally. By working directly with older adults to understand their perspectives and experiences with social robots, we can better design these devices to be effective social objects.
Supplemental Material
Supplemental Material for User perspectives on emotionally aligned social robots for older adults and persons living with dementia by Jill A Dosso, Ela Bandari, Aarti Malhotra, Gabriella K Guerra, Jesse Hoey, François Michaud, Tony J Prescott and Julie M Robillard in Journal of Rehabilitation and Assistive Technologies Engineering
Acknowledgements
We are grateful to the League (Lived Experience Expert Group), our older adult end-user advisory, for their insights and advice at all stages of this project. We would like to thank Gabrielle Sunderland for drafting an early version of the survey and Paul Killeen for robot photography used in the survey instrument and for Figure 1.
Declaration of conflicting interests: The author(s) declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article:Tony Prescott is a director and shareholder in two UK robotics companies, Consequential Robotics Ltd (CQR), and Cyberselves Ltd. CQR is the developer of the MiRo-E robot. All other authors declare that there are no conflicts of interest.
Funding: This work was supported by funding to JMR from AGE-WELL NCE [AWCAT-2019-135], the Government of Canada’s New Frontiers in Research Fund (NFRF), and the Canadian Consortium on Neurodegeneration in Aging, Ethical, Legal and Social Implications (ELSI) Cross-cutting Program.
Guarantor: JMR
Contributorship: JAD constructed the final survey, led participant recruitment, performed data coding, analysis, and visualization, and wrote the first draft of the manuscript.
EB and JMR researched the literature, conceived the study, and designed and piloted early study materials.
JMR conceptualized the project and supervised the work.
AM and JH performed data analyses related to Affect Control Theory.
FM and TJP provided study materials.
GG performed data coding and checking.
All authors reviewed and edited the manuscript and approved the final version of the manuscript.
Supplemental Material: Supplemental material for this article is available online.
ORCID iD
Jill A Dosso https://orcid.org/0000-0003-1570-9496
References
- 1.Kontis V, Bennett JE, Mathers CD, et al. Future life expectancy in 35 industrialised countries: projections with a Bayesian model ensemble. The Lancet 2017; 389(10076): 1323–1335. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.McGlynn SA, Kemple S, Mitzner TL, et al. Understanding the potential of PARO for healthy older adults. Int J Hum-Comput Stud 2017; 100: 33–47. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Vandemeulebroucke T, Dierckx de Casterlé B, Gastmans C. The use of care robots in aged care: a systematic review of argument-based ethics literature. Arch Gerontol Geriatr 2018; 74: 15–25. [DOI] [PubMed] [Google Scholar]
- 4.Cornwell EY, Waite LJ. Social disconnectedness, perceived isolation, and health among older adults. J Health Soc Behav 2009; 50(1): 31–48. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Holt-Lunstad J. Why social relationships are important for physical health: a systems approach to understanding and modifying risk and protection. Annu Rev Psychol 2018; 69: 437–458. [DOI] [PubMed] [Google Scholar]
- 6.Seeman TE, Crimmins E. Social Environment Effects on Health and Aging: Integrating Epidemiologic and Demographic Approaches and Perspectives. Washington, DC, US: Georgetown University, New York Academy of Sciences, 2001. [DOI] [PubMed] [Google Scholar]
- 7.World Health Organization . Dementia Fact Sheet [Internet]. 2021 September. Available from: https://www.who.int/news-room/fact-sheets/detail/dementia (accessed 22 September 2021). [Google Scholar]
- 8.Broadbent E. Interactions with robots: the truths we reveal about ourselves. Annu Rev Psychol 2017; 68: 627–652. [DOI] [PubMed] [Google Scholar]
- 9.Feil-Seifer D, Mataric MJ. Defining socially assistive robotics. In: 9th International Conference on Rehabilitation Robotics, ICORR, Chicago, USA, June 28–July 1 2005, pp. 465–468. [Google Scholar]
- 10.Broadbent E, Peri K, Kerse N, et al. Robots in older people’s homes to improve medication adherence and quality of life: a randomised cross-over trial. In: International conference on social robotics. Springer, Sydney, Australia, 27–29 October 2014, pp. 64–73. [Google Scholar]
- 11.Pu L, Moyle W, Jones C, et al. The effectiveness of social robots for older adults: a systematic review and meta-analysis of randomized controlled studies. The Gerontologist 2019; 59(1): e37–51. [DOI] [PubMed] [Google Scholar]
- 12.Prescott TJ, Robillard JM. Are friends electric? The benefits and risks of human-robot relationships. iScience 2021; 24(1): 101993. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Robinson H, MacDonald B, Broadbent E. The role of healthcare robots for older people at home: a review. Int J Soc Robot 2014; 6(4): 575–591. [Google Scholar]
- 14.Banks MR, Willoughby LM, Banks WA. Animal-assisted therapy and loneliness in nursing homes: use of robotic versus living dogs. J Am Med Dir Assoc 2008; 9(3): 173–177. [DOI] [PubMed] [Google Scholar]
- 15.Barber O, Somogyi E, McBride AE, et al. Children’s evaluations of a therapy dog and biomimetic robot: influences of animistic beliefs and social interaction. Int J Soc Robot 2021; 13(6): 1411–1425. [Google Scholar]
- 16.Brecher DB. Use of a robotic cat to treat terminal restlessness: a case study. J Palliat Med 2020; 23(3): 432–434. [DOI] [PubMed] [Google Scholar]
- 17.Moyle W, Jones C, Sung B, et al. What effect does an animal robot called CuDDler have on the engagement and emotional response of older people with dementia? A pilot feasibility study. Int J Soc Robot 2016; 8(1): 145–156. [Google Scholar]
- 18.Pransky J. AIBO–the No. 1 selling service robot. Ind Robot Int J 2001; 28(1): 24–26. [Google Scholar]
- 19.Prescott TJ, Mitchinson B, Conran S. Miro: An animal-like companion robot with a biomimetic brain-based control system. In: Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria, 6–9 March 2017, pp. 50–51. [Google Scholar]
- 20.Lipson H. Robots on the run. Nature 2019; 568(7751): 174–175. [DOI] [PubMed] [Google Scholar]
- 21.Naneva S, Sarda Gou M, Webb TL, et al. A systematic review of attitudes, anxiety, acceptance, and trust towards social robots. Int J Soc Robot 2020; 12: 1179–1201. [Google Scholar]
- 22.Aguiar Noury G, Walmsley A, Jones RB, et al. The barriers of the assistive robotics market—What inhibits health innovation? Sensors 2021; 21(9): 3111. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Hung L, Liu C, Woldum E, et al. The benefits of and barriers to using a social robot PARO in care settings: a scoping review. BMC Geriatr 2019; 19(1): 232. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Pino M, Boulay M, Jouen F, et al. Are we ready for robots that care for us?” Attitudes and opinions of older adults toward socially assistive robots. Front Aging Neurosci 2015; 7: 141. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Ghafurian M, Hoey J, Dautenhahn K. Social robots for the care of persons with dementia: a systematic review. ACM Trans Hum-Robot Interact 2021; 10(4): 4131–4141. [Google Scholar]
- 26.Bostrom RP, Heinen JS. MIS problems and failures: A socio-technical perspective. Part I: The causes. MIS quarterly. 1977: 17–32. [Google Scholar]
- 27.Davis FD. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q 1989; 13: 319–340. [Google Scholar]
- 28.Holden RJ, Karsh BT. The technology acceptance model: its past and its future in health care. J Biomed Inform 2010; 43(1): 159–172. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Heerink M, Kröse B, Evers V, et al. Assessing acceptance of assistive social agent technology by older adults: the Almere model. Int J Soc Robot 2010; 2(4): 361–375. [Google Scholar]
- 30.Heerink M, Krose B, Evers V, et al. Measuring acceptance of an assistive social robot: a suggested toolkit. In: 18th IEEE International Symposium on Robot and Human Interactive Communication, Toyama, Japan, September 27–October 2 2009, pp. 528–533. [Google Scholar]
- 31.Damm O, Malchus K, Hegel F, et al. A computational model of emotional alignment. In: 5th Workshop on Emotion and Computing, Berlin, Germany, 4 September 2011. [Google Scholar]
- 32.Robillard JM, Cleland I, Hoey J, et al. Ethical adoption: a new imperative in the development of technology for dementia. Alzheimers Dement 2018; 14(9): 1104–1113. [DOI] [PubMed] [Google Scholar]
- 33.Robillard JM, Kabacińska K. Realizing the potential of robotics for aged care through co-creation. J Alzheimers Dis 2020; 76(2): 461–466. [DOI] [PubMed] [Google Scholar]
- 34.Townsend KC, McWhirter BT. Connectedness: a review of the literature with implications for counseling, assessment, and research. J Couns Dev 2005; 83(2): 191–201. [Google Scholar]
- 35.Collins EC, Prescott TJ, Mitchinson B. Saying it with light: a pilot study of affective communication using the MIRO robot. In: Wilson SP, Verschure PFMJ, Mura A. (eds) Biomimetic and Biohybrid Systems [Internet]. Lecture Notes in Computer Science, 9222. Cham: Springer International Publishing, 2015, pp. 243–255. Available from: http://link.springer.com/10.1007/978-3-319-22979-9_25 (accessed 16 September 2020). [Google Scholar]
- 36.Jung MF. Affective grounding in human-robot interaction. In: 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 2017. IEEE, Vienna, Austria, 6–9 March 2017, pp. 263–273. [Google Scholar]
- 37.Liu Z, Wu M, Cao W, et al. A facial expression emotion recognition based human-robot interaction system. IEEECAA J Autom Sin 2017; 4(4): 668–676. [Google Scholar]
- 38.Heise DR. Affect control theory: concepts and model. J Math Sociol 1987; 13(1–2): 1–33. [Google Scholar]
- 39.Heise DR. Surveying Cultures: Discovering Shared Conceptions and Sentiments. USA: John Wiley & Sons, 2010. [Google Scholar]
- 40.Francis C, Heise DR. Mean Affective Ratings of 1,500 Concepts by Indiana University Undergraduates in 2002-3 [Computer File] Distributed at Affect Control Theory Website, Program Interact [Internet], 2006. Available from: http://www.indiana.edu/∼socpsy/ACT/interact/JavaInteract.html [Google Scholar]
- 41.Robillard JM, Hoey J. Emotion and motivation in cognitive assistive technologies for dementia. Computer 2018; 51(3): 24–34. [Google Scholar]
- 42.Al-Saif S. Animal healthcare robots: the case for privacy regulation. Wash JL Tech Arts 2018; 14: 77. [Google Scholar]
- 43.Leong TW, Johnston B. Co-design and robots: a case study of a robot dog for aging people. In: International Conference on Social Robotics. Springer, 2016, pp. 702–711. [Google Scholar]
- 44.Watson D, Clark LA, Tellegen A. Development and validation of brief measures of positive and negative affect: the PANAS scales. J Pers Soc Psychol 1988; 54(6): 1063–1070. [DOI] [PubMed] [Google Scholar]
- 45.Ghafurian M, Hoey J, Tchorni D, et al. Emotional alignment between older adults and online personalities: Implications for assistive technologies. In: Proceedings of the 14th EAI International Conference on Pervasive Computing Technologies for Healthcare, Online, 6–8 October 2020, pp. 296–304. [Google Scholar]
- 46.Heise DR. Expressive Order: Confirming Sentiments in Social Actions. Germany: Springer Science & Business Media, 2007. [Google Scholar]
- 47.Malhotra A, Stewart TC, Hoey J. A biologically-inspired neural implementation of affect control theory. In: 18th International Conference on Cognitive Modeling, Online, 20–31 July 2020. [Google Scholar]
- 48.Nomura T, Suzuki T, Kanda T, et al. 1 Measurement of anxiety toward robots. In: 15th IEEE International Symposium on Robot and Human Interactive Communication, Hatfield, UK, 6–8 September 2006, pp. 372–377. [Google Scholar]
- 49.Crawford JR, Henry JD. The positive and negative affect schedule (PANAS): construct validity, measurement properties and normative data in a large non-clinical sample. Br J Clin Psychol 2004; 43(3): 245–265. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supplemental Material for User perspectives on emotionally aligned social robots for older adults and persons living with dementia by Jill A Dosso, Ela Bandari, Aarti Malhotra, Gabriella K Guerra, Jesse Hoey, François Michaud, Tony J Prescott and Julie M Robillard in Journal of Rehabilitation and Assistive Technologies Engineering