Abstract
Research into companion robots for older adults, including those who are socially isolated and lonely, continues to grow. Although some insight into older adults’ preferences for various robotic types and functionality is emerging, we lack research examining how these robots fulfil or challenge a range of values and aspirations individuals have in later life. This study examines the attitudes and perspectives of 16 older adults (aged 65+) living independently but alone in their own homes, who were interviewed and shown videos depicting three distinctive companion robots: a talking assistant; a roving toylike vehicle; and a robotic dog. This approach illuminated values, preferences, and needs amongst older people that are vital for understanding the potential of companion robots. In comparing the robots, participants expressed concerns about the impact of different companion robots on their abilities and skills, their sense of autonomy and control over their lives, and the maintenance of several kinds of dignity. These results inform user-centered design and use of companion robots for older people living alone and independently.
Keywords: Companion robots, social robots, older adults, dignity, autonomy, social isolation, ethics
1. INTRODUCTION
Research into companion robots for older adults continues to grow [72], often driven by concerns about ageing populations, caregiver shortages, and social isolation and loneliness [4,25,26,77,92]. Human-robot interaction research suggests that companion robots may benefit older people by improving health [36] and providing assistance and entertainment [48].
Although there are many different kinds of companion robots available, many HCI studies target a single companion robot [56,93,98] or focus on robots that are more-or-less alike, such as robot animals [48]. Studies on older adults’ responses toward diverse kinds of companion robots are rarer [cf. 30,94]. In this video-based interview study, we explore older adults’ views regarding three distinctive robots: a conversational home assistant, a talking mobile toylike vehicle, and a small fluffy robot dog. Examining responses to these different robots exposed values and preferences relating to assistance and companionship.
We presented short videos of each of these robots to 16 independently living older adults to explore their responses. We refer to these robots as ‘companion robots’ since all three are designed to create affective social relations with users [28]. Companion robots are a kind of ‘socially assistive robot’ [16]. Robot home assistants, like the conversational home assistant included in this study, can simultaneously be ‘service’ robots and companion robots. Our broad aim in this study was to understand the attitudes and responses of relatively self-sufficient, independent-living older adults to different types of companion robots. We wanted to first understand participants’ social preferences in order to better contextualize their responses to companion robots. Against this backdrop, we investigated older adults’ views of the relative advantages and disadvantages of these distinctive robots and if and why/why not they might want to incorporate such robots into their lives. Participants discerned a range of pros and cons for each robot exemplar. We identified varied attitudes towards features such as toy-likeness, animal-likeness, mobility, assistance, and speech – and individual preferences linked to ‘values’ like control and dignity.
Our study contributes insights into the challenge of designing social and technical systems to support companionship for older people (65+) who live alone. Many older adults live alone and wish to be independent for as long as possible [91], yet most studies focus on companion robots for people in residential aged care [42] and/or with dementia [61]. We show that a one-size-fits-all solution will be difficult to achieve even for the specific group of independent-living older adults because of diversity in people’s preferred forms of interaction. Furthermore, ethical concerns have been raised, frequently in philosophical reflections, about robotic companions. These concerns are often related to replacing human interaction and causing deception [21,84] and effects on older adults’ dignity [17]. In this study, we turn to older adults themselves to understand potential concerns about dignity. We find that individuals view the design of some robots as patronizing in certain ways that can be seen as distinctive threats to dignity. Overall, our study extends and deepens understanding of the possibilities and challenges of human-robot relations for an age group known to exhibit particularly diverse preferences and needs [65].
2. BACKGROUND
2.1. Human and animal company for older people
A growing number of older adults live alone and relatively independently in their own homes [73]. Some countries, such as Australia (where this study was conducted), have policies for ‘ageing in place’, which provide support for people who choose to live in their own homes as they get older [5]. Living alone, however, can sometimes lead to social isolation and loneliness. While loneliness can affect anyone [52], older adults often experience shrinking social support networks and reduced contact with family [64]. Loneliness can harm wellbeing both physically and mentally [36,83]. Recently, social isolation measures during the COVID-19 pandemic have exacerbated loneliness for many older adults [3]. The potential of technology to help address companionship and social connection has been a core element of interventions to tackle loneliness and social isolation in later life. Thus, there is growing interest in understanding the role technology might play in supporting companionship for isolated older adults, including the role of robots and other virtual assistants.
Of course, companionship can be enriching even for those not prone to isolation or loneliness. Indeed, having opportunities for varied companionship, and, moreover, autonomous control over who we associate with, are vital for human wellbeing [75]. This latter aspect – being able to choose who or what to associate with – has received little attention in past work on designing robots for older people and is an important value that emerges in our study. Some people, of course, enjoy the company of animals, and many robotic companions are designed to mimic animal companionship. Pets may reduce older people’s loneliness and improve human health [69], yet can also pose problems including risks to health and emotional trauma [33]. Companion robots have been pursued for older people as one alternative to animal companions, though robots are limited in their capacity to replace human and animal companionship and can have drawbacks such as cost, acceptability to users, and a possible need for ongoing technology updates or support.
2.2. Diversity of companion robot types
Some socially assistive robots are designed for healthcare or for service only, such as lifting, vacuuming, feeding, and monitoring; others are designed to provide companionship [16]. Although ‘companionship’ is complex, it has characteristic features like enjoyment, play, attachment, and psychological satisfaction [51]. Creating pleasure, comfort, and a desire for interaction are also important aims of companion robots. In this study, our working definition of ‘companion robot’, as a kind of socially assistive robot, is that of an autonomous social machine with moving parts that is designed to elicit such companionable and affective responses, even if it also has other functions, like assistance. There are many kinds of companion robots. Humanoid robots have human-like appearance, speech, movement, and/or behavior [12]. Robot assistants can give reminders and may or may not have human-like features [87]. Toylike robots can resemble teddy bears, dolls, automobiles, etc. [15]. And popular zoomorphic robots can look and act like pets, wildlife, and dinosaurs [24]. Understanding which style of robot (if any) older people prefer, and which kinds of robots align with older adults’ companionship needs and preferences, requires extensive investigation.
2.3. Previous work on companion robots for older people
We now review some previous work which relates to our interest in older adults’ values and views on companion robots and the opportunities and challenges they might provide. Some studies show that older people can have positive responses to companion robots [27,35] and that robots might facilitate other valuable social encounters [43]. Robot acceptance may increase after periods of acclimatization [29,31], and robots may also prove more attractive companions than screen-based interactive characters like virtual pets [18,19]. Further, some studies suggest that robot pets may be as effective as living pets at reducing loneliness amongst older people [6]. Dautenhahn et al. [28] found that most participants preferred the idea of a robot as an assistant rather than companion, but that if robots were to be companions, they would prefer them to have human-like communication.
However, concerns about the uncanny valley effect [59] in which humanoid robots may be rejected as creepy, plus the popularity of living pets, have made animal robots a prime focus of study [44]. The baby robot seal PARO, for example, has been extensively studied for people with or without dementia [41,57], while studies of Sony’s popular AIBO suggest that emotional attachment to a robot dog is possible [41]. Some authors suggest that individuals without dementia tend to reject robots that resemble living things as phony [99], but others point to the possible acceptance of pet-like robots by at least some people. Lazar et al. [48] report on focus group responses of independent-living older adults to several pet-type robots that participants physically handled during the discussions. Participants described robot pets as having certain practical advantages over living pets, such as being easier to maintain. Admittedly, participants often preferred living pets because they give and require care, and a few found the robots ‘creepy’ and lacking authenticity. However, the authors suggest that if older adults ‘give in to the fiction’ of the artificial animals, this can lead to ‘fulfilment and an emotional connection’ [48:1039] with them.
Nonetheless, robot companions are not always perceived as desirable [15,32]. Some robots, such as the home assistant Jibo, have not fared well [1], and the manufacturer of a robot featured in the present study (Vector) has since stopped production for financial reasons (presumably due to insufficient interest) [82]. Furthermore, philosophers have criticized companion robots. For example, Sparrow argues that insofar as companion robots generate a sense of companionship, they deceive users about their insentient nature [84]. For some thinkers, robot pets like AIBO and PARO could affront the dignity of older people, while for others the threat to dignity is exaggerated or can be managed [10,78,79,89].
A recent study in Israel by Deutsch et al. [30], which used a similar video methodology to our study to explore independent older adults’ responses to 6 robots – including two used in this study (ElliQ and Cozmo, an earlier model of Vector) and a fluffy robot animal (PARO) – found that older adults rejected robots that were not ‘authentic’ and that pretended to be companions, pets, or friends. Like recent research by Zuckerman et al. [99], this study suggests that robots should not have speech or life-like appearances that indicate companionship as their primary role. Rather, the primary function of robots for cognitively intact older people, the authors argue, should be activities like assistance, play, and service – companionship should be a secondary function only. Deutsch et al. also found that older adults had strong needs for control and independence in relationships which may be threatened by proactive assistant robots. Ethicists have discussed the fact that robots might either enhance older adults’ autonomy by decreasing their dependence on others or else diminish autonomy by making decisions for them [79,85].
In this paper, we extend the above work by providing a qualitative analysis of older adults’ views. Our study shows how perceptions of various kinds of dignity, autonomy, and styles of company might be affected differently by alternative companion robots, and it reveals the ways in which individuals can have starkly diverging views on these matters.
3. METHOD
The study was based on a qualitative design, drawing on interviews. It was approved by our University’s human ethics committee. Participants received a plain language statement explaining the aims and procedures of the research and, following the opportunity for discussion, signed an informed consent form.
3.1. Participants
We recruited 16 participants who were 65+ and living alone in an Australian city. A local council helped us identify individuals who were capable of providing informed consent and interested in participating. The council provided minimal home care services (e.g., weekly house-cleaning) to participants who were all independent and largely self-sufficient. Table 1 provides an overview of the participants, using pseudonyms.
Table 1:
Participant pseudonym | ID | Age | Gender | Marital Status |
---|---|---|---|---|
Jerry | P1 | 78 | Male | Single |
Craig | P2 | 84 | Male | Widower |
Louise | P3 | 82 | Female | Widow |
Carmen | P4 | 82 | Female | Widow |
Brianne | P5 | 74 | Female | Widow |
Calvin | P6 | 71 | Male | Single |
Stephanie | P7 | 74 | Female | Widow |
Beth | P8 | 80 | Female | Widow |
Arthur | P9 | 83 | Male | Widower |
Sarah | P10 | 66 | Female | Divorced |
Gwen | P11 | 89 | Female | Widow |
Phoebe | P12 | 79 | Female | Widow |
Joan | P13 | 77 | Female | Divorced |
Mary | P14 | 65 | Female | Separated |
Jill | P15 | 79 | Female | Widow |
Lisa | P16 | 68 | Female | Single |
3.2. Data collection: Interviews
Working in pairs, the researchers conducted 16 semi-structured interviews with individual participants in their own homes. First, we administered a short sociodemographic questionnaire to collect data such as age and gender. Then, to understand participants’ preferences for companionship, we asked questions aimed to elicit their attitudes towards human company (e.g., family, former partners, friends, social groups). Next, we showed all participants videos of the three robots in the order set out below, moving from an assistant robot with human-like qualities, to a toy-like robot, and then to an animal-like robot designed to emulate a pet. This order was the same for all participants, as we wanted to understand participants’ views as the discussion progressed from focusing on a robot designed primarily for assistance to a robot designed primarily for companionship.
This order may have introduced bias, but as we were not conducting an experiment, we did not have strong concerns about order effects. It is also common for semi-structured interviews to introduce discussion topics in a fixed order. We used our interview guide to encourage discussion and reflection on each device in turn and to facilitate comparisons between them. Questions were designed to elicit attitudes and preferences regarding the three different robots and their potential value as companions. For example, we asked: What do you think of the three robots? Would you want to use them and why? If not, why not? What would you use these robots for? Which of the three robots would you prefer and why? Do you think they could provide companionship? We took field notes during the interviews to account for body language, paralanguage, and context, ensuring participants’ reactions were more fully captured beyond their verbalizations. Interviews were audio-recorded and transcribed for analysis.
3.3. Three different companion robots
We used videos of each robot that were then accessible on YouTube and of a length that allowed sufficient time to be played and extensively discussed during a one-hour interview. We made sure the videos or video segments we showed were all about one-minute duration. Although video-based studies have limitations, they have been used in similar studies on robots [30,96]. We chose these videos because they allowed us to efficiently and consistently convey to 16 participants clear depictions of each robot’s features [94]. While showing participants actual robots would have been useful, we sought to give them illustrations of how the robots can be used by other people, including older people. We told participants that these were videos we found on the internet and that we did not have any personal interest in the robots, but simply wanted to gather their perceptions and attitudes towards these technologies.
Although many different companion robots exist [24], the ones we selected were or are commercially available and display variety in form, texture, movement, verbality, expressiveness, and interactivity. Moreover, they show variety in the kind of companionship they offer – from a simple pet-like robot that responds to stroking, to a playful toy-like companion, to a more utilitarian virtual assistant that offers advice and reminders. We describe the robots and videos below. It is important to note that all three videos present the robots in a positive fashion, and that the ElliQ and Vector videos have a marketing-style, which may have overemphasised the robots’ responsiveness and apparent intelligence. This may well affect participants’ perceptions and must be factored into the analysis. As such, we have included reflections on the possible impact of the tone and style of the videos when presenting our findings below. Figure 1 shows images of the three robots.
3.3.1. ElliQ
Created by Intuition Robotics, ElliQ is a non-mobile home assistant robot which comes with a separate screen that displays pictures and videos [37]. Unlike popular smart speakers like Google Home or Amazon Alexa, ElliQ has a head that can move around flexibly and lights up in the centre when interacting. Its moving parts, conversation, and female-sounding voice qualify ElliQ as a ‘companion robot’ despite its assistive or service role [13]. ElliQ has several assistive functions, such as suggesting music or social video calls to friends and family and recommending that the older person go for a walk or take their medication. This ability to pro-actively make suggestions further distinguishes ElliQ from other home assistants [30]. ElliQ can facilitate video calling for the less technologically adept and can learn and take on new tasks based on its interactions with individual people. Of the three robots in this study, ElliQ has the most sophisticated conversational ability and the most human-like presence despite lacking a human appearance. The video we showed to participants has a marketing style outlining a range of functions which ElliQ can perform [38]. In the video [39], an active older woman interacts and talks with ElliQ, which pro-actively and verbally gives reminders about upcoming events and makes suggestions, such as asking if the user would like to talk with family via skype. The video shows the woman taking up ElliQ’s suggestions.
3.3.2. Vector
Vector, made by Anki [100], is a small robot which has caterpillar tank treads and can autonomously rove around its environment. Vector’s capabilities allow it to alter its behaviour with interaction. It has green blinking eyes and a bulldozer-like ‘arm’ which it uses to express itself (e.g., in a ‘friendly’ greeting) or to interact with objects and people. Vector makes robot-sounding noises, can use speech to answer questions, and responds to spoken commands. As with ElliQ, the video we used promotes Vector by showing its possible uses in home settings [2]. Vector, the voiceover says, can talk, display ‘curiosity’, and ‘welcome you home like a loyal companion.’ Vector has a mechanical, shiny, toy-like appearance and can go online to answer questions, similar to ElliQ. The voiceover claims: ‘You’ve got a robot in your house, waiting for you to get home. Autonomous like a Mars rover, aware enough to avoid falls like a tightrope walker, and always excited to welcome you home, like a loyal companion.’ Vector is shown receiving a ‘fist bump’ from a person and trundling around a home with people in the background. Unlike the other two videos, however, this video did not show robot interaction specifically with older people. Although Vector has the appearance of a toy, the video did not pitch it explicitly to children, but rather to an adult audience. During our study, Anki stopped production of Vector due to financial problems.
3.3.3. Biscuit
Biscuit is an animatronic but non-mobile smallish dog covered in fur. It has a relatively lifelike appearance and can respond to touch and speech by moving and making dog-like vocalisations. The video we used was a positive BBC news story on the use of Biscuit in an aged care home [7]. In the video, carers speak approvingly of Biscuit’s presence in the home, and older people interact with Biscuit in apparently natural and engaged ways. Residents are seen stroking the robot dog and smiling when it responds to them by autonomously moving its ears and wagging its tail. Unlike ElliQ and Vector, Biscuit has no assistive functions or cloud-based connectivity. It does not converse, retrieve information, proactively issue reminders, or give advice. Rather, it simply behaves and interacts like a friendly, furry dog.
3.4. Data analysis
To answer our research questions, we employed thematic analysis to identify main themes from the qualitative data generated by the interviews [11]. Our approach facilitated insights into individual preferences and responses to the robots, while also conducting a collective analysis to compare group responses to the three robots. All interviews were audio-recorded and transcribed. Two researchers coded the transcripts independently. Subsequently, they discussed and agreed upon the codes. They then discussed themes derived from these codes. Initial codes included pets, family, independence, and loneliness. As critical discussion proceeded, themes about social preferences, companionship, control, abilities and assistance, and dignity were identified and agreed upon.
4. FINDINGS
Below, we describe our findings that older people have deep needs and preferences in relation to five themes: 1) preferred social arrangements, 2) style of company, 3) control, 4) abilities in the context of receiving assistance, and 5) dignity.
4.1. Preferred Social Arrangements
Prior to discussing the robot videos, participants shared their preferences regarding human company. Participants expressed individual preferences, ranging from a desire for more human contact (expressed by many interviewees) to a preference to largely shun human contact. Many interviewees described the importance of the company of friends and family. Several participants explained that they were not experiencing their desired level of social companionship or at least companionship from particular individuals. Calvin and Beth, for instance, described themselves as very lonely and sorely missing old friends. Some lamented the inability to see their children and grandchildren very often. Many grieved the intimacy shared with lost partners and explained that they missed both physical intimacy and conversation.
Participants who missed deeper companionship with friends and partners found other opportunities for social connections. Beth explained how she would often sit in her wheelchair in her front yard hoping to chat with passersby. Conversing with another, even if not a friend or partner, was important to her. Others, such as Louise, Gwen, and Phoebe, described themselves as less inclined towards human companionship. Gwen said that she had been a loner her entire life: ‘I’m quite a solitary person… I’ve never been one to sort of be for friends. It was just me, my husband, my children.’ Phoebe said of visitors, ‘I just feel they’re squashing the air out of me.’ She explained that,
‘…I don’t have much company, which I don’t mind, I love my own company. I love my own space. If I want to get up and wander around in my PJs [pajamas] I can. If I don’t want to get dressed I can…’
Phoebe appreciated that she could do whatever she wanted when alone – a kind of freedom missing when other people were present. Gwen said that throughout her life she had always avoided adopting a façade: ‘I wouldn’t sort of make an effort to say, ‘Hello, how are you?’, and pretend I’m happy.’ For Gwen, being alone, or only with her husband and children, meant that she did not have to engage in deception [90]. She was very clear that she had no desire to talk to others, and said she was content in her solitude. And while Beth longed for human conversation, she also avoided social clubs because she thought that individuals in groups can be ‘bitchy’.
Although many participants desired company, they still emphasized their desire to choose whether and when to talk to and engage with other people: ‘Because sometimes you think you want company, but then there’s only a certain time limit that you’re prepared to take that company, and then you want to go off and do your own thing’ (Sarah). Beth, to give a different example, was gregarious and liked human company, but only ‘as long as they don’t stay too long’. Too much human company made her ‘get all uptight’. These findings about the preferred social arrangements of interviewees connect with and illuminate some of their responses to the videos of the three different robots, which we explore next.
4.2. Robots and style of company
In reacting to the presented videos and to the forms of companionship the different robots might offer, some participants thought the proactive conversation provided by ElliQ was appealing. Of the three robots we showed, ElliQ can converse most like a human. In contrast, Vector has a more robotic voice and less sophisticated conversational ability, and Biscuit does not talk at all. Beth, who reported being isolated and lonely but was wary of some human company, liked ElliQ. Beth thought ElliQ and her chatter and the way she piped up would likely be comforting in a quiet and lonely household: ‘It breaks the silence of the day.’ For Sarah, the more linguistically sophisticated presence of ElliQ had a positive side, because it was like having a person in the house. In contrast, Gwen and Stephanie disliked this same feature. Stephanie said: ‘But that whole like another person in the house…like, no thanks’. Sarah was more gregarious and liked human company; Stephanie and Gwen much less so. Here we see an example of how ‘preferred social arrangements’ (addressed in the previous section) appeared to shape participants’ responses to different robot types. Another dimension of this theme related to the way in which robots talk, including their proactivity and assertiveness. Craig objected to the frequency with which ElliQ participated in conversation, saying: ‘Alright, but… It talks all the time’. Others found ElliQ’s manner abrasive:
‘I didn’t perhaps like her tone of voice.’
(Joan)
‘I thought it was a bit rude actually. The machine sounded a bit rude to me.’
(Louise)
Again, this recalls familiar responses to some human company. We might also recall here Beth’s personal view that social groups can be unpleasant and other people often rude. Some participants also described the two talking robots, ElliQ and Vector, as lacking feeling and being ‘cold’ rather than warm. Indeed, many explained that they preferred the (non-talking) dog, Biscuit. Strikingly, participants used terms like ‘beautiful’, ‘gorgeous’, ‘wonderful’, ‘empathy’, ‘affection’, and ‘compassion’ to describe Biscuit’s interactions (possibly reacting partly to the way they were presented in the positive news story). Louise said: ‘There’s more compassion with the dog than the first two [ElliQ and Vector]. The other two don’t give a damn… Whereas the little dog and the old fella [in the video], you can imagine that happening. But with those other two, no.’ Many interviewees explicitly described Biscuit’s resemblance to a living dog as an asset. This likeness was what appeared to call forth words like gorgeous, wonderful, and affection. For Louise, Biscuit showed ‘compassion’ because it displayed a dog-like affection; whereas Vector and ElliQ lacked this feature. Sarah explained it thus:
‘You could touch Biscuit. You could pat it. You could stroke it. It’s got eyes you can look in. So you get some sort of empathy back, or feeling back, the way he tilts his head when you talk to him. All that sort of thing. More like a real dog.’
Many interviewees expressed positive attitudes towards Biscuit as a potential companion for others and for themselves.
4.3. Robots and control
Though a rude-sounding human voice might be easily re-programmed, an issue that participants had which is more central to the goal of assistive robots such as ElliQ was a resistance to the ways that these kinds of devices might proactively direct their lives. In contrast to how Beth described her preferred limits on how long people stayed, or Phoebe’s satisfaction at being able to march around her house in pajamas, ElliQ could pipe up at any time to say, ‘Now this is [what’s on]…And don’t forget that’ (Brianne). Lisa described this as ‘nagging’ and quite a few participants referred to ElliQ as ‘invasive’ and ‘intrusive’. Brianne elaborated on how intrusiveness and offering unwanted direction was a particularly dangerous combination: ‘I don’t know whether that would drive me mental if it kept interrupting me and telling me what to do…I might want to get an axe and cut it up…’ The threat that these robots posed was far more than annoyance; there was a sense that ElliQ’s demanding talk could be linked to a feeling of losing agency over parts of one’s life.
‘I just wouldn’t want to turn that over to her [ElliQ] yet…I think so. Losing control. Yes. At my age … I don’t need someone sitting in the room telling me.’
(Sarah)
‘…you could feel pushed, you could get it to control your life.’
(Lisa)
‘I suppose it’s because I don’t like being told what to do. Like the other one saying, ‘Now this is, and don’t forget that’…But see, I Google everything for that.’
(Brianne)
Directing a search on Google, as mentioned in the last quote, may be done in one’s own time and manner and does not involve ‘being told what to do.’ However, it is important to note that even some participants who found fault with Eli-Q said her reminders would be useful. That is, they had some positive reactions towards at least certain assistive functions provided by robots, even though they had concurrent negative reactions to other robot features and functions. This suggests that technology designed for older adults could potentially fill a need even when not fully aligned with people’s values or wishes.
Another important sense of having ‘control’ emerged from participants’ responses. Some participants who found ElliQ controlling said they preferred Biscuit, because they would have control over such a robot pet. Asked whether she preferred a talking robot versus a pet robot Carmen said: ‘one that I could abuse I suppose.’ Although Carmen said this with a laugh, she was serious that the pet robot was a better fit for her desire for control. Similarly, for Brianne, the attraction to exerting power was focused on the pet-like robot: ‘And then I look at the dog one, and I just think that’s beautiful. I just thought, ‘I could pick that dog up and drag it around. Be in charge. I’m sure you get my picture, that I want to be in charge.’ Also referring to Biscuit, Sarah said that ‘you could say, ‘Go to your bed’…just like you do to your real dog. You know, ‘Come sit on my lap’.’
The ability to have control over interactions with Biscuit was highlighted in quotes such as these. Calvin explained how there was even more control with a robot pet versus a live pet: ‘Yeah well if you feel like you want a bit of company you could have it just on when you want it there but you’re not gonna be all day.’ Given that individuals watched the video showcasing Biscuit after viewing the ElliQ video, it seems that some comments praising the ability to control Biscuit were partly in reaction to the more human-like and assertive ElliQ. Hence, there appears to be an interesting difference in the way people responded to the assertive and talking ElliQ, and to Biscuit as an artificial animal that displayed similar qualities to a friendly and highly compliant dog. Additional research would be required to further tease out the sorts of robots and robot characteristics (e.g. various animal features, humanoid appearance, speech, etc.) that older people regard as controllable (or not) and the degree to which they seem to them to be so.
4.4. Robots and assistance
In some research, individuals who did not want companionship from robots still expressed eagerness to have robots provide assistance in the way a human housekeeper might [48]. As mentioned above, some participants did like aspects of the assistance robots can provide. Notably, however, many interviewees raised strong concerns about receiving assistance from ElliQ, and (less strongly) from Vector. As explained, ElliQ is distinguished from a smart speaker by its ability to suggest and prompt activities and to offer advice. A prominent concern, however, was that having this assistant doing tasks such as booking appointments, calling friends, paying bills and prompting actions like taking medication, would reduce participants’ willingness or ability to do those tasks. Jerry explained that he had intentionally begun to do odd jobs for a café: ‘it gets me out of the house, gets me on the move, where before I just sit here’. Given that Jerry was actively working to avoid the trait of laziness, he was averse to the idea of using a device to turn on or off lights as ‘it could make me a lazier person’. Similarly, Louise attributed automated assistance and electronic devices as contributing to a ‘lazy generation’:
‘You’ve got machines for this and machines for that. Damned mobile phones which, I’ve got one, but I’ve never have used it much. All your electronic things. It’s just making people and their generation lazy.’
While Louise saw these types of devices as embraced by a younger generation, others spoke of how ElliQ was actually suited for individuals who were older or less healthy than themselves:
‘Well, at the moment I don’t think I’m ready for that…I think I’m still too active.’
(Phoebe)
‘I’d rather not rely on a machine when I can think myself.’
(Jill)
In each of the above quotes, participants were imagining a time in which they will no longer be active or able to think for themselves when these systems will be useful to them. For some participants, use of the voice assistant was actually seen as hastening the point at which they will lose physical or cognitive capabilities. This way of thinking was described by Lisa and Louise as follows:
‘What’s it going to do to our brain? It’s going to shrink our brain.’
(Lisa)
‘Because you’ve got a brain and if you don’t use it, well you do become a bit stupid.’
(Louise).
In addition to threats to their ability to maintain physical and cognitive wellness, robots could also pose a different kind of threat – namely, through negatively altering the opinions of others. Carmen summed up this fear by imagining a situation where she was conversing with Vector or ElliQ, saying, ‘Someone would come in the door and think, ‘She’s really going off her rocker [i.e. losing her mind]…Time for her to move on’.’ This fear, though expressed half-jokingly, is real and has two elements worth distinguishing. First, some participants were concerned that being seen to be playing or interacting with a companion robot might be embarrassing or even shameful. In other words, the fear related to damaged self-image mediated through the negative perceptions of other people, including loved ones and strangers.
Second, participants expressed a fear with more tangible effects on one’s life. Carmen was concerned that being perceived as no longer capable of safely looking after herself could have repercussions for her freedom and her ability to live independently and in her own home. (We note that Carmen’s concern may have been partly elicited by the video’s portrayal of ElliQ as an aid to independence.) This fear has been expressed by participants in previous research who did not want to be seen as ‘failing’ to use technology for fear that it would indicate they were no longer capable of being independent [90]. Our findings extend this work by showing that older adults sometimes engage in a negotiation between not wanting to be embarrassed or stigmatized as ‘dependent’ by using certain robots and also wanting to get (or imagining eventually wanting or needing) some assistance (such as reminders) from types of robots that they see as less ‘threatening’ to their cognitive and social independence. Further, the concern that interacting with robots might be shameful or embarrassing has crossovers with participant concerns about insults to dignity, which we now discuss.
4.5. Robots and Dignity
Some interviewees had few issues with the concept of receiving assistance from or engaging with companion robots. Several found ElliQ potentially helpful, and one participant (Sarah) said she would even buy a ‘fantastic’ and ‘engaging’ assistive robot like ElliQ. Sarah explained it would be great to have ‘a conversation other than with yourself.’ By contrast, some other participants found the concept of interaction with companion robots demeaning. Louise decried the mere idea of a robot assistant: ‘And I’m not a very religious person either, but God give you the brain to work…I would pitch it [the robotic assistant] out the back door.’ Craig was similarly condemnatory of ElliQ: ‘Crazy. I do think I’m a bit more intelligent than that.’ Stephanie said: ‘I like the idea [of ElliQ’s assistive functions], but I don’t like the sort of seeming humanizing of the whole thing. It’s just ridiculous.’ Some said that this assault on dignity from such robots was lamentable: Phoebe said, ‘I hope I don’t get to that stage [of needing or being tempted to interact with robots]. I’m hoping I’ll be dead.’
Participants associated robots with a perceived threat to dignity when they identified the robot as being oriented towards children. No participants compared ElliQ in this way to a toy, while some did so for Biscuit the furred dog robot. Mostly, however, it was Vector, with its cute robot voice, trucklike appearance, and cool ‘fist-bump’ gestures, that was identified as toylike. Resemblance to a child’s toy was very occasionally perceived to be a good thing but more often a bad thing. Participants described the conspicuously toylike Vector as patronizing, condescending, or, as one put it, ‘absolutely bizarre.’
‘I don’t need a cute little bug [Vector] going ‘ooh’ at me…I might need help, but I don’t need that. It’s a kids’ toy…They’re kind of patronizing. You’re an adult.
(Stephanie).
‘It was more like a…Transformer toy type of thing…I’d hate to think that I’d come home to be greeted by something like that…I don’t know that people of my age group would find companionship in that…’
(Lisa)
While some took Vector to be patronizingly evocative of child’s toy and therefore outright rejected it, others took a more ‘philosophical’ view of older adults’ use of child-like toys. Sarah found Biscuit to be ‘similar to a stuffed toy that kids play with,’ and reluctantly suggested that a return to playing with toys, a defining feature of infancy and very early youth, might in fact go with ageing:
‘Initially I thought, ‘Oh, that’s quite condescending to old people’. That’s when I’m going to kill myself with dignity…I actually felt quite sad that those poor people in the aged homes are relating to a toy. But then it has lots of applications that they can interact with… it’s just important for them to communicate, touch…It’s connecting them. So, unfortunately, it’s a reality of ageing, yeah. That you’re back to playing with toys.’
(Sarah)
Just as assistive robots may be seen by some older individuals as more appropriate and perhaps inevitable for people who are older and less capable, Stephanie and Sarah saw the case for childlike robots being appropriate for others. This stemmed from a recognition of how ageing may sometimes involve revisiting or reverting to interests and activities of childhood. As mentioned, Biscuit, a robot designed to closely resemble a living dog, was well-liked by many participants. Yet Biscuit also provoked some vigorous criticism about how both the simulation of a living dog and the reactions it prompts are phony. Perhaps because its design is so close to a living dog, participants drew attention to the fact that ‘it’s not real’ (Stephanie). As to why this was troubling, individuals would say things like, ‘Because that’s artificial and that’s real, that’s live. That’s all it is’ (Phoebe). A more concrete reason given concerned the inauthenticity of robots – a live dog is ‘not programmed to be joyous when it sees you. That that’s a natural relationship that you’ve built up…You want someone who genuinely loves you for you…’ (Sarah).
The inauthenticity that was troubling for Gwen centered around an expectation that she would need to be false to engage with the robot. Gwen, who had described being appalled by the idea of engaging in a façade of false small talk or politeness with other humans, found Biscuit repellent because, ‘It’s false. It’s something that I’m trying to be happy or I’m trying to be something that I’m not.’ Gwen explained that Biscuit was ‘something that’s not there and pretending. I don’t want to pretend to anybody.’ The (unwelcome) work Gwen would need to put in has been referred to in past work on telepresence homecare devices as relation work [47]. This finding suggests this may be the case even with non-human systems. For some of the participants, the idea that people would be expected to relate to these non-living things when companionship is so obviously a thing between living beings was a major affront to dignity. For example, in response to the robot dog, Biscuit, Stephanie exclaimed:
‘I’ll try not to let the steam come out of my ears…it’s just causing me frustration that people would think of older people like that really.’
In cases such as this one, participants expressed themselves forcefully, both verbally and in the paralinguistic behaviors that accompanied their words, such as agitation and raised voices. Still, despite feeling dismay at the indignity and patronizing effects of inauthenticity, some of these individuals did see robots as a ‘second best’ option for people who could not have living animals. And, as noted above, some participants described the robot dog (despite it not being conscious) as ‘wonderful’ and as evincing ‘compassion’.
5. DISCUSSION
This study sought to understand older adults’ perceptions of three distinct companion robot types. By first asking about companionship needs and preferences, and then showing in turn videos of the more human-like voice assistant, the roving vehicle, and the dog robot examples, we elicited responses from participants that encourage comparisons across living and non-living things. Below, we discuss older adults’ preferences regarding style of company. We then discuss some implications of companion robots for older adults’ wellbeing, including their sense of control and autonomy. Finally, we explore how robots might affect dimensions of dignity in specific, interesting ways. Pradhan et al. notes that the CSCW community is increasingly interested in ‘understanding social interactions mediated by robots’ [70]. The following discussion of our findings adds to growing work in CSCW on the potential of robots as partners in our lives [45] in various spaces and, furthermore, on the need to treat older adults in ways that respect their autonomous choices, goals, and needs [50].
5.1. Designing for preferred styles of company
The three robot exemplars had features, such as appearances and behaviors, that were variously humanlike and animal-like. Participant responses to such features could sometimes be connected with their general attitudes towards companionship. Some individuals embraced ElliQ for its proactive conversational ability which, they explained, could reduce loneliness and provide both companionship and assistance. This was the case for several participants who disclosed being lonely, and to only one person who did not express being lonely. However, in contrast with the ways the video showed an older person relating to ElliQ as companionable and supportive, many participants regarded the voice as irritating, overly assertive, and intrusive. Importantly, ElliQ appears to have been even more off-putting for those who find human company difficult or exhausting and who prefer relative solitariness.
This finding complements existing research that has also noted that older adults are reluctant to use technologies they consider to be intrusive. In a similar study that examined participants’ responses to the ElliQ video, Deutsch et al. found that older people may resent a robotic assistant’s proactivity and intrusiveness [30]. Our findings support this conclusion but add the insight that sensitivity to intrusiveness may be linked to individuals’ varying preferences for social human interactions. By exploring participants’ preferences for companionship prior to discussing the robot videos, we were able to make this link. For individuals who prefer less human company, a robotic presence that does not proactively talk or is less conversationally interactive may be a more acceptable option. We should note, however, that to achieve finer discrimination of people’s preferences, further work is needed, such as introducing robot examples that both talk and have different appearances or behavior (e.g., animal-like ones) or that or that lack both speech and animal likeness. Nonetheless, the results reveal interesting links between attitudes towards human and robot company.
While attitudes to the robot with the human voice were decidedly mixed, attitudes to the robot dog were more uniform. Even some of those who were initially skeptical expressed more positive sentiments while viewing interactions between Biscuit and older people. As noted, participants used terms like ‘beautiful’, ‘gorgeous’, ‘wonderful’, ‘empathy’, ‘affection’, and ‘compassion’ for Biscuit. In part, this could be due to the nature of the video and the way Biscuit was portrayed: it was a news story rather than an ad. Still, participants often also advanced reflective reasons, including distinguishing Biscuit’s nature from both the assistant and toy-like robots. This finding contrasts with the contention that people tend to find robots that strive to emulate living creatures hard to warm to [99]. Consistent with this finding, participants found Vector too mechanical-looking, aesthetically displeasing, and robotic. Other studies suggest many people have a preference for dog robots over humanoid ones and are drawn to the way that dogs show emotions, personality, and attachment [44]. Our study extends this suggestion by connecting it with pet robot features such as simple, affectionate doglike behaviors.
Having control over a robot animal, with analogies drawn to living pets, was seen by some as desirable. Other similar robot studies have emphasized older adults’ desire for control and for maintaining independence [30]. Our findings reveal not only that older adults desire some control and independence in regard to robots, but also that animal-like robots like Biscuit may provide a distinctive (if slightly troubling [23]) means of giving people a sense of control. One participant, for example, said they would enjoy being in charge of a robot dog and picking it up and moving it around. In general, humans can control (and shape) animal behavior and wellbeing in numerous ways – something that distinguishes pets from many humans who are more assertive and less easily directed or manipulated (although there are of course exceptions, including some young children). Given that an emerging critique of technologies for aged care is that they may overly prescribe actions to the point of becoming coercive [60], it is essential to recognize that some ‘smarter’ technologies, such as ElliQ, may be more likely to lead to this feeling of being coerced, for some people at least.
The above results give rise to contrasts between our study and the findings of recent related studies [30,99]. In particular, our findings contrast with the view that robot companionship should be a secondary rather than primary function on the grounds that older adults are likely to reject robots when a (potentially ‘inauthentic’) companionship function is strongly highlighted. Our study showed, especially through participants’ responses to a robot dog which has no function other than to provide company, and which many participants found was ‘gorgeous’ and ‘wonderful’, that pure robot companionship is received positively by at least some older adults. As we will discuss shortly, however, some older adults do have concerns that ‘inauthentic’ robot companionship could affect their dignity.
5.2. Wellbeing and autonomy
The three robots raised a range of concerns regarding wellbeing and autonomy. One salient concern was that companion robots might be seen as undermining cognitive abilities and skills. These findings about fears of loss of control provide an empirical example concerning robots and older people of how technology may be perceived as resulting in the loss of autonomy or agency through fostering dependency [49]. Some participants engaged in activities such as voluntary work and chores partly in order to remain active, even though (or because) they received (albeit minor) help from local services with cleaning and transportation. A tangible fear was that allowing others, including robots, to take over certain tasks and activities could result in a deterioration of mental function and the erosion of independence, potentially representing a step on the road to residential aged care. Studies show that many older people are determined to remain independent and to live in their own homes for as long as possible, or until they die [86]. Given these concerns, a robot that appears to encourage dependence rather than self-sufficiency may be rejected. Past work has described the ways that older adults feel pressure to engage in physical and cognitive activities to maintain abilities [46]. Confirming this finding in relation to companion robots in our study prompts a rethinking of many of our approaches to designing technologies for ageing: we may wish to shift from assisting people in tasks to assisting them in maintaining their abilities to conduct tasks. For example, instead of a device insistently issuing reminders to take a medication, older adults might be reminded only if they forget to take it and offered strategies to remember in the future.
The possible effects of robots on some individuals’ values and dispositions are significant in other ways. Some participants described interacting with robots as potentially shameful or embarrassing, signifying to others that they may be experiencing cognitive decline. This represented a threat to a person’s self-esteem and sense of identity and even a loss of control over one’s own life (e.g., in the case of the individual fearing she would be removed from her own home). Clearly, many people want to make their decisions about what to do and when, and furthermore, to carry out the actions associated with those decisions themselves, rather than having them performed, whether paternalistically or otherwise, by other parties. This raises the value of autonomy. Human autonomy has been identified as important in older adult-robot interaction [49]. However, researchers of technology for older adults might be well served in turning to perspectives from contemporary ethics which stress the value we now tend to place on forms of personal autonomy or self-governance [20]. It is widely accepted in the West that we should respect a mature person’s autonomous choices which reflect their preferences and often their deepest values [95]. This could include the deep values of dignity, independence, and self-sufficiency. Respecting autonomy implies avoiding not only the coercion and manipulation of a person for the benefit of others. It also implies avoiding unwanted altruistic interference, that is, interference performed for the benefit of the person herself. Altruistic overriding of autonomy is called strong paternalism [8]. In liberal societies, strong paternalism is generally frowned upon. But despite the moral need to refrain from unwanted paternalism, there are forms of influencing a person’s decision-making that are consistent with respect for autonomy. This warrants further explanation.
Ethicists sometimes talk of relational autonomy, which is a kind of self-governance in living and decision-making. Relational autonomy refers to the ability to make our own decisions not entirely independently of others but rather by way of desired forms of input and support from the outside [68]. Relational autonomy can shed some light on the resistance our participants had to being instructed by an assistant robot. Insofar as an assistance robot is interpreted as interfering with a given individual’s desire to make their own decisions, it may threaten their self-governance or autonomy, and even in a sense become paternalistically coercive [60]. But if that robot can instead provide input and support of a kind that is genuinely welcomed, then, in a sense, the robot—or the provision of the robot to an older person—may be said to respect or enhance autonomy. Whether or not a robot leads to the enhancement or diminution of autonomy depends on the perspective of the individual concerned. Some people are willing to hand over many decisions and tasks to others in a way that is consistent with relational autonomy; other people are loath to do so. And, of course, the precise nature of the decision or task can make a difference to a person’s willingness to surrender their involvement to another. In this study, the reactions of some participants appeared to suggest that their relational autonomy might not be respected or enhanced by the assistant robot ElliQ. Others, however, welcomed ElliQ’s assistance. Given that ElliQ and robots like it resemble humans who proactively offer suggestions, reminders, and advice about what might be good for a given person, we could regard these robots either as paternalistic threats to autonomy or as enhancers of relational autonomy, depending on the person’s individual autonomous wishes.
The notion of relational autonomy recalls the concept of interdependence discussed in the HCI disability and Science and Technology Studies communities [9,34]. As ‘dependent rational creatures’, to borrow Alasdair MacIntyre’s term for us human beings [53], we may think of our autonomy as being promoted not through total independence from others (which MacIntyre regards as a pernicious fiction), but often through interdependence with other people and (perhaps in an extended sense) with human-like assistive technologies that are capable of performing tasks and helping us in often autonomous and unsolicited ways. Different individuals’ autonomy may then be promoted by their own preferred forms of assistance and (inter)dependence. Some assistive technologies, for example, may be seen by some as stigmatizing and damaging to self-confidence and identity, and hence rejected by autonomous individuals with those attitudes [81]. Companion robots that perform ever more assistive tasks autonomously, especially when they do so against a person’s wishes, may even run the risk of appearing to enact a machine-style of paternalism. Care might then be taken to ensure that future companion robots navigate a path between over-assisting and enabling appropriate kinds of dependence.
5.3. Dignity: Condescension and patronization
The potential impact on human dignity of companion robots was another major theme in this study. Dignity is a deep human need [1], and threats to dignity and an awareness of it can increase as people age [41]. Dignity and its loss can be linked to physical and cognitive changes, to changes in one’s sense of self such as those resulting from various dependencies, and to being on the receiving end of ageist stereotyping [42]. The human need for dignity has been recognized by international law and human rights declarations [55]. While some thinkers believe that dignity is a vague and unhelpful idea [54], others hold it to be an illuminating and important moral concept [22]. In any case, ‘dignity’ is a complex notion [40]; it can refer, for example, to a person’s inherent value as a human being, or, as we discuss here, to a variety of ways that people may be or feel affronted or violated [66].
Robot-related discussions about dignity and its loss are largely theoretical and philosophical [78,88]. Thinkers have reflected on how robots may enhance dignity by, for instance, promoting human flourishing and independence through giving suitable forms of assistance [80]. Conversely, some have discussed how robots may diminish dignity by undermining a person’s autonomy and control over their lives [97]. Robots may also threaten older adults’ dignity when the robots appear to be infantilizing and deceptive [71,79]. Our study adds empirical detail that extends these philosophical discussions. The results expose how robots could affect older adults’ dignity in various ways. In particular, they expose the importance of concepts of condescension, patronization, and demeaning circumstances. The findings enable us to reflect at somewhat greater length on the various kinds of threats robots may pose to dignity, and to identify the different dimensions of dignity this points to.
A number of interviewees viewed companion robots as demeaning, using words like ‘patronizing’, ‘condescending’, and ‘dignity’. This perception was linked to the degree to which the robots were regarded as variously offensively toylike, humanizing, and/or inauthentic. These forms of condescension constitute distinctive dimensions of dignity and deserve further discussion. Consider first the idea of being condescended to by way of robots that are too toylike. Being infantilized, such as through the use of child-oriented settings, activities and babyish speech patterns, is a recognized concern in aged-care and a component of ageist stereotyping [76], a problem some researchers studying the design of robots for older people have been highlighting for some time [63]. The notion of infantilization of older adults has been connected with being deceived about robot sentience, or with companion robots in a more general sense being perceived as toys—as, for example, dolls are [32,79]. For several participants, the Pixar-like Vector was infantilizingly toylike, whereas the robot dog interestingly was not – despite the fact that the robot dog could also easily be taken to resemble a child’s toy (just as the more famous robot PARO might be so taken).
Some participants regarded ElliQ as patronizing, but not because it evoked a child’s toy. Rather, ElliQ was seen as patronizing because of the way it attempted to emulate a human speaker and, as one participant put it, involved a kind of humanizing. On this view, assistive robots for older people are respectable, but their assistance becomes condescending when they pretend to human interaction and companionship. This notion of an inauthenticity [30] that leads to feeling demeaned also emerged in relation to the animal robot. Some reported that they would feel embarrassed should they be discovered talking to such robots. One participant (Stephanie) became visibly livid at what she saw as a patronizing disrespect for older people as such companionship, she said, cannot exist between humans and machines, even clever ones. This participant described, moreover, how such pretense was also patronizing to older people as a group. This response is likely to be in part a reaction against ageist stereotyping [62]. Putting these elements together, we can say that an older person might feel patronized in two important ways: as an individual and as a member of a (age-based) group.
Thinkers have suggested that indignity might result from the sense of being ‘deceived’ into believing robots are conscious or sentient [32,84,89]. However, in our study no participants made this objection, and one interviewee even stated that deception was unlikely in people without dementia. So, rather than deception being the cause, the problem of being patronized instead arose from the notion of interacting with a robot as if it were sentient. As one participant said, a robot that is ‘programmed to be joyous’ when they see you is phony. Although other studies have highlighted robot inauthenticity [30,48], it is important to stress here that some of our participants not only reacted to certain robots as ‘false things’, they also expressed the view that users’ responses to them are therefore also ‘false.’ To compound the problem still further, such robots were, once again, seen by some older adults as an insult against not just individuals but against older people in general. In addition to this, robots may be regarded as objectively and not just subjectively demeaning. That is, some older adults may see them as patronizing or condescending to all members of their age group even with respect to those members who do not subjectively feel demeaned by them. That is a further distinction to emerge from this study.
Deustch et al. [30] suggest that robots with primary companion functions are rejected by older adults as false and inauthentic. Lazar et al. [48], however, suggest that this problem of inauthenticity may dissolve when users knowingly ‘give in to the fiction’ by adopting a stance of make-believe towards robots [74]. On this view, such a stance may confer respectability upon apparently undignified interactions with ‘unreal’ companions. However, our findings suggest that Lazar et al.’s suggested move may not resolve certain deeper values or beliefs which are at play. It is vital to note that some of our participants would regard ‘giving in to the fiction’ as itself false or inauthentic. One participant, for example, claimed that companionship is impossible with a ‘machine that supposedly has got a personality’, while another found it ‘crazy’ to ‘pretend to say hello’ to a machine and therefore (presumably) to adopt a stance of make-believe. Hence, encouraging older adults to engage in make-believe with robots as some have recommended could come across as deeply patronizing rather than as offering an escape route from an otherwise condescending interaction [58].
Accordingly, designing and marketing robots with this aim may alienate a segment of older people. Additional empirical study of this possibility would be useful in order to understand the factors that lead to more or less acceptance and to ensure that there are alternatives for people who reject this form of technology. Having highlighted these dignity-centered problems, however, we must stress that other responses from older adults were very different. As we noted, a number of participants embraced the idea of certain robots as companions [48], including those robots with companionship as their primary function. Such individuals gave no indication that they found those robots patronizing or condescending. Such findings highlight diversity amongst older people [65] in regards to attitudes to companion robots.
5.4. Study limitations
Our study has several limitations. First, our findings emerge from interviews with a particular demographic – older adults in a Western country, namely Australia, with limited support needs – and should not be taken to be globally representative. Attitudes towards robots vary across different cultural contexts [67] and older adults may be more open to robotic technologies when they view them as necessary to maintain independence [14]. Second, participants did not have actual interactions or extended experience with the robots, so we could not see changes in their views over time or with experience. Third, since the videos were uniformly enthusiastic towards robots (with two being marketing-style videos and the other a positive news story) viewers may have been (at least partly) responding to the sentiment or use cases depicted in the videos rather than the robots themselves. It is possible that responses would have been different had participants experienced the robots first-hand in the absence of the videos. For example, participants may then have reacted less (or more) strongly to the idea that robots threaten control and dignity. Also, the ElliQ and Biscuit videos showed older adults interacting with robots, while the Vector video did not. Hence, some caution in interpretation is required.
Even with these limitations, we were able to examine initial participant responses to three robot types and make comparisons between them. The three videos were equally ‘positive’ in tone, and each gave indications of how the robots may be used either with older people or as general companions. Observing a range of recorded human-robot interactions may in fact offer participants a clearer basis from which to form initial responses than providing actual robots or simply describing them. It also provides consistency in viewing experience [30]. The order of the videos shown might have affected responses. However, our aim here was not so much to measure the strength of participants’ like/dislike for the robots, but to uncover the reasons they had for their reactions. Nonetheless, further work might address the limitations of this study by providing opportunities to interact directly with robots or by exploring, in a more random order, responses to different kinds of robots. We also suggest that future research might explore other general robot types to tease out older adults’ responses to different characteristics and combinations of characteristics. This might include, for example, robots that are animal-like but can talk, etc.
6. CONCLUSION
This study explored the immediate responses of older people living alone and independently to videos of three types of companion robot. The three distinctive robot types drew forth a variety of responses which showed there is no ‘one size fits all’ robot type for older people. We unearthed detail about older peoples’ attitudes and views towards preferred social arrangements with human beings and how this might affect their attitude towards various robots as potential companions. For example, older adults who preferred relative solitude and found other adults often exasperating and intrusive tended to react negatively to the proactive and assertively conversational home assistant robot. In contrast, many were very positive to the compliant and ‘gorgeous’ doglike characteristics of the robot dog. Indeed, many participants could imagine having the robot dog as a companion even though it did not perform any other role, like providing services or assistance.
The study also interrogated the ways that different companion robots might affect personal wellbeing, autonomy, and dignity. Participants expressed some concerns about assistive robots making them feel they are losing control and, furthermore, concerns about not having control over the robots themselves. Each of these possibilities can potentially damage older adults’ autonomy. Furthermore, some older people are acutely concerned about the demeaning effects of toylike robots, humanized robots, and ‘inauthentic’ robots that engage in a pretense of feeling. An individual may feel this patronizing effect both on her own behalf and on behalf of older adults as a group. They may also feel that even people who willingly embrace such robots objectively suffer a loss of dignity. Such concerns may be strongly felt, and the idea of striking a stance of make-believe may not allay them; indeed, it is possible that recommending such a stance may compound those concerns. Some other older adults, however, do not share these values and concerns and are willing to embrace a range of companion robot types, including ones that are like toys or that simulate human conversation or animal feeling and behavior. These findings may be taken into account by robot designers and researchers and by those involved in selecting the types of robots offered to independent older people.
CCS Concepts:
•Human-centered computing → Human computer interaction (HCI) → HCI design and evaluation methods → Field studies •Human-centered computing → Human computer interaction (HCI) → Interaction devices
ACKNOWLEDGMENTS
We warmly thank the handling reviewer and the other reviewers for their very helpful feedback and advice. This work is supported by the National Science Foundation, under grant #1816145, and a grant from the Networked Society Institute.
Contributor Information
SIMON COGHLAN, The University of Melbourne, Australia.
JENNY WAYCOTT, The University of Melbourne, Australia.
AMANDA LAZAR, University of Maryland, USA.
BARBARA BARBOSA NEVES, Monash University, Australia.
REFERENCES
- [1].Ackerman Evan. Jibo Is Probably Totally Dead Now - IEEE Spectrum. IEEE Spectrum: Technology, Engineering, and Science News. Retrieved May 26, 2020 from https://spectrum.ieee.org/automaton/robotics/home-robots/jibo-is-probably-totally-dead-now [Google Scholar]
- [2].Anki. Vector by Anki: A Giant Roll Forward For Robotkind. Retrieved May 26, 2020 from https://www.youtube.com/watch?v=Qy2Z2TWAt6A&feature=youtu.be
- [3].Armitage Richard and Nellums Laura B.. 2020. COVID-19 and the consequences of isolating the elderly. The Lancet Public Health 5, 5 (2020), e256. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [4].Australian Government Health Department. 2018. Ageing and aged care. Australian Government Department of Health. Retrieved May 26, 2020 from https://www.health.gov.au/resources/corporate-plan-2018-2019/our-performance/ageing-and-aged-care [Google Scholar]
- [5].Australian Institute of Health and Welfare. Australia’s welfare 2005. Australian Institute of Health and Welfare. Retrieved May 26, 2020 from https://www.aihw.gov.au/reports/australias-welfare/australias-welfare-2005/contents/table-of-contents [Google Scholar]
- [6].Banks Marian R., Willoughby Lisa M., and Banks William A.. 2008. Animal-assisted therapy and loneliness in nursing homes: use of robotic versus living dogs. Journal of the American Medical Directors Association 9, 3 (2008), 173–177. DOI: 10.1016/j.jamda.2007.11.007 [DOI] [PubMed] [Google Scholar]
- [7].BBC News. “Robodog” helping people with dementia. BBC News. Retrieved May 26, 2020 from https://www.bbc.com/news/av/uk-england-dorset-43479791/robotic-dog-in-dorset-care-home-helps-elderly-residents [Google Scholar]
- [8].Beauchamp Tom L. and Childress James F.. 2001. Principles of biomedical ethics. Oxford University Press, USA. Retrieved from 10.1136/jme.28.5.332-a [DOI] [Google Scholar]
- [9].Bennett Cynthia L., Brady Erin, and Branham Stacy M.. 2018. Interdependence as a frame for assistive technology research and design. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility, 161–173. DOI: 10.1145/3234695.3236348 [DOI] [Google Scholar]
- [10].Borenstein Jason and Pearson Yvette. 2010. Robot caregivers: harbingers of expanded freedom for all? Ethics Inf Technol 12, 3 (September 2010), 277–288. DOI: 10.1007/s10676-010-9236-4 [DOI] [Google Scholar]
- [11].Braun Virginia and Clarke Victoria. 2006. Using thematic analysis in psychology. Qualitative research in psychology 3, 2 (2006), 77–101. DOI: 10.1145/3234695.3236348 [DOI] [Google Scholar]
- [12].Breazeal Cynthia. 2003. Emotion and sociable humanoid robots. International journal of human-computer studies 59, 1–2 (2003), 119–155. DOI: 10.1016/S1071-5819(03)00018-1 [DOI] [Google Scholar]
- [13].Breazeal Cynthia, Wang Andrew, and Picard Rosalind. 2007. Experiments with a robotic computer: body, affect and cognition interactions. In Proceedings of the ACM/IEEE international conference on Human-robot interaction, 153–160. DOI: 10.1145/1228716.1228737 [DOI] [Google Scholar]
- [14].Broadbent E, Stafford R, and MacDonald B. 2009. Acceptance of Healthcare Robots for the Older Population: Review and Future Directions. Int J of Soc Robotics 1, 4 (November 2009), 319–330. DOI: 10.1007/s12369-009-0030-6 [DOI] [Google Scholar]
- [15].Broadbent Elizabeth, Tamagawa Rie, Kerse Ngaire, Knock Brett, Patience Anna, and MacDonald Bruce. 2009. Retirement home staff and residents’ preferences for healthcare robots. In RO-MAN 2009-The 18th IEEE International Symposium on Robot and Human Interactive Communication, IEEE, 645–650. DOI: 10.1109/ROMAN.2009.5326284 [DOI] [Google Scholar]
- [16].Broekens J, Heerink M, and Rosendal H. 2009. Assistive social robots in elderly care: a review. Gerontechnology 8, 2 (April 2009), 94–103. DOI: 10.4017/gt.2009.08.02.002.00 [DOI] [Google Scholar]
- [17].Can WSR and Seibt Johanna. 2016. Social robotics, elderly care, and human dignity: A recognition-theoretical approach. Proceedings of robophilosophy 2016/TRANSOR 2016 290, (2016). DOI: 10.3233/978-1-61499-708-5-155 [DOI] [Google Scholar]
- [18].Cassell Justine. 2001. Embodied conversational agents: representation and intelligence in user interfaces. AI magazine 22, 4 (2001), 67–67. DOI: 10.1609/aimag.v22i4.1593 [DOI] [Google Scholar]
- [19].Chesney Thomas and Lawson Shaun. 2007. The illusion of love: Does a virtual pet provide the same companionship as a real one? Interaction Studies 8, 2 (2007), 337–342. DOI: 10.1075/is.8.2.09che [DOI] [Google Scholar]
- [20].Childress James F.. 1990. The place of autonomy in bioethics. The Hastings Center Report 20, 1 (1990), 12–17. [PubMed] [Google Scholar]
- [21].Coeckelbergh Mark. 2011. Are emotional robots deceptive? IEEE Transactions on Affective Computing 3, 4 (2011), 388–393. DOI: 10.1109/T-AFFC.2011.29 [DOI] [Google Scholar]
- [22].Coghlan Simon. 2018. The moral depth of human dignity. Philosophical Investigations 41, 1 (2018), 70–93. DOI: 10.1111/phin.12177 [DOI] [Google Scholar]
- [23].Coghlan Simon, Vetere Frank, Waycott Jenny, and Neves Barbara Barbosa. 2019. Could Social Robots Make Us Kinder or Crueller to Humans and Animals? International Journal of Social Robotics 11, 5 (2019), 741–751. DOI: 10.1007/s12369-019-00583-2 [DOI] [Google Scholar]
- [24].Coghlan Simon, Waycott Jenny, Neves Barbara Barbosa, and Vetere Frank. 2018. Using robot pets instead of companion animals for older people: a case of’reinventing the wheel’? In Proceedings of the 30th Australian Conference on Computer-Human Interaction, 172–183. DOI: 10.1145/3292147.3292176 [DOI] [Google Scholar]
- [25].Cornwell Erin York and Waite Linda J.. 2009. Social disconnectedness, perceived isolation, and health among older adults. Journal of health and social behavior 50, 1 (2009), 31–48. DOI: 10.1177/002214650905000103 [DOI] [PMC free article] [PubMed] [Google Scholar]
- [26].Courtin Emilie and Knapp Martin. 2017. Social isolation, loneliness and health in old age: a scoping review. Health & social care in the community 25, 3 (2017), 799–812. DOI: 10.1111/hsc.12311 [DOI] [PubMed] [Google Scholar]
- [27].Dautenhahn Kerstin. 2007. Socially intelligent robots: dimensions of human–robot interaction. Philosophical Transactions of the Royal Society B: Biological Sciences 362, 1480 (April 2007), 679–704. DOI: 10.1098/rstb.2006.2004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- [28].Dautenhahn Kerstin, Woods Sarah, Kaouri Christina, Walters Michael L., Koay Kheng Lee, and Werry Iain. 2005. What is a robot companion-friend, assistant or butler? In 2005 IEEE/RSJ international conference on intelligent robots and systems, IEEE, 1192–1197. DOI: 10.1109/IROS.2005.1545189 [DOI] [Google Scholar]
- [29].De Graaf Maartje Ma, Allouch Somaya Ben, and Klamer Tineke. 2015. Sharing a life with Harvey: Exploring the acceptance of and relationship-building with a social robot. Computers in human behavior 43, (2015), 1–14. DOI: 10.1016/j.chb.2014.10.030 [DOI] [Google Scholar]
- [30].Deutsch Inbal, Erel Hadas, Paz Michal, Hoffman Guy, and Zuckerman Oren. 2019. Home robotic devices for older adults: Opportunities and concerns. Computers in Human Behavior 98, (September 2019), 122–133. DOI: 10.1016/j.chb.2019.04.002 [DOI] [Google Scholar]
- [31].Ezer Neta, Fisk Arthur D., and Rogers Wendy A.. 2009. Attitudinal and intentional acceptance of domestic robots by younger and older adults. In International conference on universal access in human-computer interaction, Springer, 39–48. DOI: 10.1007/978-3-642-02710-9_5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- [32].Frennert Susanne and Östlund Britt. 2014. Seven matters of concern of social robots and older people. International Journal of Social Robotics 6, 2 (2014), 299–310. DOI: 10.1007/s12369-013-0225-8 [DOI] [Google Scholar]
- [33].Gee Nancy R., Mueller Megan K., and Curl Angela L.. 2017. Human–animal interaction and older adults: An overview. Frontiers in psychology 8, (2017), 1416. DOI: 10.3389/fpsyg.2017.01416 [DOI] [PMC free article] [PubMed] [Google Scholar]
- [34].Hamraie Aimi. 2013. Designing collective access: A feminist disability theory of universal design. Disability Studies Quarterly 33, 4 (2013). DOI: 10.18061/dsq.v33i4.3871 [DOI] [Google Scholar]
- [35].Heerink Marcel, Kröse Ben, Evers Vanessa, and Wielinga Bob. 2008. The influence of social presence on acceptance of a companion robot by older people. (2008).
- [36].Holt-Lunstad Julianne, Smith Timothy B., Baker Mark, Harris Tyler, and Stephenson David. 2015. Loneliness and social isolation as risk factors for mortality: a meta-analytic review. Perspectives on psychological science 10, 2 (2015), 227–237. DOI: 10.1177/1745691614568352 [DOI] [PubMed] [Google Scholar]
- [37].Intuition Robotics. 2020. ElliQ, the sidekick for happier aging. Intuition Robotics. Retrieved May 26, 2020 from https://elliq.com/ [Google Scholar]
- [38].Intuition Robotics. Our Beta Testing Journey. Intuition Robotics. Retrieved May 26, 2020 from https://elliq.com/blogs/elliq-blog/our-beta-testing-journey [Google Scholar]
- [39].Intuition Robotics. ELLIQ - The Active Aging Companion. Retrieved May 26, 2020 from https://www.youtube.com/watch?v=QfJU54nw5BU&feature=youtu.be
- [40].Kateb George. 2011. Human dignity. Harvard University Press, Cambridge, MA. [Google Scholar]
- [41].Kerepesi Andrea, Kubinyi Eniko, Jonsson Gudberg K., Magnússon Magnús S., and Miklósi Ádám. 2006. Behavioural comparison of human–animal (dog) and human–robot (AIBO) interactions. Behavioural processes 73, 1 (2006), 92–99. DOI: 10.1016/j.beproc.2006.04.001 [DOI] [PubMed] [Google Scholar]
- [42].Khosla Rajiv, Nguyen Khanh, and Chu Mei-Tai. 2017. Human robot engagement and acceptability in residential aged care. International Journal of Human–Computer Interaction 33, 6 (2017), 510–522. DOI: 10.1080/10447318.2016.1275435 [DOI] [Google Scholar]
- [43].Kidd Cory D., Taggart Will, and Turkle Sherry. 2006. A sociable robot to encourage social interaction among the elderly. In Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006., IEEE, 3972–3976. DOI: 10.1109/ROBOT.2006.1642311 [DOI] [Google Scholar]
- [44].Konok Veronika, Korcsok Beáta, Miklósi Ádám, and Gácsi Márta. 2018. Should we love robots?–The most liked qualities of companion dogs and how they can be implemented in social robots. Computers in Human Behavior 80, (2018), 132–142. DOI: 10.1016/j.chb.2017.11.002 [DOI] [Google Scholar]
- [45].Lampe Cliff, Bauer Bob, Evans Henry, Robson Dave, Lau Tessa, and Takayama Leila. 2016. Robots As Cooperative Partners… We Hope… In Proceedings of the 19th ACM Conference on Computer Supported Cooperative Work and Social Computing Companion (CSCW ‘16 Companion), Association for Computing Machinery, New York, NY, USA, 188–192. DOI: 10.1145/2818052.2893360 [DOI] [Google Scholar]
- [46].Lazar Amanda and Nguyen David H.. 2017. Successful Leisure in Independent Living Communities: Understanding Older Adults’ Motivations to Engage in Leisure Activities. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 7042–7056. DOI: 10.1145/3025453.3025802 [DOI] [Google Scholar]
- [47].Lazar Amanda, Thompson Hilaire J., Lin Shih-Yin, and Demiris George. 2018. Negotiating Relation Work with Telehealth Home Care Companionship Technologies That Support Aging in Place. Proceedings of the ACM on Human-Computer Interaction 2, CSCW (2018), 1–19. DOI: 10.1145/3274372 [DOI] [Google Scholar]
- [48].Lazar Amanda, Thompson Hilaire J., Piper Anne Marie, and Demiris George. 2016. Rethinking the design of robotic pets for older adults. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems, 1034–1046. DOI: 10.1145/2901790.2901811 [DOI] [Google Scholar]
- [49].Lee Hee Rin and Riek Laurel D.. 2018. Reframing assistive robots to promote successful aging. ACM Transactions on Human-Robot Interaction (THRI) 7, 1 (2018), 1–23. DOI: 10.1145/3203303 [DOI] [Google Scholar]
- [50].Light Ann, Leong Tuck W., and Robertson Toni. 2015. Ageing Well with CSCW. In ECSCW 2015: Proceedings of the 14th European Conference on Computer Supported Cooperative Work, 19–23 September 2015, Oslo, Norway, Springer International Publishing, Cham, 295–304. DOI: 10.1007/978-3-319-20499-4_16 [DOI] [Google Scholar]
- [51].Luh Ding-Bang, Li Elena Carolina, and Kao Yu-Jung. 2015. The Development of a Companionship Scale for Artificial Pets. Interact Comput 27, 2 (March 2015), 189–201. DOI: 10.1093/iwc/iwt055 [DOI] [Google Scholar]
- [52].Luhmann Maike and Hawkley Louise C.. 2016. Age differences in loneliness from late adolescence to oldest old age. Developmental psychology 52, 6 (2016), 943. DOI: 10.1037/dev0000117 [DOI] [PMC free article] [PubMed] [Google Scholar]
- [53].MacIntyre Alasdair C.. 1999. Dependent rational animals: Why human beings need the virtues. Open Court Publishing, Chicago. [Google Scholar]
- [54].Macklin Ruth. 2003. Dignity is a useless concept. BMJ 327, (2003), 1419. DOI: 10.1136/bmj.327.7429.1419 [DOI] [PMC free article] [PubMed] [Google Scholar]
- [55].McCrudden Christopher. 2008. Human dignity and judicial interpretation of human rights. European Journal of international Law 19, 4 (2008), 655–724. DOI: 10.1093/ejil/chn043 [DOI] [Google Scholar]
- [56].McGlynn Sean A., Kemple Shawn, Mitzner Tracy L., King Chih-Hung Aaron, and Rogers Wendy A.. 2017. Understanding the potential of PARO for healthy older adults. International journal of human-computer studies 100, (2017), 33–47. DOI: 10.1016/j.ijhcs.2016.12.004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- [57].McGlynn Sean, Snook Braeden, Kemple Shawn, Mitzner Tracy L., and Rogers Wendy A.. 2014. Therapeutic robots for older adults: investigating the potential of paro. In Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction, 246–247. DOI: 10.1145/2559636.2559846 [DOI] [Google Scholar]
- [58].Misselhorn Catrin, Pompe Ulrike, and Stapleton Mog. 2013. Ethical considerations regarding the use of social robots in the fourth age. GeroPsych (2013). DOI: 10.1024/1662-9647/a000088 [DOI] [Google Scholar]
- [59].Mori Masahiro. 1970. The uncanny valley. Energy 7, 4 (1970), 33–35. [Google Scholar]
- [60].Mort Maggie, Roberts Celia, and Callén Blanca. 2013. Ageing with telecare: care or coercion in austerity? Sociology of health & illness 35, 6 (2013), 799–812. DOI: 10.1111/j.1467-9566.2012.01530.x [DOI] [PubMed] [Google Scholar]
- [61].Moyle Wendy, Cooke Marie, Beattie Elizabeth, Jones Cindy, Klein Barbara, Cook Glenda, and Gray Chrystal. 2013. Exploring the effect of companion robots on emotional expression in older adults with dementia: a pilot randomized controlled trial. Journal of Gerontological Nursing 39, 5 (2013), 46–53. DOI: 10.3928/00989134-20130313-03 [DOI] [PubMed] [Google Scholar]
- [62].Nelson Todd D.. 2004. Ageism: Stereotyping and prejudice against older persons. MIT press, Cambridge, MA. [Google Scholar]
- [63].Neven Louis. 2010. ‘But obviously not for me’: robots, laboratories and the defiant identity of elder test users. Sociology of Health & Illness 32, 2 (2010), 335–347. DOI: 10.1111/j.1467-9566.2009.01218.x [DOI] [PubMed] [Google Scholar]
- [64].Neves Barbara Barbosa, Sanders Alexandra, and Kokanović Renata. 2019. “It’s the worst bloody feeling in the world”: Experiences of loneliness and social isolation among older people living in care homes. Journal of aging studies 49, (2019), 74–84. DOI: 10.1016/j.jaging.2019.100785 [DOI] [PubMed] [Google Scholar]
- [65].Neves Barbara Barbosa and Vetere Frank (Eds.). 2019. Ageing and Emerging Digital Technologies. Springer, Singapore. [Google Scholar]
- [66].Nordenfelt Lennart. 2009. Dignity in care for older people. John Wiley & Sons, Hoboken. [Google Scholar]
- [67].Papadopoulos Irena and Koulouglioti Christina. 2018. The Influence of Culture on Attitudes Towards Humanoid and Animal-like Robots: An Integrative Review. Journal of Nursing Scholarship 50, 6 (2018), 653–665. DOI: 10.1111/jnu.12422 [DOI] [PubMed] [Google Scholar]
- [68].Perkins Molly M., Ball Mary M., Whittington Frank J., and Hollingsworth Carole. 2012. Relational autonomy in assisted living: A focus on diverse care settings for older adults. Journal of Aging studies 26, 2 (2012), 214–225. DOI: 10.1016/j.jaging.2012.01.001 [DOI] [PMC free article] [PubMed] [Google Scholar]
- [69].Pikhartova Jitka, Bowling Ann, and Victor Christina. 2014. Does owning a pet protect older people against loneliness? BMC geriatrics 14, 1 (2014), 106. DOI: 10.1186/1471-2318-14-106 [DOI] [PMC free article] [PubMed] [Google Scholar]
- [70].Pradhan Alisha, Findlater Leah, and Lazar Amanda. 2019. “Phantom Friend” or “Just a Box with Information”: Personification and Ontological Categorization of Smart Speaker-based Voice Assistants by Older Adults. Proc. ACM Hum.-Comput. Interact 3, CSCW (November 2019), 214:1–214:21. DOI: 10.1145/3359316 [DOI] [Google Scholar]
- [71].Preuß Dirk and Legal Friederike. 2017. Living with the animals: animal or robotic companions for the elderly in smart homes? Journal of Medical Ethics 43, 6 (June 2017), 407–410. DOI: 10.1136/medethics-2016-103603 [DOI] [PubMed] [Google Scholar]
- [72].Pu Lihui, Moyle Wendy, Jones Cindy, and Todorovic Michael. 2019. The effectiveness of social robots for older adults: a systematic review and meta-analysis of randomized controlled studies. The Gerontologist 59, 1 (2019), e37–e51. DOI: 10.1093/geront/gny046 [DOI] [PubMed] [Google Scholar]
- [73].Reher David and Requena Miguel. 2018. Living Alone in Later Life: A Global Perspective. Population and Development Review 44, 3 (2018), 427–454. DOI: 10.1111/padr.12149 [DOI] [Google Scholar]
- [74].Rodogno Raffaele. 2016. Social robots, fiction, and sentimentality. Ethics and information technology 18, 4 (2016), 257–268. DOI: 10.1007/s10676-015-9371-z [DOI] [Google Scholar]
- [75].Rook Karen S.. 1990. Social relationships as a source of companionship: Implications for older adults’ psychological well-being. In Social support: An interactional view, Sarason BR, Sarason IG and Pierce GR (eds.). John Wiley & Sons, Hoboken, 219–250. [Google Scholar]
- [76].Salari Sonia Miner. 2006. Infantilization as elder mistreatment: Evidence from five adult day centers. Journal of Elder Abuse & Neglect 17, 4 (2006), 53–91. DOI: 10.1300/J084v17n04_04 [DOI] [PubMed] [Google Scholar]
- [77].Shankar Aparna, McMunn Anne, Demakakos Panayotes, Hamer Mark, and Steptoe Andrew. 2017. Social isolation and loneliness: Prospective associations with functional status in older adults. Health psychology 36, 2 (2017), 179. DOI: 10.1037/hea0000437 [DOI] [PubMed] [Google Scholar]
- [78].Sharkey Amanda. 2014. Robots and human dignity: a consideration of the effects of robot care on the dignity of older people. Ethics Inf Technol 16, 1 (March 2014), 63–75. DOI: 10.1007/s10676-014-9338-5 [DOI] [Google Scholar]
- [79].Sharkey Amanda and Sharkey Noel. 2012. Granny and the robots: ethical issues in robot care for the elderly. Ethics Inf Technol 14, 1 (March 2012), 27–40. DOI: 10.1007/s10676-010-9234-6 [DOI] [Google Scholar]
- [80].Sharkey Amanda and Wood Natalie. 2014. The Paro seal robot: demeaning or enabling? Proceedings of AISB 36, (2014). [Google Scholar]
- [81].Shinohara Kristen and Wobbrock Jacob O.. 2016. Self-conscious or self-confident? A diary study conceptualizing the social accessibility of assistive technology. ACM Transactions on Accessible Computing (TACCESS) 8, 2 (2016), 1–31. DOI: 10.1145/2827857 [DOI] [Google Scholar]
- [82].Simon Michael. 2019. Anki is shutting down, but its adorable Cozmo and Vector robots deserve a new home. Macworld. Retrieved May 26, 2020 from https://www.macworld.com/article/3391411/anki-is-shutting-down-but-its-adorable-cozmo-and-vector-robots-need-a-new-home.html [Google Scholar]
- [83].Sorkin Dara, Rook Karen S., and Lu John L.. 2002. Loneliness, lack of emotional support, lack of companionship, and the likelihood of having a heart condition in an elderly sample. Annals of Behavioral Medicine 24, 4 (2002), 290–298. DOI: 10.1207/S15324796ABM2404_05 [DOI] [PubMed] [Google Scholar]
- [84].Sparrow Robert. 2002. The March of the robot dogs. Ethics and Information Technology 4, 4 (December 2002), 305–318. DOI: 10.1023/A:1021386708994 [DOI] [Google Scholar]
- [85].Sparrow Robert and Sparrow Linda. 2006. In the hands of machines? The future of aged care. Minds and Machines 16, 2 (2006), 141–161. [Google Scholar]
- [86].Stones Damien and Gullifer Judith. 2016. ‘At home it’s just so much easier to be yourself’: older adults’ perceptions of ageing in place. Ageing & Society 36, 3 (2016), 449–481. DOI: 10.1017/S0144686X14001214 [DOI] [Google Scholar]
- [87].Van Oost Ellen and Reed Darren. 2010. Towards a sociological understanding of robots as companions. In International Conference on Human-Robot Personal Relationship, Springer, 11–18. [Google Scholar]
- [88].Vandemeulebroucke Tijs, de Casterlé Bernadette Dierckx, and Gastmans Chris. 2018. The use of care robots in aged care: A systematic review of argument-based ethics literature. Archives of gerontology and geriatrics 74, (2018), 15–25. [DOI] [PubMed] [Google Scholar]
- [89].Vandemeulebroucke Tijs, de Casterlé Bernadette Dierckx, and Gastmans Chris. 2018. The use of care robots in aged care: A systematic review of argument-based ethics literature. Archives of Gerontology and Geriatrics 74, (January 2018), 15–25. DOI: 10.1016/j.archger.2017.08.014 [DOI] [PubMed] [Google Scholar]
- [90].Waycott Jenny, Vetere Frank, Pedell Sonja, Morgans Amee, Ozanne Elizabeth, and Kulik Lars. 2016. Not for me: Older adults choosing not to participate in a social isolation intervention. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 745–757. [Google Scholar]
- [91].Wiles Janine L., Leibing Annette, Guberman Nancy, Reeve Jeanne, and Allen Ruth ES. 2012. The meaning of “aging in place” to older people. The gerontologist 52, 3 (2012), 357–366. DOI: 10.1093/geront/gnr098 [DOI] [PubMed] [Google Scholar]
- [92].Woll Anita and Bratteteig Tone. 2019. A trajectory for technology-supported elderly care work. Comput Supported Coop Work 28, 1 (April 2019), 127–168. DOI: 10.1007/s10606-018-9340-2 [DOI] [Google Scholar]
- [93].Wu Ya-Huei, Cristancho-Lacroix Victoria, Fassert Christine, Faucounau Véronique, de Rotrou Jocelyne, and Rigaud Anne-Sophie. 2016. The attitudes and perceptions of older adults with mild cognitive impairment toward an assistive robot. Journal of Applied Gerontology 35, 1 (2016), 3–17. DOI: 10.1177/0733464813515092 [DOI] [PubMed] [Google Scholar]
- [94].Wu Ya-Huei, Fassert Christine, and Rigaud Anne-Sophie. 2012. Designing robots for the elderly: Appearance issue and beyond. Archives of Gerontology and Geriatrics 54, 1 (January 2012), 121–126. DOI: 10.1016/j.archger.2011.02.003 [DOI] [PubMed] [Google Scholar]
- [95].Young Robert. 2017. Personal autonomy: Beyond negative and positive liberty. Taylor & Francis, Melbourne Australia. [Google Scholar]
- [96].Zaga Cristina, de Vries Roelof A.J., Li Jamy, Truong Khiet P., and Evers Vanessa. 2017. A Simple Nod of the Head: The Effect of Minimal Robot Movements on Children’s Perception of a Low-Anthropomorphic Robot. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ‘17), Association for Computing Machinery, New York, NY, USA, 336–341. DOI: 10.1145/3025453.3025995 [DOI] [Google Scholar]
- [97].Zardiashvili Lexo and Fosch-Villaronga Eduard. 2020. “Oh, Dignity too?” Said the Robot: Human Dignity as the Basis for the Governance of Robotics. Minds and Machines 30, (2020), 121–143. DOI: 10.1007/s11023-019-09514-6 [DOI] [Google Scholar]
- [98].Zsiga Katalin, Tóth András, Pilissy Tamás, Péter Orsolya, Dénes Zoltán, and Fazekas Gábor. 2018. Evaluation of a companion robot based on field tests with single older adults in their homes. Assistive Technology 30, 5 (2018), 259–266. DOI: 10.1080/10400435.2017.1322158 [DOI] [PubMed] [Google Scholar]
- [99].Zuckerman Oren, Walker Dina, Grishko Andrey, Moran Tal, Levy Chen, Lisak Barak, Wald Iddo Yehoshua, and Erel Hadas. 2020. Companionship Is Not a Function: The Effect of a Novel Robotic Object on Healthy Older Adults’ Feelings of” Being-Seen”. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–14. DOI: 10.1145/3313831.3376411 [DOI] [Google Scholar]
- [100].Vector. Anki US. Retrieved May 26, 2020 from https://anki.com/en-us/vector.html