Abstract
Telepresence robots have recently been introduced as a way for children who are homebound due to medical conditions to attend their local schools. These robots provide an experience that is a much richer learning experience than the typical home instruction services of 4–5 hours a week. Because the robots on the market today were designed for use by adults in work settings, they do not necessarily fit children in school settings. We carried out a study of 19 homebound students, interviewing and observing them as well as interviewing their parents, teachers, administrators, and classmates. We organized our findings along the lines of the various tasks and settings the child is in, developing a learner-centered analytic framework, then teacher-, classmate-, and homebound-controller-centered analytic frameworks. Although some features of current robots fit children in school settings, we discovered a number of cases where there was a mismatch or additional features are needed. Our findings are described according to analytic frames that capture user experiences. Based on these user-centered findings, we provide recommendations for designing the robot and user interface to better fit children using robots for school and learning activities.
Keywords: Telepresence robots, inclusion, education, communication, identity, social norms, schools, appearance
1. INTRODUCTION
Advances in pediatric medicine have changed the outcome of many once-fatal childhood illnesses. As a result, millions of children and adolescents in the US now live with chronic illnesses such as cancer, immune deficiency, and other illnesses [42]. This has led to a growing population of children who are unable to physically attend school, but still need to learn academically and grow socially. In the US, traditional services for these students consist of 4–5 hours of home instruction per week along with sets of exercises and homework to complete by themselves [13,32]. While this may, in part, serve the child’s academic needs, it completely misses the social, developmental, and emotional needs of the child. Studies show that inclusive educational practices result in better learning [15], yet current practices exclude homebound children from the full school experience.
Recent advances in technology have created ways to include homebound children in school. Some educators and researchers have experimented with video conferencing to make a connection between the home and school [14]. One study examined hospital-bound children’s use of video conferencing on an experimental movable robot, called PEBBLES [52]. The “face” of the robot showed the child’s face, and the camera on the “head” could move to show the child different parts of the classroom. Although this gave the hospital-bound students some agency to look at what they wanted, the students still needed assistance moving the robot around the classroom or from one classroom to another, possibly incurring a social debt to their helpers.
More recently, commercially available mobile telepresence robots have been introduced into classrooms. A mobile telepresence robot is a video conferencing unit on a mobile robot base that is controlled by the homebound child. This mobility allows the child to move the robot around the classroom (e.g., for small group work or a story circle), go to lunch, music classes, assemblies, and even field trips as long as there is good Wi-Fi connectivity. Students claim to feel included in class, and parents note significant increases in their children’s interest and happiness at being with their friends [32].
The two telepresence robots that were used in our study were the VGo and Double, shown in Figure 1. VGo and Double are small, and as shown in Figure 1, about as tall as an elementary school child. VGo was designed to be at the height of a seated adult; Double, however, can change height from seated adult to standing adult. Their height and light-weight (i.e., 15 lbs (7kg) and 18 lbs (8.6kg), respectively) make them suitable for use in schools. Other telepresence robots like the Beam and BeamPro exist in the world of telepresence in offices and hospital settings, but are heavier and at a higher price point. The Beam weighs 39 lbs (17.7kg), the BeamPro 90 lbs (40.8kg). Although Beam robots have been used in some schools [11], we did not encounter any Beam robots in our school districts for this study.
Figure 1.
Telepresence robots in school. VGo on the left, Double on the right.
How well do these robots, built for adults in offices and health care settings, fit children going to school? What’s missing? What other features would make them more suitable for this kind of use and for this kind of user? In this paper, we examine the experiences of 19 homebound children using VGo or Double telepresence robots to attend school. We examine the details of the situations that students experience in class, traveling to another location, etc. then focus on the teachers’ tasks, the classmates’ tasks, and finally the student user experience through the controls at home. We call this a learner-centered analytical framework, but also extend it to the smaller but important tasks in the teacher-, and classmate-centered frameworks, then revisit the student at home using the controls in a homebound-controller centered analytic framework. These frameworks organize our findings around the situations in which we uncovered important features not previously found in the literature for homebound children using telepresence robots to go to school. Many features that we found useful in schools are not new—much work has been done on telepresence robots in other settings for other populations. However, we provide empirical data on a novel population of users (i.e., homebound students) and their daily experiences with these robots in a new setting not well-covered in the literature— traditional schools.
2. RELATED WORK
2.1. Telepresence Robots
2.1.1. Movable.
The earliest attempt to use robots for virtually including these students in traditional schools is a study of a telepresence robot conducted in Canada via a movable telepresence robot called PEBBLES (Providing Education by Bringing Learning Environments to Students) [52]. PEBBLES combined videoconferencing with simple robotics to provide students with a presence in their classroom from a remote location such as a hospital or home, as shown in Figure 2. Case studies were carried out in three different classrooms with use ranging from six weeks to five months. These studies concentrated on evaluating the social, academic, and communication aspects of the system [17]. Investigators found that in time the students who used PEBBLES were able to take part in many of the same tasks as their peers and participate actively in their classroom without creating any excessive disturbances [52]. Real-time audio and video communication was valuable in maintaining or establishing connections with peers. However, the PEBBLES robot system was movable but not mobile (i.e., no remote-controlled mobility) and needed assistance when moving from one class to another. Students did not have control over their mobility and thus may have incurred implicit social debt to their peers. The burden of social debt has also been covered in the literature for adult use of telepresence technologies (e.g., with wearable and movable free standing devices) by Rae et al., [39]. Similarly, in earlier studies on mobile telepresence robots in the classroom, classmates complained when the mobile telepresence robot lost connectivity and had to be carried or pushed on a cart [30].
Figure 2.
PEBBLES robot in a classroom.
2.1.2. Mobile.
Newly developed telepresence robots can be moved and controlled by a remote person. These robots provide real-time audio and video exchange, with the person’s face typically shown on the robot’s “head.” These robots differ from each other in significant ways. They have different mobility features; they may or may not allow pan and tilt of the camera; they have different microphone and speaker placements; and they have different network security features, among other things. Figure 3 shows four commercially available robots.
Figure 3.
Four commercially available telepresence robots (height in inches): VGo (48”), Double (47–59”), Beam (53”), and BeamPro (63”).
Desai and colleagues present a comparison of the features of various robots and how they might matter in workplace settings [12]. Two commercially available robots, VGo and Double, were used in our study by children to attend school. Table 1 shows a comparison of VGo and Double robot features.
Table 1.
Comparison of Double and VGo features.
Double | VGo | |
---|---|---|
Battery life | 8–10 hours | 6 or 12 hour option |
Camera pan (left and right) | No | No |
Camera tilt (up and down) | No | Yes, 180 degrees |
Cliff sensors | No | Yes |
Drive | 1 large cylindrical wheel | 2 wheels and 2 casters |
Face screen, display static image | Yes | Yes |
Face screen, life-size | 9.7” LED, Yes | 6” LCD, No |
Microphones | 1 forward facing below screen | 4 around video screen (2 front, 2 back) |
Navigation control | Mouse, arrow keys, joystick | Mouse, arrow keys |
Number of cameras | 1 front facing and 1 always-on floor view | 1 front facing |
Resolution of cameras | 5 megapixel | 3 megapixel |
Speakers | 1 below face | 1 woofer in base, 1 tweeter in head |
Top speed | 1.6 mph | 2.75 mph |
Two-way audio & video | Yes | Yes |
Unit cost | $3K + cost of iPad | $5K |
Video encryption | 128-bit AES, HMAC-SHA1 | SSL |
Weight | 15 lbs. | 18 lbs. |
Wheels are American Disabilities Act (ADA) Compliant | Yes | Yes |
Wi-Fi access point switching | Yes | Yes |
2.2. Telepresence Robots In Work and School Settings
2.2.1. Work.
A number of papers report on the use of robots by adults in offices [12,21,22,26,45,48], health care [20,22,50], conferences [29,37] aging in place [5,22,26,41,47,48]. Short field studies have also been conducted on robots as teaching assistants for traditional students learning a second language [35]. For adult users, people reacted to the person on the robot as if they were physically present, successfully collaborating on projects with informal (hallway conversations) as well as formal interactions (participation in meetings). Those using the robots found it difficult to “walk and talk” because of their having to concentrate on navigating the space. Local users also called for a way to identify physically who was on the robot, more than just looking at the face on the screen. Control over volume was difficult, often projecting a louder voice than intended for the setting. Remote users also wished to know from where a sound was coming so they could orient to a particular person. We found a number of these features were also important in school settings but for different expected tasks by a different population and for much longer periods of usage than found in earlier literature.
2.2.2. School.
Earlier work has found that the robots are a feasible option in school settings [31,32]. While some of our findings are consistent with prior work on the use of telepresence robots for work, use by homebound students requires additional nuances. Schools differ from office and health care settings in a number of important ways. In the office and health care settings, the users are adults; in schools the users are children. As such, the expected tasks to be accomplished via the robot are different. In health care settings, the physician on the robot is involved in diagnosing and offering medical solutions--clearly in an authority position. In office settings, the robot carries with it the authority of the remote worker, offering expertise, decision-making, and formal/informal interactions in order to meet work responsibilities. In schools, children use the robots to accomplish multi-level tasks within varied complex social settings.
A child using a robot to attend school is in a wide variety of situations that change daily. Children are expected to transition from stationary lectures, to walking the halls, to attending assemblies, to “eating” in the cafeteria, to going on field trips, and even to attending after school activities. In the classroom, children receive instruction and encouragement from the teacher, participate in social interactions with peers, and engage in critical intellectual as well as social-emotional development experiences. Our study participants engaged in all these traditional school activities and settings via their robot. They were able to experience these organic environments and interactions via a synthetic means (i.e., their home device + their robot). In this paper, we outline the robot design features that assisted, interfered, or could be improved in order to accomplish expected tasks in schools. Based on these findings, we provide recommendations for designers of telepresence robots.
3. METHOD
3.1. Research Question
Data collection for this study took place over a four-year period (2013–2017). In earlier work, our first publication from this study examined the homebound student’s school experiences [32]. The second publication examined the ramifications for parents, teachers and administrators, highlighting, for example, that connectivity opens up issues of privacy, both for the classroom and the student’s home [31]. This paper focuses on the following research question: What are the robot design features that matter for accomplishing learning tasks via telepresence robots in schools? In our findings, after each user-centered group of tasks, we provide recommendations for designing the robot and user interface to better fit the needs of children using telepresence robots for learning and classroom activities.
3.2. Research Design
As we are the first to do a systematic study of homebound children using telepresence robots in traditional schools, our study is qualitative and exploratory in order to give voice to user experiences. The research methods used were semi-structured interviews, observations, and focus groups.
3.2.1. Interviews.
Over 45 hours of interviews were conducted with homebound children and their parents, teachers, and school and district officials All interviews were semi-structured and lasted 30 to 60 minutes. Interview topics included the motivation for using the robot, technical aspects of robot use, academic experiences while using the robot, social experiences while using the robot, child’s well-being, and general experiences with homebound educational services when applicable (e.g., not all children received home instruction services). Interviews took place in multiple sites with child/parent interviews taking place in homes, a restaurant (the child was traveling to the hospital), and a hospital (per parent preference). Interviews with teachers and administrators took place on school or district campuses.
3.2.2. Observations and Focus Groups.
Observations took place in four classrooms where the robot was deployed. These observations lasted 45–60 minutes. Focus groups were conducted immediately after the observations in two of these classrooms. Focus groups of two full classrooms with an average size of 22 students, lasted 5 to 10 minutes, limited by the school schedule. Discussions were limited to questions on the classmates’ attitudes and perceptions of the child attending school with a robot. Homebound children were present via robot and participated in the focus group discussions. Open responses were allowed for each question with an average of two to three minutes allowed per response to each question.
3.3. Participants
We interviewed participants in 19 cases of children with chronic illness who were currently using or had previously used telepresence robots for virtual inclusion (Table 2). Participants in this study are from 5 different states and 19 different schools. The children in this study had a range of chronic illnesses including cancer (12), spinal muscular atrophy (3), immunodeficiency disorder (2), heart failure (1), and unintentional injury (1). The age range of the children was 5 to 18 years old with 10 male students and 9 female students. The median attendance is 10 hours per week with a range from 3 – 32.5. Some school districts limit the number of hours the student can occupy the robot; other schools let the student choose when and how long to occupy the robot. The number of hours per week spent on the robot is significantly more than the 4 hours of home tutoring students would get normally, but not a full 30–35 hour school week. In all cases, the robot was not used by more than one student, contrasting with robot use in conferences and some hospital settings.
Table 2.
Homebound children (i.e., cases) in this study.
Pseudonyma | Grade | Condition | Robot Used | Hours per week on the robot |
---|---|---|---|---|
Daphne* | K | Spinal Muscular Atrophy | Double | 10 |
Tina* | 5th | Cancer | Double | 11 |
Tara* | 6th | Cancer | Double | 11 |
Beth* | 7th | Cancer | Double | 11 |
Nick* | 9th | Unintentional Injury | Double | 11 |
Bailey | 11th | Autoimmune | Double | 11 |
Ben | 1st | Cancer | VGo | 15 |
Hannah* | 1st | Spinal muscular atrophy | VGo | 20 |
Ian | 1st | Cancer | VGo | 10 |
Robert* | 1st | Cancer | VGo | 22.5 |
Nancy | 2nd | Cancer | VGo | 12 |
Nathan | 2nd | Cancer | VGo | 30 |
David | 3rd | Immunodeficiency | VGo | 3 |
Samuel | 5th | Heart | VGo | 32.5 |
Daniel | 6th | Cancer | VGo | 10 |
Victor | 6th | Cancer | VGo | 22.5 |
Dana | 8th | Cancer | VGo | 20 |
Eileen | 9th | Cancer | VGo | 20 |
Marco | 12th | Spinal muscular atrophy | VGo | 25 |
these children were not available for interview; instead, data was collected from the child’s parent or teacher.
the real names of participants are not used in this paper, all names are pseudonyms to protect the confidentiality of all participants.
Of the 19 students covered in our study, in-person interviews were conducted of 11 students. Not all students were available for interviews due to medical issues. Data for eight of these students was collected from our interviews of parents or educators. These interviews were consistent with interview questions, structure, and data collected directly from homebound students.
In addition to collecting data from the homebound children (n=11), whenever possible, we interviewed and observed their parents/guardians (n=16), teachers (n=20), and school administrators (n=16) and conducted focus groups with the classmates (n=44). All told, 107 people participated in this study.
In addition to the N=107, classroom observations were conducted of 45 classmates in two separate classes where robots were deployed. The first author sat in the back of the classroom and noted the various interactions the students and teacher had with the child-on-the-robot, both with the remote person and with the robot as an object. These 45 classmates are not counted in the N=107 because, due to issues with district parental consent forms, we were not able to conduct focus groups with the classmates. However, the teachers in these classrooms were interviewed for the study and are counted in the N=107. Observations and field notes on classmate interactions with the robot were recorded. Notes from these two observations were consistent with data from interview and focus group sessions and, as such, support our recommendations.
Informed consent.
All participants were provided with university Institutional Review Board (IRB) and local school district approved study information sheets. Study information sheets were read aloud by the interviewer before each interview to provide ample time for questions about the study. Adult participants provided written or verbal consent before being interviewed. Child participants received parental permission and gave verbal assent to being interviewed before interviews were conducted.
3.4. Robots
The robots used in our study were the VGo, shown in Figure 4 and the Double, shown in Figure 5. In all of the cases, the robots moved around the classrooms, could move between rooms (e.g. to the computer lab, gymnasium, lunchroom), and in some cases even went on field trips. More details on the features of each robot can be seen in Table 1. All robots in our study were provided to students by their schools. Students did not share their robots with anyone else, allowing for consistent personalization (e.g., one robot was dressed in a pink tutu, the favorite color of the remote student). In all cases, the classmates did not have any prior experience with a fellow student on a robot.
Figure 4.
The VGo robot in the hall with other students, in class and the view from the student at home.
Figure 5.
The Double robot walking down the hall, in class interacting with a peer, and in a group work session.
3.5. Data Analysis
To increase trustworthiness in the data and confirm validity of the processes, data was collected from multiple sources. Triangulation, protocols that are used to ensure accuracy and alternative explanations, of the data was accomplished by asking similar interview questions of different study participants, by collecting data from different sources (i.e., homebound children, parents, classmates, and educators), and by using different methods (i.e., interviews, observations, and focus groups). It was expected that the concepts and themes related to the user experiences of operating telepresence robots in traditional schools would emerge from the multiple sources of data through inductive content analysis, open coding, and the constant comparative method. Interviews, observation field notes, and classmate focus groups were recorded, transcribed, and coded to identify patterns, similarities, and dissimilarities across all cases. Initial coding was performed on transcripts and field notes where tentative labels were applied to sections of data and these labels were later classified under common concepts or categories as the data underwent multiple rounds of coding. A list of the code words for each transcript was compiled and compared across the individual cases. This allowed for checks to ensure that a code was used consistently throughout the transcripts. During these steps, notes were taken and recorded of emerging codes, the ideas they represented, and relationships between codes. The themes and concepts that emerged from the analysis were repeatedly compared with the transcripts to ensure their validity. The constant revision of the material allowed for some codes to be subsumed under broader and more abstract categories. Patterns and themes were characterized by similarity, frequency, and correspondence.
To answer the question, “What are the robot design features that matter for accomplishing learning tasks via telepresence robots in schools?” data was continually analyzed every year during a four-year period and underwent multiple rounds of coding. With every new batch of cases, consistent patterns across cases (e.g., all cases reported frustrations with connectivity) emerged. Coding resulted in four categories: 1) homebound student tasks on the robot; 2) teacher tasks; 3) classmate tasks; and 4) homebound student tasks at home. Under these four categories, the data was organized by sub-tasks that the participants expected to accomplish when operating or interacting with the telepresence robot. Evaluation of existing features and recommendations for improved robot design were based on participant feedback.
4. FINDINGS
In what follows, we present what was said and what we observed focusing on the expected tasks of the remote student while embodied in the robot. Data was collected from the viewpoints of the homebound children, classmates, parents, and educators where appropriate. We begin with what is required from the school infrastructure to bring the robot “live” for participation and progress to a description of the tasks required from the remote student, teachers, and classmates in the school environment. We end with a description of the home environment, the user interface, and the technology used to accomplish tasks from the home environment. All homebound students received the following technologies from their schools: a telepresence robot, a home device (such as a computer or tablet), and technical support from school personnel. No additional technologies were provided by the schools. Questions asked in the interviews were open ended in order to elicit responses that reflected salient user experiences. We provide counts of the offered responses only when a high number of participants reported challenges with a particular feature. Our sample size was too small to effectively identify effects of age or gender on our findings.
4.1. Participation Essentials
The robot alone is not able to provide the full virtual inclusion experience for a student. Virtual inclusion in school via telepresence robot requires consistent connectivity, power, and reliable communications. As school settings are vastly different in structure and technical support from healthcare, corporate, and conference settings, robot users in schools must ensure that 1) there is a technical infrastructure on campus that allows for strong connectivity; 2) remote students understand how their use of the robot affects battery life, and 3) there is a backup method for communication between the home and school when issues with the connectivity and battery occur.
4.1.1. Wi-Fi Connectivity.
The most cited frustration with the use of telepresence robots in this study was the Wi-Fi connectivity. Other researchers studying office and health care uses of telepresence robots have stressed the importance of connectivity [12,22,26,39,45,48]. Because students are mobile for long periods of time, connectivity is a particularly salient need for them. In our study, all 19 cases, their parents, teachers, administrators, and their classmates cited frustration with the connectivity of the robot and the remote student “turning off.”
Connectivity issues varied from spotty connectivity throughout the day to prolonged absences due to school connectivity issues. Some were spotty connections where a student would suddenly be disconnected for a brief time. Eileen said, “It loses connection a lot and like gets back on five seconds later and I miss like the middle of a sentence that the teacher would be saying.” Others were long-term disconnections. Eileen’s mother reported that, “There were times when she couldn’t go to class at all because we couldn’t get it to connect.” Nathan also reported that, “Sometimes it logs off and then it stays gray…takes like 30 minutes to log back on…” Victor had not been able to attend school for three months due to connectivity issues at the school. The principal of his school had provided the funds for Victor to have adequate Wi-Fi at his home but the school’s technology team had not been able to work out the connectivity issues at the school in order for the robot to operate within the school. The robot sat unused and Victor reverted to limited home tutoring until the school’s technology team was able to provide the necessary hardware and connectivity. More commonly, the connectivity issues were spotty and related to the strength of the school’s Wi-Fi. Dan’s principal reported that “The big problem [was that]….we tested with no kids in the building and it ran…but once the students came…they got all those cell phones and tablets going and…suddenly there were dead spots that we didn’t find…and he would be driving…and it would just quit.” Even when additional routers were installed, administrators still failed to understand why the robot would disconnect while traveling through the school. An administrator blamed it on the robot’s system, saying, “It needs a stronger receiver system…cuz my phone and my tablet don’t lose connectivity where that robot’s going dead.” Understanding that connectivity might be an issue created opportunities for classmates to help in school areas where there were dead spots. Nick’s teacher pointed out that, “My understanding is that he drove himself with an escort, and the only time people needed to carry him was when the Wi-Fi knocked out or the Bluetooth knocked out in a dead zone in our school, which ironically is the hallway that you have to take to come out to where I am in the trailers…”
Hotspots.
Mobile hotspots are devices that tap into a cellular provider’s 3G or 4G wireless data service to deliver internet data at broadband speeds via a built-in Wi-Fi router. They work anywhere that the data service receives a signal and vary in size and shape but are typically the size of cell phone or wallet. In this study, hotspots were used at some schools to provide consistent connectivity for the robot between transition points. However, the robots did not have designated ports or places to attach hotspots and there were some issues reported with the placement of the hotspots. “We had to use a hot spot and that was spotty and it was on there with Velcro and so when they moved the robot, we were concerned it would fall off…and it still had some issues traveling…”
RECOMMENDATIONS:
Approaching this use of technology as a partnership between public schools and private industries, would benefit vulnerable children who are at the center of robot sales and marketing strategies. We recommend that designers of telepresence robots work closely with schools that purchase their robots to ensure that an adequate infrastructure is in place to support robot use for a full school day. In addition, to assist with dead spots in school hallways, designers should explore displaying a message on the screen that instructs classmates or teachers to help it move to the next point of connectivity when failures happen. We also recommend that future robots incorporate cellular data connection capabilities or include a built-in portal on the robot body for secure placement of a hotspot device.
4.1.2. Battery Life.
In addition to losing connectivity due to issues with Wi-Fi service, users of the robots also struggled with loss of connectivity from battery life. We reported the importance of this feature in school settings in earlier work [31]. Cesta and colleagues [9] mention the importance of convenient battery-charging docking stations in office settings. Office and health care settings may have easily accessible charging stations but schools struggle with placement of docking stations. There are issues of security and adult supervision that are needed for placement of docking stations that, at this time, typically cost $250 or more.
Three out of the nineteen homebound students expressed frustration at the battery life. The VGo comes with two battery options, a 6 or 12-hour battery life. At the time of this study, the cost of the extended battery (i.e., the 12 hour battery) was an additional $149 and added 5 lbs to the weight of the robot. One set of classmates said they were frustrated at having to carry the robot when the battery went out. A student who attends a full day of school must be able to rely on enough power to run at least six hours. There was one reported incident where the robot battery went out completely while the robot was going up a ramp. A parent reported that the robot “shut off on one of the ramps…nobody was in the school and we’re calling…’my son is stuck…by the lunchroom…Could somebody take him and charge him?” An administrator commented, “The battery is lasting but he certainly can’t go all day,” and a student put it simply when she said, “It didn’t last all day.”
Many held the view that the battery ran out faster if the robot moved more. One student mentioned that she was told, “The more you roll, the more battery it wastes.” This was a problem for her because she was in high school and had to travel between classrooms spread out on two different floor levels. A teacher of another student commented that, “When he comes back from the gym, he’s almost out of battery ‘cause he’s been moving so much.” Another teacher suggested a design change, that the head move separately from the body so that it would make fewer gross movements to turn to see something. Four of the nineteen cases reported issues with battery life being affected by how much they used the robot to move around—three students expressed frustration (above) and one simply noted that moving used up the battery. All four cases were using VGos but were unaware whether or not their robots were equipped with VGo’s optional extended life battery or not.
Another teacher came up with a solution to compensate for the short battery life. “We just leave him on the charger so that the battery charges…battery, that’s a frustration…but we’ve been able to work around it…we keep the docking station at his desk…” While this work-around is good for a child who is in one classroom all day, this solution may not work for older students who travel to different classes throughout the day.
RECOMMENDATIONS:
Traditional students attend school for six-seven hours a day. In order to provide equitable and inclusive services for this population, robot designers need to provide a battery that will last for at least six hours. The six-hour duration of the battery must also allow for student movement, comparable to that of peers, throughout the school. Robot designers should provide users with realistic expectations of their mobility and the resulting impact on battery life. Docking stations should be designed to balance convenience and affordability with the security that is needed in school settings.
4.1.3. Secondary Communication Channels.
Kristoffersson and colleagues [22] stressed the need for a secondary communication channel when the robot loses connectivity. We found that for our study population, it was critical to have a backup method of communication. Whereas adults in offices or health-care settings have easy access to cell phones, access to student and teacher telephone/cell phone numbers varies according to school policies. Twelve of the nineteen participants and their parents reported using a cell phone to communicate with school faculty, staff, or peers when they encountered connectivity or battery issues with the robot. Teachers also reported using cell phones to communicate with the remote students when the robot did not function properly. One teacher, who could not share her personal cell phone number with students due to district policies, even reported asking a classmate (who was a close friend of the remote student) to text the remote student on his cell phone to see if he wanted to continue to attend class. Four of the students who used a cell phone for backup communication also used the classroom email function in Google Classroom. One student did not use a cell phone but used Google Classroom exclusively to communicate with her teacher and classmates when she experienced connectivity issues. A teacher recommended, “Have a backup plan …If something is going on where one of you can’t hear each other or there’s connectivity issue, have a chat opened separately…it can be through your Google Classroom…”
RECOMMENDATIONS:
Until Wi-Fi connectivity and adequate battery life are no longer issues in school robot use, a secondary communication channel is necessary for children using robots. Designers of telepresence robots may consider adding independent cell phone capabilities to the robot so that verbal communication is always enabled even if the robot loses connectivity to the school’s Wi-Fi. In addition, designers may want to explore integrating existing online services (such as Google Classroom1) into the user interface.
4.2. Homebound Student Tasks On The Robot
After the robot is purchased, the school’s Wi-Fi infrastructure is established, and connectivity is reliable, the remote student can begin actively participating in school activities. We outline expected tasks for these activities according to location and user. The first section describes expected tasks that take place in the school environment for the homebound student; the second section describes tasks that take place in the classroom for the teacher; the third section describes tasks that classmates may experience when interacting with the robot in the school environment; and the fourth section describes tasks in the home environment and on the user interface for the homebound student.
4.2.1. Attending Class.
In our study, students used the robots to attend traditional classes such as math, language arts, science, foreign languages, art, history, tutoring, physical education, social studies, etc. The opportunity to attend classes was appreciated by all participants; however, hours of class attendance varied by participant due to medical, physical, or school restrictions. Since homebound students needed some degree of flexibility in their class attendance, teachers appreciated knowing when the student was arriving to or exiting from class, when they were on or off the robot. Earlier research has found occupancy awareness, knowing that someone is on the robot and who explicitly it is, to be an important social consideration for adults [12,48]. This social consideration is also important in schools, perhaps even more so, as students use the robots daily and the robot is their only method of interacting with peers. Three students reported that their VGos verbally announced when they had logged on to the robot and when they logged off. Although fitting the original purpose, this self-announcement turned out to be very disruptive when the robot was repeatedly going on and off due to spotty connectivity. When it announced “Samuel is in the room” and Samuel had been in the room throughout class, it was at first comical. But on its third or fourth time, it was annoying to the point of the teacher having to turn Samuel’s robot off entirely. This announcement feature was reported during year one of our data collection but not during years two and three. It is possible that VGo has discontinued this feature or made it optional. Double does not have an announcement feature when someone logs in to the robot. However, one classmate reported that it was “creepy” how quiet the Double was and that at times they weren’t aware that the robot was occupied until they saw it move.
RECOMMENDATIONS:
As humans, we make noise when entering a room and are physically visible to others. Robots located in classrooms may or may not be occupied. In order to facilitate the social presence of remote students and social acceptance of embodied robots by classmates, robot designers need to provide visible or audible features that signal occupancy on the robot. We recommend that robot designers introduce optional soft announcements for entrance and exit or a very visible (i.e., from all angles) light that goes on and remains on when the student is connected to the robot. Allowing for different options of occupancy awareness that may be selected by the teacher or student would also provide the remote student an opportunity to select an announcement that suits their personality.
4.2.2. Personalization.
Prior literature on telepresence robots in offices, conferences, and health care settings report instances of personalizing the robots to easily identify who is occupying the robot [26,29,36,51]. For students, school context plays a critical role in their identity development. More personal elements such as the self-image and the integration of the students in the classroom group are also related to identity development [24]. For homebound students using telepresence robots, how the robot looks and interacts represents their identity in the classroom group. Personalizing their robots with clothing or preferred “face” images may also help students integrate more easily into their school communities as there are various personalities and age groups interacting for different purposes.
Clothing.
Homebound children in our study often expressed their identity by dressing the robots, being sensitive in particular in how they came across to their friends. Ten of the nineteen cases in our study dressed and personalized their robots at least once for the school day or a school event. In a related local news story [6], a 2nd grader (known for wearing pink) dressed her robot in a pink tutu and necklace, shown in Figure 6.
Figure 6.
The Double personalized to represent the student.
Because neither the VGo nor the Double were built to be dressed, eight students using the VGo taped a hanger to the back and put a t-shirt on it to personalize their robots. Double has a convenient opening on the back of the robot where a certain model of hangar can be placed. Two students using Double robots reported using this feature to personalize their robots with shirts. Even the color of the robot evoked a connection with the person. A classmate commented on the VGo, “I like that the robot is white, because white is one of my favorite colors.” Dressing the robot did have some drawbacks. An administrator recalled that a VGo robot was not operating properly because the hem of a jersey was blocking the cliff sensor and the robot was not able to move forward.
Live video feed.
Personalization of the robot also occurred through the image on the screen of the robot. For example, Nathan, who did not report personalizing his robot for school, still engaged in the social norm of getting ready for school. He dressed himself in his school uniform every day, even though he was attending school from home. His adherence to the school-required uniform was self-initiated and visible to anyone who interacted with him via the robot. One teacher, however, noted the downside of having a live video feed into the classroom. The student “… would be on his robot trying to take part in class and get physically ill, where he may start vomiting….and we would see that.” Even during these bouts of illness, the student insisted on remaining in class via the robot and not having his robot turned off. The teacher suggested that the camera of the homebound student be turned off, though the sound remain live. In addition to displays of physical illness, some students experience changes in their physical appearance due to illness. Liu and colleagues [28] found some children with chronic illness to be very concerned with changes in their appearance due to illness. The option of a static picture may help these students.
RECOMMENDATIONS:
To assist students with portraying their identity to peers and their school communities, we recommend that robot designers allow for personalization clothing options on the robot body that will not affect the sensors or cameras. Additionally, for times when the live video feed may not provide a desirable identity representation of the remote student, the robot should be equipped with the ability to project a static picture on the face screen, according to the child’s preference. Even with the static screen, the real-time audio feature can continue as does the student’s ability to see and move in the classroom, supporting engagement in school communication and learning.
4.2.3. Getting Attention.
How we gain attention from others varies by setting. In an office setting we may signal with our hands, leaning forward, or voice that we have something to say; in health care settings we may simply call out or press a buzzer. In schools, students are taught to raise their hands. The PEBBLES robot [17] came equipped with a hand for the child to raise to gain attention in the classroom. For the variety of settings a schoolchild has to face (e.g., classroom, lunchroom, hallways, group work, etc.), gaining attention varies and is much more of a challenge via a robot.
The remote student using the VGo has three ways to get attention: speaking up, moving toward the target person, and turning on a blinking light. The remote student on the Double may speak up, move towards the target person, or raise their “head,” making the robot taller. For informal conversations, merely speaking up seemed to be sufficient for both models of robot. The audio was loud enough to be heard, for example, by fellow students walking with the robot down the hall.
The ability of the homebound student to move provided a second way to get attention in the classroom. Several teachers reported the robot “rolling right up” when the homebound student wanted to ask a question or join a group. The blinking light or raising head were used in more formal efforts to communicate. During our two focus group discussions, both remote students actively blinked their lights to signal they were waiting for a turn to speak. Overall, when asked how they gained attention from the teacher, nine students reported blinking their lights (VGo), two students reported simply calling out, two students made themselves taller (Double), two students used text messages to the teacher or a friend; four provided no comments on how they got attention from the teacher. The two students who raised the head of their Double to get attention reported that they did not use this feature very often because it was slow.
RECOMMENDATION:
The audio feature (i.e., calling out) on currently available, off-the-shelf telepresence robots is the most consistent feature for requesting the teacher’s attention. As not all students are comfortable calling out when they need assistance and not all teachers allow for calling out as a form of requesting help, we recommend that robot designers consider adding a visual (e.g., visible light) or audio cue (e.g., a subdued tone that is not disruptive to the class) to signal to the teacher that the student is “raising a hand.” A consistent design feature for “raising a hand” would allow remote students to continue following normative social protocols for requesting their teacher’s attention.
4.2.4. Viewing Objects In The Classroom.
The importance of good vision for learning in schools has been documented since school vision testing was established in 1907, in the United Kingdom, and remains universally recommended [16,44]. It is also important for adults to be able to view objects and documents in the office, workplace, or classroom. Desai and colleagues [12], as well as Venolia and colleagues [51] reported the need for camera pan and tilt features; the need for zoom capabilities was found by researchers of office and health care contexts [21,36,37,51]. Having the head move separately has been recommended in the literature [12,43,49].
Looking to the left and right.
Neither the VGo nor the Double cameras can pan (i.e., move to look left and right). Consequently, if a child wants to look at something to the right or left of the robot, the entire robot has to turn. This was a challenge for Sam who wanted to watch the teacher as she spoke while walking around the room. His mother commented on his vision via the robot, “you have…no peripheral vision…it’s more straight focus…if they’re like get this sheet out…he has to turn the whole robot around…and it makes everyone look up.” Sam’s mother was referring to the teacher holding up a worksheet; in order for Sam to know which sheet the teacher was referring to, he had to turn the entire robot to see. Many times, he turned too slowly or was not able to focus quickly enough and had to ask for the teacher to say out loud which worksheet she was referring to.
Looking up and down.
The VGo camera can tilt up and down and allowed students to look down to view materials that were on the desk. A teacher made a positive comment about the movement of the camera, “The robot can move its head [actually only the camera] up and down…so if he was working with another teacher…he would be able to face down and see what she was writing.” However, the Double camera cannot tilt up and down. Double users reported needing to back away from the desk to get a better view of materials on the desktop. As humans, we tend to move towards things we want to see better and the practice of moving away from something for a better view was difficult for some students to grasp.
Viewing interactive white boards.
During class time, students must be able to read information off bulletin boards, chalkboards, and interactive white boards, large rear-projected screens. In all four classrooms that were observed, the robot was positioned near the front of the room in order to maximize visibility of the traditional or interactive whiteboard at the front of the room. The robot’s camera was best suited for the homebound student to read high-contrast information (i.e., black writing on a white background). Homebound students had complaints about the ability to see classroom material: “I couldn’t see everything that was written on the board.” “[She had trouble with] the document projector…cuz the white paper…the glare… she couldn’t see the writing on there” “Depending on if the light is shining on it…We figured out…he can get in front and see like head on. It’s more difficult if it’s at an angle…’cause the light just reflects funny.” For a VGo user, “[She had trouble with] the SmartBoard, an interactive white board. When she would zoom in, the words would get blurry.” Ten students reported having problems reading material on interactive white boards. One teacher, when asked what changes she’d like said, “[I’d like] a slate, a tablet-like device where if I’m teaching in my classroom and I’ve got the SmartBoard on,…instead of having the kid go up to the board and write…they could write on this [tablet] and then it appears on the Smartboard.” This would allow the homebound child’s work to be shown to classmates, just like physically going to the board.
RECOMMENDATIONS:
To more closely replicate how seated students use their eyes and head to view objects in a classroom, we recommend that robot designers provide features with similar “head” and “eye” movements such as a camera with full pan and tilt capabilities. The need to control the full robot body to gain a clearer view of objects is not intuitive to how traditional students behave. Designers should also explore developing software that coordinates with tablets and related devices for students that have difficulty viewing interactive whiteboards. The tablet may allow the homebound student to “write” on the interactive whiteboard from home. We also recommend that robot designers improve cameras for better visibility of digital images (e.g., interactive white boards) via the robot.
4.2.5. Participating In Class Discussions.
Participating in class discussions is an essential aspect of being present in the classroom. The ability to hear what the teacher is saying, speak, and interact with peers is central to both academic and social learning.
Hearing.
Audio issues and the importance of hearing clearly via the robot have been noted in the existing literature [26,29,34,48]. The microphone on the robot facilitates the hearing for the remote student. In the classroom, some homebound students reported that they were unable to detect where a voice was coming from if the person was not within visual range of the robot.
The microphone on the Double is located in the front beneath the face screen. On the VGo, microphones are located in the front and back, allowing students to hear everything around them. On both robot models, the remote student has the ability to control the volume of the microphones, including muting them. One student reported muting the back microphones because he was confused about the physical location of activities and people. His mother stated that he did that, “so he can kinda track better because…when they’re both on, and somebody talks, he doesn’t know if they’re behind or beside him.” This student also reported that he turned off the back speakers because “It’ll echo like in the front and back.” Several students commented on the echoing of voices through the robot. One student using the VGo reported continually keeping his microphones muted because, “If it’s unmuted and the people on the other end say something, it’ll kind of echo through the robot.” During a focus group discussion, a classmate asked, “Why does sometimes your voice echo back? In the robot?” Nathan, the remote student in this case, attributed it to bad internet, but it is more likely due to a rebound from his speakers into his microphone.
Speaking.
Many researchers studying telepresence robots in office and health care settings noted the need for different volume levels for interactions [26,29,34,48]. However, adults follow much more consistent norms for appropriate volumes in office and health care settings than children do in school. This feature is especially important in school settings because schools have a wider range of physical settings where different volume levels are needed for active participation. Also, children have a much more dynamic range of what volume is acceptable in different school settings. It is difficult for the homebound student to know how loud he or she is in these different environments as they are not able to hear themselves the way students are able to when they are physically collocated.
Classroom.
At times, student volume needs may change simply based on the energy in the room. Because one of the VGo speakers is on the front near the screen and the Double has the speaker directly underneath the face screen, the voice appropriately appeared to be coming from the homebound student’s mouth on both models of robot. However, the placement of the speakers did not appear to help with echoing sounds. Echoing sounds emitting from the speakers were reported as challenges to speaking in the classroom. The problem with echoing sounds interfered with the students’ ability to speak. A school counselor who was responsible for troubleshooting technology issues on the Double shared that, “there were some issues on days about either echoing or volume…I don’t know what device it was coming from or if it was just a joint thing through the program, that I don’t know.” In this case, the echoing issue occurred on the classroom end. If the issue was not resolved quickly, the teacher would mute the robot because it was disruptive to the class. When there was some down time, the teacher would unmute the robot and try to troubleshoot the problem with the remote student.
Groupwork.
The school environment has periods of quiet in the classroom (“indoor voices”), noisy periods, and moderate noise during groupwork. Groupwork is common in the classroom because the same instructional style may not fit everyone. Barr and Dreeben [2] found that teachers create subgroups of similar students to manage activities not easily handled in the classroom as a whole. Because the robot is mobile, the homebound student can move to join their peers for groupwork. A teacher noted, “He could roll right up to their desk.” Since the robot has a volume control on it, the students in the group can control the volume to suit the situation. Students reported, “We could turn the volume…just like the kids whispering, only the group could hear.” During certain group activities, if the remote student had to speak with only one person, s/he used the VGo headphone or ear bud jack. The Double robot uses the iPad headphone port for a forward-facing microphone. In order to use headphones with the Double, peers would need to know to unplug the microphone in order to plug in the headphones.
Lunchroom.
Of the students who ate lunch in the cafeteria, a VGo user commented that, “ They had trouble hearing me because the lunch room’s so loud and my robot is loud but not compared to the lunchroom…” Another student, also a VGo user, reported, “sometimes…I just keep calling them…if they keep not answering me. Sometimes it’s too loud at lunch.” Users of the Double robot reported eating lunch with friends but not in a lunchroom. These students ate lunch with their friends in classrooms. As such, they did not report challenges to the volume of their robot being adequate for a school lunchroom.
RECOMMENDATIONS:
Hearing: To better assist remote students with locating the source of a sound, we recommend that designers provide features such as echo-canceling microphones on the robot or a visual on the user interface that would allow students to locate the direction of incoming sounds.
Speaking: In real-world settings, we use sliding scales for volume on digital devices when we are in the room with the device’s speaker. This feature has not proven to be as useful for remote students using robots. To better provide these students with volume awareness, we recommend visual tools, beyond a sliding scale, on the user interface to assist students with determining appropriate sound levels for different situations. We recommend features like a numeric scale on the user interface to help students know which volume number is appropriate for which school situations (e.g., classroom volume could be at a “4” and lunch room volume could be a “9”). Volume controls should allow for volume levels beyond what is needed in workplace settings to include activities in dynamic areas such as lunchrooms. It may be beneficial to add a “yell” feature if the robot is expected to attend sporting events or other activities where classmates will be allowed to yell.
In addition, to assist classmates in hearing the remote student via robot and replicate normative speaking practices, the speakers should be near the screen of the robot, projecting sound as if coming from the mouth. For situations when the remote student would like to have more personal communications similar to whispering, the robot should have an audio jack so that earbuds or headphones can be attached for one-on-one communication and groupwork.
4.2.6. Sitting Versus Standing.
Relative height influences ease of communication and conveys relative power [7,18,38]. Prior research has found adjustable height on a robot to be important [1,12,29,36,39]. The VGo’s height is 4 feet, which is about the height of younger elementary school children. Conversation with the VGo among middle and high school students is less natural, as shown in Figure 7. The height is not adjustable; therefore, the robot cannot “stand” or “sit” to maintain eye contact with peers who choose to stand or sit while talking. The Double robot has adjustable height, suitable for sitting and standing. None of our students, however, commented on robot height as a challenge. However, one of our teachers stressed the importance of adjustable height for classwork. She mentioned that even when “sitting,” the adjustable height was helpful in adjusting the homebound student’s sight line around something blocking the view of the board, the teacher, or other objects in the classroom. The value of this feature (i.e., sitting/standing) may be higher for educators as they are the ones responsible for ensuring good visibility of materials for all students.
Figure 7.
VGo with middle school students.
RECOMMENDATION:
Sitting and standing activities occur frequently in school settings. To better emulate classmate body language and provide a sense of normalcy for the remote student, we recommend that the robot also be able to “stand” and “sit” alongside peers. We recommend that robot designers provide remote controlled adjustable height to allow students to easily achieve standing and sitting heights. Remote-controlled adjustable height options also allow for improved communication and viewing of objects without teacher assistance.
4.2.7. Completing Assignments and Taking Tests.
Unlike adults in the workplace and conferences, most students take tests or assessments to gauge learning. Having the remote student take tests in real time proved difficult to manage. Currently taking tests in real-time for robot users in our study followed this pattern: a test was sent to the student (usually emailed as a pdf); the student downloaded the test and printed it; the student marked their responses; the student scanned the completed test; the student uploaded it and emailed it back to the teacher; the teacher downloaded it; and then the teacher either scored it online or printed it to be scored with the other tests. While this method for taking tests in real-time is technically “accessible” it is not very usable—especially by younger children. One teacher had a clever work-around; she designated a local student to be the homebound student’s agent: “Like an oral quiz. I’d have one student put on ear buds so only they can hear him, and they would read into his speaker…and he could give the oral…answer…and only she heard.” Another teacher expressed her wish for fax machines both at home and in the classroom. She explained that she would like to feed the test into the fax machine and it would print out at the home and the student could easily feed the completed test into their fax machine and then it would print out in the classroom. Then, the teacher could collect it in real-time with the other tests.
Some teachers prepared tests online and that made test taking easier to manage. However, online test taking has not yet penetrated to all schools and not all schools are open to using online web services such as Google Classroom. For a teacher who does not design tests to be taken online, this type of test preparation would be an extra burden. Similar issues arise with daily handouts, worksheets, and assignments.
RECOMMENDATION:
We recommend that robot designers consider what technological features could be added to a telepresence robot to facilitate test-taking and assignment completion in real-time. Ideally, future robot technologies would allow the teacher to hand out an assignment to the robot as easily as handing it to a local student and the remote student could hand it back in as easily as if they were seated in the classroom.
4.2.8. Participating In Computer Labs.
A class in the computer lab presented another challenge for a student. While students in school worked on their computers, the remote student had to change his computer screen from displaying what the camera viewed in the room to displaying the technical material on the screen. When he did this, he could no longer “see” the class. Daniel’s teacher commented, “If we say ‘let’s go to this website and look at this’ then he no longer sees us… He’s like out of the classroom. He can hear but he can’t see ‘cause he’s changed screens on his computer.” Some students have attempted picture-in-picture on the same device but this is less preferred because the picture of the classroom becomes smaller on the user interface. The smaller picture of the classroom may disrupt the immersive experience.
Samuel’s mother reported that they worked around this problem by having two laptops open for computer class because, “You can’t see what they’re doing. That’s the only downside…’cause the video still works and they’re seeing him but he can’t see what they’re doing. So that’s why we bring the other [laptop] so he can see what they’re doing.” A second computer or tablet would also be necessary if the student were going to take tests or do worksheets electronically or “go to the board” when an interactive white board is in use, as previously mentioned. Having two home devices for school attendance may be a solution but it may not be financially feasible for school districts to issue two laptops to each remote student.
RECOMMENDATION:
As computer use in classrooms increases, remote students of all ages will experience the need for more than one screen. Some children, especially younger children, struggle with splitting their screens to keep interacting via robot and completing work for a computer class. Other children may not have the resources to purchase two screens or devices for attending school. To meet these challenges, we recommend that robot designers provide features such as a user interface that allows for an easy-to-access split screen when the child is accessing software programs or websites for school use.
4.2.9. Moving Throughout The School.
The mobility of the robots is what allows students increased autonomy and engagement in their school communities. A central feature to mobility is the ability to see where you are going. The need for a wide-angle view has been noted in the literature [12,21,29,51] and remains an important feature in schools because students may travel long distances in complex physical settings. The need for two cameras (one forward facing; another facing downward) has also been stressed as important for navigation [12,22]. The Double has a downward facing camera to help with navigation and the VGo provides a semi-circle on the user interface for awareness of the robot’s footprint.
In our study, robots were observed not only in four classrooms, but also in three hallways and a gymnasium during a book fair. Students reported attending libraries, assemblies, church, gymnasiums, auditoriums, stadiums, robot clubs, and museums. They had to navigate hallways, elevators, and ramps via the robot. Within these different environments, there are several different types of flooring that students must traverse: linoleum, carpet, tile, concrete, blacktop, wood, etc. They must traverse door thresholds and ramps. Within each local environment there are combinations of desks, shelves, plants, etc. for the homebound student to navigate around. Controls for navigating both models of robot in our study were arrow keys, track pads, joystick, and computer mice.
Challenges included stairs, elevators, ramps, doorjambs, and walls. The VGo, weighing 18 lbs., is light enough to be picked up and carried, as shown in Figure 8 at the stairway. An administrator reported that the robot was “a little wobbly” going over the door threshold. “It went over…but we had to make sure it didn’t fall…” One administrator reported feeling that the robot was “underpowered” when going up ramps. While successfully going up ramps requires a strong motor and sufficient tread on the wheels, successfully navigating door thresholds was dependent on the robot’s balance.
Figure 8.
Picking up the VGo at the stairs.
Remote students on the robot also needed help opening doors and pushing elevator buttons. A remote student reported being late to class and finding the door closed. Unable to open it or even knock, he moved the robot to face the door a few feet from it. He then moved forward as quickly as he could and rammed the door to “knock” on it. Another challenge came when a student was left inside a classroom, “Sometimes I’ll get locked in a room and I can’t unlock the door or open it…so lots of times I just wait there…I have the lights on though…so I don’t sit in the dark.” Commercially available telepresence robots do not have arms to allow students to open doors or push buttons. Faced with an Americans with Disabilities Act (ADA)-compliant door button (Figure 9), one student was reported as “crashing” into it to push it. A fifth-grade homebound student (VGo user) reported feeling frustrated when he came to school and found his classroom dark and empty. He turned on his lights and rolled around in the classroom but could not open the classroom door to find his class.
Figure 9.
Disability Access Door Opener.
Collisions.
Collisions with the robots have been documented in earlier research on adult use of the robots [12,21,29]. In our study, one school administrator reported originally being concerned for the safety of other children if the robot crashed into them or fell over. However, he reported that his fears had been unfounded due to the light-weight of the VGo. There were no reports of the robot falling or crashing into another student but weight is a design consideration if the robot will be interacting with small children. While the robot is stable, small children may not be. When Nathan was at a book fair we witnessed a classmate rush to get to a book and inadvertently bump into the VGo robot. The robot teetered for a few seconds but regained its balance and then continued on its way. The robot did not fall over in this case, but it is easy to see how a stronger bump would have caused the robot to fall over. With the light-weight of 18 lbs., running into the robot does not seem to pose any greater risk for small children than running into a real classmate.
Five students experienced crashing into things when learning to drive the robot, causing embarrassment for some students. Eileen reported, “In my first period class…every single person would stare at me and like crack up laughing if I ran into something.” For other students the crashing was reported as happening only “at first” when they were learning to “drive” the robot. In the two focus groups with classmates, students reported continued crashing and even falling. Nathan’s classmates commented, “Sometimes he bumps into a lot of desks” or, “He usually bumps into a lot of stuff.” Nathan himself said simply, “I crash a lot.”
Samuel felt that his range of vision affected his navigation, resulting in his bumping into things. He said, “I wish it had a wider screen so I wouldn’t crash all the time…well a wider camera.” Samuel’s mother said, “I wish he had a backing camera so he knows where he’s backing up.” Samuel’s mother also explained how she tried to help the classmates understand Samuel’s vision via the camera, “I was trying to give a demonstration because they’re like ‘why do you keep running into the wall?’ She said that she told them, “If you close one eye and do this [make a circle with one eye and cover the other one] that’s how the robot sees.”
Falling.
Samuel also struggled with his VGo robot falling over. During the focus group, Samuel commented, “I hug the ground” and went on to explain that this meant the robot fell over. When questioned as to how many times he had actually ‘hugged the ground,’ he reported the robot falling four times and needing to be picked up by a buddy. His classmates confirmed that he “fell a lot.”
Erratic behavior.
Samuel also found that his VGo robot behaved erratically and caused some frustration. His mother reported, “He gets really, really, frustrated when it will start spinning or run into the walls or things like that.” Samuel followed with, “The wheels get a little crazy… it hit the door jamb…and then shut off…we couldn’t get it back on for a while.” When asked if that happened often, Samuel replied, “It ran into the wall yesterday after math, and then it shut off…” Samuel’s classmates also reported, “When he’s driving around, he crashes and everything.” They tried to troubleshoot the problem, saying, “The robot’s back tires, like, it gets messed up and he bumps into walls and stuff.” The Double robot has also displayed some erratic behavior. It reportedly “lurched forward and back uncontrollably.” This behavior influenced a school district decision to not use the Double in their classrooms.
RECOMMENDATIONS:
To provide remote students improved mobility similar to that of “walking” around their school and among their peers, we recommend that robot designers produce robots that are lightweight to be safe around small children, have sufficient stability to keep from falling over, and have the power to ascend ramps that are compliant with ADA guidelines. The camera should provide a wide angle of view and be augmented with a downward facing view to assist with navigation. In addition, the camera should be sturdy enough to not break if the robot falls over or collides with objects or people. Also, if homebound students are going to find clever ways to “crash” into doors or buttons for ADA access to school areas, the body of the robot should be robust. It is unclear what causes the erratic behavior of robots but robot designers should be aware of this challenge.
4.2.10. Extracurricular Activities.
For children living with long term medical conditions, developing peer relationships and the support of friends are vitally important [3]. In our study, students attended several extracurricular activities via the robot. Seven students reported eating lunch with friends via the robot-- four in classrooms and three in the cafeteria. Many of the recommendations we outlined before applied to these environments. Students also used the robot to go on field trips (e.g. visit to the Capitol, a professional baseball game) and extracurricular activities (e.g. choir auditions and rehearsals, Boy Scouts, book fair, freshman orientation, homecoming, and a costume party) and religious functions such as mass. A classmate noted, “He goes up for his blessing just like the rest of the kids do.” Attendees at some of these functions are expected to wear special outfits: choir, scouts, dances and costume parties, making the recommendation that it be easy to dress the robot even stronger.
In these activities, it is likely that the robot will be moving outside the school’s Wi-Fi infrastructure. We therefore recommend the use of a mobile Hotspot. In one case, a student attempted to use her VGo to attend a public function that had strong, public Wi-Fi. She was not able to connect the VGo to the public Wi-Fi due to restrictions on the VGo for the encrypted feed. In another instance a school district was unable to use VGos in their schools because the VGos were not able to access the school’s internet due to security features on the school’s network. The encrypted feed of telepresence robots is needed for schools but has been shown to cause access issues.
RECOMMENDATIONS:
To provide remote students with equitable physical accessibility comparable to their peers, we recommend that future robots have the necessary wheels and power to be able to attend extracurricular activities and move along with groups of peers, navigate public spaces, and traverse various floor surfaces. In addition, the robot should have the option to be able to connect to public Wi-Fi networks with parental permission (similar to parental permission required for all students on field trips). Allowing access to public Wi-Fi could be accessed with a restriction code entered on the robot similar to restriction codes found on parental controls for mobile devices.
4.3. Teacher Tasks
4.3.1. Handing Out and Receiving Assignments.
In our study, all remote students received packets of paper, manipulatives, and reading materials via parents, siblings, or home instruction teachers. Teachers and administrators expressed interest in features that would reduce the time spent on assembling take-home materials, allowing for increased teaching flexibility, and making the remote student “closer to being just another kid in the classroom.” Many teachers in our study requested a better way to get a child his/her assignments in real-time as lesson plans were sometimes modified and new handouts were distributed to classmates that were not included in the weekly packet of papers.
Both the VGo and the Double have a camera that can zoom and take snapshots. One administrator reported that the snapshot feature works but not as well if it is something that has to be handed in right away, like a quiz. He expressed, “so if there was a way to…scan and print out…and hand it right back…that would be really cool.” Teachers often collect work from students at the same time and this would allow for the remote student’s assignments to be included along with other classmate’s work. This would also ease the burden on the teacher when grading assignments or tests.
RECOMMENDATIONS:
To promote inclusive practices and better assist teachers in distributing handouts/assignments to the remote student in real-time, we recommend that the robot’s camera be able to snap pictures that are instantly printed out at home. To reduce the cost burden on families for printer ink, images could be instantly converted into black and white, portable document formats (PDF). Robot designers should also explore integrated software and hardware options that would allow the remote student to snap pictures/PDFs that instantly print out in the classroom.
4.3.2. Maintaining Student Engagement.
Teacher-to-student and student-to-student interactions require that the homebound student be able to focus on the teacher or classmate, to read facial and gestural cues such as pointing at an object or giving a demonstration. The literature supports the effect on learning of eye contact between the teacher and the student [18]. Having the face be life-size has been noted in the literature for adult teachers [19] and adults in distributed work teams [45]. In our study, one teacher reported that she made every attempt to keep the remote student engaged by “looking at his eyes and making sure he saw my eyes.” When questioned about this practice she said it was what she did for all of her 2nd graders. She felt it was important for maintaining the student’s interest and engagement.
RECOMMENDATION:
Remote users on telepresence robots in other settings have been recommended to show their face full-size and some shoulder [19,45]. Because the robot in school is an embodiment of the child, we similarly recommend that the “face” be near life size. The screen should also display some shoulder for improved reading of the remote student’s body language by classmates. We also recommend that robot designers place the camera close to the student’s eyes on the face screen to emulate eye contact.
4.3.3. Carrying The Robot.
While the remote students were largely in control of their connectivity, there were times the robot lost connectivity and had to be carried. Younger children may not fully grasp the complexities of virtual inclusion and may prefer to have their remote classmate stay with them during emergency situations (whether real or as part of a drill). A teacher reported that when a fire drill took place while the robot was in class, her 2nd grade students said, “We can’t leave Nathan behind!” and Nathan said, “What about me?” Nathan was familiar with fire drills as a traditional student but his role via the robot in the fire drill was unclear to him and to his classmates. The teacher said that she was compelled to take the robot outside with them because she “could just see the school burning down and there’s Nathan-- his little face on the robot, burning away…wondering ‘what’s going on?’” Sadly, the teacher had to manually carry the robot as the connectivity failed when she took it outside the classroom.
RECOMMENDATION:
To better assist teachers with the responsibility of moving the physical robot when it loses connectivity, we recommend that robot designers provide a manual setting to disconnect the wheels from the motor (i.e., achieve a “neutral” position) so the robot can be easily rolled manually instead of needing to be carried. This would allow classmates as well as the teachers to “help” the remote student remain with classmates during emergency situations and other school activities where the robot may become disconnected for a short period of time.
4.4. Classmate Tasks: Acting As Helpers.
A frustration with the robot centered on the need for helpers and the implicit social debt that the helping act incurred. Twelve students and their teachers reported a need for helpers for at least one of the following: opening doors, accessing elevators, filling out papers, carrying the robot when it could not move, and guiding the robot. During a focus group interview of 2nd graders, one student reported “Sometimes he doesn’t have enough charge and we have to carry him around to the classroom and he’s kinda heavy.” When asked if there were something they wish the robot could do, one student replied “two things—I wish we wouldn’t have to carry him back and forth when he had connection issues and I wish he didn’t have connection issues.” However, even though these 2nd graders complained about having to carry their friend, they still wanted their homebound peer with them at all times. So much so, that their teacher started using a round trash can dolly to transport the robot. She would place the robot on the round dolly and they would push the robot through outdoor hallways and other locations within the school campus when connectivity failed.
RECOMMENDATION:
If the robot does not have a “neutral” feature for the wheels on the base, robot designers should provide external wheeled bases for secure transportation of robots when connectivity fails. This would reduce the burden on classmates and teachers, and remove the need for users to adapt ill-fitting dollies to transport the robots.
4.5. Homebound Student Tasks
Many of the things that are important in the design of the robot and the accompanying best practices focus on the experience the student has being in the classroom, as described before. However, some of the behaviors, especially control of the user interface, are best described from the point of view of the homebound student. We focus here on the remote student’s experience with the home device, user interface, and ability to see and hear what is happening.
4.5.1. Controlling The Robot From Home.
Home device.
Once students are ready to attend school, they log in to use the robot via their home device. Thirteen students in this study were given a district-issued laptop, four used a home computer, and two used a family iPad. All reported having adequate Wi-Fi connectivity in their home. As mentioned before, students used a combination of arrow keys, mice, and track pads (iPads) to control the robot. These devices all had a camera at the top center of the screen through which the students in the classroom could see the homebound student. These devices also had standard microphones to capture the homebound student’s voice.
Controlling the robot.
In this study, six students used the arrow keys, two used trackpads (iPads), and one used a joystick as exclusive ways to drive their robots. Two students reported using a combination of the arrow keys and a mouse. Eight cases did not share how they controlled their robots. One homebound student complained of having to keep pressing the arrow key to continue to go down a long hall and one student complained that his finger hurt after driving the robot all day. He then mused, “I wish we could hook up a joystick to the computer and then I could just use that to move the robot.” When we suggested using the mouse, he replied, “That one’s harder to do.”
Walking and talking.
Driving the robot takes a certain amount of cognitive load and one student commented that he would put the robot on mute when he was driving. He couldn’t speak with others when he was driving the robot, “I’ll have to unmute it, say ‘hi’ and mute it and then keep on driving. But I can’t drive while I unmute it…” When questioned about this, he explained that it took too much concentration to “walk and talk” at the same time. A number of solutions for this have been proposed in the literature: semi-automatic navigation and the ability to follow an accompanying person [12,26,46,49].
RECOMMENDATIONS:
To reduce fatigue from using arrow keys for long periods of time, we recommend that robot designers move beyond traditional laptop and home devices for controlling the robot. Designers should allow for robot mobility (i.e., walking) to take place via more ergonomic options such as joysticks, Xbox, PlayStation or other game controllers, echoing the findings of Tanaka et al., [46]. Arrow keys may be reserved for more detailed movements such as moving the head (camera and screen jointly), when this feature is available. We recommend that the user interface have a map for semi-autonomous navigation to desired locations to facilitate the ability to “walk and talk” and create a more immersive experience. To further reduce cognitive load on the remote student, robot designers should also explore adding the ability for the robot camera to “follow” an accompanying person (e.g., the teacher, a classmate) who can be designated via the user interface.
4.5.2. Speaking To People At School.
The microphone through which the homebound student is heard is located on a laptop or a tablet and is designed to pick up sounds from a wide physical range. Sibling tantrums, ice makers, television, conversations, and pets were all given as reasons for muting the home microphone. Ten students reported at least one instance of classmates hearing one of these sounds. The VGo robot also has the ability for text-to-voice transmission from the remote student. The student can be muted at home and type in what they want to say and the robot will speak for them. A student who was having feedback and lag issues when he spoke used this feature. Samuel’s mother commented that, “depending on where it’s at, they can’t hear if he has an extra feedback, they can’t hear him ‘cause it’s like “err err’ noise…so you can type into the robot…then it will say what you type.”
RECOMMENDATIONS:
Microphone options that are tailored for use in an occupied home are needed by homebound students. Most homebound students have family members or care providers in the home to assist with medical or other needs. For the user interface and home device, we recommend a high quality microphone that is able to recognize and prioritize the voice of the remote student. Ideally, this feature would minimize or have reduced recognition of surrounding sounds that are not human voices. We also recommend that the text-to-voice feature allow for voice options that closely match what the homebound student sounds like in person.
5. DISCUSSION
In this paper, we evaluate the design features that are currently available on two models of telepresence robots being used in schools. Both robots were designed for adults in work settings but are being adapted for use by children in schools. Our goal in writing this paper was to explore the robot design features that matter for accomplishing learning tasks via telepresence robots in schools. The robots facilitated daily attendance, it is well known that school attendance matters for academic achievement and learning [23,33]. However, the learning in schools goes beyond academic attendance and achievement to include valuable social learning. Ryan and Deci’s self-determination theory of motivation posits that all humans have three basic needs: autonomy, relatedness, and competence [40]. Self-determination theory is supported by researchers in education who report that students who feel engaged, motivated, and have a sense of belonging are more likely to attend school [4,8,10,27]. When students attend school via telepresence robot, the robot and its features are central to conveying the school environment and opportunities for having social and academic needs met. As a result, how well the robot operates influences whether homebound children are motivated to attend school and continue to attend school. In exploring these robot features, our paper highlights the design features that both facilitate and impede school activities via robot and provides recommendations for design of future robots.
6. CONCLUSION
Our research provides a holistic view of telepresence robot design for child operators. In this space, we provide empirical data that evaluates the effectiveness of existing robot design to accomplish expected learning tasks. Our findings in this paper provide new insights on how these design features operate for learning via robot in traditional schools. Our evaluation of existing robot features for child operators in schools and recommendations for features needed beyond what is already found on the robots constitute our contributions to research in this area. The empirical data provided in this paper will aid designers in creating effective telepresence systems for academic settings and support users who may be wanting to establish or increase telepresence programs. Some of the recommended robot features found to be important in the school setting may turn out to also improve the human-computer and human-robot experiences for others. In the past, a number of redesigns of technologies for the less-abled benefitted those who were fully-able as well—for example, curb-cuts, ramps instead of stairs, closed captioning, and others [25].
Our earlier work has found that telepresence robots are successful in allowing the homebound student to attend school [31,32]. Indeed, at the date of this writing, there are hundreds of these robots purchased for use by homebound student to go to school. However, existing robots are not a perfect fit for this population. We have provided a number of aspects of the robot, its user interface, and auxiliary equipment that could improve the experience. We anticipate that the use of telepresence robots in schools will become more common as the robots become more affordable. At the time of this writing the prices are as follows: Double, $2995 plus cost of an iPad, no subscription fees; VGo, $5995, no subscription fees; Beam, $2140, no fees for 3 years. The BeamPro, $14,945, no fees for 3 years, is not only expensive but perhaps too heavy for use in elementary schools. Many school districts consider some of these robots to be affordable and are piloting them in their schools. Designing effective telepresence robots for use in traditional schools is critical to the success of programs to virtually include homebound children.
Our future studies will continue to explore the use of telepresence robots in real-world settings that increase access to learning opportunities and facilitate remaining connected to one’s community. Our studies will continue to assess the success of robot design relative to the social contexts of the settings and expected tasks that humans would like to accomplish via these robots. We plan to extend our analytic frameworks to conduct more in-depth design studies of real-world users from different age and capability groups in the real-world settings of their communities.
CCS CONCEPTS.
Human-centered computing~Empirical studies in HCI
Human-centered computing~Empirical studies in interaction design
Human-centered computing~Empirical studies in accessibility
ACKNOWLEDGEMENTS
We thank our participants who shared their experiences and insight. We also thank collaborators Jacquelynne Eccles and Mark Warschauer for their work on the studies. The project described was supported by the National Center for Research Resources and the National Center for Advancing Translational Sciences, National Institutes of Health (NIH), through Grants UL1 TR000153 and TL1 TR001415. The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH. This work was also generously supported by the Children’s Hospital of Orange County Hyundai Cancer Institute, National Science Foundation Grant ACI-1322304, and a Focused Faculty Research Award from Google.
Footnotes
Google classroom is a free web service that facilitates the sharing of files between teachers and students.
Contributor Information
Veronica Ahumada-Newhart, Institute for Clinical and Translational Science, University of California Irvine, Irvine, California, United States.
Judith S. Olson, Department of Informatics, University of California Irvine, Irvine, California, United States
REFERENCES
- [1].Bae Ilhan and Han Jeonghye. 2017. Does Height Affect the Strictness of Robot Assisted Teacher? In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, 73–74. [Google Scholar]
- [2].Barr Rebecca and Dreeben Robert. 2014. How Schools Work In Schools and Society: A Sociological Approach to Education (5th ed.), Ballantine Jeanne H. and Spade Joan Z. (eds.). SAGE Publications, Thousand Oaks, CA, 127. [Google Scholar]
- [3].Berntsson Leeni, Berg Marie, Brydolf Marianne, and Hellström Anna-Lena. 2007. Adolescents’ experiences of well-being when living with a long-term illness or disability. Scand. J. Caring Sci 21, 4 (2007), 419–425. [DOI] [PubMed] [Google Scholar]
- [4].Brewster Ann B. and Bowen Gary L.. 2004. Teacher support and the school engagement of Latino middle and high school students at risk of school failure. Child Adolesc. Soc. Work J 21, 1 (2004), 47–67. [Google Scholar]
- [5].Broekens Joost, Heerink Marcel, and Rosendal Henk. 2009. Assistive social robots in elderly care: A review. Gerontechnology 8, 2 (2009), 94–103. [Google Scholar]
- [6].Brown Robbie. 2013. A Swiveling Proxy That Will Even Wear a Tutu. The New York Times. Retrieved from http://www.nytimes.com/2013/06/08/education/for-homebound-students-a-robot-proxy-in-the-classroom.html [Google Scholar]
- [7].Burgoon Judee K. and Johnson Michelle L.. 1998. The nature and measurement of interpersonal dominance. Commun. Monogr 65, 4 (1998), 308–335. [Google Scholar]
- [8].Catterall James S.. 1998. Risk and resilience in student transitions to high school. Am. J. Educ 106, 2 (1998), 302–333. [Google Scholar]
- [9].Cesta Amedeo, Cortellessa Gabriella, Orlandini Andrea, and Tiberio Lorenza. 2012. Evaluating telepresence robots in the field. In International Conference on Agents and Artificial Intelligence, 433–448. [Google Scholar]
- [10].Croninger Robert G. and Lee VE. 2001. Social capital and dropping out of high school: benefits to at-risk students of teachers’ support and guidance. Teach. Coll. Rec 103, 4 (2001), 548–581. [Google Scholar]
- [11].Dabaghi-Pacheco Omar. 2018. Chemo’s no obstacle for this boy and his bot. Canadian Broadcasting Corporation (CBC) Retrieved from https://www.cbc.ca/news/canada/ottawa/robot-school-attendance-virtual-gloucester-verandrye-1.4543850
- [12].Desai Munjal, Tsui Katherine M., Yanco Holly A., and Uhlik Chris. 2011. Essential features of telepresence robots. In Technologies for Practical Robot Applications (TePRA), 2011 IEEE Conference, 15–20. DOI: 10.1109/TEPRA.2011.5753474 [DOI] [Google Scholar]
- [13].Disability Rights California. 2012. Special Education Rights and Responsibilities: Information on the Rights of Students with Significant Health Conditions.
- [14].Ellis Sarah J., Drew Donna, Wakefield Claire E., Saikal Samra L., Punch Deborah, and Cohn Richard J.. 2013. Results of a nurse-led intervention: connecting pediatric cancer patients from the hospital to the school using videoconferencing technologies. J. Pediatr. Oncol. Nurs 30, 6 (2013), 333–341. [DOI] [PubMed] [Google Scholar]
- [15].Erwin Elizabeth J. and Guintini Margaret. 2000. Inclusion and Classroom Membership in Early Childhood. Int. J. Disabil. Dev. Educ 47, 3 (2000), 237–257. DOI: 10.1080/713671117 [DOI] [Google Scholar]
- [16].Evans-Jones G, Fielder AR, Jones RB, Markham R, and Stewart-Brown S. 1994. Report of a joint working party: Ophthalmological services for children.London, England. [Google Scholar]
- [17].Fels Deborah, Waalen Judith, Zhai Shumin, and Weiss Patrice. 2001. Telepresence under exceptional circumstances : Enriching the connection to school for sick children. In Proceedings of INTERACT, 617–624. [Google Scholar]
- [18].Fullwood Chris and Doherty-Sneddon Gwyneth. 2006. Effect of gazing at the camera during a video link on recall. Appl. Ergon 37, 2 (2006), 167–175. [DOI] [PubMed] [Google Scholar]
- [19].Gweon Gahgene, Hong Donghee, Kwon Sunghee, and Han Jeonghye. 2015. The influence of head size in mobile remote presence (MRP) educational robots. In Robot and Human Interactive Communication (RO-MAN), 2015 24th IEEE International Symposium, 173–178. [Google Scholar]
- [20].InTouch Technologies Inc. 2003. InTouch Health and Rehabilitation Institute at Santa Barbara Collaborate to Evaluate World’s First Mobile Remote Presence Robot for Healthcare. Press Release. Retrieved from http://www.intouchhealth.com/pr1-15-03.html, accessed Sept. 2010 [Google Scholar]
- [21].Johnson Steven, Rae Irene, Mutlu Bilge, and Takayama Leila. 2015. Can you see me now?: How field of view affects collaboration in robotic telepresence. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 2397–2406. [Google Scholar]
- [22].Kristoffersson Annica, Coradeschi Silvia, and Loutfi Amy. 2013. A review of mobile robotic telepresence. Adv. Human-Computer Interact.2013, (2013). DOI: 10.1155/2013/902316 [DOI] [Google Scholar]
- [23].Lamdin Douglas J.. 1996. Evidence of student attendance as an independent variable in education production functions. J. Educ. Res 89, 3 (1996), 155–162. [Google Scholar]
- [24].Lannegrand-Willems Lyda and Bosma Harke A.. 2006. Identity development-in-context: The school as an important context for identity development. Identity 6, 1 (2006), 85–113. [Google Scholar]
- [25].Lazar Amanda, Demiris George, and Thompson Hillaire J.. 2015. Involving family members in the implementation and evaluation of technologies for dementia: A dyad case study. J. Gerontol. Nurs (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
- [26].Lee Min Kyung and Takayama Leila. 2011. “Now, I have a body”: Uses and social norms for mobile remote presence in the workplace. In Proceedings of the SIGCHI conference on human factors in computing systems, 33–42. DOI: 10.1145/1978942.1978950 [DOI] [Google Scholar]
- [27].Lee Valerie E. and Burkham David T.. 2003. Dropping out of high school: The role of school organization and structure. Am. Educ. Res. J 40, 2 (2003), 353–393. [Google Scholar]
- [28].Leslie S Liu Kori Inkpen, and Pratt Wanda. 2015. “I ‘m Not Like My Friends”: Understanding How Children with a Chronic Illness Use Technology to Maintain Normalcy. In Proceedings of the 18th ACM Conference on Cumputer Supported Cooperative Work & Social Computing, 1527–1539. [Google Scholar]
- [29].Neustaedter Carman, Venolia Gina, Procyk Jason, and Hawkins Daniel. 2016. To Beam or not to Beam: A study of remote telepresence attendance at an academic conference. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, 418–431. [Google Scholar]
- [30].Ahumada Newhart Veronica. 2014. Virtual inclusion via telepresence robots in the classroom. In CHI’14 Extended Abstracts on Human Factors in Computing Systems, 951–956. DOI: 10.1145/2559206.2579417 [DOI] [Google Scholar]
- [31].Ahumada Newhart Veronica and Olson Judith S.. 2017. My student is a robot: How schools manage telepresence experiences for students. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 342–347. DOI: 10.1145/3025453.3025809 [DOI] [Google Scholar]
- [32].Ahumada Newhart Veronica, Warschauer Mark, and Sender Leonard S.. 2016. Virtual inclusion via telepresence robots in the classroom: An exploratory case study. Int. J. Technol. Learn 23, 4 (2016), 9–25. Retrieved from http://escholarship.org/uc/item/9zm4h7nf [Google Scholar]
- [33].Nichols Bonnie. 2003. Demographic characteristics of arts attendance, 2002.Washington, DC. [Google Scholar]
- [34].Paepcke Andreas, Soto Bianca, Takayama Leila, Koenig Frank, and Gassend Blaise. 2011. Yelling in the hall: using sidetone to address a problem with mobile remote presence systems. In Proceedings of the 24th annual ACM symposium on User interface software and technology, 107–116. [Google Scholar]
- [35].Park Seong Ju, Hye Han JJeong, Kang Bok Hyun, and Shin Kyung Chul. 2011. Teaching assistant robot, ROBOSEM, in English class and practical issues for its diffusion. Adv. Robot. its Soc. Impacts (2011), 8–11. [Google Scholar]
- [36].Rae Irene, Mutlu Bilge, and Takayama Leila. 2014. Bodies in motion: mobility, presence, and task awareness in telepresence. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2153–2162. [Google Scholar]
- [37].Rae Irene and Neustaedter Carman. 2017. Robotic Telepresence at Scale. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 313–324. [Google Scholar]
- [38].Rae Irene, Takayama Leila, and Mutlu Bilge. 2013. The influence of height in robot-mediated communication. In Human-Robot Interaction (HRI), 2013 8th ACM/IEEE International Conference, 1–8. [Google Scholar]
- [39].Rae Irene, Venolia Gina, Tang John C., and Molnar David. 2015. A framework for understanding and designing telepresence. Proc. 18th ACM Conf. Comput. Support. Coop. Work Soc. Comput. - CSCW ‘ 15 (2015), 1552–1566. DOI: 10.1145/2675133.2675141 [DOI] [Google Scholar]
- [40].Ryan Richard M. and Deci Edward L.. 2000. Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. Am. Psychol 55, 1 (2000), 68–78. DOI: 10.1037/0003-066X.55.1.68 [DOI] [PubMed] [Google Scholar]
- [41].Sabelli Alessandra, Kanda Takayuki, and Hagita Norihiro. 2011. A conversational robot in an elderly care center: An ethnographic study. In6th Intl. Conf. on Human-Robot Interaction. ACM, 37–44. [Google Scholar]
- [42].Sexson Sandra B. and Madan-Swain Avi. 1993. School reentry for the child with chronic illness. J. Learn. Disabil 26, 2 (1993), 115–125, 137. DOI: 10.1177/002221949302600204 [DOI] [PubMed] [Google Scholar]
- [43].Sirkin David, Venolia Gina, Tang John, Robertson George, Kim Taemie, Inkpen Kori, Sedlins Mara, Lee Bongshin, and Sinclair Mike. 2011. Motion and attention in a kinetic videoconferencing proxy. In IFIP Conference on Human-Computer Interaction, 162–180. [Google Scholar]
- [44].Spowart Katherine M., Simmers Anita, and Tappin David M.. 1998. Vision testing in schools: an evaluation of personnel, tests, and premises. J. Med. Screen 5, 3 (1998), 131–132. [DOI] [PubMed] [Google Scholar]
- [45].Takayama Leila and Go Janet. 2012. Mixing metaphors in mobile remote presence. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work, 495–504. [Google Scholar]
- [46].Tanaka Fumihide, Takahashi Toshimitsu, Matsuzoe Shizuko, Tazawa Nao, and Morita Masahiko. 2013. Child-operated telepresence robot: a field trial connecting classrooms between Australia and Japan. In Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference, 5896–5901. [Google Scholar]
- [47].Tsai Tzung-Cheng, Hsu Yeh-Liang, Ma An-I, King Trevor, and Wu Chang-Huei. 2007. Developing a telepresence robot for interpersonal communication with the elderly in a home environment. Telemed. e-Health 13, 4 (2007), 407–424. [DOI] [PubMed] [Google Scholar]
- [48].Tsui Katherine M., Desai Munjal, Yanco Holly A., and Uhlik Chris. 2011. Exploring use cases for telepresence robots. In Proceedings of the 6th international conference on Human-robot interaction, 11–18. DOI: 10.1145/1957656.1957664 [DOI] [Google Scholar]
- [49].Tsui Katherine M., Norton Adam, Brooks David, Yanco Holly A., and Kontak Daniel. 2011. Designing telepresence robot systems for use by people with special needs. In Int. Symposium on Quality of Life Technologies: Intelligent Systems for Better Living. [Google Scholar]
- [50].Tsui Katherine M. and Yanco Holly A.. 2007. Assistive, rehabilitation, and surgical robots from the perspective of medical and healthcare professionals. In AAAI Workshop on Human Implications ofHuman-Robot Interaction. [Google Scholar]
- [51].Venolia Gina, Tang John, Cervantes Ruy, Bly Sara, Robertson George, Lee Bongshin, and Inkpen Kori. 2010. Embodied social proxy: mediating interpersonal connection in hub-and-satellite teams. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1049–1058. [Google Scholar]
- [52].Yeung Jason and Fels Deborah I.. 2005. A remote telepresence system for high school classrooms. In Canadian Conference on Electrical and Computer Engineering, 1465–1468. [Google Scholar]