Skip to main content
Children logoLink to Children
. 2022 Jun 25;9(7):953. doi: 10.3390/children9070953

Social Humanoid Robots for Children with Autism Spectrum Disorders: A Review of Modalities, Indications, and Pitfalls

Alfio Puglisi 1, Tindara Caprì 1,2, Loris Pignolo 3, Stefania Gismondo 1, Paola Chilà 1, Roberta Minutoli 1, Flavia Marino 1, Chiara Failla 1, Antonino Andrea Arnao 1, Gennaro Tartarisco 1, Antonio Cerasa 1,3,4,*, Giovanni Pioggia 1
Editors: Antonio Narzisi, Francisco Alcantud-Marín, Yurena Alonso-Esteban
PMCID: PMC9316169  PMID: 35883937

Abstract

Robot-assisted therapy (RAT) is a promising area of translational neuroscience for children with autism spectrum disorders (ASDs). It has been widely demonstrated that this kind of advanced technological tool provides a reliable and efficient intervention for promoting social skills and communication in children with ASD. This type of treatment consists of a human-assisted social robot acting as an intervention mediator to increase competence and skills in children with ASD. Several social robots have been validated in the literature; however, an explicit technical comparison among devices has never been performed. For this reason, in this article, we provide an overview of the main commercial humanoid robots employed for ASD children with an emphasis on indications for use, pitfalls to be avoided, and recent advances. We conclude that, in the near future, a new generation of devices with high levels of mobility, availability, safety, and acceptability should be designed for improving the complex triadic interaction among teachers, children, and robots.

Keywords: autism, robot-assisted therapy, humanoid

1. Introduction

Behavioral treatments are the major tool for reducing comorbidity and disability in children with autistic spectrum disorder (ASD) [1]. Generally, these are focused on maximizing the ability in social communication and skills [2]. Various behavioral approaches have been validated for ASD patients and are classified into: (i) comprehensive applied behavioral analysis-based intensive intervention; (ii) targeted skill-based intervention (training in joint attention, teaching social skills, and social skill training); and (iii) targeted behavioral intervention for anxiety and aggression (cognitive behavioral therapy) [3,4]. These treatments promote the development of several emotional and cognitive skills in ASD children.

However, individuals with ASD show notable heterogeneity at genetic, behavioral, and neurophysiological levels, which could interfere with the efficacy of these interventions [5]. Moreover, it is well-known that ASD individuals engage more successfully in social interactions if social information is presented in an “attractive” manner [6].

The last two decades have seen the emergence of technology-based therapies, such as robot-assisted therapy (RAT), for improving the treatment of individuals with ASD [7]. Robot-mediated intervention studies have shown positive outcomes in improving (a) joint attention, (b) social communication, (c) imitation, and (d) social behaviors [8]. Interacting with robots as an emulated peer is naturally more attractive because this is based on the observation that the well-known limited eye contact of children with a therapist could be successfully stimulated by a social robot [9].

The RAT approach has two fundamental advantages: (a) the opportunity to record objective data during therapy and (b) the ability of the robot to adaptively “learn” both interindividual differences at one time point and intraindividual differences over time, thus partially overcoming the limitations due to clinical heterogeneity. The former characteristic is important to characterize the behavioral improvement, providing quantitative data about the developmental process [9].

In the last few years, several successful interventions have been developed using the RAT approach [8,9,10,11,12], although a rigorous comparison among technical devices has never been performed. In this review, we provide an explicit description of strengths and limitations for devices employed in clinical trials and generally considered by teachers and therapists as the best tool for their practice: the humanoid robots.

2. Social Humanoid Commercial Robots

In this qualitative analysis, we consider only humanoid social robots employed in social skills training for ASD children which are commercially available and already validated in clinical trials. Considering these criteria, we reviewed the characteristics of: (1) NAO (Aldebaran Softbank Robotics, Tokyo, Japan), (2) QTrobot (LuxAI S.A.; Luxembourg); (3) KASPAR (Adaptive Systems Research Group at the University of Hertfordshire, Hatfield, UK); (4) FACE (Enrico Piaggio Center for Robotics and Bioengineering of the University of Pisa, Pisa, Italy); and (5) ZENO (Hanson Robotics, Hong Kong, China).

2.1. NAO Robot

NAO (dimensions: 574 × 311 × 275 mm; weight: 5.48 kg) is characterized by a body in plastic with 25 DoF (four joints for each arm; two for each hand; five for each leg; two for the head; and one to control the hips). The internal processor is an Intel Atom E3845, using Linux as an operating system (compatible with Windows and MacOS). NAO can also speak and assure a certain degree of non-verbal communication, capturing a lot of information about the environment using sensors and microphones. In detail, the NAO robot is equipped with:

  • Sonar to interpret the distance to objects or subjects.

  • Tactile sensors on the hands and head.

  • A camera (two OV5640 2592 × 1944) and microphones for voice and facial recognition.

  • Speakers to listen to sounds that can be reproduced by the robot itself.

  • Stepper-motors to represent the robot’s movements.

  • Stepper-motors (see Figure 1) that allow movements very similar to a human being’s prehensile hands.

  • An ethernet and wireless network card.

Figure 1.

Figure 1

Localization of stepper-motors inside NAO robot.

The user-friendly software embedded in the robot works on Mac, Windows, and Linux platforms, although it is not supported by the latest versions of the MacOsX system. In any case, programming through Choreographer’s proprietary software is very limited, but the C++ and Python APIs are available, allowing the robot to be implemented in mobile or desktop applications.

2.1.1. NAO: Clinical Validation

NAO is the most famous and employed device for promoting emotional and cognitive rehabilitation in children with ASD [11]. Several studies demonstrated its effectiveness as a mediator of behavioral interventions. For instance, Marino et al. [12] conducted the first randomized controlled trial using NAO in a socio-emotional understanding protocol for children with ASD. Fourteen children were randomly assigned to 10 sessions of cognitive behavioral therapy intervention applied in a group setting, either with or without the assistance of NAO. The results demonstrated that children performing with the RAT significantly improved their socio-emotional skills with respect to the control group. Van den Berk-Smeekens et al. [13] conducted a randomized controlled trial using Pivotal Response Treatment (PRT) with and without NAO robot assistance for improving the social skills of children with ASD. Seventy-three children were randomly assigned into three groups (PRT: n = 25; PRT + robot: n = 25; standard intervention: n = 23). The results indicated that the PRT + robot group showed a larger improvement in social communication than the other two groups.

2.1.2. NAO: Advantages vs. Disadvantages

The main strengths of this device are: (a) autonomy, (b) motion, and (c) clinical validation [14] (Figure 2). With reference to robot autonomy, NAO was used in three operating modes: full-autonomy, semi-autonomy, and Wizard of Oz [15]. In the first mode, the robot autonomously detects a child’s behavior or its eye gazes through tracking devices. In the second mode, the actions of the robot are activated both autonomously or by a therapist or researcher. In the Wizard mode, the researcher or therapist remotely controls the robot’s behavior without the child noticing it. This mode is the most-used both with NAO and other robots. With reference to motion, NAO can provide a large variety of human–robot interactions, increasing the types of actions that the robot and child can make together [14]. Moreover, NAO can walk with high degrees of freedom (DOF). For this reason, NAO seems more human-like than other robots that can move their arms only up and down in a single plane of motion [14]. In regard to clinical validation, as explained in the above section, NAO has been used in several clinical studies and validated for behavioral rehabilitation in children with ASD.

Figure 2.

Figure 2

NAO robot.

Otherwise, NAO is also characterized by some limitations, such as physical appearance and technological features. NAO’s eyes have colored LED to help children in focusing attention on particular social cues that are necessary for the skill being trained. However, this could represent overstimulation, and it is well-known that sensory overstimulation is a serious problem for many children with ASD [16]. Moreover, NAO cannot express facial emotions, and this may not be helpful for children with difficulties in recognizing human facial emotional expressions [1,2,17]. For this reason, a robot cannot appear both extremely human-like and socially simple [14]. Thus, an alternative option for designers is to create evocative but visually simple robots by implementing an additional screen on the robot’s head to display simple emotional facial expressions (see Section 2.2) Finally, although NAO is equipped with guidelines for safety, it is not possible to anticipate and predict all potential situations that could occur when children and robots interact. Indeed, NAO could create physical damage to children’s hands and fingers, given that it has strong prehensile skills [18,19,20,21].

NAO has been built for improving behavioral intervention in ASD children; however, this robot has also been applied in other clinical domains, such as attention deficit hyperactivity disorder (ADHD), language disorders, and Down’s Syndrome [19,20,21]. Taken together, these studies suggest that NAO has the potential to be translated for the treatment and education of children with different disabilities.

2.2. QTrobot

QTrobot is an expressive little humanoid robot (dimensions: 574 × 311 × 275 mm; weight: 5 Kg), designed and built to assist therapists in teaching new skills (cognitive, social, communication, and emotional) to children with autism or special educational needs (Figure 3). This robot is characterized by high motricity in the neck and hands (DOF: 12). This is equipped with: (a) a face display that can show movies, thus emulating basic emotional expressions; (b) a 3D intel RealSense camera that enables vision and gesture recognition in space, as well as excellent resolution for facial recognition; and (c) microphones to recognize where the sound is coming from and speakers which allow the robot to produce verbal communication or play sounds.

Figure 3.

Figure 3

General characteristics of QTrobot.

An internal Raspberry PI (QTPI) board controls the motors, displays, and sensors, all connected to a Linux PC (QTPC), which uses ROS to send commands to the Raspberry board. The QTPC and Raspberry board that make up the robot are connected to each other via an internal LAN, allowing easy configuration and programming which can be directly sent (via Web) to the company manufacturer for information exchange. This tool provides an opportunity to translate robot-assisted therapy on the Internet of Things (IoT) data domain.

Programming can be performed using the web app interface provided by the manufacturer, which offers an intuitive block-type utility. This allows routines to be created and executed on the robot using the Android tablets that come with the robot.

For customizing specific behaviors, QTrobot allows the use of the RealSense software. This software has been installed into the robot and allows it to recognize gestures or faces through the assignment of key point data in space. It is possible to write specific commands using Python and C++ that invoke the APIs already installed in the robot’s QTPC. The manufacturer’s site (LuxAI) provides extensive tutorials on hardware and software characteristics.

In detail, the QTrobot robot is equipped with (Figure 3):

  • An 8th Gen quad-core Intel® CoreTM i5/i7 processor up to 4 × 4.5 GHz, up to 32 GB DDR4 RAM, and up to 512 GB M.2 SSD.

  • A camera (RealSenseTM depth camera D435; field of view ≈ 87° × 58° × 95°) and microphones (four digital microphones; supports far-field voice capture; microphones: ST MP34DT01TR-M; sensitivity: −26 dBFS) for voice and facial recognition.

  • Speakers to listen to sounds that can be reproduced by the robot itself (audio amplifier: stereo 2.8 W Class D; speaker frequency rate: 800~7000 Hz).

  • Facial Display (8 inch TFT 800 × 480 LCD).

  • An ethernet and wireless network card.

2.2.1. QTrobot: Clinical Validation

QTrobot is a recently developed social robot. Until now, there are only two studies demonstrating its effectiveness as a mediator of behavioral interventions on children with ASD. Costa et al. [22] have examined the use of QTrobot in long emotional-ability training for ASD, providing restricted evidence of the positive effects of the robot-mediated intervention. In another study, Costa et al. [23] have evaluated the usefulness of QTrobot by assessing children’s attention, imitation, and presence of repetitive and stereotyped behaviors. They obtained significant positive results in all considered parameters.

2.2.2. Qtrobot: Advantages vs. Disadvantages

The most significant advantages of this device are the physical appearance and some technological features. With reference to physical appearance, QTRobot has more closely related human features, with different levels of motion which allow for an easier identification of social actions and expressions, facilitating the transfer of skills learned in the human–robot context to a human–human interaction [4,24,25,26]. QTrobot is built precisely to a child’s physical dimensions; it moves its arms with multiple DOF. Its display allows the presentation of animated faces and emotional facial expressions combined with arm movements and voice. Concerning technological features, the architecture of QTrobot is characterized by simple programming using Internal software, easy to customize with different behaviors (RealSense) useful for robot-assisted applications in the ASD domain [15]. Furthermore, QTrobot has been developed to be employed in both homes and therapy settings.

The most significant disadvantages are that it has few sensory features and effective usage only with digital tablets (Figure 4). Generally, robots employed in RAT should be able to detect the child’s position in order to orient the child in performing specific actions and responses [27]. QTrobot is only equipped with RealSense which does not allow this kind of interactive spatial evaluation. Moreover, the child–robot interaction is mediated by the use of a digital tablet that could create an overstimulation for the child. Another pitfall is the lack of applications in clinical trials. Nowadays, only two studies evaluated the effectiveness of QTrobot in reducing repetitive and stereotyped behaviors and in increasing joint attention and emotional skills in children with ASD [22,23].

Figure 4.

Figure 4

QTrobot.

2.3. KASPAR

Kaspar is a humanoid social robot (dimensions: 55 × 50 × 45 cm; weight: 15 kg; six DoF on the neck and head, six on the arms, and two in the eyes). Its face is a silicon-rubber mask that can show a range of simplified expressions. This can respond to the touch of children and can move its head, arms, and eyes. This is equipped with tactile sensors (Figure 5), which allow the robot to react as previously defined by software programming.

Figure 5.

Figure 5

KASPAR robot (a) and sensors localization (b).

The programming of the robot is performed through an easy programming interface, but it is very limiting as it does not allow the development of interaction with other devices and platforms.

In detail, the KASPAR robot is equipped with:

  • SENSORS Cameras in eyes. Force-sensing resistor or capacitive touch sensors.

  • ACTUATORS Dynamixel AX-12A robot servos and RC servos.

  • POWER One 12-V 7-Ah lead acid battery, 4 hours of operation.

  • COMPUTING Controlled by external PC via USB. Or wirelessly using on-board mini PC.

  • SOFTWARE Custom Java software. YARP, C++, and Python interfaces optional.

  • DEGREES OF FREEDOM (DOF) 17 (Arm: 4 DoF x 2; Neck: 3 DoF; Mouth: 2 DoF; Eyes: 2 DoF; Eyelids: 1 DoF; Torso: 1 DoF)

  • MATERIALS Fiberglass body; aluminum frame and head parts; silicone rubber face.

2.3.1. KASPAR: Clinical Validation

The KASPAR robot has been employed in several clinical trials to demonstrate its effectiveness as mediator of behavioral interventions on children with ASD. Marinoiu, Zanfir, Olaru, and Sminchisesc [28] have used KASPAR in order to involve 13 children with ASD in different games for helping them to see the world from the robot’s perspective (i.e., the theory of mind). The results have indicated that the robot-assisted therapy using KASPAR can be an effective intervention to improve the theory of mind and visual perspective-taking in autism. Recently, the results of other studies have demonstrated that robot-mediated interventions using the KASPAR robot improved communication, psychomotor functions, social skills, and imitation in children with ASD [29]. Reviews on the effectiveness of KASPAR have highlighted the potential of this robot in interventions for children with ASD [30].

2.3.2. KASPAR: Advantages vs. Disadvantages

The most significant advantages of this device are: (a) less complexity of human-related facial emotion expressions; (b) tactile sensors, and (c) it is easy to customize to autism needs (Figure 6). Kaspar has a realistic face with a less complex actuation system [29], i.e., KASPAR can open and close its mouth, can smile and frown, can move its eyes up/down and left/right, and finally, it can open/close the eyelids. This system reduces the complexity of social stimulus; consequently, KASPAR can be more predictable, less distracting, trustable, and less ambiguous than a human person would be [30]. Differently from other robots, KASPAR is equipped with tactile sensors; this means that children can observe the effect of pressing buttons on Kaspar’s motion, so they can benefit from a turn-taking interaction, given that children with ASD usually tend to not engage in such behavior [16].

Figure 6.

Figure 6

KASPAR.

The main disadvantage of this device is the limited behavioral reactions. KASPAR cannot walk, grasp, or fetch objects, or make fine gestures with its hands or fingers. Mobility is an important factor that must be controlled during a human–robot interaction, because a good movement capability increases the types of actions that the robot and child can engage in together [14]. Additionally, KASPAR is used in a semi-autonomous way; this means that a few predefined actions can be programmed on the remote control [29,30,31]. This limited autonomy influences the application in rehabilitation settings, as well as the development of scenarios for the child–robot interaction.

2.4. FACE (Facial Automaton for Conveying Emotions)

FACE is a passive body with an active head. Thirty servomotors simulate and modulate six basic emotions (anger, happiness, surprise, sadness, disgust, and fear). FACE cannot speak, but through its microphones and cameras, it can analyze the emotional reactions of individuals, react to them, and store all data.

The programming of the robot is performed through scratch programming, which is very simple, even for beginners, but very limiting as it does not allow the development of interaction with other devices and platforms.

In detail, the FACE robot is equipped with:

  • SENSORS External cameras and microphones positioned next to the android (used for teleoperation).

  • ACTUATORS Pneumatic actuators in the face (eyes, forehead, eyebrows, eyelids, and cheeks) and body (neck and shoulder).

  • POWER Standard 110-V/220-V power supply

  • COMPUTING Custom server and control infrastructure

  • SOFTWARE Windows OS and Java-based application

  • DEGREES OF FREEDOM (DOF) 12

  • MATERIALS Metal skeleton, silicone skin for hands and face, wig made of human and artificial hair.

2.4.1. FACE: Clinical Validation

The FACE robot has been employed in some clinical trials to demonstrate its effectiveness as a mediator of behavioral interventions on children with ASD. A study [32] demonstrated that this device aided in improving imitative skills and shared attention, although a small group of ASD children was enrolled. Another study [33] confirms this preliminary evidence, highlighting that all participants have shown an improvement in their imitation abilities and social communication skills after RAT with FACE. Based on these preliminary data, researchers have suggested that treatment with FACE can develop pragmatic emotional responsiveness in children with ASD.

2.4.2. FACE: Advantages vs. Disadvantages

The main significant advantage of this device is the ability to express realistic emotions (Figure 7). Indeed, the FACE robot has been developed based on biological principles to be a realistic facial display system. The FACE robot has servomotors to control facial movements and a biomimetic proprioceptive system. The motors allow us to express six basic emotions based on feedback from the sensing layer [32,33].

Figure 7.

Figure 7

FACE robot.

Otherwise, FACE is also characterized by limitations, such as missing motion and mobility. It is unable to express complex emotions combining facial emotional expressions with gestures. Moreover, the lack of mobility and motion reduces the variety of human–robot interactions [14]. Finally, another major disadvantage of this device could be the Uncanny Valley effect [34]. Following Masahiro Mori’s statements [34], it describes the relationship between the human-like appearance of a robot and the emotional response evoked in people. Mori observed that people found robots more appealing the more human they appeared, and this feeling induces positive emotion and reaction. However, this sense of familiarity only worked up to a certain point. When the appearance of humanoid robots moves from a “somewhat human” to “fully human” entity, this provokes uncanny or strangely familiar feelings of revulsion in observers. For this reason, the FACE robot could fall into the uncanny valley in ASD children.

2.5. ZENO

Zeno is a humanoid child-size robot with a simple expressive face (dimensions: (0.635 m max height; 6.5 Kg weight). The robot’s face has 8 DOF, 3 DOF for the neck, and 25 DOF for the body where motors are used for simulating facial expressions. The body is equipped with servomotors for the legs, hips and shoulders, and waist. The programming of the robot is performed through an easy programming interface, but it is very limiting as it does not allow the development of interaction with other devices and platforms.

In detail, the ZENO robot is equipped with:

  • SENSORSvTwo 720p, 30fps HD cameras (one in each motorized eye).

  • Three-axis gyroscope, three-axis accelerometer, compass.

  • Twenty one joint load sensors, 30 joint position sensors, two cliff sensors, two ground contact sensors, two infrared obstacle-detection sensors, two bump sensors (feet), grip-load sensors in the hands. Three microphones.

  • ACTUATORS Three Cirrus CS-101 STD 4-gram micro servos. Five Hitec HS-65MG motors (Frubber actuators). Dynamixel RX-64 (legs, hips, shoulders). Dynamixel RX-28 servos (waist).

  • POWER Two 18.5-V lithium-ion batteries, 1 hour of operation

  • COMPUTING 1 GHz Vortex86DX CPU, 1 GB RAM, Wi-Fi, Ethernet

  • SOFTWARE Linux Ubuntu

  • DEGREES OF FREEDOM (DOF) 36 (Arms: 12 DoF; Legs: 12 DoF; Waist: 1 DoF; Neck: 3 DoF; Face: 8 DoF)

  • MATERIALS Frubber, plastic, and aluminum

2.5.1. ZENO: Clinical Validation

The ZENO robot has been employed in some clinical trials to demonstrate its effectiveness as a mediator of behavioral interventions on children with ASD. In a study [35], researchers sought to stimulate facial emotion recognition skills in children with ASD, compared to typically developing children (TD). Results indicated no significant difference among groups, although ZENO is able to successfully express six basic emotions. Recently, Lecciso et al. [35] enrolled 12 children with ASD, randomly subdivided into two groups: a robot-based intervention with ZENO and a computer based-intervention. Both types of intervention aimed at improving facial emotion recognition. Results have shown no significant differences between the two groups. Both robot and computer intervention produce similar improvements. Overall, future studies are necessary to validate the use of ZENO in the treatment of ASD.

2.5.2. ZENO: Advantages vs. Disadvantages

The most significant advantages of this device are facial emotional expression and mobility (Figure 8). ZENO is a child-sized and -shaped robot but with limited expressive abilities (only six basic emotions). However, this capability combined with motion (it can move its arms and legs) gives it a human-like physical appearance. It is known that physical appearance and mobility are two important factors that mediate human–robot interactions [24]. This is essential in the context of ASD, given that one of the major impairments in ASD is emotional understanding and recognition.

Figure 8.

Figure 8

ZENO.

The main disadvantage of this device is the low number of DOF. The motor system of ZENO is limited in its bodily capabilities due to the low number of DOF. This drastically decreases the changes to design human-like actions.

3. Discussion

The establishment of an adequate social robot tool is one of the most important clinical targets aimed at increasing the efficiency of RAT approaches for ASD children. Taken together, the results of the present analysis indicate that the most important factors for human–robot interaction, in the context of treatment for ASD, are physical appearance and mobility. The NAO robot has good mobility, even if it can be dangerous for a child’s fingers, but it is limited in its physical appearance. The QT robot has a social, expressive, and simple appearance with a display for showing facial emotion expression, but it is fixed on a stand, and it cannot walk or roll around in their environment. Other robots included in this narrative evaluation show ambiguous physical features and limited mobility.

Overall, it is extremely difficult to design a robot that is able to conflate a human-like appearance with socially interactive capabilities and imitations of the children’s movements in real time. Several key challenges must be addressed. Within the scenario of social assistive robotics for ASD, the main aims for a child–robot interaction are to elicit joint attention, to encourage imitative behaviors, to promote socio-emotional understanding and facial emotion recognition, and for turn-taking between the child with ASD and the robot. Consequently, the challenge is to design a child-size expressive humanoid robot with good mobility and verbal skills. Thus, the robot should be able to walk, move its arms and legs around the environment, and it should also be safe and socially attractive with a human-like appearance.

From a technological point of view, the perception system of a robot must be able to detect the child’s position and movements, because the child is free to move around the room. Both the NAO and QT robots have a good perception system, whereas other robots are limited in this function. Moreover, the robot must be able to express several and complex emotions, not only basic emotions. These are important factors to promote a greater variety of potential actions between the child and robot and to make the therapeutic session more closely life-like. In this case, only QTrobot can express both simple and complex emotions.

From a researcher’s point of view, robot systems must have numerous capabilities, such as: sensing and interpreting the child’s actions; full autonomy within the experimental scenario setting; collecting and processing data over time; evaluating the interaction in terms of the quantity and quality of behaviors; altering behavior based on parameters chosen by the researcher or experimenter; and flexibility in the programming [25]. Again, the NAO and QT robots are equipped with a platform for researchers; however, further developments are needed in order to make these platforms more flexible.

Summarizing, the key idea is to connect the needs of robot developers, care professionals, researchers, and children to increase the efficiency of a robot-assisted mediated cognitive therapy approach for ASD and to design and develop a robot with high levels of utility, availability, safety, and acceptability.

4. Conclusions

To the best of our knowledge, this is the first rigorous comparison among technical devices showing indications for use, pitfalls to be avoided, and recent advances of the most famous humanoid robots used as an intervention mediator to increase the emotional/cognitive competence and skills in children with ASD. There are several reviews on the RAT approach in ASD, but none focus on the technical features of robots. In accordance with previous studies [15,36], the present analysis suggests that to design and develop meaningful robot-mediated interventions, the robot must address the needs of children with ASD, care professionals, and developers.

The current state-of-the-art for social assistive therapy has not reached its full potential yet in terms of physical appearance and technological features which are the two key aspects evidenced in this review. The most-used robots are employed in a wizard way, increasing the burden of care professionals. Some robots are limited in mobility functions, and they are visually and kinetically simple designs.

The challenge for the future is to design a new era of child-size expressive humanoid robots to improve the complex triadic interaction among teachers and children with the robots, also considering the entry of Artificial Intelligence algorithms that should induce flexibility and learning capabilities in previously rigid applications.

Author Contributions

Conceptualization, A.C. and G.P.; methodology, A.P., T.C. and G.T.; investigation, L.P., S.G. and A.A.A.; resources, G.P.; data curation, R.M., C.F., P.C. and F.M.; writing—original draft preparation, T.C., A.P. and A.C.; writing—review and editing, G.P. and A.C.; supervision, L.P.; project administration, G.P. All authors have read and agreed to the published version of the manuscript.

Institutional Review Board Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Funding Statement

This research received no external funding.

Footnotes

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Sharma S.R., Gonda X., Tarazi F.I. Autism Spectrum Disorder: Classification, diagnosis and therapy. Pharmacol. Ther. 2018;190:91–104. doi: 10.1016/j.pharmthera.2018.05.007. [DOI] [PubMed] [Google Scholar]
  • 2.Leekam S. Social cognitive impairment and autism: What are we trying to explain? Philos. Trans. R Soc. Lond. B Biol. Sci. 2016;371:20150082. doi: 10.1098/rstb.2015.0082. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Berkovits L., Eisenhower A., Blacher J. Emotion regulation in young children with autism spectrum disorders. J. Autism Dev. Disord. 2017;47:68–79. doi: 10.1007/s10803-016-2922-2. [DOI] [PubMed] [Google Scholar]
  • 4.Lai M.C., Lombardo M.V., Baron-Cohen S. Autism. Lancet. 2014;383:896–910. doi: 10.1016/S0140-6736(13)61539-1. [DOI] [PubMed] [Google Scholar]
  • 5.Maglione M.A., Gans D., Das L., Timbie J., Kasari C., Technical Expert Panel. HRSA Autism Intervention Research—Behavioral (AIR-B) Network Nonmedical interventions for children with ASD: Recommended guidelines and further research needs. Pediatrics. 2012;130:S169–S178. doi: 10.1542/peds.2012-0900O. [DOI] [PubMed] [Google Scholar]
  • 6.Quirmbach L.M., Lincoln A.J., Feinberg-Gizzo M.J., Ingersoll B.R., Andrews S.M. Social stories: Mechanisms of effectiveness in increasing game play skills in children diagnosed with autism spectrum disorder using a pretest posttest repeated measures randomized control group design. J. Autism Dev. Disord. 2009;39:299–321. doi: 10.1007/s10803-008-0628-9. [DOI] [PubMed] [Google Scholar]
  • 7.Sartorato F., Przybylowski L., Sarko D.K. Improving therapeutic outcomes in autism spectrum disorders: Enhancing social communication and sensory processing through the use of interactive robots. J. Psychiatr. Res. 2017;90:1–11. doi: 10.1016/j.jpsychires.2017.02.004. [DOI] [PubMed] [Google Scholar]
  • 8.Pennisi P., Tonacci A., Tartarisco G., Billeci L., Ruta L., Gangemi S., Pioggia G. Autism and social robotics: A systematic review. Autism Res. 2016;9:165–183. doi: 10.1002/aur.1527. [DOI] [PubMed] [Google Scholar]
  • 9.Yun S.S., Choi J., Park S.K., Bong G.Y., Yoo H. Social skills training for children with autism spectrum disorder using a robotic behavioral intervention system. Autism Res. 2017;10:1306–1323. doi: 10.1002/aur.1778. [DOI] [PubMed] [Google Scholar]
  • 10.Provoost S., Lau H.M., Ruwaard J., Riper H. Embodied conversational agents in clinical psychology: A scoping review. J. Med. Internet Res. 2017;19:6553. doi: 10.2196/jmir.6553. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Saleh M.A., Hanapiah F.A., Hashim H. Robot applications for autism: A comprehensive review. Disabil. Rehabil. Assist. Technol. 2021;16:580–602. doi: 10.1080/17483107.2019.1685016. [DOI] [PubMed] [Google Scholar]
  • 12.Marino F., Chilà P., Sfrazzetto S.T., Carrozza C., Crimi I., Failla C., Busà M., Bernava G., Tartarisco G., Vagni D., et al. Outcomes of a Robot-Assisted Social-Emotional Understanding Intervention for Young Children with Autism Spectrum Disorders. J. Autism Dev. Disord. 2020;50:1973–1987. doi: 10.1007/s10803-019-03953-x. [DOI] [PubMed] [Google Scholar]
  • 13.van den Berk-Smeekens I., de Korte M.W., van Dongen-Boomsma M., Oosterling I.J., den Boer J.C., Barakova E.I., Lourens T., Glennon J.C., Staal W.G., Buitelaar J.K. Pivotal Response Treatment with and without robot-assistance for children with autism: A randomized controlled trial. Eur. Child. Adolesc. Psychiatry. 2021;9:79. doi: 10.1007/s00787-021-01804-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Scassellati B., Admoni H., Matarić M. Robots for use in autism research. Annu. Rev. Biomed. Eng. 2012;14:275–294. doi: 10.1146/annurev-bioeng-071811-150036. [DOI] [PubMed] [Google Scholar]
  • 15.Huijnen C.A.G.J., Lexis M.A.S., Jansens R., de Witte L.P. Mapping Robots to Therapy and Educational Objectives for Children with Autism Spectrum Disorder. J. Autism Dev. Disord. 2016;46:2100–2114. doi: 10.1007/s10803-016-2740-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Johnson C.P., Myers S.M., American Academy of Pediatrics Council on Children With Disabilities Identification and evaluation of children with autism spectrum disorders. Pediatrics. 2007;120:1183–1215. doi: 10.1542/peds.2007-2361. [DOI] [PubMed] [Google Scholar]
  • 17.Fabio R.A., Esposito S., Carrozza C., Pino G., Caprì T. Correlations between facial emotion recognition and cognitive flexibility in autism spectrum disorder. Adv. Autism. 2020;6:95–204. doi: 10.1108/AIA-02-2019-0005. [DOI] [Google Scholar]
  • 18.Woo H., LeTendre G.K., Pham-Shouse T., Xiong Y. The Use of Social Robots in Classrooms: A Review of Field-based Studies. Educ. Res. Rev. 2021;33:100–388. doi: 10.1016/j.edurev.2021.100388. [DOI] [Google Scholar]
  • 19.Estévez D., Terrón-López M.-J., Velasco-Quintana P.J., Rodríguez-Jiménez R.-M., Álvarez-Manzano V. A Case Study of a Robot-Assisted Speech Therapy for Children with Language Disorders. Sustainability. 2021;13:2771. doi: 10.3390/su13052771. [DOI] [Google Scholar]
  • 20.Tleubayev B., Zhexenova Z., Zhakenova A., Sandygulova A. Robot-assisted therapy for children with ADHD and ASD: A pilot study; Proceedings of the 2019 2nd International Conference on Service Robotics Technologies, ICSRT; Beijing, China. 22−24 March 2019. [Google Scholar]
  • 21.Real M.J., Ochoa A., Escobedo D., Estrada-Medrano R., Martínez E., Maciel R., Larios-Rosillo V.M. Recognition of Colors through Use of a Humanoid Nao Robot in Therapies for Children with Down Syndrome in a Smart City. Res. Comput. Sci. 2019;148:239–252. [Google Scholar]
  • 22.Costa A., Kirsten L., Charpiot L., Steffgen G. Mental health benefits of a robot-mediated emotional ability training for children with autism: An exploratory study; Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN); Montreal, QC, Canada. 1 May 2019. [Google Scholar]
  • 23.Costa A.P., Charpiot L., Lera F.J., Ziafati P., Nazarikhorram A., van der Torre L., Steffgen G. More Attention and Less Repetitive and Stereotyped Behaviors using a Robot with Children with Autism; Proceedings of the 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN); Chicago, IL, USA. 5 March 2018. [Google Scholar]
  • 24.Duquette A., Michaud F., Mercier H. Exploring the use of a mobile robot as an imitation agent with children with low-functioning autism. Auton. Robot. 2008;24:147–157. doi: 10.1007/s10514-007-9056-5. [DOI] [Google Scholar]
  • 25.Feil-Seifer D., Mataric’ M. Robot-assisted therapy for children with autism spectrum disorders; Proceedings of the 7th International Conference on Interaction Design and Children; Chicago, IL, USA. 11−13 June 2008. [Google Scholar]
  • 26.Kozima H., Yano H. A robot that learns to communicate with human caregivers; Proceedings of the First International Workshop on Epigenetic Robotics; Lund, Sweden. 17−18 September 2001. [Google Scholar]
  • 27.Melo M., Mota F., Albuquerque V., Alexandria A. Development of a Robotic Airboat for Online Water Quality Monitoring in Lakes. Robotics. 2019;8:19. doi: 10.3390/robotics8010019. [DOI] [Google Scholar]
  • 28.Marinoiu E., Zanfir M., Olaru V., Sminchisescu C. 3D human sensing, action and emotion recognition in robot assisted therapy of children with autism; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; Salt Lake City, UT, USA. 18–23 June 2018; pp. 2158–2167. [Google Scholar]
  • 29.Dautenhahn K., Nehaniv C.L., Walters M.L., Robins B., Kose-Bagci H., Assif N., Blow M. KASPAR–a minimally expressive humanoid robot for human–robot interaction research. Appl. Bionics Biomech. 2009;6:369–397. doi: 10.1155/2009/708594. [DOI] [Google Scholar]
  • 30.Robins B., Dautenhahn K., Boekhorst R.T., Billard A. Robotic assistants in therapy and education of children with autism: Can a small humanoid robot help encourage social interaction skills? Univ. Access Inf. Soc. 2005;4:105–120. doi: 10.1007/s10209-005-0116-3. [DOI] [Google Scholar]
  • 31.Huijnen C.A.G.J., Lexis M.A.S., Jansens R., de Witte L.P. Roles, Strengths and Challenges of Using Robots in Interventions for Children with Autism Spectrum Disorder (ASD) J. Autism Dev. Disord. 2019;49:11–21. doi: 10.1007/s10803-018-3683-x. [DOI] [PubMed] [Google Scholar]
  • 32.Pioggia G., Sica M.L., Ferro M., Igliozzi R., Muratori F., Ahluwalia A., De Rossi D. Human-robot interaction in autism: FACE, an android-based social therapy; Proceedings of the RO-MAN 2007-the 16th IEEE International Symposium on Robot and Human Interactive Communication; Jeju, Korea. 26–29 August 2007. [Google Scholar]
  • 33.Pioggia G., Igliozzi R., Sica M.L., Ferro M., Muratori F., Ahluwalia A., De Rossi D. Exploring emotional and imitational android-based interactions in autistic spectrum disorders. J. Cyber Ther. Rehabil. 2008;1:49–61. [Google Scholar]
  • 34.Mori M. The uncanny valley. Energy. 1970;7:33–35. [Google Scholar]
  • 35.Lecciso F., Levante A., Fabio R.A., Caprì T., Leo M., Carcagnì P., Distante C., Mazzeo P.L., Spagnolo P., Petrocchi S. Emotional Expression in Children With ASD: A Pre-Study on a Two-Group Pre-Post-Test Design Comparing Robot-Based and Computer-Based Training. Front. Psychol. 2021;12:678052. doi: 10.3389/fpsyg.2021.678052. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.McCleery J.P. Comment on Technology-Based Intervention Research for Individuals on the Autism Spectrum. J. Autism Dev. Disord. 2015;45:3832–3835. doi: 10.1007/s10803-015-2627-y. [DOI] [PubMed] [Google Scholar]

Articles from Children are provided here courtesy of Multidisciplinary Digital Publishing Institute (MDPI)

RESOURCES