Skip to main content
PLOS One logoLink to PLOS One
. 2024 May 15;19(5):e0303704. doi: 10.1371/journal.pone.0303704

Social robots in research on social and cognitive development in infants and toddlers: A scoping review

Solveig Flatebø 1,*, Vi Ngoc-Nha Tran 2, Catharina Elisabeth Arfwedson Wang 1, Lars Ailo Bongo 2
Editor: Simone Varrasi3
PMCID: PMC11095739  PMID: 38748722

Abstract

There is currently no systematic review of the growing body of literature on using social robots in early developmental research. Designing appropriate methods for early childhood research is crucial for broadening our understanding of young children’s social and cognitive development. This scoping review systematically examines the existing literature on using social robots to study social and cognitive development in infants and toddlers aged between 2 and 35 months. Moreover, it aims to identify the research focus, findings, and reported gaps and challenges when using robots in research. We included empirical studies published between 1990 and May 29, 2023. We searched for literature in PsychINFO, ERIC, Web of Science, and PsyArXiv. Twenty-nine studies met the inclusion criteria and were mapped using the scoping review method. Our findings reveal that most studies were quantitative, with experimental designs conducted in a laboratory setting where children were exposed to physically present or virtual robots in a one-to-one situation. We found that robots were used to investigate four main concepts: animacy concept, action understanding, imitation, and early conversational skills. Many studies focused on whether young children regard robots as agents or social partners. The studies demonstrated that young children could learn from and understand social robots in some situations but not always. For instance, children’s understanding of social robots was often facilitated by robots that behaved interactively and contingently. This scoping review highlights the need to design social robots that can engage in interactive and contingent social behaviors for early developmental research.

Introduction

Early childhood encompasses the infant and toddler years, marked by gradual but rapid growth in both social and cognitive development [1, 2]. Social development involves acquiring skills to interact and build social bonds with others, whereas cognitive development refers to developing skills related to thinking and reasoning processes [1, 2]. Research in these two subdisciplines focuses on a diverse range of abilities, such as attachment [3], imitation [4], play [5, 6], memory [7], theory of mind [8], social cognition [4], and language acquisition [9, 10]. Theory of Mind (ToM), the ability to attribute underlying mental states like beliefs, desires, and intentions to others [1113], has not previously been studied in pre-verbal infants [14, 15]. However, recent advances in methods have demonstrated that a rudimentary ToM may emerge earlier than the traditional assumption at the age of four [14, 15]. In line with this research, an interesting question is whether infants attribute mental states to non-human agents. Similarly, animacy understanding, the ability to classify entities as animate or inanimate [1618], has been demonstrated in infants as young as two months [1922], and by three years of age, children are good at understanding this distinction. Research on animacy examines how young children distinguish living beings and objects based on featural and dynamic cues such as faces, contingency behavior, and goal-directed or self-generated movement, which may involve using non-human agents possessing such cues [16, 2327].

Developmental psychology uses diverse methodologies, designs, data-gathering instruments and materials, and formats for stimuli presentation, and the research can be conducted in various research settings [28]. Using social robots as part of research methods has emerged as a promising way to gain social and cognitive developmental insights [2931]. Some pioneering studies have also demonstrated that social robots can contribute to cognitive assessments of elderly people and children with autism [32, 33]. These robots are designed for social interactions with humans, and they are often physically embodied, with human or animal-like qualities, and can be autonomous or pre-programmed to perform specific actions, and they engage in social interactions [34, 35]. Social robots often have an anthropomorphic design with human-like appearance and behavior. For example, they commonly have heads with facial features and can display various social behaviors such as facial expressions, eye contact, pointing, or postural cues [3638]. Two social robots commonly used for research on social and cognitive development skills are Robovie [39] and NAO [40]. In research settings, social robots can serve various roles, such as social partners in interactions [e.g., 40, 41], teaching aids delivering learning content [40, 42, 43], and they can be equipped with sensors and cameras to record child behaviors [39].

There are several research advantages of using social robots that are not easily achievable through other means when studying young children. Firstly, they provide a level of control and consistency that can be challenging to achieve with human experimenters [32, 44]. Secondly, because social robots are designed for social interactions, they might have potential in research on social learning situations such as imitation studies. Third, the socialness of robots in appearance and behavior [45], in addition to their novelty, make them potentially more suited to capture a child’s attention and sustain their engagement over longer time periods for a variety of testing purposes. Lastly, social robots offer a compelling avenue for advancing our understanding of young children’s early ToM and animacy understanding related to non-human agents with rich social properties and how they represent social robots specifically.

The current review

Although social robots are increasingly used in various settings with children, little is known about their utility as a research tool investigating social and cognitive concepts in infants and toddlers. We need to determine at which stages in early childhood children are receptive to and can learn from these robots. Currently, there is no available scoping review or systematic review of the available body of literature in this field. A review of the existing literature is needed to advance our understanding of social robots’ relevance in research with younger age groups and map the current state of knowledge in this field. Given the potential diversity in methodologies, research designs, and the wide range of developmental topics and concepts in the present research field, we decided to do a scoping review. Consequently, the main objective of the current scoping review is to provide a comprehensive overview and summary of the available literature on the use of social robots as research tools for studying the social and cognitive development of typically developing infants and toddlers aged 2 to 35 months.

Our focus is on research using social robots to inform child development, rather than research exclusively focusing on robot skills and application. We focus on typically developing children in the infancy and toddler years, younger than 3 years. We exclude neonates (0–2 months) and preschoolers (3–5 years) due to the notable distinctions in their developmental stages, which may necessitate different research methods compared to those used for infants and toddlers. Our definition of social robots is broad, encompassing all embodied robots exposed to children in a research context, irrespective of form and presentation format. However, we recognize the significance of eyes in early childhood communication [46] and, consequently, restrict our inclusion to only robots featuring eyes. Our definition covers both robots commonly defined as social robots as well as robots with social features in form and/or behavior. We chose this definition because both types of robots might be relevant for how non-human agents with richer social features can inform social and cognitive development.

This review will provide an overview of the research literature, covering research on concepts of social and cognitive development using robots, the research methods employed, and the types of robots used and their purposes. Also, our aim is to summarize the research trends by identifying the primary research focuses and findings. Finally, we want to summarize the reported gaps and challenges in this research field. Hopefully, the current review can be valuable for future research, helping to decide how to employ social robots in research settings with infants and toddlers and to support the development of age-appropriate robots for children.

Method

We conducted a scoping review, which aimed to explore and map the concepts and available literature in a given field of research [47]. Like systematic reviews, scoping reviews follow rigorous and transparent methods [47, 48]. But, differently from systematic reviews, scoping reviews ask broader rather than specific research questions to encompass the extent and breadth of the available literature of a given field [47, 48]. We used The Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews (PRISMA-ScR) (S1 Checklist) to improve this scoping review’s methodological and reporting quality. We preregistered the protocol for this study on Open Science Framework on May 19, 2023 (see updated version of the protocol: https://osf.io/2vwpn/). We followed the recommendations of the Johanna Briggs Institute (JBI) [49] and the first five stages in the methodological framework of Arksey and O’Malley [47] and Levac and O’Brien’s advancements of this framework [50].

Stage 1: Identifying the research questions

The review was guided by three research questions: 1) What is the extent and nature of using social robots as a research tool to study social and cognitive development in infants and toddlers? 2) What are the primary research focus and findings? 3) What are the reported research gaps and challenges when using social robots as a research tool?

Stage 2: Identifying relevant studies

Inclusion criteria

We developed inclusion criteria related to the publication type, target child population, the robot type, and the research focus (Table 1) to focus the scope of the review.

Table 1. Inclusion criteria.

In the full-text screening, we excluded studies by the first unmet inclusion criteria, i.e., we checked if the publication met the criteria for publication type first, then for the target population, robot type, and finally, the research focus.

Criterion Included
1. Publication type
Time frame 1990 until May 29/05/2023
Availability Full texts available through open access or through our university subscription
Publication type Peer-reviewed journal articles, journal magazine articles, preprints, and conference proceedings with full papers for empirical studies
Language English
Research methodology Empirical studies using quantitative, qualitative, or mixed methods
2. Target child population
Participants Publications with an exclusive focus on typically developing children between 2 to 35 months of age
3. Robot type
Robot Humanoid or non-humanoid form. Embodied robots, including partly animated robots. Full or partly autonomous. The robot must have eyes. The robot can be physically or virtually present in the child’s environment, either as a physical robot or appearing in a video. The authors of the studies do not need to define the robot as social
4. Research focus
Focus Focus on child development, i.e., the robot is used to assess social and/or cognitive development in children. The publication includes an experiment, a pilot study, or a trial to test social and/or cognitive child development

We consulted multiple databases to identify studies, as social robotics is an interdisciplinary field. We included conference proceedings and preprints because studies within robotics are often published in this format [5153].

Search strategy

We searched for literature in PsychINFO (OVID), Education Resources Information Center (ERIC, EMBASE), and Web of Science. We searched for preprints using the Preprint Citation Index in Web of Science and in PsyArXiv. All searches were done on 29 May 2023. In consultation with an academic librarian, we developed a search strategy and search terms, which are presented in the S1 File. We used controlled vocabulary in addition to keywords when searching in PsychINFO and ERIC. Web of Science and PsyArXiv lack their own controlled vocabulary, so PsychINFO and ERIC keywords were used in the searches. We categorized the search terms into three categories: robot type, target child population, and social and cognitive developmental concepts. For a comprehensive search, we used the search terms “robot*”, “robotics”, “social robotics”, and “human robot interaction” related to robot type category. Moreover, for the target child population category we used terms like “infan*”, “toddler*”, “child*”, “infant development”, and “childhood development”. Lastly, for developmental concepts we used terms such as “cognitive development”, “social development”, “social cognition”, and “psychological development”.

Stage 3: Study selection

We developed a screening questionnaire a priori (doi.org/10.17605/OSF.IO/4BGX6), which all reviewers (SF, LAB, and VT) piloted initially on a random sample of studies. After revising the screening questionnaire, we started screening studies for eligibility in the web-based software Covidence [54]. We removed duplications manually and by using the Covidence duplicate check tool. All studies were screened by two reviewers independently using the screening questionnaire. The first author (SF) screened all studies, whereas LAB and VT screened half of the studies each. We resolved disagreements by team discussion. The studies were screened through a two-step process: 1) screening of titles and abstracts; 2) screening of full texts. In full-text screening, we followed the exclusion reason order in Table 1 and excluded studies by the first unmet inclusion criteria.

Stage 4: Data charting

We developed a data charting template a priori in Covidence and we used it to chart data from the studies included. The first author (SF) piloted the data charting template on five studies and iteratively modified it based on recommendations [50]. The main revisions included changes to the template layout, adding entities (i.e., final sample size and physical CRI contact), and providing more charting instructions and explanations of the entities. The details about the newest version of the charting template and charted entities are available at OSF (doi.org/10.17605/OSF.IO/B32R6). The first author (SF) charted data from each publication, and a second reviewer (LAB or VT) checked the charted data for completeness and accuracy in Covidence. Disagreements were resolved by discussion in the research team. We charted data regarding general study characteristics (e.g., authors, publication year, publication type, and country of the first author), research aims, developmental concepts, methods (e.g., research methodology and design, research setting, procedure and conditions, material, outcome measures, and type of CRI), child population characteristics (e.g., sample size, age, and socioeconomic background), robot characteristics (e.g., robotic platform, developer, exposition, physical CRI contact, purpose of use, form, appearance, autonomy, and behavior), reported gaps and limitations, research findings and conclusions. We exported the charted data from Covidence to Excel. All charted data is available at OSF (doi.org/10.17605/OSF.IO/WF48R).

Stage 5: Collation, summarizing, and reporting results

The reviewed studies are summarized, reported, and discussed in line with the fifth stage of Arksey and O’Malley’s scoping review framework in the following sections. We classified the studies based on the type of developmental concepts they involved.

Results

Search results

Overall, we identified 1747 studies from all database searches. After removing duplicates, and screening titles and abstracts, we screened 187 full texts for eligibility. Out of these, 158 studies were excluded. Finally, we included 29 studies in the review. Fig 1 shows the details of the search results and the study selection process in the PRISMA flowchart diagram [55].

Fig 1. PRISMA 2009 flowchart diagram.

Fig 1

The study selection process, including procedures of identification, and screening of studies. Studies were excluded based on a fixed order of exclusion reasons, including only the first incident of an unmet reason in this diagram.

General characteristics

S1 Table provides an overview of all reviewed studies, including general characteristics, research methods, aims, sample characteristics, the robotic platform and other measures used, and a summary of the main findings and conclusions. There were 25 journal articles, three conference papers, and one magazine article. None of the studies were preprints. Studies were published between 1991 to 2023, and the research activity slightly grew over the past three decades (Fig 2).

Fig 2. Studies per year.

Fig 2

The cumulative number of studies per year between 1990 to 29. May 2023.

The authors came from different countries, and most studies were conducted in Japan, followed by the United States and Canada (Table 2).

Table 2. Country distribution.

Countries of the lead authors (N = 29).

Country n
Japan 8
Taiwan 1
Italy 3
Romania 1
United Kingdom 2
Canada 5
United States 6
Australia 3

Research methods

Almost all studies (n = 25) used quantitative methodology, while only two studies used qualitative methodology and one used a mixed approach. Twenty-five of the studies used an experimental design, while the remaining four used a descriptive, correlational, case study, or ethnomethodology design. Twenty-four studies were conducted in a laboratory or in a controlled laboratory setting. Two studies were conducted in ecological settings, such as classrooms. The remaining three studies were conducted in different locations, one study in a naturalistic setting at a science museum, and two studies used various locations (i.e., laboratory, ecological and/or naturalistic location).

Child characteristics

The final sample sizes of the studies ranged from 6 to 230 participants, with the ages of participants ranging from 2 to 35 months. While some studies [5662] included participants older than the target age, this review only focuses on findings related to children in the target age group. Twenty studies included toddlers who were 12 months or older, while seven studies included infants under 12 months. Five studies reported the socioeconomic status of the families [6367], all belonging to the middle-class. For more details about the samples, see S1 Table.

Robot characteristics and interaction types

We identified 16 social robots (Table 3 and Fig 3), most having a humanoid appearance (n = 24), whereas the remaining were animal-like (n = 4) and a ball-shaped robot (n = 1). The robots used were Robie Sr., Robovie, Robovie2, NAO, Dr. Robot Inc, HOAP-2, RUBI, RUBI-6, iRobiQ, Sphero, ReplieeQ2, MyKeepon, Bee-Bot, 210 AIBO, MiRoE, and Opie. Robovie (versions 1 and 2) was most frequently used (n = 8). Most robots were pre-programmed to perform specific behaviors to examine children’s responses to these acts (n = 24), such as making eye contact or gazing in the direction of an object [e.g., 68], or performing specific actions with objects [e.g., 62]. Two studies used autonomous robot dogs that acted by themselves and reacted to the children’s behavior [60, 61]. Additionally, some [57, 58, 69] exposed children to robots that were autonomous or pre-programmed at different phases of the experiment.

Table 3. Robots used in the studies.

H = humanoid; NH = non-humanoid; n = number of studies using a given robot.

Robot Developer Purpose Form Appearance n Representative studies
Robie Sr. Radio Shack Animacy concept, early conversational skills H Small toy robot with a head, ears, eyes, and a mouth. Wore a sweater/T-shirt and a cap/hat. Mounted on a wheeled base with a single unit body, arms, and hands. 4 [6365, 81]
Robovie ATR Media Information Science Laboratories; ATR Intelligence Robotics Laboratory Action understanding (e.g., gaze following, goal attribution, attribution of intention to failed actions) H Large robot with a moveable head, eyes with pupils, body, torso, arms, and hands. Mounted on a wheeled base. 7 [68, 8287]
Robovie2 Hiroshi Ishiguro Laboratories Animacy concept, early conversational skills, action understanding (e.g., gaze following) H Large robot with a movable head, movable eyes with pupils, body, torso, arms, hands, and fingers. Wore white gloves. 2 [72, 88]
NAO SoftBank Robotics; Aldebaran Motor imitation, contingency learning H Medium-sized robot with a head, mouth, LED-eyes, body, torso, shoulders, arms, hands, fingers, legs, and feet. 3 [69, 89, 90]
Dr. Robot Inc. NR Action understanding (e.g., gaze following) H Medium-sized robot mounted on a wheeled base, with body, torso, fixed arms, moveable head, mouth, and eyes with pupils. Wore a red shirt. 1 [66]
HOAP-2 Fujitsu Laboratories Action understanding (e.g., gaze following) H Medium-sized robot with body, torso, arms, legs, hands that open/close, pan-tilt moveable head, black-circled eyes (rims of cameras), and a nose. 1 [67]
RUBI NR Animacy concept H Large robot with body, torso, arms, hands, head, eyes (cameras), a fixed nose, and a fixed mouth. Equipped with a computer screen on its torso. 1 [91]
RUBI-6 Movellan et al., (2009, 2005); Tanaka et al., (2006) Action imitation with objects H Medium-sized robot with body, torso, moveable arms, pincer hands, moveable head with an Apple iPad mini displaying an animated cartoon face with eyes, pupils, nose, mouth, and eyebrows. Mounted on a base. Equipped with a computer screen on its torso. 2 [58, 62]
iRobiQ Yujin Robots, Yujin Robot Co., Ltd Reading skills and interest H Medium-sized robot mounted on a base, with body, torso, moveable arms, moveable head with eyes and mouth. The face has LEDs and sounds. 1 [56]
Sphero NR Physical play and emotions NH Small, white-colored robotic ball. Blue drawing of a head with eyes. 1 [57]
ReplieeQ2 Osaka University and KOKORO Co. Ltd., Japan Animacy concept H 1) Android: Human-like head with black hair, a face with silicone skin, eyes, black eyebrows, a nose, and a mouth with lips. 2) Robot: Wore a plastic mask with a human-like appearance. Has eyebrows, fixed black eyes, a nose, and a mouth with lips. Both robots have bodies with necks, shoulders, arms, hands, fingers, and legs. 1 [77]
MyKeepon Kozima, Nakagawa, and Yano (2004) Animacy concept NH Small yellow snowman-shaped and creature-like robot in soft silicone rubber. Head with fixed eyes with pupils. Mounted on a base. 1 [92]
Bee-Bot TTS Group Ltd, 2021 Computational thinking, programming, and coding skills NH Small and bee-like robot with black and yellow stripes on its body. Colorful buttons on top. Head with a fixed mouth and fixed eyes with pupils. 1 [59]
210 AIBO Sony Animacy concept NH Small and dog-like robot with a head, eyes, nose, ears, legs, and a tail. Metallic form in black color. 1 [61]
MiRoE Consequential Robots Animacy concept NH Small and dog-like robot with a head, ears, eyes with pupils, moveable eyelids, body, neck, two wheeled legs. Wore a collar. 1 [60]
Opie NR Action imitation with objects H Large robot with an upper body, a head, animated eyes with pupils and eyelids, neck, torso, shoulders, arms, and hands. A head with a black-colored screen face. 1 [93]

Fig 3. Most of the robots in the review.

Fig 3

Images b, c, e, f, h, j, k, and l are modified cropped versions of the original work. Original images are licensed under CC-BY. For the robots Dr. Robot Inc., Opie, RUBI, and RUBI-6, we could not find images with a CC-BY (or similar) license. The Android and mechanical configurations of the same robot are shown in image (h). The image sources are: a) [70]; b) [71]; c) [72]; d) [73]; e) [74]; f) [75]; g) [76]; h) [77]; i) [78]; j) [79]; k & l); [80].

In most studies, the robots were present in the same physical location as the child (n = 18), whereas the remaining robots were presented in video (n = 11). In most cases, the child-robot interaction did not involve any physical contact with the robot (n = 19). A total of 34 experiments were conducted in the 29 reviewed articles in which children were exposed to robots in some way. Most commonly, the robot was exposed to the child in a one-to-one interaction or situation (n = 20), including both live interactions and passive observations without social exchange. The remaining were bystander interactions (n = 5), where the child observed the robot interact with someone else, children-robot interactions in groups (n = 4), or a mixture of different interaction types (n = 5).

Outcome measures and other instruments and material

Details of the outcome measures are presented in the S1 Table. The most frequent measure in the studies was children’s looking behavior during stimuli presentation (n = 12). Looking behavior was measured using different instruments, such as eye tracking methods, video recordings captured by cameras, or observational notes. Various techniques were used to analyze looking behavior, such as visual habituation, preferential looking, violation of expectation, and anticipatory looking. Another common measure was children’s imitation behavior assessed in imitation tests by analyzing the performance of target actions (n = 7).

Research focus, key findings, and conclusions

The studies focused on several social and cognitive skills that we clustered into 4 main categories (Table 4). The key findings and conclusions of all studies are presented in the S1 Table.

Table 4. Research focus in the studies.

The other category includes the concepts of computational thinking (n = 1), reading interest and skills (n = 1), and physical play and emotions during robot interaction (n = 1).

Social and cognitive concepts Frequency Representative studies
Animacy concept 7 [60, 77, 81, 82, 91, 92, 94]
Action understanding 10 [6668, 72, 8388]
Imitation 6 [58, 62, 69, 89, 90, 93]
Early conversational skills 3 [6365]
Other 3 [56, 57, 59]

Animacy understanding

Seven studies investigated children’s understanding of animacy (Table 4). They examined how children classify robots as animate or inanimate based on their appearance [77, 91], movements [81], and interactive behaviors [60, 61, 82, 91], using both humanoid and animal-like robots (Table 3 and Fig 3). The findings were diverse, with children sometimes perceiving robots as more like living beings when the robots had a highly human-like appearance [77] or behaved contingently [82, 91, 92]. For example, infants aged 6 to 14 months did not differentiate between a highly human-like android and a human, viewing both as animate, but they recognized the difference between a human and a mechanical-looking robot (Fig 3) [77]. Contingency behavior influenced children’s animacy understanding, with children’s reactions to robots varying depending on the robots’ contingency [82, 92]. Children aged 9 to 17 months who observed contingent interactions between a robot and a human were more likely to perceive the robot as a social being, suggesting the importance of responsive behavior in animacy perception [82, 92]. Nine- and twelve-month-old infants showed different expectations for human and robot movement, demonstrating increased negative affect when robots moved autonomously, suggesting that infants might consider robots inanimate regardless of self-generated motion [81]. Studies with robot dogs showed that children differentiated between robotic dogs and toy dogs, but they did not necessarily view the robotic dog as a living animal [60, 61]. However, they did engage with the robotic dog in a manner suggesting that they perceived it as a social partner [60, 61]. Observations of 12- to 24-month-old toddlers’ long-term interactions with a social robot indicated that they perceived the robot as a social partner [91]. The robot’s interactivity, appearance, and inscriptions of gender and social roles influenced toddlers’ attribution of animacy [91]. One study discussed anecdotal observations suggesting that toddlers may ascribe animacy to robots based on reciprocal vocalizations and social behaviors, such as inviting the robot to dance or apologizing to it after accidental contact [63]. Two studies connected children’s concepts of animacy with their understanding of actions, particularly goal-directed and contingent actions [77, 91], which will be discussed in the section below on action understanding.

Action understanding

Ten studies used humanoid social robots to examine children’s understanding of various actions (Tables 3 and 4), including referential actions [66, 67, 72, 8486], goal-directed actions [83, 87, 88], and intentions behind failed actions [68]. Action understanding refers to the ability to recognize and respond appropriately to other’s actions, infer the goals of actions, and detect the intention underlying the actions [95].

Studies on referential actions [66, 67, 72, 8486] showed that children aged 10 to 18 months can follow the gaze of humanoid robots, but their understanding of the robot’s intentions varied. For example, 12-month-olds respond to robot gaze, and it is not just an attentional reflex to its head movements [84], but they do not anticipate object appearance following robot gaze as they do for humans [84, 85]. Similarly, one study [72] found that 17-month-olds more frequently followed the human gaze than the robot gaze, suggesting that toddlers did not understand the referential intention of the robot’s gaze. Yet, toddlers may still understand the robot’s referential intentions, such as when the robots provide verbal cues during object learning [66, 86] or when the robot has previously engaged socially with adults [67]. Studies on goal-directed actions [83, 87, 88] showed that infants from 6.5 months could identify the goals of a humanoid robot as it is moving towards a goal destination, and they evaluate whether the robot is performing the most efficient path to reach its goal [83]. However, they do not attribute goals to a featureless box, suggesting that the human-like appearance of an agent influences infants’ reasoning about an agent’s actions [83]. Moreover, 13-month-old toddlers did not expect cooperative actions between humans and robots, even with social cues present [87]. By 17 months, toddlers showed signs of predicting the goal-directed reaching actions towards a target of both humans and humanoid robots, indicating an understanding of goal-directed behavior irrespective of the agent [106]. Finally, toddlers aged 24 to 35 months recognized the intention behind a robot’s failed attempts to place beads inside a cup, but only when the robot made eye contact [68].

Imitation

Social robots were used to study two kinds of imitation in young children, i.e., their ability to learn by observing and imitating others [96]. Half of the studies focused on infants aged 2–8 months and their imitation of the humanoid robot’s bodily movements, also known as motor imitation, and contingency learning in a face-to-face interaction [69, 89, 90]. Although 2- to 5-month-olds paid more attention to the robot when it moved, only 6- to 8-month-olds imitated its motor movements and demonstrated contingency learning [69, 89, 90]. The remaining studies investigated 1- to 3-year-old toddlers’ imitation of a robot’s actions with objects, such as assembling a rattle and shaking it to make a sound [58, 62, 93]. The studies found that toddlers imitate both physically present [58] and on-screen robots [62] and that their imitation of robots increased with age [58, 62]. Toddlers who interacted more with the robot prior to the imitation test were more likely to imitate it [58], though they still imitated humans more frequently [58, 62]. Moreover, toddlers’ imitation from on-screen demonstrations of a human experimenter performing actions is not facilitated by presenting such videos embedded in robots behaving socially [93].

Early conversational skills

Three studies used a toy robot to investigate early conversational skills in toddlers (Tables 3 and 4). The robot provided constant verbal stimulation through an in-built speaker. By using a robot, the researchers aimed to eliminate potential confounding nonverbal cues (e.g., gaze, gestures) inevitably present in human conversation that could affect toddlers’ responses [6365]. For 24-month-olds, when the robot reciprocated toddlers’ utterances by repeating and expanding the topic, it led to more topic-maintaining conversation and increased linguistically mediated social play [63]. Moreover, 24-month-olds recognized when the robot’s responses were semantically relevant and on-topic, and in these situations, toddlers were more likely to continue and expand the conversational topic compared to when the robot was off-topic [64]. Older toddlers, aged 27 and 33 months, demonstrated an understanding of pragmatic quantity rules in conversations by responding appropriately to specific and general queries when conversing with the robot [65].

Other concepts and related findings

The remaining studies used various social robots (Table 3) to examine: reading ability [56], computational thinking programming, coding skills [59], and physical play and emotional responses [57]. For more details about these studies, see the S1 Table.

Gaps and challenges

To address our third research question, we summarize gaps and challenges in using social robots as a research tool reported by the authors of the studies in the review. The most reported gaps by the authors were related to children’s familiarity with robots, testing the effect of specific robot appearance and/or behavior cues, the design of the robot, and testing across different settings. Many studies [58, 62, 72, 82, 85, 87, 88] discussed that future work should investigate whether children’s familiarity with robots might influence their understanding of and response to robots. For example, Okumura discusses [85] that infants might have stronger expectations for referential cues, such as gaze, from humans rather than robots due to their familiarity with human interaction. Moreover, future studies should investigate whether children’s increased exposure to robots can enhance their ability to understand and respond to a robot’s referential communication [85]. Several studies suggest that further research should investigate how a robot’s physical appearance and behavior impact children’s perception, comprehension, and learning from robots [66, 8183, 85, 87]. For instance, Okumura et al. [86] suggest that future research should examine whether verbal cues provided by robots influence infants’ object learning. Regarding gaps related to robotic design, one study [92] elucidated that robotic developers should aim to make robots that can interact autonomously without interference from a human operator. Related to the robot’s design, Peca and colleagues [92] propose that future work should try to make robots that can interact autonomously with the child without the need for an operator. Most of the studies were conducted in experimental settings, and some studies [69, 72] suggest that future work should examine child-robot interactions in more naturalistic settings.

Most studies (n = 24) reported some challenges or limitations related to using social robots as a research tool. Many studies (n = 10) reported challenges related to the robot’s design, such as issues related to its appearance and functionality. For example, additional human operators are required in the experimental procedures due to the technical constraints of the robots, difficulty in making the robots’ movements resemble human movements, or challenges with using robots in live tasks because robots fail to provide the stimuli correctly or do not respond appropriately during interactions. Several studies (n = 7) reported children having challenges understanding the robot, such as its actions, communicative cues, and underlying intentions. Relatedly, some studies discussed that children’s lack of familiarity and experience with robots may contribute to difficulty understanding them and make them more distracting (n = 4). Several studies (n = 5) reported children experiencing challenges with task focus, including little or too much interest in the robot, irritability during robot inactivity, or children being distracted and leaving the task activity. Some studies (n = 3) discussed ecological validity issues, such as the generalization of findings across settings and with specific robots to other robot types or humans. Relatedly, we noticed that few studies used control groups with human or non-human agents for the robots they used, and there is limited discussion on the absence of these controls. An overview of commonly reported challenges is presented in Table 5.

Table 5. Reported challenges in using robots as a research tool in the included studies.

The category “no limitations reported” refers to studies that have not reported any challenges relevant to using social robots as a research tool.

Challenges reported Frequency Representative studies
Child
 Fear of robot 3 [58, 81, 93]
 Novelty of robot 4 [66, 72, 84, 87]
 Understanding the robot 7 [58, 66, 69, 72, 8587]
 On-task engagement 5 [58, 59, 87, 89, 90]
 Sample bias 1 [65]
Robot
 Design 10 [58, 61, 62, 66, 67, 72, 77, 91, 92]
 Cost 2 [56, 69]
 Safety hazards 1 [62]
 Stimuli presentation 2 [66, 77]
Research design
 Ecological validity 3 [63, 69, 88]
 Chosen design 1 [92]
 Operationalizing 2 [61, 92]
 Setting or set-up 3 [60, 85, 90]
No limitations reported 5 [57, 64, 68, 82, 83]

Discussion

This scoping review is a novel contribution to the field as it is the first to systematically cover the breadth of the literature on how social robots have been used in early development research to investigate social and cognitive development. Our review provides an overview of general characteristics, methods, research focus, findings, and the reported gaps and challenges when social robots are used in early developmental research. Previous systematic reviews and scoping reviews have focused on using social robots with older children in other settings, such as in education [97], supporting autism development [98102], or various health care contexts [103106]. Although we maintained the wide approach of a scoping review, we found that an overarching research focus in the reviewed literature was to determine if social robots can act as social partners for young children. According to this literature, children sometimes classify social robots as social partners and can interpret the social cues and actions of robots in certain situations. Thus, the studies demonstrate the potential of using various social robots in early developmental research, but do not suggest that social robots can replace humans in research settings.

General characteristics and methods

We found that the use of social robots in early development research is a small research field, and we found 29 studies for the review. Most studies were quantitative with experimental designs and conducted in controlled laboratory settings, in which the children were exposed to the robots in a one-to-one situation. Few studies used qualitative methodology [59, 60, 91], and only one study [91] observed child-robot interactions in a long-term context. Most robots were humanoid and pre-programmed to perform a specific social behavior of interest. We had a broad definition of social robots, including robots that fit typical descriptions of social robots, such as Robie Sr., Robovie, Robovie2, NAO, Dr. Robot Inc., HOAP-2, RUBI, RUBI-6, iRobiQ, ReplieeQ2, MyKeepon, 210 AIBO, MiRoE, and Opie (Table 3 and Fig 3). However, we also found robots not typically considered social robots, such as the robotic ball Sphero and Bee-Bot (Table 3 and Fig 3). Notably, the robots used in the studies varied in their level of advancement. Some were relatively simple and immobile, like the Robie Sr. robot, while others were capable of autonomous action, such as the NAO robot (Table 3 and Fig 3). Naturally, some of the more advanced robots were unavailable when the first studies were conducted, and therefore, we found that more simplistic robots were used in the studies that were first published.

Research focus and key findings

Our review shows research trends in using social robots to study social and cognitive concepts such as animacy understanding, action understanding, imitation, and early conversational skills. Some studies also used robots to examine reading abilities, computational thinking, and emotions. We found that most studies focused on whether children classify robots as social partners to interact with and acquire information from or whether humans are a privileged source of information at these developmental stages [58, 60, 62, 6669, 72, 77, 8194]. Only a few studies [6365] used robots to provide more constant stimuli instead of humans, with a main focus on the developmental concepts examined. Furthermore, some had an additional focus on the application of robots [56, 59, 60], such as the therapeutic potential of robot dogs [60] or as a learning tool to improve reading [56]. Lastly, one study used a robot providing socially contingent behaviors to facilitate children’s imitation learning from a human experimenter [93].

The limited number of studies means that caution is necessary when interpreting the findings. Furthermore, research findings from one age group cannot be generalized to others. However, some key findings indicate that infants are attentive to robots and can learn from them at an early stage of development in several situations. Thus, humans are not necessarily the only information source for young children. For instance, 2-month-olds tend to be more attentive to robots that move [90], while 6-month-olds imitate robots [69]. Furthermore, 6.5-month-olds can attribute goals to a robot’s moving actions toward a specific destination [83]. Another key finding was that as children grow older, they show signs of becoming better at recognizing and interpreting the social cues provided by robots, and their learning from robots is enhanced. For example, 24- to 35-month-old showed early signs of attributing intentions to robots by detecting what a robot intended to do when it failed to put beads inside a cup [68]. Additionally, 1-to-3-year-olds were able to imitate a robot’s actions with objects both on-screen and in real life, and imitation increased with age [58, 62]. Yet, in several situations, children in the reviewed studies did not understand the robots’ social behaviors and were not able to learn from them [66, 72, 84, 85, 87, 90]. Taken together, toddlers and infants may view robots as social partners, attributing mental states to them like older children do [107110]. Moreover, this literature provides information on the ages at which young children can socially engage with social robots.

Yet another key finding was that it was not just the appearance of social robots but also how the robots behave that plays an important role in how young children perceive, understand, and respond to them [56, 58, 63, 64, 67, 82, 86, 91]. Especially, contingency and interactivity behaviors facilitated how the robots were understood. For example, when young infants observed another person talking to or contingently interacting with a robot, they tended to classify the robot as animate [82, 92], and they showed increased sensitivity to its social cues such as eye gaze [67]. Additionally, toddlers who interacted more with the robot prior to the imitation test were more likely to imitate it [58]. In conversations with robots, toddlers tended to stay more engaged in the conversation when the robot reciprocated their verbalizations and stayed on-topic [63, 64]. Moreover, adding more social factors to the robot, such as verbal cuinging, increases 12-month-old infants’ ability to follow a robot’s gaze to an object [86]. Relatedly, Csibra [111] proposes that it is not how an agent looks that is important for children to identify it as an agent, but how it behaves. It is possible that social robots having appearances and social behaviors like living beings blur the lines between living and non-living beings and that social robots are represented as a new ontological category in children. As a result, young children might perceive and treat these robots as social partners and not just machines. Relatedly, Manzi [88] et al. discuss robots with human-like characteristics might activate social mechanisms in young infants. Yet, in some cases, appearance and contingency behaviors were not enough to elicit an understanding of the robot’s intention [66].

Gaps and challenges

The authors reported several gaps and challenges related to using social robots in early developmental research. Most commonly, the authors reported that future work should investigate whether children’s familiarity with robots impacts their responses. Although social robots possess human-like qualities and behaviors already familiar to the child, their novelty may result in different responses from children when compared to interactions with human agents. Frequently reported challenges were related to robot design. For instance, in some studies, a human experimenter had to accompany the robot during an experiment because of the technical constraints of the robots [66, 92]. Relatedly, Peca and colleagues [92] discuss that future work should aim to make robots that do not require human operators.

Limitations

This scoping review is not without limitations. Although we conducted extensive searches across multiple databases, it is possible that some relevant studies were not included. Our inclusion criteria were limited to studies published in English, and we did not manually search reference lists to identify additional studies, which may have resulted in the exclusion of relevant studies. Furthermore, as scoping reviews do not typically aim to assess the quality of evidence, we did not perform a formal quality assessment of the studies included.

Future directions

This review has allowed us to identify important directions for future research, primarily within developmental psychology but also in social robotics. Firstly, it is unclear how efficient social robots are when acting as agents in early developmental research. This is indicated by diverse findings related to how children classify them as animate or inanimate and how children interpret their social cues and behaviors. Notably, few studies used any human or non-human controls for robots. Thus, future studies should use other agent types in addition to robots to determine the efficiency of using social robots, humans, and other types of agents in early developmental research. Findings on what robot behaviors are crucial for young children may have implications for future work within social robotics when aiming to develop age-appropriate robots. Secondly, we found that multiple robots were rarely used within the same study, and thus, it is unclear if their findings generalize to other types of robots or if the findings are specific to a particular robot type. Future work could use several robots to test generalizability across different robot types. Thirdly, most studies investigated child-robot interactions in highly controlled settings that do not easily generalize to other environments. Future work should investigate naturalistic interactions between children and robots, in which the robots respond to the child’s behavior at the moment rather than being pre-programmed to do a specific task. Fourth, we noticed that the included studies rarely reported the reasons behind their choice of a specific robot type and the amount of time spent preparing the robot, such as learning to program it or having a skilled programmer do it. We suggest reporting such information to ease replication and to improve planning for future studies.

Conclusion

Our scoping review of 29 studies shows a small and emerging field of using social robots to study social and cognitive development in infants and toddlers. We identified four main areas of focus: animacy understanding, action understanding, imitation, and early conversational skills. An important question in the field is whether young children perceive social robots as social partners or agents. Findings vary on how children classify and understand the behaviors of social robots. According to the studies, young children can, from an early age, pay attention to social robots, learn from them, and recognize their social signals, but not always. The studies suggest that certain robot behaviors, particularly those that are interactive and contingent, are critical for enhancing children’s perception of robots as social entities. Moreover, it seems like children’s understanding of robots improves with age. Our review indicates that even in infancy, social robots can be regarded as social partners, a perception that is essential in research settings that depend on social interaction. Consequently, our review highlights the need for careful selection of social robots that exhibit interactive and contingent behaviors to be effective in early developmental research. Furthermore, this review contributes knowledge on how children socially interact with and learn from non-human agents with rich social features. These insights are important for future studies within developmental psychology involving social robots and young children and future work within social robotics on designing appropriate robot behaviors to facilitate social interaction with robots in early childhood.

Supporting information

S1 Checklist. Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) checklist.

(DOCX)

pone.0303704.s001.docx (84.4KB, docx)
S1 File. Search strategy.

Search queries and search terms used in the databases and preprint repository.

(DOCX)

pone.0303704.s002.docx (30.5KB, docx)
S1 Table. Overview of the included studies.

(DOCX)

pone.0303704.s003.docx (36.4KB, docx)

Acknowledgments

We thank Torstein Låg, Senior Academic Librarian at the UiT The Arctic University of Norway, for support in developing search strategies.

Data Availability

All data are available from the OSF database doi.org/10.17605/OSF.IO/WF48R.

Funding Statement

The author(s) received no specific funding for this work.

References

  • 1.Smith PK, Hart CH, Abecassis M, Barrett MD, Bellmore A, Bissaker K, et al. Blackwell handbook of childhood social development: Blackwell Publishers Oxford, UK; 2002. [Google Scholar]
  • 2.Goswami U. The Wiley-Blackwell handbook of childhood cognitive development: 2. ed. 2,Second edition. ed. Chichester u.a: Chichester u.a: Wiley-Blackwell; 2011. [Google Scholar]
  • 3.Workman L, Taylor S, Barkow JH. Evolutionary perspectives on social development. The Wiley‐Blackwell handbook of childhood social development: Wiley-Blackwell; 2022. p. 84–100. [Google Scholar]
  • 4.Meltzoff AN. Social cognition and the origins of imitation, empathy, and theory of mind. The Wiley‐Blackwell handbook of childhood cognitive development: Wiley-Blackwell; 2010. p. 49–75. [Google Scholar]
  • 5.Lillard A, Pinkham AM, Smith E. Pretend play and cognitive development. The Wiley‐Blackwell handbook of childhood cognitive development. Blackwell Publishing Ltd; 2010. p. 285–311. [Google Scholar]
  • 6.Nicolopoulou A, Smith PK. Social play and social development. The Wiley‐Blackwell handbook of childhood social development: Wiley-Blackwell; 2022. p. 538–54. [Google Scholar]
  • 7.Bauer PJ, Larkina M, Deocampo J. Early memory development. The Wiley‐Blackwell handbook of childhood cognitive development: Wiley-Blackwell; 2010. p. 153–79. [Google Scholar]
  • 8.Wellman HM. Developing a theory of mind. The Wiley‐Blackwell handbook of childhood cognitive development: Wiley-Blackwell; 2010. p. 258–84. [Google Scholar]
  • 9.Tomasello M. Language development. The Wiley‐Blackwell handbook of childhood cognitive development: Wiley-Blackwell; 2010. p. 239–57. [Google Scholar]
  • 10.Waxman SR, Leddon EM. Early word-learning and conceptual development. The Wiley‐Blackwell handbook of childhood cognitive development: Wiley-Blackwell; 2010. p. 180–208. [Google Scholar]
  • 11.Gopnik A, Meltzoff AN. Words, thoughts, and theories. Cambridge, MA: MIT Press; 1997. [Google Scholar]
  • 12.Tomasello M. The cultural origins of human cognition. Cambridge, MA: Harvard University Press; 1999. [Google Scholar]
  • 13.Wellman Henry M. The child’s theory of mind. Cambridge, MA: MIT Press; 1990. [Google Scholar]
  • 14.Gergely G. Kinds of agents: The origins of understanding instrumental and communicative agency. The Wiley‐Blackwell Handbook of childhood cognitive development: Wiley-Blackwell; 2010. p. 76–105. [Google Scholar]
  • 15.Johnson SC. The recognition of mentalistic agents in infancy. Trends in Cognitive Sciences. 2000;4(1):22–8. doi: 10.1016/s1364-6613(99)01414-x [DOI] [PubMed] [Google Scholar]
  • 16.Opfer JE, Gelman SA. Development of the animate–inanimate distinction. The Wiley‐Blackwell handbook of childhood cognitive development: Wiley-Blackwell; 2010. p. 213–38. [Google Scholar]
  • 17.Carey S. Conceptual change in childhood. Cambridge, MA: MIT Press; 1985. [Google Scholar]
  • 18.Keil FC. Concepts, kinds, and cognitive development. Cambridge, MA: Bradford. Cambridge, MA: MIT Press; 1989. [Google Scholar]
  • 19.Klein RP, Jennings KD. Responses to social and inanimate stimuli in early infancy. The Journal of Genetic Psychology. 1979;135(1):3–9. doi: 10.1080/00221325.1979.10533411 [DOI] [PubMed] [Google Scholar]
  • 20.Field TM. Visual and cardiac responses to animate and inanimate faces by young term and preterm infants. Child Development. 1979:188–94. doi: 10.2307/1129055 [DOI] [PubMed] [Google Scholar]
  • 21.Ellsworth CP, Muir DW, Hains SM. Social competence and person-object differentiation: An analysis of the still-face effect. Developmental Psychology. 1993;29(1):63–73. doi: 10.1037/0012-1649.29.1.63 [DOI] [Google Scholar]
  • 22.Legerstee M, Pomerleau A, Malcuit G, Feider H. The development of infants’ responses to people and a doll: Implications for research in communication. Infant Behavior and Development. 1987;10(1):81–95. doi: 10.1016/0163-6383(87)90008-7 [DOI] [Google Scholar]
  • 23.Gelman SA, Gottfried GM. Children’s causal explanations of animate and inanimate motion. Child Development. 1996;67(5):1970–87. doi: 10.1111/j.1467-8624.1996.tb01838.x [DOI] [Google Scholar]
  • 24.Goren CC, Sarty M, Wu P. Visual following and pattern discrimination of face-like stimuli by newborn infants. Pediatrics. 1975;56(4):544–9. [PubMed] [Google Scholar]
  • 25.Csibra G, Gergely G, Bíró S, Koós O, Brockbank M. Goal attribution without agency cues: The perception of ‘pure reason’ in infancy. Cognition. 1999;72(3):237–67. doi: 10.1016/s0010-0277(99)00039-6 [DOI] [PubMed] [Google Scholar]
  • 26.Johnson S, Slaughter V, Carey S. Whose gaze will infants follow? The elicitation of gaze-following in 12-month-olds. Developmental Science. 1998;1(2):233–8. doi: 10.1111/1467-7687.00036 [DOI] [Google Scholar]
  • 27.Spelke ES, Phillips A, Woodward AL. Infants’ knowledge of object motion and human action. 1995. [Google Scholar]
  • 28.Mukherji P, Albon D. Research methods in early childhood: An introductory guide: Sage; 2022. [Google Scholar]
  • 29.Kozima H, Nakagawa C, Yano H, editors. Using robots for the study of human social development. AAAI Spring Symposium on Developmental Robotics; 2005: Citeseer. Available from: http://mainline.brynmawr.edu/DevRob05/schedule/papers/kozima.pdf.
  • 30.Kozima H, Nakagawa C. Interactive robots as facilitators of childrens social development. Mobile robots: Towards new applications: IntechOpen; 2006. Available from: https://www.intechopen.com/chapters/59. [Google Scholar]
  • 31.Scassellati B. How developmental psychology and robotics complement each other. NSF/DARPA workshop on development and learning;2000. Available from: https://groups.csail.mit.edu/lbr/hrg/2000/WDL2000.pdf. [Google Scholar]
  • 32.Varrasi S, Di Nuovo S, Conti D, Di Nuovo A, editors. Social robots as psychometric tools for cognitive assessment: A pilot test. Human Friendly Robotics; 2019 2019//; Cham: Springer International Publishing. Available from: 10.1007/978-3-319-89327-3_8. [DOI] [Google Scholar]
  • 33.Conti D, Trubia G, Buono S, Di Nuovo S, Di Nuovo A, editors. Evaluation of a robot-assisted therapy for children with autism and intellectual disability. Towards Autonomous Robotic Systems; 2018 2018//; Cham: Springer International Publishing. Available from: 10.1007/978-3-319-96728-8_34. [DOI] [Google Scholar]
  • 34.Sarrica M, Brondi S, Fortunati L. How many facets does a “social robot” have? A review of scientific and popular definitions online. Information Technology & People. 2020;33(1):1–21. doi: 10.1108/ITP-04-2018-0203 [DOI] [Google Scholar]
  • 35.Henschel A, Laban G, Cross ES. What makes a robot social? A review of social robots from science fiction to a home or hospital near you. Current Robotics Reports. 2021;2(1):9–19. doi: 10.1007/s43154-020-00035-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Onyeulo EB, Gandhi V. What makes a social robot good at interacting with humans? Information. 2020;11(1):43. doi: 10.3390/info11010043 [DOI] [Google Scholar]
  • 37.Breazeal C, Scassellati B. A context-dependent attention system for a social robot. IJCAI International Joint Conference on Artificial Intelligence. 1999;2. [Google Scholar]
  • 38.Nakadai K, Hidai K-i, Mizoguchi H, Okuno H, Kitano H. Real-time auditory and visual multiple-object tracking for humanoids. Proceedings of the Seventeenth International Joint Conference on Artificial Intelligence, IJCAI, August 4–10, 2001; Seattle, Washington, USA, Seattle2001. p. 1425–36.
  • 39.Das D, Rashed MG, Kobayashi Y, Kuno Y. Supporting human–robot interaction based on the level of visual focus of attention. IEEE Transactions on Human-Machine Systems. 2015;45(6):664–75. doi: 10.1109/THMS.2015.2445856 [DOI] [Google Scholar]
  • 40.Amirova A, Rakhymbayeva N, Yadollahi E, Sandygulova A, Johal W. 10 years of human-NAO interaction research: A scoping review. Frontiers in Robotics and AI. 2021;8:744526. doi: 10.3389/frobt.2021.744526 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Kanda T, Sato R, Saiwaki N, Ishiguro H. A two-month field trial in an elementary school for long-term human-robot interaction. IEEE Transactions on Robotics. 2007;23(5):962–71. doi: 10.1109/TRO.2007.904904 WOS:000250177900013. [DOI] [Google Scholar]
  • 42.Belpaeme T, Kennedy J, Ramachandran A, Scassellati B, Tanaka F. Social robots for education: A review. Science Robotics. 2018;3:eaat5954. doi: 10.1126/scirobotics.aat5954 [DOI] [PubMed] [Google Scholar]
  • 43.Billard A. Robota: Clever toy and educational tool. Robotics and Autonomous Systems. 2003;42(3):259–69. doi: 10.1016/S0921-8890(02)00380-9 [DOI] [Google Scholar]
  • 44.Scassellati B. Investigating models of social development using a humanoid robot. Proceedings of the International Joint Conference on Neural Networks, 20032003. p. 2704–9 vol.4.
  • 45.Roesler E, Manzey D, Onnasch L. A meta-analysis on the effectiveness of anthropomorphism in human-robot interaction. Science Robotics. 2021;6(58):eabj5425. doi: 10.1126/scirobotics.abj5425 [DOI] [PubMed] [Google Scholar]
  • 46.Csibra G. Recognizing communicative intentions in infancy. Mind & Language. 2010;25(2):141–68. doi: 10.1111/j.1468-0017.2009.01384.x [DOI] [Google Scholar]
  • 47.Arksey H O ’Malley L. Scoping studies: Towards a methodological framework. International Journal of Social Research Methodology. 2005;8(1):19–32. doi: 10.1080/1364557032000119616 [DOI] [Google Scholar]
  • 48.Munn Z, Peters MDJ, Stern C, Tufanaru C, McArthur A, Aromataris E. Systematic review or scoping review? Guidance for authors when choosing between a systematic or scoping review approach. BMC Medical Research Methodology. 2018;18(1):143. doi: 10.1186/s12874-018-0611-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Peters MD, Godfrey C, McInerney P, Munn Z, Tricco AC, Khalil H. Chapter 11: Scoping reviews. Joanna Briggs Institute reviewer’s manual. doi: 10.1186/1475-2859-9-9 p. 1–24. [DOI] [Google Scholar]
  • 50.Levac D, Colquhoun H, O’Brien KK. Scoping studies: Advancing the methodology. Implementation Science. 2010;5(1):69. doi: 10.1186/1748-5908-5-69 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Xie B, Shen Z, Wang K. Is preprint the future of science? A thirty year journey of online preprint services2021 February 01, 2021:[arXiv:2102.09066 p.]. Available from: https://ui.adsabs.harvard.edu/abs/2021arXiv210209066X.
  • 52.Baxter P, Kennedy J, Senft E, Lemaignan S, Belpaeme T, editors. From characterising three years of HRI to methodology and reporting recommendations. 2016 11th acm/ieee international conference on human-robot interaction (hri); 2016: IEEE.
  • 53.Shamir L. The effect of conference proceedings on the scholarly communication in computer science and engineering. Scholarly and Research Communication. 2010;1(2). doi: 10.22230/src.2010v1n2a25 [DOI] [Google Scholar]
  • 54.Covidence systematic review software: Veritas Health Innovation, Melbourne, Australia. Available from: Available at www.covidence.org.
  • 55.Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. BMJ. 2009;339:b2535. doi: 10.1136/bmj.b2535 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56.Hsiao H-S, Chang C-S, Lin C-Y, Hsu H-L. "iRobiQ": The influence of bidirectional interaction on kindergarteners’ reading motivation, literacy, and behavior. Interactive Learning Environments. 2015;23(3):269–92. doi: 10.1080/10494820.2012.745435 EJ1058393. [DOI] [Google Scholar]
  • 57.Boccanfuso L, Kim ES, Snider JC, Wang Q, Wall CA, DiNicola L, et al. Autonomously detecting interaction with an affective robot to explore connection to developmental ability. 2015 International Conference on Affective Computing and Intelligent Interaction (ACII); 20152015. p. 1–7.
  • 58.Sommer K, Slaughter V, Wiles J, Owen K, Chiba AA, Forster D, et al. Can a robot teach me that? Children’s ability to imitate robots. Journal of Experimental Child Psychology. 2021;203. doi: 10.1016/j.jecp.2020.105040 [DOI] [PubMed] [Google Scholar]
  • 59.Critten V, Hagon H, Messer D. Can pre-school children learn programming and coding through guided play activities? A case study in computational thinking. Early Childhood Education Journal. 2022;50(6):969–81. doi: 10.1007/s10643-021-01236-8 EJ1340706. [DOI] [Google Scholar]
  • 60.Barber O, Somogyi E, McBride A, Proops L. Exploring the role of aliveness in children’s responses to a dog, biomimetic robot, and toy dog. Computers in Human Behavior. 2023;142. doi: 10.1016/j.chb.2023.107660 WOS:000922494900001. [DOI] [Google Scholar]
  • 61.Kahn PH Jr., Friedman B, Perez-Granados DR, Freier NG. Robotic pets in the lives of preschool children. Interaction Studies. 2006;7(3):405–36. doi: 10.1075/is.7.3.13kah WOS:000244942600009. [DOI] [Google Scholar]
  • 62.Sommer K, Redshaw J, Slaughter V, Wiles J, Nielsen M. The early ontogeny of infants’ imitation of on screen humans and robots. Infant Behavior and Development. 2021;64:101614. doi: 10.1016/j.infbeh.2021.101614 [DOI] [PubMed] [Google Scholar]
  • 63.Dunham P, Dunham F, Tran S, Akhtar N. The nonreciprocating robot: Effects on verbal discourse, social play, and social referencing at two years of age. Child Development. 1991;62(6):1489–502. doi: 10.1111/j.1467-8624.1991.tb01620.x [DOI] [PubMed] [Google Scholar]
  • 64.Dunham P, Dunham F. The semantically reciprocating robot: Adult influences on children’s early conversational skills. Social Development. 1996;5(3):261–74. doi: 10.1111/j.1467-9507.1996.tb00085.x [DOI] [Google Scholar]
  • 65.Ferrier S, Dunham P, Dunham F. The confused robot: Two-year-olds’ responses to breakdowns in conversation. Social Development. 2000;9(3):337–47. doi: 10.1111/1467-9507.00129 [DOI] [Google Scholar]
  • 66.O’Connell L, Poulin-Dubois D, Demke T, Guay A. Can infants use a nonhuman agent’s gaze direction to establish word–object relations? Infancy. 2009;14(4):414–38. doi: 10.1080/15250000902994073 [DOI] [PubMed] [Google Scholar]
  • 67.Meltzoff AN, Brooks R, Shon AP, Rao RPN. “Social” robots are psychological agents for infants: A test of gaze following. Neural Networks. 2010;23(8):966–72. doi: 10.1016/j.neunet.2010.09.005 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.Itakura S, Ishida H, Kanda T, Shimada Y, Ishiguro H, Lee K. How to build an intentional android: Infants’ imitation of a robot’s goal-directed actions. Infancy. 2008;13(5):519–32. doi: 10.1080/15250000802329503 [DOI] [Google Scholar]
  • 69.Fitter NT, Funke R, Carlos Pulido J, Eisenman LE, Deng W, Rosales MR, et al. Socially assistive infant-robot interaction using robots to encourage infant leg-motion training. IEEE Robotics & Automation Magazine. 2019;26(2):12–23. doi: 10.1109/MRA.2019.2905644 WOS:000471680800005. [DOI] [Google Scholar]
  • 70.Somma R. Robie Sr. Robot. openverse (CC BY 2.0 https://creativecommons.org/licenses/by/2.0/?ref=openverse) https://openverse.org/image/b616072e-5dcf-44e3-8a42-a4338ae72c72?q=Somma%20Robie%20Sr.%20Robot [Google Scholar]
  • 71.Manzi F, Peretti G, Di Dio C, Cangelosi A, Itakura S, Kanda T, et al. A robot is not worth another: Exploring children’s mental state attribution to different humanoid robots. Frontiers in Psychology. 2020;11:Figure 1 (CC BY) https://www.frontiersin.org/journals/psychology/about#about-open. doi: 10.3389/fpsyg.2020.02011 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 72.Manzi F, Ishikawa M, Di Dio C, Itakura S, Kanda T, Ishiguro H, et al. The understanding of congruent and incongruent referential gaze in 17-month-old infants: an eye-tracking study comparing human and robot. Scientific Reports. 2020;10(1):11918: Figure 3 (CC BY 4.0) https://creativecommons.org/licenses/by/4.0/. doi: 10.1038/s41598-020-69140-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.dullhunk. Nao social humanoid robot from aldebaran robotics at animation 2012. openverse (CC BY 2.0 https://creativecommons.org/licenses/by/2.0/?ref=openverse) https://openverse.org/image/965747b9-7372-4ef0-bc45-8b3f5a77a7d9?q=Nao%20social%20humanoid%20robot%20from%20aldebaran.
  • 74.Kim J, Mishra AK, Limosani R, Scafuro M, Cauli N, Santos-Victor J, et al. Control strategies for cleaning robots in domestic applications: A comprehensive review. International Journal of Advanced Robotic Systems. 2019;16(4):Figure 4 (CC BY .0) https://creativecommons.org/licenses/by/4.0/. doi: 10.1177/1729881419857432 [DOI] [Google Scholar]
  • 75.Jeong G-M, Park C-W, You S, Ji S-H. A study on the education assistant system using smartphones and service robots for children. International Journal of Advanced Robotic Systems. 2014;11(4):71: Figure 1 (CC BT 3.0) https://creativecommons.org/licenses/by/3.0/. doi: 10.5772/58389 [DOI] [Google Scholar]
  • 76.Loimere. Sphero! openverse (CC BY 2.0 https://creativecommons.org/licenses/by/2.0/?ref=openverse) https://openverse.org/image/f8fe1444-9400-4a33-9597-dd2b8015d868?q=Sphero%21.
  • 77.Matsuda G, Ishiguro H, Hiraki K. Infant discrimination of humanoid robots. Frontiers in Psychology. 2015;6:Figure 1 (CC BY 4.0) https://creativecommons.org/licenses/by/4.0/. doi: 10.3389/fpsyg.2015.01397 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 78.gophodotcom. DSC_0096. openverse (CC BY 2.0 https://creativecommons.org/licenses/by/2.0/?ref=openverse) https://openverse.org/image/68971a4a-3deb-4a52-bc0d-811863c7bf4a?q=Keepon.
  • 79.Cervera N, Diago PD, Orcos L, Yáñez DF. The acquisition of computational thinking through mentoring: An exploratory study. Education Sciences. 2020;10(8):202: Figure 2 (CC BY 4.0) https://creativecommons.org/licenses/by/4.0/. doi: 10.3390/educsci10080202 [DOI] [Google Scholar]
  • 80.Riddoch KA, Hawkins RD, Cross ES. Exploring behaviours perceived as important for human—Dog bonding and their translation to a robotic platform. PLOS ONE. 2022;17(9):e0274353: Figure 1 (CC BY 4.0) https://creativecommons.org/licenses/by/4.0/. doi: 10.1371/journal.pone.0274353 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 81.Poulin-Doubois D, Lepage A, Ferland D. Infants’ concept of animacy. Cognitive Development. 1996;11(1):19–36. doi: 10.1016/S0885-2014(96)90026-X [DOI] [Google Scholar]
  • 82.Arita A, Hiraki K, Kanda T, Ishiguro H. Can we talk to robots? Ten-month-old infants expected interactive humanoid robots to be talked to by persons. Cognition. 2005;95(3):B49–B57. doi: 10.1016/j.cognition.2004.08.001 [DOI] [PubMed] [Google Scholar]
  • 83.Kamewari K, Kato M, Kanda T, Ishiguro H, Hiraki K. Six-and-a-half-month-old children positively attribute goals to human action and to humanoid-robot motion. Cognitive Development. 2005;20(2):303–20. doi: 10.1016/j.cogdev.2005.04.004 [DOI] [Google Scholar]
  • 84.Okumura Y, Kanakogi Y, Kanda T, Ishiguro H, Itakura S. The power of human gaze on infant learning. Cognition. 2013;128(2):127–33. doi: 10.1016/j.cognition.2013.03.011 [DOI] [PubMed] [Google Scholar]
  • 85.Okumura Y, Kanakogi Y, Kanda T, Ishiguro H, Itakura S. Infants understand the referential nature of human gaze but not robot gaze. Journal of Experimental Child Psychology. 2013;116(1):86–95. doi: 10.1016/j.jecp.2013.02.007 WOS:000321723500008. [DOI] [PubMed] [Google Scholar]
  • 86.Okumura Y, Kanakogi Y, Kanda T, Ishiguro H, Itakura S. Can infants use robot gaze for object learning? The effect of verbalization. Interaction Studies. 2013;14(3):351–65. doi: 10.1075/is.14.3.03oku WOS:000338351400004. [DOI] [Google Scholar]
  • 87.Wang Y, Park Y-H, Itakura S, Henderson AME, Kanda T, Furuhata N, Ishiguro H. Infants’ perceptions of cooperation between a human and robot. Infant and Child Development. 2020;29(2). doi: 10.1002/icd.2161 WOS:000501682800001. [DOI] [Google Scholar]
  • 88.Manzi F, Ishikawa M, Di Dio C, Itakura S, Kanda T, Ishiguro H, et al. Infants’ prediction of humanoid robot’s goal-directed action. International Journal of Social Robotics. 2022. doi: 10.1007/s12369-022-00941-7 WOS:000882557200001. [DOI] [Google Scholar]
  • 89.Deng W, Sargent B, Havens K, Vanderbilt D, Rosales M, Pulido JC, et al. Correlation between performance and quantity/variability of leg exploration in a contingency learning task during infancy. Infant Behavior & Development. 2023;70:1–10. doi: 10.1016/j.infbeh.2022.101788 [DOI] [PubMed] [Google Scholar]
  • 90.Funke R, Fitter NT, de Armendi JT, Bradley NS, Sargent B, Mataric MJ, et al. A data collection of infants’ visual, physical, and behavioral reactions to a small humanoid robot. 2018 IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO); 20182018. p. 99–104. [Google Scholar]
  • 91.Alac M, Movellan J, Malmir M. Grounding a sociable robot’s movements in multimodal, situational engagements. New Frontiers in Artificial Intelligence (JSAI-ISAI 2013); 20142014. p. 267–81. [Google Scholar]
  • 92.Peca A, Simut R, Cao H-L, Vanderborght B. Do infants perceive the social robot Keepon as a communicative partner? Infant Behavior & Development. 2016;42:157–67. doi: 10.1016/j.infbeh.2015.10.005 WOS:000372389100017. [DOI] [PubMed] [Google Scholar]
  • 93.Sommer K, Slaughter V, Wiles J, Nielsen M. Revisiting the video deficit in technology-saturated environments: Successful imitation from people, screens, and social robots. Journal of experimental child psychology. 2023;232:105673–. doi: 10.1016/j.jecp.2023.105673 MEDLINE:37068443. [DOI] [PubMed] [Google Scholar]
  • 94.Kahn PH Jr., Gary HE, Shen S. Children’s social relationships with current and near-future robots. Child Development Perspectives. 2013;7(1):32–7. doi: 10.1111/cdep.12011 WOS:000314975500007. [DOI] [Google Scholar]
  • 95.Thompson EL, Bird G, Catmur C. Conceptualizing and testing action understanding. Neuroscience & Biobehavioral Reviews. 2019;105:106–14. doi: 10.1016/j.neubiorev.2019.08.002 [DOI] [PubMed] [Google Scholar]
  • 96.Heyes CM. Social learning in animals: Categories and mechanisms. Biological Reviews. 1994;69(2):207–31. doi: 10.1111/j.1469-185x.1994.tb01506.x [DOI] [PubMed] [Google Scholar]
  • 97.Papakostas GA, Sidiropoulos GK, Papadopoulou CI, Vrochidou E, Kaburlasos VG, Papadopoulou MT, et al. Social robots in special education: A systematic review. Electronics. 2021;10(12):1398. doi: 10.3390/electronics10121398 [DOI] [Google Scholar]
  • 98.Pennisi P, Tonacci A, Tartarisco G, Billeci L, Ruta L, Gangemi S, et al. Autism and social robotics: A systematic review. Autism Research. 2016;9(2):165–83. doi: 10.1002/aur.1527 [DOI] [PubMed] [Google Scholar]
  • 99.Kouroupa A, Laws KR, Irvine K, Mengoni SE, Baird A, Sharma S. The use of social robots with children and young people on the autism spectrum: A systematic review and meta-analysis. PLoS ONE. 2022;17(6). doi: 10.1371/journal.pone.0269800 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 100.Sani-Bozkurt S, Bozkus-Genc G. Social robots for joint attention development in autism spectrum disorder: A systematic review. International Journal of Disability, Development and Education. 2023;70(5):625–43. doi: 10.1080/1034912X.2021.1905153 [DOI] [Google Scholar]
  • 101.Kohli M, Kar AK, Sinha S. Robot facilitated rehabilitation of children with autism spectrum disorder: A 10 year scoping review. EXPERT SYSTEMS. 2023;40(5). doi: 10.1111/exsy.13204 WOS:000894239600001. [DOI] [Google Scholar]
  • 102.Alabdulkareem A, Alhakbani N, Al-Nafjan A. A systematic review of research on robot-assisted therapy for children with autism. Sensors. 2022;22(3):944. doi: 10.3390/s22030944 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 103.Kabacinska K, Prescott TJ, Robillard JM. Socially assistive robots as mental health interventions for children: A scoping review. International Journal of Social Robotics. 2021;13(5):919–35. doi: 10.1007/s12369-020-00679-0 WOS:000552929400001. [DOI] [Google Scholar]
  • 104.Dawe J, Sutherland C, Barco A, Broadbent E. Can social robots help children in healthcare contexts? A scoping review. BMJ paediatrics open. 2019;3(1). doi: 10.1136/bmjpo-2018-000371 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 105.Lau Y, Chee DGH, Chow XP, Wong SH, Cheng LJ, Lau ST. Humanoid robot-assisted interventions among children with diabetes: A systematic scoping review. International Journal of Nursing Studies. 2020;111:103749. doi: 10.1016/j.ijnurstu.2020.103749 [DOI] [PubMed] [Google Scholar]
  • 106.Triantafyllidis A, Alexiadis A, Votis K, Tzovaras D. Social robot interventions for child healthcare: A systematic review of the literature. Computer Methods and Programs in Biomedicine Update. 2023;3:100108. doi: 10.1016/j.cmpbup.2023.100108 [DOI] [Google Scholar]
  • 107.Melson GF, Kahn PH, Beck A, Friedman B, Roberts T, Garrett E, et al. Children’s behavior toward and understanding of robotic and living dogs. Journal of Applied Developmental Psychology. 2009;30(2):92–102. doi: 10.1016/j.appdev.2008.10.011 [DOI] [Google Scholar]
  • 108.Kahn PH Jr, Kanda T, Ishiguro H, Freier NG, Severson RL, Gill BT, et al. “Robovie, you’ll have to go into the closet now”: Children’s social and moral relationships with a humanoid robot. Developmental Psychology. 2012;48(2):303–14. doi: 10.1037/a0027033 [DOI] [PubMed] [Google Scholar]
  • 109.Desideri L, Bonifacci P, Croati G, Dalena A, Gesualdo M, Molinario G, et al. The mind in the machine: Mind perception modulates gaze aversion during child–robot interaction. International Journal of Social Robotics. 2021;13(4):599–614. doi: 10.1007/s12369-020-00656-7 [DOI] [Google Scholar]
  • 110.Di Dio C, Manzi F, Peretti G, Cangelosi A, Harris PL, Massaro D, et al. Shall I trust you? From child–robot interaction to trusting relationships. Frontiers in Psychology. 2020;11. doi: 10.3389/fpsyg.2020.00469 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 111.Csibra G. Teleological and referential understanding of action in infancy. Philosophical Transactions of the Royal Society of London Series B: Biological Sciences. 2003;358(1431):447–58. doi: 10.1098/rstb.2002.1235 [DOI] [PMC free article] [PubMed] [Google Scholar]

Decision Letter 0

Simone Varrasi

22 Mar 2024

PONE-D-24-04880Social robots in research on social and cognitive development in infants and toddlers: A Scoping reviewPLOS ONE

Dear Dr. Flatebø,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by May 06 2024 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Simone Varrasi

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at 

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. We note that Figure 3 includes an image of a participant in the study. 

As per the PLOS ONE policy (http://journals.plos.org/plosone/s/submission-guidelines#loc-human-subjects-research) on papers that include identifying, or potentially identifying, information, the individual(s) or parent(s)/guardian(s) must be informed of the terms of the PLOS open-access (CC-BY) license and provide specific permission for publication of these details under the terms of this license. Please download the Consent Form for Publication in a PLOS Journal (http://journals.plos.org/plosone/s/file?id=8ce6/plos-consent-form-english.pdf). The signed consent form should not be submitted with the manuscript, but should be securely filed in the individual's case notes. Please amend the methods section and ethics statement of the manuscript to explicitly state that the patient/participant has provided consent for publication: “The individual in this manuscript has given written informed consent (as outlined in PLOS consent form) to publish these case details”. 

If you are unable to obtain consent from the subject of the photograph, you will need to remove the figure and any other textual identifying information or case descriptions for this individual.

3. Please remove your figures from within your manuscript file, leaving only the individual TIFF/EPS image files, uploaded separately. These will be automatically included in the reviewers’ PDF.

4. We note that this manuscript is a systematic review or meta-analysis; our author guidelines therefore require that you use PRISMA guidance to help improve reporting quality of this type of study. Please upload copies of the completed PRISMA checklist as Supporting Information with a file name “PRISMA checklist”.

5. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

Additional Editor Comments:

Both Reviewers recognized the high value of your manuscript. They reported some suggestions that do not question the methodological structure of your work, but yet they are important for the improvement of its quality and for the adherence to editorial guidelines. Please consider them as appropriate.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: N/A

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Dear authors,

your scoping review about social robots in research on social and cognitive development in infants and toddlers is really interesting and scientifically useful.

It is fluid and well written.

Minor concerns are suggested:

From line 60, you deal with the scientific literature about the role of social robot as methodology of developmental psychology research. You could introduce this topic dealing with how social robots began to be used in assessment. These pioneering studies could help you: https://doi.org/10.1007/978-3-319-89327-3_8 ; https://doi.org/10.1007/978-3-319-96728-8_34 (in this case, the paper is focused on children with autism and intellectual disability, but it is really interesting);

Reviewer #2: Dear authors,

thank you for submitting your work to the journal PLOS ONE.

I reckon the study is very interesting.

I just found few details that may need to be checked.

1. Bibliography isn’t justified and the insertion of DOI would be appreciated.

2. Tables should have their description underneath, not above (e.g. tab. 3, 5, 6…)

3. Use grid lines, especially separating long lists. (S1 Table 1).

4. In which field would you like to expand the future perspective of the technique?

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2024 May 15;19(5):e0303704. doi: 10.1371/journal.pone.0303704.r002

Author response to Decision Letter 0


22 Apr 2024

Dr Simone Varrasi

Academic Editor

PLOS ONE

Dear Dr Simone Varrasi,

Thank you for your decision letter regarding our submission PONE-D-24-04880 “Social robots in research on social and cognitive development in infants and toddlers: A Scoping review”, and for inviting us to resubmit our manuscript after minor revisions. For clarity, throughout this letter, we use italics to mark your and the Reviewer’s comments and blue to mark our new text.

We have carefully read both the reviews and your letter. We made a great effort to incorporate all the points offered in the letter and in the two reviews to improve our manuscript. We have also proofread the manuscript again and corrected minor typos and errors. Below is a detailed list of how we addressed all the Reviewers’ and your points. We hope that with this minor revision, our manuscript will be accepted for publication in PLOS ONE. Please see below how we addressed your and the reviewer’s feedback.

Sincerely,

Solveig Flatebø

PhD student

UiT The Arctic University of Norway

-

solveig.flatebo@uit.no

+47 95 48 46 03

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

[Our response:]

We have double-checked, and our manuscript follows PLOS ONE’s style requirements, and the files are named correctly.

2. We note that Figure 3 includes an image of a participant in the study. As per the PLOS ONE policy (http://journals.plos.org/plosone/s/submission-guidelines#loc-human-subjects-research) on papers that include identifying, or potentially identifying, information, the individual(s) or parent(s)/guardian(s) must be informed of the terms of the PLOS open-access (CC-BY) license and provide specific permission for publication of these details under the terms of this license. Please download the Consent Form for Publication in a PLOS Journal(http://journals.plos.org/plosone/s/file?id=8ce6/plos-consent-form-english.pdf). The signed consent form should not be submitted with the manuscript, but should be securely filed in the individual's case notes. Please amend the methods section and ethics statement of the manuscript to explicitly state that the patient/participant has provided consent for publication: “The individual in this manuscript has given written informed consent (as outlined in PLOS consent form) to publish these case details” If you are unable to obtain consent from the subject of the photograph, you will need to remove the figure and any other textual identifying information or case descriptions for this individual.

[Our response:]

All images in Figure 3 are licensed under CC-BY and can freely be used by others, including image “h” of the ReplieQ2 robot. Image “h” displays two configurations of the “female” ReplieQ2 robot developed by Kokoro and Osaka University and Advanced Media, Inc. The leftward picture of the robot is the original Android configuration, whereas the picture to the right is the same robot stripped down to its underlying mechanical look. Therefore, since this is a picture of a robot and not a human participant, we have not removed picture “h” in Figure 3. However, we acknowledge that the android can easily be mistaken for a human, and we have therefore added the following sentence in the figure’s note to clarify (lines 250-251 in the clean manuscript):

[Our new text:]

[…]. The Android and mechanical configurations of the same robot are shown in image (h).

3. Please remove your figures from within your manuscript file, leaving only the individual TIFF/EPS image files, uploaded separately. These will be automatically included in the reviewers’ PDF.

[Our response:]

Thanks for bringing this to our attention. All figures within the manuscript files have been removed, and the individual TIFF image files have been uploaded separately.

4. We note that this manuscript is a systematic review or meta-analysis; our author guidelines therefore require that you use PRISMA guidance to help improve reporting quality of this type of study. Please upload copies of the completed PRISMA checklist as Supporting Information with a file name “PRISMA checklist”.

[Our response:]

We have now uploaded the PRISMA checklist as Supporting Information with the correct file name, “S1_File_PRISMA_Checklist,” as requested. The PRISMA checklist was also uploaded in the first submission, but it did not have the correct file name.

5. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

[Our response:]

We have reviewed our reference list, and it is complete and correct. We have not cited any retracted papers.

Reviewer #1: Dear authors, your scoping review about social robots in research on social and cognitive development in infants and toddlers is really interesting and scientifically useful. It is fluid and well written.

Reviewer #2: Thank you for submitting your work to the journal PLOS ONE. I reckon the study is very interesting.

[Our response:]

We thank both Reviewers for this positive feedback.

Reviewer #1: From line 60, you deal with the scientific literature about the role of social robot as methodology of developmental psychology research. You could introduce this topic dealing with how social robots began to be used in assessment. These pioneering studies could help you: https://doi.org/10.1007/978-3-319-89327-3_8 ; https://doi.org/10.1007/978-3-319-96728-8_34 (in this case, the paper is focused on children with autism and intellectual disability, but it is really interesting)

[Our response:]

We thank the reviewer for introducing us to these studies on robots' implications in psychological assessments. As the reviewer suggested, we read the papers linked to and read about the topic. In the Introduction, we added sentences that point to the importance of the topics raised by the Reviewer in connection with our study (lines 61-63 in the clean manuscript).

[Our new text:]

Some pioneering studies have also demonstrated that social robots can contribute to cognitive assessments of elderly people and children with autism [32, 33].

[Our response continued:]

Moreover, we added the reference https://doi.org/10.1007/978-3-319-89327-3_8 to our discussion about the advantages of using social robots in research in the Introduction (lines 75-76 in the clean manuscript).

[Our new text:]

[…]. Firstly, they provide a level of control and consistency that can be challenging to achieve with human experimenters [32, 44].

Reviewer #2: 1. Bibliography isn’t justified and the insertion of DOI would be appreciated.

[Our response:]

Thank you for bringing this matter to our attention. We have now applied the PLoS reference style, which includes the DOI of all papers.

Reviewer #2: 2. Tables should have their description underneath, not above (e.g. tab. 3, 5, 6…)

[Our response:]

We have now checked all tables in the manuscript and supporting information and changed them so that they meet the table requirements (https://journals.plos.org/plosone/s/tables). We have also corrected the tables so that the table labels and titles are presented in bold font underneath them.

Reviewer #2: 3. Use grid lines, especially separating long lists. (S1 Table 1).

[Our response:]

We have applied grid lines to all our tables, also in the long lists in the S1 Table 1 and in Table 4.

Reviewer #2: 4. In which field would you like to expand the future perspective of the technique?

[Our response:]

Thanks for raising this important issue. We believe that the manuscript primarily falls under the field of developmental psychology, but it also has implications for the field of social robotics. The manuscript highlights crucial features of robots that are significant for young children and must be considered when developing age-appropriate robots in social robotics. However, the main topic in the reviewed literature is whether infants and toddlers perceive social robots as social partners, which are fundamental research questions belonging to the field of developmental psychology. We have highlighted this issue within the manuscript by adding several new sentences in the Future directions (lines 503-504 and 510-512) and by rewriting the last sentences in the Conclusion (lines 539-542 in the clean manuscript) to clarify which fields we would like to expand the future perspective.

[Our new text:]

This review has allowed us to identify important directions for future research, primarily within developmental psychology but also in social robotics. (Lines 503-504 in the clean manuscript).

[…] Findings on what robot behaviors are crucial for young children may have implications for future work within social robotics when aiming to develop age-appropriate robots. (Lines 510-512 in the clean manuscript).

These insights have implications for future studies within developmental psychology involving social robots and young children and future work within social robotics on designing appropriate robot behaviors to facilitate social interaction with robots in early childhood. (Lines 539-542 in the clean manuscript).

Attachment

Submitted filename: Response to Reviewers.docx

pone.0303704.s004.docx (83.3KB, docx)

Decision Letter 1

Simone Varrasi

30 Apr 2024

Social robots in research on social and cognitive development in infants and toddlers: A Scoping review

PONE-D-24-04880R1

Dear Dr. Flatebø,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice will be generated when your article is formally accepted. Please note, if your institution has a publishing partnership with PLOS and your article meets the relevant criteria, all or part of your publication costs will be covered. Please make sure your user information is up-to-date by logging into Editorial Manager at Editorial Manager® and clicking the ‘Update My Information' link at the top of the page. If you have any questions relating to publication charges, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Simone Varrasi

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Acceptance letter

Simone Varrasi

2 May 2024

PONE-D-24-04880R1

PLOS ONE

Dear Dr. Flatebø,

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now being handed over to our production team.

At this stage, our production department will prepare your paper for publication. This includes ensuring the following:

* All references, tables, and figures are properly cited

* All relevant supporting information is included in the manuscript submission,

* There are no issues that prevent the paper from being properly typeset

If revisions are needed, the production department will contact you directly to resolve them. If no revisions are needed, you will receive an email when the publication date has been set. At this time, we do not offer pre-publication proofs to authors during production of the accepted work. Please keep in mind that we are working through a large volume of accepted articles, so please give us a few weeks to review your paper and let you know the next and final steps.

Lastly, if your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

If we can help with anything else, please email us at customercare@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Simone Varrasi

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Checklist. Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) checklist.

    (DOCX)

    pone.0303704.s001.docx (84.4KB, docx)
    S1 File. Search strategy.

    Search queries and search terms used in the databases and preprint repository.

    (DOCX)

    pone.0303704.s002.docx (30.5KB, docx)
    S1 Table. Overview of the included studies.

    (DOCX)

    pone.0303704.s003.docx (36.4KB, docx)
    Attachment

    Submitted filename: Response to Reviewers.docx

    pone.0303704.s004.docx (83.3KB, docx)

    Data Availability Statement

    All data are available from the OSF database doi.org/10.17605/OSF.IO/WF48R.


    Articles from PLOS ONE are provided here courtesy of PLOS

    RESOURCES