Skip to main content
Proceedings of the Royal Society B: Biological Sciences logoLink to Proceedings of the Royal Society B: Biological Sciences
. 2021 Apr 28;288(1949):20210070. doi: 10.1098/rspb.2021.0070

The development of body representations: an associative learning account

Carina C J M de Klerk 1,, Maria Laura Filippetti 1, Silvia Rigato 1
PMCID: PMC8079995  PMID: 33906399

Abstract

Representing one's own body is of fundamental importance to interact with our environment, yet little is known about how body representations develop. One account suggests that the ability to represent one's own body is present from birth and supports infants' ability to detect similarities between their own and others’ bodies. However, in recent years evidence has been accumulating for alternative accounts that emphasize the role of multisensory experience obtained through acting and interacting with our own body in the development of body representations. Here, we review this evidence, and propose an integrative account that suggests that through experience, infants form multisensory associations that facilitate the development of body representations. This associative account provides a coherent explanation for previous developmental findings, and generates novel hypotheses for future research.

Keywords: body representations, associative learning, multisensory experience, infancy, development

1. Introduction

Our very first sensory experiences are inherently linked to our own body. Fetuses perform isolated limb movements from as early as the fifteenth gestational week [1], and when they do, this generates both proprioceptive and tactile feedback, for example when they touch their face or the uterine wall. From birth, infants' bodies provide the main tool for interacting with the external environment, and thus the development of infants’ bodily abilities is fundamentally linked with their ability to interact with, and learn from the world [2]. How infants represent this ever-present part of their existence is a fascinating question that has remained largely unanswered. This may be due to the fact that body representation is a multifaceted concept that has been defined in many different ways across the literature. For example, while for some, body representations relate to high-level concepts such as bodily self-awareness (e.g. [3]), for others these representations are more low-level and relate to the multisensory representations of the spatial disposition of our various body parts (e.g. [4]). In the current paper, we focus on the development of body representations in infancy defined as the ability to integrate multisensory (visual, proprioceptive and tactile) bodily information into coherent representations of one's own body.

The past two decades have seen an increasing interest in the study of body representations in adulthood using perceptual illusions such as the ‘rubber hand’ and ‘enfacement’ illusion (e.g. [5,6]). These studies have shown that visual, tactile, postural and anatomical information all contribute to body representations in adults. More recently, research has suggested that the ability to bind together such multisensory signals lies at the core of a gradual development of body representations from infancy onwards [79]. However, the exact mechanism through which these multisensory signals become integrated into coherent body representations remains unknown. One theory proposes that infants are able to combine information from multiple senses from birth (e.g. [10]). According to this view, infants are born with a supramodal representational system that is not restricted to modality-specific information and that allows them to process sensory representations of their own and others' bodies in a common framework [11]. This account has mainly focussed on how these supramodal representations allow infants to detect similarities between their own felt bodily acts and the perceived bodily acts of others to support neonatal imitation (a controversial claim, see [1214] for recent debate). Based on the same supramodal representational system one would also expect infants to show evidence of very early emerging multisensory body representations. To date, however, this topic remains poorly investigated. Although infants indeed seem able to represent unimodal bodily signals [15,16] and to detect intersensory body-related contingency from very early in life [7,9], in recent years, evidence for views that emphasize the role of experience in the development of multisensory body representations has started to accumulate. For example, recent studies have shown a protracted developmental trajectory in infants’ abilities to integrate visual–tactile information presented to the limbs [17], to localize tactile stimulation on their body [18,19] and to distinguish typical from distorted body shapes [20,21]. These findings appear inconsistent with the idea that body representations are present from birth.

Expanding on existing associative learning accounts of visuo–motor integration [22] and visual–tactile integration [23], here we propose that through daily multisensory experience infants form associations between visual, tactile and proprioceptive signals that lead to gradually emerging multisensory representations of their own body. This account suggests that the kind of learning that leads to more coherent body representations occurs when there is correlated (i.e. contiguous and contingent) excitation of the sensory neurons that represent a certain body part. For example, when an infant sees her own hand touching an object, the correlated excitation of the visual, tactile and proprioceptive sensory neurons increases the strength of the connections between them, so that subsequent excitation of one of these types of sensory neurons, i.e. when the infant's hand is touched by an external object, leads to co-activation of the others. Thus, through correlated multisensory experience, unisensory body representations become multisensory body representations, which allow infants to represent multisensory bodily events in relation to each other. The learning that supports the formation of these multisensory body representations mainly occurs when infants observe themselves while they touch their body or external objects.

Our bodies undergo several periods of significant change over the course of our lifespan, for example when we grow in infancy, childhood and adolescence, and as we gain or lose weight, become pregnant or age. Here, we argue that our associative account of body representation development offers a plausible explanation for how we update representations of our bodies as we develop and change, opening up new avenues for future research. We start by reviewing studies with adult and child participants that demonstrate that body representations are malleable and can be influenced by incoming multisensory signals. Hereafter we discuss what is known about the development of body representations in infancy, and provide evidence for our account by drawing on studies investigating the role of experience in this process. Throughout we provide suggestions for future research that would more directly test this associative hypothesis of body representation development.

2. Multisensory body representations in adulthood

(a). Evidence from bodily illusions

Research on the mechanisms underlying body representations in adulthood has focused on the role of multisensory integration in defining the perception of one's body. Through experimentally induced manipulations of multisensory inputs, these studies have shown that body representations are highly malleable (for a review see [6]). A well-known example of the plasticity of the representations of our body comes from the ‘rubber hand illusion’ (RHI; [24]). In this illusion, watching a rubber hand being stroked in synchrony with the real hidden hand causes a change in body ownership, whereby the rubber hand is experienced as part of the own body and the real hand is felt to be closer to the rubber hand (for reviews see [2,6]). The RHI is not limited to the visual–tactile domain, and can also be elicited by synchronous tactile–proprioceptive [25] and visual–motor experience [26]. This illusion provides an indirect demonstration of how correlated multisensory experience during our typical day-to-day interactions likely plays a critical role in the perception of our own body. Indeed, changes in perceptual body representations as a result of the RHI are only significant when information from proprioceptive, visual and tactile sensory channels is coherent (i.e. spatially and temporally integrated [27,28]). Instead, the illusion is diminished when the multisensory information provided is incongruent, either because it is temporally asynchronous [24], or because the postural and anatomical positions are disrupted, leading to a spatial mismatch between the rubber hand and the real hand [29].

While correlated visual, tactile and proprioceptive information is necessary for maintaining a stable representation of one's own hand, it is not sufficient for the RHI to take place. Top-down constraints, such as visual resemblance to the real hand, are also relevant for the illusion to occur [29,30], suggesting that perceptual body representations arise from an interactive process whereby the immediate sensory signals are compared with stored representations of the body [5]. Based on our hypothesis that representations of the body develop gradually through multisensory experience, one would expect that early in development correlated multisensory signals are likely to have a bigger impact on bodily illusions than top-down representations (i.e. expectations about the visual percept that should be associated with certain tactile and proprioceptive input). Indeed, recent research seems to support this proposal. For example, 6-to-8-year-old children are as likely to embody a rubber hand that is significantly larger than their own hand, as they are to embody a rubber hand that is equal in size [31]. Similarly, multisensory processing of bodily stimuli shows a protracted development that is dependent upon the substantial physical changes that the body undergoes from infancy to late childhood [3236]. Overall, this evidence suggests that children's body representations are more plastic, presumably because they have had less time to gain substantial multisensory experience with their bodies to establish strong expectations about how proprioceptive, visual and tactile representations typically co-occur.

Perceptual illusions similar to the RHI have also been demonstrated for other body parts, such as the face [37,38] and full body [39]. For example, in the enfacement illusion, synchronous visual–tactile stimulation of the participant's face and another person's face induces a change in self-identification. Thus even our own face, which is arguably the most distinguishable component of our personal identity, is susceptible to illusions induced by temporal and spatial sensory correspondences. Together, these studies suggest that our body representations are not solely derived from stored internal representations, or determined by information from one particular sensory domain, but that they are instead flexibly updated based on the available multisensory information.

(b). Evidence from tactile localization studies

The ability to combine incoming afferent information with a pre-existing body representation becomes critical when we need to determine the location of a specific body part. Adults rely on statistical information about the sensory input they receive to act on the environment, and therefore they estimate their body configuration, e.g. limb position, based on the high prior probability that limbs usually occupy particular locations with respect to the body. When limbs are not in their usual position, for example when the arms are crossed, the spatial correspondences between external stimuli (e.g. visual information) and the proprioceptive information (limb posture) need to be remapped. Consequently, when asked to localize touch on their hands, adults are less accurate when their hands are crossed than when they are uncrossed. This ‘crossed-hands deficit’ [40] suggests that the extensive correlated tactile, proprioceptive and visual experience we obtain with our typical body configuration (i.e. with the left hand in our left visual field and the right hand in the right visual field) promotes the emergence of multisensory associations that help localize touch when our hands are in their normal position, but that results in a conflict when our hands are crossed. In line with our account, studies on tactile remapping show that tactile processing and localization on the body indeed appear to be influenced by multisensory experience. For example, Azañón et al. [41] demonstrated that repeated visual, proprioceptive and tactile experience in a crossed posture can improve tactile localization and diminish the crossed-hand deficit [41]. Furthermore, prolonged experience with unfamiliar postures leads to a reduction of the deficit in localizing touch across such postures [4244], suggesting that, over time, the multisensory experience can produce long-lasting changes in body representations.

The role of experience in coding multisensory spatial information of proprioceptive–tactile stimuli is also apparent when comparing performance across regions of peripersonal space that differ in the amount of visual representation accumulated over time. For example, when an accurate visual representation of the body is lacking (i.e. the space behind our backs, which we rarely get to see), individuals show better performance in a tactile temporal resolution task, compared to when the same task is performed in the frontal space of the body [44]. These results are in line with our account as they demonstrate that, in the absence of opportunities to form associations between seeing and feeling touch on one's hands in this posture (e.g. when hands are behind the back), the interference normally seen in the unfamiliar posture is less pronounced. This raises the intriguing question of how changes in body representations occur in the context of similar slow learning experiences such as the ones accumulated across development.

3. The development of body representations

One of the key processes in the development of infants' first rudimentary body representations is their ability to detect contingencies between multisensory information [45]. For example, preferential looking studies have shown that infants are able to match the proprioceptive information generated by their own performed arm and leg movements to those observed on a video display from at least five months of age [46,47]. At this age, infants also start to demonstrate the ability to localize vibrotactile stimulation on their limbs, successfully combining tactile and proprioceptive information [48]. From at least three months of age, infants respond differently when they observe a specular image of their own face compared to that of another infant [49], suggesting that they are able to detect the contingency between the visual, motor and proprioceptive information generated by their own actions. Even newborn infants show a differential looking time response to visual displays of an infant face being stroked synchronously or asynchronously to the tactile stimulation they are receiving themselves [9]. These early competencies are thought to rely on infants’ abilities to match sensory stimulation in one modality (e.g. tactile or proprioceptive) to stimulation in another modality (e.g. visual). Detecting similar intersensory contingencies during every day exploratory behaviours is thought to play an important role in the development of infants' ability to identify their body as belonging to themselves, and as separate from the environment [50], and may facilitate the formation of multisensory body representations from early on.

Despite infants’ early competency in detecting contingencies between multisensory signals, it appears that more advanced body representations, which include expectations about the typical configuration and proportions of human bodies, develop much later in infancy. Indeed, habituation studies have demonstrated that infants only start to differentiate between typical and scrambled adult body configurations from about 15 months of age when these are presented as pictures [21], and from nine months when real-live human models or mannequins are used [20]. Nine-month-olds also appear to have an understanding of the typical proportions of adult human bodies, while five-month-olds do not [51]. Additionally, structural encoding of body configuration seems to emerge around 14 months of age when infants start to show a differential neural signature of body processing when they observe upright versus inverted bodies [52]. However, there are also preferential looking studies [53] and an ERP study [54] that suggest that infants are sensitive to the overall organization of body parts from as early as 3.5 months of age. At this age, infants can also discriminate between appropriately and inappropriately proportioned bodies after first being familiarized to the disproportional bodies [53]. Together, these studies suggest that infants have expectations about the first-order structure of bodies from relatively early in life, but that these expectations may be fragile and dependent on how closely the stimuli resemble the bodies infants observe in daily life, and on whether they can directly compare the typical and atypical stimuli. These previous studies all used adult human bodies, but it is unknown whether infants' representations of their own bodies follow a similar developmental trajectory. As we explain below, our account would predict that the visual, motor and proprioceptive experience that infants obtain while they observe their own full body would be critical for the development of infants’ ability to form expectations about their own body's configuration and proportions.

4. The role of multisensory experience in the development of body representations

Our associative account suggests that situations in which infants receive contiguous and contingent multisensory input are critical for the development of multisensory body representations. The great majority of these experiences comes from infants' own actions (but also see Box 1). We propose that the order in which modalities become integrated into multisensory representations is dictated by the available modalities at different points in development. For example, while in utero proprioception and touch are likely to be dominant, after birth, proprioceptive–tactile representations will become associated with visual representations (figure 2). The earliest evidence for the idea that infants’ own experiences play a key role in the development of body representations comes from studies that suggest that fetuses use tactile and proprioceptive information to learn to differentiate between their own body and the external uterine environment, including other bodies. For example, Castiello et al. [59] used ultrasound to observe and compare touch movements of twins at 14 and 18 weeks of gestation towards the uterine wall, themselves and the other twin. They found that while movements towards the uterine wall did not change over time, the proportion of self-directed movements decreased with time, and movements directed toward the twin were instead greater at 18 than 14 weeks of gestation. This study suggests that infants already start to use correlated proprioceptive–tactile information to learn about their own bodies while they are still in the womb. Further evidence for the role of multisensory experience in the development of body representations can be separated into studies investigating representations of the limbs, the face and the full body.

Box 1. Social interactions.

While our review focuses on infants driving multisensory learning through actions and interactions with their own bodies, social interactions likely also play an important role in the development of multisensory body representations [55,56]. Infants receive multisensory proprioceptive and tactile experience during infant massage, and visual–tactile experience when they receive social touch (figure 1). These types of interactions are characterized by the use of ‘affective touch’—the slow caress-like touch that specifically targets the CT fibres [57]. Besides playing an important role in bonding, these experiences of affective touch also allow infants to form associations between visual, tactile and proprioceptive bodily representations. Indeed, a recent study [58] found that five-month-old infants showed a preference for body-related visual–tactile synchrony when they received slow velocity CT-optimal affective touch, but not when they receive faster velocity non-affective touch. Although the relationship between this preference and the infants' previous experience with receiving affective touch was not investigated, these findings suggest that slow, caress-like touch, may facilitate the development of multisensory body representations in infancy.

Figure 2.

Figure 2.

Sources of multisensory bodily information during prenatal and postnatal development. We propose that through daily multisensory experience infants form associations between visual, tactile and proprioceptive signals that lead to gradually emerging representations of their own body. (Online version in colour.)

(a). Limbs

When infants start to reach for objects around four months of age, not only the visual and motor representation of that action, but also its tactile consequences are activated together. Through the process of associative learning, this repeated experience of seeing and feeling one's body being touched would be expected to result in a link between visual and tactile representations [22,23]. Given that infants do not spontaneously reach across the body midline until they are about six to eight months old [60], young infants would typically see their left hand touching objects in their left visual field and their right hand in their right visual field. It has been suggested that this consistent early experience promotes the emergence of representations about the most plausible locations of touch (i.e. spatial priors) and prototypical proprioceptive body postures (i.e. canonical posture) [61]. Findings from Bremner and colleagues [17,62,63] offer a developmental perspective on how infants' experience with reaching with their hands and feet plays a role in the development of the representation of body parts across postures. In the first six months of life, infants tend to respond to a vibrotactile sensation presented to one of their hands by moving the hand located on the side of the body where the stimulated hand would typically be, regardless of their posture (crossed or uncrossed hands) [62]. It is only in the second half of the first year of life, when infants have accumulated more experience with their limbs in a variety of postures, that the mechanism of postural remapping emerges [62]. This finding is consistent with our hypothesis that the correlated visual, proprioceptive and tactile experience infants obtain while reaching across the midline from about 6.5 months of age supports the integration of the multisensory spatial signals and, as a result, enables infants to make accurate manual responses to the stimulated hand across postures. A similar pattern of results has been found when vibrotactile stimulation was applied to the infants’ feet in crossed and uncrossed legs posture [17]. In this case, six-month-olds, but not four-month-olds, showed a tactile localization deficit with their feet crossed, indicating that while the four-month-olds relied on purely anatomical coding of touch, the six-month-olds attempted to integrate the visual–tactile information to the body [17]. The authors conclude that because the influence of external spatial coordinates on tactile localization emerges between four and six months of age, this process is likely to be dependent on experience. Between six and six months of age infants increasingly start to reach for objects with their hands and their feet [64,65]. At these initial stages—when infants do not reach across the midline yet—the proprioceptive, tactile and visual information coming from their limbs typically is congruent, with the left limbs making contact with objects in the infant's left visual field, and the right limbs in the right visual field. Placing the limbs in an unfamiliar crossed position during tactile stimulation may, therefore, result in a conflict between the previously associated proprioceptive and visual representation of the limbs, and the current representation of the limbs resulting in the observed ‘crossed-limb’ deficit.

Converging evidence for the idea that experience with crossing the midline is important comes from an EEG study in which a neural signature of visual–tactile integration across limb posture was observed in 10-month-olds, but not six-month-olds [66]. Posture also modulated somatosensory processing in a group of eight-month-olds who were proficient at reaching across the midline, but not in a group of infants with matched age and motor ability but who did not reach across the midline yet [66]. Altogether, this shows how early sensory experience promotes the emergence of representations about the most plausible locations of touch on a canonical limb posture, and how further experience is necessary for the infant to be able to update the postural coordinates and integrate multisensory spatial signals.

Although these studies suggest that experience plays a role in the development of visual–tactile integration, by relying on natural variability in motor skills they do not allow us to rule out alternative explanations such as general maturational processes affecting both visual–tactile integration and the ability to cross the midline. More direct evidence for the role of experience in multisensory integration comes from studies with individuals who had dense bilateral cataracts early in development [67,68]. For example, a participant whose vision was restored by 2 years of age did not show a crossed-hands deficit in a tactile localization task, suggesting he relied on anatomical rather than visual–external coding of touch [68]. However, individuals whose vision was restored by five months of age did show a typical crossed-hands deficit [67]. These studies suggest that there is a sensitive period between five months and 2 years of age during which visual experience is necessary for the development of crossmodal links between touch and vision. In the first 2 years of life, infants spend an increasing amount of time reaching for, touching, and exploring objects with their hands. This experience provides infants with a multitude of opportunities for forming multisensory associations that are fundamental for developing body representations.

(b). Face

While infants have ample opportunities to form multisensory associations for limbs via self-observation, there are fewer such opportunities for body parts that are visually opaque, such as the face. Without access to a mirror, infants may obtain contiguous and contingent multisensory information when they explore their face with their hands while performing orofacial gestures such as opening their mouth. Infants spend a significant amount of time touching their own face, both pre- and postnatally [69,70]. For example, between 24 and 36 weeks of gestation, fetuses increasingly touch the sensitive parts of their face (the mouth region and the lower part of the face) more than the relatively less sensitive areas of the face [69]. The ‘double-touch’ fetuses experience when they touch their own face provides a unique cue that specifies their face as being separate from the environment and from others. Indeed, when newborn infants touch their own face, they do not demonstrate the same rooting response as when an external object contacts their face [71], suggesting that prenatal multisensory experience contributes to early self/other distinction. Additionally, fetuses perform anticipatory mouth movements when they approach their face with their hand from as early as 24 weeks of gestational age [69]. Together with the observation of fetal thumb sucking, which can be seen as early as 10–15 weeks of gestation [1,72], the evidence of coordinated movements between hands and mouths observed in utero suggests that prenatal multisensory experience supports the integration of tactile and proprioceptive information and may play a fundamental role in the early development of body representations.

However, to integrate proprioceptive–tactile experiences with the visual representation of one's own face, infants would need to be able to see themselves in a mirror. There is evidence that infants show a great deal of self-exploration when they are placed in front of a mirror, observing their own movements and reaching for the part of the body reflected in the mirror [73]. However, given that most infants will only obtain experience with observing themselves in a mirror when their carer places them in front of one, or when a mirror is attached to a toy or their play pen, it may be unsurprising that it takes a relatively long time before infants show evidence of mirror self-recognition between 18 and 24 months of age [73,74]. If it indeed is the case that the formation of associations between visual, proprioceptive and tactile experiences aids the development of body representations, one would predict that it should be possible to speed up mirror self-recognition by giving infants additional mirror exposure. Studies with rhesus monkeys have provided evidence for this idea by showing that visual–somatosensory [75] and visual–proprioceptive [76] training induces self-directed behaviours in front of a mirror, similar to those observed in the classic rouge task. In the study by Chang et al. [75] monkeys were placed in front of a mirror and trained to touch an irritant laser light that was presented on their own face. After several weeks of training, the monkeys had formed an association between seeing a light spot in the mirror and touching the corresponding area of the face, which allowed them to touch the mark even in the absence of somatosensation. These findings suggest that the formation of multisensory associations supports the development of mirror-induced self-directed behaviours, and have implications for our understanding of what mirror self-recognition as measured by the mark test reflects. There has been lively debate about this in the last three decades, with some researchers suggesting that touching the mark reflects the development of self-awareness [74,77] while others have favoured lower-level interpretations [78,79]. The finding that mark-directed touch can be trained through multisensory experience in non-human primates suggests that it is unlikely that this behaviour always reflects true self-awareness. Instead, the development of multisensory associations may constitute a prerequisite process for the ability to identify the face as belonging to oneself. Future studies will need to develop experimental methodologies that will allow us to investigate whether similar training effects can be found in human infants.

(c). Body

Like our face, our full body is perceptually opaque as we cannot see the visual gestalt of our entire body unless we stand in front of a full-length reflective surface. As a result, we may expect not only infants' representations of their own face, but also those of their own full body to be relatively slow to develop. However, thus far the majority of studies investigating full-body representations in infancy have used stimuli of adult bodies (e.g. [20,21]), and as far as we are aware, no studies have investigated infants’ representations of their own full-body shape. We hypothesize that simultaneous multisensory experience across different parts of the body influences infants' representations of their own full-body configuration.

There is some preliminary evidence that infants’ ability to represent the various parts of their body may indeed depend on the amount of multisensory experience they have acquired with these body parts. For example, a tactile-localization study in which vibrating stimuli were applied to different points on the head and arms of seven- to 21-month-old infants, showed that the ability to reach to tactile stimuli on the body becomes established in the second half of the first year of life and is refined further during the second year [19]. Interestingly, and in line with our account, this study revealed that infants are able to localize targets near the mouth and on the hand at a younger age, compared to targets near the ear or on the forehead, or on other areas of the arm. This developmental trajectory may reflect the amount of multisensory experience with specific body parts the infant acquires with age, from the early prenatal stages onwards [80]. Infants are known to spend a significant amount of time contacting their mouth with their hands from as early as the 24th week of gestation [69,70] and likely obtain significantly less correlated multisensory experience for the ear or forehead. We propose that infants' representations of their own full body are similarly influenced by the amount of full-body multisensory experience. For example, as infants start to locomote, there are increased opportunities for them to use their whole body in a coordinated fashion (e.g. crawling, walking) and thus for integrating proprioceptive, tactile and visual experiences (for similar discussion see [52,81]). Supporting evidence comes from Slaughter et al. [21] who showed that walking 12-month-olds discriminated typical from scrambled body configurations, compared to non-walking 12-month-olds (but see also [82]). Given that full-body actions are perceptually opaque, one would expect that mirror exposure while performing such actions would be critical for the development of representations of one's own full body. Future research will need to examine whether multisensory experience obtained while performing whole-body actions such as crawling or walking, indeed influences when infants start to represent their own full body. For example, this could be achieved by adapting paradigms that elicit full-body illusions [39,83] for use with infant populations to investigate the role of multisensory experience and mirror exposure.

5. Concluding remarks

To summarize, we propose that from the prenatal stages onwards, the correlated multisensory experience infants obtain when they act and interact with their body helps them form representations of their own body. Whether it is through touching the uterine wall, reaching for objects, crawling across the floor or exploring their face with their hands, the multisensory associations formed through these experiences help the infant update the relative positions of their body parts and enhance the accuracy of their body representations. Adult research is largely consistent with this account; studies using bodily illusions and modifications of the standard posture of the body have shown how multisensory experience can change existing body representations. However, the fact that body representations can be changed by the multisensory experience in adults, does not necessarily mean that they also develop through multisensory experience in infancy. Although there is preliminary evidence to suggest that multisensory experience plays an important role in the development of body representations in infancy (e.g. [7,48]), there is a need for longitudinal and training studies in which this experience is systematically manipulated. These studies could shed light on how much, or what kind of sensory experience (e.g. visual, motor, tactile) is crucial for infants to integrate multisensory information to form more coherent body representations. For instance, we predict that if infants were to be trained to cross the midline to reach for objects, the correlated visual, tactile and proprioceptive information they would obtain during this experience would improve their ability to localize touch across arm postures.

Further support for our account comes from neural evidence demonstrating the recruitment of key multisensory cortical areas when infants and adults process body-related stimuli (e.g. [7,84]). Given the posterior parietal cortex’ hypothesized role in integrating multisensory bodily signals [25,84,85], future research could use functional near-infrared spectroscopy (fNIRS) to measure activation over this area while infants obtain correlated multisensory experience. We would expect that the amount of activation over the posterior parietal cortex may predict the extent to which infants' body representations are influenced by the multisensory experience they receive during a training study.

Our account also has implications for developmental disorders in which either the sensory input, or the ability to integrate multisensory signals may differ. For example, a recent longitudinal study that investigated midline crossing behaviours showed that at 10 months of age (but not at five and 14 months), infants at risk of ASD or ADHD produced fewer manual actions that involved their hand crossing the body midline into the contralateral side of space compared to low-risk infants [86]. This reduced level of midline crossing may play a role in the recently demonstrated delay in the ability to represent touch across body postures in children with ASD [87,88]. Individuals with ASD also demonstrate hypo- and/or hypersensitivity to individual sensory channels [89] and show disrupted multisensory integration processes [90]. These differences in processing and integrating multisensory signals may impact on the development of body representations. For example, it has been shown that children with ASD are less sensitive to the RHI [91,92] and evidence suggests that these children might require prolonged exposure to multisensory synchronous stimuli for a change in body ownership to take place [91]. This opens up avenues for future studies investigating the effectiveness of multisensory training on the development of body representations in this group.

In recent years, evidence has been accumulating for the idea that body representations are not only important for processing bodily events involving our own body but also those of others (e.g. [93]). For example, from early in infancy, somatosensory representations are activated both when our own body is touched, and when we observe touch on others’ bodies [15,94], and representations of our own body and those of our interaction partners are closely intertwined [95]. This self–other bodily overlap is an expected consequence of our body representations developing through associative learning. For instance, if an infant tends to look at their hand while it is being touched, the correlated visual–tactile experience results in a link between the two sensory representations, causing the tactile representation to become activated in response to the observation of a visual event that is physically similar, e.g. someone else's hand being touched [23,96]. Given that processes of self–other overlap are thought to play an important role in social cognitive abilities such as empathy [97], future research that investigates the developmental origins of body representations will have wider implications for understanding how infants start to make sense of the social world.

Building on empirical research conducted with infants, adults and clinical populations, we have argued that body representations not only support our actions and interactions with the world, but are also formed by them. They are a consequence of the rich multisensory experiences we obtain with our own bodies from the prenatal stages onwards. The key challenge for future research will be to determine exactly how much, and what kind of multisensory experience infants need to form more coherent body representations.

Supplementary Material

Acknowledgement

We would like to thank H. Gillmeister for helpful comments on an earlier draft of this paper.

Data accessibility

This article has no additional data.

Authors' contributions

C.C.J.M.d.K., M.L.F. and S.R. came up with the idea of writing the paper. All authors contributed to the writing of the first draft. S.R., M.L.F. and C.C.J.M.d.K. contributed images for figures 1 and 2. C.C.J.M.d.K. created the figures, and critically revised the manuscript. All authors gave final approval for publication.

Figure 1.

Figure 1.

Examples of the kinds of multisensory proprioceptive, visual and tactile experience infants receive from social touch. (Online version in colour.)

Competing interests

We declare we have no competing interests.

Funding

We received no funding for this study.

References

  • 1.De Vries JI, Visser G, Prechtl HF. 1985. The emergence of fetal behaviour. II. Quantitative aspects. Early Hum. Dev. 12, 99-120. ( 10.1016/0378-3782(85)90174-4) [DOI] [PubMed] [Google Scholar]
  • 2.Ehrsson HH. 2020. Multisensory processes in body ownership. Multisensory Percept. 1, 179-200. ( 10.1016/B978-0-12-812492-5.00008-5) [DOI] [Google Scholar]
  • 3.Berlucchi G, Aglioti S. 1997. The body in the brain: neural bases of corporeal awareness. Trends Neurosci. 20, 560-564. ( 10.1016/S0166-2236(97)01136-3) [DOI] [PubMed] [Google Scholar]
  • 4.Bremner AJ. 2016. Developing body representations in early life: combining somatosensation and vision to perceive the interface between the body and the world. Dev. Med. Child Neurol. 58, 12-16. ( 10.1111/dmcn.13041) [DOI] [PubMed] [Google Scholar]
  • 5.Longo MR, Azañón E, Haggard P. 2010. More than skin deep: body representation beyond primary somatosensory cortex. Neuropsychologia 48, 655-668. ( 10.1016/j.neuropsychologia.2009.08.022) [DOI] [PubMed] [Google Scholar]
  • 6.Tsakiris M. 2017. The multisensory basis of the self: from body to identity to others. Q. J. Exp. Psychol. 70, 597-609. ( 10.1080/17470218.2016.1181768) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Filippetti ML, Lloyd-Fox S, Longo MR, Farroni T, Johnson MH. 2015. Neural mechanisms of body awareness in infants. Cereb. Cortex. 25, 3779-3787. ( 10.1093/cercor/bhu261) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Filippetti ML, Orioli G, Johnson MH, Farroni T. 2015. Newborn body perception: sensitivity to spatial congruency. Infancy 20, 455-465. ( 10.1111/infa.12083) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Filippetti ML, Johnson MH, Lloyd-Fox S, Dragovic D, Farroni T. 2013. Body perception in newborns. Curr. Biol. 23, 2413-2416. ( 10.1016/j.cub.2013.10.017) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Meltzoff AN, Kuhl PK. 1994. Faces and speech: intermodal processing of biologically relevant signals in infants and adults. In The development of intersensory perception: comparative perspectives (eds DJ Lewkowicz, R Lickliter), pp. 335–369. Hillsdale, NJ: Lawrence Erlbaum Associates Publishers.
  • 11.Meltzoff AN. 2007. ‘Like me’: a foundation for social cognition. Dev. Sci. 10, 126-134. ( 10.1111/j.1467-7687.2007.00574.x) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Meltzoff AN, et al. 2019. Eliciting imitation in early infancy. Dev. Sci. 22, e12738. ( 10.1111/desc.12738) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Oostenbroek J, et al. 2016. Comprehensive longitudinal study challenges the existence of neonatal imitation in humans. Curr. Biol. 26, 1334-1338. ( 10.1016/j.cub.2016.03.047) [DOI] [PubMed] [Google Scholar]
  • 14.Oostenbroek J, et al. 2018. Re-evaluating the neonatal imitation hypothesis. Dev. Sci. 22, e12720. [DOI] [PubMed] [Google Scholar]
  • 15.Meltzoff AN, Saby JN, Marshall PJ. 2019. Neural representations of the body in 60-day-old human infants. Dev. Sci. 22, e12698. ( 10.1111/desc.12698) [DOI] [PubMed] [Google Scholar]
  • 16.Milh M, Kaminska A, Huon C, Lapillonne A, Ben-Ari Y, Khazipov R. 2007. Rapid cortical oscillations and early motor activity in premature human neonate. Cereb. Cortex. 17, 1582-1594. ( 10.1093/cercor/bhl069) [DOI] [PubMed] [Google Scholar]
  • 17.Begum AJ, Spence C, Bremner AJ. 2015. Human infants' ability to perceive touch in external space develops postnatally. Curr. Biol. 25, R978-R979. ( 10.1016/j.cub.2015.08.055) [DOI] [PubMed] [Google Scholar]
  • 18.Chinn LK, Hoffmann M, Leed JE, Lockman JJ. 2019. Reaching with one arm to the other: coordinating touch, proprioception, and action during infancy. J. Exp. Child Psychol. 183, 19-32. ( 10.1016/j.jecp.2019.01.014) [DOI] [PubMed] [Google Scholar]
  • 19.Leed JE, Chinn LK, Lockman JJ. 2019. Reaching to the self: the development of Infants’ ability to localize targets on the body. Psychol. Sci. 30, 1063-1073. ( 10.1177/0956797619850168) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Heron M, Slaughter V. 2010. Infants' responses to real humans and representations of humans. Int. J. Behav. Dev. 34, 34-45. ( 10.1177/0165025409345047) [DOI] [Google Scholar]
  • 21.Slaughter V, Heron M. 2004. Origins and early development of human body knowledge. Monogr. Soc. Res. Child Dev. 69, 1-102. ( 10.1111/j.0037-976X.2004.00287.x) [DOI] [PubMed] [Google Scholar]
  • 22.Heyes C. 2001. Causes and consequences of imitation. Trends Cogn. Sci. 5, 253-261. ( 10.1016/S1364-6613(00)01661-2) [DOI] [PubMed] [Google Scholar]
  • 23.Keysers C, Wicker B, Gazzola V, Anton JL, Fogassi L, Gallese V. 2004. A touching sight: SII/PV activation during the observation and experience of touch. Neuron 42, 335-346. ( 10.1016/S0896-6273(04)00156-4) [DOI] [PubMed] [Google Scholar]
  • 24.Botvinick M, Cohen J. 1998. Rubber hands ‘feel’ touch that eyes see. Nature 391, 756. ( 10.1038/35784) [DOI] [PubMed] [Google Scholar]
  • 25.Ehrsson HH, Holmes NP, Passingham RE. 2005. Touching a rubber hand: feeling of body ownership is associated with activity in multisensory brain areas. J. Neurosci. 25, 10 564-10 573. ( 10.1523/JNEUROSCI.0800-05.2005) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Kalckert A. 2012. Moving a rubber hand that feels like your own: a dissociation of ownership and agency. Front. Hum. Neurosci. 6, 401-414. ( 10.3389/fnhum.2012.00040) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Makin TR, Holmes NP, Ehrsson HH. 2008. On the other hand: dummy hands and peripersonal space. Behav. Brain Res. 191, 1. ( 10.1016/j.bbr.2008.02.041) [DOI] [PubMed] [Google Scholar]
  • 28.Tsakiris M. 2010. My body in the brain: a neurocognitive model of body-ownership. Neuropsychologia 48, 703-712. ( 10.1016/j.neuropsychologia.2009.09.034) [DOI] [PubMed] [Google Scholar]
  • 29.Tsakiris M, Haggard P. 2005. The rubber hand illusion revisited: visuotactile integration and self-attribution. J. Exp. Psychol. Hum. Percept. Perform. 31, 80-91. ( 10.1037/0096-1523.31.1.80) [DOI] [PubMed] [Google Scholar]
  • 30.Tsakiris M, Carpenter L, James D, Fotopoulou A. 2010. Hands only illusion: multisensory integration elicits sense of ownership for body parts but not for non-corporeal objects. Exp. Brain Res. 204, 343-352. ( 10.1007/s00221-009-2039-3) [DOI] [PubMed] [Google Scholar]
  • 31.Filippetti ML, Crucianelli L. 2019. If I were a grown-up: children's response to the rubber hand illusion with different hand sizes. J. Exp. Child Psychol. 185, 191-205. ( 10.1016/j.jecp.2019.04.016) [DOI] [PubMed] [Google Scholar]
  • 32.Cowie D, Makin TR, Bremner AJ. 2013. Children's responses to the rubber-hand illusion reveal dissociable pathways in body representation. Psychol. Sci. 24, 762-769. ( 10.1177/0956797612462902) [DOI] [PubMed] [Google Scholar]
  • 33.Cowie D, Sterling S, Bremner AJ. 2016. The development of multisensory body representation and awareness continues to 10 years of age: evidence from the rubber hand illusion. J. Exp. Child Psychol. 142, 230-238. ( 10.1016/j.jecp.2015.10.003) [DOI] [PubMed] [Google Scholar]
  • 34.Gori M, Del Viva M, Sandini G, Burr DC. 2008. Young children do not integrate visual and haptic form information. Curr. Biol. 18, 694-698. ( 10.1016/j.cub.2008.04.036) [DOI] [PubMed] [Google Scholar]
  • 35.Nardini M, Jones P, Bedford R, Braddick O. 2008. Development of cue integration in human navigation. Curr. Biol. 18, 689-693. ( 10.1016/j.cub.2008.04.021) [DOI] [PubMed] [Google Scholar]
  • 36.Nava E, Bolognini N, Turati C. 2017. The development of a cross-modal sense of body ownership. Psychol. Sci. 28, 330-337. ( 10.1177/0956797616682464) [DOI] [PubMed] [Google Scholar]
  • 37.Sforza A, Bufalari I, Haggard P, Aglioti SM. 2010. My face in yours: visuo-tactile facial stimulation influences sense of identity. Soc. Neurosci. 5, 148-162. ( 10.1080/17470910903205503) [DOI] [PubMed] [Google Scholar]
  • 38.Tsakiris M. 2008. Looking for myself: current multisensory input alters self-face recognition. PLoS ONE 3, e4040–e4046. ( 10.1371/journal.pone.0004040) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Petkova VI, Ehrsson HH. 2008. If I were you: perceptual illusion of body swapping. PLoS ONE 3, e3832. ( 10.1371/journal.pone.0003832) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Yamamoto S, Kitazawa S. 2001. Reversal of subjective temporal order due to arm crossing. Nat. Neurosci. 4, 759-765. ( 10.1038/89559) [DOI] [PubMed] [Google Scholar]
  • 41.Azañón E, Stenner MP, Cardini F, Haggard P. 2015. Dynamic tuning of tactile localization to body posture. Curr. Biol. 25, 512-517. ( 10.1016/j.cub.2014.12.038) [DOI] [PubMed] [Google Scholar]
  • 42.Benedetti F. 1991. Perceptual learning following a long-lasting tactile reversal. J. Exp. Psychol. Hum. Percept. Perform. 17, 267-277. ( 10.1037/0096-1523.17.1.267) [DOI] [PubMed] [Google Scholar]
  • 43.Craig JC, Belser AN. 2006. The crossed-hands deficit in tactile temporal-order judgments: the effect of training. Perception 35, 1561-1572. ( 10.1068/p5481) [DOI] [PubMed] [Google Scholar]
  • 44.Kobor I, Furedi L, Kovacs G, Spence C, Vidnyanszky Z. 2006. Back-to-front: improved tactile discrimination performance in the space you cannot see. Neurosci. Lett. 400, 163-167. ( 10.1016/j.neulet.2006.02.037) [DOI] [PubMed] [Google Scholar]
  • 45.Rochat P. 2009. Others in mind: social origins of self-consciousness. Cambridge, UK: Cambridge University Press. [Google Scholar]
  • 46.Bahrick LE, Watson JS. 1985. Detection of intermodal proprioceptive–visual contingency as a potential basis of self-perception in infancy. Dev. Psychol. 21, 963. ( 10.1037/0012-1649.21.6.963) [DOI] [Google Scholar]
  • 47.Schmuckler MA. 1996. Visual–proprioceptive intermodal perception in infancy. Infant Behav. Dev. 19, 221-232. ( 10.1016/S0163-6383(96)90021-1) [DOI] [Google Scholar]
  • 48.Somogyi E, et al. 2018. Which limb is it? Responses to vibrotactile stimulation in early infancy. Br. J. Dev. Psychol. 36, 384-401. ( 10.1111/bjdp.12224) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Field TM. 1979. Differential behavioral and cardiac responses of 3-month-old infants to a mirror and peer. Infant Behav. Dev. 2, 179-184. [Google Scholar]
  • 50.Rochat P, Striano T. 2002. Who's in the mirror? Self-other discrimination in specular images by four- and nine-month-old infants. Child Dev. 73, 35-46. ( 10.1111/1467-8624.00390) [DOI] [PubMed] [Google Scholar]
  • 51.Zieber N, Bhatt RS, Hayden A, Kangas A, Collins R, Bada H. 2010. Body representation in the first year of life. Infancy 15, 534-544. ( 10.1111/j.1532-7078.2009.00026.x) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Gillmeister H, Stets M, Grigorova M, Rigato S. 2019. How do bodies become special? Electrophysiological evidence for the emergence of body-related cortical processing in the first 14 months of life. Dev. Psychol. 55, 2025-2038. ( 10.1037/dev0000762) [DOI] [PubMed] [Google Scholar]
  • 53.Zieber N, Kangas A, Hock A, Bhatt RS. 2014. The development of intermodal emotion perception from bodies and voices. J. Exp. Child Psychol. 126, 68-79. ( 10.1016/j.jecp.2014.03.005) [DOI] [PubMed] [Google Scholar]
  • 54.Gliga T, Dehaene-Lambertz G. 2005. Structural encoding of body and face in human infants and adults. J. Cogn. Neurosci. 17, 1328-1340. ( 10.1162/0898929055002481) [DOI] [PubMed] [Google Scholar]
  • 55.Crucianelli L, Filippetti ML. 2020. Developmental perspectives on interpersonal affective touch. Topoi 39, 575-586. ( 10.1007/s11245-018-9565-1) [DOI] [Google Scholar]
  • 56.Montirosso R, McGlone F. 2020. The body comes first. Embodied reparation and the co-creation of infant bodily-self. Neurosci. Biobehav. Rev. 113, 77-87. ( 10.1016/j.neubiorev.2020.03.003) [DOI] [PubMed] [Google Scholar]
  • 57.Croy I, Luong A, Triscoli C, Hofmann E, Olausson H, Sailer U. 2016. Interpersonal stroking touch is targeted to C tactile afferent activation. Behav. Brain Res. 297, 37-40. ( 10.1016/j.bbr.2015.09.038) [DOI] [PubMed] [Google Scholar]
  • 58.Della Longa L, Filippetti ML, Dragovic D, Farroni T. 2019. Synchrony of caresses: does affective touch help infants to detect body-related visual–tactile synchrony? Front. Psychol. 10, 2944. ( 10.3389/fpsyg.2019.02944) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Castiello U, et al. 2010. Wired to be social: the ontogeny of human interaction. PLoS ONE 5, e13199. ( 10.1371/journal.pone.0013199) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Morange F, Bloch H. 1996. Lateralization of the approach movement and the prehension movement in infants from 4 to 7 months. Int. J. Surg. Res. Pract. 5, 81-92. [Google Scholar]
  • 61.Tame L, Azañón E, Longo MR. 2019. A conceptual model of tactile processing across body features of size, shape, side, and spatial location. Front. Psychol. 10, 291. ( 10.3389/fpsyg.2019.00291) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Bremner AJ, Mareschal D, Lloyd-Fox S, Spence C. 2008. Spatial localization of touch in the first year of life: early influence of a visual spatial code and the development of remapping across changes in limb position. J. Exp. Psychol. Gen. 137, 149-162. ( 10.1037/0096-3445.137.1.149) [DOI] [PubMed] [Google Scholar]
  • 63.Bremner AJ, van Velzen J. 2015. Sensorimotor control: retuning the body–world interface. Curr. Biol. 25, R159-RR61. ( 10.1016/j.cub.2014.12.042) [DOI] [PubMed] [Google Scholar]
  • 64.Galloway JC, Thelen E. 2004. Feet first: object exploration in young infants. Infant Behav. Dev. 27, 107-112. ( 10.1016/j.infbeh.2003.06.001) [DOI] [Google Scholar]
  • 65.Thelen E, Corbetta D, Kamm K, Spencer JP, Schneider K, Zernicke RF. 1993. The transition to reaching: mapping intention and intrinsic dynamics. Child Dev. 64, 1058-1098. ( 10.2307/1131327) [DOI] [PubMed] [Google Scholar]
  • 66.Rigato S, Ali JB, van Velzen J, Bremner AJ. 2014. The neural basis of somatosensory remapping develops in human infancy. Curr. Biol. 24, 1222-1226. ( 10.1016/j.cub.2014.04.004) [DOI] [PubMed] [Google Scholar]
  • 67.Azañón E, Camacho K, Morales M, Longo MR. 2018. The sensitive period for tactile remapping does not include early infancy. Child Dev. 89, 1394-1404. ( 10.1111/cdev.12813) [DOI] [PubMed] [Google Scholar]
  • 68.Ley P, Bottari D, Shenoy BH, Kekunnaya R, Röder B. 2013. Partial recovery of visual–spatial remapping of touch after restoring vision in a congenitally blind man. Neuropsychologia 51, 1119-1123. ( 10.1016/j.neuropsychologia.2013.03.004) [DOI] [PubMed] [Google Scholar]
  • 69.Reissland N, Francis B, Aydin E, Mason J, Schaal B. 2014. The development of anticipation in the fetus: a longitudinal account of human fetal mouth movements in reaction to and anticipation of touch. Dev. Psychobiol. 56, 955-963. ( 10.1002/dev.21172) [DOI] [PubMed] [Google Scholar]
  • 70.Rochat P. 1998. Self-perception and action in infancy. Exp. Brain Res. 123, 102-109. ( 10.1007/s002210050550) [DOI] [PubMed] [Google Scholar]
  • 71.Butterworth G, Hopkins B. 1988. Hand–mouth coordination in the new-born baby. Br. J. Dev. Psychol. 6, 303-314. ( 10.1111/j.2044-835X.1988.tb01103.x) [DOI] [Google Scholar]
  • 72.Hepper PG, Shahidullah S, White R. 1991. Handedness in the human fetus. Neuropsychologia 29, 1107-1111. ( 10.1016/0028-3932(91)90080-R) [DOI] [PubMed] [Google Scholar]
  • 73.Amsterdam B. 1972. Mirror self-image reactions before age two. Dev. Psychobiol. 5, 297-305. ( 10.1002/dev.420050403) [DOI] [PubMed] [Google Scholar]
  • 74.Rochat P. 2003. Five levels of self-awareness as they unfold early in life. Conscious Cogn. 12, 717-731. ( 10.1016/S1053-8100(03)00081-3) [DOI] [PubMed] [Google Scholar]
  • 75.Chang L, Fang Q, Zhang S, Poo M-M, Gong N. 2015. Mirror-induced self-directed behaviors in rhesus monkeys after visual–somatosensory training. Curr. Biol. 25, 212-217. ( 10.1016/j.cub.2014.11.016) [DOI] [PubMed] [Google Scholar]
  • 76.Chang L, Zhang S, Poo M-M, Gong N. 2017. Spontaneous expression of mirror self-recognition in monkeys after learning precise visual–proprioceptive association for mirror images. Proc. Natl Acad. Sci. USA 114, 3258-3263. ( 10.1073/pnas.1620764114) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 77.Gallup GG Jr. 1998. Self-awareness and the evolution of social intelligence. Behav. Processes 42, 239-247. ( 10.1016/S0376-6357(97)00079-X) [DOI] [PubMed] [Google Scholar]
  • 78.Heyes CM. 1994. Reflections on self-recognition in primates. Anim. Behav. 47, 909-919. ( 10.1006/anbe.1994.1123) [DOI] [Google Scholar]
  • 79.Suddendorf T, Butler DL. 2013. The nature of visual self-recognition. Trends Cogn. Sci. 17, 121-127. ( 10.1016/j.tics.2013.01.004) [DOI] [PubMed] [Google Scholar]
  • 80.Yamada Y, et al. 2016. An embodied brain model of the human foetus. Sci. Rep. 6, 27893. ( 10.1038/srep27893) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 81.Fausey C, Jayaraman S, Smith LB. 2016. From faces to hands: changing visual input in the first two years. Cognition 152, 101-107. ( 10.1016/j.cognition.2016.03.005) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 82.Christie T, Slaughter V. 2009. Exploring links between sensorimotor and visuospatial body representations in infancy. Dev. Neuropsychol. 34, 448-460. ( 10.1080/87565640902964532) [DOI] [PubMed] [Google Scholar]
  • 83.Ehrsson HH. 2007. The experimental induction of out-of-body experiences. Science 317, 1048. ( 10.1126/science.1142175) [DOI] [PubMed] [Google Scholar]
  • 84.Gentile G, Petkova VI, Ehrsson HH. 2011. Integration of visual and tactile signals from the hand in the human brain: an FMRI study. J. Neurophysiol. 105, 910-922. ( 10.1152/jn.00840.2010) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 85.Limanowski J, Blankenburg F. 2016. Integration of visual and proprioceptive limb position information in human posterior parietal, premotor, and extrastriate cortex. J. Neurosci. 36, 2582-2589. ( 10.1523/JNEUROSCI.3987-15.2016) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 86.Begum AJ, Charman T, Johnson MH, Jones EJ, Team BS. 2020. Early motor differences in infants at elevated likelihood of autism spectrum disorder and/or attention deficit hyperactivity disorder. J. Autism Dev. Disord. 50, 4367-4384. ( 10.1007/s10803-020-04489-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 87.Hense M, Badde S, Köhne S, Dziobek I, Röder B. 2019. Visual and proprioceptive influences on tactile spatial processing in adults with autism spectrum disorders. Autism. Res. 12, 1745-1757. ( 10.1002/aur.2202) [DOI] [PubMed] [Google Scholar]
  • 88.Wada M, Suzuki M, Takaki A, Miyao M, Spence C, Kansaku K. 2014. Spatio-temporal processing of tactile stimuli in autistic children. Sci. Rep. 4, 1-9. ( 10.1038/srep05985) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 89.Leekam SR, Nieto C, Libby SJ, Wing L, Gould J. 2007. Describing the sensory abnormalities of children and adults with autism. J. Autism. Dev. Disord. 37, 894-910. ( 10.1007/s10803-006-0218-7) [DOI] [PubMed] [Google Scholar]
  • 90.Falck-Ytter T, et al. 2018. Reduced orienting to audiovisual synchrony in infancy predicts autism diagnosis at 3 years of age. J. Child Psychol. Psychiatry 59, 872-880. ( 10.1111/jcpp.12863) [DOI] [PubMed] [Google Scholar]
  • 91.Cascio CJ, Foss-Feig JH, Burnette CP, Heacock JL, Cosby AA. 2012. The rubber hand illusion in children with autism spectrum disorders: delayed influence of combined tactile and visual input on proprioception. Autism 16, 406-419. ( 10.1177/1362361311430404) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 92.Ropar D, Greenfield K, Smith AD, Carey M, Newport R. 2018. Body representation difficulties in children and adolescents with autism may be due to delayed development of visuo-tactile temporal binding. Dev. Cogn. Neurosci. 29, 78-85. ( 10.1016/j.dcn.2017.04.007) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 93.Goldman A, de Vignemont F. 2009. Is social cognition embodied? Trends Cogn. Sci. 13, 154-159. ( 10.1016/j.tics.2009.01.007) [DOI] [PubMed] [Google Scholar]
  • 94.Rigato S, Bremner AJ, Gillmeister H, Banissy MJ. 2019. Interpersonal representations of touch in somatosensory cortex are modulated by perspective. Biol. Psychol. 146, 107719. ( 10.1016/j.biopsycho.2019.107719) [DOI] [PubMed] [Google Scholar]
  • 95.Maister L, Hodossy L, Tsakiris M, Shinskey JL. 2020. Self or (M) other? Infants' sensitivity to bodily overlap with their mother reflects their dyadic coordination. Child Dev. 91, 1631-1649. ( 10.1111/cdev.13361) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 96.Heyes C. 2010. Where do mirror neurons come from? Neurosci. Biobehav. Rev. 34, 575-583. ( 10.1016/j.neubiorev.2009.11.007) [DOI] [PubMed] [Google Scholar]
  • 97.De Guzman M, Bird G, Banissy MJ, Catmur C. 2016. Self–other control processes in social cognition: from imitation to empathy. Phil. Trans. R. Soc. Lond. B 371, 20150079. ( 10.1098/rstb.2015.0079) [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Data Availability Statement

This article has no additional data.


Articles from Proceedings of the Royal Society B: Biological Sciences are provided here courtesy of The Royal Society

RESOURCES