Skip to main content
UKPMC Funders Author Manuscripts logoLink to UKPMC Funders Author Manuscripts
. Author manuscript; available in PMC: 2008 May 7.
Published in final edited form as: J Cogn Neurosci. 2008 Feb;20(2):312–323. doi: 10.1162/jocn.2008.20022

Visuotactile learning and body representation: An ERP study with rubber hands and rubber objects

Clare Press 1,*, Cecilia Heyes 1, Patrick Haggard 1,2, Martin Eimer 3
PMCID: PMC2373573  EMSID: UKMS1760  PMID: 18275337

Abstract

We studied how the integration of seen and felt tactile stimulation modulates somatosensory processing, and investigated whether visuotactile integration depends on temporal contiguity of stimulation, and its coherence with a pre-existing body representation. During training, participants viewed a rubber hand or a rubber object that was tapped either synchronously with stimulation of their own hand, or in an uncorrelated fashion. In a subsequent test phase, somatosensory ERPs were recorded to tactile stimulation of the left or right hand, to assess how tactile processing was affected by previous visuotactile experience during training. An enhanced somatosensory N140 component was elicited after synchronous, compared with uncorrelated, visuotactile training, irrespective of whether participants viewed a rubber hand or rubber object. This early effect of visuotactile integration on somatosensory processing is interpreted as a candidate electrophysiological correlate of the rubber hand illusion that is determined by temporal contiguity, but not by pre-existing body representations. ERP modulations were observed beyond 200ms post-stimulus, suggesting an attentional bias induced by visuotactile training. These late modulations were absent when the stimulation of a rubber hand and participants' own hand was uncorrelated during training, suggesting that pre-existing body representations may affect later stages of tactile processing.

Introduction

To form an accurate representation of the body and of sensory events impinging on the bodily surface, the brain must integrate inputs arriving from various sensory modalities, and in particular from vision and touch. Such multisensory integration processes are mediated by several cortical and subcortical brain structures. Multisensory integration of visual (or auditory) and tactile inputs is typically strongest when the critical stimuli are temporally and spatially coincident or ‘contiguous’ (cf. Stein & Meredith, 1993), as these spatiotemporal correlations provide strong evidence that such sensory inputs are related (Keysers et al., 2004; Armel & Ramachandran, 2003; but see Zampini, Torresan, Spence, & Murray, 2007, and Gillmeister & Eimer, in press, for recent evidence that spatial contiguity may be less relevant for tactile-auditory integration). For example, Keysers et al. (2004) propose that when we observe ourselves being touched, the integration of vision and touch arises through a Hebbian process of associative learning where links between visual and somatosensory neurons are strengthened through repeated correlated activation (see Heyes, 2001 for a similar account of visuomotor integration). Neural responses can become bimodal through concurrent activation of visual and tactile neurons (Stein & Meredith, 1993). Indeed, somatosensory cortex cells are known to fire in response to visual stimuli only if these stimuli were previously paired with tactile information (Zhou & Fuster, 2000; Zhou & Fuster, 1997). Cells in the ventral premotor cortex and superior parietal lobule (area 5) of the macaque monkey are activated not only when a monkey is touched on the arm, but also when a visual stimulus is presented near the arm (e.g. Graziano, Cooke, & Taylor, 2000; Obayashi, Tanaka, & Iriki, 2000; Graziano, Yap, & Gross, 1994), suggesting that these cells might also be involved in multisensory integration. In addition, sensory responses to visual events are enhanced by the concurrent presence of touch (e.g. Kennett, Eimer, Spence, & Driver, 2001), and vice versa (e.g. Fiorio & Haggard, 2005; Ro, Wallace, Hagedorn, Farnè, & Pienkos, 2004).

Temporal correlations between visual and tactile events are not only necessary for multisensory integration processes on the neuronal level, but also have important consequences for the perception of our own body. They play a crucial role in determining whether visual objects are treated as part of one's own body, that is, whether they are incorporated into the ‘bodily self’. When participants observe stimulation of an external object, such as a rubber hand, while simultaneously feeling stimulation on their own hand (e.g. Botvinick & Cohen, 1998; Tsakiris & Haggard, 2005; Armel & Ramachandran, 2003; Ehrsson, Spence, & Passingham, 2004, see also Pavani, Spence, & Driver, 2000), they commonly experience an illusion that the rubber hand is part of their own body. This ‘rubber hand illusion’ (RHI) is attenuated or absent when the seen stimulation of the rubber hand and the felt stimulation of one's own hand is asynchronous.

Although it is generally difficult to demonstrate functional analogies between human and monkey anatomy, the brain activations found to be associated with the RHI appear to coincide with the ventral premotor areas where bimodal visuotactile neurons have previously been found by single-cell recording in primates (e.g. Graziano et al., 1994). For example, Ehrsson et al. (2004) found greater ventral premotor activation when participants observed a rubber hand being stroked synchronously with their own hand, relative to when stimulation was asynchronous, or the rubber hand was in a posture incongruent with their own hand, and that the level of this activation correlated with subjective reports of the RHI. Moreover, visuotactile integration processes that give rise to the RHI can also have systematic effects on early sensory-perceptual processing. Schaefer and colleagues demonstrated on the basis of neuromagnetic source imaging that synchronous stimulation of one's own, unseen little finger and a visible thumb can change the somatotopic locus of the representation of the little finger in primary somatosensory cortex (Schaefer, Noennig, Heinze, & Rotte, 2006).

Whereas temporal contiguity between visual and tactile events clearly plays a central role in the integration of visual objects within the bodily self, it is not clear whether temporally contiguous visual-tactile stimulation is sufficient for these integration processes to take place. Pre-existing representations of the body may also play a role in processing body-related stimuli. Such representations may be innate, or the product of prior associative or nonassociative learning. Recent studies have suggested that a mismatch between pre-existing representations of the body and current visual objects may prevent their integration, and thereby prevent the emergence of the RHI. For example, Tsakiris & Haggard (2005) found that the RHI was reduced or abolished by substituting the rubber hand with a neutral object such as a wooden stick, or a hand with a posture or anatomical identity incompatible with that of the participants' stimulated hand. In other words, the ability of temporally coincident visuotactile inputs to affect judgements about one's own body was dependent on the compatibility of visual stimuli with pre-existing body representations. The importance of pre-existing body representations in visuotactile integration has also been demonstrated by Blakemore, Bristow, Bird, Frith, & Ward (2005), who found that the observation of bodies being touched activated primary and secondary somatosensory cortex to a greater extent than the observation of touch applied to spatially matched non-body objects. Along similar lines, Taylor-Clarke, Kennett, & Haggard (2002) demonstrated that the N80 component of the somatosensory event-related brain potential, that is assumed to be generated in primary somatosensory cortex (Allison, McCarthy, & Wood, 1992), was enhanced when participants were observing their stimulated arm, relative to a condition where they viewed a neutral object at the same location.

However, in contrast to the results observed by Tsakiris & Haggard (2005), Armel & Ramachandran (2003) found evidence that the RHI can be elicited under a wide range of visual conditions, provided that visual and tactile events coincide. They stroked a rubber hand or the table top synchronously with the participant's own unseen hand, and subsequently measured the skin conductance response when a finger was pulled back on the rubber hand or a plaster was pulled off the table top. The authors found greater evidence of the RHI when participants viewed a rubber hand or a table top being stimulated synchronously with their own unseen hand, compared with control stimulation conditions, and this difference between test and control conditions was equal with the two visual stimulus types. Further evidence consistent with the hypothesis that strict compatibility of visual stimuli with pre-existing body representations may not be required in situations like the RHI was obtained by Keysers et al. (2004). Those authors found using functional magnetic resonance imaging that secondary somatosensory cortex was activated both when participants observed touching of body parts and when they observed touching of spatially matched inanimate objects such as binders and rolls of paper.

In summary, previous research has unequivocally demonstrated that the temporal contiguity between visual and tactile events is a necessary condition for visuotactile integration processes that give rise to the RHI. In contrast, the question of whether contiguity is sufficient is currently under debate. According to one view, contiguity is not itself sufficient, because the RHI context must additionally fit with a pre-existing, innate or learned, representation of the body. In the present experiment, we used event-related brain potentials (ERPs) to further explore the impact of contiguity and pre-existing body representations on somatosensory processing. More specifically, we investigated how previous visuotactile experience involving a rubber stimulus (training phase), together with the visible presence of this rubber stimulus, affects the processing of tactile stimuli presented to the left or right hand (test phase). Visuotactile integration was expected to result in an enhancement of tactile stimulus processing during the test phase. Effects of multisensory integration on early sensory-perceptual stages of tactile processing should be reflected by modulations of early sensory-specific somatosensory ERP components such as the N140, whereas effects on later post-perceptual processes should be reflected by longer-latency ERP modulations. We were particularly interested to know whether (1) the temporal contiguity of visual-tactile stimulation during training, and (2) the compatibility of the visual stimulus with a pre-existing body representation would influence perceptual or post-perceptual somatosensory processing.

We recorded ERPs to mechanical vibratory tactile stimuli delivered to the left or right hand during a test phase in which separate groups of participants viewed a left or right static rubber hand (in the View Hand group) or a similarly lateralized inanimate object (in the View Object group). Prior to each test phase, a series of training trials was presented, in which participants saw taps delivered to this rubber hand or rubber object, while their own hand was stimulated in a synchronous or temporally uncorrelated fashion. To ensure that participants paid attention to the stimulus display, their task was to monitor an LED that was located on the rubber hand or object, in order to detect infrequent visual target events (LED flashes containing a brief gap) that were presented with equal probability during the training and test phases.

This procedure allowed us to dissociate effects of the temporal contiguity of visuotactile input during the training phase on ERP responses to tactile stimuli during the test phase from effects that are due to the compatibility of visual input with pre-existing body representations. First, we investigated how the correlation between visual and tactile inputs during the training phase affected somatosensory processing during the test phase by comparing ERP responses during the test phase as a function of whether participants previously experienced synchronous or uncorrelated visuotactile input. If multisensory integration processes based on temporally contiguous visual and tactile stimuli resulted in modulations of somatosensory processing, as is commonly assumed, this should be reflected in main effects of training type on somatosensory ERP components such as the N140. We expected the amplitude of such components to be larger in response to tactile stimulation of the hand that followed synchronous training, relative to tactile stimulation following uncorrelated training.

Second, we addressed the role of pre-existing body representations on visuotactile integration. We therefore compared ERPs recorded during the test phase for participants in the View Hand group, where the compatibility between visual input and tactile stimulation was high, and for the View Object group, where this compatibility was much lower. In addition, we investigated more subtle effects of bodily compatibility, by directly comparing ERPs on trials where the hand stimulated during the training phase was anatomically compatible with the rubber stimulus visible during training and test phases (e.g., left-hand stimulation in blocks where participants viewed a left rubber hand or object) to ERPs on trials where the anatomically incompatible hand or object (e.g., right-hand stimulation in blocks where participants viewed a left rubber hand or object) was stimulated instead. If visuotactile integration processes were largely independent of the compatibility of visual stimuli with relevant body representations, as suggested by Armel & Ramachandran (2003), any impact of synchronous versus uncorrelated training on tactile processing should be present for both groups of participants, and for both anatomically compatible as well as anatomically incompatible trials. In other words, any main effect of training type on somatosensory ERP components such as the N140 should not interact with rubber stimulus type (hand versus object) or visuotactile compatibility. In contrast, if the compatibility between visual input and pre-existing body representations was important, modulations of somatosensory ERPs indicative of enhanced tactile processing as a result of synchronous training should only be present for the View Hand group, and perhaps only on trials where the stimulated hand is anatomically compatible with the seen rubber hand. Such a pattern of results would be reflected by significant interactions between training type, rubber stimulus type, and compatibility.

Materials and methods

Participants

Thirty-five paid healthy participants took part in this study. All were right-handed, had normal or corrected-to-normal vision, and were naïve with respect to the purpose of the experiment. Two participants were excluded due to insufficient eye movement control (see below), and one other participant was excluded due to excessive alpha activity. Thus, 32 participants (15 male, mean age 27.6 years, range 20-42 years) remained in the sample, and were randomly assigned in equal numbers to the View Hand and View Object groups. The experiment was performed with the approval of the ethics committee of the School of Psychology, Birkbeck College.

Stimuli

Experimental blocks consisted of sequential training and test phases. Throughout all blocks, participants viewed one of the four visual stimulus arrays depicted in Figure 1. The visual stimulus in the View Hand group consisted of either a left or right stuffed yellow rubber glove (extreme point length = 31.0°, extreme point width = 22.6°, sleeve of 15.9° width and 16.7° length), with an LED (target stimulus) wrapped around its index finger on the proximal ‘segment’. The visual stimulus in the View Object group consisted of a symmetrical stuffed yellow rubber block, which was matched to the hand stimuli in extreme point dimensions, surface area and luminance, with LED locations spatially matched to the positions on the hand stimuli (see Figure 1). The LED diameter was 2.30° and the luminance, measured with a SpectraScan PR650 luminance meter (Micron Techniques Ltd.) at a distance of 25cm from the LED, was 84cd/m2. Non-target LED flashes were of 200ms duration with no gap. Target LED flashes were of the same duration, but had a 30ms gap in the middle (i.e., the LED was on for 85ms, off for 30ms, then on again for 85ms). Visual stimulus arrays also included a red wooden stick (6.9° long, with a 4×4mm square cross-section), located perpendicular to the rubber stimulus. During the training phases in the View Hand group, this stick prodded the rubber hand for 50ms on the middle segment of the rubber hand index finger. During the training phases in the View Object group, the stick prodded the object for 50ms in a spatially matched location.

Figure 1.

Figure 1

Experimental set-up. The part of the rubber stimulus that was visible to participants was seen within the white square. This stimulus could be one of four types (A Left hand, B Right hand, C Left object, D right object). Participants' own hands and arms were out of sight for the duration of the experiment. During training phases, participants received taps to the side of the middle segment of the index finger that was congruent with the viewed rubber stimulus (Tleft or Tright), while the rubber stimulus was tapped in a synchronous or uncorrelated fashion. During test phases, participants received vibrations to the distal segment of the left or right finger (Vleft or Vright), and no taps were delivered to the rubber stimulus.

Tactile stimuli were presented using 12V solenoids, driving a metal rod with a blunt conical tip, making contact with the fingers or the rubber objects whenever a current was passed through the solenoid. During the training phases, single taps of 50ms duration and were presented to the outer side of the middle segment of the index finger of the hand corresponding to the viewed rubber stimulus (i.e., left hand stimulation with left rubber hand or left rubber object; right hand stimulation with right rubber hand or object). During the test phase, single vibratory tactile stimuli were delivered to the distal pad of the left or right index fingers via vibrators located underneath these fingerpads. These stimuli consisted of six successive 2ms stimulations with 18ms interstimulus intervals, resulting in a vibration with a frequency of 50Hz and a duration of 102ms.

Procedure

Participants sat in a dimly lit experimental chamber, wearing a head-mounted microphone used to record vocal responses. Participants' left and right arms were positioned on a tabletop 17.5cm to the right and left of the body midline (see Figure 1). The rubber stimulus was positioned such that the LED was aligned with the body midline. With hands held palm-side down, the index fingers were resting in plastic casing, on top of the tactile stimulators used during the test phases. To prevent movement, fingers were held in place by Velcro on the distal and middle segments of the index finger. The index finger was separated from the other fingers on the hand with a strip of foam. Once hands were positioned, a black frame (Width = 85cm, Height = 11.5cm, Length = 40cm) was placed on the tabletop so that participants could not see their hands and the tactile stimulators. A 57.0° × 22.8° hole was cut in this frame so that participants could see the stimulus display (see Figure 1). Vision of the arms was prevented by a black cloth attached to the frame and tied around the participant's neck. White noise (78dB SPL) was continuously presented via a loudspeaker that was aligned with the body midline and located directly behind the black frame (see Figure 1) to mask any sounds made by the tactile stimulators.

All blocks consisted of sequential training and test phases. Transitions between training and test phases were not explicitly indicated to participants. During training phases, seen taps on the rubber stimulus and felt taps on the spatially corresponding hand were delivered either synchronously or in an uncorrelated fashion, and this was varied across blocks. In synchronous blocks, tactile and visual stimuli were presented simultaneously during the training phases. In uncorrelated blocks, the temporal relationship between tactile and visual stimuli was random. Each training trial lasted for 1300ms. In synchronous blocks, tactile and visual stimuli were always presented simultaneously and with equal probability at one of 13 possible time points between 0ms and 1200ms after training trial onset that were separated by 100ms (i.e., 0ms, 100ms, 200ms, … 1200ms). In uncorrelated blocks, tactile and visual stimuli were also presented at one of these time points, but now the point at which the tactile stimulus was presented was random with respect to the point at which the visual stimulus was presented.

During test phases, single vibratory stimuli were presented in a random order and with equal probability to the left or right hand, with an inter-trial interval varying between 1000ms and 1400ms (mean 1200ms). Test phases were identical in synchronous and uncorrelated blocks, and no stimulation of the seen rubber stimulus occurred during these phases. The different combinations of training and test phases are indicated in Table 1.

Table 1.

A table of conditions, according to the combination of within-subject factors of viewed stimulus type, stimulated hand and stimulation type in training phases and stimulated hand in test phases. In all conditions, participants in the View Hand group viewed a hand stimulus and participants in the View Object group viewed an object stimulus.

Viewed stimulus
type (training and
test) / stimulated
hand (training)
Stimulation
type (training)
Stimulated
hand (test)
Right-Synchronous
condition
Right Synchronous Right and
left
Left-Synchronous
condition
Left Synchronous Right and
left
Right-Uncorrelated
condition
Right Uncorrelated Right and
left
Left-Uncorrelated
condition
Left Uncorrelated Right and
left

The participants' task was to monitor the LED located on the rubber stimulus throughout training and test blocks, and to respond (by saying ‘yes’) whenever they detected an LED flash containing a gap, but to refrain from responding to LED flashes without a gap. These LED trials were equally frequent during training and test phases, and consisted of a target or non-target LED flash, followed by a 1000 – 1400ms (mean 1200ms) interval. These trials were positioned randomly within the blocks and no additional visual or tactile stimulus was presented during LED trials.

Each experimental block consisted of three rotations of training phase followed by test phase. Each training phase included 26 visual/tactile trials, two LED target trials and two LED non-target trials, all presented in random order. Each test phase included 13 left-hand vibration stimuli, 13 right-hand vibration stimuli, two LED target trials and two LED non-target trials, again presented in random order. Each training phase and each test phase lasted for approximately 40 seconds.

The experiment consisted of 12 experimental blocks. Three consecutive blocks were run for each of the four combinations of rubber stimulus type (left stimulus vs. right stimulus) and training type (synchronous vs. uncorrelated stimulation). The six synchronous and uncorrelated blocks were completed in immediate succession, with order of blocks counterbalanced across participants. Rubber stimulus type (left vs. right) was ordered in an ABBA fashion, with order again counterbalanced across participants. Prior to the start of each new set of three blocks, participants completed one practise block of 60 trials that consisted of one training phase followed by one test phase. Excluding breaks, the total duration of the study was approximately 50 minutes.

EEG recording and data analyses

EEG was recorded with Ag-AgCl electrodes and linked-earlobe reference from FPz, F7, F3, Fz, F4, F8, FC5, FC6, T7, C3, Cz, C4, T8, CP5, CP6, P7, P3, Pz, P4, P8, and Oz (according to the 10-20 system), and from OL and OR (located halfway between O1 and P7, and O2 and P8, respectively). Horizontal EOG (HEOG) was recorded bipolarly from the outer canthi of both eyes. Electrode impedance was kept below 5kΩ, and the impedances of the earlobe electrodes were kept as equal as possible. Amplifier bandpass was 0.1 to 40Hz. EEG and EOG were sampled with a digitization rate of 200Hz and stored on disk.

EEG and EOG for test phase trials where a vibratory tactile stimulus was presented to the left or right hand were epoched off-line into 600ms periods, starting 100ms prior to tactile stimulus onset and ending 500ms after onset. Trials immediately following those where a target LED was presented were excluded to avoid contamination by vocal responses. Trials with vocal responses to tactile stimuli were also excluded from EEG analysis, as were non-target trials with eyeblinks (Fpz exceeding ±60μV), small horizontal eye movements (HEOG exceeding ±30μV), or other artifacts (a voltage exceeding ±80μV at any electrode) in the 500ms interval following tactile stimulus onset. Averaged HEOG waveforms obtained for each participant and task condition in this interval in response to left versus right-hand stimulation were scored for systematic deviations of eye position, indicating residual tendencies to move the eyes towards the stimulated hand. Two participants were excluded because residual HEOG deviations exceeded ±3μV.

The EEG obtained in the 500ms interval following the onset of a vibratory tactile stimulus in the test phase for each participant in the View Hand and View Object groups was averaged relative to a 100ms pre-stimulus baseline for all combinations of stimulated hand (left vs. right), type of training (synchronous vs. asynchronous), and visuotactile compatibility (compatible: left hand stimulation when viewing a left rubber stimulus or right-hand stimulation when viewing a right rubber stimulus; incompatible: right stimulation in the presence of a left rubber stimulus or left stimulation in the presence of a right rubber stimulus). ERP mean amplitudes were computed within measurement windows centered on the latency of early somatosensory P50 (30-65 ms), N80 (65-90 ms), P100 (90-115 ms) and N140 (120-160 ms) components, as well as within a longer-latency time window (200-450 ms post-stimulus). These mean amplitudes were analysed separately for lateral recording sites where the amplitudes of early somatosensory components are maximal (F3/4, FC5/6, C3/4, and CP5/6) and for midline electrodes (Fz, Cz, and Pz), for the within-subject factors type of training, visuotactile compatibility, stimulated hand, and laterality (ipsilateral vs. contralateral; for analyses of lateral electrodes only) and the between-subject factor group (View Hand vs. View Object).

Results

Behavioural performance

Performance data were analysed with repeated measures analyses of variance (ANOVAs) for the factors training type (synchronous vs. uncorrelated), phase of experiment (training vs. test), and group (View Hand vs. View Object). There were no significant main effects or interactions in either the vocal response time (RT) or error analyses. Mean vocal RTs to visual target events did not differ significantly between the View Hand and View Object groups (592 ms, SEM: 20ms vs. 582 ms, SEM: 15ms). The percentage of missed targets (Hand: 4.5%, SEM: 0.71% vs. Object: 4.5%, SEM: 0.63%) was identical in both groups. There were also no significant differences in RTs or error rates on target trials between training and test phases, and between synchronous and uncorrelated blocks. False alarms occurred on only 1.3% of all trials where non-target LEDs were presented, and false alarm rate did not differ across groups, block types, or between training and test phases.

ERPs to vibratory tactile stimuli presented during the test phase

Figure 2 shows ERPs at midline electrode Cz and at electrodes C3/C4 ipsilateral and contralateral to the stimulated hand that were elicited in response to vibratory tactile stimuli during the test phases, displayed separately for blocks where training consisted of either synchronous (solid lines) or uncorrelated (dashed lines) stimulation of the rubber stimulus and the participant's hand, and for the View Hand and View Object groups. ERPs are collapsed across visuotactile compatible and incompatible trials. The amplitude of the somatosensory N140 component was enhanced when vibratory tactile stimuli were presented following synchronous training, relative to blocks with uncorrelated training. Moreover, this N140 enhancement was independent of whether participants viewed a rubber hand or a rubber object (see Figure 2). These observations were substantiated by the presence of a main effect of type of training on ERP mean amplitudes obtained in the N140 time window (120-160ms after stimulus onset) at lateral electrodes (F(1,30) = 6.8, p < 0.02) as well as at midline electrodes (F(1,30) = 7.7, p < 0.01), reflecting an enhanced negativity when vibratory tactile stimuli were presented after synchronous training blocks relative to tactile stimuli presented after uncorrelated training blocks. Importantly, this main effect of training was not affected by whether the seen visual stimulus was a rubber hand or a rubber object, as indicated by the absence of any group × type of training interaction at either lateral or midline electrodes (both F < 1). Similarly, there were no significant interactions between visuotactile compatibility and type of training at lateral or midline electrodes (both F < 1.5), demonstrating that the effect of synchronous training on N140 amplitudes was elicited regardless of whether the stimulated hand was compatible or incompatible with the seen rubber stimulus. No reliable main effects of type of training were found in any of the three earlier time windows (P50, N80, or P100), or for the longer-latency measurement interval (200-450 ms post-stimulus).

Figure 2.

Figure 2

Grand-averaged ERPs elicited in response to tactile stimuli delivered in test phases in the 500 ms interval after tactile stimulus onset at midline electrode Cz, and at central electrodes C3 and C4 ipsilateral (C34I) and contralateral (C34C) to the stimulated hand. ERPs are shown separately for the View Hand group (top panel) and the View Object group (bottom panel), and for blocks with synchronous visuotactile training (solid lines), and blocks with uncorrelated training (dashed lines).

Figure 3 shows ERPs at midline electrode Cz and at electrodes C3/C4 ipsilateral and contralateral to the stimulated hand that were elicited in response to vibratory tactile stimuli delivered in the test phases that were either compatible (e.g., left-hand stimulation while viewing a left rubber stimulus) or incompatible (e.g., left-hand stimulation while viewing a right rubber stimulus). Waveforms are shown separately for blocks with synchronous and uncorrelated training, and for the View Hand and View Object groups. No significant main effects or interactions involving visuotactile compatibility or group were observed for the early P50, N80, P100, and N140 time intervals. Systematic differences between these conditions only emerged during the 200-450 ms interval. Here, ERPs on compatible trials (i.e., trials where the tactile stimulus was applied to the hand matching the seen rubber stimulus) were generally more negative than ERPs on incompatible trials. As can be seen from Figure 3, this pattern of results was present for blocks with synchronous as well as uncorrelated training for the View Object group. In contrast, for the View Hand group, an enhanced negativity for compatible trials was only present for blocks with synchronous training, but not for blocks where training was uncorrelated.

Figure 3.

Figure 3

Grand-averaged ERPs elicited in response to tactile stimuli delivered in test phases in the 500 ms interval after tactile stimulus onset at midline electrode Cz, and at central electrodes C3 and C4 ipsilateral (C34I) and contralateral (C34C) to the stimulated hand. ERPs are shown separately for the View Hand and View Object groups, and for blocks with synchronous and uncorrelated training. ERPs on trials where the stimulated hand was compatible with the seen rubber object (solid lines) are compared to ERPs on trials where tactile stimuli were presented to the incompatible hand (dashed lines).

These observations were supported by statistical analysis of ERP mean amplitudes obtained in the 200-450 ms post-stimulus interval. A main effect of visuotactile compatibility was present at lateral (F(1,30) = 13.3, p < 0.002) as well as at midline electrodes (F(1,30) = 11.0, p < 0.005), reflecting the overall enhanced negativity for compatible trials visible in Figure 3. In addition, and importantly, a significant group × visuotactile compatibility × type of training interaction was obtained at lateral (F(1,30) = 6.3, p < 0.02) as well as at midline electrodes F(1,30) = 5.5, p < 0.03). The nature of this interaction was further explored by conducting separate analyses for the View Hand and View Object groups. In the View Object group, main effects of visuotactile compatibility at lateral and midline electrodes (F(1,15) = 16.3, p < 0.002; F(1,15) = 17.9, p = 0.001, respectively) were present, without an indication of any two-way interactions between visuotactile compatibility and type of training (F(1,15) = 1.4, p = 0.3; F < 1, for lateral and midline sites, respectively). In marked contrast, no overall significant main effect of visuotactile compatibility was found for the View Hand group at lateral or midline electrodes (F(1,15) = 1.9, p = 0.2; F(1,15) = 1.2, p = 0.3, respectively). However, significant two-way interactions between visuotactile compatibility and type of training were present at lateral (F(1,15) = 6.2, p < 0.03) as well as at midline electrodes (F(1,15) = 9.7, p < 0.01). This interaction was due to the fact that there was a significant visuotactile compatibility effect for blocks with synchronous training in the View Hand group (F(1,15) = 8.4, p < 0.02; F(1,15) = 11.1, p = 0.005; for lateral and midline electrodes, respectively), whereas there was no such effect in blocks with uncorrelated training (both F < 1).

Discussion

This study explored whether the temporal contiguity of visual and tactile stimulation, and the compatibility between visual objects and pre-existing body representations, contribute to visuotactile integration processes. We employed a procedure that was similar to previous investigations of the RHI (e.g. Armel & Ramachandran, 2003; Tsakiris & Haggard, 2005). In the test phase, somatosensory ERPs were measured in response to tactile stimulation of the left or right hand as an indicator of how the efficiency of tactile processing was affected by visuotactile experience during a prior training phase. In the training phase, a rubber hand or a rubber object was tapped either synchronously or randomly with respect to the participant's own hand.

The most straightforward differential effect observed in the present experiment was induced by the temporal contingency of visuotactile stimulation delivered during the training phase. A reliably enhanced somatosensory N140 component was elicited after training with synchronous visual and tactile stimulation, relative to blocks where this stimulation was uncorrelated during training. Importantly, this effect was present, not only in the View Hand group, but also in the View Object group, suggesting that it was exclusively determined by the temporal contiguity of visual and tactile stimulation during the training phase, and not modulated by whether the visual stimulus context was a hand or an object. It therefore involved a ‘bottom-up’ effect of previous multisensory stimulation rather than a ‘top-down’ effect of compatibility with a pre-existing body representation (Tsakiris & Haggard, 2005).

The N140 component is assumed to be generated in secondary somatosensory cortex (S2, Allison et al., 1992). The fact that many neurons in secondary somatosensory cortex have bilateral receptive fields (Hari et al., 1984; see also Iwamura, Iriki, & Tanaka, 1994) may account for the fact that the N140 enhancement found in the present study following synchronous training was triggered both for stimuli delivered to the anatomically compatible hand (e.g., the left hand when participants viewed a left rubber stimulus during training and test phases) as well as for stimulation of the anatomically incompatible hand. The observation that synchronous visuotactile training resulted in modulations of processing within sensory-specific somatosensory areas is also consistent with previous findings that synchronous visual and tactile stimulation can result in changes in neural processing in somatosensory cortex (Schaefer et al., 2006), although in these studies, such changes were predominantly localised in primary somatosensory cortex. The finding that temporally coincident visuotactile stimulation can systematically affect tactile processing in somatosensory areas suggests that these areas may be modulated through back-projections from multimodal areas (e.g. Taylor-Clarke et al., 2002), or may themselves not be as unimodal as once thought.

Enhanced N140 amplitudes resulting from synchronous training were observed regardless of whether participants viewed a rubber hand or a non-hand rubber object. Although such a N140 enhancement suggests a modulation of tactile processing in somatosensory cortical areas, this additional finding indicates that this modulation is triggered independently of pre-existing higher-order body representations. If this effect had been contingent on the compatibility between a visible rubber stimulus and the representation of the concurrently stimulated hand, it should have been much more pronounced in the View Hand group (and within this group perhaps larger for trials where the stimulated hand was anatomically compatible with the rubber hand), than in the View Object group, which was clearly not the case (see Figure 2). This insensitivity of the N140 modulation to the difference between rubber hands and rubber objects also corresponds well with the observation of Keysers et al. (2004) that S2 was activated not only when observers watch another person being touched, but also when inanimate objects, such as rolls of paper towel, were touched instead. Together with the current findings, these results suggest that S2 activations that are triggered through visuotactile integration may not be mediated at all by pre-existing body representations, but are exclusively dependent on recent experience of temporal contiguity. In other words, associative learning based on brief correlated experience of seeing a rubber object being touched while simultaneously feeling touch during the training phase of the present experiment may have been sufficient to result in visuotactile integration.

As a possible alternative account of the N140 enhancements observed as a function of synchronous training, one could assume that this type of training is generally more efficient in focusing attention than the uncorrelated presentation of visual and tactile events, and that a stronger attentional focus on the tactile stimulus that is present during training persists during the subsequent test phase. However, given that such stimulus-driven (exogenous) modulations of attentional processing are typically short-lived, it seems unlikely that they would remain present for an extended period beyond the training phase. Even if this was the case, this attentional account would not necessarily be inconsistent with the presence of multisensory integration processes during synchronous training, as it is possible that more focussed spatial attention may itself be the result of such integration processes.

The presence of clear-cut N140 amplitude modulations as a result of synchronous versus uncorrelated training in the present study raises the obvious question whether this effect should be interpreted as a direct electrophysiological marker of the RHI. We did not give participants a questionnaire to assess their experience of the RHI because it would have been necessary to administer the questionnaire between blocks of trials (training type was varied within-subjects). The questions might suggest particular interpretations of the task situation that would influence subsequent electrophysiological responses. Although we cannot therefore provide direct evidence that the N140 amplitude enhancements produced by synchronous visuotactile training are correlated with the RHI, we do believe that this is likely to be the case. The synchronous training regime used in the present experiment resembles those which have been shown in previous studies to induce the RHI with hand stimuli (Tsakiris & Haggard, 2005), and with non-hand stimuli (Armel & Ramachandran, 2003). In addition, the effect observed on the N140 component in the present study is likely to reflect modulatory top-down influences from multimodal areas (Taylor-Clarke et al., 2002). Activation of cortical areas thought to contain bimodal neurons has previously been found to be correlated with reports of the RHI (Ehrsson et al., 2004).

If the N140 amplitude modulations produced by synchronous training are indeed correlated with the RHI, as suggested here, this would have the important consequence that the temporal synchrony of visuotactile stimulation is sufficient to produce the RHI, whereas the visual stimulus context (hand versus object) does not play a major role. Recall that in the present experiment, synchronous training resulted in enhanced N140 amplitudes for the View Hand as well as for the View Object group. While this conclusion is consistent with the observation by Armel & Ramachandran (2003) that the RHI is triggered in very different visual contexts as long as visual and tactile stimulation is synchronous, it appears to be at odds with several other studies that generally found a reduction of the RHI when non-hand objects were stimulated, or when the seen hand posture was incompatible with the participants' stimulated hand (e.g., Holmes, Snijders, & Spence, 2006; Lloyd, Morrison, & Roberts, 2006; Tsakiris & Haggard, 2005; Austen, Soto-Faraco, Enns, & Kingstone, 2004). These results suggest that in addition to synchrony of visuotactile stimulation, a match between visible object features and pre-existing body representations is another important factor responsible for the RHI.

However, a closer examination of these studies reveals that their results are not necessarily inconsistent with our current findings and conclusions. Most of these studies (Holmes et al., 2006; Lloyd et al., 2006; Austen et al., 2004; Pavani et al., 2000) did not include any visuotactile stimulation, and thus did not investigate links between the RHI and visuotactile learning. In addition, they measured behavioural effects that may only be indirectly linked with the subjective experience of the RHI. For example, Austen et al. (2004) measured interference effects of visual stimuli presented on a rubber hand on responses to tactile stimuli on participants' own hand, and found larger effects when the rubber hand posture was aligned with the real stimulated hand (see also Pavani et al., 2000). Holmes et al. (2006) reported that reaching movements were more affected by the visible presence of real or rubber hands than by non-hand objects, but found that this visually induced reaching bias was not strongly correlated with the RHI, as assessed by a questionnaire. Tsakiris & Haggard (2005) is the only study to date where the RHI was found to be larger following synchronous versus asynchronous stimulation for a rubber hand, whereas no such difference was observed for a non-hand object. However, it should be noted that the duration of visuotactile training for each stimulation condition in the Tsakiris & Haggard (2005) study was considerably shorter (four minutes) than the total amount of training that participants received in the course of our ERP study. It is possible that, had Tsakiris & Haggard (2005) provided a more extended period of synchronous training in their View Object condition, this would give rise to an illusion of object ownership, and thus to the RHI. In contrast to the more objective, but less direct, electrophysiological correlates of the RHI measured in the present study, the measure used by Tsakiris & Haggard (2005) to quantify the RHI (proprioceptive drift) might in principle also be more susceptible to subtle demand effects that could have contributed to the observed differences between View Hand and View Object conditions.

In summary, our tentative proposal is that the N140 amplitude modulations observed in the present study as a consequence of synchronous training are a candidate correlate of the RHI. More research is obviously necessary to establish whether our procedure gives rise to the RHI in the View Hand condition, whether it also yields a similar illusion in the View Object condition, and, more generally, whether and under which conditions the RHI and modulations of somatosensory N140 amplitudes are directly linked.

In contrast with our early effect on N140 amplitudes, which was unaffected by the difference between rubber hands and rubber objects, and thus provides some preliminary evidence that visual stimulus context might not be a critical factor for the RHI (see above), the later ERP effects observed in this experiment do suggest some role for pre-existing body representations in visuotactile integration. These effects appeared considerably later than the N140 at latencies beyond 200ms post-stimulus. As can be seen in Figure 3, an enhanced sustained negativity was generally found for trials where a tactile stimulus was presented to the hand that was anatomically compatible with the visible rubber hand/object (i.e., the left hand in blocks where participants viewed a left rubber hand or rubber object during training and test, and the right hand when a right rubber hand or object was in view), relative to trials where the incompatible hand was stimulated instead. In the View Object group, this effect of visuotactile compatibility was present following blocks of uncorrelated visuotactile training as well as after blocks of synchronous visuotactile training. In contrast, for the View Hand group, visuotactile compatibility effects were present after synchronous training, but not after training that consisted of uncorrelated presentation of visual and tactile events. The direction of this three-way interaction between group, type of training, and visuotactile compatibility suggests that we cannot simply associate an enhanced late negativity for compatible trials with the involvement of pre-existing body representations, as this effect depended on temporal contiguity when viewing a rubber hand, but not when viewing a non-hand object.

We suggest that the late negativity results may reflect processes of spatial attention that are modulated by specific mental representation of the body. A sustained negativity in somatosensory ERPs at latencies beyond 200ms post-stimulus is usually interpreted as evidence for the spatially selective attentional processing of tactile stimuli at post-perceptual processing levels (e.g. Michie, 1984; Eimer & Forster, 2003). Thus, the presence of such a sustained negativity on compatible as compared to incompatible trials in the present study may indicate that vision of a left or right rubber stimulus resulted in a small but systematic bias of tactile attention towards the hand compatible with this stimulus. Such a bias is likely to result from the association of the lateralized features of the rubber stimulus with the side of tactile stimulation during training (e.g. viewing left rubber stimulus associated with tactile stimulation of the participant's left hand). If this hypothesis is correct, the fact that no such enhanced negativity for compatible trials was observed for the View Hand group following uncorrelated training (see Figure 3) implies that no attentional bias towards the compatible hand was elicited for this condition. This might be explained with reference to the impact of pre-existing body representations. These include strong associations between visual experience of a hand being stimulated and simultaneously feeling one's own hand being stimulated, which result in a strong expectation of temporally correlated visuotactile stimulation. It is well-established that the rate of both excitatory and inhibitory conditioning increase as the presence or absence of a second event becomes more surprising or unpredicted (Kamin, 1969). Therefore, violation of the expectation that visual and tactile experience of hand stimulation will co-occur, by uncorrelated training in the View Hand group, may result in rapid inhibitory learning, deactivating any excitatory links between current visual and tactile stimulation (Schultz & Dickinson, 2000), and thereby preventing any subsequent attentional bias towards the compatible side. In contrast, due to the absence of pre-existing visuotactile representations for objects that are not body parts, no such inhibitory learning will occur when viewing a non-hand rubber object during uncorrelated training.

According to this account, pre-existing body representations may indeed have interacted with the temporal parameters of visuotactile stimulation by promoting the establishment of inhibitory associative links between visual and tactile events during uncorrelated training in the View Hand group, thus eliminating the normal attentional bias towards visuotactile compatible hands during the test phase. On this view, the interaction between pre-existing body representations and multisensory stimulation would be based on excorporation (see Holmes & Spence, 2006, for discussion of this concept) rather than the more familiar concept of incorporation. That is, when an object that could be part of the body on the basis of visual appearance is shown by multisensory experience not to be part of the body, then normal cross-modal links between visual and tactile spatial attention, which produce the late ERP compatibility effects, are inhibited and reduced. It should be acknowledged that at present, the three-way interaction between group, type of training, and visuotactile compatibility cannot easily be accounted for in the context of the existing literature on the RHI. This interaction needs to be confirmed in further studies, and our suggested interpretation may only be one of several possible alternative interpretations.

In summary, the present study has found clear-cut electrophysiological evidence that the temporal contiguity of seen and felt tactile stimulation modulates subsequent somatosensory processing at early sensory-specific stages. While these effects appear to be independent of pre-existing body representations, such representations may affect later stages in the processing of tactile events. We therefore suggest that two types of visuotactile integration process contribute to the ‘bodily self’. First, associative learning based on the temporal contiguity between visual and tactile stimulation enhances perceptual processing of subsequent tactile events. While this enhancement appears to occur independently of pre-existing representations of specific body parts, expectations based on these body representations may affect the rate of associative learning that modulates post-perceptual, possibly attentional, processing of tactile stimuli.

Acknowledgements

This research was supported by a PhD studentship awarded to CP by the Biotechnology and Biological Sciences Research Council (BBSRC). ME holds a Royal Society-Wellcome Research Merit Award. We are grateful to José van Velzen for technical assistance, and to an anonymous referee for numerous helpful suggestions.

Reference List

  1. Allison T, McCarthy G, Wood CC. The relationship between human long-latency somatosensory evoked-potentials recorded from the cortical surface and from the scalp. Clinical Neurophysiology. 1992;84:301–314. doi: 10.1016/0168-5597(92)90082-m. [DOI] [PubMed] [Google Scholar]
  2. Armel KC, Ramachandran VS. Projecting sensations to external objects: Evidence from skin conductance response. Proceedings of the Royal Society of London Series B. 2003;270:1499–1506. doi: 10.1098/rspb.2003.2364. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Austen EL, Soto-Faraco S, Enns JT, Kingstone A. Mislocalisation of touch to a fake hand. Cognitive, Affective, & Behavioral Neuroscience. 2004;4:170–181. doi: 10.3758/cabn.4.2.170. [DOI] [PubMed] [Google Scholar]
  4. Blakemore S-J, Bristow D, Bird G, Frith C, Ward J. Somatosensory activations during the observation of touch and a case of vision-touch synaesthesia. Brain. 2005;128:1571–1583. doi: 10.1093/brain/awh500. [DOI] [PubMed] [Google Scholar]
  5. Botvinick M, Cohen D. Rubber hands ‘feel’ touch that eyes see. Nature. 1998;391:756. doi: 10.1038/35784. [DOI] [PubMed] [Google Scholar]
  6. Ehrsson HH, Spence C, Passingham RE. That's my hand! Activity in premotor cortex reflects feeling of ownership of a limb. Science. 2004;305:875–877. doi: 10.1126/science.1097011. [DOI] [PubMed] [Google Scholar]
  7. Eimer M, Forster B. The spatial distribution of attentional selectivity in touch: Evidence from somatosensory ERP components. Clinical Neurophysiology. 2003;114:1298–1306. doi: 10.1016/s1388-2457(03)00107-x. [DOI] [PubMed] [Google Scholar]
  8. Fiorio M, Haggard P. Viewing the body prepares the brain for touch: Effects of TMS over somatosensory cortex. European Journal of Cognitive Neuroscience. 2005;22:773–777. doi: 10.1111/j.1460-9568.2005.04267.x. [DOI] [PubMed] [Google Scholar]
  9. Gillmeister H, Eimer M. Tactile enhancement of auditory detection and perceived loudness. Brain Research. doi: 10.1016/j.brainres.2007.03.041. in press. [DOI] [PubMed] [Google Scholar]
  10. Graziano MSA, Yap GS, Gross CG. Coding of visual space by premotor neurons. Science. 1994;266:1054–1057. doi: 10.1126/science.7973661. [DOI] [PubMed] [Google Scholar]
  11. Graziano MSA, Cooke DF, Taylor CS. Coding the location of the arm by sight. Science. 2000;290:1782–1786. doi: 10.1126/science.290.5497.1782. [DOI] [PubMed] [Google Scholar]
  12. Hari R, Reinikainen K, Kaukoranta E, Hamalainen M, Ilmoniemi R, Penttinen A, et al. Somatosensory evoked cerebral magnetic fields from SI and SII in man. Electroencephalography and Clinical Neurophysiology. 1984;57:254–263. doi: 10.1016/0013-4694(84)90126-3. [DOI] [PubMed] [Google Scholar]
  13. Heyes CM. Causes and consequences of imitation. Trends in Cognitive Sciences. 2001;5:253–261. doi: 10.1016/s1364-6613(00)01661-2. [DOI] [PubMed] [Google Scholar]
  14. Holmes NP, Snijders HJ, Spence C. Reaching with alien limbs: Visual exposure to prosthetic hands in a mirror biases proprioception without accompanying illusions of ownership. Perception and Psychophysics. 2006;68:685–701. doi: 10.3758/bf03208768. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Holmes NP, Spence C. Beyond the body schema: Visual, prosthetic, and technological contributions to bodily perception and awareness. In: Knoblich G, Thornton I, Grosjean M, Shiffrar M, editors. Human body perception from the inside out. Oxford: Oxford University Press; 2006. pp. 15–64. [Google Scholar]
  16. Iwamura Y, Iriki A, Tanaka M. Bilateral hand representation in the postcentral somatosensory cortex. Nature. 1994;369:554–556. doi: 10.1038/369554a0. [DOI] [PubMed] [Google Scholar]
  17. Kamin LJ. Selective association and conditioning. In: Mackintosh NJ, Honig WK, editors. Fundamental issues in associative learning. Halifax: Dalhousie University Press; 1969. pp. 42–64. [Google Scholar]
  18. Kennett S, Eimer M, Spence C, Driver J. Tactile-visual links in exogenous spatial attention under different postures: Convergent evidence from psychophysics and ERPs. Journal of Cognitive Neuroscience. 2001;13:462–478. doi: 10.1162/08989290152001899. [DOI] [PubMed] [Google Scholar]
  19. Keysers C, Wicker B, Gazzola V, Anton J-L, Fogassi L, Gallese V. A touching sight: SII/PV activation during the observation and experience of touch. Neuron. 2004;42:335–346. doi: 10.1016/s0896-6273(04)00156-4. [DOI] [PubMed] [Google Scholar]
  20. Lloyd D, Morrison I, Roberts N. Role for human posterior parietal cortex in visual processing of aversive objects in peripersonal space. Journal of Neurophysiology. 2006;95:205–214. doi: 10.1152/jn.00614.2005. [DOI] [PubMed] [Google Scholar]
  21. Michie PT. Selective attention effects on somatosensory event-related potentials. Annals of the New York Academy of Sciences. 1984;425:250–255. doi: 10.1111/j.1749-6632.1984.tb23542.x. [DOI] [PubMed] [Google Scholar]
  22. Obayashi S, Tanaka M, Iriki A. Subjective image of invisible hand coded by monkey intraparietal neurons. Neuroreport. 2000;11:3499–3505. doi: 10.1097/00001756-200011090-00020. [DOI] [PubMed] [Google Scholar]
  23. Pavani F, Spence C, Driver J. Visual capture of touch: Out-of-the-body experiences with rubber gloves. Psychological Science. 2000;11:353–359. doi: 10.1111/1467-9280.00270. [DOI] [PubMed] [Google Scholar]
  24. Ro T, Wallace R, Hagedorn J, Farnè A, Pienkos E. Visual enhancing of tactile perception in the posterior parietal cortex. Journal of Cognitive Neuroscience. 2004;16:24–30. doi: 10.1162/089892904322755520. [DOI] [PubMed] [Google Scholar]
  25. Schaefer M, Noennig N, Heinze H-J, Rotte M. Fooling your feelings: Artificially induced referred sensations are linked to a modulation of the primary somatosensory cortex. NeuroImage. 2006;29:67–73. doi: 10.1016/j.neuroimage.2005.07.001. [DOI] [PubMed] [Google Scholar]
  26. Schultz W, Dickinson A. Neuronal coding of prediction errors. Annual Review of Neuroscience. 2000;23:473–500. doi: 10.1146/annurev.neuro.23.1.473. [DOI] [PubMed] [Google Scholar]
  27. Stein BE, Meredith MA. The merging of the senses. Cambridge, MA: MIT Press; 1993. [Google Scholar]
  28. Taylor-Clarke M, Kennett S, Haggard P. Vision modulates somatosensory cortical processing. Current Biology. 2002;12:233–236. doi: 10.1016/s0960-9822(01)00681-9. [DOI] [PubMed] [Google Scholar]
  29. Tsakiris M, Haggard P. The rubber hand illusion re-visited: Visuotactile integration and self-attribution. Journal of Experimental Psychology: Human Perception and Performance. 2005;31:80–91. doi: 10.1037/0096-1523.31.1.80. [DOI] [PubMed] [Google Scholar]
  30. Zampini M, Torresan D, Spence C, Murray MM. Auditory-somatosensory multisensory interactions in front and rear space. Neuropsychologia. 2007;45:1869–1877. doi: 10.1016/j.neuropsychologia.2006.12.004. [DOI] [PubMed] [Google Scholar]
  31. Zhou Y-D, Fuster JM. Neuronal activity of somatosensory cortex in a cross-modal (visuo-haptic) memory task. Experimental Brain Research. 1997;116:551–555. doi: 10.1007/pl00005783. [DOI] [PubMed] [Google Scholar]
  32. Zhou Y-D, Fuster JM. Visuo-tactile cross-modal associations in cortical somatosensory cells. Proceedings of the National Academy of Sciences of the United States of America. 2000;97:9777–9782. doi: 10.1073/pnas.97.17.9777. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES