Skip to main content
Philosophical Transactions of the Royal Society B: Biological Sciences logoLink to Philosophical Transactions of the Royal Society B: Biological Sciences
. 2015 Sep 19;370(1677):20140203. doi: 10.1098/rstb.2014.0203

Dissecting neural circuits for multisensory integration and crossmodal processing

Jeffrey M Yau 1,, Gregory C DeAngelis 2, Dora E Angelaki 1
PMCID: PMC4528815  PMID: 26240418

Abstract

We rely on rich and complex sensory information to perceive and understand our environment. Our multisensory experience of the world depends on the brain's remarkable ability to combine signals across sensory systems. Behavioural, neurophysiological and neuroimaging experiments have established principles of multisensory integration and candidate neural mechanisms. Here we review how targeted manipulation of neural activity using invasive and non-invasive neuromodulation techniques have advanced our understanding of multisensory processing. Neuromodulation studies have provided detailed characterizations of brain networks causally involved in multisensory integration. Despite substantial progress, important questions regarding multisensory networks remain unanswered. Critically, experimental approaches will need to be combined with theory in order to understand how distributed activity across multisensory networks collectively supports perception.

Keywords: modulation, microstimulation, causal, network, perception, interactions

1. Introduction

Much of our knowledge of how networks of neurons contribute to perception and cognition is based on the ability of systems neuroscientists to correlate the activity of neurons with both the sensory stimulus and behavioural outcomes. For example, trial-by-trial correlations of the responses of sensory neurons with perceptual decisions have been used to probe for functional roles in perception [1,2]. However, a crucial test of any model relating neural activity to behaviour involves interrogating how behaviour changes upon causal manipulation of these networks. Large strides have been made in understanding basic neural circuits using such neuromodulatory techniques in recent years. Although studies of sensory processing have traditionally been performed one sense at a time, there is also a long and parallel history of multisensory research.

In this review, we highlight some of the most important discoveries in the field of multisensory and crossmodal processing derived from causal manipulations of brain activity. Multisensory integration refers to the process by which information from different sensory modalities is combined by the nervous system to form a stable and coherent percept of the world. Multisensory integration enhances our ability to perceive and understand our environment, enabling us to move and interact with objects in our surroundings. Crossmodal processing, as used in this review, refers to the influence of stimulation in one sensory modality on neural processing mechanisms supporting a different modality (e.g. auditory stimulation influencing neural activity in visual cortex).

Progress to date has been based on a multitude of techniques, including psychophysics, neurophysiology and neuroimaging. Here we will provide a brief synopsis of how neuromodulation approaches (figures 1 and 4) have built on these techniques to (i) provide a systems-level understanding of the areas involved in multisensory integration and the areas providing sensory inputs to these integrators; (ii) demonstrate that multisensory integration can be a distributed property across cortical networks; (iii) show that sensory areas traditionally considered to be dedicated to a single modality can be multimodal, with crossmodal responses being modulatory or serving explicit and specific functions; and (iv) demonstrate how multisensory integration can depend on microcircuit interactions between particular cell types.

Figure 1.

Figure 1.

Non-invasive brain stimulation techniques. (a) With transcranial magnetic stimulation (TMS), current passed through wire loops (red arrows) creates a brief (100–200 µs), yet strong magnetic field (peak strengths of 1–2 T, blue dashed lines) that can induce electrical currents (yellow patch) non-invasively in nearby brain tissue [3]. TMS spatial specificity and penetration depth vary depending on coil size and geometry—commonly used coils achieve spatial resolutions on the order of 1 cm2 on the brain surface with field strength decaying exponentially with distance from the coil. TMS is used in ‘online’ modes to acutely disrupt neural activity patterns in the targeted brain region. TMS is used in ‘offline’ modes to modulate cortical excitability—the modulatory effect depends on stimulation parameters such as the temporal rate and pattern. (b) With transcranial direct current stimulation (tDCS), low-amplitude (typically 1–2 mA) electrical currents (yellow arrow) passing between surface electrodes (red and blue boxes) modulate excitability of brain tissue residing beneath the electrodes [4]. Prolonged anodal or cathodal tDCS administration can induce persistent potentiation or depression of cortical excitability, respectively. Current spread depends on a variety of factors including electrode size (typically 10–40 cm2) and arrangement.

Figure 4.

Figure 4.

Invasive neuromodulation techniques. (a) With cooling deactivation, cortical tissue temperature is reduced to block neural activity temporarily [71]. (b) With chemical deactivation, pharmacological agents are used to block neural activity temporarily [72]. (c) With electrical microstimulation, low-amplitude electrical currents are delivered through microelectrodes to drive neural activity in small groups of neurons [73]. Microstimulation effects can depend strongly on stimulation parameters like amplitude, pulse width, pulse train duration and pulse train frequency. (d) With optogenetics, light is used to drive activity in neuron populations that have been genetically sensitized to light through the expression of light-sensitive proteins [74]. Optogenetics can be used to excite or inhibit specific cell types or neuron populations generally.

Because neuromodulatory techniques differ between animal and human models of multisensory function, we first outline recent discoveries in multisensory processing in humans (see also, [5]), followed by a brief review of parallel studies in animals.

2. Summary of advances using neuromodulatory techniques in human studies

This section focuses exclusively on studies that have used two non-invasive brain stimulation (NIBS) methods: transcranial magnetic stimulation (TMS) and transcranial direct current stimulation (tDCS). Here we provide a brief overview of these methods and interested readers are directed to detailed descriptions of TMS, tDCS and related NIBS methods [3,4,6,7]. The sections following the TMS and tDCS overview then review studies employing these methods to investigate the multisensory and crossmodal properties of (i) higher-order association regions like the posterior parietal cortex (PPC) and the superior temporal sulcus (STS), and (ii) sensory cortex.

TMS (figure 1a) is used to modulate neural activity via electromagnetic induction. A high-voltage current is briefly passed through a coil to generate a rapidly changing magnetic field—this time-varying field induces electrical current in the brain (according to Farraday's induction principle) when the coil is positioned against the scalp. TMS-induced currents interact with the targeted brain region's endogenous neural activity to temporarily alter neural processing. Accordingly, in an ‘online’ mode of use, single-pulse TMS (spTMS) or brief trains of repetitive TMS (rTMS) are used to induce acute changes (most often perturbations) in motor, perceptual or cognitive performance in order to demonstrate causally the targeted brain region's functional involvement in the task. Beyond acutely influencing neural activity, prolonged application of rTMS can also induce changes in neuronal excitability that persist after TMS cessation. Depending on the temporal rates or patterning of the stimulation trains, rTMS can induce long-term potentiation- and long-term depression-like changes in cortical excitability. For instance, with regularly patterned rTMS, pulse frequency generally determines the direction of neuroplastic change: low frequency rTMS (less than 1 Hz) typically results in decreased cortical excitability while high frequency rTMS (greater than 1 Hz) leads to increased cortical excitability. Because cortical excitability changes can influence (by enhancing or impairing) perception and behaviour, rTMS paradigms are used in ‘offline’ TMS studies to characterize the functions of different brain regions.

With tDCS (figure 1b), electric currents can be used directly and non-invasively to stimulate neural tissue. tDCS involves delivering continuous and prolonged low-amplitude electric current (typically 1–2 mA) to brain tissue via surface electrodes. Although much of the current is shunted through the scalp and cranium, substantial currents are passed to cortical tissue residing immediately beneath the electrodes and these currents can alter the membrane potential of neuron cell bodies or axon terminals to influence signal transmission. Unlike TMS, tDCS does not initiate action potentials, but instead changes spontaneous and stimulus-evoked neural activity. These neural excitability changes can persist temporarily beyond tDCS cessation. Crucially, the neuromodulatory influences of tDCS depend on current polarity: simplistically, cortical excitability of tissue underneath the anode is enhanced while excitability of tissue underneath the cathode is depressed. These cortical excitability changes can modulate neural processing sufficiently to impact motor, perceptual and cognitive function. Accordingly, studies have used tDCS as a tool for examining the contributions of targeted brain areas to particular functions.

(a). Multisensory functions of posterior parietal cortex

(i). Attention and stimulus binding

NIBS studies support the view that PPC plays a critical role in functions related to attention allocation for unimodal and multisensory processing. Indeed, anodal tDCS, which enhances cortical excitability [4,7], speeds reaction times for detecting auditory, visual and bimodal auditory–visual targets when applied over right PPC [8]. While reflexive attention shifts can be advantageous, automatic orienting may distract and impede performance in particular contexts. For instance, reflexive orienting to a non-informative visual stimulus can impair localization of a subsequently presented tactile target. Online TMS targeting the right angular gyrus (AG) and right supramarginal gyrus (SMG) impairs suppression of non-informative visual and tactile distractors in the context of tactile target localization [9]. This pattern indicates that PPC contributes to controlling reflexive spatial attention shifts across and within modalities. Because TMS targeting the right AG also impairs the temporal binding of auditory and visual stimulation [10], multisensory PPC circuits may be involved in attentional control over space and time. Although PPC may function as a general filter and integrator of sensory information within and across modalities, different PPC subregions residing in the inferior parietal lobule and intraparietal sulcus (IPS) probably subserve attention functions for particular modalities over others [11]. Given the involvement of PPC in unimodal and multisensory attention functions, this region offers great potential as a neuromodulation target for learning and attention training interventions [12].

(ii). Reference frame remapping

Because sensory information conveyed by each modality may be represented in particular spatial reference frames (e.g. tactile information in body-centred coordinates), integration of sensory inputs, in particular to guide actions, may require remapping neural representations into crossmodally consistent coordinate systems. NIBS studies in humans indicate that the remapping of sensory information from body-centred to external coordinates relies on the PPC. Bolognini & Maravita [13] explored the role of PPC in aligning tactile and visual reference frames by exploiting the fact that a touch felt on the hand enhances visual cortex excitability only at visual field locations coinciding with the stimulated hand's position in visual space. Using phosphenes induced by TMS [14] as a measure of visual cortex excitability, Bolognini & Maravita [13] showed that phosphene sensitivity increases in the right hemifield only when the touched right hand occupies the same position as the reported phosphene location. Moreover, if the arms are crossed and the left hand is positioned to the right of body midline, touch to the left hand, but not the right, increases phosphene sensitivity in the right hemifield. This pattern reveals that the tactile influences on visual cortex excitability depend on the location of touch in external coordinates: touch location remaps to external space based on proprioceptive cues. (Note that touch location could have been remapped to world-centred, body-centred or head-centred external coordinate frames, but this was not tested in [13].) Critically, following 1-Hz rTMS perturbation of right PPC, only tactile stimulation of the right hand influenced visual cortex excitability [13], with and without arm crossing. This result reveals that PPC mediates the remapping of touch to an external reference frame. Importantly, TMS targeting the right PPC also disrupts participants' ability to localize and compare touched body locations in external coordinates [15], in the absence of visual inputs. Thus, PPC appears to be involved in remapping touch from egocentric to allocentric coordinates. In the context of multisensory stimulation, reference frame remapping by the PPC may be critical for aligning inputs to facilitate integration.

(iii). Mediating crossmodal interactions in sensory cortex

In traditional hierarchical sensory processing models, PPC has been viewed as a higher-order sensory association area that integrates modality-specific sensory representations [16]. The propensity of multisensory neurons and the clear demonstration of multisensory processing in PPC support this view [17,18]; however, PPC need not be viewed exclusively as the recipient of inputs processed by the sensory cortices. PPC is also ideally situated to mediate interactions between the sensory systems by shaping processing in primary sensory areas via feedback projections [19]. For instance, just as touch can modulate visual cortex activity, vision can also modulate activity in somatosensory cortex. Indeed, visual feedback is tremendously helpful for guiding haptic processing by providing information about what objects may be touching the body, where to expect touch and when contact may occur.

Results from a growing body of NIBS studies implicate PPC as a mediator of the rich communications between vision and touch. For instance, concurrently viewing a hand being touched on video improves the speed with which participants identify which of their own fingers is actually touched in a localization task. Online TMS to PPC abolishes this facilitation and reduces video-supported performance to unisensory levels [20]. Furthermore, viewing a picture of a hand immediately before engaging in a tactile grating discrimination task improves tactile performance [21,22]. Online TMS delivered to anterior IPS immediately after visual stimulus presentation but prior to touch onset disrupts visual enhancement of touch [22]. Importantly, TMS to IPS that is concurrent with touch has no effect, but TMS targeting primary somatosensory cortex (S1) at the time of tactile stimulation effectively disrupts tactile processing [22]. This spatio-temporal pattern is consistent with a visually driven PPC exerting modulatory influences on somatosensory cortex activity, and this top-down signal may specifically affect S1 rather than secondary somatosensory cortex circuits [21]. Prolonged visual stimulation can even serve to recalibrate tactile sensitivity, and this recalibration relies on an intact PPC [23]. Finally, auditory stimulation that is localized near the hand can similarly enhance touch, and this facilitation is also abolished by rTMS targeting PPC [24]. Together, these findings imply that PPC mediates crossmodal influences on primary sensory cortex activity. For visual and auditory influences on somatosensory processing, in particular, PPC may be necessary for maintaining stable and useful multisensory representations of peripersonal space.

(b). Multisensory functions of the superior temporal sulcus

We rely on auditory and visual information to communicate and it is unsurprising that these senses interact in speech perception. In the McGurk effect, the perception of an auditory syllable depends on whether it is accompanied by a visual depiction of the speaker producing the same or a different syllable [25]. In this robust phenomenon, incongruent auditory and visual stimuli can result in the perception of illusory syllables not presented in either modality. Neuroimaging studies have consistently identified the STS as a region involved in auditory–visual speech processing [26] and, indeed, the homologous area in macaque monkeys contains neurons tuned for communication-related auditory and visual inputs [27]. To establish a causal contribution of the STS to auditory–visual integration for the McGurk effect, Beauchamp et al. [28] applied spTMS over STS (figure 2a) while participants reported their perception of auditory and auditory–visual speech stimuli. With spTMS over STS, participants report significantly fewer occurrences of the McGurk illusion compared with trials without TMS or with TMS applied over a control site located approximately 4 cm dorsal and posterior to the STS (figure 2b) [28]. Crucially, TMS targeting STS does not disrupt performance on control trials containing auditory-only and congruent auditory–visual stimuli, implying that STS is specifically involved in the generation of the McGurk illusion with incongruent auditory–visual stimuli. To establish the time course of STS involvement in the McGurk effect, Beauchamp et al. additionally conducted a chronometric TMS experiment (figure 2c) and found that TMS to STS disrupts the McGurk effect only when pulses were delivered during an interval spanning 100 ms before the onset of the auditory stimulus to 100 ms after stimulus onset [28]. These experiments collectively provide strong support for the role of the STS in auditory–visual integration for speech. The causal contributions of the STS to the McGurk effect have recently been corroborated by a study using tDCS [29]—cathodal stimulation, which depresses cortical excitability, decreases the likelihood of McGurk responses when applied over the STS. Interestingly, the authors also reported that anodal tDCS over PPC increases the likelihood of McGurk responses [29]. This result is consistent with neuroimaging studies reporting PPC activations related to auditory–visual speech processing and the McGurk effect [30,31]. Together, these findings imply that multiple brain areas contribute to multisensory speech perception, and future studies are needed to establish the functional relationships between regions like the STS and PPC.

Figure 2.

Figure 2.

fMRI-guided TMS-disruption of the McGurk illusion (from Beauchamp et al. [28]). (a) Auditory–visual speech stimuli evoke strong activations in the left superior temporal sulcus (STS; red voxels) and other brain regions (green voxels). The dashed black line shows the fundus of the posterior STS. (b) Participants are highly likely to report experiencing the McGurk illusion when presented with incongruent auditory–visual speech stimuli without concurrent TMS (black) or with TMS delivered to a control site near the STS (blue). For the same auditory–visual speech stimuli, TMS targeting STS significantly reduces the likelihood of the McGurk percept (red). (c) Single-pulse TMS targeting STS impacts McGurk illusion generation only when TMS is delivered within a specific temporal window relative to the onset of the auditory speech stimulus.

NIBS studies indicate that the STS may be generally involved in binding auditory and visual inputs, regardless of whether they contain speech or biological motion. For example, tDCS applied over the STS influences the sound-induced fission illusion [32], a phenomenon in which a single flash appears to flicker when paired with multiple sounds [33]. Neuromodulation of the fission illusion is polarity-specific: anodal tDCS over STS increases fission while cathodal tDCS over STS decreases fission [32]. These results support the notion that the STS also serves as a locus for the fission illusion. Alternatively, the STS may be part of a network that supports auditory–visual binding and tDCS applied over STS may remotely modify excitability across this distributed network. Indeed, tDCS over primary visual cortex, whose activity has been shown to correlate with the strength of fission illusion across individuals [34,35], can also modulate fission strength in the sound-induced illusion in a polarity-dependent manner. These considerations exemplify the potential challenges involved in interpreting NIBS results, but the collective results from these studies reveal the STS' clear involvement in the binding of auditory–visual information.

(c). Multimodality of sensory cortex

In traditional models of sensory cortex organization, sensory information processing for each modality is initially segregated in modality-specific sensory areas before subsequent integration in higher-order association areas. This organizational scheme has been challenged recently by the demonstration of multimodal responses in primary and secondary sensory areas [16,36]. NIBS has been tremendously helpful for testing the modality-specificity of sensory cortex functions. For clarity, here we define the modality that is traditionally associated with a given sensory region as ‘preferred’ (i.e. visual inputs are preferred by visual cortex). Accordingly, the other modalities are defined as ‘non-preferred’ with respect to the same region (i.e. auditory and somatosensory inputs are non-preferred by visual cortex). In the following section, we review two types of multimodal responses in sensory cortex that have been revealed by NIBS. In the first case, NIBS reveals the contributions of traditionally-defined sensory areas to perceptual processing of non-preferred modalities. In the second case, NIBS reveals modulation of sensory cortex excitability by non-preferred sensory inputs.

(i). Crossmodal recruitment of sensory cortex

Sensory brain regions can be characterized according to their preferred input modalities and according to the functions and computations they perform. For instance, early processing stages in the ventral visual pathway are known to represent simple visual shape information like orientation and curvature, while higher-order ventral pathway areas support object processing [37,38]. Rather than being strictly dedicated to visual processing, these regions also respond to tactile inputs [3942] in the absence of concurrent visual stimulation if these signals convey spatial or shape information. This response selectivity is consistent with the notion that visual cortex serves as a general operator for processing spatial or shape information [43]. Supporting this hypothesis, rTMS disrupting visual cortex processing in sighted individuals impairs Braille reading [42] and fine tactile dot spacing judgements [44]. Yau et al. [45] recently demonstrated that anodal tDCS applied over visual cortex, which enhances contrast sensitivity to visual grating patterns [46], also temporarily enhances tactile spatial acuity [45]. Participants performed a discrimination task in which they identified the orientation of tactile gratings that varied in grating width. Following anodal tDCS over visual cortex (figure 3a), but not sham stimulation over visual cortex or anodal tDCS over auditory cortex, participants were better able to discriminate the orientation of tactile gratings at smaller grating width values [45]. This pattern implies that visual cortex (probably V1/V2) contributes to tactile orientation perception, corroborating an earlier report showing that spTMS delivered to occipital cortex impairs participants' ability to discriminate tactile grating orientation [47]. Zangaladze et al. [47] employed a chronometric analysis to reveal that TMS disruption occurs for pulses delivered 180 ms after touch [47]. This temporal specificity indicates that tactile recruitment of visual cortex occurs relatively late in the processing stream.

Figure 3.

Figure 3.

Crossmodal recruitment for spatial and temporal touch (from Yau et al. [45]). (a) Participants performed a tactile grating orientation discrimination task during sessions in which they received sham tDCS (grey-hued traces) or anodal tDCS (red-hued traces) over visual cortex (yellow crosshair in left panel). Performance scaled according to grating width and widths corresponding to 75% correct performance define the grating orientation threshold, a measure of tactile spatial acuity. Inset: grating orientation (GO) thresholds estimated for different test intervals during each session (Pre, Stim, Post1 and Post2) reveal a significant enhancement of tactile spatial acuity following anodal tDCS. (b) Participants performed a tactile frequency discrimination task during sessions in which they received sham tDCS (grey-hued traces) or anodal tDCS (red-hued traces) over auditory cortex (yellow crosshair in left panel). Psychometric plots indicate the probability that participants judged a comparison frequency (fc) as being higher in perceived frequency compared with a 300-Hz standard stimulus. Inset: sensitivity estimates for different test intervals during each session (Pre, Stim, Post1 and Post2) reveal a significant enhancement of tactile frequency sensitivity following anodal tDCS.

Involvement of visual cortex in tactile shape processing is not restricted to the perception of low-level shape features: rTMS targeting the lateral occipital complex (LOC), a higher-order visual area that also responds to tactile shapes [48], disrupts visual, tactile and visual–tactile instances of the Muller-Lyer illusion [49], a length illusion that requires integration of multiple line stimulus segments. Crucially, dorsal pathway visual areas also appear to support processing for non-preferred modalities. For instance, middle temporal area (area MT), which is well studied given its prominent role in visual motion processing, also activates in response to tactile motion [5052]. rTMS targeting area MT impairs tactile motion speed judgements [53]. Collectively, these NIBS studies support the emerging view that brain regions which have been traditionally ascribed visual functions may also contribute to non-visual perception.

Auditory cortex also contributes to processing for non-preferred modalities. Tactile stimulation alone evokes responses in auditory cortex [5458]. Because auditory cortex comprises neural circuits specialized for processing temporal information like frequency and timing, tactile responses in auditory cortex may reflect functional recruitment for tactile temporal processing. Indeed, anodal tDCS over auditory cortex, which modulates auditory frequency sensitivity [59], improves performance on a tactile frequency discrimination task by enhancing frequency sensitivity or perceptual learning in the frequency domain [45] (figure 3b). Additionally, spTMS targeting the superior temporal gyrus (STG) impairs performance on a tactile duration discrimination task [60,61]. In individuals with normal hearing, TMS targeting STG impairs tactile performance only when TMS pulses are presented 180 ms after touch; however, TMS delivered to STG 60 ms after touch impairs tactile performance in congenitally deaf individuals [61]. These studies reveal that tactile recruitment of auditory cortex may be ubiquitous in hearing and deaf individuals, and the degree and nature of crossmodal processing is shaped by auditory experience.

Just as regions in visual and auditory cortex can be defined by the functions they support (i.e. spatial form and temporal frequency processing, respectively), rather than by modality per se, somatosensory cortex appears to contribute to processing of body-related information independent of input modality. This perspective, also articulated in theories of embodied cognition [62], proposes that visual perception of facial expressions recruits face-processing circuits in sensorimotor brain areas in addition to those in visual cortex. Supporting this view, Pitcher et al. [63] demonstrated that online rTMS targeting the somatosensory face region, but not the adjacent hand region, impairs participants' ability to discriminate visually presented facial expressions [63]. This result is consistent with a previous study demonstrating that spTMS targeting primary somatosensory cortex affects facial emotion judgements, but not eye-gaze direction [64]. The magnitude of S1-TMS disruption is comparable to the effect of delivering TMS to the occipital face area (OFA). Notably, the time course of TMS-induced perturbations differs between the two targets: TMS targeting OFA at 60 ms after stimulus onset maximally disrupts performance while the largest effects of S1-TMS occur with pulses delivered 130 ms after visual stimulus onset [63]. Visual processing related to non-face body parts also recruits somatosensory cortex, as online rTMS targeting S1 impairs discrimination performance in a task requiring participants to judge whether a video depicts a hand being touched [65]. Thus, somatosensory cortex may be specialized for processing body-related information across sensory modalities.

(ii). Crossmodal modulation of sensory cortex excitability

As discussed previously, a single TMS pulse delivered to occipital cortex can evoke acute visual sensations (i.e. phosphenes) in restricted portions of the visual field [14]. Because the likelihood of producing phosphenes depends on the excitability state of the visual cortex [66] in addition to TMS intensity, phosphene generation provides a unique tool for probing crossmodal influences on visual cortex excitability. Furthermore, just as chronometric TMS experiments track the time course of a brain region's involvement in a particular function (e.g. figure 2c), the time course of crossmodal influences on visual cortex excitability can be profiled. Ramos-Estebanez et al. [67] demonstrated that subthreshold tactile stimulation (electrocutaneous stimulation that was imperceptible) on the right hand increases phosphene occurrence in the right visual field when occipital TMS is delivered 60 ms after tactile stimulation [67]. This result reveals that the crossmodal somatosensory influences on visual cortex excitability are highly specific in space and time. Additionally, the fact that subthreshold tactile stimuli modulate visual cortex excitability implies crossmodal influences operate via automatic, pre-attentional mechanisms. Interestingly, Ramos-Estebanez et al. found that subthreshold stimulation of the right hand increases phosphene occurrence in the right visual field even if the arms are crossed and the hand occupies space in the left hemifield. Recall that Bolognini & Maravita [13] reported that a stimulated right hand positioned to the left of body midline did not modulate excitability in the right hemifield—the authors interpreted this as evidence for the tactile reference frame remapping function of PPC [13]. These contradictory results may be reconciled by considering that Bolognini and Maravita used suprathreshold tactile stimulation while the tactile stimulus used by Ramos-Estebanez et al. was subthreshold; conceivably, reference frame remapping specifically occurs for detectable or attended stimuli, consistent with PPC's ascribed role in attention allocation.

TMS induction of phosphenes has also been used to demonstrate that auditory stimulation modulates visual cortex excitability. Auditory modulation of visual cortex occurs with a temporal profile matching tactile-mediated excitability changes [68]. Furthermore, auditory modulation of visual cortex can depend on sound characteristics—looming sounds exert greater influences on visual cortex excitability compared with receding or stationary sounds [69]. Looming stimuli preferentially drive visual cortex excitability even for sound durations short enough to abolish participants' ability to discriminate sound characteristics. These results indicate that auditory modulation of visual cortex is pre-perceptual and feature-specific [69].

NIBS has been used not only to reveal crossmodal influences on visual cortex excitability, but also to identify potential cortical origins of crossmodal modulatory signals. Anodal tDCS targeting auditory cortex enhances the influence of looming sounds on visual cortex excitability [70]. Likewise, anodal tDCS targeting parietal cortex enhances the tactile influences on visual cortex excitability [70]. That auditory stimulation can combine with tactile stimulation to induce larger excitability changes in visual cortex than can be achieved by either modality alone [70] suggests that common mechanisms exist for crossmodal modulation of visual cortex. Together, these results imply that visual cortex excitability changes are shaped by the activity of auditory and somatosensory cortex. Because visual cortex excitability can be influenced by stimuli that are subthreshold [67] and pre-perceptual [69], interactions between sensory cortical systems appear to be automatic and nearly ubiquitous.

3. Summary of advances using neuromodulatory techniques in animal studies

In the following section, we summarize studies that characterize multisensory neural mechanisms using neuromodulatory methods in animal models. Animal models permit use of invasive techniques like electrical microstimulation, chemical and cryogenic deactivation, and more recently, optogenetics (figure 4). These approaches, combined with psychophysics and neurophysiology, have led to important advances in our understanding of the brain systems and mechanisms that support multisensory functions. Here we synthesize the relevant literature according to four general topics: (i) multisensory network interactions supporting attention and orienting behaviour, (ii) crossmodal activity in putative unisensory regions, (iii) optimal cue integration, and (iv) neural and synaptic architecture for multisensory integration.

(a). Multisensory networks for orienting behaviours and attention

(i). Multisensory integration in cat superior colliculus

The superior colliculus (SC) is a brainstem structure involved in spatial orienting. SC neurons display a wide range of unisensory and multisensory responses to visual, auditory and somatosensory inputs [7578]. Because of these complex interactions, pioneering work on multisensory processing focused on SC neurophysiology. Much of the work performed in SC builds upon the observation that SC unit responses to bimodal stimuli often exceed the response to the most effective unimodal component stimuli. Such facilitation has been used to define multisensory enhancement (MSE) in SC [79,80] and this definition has been extended to quantify multisensory integration in other brain structures [81,82]. MSE strongly depends on the spatial and temporal relationships between the integrated sensory inputs. In general, MSE is found to be reduced when stimuli from two modalities are separated in either space or time [80]. MSE is also evident in behaviour, as animals trained to orient to visual, auditory or bimodal spatial cues perform best when bimodal cues are spatially and temporally congruent [77]. Importantly, an intact SC appears to be required for MSE effects in cat orienting behaviour because excitotoxic SC lesions abolish the MSE in behaviour and reduce performance to unimodal levels [83]. The well-defined neurophysiological substrate of orienting behaviour in the SC of cats has provided a rich model system for testing multisensory integration mechanisms using neuromodulatory methods.

Substantial efforts have focused on characterizing the inputs that drive multisensory responses in cat SC. For instance, microstimulation applied to the frontal eye fields (FEF) preferentially activates multisensory, not unisensory, neurons in SC [84]. This suggests that frontotectal projections may serve specific functions in SC multisensory processing. Additionally, evidence from many studies using a variety of neuromodulation techniques highlights the importance of sensory cortex in sculpting multisensory interactions in SC (figure 5). In particular, electrical microstimulation has revealed that the SC receives projections from multiple sensory cortical areas including the lateral suprasylvian visual area (LS) and the anterior ectosylvian area (AES), the latter of which comprises an auditory field AES (FAES), a visual region (AEV) and the fourth somatosensory area (SIV). Electrical microstimulation delivered to each of these cortical regions produces orthodromic activation of multisensory SC units [86]. Furthermore, microstimulation of sensory cortex produces modality-specific SC responses that are consistent with activity evoked by actual sensory stimulation [86]. Because sensory cortex contains both unisensory and multisensory neurons, either population (or both) could project to SC; however, microstimulation targeting SC only evokes antidromic responses in unisensory cortical neurons. This result suggests that unisensory, not multisensory, cortical neurons make corticotectal projections.

Figure 5.

Figure 5.

Putative circuitry underlying multisensory integration in cat superior colliculus (from Rowland et al. [85]). Multisensory neurons in superior colliculus receive ascending and descending inputs from multiple sources. Cortico-collicular projections from different sensory cortex regions in the anterior ectosylvian sulcus (AES) target electrotonically coupled areas of the dendrite. Afferents also project to interneurons that inhibit multisensory output neurons. Cell labelled ‘I’ is an inhibitory interneuron; PAG, periaqueductal grey area.

Consistent with the notion that sensory cortex provides critical inputs that shape multisensory integration in the SC, chemical deactivation of sensory cortex selectively impacts orienting and approach behaviours that are based on bimodal cues [87]. Specifically, lidocaine injections in AES reduce both the behavioural enhancements that arise from congruent bimodal cues and the performance impairments that are caused by incongruent bimodal cues [87]. This demonstrates that modifying sensory cortex responses directly influences SC activity. Indeed, reversible cryogenic deactivation of AES and LS selectively impacts the multisensory responses of many SC neurons. Following cooling, SC responses to bimodal cues are reduced to levels that match the maximum unisensory responses [88]. This indicates that multisensory integration in the SC requires intact and functional sensory cortex. Importantly, these cooling experiments also reveal that SC neurons display different patterns of dependencies on sensory cortex: some SC neurons maintained MSE as long as either AES or LS remained functional, but MSE in a majority of SC neurons required functional integrity of both AES and LS [88]. Multisensory integration in SC thus appears to rely on interactions between auditory and visual sensory cortical areas (figure 5), and this ‘corticotectal synergism’ is also reflected in SC dependencies on the separate auditory and visual subfields contained within AES [89].

Because intact sensory cortical function is required for multisensory integration in SC neurons, it is unsurprising that sensory cortex also plays a major role in shaping the maturation of multisensory properties in SC neurons during development. Rowland et al. [90] recently demonstrated that prolonged, but temporary, deactivation of AES and LS, using muscimol-releasing packets implanted at postnatal week 3, impairs the development of MSE in SC neurons. Chemical deactivation of sensory cortex early in life led to an absence of MSE in behaviour and neurophysiological tests in 1-year old animals [90]. Importantly, MSE appeared by age 4 in these same animals. Thus, maturation of SC multisensory properties that are delayed by early cortical deactivation can recover later in life, assuming sensory cortex integrity.

(ii). Spatial attention mechanisms

Sensory responses can be modulated by top-down attentional signals [91]. In macaque monkeys, electrical microstimulation of FEF, a forebrain area involved in gaze control, increases both behavioural sensitivity and the responsiveness of V4 neurons to stimuli presented within the movement field of the stimulated FEF neurons [92,93]. In this manner, microstimulation of FEF is thought to mimic the neural and behavioural effects seen with endogenous visual spatial attention. A top-down mechanism for spatial attention has also been reported in the barn owl [94]. Microstimulation applied to the arcopallial gaze fields (AGFs) in the barn owl modulates spatially-tuned auditory responses in the optic tectum (OT), suggesting that AGF serves an analogous function as mammalian FEF. Critically, the spatially-tuned neurons in barn owl OT respond to visual as well as auditory stimulation, so the OT can be considered a multisensory structure. Winkowski & Knudsen [95] demonstrated that electrical stimulation of AGF enhances the auditory and visual responsiveness and stimulus discriminability of OT neurons that are driven by the location in space represented by the AGF microstimulation site [95]. Conversely, AGF microstimulation suppresses OT responses to sensory inputs presented outside of the AGF receptive field (RF) [95].

These results indicate that microstimulation-induced AGF modulation of OT, which mimics endogenous spatial attention, depends on the spatial alignment of AGF and OT RFs. Furthermore, the ability of AGF microstimulation to produce these endogenous spatial attention effects depends on the integrity of a particular inhibitory circuit in the barn owl midbrain tegmentum, the nucleus isthmi pars magnocellularis (Imc). Chemical inactivation of Imc abolishes endogenous competitive suppression caused by AGF microstimulation [96]. Imc blockade also disrupts the suppressive influences of distractor stimuli on OT activity [96]. Thus, a common inhibitory circuit appears to mediate exogenous and endogenous attention effects on OT activity, which potentially impacts multisensory responses across multiple processing levels in the tectofugal pathway [97]. Together, these studies use neuromodulation to reveal how top-down control mechanisms influence multimodal spatial processing circuits in the barn owl.

(b). Multimodal processing in auditory brainstem and cortex

(i). Somatosensory activation of the dorsal cochlear nucleus

Based on its anatomical and physiological characteristics, the dorsal cochlear nucleus (DCN) is traditionally considered to be an auditory brainstem structure [98]. However, the DCN also receives a projection from the somatosensory dorsal column and spinal trigeminal nuclei (collectively termed the medullary somatosensory nuclei; MSN) [99]. This connectivity implies that tactile inputs also access DCN circuits. Neuromodulation has been used to characterize the nature of somatosensory inputs to DCN. Electrical microstimulation of MSN strongly modulates DCN activity in cats [99] through a mixture of inhibitory and excitatory effects that depend on cell type [100]. Microstimulation of the trigeminal nuclei also produces complex interaction patterns in the guinea pig DCN [101]. These results clearly indicate that the DCN does not exclusively respond to auditory stimulation.

The observation that mechanical stimulation of the pinna and vibrissae, but not other body regions, drives DCN activity [99] implies that somatosensory inputs to DCN may serve specific functions. In fact, muscle-related somatosensory stimulation of the pinna drives DCN activity most robustly [102]. These patterns are consistent with the hypothesis that somatosensory inputs to DCN modulate auditory responses in behaviourally meaningful ways: excitatory interactions that amplify auditory signals can result from pinna movements related to spatial orienting while inhibitory interactions can suppress sound processing related to self-generated movements [98,103]. Importantly, because the DCN occupies a relatively early processing stage in the auditory pathway, somatosensory information accessing DCN circuits necessarily propagates to all levels of the auditory system that receive direct or indirect inputs from DCN [99].

(ii). Crossmodal recruitment of auditory cortex

Neuromodulation studies have provided clear evidence for functional visual recruitment of auditory cortex in congenital and early-deafened cats. Lomber et al. [104] used cryogenic deactivation techniques to establish the neural mechanism mediating the supranormal visual capacities of deaf cats [104]. Specifically, deaf cats have superior visual localization and lower visual movement detection thresholds in the peripheral field compared with hearing cats. Consistent with the hypothesis that auditory cortex in deaf animals is recruited to process visual inputs, substantially more visually responsive neurons are found in the FAES of deaf compared with hearing cats [105]. To demonstrate that FAES causally supports vision, Lomber and colleagues deactivated all or parts of FAES in cats performing visual tasks. Cooling all of FAES reduces supranormal visual performance to typical levels in all domains that are enhanced by deafness [104,105]. Moreover, deactivating only the posterior auditory field (PAF) selectively impairs visual localization and deactivating only the dorsal zone of auditory cortex (area DZ) selectively impairs visual motion detection [104]. This double dissociation indicates that distinct auditory cortical regions support separate visual functions in the deaf. This highly specific crossmodal recruitment of sensory cortex, which pairs particular visual functions with specific auditory cortical loci, may also account for the superior visual sensibilities in deaf humans [106].

(c). Optimal cue integration

(i). Visual and vestibular cue integration for perceiving self-motion

A popular theoretical framework for considering multisensory cue integration is based on maximum-likelihood estimation (MLE). According to the MLE framework, redundant sensory cues are combined linearly in a statistically optimal manner such that (i) the variance of perceptual estimates based on the combined cues is minimized and (ii) the perceptual weights assigned to each cue are proportional to the relative reliabilities of the cues, such that more reliable cues are weighted more heavily [107,108]. Near-optimal cue integration properties have been observed for a broad variety of sensory modalities and tasks [109114], and substantial progress has been made in establishing the neurophysiological mechanisms of optimal cue integration [115]. Recent studies have used neuromodulation techniques to investigate the cortical circuits that support optimal cue integration.

Visual and vestibular cues combine near-optimally to inform our perception of heading, which is the direction of translational self-motion [116]. Neurons in the dorsal medial superior temporal area (MSTd) exhibit directional tuning for visual and vestibular heading signals, making this region a prime candidate for mediating multisensory heading perception. If visual and/or vestibular signals in MSTd are used by the monkey to judge heading, electrical microstimulation targeting neuron clusters with similar tuning preferences is expected to add a net signal to the local circuit, thus biasing behaviour. Specifically, microstimulation is expected to shift the psychometric function toward the preferred heading of neural activity recorded at the stimulation site, with little or no effect on the slope of the function (figure 6a). By contrast, reversible chemical inactivation using muscimol, a GABA-A receptor agonist, is expected to flatten the psychometric function if a large enough volume is used to suppress activity in populations of neurons with diverse tuning preferences. In this case, chemical inactivation would be expected to increase heading thresholds, rather than inducing a choice bias (figure 6b).

Figure 6.

Figure 6.

Hypothesized and actual changes in heading perception (from Gu et al. [117]). (a) Predicted effects of microstimulation on heading discrimination. If multi-unit (MU) activity at the stimulation site has a leftward heading preference, microstimulation should lead to more leftward choices such that the psychometric function is shifted to the right. By contrast, stimulating sites with a rightward heading preference should cause a leftward shift of the psychometric function. PSE, point of subjective equality. (b) Predicted effects of reversible inactivation on heading discrimination performance. Muscimol injection suppresses neural activity (inset) in a large region and is expected to reduce the precision of heading discrimination, leading to a shallower psychometric function. (c) Example microstimulation experiment at a site with a leftward heading preference. Psychometric functions for stimulated (dashed curves) and non-stimulated (solid curves) trials in response to vestibular (black), low-coherence visual (red) and combined (green) stimuli. (d) Example inactivation experiment. Visual psychometric functions were collected during four separate sessions: a control block before muscimol injection (‘Pre’, dashed curves and cross symbols), a block immediately after inactivation (‘0 h’, orange), a block 12 h after inactivation (‘12 h’, red) and a recovery block 36 h after inactivation (‘36 h’, black solid curves and open symbols).

In a series of experiments, Gu et al. [117] used electrical microstimulation and muscimol inactivation methods to test the causal involvement of area MSTd in multisensory heading perception [117,118]. Electrical microstimulation targeted to clusters of MSTd neurons with consistent heading tuning was found to induce systematic biases in heading percepts. Although electrical microstimulation greatly affected performance on vision-only trials, performance on vestibular-only and multisensory trials was minimally influenced (figure 6c). The weak effects on vestibular heading perception were probably due to the fact that tuning of MSTd neurons for heading based on vestibular signals is relatively weak. Because of this weak vestibular tuning, microstimulation may have failed to activate sufficiently large clusters of MSTd neurons with consistent vestibular heading preferences.

To counter some of the limitations of microstimulation, Gu et al. [117] also injected muscimol into MSTd to test how inactivation of this area impacts heading perception. Suppressing MSTd activity using large bilateral injections of muscimol significantly impaired heading perception and increased behavioural thresholds (figure 6d). The threshold changes were largest for the visual condition, intermediate for the multisensory condition and weakest for the vestibular condition, again revealing that visual perception is more strongly impacted by neuromodulation of MSTd. The consistently weaker effects on vestibular-only trials suggested that other brain regions also support vestibular heading perception. Candidate regions include the parieto-insular vestibular cortex [119], the visual posterior sylvian area [120] and the ventral intraparietal area (VIP) [121,122], all of which contain neurons with responses that are either dominated by vestibular inputs or have more balanced representations of visual and vestibular signals than MSTd, which is visually dominated. Macaque area VIP has also been shown to respond to tactile and auditory stimuli, as well as visual inputs [123125]. Electrical microstimulation of VIP evokes behaviourally relevant motor actions, including defensive postures or avoidance movements [126].

Interestingly, even though microstimulation and inactivation of area MSTd biased and impaired heading perception, respectively, behavioural cue integration continued to be near-optimal [117]. These results, supported by model simulations, suggest that optimal cue integration for heading perception is a distributed operation that involves additional populations of heading selective neurons located outside of MSTd. Similarly, a recent study reported that reversible chemical deactivation of PPC impairs visual processing without affecting auditory sensitivity or auditory–visual integration in an event-counting task [127]. Thus, from these collective results, cue integration appears to be supported by distributed multisensory networks.

(ii). Optimal integration of intracortical microstimulation and sensory inputs

Remarkably, even arbitrary patterns of electrical microstimulation conveying sensory information can be optimally integrated with natural sensory cues, so long as the relationship between the microstimulation patterns and sensory information can be learned [128]. Macaque monkeys initially trained to perform centre-out reaches in a virtual reality environment based on visual cues can learn to perform reaches based solely on microstimulation feedback after a period of multisensory training that pairs the microstimulation patterns with visual cues. This learning is possible even if the microstimulation activates somatosensory neurons in an arbitrary fashion that ignores the tuning preferences of the stimulated cortical site, so long as the microstimulation is spatio-temporally correlated with the visual cues during training. Following training, the microstimulation-evoked somatosensory activity and the visual signals provide redundant feedback information and reaching performance with either cue alone is characterized by particular bias and variance levels. Critically, the visual and microstimulation cues combine near-optimally such that performance variance with bimodal cues is smaller than with either cue separately and the weight ascribed to each cue scaled according to its relative reliability. This study highlights not only the potential for electrical microstimulation to provide sensory inputs in neuroprosthetic applications but also the fundamental and ubiquitous nature of optimal cue integration.

(d). Neural and synaptic architecture for multisensory integration

Although our understanding of the brain areas and networks involved in multisensory integration has advanced tremendously, much less is known regarding mechanisms of multisensory integration at the cellular and synaptic levels. A recent study used optogenetics to characterize cortical microcircuits mediating multisensory integration [129]. Olcese et al. [129] focused their experiments on the visuotactile area RL, a region of parietal cortex in the mouse that lies between the visual and somatosensory cortices and projects to motor areas. According to measured post-synaptic potentials and action potentials, area RL neurons respond to visual, tactile and visual–tactile stimulation. In a critical set of experiments, Olcese et al. first established that both pyramidal cells and Parvalbumin-positive interneurons (Pv-INs) respond to visual, tactile and combined inputs. Thus, both excitatory and inhibitory neural circuit components potentially process multisensory inputs. Notably, compared with pyramidal cells, Pv-INs were more likely to be bimodal (i.e. responsive to both visual and tactile inputs), but they displayed less MSE.

To test whether the multisensory properties of Pv-INs influence MSE in pyramidal cells, Olcese et al. selectively activated Pv-INs using optogenics [129]. Pv-INs were activated by light while unimodal and bimodal sensory stimuli were presented. Photoactivation increased the spiking activity of Pv-INs on multisensory trials to a greater extent than on unisensory trials. Consequently, Pv-IN photoactivation resulted in larger reductions in the synaptic responses and spiking activity of pyramidal cells on multisensory compared with unisensory trials. Thus, artificially augmenting MSE in Pv-INs reduced MSE in pyramidal cells. This experiment demonstrates a causal relationship linking the multisensory properties of excitatory and inhibitory cell types in cortical microcircuits.

4. Discussion

As summarized here, the addition of causal neuromodulatory techniques to behavioural, neurophysiological and neuroimaging methods has greatly advanced our understanding of the neural mechanisms underlying multisensory integration. In particular, neuromodulation has helped to elucidate the constellations of brain structures that contribute to multisensory integration and perception. Multisensory processing clearly relies on coordinated activity over networks that span multiple scales: multisensory integration depends on complex interactions between neural populations in cortical microcircuits as well as interactions between cortical and subcortical brain structures. Despite recent progress, many unanswered questions remain. For example, why are there multiple areas with similar multisensory properties? What is the functional relationship between areas that comprise multisensory networks? Experiments that combine stimulation approaches with simultaneous recording and imaging will be critical for addressing these questions. In humans, multimodal non-invasive approaches like concurrent TMS-fMRI [130,131] and concurrent TMS-EEG [132] enable direct characterizations of functional connectivity. In animal models, the pairing of electrical microstimulation or optogenetic stimulation with electrode recordings or optical imaging offers even more powerful tools for interrogating multisensory networks, especially in systems that permit genetic control. Recent advances in such multimodal approaches position the field to explore not only the architecture of multisensory networks, but also the functional properties of these circuits.

A particularly important question regarding multisensory networks that neuromodulation methods may address involves the coupling of network nodes: how do distributed brain areas collaborate to support integration and perception? The last decade has brought a growing appreciation for cortical oscillations and the range of cognitive functions they potentially support, which includes the binding of neuronal ensembles [133]. Indeed, recent evidence suggests that coupled oscillatory activity across distributed sensory brain regions can serve to bind information across the senses [134]. Multisensory integration and attention-driven stimulus binding and selection may similarly rely on mechanisms based on coherence in cortical oscillations [135,136]. Accordingly, some instances of crossmodal processing in sensory areas may be explained in part by attention-based mechanisms. Because neuromodulation can be used to entrain or shape cortical oscillations [137], stimulation methods may be particularly well suited for establishing a causal role of oscillations in multisensory integration.

Crucially, neuromodulation techniques suffer from some limitations. It is important to note that neuromodulation approaches can have limited spatial and temporal resolution and lack cell-type specificity. NIBS methods certainly are limited in their spatial and cell-type specificity, although clever experimental manipulations exploiting state-dependent stimulation effects have been developed for more targeted interventions [138]. Among invasive neuromodulation approaches, although the older microstimulation, cooling, and chemical inactivation techniques have limited cell-type specificity, the development of optogenetics has largely overcome this challenge in rodents. In primates and larger animals, optogenetic specificity remains lacking (although continued developments are promising [139143]; however, as shown by the study of Olcese et al. [129], investigating multisensory integration in the mouse offers superb advantages. Of course, all neuromodulatory techniques, including optogenetics, have the caveat that rapid plasticity mechanisms may alter network properties.

Finally, and perhaps most importantly, there is a great need to couple experimental approaches with theory. For example, it is generally assumed that a lack of behavioural deficits following inactivation of a brain area implies that the area does not make a causal contribution to behaviour. However, a recent theoretical analysis of the relationships between neural population codes and inactivation effects suggests that this assumption may not always hold [144]. Specifically, as summarized above, heading information is known to be distributed across at least two areas in the macaque cortex—MSTd and VIP. Surprisingly, although area VIP contains neurons with stronger choice-related activity than MSTd, inactivating VIP had no observable effect on discrimination thresholds [144], whereas MSTd inactivation did impair performance (figure 6).

Thus, with the development of better neuromodulatory techniques comes a greater need for theoretical advances in order to appropriately interpret experimental findings. As discussed, understanding the functional organization of multisensory networks remains a primary challenge and focusing on the roles of feed-forward and feedback signalling in sensory processing may be particularly insightful in this endeavour. Indeed, neuroanatomical feedback represents a hallmark of brain connectivity; yet, how feedback affects early sensory properties remains mostly unexplored and theoretical principles governing such feedback networks are largely lacking. Such theoretical developments would be especially important for exploring multisensory interactions, for which the interconnections across levels of processing circuits are necessarily more extensive.

Competing interests

We declare we have no competing interests.

Funding

This work was supported by DC00720 and EY017866 (D.E.A.) and EY016178 (G.C.D.).

References

  • 1.Britten KH, Newsome WT, Shadlen MN, Celebrini S, Movshon JA. 1996. A relationship between behavioral choice and the visual responses of neurons in macaque MT. Vis. Neurosci. 13, 87–100. ( 10.1017/S095252380000715X) [DOI] [PubMed] [Google Scholar]
  • 2.Nienborg H, Cumming BG. 2014. Decision-related activity in sensory neurons may depend on the columnar architecture of cerebral cortex. J. Neurosci. 34, 3579–3585. ( 10.1523/JNEUROSCI.2340-13.2014) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Hallett M. 2007. Transcranial magnetic stimulation: a primer. Neuron 55, 187–199. ( 10.1016/j.neuron.2007.06.026) [DOI] [PubMed] [Google Scholar]
  • 4.Nitsche MA, et al. 2008. Transcranial direct current stimulation: state of the art 2008. Brain Stimul. 1, 206–223. ( 10.1016/j.brs.2008.06.004) [DOI] [PubMed] [Google Scholar]
  • 5.Bolognini N, Maravita A. 2011. Uncovering multisensory processing through non-invasive brain stimulation. Front. Psychol. 2, 46 ( 10.3389/fpsyg.2011.00046) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Walsh V, Cowey A. 2000. Transcranial magnetic stimulation and cognitive neuroscience. Nat. Rev. Neurosci. 1, 73–79. ( 10.1038/35036239) [DOI] [PubMed] [Google Scholar]
  • 7.Priori A. 2003. Brain polarization in humans: a reappraisal of an old tool for prolonged non-invasive modulation of brain excitability. Clin. Neurophysiol. 114, 589–595. ( 10.1016/S1388-2457(02)00437-6) [DOI] [PubMed] [Google Scholar]
  • 8.Bolognini N, Olgiati E, Rossetti A, Maravita A. 2010. Enhancing multisensory spatial orienting by brain polarization of the parietal cortex. Eur. J. Neurosci. 31, 1800–1806. ( 10.1111/j.1460-9568.2010.07211.x) [DOI] [PubMed] [Google Scholar]
  • 9.Chambers CD, Payne JM, Mattingley JB. 2007. Parietal disruption impairs reflexive spatial attention within and between sensory modalities. Neuropsychologia 45, 1715–1724. ( 10.1016/j.neuropsychologia.2007.01.001) [DOI] [PubMed] [Google Scholar]
  • 10.Kamke MR, Vieth HE, Cottrell D, Mattingley JB. 2012. Parietal disruption alters audiovisual binding in the sound-induced flash illusion. Neuroimage 62, 1334–1341. ( 10.1016/j.neuroimage.2012.05.063) [DOI] [PubMed] [Google Scholar]
  • 11.Chambers CD, Stokes MG, Mattingley JB. 2004. Modality-specific control of strategic spatial attention in parietal cortex. Neuron 44, 925–930. ( 10.1016/j.neuron.2004.12.009) [DOI] [PubMed] [Google Scholar]
  • 12.Bolognini N, Fregni F, Casati C, Olgiati E, Vallar G. 2010. Brain polarization of parietal cortex augments training-induced improvement of visual exploratory and attentional skills. Brain Res. 1349, 76–89. ( 10.1016/j.brainres.2010.06.053) [DOI] [PubMed] [Google Scholar]
  • 13.Bolognini N, Maravita A. 2007. Proprioceptive alignment of visual and somatosensory maps in the posterior parietal cortex. Curr. Biol. 17, 1890–1895. ( 10.1016/j.cub.2007.09.057) [DOI] [PubMed] [Google Scholar]
  • 14.Merabet LB, Theoret H, Pascual-Leone A. 2003. Transcranial magnetic stimulation as an investigative tool in the study of visual function. Optom. Vis. Sci. 80, 356–368. ( 10.1097/00006324-200305000-00010) [DOI] [PubMed] [Google Scholar]
  • 15.Azanon E, Longo MR, Soto-Faraco S, Haggard P. 2010. The posterior parietal cortex remaps touch into external space. Curr. Biol. 20, 1304–1309. ( 10.1016/j.cub.2010.05.063) [DOI] [PubMed] [Google Scholar]
  • 16.Ghazanfar AA, Schroeder CE. 2006. Is neocortex essentially multisensory? Trends Cogn. Sci. 10, 278–285. ( 10.1016/j.tics.2006.04.008) [DOI] [PubMed] [Google Scholar]
  • 17.Kaas JH, Gharbawie OA, Stepniewska I. 2011. The organization and evolution of dorsal stream multisensory motor pathways in primates. Front. Neuroanat. 5, 34 ( 10.3389/fnana.2011.00034) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Sereno MI, Huang RS. 2014. Multisensory maps in parietal cortex. Curr. Opin. Neurobiol. 24, 39–46. ( 10.1016/j.conb.2013.08.014) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Macaluso E, Driver J. 2005. Multisensory spatial interactions: a window onto functional integration in the human brain. Trends Neurosci. 28, 264–271. ( 10.1016/j.tins.2005.03.008) [DOI] [PubMed] [Google Scholar]
  • 20.Pasalar S, Ro T, Beauchamp MS. 2010. TMS of posterior parietal cortex disrupts visual tactile multisensory integration. Eur. J. Neurosci. 31, 1783–1790. ( 10.1111/j.1460-9568.2010.07193.x) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Fiorio M, Haggard P. 2005. Viewing the body prepares the brain for touch: effects of TMS over somatosensory cortex. Eur. J. Neurosci. 22, 773–777. ( 10.1111/j.1460-9568.2005.04267.x) [DOI] [PubMed] [Google Scholar]
  • 22.Konen CS, Haggard P. 2014. Multisensory parietal cortex contributes to visual enhancement of touch in humans: a single-pulse TMS study. Cereb. Cortex 24, 501–507. ( 10.1093/cercor/bhs331) [DOI] [PubMed] [Google Scholar]
  • 23.Ro T, Wallace R, Hagedorn J, Farne A, Pienkos E. 2004. Visual enhancing of tactile perception in the posterior parietal cortex. J. Cogn. Neurosci. 16, 24–30. ( 10.1162/089892904322755520) [DOI] [PubMed] [Google Scholar]
  • 24.Serino A, Canzoneri E, Avenanti A. 2011. Fronto-parietal areas necessary for a multisensory representation of peripersonal space in humans: an rTMS study. J. Cogn. Neurosci. 23, 2956–2967. ( 10.1162/jocn_a_00006) [DOI] [PubMed] [Google Scholar]
  • 25.McGurk H, MacDonald J. 1976. Hearing lips and seeing voices. Nature 264, 746–748. ( 10.1038/264746a0) [DOI] [PubMed] [Google Scholar]
  • 26.Beauchamp MS. 2005. See me, hear me, touch me: multisensory integration in lateral occipital-temporal cortex. Curr. Opin. Neurobiol. 15, 145–153. ( 10.1016/j.conb.2005.03.011) [DOI] [PubMed] [Google Scholar]
  • 27.Perrodin C, Kayser C, Logothetis NK, Petkov CI. 2014. Auditory and visual modulation of temporal lobe neurons in voice-sensitive and association cortices. J. Neurosci. 34, 2524–2537. ( 10.1523/JNEUROSCI.2805-13.2014) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Beauchamp MS, Nath AR, Pasalar S. 2010. fMRI-Guided transcranial magnetic stimulation reveals that the superior temporal sulcus is a cortical locus of the McGurk effect. J. Neurosci. 30, 2414–2417. ( 10.1523/JNEUROSCI.4865-09.2010) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Marques LM, Lapenta OM, Merabet LB, Bolognini N, Boggio PS. 2014. Tuning and disrupting the brain-modulating the McGurk illusion with electrical stimulation. Front. Hum. Neurosci. 8, 533 ( 10.3389/fnhum.2014.00533) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Jones JA, Callan DE. 2003. Brain activity during audiovisual speech perception: an fMRI study of the McGurk effect. Neuroreport 14, 1129–1133. ( 10.1097/00001756-200306110-00006) [DOI] [PubMed] [Google Scholar]
  • 31.Kilian-Hutten N, Vroomen J, Formisano E. 2011. Brain activation during audiovisual exposure anticipates future perception of ambiguous speech. Neuroimage 57, 1601–1607. ( 10.1016/j.neuroimage.2011.05.043) [DOI] [PubMed] [Google Scholar]
  • 32.Bolognini N, Rossetti A, Casati C, Mancini F, Vallar G. 2011. Neuromodulation of multisensory perception: a tDCS study of the sound-induced flash illusion. Neuropsychologia 49, 231–237. ( 10.1016/j.neuropsychologia.2010.11.015) [DOI] [PubMed] [Google Scholar]
  • 33.Shams L, Kamitani Y, Shimojo S. 2000. Illusions. What you see is what you hear. Nature 408, 788 ( 10.1038/35048669) [DOI] [PubMed] [Google Scholar]
  • 34.Watkins S, Shams L, Josephs O, Rees G. 2007. Activity in human V1 follows multisensory perception. Neuroimage 37, 572–578. ( 10.1016/j.neuroimage.2007.05.027) [DOI] [PubMed] [Google Scholar]
  • 35.Watkins S, Shams L, Tanaka S, Haynes JD, Rees G. 2006. Sound alters activity in human V1 in association with illusory visual perception. Neuroimage 31, 1247–1256. ( 10.1016/j.neuroimage.2006.01.016) [DOI] [PubMed] [Google Scholar]
  • 36.Driver J, Noesselt T. 2008. Multisensory interplay reveals crossmodal influences on ‘sensory-specific’ brain regions, neural responses, and judgments. Neuron 57, 11–23. ( 10.1016/j.neuron.2007.12.013) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Connor CE, Brincat SL, Pasupathy A. 2007. Transformation of shape information in the ventral pathway. Curr. Opin. Neurobiol. 17, 140–147. ( 10.1016/j.conb.2007.03.002) [DOI] [PubMed] [Google Scholar]
  • 38.Kourtzi Z, Connor CE. 2011. Neural representations for object perception: structure, category, and adaptive coding. Annu. Rev. Neurosci. 34, 45–67. ( 10.1146/annurev-neuro-060909-153218) [DOI] [PubMed] [Google Scholar]
  • 39.Amedi A, Malach R, Hendler T, Peled S, Zohary E. 2001. Visuo-haptic object-related activation in the ventral visual pathway. Nat. Neurosci. 4, 324–330. ( 10.1038/85201) [DOI] [PubMed] [Google Scholar]
  • 40.James TW, Humphrey GK, Gati JS, Servos P, Menon RS, Goodale MA. 2002. Haptic study of three-dimensional objects activates extrastriate visual areas. Neuropsychologia 40, 1706–1714. ( 10.1016/S0028-3932(02)00017-9) [DOI] [PubMed] [Google Scholar]
  • 41.Haenny PE, Maunsell JH, Schiller PH. 1988. State dependent activity in monkey visual cortex II. Retinal and extraretinal factors in V4. Exp. Brain Res. 69, 245–259. ( 10.1007/BF00247570) [DOI] [PubMed] [Google Scholar]
  • 42.Merabet LB, Hamilton R, Schlaug G, Swisher JD, Kiriakopoulos ET, Pitskel NB, Kauffman T, Pascual-Leone A. 2008. Rapid and reversible recruitment of early visual cortex for touch. PLoS ONE 3, e3046 ( 10.1371/journal.pone.0003046) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Pascual-Leone A, Hamilton R. 2001. The metamodal organization of the brain. Prog. Brain Res. 134, 427–445. [DOI] [PubMed] [Google Scholar]
  • 44.Merabet L, Thut G, Murray B, Andrews J, Hsiao S, Pascual-Leone A. 2004. Feeling by sight or seeing by touch? Neuron 42, 173–179. ( 10.1016/S0896-6273(04)00147-3) [DOI] [PubMed] [Google Scholar]
  • 45.Yau JM, Celnik P, Hsiao SS, Desmond JE. 2014. Feeling better: separate pathways for targeted enhancement of spatial and temporal touch. Psychol. Sci. 25, 555–565. ( 10.1177/0956797613511467) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Antal A, Nitsche MA, Paulus W. 2001. External modulation of visual perception in humans. Neuroreport 12, 3553–3555. ( 10.1097/00001756-200111160-00036) [DOI] [PubMed] [Google Scholar]
  • 47.Zangaladze A, Epstein CM, Grafton ST, Sathian K. 1999. Involvement of visual cortex in tactile discrimination of orientation. Nature 401, 587–590. ( 10.1038/44139) [DOI] [PubMed] [Google Scholar]
  • 48.Lacey S, Tal N, Amedi A, Sathian K. 2009. A putative model of multisensory object representation. Brain Topogr. 21, 269–274. ( 10.1007/s10548-009-0087-4) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Mancini F, Bolognini N, Bricolo E, Vallar G. 2011. Cross-modal processing in the occipito-temporal cortex: a TMS study of the Muller-Lyer illusion. J. Cogn. Neurosci. 23, 1987–1997. ( 10.1162/jocn.2010.21561) [DOI] [PubMed] [Google Scholar]
  • 50.Blake R, Sobel KV, James TW. 2004. Neural synergy between kinetic vision and touch. Psychol. Sci. 15, 397–402. ( 10.1111/j.0956-7976.2004.00691.x) [DOI] [PubMed] [Google Scholar]
  • 51.van Kemenade BM, Seymour K, Wacker E, Spitzer B, Blankenburg F, Sterzer P. 2014. Tactile and visual motion direction processing in hMT+/V5. Neuroimage 84, 420–427. ( 10.1016/j.neuroimage.2013.09.004) [DOI] [PubMed] [Google Scholar]
  • 52.Wacker E, Spitzer B, Lutzkendorf R, Bernarding J, Blankenburg F. 2011. Tactile motion and pattern processing assessed with high-field fMRI. PLoS ONE 6, e24860 ( 10.1371/journal.pone.0024860) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Basso D, Pavan A, Ricciardi E, Fagioli S, Vecchi T, Miniussi C, Pietrini P. 2012. Touching motion: rTMS on the human middle temporal complex interferes with tactile speed perception. Brain Topogr. 25, 389–398. ( 10.1007/s10548-012-0223-4) [DOI] [PubMed] [Google Scholar]
  • 54.Kayser C, Petkov CI, Augath M, Logothetis NK. 2005. Integration of touch and sound in auditory cortex. Neuron 48, 373–384. ( 10.1016/j.neuron.2005.09.018) [DOI] [PubMed] [Google Scholar]
  • 55.Fu KM, Johnston TA, Shah AS, Arnold L, Smiley J, Hackett TA, Garrahty PE, Shroeder CE. 2003. Auditory cortical neurons respond to somatosensory stimulation. J. Neurosci. 23, 7510–7515. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56.Butler JS, Foxe JJ, Fiebelkorn IC, Mercier MR, Molholm S. 2012. Multisensory representation of frequency across audition and touch: high density electrical mapping reveals early sensory-perceptual coupling. J. Neurosci. 32, 15 338–15 344. ( 10.1523/JNEUROSCI.1796-12.2012) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Renier LA, Anurova I, De Volder AG, Carlson S, VanMeter J, Rauschecker JP. 2009. Multisensory integration of sounds and vibrotactile stimuli in processing streams for ‘what’ and ‘where’. J. Neurosci. 29, 10 950–10 960. ( 10.1523/JNEUROSCI.0910-09.2009) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58.Nordmark PF, Pruszynski JA, Johansson RS. 2012. BOLD responses to tactile stimuli in visual and auditory cortex depend on the frequency content of stimulation. J. Cogn. Neurosci. 24, 2120–2134. ( 10.1162/jocn_a_00261) [DOI] [PubMed] [Google Scholar]
  • 59.Ladeira A, Fregni F, Campanha C, Valasek CA, De Ridder D, Brunoni AR, Boggio PS. 2011. Polarity-dependent transcranial direct current stimulation effects on central auditory processing. PLoS ONE 6, e25399 ( 10.1371/journal.pone.0025399) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Bolognini N, Papagno C, Moroni D, Maravita A. 2010. Tactile temporal processing in the auditory cortex. J. Cogn. Neurosci. 22, 1201–1211. ( 10.1162/jocn.2009.21267) [DOI] [PubMed] [Google Scholar]
  • 61.Bolognini N, Cecchetto C, Geraci C, Maravita A, Pascual-Leone A, Papagno C. 2011. Hearing shapes our perception of time: temporal discrimination of tactile stimuli in deaf people. J. Cogn. Neurosci. 24, 276–286. ( 10.1162/jocn_a_00135) [DOI] [PubMed] [Google Scholar]
  • 62.Niedenthal PM. 2007. Embodying emotion. Science 316, 1002–1005. ( 10.1126/science.1136930) [DOI] [PubMed] [Google Scholar]
  • 63.Pitcher D, Garrido L, Walsh V, Duchaine BC. 2008. Transcranial magnetic stimulation disrupts the perception and embodiment of facial expressions. J. Neurosci. 28, 8929–8933. ( 10.1523/JNEUROSCI.1450-08.2008) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 64.Pourtois G, Sander D, Andres M, Grandjean D, Reveret L, Olivier E, Vuilleumier P. 2004. Dissociable roles of the human somatosensory and superior temporal cortices for processing social face signals. Eur. J. Neurosci. 20, 3507–3515. ( 10.1111/j.1460-9568.2004.03794.x) [DOI] [PubMed] [Google Scholar]
  • 65.Bolognini N, Rossetti A, Maravita A, Miniussi C. 2011. Seeing touch in the somatosensory cortex: a TMS study of the visual perception of touch. Hum. Brain. Mapp. 32, 2104–2114. ( 10.1002/hbm.21172) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66.Ray PG, Meador KJ, Epstein CM, Loring DW, Day LJ. 1998. Magnetic stimulation of visual cortex: factors influencing the perception of phosphenes. J. Clin. Neurophysiol. 15, 351–357. ( 10.1097/00004691-199807000-00007) [DOI] [PubMed] [Google Scholar]
  • 67.Ramos-Estebanez C, Merabet LB, Machii K, Fregni F, Thut G, Wagner TA, Romei V, Amedi A, Pascual-Leone A. 2007. Visual phosphene perception modulated by subthreshold crossmodal sensory stimulation. J. Neurosci. 27, 4178–4181. ( 10.1523/JNEUROSCI.5468-06.2007) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.Romei V, Murray MM, Merabet LB, Thut G. 2007. Occipital transcranial magnetic stimulation has opposing effects on visual and auditory stimulus detection: implications for multisensory interactions. J. Neurosci. 27, 11 465–11 472. ( 10.1523/JNEUROSCI.2827-07.2007) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69.Romei V, Murray MM, Cappe C, Thut G. 2009. Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds. Curr. Biol. 19, 1799–1805. ( 10.1016/j.cub.2009.09.027) [DOI] [PubMed] [Google Scholar]
  • 70.Convento S, Vallar G, Galantini C, Bolognini N. 2013. Neuromodulation of early multisensory interactions in the visual cortex. J. Cogn. Neurosci. 25, 685–696. ( 10.1162/jocn_a_00347) [DOI] [PubMed] [Google Scholar]
  • 71.Lomber SG, Payne BR, Horel JA. 1999. The cryoloop: an adaptable reversible cooling deactivation method for behavioral or electrophysiological assessment of neural function. J. Neurosci. Methods 86, 179–194. ( 10.1016/S0165-0270(98)00165-4) [DOI] [PubMed] [Google Scholar]
  • 72.Malpeli JG. 1999. Reversible inactivation of subcortical sites by drug injection. J. Neurosci. Methods 86, 119–128. ( 10.1016/S0165-0270(98)00161-7) [DOI] [PubMed] [Google Scholar]
  • 73.Histed MH, Ni AM, Maunsell JH. 2013. Insights into cortical mechanisms of behavior from microstimulation experiments. Prog. Neurobiol. 103, 115–130. ( 10.1016/j.pneurobio.2012.01.006) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 74.Fenno L, Yizhar O, Deisseroth K. 2011. The development and application of optogenetics. Annu. Rev. Neurosci. 34, 389–412. ( 10.1146/annurev-neuro-061010-113817) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 75.Krueger J, Royal DW, Fister MC, Wallace MT. 2009. Spatial receptive field organization of multisensory neurons and its impact on multisensory interactions. Hear. Res. 258, 47–54. ( 10.1016/j.heares.2009.08.003) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 76.Stein BE, Rowland BA. 2011. Organization and plasticity in multisensory integration: early and late experience affects its governing principles. Prog. Brain Res. 191, 145–163. ( 10.1016/b978-0-444-53752-2.00007-2) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 77.Stein BE. 1998. Neural mechanisms for synthesizing sensory information and producing adaptive behaviors. Exp. Brain Res. 123, 124–135. ( 10.1007/s002210050553) [DOI] [PubMed] [Google Scholar]
  • 78.Stein BE, Stanford TR, Rowland BA. 2014. Development of multisensory integration from the perspective of the individual neuron. Nat. Rev. Neurosci. 15, 520–535. ( 10.1038/nrn3742) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79.Stein BE, Meredith MA. 2004. The merging of the senses. Cambridge, MA: MIT Press. [Google Scholar]
  • 80.Stein BE, Stanford TR. 2008. Multisensory integration: current issues from the perspective of the single neuron. Nat. Rev. Neurosci. 9, 255–266. ( 10.1038/nrn2331) [DOI] [PubMed] [Google Scholar]
  • 81.Beauchamp MS. 2005. Statistical criteria in fMRI studies of multisensory integration. Neuroinformatics 3, 93–113. ( 10.1385/NI:3:2:093) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 82.Laurienti PJ, Perrault TJ, Stanford TR, Wallace MT, Stein BE. 2005. On the use of superadditivity as a metric for characterizing multisensory integration in functional neuroimaging studies. Exp. Brain Res. 166, 289–297. ( 10.1007/s00221-005-2370-2) [DOI] [PubMed] [Google Scholar]
  • 83.Burnett LR, Stein BE, Chaponis D, Wallace MT. 2004. Superior colliculus lesions preferentially disrupt multisensory orientation. Neuroscience 124, 535–547. ( 10.1016/j.neuroscience.2003.12.026) [DOI] [PubMed] [Google Scholar]
  • 84.Meredith MA. 1999. The frontal eye fields target multisensory neurons in cat superior colliculus. Exp. Brain Res. 128, 460–470. ( 10.1007/s002210050869) [DOI] [PubMed] [Google Scholar]
  • 85.Rowland BA, Stanford TR, Stein BE. 2007. A model of the neural mechanisms underlying multisensory integration in the superior colliculus. Perception 36, 1431–1443. ( 10.1068/p5842) [DOI] [PubMed] [Google Scholar]
  • 86.Wallace MT, Meredith MA, Stein BE. 1993. Converging influences from visual, auditory, and somatosensory cortices onto output neurons of the superior colliculus. J. Neurophysiol. 69, 1797–1809. [DOI] [PubMed] [Google Scholar]
  • 87.Wilkinson LK, Meredith MA, Stein BE. 1996. The role of anterior ectosylvian cortex in cross-modality orientation and approach behavior. Exp. Brain Res. 112, 1–10. ( 10.1007/BF00227172) [DOI] [PubMed] [Google Scholar]
  • 88.Jiang W, Wallace MT, Jiang H, Vaughan JW, Stein BE. 2001. Two cortical areas mediate multisensory integration in superior colliculus neurons. J. Neurophysiol. 85, 506–522. [DOI] [PubMed] [Google Scholar]
  • 89.Alvarado JC, Stanford TR, Rowland BA, Vaughan JW, Stein BE. 2009. Multisensory integration in the superior colliculus requires synergy among corticocollicular inputs. J. Neurosci. 29, 6580–6592. ( 10.1523/JNEUROSCI.0525-09.2009) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 90.Rowland BA, Jiang W, Stein BE. 2014. Brief cortical deactivation early in life has long-lasting effects on multisensory behavior. J. Neurosci. 34, 7198–7202. ( 10.1523/JNEUROSCI.3782-13.2014) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 91.Reynolds JH, Chelazzi L. 2004. Attentional modulation of visual processing. Annu. Rev. Neurosci. 27, 611–647. ( 10.1146/annurev.neuro.26.041002.131039) [DOI] [PubMed] [Google Scholar]
  • 92.Moore T, Armstrong KM. 2003. Selective gating of visual signals by microstimulation of frontal cortex. Nature 421, 370–373. ( 10.1038/nature01341) [DOI] [PubMed] [Google Scholar]
  • 93.Armstrong KM, Fitzgerald JK, Moore T. 2006. Changes in visual receptive fields with microstimulation of frontal cortex. Neuron 50, 791–798. ( 10.1016/j.neuron.2006.05.010) [DOI] [PubMed] [Google Scholar]
  • 94.Winkowski DE, Knudsen EI. 2006. Top-down gain control of the auditory space map by gaze control circuitry in the barn owl. Nature 439, 336–339. ( 10.1038/nature04411) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 95.Winkowski DE, Knudsen EI. 2007. Top-down control of multimodal sensitivity in the barn owl optic tectum. J. Neurosci. 27, 13 279–13 291. ( 10.1523/JNEUROSCI.3937-07.2007) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 96.Mysore SP, Knudsen EI. 2013. A shared inhibitory circuit for both exogenous and endogenous control of stimulus selection. Nat. Neurosci. 16, 473–478. ( 10.1038/nn.3352) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 97.Reches A, Gutfreund Y. 2009. Auditory and multisensory responses in the tectofugal pathway of the barn owl. J. Neurosci. 29, 9602–9613. ( 10.1523/JNEUROSCI.6117-08.2009) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 98.Oertel D, Young ED. 2004. What's a cerebellar circuit doing in the auditory system? Trends Neurosci. 27, 104–110. ( 10.1016/j.tins.2003.12.001) [DOI] [PubMed] [Google Scholar]
  • 99.Young ED, Nelken I, Conley RA. 1995. Somatosensory effects on neurons in dorsal cochlear nucleus. J. Neurophysiol. 73, 743–765. [DOI] [PubMed] [Google Scholar]
  • 100.Davis KA, Miller RL, Young ED. 1996. Effects of somatosensory and parallel-fiber stimulation on neurons in dorsal cochlear nucleus. J. Neurophysiol. 76, 3012–3024. [DOI] [PubMed] [Google Scholar]
  • 101.Shore SE. 2005. Multisensory integration in the dorsal cochlear nucleus: unit responses to acoustic and trigeminal ganglion stimulation. Eur. J. Neurosci. 21, 3334–3348. ( 10.1111/j.1460-9568.2005.04142.x) [DOI] [PubMed] [Google Scholar]
  • 102.Kanold PO, Young ED. 2001. Proprioceptive information from the pinna provides somatosensory input to cat dorsal cochlear nucleus. J. Neurosci. 21, 7848–7858. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 103.Shore SE, Zhou J. 2006. Somatosensory influence on the cochlear nucleus and beyond. Hear. Res. 216–217, 90–99. ( 10.1016/j.heares.2006.01.006) [DOI] [PubMed] [Google Scholar]
  • 104.Lomber SG, Meredith MA, Kral A. 2010. Cross-modal plasticity in specific auditory cortices underlies visual compensations in the deaf. Nat. Neurosci. 13, 1421–1427. ( 10.1038/nn.2653) [DOI] [PubMed] [Google Scholar]
  • 105.Meredith MA, Kryklywy J, McMillan AJ, Malhotra S, Lum-Tai R, Lomber SG. 2011. Crossmodal reorganization in the early deaf switches sensory, but not behavioral roles of auditory cortex. Proc. Natl Acad. Sci. USA 108, 8856–8861. ( 10.1073/pnas.1018519108) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 106.Bavelier D, Tomann A, Hutton C, Mitchell T, Corina D, Liu G, Neville H. 2000. Visual attention to the periphery is enhanced in congenitally deaf individuals. J. Neurosci. 20, RC93. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 107.Angelaki DE, Gu Y, DeAngelis GC. 2009. Multisensory integration: psychophysics, neurophysiology, and computation. Curr. Opin. Neurobiol. 19, 452–458. ( 10.1016/j.conb.2009.06.008) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 108.Ernst MO, Bulthoff HH. 2004. Merging the senses into a robust percept. Trends Cogn. Sci. 8, 162–169. ( 10.1016/j.tics.2004.02.002) [DOI] [PubMed] [Google Scholar]
  • 109.Ernst MO, Banks MS. 2002. Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415, 429–433. ( 10.1038/415429a) [DOI] [PubMed] [Google Scholar]
  • 110.Hillis JM, Ernst MO, Banks MS, Landy MS. 2002. Combining sensory information: mandatory fusion within, but not between, senses. Science 298, 1627–1630. ( 10.1126/science.1075396) [DOI] [PubMed] [Google Scholar]
  • 111.Alais D, Burr D. 2004. The ventriloquist effect results from near-optimal bimodal integration. Curr. Biol. 14, 257–262. ( 10.1016/j.cub.2004.01.029) [DOI] [PubMed] [Google Scholar]
  • 112.Landy MS, Kojima H. 2001. Ideal cue combination for localizing texture-defined edges. J. Opt. Soc. Am A 18, 2307–2320. ( 10.1364/JOSAA.18.002307) [DOI] [PubMed] [Google Scholar]
  • 113.Landy MS, Maloney LT, Johnston EB, Young M. 1995. Measurement and modeling of depth cue combination: in defense of weak fusion. Vis. Res. 35, 389–412. ( 10.1016/0042-6989(94)00176-M) [DOI] [PubMed] [Google Scholar]
  • 114.Ma WJ, Zhou X, Ross LA, Foxe JJ, Parra LC. 2009. Lip-reading aids word recognition most in moderate noise: a Bayesian explanation using high-dimensional feature space. PLoS ONE 4, e4638 ( 10.1371/journal.pone.0004638) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 115.Fetsch CR, DeAngelis GC, Angelaki DE. 2013. Bridging the gap between theories of sensory cue integration and the physiology of multisensory neurons. Nat. Rev. Neurosci. 14, 429–442. ( 10.1038/nrn3503) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 116.Gu Y, Angelaki DE, DeAngelis GC. 2008. Neural correlates of multisensory cue integration in macaque MSTd. Nat. Neurosci. 11, 1201–1210. ( 10.1038/nn.2191) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 117.Gu Y, DeAngelis GC, Angelaki DE. 2012. Causal links between dorsal medial superior temporal area neurons and multisensory heading perception. J. Neurosci. 32, 2299–2313. ( 10.1523/JNEUROSCI.5154-11.2012) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 118.Fetsch CR, Pouget A, DeAngelis GC, Angelaki DE. 2012. Neural correlates of reliability-based cue weighting during multisensory integration. Nat. Neurosci. 15, 146–154. ( 10.1038/nn.2983) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 119.Chen A, DeAngelis GC, Angelaki DE. 2010. Macaque parieto-insular vestibular cortex: responses to self-motion and optic flow. J. Neurosci. 30, 3022–3042. ( 10.1523/JNEUROSCI.4029-09.2010) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 120.Chen A, DeAngelis GC, Angelaki DE. 2011. Convergence of vestibular and visual self-motion signals in an area of the posterior sylvian fissure. J. Neurosci. 31, 11 617–11 627. ( 10.1523/JNEUROSCI.1266-11.2011) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 121.Chen A, DeAngelis GC, Angelaki DE. 2011. Representation of vestibular and visual cues to self-motion in ventral intraparietal cortex. J. Neurosci. 31, 12 036–12 052. ( 10.1523/JNEUROSCI.0395-11.2011) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 122.Chen A, DeAngelis GC, Angelaki DE. 2013. Functional specializations of the ventral intraparietal area for multisensory heading discrimination. J. Neurosci. 33, 3567–3581. ( 10.1523/JNEUROSCI.4522-12.2013) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 123.Avillac M, Deneve S, Olivier E, Pouget A, Duhamel JR. 2005. Reference frames for representing visual and tactile locations in parietal cortex. Nat. Neurosci 8, 941–949. ( 10.1038/nn1480) [DOI] [PubMed] [Google Scholar]
  • 124.Bremmer F, Klam F, Duhamel JR, Ben Hamed S, Graf W. 2002. Visual-vestibular interactive responses in the macaque ventral intraparietal area (VIP). Eur. J. Neurosci. 16, 1569–1586. ( 10.1046/j.1460-9568.2002.02206.x) [DOI] [PubMed] [Google Scholar]
  • 125.Schlack A, Sterbing-D'Angelo SJ, Hartung K, Hoffmann KP, Bremmer F. 2005. Multisensory space representations in the macaque ventral intraparietal area. J. Neurosci. 25, 4616–4625. ( 10.1523/JNEUROSCI.0455-05.2005) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 126.Cooke DF, Taylor CS, Moore T, Graziano MS. 2003. Complex movements evoked by microstimulation of the ventral intraparietal area. Proc. Natl Acad. Sci. USA 100, 6163–6168. ( 10.1073/pnas.1031751100) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 127.Raposo D, Kaufman MT, Churchland AK. 2014. A category-free neural population supports evolving demands during decision-making. Nat. Neurosci. 17, 1784–1792. ( 10.1038/nn.3865) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 128.Dadarlat MC, O'Doherty JE, Sabes PN. 2014. A learning-based approach to artificial sensory feedback leads to optimal integration. Nat. Neurosci. 18, 138–144. ( 10.1038/nn.3883) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 129.Olcese U, Iurilli G, Medini P. 2013. Cellular and synaptic architecture of multisensory integration in the mouse neocortex. Neuron 79, 579–593. ( 10.1016/j.neuron.2013.06.010) [DOI] [PubMed] [Google Scholar]
  • 130.Driver J, Blankenburg F, Bestmann S, Vanduffel W, Ruff CC. 2009. Concurrent brain-stimulation and neuroimaging for studies of cognition. Trends Cogn. Sci. 13, 319–327. ( 10.1016/j.tics.2009.04.007) [DOI] [PubMed] [Google Scholar]
  • 131.Yau JM, Hua J, Liao DA, Desmond JE. 2013. Efficient and robust identification of cortical targets in concurrent TMS-fMRI experiments. Neuroimage 76, 134–144. ( 10.1016/j.neuroimage.2013.02.077) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 132.Miniussi C, Thut G. 2010. Combining TMS and EEG offers new prospects in cognitive neuroscience. Brain Topogr. 22, 249–256. ( 10.1007/s10548-009-0083-8) [DOI] [PubMed] [Google Scholar]
  • 133.Buzsaki G, Draguhn A. 2004. Neuronal oscillations in cortical networks. Science 304, 1926–1929. ( 10.1126/science.1099745) [DOI] [PubMed] [Google Scholar]
  • 134.Senkowski D, Schneider TR, Foxe JJ, Engel AK. 2008. Crossmodal binding through neural coherence: implications for multisensory processing. Trends Neurosci. 31, 401–409. ( 10.1016/j.tins.2008.05.002) [DOI] [PubMed] [Google Scholar]
  • 135.Talsma D, Senkowski D, Soto-Faraco S, Woldorff MG. 2010. The multifaceted interplay between attention and multisensory integration. Trends Cogn. Sci. 14, 400–410. ( 10.1016/j.tics.2010.06.008) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 136.van Atteveldt N, Murray MM, Thut G, Schroeder CE. 2014. Multisensory integration: flexible use of general operations. Neuron 81, 1240–1253. ( 10.1016/j.neuron.2014.02.044) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 137.Thut G, Miniussi C, Gross J. 2012. The functional importance of rhythmic activity in the brain. Curr. Biol. 22, R658–R663. ( 10.1016/j.cub.2012.06.061) [DOI] [PubMed] [Google Scholar]
  • 138.Silvanto J, Pascual-Leone A. 2008. State-dependency of transcranial magnetic stimulation. Brain Topogr. 21, 1–10. ( 10.1007/s10548-008-0067-0) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 139.Diester I, Kaufman MT, Mogri M, Pashaie R, Goo W, Yizhar O, Ramakrishnan C, Deisseroth K, Shenoy KV. 2011. An optogenetic toolbox designed for primates. Nat. Neurosci. 14, 387–397. ( 10.1038/nn.2749) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 140.Gerits A, Vanduffel W. 2013. Optogenetics in primates: a shining future? Trends Genet 29, 403–411. ( 10.1016/j.tig.2013.03.004) [DOI] [PubMed] [Google Scholar]
  • 141.Ozden I, et al. 2013. A coaxial optrode as multifunction write-read probe for optogenetic studies in non-human primates. J. Neurosci. Methods 219, 142–154. ( 10.1016/j.jneumeth.2013.06.011) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 142.Ruiz O, Lustig BR, Nassi JJ, Cetin A, Reynolds JH, Albright TD, Callaway EM, Stoner GR, Roe AW. 2013. Optogenetics through windows on the brain in the nonhuman primate. J. Neurophysiol. 110, 1455–1467. ( 10.1152/jn.00153.2013) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 143.Dai J, Brooks DI, Sheinberg DL. 2014. Optogenetic and electrical microstimulation systematically bias visuospatial choice in primates. Curr. Biol. 24, 63–69. ( 10.1016/j.cub.2013.11.011) [DOI] [PubMed] [Google Scholar]
  • 144.Lakshminarasimhan K, Liu S, Klier EM, Gu Y, Pitkow X, DeAngelis GC, et al. 2014. Dissecting the contributions of area MSTd and VIP to heading perception. Computational and Systems Neuroscience Meeting (Cosyne), 27 February–2 March, Salt Lake City, UT. [Google Scholar]

Articles from Philosophical Transactions of the Royal Society B: Biological Sciences are provided here courtesy of The Royal Society

RESOURCES