Skip to main content
The Journal of Neuroscience logoLink to The Journal of Neuroscience
. 2008 Feb 20;28(8):1787–1788. doi: 10.1523/JNEUROSCI.5368-07.2008

The Influences of Associative Cortices on Cross-Modal Integration in the Superior Colliculus

Etienne Vachon-Presseau 1,, Luke Henry 2
PMCID: PMC6671439  PMID: 18287494

The creation of robust percepts is a key function of the CNS necessary to overcome everyday life challenges. To this end, the CNS possesses the capacity to synthesize spatially related unimodal stimuli into multisensory events. More specifically, many neurons of the superior colliculus (SC) have the capacity to govern multisensory integration and mediate orienting behaviors toward multisensory targets (Stein, 1998). In true gestalt manner, many of these neurons' responses to different spatially congruent sensory stimuli (e.g., visual and auditory) exceed the larger of the unimodal responses (Meredith and Stein, 1986) or even the arithmetic sum of the unimodal responses (King and Palmer, 1985). Moreover, multisensory integration in single SC neurons is an important predictor of the animal's behavior. For example, cats trained to orient toward a briefly illuminated light-emitting diode showed increased success rate when a spatially coincidental auditory stimulus cue was simultaneously presented (Stein et al., 1988). However, many studies have demonstrated that this multimodal integration is not computed exclusively within the SC, but is rather processed through a distributed neural network that involves the midbrain as well as cortical areas. Indeed, reversible cooling deactivation (Jiang et al., 2001) of the cat's ectosylvian sulcus (AES) and lateral suprasylvian sulcus (rLS) resulted in the loss of SC neurons' cross-modal integration abilities.

In a recent study, Alvarado et al. (2007a) observed that unisensory and multisensory integration are governed by different rules. Single-unit recording in the deep layers of the cat's SC revealed that, generally, within-modal (e.g., two visual stimuli) integration produces response depression, whereas cross-modal (e.g., visual and auditory stimuli) integration produces response enhancement. Segregating these two mechanisms brought new evidence supporting a parallel organization, in which within-modal depression could be of primary use to focus on a single target feature among potential distracters, whereas multimodal enhancement would more likely enhance stimuli detection. This raised questions about whether and how the neuronal networks underlying unimodal and multimodal processing are different from each other.

Alvarado et al. (2007b) recently published in The Journal of Neuroscience an attempt to resolve this issue by exploring potential differences in cortical influences over multisensory and unisensory neuronal integration in the deep layers of the SC. For this purpose, the authors deactivated the AES and the rLS of anesthetized cats by using two cooling coils. Visual–auditory multisensory neurons (n = 45) and visual unisensory neurons (n = 25) were tested with two different light bars (V1 and V2), which were either moving or stationary, and a white noise burst (A). Each neuron was tested with the three modality-specific stimuli presented alone (V1, V2, and A) and with the presentation of combined stimuli: V1 with V2, V1 with A, and V2 with A. Because multisensory enhancement is most robust when the stimuli are presented at weak intensities [a phenomenon referred to as inverse effectiveness (Meredith and Stein, 1986)], the complete protocol was run at three levels of stimulation (threshold intensity, maximal response intensity, and in-between intensity, around the midpoint on the dynamic curve) in each experimental condition (control, cortical deactivation, and cortical reactivation).

In the control condition, the majority of the multisensory neurons (89%, 40 of 45) showed cross-modal integration. Moreover, most of these neurons (77.5%, 31 of 40) followed the principles of inverse effectiveness. Finally, within-modal stimulations usually resulted in substantially lower responses than the sum of their components in either unisensory or multisensory neurons.

The novel results of the present study revealed two major advancements to the general understanding of multisensory integration in the SC. First, AES and rLS deactivation altered the SC multisensory neurons capacity to integrate cross-modal (V–A) but not within-modal (V1–V2) information. Thus, corroborating the results of Jiang et al. (2001), most of the multisensory neurons (77.5%, 31 of 40) lost their cross-modal integration abilities after cortical deactivations and recovered them after cortical reactivation. In sharp contrast, cortical deactivation had little or no effect on within-modal integration among multisensory and unisensory SC neurons. This suggests that AES and rLS specifically target cross-modal, but not within-modal, integration among multisensory SC neurons. Moreover, the results show that cortex deactivation nullifies inverse effectiveness in cross-modal integrative neurons. Second, as illustrated by Alvarado et al. (2007b) [their Fig. 8 (http://www.jneurosci.org/cgi/content/full/27/47/12775/F8)], the authors observed different cortical deactivation effects on the modality-specific response (V1, V2, and A) of multisensory and unisensory neurons. The figure shows that during the deactivation condition, multisensory neurons' firing rate decreased in response to modality-specific stimulations, whereas unisensory neurons responses remained unchanged. Moreover, in contrast to the inverse effectiveness seen in cross-modal integration, the influence of cortical deactivation on modality-specific responses was more prominent on most vigorous responses.

The present study demonstrates that AES and rLS modulate cross-modal but not within-modal integration and that modality-specific stimuli are affected by cortical deactivation only in multisensory neurons but not in unisensory neurons. These observations further suggest that AES and rLS projections target exclusively multisensory neurons. The present study is of primary importance to understanding the wiring behind multisensory integration in the SC, but also in a more general manner across the CNS. Indeed, recent data has revealed new avenues of research in the extrastriate cortex of the cat, in which the presence of subthreshold auditory stimulations enhanced visual responses (Allman and Meredith, 2007). It would be of interest to study the effect of AES and rLS on such cortical areas to determine whether midbrain multisensory processing is structure specific. This would help to disclose whether cross-modal integration processed within primary cortices relies on projections from associative regions, and therefore bring new evidence concerning the cortical network underlying multisensory integration.

Footnotes

Editor's Note: These short, critical reviews of recent papers in the Journal, written exclusively by graduate students or postdoctoral fellows, are intended to summarize the important findings of the paper and provide additional insight and commentary. For more information on the format and purpose of the Journal Club, please see http://www.jneurosci.org/misc/ifa_features.shtml.

References

  1. Allman BL, Meredith MA. Multisensory processing in “unimodal” neurons: cross-modal subthreshold auditory effects in cat extrastriate visual cortex. J Neurophysiol. 2007;98:545–549. doi: 10.1152/jn.00173.2007. [DOI] [PubMed] [Google Scholar]
  2. Alvarado JC, Vaughan JW, Stanford TR, Stein BE. Multisensory versus unisensory integration: contrasting modes in the superior colliculus. J Neurophysiol. 2007a;97:3193–3205. doi: 10.1152/jn.00018.2007. [DOI] [PubMed] [Google Scholar]
  3. Alvarado JC, Stanford TR, Vaughan JW, Stein BE. Cortex mediates multisensory but not unisensory integration in superior colliculus. J Neurosci. 2007b;27:12775–12786. doi: 10.1523/JNEUROSCI.3524-07.2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Jiang W, Wallace MT, Jiang H, Vaughan JW, Stein BE. Two cortical areas mediate multisensory integration in superior colliculus neurons. J Neurophysiol. 2001;85:506–522. doi: 10.1152/jn.2001.85.2.506. [DOI] [PubMed] [Google Scholar]
  5. King AJ, Palmer AR. Integration of visual and auditory information in bimodal neurones in the guinea-pig superior colliculus. Exp Brain Res. 1985;60:492–500. doi: 10.1007/BF00236934. [DOI] [PubMed] [Google Scholar]
  6. Meredith MA, Stein BE. Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. J Neurophysiol. 1986;56:640–662. doi: 10.1152/jn.1986.56.3.640. [DOI] [PubMed] [Google Scholar]
  7. Stein BE. Neural mechanisms for synthesizing sensory information and producing adaptive behaviors. Exp Brain Res. 1998;123:124–135. doi: 10.1007/s002210050553. [DOI] [PubMed] [Google Scholar]
  8. Stein BE, Huneycutt WS, Meredith MA. Neurons and behavior: the same rules of multisensory integration apply. Brain Res. 1988;448:355–358. doi: 10.1016/0006-8993(88)91276-0. [DOI] [PubMed] [Google Scholar]

Articles from The Journal of Neuroscience are provided here courtesy of Society for Neuroscience

RESOURCES