Skip to main content
UKPMC Funders Author Manuscripts logoLink to UKPMC Funders Author Manuscripts
. Author manuscript; available in PMC: 2020 Dec 30.
Published in final edited form as: J Neurophysiol. 2006 Oct 25;97(1):3–4. doi: 10.1152/jn.01075.2006

Coordinating Different Sensory Inputs During Development. Focus on “Early Experience Determines How the Senses Will Interact”

Andrew J King 1,
PMCID: PMC7116513  EMSID: EMS108590  PMID: 17065244

Objects and events encountered in everyday life frequently generate cues that are registered by the sense organs of more than one modality. For instance, it is often the case when listening to someone’s voice that we also see their lips moving. The capacity of the brain to combine and coordinate the different sensory signals arising from a common source provides us with a unified perception of the world and is essential for directing attention and controlling movement within it. In this issue of the Journal of Neurophysiology (p. 921–926), Wallace and Stein show that experience during infancy can shape the way in which visual and auditory inputs interact to determine the responses of neurons in the superior colliculus (SC) a midbrain nucleus involved in the control of orienting movements.

Interactions between the senses can improve the likelihood of detecting and responding to an event and of identifying and localizing it accurately. On the other hand, if incongruent information is provided by different sensory modalities, then our perception of the event in question can be degraded or altered. This is well illustrated in humans by the “McGurk effect.” Although speech comprehension can be improved, particularly in a noisy environment, by lip-reading, watching a person articulate one speech syllable while listening to another typically results in the perception of a third sound that represents a combination of what was seen and heard (McGurk and MacDonald 1976). The perceptual consequences of cross-modal interactions therefore depend on the binding together of appropriate multisensory signals, i.e., those originating from the source in question, as opposed to other, unrelated stimuli.

This capacity to combine information across different sensory modalities to form a coherent multisensory representation of the world has its origin in the way in which different sensory systems interact during development. The importance of experience in this process has been demonstrated at a number of levels (Lickliter and Bahrick 2004) and particularly in the matching of spatial information provided by the different sensory systems. Recent studies have shown that multisensory convergence is more widespread in the brain than was previously thought to be the case (Ghazanfar and Schroeder 2006), but the SC has long been the region of choice for investigating the way in which the spatial cues provided by different sensory modalities are combined and integrated by individual neurons, both in adult animals and during the course of development (King 1999).

There are two related reasons for this. First for each sensory modality, stimulus location is represented topographically in the SC to form overlapping maps of space. In principle, this allows the different sensory cues associated with a common source to activate a specific region of the SC motor map and therefore be transformed into motor commands that result in a change in gaze direction. Second, many of the neurons found in the deeper layers of this midbrain structure receive converging inputs from two or more sensory systems and generate higher spike discharge rates—and it is likely, in turn, more accurate orienting responses—when combinations of stimuli are delivered in close temporal and spatial proximity.

Because spatial information is represented in different reference frames in the visual, auditory, and somatosensory systems, maintenance of intersensory map alignment in the SC requires that these signals are transformed into a common set of coordinates. Auditory and somatosensory receptive fields are indeed partially remapped into eye-centered coordinates, but this transformation appears to be incomplete, suggesting that multiple reference frames are employed (Pouget et al. 2002). The process of aligning the different representations of different modalities also depends on interactions that take place between the sensory inputs to the SC, particularly during development but also to a certain extent in later life too. It has been shown, for example, that shifting the visual representation relative to the head by optical (Bergan et al. 2005; Knudsen and Brainard 1991) or surgical (King et al. 1988) means can produce a corresponding shift in the auditory spatial tuning of SC neurons. This guiding role for vision is further supported by the finding that degradation of visual input during infancy results in both the emergence of auditory and somatosensory receptive fields that are either abnormally large (Wallace et al. 2004; Withington-Wray et al. 1990) or inappropriately located (King and Carlile 1993; Knudsen et al. 1991) and an absence of multisensory facilitation in the responses of SC neurons (Wallace et al. 2004).

Now Wallace and Stein (2006) have extended these finds by maintaining kittens in the dark and periodically exposing them to temporally coincident but spatially incongruent visual and auditory stimuli to determine whether a systematic change in the spatial relationship of these cues could alter the way in which they are synthesized within the brain. Recordings made when the animals were mature revealed that some neurons had abnormally large receptive fields and failed to show cross-modal interactions when visual and auditory stimuli were presented together. This is in accord with the previously reported effects of dark rearing alone. By contrast, other neurons had relatively small visual and auditory receptive fields, which, as a result of a systematic displacement in auditory spatial tuning, showed little overlap. As in normally raised controls, multisensory enhancement could be elicited in these neurons when visual and auditory stimuli were presented together from within their respective receptive fields. Because of the misalignment of these receptive fields, however, this required the stimuli to be presented from different locations.

These results show that exposure to an abnormal conjunction of visual and auditory inputs can shift the receptive field locations of SC neurons and in so doing change the intersensory spatial requirements for evoking multisensory facilitation by individual neurons. It seems likely that the key factor in binding different sensory stimuli—even when a spatial mismatch exists between them—is their temporal synchrony, although additional experiments will be needed to confirm this. The paper by Wallace and Stein (2006) further highlights the very important role of experience in shaping the multisensory receptive fields and integrative properties of SC neurons during early postnatal development. However, given the well-studied effects of pairing spatially disparate auditory and visual stimuli on auditory spatial perception in adult humans (Zwiers et al. 2003) and monkeys (Woods and Recanzone 2004), plasticity of multisensory processing must be a property of the adult brain too.

References

  1. Bergan JF, Ro P, Ro D, Knudsen EI. Hunting increases adaptive auditory map plasticity in adult barn owls. J Neurosci. 2005;25:9816–9820. doi: 10.1523/JNEUROSCI.2533-05.2005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Ghazanfar AA, Schroeder CE. Is neocortex essentially multisensory? Trends Cogn Sci. 2006;10:278–285. doi: 10.1016/j.tics.2006.04.008. [DOI] [PubMed] [Google Scholar]
  3. King AJ. Sensory experience and the formation of a computational map of auditory space in the brain. Bioessays. 1999;21:900–911. doi: 10.1002/(SICI)1521-1878(199911)21:11<900::AID-BIES2>3.0.CO;2-6. [DOI] [PubMed] [Google Scholar]
  4. King AJ, Carlile S. Changes induced in the representation of auditory space in the superior colliculus by rearing ferrets with binocular eyelid suture. Exp Brain Res. 1993;94:444–455. doi: 10.1007/BF00230202. [DOI] [PubMed] [Google Scholar]
  5. King AJ, Hutchings ME, Moore DR, Blakemore C. Developmental plasticity in the visual and auditory representations in the mammalian superior colliculus. Nature. 1988;332:73–76. doi: 10.1038/332073a0. [DOI] [PubMed] [Google Scholar]
  6. Knudsen EI, Brainard MS. Visual instruction of the neural map of auditory space in the developing optic tectum. Science. 1991;253:85–87. doi: 10.1126/science.2063209. [DOI] [PubMed] [Google Scholar]
  7. Knudsen EI, Esterly SD, du Lac S. Stretched and upside-down maps of auditory space in the optic tectum of blind-reared owls; acoustic basis and behavioral correlates. J Neurosci. 1991;11:1727–1747. doi: 10.1523/JNEUROSCI.11-06-01727.1991. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Lickliter R, Bahrick LE. Perceptual development and the origins of multi-sensory responsiveness. In: Calvert G, Spence C, Stein BE, editors. The Handbook of Multisensory Processes. Cambridge, MA: MIT Press; 2004. pp. 643–654. [Google Scholar]
  9. McGurk H, MacDonald J. Hearing lips and seeing voices. Nature. 1976;264:746–748. doi: 10.1038/264746a0. [DOI] [PubMed] [Google Scholar]
  10. Pouget A, Deneve S, Duhamel JR. A computational perspective on the neural basis of multisensory spatial representations. Nature Rev Neurosci. 2002;3:741–747. doi: 10.1038/nrn914. [DOI] [PubMed] [Google Scholar]
  11. Wallace MT, Perrault TJ, Jr, Hairston WD, Stein BE. Visual experience is necessary for the development of multisensory integration. J Neurosci. 2004;24:9580–9584. doi: 10.1523/JNEUROSCI.2535-04.2004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Wallace MT, Stein BE. Early experience determines how the senses will interact. J Neurophysiol. 2007;97:921–926. doi: 10.1152/jn.00497.2006. [DOI] [PubMed] [Google Scholar]
  13. Withington-Wray DJ, Binns KE, Keating MJ. The maturation of the superior collicular map of auditory space in the guinea pig is disrupted by developmental visual deprivation. Eur J Neurosci. 1990;2:682–692. doi: 10.1111/j.1460-9568.1990.tb00458.x. [DOI] [PubMed] [Google Scholar]
  14. Woods TM, Recanzone GH. Visually induced plasticity of auditory spatial perception in macaques. Curr Biol. 2004;14:1559–1564. doi: 10.1016/j.cub.2004.08.059. [DOI] [PubMed] [Google Scholar]
  15. Zwiers MP, Van Opstal AJ, Paige GD. Plasticity in human sound localization induced by compressed spatial vision. Nat Neurosci. 2003;6:175–181. doi: 10.1038/nn999. [DOI] [PubMed] [Google Scholar]

RESOURCES