Skip to main content
UKPMC Funders Author Manuscripts logoLink to UKPMC Funders Author Manuscripts
. Author manuscript; available in PMC: 2013 May 20.
Published in final edited form as: Nat Neurosci. 2009 May 26;12(6):698–701. doi: 10.1038/nn.2308

Unraveling the principles of auditory cortical processing: can we learn from the visual system?

Andrew J King 1, Israel Nelken 2
PMCID: PMC3657701  EMSID: EMS53164  PMID: 19471268

Abstract

Studies of auditory cortex are often driven by the assumption, derived from our better understanding of visual cortex, that basic physical properties of sounds are represented there before being used by higher-level areas for determining sound-source identity and location. However, we only have a limited appreciation of what the cortex adds to the extensive subcortical processing of auditory information, which can account for many perceptual abilities. This is partly because of the approaches that have dominated the study of auditory cortical processing to date, and future progress will unquestionably profit from the adoption of methods that have provided valuable insights into the neural basis of visual perception. At the same time, we propose that there are unique operating principles employed by the auditory cortex that relate largely to the simultaneous and sequential processing of previously derived features and that therefore need to be studied and understood in their own right.

Introduction

Hearing provides us with an immensely rich source of information about the world around us. From the pattern of sound waves reaching the two ears, we can detect the presence of and distinguish between a vast array of objects, and very often tell how large they are, what type of material they are constructed from and where they are located. Our hearing also endows us with the capacity to recognize and communicate through speech, as well to appreciate music. It is generally accepted that the auditory cortex has a critical role in these processes, but how it does so remains largely a mystery.

As in the visual system, auditory neurons respond to stimuli originating from sources in the surrounding environment. Audition and vision capture overlapping, but distinct, properties of the world. For example, although the spatial resolution of vision in humans, nonhuman primates and carnivores exceeds their auditory localization abilities, the temporal precision of the auditory system is far superior to that of the visual system1,2,3,4. Furthermore, although vision may have a limited spatial extent, audition is omnidirectional, allowing information from all directions in space to be sampled (admittedly, at a low spatial resolution). Nevertheless, the goal of both systems is to represent behaviorally important aspects of the external environment and it seems reasonable to assume that the two should have at least some common principles of organization and operation.

We know considerably more about the nature of processing in the visual system, particularly at the level of the cortex, than of any other sensory modality. In large part, this simply reflects the fact that much greater efforts have been made over many years to investigate the neural basis of visual perception. But it is also the case that, despite a marked increase in the number of studies that have been carried out over the last decade, the workings of the auditory cortex have been harder to unravel. Here we assess why this might be so and consider ways in which the field of auditory cortical research could be moved forward.

Emergent properties of cortical neurons

Well-known features of primary visual cortex (V1) neurons include their sensitivity to line orientation and binocular disparity and the presence of phase-dependent simple cells and phase-invariant complex cells. These properties arise at the level of V1 as a result of convergence among the ascending thalamic axons and of the circuitry of the cortex itself5. In contrast, most primary auditory cortex (A1) neuron properties, such as sensitivity to sound frequency, duration and amplitude, and the manner in which these neurons extract pitch or process binaural disparities, are already found at subcortical levels6. This highlights one of the most fundamental differences between the visual and auditory systems. Retinal ganglion cells project directly to the visual thalamus and thence to V1, whereas several subcortical relay stations exist below the level of the auditory thalamus. In particular, the outputs of the cochlear nucleus and superior olivary complex all synapse in the inferior colliculus, which then projects to the thalamus. Even allowing for the additional processing in the retina, signals may pass through several more synaptic relays before reaching the cortex in the auditory compared with the visual system. Indeed, in view of the sensitivity of its neurons to different sound properties, it has been proposed that the inferior colliculus occupies a processing level equivalent to that of V1 (ref. 7).

The highly preprocessed nature of the inputs to A1 may therefore help to explain why it has been so hard to identify emergent properties in the auditory cortex. The selectivity of A1 neurons to stimulus properties, such as sound frequency or location, is highly variable, but is often comparable to or less specific than that observed subcortically8, 9. Neurons in A1 are relatively promiscuous; they tend to respond to many different sounds, although often with somewhat low firing rates, even in the awake animal10, 11. After many years of looking, we believe it is safe to conclude that there are no standard simple stimuli that are particularly salient for the majority of neurons in A1 in the same way that moving, oriented bars, gratings and Gabor patches have been so effective for investigating the receptive fields of V1 neurons. This lack of an equivalent set of stimuli to probe the functional organization of A1 has undoubtedly hindered progress in this area.

One approach to finding neuron-specific best stimuli is using large numbers of sounds and estimating the spectrotemporal receptive fields (STRFs) of the neurons by reverse correlation. In one study, the STRFs of A1 neurons were reported to have complex shapes, suggesting that they may be well suited to detecting ‘edges’ in frequency or time, analogous to orientation or directionally selective neurons in V1 (ref. 12). However, more recent studies, including those employing dynamic ripples, the auditory equivalent of moving gratings, suggest that the STRF structure of A1 neurons is typically much simpler, often comprising an excitatory region around the best frequency of the neuron, surrounded by one or more inhibitory regions13, 14. Attempts to use STRFs to predict the observed responses of A1 neurons to other sounds have met with mixed success. Although STRFs estimated using sequences of rapidly changing chords can successively predict the spatial receptive fields of A1 neurons, as measured by presenting brief noise bursts in virtual acoustic space13, they are less able to account for the responses to rapidly changing or more complex natural sounds15, 16.

Showing that individual neurons respond in different ways to physically different stimuli, a property that is already present in the auditory nerve, is only part of the story. The real problem for auditory perception is related to invariance. For example, the same pitch sensation can be evoked by many different stimuli, whereas speech can be understood even when delivered by many different voices, which can differ in their accent as much as the two authors of this review. Similarly, sounds from the same direction in space can be identified as such even if their spectrotemporal structure differs substantially. Visual research has addressed the problem of invariance in both computational studies17 and experimentally18. So far, there have been few studies of invariance in the auditory system, but invariant responses to particular sounds may be among the most important emerging properties of auditory cortical neurons, possibly in higher auditory areas.

Organization of response properties in the cortex

A second way in which the functional organization of V1 has influenced the investigation of auditory cortical processing is the presence of parameter maps across the cortical surface. Multiple cortical areas are found in both the visual and auditory systems of a range of mammalian species. A common feature of several of these areas is that they contain topographic representations of the receptor surface. Thus, neighboring neurons in V1 receive their ascending inputs, via the thalamus, from adjacent parts of the retina and therefore collectively form a retinotopic map of visual space. Retinotopic maps are also found to varying extent in higher visual areas. The same principle applies in A1 and some other auditory areas. In this case, however, the receptor hair cells located along the length of the cochlea are tuned to different sound frequencies rather than to different locations, giving rise to tonotopic maps of sound frequency in the cortex.

In the retinotopic framework, V1 neurons are organized into finer-scale, intertwined maps according to their preferences for different stimulus parameters, such as stimulus orientation or spatial frequency, and their sensitivity to eye of input. V1 is also characterized by its columnar organization, in which neurons have similar response properties across different layers or in which response properties differ with laminar location, reflecting different stages of cortical processing19.

In contrast to the two-dimensional receptor surface in the retina, the cochlea generates a one-dimensional representation of sound frequency along its length. In the tonotopically organized parts of the central auditory system, the representation of each point along the basilar membrane is expanded to form an isofrequency region. This potentially provides the basis for mapping other parameters while preserving the neighborhood relationships established in the cochlea. Inspired by the highly ordered organization of V1 (ref. 20), numerous efforts have been made to investigate whether other response properties vary either parallel to the tonotopic organization of A1 or along its isofrequency contours. These studies have shown that the neurons’ thresholds, dynamic range, the shape of their response-level functions, the sharpness of their frequency tuning, their sensitivity to frequency modulation and the type of binaural interaction they have are all distributed in a nonrandom and sometimes interrelated fashion21. In particular, local clusters of neurons with similar response properties can be found, but the overall order is weak, often variable, and lacks any clear functional importance, other than possibly forming conjunctions of particular stimulus properties that may be useful for processing at a later stage. Again, in contrast with V1, the question of how receptive field properties are organized across different cortical layers in A1 is equally uncertain22.

Alternative views of cortical processing

Given the paucity of evidence for the existence in A1 of the type of emergent response properties that characterize V1, such as orientation selectivity, we have to look for alternative organizing principles. Because, as we suggested earlier, A1 may sit at a higher level of processing than V1, one way to proceed would be to adopt more natural stimuli, such as those used in previous studies23 to examine visual neurons in the inferotemporal cortex (IT). The use of natural sounds, such as species-specific vocalizations, to investigate auditory cortex has a long history (for example, ref. 24). Such studies have typically shown that cortical neurons respond to a broad range of stimuli, although the information that these responses convey about stimulus identity can be rather unrelated, even between neighboring, simultaneously recorded neurons25. Because cortical neurons have a tendency to respond to specific acoustic components even in the presence of other, louder components, it has been suggested that neurons in A1 represent ‘auditory objects’: specific, behaviorally relevant spectrotemporal patterns7. Selectivity to auditory objects could rise from combination sensitivity26: the presence of particularly strong responses to specific combinations of a number of different stimulus features10. This seems more analogous to the selectivity for complex visual features seen in IT and suggests that clues to understanding A1 may come from experiments in which the neurons are faced with a complex computational problem, such as the parsing of a complex auditory scene into its individual components.

Whether neurons in A1 are particularly sensitive to auditory objects, they have a number of other properties that strongly hint at a possible role in auditory perception. Similar to V1, A1 neurons are more sluggish than those found subcortically27, with most failing to lock to stimuli presented at rates above a few tens of hertz. On the other hand, they do show sensitivity to slower repetition rates, which are often found naturally, for example, in the temporal fluctuations of speech. This suggests a specialization for naturally occurring slow modulations28. The same sluggishness may also be involved in streaming, which is usually illustrated by the perceptual splitting of sequences of tones that alternate between two frequencies, when the rate of presentation is fast enough, into two separate ‘streams’ of tones, each corresponding to one of the two frequencies. Although streaming also depends on the frequency separation between the two tones, it typically occurs at presentation rates of around 10 Hz, the value at which cortical sluggishness starts to take effect.

As mentioned earlier, time is a particularly important property for audition. Temporal factors influence the responses of neurons at all levels of the auditory system, but particularly so in the cortex. In fact, A1 neurons show considerable adaptive plasticity across different time scales. In passive hearing conditions, their responses to a repeating stimulus adapt at relatively slow repetition rates (less than 1 Hz), whereas responses to even slightly different frequencies may remain strong, a phenomenon known as stimulus-specific adaptation29. Under active conditions, when a tone frequency has a particular behavioral meaning, the reverse may happen: the responses to that frequency can be enhanced30. Such plasticity can be extremely rapid, changing neuronal responses in a task-specific fashion in a matter of minutes14. Over longer time scales, auditory cortical plasticity in adult animals may serve to compensate for changes in sensory input31 or to improve performance during perceptual learning32, 33. Although plasticity of cortical processing in adulthood is, of course, also a characteristic of other sensory systems, it seems to be more pronounced in A1 than in V1. For example, learning in various auditory tasks is accompanied by changes in A1 response properties32, 33, whereas the clearest evidence for learning-induced plasticity in the visual pathway lies not in V1, but in those areas where the response properties of the neurons are most closely matched to the learned stimulus feature34, 35.

The highly context-dependent responses seen in A1 suggest that this property may be important in the temporal organization of auditory perception. The extremely dynamic nature of auditory cortex function contrasts with the standard view of V1 as a set of relatively static, overlapping parameter maps. The more extensive distribution of descending corticofugal projections in the auditory system is probably related to this. Both auditory and visual cortices have massive corticothalamic projections, but the auditory system is unique in terms of the profound influence that the cortex has on processing at lower levels and particularly on the inferior colliculus in the midbrain31, 36, which in turn will presumably shape thalamocortical processing.

Functional specialization beyond the primary areas

Although V1 seems to have a general-purpose function in the processing of visual contours, extrastriate areas can have more specialized computational roles. This is illustrated, for example, by the presence of motion detector neurons in area MT or face detector neurons in IT. These cortical fields form a part of functional processing streams, beginning in V1, that extend either dorsally or ventrally in the brain, where they are involved in visuomotor control and object recognition, respectively37.

This finding has prompted numerous investigations into whether multiple auditory cortical fields show a comparable division of labor. Studies in one species of echo-locating bats have provided clear evidence that this is the case26. Neurons in separate non-cochleotopic fields are tuned to particular combinations of the bat’s emitted biosonar pulses and their returning echoes, forming neural maps of either target distance or relative velocity. This has all the hallmarks of visual processing in that these are emergent cortical properties that are topographically organized in different cortical areas. Echo-locating bats are, however, highly specialized and a comparable organization of higher cortical fields has not been discovered in other species, even among other bats. Nevertheless, the existence of distinct cortical regions that are involved in sound identification and localization is supported by functional imaging38, electrophysiological39 and anatomical40 studies, and by behavioral deficits observed following damage to or reversible inactivation of particular cortical areas41.

Given the growing evidence for extensive crosstalk between the cortical areas representing different modalities, and particularly for visual and somatosensory inputs into auditory cortex42, it makes sense that the largely parallel processing of spatial and nonspatial stimulus properties seen in the visual cortex should be mirrored in the auditory cortex. This would presumably make it easier to integrate corresponding multisensory features, such as vocalizations and their associated lip movements, or visual and auditory cues originating from the same direction in space. But the true extent to which different attributes of sound are separated in higher cortical areas remains controversial, particularly as recent electrophysiological studies have shown that sensitivity to any given feature can be distributed over multiple cortical areas43, 44, 45 and that there may be substantial interactions between the putative ‘what’ and ‘where’ pathways45.

Linking neural activity to perception

All the recording studies described above are correlational and cannot be used to relate neural activity causally to animal behavior, let alone to perception (which is only indirectly accessible in animal studies). Most studies of auditory cortex still use anesthetized animals, usually because of practical considerations such as the need to couple transducers to the ears to maximize stimulus control, or awake, nonbehaving preparations. However, recording from the brain in behaving animals is a necessary step for establishing a direct link between neural responses and behavior. Ground-breaking studies46 using intracortical microstimulation (ICMS) showed that it is possible to bias animal behavior by changing the electrical activity of small populations of appropriately selected neurons in visual cortex. Microstimulation experiments in auditory cortex are rare, although it has been shown that rats can discriminate between electrical stimuli administered to two different regions of the cortex47, 48.

The difficulty of using ICMS to activate specific regions of auditory cortex comes down to the relatively sparse11 and highly independent25 nature of the auditory responses of even nearby neurons. ICMS is more suited to structures with a better-defined functional organization, as in many visual or somatosensory cortical areas. The capacity to visualize the activity of large assemblies of neurons over time in awake animals and, through the use of genetic methods or viral vectors, to optically stimulate selected subpopulation of neurons49 offers considerable potential for investigating the neural circuits that give rise to perceptual decision making and other behaviors. For this powerful approach to be successful in the auditory cortex, it will be necessary to activate selected neurons with appropriate temporal patterns to determine whether the same behavior can be evoked as with natural stimuli.

Conclusions

The auditory cortex shares some important characteristics with the visual system, including the presence of maps, multiple cortical areas, etc. Although the similarities are certainly real, some of the most exciting aspects of our growing understanding of auditory cortex are not part of the textbook view of visual cortex. These include the presence of sensitivity to particular combinations of sounds, sparse and nonredundant representations of those stimuli, and the high level of adaptive plasticity that governs the function of auditory neurons and makes auditory processing so context dependent. Moreover, neurons in both primary and nonprimary auditory cortical fields seem to be particularly susceptible to nonsensory factors such as attention, as was first recognized 50 years ago50. Thus, auditory cortex is not a visual cortex translated into the auditory modality; it seems to be involved in processing stimuli beyond simple feature detection, combining sound components across frequency and over time to generate interpretations of the auditory scene. In that respect, A1 could be more analogous to higher visual areas such as IT than to V1, which share with it the same difficulties of understanding their organization and function.

Given the apparently higher complexity of auditory cortical processing that begins in A1, the computational questions that should be addressed to neurons in auditory cortex might need to be the same as those that are normally reserved for higher visual areas. Thus, we need to focus on questions such as how invariant representations are formed or how learning and experience help to create and maintain the complex response properties of auditory cortical neurons. An important lesson the auditory cortex community can learn from the successes of visual research into these higher-order questions is the importance of working with awake animals. This makes it possible to link neural activity with behavior much more tightly than has been the case, leading eventually to the manipulation of behavior through the imposition of artificial activity patterns on specific subsets of neurons. The use of new optical techniques for visualizing and manipulating neural activity will be important in any such advances, as they will throughout neuroscience. But although the vast resource of information available from the study of the visual cortex will continue to provide an important guide to the study of auditory cortical processing, it is clear that there are key differences that will require solutions unique to hearing.

Acknowledgments

We are grateful to B. Willmore for discussions. Financial support was provided by the Wellcome Trust (a Principal Research Fellowship to A.J.K.) and by the Israeli Science Foundation (I.N.).

References

  • 1.Brown CH, May BJ. In: Comparative mammalian sound localization. in Sound Source Localization, Springer Handbook of Auditory Research. Popper AN, Fay RR, editors. Springer; New York: 2005. pp. 124–178. [Google Scholar]
  • 2.DeValois RL, DeValois KK. Spatial Vision. Oxford University Press; Oxford, UK: 1990. (Oxford Psychology Series 14). [Google Scholar]
  • 3.Tyler CW, Hamer RD. Analysis of visual modulation sensitivity. IV. Validity of the Ferry-Porter law. J. Opt. Soc. Am. A. 1990;7:743–758. doi: 10.1364/josaa.7.000743. [DOI] [PubMed] [Google Scholar]
  • 4.Viemeister NF, Plack CJ. Time analysis. In: Yost WA, Popper AN, Fay RR, editors. Human Psychophysics, Springer Handbook of Auditory Research. Springer; New York: 1993. pp. 116–154. [Google Scholar]
  • 5.Hirsch JA, Martinez LM. Circuits that build visual cortical receptive fields. Trends Neurosci. 2006;29:30–39. doi: 10.1016/j.tins.2005.11.001. [DOI] [PubMed] [Google Scholar]
  • 6.Palmer AR. Anatomy and physiology of the auditory brainstem. In: Burkard RF, Don M, Eggermont JJ, editors. Auditory Evoked Potentials. Lippincott Williams and Wilkins; Baltimore: 2007. pp. 200–228. [Google Scholar]
  • 7.Nelken I, Fishbach A, Las L, Ulanovsky N, Farkas D. Primary auditory cortex of cats: feature detection or something else? Biol. Cybern. 2003;89:397–406. doi: 10.1007/s00422-003-0445-3. [DOI] [PubMed] [Google Scholar]
  • 8.Read HL, Winer JA, Schreiner CE. Modular organization of intrinsic connections associated with spectral tuning in cat auditory cortex. Proc. Natl. Acad. Sci. USA. 2001;98:8042–8047. doi: 10.1073/pnas.131591898. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Clarey JC, Barone P, Imig TJ. Physiology of thalamus and cortex. In: Popper AN, Fay RR, editors. The Mammalian Auditory Pathway: Neurophysiology. Springer; New York: 1992. pp. 232–335. Springer Handbook of Auditory Research. [Google Scholar]
  • 10.Wang X, Lu T, Snider RK, Liang L. Sustained firing in auditory cortex evoked by preferred stimuli. Nature. 2005;435:341–346. doi: 10.1038/nature03565. [DOI] [PubMed] [Google Scholar]
  • 11.Hromádka T, Deweese MR, Zador AM. Sparse representation of sounds in the unanesthetized auditory cortex. PLoS Biol. 2008;6:e16. doi: 10.1371/journal.pbio.0060016. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.deCharms RC, Blake DT, Merzenich MM. Optimizing sound features for cortical neurons. Science. 1998;280:1439–1443. doi: 10.1126/science.280.5368.1439. [DOI] [PubMed] [Google Scholar]
  • 13.Schnupp JWH, Mrsic-Flogel TD, King AJ. Linear processing of spatial cues in primary auditory cortex. Nature. 2001;414:200–204. doi: 10.1038/35102568. [DOI] [PubMed] [Google Scholar]
  • 14.Fritz JB, Elhilali M, Shamma SA. Differential dynamic plasticity of A1 receptive fields during multiple spectral tasks. J. Neurosci. 2005;25:7623–7635. doi: 10.1523/JNEUROSCI.1318-05.2005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Bar-Yosef O, Nelken I. The effects of background noise on the neural responses to natural sounds in cat primary auditory cortex. Front. Comput. Neurosci. 2007;1:1–14. doi: 10.3389/neuro.10.003.2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Machens CK, Wehr MS, Zador AM. Linearity of cortical receptive fields measured with natural sounds. J. Neurosci. 2004;24:1089–1100. doi: 10.1523/JNEUROSCI.4445-03.2004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Serre T, Oliva A, Poggio T. A feedforward architecture accounts for rapid categorization. Proc. Natl. Acad. Sci. USA. 2007;104:6424–6429. doi: 10.1073/pnas.0700622104. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Kiani R, Esteky H, Mirpour K, Tanaka K. Object category structure in response patterns of neuronal population in monkey inferior temporal cortex. J. Neurophysiol. 2007;97:4296–4309. doi: 10.1152/jn.00024.2007. [DOI] [PubMed] [Google Scholar]
  • 19.Martinez LM, et al. Receptive field structure varies with layer in the primary visual cortex. Nat. Neurosci. 2005;8:372–379. doi: 10.1038/nn1404. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Ohki K, et al. Highly ordered arrangement of single neurons in orientation pinwheels. Nature. 2006;442:925–928. doi: 10.1038/nature05019. [DOI] [PubMed] [Google Scholar]
  • 21.Schreiner CE, Winer JA. Auditory cortex mapmaking: principles, projections and plasticity. Neuron. 2007;56:356–365. doi: 10.1016/j.neuron.2007.10.013. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Linden JF, Schreiner CE. Columnar transformations in auditory cortex? A comparison to visual and somatosensory cortices. Cereb. Cortex. 2003;13:83–89. doi: 10.1093/cercor/13.1.83. [DOI] [PubMed] [Google Scholar]
  • 23.Tanaka K. Inferotemporal cortex and object vision. Annu. Rev. Neurosci. 1996;19:109–139. doi: 10.1146/annurev.ne.19.030196.000545. [DOI] [PubMed] [Google Scholar]
  • 24.Wollberg Z, Newman JD. Auditory cortex of squirrel monkey: response patterns of single cells to species-specific vocalizations. Science. 1972;175:212–214. doi: 10.1126/science.175.4018.212. [DOI] [PubMed] [Google Scholar]
  • 25.Chechik G, et al. Reduction of information redundancy in the ascending auditory pathway. Neuron. 2006;51:359–368. doi: 10.1016/j.neuron.2006.06.030. [DOI] [PubMed] [Google Scholar]
  • 26.Suga N. Principles of auditory information-processing derived from neuroethology. J. Exp. Biol. 1989;146:277–286. doi: 10.1242/jeb.146.1.277. [DOI] [PubMed] [Google Scholar]
  • 27.Joris PX, Schreiner CE, Rees A. Neural processing of amplitude-modulated sounds. Physiol. Rev. 2004;84:541–577. doi: 10.1152/physrev.00029.2003. [DOI] [PubMed] [Google Scholar]
  • 28.Chi T, Ru P, Shamma SA. Multiresolution spectrotemporal analysis of complex sounds. J. Acoust. Soc. Am. 2005;118:887–906. doi: 10.1121/1.1945807. [DOI] [PubMed] [Google Scholar]
  • 29.Ulanovsky N, Las L, Nelken I. Processing of low-probability sounds by cortical neurons. Nat. Neurosci. 2003;6:391–398. doi: 10.1038/nn1032. [DOI] [PubMed] [Google Scholar]
  • 30.Weinberger NM. Specific long-term memory traces in primary auditory cortex. Nat. Rev. Neurosci. 2004;5:279–290. doi: 10.1038/nrn1366. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.King AJ, et al. Physiological and behavioral studies of spatial coding in the auditory cortex. Hear. Res. 2007;229:106–115. doi: 10.1016/j.heares.2007.01.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Polley DB, Steinberg EE, Merzenich MM. Perceptual learning directs auditory cortical map reorganization through top-down influences. J. Neurosci. 2006;26:4970–4982. doi: 10.1523/JNEUROSCI.3771-05.2006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Schnupp JWH, Hall TM, Kokelaar RF, Ahmed B. Plasticity of temporal pattern codes for vocalization stimuli in primary auditory cortex. J. Neurosci. 2006;26:4785–4795. doi: 10.1523/JNEUROSCI.4330-05.2006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Kobatake E, Wang G, Tanaka K. Effects of shape-discrimination training on the selectivity of inferotemporal cells in adult monkeys. J. Neurophysiol. 1998;80:324–330. doi: 10.1152/jn.1998.80.1.324. [DOI] [PubMed] [Google Scholar]
  • 35.Yang T, Maunsell JH. The effect of perceptual learning on neuronal responses in monkey visual area V4. J. Neurosci. 2004;24:1617–1626. doi: 10.1523/JNEUROSCI.4442-03.2004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Suga N. Role of corticofugal feedback in hearing. J. Comp. Physiol. A Neuroethol. Sens. Neural. Behav. Physiol. 2008;194:169–183. doi: 10.1007/s00359-007-0274-2. [DOI] [PubMed] [Google Scholar]
  • 37.Ungerleider LG, Haxby JV. What’ and ‘where’ in the human brain. Curr. Opin. Neurobiol. 1994;4:157–165. doi: 10.1016/0959-4388(94)90066-3. [DOI] [PubMed] [Google Scholar]
  • 38.Alain C, Arnott SR, Hevenor S, Graham S, Grady CL. ‘What’ and ‘where’ in the human auditory system. Proc. Natl. Acad. Sci. USA. 2001;98:12301–12306. doi: 10.1073/pnas.211209098. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Tian B, Reser D, Durham A, Kustov A, Rauschecker JP. Functional specialization in rhesus monkey auditory cortex. Science. 2001;292:290–293. doi: 10.1126/science.1058911. [DOI] [PubMed] [Google Scholar]
  • 40.Kaas JH, Hackett TA. ‘What’ and ‘where’ processing in auditory cortex. Nat. Neurosci. 1999;2:1045–1047. doi: 10.1038/15967. [DOI] [PubMed] [Google Scholar]
  • 41.Lomber SG, Malhotra S. Double dissociation of ‘what’ and ‘where’ processing in auditory cortex. Nat. Neurosci. 2008;11:609–616. doi: 10.1038/nn.2108. [DOI] [PubMed] [Google Scholar]
  • 42.Ghazanfar AA, Schroeder CE. Is neocortex essentially multisensory? Trends Cogn. Sci. 2006;10:278–285. doi: 10.1016/j.tics.2006.04.008. [DOI] [PubMed] [Google Scholar]
  • 43.Harrington IA, Stecker GC, Macpherson EA, Middlebrooks JC. Spatial sensitivity of neurons in the anterior, posterior, and primary fields of cat auditory cortex. Hear. Res. 2008;240:22–41. doi: 10.1016/j.heares.2008.02.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Recanzone GH. Representation of con-specific vocalizations in the core and belt areas of the auditory cortex in the alert macaque monkey. J. Neurosci. 2008;28:13184–13193. doi: 10.1523/JNEUROSCI.3619-08.2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Bizley JK, Walker KMM, Silverman BW, King AJ, Schnupp JWH. Interdependent encoding of pitch, timbre and spatial location in auditory cortex. J. Neurosci. 2009;29:2064–2075. doi: 10.1523/JNEUROSCI.4755-08.2009. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Salzman CD, Britten KH, Newsome WT. Cortical microstimulation influences perceptual judgments of motion direction. Nature. 1990;346:174–177. doi: 10.1038/346174a0. [DOI] [PubMed] [Google Scholar]
  • 47.Otto KJ, Rousche PJ, Kipke DR. Microstimulation in auditory cortex provides a substrate for detailed behaviors. Hear. Res. 2005;210:112–117. doi: 10.1016/j.heares.2005.08.004. [DOI] [PubMed] [Google Scholar]
  • 48.Yang Y, DeWeese MR, Otazu GH, Zador AM. Millisecond-scale differences in neural activity in auditory cortex can drive decisions. Nat. Neurosci. 2008;11:1262–1263. doi: 10.1038/nn.2211. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Huber D, et al. Sparse optical microstimulation in barrel cortex drives learned behavior in freely moving mice. Nature. 2008;451:61–64. doi: 10.1038/nature06445. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Hubel DH, Henson CO, Rupert A, Galambos R. Attention units in the auditory cortex. Science. 1959;129:1279–1280. doi: 10.1126/science.129.3358.1279. [DOI] [PubMed] [Google Scholar]

RESOURCES