Abstract
One of the most important tasks for 3D vision is tracking the movement of objects in space. The ability of early blind individuals to understand motion in the environment from noisy and unreliable auditory information is an impressive example of cortical adaptation that is only just beginning to be understood. Here, we compare visual and auditory motion processing, and discuss the effect of early blindness on the perception of auditory motion. Blindness leads to cross-modal recruitment of the visual motion area hMT+ for auditory motion processing. Meanwhile, the planum temporale, associated with auditory motion in sighted individuals, shows reduced selectivity for auditory motion. We discuss how this dramatic shift in the cortical basis of motion processing might influence the perceptual experience of motion in early blind individuals.
This article is part of a discussion meeting issue ‘New approaches to 3D vision’.
Keywords: vision, audition, motion, blindness, plasticity
1. Introduction
Understanding the motion of objects is critical to the ability of individuals to track and interact with objects in the 3D environment. In sighted individuals, these tasks are almost exclusively carried out using visual information. Most of us experience little to no disorientation weaving our way through a moving crowd to rapidly exit an overly noisy rock concert, but would be understandably reluctant to cross a busy intersection with our eyes closed. Indeed, the word ‘blind’ is commonly used as a synonym for confusion or a general lack of situational awareness—as her irate professor approached, she was blind to the very real danger …
Yet, blind individuals are by no means ‘blindly’ and helplessly stumbling about, incapable of crossing roads. Instead, blind individuals, especially those who become blind early in life, show remarkable fluency navigating a world designed by, and for, sighted people. This navigation relies primarily on auditory information; among the four remaining senses, audition is the only modality that provides information about distant space. This switch to audition as the primary sense that mediates the representation of 3D space requires remarkable sensory adaptations, including the development of novel specialized responses within deprived occipital cortex—cross-modal plasticity, and the development of hyper-expertise within existing areas. However, the mechanisms underlying this cortical plasticity are only just beginning to be understood.
Here, we focus on how early blind individuals perceive auditory motion. We begin by comparing what is known about visual versus auditory motion processing in sighted individuals. Then we discuss how blindness early in life leads to cross-modal plasticity, whereby hMT+, an area known for its selectivity for visual motion in sighted individuals, shows novel responses to auditory motion stimuli. By contrast, the planum temporale (PT), associated with auditory motion in sighted individuals, shows reduced selectivity to auditory motion as a result of blindness early in life. Finally, we discuss the effects of blindness on the perception of auditory motion.
Understanding how early blind individuals perceive auditory motion in space provides important insights both into the neuroanatomical basis of cortical plasticity, and into how the computations underlying our perception of 3D space are optimized to match the available sensory information.
2. Auditory versus visual spatial resolution
One important factor to consider when comparing the computational mechanisms underlying visual and auditory motion processing is the difference in the precision of spatial localization [1] between these two modalities. Humans are capable of detecting visual spatial offsets as small as 2–5 arcsec [2,3], and thresholds for detecting positional displacements for successively presented visual stimuli can be as low as 5 min [4,5]. By contrast, free-field localization of auditory click trains produces discrimination thresholds on the order of 1° [6,7].
The visual and auditory systems also represent spatial location very differently. Throughout early visual cortex, retinal position is directly encoded with remarkable fidelity: neural tuning incorporates information about retinal position, with receptive fields that represent less than 1° within the fovea [8]. In comparison, auditory spatial tuning is broad, within both the primary auditory cortex and PT [9,10], and neural tuning to sound-source locations seem to be represented by an opponent process, based on differences in the activity of two broadly tuned channels formed by contra- and ipsilaterally preferring neurons [9,11,12].
By contrast to the robust retinotopic organization observed in all early visual areas [13,14], no map exists for auditory location (or for proxy cues, such as interaural level or phase differences) in mammalian auditory cortex [15]. The reason for this is not clear, but it may be related to the fact that that retinotopic location and auditory frequency are directly encoded within the retina and the cochlea. Early visual area maps do not represent spatial position [16], which must be computed by combining retinal signals, head position and eye position. Similarly, early visual area maps do not reflect depth, which is inferred using a combination of cues such as stereopsis and motion parallax. Like spatial location, and depth, sound location must be inferred through a combination of cues, including interaural level differences, interaural time differences, pinna shadows and Doppler shift. Although these cues support extremely high temporal sensitivity (e.g. gap detection on the order of 1.5–3 ms [17]), spatial resolution is poor [1].
Finally, visual information, in the absence of occlusion, is continuous, whereas auditory information is sporadic. Take two examples—watching a basketball player, and moving through a crowded airport terminal without bumping into anyone. In both scenarios the visual motion information projected from the 3D world consists of continuous object trajectories with only brief periods of occlusion. By contrast, the auditory signals projected from the 3D world are sporadic: when the player is standing still without the ball she effectively disappears. Similarly, when she is heavily guarded by the opposition, there is no way of spatially resolving her from the other players—she rapidly becomes the auditory equivalent of a sardine in a school of fish. In the airport, one can listen for footsteps, conversations or the sounds of a rolling bag, but the vast majority of objects around one remain acoustically invisible or only sporadically visible. Thus, tracking auditory motion essentially requires inferring continuously moving objects from occasional auditory ‘clues’.
3. The neural basis of visual motion processing in sighted individuals
In sighted individuals, the hub of visual motion processing is hMT+ (human middle temporal cortex) [18]. hMT+ consists of regions analogous to macaque areas MT/MST/FST, along with some possible additional areas tuned for complex types of motion [19–21]. The ecological importance of MT and its surrounding motion areas cannot be overestimated: hMT+ is the largest ‘specialized’ area in the visual hierarchy. In non-human primates, MT neurons have exquisite selectivity for motion direction [22] and speed [23], as well as more complex motion patterns [24]. Neurons in MT and MST also play a crucial role in the perception of 3D space [25], including tracking objects moving in depth [26,27], inference of 3D structure from motion [28,29], motion parallax [30,31] and optic flow [32–35]. Importantly, responses in both MT and MST are associated with the conscious perception of motion. Given a stimulus with an ambiguous direction of motion, the responses of MT and MST neurons correlate with the ‘perceptual choice’ of the animal on a trial-by-trial basis [36,37], and lesions of area MT cause selective impairments of motion perception [19]. The human analogue, the hMT+ complex, has similarly been shown to have selectivity to perceived motion direction [38] and responds to complex motion patterns [39,40] and motion in depth [41,42]. Patients with damage in brain regions overlapping with hMT+ can report akinetopsia or ‘stroboscopic vision’ in which moving objects appear as frames of a cinema reel, and have great difficulties parsing moving objects [43,44].
Responses in MT neurons are classically described in terms of non-separable spatio-temporal tuning [45]. Figure 1a left panel shows a spatio-temporal description of a vertical bar moving to the right. As described by Adelson & Bergen [45], if we ignore the y-dimension (since a vertical bar is unchanging along that dimension), we can represent this stimulus using only the x–t plane (figure 1a, right panel). The moving bar becomes a slanted strip, and detecting motion is equivalent to detecting its spatio-temporal orientation. In other words, any 1D motion signal can be conceptualized as a diagonally oriented template in the space–time plane, where the orientation of the template represents velocity, v = x/t, with a steeper slant representing tuning for lower speeds.
Figure 1.
(a) Spatio-temporal depiction of a vertical bar moving to the right, redrawn from Adelson & Bergen [45]. In the left panel all three dimensions are shown, and in the right panel only the x–t slice is shown. (b) Classic separable (upper panel) and non-separable (lower panel) spatio-temporal templates in a space–time plot. Bright and dark regions correspond to excitatory and inhibitory inputs to the template, respectively. Both templates are selective for rightward motion. Responses to six spatio-temporal stimuli are shown for each template.
The upper panel of figure 1b represents a classic spatio-temporally separable template; defined as separable because it is constructed as the outer dot product of separate spatial and temporal tuning functions (shown with dotted grey lines). The grey bars show predicted responses from this template to a variety of moving stimuli. Although this template is clearly responsive to rightward motion, the strongest responses are elicited for a discrete flash on the left followed by one on the right. Consequently this type of template is considered to be tuned to discrete changes in position over time rather than continuous motion. The final three bars represent responses to analogous spatio-temporal stimuli moving leftward—all three produce suppressive responses.
The lower panel of figure 1b shows a classic non-separable template, typical of MT tuning. The largest response is to a stimulus whose velocity matches the template. Weaker responses are obtained for a stimulus whose speed is lower than the ideal stimulus for the template. It is worth noting that discrete flashes elicit strong responses, as observed by Adelson & Bergen [45]. Spatio-temporal stimuli moving leftward produce suppressive responses.
This computational specialization for visual motion in MT is evolutionarily ancient—analogues to MT can be seen in ancestral predecessors to primates across over 50 million years [46]. Both retinotopic organization and motion selectivity in MT appears to be at least partially ‘innate’—simple selectivity for visual motion can be observed in infant monkeys as soon as measurements can be made [47]. Thus, it seems highly likely that the non-separable spatio-temporal selectivity of MT for motion processing reflects innate mechanisms designed to support the evolutionarily critical task of processing self-motion and the movement of objects in space.
4. The perception of visual motion in sighted individuals
Non-separable spatio-temporal tuning is also observed in human behavioural measurements of visual motion sensitivity. For example, using a psychophysical reverse correlation paradigm, Neri [48] estimated the templates underlying visual motion detection in sighted observers. Participants discriminated a horizontal bar moving vertically up or down (figure 2a, left panel) which was embedded in noise that consisted of stationary bars that appeared in random locations in space and time with luminance randomly drawn from a Gaussian distribution (right panel). The shape of the perceptual templates was inferred by contrasting the noise present in ‘hit’ trials (where the bar direction was correctly identified) versus ‘miss’ trials (where the bar direction was not correctly identified). The resulting estimated visual motion templates (figure 2b) resemble the non-separable spatio-temporal template of the lower panel in figure 1b—selectivity that mirrors the classic model of MT neurons [45].
Figure 2.
Visual motion templates estimated psychophysically using a reverse correlation technique, replotted from Neri [48, fig. 2l]. (a) Observers were asked to discriminate an upward versus downward moving horizontal bar (signal), a single frame is shown (left panel). Spatio-temporal noise was added to the moving signal bars, a single frame is shown (right panel). (b) Because the signal did not vary across the horizontal dimension of space (x), it can be described as a two-dimensional space–time plot, illustrated by the yellow diagonal overlaid on the estimated perceptual visual motion template. The templates were constructed by sorting trials into four categories based on the direction of the signal direction (up (q = 1) or down (q = 0)) and participants' response (correct (z = 1) or not (z = 0)). The resulting template (F) is derived: , where denotes the noise sample over space (x) and time (t), and represents the average across trials. The spatial axis is mirror-inverted (−x) for noise samples with downward-signals to align the template orientations. Figure from [49].
5. The neural basis of auditory motion processing in sighted individuals
In striking contrast to the clear evidence for specialized visual motion mechanisms, it is not yet clear whether or not genuine selectivity for auditory motion exists in the human brain.
In non-primate animal models, direction selectivity for auditory motion has been observed in inferior colliculus neurons, but this directional selectivity seems to reflect spatial adaptation rather than spatio-temporal tuning [50–52]. Within auditory cortex, the large majority of directionally selective neurons show separable spatio-temporal tuning (figure 1b, upper panel) [53–56], such that their maximal response would occur for a sound burst occurring at two discrete locations in space and time. In general, directionally selective neural responses in auditory cortex [57] have been shown to reflect a combination of a preference to the spatial location of sounds at the onset/offset, and binaural interactions [58]. For example, a neuron that shows selective responses to motion might have a strong preference for a sound onset in the contralateral ear and for a simultaneous decrease and increase in sound intensity at the contralateral and ipsilateral ears, respectively, rather than being tuned for motion per se.
Numerous studies point to PT as the cortical region that processes auditory motion in humans and primates. In primates, BOLD auditory responses to motion stimuli in this area cannot simply be predicted on the basis of a linear combination of location preference and spectrotemporal tuning [59], but it remains ambiguous whether this reflects separable or non-separable tuning. In humans, PT shows stronger responses to coherent auditory motion as compared to stationary stimuli appearing at one location in space [60–62], responses in this area can be used to classify direction of motion [63,64], and cortical damage in this region can lead to selective impairments in auditory direction perception [65]. However, whether the computation in PT is specialized for continuous auditory motion versus discrete spatial changes over time remains an open question. BOLD responses in PT are not statistically different in magnitude (and fail to show selective event-related adaptation) for genuine auditory motion versus stationary 1 s noise bursts that vary in location [66,67]. Similarly, although ERP data in humans show distinct neural signatures for moving versus stationary stimuli [68], the patterns are similar for smooth motion, random scattered changes in sound location and abrupt displacement to a contralateral location over time [69].
6. The perception of auditory motion in sighted individuals
Human behavioural data similarly suggest a heavy reliance on separable spatio-temporal tuning. A variety of psychophysical studies support a ‘snapshot’ model of auditory motion in which the perception of a moving sound is constructed from successive, temporally integrated frames [70–72]. Consistent with this model, it is easier to discriminate stimuli of similar speeds but different durations than it is to discriminate stimuli of similar durations but different speeds [73], suggesting that duration and distance cues dominate human auditory speed perception.
A direct measurement of perceptual auditory motion templates also suggests the use of separable filters [49]. Here, using a psychophysical reverse correlation paradigm directly analogous to that Neri [48] used to characterize visual motion templates (figure 2), participants were asked to discriminate left versus right auditory motion embedded in spatio-temporal noise (figure 3a). By contrast to the result for visual motion, measured auditory motion templates were space–time separable (figure 3b, left panel), with large weights for sound locations at the onset and offset.
Figure 3.
Auditory motion templates for sighted and early blind individuals estimated using a psychophysical reverse correlation technique that was a direct analogue to Neri [48]. (a) Early blind (n = 8) and age and musical experience matched sighted participants (n = 8) were asked to discriminate the direction of signal motion (left panel; broadband noise, 500–14 000 Hz, left versus right, leftward motion depicted). Spatio-temporal noise bursts (right panel; amplitudes drawn from a Gaussian distribution) was added to the moving signal. (b) Estimated perceptual auditory motion templates for sighted (left panel) and blind (right panel) individuals. Intensity in each cell in (a) represents sound amplitude (scale = [0 1], from black to white). Note that the temporal and spatial scales were designed to maximize the perception of audition motion [70] and are very different from those used to measure visual motion templates by Neri [48]. Figure from [49].
Why do the templates underlying auditory motion processing differ so substantially from those for visual motion, as measured using closely analogous paradigms? An ideal observer analysis, figure 4, suggests that spatio-temporally separable templates may be optimal, given the poor spatial resolution of the auditory input. When predicting performance for these analogous motion tasks using a variety of templates, a narrowly tuned non-separable template (light grey curve), which resembles classic visual motion detectors, predicted the best performance (lowest signal amplitude threshold), especially at low internal noise levels. However, if one assumes that the spatial tuning of the filter is broader than the width of the stimulus or high levels of internal noise, then the separable template (green curve) performs better than all non-separable templates with broader spatial tuning (darker grey curves). Thus, given the poor spatial selectivity of auditory signals, the optimal solution for auditory motion discrimination may be a spatio-temporally separable template.
Figure 4.
An ideal observer analysis, simulating performance in a task analogous to Neri [48] using a set of non-separable templates with varying tuning width (grey curves) and a single separable template (green curve). On each trial, the stimulus (signal + random external spatio-temporal noise, as described in the main text) was passed through each template. Internal noise was added to the template response. The model assumes a correct response when the appropriately oriented template (e.g. leftward) had a larger response than the opposite (e.g. rightward) tuned template. Signal amplitude was varied to find the predicted perceptual threshold as a function of internal noise. The separable template (green curve) performed better than broadly tuned non-separable models (darker grey) and comparably to the most narrowly tuned non-separable template (light grey) except for very low levels of internal noise.
7. The neural basis of auditory motion processing in early blind individuals
Numerous neuroimaging studies have shown selectivity for auditory motion in hMT+ as a result of early blindness [74–77]. One of the earlier studies, Saenz et al. [78] compared the location of auditory motion responses in typically sighted and two sight recovery participants (who lacked vision for most of their life, and had some vision restored in adulthood, figure 5). Both sight recovery individuals showed overlapping visual and auditory motion responses in hMT+, suggesting that auditory cross-modal responses coexisted with regained visual responses within the visual cortex. These auditory responses in hMT+ were specific for motion; the region lacked sensitivity to other complex auditory stimuli, including frequency sweeps and forward versus reversed speech. This finding has been replicated in numerous studies in early blind individuals [75–77,79–81]. This recruitment of hMT+ seems to require blindness during development; directional tuning has been observed in both early blind and sight recovery individuals, but not in late blind or typically sighted participants [76,81].
Figure 5.
Surface maps of auditory motion responses and hMT+ in two sight recovery (SR) participants. Auditory motion was generated by manipulating interaural level differences. Yellow regions responded more to moving versus stationary auditory white noise. Green and blue regions show hMT+ location (green, hMT+ overlapped by auditory motion responses; blue, hMT+ not overlapped by auditory motion responses). Note the near-complete overlap (very little blue), indicating co-localization of auditory responses with their visually defined hMT+. Figure from [78].
It has been suggested that hMT+ may be inherently multisensory, capable of representing auditory as well as visual motion signals [82]. However, auditory motion does not modulate spiking responses in marmoset MT/MST [83], and the majority of studies have failed to find auditory BOLD responses within individually localized hMT+ in sighted individuals [64,75,84] (although auditory motion responses are found in a nearby region [75]). One recent study did show successful classification of up–down versus left–right auditory motion in hMT+ in sighted individuals [85]. However, in the same dataset, neither up-versus-down nor left-versus-right motion could be classified. One concern is that up–down (performance between 74% and 78%) was much harder than left–right (96%) discrimination, so successful classification of horizontal versus vertical motion directions could have been driven by arousal mediated by task difficulty. Another possibility is that successful classification was driven by differences in the spatial spread of cross-modal spatial attention as the participant switched their attention between vertical versus horizontal motion trajectories [86].
Curiously, the recruitment of hMT+ for auditory motion processing in early blind individuals seems to be accompanied by a loss of selectivity for auditory motion [64], as demonstrated by a reduced ability to classify motion direction based on the pattern of BOLD responses in right PT [75,87]. Almost all examples of cortical sensory plasticity in the current literature, whether at the cellular (e.g. [88,89]) or the map (e.g. [90–92]) level can be interpreted as being the result of competition between inputs. For example, competition between inputs can explain responses to auditory motion within hMT+ in early blind individuals [75,79,81]—auditory inputs to hMT+ (whether top-down or bottom-up) benefit from a lack of competition with visual input. However, competition between inputs cannot explain the loss of selectivity in right PT, whose typical auditory input remains unperturbed. This finding suggests that competition between cortical areas may also play a significant role in developmental cortical specification: with the recruitment of hMT+ competitively reducing the role of right PT in auditory motion processing (figure 6).
Figure 6.
In early blind individuals the altered cortical locus of auditory motion processing seems to be driven both by competition between inputs and by competition between areas for cortical role.
8. The perception of auditory motion in early blind individuals
The effects of blindness on auditory motion perception are still not entirely clear. Early blind individuals perform worse than sighted individuals in a speed judgement task [93]; however, they outperform sighted individuals on a simple auditory motion discrimination task—the minimum detectable audible movement angle [94]. Early blind individuals also show an enhanced ability to understand complex auditory motion under tasks designed to mimic more ecologically valid circumstances. For example, in a complex auditory motion task involving integration of both frequency and motion information, designed to replicate the experience of crossing a road at a busy intersection, early blind individuals outperformed their sighted counterparts when asked to report the overall direction of perceived motion [79] (figure 7).
Figure 7.
(a) Stimuli consisted of eight ‘noise bands’. Each of the eight noise bands was a different frequency (generated by filtering white noise in the Fourier domain). For each noise band we simulated motion from right to left or vice versa, with motion simulated using naturalistic interaural time differences, interaural level differences and Doppler shift. The frequency range of each noise band remained constant as it travelled from left to right (or vice versa), apart from the frequency shift induced by simulating the Doppler effect. (b) For the data shown here six noise bands travelled from left to right, and two travelled from right to left. Participants were asked to report the overall direction of perceived motion. (c) Early blind participants are significantly better at determining the direction of auditory motion. Replotted from Jiang et al. [79,87]. (Online version in colour.)
These mixed findings for the effects of blindness on auditory motion processing resemble differences between early blind and sighted individuals in spatial localization. Blind individuals outperform sighted individuals when judging the relative location of two sounds sources [95,96], but are less accurate at auditory spatial bisection [97]. One possibility is that the ability to encode auditory spatio-temporal cues is enhanced in blind individuals, enhancing performance on some spatial and motion tasks [79,94], but 3D spatial calibration is impaired in the absence of vision [98,99], resulting in poorer performance in spatial and motion tasks that require a precise encoding of Euclidean auditory space [94,97].
Given the known spatio-temporal selectivity for processing visual motion in hMT+ discussed above, one intriguing possibility is that enhanced abilities for perceiving auditory motion observed in early blind individuals for some tasks might be the result of recruiting non-separable spatio-temporal mechanisms innate to hMT+. However, the perceptual templates estimated for early blind individuals using the analogue of Neri [48] are also spatio-temporally separable [49], showing only subtle differences from the templates observed in sighted individuals (figure 3b, right panel). This result suggests that if auditory motion is processed in hMT+ in early blind individuals, then this must result in a shift from non-separable to separable templates within this region—a significant modification of hMT+ normal computational operations.
One possibility is that the signals within MT are not motion signals per se, but reflect spatial and temporal signals from either subcortical (perhaps recruited superior colliculus [100]), or cortical [101] areas. If so, the ability to classify direction of auditory motion in hMT+ in early blind individuals might be mediated by its retinotopic organization (if, for example, neural responses were biased towards the sound onset) rather than by its spatio-temporal tuning. It remains possible that these signals in hMT+ nonetheless play an important role both in mediating the perceptual experience of motion and for propagating motion information to other areas in the brain. hMT+ has reciprocal projections [102] to a variety of sensorimotor areas including parietal V6 and V6A (object motion recognition and control of reach-to-grasp movements [103,104]), intraparietal AIP (visuo-motor transformations for grasp [105]), MIP (coordination of hand movements and visual targets [106]), LIP (saccadic target selection) and frontal A4ab (motor cortex), prefrontal A8aV (frontal eye fields) and A8C (premotor).
Given that it is not possible to classify motion direction based on the pattern of BOLD responses in PT in early blind individuals [75,87], it seems unlikely that the signals in hMT+ come from PT. Indeed, the loss of motion selectivity in PT as a result of early blindness seems to have no discernable perceptual consequence. This suggests that simple motion discrimination may be mediated subcortically; PT's responses to auditory motion in sighted individuals may not reflect the computation of auditory motion per se, but rather functions analogous to higher level areas of hMT+ [107,108], including the segregation, identification and direction of attention to moving objects [109,110].
9. How does our qualitative experience of motion arise?
Over a wide range of conditions we perceive visual signals as motion (rather than changes in space and time), even when the underlying stimulus consists of discrete signals [111]. This likely reflects both physiological constraints (such as a lack of sensitivity to high spatial frequencies) within the computational machinery used to compute motion within MT [112], as well as an ecological prior that objects move continuously in space. In either case, hMT+ seems to play a crucial role, with damage to this area causing ‘stroboscopic’ vision [43,44]. This ecological prior seems not to require statistical experience with visual motion. Sight recovery participant MM, who became blind at 3.5 years, had no visual memories and had sight restored in his forties. However, after sight recovery he seemed to have no difficulty understanding visual motion, with no evidence of ‘stroboscopic’ motion perception, in striking contrast to his deeply and permanently impaired 3D form processing [113,114]. He did not subjectively report his first experience of visual motion as a ‘novel sensation of motion’ (as he did for colour) and he could perform tasks like biological motion recognition, which relies heavily on continuous motion perception. Similar immediate fluency with visual motion is also reported in other sight recovery studies, some involving congenitally blind individuals [115,116].
In the auditory domain, for both sighed and early blind individuals, there is a similarly strong bias towards interpreting auditory signals that change in space and time as representing motion. It seems unlikely that this prior is generated via a ‘bottom up’ interpretation of experienced auditory signals, given that audition rarely provides continuous motion information. In sighted individuals visual information might ‘scaffold’ the interpretation of auditory signals, but this does not explain why early blind individuals also have an understanding of motion. One possibility is that signals from hMT+ may be ‘innately’ interpreted as motion by other regions of the brain, so blind individuals' understanding of motion may be the result of the propagation of signals from hMT+. Another possibility is that, even in the absence of vision, an understanding of motion can be mediated by higher level amodal knowledge about object permanence.
10. Conclusion
It is tempting to think of auditory and visual motion as being direct analogues of each other—representing corresponding information about the movement of objects in space, albeit in different modalities. Another perspective, however, is to appreciate that both auditory and visual motion signals represent the movement of objects in the world, but accept that they are deeply incommensurate in terms of the information that is carried about the 3D world, the quality of that information, and how this information is represented neurally. Visual information, in the absence of occlusion, is continuous, whereas auditory information is sporadic. The quality of spatial information also differs substantially across the two modalities. Visual motion is directly encoded by the changes in firing over time in retinal cells with remarkable spatial precision [8], and is carried by ‘labelled lines’ through most of the early visual hierarchy [13,14]. By contrast, sound location must be inferred through a combination of indirect cues [1]. It seems plausible that the difference in how auditory and visual motion is represented neurally is driven by differences in the nature of the input.
In early blind individuals the templates underlying motion discrimination are very similar to those of sighted individuals [49]. This similarity is somewhat surprising, given the significant neural reorganization that occurs as a result of early blindness, wherein hMT+ is recruited for auditory motion [75–81], and there seems to be a reduced selectivity for motion in right PT [75,79]. One possibility is that auditory motion is processed within hMT+ in early blind individuals, there is a substantial change in hMT+ tuning, from non-separable to separable, and the similarities between early blind and sighted templates simply reflect homologies of function—driven by the statistics of auditory input. A second possibility is that the templates underlying auditory motion perception reflect separable computations either prior or feedback to hMT+. If so, the recruitment of hMT+ may reflect its selectivity for retinotopic location rather than its neural spatio-temporal tuning. The recruitment of hMT+ may still be functionally important, given that this area provides the motion information necessary for navigating and interacting with the 3D world to numerous cortical areas [102]. The recruitment of hMT+ may also enhance the perceptual experience of continuous motion from discontinuous auditory input.
In conclusion, blind individuals make their way through the world with limited and noisy information, using neural motion representations that are heavily constrained by limitations in the sensory input. Navigating our way through a 3D world is an impressive feat of inference for sighted individuals. In the case of blind individuals, it represents a genuine triumph of cortical plasticity that is only just beginning to be understood.
Ethics
Previously unpublished data included in this article were approved by the University of Washington's Institutional Review Board, and was carried out in accordance with the Code of Ethics of the Declaration of Helsinki.
Data accessibility
Data and code used in generating figures 1, 3, and 4 are provided at https://github.com/VisCog.
Authors' contributions
I.F.: conceptualization, funding acquisition, writing—original draft, writing—review and editing; W.J.P.: conceptualization, writing—original draft, writing—review and editing.
All authors gave final approval for publication and agreed to be held accountable for the work performed therein.
Conflict of interest declaration
We declare we have no competing interests.
Funding
NIH National Eye Institute Grant R01EY014645 (to I.F.) and a Weill Neurohub Postdoctoral Fellowship (to W.J.P.).
References
- 1.Mohl JT, Pearson JM, Groh JM. 2020. Monkeys and humans implement causal inference to simultaneously localize auditory and visual stimuli. J. Neurophysiol. 124, 715-727. ( 10.1152/jn.00046.2020) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Westheimer G, McKee SP. 1977. Spatial configurations for visual hyperacuity. Vis. Res. 17, 941-947. ( 10.1016/0042-6989(77)90069-4) [DOI] [PubMed] [Google Scholar]
- 3.Westheimer G. 1987. Visual acuity and hyperacuity: resolution, localization, form. Am. J. Optom. Physiol. Opt. 64, 567-574. ( 10.1097/00006324-198708000-00002) [DOI] [PubMed] [Google Scholar]
- 4.Verdon-Roe GM, Westcott MC, Viswanathan AC, Fitzke FW, Garway-Heath DF. 2006. Exploration of the psychophysics of a motion displacement hyperacuity stimulus. Invest. Ophthalmol. Vis. Sci. 47, 4847-4855. ( 10.1167/iovs.05-1487) [DOI] [PubMed] [Google Scholar]
- 5.MacVeigh D, Whitaker D, Elliott DB. 1991. Spatial summation determines the contrast response of displacement threshold hyperacuity. Ophthalmic Physiol. Opt. 11, 76-80. ( 10.1111/j.1475-1313.1991.tb00198.x) [DOI] [PubMed] [Google Scholar]
- 6.Mills AW. 1958. On the minimum audible angle. J. Acoust. Soc. Am. 30, 237-246. ( 10.1121/1.1909553) [DOI] [Google Scholar]
- 7.Perrott DR, Saberi K. 1990. Minimum audible angle thresholds for sources varying in both elevation and azimuth. J. Acoust. Soc. Am. 87, 1728-1731. ( 10.1121/1.399421) [DOI] [PubMed] [Google Scholar]
- 8.Nauhaus I, Nielsen KJ, Callaway EM. 2016. Efficient receptive field tiling in primate V1. Neuron 91, 893-904. ( 10.1016/j.neuron.2016.07.015) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Derey K, Valente G, de Gelder B, Formisano E. 2015. Opponent coding of sound location (azimuth) in planum temporale is robust to sound-level variations. Cereb. Cortex 26, 450-464. ( 10.1093/cercor/bhv269) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.van der Heijden K, Rauschecker JP, Formisano E, Valente G, de Gelder B. 2018. Active sound localization sharpens spatial tuning in human primary auditory cortex. J. Neurosci. 38, 8574-8587. ( 10.1523/jneurosci.0587-18.2018) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Stecker GC, Harrington IA, Middlebrooks JC. 2005. Location coding by opponent neural populations in the auditory cortex. PLoS Biol. 3, e78. ( 10.1371/journal.pbio.0030078) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.McAlpine D, Jiang D, Palmer AR. 2001. A neural code for low-frequency sound localization in mammals. Nat. Neurosci. 4, 396-401. ( 10.1038/86049) [DOI] [PubMed] [Google Scholar]
- 13.Engel SA, Glover GH, Wandell BA. 1997. Retinotopic organization in human visual cortex and the spatial precision of functional MRI. Cereb. Cortex 7, 181-192. ( 10.1093/cercor/7.2.181) [DOI] [PubMed] [Google Scholar]
- 14.Wandell BA, Dumoulin SO, Brewer AA. 2007. Visual field maps in human cortex. Neuron 56, 366-383. ( 10.1016/j.neuron.2007.10.012) [DOI] [PubMed] [Google Scholar]
- 15.Middlebrooks JC. 2021. A search for a cortical map of auditory space. J. Neurosci. 41, 5772-5778. ( 10.1523/JNEUROSCI.0501-21.2021) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Gardner JL, Merriam EP, Movshon JA, Heeger DJ. 2008. Maps of visual space in human occipital cortex are retinotopic, not spatiotopic. J. Neurosci. 28, 3988-3999. ( 10.1523/jneurosci.5476-07.2008) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Kumar P, Sanju HK, Nikhil J. 2016. Temporal resolution and active auditory discrimination skill in vocal musicians. Int. Arch. Otorhinolaryngol. 20, 310-314. ( 10.1055/s-0035-1570312) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Park WJ, Tadin D. 2018. Motion perception. In Stevens' handbook of experimental psychology and cognitive neuroscience (eds Stevens S, Wixted JT). New York, NY: John Wiley & Sons, Inc. [Google Scholar]
- 19.Newsome WT, Paré EB. 1988. A selective impairment of motion perception following lesions of the middle temporal visual area (MT). J. Neurosci. 8, 2201-2211. ( 10.1523/jneurosci.08-06-02201.1988) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Born RT, Bradley DC. 2005. Structure and function of visual area MT. Annu. Rev. Neurosci. 28, 157-189. ( 10.1146/annurev.neuro.26.041002.131052) [DOI] [PubMed] [Google Scholar]
- 21.Desimone R, Ungerleider LG. 1986. Multiple visual areas in the caudal superior temporal sulcus of the macaque. J. Comp. Neurol. 248, 164-189. ( 10.1002/cne.902480203) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Albright TD, Desimone R, Gross CG. 1984. Columnar organization of directionally selective cells in visual area MT of the macaque. J. Neurophysiol. 51, 16-31. ( 10.1152/jn.1984.51.1.16) [DOI] [PubMed] [Google Scholar]
- 23.Priebe NJ, Cassanello CR, Lisberger SG. 2003. The neural representation of speed in macaque area MT/V5. J. Neurosci. 23, 5650-5661. ( 10.1523/jneurosci.23-13-05650.2003) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Rust NC, Mante V, Simoncelli EP, Movshon JA. 2006. How MT cells analyze the motion of visual patterns. Nat. Neurosci. 9, 1421-1431. ( 10.1038/nn1786) [DOI] [PubMed] [Google Scholar]
- 25.Cormack LK, Czuba TB, Knöll J, Huk AC. 2017. Binocular mechanisms of 3D motion processing. Annu. Rev. Vis. Sci. 3, 297-318. ( 10.1146/annurev-vision-102016-061259) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Czuba TB, Huk AC, Cormack LK, Kohn A. 2014. Area MT encodes three-dimensional motion. J. Neurosci. 34, 15 522-15 533. ( 10.1523/jneurosci.1081-14.2014) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Sanada TM, DeAngelis GC. 2014. Neural representation of motion-in-depth in area MT. J. Neurosci. 34, 15 508-15 521. ( 10.1523/JNEUROSCI.1072-14.2014) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Bradley DC, Chang GC, Andersen RA. 1998. Encoding of three-dimensional structure-from-motion by primate area MT neurons. Nature 392, 714-717. ( 10.1038/33688) [DOI] [PubMed] [Google Scholar]
- 29.Kim HR, Angelaki DE, DeAngelis GC. 2015. A functional link between MT neurons and depth perception based on motion parallax. J. Neurosci. 35, 2766-2777. ( 10.1523/jneurosci.3134-14.2015) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Nadler JW, Angelaki DE, DeAngelis GC. 2008. A neural representation of depth from motion parallax in macaque visual cortex. Nature 452, 642-645. ( 10.1038/nature06814) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Roy J-P, Wurtz RH. 1990. The role of disparity-sensitive cortical neurons in signalling the direction of self-motion. Nature 348, 160-162. ( 10.1038/348160a0) [DOI] [PubMed] [Google Scholar]
- 32.Duffy CJ. 1998. MST neurons respond to optic flow and translational movement. J. Neurophysiol. 80, 1816-1827. ( 10.1152/jn.1998.80.4.1816) [DOI] [PubMed] [Google Scholar]
- 33.Bremmer F, Duhamel JR, Ben Hamed S, Graf W. 2002. Heading encoding in the macaque ventral intraparietal area (VIP). Eur. J. Neurosci. 16, 1554-1568. ( 10.1046/j.1460-9568.2002.02207.x) [DOI] [PubMed] [Google Scholar]
- 34.Duffy CJ, Wurtz RH. 1991. Sensitivity of MST neurons to optic flow stimuli. I. A continuum of response selectivity to large-field stimuli. J. Neurophysiol. 65, 1329-1345. ( 10.1152/jn.1991.65.6.1329) [DOI] [PubMed] [Google Scholar]
- 35.Lagae L, Maes H, Raiguel S, Xiao D, Orban GA. 1994. Responses of macaque STS neurons to optic flow components: a comparison of areas MT and MST. J. Neurophysiol. 71, 1597-1626. ( 10.1152/jn.1994.71.5.1597) [DOI] [PubMed] [Google Scholar]
- 36.Britten KH, Newsome WT, Shadlen MN, Celebrini S, Movshon JA. 1996. A relationship between behavioral choice and the visual responses of neurons in macaque MT. Vis. Neurosci. 13, 87-100. ( 10.1017/S095252380000715X) [DOI] [PubMed] [Google Scholar]
- 37.Yu X, Gu Y. 2018. Probing sensory readout via combined choice-correlation measures and microstimulation perturbation. Neuron 100, 715-727. ( 10.1016/j.neuron.2018.08.034) [DOI] [PubMed] [Google Scholar]
- 38.Serences JT, Boynton GM. 2007. The representation of behavioral choice for motion in human visual cortex. J. Neurosci. 27, 12 893-12 899. ( 10.1523/JNEUROSCI.4021-07.2007) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Huk AC, Heeger DJ. 2002. Pattern-motion responses in human visual cortex. Nat. Neurosci. 5, 72-75. ( 10.1038/nn774) [DOI] [PubMed] [Google Scholar]
- 40.Braddick OJ, O'Brien JMD, Wattam-Bell J, Atkinson J, Hartley T, Turner R. 2001. Brain areas sensitive to coherent visual motion. Perception 30, 61-72. ( 10.1068/p3048) [DOI] [PubMed] [Google Scholar]
- 41.Likova LT, Tyler CW. 2007. Stereomotion processing in the human occipital cortex. Neuroimage 38, 293-305. ( 10.1016/j.neuroimage.2007.06.039) [DOI] [PubMed] [Google Scholar]
- 42.Rokers B, Cormack LK, Huk AC. 2009. Disparity- and velocity-based signals for three-dimensional motion perception in human MT+. Nat. Neurosci. 12, 1050. ( 10.1038/nn.2343) [DOI] [PubMed] [Google Scholar]
- 43.Zihl J, von Cramon D, Mai N. 1983. Selective disturbance of movement vision after bilateral brain damage. Brain 106, 313-340. ( 10.1093/brain/106.2.313) [DOI] [PubMed] [Google Scholar]
- 44.Zeki S. 1991. Cerebral akinetopsia (visual motion blindness): a review. Brain 114, 811-824. ( 10.1093/brain/114.2.811) [DOI] [PubMed] [Google Scholar]
- 45.Adelson EH, Bergen JR. 1985. Spatiotemporal energy models for the perception of motion. J. Opt. Soc. Am. A 2, 284-299. ( 10.1364/JOSAA.2.000284) [DOI] [PubMed] [Google Scholar]
- 46.Kaas JH. 2012. The evolution of neocortex in primates. Prog. Brain Res. 195, 91-102. ( 10.1016/B978-0-444-53860-4.00005-2) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Movshon JA, Rust NC, Kohn A, Kiorpes L, Hawken M. 2003. Receptive field properties of MT neurons in infant macaques. In 2003 Neuroscience Meeting Planner, New Orleans, LA. Society for Neuroscience. See https://www.sfn.org/meetings/past-and-future-annual-meetings/abstract-archive/abstract-archive-details?absID=2808&absyear=2003. [Google Scholar]
- 48.Neri P. 2014. Dynamic engagement of human motion detectors across space–time coordinates. J. Neurosci. 34, 8449-8461. ( 10.1523/jneurosci.5434-13.2014) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Park WJ, Fine I. 2022. The perception of auditory motion in sighted and early blind individuals. bioRxiv. ( 10.1101/424622). [DOI]
- 50.Wang Y, Peña JL. 2013. Direction selectivity mediated by adaptation in the owl's inferior colliculus. J. Neurosci. 33, 19 167-19 175. ( 10.1523/jneurosci.2920-13.2013) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Ingham NJ, Hart HC, McAlpine D. 2001. Spatial receptive fields of inferior colliculus neurons to auditory apparent motion in free field. J. Neurophysiol. 85, 23-33. ( 10.1152/jn.2001.85.1.23) [DOI] [PubMed] [Google Scholar]
- 52.McAlpine D, Jiang D, Shackleton TM, Palmer AR. 2000. Responses of neurons in the inferior colliculus to dynamic interaural phase cues: evidence for a mechanism of binaural adaptation. J. Neurophysiol. 83, 1356-1365. ( 10.1152/jn.2000.83.3.1356) [DOI] [PubMed] [Google Scholar]
- 53.Ahissar M, Ahissar E, Bergman H, Vaadia E. 1992. Encoding of sound-source location and movement: activity of single neurons and interactions between adjacent neurons in the monkey auditory cortex. J. Neurophysiol. 67, 203-215. ( 10.1152/jn.1992.67.1.203) [DOI] [PubMed] [Google Scholar]
- 54.Sovijärvi AR, Hyvärinen J. 1974. Auditory cortical neurons in the cat sensitive to the direction of sound source movement. Brain Res. 73, 455-471. ( 10.1016/0006-8993(74)90669-6) [DOI] [PubMed] [Google Scholar]
- 55.Jenison RL, Schnupp JW, Reale RA, Brugge JF. 2001. Auditory space–time receptive field dynamics revealed by spherical white-noise analysis. J. Neurosci. 21, 4408-4415. ( 10.1523/jneurosci.21-12-04408.2001) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Massoudi R, Van Wanrooij MM, Versnel H, Van Opstal AJ. 2015. Spectrotemporal response properties of core auditory cortex neurons in awake monkey. PLoS ONE 10, e0116118. ( 10.1371/journal.pone.0116118) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57.Stumpf E, Toronchuk JM, Cynader MS. 1992. Neurons in cat primary auditory cortex sensitive to correlates of auditory motion in three-dimensional space. Exp. Brain Res. 88, 158-168. ( 10.1007/bf02259137) [DOI] [PubMed] [Google Scholar]
- 58.Toronchuk JM, Stumpf E, Cynader MS. 1992. Auditory cortex neurons sensitive to correlates of auditory motion: underlying mechanisms. Exp. Brain Res. 88, 169-180. ( 10.1007/BF02259138) [DOI] [PubMed] [Google Scholar]
- 59.Poirier C, et al. 2017. Auditory motion-specific mechanisms in the primate brain. PLoS Biol. 15, e2001379. ( 10.1371/journal.pbio.2001379) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Griffiths TD, Green GG, Rees A, Rees G. 2000. Human brain areas involved in the analysis of auditory movement. Hum. Brain Mapp. 9, 72-80. () [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Griffiths TD, Rees A, Witton C, Shakir RAA, Henning GB, Green GG. 1996. Evidence for a sound movement area in the human cerebral cortex. Nature 383, 425-427. ( 10.1038/383425a0) [DOI] [PubMed] [Google Scholar]
- 62.Warren JD, Zielinski BA, Green GG, Rauschecker JP, Griffiths TD. 2002. Perception of sound-source motion by the human brain. Neuron 34, 139-148. ( 10.1016/S0896-6273(02)00637-2) [DOI] [PubMed] [Google Scholar]
- 63.Battal C, Rezk M, Mattioni S, Vadlamudi J, Collignon O. 2019. Representation of auditory motion directions and sound source locations in the human planum temporale. J. Neurosci. 39, 2208-2220. ( 10.1523/jneurosci.2289-18.2018) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 64.Alink A, Euler F, Kriegeskorte N, Singer W, Kohler A. 2012. Auditory motion direction encoding in auditory cortex and high-level visual cortex. Hum. Brain Mapp. 33, 969-978. ( 10.1002/hbm.21263) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Thaler L, et al. 2016. A selective impairment of perception of sound motion direction in peripheral space: a case study. Neuropsychologia 80, 79-89. ( 10.1016/j.neuropsychologia.2015.11.008) [DOI] [PubMed] [Google Scholar]
- 66.Smith KR, Okada K, Saberi K, Hickok G. 2004. Human cortical auditory motion areas are not motion selective. Neuroreport 15, 1523-1526. ( 10.1097/01.wnr.0000130233.43788.4b) [DOI] [PubMed] [Google Scholar]
- 67.Smith KR, Saberi K, Hickok G. 2007. An event-related fMRI study of auditory motion perception: no evidence for a specialized cortical system. Brain Res. 1150, 94-99. ( 10.1016/j.brainres.2007.03.003) [DOI] [PubMed] [Google Scholar]
- 68.Ducommun CY, Murray MM, Thut G, Bellmann A, Viaud-Delmon I, Clarke S, Michel CM. 2002. Segregated processing of auditory motion and auditory location: an ERP mapping study. Neuroimage 16, 76-88. ( 10.1006/nimg.2002.1062) [DOI] [PubMed] [Google Scholar]
- 69.Getzmann S, Lewald J. 2012. Cortical processing of change in sound location: smooth motion versus discontinuous displacement. Brain Res. 1466, 119-127. ( 10.1016/j.brainres.2012.05.033) [DOI] [PubMed] [Google Scholar]
- 70.Grantham DW. 1986. Detection and discrimination of simulated motion of auditory targets in the horizontal plane. J. Acoust. Soc. Am. 79, 1939-1949. ( 10.1121/1.393201) [DOI] [PubMed] [Google Scholar]
- 71.Roggerone V, Vacher J, Tarlao C, Guastavino C. 2019. Auditory motion perception emerges from successive sound localizations integrated over time. Sci. Rep. 9, 16437. ( 10.1038/s41598-019-52742-0) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 72.Carlile S, Leung J. 2016. The perception of auditory motion. Trends Hear. 20, 2331216516644254. ( 10.1177/2331216516644254) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 73.Freeman TCA, Leung J, Wufong E, Orchard-Mills E, Carlile S, Alais D. 2014. Discrimination contours for moving sounds reveal duration and distance cues dominate auditory speed perception. PLoS ONE 9, e102864. ( 10.1371/journal.pone.0102864) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 74.Lewis LB, Saenz M, Fine I. 2010. Mechanisms of cross-modal plasticity in early-blind subjects. J. Neurophysiol. 104, 2995-3008. ( 10.1152/jn.00983.2009) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 75.Dormal G, Rezk M, Yakobov E, Lepore F, Collignon O. 2016. Auditory motion in the sighted and blind: early visual deprivation triggers a large-scale imbalance between auditory and ‘visual’ brain regions. Neuroimage 134, 630-644. ( 10.1016/j.neuroimage.2016.04.027) [DOI] [PubMed] [Google Scholar]
- 76.Bedny M, Konkle T, Pelphrey K, Saxe R, Pascual-Leone A. 2010. Sensitive period for a multimodal response in human visual motion area MT/MST. Curr. Biol. 20, 1900-1906. ( 10.1016/j.cub.2010.09.044) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77.Poirier C, Collignon O, Scheiber C, Renier L, Vanlierde A, Tranduy D, Veraart C, De Volder AG. 2006. Auditory motion perception activates visual motion areas in early blind subjects. Neuroimage 31, 279-285. ( 10.1016/j.neuroimage.2005.11.036) [DOI] [PubMed] [Google Scholar]
- 78.Saenz M, Lewis LB, Huth AG, Fine I, Koch C. 2008. Visual motion area MT+/V5 responds to auditory motion in human sight-recovery subjects. J. Neurosci. 28, 5141-5148. ( 10.1523/JNEUROSCI.0803-08.2008) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 79.Jiang F, Stecker GC, Fine I. 2014. Auditory motion processing after early blindness. J. Vis. 14, 4. ( 10.1167/14.13.4) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 80.Wolbers T, Zahorik P, Giudice NA. 2011. Decoding the direction of auditory motion in blind humans. Neuroimage 56, 681-687. ( 10.1016/j.neuroimage.2010.04.266) [DOI] [PubMed] [Google Scholar]
- 81.Jiang F, Stecker GC, Boynton GM, Fine I. 2016. Early blindness results in developmental plasticity for auditory motion processing within auditory and occipital cortex. Front. Hum. Neurosci. 10, 324. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 82.Pascual-Leone A, Hamilton R. 2001. The metamodal organization of the brain. Prog. Brain Res. 134, 427-445. ( 10.1016/s0079-6123(01)34028-1) [DOI] [PubMed] [Google Scholar]
- 83.Chaplin TA, Allitt BJ, Hagan MA, Rosa MGP, Rajan R, Lui LL. 2018. Auditory motion does not modulate spiking activity in the middle temporal and medial superior temporal visual areas. Eur. J. Neurosci. 48, 2013-2029. ( 10.1111/ejn.14071) [DOI] [PubMed] [Google Scholar]
- 84.Lewis JW, Beauchamp MS, DeYoe EA. 2000. A comparison of visual and auditory motion processing in human cerebral cortex. Cereb. Cortex 10, 873-888. ( 10.1093/cercor/10.9.873) [DOI] [PubMed] [Google Scholar]
- 85.Rezk M, Cattoir S, Battal C, Occelli V, Mattioni S, Collignon O. 2020. Shared representation of visual and auditory motion directions in the human middle-temporal cortex. Curr. Biol. 30, 2289-2299. ( 10.1016/j.cub.2020.04.039) [DOI] [PubMed] [Google Scholar]
- 86.Ciaramitaro VM, Buracas GT, Boynton GM. 2007. Spatial and cross-modal attention alter responses to unattended sensory information in early visual and auditory human cortex. J. Neurophysiol. 98, 2399-2413. ( 10.1152/jn.00580.2007) [DOI] [PubMed] [Google Scholar]
- 87.Jiang F, Huber E, Thomas J, Stecker GC, Boynton G, Fine I. 2015. Frequency-tuned auditory motion responses within hMT+ as a result of early blindness. J. Vis. 15, 128. ( 10.1167/15.12.128) [DOI] [Google Scholar]
- 88.Mitchell DE, Timney B. 2011. Postnatal development of function in the mammalian visual system. In Handbook of physiology: Section I. The nervous system (ed. Darian-Smith I), pp. 507-555. Bethesda, MD: American Physiological Society. [Google Scholar]
- 89.Katz LC, Shatz CJ. 1996. Synaptic activity and the construction of cortical circuits. Science 274, 1133-1138. ( 10.1126/science.274.5290.1133) [DOI] [PubMed] [Google Scholar]
- 90.Darian-Smith C, Gilbert C. 1995. Topographic reorganization in the striate cortex of the adult cat and monkey is cortically mediated. J. Neurosci. 15, 1631-1647. ( 10.1523/jneurosci.15-03-01631.1995) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 91.Merzenich MM, Kaas JH, Wall J, Nelson RJ, Sur M, Felleman D. 1983. Topographic reorganization of somatosensory cortical areas 3b and 1 in adult monkeys following restricted deafferentation. Neuroscience 8, 33-55. ( 10.1016/0306-4522(83)90024-6) [DOI] [PubMed] [Google Scholar]
- 92.Keliris GA, Shao Y, Schmid MC, Augath M, Logothetis NK, Smirnakis SM. 2022. Macaque area V2/V3 reorganization following homonymous retinal lesions. Front. Neurosci. 16, 757091. ( 10.3389/fnins.2022.757091) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 93.Bertonati G, Amadeo MB, Campus C, Gori M. 2021. Auditory speed processing in sighted and blind individuals. PLoS ONE 16, e0257676. ( 10.1371/journal.pone.0257676) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 94.Lewald J. 2013. Exceptional ability of blind humans to hear sound motion: implications for the emergence of auditory space. Neuropsychologia 51, 181-186. ( 10.1016/j.neuropsychologia.2012.11.017) [DOI] [PubMed] [Google Scholar]
- 95.Battal C, Occelli V, Bertonati G, Falagiarda F, Collignon O. 2020. General enhancement of spatial hearing in congenitally blind people. Psychol. Sci. 31, 1129-1139. ( 10.1177/0956797620935584) [DOI] [PubMed] [Google Scholar]
- 96.Röder B, Teder-Sälejärvi W, Sterr A, Rösler F, Hillyard SA, Neville HJ. 1999. Improved auditory spatial tuning in blind humans. Nature 400, 162-166. ( 10.1038/22106) [DOI] [PubMed] [Google Scholar]
- 97.Gori M, Sandini G, Martinoli C, Burr DC. 2014. Impairment of auditory spatial localization in congenitally blind human subjects. Brain 137, 288-293. ( 10.1093/brain/awt311) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 98.Finocchietti S, Cappagli G, Gori M. 2017. Auditory spatial recalibration in congenital blind individuals. Front. Neurosci. 11, 76. ( 10.3389/fnins.2017.00076) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 99.Finocchietti S, Cappagli G, Gori M. 2015. Encoding audio motion: spatial impairment in early blind individuals. Front. Psychol. 6, 1357. ( 10.3389/fpsyg.2015.01357) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 100.Coullon GSL, Jiang F, Fine I, Watkins KE, Bridge H. 2015. Subcortical functional reorganization due to early blindness. J. Neurophysiol. 113, 2889-2899. ( 10.1152/jn.01031.2014) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 101.Gurtubay-Antolin A, Battal C, Maffei C, Rezk M, Mattioni S, Jovicich J, Collignon O. 2021. Direct structural connections between auditory and visual motion-selective regions in humans. J. Neurosci. 41, 2393-2405. ( 10.1523/JNEUROSCI.1552-20.2021) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 102.Abe H, et al. 2018. Axonal projections from the middle temporal area in the common marmoset. Front. Neuroanat. 12, 89. ( 10.3389/fnana.2018.00089) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 103.Pitzalis S, Fattori P, Galletti C. 2013. The functional role of the medial motion area V6. Front. Behav. Neurosci. 6, 91. ( 10.3389/fnbeh.2012.00091) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 104.Gamberini M, Galletti C, Bosco A, Breveglieri R, Fattori P. 2011. Is the medial posterior parietal area V6A a single functional area? J. Neurosci. 31, 5145-5157. ( 10.1523/jneurosci.5489-10.2011) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 105.Nelissen K, Fiave PA, Vanduffel W. 2017. Decoding grasping movements from the parieto-frontal reaching circuit in the nonhuman primate. Cereb. Cortex 28, 1245-1259. ( 10.1093/cercor/bhx037) [DOI] [PubMed] [Google Scholar]
- 106.Grefkes C, Fink GR. 2005. The functional organization of the intraparietal sulcus in humans and monkeys. J. Anat. 207, 3-17. ( 10.1111/j.1469-7580.2005.00426.x) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 107.Tanaka K, Sugita Y, Moriya M, Saito H-A. 1993. Analysis of object motion in the ventral part of the medial superior temporal area of the macaque visual cortex. J. Neurophysiol. 69, 128-142. ( 10.1152/jn.1993.69.1.128) [DOI] [PubMed] [Google Scholar]
- 108.Nelissen K, Vanduffel W, Orban GA. 2006. Charting the lower superior temporal region, a new motion-sensitive region in monkey superior temporal sulcus. J. Neurosci. 26, 5929-5947. ( 10.1523/jneurosci.0824-06.2006) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 109.Hirnstein M, Westerhausen R, Hugdahl K. 2013. The right planum temporale is involved in stimulus-driven, auditory attention—evidence from transcranial magnetic stimulation. PLoS ONE 8, e57316. ( 10.1371/journal.pone.0057316) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 110.Griffiths TD, Warren JD. 2002. The planum temporale as a computational hub. Trends Neurosci. 25, 348-353. ( 10.1016/S0166-2236(02)02191-4) [DOI] [PubMed] [Google Scholar]
- 111.Liu T, Slotnick SD, Yantis S. 2004. Human MT+ mediates perceptual filling-in during apparent motion. Neuroimage 21, 1772-1780. ( 10.1016/j.neuroimage.2003.12.025) [DOI] [PubMed] [Google Scholar]
- 112.Morgan MJ, Barlow HB, Longuet-Higgins HC, Sutherland NS. 1980. Analogue models of motion perception. Phil. Trans. R. Soc. Lond. B 290, 117-135. ( 10.1098/rstb.1980.0086) [DOI] [PubMed] [Google Scholar]
- 113.Fine I, Wade AR, Brewer AA, May MG, Goodman DF, Boynton GM, Wandell BA, MacLeod DI. 2003. Long-term deprivation affects visual perception and cortex. Nat. Neurosci. 6, 915-916. ( 10.1038/nn1102) [DOI] [PubMed] [Google Scholar]
- 114.Huber E, Webster JM, Brewer AA, MacLeod DI, Wandell BA, Boynton GM, Wade AR, Fine I. 2015. A lack of experience-dependent plasticity after more than a decade of recovered sight. Psychol. Sci. 26, 393-401. ( 10.1177/0956797614563957) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 115.Ostrovsky Y, Andalman A, Sinha P. 2006. Vision following extended congenital blindness. Psychol. Sci. 17, 1009-1014. ( 10.1111/j.1467-9280.2006.01827.x) [DOI] [PubMed] [Google Scholar]
- 116.Ostrovsky Y, Meyers E, Ganesh S, Mathur U, Sinha P. 2009. Visual parsing after recovery from blindness. Psychol. Sci. 20, 1484-1491. ( 10.1111/j.1467-9280.2009.02471.x) [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
Data and code used in generating figures 1, 3, and 4 are provided at https://github.com/VisCog.







