Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2016 Feb 13.
Published in final edited form as: Neuropsychologia. 2014 Aug 13;64:105–123. doi: 10.1016/j.neuropsychologia.2014.08.005

The Construct of the Multisensory Temporal Binding Window and its Dysregulation in Developmental Disabilities

Mark T Wallace 1,2,3,4, Ryan A Stevenson 5
PMCID: PMC4326640  NIHMSID: NIHMS632840  PMID: 25128432

Abstract

Behavior, perception and cognition are strongly shaped by the synthesis of information across the different sensory modalities. Such multisensory integration often results in performance and perceptual benefits that reflect the additional information conferred by having cues from multiple senses providing redundant or complementary information. The spatial and temporal relationships of these cues provide powerful statistical information about how these cues should be integrated or “bound” in order to create a unified perceptual representation. Much recent work has examined the temporal factors that are integral in multisensory processing, with many focused on the construct of the multisensory temporal binding window – the epoch of time within which stimuli from different modalities is likely to be integrated and perceptually bound. Emerging evidence suggests that this temporal window is altered in a series of neurodevelopmental disorders, including autism, dyslexia and schizophrenia. In addition to their role in sensory processing, these deficits in multisensory temporal function may play an important role in the perceptual and cognitive weaknesses that characterize these clinical disorders. Within this context, focus on improving the acuity of multisensory temporal function may have important implications for the amelioration of the “higher-order” deficits that serve as the defining features of these disorders.

Introduction

We live in a world rich with information about the events and objects around us. This information comes in a variety of different forms; forms that we generally ascribe to our different senses. Although neuroscience has generally approached the study of sensory processes on a modality-by-modality basis, our perceptual view of the world is an integrated and holistic one in which these sensory cues are blended seamlessly into a singular perceptual Gestalt. Such a multisensory perspective cries out for an intensive investigation of how information from the different senses is combined by the brain to influence our behaviors and shape our perceptions, a field that has emerged over the past 25 years and which is now growing at an impressive pace.

Rather than simply acknowledging the necessity of merging information from the different senses in order to build our perceptual reality, it must also be pointed out that the synthesis of multisensory information confers powerful behavioral and perceptual advantages (for recent reviews see [1-4]). Indeed, the driving evolutionary forces that undoubtedly led to multisensory systems are the powerful adaptive benefits seen when information is available from more than a single sense. For example, in animal behavior, the presence of cues from multiple senses has been shown to result in improvements in stimulus detection, discrimination and localization that manifest as faster and more accurate responses. In a similar manner, human studies have revealed multisensory-mediated performance benefits in a host of behavioral and perceptual tasks. Several of the more salient of these include the speeding of simple reaction times under paired visual-auditory stimulation and increased intelligibility of a speech signal when presented in a multisensory (i.e., audiovisual) context within a noisy environment [5-13].

A great deal of work has gone into examining the neural correlates of these multisensory-mediated changes in behavior and perception. These studies have detailed the presence and organization of a number of cortical and subcortical structures within which information from multiple senses converges, and the neural integration that accompanies this convergence in both humans [6, 7, 14-52] and animals [52-82]. In addition, a great deal of recent work has gone into describing the modulatory influences that a “non-dominant” modality can have on information processing within the “dominant” modality, such as examining how visual information can affect the processing of sounds within auditory cortex [83, 84]. Indeed, these observations have spurred a debate as to whether or not the entire cerebral cortex (and by extension the entire brain) can be considered “multisensory” [85, 86]. Collectively, these studies have greatly illuminated our understanding of how information from the different senses interacts to influence neural and network responses, and how these responses are ultimately correlated with behavior and perception.

A “principled” view into multisensory processing

Along with detailing how neuronal, behavioral and perceptual responses are altered under multisensory conditions, prior work has also revealed key operational characteristics regarding these multisensory interactions. Perhaps most important among these was the general finding that the physical characteristics of the stimuli that were to be combined are important determinants of the end product of a multisensory interaction. First studied at the level of the individual neuron, these stimulus factors include the characteristics of space, time and effectiveness. In regards to space and time, multisensory (e.g., visual-auditory) stimuli that are spatially and temporally proximate typically result in the largest enhancements in neuronal response [56, 58, 66, 87-90]. In addition, stimuli that are weakly effective when presented on their own result in proportionately larger enhancements when combined [71, 91, 92]. These basic integrative principles make a great deal of intuitive sense in that space and time are powerful statistical indicators of the likelihood that stimuli arise from the same event, and in that a highly-salient or effective stimulus is one modality needs little amplification. Recent work has added to our understanding of the role that these stimulus factors play in multisensory interactions by highlighting their interdependency [58, 93-97]. Thus, one cannot view space, time and effectiveness as independent entities, since manipulations of one, for example spatial location, will also impact the effectiveness of those stimuli and the temporal firing patterns associated with them.

Following the description of these principles at the neuronal level, a number of studies have followed up on this work in the behavioral and perceptual realms, and has shown that these principles often extend into these domains as well. Thus, behavioral and perceptual facilitations have been shown to be greatest for stimuli that are close together in space and time [90, 98-128], and the proportional benefits of combining stimuli across different modalities appear to be greatest when the individual stimuli are weakly effective [18, 21, 22, 113]. In addition, and much like for the neuronal data described above, recent studies have also illustrated the interdependency of these principles in human performance and perception [20, 58, 129].

One area of very active research is the applicability of these principles for describing all aspects of human performance and perception. Although first driven by studies showing exceptions to the spatial, temporal and effectiveness principles described above, more recent thinking is converging toward a more dynamic and contextual view of the applicability of these principles [130, 131]. In addition to illustrating the flexibility inherent in multisensory processes, there are strong suggestions as to the mechanistic underpinnings of such adaptive networks and integrative processes, including oscillatory phase resetting and divisive normalization [131]. As a more concrete example, in the context of a task in which temporal factors are relatively unimportant (e.g., stimulus or target localization), it is expected that there would be less (if any) weighting placed on the temporal structure of the stimulus complex. Thus, current thinking invokes a flexibly specified set of interactive rules or principles tightly related to task performance that ultimately dictate the final product of a multisensory stimulus combination.

In the current review, we have chosen to focus on temporal factors, in large measure because of the recent accumulation of evidence that has outlined how multisensory temporal function changes during typical development, and because of the growing acknowledgment that multisensory temporal acuity is altered in a number of neurodevelopmental disabilities – three of which, autism, dyslexia and schizophrenia, are highlighted in this review. Although this review is framed from the perspective of temporal function for these reasons, we must point out that, as alluded to above, both space and spatiotemporal factors are powerful players in the construction of our multisensory perceptual gestalt. Indeed, much work has focused on describing how these spatial and spatiotemporal factors influence multisensory interactions at the neural, behavioral and perceptual levels [20, 58, 89, 90, 97, 116, 117, 128, 132-138], and any accounting of multisensory function is necessarily incomplete without acknowledgement of the important role these factors play as “filters” for multisensory systems.

The temporal principle expanded: the multisensory temporal binding window

The concept of temporal factors, originally defined on the basis of the temporal tuning functions of individual multisensory neurons (fig 1A) [87], has been expanded to capture the effects of temporal factors on human psychophysical performance. Although the temporal properties of human performance very much resemble their neuronal counterparts (fig 1B) [100], when placed in the context of a judgment about the unity of an audiovisual stimulus complex (i.e., did they occur at the same time or not), they point to a thresholding process in which the observer must make a probabilistic judgment about the nature of the stimulus complex. More concretely, in the example shown in Figure 1b, the subject is making judgments about the simultaneity of a visual-auditory stimulus pair that is presented at varying stimulus onset asynchronies (SOAs) or delays. Note that when the stimuli are objectively simultaneous (i.e., an SOA of 0 ms), the subject has a high probability of correctly reporting this simultaneity. However, even with delays of a hundred milliseconds or more, the subject still reports on a high percentage of trials that the stimuli are simultaneous. Such a broad interval within which simultaneity continues to be reported suggests a degree of temporal tolerance for stimulus asynchrony, in essence creating a “window” of time within which multisensory stimuli are highly likely to be perceptually bound or integrated [99, 105, 115, 139-144].

Figure 1.

Figure 1

Representative multisensory temporal “filters” for neurons and perception. Panel on the left (A) shows the temporal tuning function for a neuron in the cat superior colliculus in response to paired audiovisual stimuli. Plotted is the gain in neuronal response (i.e., multisensory interactive gain) as a function of the stimulus onset asynchrony between the visual and auditory stimuli. Negative values represent conditions in which the auditory stimulus precedes the visual stimulus. Panel on the right (B) shows the responses of a representative human subject for a simultaneity judgment task. Plotted is the percentage of reports of simultaneity as a function of stimulus onset asynchrony. Note the similarities in the neuronal and psychophysical distributions.

This construct of a multisensory temporal binding window (TBW) is highly adaptive, in that it allows multisensory information to be bound even when it originates at differing distances from the subject. The biological utility of this is grounded in the substantial differences in the propagation times for visual and auditory energy. Consider a visual-auditory event happening 1 meter from you vs. 34 meters away. In the first case, the arrival of the visual and auditory energies to the eye and ear is nearly simultaneous, whereas in the second circumstance the auditory information arrives at the ear approximately 100 ms after the visual information impinges on the eye (sound travels at about 340 m/s). Additional evidence for the importance of these biological delays can be seen through measures of the point of subjective simultaneity (PSS), the exact temporal offset (measured at the sensory organ) at which an individual is most likely to perceive two inputs as synchronous. On initial thinking, one would expect the PSS to be at 0. However, the PSS in most individuals is typically observed when the auditory component of a stimulus pair slightly lags the visual stimulus component [for review, see 141].

In recent years a number of salient characteristics about this TBW have been discovered (Figure 2). First, the window differs in size for different stimuli, with it being smallest for simple audiovisual stimulus pairs such as flashes and beeps, intermediate in size for more complex environmental stimuli such as a hammer hitting a nail, and largest for the most complex of naturalistic multisensory stimuli – speech [140, 141, 145-147]. Second, the TBW exhibits a marked degree of variability from subject-to-subject [148]. Third, the TBW continues to mature late into development, with it being broader than for adults well into adolescence [101, 120]. Finally, the TBW has been shown to be malleable in multiple ways, both adjusting to the temporal statistics of the environment (recalibration [122, 149-153]), and in perceptual plasticity studies showing that it can be substantially narrowed with feedback training [37, 103, 154, 155].

Figure 2.

Figure 2

Methods of characterizing multisensory temporal function in human subjects. A. Individual subjects’ points of subjective simultaneity (PSS) determined using three different temporal tasks: a two-alternative forced-choice simultaneity judgment (SJ2; “same time different time?”), a three-alternative forced-choice simultaneity judgment (SJ3; “audio first, same time, or visual first?”), and a temporal order judgment (TOJ; “Which came first?”). Note that with all three tasks, individuals PSS values fell in the visual-leading range. Adapted from van Eijk et al, 2008. B. The size of the multisensory temporal binding window (TBW) is highly variable between subjects, but within each subject the size of the left (auditory leading) and right (visual leading) windows are strongly correlated. Adapted from Stevenson, Zemtsov and Wallace, 2012. C. The width of the TBW is very dependent upon the type of stimuli presented, with narrower windows (high temporal acuity) being seen for simple (i.e., flashes and beeps) stimuli, and the widest windows being observed for speech stimuli. Adapted from Stevenson and Wallace, 2013. D. Perceptual learning paradigms have been shown to reliably increase individual's multisensory temporal acuity, as indexed by a narrowing of the TBW. Adapted from Schlesinger et al., In Press.

Collectively, these studies point to the multisensory TBW as an important component of our perceptual view of the world, structured to make strong statistical inferences about the likelihood that multisensory stimuli originate from the same object or event. As highlighted below, individual differences in this window and alterations in its size are likely to have important implications for the construction of our perceptual (and cognitive) representations.

The neural correlates of multisensory temporal function

As alluded to earlier, much of the foundational work in regards to the neural correlates of multisensory function, including its temporal constraints, has come from a midbrain structure, the superior colliculus (SC). The primary role of the SC is in the initiation and control of gaze (i.e., combined eye and head) movements to a stimulus of interest. Following the principles described earlier, these movements are facilitated (i.e., are generally faster and more accurate) to multisensory stimuli that are spatially and temporally proximate [109, 113, 116, 117, 156-160]. However, for perceptual judgments such as the evaluations of simultaneity described earlier, it is unlikely that the SC plays a major role. Rather, these perceptual (as opposed to sensorimotor) processes appear to be the dominion of cortical domains likely to play a central role in stimulus “binding.” One of the central cortical hubs for the processing of audiovisual timing relations appears to be the cortex surrounding the posterior superior temporal sulcus (pSTS). The pSTS is well positioned for this role in that it lay at the junction between occipital (visual) and temporal (auditory) cortex, and it receives substantial convergent input from visual and auditory cortical domains. Moreover, the pSTS is differentially active during the presentation of synchronous versus asynchronous audiovisual stimulus pairs, suggesting an important role in evaluations of audiovisual timing [20, 24, 36, 37, 161]. The pSTS has also been shown to signal the perceptual binding of an audiovisual stimulus pairing, responding more efficiently to a pairing of identical temporal relations that is perceived as a single event when compared to one that is perceived to be two distinct events [36]. An additional piece of evidence in support of a central role for the pSTS in multisensory temporal function is that following perceptual training that narrows the TBW, activity changes as indexed by fMRI are seen in a cortical network centered on the pSTS [37]. Finally, numerous studies have shown the pSTS to be an important site for the processing of audiovisual speech cues [7, 15, 18, 19, 26, 162, 163], including work that has shown that deactivation of the pSTS via transcranial magnetic stimulation (TMS) can abolish the McGurk illusion – in which the pairing of discordant visual and auditory speech tokens typically results in a novel fused percept [162]. Collectively, these studies point to the pSTS as a key node for multisensory convergence and integration, and for the evaluation of temporal factors in the perceptual determination of stimulus binding.

The development of multisensory function

Somewhat surprisingly, although we know a great deal about the characteristics, function and behavioral/perceptual correlates of multisensory integration in the adult, our knowledge of these processes during development has been less well described. Animal model studies have shown that multisensory neurons and their associated integrative properties mature over a protracted period of developmental life that extends well into “adolescence” [61, 164-167]. In addition, these studies have shown remarkable plasticity in the development of these processes, such that changes in the statistical structure (i.e., spatial and temporal stimulus relations) of the early sensory world result in the development of integrative properties that match these statistics [53, 168-170].

Much of the work that has examined multisensory function in human development has focused on the period soon after birth. These studies have shown a beautiful sequential development in the abilities of the infant in their ability to evaluate (and likely bind) multisensory relations, with the capacity to evaluate simple features of a multisensory stimulus complex (e.g., duration) maturing prior to the ability to evaluate more complex features (e.g., rhythm) [171-173]. Examples that are most germane to the temporal dimension, the focus of the current review, include the findings that infants begin life with much larger temporal binding windows for both audiovisual non-speech and speech stimuli (with those for speech being longest [174-176]). In addition, it has been found that the window for speech stimuli does not begin to narrow until around 5 years of age [177]. More recent work from our group has shown that these developmental processes continue to mature well into older ages. Thus, we have shown that the multisensory TBW remains larger than for adults well into adolescence (Figure 4) [101, 178]. Intriguingly, this enlarged window appears to depend on the nature of the stimuli that are being combined. Thus, whereas the window appears larger for the pairing of simple low-level visual and auditory stimuli (i.e., flashes and beeps), it is of normal size in these children for more complex speech-related stimuli.

Figure 4.

Figure 4

The size of the multisensory temporal binding window (TBW) is smaller in adults than in children and adolescents. Bar graph displays mean window size for children (ages 6-11, left), adolescents (ages 12-17, middle) and adults (ages 18-23, right) (n = 15 participants/ group). * = p < .05. Error bars indicate ±1 standard error of the mean (SEM). Adapted from Hillock-Dunn and Wallace, 2012.

Although far from providing a comprehensive characterization of how multisensory processes develop in the period leading up to adulthood, these studies have illustrated the long developmental interval over which these processes mature, and the marked plasticity that characterizes the maturation of multisensory function. With this as a backdrop, it should come as little surprise to see that multisensory abilities are frequently altered in the context of developmental disabilities.

Multisensory integration in developmental disabilities

As we have seen, the ability of individuals to perceptually bind sensory information allows for significant behavioral benefits and serves to create a coherent and unified perception of the external world. If these processes develop in an atypical manner then, it should come as little surprise that detrimental behavioral, perceptual and cognitive consequences are the result. Here, we will discuss such atypical multisensory function in the context of three developmental disabilities; autism spectrum disorders, dyslexia, and schizophrenia. In each case, we will highlight the current behavioral and perceptual evidence for atypical multisensory temporal processing, describe the evidence for the possible neural correlates of these dysfunctions, and outline areas in which further work is needed..

Autism and emerging evidence for sensory dysfunction

Autism spectrum disorders (ASD) make up a constellation of neurodevelopmental disabilities characterized by deficits in social communicative skills and by the presence of restricted interests and/or repetitive behaviors. The most recent evidence suggests that the incidence of ASD may be as high as 1 child in 88 [179], making it a substantial public health problem with large societal and economic costs. Although initially characterized and diagnosed on the basis of deficits in a “triad” of domains – language and communication, social reciprocity and restricted/repetitive interests – the presence of sensory deficits are now widely acknowledged, warranting their inclusion in the recent revision of the DSM [180].

The challenges in describing and defining sensory dysfunction in the context of ASD have arisen in part because of the enormous heterogeneity in these changes – ranging from striking hyporesponsivity and underreactivity to sensory stimuli to hyperresponsivity and between sensory aversions to sensory-seeking behaviors [180] . Despite this phenotypic variability, the fact that upwards of 90% of children with autism have some form of sensory alteration suggest it to be a core component of autism.

One of the great challenges with assessing the nature of these sensory changes has been that the overwhelming majority of the data has come from anecdotal evidence, caregiver reports, or self-report survey measures, limiting the ability to have a comprehensive and empirically grounded picture of the nature of these changes. This is currently changing as a number of studies are beginning to provide a more objective and systematic view into sensory function in autism. This work has served to bolster the more subjective reports, reinforcing the presence of processing deficits in a number of sensory modalities, including vision [181-194], audition [184, 195-206], and touch [207-209].

However, and seemingly at odds with this evidence, a number of these studies have also revealed the presence of normal or even enhanced sensory function in certain children and in certain domains [210-230]. Although initially enigmatic, these normal or improved abilities appear to be restricted to tasks that tap into low-level sensory function or require extensive local (as opposed to global) processing, suggesting that early sensory processing and the neural architecture that subserves it may be preserved (or even enhanced) in the autistic brain. This finding fits within the hypothetical framework that in autism local cortical organization and connectivity are preserved, but processes that rely upon communication across brain networks are impaired (see model section below for more detail). As an elegant example of this, Bertone and colleagues found that in a visual grating orientation task in which the gratings were specified by luminance, children with autism outperformed typically developing children [223]. In contrast, when the gratings were specified by changes in texture rather than luminance, the children with autism performed more poorly. Whereas the neural mechanisms for determining orientation from luminance are believed to be in primary visual cortex (V1), the mechanism for deriving orientation from texture are believed to take place at later processing stages within the visual hierarchy. This example highlights evidence in support of but one of the many neurobiologically-inspired models for describing autism and the associated changes in sensory function.

Neurobiological models of autism

A multitude of brain-based theories of autism have been put forth, each with varying degrees of supporting evidence. Several of the more prominent of these, described briefly in this section, have been used to explain differences in sensory function in ASD (along with the more widely established changes in social communicative function).

The concept of weak central coherence is closely related to the observations described above, in that it suggests that communication across brain networks is preferentially impaired in autism [231-233]. In its simplest form, the concept suggests strong deficits in holistic or “Gestalt” processing, in which individuals with autism have striking difficulties in the processing of global features, but in which the processing of local features is relatively intact or even enhanced. One of the hallmark tests used to differentiate local vs. global processing is the so-called “embedded figures” test, in which participants are asked to report on the number of simple shapes (e.g., triangles) contained within a larger image (e.g., the drawing of a clock). Numerous studies have shown that individuals with ASD outperform those who are typically-developed, but disagree on the nature of the global deficits seen using this task [234-237]. In many respects, weak central coherence can be subsumed within ideas of autism as a functional disconnection syndrome or a connectopathy, in which the core deficits are founded in weaknesses in connectivity across brain networks and that have been seen in both structural and functional connectivity studies ([238, 239][240, 241]. Although framed at a different level, these changes in network function can also be seen as a result of changes in the excitatory/inhibitory balance, another prevailing model concerning the pathophysiology in autism [242]. In this model, the core deficit in autism is the carefully balanced ratio of excitation and inhibition within and across brain networks, which if disrupted can have dramatic effects on network communication and the associated functional correlates. Another emerging model in autism suggests an important role for increases in noise or degraded signal-to-noise ratio in the etiology of autism [243-246]. The presence of increased noise (which could come from a number of sources) would basically degrade the quality of information processing, with increasing effects as one ascends up through an information processing hierarchy and thus taps greater and greater integrative abilities (since the noise would be cumulative). Finally, the temporal binding deficit hypothesis posits that timing-related deficits as a core feature of autism [247]. Indeed, temporal integration is a core feature for processing within all sensory systems, and disruptions in timing–related circuits could give rise to supramodal or multisensory processing deficits. Although these theories have been espoused by different groups at different times, there are striking similarities among them that suggest marked commonalities and shared mechanistic relations. As just one example, the temporal deficit described above could be the result of alterations in connectivity, excitatory/inhibitory balance and/or noisy sensory and perceptual encoding.

Multisensory contributions to autism

The prevalence of observations highlighting deficits in multiple sensory systems, coupled with evidence that integrative functions across brain networks may be preferentially impacted, has led to an examination of the role that multisensory dysfunction may play in autism [248]. Although as highlighted above there is now clear evidence for changes in function within the individual sensory systems, this work is predicated on the view that these unisensory deficits may not completely capture the nature of the changes in processes that index integration across the different sensory systems. In recent years, a number of labs, including our own, have attempted to provide a better view into the nature of these changes in multisensory function in those with autism.

To date, the picture that has been generated by these studies has been a complex and confusing one. Thus, although a number of studies have reported changes in multisensory function that extend beyond those predicted on the basis of changes in unisensory function, others have found either normal multisensory abilities, or deficits that can be completely explained based on unisensory performance. One of the best illustrations of this complexity is in work that has explored the susceptibility of individuals with autism to the McGurk effect – the perceptual illusion that indexes the synthesis of visual and auditory speech signals [249]. Whereas some groups have found weaknesses in this perceptual fusion [250-256], others have found normal McGurk percepts [257, 258] or changes in McGurk reports that are accountable by changes in responsiveness to the visual or auditory speech tokens [259]. The likely explanations for the substantial disparities across studies include differences in the composition of the ASD cohort (with age and severity of symptoms being significant factors) and differences in how the specific tasks are structured. Thus, even for the McGurk effect, different stimuli and response modes have been used to assay the illusion.

Changes in multisensory temporal function in autism

Despite this confusion, one of the more robust findings in autism is poorer multisensory temporal acuity – a finding that typically manifests as a broadening of their multisensory TBW [146, 147, 196, 260, 261]. In addition to their concordance with the general finding of sensory changes in children with autism, these results are also in agreement with a substantial body of evidence pointing to deficits in timing or temporally-based processes in autism. Indeed, these deficits have been encapsulated within one of the neurobiologically-inspired theories of autism described earlier - namely the temporal binding deficit hypothesis [247].

Changes in multisensory temporal function in autism have been found using a number of different tasks, including simultaneity judgments [147], temporal order judgments [146, 196], the perception of the sound-induced flash illusion [260], and preferential looking tasks [261] (Figure 5). In each of these studies, the basic finding is that individuals with ASD perceive paired visual-auditory stimuli as originating from the same event over longer time intervals than for control groups (i.e., they report simultaneity even when the stimuli are substantially asynchronous). One interesting, and to date unresolved, difference between these studies is whether the TBW is extended for all types of visual-auditory stimuli, or only for specific stimulus types more closely related to the well-established domains of weakness (e.g., speech). Thus, whereas much work supports differences only for speech-related stimuli [147, 261], other studies suggest more generalized temporal deficits that extend to pairs of very simple stimuli (i.e., flashes and beeps) [146]. Although future work will need to resolve these differences, it is important to point out here that although there are likely to be commonalities in the brain networks supporting multisensory (or at least audiovisual) temporal function, there are also likely to be separate mechanisms governing the integration of low- vs. higher-level audiovisual stimuli. For example, whereas the integration of lower-level flashes and beeps (which can be considered to represent an “arbitrary” pairing) are likely to not involve brain regions interested in contextual or semantic congruence (another important facet of multisensory binding), the integration of higher-level stimuli such as object or speech cues will also entail activation in network components performing such contextual computations.

Figure 5.

Figure 5

Differences in the multisensory temporal binding window are a characteristic feature of autism that relate to other domains of deficit. A. Temporal binding windows (TBW) measured in individuals with and without ASD show differences across stimulus type. A main effect of complexity was seen in both groups, with more complex stimuli associated with wider TBWs. An interaction effect showed that this effect of complexity was greater in individuals with ASD, and most notably, that a difference specific to the processing of audiovisual speech stimuli between ASD and TD groups was observed. Adapted from Stevenson et al., J Neurosci. 2014. B. In individuals with ASD, the ability to perceive the McGurk Effect is negatively correlated with the width of their TBW. That is, as individuals’ multisensory temporal acuity decreased (wider TBWs), so too did their ability to perceptually bind audiovisual speech in order to perceive the McGurk Effect. This relationship was seen when the TBW was measured using simple (i.e., flashbeep), complex non-speech (i.e., tools) and speech stimuli, suggesting that this relationship is based, at least in part, on low-level multisensory temporal processing. Adapted from Stevenson et al., J Neurosci. 2014. C. Individuals who showed more atypical auditory processing, as measured via the auditory processing score of the Sensory Processing Caregiver Questionnaire (SP), showed lower rates of McGurk perceptions (r = 0.51, p < 0.05). Similarly, individuals who showed atypical attention, as measured with via the inattention score of the SP, showed weaker McGurk perceptions (r = 0.61, p = 0.01). Adapted from Woynaroski et al., J. Autism Devel Disabil. 2014. D. Individuals that showed greater difficulties with communication, as measured by the Autism Diagnostic Observation Schedule's (ADOS) communication domain score, were less likely to accurately perceive congruent audiovisual speech (r = −0.58, p < 0.05). A similar trend was seen with the ADOS reciprocal social interaction domain score, but failed to reach significance (r = −0.29, p > 0.05), likely a result of the relatively small sample size. Adapted from Woynaroski et al., J. Autism Devel Disabil. 2014.

Why would an enlarged TBW in autism necessarily be a bad thing? One reason is that the temporal fidelity or tuning of multisensory systems decides on which stimuli should be bound and which should not. The binding of stimuli over longer temporal intervals is likely to result in the creation of poor or “fuzzy” multisensory perceptual representations in which there is a great deal of ambiguity about stimulus identity. Typically, the temporal relationship between two sensory inputs is an important cue as to whether those inputs should be bound. When the perception of this temporal relationship is less acute, subjective temporal synchrony loses its reliability as a cue to bind. The end result of losing such a salient cue and important piece of information is that the individual shows weaker binding overall. In support of this idea is recent work from our laboratory and that has illustrated a strong relationship between the multisensory TBW and the strength of perceptual binding [147, 148]. In this study, the width of the TBW was found to be strongly negatively correlated with perceptual fusions as indexed by the McGurk effect (Figure 5). This finding lends strong support to the linkage between multisensory temporal function and the creation of perceptual representations, an area of inquiry that we believe will be extremely informative moving forward.

Multisensory temporal function and the creation of veridical perceptual and cognitive representations

In our view, the importance of the worked cited above extends well beyond the links that have currently been established. Sensory, and by extension multisensory, processes form the building blocks upon which perceptual and cognitive representations are created. These input streams are crucial not only for the “maps” that form the cornerstone of early subcortical and cortical sensory representations, but also for so-called “higher-order” processes that are dependent on the integrity of the information within the incoming sensory streams. Such a framework predicts that changes in sensory and multisensory processes will have cascading effects upon the information processing hierarchy, ultimately impacting cognitive domains such as attention, executive function, language and communication and social interactions [147, 262-266].

Focusing on the social and communicative pieces because of their relationship to autism, it must be acknowledged that both are not only highly dependent upon sensory information, but also are dependent upon the integration of information across the different sensory modalities [266]. Language and communicative function are highly multisensory, depending not only upon the auditory channel but also upon the associated visual cues such as articulatory gestures that provide vital information for the comprehension of the speech signal (particularly in noisy environments – see [5-13]). In a similar fashion, the interpretation of social cues is keenly dependent upon multisensory processes. Inflections of the voice, facial gestures, and touch convey important social information that must be properly integrated in order to fully understand the content of the social setting.

Although intuitively appealing, much work needs to be done in order to establish these critical links between sensory and multisensory function and higher-order abilities. Indeed, ongoing work in our laboratory is using large-scale correlational matrices in order to identify important associations between a battery of sensory and multisensory tasks that we now routinely use, and a host of measures of cognitive abilities. In an associated manner, we have recently examined multisensory speech perception in a cohort of ASD and typically-developing (TD) children between the ages of 8 and 17 [147]. Consistent with our prior work, ASD children showed an increased width to their multisensory TBW, as well as differences in the degree to which they fused concordant audiovisual speech stimuli. Children with ASD also exhibited a strong relationship between the strength of their perceptual binding on concordant audiovisual trials and the communication subscore of the ADOS, with lower (i.e., more typical) scores being associated with greater binding [267] (Figure 5). Thus, the temporal acuity of individuals’ multisensory binding is directly correlated with their abilities to integrate audiovisual speech, and the correlation between multisensory temporal processing and ADOS communication scores suggests that this relationship may extend into clinical manifestations of ASD. Although this work suggests important links between some of the key diagnostic features of autism and multisensory function, much more needs to be done in order to fully elucidate the nature of these relationships.

In addition to the recent data linking multisensory temporal acuity, speech integration and ADOS communication scores in the ASD populations, ongoing research has begun to examine these relationships to autistic-like traits in the general population (referred to as the broader or extended phenotype). Autistic traits are found to varying degrees in the population at large, and can be indexed through scales such as the Autism-spectrum Quotient [ASQ; 268] or the Broad Autism Phenotype Questionnaire [269]. These traits can then be correlated with any number of perceptual measures. For example, a recent study by Donohue and colleagues [270] showed that the point of subjective simultaneity (PSS) varies relative to the (non-clinical) level of autistic traits an individual exhibits. The PSS, that point in time in which an individual perceives a visual and auditory event to be absolutely synchronous, tends to be observed when the auditory stimulus component slightly lags the visual component, reflecting the statistics of the natural environment (i.e. auditory information travels more slowly when compared with visual information). Individuals showing greater levels of autistic traits however, tend to have PSS measurements closer to absolute synchrony, reflecting a decrease in adaptation to the statistics in the external environment.

The neurobiological substrate for an extended multisensory TBW in ASD

As described earlier, the cortex of the posterior superior temporal sulcus (pSTS) has been implicated as a major node in the computation of multisensory temporal relations. Hence, with the wealth of evidence suggestive of alterations in multisensory temporal function with autism, a logical biological basis for these differences would be changes in the structure and/or function of pSTS. Indeed, some of the most characteristic structural alterations in the brains of those with autism are differences in gray and white matter associated with the pSTS [271-275]. Furthermore, a number of functional studies (i.e., fMRI) have pointed to differences in the activation patterns within pSTS in autism, as has work looking at both functional and structural connectivity of the pSTS [276-281]. Finally, our lab has shown that training that is focused on improving multisensory temporal acuity results in changes in activation and connectivity in a network centered on the pSTS [37]. Collectively, these studies suggest that changes in pSTS in individuals with autism may represent the neural bases for altered multisensory temporal function, and may be a key node in networks responsible for the changes in social and communicative function.

Multisensory temporal contributions to developmental dyslexia

Although autism represents the clinical condition in which multisensory function has been best characterized, evidence suggests that multisensory deficits, and specifically those in the temporal domain, are not unique to autism. Both sensory and multisensory changes have been found to accompany dyslexia, a reading disability in which affected individuals have profound reading difficulties in the background of normal or even above-normal intelligence. Like with autism, numerous neurobiological theories have been espoused for dyslexia, with most being centered on the substantial differences in phonological processing seen in these individuals. Although many of these theories are centered on changes in brain structures responsible for the processing of phonology and phonological relations (e.g., see [282-286]), others have suggested that these phonological deficits may be a result of processing difficulties at earlier stages. One of the most well-developed of these views centers on the magnocellular layers of the thalamus [287, 288]. In this view, selective deficits in the magnocellular visual stream, which subserves the processing of motion, play a key role in dyslexia. Supporting evidence for this theory comes from reports of abnormal eye movements in dyslexia, and from altered activation patterns in areas of the cerebral cortex specialized for processing stimulus motion [289].

The evidence for changes in both visual and auditory function in dyslexia is suggestive that it may be fruitful to consider the disorder in a more pansensory or multisensory framework. Indeed, some of the original clinical descriptions of dyslexia from the neurologist Samuel Orton are rife with multisensory references [290], and to date the most widely adopted intervention approach, the Orton-Gillingham method, is founded on multisensory principles [291]. In addition, several early studies of reading disabled and reading delayed individuals found changes in cross-modal (visual-auditory) temporal function, consistent with a multisensory contribution to reading dysfunction [292, 293]. In order to attribute a specific multisensory contribution to the disorder, however, it is first necessary to show that the nature of the multisensory changes cannot be ascribed simply to changes in unisensory function. Stated a bit differently, it would not be terribly surprising (or interesting) to see multisensory changes accompanying changes in visual (and/or auditory) function. Of interest is whether these changes go beyond those that can be predicted based on unisensory differences.

In an effort to examine specific multisensory alterations in dyslexia, we adopted a multisensory version of the familiar and frequently employed visual temporal order judgment (TOJ) task. Prior work in typical subjects had found that the introduction of a pair of task-irrelevant sounds during performance of the visual TOJ task could improve performance, most notably when the second sound lagged the appearance of the second light [294]. Taking advantage of this task, we were able to show striking differences between dyslexic and typical readers – specifically in the time window within which the second auditory stimulus could enhance visual performance (Figure 6) [142]. Dyslexic readers received benefits from this sound over intervals more than twice as long as typical readers, suggesting that they are “binding” visual and auditory stimuli over unusually long periods of time. We speculate that such an extended TBW will present substantial difficulties for the construction of strong reading representations, in that it will present greater ambiguity as to which auditory elements of the written word (i.e., phonemes) belong with which visual elements (i.e., graphemes). In support of this, EEG studies have shown that as readers progress to fluency, letters and speech-sounds are combined early and automatically in the auditory association cortex, and that this processing is strongly dependent on the relative timing of the paired stimuli [295, 296]. Furthermore, it was found that for dyslexic readers, this progression to automaticity failed to take place [297].

Figure 6.

Figure 6

Alterations in multisensory temporal function in developmental dyslexia. A. Typical readers show a pattern of benefits on a visual temporal order judgment (TOJ) task in which an accessory auditory stimulus can facilitate task performance, but only when presented with a specific temporal structure relative to the visual stimuli. Specifically, only when the second auditory stimulus is delayed by between 100-250 ms are accuracy improvements seen. B. In contrast, for dyslexic readers both the magnitude and the temporal pattern of benefits differ substantially. Most importantly, performance improvements are now seen at all tested delays, suggesting an enlargement in the audiovisual temporal binding window. Adapted from Hairston et al., 2005.

Additional evidence that sits outside of the domain of temporal function has been gathered in support of multisensory alterations in dyslexia. For example, deficits in spatial attention to both visual and auditory stimuli have been linked to phonological skills in dyslexia [298]. In addition, Blau and colleagues have shown using fMRI that dyslexic readers underactivate regions of the superior temporal cortex when binding the auditory and visual components of a speech signal [299]. As highlighted earlier, the cortex surrounding the pSTS is a critical node for the convergence of auditory and visual information, and appears to play a key role in the temporal binding of these signals. Indeed, the pSTS and its associated gyrus (the superior temporal gyrus) have been implicated as key regions of difference between typical and dyslexic readers (e.g., see [300-304]). Overall, these studies point to an important role for multisensory function in dyslexia, but much more work needs to be done to better understand how these changes ultimately result in poor reading performance [305].

Evidence for multisensory abnormalities in schizophrenia

Schizophrenia is a complex psychiatric disorder best characterized by changes in thought and emotional reactivity. Frequently accompanying schizophrenia are delusions and hallucinations, with the latter resulting in research into the nature of sensory (i.e., auditory) processing differences and their contributions to the cognitive changes seen in schizophrenia [306-312]. Although these studies have indeed highlighted changes in auditory and visual processes and cortical organization in schizophrenia, no clear picture as to how sensory dysfunction contributes to the overall schizophrenia phenotype has emerged. Nonetheless, as for autism and dyslexia, the presence of these sensory changes across multiple modalities begs for an examination of multisensory function.

Clinical reports have long suggested changes in multisensory function in schizophrenia, most notably seen in the ability to match stimuli across the different sensory modalities (i.e, cross-modal matching, see [313]). More empirically directed work subsequently found there to be deficits in the integration of audiovisual stimuli in a schizophrenia cohort, and that this deficit appeared to be restricted to speech-related audiovisual stimuli and was amplified under noisy conditions [314-318]. A subset of these studies also revealed differences in multisensory performance specifically when the tasks indexed the emotional valence of the auditory (voice) and visual (face) stimuli. Other work has suggested the presence of deficits in lower-level multisensory integration, specifically in demonstrating reduced facilitation of reaction times on a visual-auditory target detection task [319]. In a recent EEG study comparing between those with schizophrenia and controls, it was found that the neural signatures associated with typical audiovisual integration were absent or comprised in the schizophrenic patients [320].

One crucial issue as it relates to the establishment of specific multisensory deficits in schizophrenia (or in any other clinical condition) is to show that changes in performance and/or perception are either unique to the multisensory conditions, or cannot be predicted based on the changes seen in unisensory function. For this reason, it is essential that measures of multisensory function are contrasted against unisensory measures. Although such unisensory-multisensory contrasts are becoming increasingly common, some of the earlier studies failed to test for unisensory changes, making it difficult to interpret the differences in multisensory performance.

Numerous prior studies have suggested that in addition to sensory-based problems, individuals with schizophrenia have alterations in temporal perception [312, 321-323]. Indeed, prior work has merged these areas of inquiry, and has shown changes in both unisensory and multisensory temporal perception in schizophrenia, which manifest as a lessened acuity in judging the simultaneity between visual, auditory and combined visual-auditory stimulus pairs [322, 324]. In an effort to follow up on this work with an emphasis on the TBW and on the specificity of these effects for multisensory integration, we have recently embarked on a study designed to detail the nature of these changes and their relationships to the constellation of clinical symptoms. Although preliminary, this work is suggestive of changes in multisensory temporal function that we believe may be important factors in the schizophrenia phenotype.

Training as a therapeutic tool to engage unisensory and multisensory plasticity

As alluded to in the prior section, ongoing work in our laboratory has focused on using approaches grounded in perceptual plasticity to train sensory and multisensory systems. In addition to its application for those wearing cochlear implants, we believe that such methods also hold promise for clinical conditions such as autism and schizophrenia, most notably in their possible utility for improving sensory and multisensory temporal acuity. As highlighted in an earlier section, we have successfully trained individuals to narrow the width of their TBW [103, 154], with these changes accompanied by changes in a brain network centered on the pSTS (Figure 7) [37]. Most encouraging in these normative studies was the finding that those who benefited the most from training (i.e., showed the largest changes in the size of their TBW) were those for whom the TBW was the largest prior to training [103, 154]. Hence, our findings of enlarged multisensory TBW in autism, dyslexia and schizophrenia suggest that these individuals may be highly susceptible to perceptual training methods.

Figure 7.

Figure 7

Training on a multisensory temporal task results in activation changes in a network of areas centered on the pSTS. A. Red and yellow colors represent two ROIs in the pSTS that showed activation during visual and auditory conditions and that were altered following perceptual training. B-D. Mean percent signal change for all voxels in the posterior pSTS ROI (yellow box, B), the anterior pSTS ROI (red box, D) and for the two combined (orange box, C). Note that significant decreases in the BOLD response were found following training for stimulus onset asynchronies (SOAs) that represent the easiest conditions (i.e., SOA 0 and SOA 300). In contrast, little changes was seen for the intermediate SOA that defined the borders of each individual's TBW (SOA RWS). E-G. Mean percent signal changes as a function of accuracy for SOA 300 and SOA RWS trials for the posterior (E), anterior (G) and combined ROIs (F). Adapted from Powers, Hevey and Wallace, 2012.

In preliminary work in autism, we have shown this to be the case, with several days of training resulting in a significant narrowing of the TBW. Although very exciting, this work needs to be extended to show that this training results in changes beyond the trained task and domain. We are encouraged by our results in our typical cohort, which have shown that training using low-level stimuli (i.e., flashes and beeps) on one task (i.e., simultaneity judgment) can result in changes in the TBW for the processing of higher-level (i.e., speech) stimuli in the context of a different task (i.e., perceptual fusions as indexed by the McGurk effect). The presence of such generalization is extraordinarily exciting, but must now be extended to see if these training regimens can engender meaningful change in measures of real world function – such as improvements in social skills and communication. Although still in their early stages, we feel that these perceptual plasticity-based approaches hold great promise as potential tools that can be incorporated into behaviorally-based remediation methods.

Concluding remarks

Sensory and multisensory dysfunction accompanies many developmental disabilities. Although widely acknowledged, the presence of these deficits is often overlooked from the perspective of how they can inform and contribute to the characteristics that are considered defining for the disorder. Using autism as an example, it is only with the recent update to the DSM-5 that sensory problems are considered a core feature of the disorder. Even with this important acknowledgment, little empirical evidence exists to better characterize the nature of these sensory disturbances, and perhaps more importantly, to relate these changes to higher cognitive abilities. This landscape is changing, and is beginning to reveal the importance of a more integrated and holistic view into these interactions and interrelationships.

The current review focuses on but one facet of these sensory changes – multisensory temporal processes – and on but a few of the clinical conditions in which a picture is beginning to emerge. The presented evidence illustrates that both unisensory (i.e., within modality) and multisensory (i.e., across modality) processes are frequently affected in autism, dyslexia and schizophrenia. There is surprising commonality in the way in which multisensory function is altered in these three disorders, with the principal finding being an enlargement in the width of the multisensory temporal binding window – that epoch of time within which stimuli from different modalities interact and influence one another's processing. How such an enlarged time window ultimately impacts the perceptual and cognitive features that define each of these conditions, particularly given the striking phenotypic differences between them, remains to be determined. Nonetheless, these results bring into focus the critical importance of sensory and multisensory function, and the strong need to employ a battery of tasks designed to index various aspects of (multi)sensory function, and to relate performance on these tasks to cognitive and perceptual abilities in order to establish sensory-perceptual-cognitive links. In conjunction with brain-based physiological measures, such as EEG and MRI, and the associated connectivity and network analyses, these approaches will undoubtedly reveal key pathophysiological features for each of these clinical conditions.

Finally, in addition to providing a more integrated and detailed view into these behavioral, perceptual and neurobiological characteristics, the current work holds great promise from an interventional perspective. We predicate this concept on the view that (multi)sensory function forms the building blocks for higher-order representations. Thus, training methods that improve (multi)sensory function will also be likely to have effects that cascade beyond the trained tasks and domains.

Highlights (NSY-D-14-00152).

  • The review focuses on altered multisensory function in developmental disabilities

  • Multisensory temporal acuity is altered in autism, dyslexia and schizophrenia

  • The construct of the multisensory temporal binding window is critical in perception

  • Perceptual training may have utility in improving multisensory function

Figure 3.

Figure 3

The cortex surrounding the posterior superior temporal sulcus (pSTS) is integrally involved in the integration of visual and auditory information. A. The pSTS as defined as the conjunction of brain regions functionally responsive to intact auditory stimuli over their scrambled equivalents AND intact visual stimuli over their scrambled equivalents. The pSTS is located directly between auditory-only and visually-only responsive regions, making it a logical site for audiovisual convergence and integration. Adapted From James et al, 2012. B. Subregions of pSTS appear to be engaged in different multisensory processes. The subregion of pSTS defined as responding more to synchronous as opposed to asynchronous stimuli does not respond differentially according to the individual's perceptual reports (pink and light blue bars). In contrast, the subregion of pSTS defined as a conjunction of auditory and visual responsive regions (see Figure 3A) responds differentially according to the individual's perceptual reports (pink and light blue bars), even when the stimuli are identical. These data suggest that this may be the region of perceptual “binding” for the auditory and visual stimuli. Adapted from Stevenson et al, 2011. C. When given perceptual feedback training to improve multisensory temporal processing, the neural analogs of this change are centered about pSTS. The BOLD responses from these regions show more efficient processing of synchronous (and thus likely perceptually bound) stimuli. Adapted from Powers et al, 2012. D. When TMS is applied to pSTS during presentation of stimulus pairs that typically result in the McGurk illusion, individuals’ ability to perceptually bind the auditory and visual components is impaired, resulting in decreases in perception of the illusion. Adapted from Beauchamp et al, 2010.

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

Bibliography

  • 1.Stein BE, Meredith MA. The Merging of the Senses. MIT Press; Cambridge, MA: 1993. [Google Scholar]
  • 2.King AJ, Calvert GA. Multisensory integration: perceptual grouping by eye and ear. Curr Biol. 2001;11(8):R322–5. doi: 10.1016/s0960-9822(01)00175-0. [DOI] [PubMed] [Google Scholar]
  • 3.Stein BE, et al. Multisensory integration. In: Ramachandran V, editor. Encyclopedia of the Human Brain. Elsevier; Amsterdam: 2002. pp. 227–241. [Google Scholar]
  • 4.Calvert GA, Spence C, Stein BE, editors. The Handbook of Multisensory Processes. The MIT Press; Cambridge, MA: 2004. [Google Scholar]
  • 5.Sumby WH, Pollack I. Visual contribution to speech intelligibility in noise. Journal of the Acoustical Society of America. 1954;26:212–215. [Google Scholar]
  • 6.Stevenson RA, James TW. Audiovisual integration in human superior temporal sulcus: Inverse effectiveness and the neural processing of speech and object recognition. Neuroimage. 2009;44(3):1210–23. doi: 10.1016/j.neuroimage.2008.09.034. [DOI] [PubMed] [Google Scholar]
  • 7.Bishop CW, Miller LM. A multisensory cortical network for understanding speech in noise. J Cogn Neurosci. 2009;21(9):1790–805. doi: 10.1162/jocn.2009.21118. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Girin L, Schwartz J-L, Feng G. Audio-visual enhancement of speech in noise. J Acoust Soc Am. 2001;109:3007. doi: 10.1121/1.1358887. [DOI] [PubMed] [Google Scholar]
  • 9.MacLeod A, Summerfield Q. Quantifying the contribution of vision to speech perception in noise. Br J Audiol. 1987;21(2):131–41. doi: 10.3109/03005368709077786. [DOI] [PubMed] [Google Scholar]
  • 10.Grant KW, Walden BE, Seitz PF. Auditory-visual speech recognition by hearing-impaired subjects: consonant recognition, sentence recognition, and auditory-visual integration. J Acoust Soc Am. 1998;103(5 Pt 1):2677–90. doi: 10.1121/1.422788. [DOI] [PubMed] [Google Scholar]
  • 11.Grant KW, Walden BE. Evaluating the articulation index for auditory-visual consonant recognition. J Acoust Soc Am. 1996;100(4 Pt 1):2415–24. doi: 10.1121/1.417950. [DOI] [PubMed] [Google Scholar]
  • 12.Erber NP. Auditory-visual perception of speech. The Journal of speech and hearing disorders. 1975;40(4):481–92. doi: 10.1044/jshd.4004.481. [DOI] [PubMed] [Google Scholar]
  • 13.Robert-Ribes J, et al. Complementarity and synergy in bimodal speech: auditory, visual, and audio-visual identification of French oral vowels in noise. J Acoust Soc Am. 1998;103(6):3677–89. doi: 10.1121/1.423069. [DOI] [PubMed] [Google Scholar]
  • 14.Amedi A, et al. Functional imaging of human crossmodal identification and object recognition. Exp Brain Res. 2005;166(3-4):559–71. doi: 10.1007/s00221-005-2396-5. [DOI] [PubMed] [Google Scholar]
  • 15.Baum SH, et al. Multisensory speech perception without the left superior temporal sulcus. Neuroimage. 2012 doi: 10.1016/j.neuroimage.2012.05.034. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Beauchamp MS. See me, hear me, touch me: multisensory integration in lateral occipital-temporal cortex. Curr Opin Neurobiol. 2005;15(2):145–53. doi: 10.1016/j.conb.2005.03.011. [DOI] [PubMed] [Google Scholar]
  • 17.Beauchamp MS, et al. Integration of auditory and visual information about objects in superior temporal sulcus. Neuron. 2004;41(5):809–23. doi: 10.1016/s0896-6273(04)00070-4. [DOI] [PubMed] [Google Scholar]
  • 18.Nath AR, Beauchamp MS. Dynamic changes in superior temporal sulcus connectivity during perception of noisy audiovisual speech. J Neurosci. 2011;31(5):1704–14. doi: 10.1523/JNEUROSCI.4853-10.2011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Nath AR, Beauchamp MS. A neural basis for interindividual differences in the McGurk effect, a multisensory speech illusion. Neuroimage. 2012;59(1):781–7. doi: 10.1016/j.neuroimage.2011.07.024. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Macaluso E, et al. Spatial and temporal factors during processing of audiovisual speech: a PET study. Neuroimage. 2004;21(2):725–32. doi: 10.1016/j.neuroimage.2003.09.049. [DOI] [PubMed] [Google Scholar]
  • 21.Kim S, James TW. Enhanced effectiveness in visuo-haptic object-selective brain regions with increasing stimulus salience. Hum Brain Mapp. 2010;31(5):678–93. doi: 10.1002/hbm.20897. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Kim S, Stevenson RA, James TW. Visuo-haptic neuronal convergence demonstrated with an inversely effective pattern of BOLD activation. J Cogn Neurosci. 2012;24(4):830–42. doi: 10.1162/jocn_a_00176. [DOI] [PubMed] [Google Scholar]
  • 23.Stevenson R, et al. Inverse Effectiveness and Multisensory Interactions in Visual Event-Related Potentials with Audiovisual Speech. Brain Topography. 2012:1–19. doi: 10.1007/s10548-012-0220-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Stevenson RA, et al. Neural processing of asynchronous audiovisual speech perception. Neuroimage. 2010;49(4):3308–18. doi: 10.1016/j.neuroimage.2009.12.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Stevenson RA, et al. Inverse effectiveness and multisensory interactions in visual event-related potentials with audiovisual speech. Brain Topogr. 2012;25(3):308–26. doi: 10.1007/s10548-012-0220-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Stevenson RA, Kim S, James TW. An additive-factors design to disambiguate neuronal and areal convergence: measuring multisensory interactions between audio, visual, and haptic sensory streams using fMRI. Exp Brain Res. 2009;198(2-3):183–94. doi: 10.1007/s00221-009-1783-8. [DOI] [PubMed] [Google Scholar]
  • 27.Calvert GA. Crossmodal processing in the human brain: insights from functional neuroimaging studies. Cereb Cortex. 2001;11(12):1110–23. doi: 10.1093/cercor/11.12.1110. [DOI] [PubMed] [Google Scholar]
  • 28.Calvert GA, et al. Response amplification in sensory-specific cortices during crossmodal binding. Neuroreport. 1999;10(12):2619–23. doi: 10.1097/00001756-199908200-00033. [DOI] [PubMed] [Google Scholar]
  • 29.Calvert GA, Campbell R, Brammer MJ. Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex. Curr Biol. 2000;10(11):649–57. doi: 10.1016/s0960-9822(00)00513-3. [DOI] [PubMed] [Google Scholar]
  • 30.Calvert GA, et al. Detection of audio-visual integration sites in humans by application of electrophysiological criteria to the BOLD effect. Neuroimage. 2001;14(2):427–38. doi: 10.1006/nimg.2001.0812. [DOI] [PubMed] [Google Scholar]
  • 31.De Gelder B, Vroomen II, Pourtois G. Multisensory perception of emotion, its time course, and its neural basis. In: Calvert G, Spence C, Stein BE, editors. The Handbook of Multisensory Processes. MIT Press; Cambridge, MA: 2004. pp. 581–597. [Google Scholar]
  • 32.Lloyd DM, et al. Multisensory representation of limb position in human premotor cortex. Nat Neurosci. 2003;6(1):17–8. doi: 10.1038/nn991. [DOI] [PubMed] [Google Scholar]
  • 33.O'Doherty J, Rolls ET, Kringelbach ML. Neuroimaging studies of cross-modal integration for emotion. In: Calvert G, Spence C, Stein BE, editors. The Handbook of Multisensory Processes. MIT Press; Cambridge, MA: 2004. pp. 563–580. [Google Scholar]
  • 34.James TW, et al. Multisensory perception of action in posterior temporal and parietal cortices. Neuropsychologia. 2011;49(1):108–14. doi: 10.1016/j.neuropsychologia.2010.10.030. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Stevenson RA, Geoghegan ML, James TW. Superadditive BOLD activation in superior temporal sulcus with threshold non-speech objects. Experimental Brain Research. 2007;179(1):85–95. doi: 10.1007/s00221-006-0770-6. [DOI] [PubMed] [Google Scholar]
  • 36.Stevenson RA, et al. Discrete neural substrates underlie complementary audiovisual speech integration processes. Neuroimage. 2011;55(3):1339–45. doi: 10.1016/j.neuroimage.2010.12.063. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Powers AR, 3rd, Hevey MA, Wallace MT. Neural correlates of multisensory perceptual learning. J Neurosci. 2012;32(18):6263–74. doi: 10.1523/JNEUROSCI.6138-11.2012. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Laurienti PJ, et al. Deactivation of sensory-specific cortex by cross-modal stimuli. J Cogn Neurosci. 2002;14(3):420–9. doi: 10.1162/089892902317361930. [DOI] [PubMed] [Google Scholar]
  • 39.Laurienti PJ, et al. Cross-modal sensory processing in the anterior cingulate and medial prefrontal cortices. Hum Brain Mapp. 2003;19(4):213–23. doi: 10.1002/hbm.10112. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Werner S, Noppeney U. Superadditive Responses in Superior Temporal Sulcus Predict Audiovisual Benefits in Object Categorization. Cereb Cortex. 2009 doi: 10.1093/cercor/bhp248. [DOI] [PubMed] [Google Scholar]
  • 41.Werner S, Noppeney U. Distinct functional contributions of primary sensory and association areas to audiovisual integration in object categorization. J Neurosci. 2010;30(7):2662–75. doi: 10.1523/JNEUROSCI.5091-09.2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Werner S, Noppeney U. Superadditive responses in superior temporal sulcus predict audiovisual benefits in object categorization. Cereb Cortex. 2010;20(8):1829–42. doi: 10.1093/cercor/bhp248. [DOI] [PubMed] [Google Scholar]
  • 43.Werner S, Noppeney U. The contributions of transient and sustained response codes to audiovisual integration. Cereb Cortex. 2011;21(4):920–31. doi: 10.1093/cercor/bhq161. [DOI] [PubMed] [Google Scholar]
  • 44.Cappe C, et al. Selective integration of auditory-visual looming cues by humans. Neuropsychologia. 2009;47(4):1045–52. doi: 10.1016/j.neuropsychologia.2008.11.003. [DOI] [PubMed] [Google Scholar]
  • 45.Cappe C, et al. Auditory-visual multisensory interactions in humans: timing, topography, directionality, and sources. J Neurosci. 2010;30(38):12572–80. doi: 10.1523/JNEUROSCI.1099-10.2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Foxe JJ, et al. Multisensory auditory-somatosensory interactions in early cortical processing revealed by high-density electrical mapping. Brain Res Cogn Brain Res. 2000;10(1-2):77–83. doi: 10.1016/s0926-6410(00)00024-0. [DOI] [PubMed] [Google Scholar]
  • 47.Foxe JJ, et al. Auditory-somatosensory multisensory processing in auditory association cortex: an fMRI study. J Neurophysiol. 2002;88(1):540–3. doi: 10.1152/jn.2002.88.1.540. [DOI] [PubMed] [Google Scholar]
  • 48.Martuzzi R, et al. Multisensory interactions within human primary cortices revealed by BOLD dynamics. Cereb Cortex. 2007;17(7):1672–9. doi: 10.1093/cercor/bhl077. [DOI] [PubMed] [Google Scholar]
  • 49.Molholm S, et al. Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study. Brain Res Cogn Brain Res. 2002;14(1):115–28. doi: 10.1016/s0926-6410(02)00066-6. [DOI] [PubMed] [Google Scholar]
  • 50.Murray MM, et al. Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment. Cereb Cortex. 2005;15(7):963–74. doi: 10.1093/cercor/bhh197. [DOI] [PubMed] [Google Scholar]
  • 51.Romei V, et al. Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds. Curr Biol. 2009;19(21):1799–805. doi: 10.1016/j.cub.2009.09.027. [DOI] [PubMed] [Google Scholar]
  • 52.Wallace MH, Murray MM, editors. Frontiers in the Neural Basis of Multisensory Processes. Taylor &amp; Francis; London: 2011. [Google Scholar]
  • 53.Carriere BN, et al. Visual deprivation alters the development of cortical multisensory integration. J Neurophysiol. 2007;98(5):2858–67. doi: 10.1152/jn.00587.2007. [DOI] [PubMed] [Google Scholar]
  • 54.Jiang W, et al. Two cortical areas mediate multisensory integration in superior colliculus neurons. J Neurophysiol. 2001;85(2):506–22. doi: 10.1152/jn.2001.85.2.506. [DOI] [PubMed] [Google Scholar]
  • 55.Kadunce DC, et al. Mechanisms of within- and cross-modality suppression in the superior colliculus. J Neurophysiol. 1997;78(6):2834–47. doi: 10.1152/jn.1997.78.6.2834. [DOI] [PubMed] [Google Scholar]
  • 56.Meredith MA, Wallace MT, Stein BE. Visual, auditory and somatosensory convergence in output neurons of the cat superior colliculus: multisensory properties of the tecto-reticulo-spinal projection. Exp Brain Res. 1992;88(1):181–6. doi: 10.1007/BF02259139. [DOI] [PubMed] [Google Scholar]
  • 57.Perrault TJ, Jr., et al. Superior colliculus neurons use distinct operational modes in the integration of multisensory stimuli. J Neurophysiol. 2005;93(5):2575–86. doi: 10.1152/jn.00926.2004. [DOI] [PubMed] [Google Scholar]
  • 58.Royal DW, Carriere BN, Wallace MT. Spatiotemporal architecture of cortical receptive fields and its impact on multisensory interactions. Exp Brain Res. 2009;198(2-3):127–36. doi: 10.1007/s00221-009-1772-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Stein BE, Wallace MT. Comparisons of cross-modality integration in midbrain and cortex. Prog Brain Res. 1996;112:289–99. doi: 10.1016/s0079-6123(08)63336-1. [DOI] [PubMed] [Google Scholar]
  • 60.Stein BE, et al. Cortex governs multisensory integration in the midbrain. Neuroscientist. 2002;8(4):306–14. doi: 10.1177/107385840200800406. [DOI] [PubMed] [Google Scholar]
  • 61.Wallace MT, et al. The development of cortical multisensory integration. J Neurosci. 2006;26(46):11844–9. doi: 10.1523/JNEUROSCI.3295-06.2006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Wallace MT. MA Meredith, and B.E. Stein, Integration of multiple sensory modalities in cat cortex. Exp Brain Res. 1992;91(3):484–8. doi: 10.1007/BF00227844. [DOI] [PubMed] [Google Scholar]
  • 63.Wallace MT, Meredith MA, Stein BE. Converging influences from visual, auditory, and somatosensory cortices onto output neurons of the superior colliculus. J Neurophysiol. 1993;69(6):1797–809. doi: 10.1152/jn.1993.69.6.1797. [DOI] [PubMed] [Google Scholar]
  • 64.Wallace MT, Meredith MA, Stein BE. Multisensory integration in the superior colliculus of the alert cat. J Neurophysiol. 1998;80(2):1006–10. doi: 10.1152/jn.1998.80.2.1006. [DOI] [PubMed] [Google Scholar]
  • 65.Wallace MT, Stein BE. Cross-modal synthesis in the midbrain depends on input from cortex. J Neurophysiol. 1994;71(1):429–32. doi: 10.1152/jn.1994.71.1.429. [DOI] [PubMed] [Google Scholar]
  • 66.Wallace MT, Wilkinson LK, Stein BE. Representation and integration of multiple sensory inputs in primate superior colliculus. J Neurophysiol. 1996;76(2):1246–66. doi: 10.1152/jn.1996.76.2.1246. [DOI] [PubMed] [Google Scholar]
  • 67.Benevento LA, et al. Auditory-visual interaction in single cells in the cortex of the superior temporal sulcus and the orbital frontal cortex of the macaque monkey. Exp Neurol. 1977;57(3):849–72. doi: 10.1016/0014-4886(77)90112-1. [DOI] [PubMed] [Google Scholar]
  • 68.Allman BL, Keniston LP, Meredith MA. Subthreshold auditory inputs to extrastriate visual neurons are responsive to parametric changes in stimulus quality: sensory-specific versus non-specific coding. Brain Res. 2008;1242:95–101. doi: 10.1016/j.brainres.2008.03.086. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69.Allman BL, Meredith MA. Multisensory processing in “unimodal” neurons: cross-modal subthreshold auditory effects in cat extrastriate visual cortex. J Neurophysiol. 2007;98(1):545–9. doi: 10.1152/jn.00173.2007. [DOI] [PubMed] [Google Scholar]
  • 70.Meredith MA. On the neuronal basis for multisensory convergence: a brief overview. Brain Res Cogn Brain Res. 2002;14(1):31–40. doi: 10.1016/s0926-6410(02)00059-9. [DOI] [PubMed] [Google Scholar]
  • 71.Meredith MA, Stein BE. Interactions among converging sensory inputs in the superior colliculus. Science. 1983;221(4608):389–91. doi: 10.1126/science.6867718. [DOI] [PubMed] [Google Scholar]
  • 72.Meredith MA, Stein BE. Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. J Neurophysiol. 1986;56(3):640–62. doi: 10.1152/jn.1986.56.3.640. [DOI] [PubMed] [Google Scholar]
  • 73.Stein B, Meredith MA. The Merging of the Senses. MIT Press; Boston, MA: 1993. p. 224. [Google Scholar]
  • 74.Stein BE, Meredith MA, Wallace MT. The visually responsive neuron and beyond: multisensory integration in cat and monkey. Prog Brain Res. 1993;95:79–90. doi: 10.1016/s0079-6123(08)60359-3. [DOI] [PubMed] [Google Scholar]
  • 75.Alvarado JC, et al. A neural network model of multisensory integration also accounts for unisensory integration in superior colliculus. Brain Res. 2008;1242:13–23. doi: 10.1016/j.brainres.2008.03.074. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 76.Alvarado JC, et al. Multisensory integration in the superior colliculus requires synergy among corticocollicular inputs. J Neurosci. 2009;29(20):6580–92. doi: 10.1523/JNEUROSCI.0525-09.2009. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 77.Alvarado JC, et al. Multisensory versus unisensory integration: contrasting modes in the superior colliculus. J Neurophysiol. 2007;97(5):3193–205. doi: 10.1152/jn.00018.2007. [DOI] [PubMed] [Google Scholar]
  • 78.Stein BE. Neural mechanisms for synthesizing sensory information and producing adaptive behaviors. Exp Brain Res. 1998;123(1-2):124–35. doi: 10.1007/s002210050553. [DOI] [PubMed] [Google Scholar]
  • 79.Stein BE, Meredith MA. Multisensory integration. Neural and behavioral solutions for dealing with stimuli from different sensory modalities. Ann N Y Acad Sci. 1990;608:51–65. doi: 10.1111/j.1749-6632.1990.tb48891.x. discussion 65-70. [DOI] [PubMed] [Google Scholar]
  • 80.Stein BE, Stanford TR, Rowland BA. The neural basis of multisensory integration in the midbrain: Its organization and maturation. Hear Res. 2009 doi: 10.1016/j.heares.2009.03.012. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 81.Schroeder CE, Foxe JJ. The timing and laminar profile of converging inputs to multisensory areas of the macaque neocortex. Brain Res Cogn Brain Res. 2002;14(1):187–98. doi: 10.1016/s0926-6410(02)00073-3. [DOI] [PubMed] [Google Scholar]
  • 82.Schroeder CE, et al. Somatosensory input to auditory association cortex in the macaque monkey. J Neurophysiol. 2001;85(3):1322–7. doi: 10.1152/jn.2001.85.3.1322. [DOI] [PubMed] [Google Scholar]
  • 83.Hackett TA, Schroeder CE. Multisensory integration in auditory and auditory-related areas of cortex. Hear Res. 2009;258(1-2):1–3. doi: 10.1016/j.heares.2009.10.016. [DOI] [PubMed] [Google Scholar]
  • 84.Zion Golumbic E, et al. Visual input enhances selective speech envelope tracking in auditory cortex at a “cocktailparty”. J Neurosci. 2013;33(4):1417–26. doi: 10.1523/JNEUROSCI.3675-12.2013. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 85.Driver J, Noesselt T. Multisensory interplay reveals crossmodal influences on ‘sensory-specific’ brain regions, neural responses, and judgments. Neuron. 2008;57(1):11–23. doi: 10.1016/j.neuron.2007.12.013. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 86.Schroeder CE, Foxe J. Multisensory contributions to low-level, ‘unisensory’ processing. Curr Opin Neurobiol. 2005;15(4):454–8. doi: 10.1016/j.conb.2005.06.008. [DOI] [PubMed] [Google Scholar]
  • 87.Meredith MA, Nemitz JW, Stein BE. Determinants of multisensory integration in superior colliculus neurons. I. Temporal factors. J Neurosci. 1987;7(10):3215–29. doi: 10.1523/JNEUROSCI.07-10-03215.1987. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 88.Meredith MA, Stein BE. Spatial factors determine the activity of multisensory neurons in cat superior colliculus. Brain Res. 1986;365(2):350–4. doi: 10.1016/0006-8993(86)91648-3. [DOI] [PubMed] [Google Scholar]
  • 89.Meredith MA, Stein BE. Spatial determinants of multisensory integration in cat superior colliculus neurons. J Neurophysiol. 1996;75(5):1843–57. doi: 10.1152/jn.1996.75.5.1843. [DOI] [PubMed] [Google Scholar]
  • 90.Wallace MT, et al. Unifying multisensory signals across time and space. Exp Brain Res. 2004;158(2):252–8. doi: 10.1007/s00221-004-1899-9. [DOI] [PubMed] [Google Scholar]
  • 91.Stein BE, et al. Challenges in quantifying multisensory integration: alternative criteria, models, and inverse effectiveness. Exp Brain Res. 2009;198(2-3):113–26. doi: 10.1007/s00221-009-1880-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 92.James TW, Stevenson RA, Kim S. Inverse effectiveness in multisensory processing. In: Stein BE, editor. The New Handbook of Multisensory Processes. MIT Press; Cambridge, MA: 2012. [Google Scholar]
  • 93.Carriere BN, Royal DW, Wallace MT. Spatial heterogeneity of cortical receptive fields and its impact on multisensory interactions. J Neurophysiol. 2008;99(5):2357–68. doi: 10.1152/jn.01386.2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 94.Krueger J, et al. Spatial receptive field organization of multisensory neurons and its impact on multisensory interactions. Hear Res. 2009;258(1-2):47–54. doi: 10.1016/j.heares.2009.08.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 95.Sarko DK, Ghose D, Wallace MT. Convergent approaches toward the study of multisensory perception. Front Syst Neurosci. 2013;7:81. doi: 10.3389/fnsys.2013.00081. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 96.Ghose D, Wallace MT. Heterogeneity in the spatial receptive field architecture of multisensory neurons of the superior colliculus and its effects on multisensory integration. Neuroscience. 2014;256:147–62. doi: 10.1016/j.neuroscience.2013.10.044. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 97.Sarko DK, et al. Spatial and Temporal Features of Multisensory Processes: Bridging Animal and Human Studies. In: Murray MM, Wallace MT, editors. The Neural Bases of Multisensory Processes. Boca Raton (FL): 2012. [PubMed] [Google Scholar]
  • 98.Lewald J, Ehrenstein WH, Guski R. Spatio-temporal constraints for auditory--visual integration. Behav Brain Res. 2001;121(1-2):69–79. doi: 10.1016/s0166-4328(00)00386-7. [DOI] [PubMed] [Google Scholar]
  • 99.Conrey B, Pisoni DB. Auditory-visual speech perception and synchrony detection for speech and nonspeech signals. J Acoust Soc Am. 2006;119(6):4065–73. doi: 10.1121/1.2195091. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 100.Dixon NF, Spitz L. The detection of auditory visual desynchrony. Perception. 1980;9(6):719–21. doi: 10.1068/p090719. [DOI] [PubMed] [Google Scholar]
  • 101.Hillock AR, Powers AR, Wallace MT. Binding of sights and sounds: age-related changes in multisensory temporal processing. Neuropsychologia. 2011;49(3):461–7. doi: 10.1016/j.neuropsychologia.2010.11.041. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 102.Keetels M, Vroomen J. The role of spatial disparity and hemifields in audio-visual temporal order judgments. Exp Brain Res. 2005;167(4):635–40. doi: 10.1007/s00221-005-0067-1. [DOI] [PubMed] [Google Scholar]
  • 103.Powers AR, 3rd, Hillock AR, Wallace MT. Perceptual training narrows the temporal window of multisensory binding. J Neurosci. 2009;29(39):12265–74. doi: 10.1523/JNEUROSCI.3501-09.2009. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 104.van Atteveldt NM, et al. The effect of temporal asynchrony on the multisensory integration of letters and speech sounds. Cereb Cortex. 2007;17(4):962–74. doi: 10.1093/cercor/bhl007. [DOI] [PubMed] [Google Scholar]
  • 105.van Wassenhove V, Grant KW, Poeppel D. Temporal window of integration in auditory-visual speech perception. Neuropsychologia. 2007;45(3):598–607. doi: 10.1016/j.neuropsychologia.2006.01.001. [DOI] [PubMed] [Google Scholar]
  • 106.Zampini M, et al. Audio-visual simultaneity judgments. Percept Psychophys. 2005;67(3):531–44. doi: 10.3758/BF03193329. [DOI] [PubMed] [Google Scholar]
  • 107.Bertelson P, Radeau M. Cross-modal bias and perceptual fusion with auditory-visual spatial discordance. Percept Psychophys. 1981;29(6):578–84. doi: 10.3758/bf03207374. [DOI] [PubMed] [Google Scholar]
  • 108.Alais D, Newell FN, Mamassian P. Multisensory processing in review: from physiology to behaviour. Seeing Perceiving. 2010;23(1):3–38. doi: 10.1163/187847510X488603. [DOI] [PubMed] [Google Scholar]
  • 109.Colonius H, Diederich A. Multisensory interaction in saccadic reaction time: a time-window-of-integration model. J Cogn Neurosci. 2004;16(6):1000–9. doi: 10.1162/0898929041502733. [DOI] [PubMed] [Google Scholar]
  • 110.Colonius H, Diederich A. The optimal time window of visual-auditory integration: a reaction time analysis. Front Integr Neurosci. 2010;4:11. doi: 10.3389/fnint.2010.00011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 111.Colonius H, Diederich A. Computing an optimal time window of audiovisual integration in focused attention tasks: illustrated by studies on effect of age and prior knowledge. Exp Brain Res. 2011;212(3):327–37. doi: 10.1007/s00221-011-2732-x. [DOI] [PubMed] [Google Scholar]
  • 112.Colonius H, Diederich A, Steenken R. Time-window-of-integration (TWIN) model for saccadic reaction time: effect of auditory masker level on visual-auditory spatial interaction in elevation. Brain Topogr. 2009;21(3-4):177–84. doi: 10.1007/s10548-009-0091-8. [DOI] [PubMed] [Google Scholar]
  • 113.Diederich A, Colonius H. Bimodal and trimodal multisensory enhancement: effects of stimulus onset and intensity on reaction time. Percept Psychophys. 2004;66(8):1388–404. doi: 10.3758/bf03195006. [DOI] [PubMed] [Google Scholar]
  • 114.Diederich A, Colonius H. Modeling spatial effects in visual-tactile saccadic reaction time. Percept Psychophys. 2007;69(1):56–67. doi: 10.3758/bf03194453. [DOI] [PubMed] [Google Scholar]
  • 115.Diederich A, Colonius H. Crossmodal interaction in speeded responses: time window of integration model. Prog Brain Res. 2009;174:119–35. doi: 10.1016/S0079-6123(09)01311-9. [DOI] [PubMed] [Google Scholar]
  • 116.Diederich A, et al. Visual-tactile spatial interaction in saccade generation. Exp Brain Res. 2003;148(3):328–37. doi: 10.1007/s00221-002-1302-7. [DOI] [PubMed] [Google Scholar]
  • 117.Frens MA, Van Opstal AJ, Van der Willigen RF. Spatial and temporal factors determine auditory-visual interactions in human saccadic eye movements. Percept Psychophys. 1995;57(6):802–16. doi: 10.3758/bf03206796. [DOI] [PubMed] [Google Scholar]
  • 118.Hay-McCutcheon MJ, Pisoni DB, Hunt KK. Audiovisual asynchrony detection and speech perception in hearing-impaired listeners with cochlear implants: a preliminary analysis. Int J Audiol. 2009;48(6):321–33. doi: 10.1080/14992020802644871. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 119.Hay-McCutcheon MJ, Pisoni DB, Kirk KI. Audiovisual speech perception in elderly cochlear implant recipients. Laryngoscope. 2005;115(10):1887–94. doi: 10.1097/01.mlg.0000173197.94769.ba. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 120.Hillock-Dunn A, Wallace MT. Developmental changes in the multisensory temporal binding window persist into adolescence. Developmental Science. 2012;15(5):688–96. doi: 10.1111/j.1467-7687.2012.01171.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 121.Keetels M, Stekelenburg J, Vroomen J. Auditory grouping occurs prior to intersensory pairing: evidence from temporal ventriloquism. Exp Brain Res. 2007;180(3):449–56. doi: 10.1007/s00221-007-0881-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 122.Keetels M, Vroomen J. No effect of auditory-visual spatial disparity on temporal recalibration. Exp Brain Res. 2007;182(4):559–65. doi: 10.1007/s00221-007-1012-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 123.Keetels M, Vroomen J. Tactile-visual temporal ventriloquism: no effect of spatial disparity. Percept Psychophys. 2008;70(5):765–71. doi: 10.3758/pp.70.5.765. [DOI] [PubMed] [Google Scholar]
  • 124.Kuling IA, et al. Effects of stimulus duration on audio-visual synchrony perception. Experimental brain research. Experimentelle Hirnforschung. Experimentation cerebrale. 2012;221(4):403–12. doi: 10.1007/s00221-012-3182-9. [DOI] [PubMed] [Google Scholar]
  • 125.Lewkowicz DJ. Perception of auditory-visual temporal synchrony in human infants. Journal of Experimental Psychology: Human Perception and Performance. 1996;22(5):1094. doi: 10.1037//0096-1523.22.5.1094. [DOI] [PubMed] [Google Scholar]
  • 126.Lewkowicz DJ. The development of intersensory temporal perception: an epigenetic systems/limitations view. Psychol Bull. 2000;126(2):281. doi: 10.1037/0033-2909.126.2.281. [DOI] [PubMed] [Google Scholar]
  • 127.Massaro DW, Cohen MM, Smeele PM. Perception of asynchronous and conflicting visual and auditory speech. J Acoust Soc Am. 1996;100(3):1777–86. doi: 10.1121/1.417342. [DOI] [PubMed] [Google Scholar]
  • 128.Neil PA, et al. Development of multisensory spatial integration and perception in humans. Dev Sci. 2006;9(5):454–64. doi: 10.1111/j.1467-7687.2006.00512.x. [DOI] [PubMed] [Google Scholar]
  • 129.Stevenson RA, et al. Interactions between the spatial and temporal stimulus factors that influence multisensory integration in human performance. Exp Brain Res. 2012 doi: 10.1007/s00221-012-3072-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 130.Doehrmann O, Naumer MJ. Semantics and the multisensory brain: how meaning modulates processes of audio-visual integration. Brain Res. 2008;1242:136–50. doi: 10.1016/j.brainres.2008.03.071. [DOI] [PubMed] [Google Scholar]
  • 131.van Atteveldt N, et al. Multisensory integration: flexible use of general operations. Neuron. 2014;81(6):1240–53. doi: 10.1016/j.neuron.2014.02.044. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 132.Bell AH, et al. The influence of stimulus properties on multisensory processing in the awake primate superior colliculus. Can J Exp Psychol. 2001;55(2):123–32. doi: 10.1037/h0087359. [DOI] [PubMed] [Google Scholar]
  • 133.Harrington LK, Peck CK. Spatial disparity affects visual-auditory interactions in human sensorimotor processing. Exp Brain Res. 1998;122(2):247–52. doi: 10.1007/s002210050512. [DOI] [PubMed] [Google Scholar]
  • 134.Meredith MA, Stein BE. Spatial factors determine the activity of multisensory neurons in cat superior colliculus. Brain Res. 1986;365:350–354. doi: 10.1016/0006-8993(86)91648-3. [DOI] [PubMed] [Google Scholar]
  • 135.Stevenson RA, et al. Interactions between the spatial and temporal stimulus factors that influence multisensory integration in human performance. Exp Brain Res. 2012;219(l):121–37. doi: 10.1007/s00221-012-3072-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 136.Teder-Salejarvi WA, et al. Effects of spatial congruity on audio-visual multimodal integration. J Cogn Neurosci. 2005;17(9):1396–409. doi: 10.1162/0898929054985383. [DOI] [PubMed] [Google Scholar]
  • 137.Van Wanrooij MM, et al. The effect of spatial-temporal audiovisual disparities on saccades in a complex scene. Exp Brain Res. 2009;198(2-3):425–37. doi: 10.1007/s00221-009-1815-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 138.Zampini M, et al. Auditory-somatosensory multisensory interactions in front and rear space. Neuropsychologia. 2007;45(8):1869–77. doi: 10.1016/j.neuropsychologia.2006.12.004. [DOI] [PubMed] [Google Scholar]
  • 139.Conrey BL, Pisoni DB. Detection of Auditory-Visual Asynchrony in Speech and Nonspeech Signals. In: Pisoni DB, editor. Research on Spoken Language Processing. Indiana University; Bloomington: 2004. pp. 71–94. [Google Scholar]
  • 140.Stevenson RA, Wallace MT. Multisensory temporal integration: task and stimulus dependencies. Experimental brain research. Experimentelle Hirnforschung. Experimentation cerebrale. 2013;227(2):249–61. doi: 10.1007/s00221-013-3507-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 141.van Eijk RL, et al. Audiovisual synchrony and temporal order judgments: effects of experimental method and stimulus type. Percept Psychophys. 2008;70(6):955–68. doi: 10.3758/pp.70.6.955. [DOI] [PubMed] [Google Scholar]
  • 142.Hairston WD, et al. Altered temporal profile of visual-auditory multisensory interactions in dyslexia. Exp Brain Res. 2005;166(3-4):474–80. doi: 10.1007/s00221-005-2387-6. [DOI] [PubMed] [Google Scholar]
  • 143.Hairston WD, et al. Multisensory enhancement of localization under conditions of induced myopia. Exp Brain Res. 2003;152(3):404–8. doi: 10.1007/s00221-003-1646-7. [DOI] [PubMed] [Google Scholar]
  • 144.Stevenson RA, Wallace MT, Altieri N. The interaction between stimulus factors and cognitive factors during multisensory integration of audiovisual speech. Front Psychol. 2014;5:352. doi: 10.3389/fpsyg.2014.00352. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 145.Vatakis A, Spence C. Audiovisual synchrony perception for music, speech, and object actions. Brain Res. 2006;1111(1):134–42. doi: 10.1016/j.brainres.2006.05.078. [DOI] [PubMed] [Google Scholar]
  • 146.de Boer-Schellekens L, Eussen M, Vroomen J. Diminished sensitivity of audiovisual temporal order in autism spectrum disorder. Frontiers in integrative neuroscience. 2013;7:8. doi: 10.3389/fnint.2013.00008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 147.Stevenson RA, et al. Multisensory Temporal Integration in Autism Spectrum Disorders. Journal of Neuroscience. doi: 10.1523/JNEUROSCI.3615-13.2014. In Press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 148.Stevenson RA, Zemtsov RK, Wallace MT. Individual Differences in the Multisensory Temporal Binding Window Predict Susceptibility to Audiovisual Illusions. J Exp Psychol Hum Percept Perform. 2012 doi: 10.1037/a0027339. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 149.Navarra J, et al. Exposure to asynchronous audiovisual speech extends the temporal window for audiovisual integration. Brain Res Cogn Brain Res. 2005;25(2):499–507. doi: 10.1016/j.cogbrainres.2005.07.009. [DOI] [PubMed] [Google Scholar]
  • 150.Fujisaki W, et al. Recalibration of audiovisual simultaneity. Nat Neurosci. 2004;7(7):773–8. doi: 10.1038/nn1268. [DOI] [PubMed] [Google Scholar]
  • 151.Keetels M, Vroomen J. Temporal recalibration to tactile-visual asynchronous stimuli. Neurosci Lett. 2008;430(2):130–4. doi: 10.1016/j.neulet.2007.10.044. [DOI] [PubMed] [Google Scholar]
  • 152.Vroomen J, et al. Recalibration of temporal order perception by exposure to audio-visual asynchrony. Brain Res Cogn Brain Res. 2004;22(1):32–5. doi: 10.1016/j.cogbrainres.2004.07.003. [DOI] [PubMed] [Google Scholar]
  • 153.Stetson C, et al. Motor-sensory recalibration leads to an illusory reversal of action and sensation. Neuron. 2006;51(5):651–9. doi: 10.1016/j.neuron.2006.08.006. [DOI] [PubMed] [Google Scholar]
  • 154.Stevenson RA, et al. The effects of visual training on multisensory temporal processing. Experimental brain research. Experimentelle Hirnforschung. Experimentation cerebrale. 2013;225(4):479–89. doi: 10.1007/s00221-012-3387-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 155.Schlesinger JJ, et al. Improving pulse oximetry pitch perception with multisensory perceptual training. Anesth Analg. 2014;118(6):1249–53. doi: 10.1213/ANE.0000000000000222. [DOI] [PubMed] [Google Scholar]
  • 156.Stein B, et al. Behavioral indices of multisensory integration: orientation to visual cues is affected by auditory stimuli. J Cogn Neurosci. 1989;1:12–24. doi: 10.1162/jocn.1989.1.1.12. [DOI] [PubMed] [Google Scholar]
  • 157.Frens MA, Van Opstal AJ. Visual-auditory interactions modulate saccade-related activity in monkey superior colliculus. Brain Res Bull. 1998;46(3):211–24. doi: 10.1016/s0361-9230(98)00007-0. [DOI] [PubMed] [Google Scholar]
  • 158.Hughes HC, Nelson MD, Aronchick DM. Spatial characteristics of visual-auditory summation in human saccades. Vision Res. 1998;38(24):3955–63. doi: 10.1016/s0042-6989(98)00036-4. [DOI] [PubMed] [Google Scholar]
  • 159.Hughes HC, et al. Visual-auditory interactions in sensorimotor processing: saccades versus manual responses. J Exp Psychol Hum Percept Perform. 1994;20(1):131–53. doi: 10.1037//0096-1523.20.1.131. [DOI] [PubMed] [Google Scholar]
  • 160.Diederich A, Colonius H. Crossmodal interaction in saccadic reaction time: separating multisensory from warning effects in the time window of integration model. Exp Brain Res. 2008;186(1):1–22. doi: 10.1007/s00221-007-1197-4. [DOI] [PubMed] [Google Scholar]
  • 161.Miller LM, D'Esposito M. Perceptual fusion and stimulus coincidence in the cross-modal integration of speech. J Neurosci. 2005;25(25):5884–93. doi: 10.1523/JNEUROSCI.0896-05.2005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 162.Beauchamp MS, Nath AR, Pasalar S. fMRI-Guided transcranial magnetic stimulation reveals that the superior temporal sulcus is a cortical locus of the McGurk effect. J Neurosci. 2010;30(7):2414–7. doi: 10.1523/JNEUROSCI.4865-09.2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 163.Nath AR, Fava EE, Beauchamp MS. Neural correlates of interindividual differences in children's audiovisual speech perception. J Neurosci. 2011;31(39):13963–71. doi: 10.1523/JNEUROSCI.2605-11.2011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 164.Wallace MT, Stein BE. Development of multisensory neurons and multisensory integration in cat superior colliculus. J Neurosci. 1997;17(7):2429–2444. doi: 10.1523/JNEUROSCI.17-07-02429.1997. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 165.Wallace MT. The development of multisensory processes. Cognitive Processing. 2004;5(2):69–83. [Google Scholar]
  • 166.Wallace MT. The development of multisensory integration. In: Calvert GA, Spence C, Stein BE, editors. The Handbook of Multisensory Processes. The MIT Press; Cambridge, MA: 2004. pp. 683–700. [Google Scholar]
  • 167.Wallace MT, McHaffie JG, Stein BE. Visual response properties and visuotopic representation in the newborn monkey superior colliculus. J Neurophysiol. 1997;78(5):2732–41. doi: 10.1152/jn.1997.78.5.2732. [DOI] [PubMed] [Google Scholar]
  • 168.Polley DB, et al. Development and plasticity of intra- and intersensory information processing. J Am Acad Audiol. 2008;19(10):780–98. doi: 10.3766/jaaa.19.10.6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 169.Wallace MT, et al. Visual experience is necessary for the development of multisensory integration. J Neurosci. 2004;24(43):9580–4. doi: 10.1523/JNEUROSCI.2535-04.2004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 170.Wallace MT, Stein BE. The role of experience in the development of multisensory integration. Soc. Neurosci. Abstr. 2000;26:1220. [Google Scholar]
  • 171.Lewkowicz DJ. Early experience and multisensory perceptual narrowing. Dev Psychobiol. 2014;56(2):292–315. doi: 10.1002/dev.21197. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 172.Lewkowicz DJ, Lickliter R. The development of intersensory perception: comparative perspectives. Lawrence Erlbaum Associates; Hillsdale, NJ: 1994. [Google Scholar]
  • 173.Lewkowicz DJ, Ghazanfar AA. The emergence of multisensory systems through perceptual narrowing. Trends Cogn Sci. 2009;13(11):470–8. doi: 10.1016/j.tics.2009.08.004. [DOI] [PubMed] [Google Scholar]
  • 174.Lewkowicz DJ. Infants' perception of the audible, visible, and bimodal attributes of multimodal syllables. Child Dev. 2000;71(5):1241–57. doi: 10.1111/1467-8624.00226. [DOI] [PubMed] [Google Scholar]
  • 175.Lewkowicz DJ. Infant perception of audio-visual speech synchrony. Dev Psychol. 2010;46(1):66–77. doi: 10.1037/a0015579. [DOI] [PubMed] [Google Scholar]
  • 176.Lewkowicz DJ. Perception of auditory-visual temporal synchrony in human infants. J Exp Psychol Hum Percept Perform. 1996;22(5):1094–106. doi: 10.1037//0096-1523.22.5.1094. [DOI] [PubMed] [Google Scholar]
  • 177.Lewkowicz DJ, Flom R. The audiovisual temporal binding window narrows in early childhood. Child Dev. 2014;85(2):685–94. doi: 10.1111/cdev.12142. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 178.Hillock-Dunn A, Wallace MT. Developmental changes in the multisensory temporal binding window persist into adolescence. Dev Sci. 2012;15(5):688–96. doi: 10.1111/j.1467-7687.2012.01171.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 179.Autism, et al. Prevalence of autism spectrum disorders--Autism and Developmental Disabilities Monitoring Network, 14 sites, United States, 2008. MMWR Surveill Summ. 2012;61(3):1–19. [PubMed] [Google Scholar]
  • 180.Assossiation AP. Diagnostic and Statistical Manual of Mental Disorders (DSM-5) 2013. [DOI] [PubMed]
  • 181.Bertone A, et al. Enhanced and diminished visuo-spatial information processing in autism depends on stimulus complexity. Brain : a journal of neurology. 2005;128(Pt 10):2430–41. doi: 10.1093/brain/awh561. [DOI] [PubMed] [Google Scholar]
  • 182.Kern JK, et al. The pattern of sensory processing abnormalities in autism. Autism. 2006;10(5):480–94. doi: 10.1177/1362361306066564. [DOI] [PubMed] [Google Scholar]
  • 183.Kern JK, et al. Sensory correlations in autism. Autism. 2007;11(2):123–34. doi: 10.1177/1362361307075702. [DOI] [PubMed] [Google Scholar]
  • 184.Marco EJ, et al. Sensory processing in autism: a review of neurophysiology findings. Pediatr Res. 2011;69(5 Pt 2):48R–54R. doi: 10.1203/PDR.0b013e3182130c54. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 185.Pellicano E, et al. Abnormal global processing along the dorsal visual pathway in autism: a possible mechanism for weak visuospatial coherence? Neuropsychologic. 2005;43(7):1044–53. doi: 10.1016/j.neuropsychologia.2004.10.003. [DOI] [PubMed] [Google Scholar]
  • 186.Simmons DR, et al. Vision in autism spectrum disorders. Vision Res. 2009;49(22):2705–39. doi: 10.1016/j.visres.2009.08.005. [DOI] [PubMed] [Google Scholar]
  • 187.Kroger A, et al. Visual event-related potentials to biological motion stimuli in autism spectrum disorders. Soc Cogn Affect Neurosci. 2013 doi: 10.1093/scan/nst103. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 188.Frey HP, et al. Atypical cortical representation of peripheral visual space in children with an autism spectrum disorder. Eur J Neurosci. 2013;38(1):2125–38. doi: 10.1111/ejn.12243. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 189.Tsermentseli S, O'Brien JM, Spencer JV. Comparison of form and motion coherence processing in autistic spectrum disorders and dyslexia. J Autism Dev Disord. 2008;38(7):1201–10. doi: 10.1007/s10803-007-0500-3. [DOI] [PubMed] [Google Scholar]
  • 190.Takarae Y, et al. Patterns of visual sensory and sensorimotor abnormalities in autism vary in relation to history of early language delay. J Int Neuropsychol Soc. 2008;14(6):980–9. doi: 10.1017/S1355617708081277. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 191.Spencer JV, O'Brien JM. Visual form-processing deficits in autism. Perception. 2006;35(8):1047–55. doi: 10.1068/p5328. [DOI] [PubMed] [Google Scholar]
  • 192.Davis RA, et al. Subjective perceptual distortions and visual dysfunction in children with autism. J Autism Dev Disord. 2006;36(2):199–210. doi: 10.1007/s10803-005-0055-0. [DOI] [PubMed] [Google Scholar]
  • 193.Wainwright-Sharp JA, Bryson SE. Visual orienting deficits in high-functioning people with autism. J Autism Dev Disord. 1993;23(1):1–13. doi: 10.1007/BF01066415. [DOI] [PubMed] [Google Scholar]
  • 194.Greenaway R, Davis G, Plaisted-Grant K. Marked selective impairment in autism on an index of magnocellular function. Neuropsychologia. 2013;51(4):592–600. doi: 10.1016/j.neuropsychologia.2013.01.005. [DOI] [PubMed] [Google Scholar]
  • 195.Groen WB, et al. Intact spectral but abnormal temporal processing of auditory stimuli in autism. Journal of autism and developmental disorders. 2009;39(5):742–50. doi: 10.1007/s10803-008-0682-3. [DOI] [PubMed] [Google Scholar]
  • 196.Kwakye LD, et al. Altered auditory and multisensory temporal processing in autism spectrum disorders. Front Integr Neurosci. 2011;4:129. doi: 10.3389/fnint.2010.00129. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 197.Visser E, et al. Atypical vertical sound localization and sound-onset sensitivity in people with autism spectrum disorders. J Psychiatry Neurosci. 2013;38(6):398–406. doi: 10.1503/jpn.120177. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 198.Kujala T, Lepisto T, Naatanen R. The neural basis of aberrant speech and audition in autism spectrum disorders. Neurosci Biobehav Rev. 2013;37(4):697–704. doi: 10.1016/j.neubiorev.2013.01.006. [DOI] [PubMed] [Google Scholar]
  • 199.Gandal MJ, et al. Validating gamma oscillations and delayed auditory responses as translational biomarkers of autism. Biol Psychiatry. 2010;68(12):1100–6. doi: 10.1016/j.biopsych.2010.09.031. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 200.Russo NM, et al. Biological changes in auditory function following training in children with autism spectrum disorders. Behav Brain Funct. 2010;6:60. doi: 10.1186/1744-9081-6-60. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 201.Roberts TP, et al. MEG detection of delayed auditory evoked responses in autism spectrum disorders: towards an imaging biomarker for autism. Autism Res. 2010;3(1):8–18. doi: 10.1002/aur.111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 202.Russo NM, et al. Deficient brainstem encoding of pitch in children with Autism Spectrum Disorders. Clin Neurophysiol. 2008;119(8):1720–31. doi: 10.1016/j.clinph.2008.01.108. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 203.Lepisto T, et al. The discrimination of and orienting to speech and non-speech sounds in children with autism. Brain Res. 2005;1066(1-2):147–57. doi: 10.1016/j.brainres.2005.10.052. [DOI] [PubMed] [Google Scholar]
  • 204.Teder-Salejarvi WA, et al. Auditory spatial localization and attention deficits in autistic adults. Brain Res Cogn Brain Res. 2005;23(2-3):221–34. doi: 10.1016/j.cogbrainres.2004.10.021. [DOI] [PubMed] [Google Scholar]
  • 205.Szelag E, et al. Temporal processing deficits in high-functioning children with autism. Br J Psychol. 2004;95(Pt 3):269–82. doi: 10.1348/0007126041528167. [DOI] [PubMed] [Google Scholar]
  • 206.Gage NM, et al. Cortical sound processing in children with autism disorder: an MEG investigation. Neuroreport. 2003;14(16):2047–51. doi: 10.1097/00001756-200311140-00008. [DOI] [PubMed] [Google Scholar]
  • 207.Cascio CJ. Somatosensory processing in neurodevelopmental disorders. J Neurodev Disord. 2010;2(2):62–9. doi: 10.1007/s11689-010-9046-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 208.Cascio CJ, et al. The rubber hand illusion in children with autism spectrum disorders: delayed influence of combined tactile and visual input on proprioception. Autism. 2012;16(4):406–19. doi: 10.1177/1362361311430404. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 209.Foss-Feig JH, Heacock JL, Cascio CJ. Tactile Responsiveness Patterns and Their Association with Core Features in Autism Spectrum Disorders. Res Autism Spectr Disord. 2012;6(1):337–344. doi: 10.1016/j.rasd.2011.06.007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 210.Falter CM, et al. Enhanced access to early visual processing of perceptual simultaneity in autism spectrum disorders. J Autism Dev Disord. 2013;43(8):1857–66. doi: 10.1007/s10803-012-1735-1. [DOI] [PubMed] [Google Scholar]
  • 211.Chamberlain R, et al. Local processing enhancements associated with superior observational drawing are due to enhanced perceptual functioning, not weak central coherence. Q J Exp Psychol (Hove) 2013;66(7):1448–66. doi: 10.1080/17470218.2012.750678. [DOI] [PubMed] [Google Scholar]
  • 212.Stanutz S, Wapnick J, Burack JA. Pitch discrimination and melodic memory in children with autism spectrum disorders. Autism. 2014;18(2):137–47. doi: 10.1177/1362361312462905. [DOI] [PubMed] [Google Scholar]
  • 213.Almeida RA, et al. Visual search targeting either local or global perceptual processes differs as a function of autistic-like traits in the typically developing population. J Autism Dev Disord. 2013;43(6):1272–86. doi: 10.1007/s10803-012-1669-7. [DOI] [PubMed] [Google Scholar]
  • 214.Falter CM, Elliott MA, Bailey AJ. Enhanced visual temporal resolution in autism spectrum disorders. PLoS One. 2012;7(3):e32774. doi: 10.1371/journal.pone.0032774. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 215.Chen Y, et al. Enhanced local processing of dynamic visual information in autism: evidence from speed discrimination. Neuropsychologia. 2012;50(5):733–9. doi: 10.1016/j.neuropsychologia.2012.01.007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 216.Samson F, et al. Enhanced visual functioning in autism: an ALE meta-analysis. Hum Brain Mapp. 2012;33(7):1553–81. doi: 10.1002/hbm.21307. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 217.Bonnel A, et al. Enhanced pure-tone pitch discrimination among persons with autism but not Asperger syndrome. Neuropsychologia. 2010;48(9):2465–75. doi: 10.1016/j.neuropsychologia.2010.04.020. [DOI] [PubMed] [Google Scholar]
  • 218.Joseph RM, et al. Why is visual search superior in autism spectrum disorder? Dev Sci. 2009;12(6):1083–96. doi: 10.1111/j.1467-7687.2009.00855.x. [DOI] [PubMed] [Google Scholar]
  • 219.Smith H, Milne E. Reduced change blindness suggests enhanced attention to detail in individuals with autism. J Child Psychol Psychiatry. 2009;50(3):300–6. doi: 10.1111/j.1469-7610.2008.01957.x. [DOI] [PubMed] [Google Scholar]
  • 220.Manjaly ZM, et al. Neurophysiological correlates of relatively enhanced local visual search in autistic adolescents. Neuroimage. 2007;35(1):283–91. doi: 10.1016/j.neuroimage.2006.11.036. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 221.O'Riordan M, Passetti F. Discrimination in autism within different sensory modalities. J Autism Dev Disord. 2006;36(5):665–75. doi: 10.1007/s10803-006-0106-1. [DOI] [PubMed] [Google Scholar]
  • 222.Jarrold C, Gilchrist ID, Bender A. Embedded figures detection in autism and typical development: preliminary evidence of a double dissociation in relationships with visual search. Dev Sci. 2005;8(4):344–51. doi: 10.1111/j.1467-7687.2005.00422.x. [DOI] [PubMed] [Google Scholar]
  • 223.Bertone A, et al. Enhanced and diminished visuo-spatial information processing in autism depends on stimulus complexity. Brain. 2005;128(Pt 10):2430–41. doi: 10.1093/brain/awh561. [DOI] [PubMed] [Google Scholar]
  • 224.O'Riordan M A. Superior visual search in adults with autism. Autism. 2004;8(3):229–48. doi: 10.1177/1362361304045219. [DOI] [PubMed] [Google Scholar]
  • 225.Bonnel A, et al. Enhanced pitch sensitivity in individuals with autism: a signal detection analysis. J Cogn Neurosci. 2003;15(2):226–35. doi: 10.1162/089892903321208169. [DOI] [PubMed] [Google Scholar]
  • 226.O'Riordan M, Plaisted K. Enhanced discrimination in autism. Q J Exp Psychol A. 2001;54(4):961–79. doi: 10.1080/713756000. [DOI] [PubMed] [Google Scholar]
  • 227.O'Riordan MA, et al. Superior visual search in autism. J Exp Psychol Hum Percept Perform. 2001;27(3):719–30. doi: 10.1037//0096-1523.27.3.719. [DOI] [PubMed] [Google Scholar]
  • 228.Mottron L, Peretz I, Menard E. Local and global processing of music in high-functioning persons with autism: beyond central coherence? J Child Psychol Psychiatry. 2000;41(8):1057–65. [PubMed] [Google Scholar]
  • 229.Plaisted K, O'Riordan M, Baron-Cohen S. Enhanced visual search for a conjunctive target in autism: a research note. J Child Psychol Psychiatry. 1998;39(5):777–83. [PubMed] [Google Scholar]
  • 230.Plaisted K, O'Riordan M, Baron-Cohen S. Enhanced discrimination of novel, highly similar stimuli by adults with autism during a perceptual learning task. J Child Psychol Psychiatry. 1998;39(5):765–75. [PubMed] [Google Scholar]
  • 231.Happe F. Autism: cognitive deficit or cognitive style? Trends Cogn Sci. 1999;3(6):216–222. doi: 10.1016/s1364-6613(99)01318-2. [DOI] [PubMed] [Google Scholar]
  • 232.Happe F, Frith U. The weak coherence account: detail-focused cognitive style in autism spectrum disorders. J Autism Dev Disord. 2006;36(1):5–25. doi: 10.1007/s10803-005-0039-0. [DOI] [PubMed] [Google Scholar]
  • 233.Frith U, Happe F. Autism: beyond “theory of mind”. Cognition. 1994;50(1-3):115–32. doi: 10.1016/0010-0277(94)90024-8. [DOI] [PubMed] [Google Scholar]
  • 234.Jolliffe T, Baron-Cohen S. Are people with autism and Asperger syndrome faster than normal on the Embedded Figures Test? J Child Psychol Psychiatry. 1997;38(5):527–34. doi: 10.1111/j.1469-7610.1997.tb01539.x. [DOI] [PubMed] [Google Scholar]
  • 235.Shah A, Frith U. An islet of ability in autistic children: a research note. J Child Psychol Psychiatry. 1983;24(4):613–20. doi: 10.1111/j.1469-7610.1983.tb00137.x. [DOI] [PubMed] [Google Scholar]
  • 236.Mottron L, et al. Locally oriented perception with intact global processing among adolescents with high-functioning autism: evidence from multiple paradigms. J Child Psychol Psychiatry. 2003;44(6):904–13. doi: 10.1111/1469-7610.00174. [DOI] [PubMed] [Google Scholar]
  • 237.Bolte S, et al. Gestalt perception and local-global processing in high-functioning autism. J Autism Dev Disord. 2007;37(8):1493–504. doi: 10.1007/s10803-006-0231-x. [DOI] [PubMed] [Google Scholar]
  • 238.Geschwind DH, Levitt P. Autism spectrum disorders: developmental disconnection syndromes. Curr Opin Neurobiol. 2007;17(1):103–11. doi: 10.1016/j.conb.2007.01.009. [DOI] [PubMed] [Google Scholar]
  • 239.Melillo R, Leisman G. Autistic spectrum disorders as functional disconnection syndrome. Rev Neurosci. 2009;20(2):111–31. doi: 10.1515/revneuro.2009.20.2.111. [DOI] [PubMed] [Google Scholar]
  • 240.Just MA, et al. Cortical activation and synchronization during sentence comprehension in high-functioning autism: evidence of underconnectivity. Brain. 2004;127(Pt 8):1811–21. doi: 10.1093/brain/awh199. [DOI] [PubMed] [Google Scholar]
  • 241.Just MA, et al. Functional and anatomical cortical underconnectivity in autism: evidence from an FMRI study of an executive function task and corpus callosum morphometry. Cerebral Cortex. 2007;17(4):951–961. doi: 10.1093/cercor/bhl006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 242.Rubenstein JL, Merzenich MM. Model of autism: increased ratio of excitation/inhibition in key neural systems. Genes, brain, and behavior. 2003;2(5):255–67. doi: 10.1034/j.1601-183x.2003.00037.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 243.Perez Velazquez JL, Galan RF. Information gain in the brain's resting state: A new perspective on autism. Front Neuroinform. 2013:37. doi: 10.3389/fninf.2013.00037. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 244.Milne E. Increased intra-participant variability in children with autistic spectrum disorders: evidence from single-trial analysis of evoked EEG. Front Psychol. 2011;2:51. doi: 10.3389/fpsyg.2011.00051. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 245.Jones TB, et al. Sources of group differences in functional connectivity: an investigation applied to autism spectrum disorder. Neuroimage. 2010;49(1):401–14. doi: 10.1016/j.neuroimage.2009.07.051. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 246.Dinstein I, et al. Unreliable evoked responses in autism. Neuron. 2012;75(6):981–91. doi: 10.1016/j.neuron.2012.07.026. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 247.Brock J, et al. The temporal binding deficit hypothesis of autism. Dev Psychopathol. 2002;14(2):209–24. doi: 10.1017/s0954579402002018. [DOI] [PubMed] [Google Scholar]
  • 248.Stevenson RA, et al. The impact of multisensory integration deficits on speech perception in children with autism spectrum disorders. Front Psychol. 2014;5:379. doi: 10.3389/fpsyg.2014.00379. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 249.McGurk H, MacDonald J. Hearing lips and seeing voices. Nature. 1976;264(5588):746–8. doi: 10.1038/264746a0. [DOI] [PubMed] [Google Scholar]
  • 250.de Gelder B, Vroomen J, Van der Heide L. Face recognition and lip-reading in autism. European Journal of Cognitive Psychology. 1991;3(1):69–86. [Google Scholar]
  • 251.Williams J, et al. Visual-auditory integration during speech imitation in autism. Research in Developmental Disabilities. 2004;25(6):559–575. doi: 10.1016/j.ridd.2004.01.008. [DOI] [PubMed] [Google Scholar]
  • 252.Irwin JR, et al. Can children with autism spectrum disorders “hear” a speaking face? Child Development. 2011;82(5):1397–1403. doi: 10.1111/j.1467-8624.2011.01619.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 253.Mongillo E, et al. Audiovisual processing in children with and without autism spectrum disorders. Journal of Autism and Developmental Disorders. 2008;38(7):1349–1358. doi: 10.1007/s10803-007-0521-y. [DOI] [PubMed] [Google Scholar]
  • 254.Taylor N, Isaac C, Milne E. A comparison of the development of audiovisual integration in children with autism spectrum disorders and typically developing children. J Autism Dev Disord. 2010;40(11):1403–11. doi: 10.1007/s10803-010-1000-4. [DOI] [PubMed] [Google Scholar]
  • 255.Stevenson RA, et al. Brief Report: Arrested Development of Audiovisual Speech Perception in Autism Spectrum Disorders. Journal of Autism and Developmental Disorders. :1–8. doi: 10.1007/s10803-013-1992-7. In Press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 256.Bebko JM, Schroeder JH, Weiss JA. The McGurk Effect in Children With Autism and Asperger Syndrome. Autism Research. 2013 doi: 10.1002/aur.1343. [DOI] [PubMed] [Google Scholar]
  • 257.Iarocci G, et al. Visual influences on speech perception in children with autism. Autism. 2010;14(4):305–20. doi: 10.1177/1362361309353615. [DOI] [PubMed] [Google Scholar]
  • 258.Woynaroski TG, et al. Multisensory Speech Perception in Children with Autism Spectrum Disorders. J Autism Dev Disord. 2013 doi: 10.1007/s10803-013-1836-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 259.Mongillo EA, et al. Audiovisual processing in children with and without autism spectrum disorders. J Autism Dev Disord. 2008;38(7):1349–58. doi: 10.1007/s10803-007-0521-y. [DOI] [PubMed] [Google Scholar]
  • 260.Foss-Feig JH, et al. An extended multisensory temporal binding window in autism spectrum disorders. Exp Brain Res. 2010;203(2):381–9. doi: 10.1007/s00221-010-2240-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 261.Bebko JM, et al. Discrimination of temporal synchrony in intermodal events by children with autism and children with developmental disabilities without autism. J Child Psychol Psychiatry. 2006;47(1):88–98. doi: 10.1111/j.1469-7610.2005.01443.x. [DOI] [PubMed] [Google Scholar]
  • 262.Bahrick LE. Intermodal perception and selective attention to intersensory redundancy: Implications for typical social development and autism. Blackwell handbook of infant development. 2010;1:120–165. [Google Scholar]
  • 263.Bahrick LE, Lickliter R. Intersensory redundancy guides attentional selectivity and perceptual learning in infancy. Dev Psychol. 2000;36(2):190–201. doi: 10.1037//0012-1649.36.2.190. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 264.Bahrick LE, Lickliter R. Intersensory redundancy guides early perceptual and cognitive development. Advances in child development and behavior. 2003;30:153–187. doi: 10.1016/s0065-2407(02)80041-6. [DOI] [PubMed] [Google Scholar]
  • 265.Bahrick LE, Todd JT. Multisensory Processing in Autism Spectrum Disorders: Intersensory Processing Disturbance as a Basis for Atypical Development. In: Wallace MT, Murray MM, editors. Frontiers in the Neural Basis of Multisensory Processes. Francis Group; London: 2011. [Google Scholar]
  • 266.Stevenson RA, et al. Multisensory temporal integration in autism spectrum disorders. J Neurosci. 2014;34(3):691–7. doi: 10.1523/JNEUROSCI.3615-13.2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 267.Woynaroski TG, et al. Multisensory Speech Perception in High-Functioning Children with Autism Spectrum Disorders. Journal of Autism and Developmental Disorders. doi: 10.1007/s10803-013-1836-5. In Press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 268.Baron-Cohen S, et al. The autism-spectrum quotient (AQ): Evidence from asperger syndrome/high-functioning autism, males and females, scientists and mathematicians. Journal of Autism and Developmental Disorders. 2001;31(1):5–17. doi: 10.1023/a:1005653411471. [DOI] [PubMed] [Google Scholar]
  • 269.Hurley RS, et al. The broad autism phenotype questionnaire. J Autism Dev Disord. 2007;37(9):1679–90. doi: 10.1007/s10803-006-0299-3. [DOI] [PubMed] [Google Scholar]
  • 270.Donohue SE, Darling EF, Mitroff SR. Links between multisensory processing and autism. Experimental Brain Research. 2012;222(4):377–387. doi: 10.1007/s00221-012-3223-4. [DOI] [PubMed] [Google Scholar]
  • 271.Levitt JG, et al. Corticalsulcal maps in autism. Cereb Cortex. 2003;13(7):728–35. doi: 10.1093/cercor/13.7.728. [DOI] [PubMed] [Google Scholar]
  • 272.Boddaert N, et al. Superior temporal sulcus anatomical abnormalities in childhood autism: a voxel-based morphometry MRI study. Neuroimage. 2004;23(1):364–9. doi: 10.1016/j.neuroimage.2004.06.016. [DOI] [PubMed] [Google Scholar]
  • 273.Boddaert N, et al. MRI findings in 77 children with non-syndromic autistic disorder. PLoS One. 2009;4(2):e4415. doi: 10.1371/journal.pone.0004415. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 274.Zilbovicius M, et al. Temporal lobe dysfunction in childhood autism: a PET study. Positron emission tomography. Am J Psychiatry. 2000;157(12):1988–93. doi: 10.1176/appi.ajp.157.12.1988. [DOI] [PubMed] [Google Scholar]
  • 275.Zilbovicius M, Meresse I, Boddaert N. [Autism: neuroimaging] Rev Bras Psiquiatr. 2006;28(Suppl 1):S21–8. doi: 10.1590/s1516-44462006000500004. [DOI] [PubMed] [Google Scholar]
  • 276.Lee JE, et al. Diffusion tensor imaging of white matter in the superior temporal gyrus and temporal stem in autism. Neurosci Lett. 2007;424(2):127–132. doi: 10.1016/j.neulet.2007.07.042. [DOI] [PubMed] [Google Scholar]
  • 277.Minshew NJ, Williams DL. The new neurobiology of autism: cortex, connectivity, and neuronal organization. Arch Neurol. 2007;64(7):945. doi: 10.1001/archneur.64.7.945. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 278.Barnea-Goraly N, et al. White matter structure in autism: preliminary evidence from diffusion tensor imaging. Biol Psychiatry. 2004;55(3):323–326. doi: 10.1016/j.biopsych.2003.10.022. [DOI] [PubMed] [Google Scholar]
  • 279.Keller TA, Kana RK, Just MA. A developmental study of the structural integrity of white matter in autism. Neuroreport. 2007;18:23–27. doi: 10.1097/01.wnr.0000239965.21685.99. [DOI] [PubMed] [Google Scholar]
  • 280.Ke X, et al. White matter impairments in autism, evidence from voxel-based morphometry and diffusion tensor imaging. Brain Res. 2009;1265:171–177. doi: 10.1016/j.brainres.2009.02.013. [DOI] [PubMed] [Google Scholar]
  • 281.Amaral DG, Schumann CM, Nordahl CW. Neuroanatomy of autism. Trends Neurosci. 2008;31(3):137–145. doi: 10.1016/j.tins.2007.12.005. [DOI] [PubMed] [Google Scholar]
  • 282.Peterson RL, Pennington BF. Developmental dyslexia. Lancet. 2012;379(9830):1997–2007. doi: 10.1016/S0140-6736(12)60198-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 283.Shaywitz SE, Shaywitz BA. Paying attention to reading: the neurobiology of reading and dyslexia. Dev Psychopathol. 2008;20(4):1329–49. doi: 10.1017/S0954579408000631. [DOI] [PubMed] [Google Scholar]
  • 284.Shaywitz SE, Shaywitz BA. Dyslexia (specific reading disability). Biol Psychiatry. 2005;57(11):1301–9. doi: 10.1016/j.biopsych.2005.01.043. [DOI] [PubMed] [Google Scholar]
  • 285.Ramus F. Developmental dyslexia: specific phonological deficit or general sensorimotor dysfunction? Curr Opin Neurobiol. 2003;13(2):212–8. doi: 10.1016/s0959-4388(03)00035-7. [DOI] [PubMed] [Google Scholar]
  • 286.Ramus F, et al. Theories of developmental dyslexia: insights from a multiple case study of dyslexic adults. Brain. 2003;126(Pt4):841–65. doi: 10.1093/brain/awg076. [DOI] [PubMed] [Google Scholar]
  • 287.Livingstone MS, et al. Physiological and anatomical evidence for a magnocellular defect in developmental dyslexia. Proc Natl Acad Sci USA. 1991;88(18):7943–7. doi: 10.1073/pnas.88.18.7943. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 288.Stein J. The magnocellular theory of developmental dyslexia. Dyslexia. 2001;7(1):12–36. doi: 10.1002/dys.186. [DOI] [PubMed] [Google Scholar]
  • 289.Eden GF, et al. Abnormal processing of visual motion in dyslexia revealed by functional brain imaging. Nature. 1996;382(6586):66–9. doi: 10.1038/382066a0. [DOI] [PubMed] [Google Scholar]
  • 290.Henry MK. Structured, sequential, multisensory teaching: the Orton legacy. Annals of Dyslexia. 1998;48:3–26. [Google Scholar]
  • 291.Oakland T, et al. An evaluation of the dyslexia training program: a multisensory method for promoting reading in students with reading disabilities. J Learn Disabil. 1998;31(2):140–7. doi: 10.1177/002221949803100204. [published erratum appears in J Learn Disabil 1998 Jul-Aug;31(4):385].
  • 292.Birch HG, Belmont L. Auditory-Visual Integration in Normal and Retarded Readers. Am J Orthopsychiatry. 1964;34:852–61. doi: 10.1111/j.1939-0025.1964.tb02240.x. [DOI] [PubMed] [Google Scholar]
  • 293.Muehl S, Kremenak S. Ability to match information within and between auditory and visual sense modalities and subsequent reading acheivement. Journal of Educational Psychology. 1966;57:230–238. [Google Scholar]
  • 294.Morein-Zamir S, Soto-Faraco S, Kingstone A. Auditory capture of vision: examining temporal ventriloquism. Brain Res Cogn Brain Res. 2003;17(1):154–63. doi: 10.1016/s0926-6410(03)00089-2. [DOI] [PubMed] [Google Scholar]
  • 295.Froyen D, et al. Cross-modal enhancement of the MMN to speech-sounds indicates early and automatic integration of letters and speech-sounds. Neurosci Lett. 2008;430(1):23–8. doi: 10.1016/j.neulet.2007.10.014. [DOI] [PubMed] [Google Scholar]
  • 296.Froyen DJ, et al. The long road to automation: neurocognitive development of letter-speech sound processing. J Cogn Neurosci. 2009;21(3):567–80. doi: 10.1162/jocn.2009.21061. [DOI] [PubMed] [Google Scholar]
  • 297.Froyen D, Willems G, Blomert L. Evidence for a specific cross-modal association deficit in dyslexia: an electrophysiological study of letter-speech sound processing. Dev Sci. 2011;14(4):635–48. doi: 10.1111/j.1467-7687.2010.01007.x. [DOI] [PubMed] [Google Scholar]
  • 298.Facoetti A, et al. Multisensory spatial attention deficits are predictive of phonological decoding skills in developmental dyslexia. J Cogn Neurosci. 2010;22(5):1011–25. doi: 10.1162/jocn.2009.21232. [DOI] [PubMed] [Google Scholar]
  • 299.Blau V, et al. Reduced neural integration of letters and speech sounds links phonological and reading deficits in adult dyslexia. Current biology : CB. 2009;19(6):503–8. doi: 10.1016/j.cub.2009.01.065. [DOI] [PubMed] [Google Scholar]
  • 300.Richlan F, Kronbichler M, Wimmer H. Structural abnormalities in the dyslexic brain: a meta-analysis of voxel-based morphometry studies. Hum Brain Mapp. 2013;34(11):3055–65. doi: 10.1002/hbm.22127. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 301.Blau V, et al. Deviant processing of letters and speech sounds as proximate cause of reading failure: a functional magnetic resonance imaging study of dyslexic children. Brain. 2010;133(Pt 3):868–79. doi: 10.1093/brain/awp308. [DOI] [PubMed] [Google Scholar]
  • 302.Maisog JM, et al. A meta-analysis of functional neuroimaging studies of dyslexia. Ann N Y Acad Sei. 2008;1145:237–59. doi: 10.1196/annals.1416.024. [DOI] [PubMed] [Google Scholar]
  • 303.Steinbrink C, et al. The contribution of white and gray matter differences to developmental dyslexia: insights from DTI and VBM at 3. 0 T. Neuropsychologia. 2008;46(13):3170–8. doi: 10.1016/j.neuropsychologia.2008.07.015. [DOI] [PubMed] [Google Scholar]
  • 304.Rimrodt SL, et al. Functional MRI of sentence comprehension in children with dyslexia: beyond word recognition. Cereb Cortex. 2009;19(2):402–13. doi: 10.1093/cercor/bhn092. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 305.Wallace MT. Dyslexia: bridging the gap between hearing and reading. Curr Biol. 2009;19(6):R260–2. doi: 10.1016/j.cub.2009.01.025. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 306.Javitt DC. Sensory processing in schizophrenia: neither simple nor intact. Schizophr Bull. 2009;35(6):1059–64. doi: 10.1093/schbul/sbp110. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 307.Brenner CA, et al. Steady state responses: electrophysiological assessment of sensory function in schizophrenia. Schizophr Bull. 2009;35(6):1065–77. doi: 10.1093/schbul/sbp091. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 308.Freedman R, et al. The genetics of sensory gating deficits in schizophrenia. Curr Psychiatry Rep. 2003;5(2):155–61. doi: 10.1007/s11920-003-0032-2. [DOI] [PubMed] [Google Scholar]
  • 309.Carvill S. Sensory impairments, intellectual disability and psychiatry. J Intellect Disabil Res. 2001;45(Pt 6):467–83. doi: 10.1046/j.1365-2788.2001.00366.x. [DOI] [PubMed] [Google Scholar]
  • 310.Behrendt RP. Dysregulation of thalamic sensory “transmission” in schizophrenia: neurochemical vulnerability to hallucinations. J Psychopharmacol. 2006;20(3):356–72. doi: 10.1177/0269881105057696. [DOI] [PubMed] [Google Scholar]
  • 311.Behrendt RP, Young C. Hallucinations in schizophrenia, sensory impairment, and brain disease: a unifying model. Behav Brain Sci. 2004;27(6):771–87. doi: 10.1017/s0140525x04000184. discussion 787-830. [DOI] [PubMed] [Google Scholar]
  • 312.Hughes G, Desantis A, Waszak F. Mechanisms of intentional binding and sensory attenuation: the role of temporal prediction, temporal control, identity prediction, and motor prediction. Psychol Bull. 2013;139(1):133–51. doi: 10.1037/a0028566. [DOI] [PubMed] [Google Scholar]
  • 313.Maurage P, Campanella S. Experimental and clinical usefulness of crossmodal paradigms in psychiatry: an illustration from emotional processing in alcohol-dependence. Front Hum Neurosci. 2013;7:394. doi: 10.3389/fnhum.2013.00394. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 314.Szycik GR, et al. Audiovisual integration of speech is disturbed in schizophrenia: an fMRI study. Schizophr Res. 2009;110(1-3):111–8. doi: 10.1016/j.schres.2009.03.003. [DOI] [PubMed] [Google Scholar]
  • 315.Pearl D, et al. Differences in audiovisual integration, as measured by McGurk phenomenon, among adult and adolescent patients with schizophrenia and age-matched healthy control groups. Compr Psychiatry. 2009;50(2):186–92. doi: 10.1016/j.comppsych.2008.06.004. [DOI] [PubMed] [Google Scholar]
  • 316.de Jong JJ, et al. Audiovisual emotion recognition in schizophrenia: reduced integration of facial and vocal affect. Schizophr Res. 2009;107(2-3):286–93. doi: 10.1016/j.schres.2008.10.001. [DOI] [PubMed] [Google Scholar]
  • 317.Ross LA, et al. Impaired multisensory processing in schizophrenia: deficits in the visual enhancement of speech comprehension under noisy environmental conditions. Schizophr Res. 2007;97(1-3):173–83. doi: 10.1016/j.schres.2007.08.008. [DOI] [PubMed] [Google Scholar]
  • 318.de Gelder B, et al. Multisensory integration of emotional faces and voices in schizophrenics. Schizophr Res. 2005;72(2-3):195–203. doi: 10.1016/j.schres.2004.02.013. [DOI] [PubMed] [Google Scholar]
  • 319.Williams LE, et al. Reduced multisensory integration in patients with schizophrenia on a target detection task. Neuropsychologia. 2010;48(10):3128–36. doi: 10.1016/j.neuropsychologia.2010.06.028. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 320.Stekelenburg JJ, et al. Deficient multisensory integration in schizophrenia: an event-related potential study. Schizophr Res. 2013;147(2-3):253–61. doi: 10.1016/j.schres.2013.04.038. [DOI] [PubMed] [Google Scholar]
  • 321.Carroll CA, et al. Temporal processing dysfunction in schizophrenia. Brain Cogn. 2008;67(2):150–61. doi: 10.1016/j.bandc.2007.12.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 322.Martin B, et al. Temporal event structure and timing in schizophrenia: preserved binding in a longer “now”. Neuropsychologia. 2013;51(2):358–71. doi: 10.1016/j.neuropsychologia.2012.07.002. [DOI] [PubMed] [Google Scholar]
  • 323.Shin YW, et al. Increased temporal variability of auditory event-related potentials in schizophrenia and Schizotypal Personality Disorder. Schizophr Res. 2010;124(1-3):110–8. doi: 10.1016/j.schres.2010.08.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 324.Foucher JR, et al. Low time resolution in schizophrenia Lengthened windows of simultaneity for visual, auditory and bimodal stimuli. Schizophr Res. 2007;97(1-3):118–27. doi: 10.1016/j.schres.2007.08.013. [DOI] [PubMed] [Google Scholar]

RESOURCES