Abstract
Voluntary orienting of visual spatial attention has been shown to be associated with activation in a distributed network of frontal and parietal brain areas. Neuropsychological data suggest that at least some of these areas should be sensitive to the direction in which attention is shifted. The aim of this study was to use rapid event‐related functional magnetic resonance imaging to investigate whether spatial attention in the auditory modality is subserved by the same or different brain areas as in the visual modality, and whether the auditory and visual attention networks show any degree of hemispheric lateralisation or sensitivity to the direction of attention shifts. The results suggest that auditory and visual spatial attention shifts are controlled by a supramodal network of frontal, parietal and temporal areas. Areas activated included the precuneus and superior parietal cortex, the inferior parietal cortex and temporo‐parietal junction, as well as the premotor and supplementary motor areas and dorsolateral prefrontal cortex (DLPFC). In the auditory task, some of these areas, in particular the precuneus as well as the inferior parietal cortex and temporo‐parietal junction, showed ‘relative’ asymmetry, in that they responded more strongly to attention shifts towards the contralateral than the ipsilateral hemispace. Some areas, such as the right superior parietal cortex and left DLPFC, showed ‘absolute’ asymmetry, in that they responded more strongly in one than in the other hemisphere. Human Brain Mapp 2009. © 2008 Wiley‐Liss, Inc.
Keywords: auditory system, attention: auditory, attention: visual, functional MRI, hemispheric specialization
INTRODUCTION
Space is the most important cue in guiding selective attention in the visual modality. In contrast, the role of space in auditory selective attention is less straightforward. While spatial attention yields reliable performance improvements in visual tasks, e.g., when detecting a target at a cued compared to an uncued location [Fan et al., 2002; Posner, 1980], spatial‐orienting benefits in auditory tasks have proven much harder to elicit, apparently depending on both task demands and cueing procedure [Hugdahl and Nordby, 1994; McDonald and Ward, 1999; Roberts et al., 2006; Spence and Driver, 1994]. However, the difference in the effects of spatial orienting in the auditory and visual modalities may be related to differences in the sensory representations of spatial and non‐spatial information rather than in the mechanisms of executive attentional control. In the visual system, space is the basic organising feature, which is mapped on the retina and which the processing of all other features (e.g., colour, shape or motion) is contingent upon. In contrast, the basic organising feature of the auditory system is sound frequency. The perception of sound location is based on specific binaural (interaural time and level differences) and monaural (spectral) cues, the processing of which appears to occur in parallel to the processing of non‐spatial features.
Neuropsychological data indicate that spatial processing and the control of visual spatial attention are distributed asymmetrically across the two cerebral hemispheres. Unilateral lesions in either hemisphere can lead to acute deficits in orienting attention towards the contralesional hemispace, a condition referred to as hemispatial neglect [for review, see, e.g., Halligan et al., 2003; Mesulam, 1999]. However, deficits tend to occur more frequently and be more enduring following lesions to the right than to the left hemisphere [Heilman et al., 2003]. These results suggest a contralateral distribution of visuospatial attentional control, superposed by a right‐hemisphere bias [Mesulam, 1981, 1999]. Results from studies that have applied transcranial magnetic stimulation (TMS) to the parietal [Walsh et al., 1999] or frontal cortex [Grosbras and Paus, 2002, 2003] seem to be broadly consistent with this hypothesis. In contrast, imaging and electrophysiological studies provide a more inconsistent picture. Event‐related potential (ERP) studies have shown that voluntary (endogenous) shifting of attention to a cued location in the left or right visual space elicits neuroelectric responses at posterior electrode sites contralateral to the direction of the attention shift (Harter et al., 1989; Talsma et al., 2005; Yamaguchi et al., 1994]. Source analysis suggests that these responses arise from the posterior parietal cortex [van der Lubbe et al., 2006]. Some studies have also observed contralateral attention‐shift responses at frontal electrode sites [Eimer et al., 2002; Nobre et al., 2000; Talsma et al., 2005]. In contrast, the blood oxygen level‐dependent (BOLD) response associated with spatial orienting of visual attention, measured with functional magnetic resonance imaging (fMRI), was mostly found to be bilateral, regardless of which hemispace attention was oriented towards [e.g., Corbetta et al., 2000; Hopfinger et al., 2000; Pollmann and Morillo, 2003; Thiel et al., 2004; Vandenberghe et al., 1997, 2001]. Some studies have observed hemispheric asymmetries in the activations produced by visual attentional orienting. However, these asymmetries were mostly incongruous with the asymmetry suggested by the lesion studies [e.g., Vandenberghe et al., 2000]. In a positron emission tomography (PET) study, Nobre et al. [1997] observed asymmetric activation patterns in the posterior parietal cortex and superior temporal sulcus that were consistent with the asymmetry of neglect, but failed to verify the statistical significance of these effects. While auditory neglect has been more difficult to define than visual neglect [for review, see Marshall, 2001], spatial processing in the auditory domain has consistently exhibited a neglect‐like pattern of right‐biased hemispheric asymmetry. Imaging and electrophysiological data converge in showing that sounds lateralised to the left auditory hemispace produce a predominantly contralateral response, whereas the response to right‐lateralised sounds is more bilateral [Hine and Debener, 2007; Deouell et al., 1998; Kaiser et al., 2000; Krumbholz et al., 2005, 2007; Petit et al., 2007; Schönwiesner et al., 2007].
The aim of this study was to investigate whether the control of spatial attentional orienting in the auditory modality involves the same or different brain areas as in the visual modality, and whether the auditory and visual attention networks show any degree of hemispheric asymmetry or sensitivity to the direction of attention shifts. For that, brain activity associated with voluntary shifts of attention between two stimulus streams, presented either in the auditory or visual modality and located in opposite hemispaces, were measured using rapid event‐related fMRI. Under the assumption that spatial attention is controlled by a supramodal system, the attention shifts would be expected to activate the same areas, irrespective of stimulus modality. Based on previous ERP results, at least some of these areas would be expected to respond more strongly to attention shifts towards the contralateral than the ipsilateral hemispace. The hemispheric asymmetry of neglect would suggest that the degree of this contralaterality may be larger in the left than in the right hemisphere.
METHODS
Brain activity associated with spatial shifts of auditory or visual attention was measured using a modified version of the rapid serial presentation task devised by Yantis et al. [2002; see Reeves and Sperling 1986, for the original perceptual version of the task] and its auditory equivalent [Shomstein and Yantis, 2006]. A serial task was deemed preferable to a trial‐based cued target detection task, as used in most of the previous imaging studies of visual spatial attention, because of the unreliability of cue validity effects in auditory versions of this kind of task [see, e.g., Roberts et al., 2006, and earlier]. Using a discrimination rather than a detection task to eliminate the problem in eliciting auditory cue validity effects as suggested by Spence and Driver [1994] and McDonald and Ward [1999] would have entailed the risk of confounding attentional processing with processing associated with target discrimination and was thus not an option for the current experiment.
The experiment consisted of a mixed blocked and event‐related (hybrid) design. Within each block, two continuous streams of either auditory or visual stimuli were presented, one in each hemispace. The participant had to attend to one of the streams and detect infrequent target and cue events in the attended stream while maintaining fixation at the centre of the visual field. The cue events signalled the participant to shift attention to the unattended stream in the opposite hemispace.
Stimuli and Task
Stimuli were presented once per second (Fig. 1A) and consisted of three vowels (Fig. 1B). Each hemispace contained one stimulus stream. Most of the stimuli were so‐called null events (‘N’ in Fig. 1A), which were task‐irrelevant and consisted of any combination of three different of the vowels a, e, i and o, chosen randomly (second row in Fig. 1B). In both visual and auditory blocks, there was a one‐in‐three chance for every fourth stimulus on the attended side to be a target ('T'), a cue ('C') or another null event. Targets (first row in Fig. 1B) consisted of two o's and one of the vowels a, e or i, chosen randomly, whereas cues (third row in Fig. 1B) consisted of three a's. In the visual blocks, the vowels were presented simultaneously in a triangular arrangement to the left or right of the fixation cross (Fig. 1B, left column). The triangles consisted of one medial and two stacked lateral letters. The two o's in the visual targets always appeared at the medial and bottom lateral locations within the triangle. Stimulus duration was 750 ms. In the auditory blocks, the vowels were presented sequentially and were perceived as coming from external, fully left‐ or right‐lateralised sources (Fig. 1B, right column). They had a duration of 100 ms and were presented with 150‐ms silent gaps between them. The two o's in the auditory targets were always presented in the second and third temporal positions. The stimuli on the two sides were interleaved by delaying the stimulus on one side by 125 ms relative to the stimulus on the other side. The side containing the delayed stimulus was chosen randomly for each participant.
Figure 1.

Schematic representation of stimuli (not to scale): (A) General sequential structure of stimuli in left and right hemispaces. N: null event; T: target; C: cue. (B) Visual (left) and auditory (right) examples of the three stimulus pairs marked 1, 2 and 3 in panel A. In both panels, attended stimuli are shown in black and unattended stimuli are shown in grey. In the example shown, attention is first directed to the stimulus on the left side and then shifts to the right side after the cue event (see panel A). The task‐relevant stimuli (targets and cues) are underlined.
The vowels were presented sequentially in the auditory condition, because source location is, if at all, a weak cue for the perceptual segregation of concurrent sounds [Culling and Summerfield, 1995; Darwin and Hukin, 1999, 2000]; presenting the vowels concurrently would thus have rendered them largely unintelligible. Informal testing revealed that presenting the visual stimuli sequentially like the auditory ones would have made it difficult to memorise them, highlighting an important difference in the role of time in auditory versus visual perception. In fact, it has been hypothesised that sequences of unrelated visually presented verbal items have to first be converted to phonological traces in order to be memorised [see Baddeley, 2003, for review]; in the current experiment, the rate of presentation of the vowels would have been too fast for that conversion to be performed efficiently.
Participants were asked to fixate a cross at the centre of the visual field, which was presented during both visual and auditory blocks. The task was to attend to one of the two streams of stimuli (highlighted in back in Fig. 1A) and respond to targets appearing in the attended stream with a button press (Lumitouch, Photon Control, Burnaby, B. C., Canada) using the right or left index finger; the response hand was counterbalanced across participants. The cues instructed participants to shift attention to the unattended stimulus while maintaining fixation. Once shifted, attention then remained focussed on the newly attended stream until the next cue event occurred in that stream. Note that, in the current paradigm, the cues had a different meaning than in trial‐based paradigms (e.g., cued detection task); in the current paradigm, the cues signalled the participant to shift attention rather than to expect the occurrence of a target [see also Yantis et al., 2002].
Fixation was monitored with an MR‐compatible infrared eye tracker system (Advanced Science Laboratories, model 540, Bedford, MA) and analysed using the ILAB software [Gitelman, 2002]. For technical reasons, eye position could not be reliably recorded in 7 out of 22 participants. Analysis of the remaining eye data revealed that, on average, participants fixated (eyes within 25% of the distance between the fixation cross and the visual stimuli, or 2.4° visual angle) for 97% of the 60‐s task blocks. There was no significant difference in fixation between the visual and auditory task blocks, and there was no correlation between eye movements and the occurrence of the cue stimuli.
All visual stimuli including the fixation cross were presented in white on a black background using an equi‐spaced font ('Courier New'), and subtended a visual angle of 1.73° at a viewing distance of 29 cm. They were back‐projected onto a screen in front of the participant and viewed through a mirror system mounted on top of the head coil. The letter triangles subtended 3.45° of visual angle in both width and height (to the centres of the letters) and their centres were located 9.8° to the left or right of the fixation cross. The presentation of the visual stimuli and recording of behavioural responses were controlled with Presentation (Neurobehavioral Systems, Albany, CA).
Auditory stimuli were generated digitally with a sampling rate of 12 kHz and a 24‐bit amplitude resolution using Matlab (The Mathworks, Natick, MA) and TDT System 3 (Tucker Davis Technologies, Alachua, FL) and delivered through high‐end electrostatic transducers (Koss, Milwaukee, WI), mounted into professional, metal‐free ear defenders (Bilsom) to shield the participant from the scanner noise. Vowels were synthesized by multiplying sinusoids at the first three formant frequencies with a periodic envelope mimicking the stream of glottal pulses, weighting them according to a sloping spectral profile of −6 dB per octave, and then summing them [see, e.g., Uppenkamp et al., 2006]. The shape of each glottal pulse was approximated by a γ‐function with a sharp attack and an exponential decay with a half‐life of 2.5 ms. In order to produce a realistic, externalised spatial perception, the stimuli were lateralised towards the left or right ear by filtering them with generic head‐related transfer functions (HRTFs) for azimuths of 90 and 270°. The HRTFs were obtained from the CIPIC database for the KEMAR head with large pinnae (http://interface.cipic.ucdavis.edu) and implemented as 54th‐order finite impulse response filters. In order to facilitate perceptual segregation of the lateralised sound streams, the vowels in the two streams were generated to be perceived as being produced by different male speakers, one with a slightly lower voice than the other. The glottal‐pulse rate of one speaker was set to 83 Hz and that of the other to 125 Hz. The side containing the lower voice was chosen randomly for each participant. The difference in voice quality produced by the pitch difference was relatively subtle, because the pitch difference was relatively small and was not accompanied by a commensurate change in the formant frequencies [Smith and Patterson, 2005]. Thus, while the pitch difference is likely to have had a facilitatory effect on sound‐source segregation, it is unlikely that participants used the pitch difference—rather than stimulus location—to guide attention. The finding that the response to the auditory attention shifts showed sensitivity to the direction of the attention shifts corroborates this notion (see Results).
Experimental Protocol and fMRI Data Acquisition
The experiment consisted of a total of 10 auditory and 10 visual blocks, which were alternated. The starting modality was counterbalanced across participants. Each block lasted 70 s and started with a 10‐s period, during which the participant was given written instruction about the upcoming stimulus modality and the side that was to be attended first. The instruction also contained a reminder of the characteristics of the target and cue stimuli. To make sure that participants were attending to the correct side, a small dot was presented next to the fixation cross on the to‐be‐attended side during the first four stimuli of each block (for 4 s) and then disappeared.
Participants were asked to complete a dry run of the experiment one or two days before the actual experiment for practice. The dry run was carried out in a sound‐attenuated chamber (IAC GmbH, Niederkrüchten, Germany), using AKG K240 DF headphones (Vienna, Austria) for auditory stimulation and an ordinary 17″ CRT monitor for visual stimulation, but was otherwise identical to the actual experiment.
A total of 480 BOLD contrast images were acquired for each participant using a Siemens Vision 1.5‐T whole body scanner (SONATA, Siemens Medical Solutions, Erlangen, Germany) and gradient echo planar imaging (TR = 3 s, TE = 66 ms). Functional images comprised 25 ascending 4‐mm slices with an in‐plane resolution of 3.125 × 3.125 mm2 and a 0.4‐mm inter‐slice gap. Functional images were acquired continuously. A high‐resolution structural image (3D MP‐RAGE) was acquired at the end of each measurement.
Data Analysis
Structural and functional images were analysed using SPM2 (http://www.fil.ion.ucl.ac.uk/spm). The first four functional images were discarded to allow for magnetic saturation. The remaining images were realigned, slice‐time corrected, co‐registered with the structural images and normalised to a symmetrical version of the standard SPM MNI template. A symmetrical rather than the standard template was used, because one analysis involved comparing contrast images across hemispheres by contrasting the original images with left–right flipped versions of the same images. Unless a symmetrical template is used, any differences between the flipped and unflipped images might conceivably be due to inter‐hemispheric differences in the normalisation process. A symmetrical template was created by averaging the standard template with a flipped version of the template. Finally, the functional images were spatially smoothed using a Gaussian kernel with 10‐mm full width at half maximum.
The data of each participant were modelled with a general linear model containing regressors for the instruction periods, for sensory‐driven effects during the auditory or visual stimulation periods (auditory and visual block regressors) and for four different types of events (left and right target and cue events) for each modality (auditory or visual). Regressors were generated by convolving a boxcar function with the appropriate duration (10 s for the instructions, 60 s for the auditory and visual block regressors and 1 s for the event regressors) with the canonical haemodynamic response function [Friston et al., 1998]. The data were highpass‐filtered (256 s) to remove low‐frequency noise, and serial correlations were accounted for.
Contrast images for each participant were submitted to voxel‐wise one‐sample t‐tests (one‐tailed). If not stated otherwise, the resulting random‐effects t‐maps were thresholded at a voxel level of P = 0.001 (t ≥ 3.53) and a cluster level of P = 0.05 (corrected). As the activation to the visual cues was overall weaker than that to the auditory cues, but generally comprised the same or similar areas (see Results), the activations to the auditory and visual cues were thresholded at a corrected cluster level of P = 0.05 of their OR conjunction (voxel threshold P = 0.001). The OR conjunction of two contrasts is defined by the maximum of their t‐maps, rather than their minimum as in the more commonly used AND conjunction [Nichols et al., 2005]. The OR conjunction between the auditory and visual cue regressors (thresholded at a voxel level of P = 0.001 and cluster level of P = 0.05) was also used for region‐of‐interest analyses of the effect of attention shift direction (leftward or rightward) and hemispheric asymmetries (see Results).
For visualisation, the statistical maps were rendered onto a reconstruction of the cerebral and cerebellar grey‐matter surface of a single participant. The surface was generated using SPM2 for reducing intensity non‐uniformities in the original structural image and Caret 5 (http://brainmap.wustl.edu/caret) for grey/white‐matter segmentation and surface reconstruction. Statistical maps were rendered onto the surface by assigning each surface node the maximum voxel value within a 10‐mm cube around the node and converted to RGB maps by scaling the statistical values (t‐values) between zero intensity (black) for t = 0 and the maximum colour intensity for the maximum t‐value of the respective map.
Participants
A total of 22 right‐handed (Oldfield test) volunteers (15 male, 7 female, mean age 24 years) with no known history of audiological, psychiatric or neurological disease participated in the experiment after having given written informed consent. The experimental procedures were approved by the local ethics committee.
RESULTS
Behavioural Data
The behavioural results indicate that all participants managed to perform the attentional task with high accuracy. Auditory and visual targets were detected with an average hit minus false‐alarm (Hit‐FA) rate of 94% (±1.3%) and 97% (±0.6%), respectively (Fig. 2A). The average response time (RT) for visual and auditory target detection amounted to 593 ms (±19.4 ms) and 446 ms (±16.6 ms), respectively (Fig. 2B). Thereby, RT was measured from the onset of the information‐bearing stimulus, which was the onset of the third vowel in the three‐vowel sequences in the auditory modality (see Fig. 1D). Hit‐FA rates and RTs were submitted to two‐way repeated‐measures analyses of variance with stimulus modality (auditory and visual) and attended hemispace (left and right) as within‐participant factors. These analyses revealed that the main effect of stimulus modality was significant for both Hit‐FA rate [F(1,21) = 5.214, P = 0.033] and RT [F(1,21) = 88.031, P < 0.001]. The main effect of attended hemispace was insignificant in both cases, but there was a significant interaction between modality and hemispace in the RTs [F(1,21) = 5.828, P = 0.025], which was due to the average RT for left auditory targets being shorter [417 ms (±20.2 ms)] than that for right auditory targets [475 ms (±21.6 ms); Fig. 2B]. The difference in Hit‐FA rate between the auditory and visual task conditions is consistent with participants' reports of finding the auditory task more difficult than the visual one.
Figure 2.

Average hit minus false‐alarm rate (Hit‐FA; A) and response time (RT; B) for target detection, plotted as a function of stimulus modality (visual: grey bars, auditory: black bars) and attended hemispace (left, right): Error bars show the standard error of the mean across participants. Significant differences are marked with stars.
Imaging Data
Sensory activation
To reveal activations associated with the sensory processing of the auditory and visual stimuli, the auditory block regressor was contrasted with the visual block regressor and vice versa. As expected, the auditory (green highlight in Fig. 3A) and visual (blue highlight) block regressors yielded widespread bilateral activation in the auditory and visual cortices, respectively, as compared with the respective other block regressor. The auditory‐evoked activation also encompassed the left inferior frontal lobe in the region of Broca's area, which was probably due to the verbal character of the stimuli. The visual block regressor yielded greater activation in the occipital lobe, as well as the posterior parietal lobe, the hippocampus and the anterior cingulate cortex.
Figure 3.

(A) Activation to the auditory (green) and visual (blue) block regressors as compared with the respective other block regressor. (B) Activation to the target regressors (averaged over modality and hemispace) for participants who had used their left (green) or right (red) hand as response hand. Both activation patterns were rendered onto a reconstruction of the grey‐matter surface of a single participant (see Methods).
Given that the analysis of the sensory activations involved directly comparing the auditory and visual block regressors, the resulting contrasts would be expected to reflect not only increases but also decreases in the BOLD signal, caused by cross‐modal deactivation [Johnson and Zatorre, 2005; Laurienti et al., 2002].
Button presses
Activations associated with button‐press responses to auditory and visual target events were revealed by testing the target regressors (averaged across modality and hemispace) for participants who had used their left (green highlight in Fig. 3B) or right (red highlight) hand as response hand, separately. As expected, the target regressors yielded activity in sensorimotor areas contralateral to the response hand. In addition to the sensorimotor activation, the right‐hand group also activated the right cerebellum and bilateral anterior cingulate cortex.
Attention shifts
The cue events initiated a shift of attention, which would be expected to produce a transient response in areas involved in endogenous attentional control [Shomstein and Yantis, 2004, 2006; Yantis et al., 2002]. The blue and purple highlight in Figure 4A shows that the activations associated with the auditory cue regressor (averaged across hemispace) comprised the medial parietal cortex, henceforth referred to as the precuneus, bilaterally (1), the right superior and left posterior parietal cortex (2, 3), the temporo‐parietal junction (TPJ) and inferior parietal cortex bilaterally, but stronger on the right (4), the right temporo‐occipital junction (TOJ; 5), the premotor cortex and supplementary motor area (SMA) bilaterally (6) as well as the left dorsolateral prefrontal cortex (DLPFC, 7). Activations associated with the visual cue regressor were overall weaker than the auditory cue‐related activations, but comprised largely the same areas (red and purple highlight in Fig. 4A; see also Table I). The visual cue‐related response did not reach significance in the right superior parietal cortex and the right premotor cortex and SMA.
Figure 4.

(A) Activation to the auditory (blue) and visual (red) cue regressors (averaged over hemispace) and their overlap (purple); 1: precuneus; 2: right superior parietal cortex; 3: left posterior parietal cortex; 4: temporo‐parietal junction (TPJ) and inferior parietal cortex; 5: right temporo‐occipital junction; 6: premotor cortex and supplementary motor area (SMA); 7: left dorsolateral prefrontal cortex (DLPFC). (B) Activation associated with the recognition of the cue stimuli, as reflected by the conjunction between the cue and target regressors (each averaged across modality and hemispace), plotted onto the activation to the auditory cue regressor (blue), replotted from panel A; the cue‐target conjunction is shown in green (cyan where it overlaps the auditory cue‐related activation). Both activation patterns were rendered onto the same grey‐matter surface reconstruction as used in Figure 3 (see Methods). See also Table I.
Table I.
MNI coordinates (in mm) and t‐values of most significant voxels as well as number of voxels in cluster (k) for activations associated with processing of cue events (Fig. 4)
| Contrast | Brain region | Coordinates x, y, z | k | t‐value |
|---|---|---|---|---|
| Auditory cues | Precuneus | 8, −64, 50 | 5,971 | 8.33 |
| Right TPJ and IPC | 58, −36, 28 | 1,317 | 7.78 | |
| PreMC and SMA | −30, −10, 58 | 1,175 | 7.18 | |
| Left DLPFC | −38, 36, 36 | 423 | 6.29 | |
| Left PPC | −54, −60, 30 | 1,107 | 6.29 | |
| Visual cues | Precuneus | −10, −62, 62 | 848 | 8.51 |
| Left DLPFC | −38, 42, 30 | 125 | 5.19 | |
| Left PreMC and SMA | −22, −10, 72 | 78 | 4.49 | |
| Right TPJ and IPC | 66, −38, 30 | 35 | 4.39 | |
| Left TPJ and IPC | −48, −56, 34 | 141 | 4.28 | |
| Cue target conjunction | Right TPJ and IPC | 60, −32, 32 | 119 | 5.15 |
| Left TPJ and IPC | −64, −28, 24 | 380 | 4.50 |
TPJ, temporo‐parietal junction; IPC, inferior parietal cortex; PreMC, premotor cortex; SMA, supplementary motor area; PPC, posterior parietal cortex; DLPFC, dorsolateral prefrontal cortex.
Part of the activations associated with the cue regressors in Figure 4A would be assumed to be related to the recognition of the cue stimuli rather than the actual shifting of attention. Activation associated with stimulus recognition would be expected to be present for both the cue and the target events, because both kinds of events involved the recognition of a predefined stimulus pattern, and would thus be expected to be reflected by the conjunction [Nichols et al., 2005] between the cue and target regressors. The conjunction was thresholded in the same way as the activation relating to sensory processing and button press responses (P = 0.001 voxel level and P = 0.05 cluster level, corrected). The green and cyan highlight in Figure 4B suggests that part of the inferior parietal activation to the cue regressors was related to the recognition of the cue stimuli.
Lateralisation of attention‐shift responses
To test for sensitivity to the direction of attention shifts (leftward versus rightward) in the areas activated by the auditory and visual cue events, the regressors for auditory and visual cue events that had occurred in the right hemispace (i.e., triggered leftward attention shifts) were contrasted with those for cue events in the left hemispace (rightward attention shifts) and vice versa. As the statistical power of the resulting contrasts was relatively weak, a region of interest analysis was performed, whereby the contrasts were masked by the OR conjunction between the auditory and visual cue regressors (see Methods) and thresholded at a relatively lenient voxel threshold of P = 0.05. This analysis revealed that the temporo‐parietal and inferior parietal region as well as the precuneus showed a significant and largely consistent preference for auditory attention shifts towards the contralateral hemispace (highlighted by white circles in Fig. 5A). Thus, the left TPJ and precuneus responded more strongly to rightward than to leftward attention shifts (green highlight in left panels of Fig. 5A and white bars in Fig. 6), whereas the corresponding areas in the right hemisphere were activated more strongly by leftward than by rightward shifts (red highlight in right panels of Fig. 5A and grey bars in Fig. 6). The β‐values for the leftward and rightward auditory attention shifts at those voxels in the TPJ and precuneus that exhibited the greatest sensitivity to shift direction (see Table II) showed that there was no difference in the degree of this contralaterality of the shift responses between the left and right hemispheres [Fig. 6; TPJ: F(1,21) = 1.258, P = 0.275; precuneus: F(1,21) = 0.082, P = 0.777]. In other words, the difference between the contralateral and ipsilateral shift responses was similar in the two hemispheres.
Figure 5.

Comparison between leftward and rightward auditory (A) and visual (B) attention shifts, thresholded at a voxel level of P = 0.05 (t ≥ 1.72) and masked by the OR conjunction between the auditory and visual cue regressors (see Methods). Regions that responded more strongly to the leftward than the rightward attention shifts are shown in red; regions that responded more strongly to the rightward shifts are shown in green. See also Table II.
Figure 6.

Parameter estimates (i.e., β values, referred to as “size of effect”) for leftward and rightward auditory attention shifts at those voxels in the TPJ (A) and precuneus (B) that were most sensitive to shift direction. The white and grey bars show the β‐values from the left and right hemisphere, respectively. The most shift‐sensitive voxels in the left hemisphere were located at −42, −68, 26 mm (TPJ) and −10, −62, 52 mm (precuneus), and at 62, −40, 28 mm (TPJ) and 6, −56, 58 mm (precuneus) in the right hemisphere (see Table II).
Table II.
MNI coordinates (in mm) and t‐values of most significant voxels as well as number of voxels in cluster (k) for activation differences related to the direction of auditory attention shifts (Fig. 5A)
| Contrast | Brain region | Coordinates x, y, z | k | t‐value |
|---|---|---|---|---|
| Aud. cue right–left | Right TPJ | 62, −40, 28 | 501 | 4.37 |
| Left TPJ | −46, −52, 24 | 47 | 2.93 | |
| Right precuneus | 6, −56, 58 | 471 | 2.58 | |
| Aud. cue left–right | Left TPJ | −42, −68, 26 | 68 | 3.38 |
| Left IPC | −60, −30, 20 | 45 | 3.16 | |
| Left precuneus | −10, −62, 52 | 420 | 2.87 |
TPJ, temporo‐parietal junction; IPC, inferior parietal cortex.
The picture conveyed by the visual cues was less consistent. The visual cues generally activated the left hemisphere more strongly than the right (see Fig. 4A), irrespective of attention shift direction. The only region that exhibited any sensitivity to shift direction was the TPJ, with the left TPJ showing a preference for visual attention shifts towards the left (ipsilateral) hemispace (triggered by cues in the right hemispace) and a more inconsistent pattern of direction preference in the right TPJ (Fig. 5B).
Figure 4A suggests that some of the brain areas that were activated by the attention shifts responded more strongly in one than in the other hemisphere, irrespective of shift direction. For instance, the DLPFC (7 in Fig. 4A) seems to be activated in the left, but not the right hemisphere, whereas the superior parietal cortex (2 in Fig. 4A) appears to be activated in the right but not the left hemisphere. To test the significance of these ‘absolute’ asymmetries between the hemispheres, the contrast images for the auditory and visual cue regressors (averaged across hemispace as in Fig. 4A) were compared with the respective left–right flipped versions of these contrasts (see Fig. 7). Comparing a contrast with the flipped version of the same contrast reveals whether the activation at a given voxel in one hemisphere is significantly greater than the activation at the corresponding voxel in the other hemisphere (see also Methods). These hemispheric comparisons showed that the visual cues (red and purple in Fig. 7) elicited a stronger response in the left than the right hemisphere in all activated regions [precuneus (1), posterior parietal cortex (3), TPJ and inferior parietal cortex (4), premotor cortex and SMA (6) and DLPFC (7)]. In contrast, activation to the auditory cues (blue and purple) was stronger in the left precuneus (1), the right superior and left posterior parietal cortex (2, 3), the right TPJ, inferior parietal cortex and TOJ (4, 5) and the left DLPFC (7) than in the corresponding regions in the opposite hemisphere.
Figure 7.

Comparison of the contrast images for the auditory (blue and purple) and visual (red and purple) cue regressors (averaged across hemispace as in Fig. 4) with the respective right–left flipped contrasts, thresholded and masked in the same way as the data shown in Fig. 5. The activations show areas where the auditory or visual cues produced a stronger activation in one than in the other hemisphere.
DISCUSSION
Supramodal Attention System?
The attention shift‐related activations observed in the current study are broadly consistent with findings from previous studies on attentional orienting [see, e.g., Giessing et al., 2004; Himmelbach et al., 2006; Shomstein and Yantis, 2004, 2006; Yantis et al., 2002]. By including visual and auditory attention conditions in the same experiment, the current study allowed to directly compare the patterns of attention‐related activations between the auditory and visual modalities. The results showed that spatial attentional orienting in the auditory modality activates the same frontal, parietal and temporal brain regions as attentional orienting in the visual modality. The results are reminiscent of those of Macaluso et al. [2002], who showed that orienting to visual or somatosensory stimuli produced common activation patterns in the TPJ and the inferior and superior parietal cortices. The current results are also consistent with findings by Eimer et al. [2002, 2003], who reported similar ERP components during attentional orienting in vision, audition and touch. The current results also accord with results by Lewis et al. [2000], showing that auditory and visual motion processing co‐activate areas in parietal and frontal cortices. Taken together, these findings lend strong support to the notion of a supramodal system for spatial attention [e.g., Eimer and Driver, 2001; Farah et al., 1989]. However, there are some data that suggest that at least some of the relevant brain regions contain modality‐specific subregions. For instance, in the current data, the dorsolateral prefrontal activation to the visual cues was located more anterior than that to the auditory cues (see region 7 in Fig. 4A). Using TMS, Chambers et al. [2004] showed that parts of the inferior parietal cortex play a critical role in attentional orienting in the visual but not the somatosensory domain. Finally, Green and McDonald [2006] observed modality‐specific differences in the attentional ERP components in a study using cross‐modal (audiovisual) attentional cueing. In particular, a frontal response (see Introduction) was present when the cue was visual and the target auditory, but absent in the reverse situation, and the topology of one component in the posterior response depended on the modality of the target stimuli. More advanced fMRI methods with a higher spatial resolution [Frahm et al., 2004] might be able to reveal whether or not the attention system contains modality‐specific submodules.
Spatial Versus Non‐Spatial Attention
Previous results on spatial and non‐spatial attentional orienting in the auditory and visual modality suggest that some of the areas that were activated by the shifting of spatial attention in the current experiment may also be involved in the control of non‐spatial attention [Serences et al., 2004; Shomstein and Yantis, 2006; Zatorre et al., 1999]. Zatorre et al. [1999], for instance, showed that focusing attention on either pitch or location in a random sequence of tones activates essentially the same regions in the parietal and frontal lobes. Using a paradigm similar to the one used in the current study, Shomstein and Yantis [2004] showed that the precuneus and the inferior parietal lobe are also activated by shifting attention between sensory modalities (vision and audition); the locations of the activations reported by Shomstein and Yantis appear to be similar to those of the cue‐related activations observed in the current experiment.
Hemispheric Lateralisation
Some of the areas activated by the cue events were sensitive to the direction of the attention shifts initiated by the cues. Sensitivity to attention shift direction was largely limited to the parietal cortex (particularly the precuneus) and adjacent areas (TPJ). In the case of the auditory cues, responses in these shift direction‐sensitive areas were almost exclusively larger to the contralateral than the ipsilateral attention shifts. In the case of the visual cues, sensitivity to attention shift direction was limited to a relatively small area in the region of the TPJ, which exhibited a larger response to the ipsilateral attention shifts in the left hemisphere and a mixed pattern in the right hemisphere. The results for the auditory cues are consistent with previous ERP results, which have also found contralateral direction sensitivity in some components of the neuroelectric attention‐shift response [Eimer et al., 2002; Harter et al., 1989; Nobre et al., 2000; Talsma et al., 2005; Yamaguchi et al., 1994]. The current study allows the precise localisation of these shift direction‐sensitive components, and, unlike the ERP studies, also allows to delineate the shift direction‐sensitive components from those components that are insensitive to the direction of attention shifts.
Contrary to expectations based on the hemispheric asymmetry of neglect, the degree of contralaterality in the responses to the auditory cues did not show any significant differences between the hemispheres. However, some of the attention areas exhibited an absolute hemispheric asymmetry, that is, they were activated more strongly in one than the other hemisphere. While, some of these areas were located in the left rather than the right hemisphere, particularly for the visual cues, the auditory cues activated the TPJ and TOJ and the superior parietal cortex more strongly in the right than in the left hemisphere. Both the temporal, parietal and occipital junctional area (TPJ and TOJ) and the superior parietal cortex have previously been found to be associated with neglect [Heilman et al., 1983, see also Karnath et al., 2001; Ringman et al., 2004].
The inconsistencies between the lateralisation results for the auditory and visual cues are probably due to the relatively poor statistical power of the overall responses to the visual cue events. However, they may also be at least partly due to non‐trivial reasons. Three possibilities spring to mind. First, it may be the case that spatial attention shifts are controlled not within the hemisphere contralateral to the shift direction, but contralateral to the hemispace within which the shift is executed. Attention shifts that span only one hemispace, as used in most of the ERP studies on attention shifting, would thus be expected to produce a predominantly contralateral pattern of response. Attention shifts spanning both hemispaces, as used in the current and most of the previous imaging (fMRI, PET) studies of attentional control, would be expected to produce a largely bilateral overall activation. Temporal snapshots of the activation could be lateralised to either hemisphere, depending on the time point of the snapshot.
Alternatively, it could be the case that the process of disengaging attention from the currently attended stimulus prior to the attention shift [Posner et al., 1984] is sensitive to the hemispace within which it occurs. In the current study, as in most previous imaging studies of attentional control, the location from which attention was disengaged was located in the opposite hemispace to where attention was shifted. Thus, if activity associated with disengagement is lateralised to the hemisphere contralateral to the currently attended location and activity associated with the attention shift is contralateral to the shift direction, the overall activation would again be bilateral. Again, snapshots of the activation could be lateralised to either hemisphere depending on whether the snapshot is dominated by the disengagement or the shifting process.
Finally, because of difficult‐to‐overcome technical limitations, the auditory attention shifts spanned a much wider angle than the visual ones, and the discrepancies between the lateralisation results for the auditory and visual cues may be related to this difference. The auditory cues were perceived as fully lateralised to the left or right ear (±90°), whereas the visual stimuli were lateralised by only ±9.8°. If the auditory and visual attention shifts were controlled by a common, supramodal, attention system based on a common spatial map, the weaker activation and more inconsistent lateralisation pattern produced by the visual compared to the auditory cues may be due to the size and lateralisation of the shift responses increasing with increasing shift angle. These hypotheses may be tested by comparing brain activations associated with attention shifts within versus across hemispaces or to different eccentricities within a given hemispace. It is improbable that the leftward asymmetry of the visual cue responses were due to the verbal character of the stimuli; previous studies indicate that the left‐hemisphere dominance for speech processing is linked to, or starts with, the lexical mapping of speech items, rather their physical or phonological attributes [Shtyrov et al., 2005; for review, see Hickok and Peoppel, 2000; Scott and Wise, 2004; see also Stephan et al., 2003]. Moreover, due to the triangular arrangement of the letters in the visual task, the auditory stimuli would have been expected to be perceived as more speech‐like, and thus be more affected by the left lateralisation of language, than the visual ones.
Stimulus Recognition Versus Attention Shifting
The current results suggest that part of the inferior parietal cortex and the TPJ were involved in the recognition of the cue and target events (Fig. 4B). This is consistent with previous findings on the detection of behaviourally relevant stimuli [Astafiev et al., 2006; Corbetta and Shulman, 2002; Kincade et al., 2005]. However, these areas overlapped considerably the temporo‐parietal regions sensitive to the direction of attention shifts (see Fig. 5), suggesting that networks subserving voluntary attentional orienting and detection of behaviourally relevant stimuli in the temporo‐parietal region are at least partially overlapping. The notion that the TPJ and inferior parietal cortex are involved in voluntary attentional orienting is also supported by previous imaging [Himmelbach et al., 2006] and neuropsychological data [Karnath et al., 2001].
CONCLUSIONS
The current results show that spatial attention shifts in the auditory and visual modalities activate essentially the same areas in the parietal, frontal and temporal lobes. These areas include the precuneus and superior parietal cortex, the temporal, parietal and occipital junctional area, the premotor cortex and SMA, as well as the DLPFC. The results lend strong support for the notion that spatial attention is controlled by a supramodal system. Results from previous studies suggest that the same system may also be involved in the control of non‐spatial attention and that at least some of its elements may contain modality‐specific submodules. Some areas responded more strongly to attention shifts towards the contralateral than the ipsilateral hemispace. The degree of this contralaterality was similar between the two hemispheres. This is contrary to expectations based on the hemispheric asymmetry of neglect, according to which the right hemisphere should have responded more equally to ipsilateral and contralateral attention shifts than the left hemisphere. This suggests that the asymmetry of neglect is due to absolute hemispheric asymmetries in the spatial attention network. In the current experiment, the auditory attention shifts activated the TPJ and TOJ as well as the superior parietal cortex more strongly in the right than in the left hemisphere. Both the superior parietal cortex and the temporal, parietal and occipital junctional area have previously been shown to be associated with neglect.
Acknowledgements
We thank our colleagues from the MR and cognitive neurology groups at the INB‐3 for their support.
REFERENCES
- Astafiev SV,Shulman GL,Corbetta M ( 2006): Visuospatial reorienting signals in the human temporo‐parietal junction are independent of response selection. Eur J Neurosci 23: 591–596. [DOI] [PubMed] [Google Scholar]
- Baddeley A ( 2003): Working memory and language: An overview. J Commun Disord 36: 189–208. [DOI] [PubMed] [Google Scholar]
- Chambers CD,Stokes MG,Mattingley JB ( 2004): Modality‐specific control of strategic spatial attention in parietal cortex. Neuron 16: 925–930. [DOI] [PubMed] [Google Scholar]
- Corbetta M,Shulman GL ( 2002): Control of goal‐directed and stimulus‐driven attention in the brain. Nat Rev Neurosci 3: 201–215. [DOI] [PubMed] [Google Scholar]
- Corbetta M,Kincade JM,Ollinger JM,McAvoy MP,Shulman GL ( 2000): Voluntary orienting is dissociated from target detection in human posterior parietal cortex. Nat Neurosci 3: 292–297. [DOI] [PubMed] [Google Scholar]
- Culling JF,Summerfield Q ( 1995): Perceptual separation of concurrent speech sounds: Absence of across‐frequency grouping by common interaural delay. J Acoust Soc Am 98: 785–797. [DOI] [PubMed] [Google Scholar]
- Darwin CJ,Hukin RW ( 1999): Auditory objects of attention: The role of interaural time differences. J Exp Psychol Hum Percept Perform 25: 617–629. [DOI] [PubMed] [Google Scholar]
- Darwin CJ,Hukin RW ( 2000): Effectiveness of spatial cues, prosody, and talker characteristics in selective attention. J Acoust Soc Am 107: 970–977. [DOI] [PubMed] [Google Scholar]
- Deouell LY,Bentin S,Giard MH ( 1998): Mismatch negativity in dichotic listening: Evidence for interhemispheric differences and multiple generators. Psychophysiology 35: 355–365. [PubMed] [Google Scholar]
- Eimer M,Driver J ( 2001): Crossmodal links in endogenous and exogenous spatial attention: Evidence from event‐related brain potential studies. Neurosci Biobehav Rev 25: 497–511. [DOI] [PubMed] [Google Scholar]
- Eimer M,van Velzen J,Driver J ( 2002): Cross‐modal interactions between audition, touch, and vision in endogenous spatial attention: ERP evidence on preparatory states and sensory modulations. J Cogn Neurosci 14: 254–271. [DOI] [PubMed] [Google Scholar]
- Eimer M,van Velzen J,Forster B,Driver J ( 2003): Shifts of attention in light and in darkness: An ERP study of supramodal attentional control and crossmodal links in spatial attention. Brain Res Cogn Brain Res 15: 308–323. [DOI] [PubMed] [Google Scholar]
- Farah MJ,Wong AB,Monheit MA,Morrow LA ( 1989): Parietal lobe mechanisms of spatial attention: Modality‐specific or supramodal? Neuropsychologia 27: 461–470. [DOI] [PubMed] [Google Scholar]
- Fan J,McCandliss BD,Sommer T,Raz A,Posner MI ( 2002): Testing the efficiency and independence of attentional networks. J Cogn Neurosci 14: 340–347. [DOI] [PubMed] [Google Scholar]
- Frahm J,Dechent P,Baudewig J,Merboldt KD ( 2004): Advances in functional MRI of the human brain. Prog Nucl Magn Reson Spectrosc 44: 1–32. [Google Scholar]
- Friston KJ,Fletcher P,Josephs O,Holmes A,Rugg MD,Turner R ( 1998): Event‐related fMRI: Characterizing differential responses. Neuroimage 7: 30–40. [DOI] [PubMed] [Google Scholar]
- Giessing C,Thiel CM,Stephan KE,Rösler F,Fink GR ( 2004): Visuospatial attention: How to measure effects of infrequent, unattended events in a blocked stimulus design. Neuroimage 23: 1370–1381. [DOI] [PubMed] [Google Scholar]
- Gitelman DR ( 2002): ILAB: A program for postexperimental eye movement analysis. Behav Res Methods Instrum Comput 34: 605–612. [DOI] [PubMed] [Google Scholar]
- Green JJ,McDonald JJ ( 2006): An event‐related potential study of supramodal attentional control and crossmodal attention effects. Psychophysiology 43: 161–171. [DOI] [PubMed] [Google Scholar]
- Grosbras MH,Paus T ( 2002): Transcranial magnetic stimulation of the human frontal eye field: Effects on visual perception and attention. J Cogn Neurosci 14: 1109–1120. [DOI] [PubMed] [Google Scholar]
- Grosbras MH,Paus T ( 2003): Transcranial magnetic stimulation of the human frontal eye field facilitates visual awareness. Eur J Neurosci 18: 3121–3126. [DOI] [PubMed] [Google Scholar]
- Halligan PW,Fink GR,Marshall JC,Vallar G ( 2003): Spatial cognition: Evidence from visual neglect. Trends Cogn Sci 7: 125–133. [DOI] [PubMed] [Google Scholar]
- Harter MR,Miller SL,Price NJ,LaLonde ME,Keyes AL ( 1989): Neural processes involved in directing attention. J Cogn Neurosci 1: 223–237. [DOI] [PubMed] [Google Scholar]
- Heilman KM,Watson RT,Valenstein E,Damasio AR ( 1983): Localization of lesions in neglect In: Kertesz A, editor. Localization in Neuropsychology. New York: Academic Press; pp 471–492. [Google Scholar]
- Heilman KM,Watson RT,Valenstein E ( 2003): Neglect and related disorders In: Heilman KM,Valenstein E, editors. Clinical Neuropsychology. London: Oxford University Press; pp 296–346. [Google Scholar]
- Hickok G,Poeppel D ( 2000): Towards a functional neuroanatomy of speech perception. Trends Cogn Sci 4: 131–138. [DOI] [PubMed] [Google Scholar]
- Himmelbach M,Erb M,Karnath HO ( 2006): Exploring the visual world: The neural substrate of spatial orienting. Neuroimage 32: 1747–1759. [DOI] [PubMed] [Google Scholar]
- Hine J,Debener S ( 2007): Late auditory evoked potentials asymmetry revisited. Clin Neurophysiol 118: 1274–1285. [DOI] [PubMed] [Google Scholar]
- Hopfinger JB,Buonocore MH,Mangun GR ( 2000): The neural mechanisms of top–down attentional control. Nat Neurosci 3: 284–291. [DOI] [PubMed] [Google Scholar]
- Hugdahl K,Nordby H ( 1994): Electrophysiological correlates to cued attentional shifts in the visual and auditory modalities. Behav Neural Biol 62: 21–32. [DOI] [PubMed] [Google Scholar]
- Johnson JA,Zatorre RJ ( 2005): Attention to simultaneous unrelated auditory and visual events: Behavioral and neural correlates. Cereb Cortex 15: 1609–1620. [DOI] [PubMed] [Google Scholar]
- Kaiser J,Lutzenberger W,Preissl H,Ackermann H,Birbaumer N ( 2000): Right‐hemisphere dominance for the processing of sound‐source lateralization. J Neurosci 20: 6631–6639. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Karnath HO,Ferber S,Himmelbach M ( 2001): Spatial awareness is a function of the temporal not the posterior parietal lobe. Nature 411: 950–953. [DOI] [PubMed] [Google Scholar]
- Kincade JM,Abrams RA,Astafiev SV,Shulman GL,Corbetta M ( 2005): An event‐related functional magnetic resonance imaging study of voluntary and stimulus‐driven orienting of attention. J Neurosci 25: 4593–4604. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Krumbholz K,Schönwiesner M,von Cramon DY,Rübsamen R,Shah NJ,Zilles K,Fink GR ( 2005): Representation of interaural temporal information from left and right auditory space in the human planum temporale and inferior parietal lobe. Cereb Cortex 15: 317–324. [DOI] [PubMed] [Google Scholar]
- Krumbholz K,Hewson‐Stoate N,Schönwiesner M ( 2007): Cortical response to auditory motion suggests an asymmetry in the reliance on inter‐hemispheric connections between the left and right auditory cortices. J Neurophysiol 97: 1649–1655. [DOI] [PubMed] [Google Scholar]
- Laurienti PJ,Burdette JH,Wallace MT,Yen YF,Field AS,Stein BE ( 2002): Deactivation of sensory‐specific cortex by cross‐modal stimuli. J Cogn Neurosci 14: 420–429. [DOI] [PubMed] [Google Scholar]
- Lewis JW,Beauchamp MS,DeYoe EA ( 2000): A comparison of visual and auditory motion processing in human cerebral cortex. Cereb Cortex 10: 873–888. [DOI] [PubMed] [Google Scholar]
- Macaluso E,Frith CD,Driver J ( 2002): Supramodal effects of covert spatial orienting triggered by visual or tactile events. J Cogn Neurosci 14: 389–401. [DOI] [PubMed] [Google Scholar]
- Marshall JC ( 2001): Editorial: Auditory neglect and right parietal cortex. Brain 124: 645–646. [DOI] [PubMed] [Google Scholar]
- Mesulam MM ( 1981): A cortical network for directed attention and unilateral neglect. Ann Neurol 10: 309–325. [DOI] [PubMed] [Google Scholar]
- Mesulam MM ( 1999): Spatial attention and neglect: Parietal, frontal and cingulate contributions to the mental representation and attentional targeting of salient extrapersonal events. Philos Trans R Soc Lond B Biol Sci 354: 1325–1346. [DOI] [PMC free article] [PubMed] [Google Scholar]
- McDonald JJ,Ward LM ( 1999): Spatial relevance determines facilitatory and inhibitory effects of auditory covert spatial orienting. J Exp Psychol Hum Percept Perform 25: 1234–1252. [Google Scholar]
- Nichols T,Brett M,Andersson J,Wager T,Poline JB ( 2005): Valid conjunction inference with the minimum statistic. Neuroimage 25: 653–660. [DOI] [PubMed] [Google Scholar]
- Nobre AC,Sebestyen GN,Gitelman DR,Mesulam MM,Frackowiak RS,Frith CD ( 1997): Functional localization of the system for visuospatial attention using positron emission tomography. Brain 120: 515–533. [DOI] [PubMed] [Google Scholar]
- Nobre AC,Sebestyen GN,Miniussi C ( 2000): The dynamics of shifting visuospatial attention revealed by event‐related potentials. Neuropsychologia 38: 964–974. [DOI] [PubMed] [Google Scholar]
- Petit L,Simon G,Joliot M,Andersson F,Bertin T,Zago L,Mellet E,Tzourio‐Mazoyer N ( 2007): Right hemisphere dominance for auditory attention and its modulation by eye position: An event related fMRI study. Restor Neurol Neurosci 25: 211–225. [PubMed] [Google Scholar]
- Pollmann S,Morrillo M ( 2003): Left and right occipital cortices differ in their response to spatial cueing. Neuroimage 18: 273–283. [DOI] [PubMed] [Google Scholar]
- Posner MI ( 1980): Orienting of attention. Q J Exp Psychol 32: 3–25. [DOI] [PubMed] [Google Scholar]
- Posner MI,Walker JA,Friedrich FJ,Rafal RD ( 1984): Effects of parietal injury on covert orienting of attention. J Neurosci 4: 1863–1874. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Reeves A,Sperling G ( 1986): Attention gating in short‐term visual memory. Psychol Rev 93: 180–206. [PubMed] [Google Scholar]
- Ringman JM,Saver JL,Woolson RF,Clarke WR,Adams HP ( 2004): Frequency, risk factors, anatomy, and course of unilateral neglect in an acute stroke cohort. Neurology 63: 468–474. [DOI] [PubMed] [Google Scholar]
- Roberts KL,Summerfield AQ,Hall DA ( 2006): Presentation modality influences behavioral measures of alerting, orienting, and executive control. J Int Neuropsychol Soc 12: 485–492. [DOI] [PubMed] [Google Scholar]
- Serences JT,Schwarzbach J,Courtney SM,Golay X,Yantis S ( 2004): Control of object‐based attention in human cortex. Cereb Cortex 14: 1346–1357. [DOI] [PubMed] [Google Scholar]
- Schönwiesner M,Krumbholz K,Rübsamen R,Fink GR,von Cramon DY ( 2007): Hemispheric asymmetry for auditory processing in the human auditory brain stem, thalamus, and cortex. Cereb Cortex 17: 492–499. [DOI] [PubMed] [Google Scholar]
- Scott SK,Wise RJ ( 2004): The functional neuroanatomy of prelexical processing in speech perception. Cognition 92: 13–45. [DOI] [PubMed] [Google Scholar]
- Shomstein S,Yantis S ( 2004): Control of attention shifts between vision and audition in human cortex. J Neurosci 24: 10702–10706. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shomstein S,Yantis S ( 2006): Parietal cortex mediates voluntary control of spatial and nonspatial auditory attention. J Neurosci 26: 435–439. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shtyrov Y,Pihko E,Pulvermüller F ( 2005): Determinants of dominance: Is language laterality explained by physical or linguistic features of speech? Neuroimage 27: 37–47. [DOI] [PubMed] [Google Scholar]
- Smith DR,Patterson RD ( 2005): The interaction of glottal‐pulse rate and vocal‐tract length in judgements of speaker size, sex, and age. J Acoust Soc Am 118: 3177–3186. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Spence CJ,Driver J ( 1994): Covert spatial orienting in audition: Exogenous and endogenous mechanisms. J Exp Psychol Hum Percept Perform 20: 555–574. [Google Scholar]
- Stephan KE,Marshall JC,Friston KJ,Rowe JB,Ritzl A,Zilles K,Fink GR ( 2003): Lateralized cognitive processes and lateralized task control in the human brain. Science 301: 384–386. [DOI] [PubMed] [Google Scholar]
- Talsma D,Slagter HA,Nieuwenhuis S,Hage J,Kok A ( 2005): The orienting of visuospatial attention: An event‐related brain potential study. Brain Res Cogn Brain Res 25: 117–129. [DOI] [PubMed] [Google Scholar]
- Thiel CM,Zilles K,Fink GR ( 2004): Cerebral correlates of alerting, orienting and reorienting of visuospatial attention: An event‐related fMRI study. Neuroimage 21: 318–328. [DOI] [PubMed] [Google Scholar]
- Uppenkamp S,Johnsrude IS,Norris D,Marslen‐Wilson W,Patterson RD ( 2006): Locating the initial stages of speech‐sound processing in human temporal cortex. Neuroimage 31: 1284–1296. [DOI] [PubMed] [Google Scholar]
- Vandenberghe R,Duncan J,Dupont P,Ward R,Poline JB,Bormans G,Michiels J,Mortelmans L,Orban GA ( 1997): Attention to one or two features in left or right visual field: A positron emission tomography study. J Neurosci 17: 3739–3750. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Vandenberghe R,Duncan J,Arnell KM,Bishop SJ,Herrod NJ,Owen AM,Minhas PS,Dupont P,Pickard JD,Orban GA ( 2000): Maintaining and shifting attention within left or right hemifield. Cereb Cortex 10: 706–713. [DOI] [PubMed] [Google Scholar]
- van der Lubbe RH,Neggers SF,Verleger R,Kenemans JL ( 2006): Spatiotemporal overlap between brain activation related to saccade preparation and attentional orienting. Brain Res 1072: 133–152. [DOI] [PubMed] [Google Scholar]
- Vandenberghe R,Gitelman DR,Parrish TB,Mesulam MM ( 2001): Location‐ or feature‐based targeting of peripheral attention. Neuroimage 14: 37–47. [DOI] [PubMed] [Google Scholar]
- Walsh V,Ellison A,Ashbridge E,Cowey A ( 1999): The role of the parietal cortex in visual attention hemispheric asymmetries and the effects of learning: A magnetic stimulation study. Neuropsychologia 37: 245–251. [DOI] [PubMed] [Google Scholar]
- Yamaguchi S,Tsuchiya H,Kobayashi S ( 1994): Electroencephalographic activity associated with shifts of visuospatial attention. Brain 117: 553–562. [DOI] [PubMed] [Google Scholar]
- Yantis S,Schwarzbach J,Serences JT,Carlson RL,Steinmetz MA,Pekar JJ,Courtney SM ( 2002): Transient neural activity in human parietal cortex during spatial attention shifts. Nat Neurosci 5: 995–1002. [DOI] [PubMed] [Google Scholar]
- Zatorre RJ,Mondor TA,Evans AC ( 1999). Auditory attention to space and frequency activates similar cerebral systems. Neuroimage 10: 544–554. [DOI] [PubMed] [Google Scholar]
