Skip to main content
Proceedings of the National Academy of Sciences of the United States of America logoLink to Proceedings of the National Academy of Sciences of the United States of America
. 2016 Apr 11;113(17):4842–4847. doi: 10.1073/pnas.1524087113

Prestimulus influences on auditory perception from sensory representations and decision processes

Stephanie J Kayser a,1, Steven W McNair a,1, Christoph Kayser a,2
PMCID: PMC4855557  PMID: 27071110

Significance

The likelihood of perceiving a faint stimulus depends not only on the stimulus itself but also on the state of rhythmic brain activity preceding the stimulus. Previous neuroimaging results did not confirm this state dependency as arising from early sensory representations or later decision-related computations. We show that state affects perception via two mechanisms: one where the amplitude of slow-wave activity influences the scaling of early sensory evidence, and another where the time profile of the activity influences sensory decisions in areas governing cognitive processes. These findings reconcile the plethora of previous findings and delineate two relevant mechanisms.

Keywords: perception, oscillatory brain activity, EEG, single-trial decoding, prestimulus effects

Abstract

The qualities of perception depend not only on the sensory inputs but also on the brain state before stimulus presentation. Although the collective evidence from neuroimaging studies for a relation between prestimulus state and perception is strong, the interpretation in the context of sensory computations or decision processes has remained difficult. In the auditory system, for example, previous studies have reported a wide range of effects in terms of the perceptually relevant frequency bands and state parameters (phase/power). To dissociate influences of state on earlier sensory representations and higher-level decision processes, we collected behavioral and EEG data in human participants performing two auditory discrimination tasks relying on distinct acoustic features. Using single-trial decoding, we quantified the relation between prestimulus activity, relevant sensory evidence, and choice in different task-relevant EEG components. Within auditory networks, we found that phase had no direct influence on choice, whereas power in task-specific frequency bands affected the encoding of sensory evidence. Within later-activated frontoparietal regions, theta and alpha phase had a direct influence on choice, without involving sensory evidence. These results delineate two consistent mechanisms by which prestimulus activity shapes perception. However, the timescales of the relevant neural activity depend on the specific brain regions engaged by the respective task.


Sensory percepts depend not only on the environmental inputs but also on the internal brain state before stimulus presentation (1). Many studies have shown that the accuracy and speed of sensory performance change with the power and timing (phase) of rhythmic activity during a prestimulus period (2, 3). Studies in the auditory system, for example, have demonstrated that performance in detecting sounds and gaps in noise, or the discrimination of lexical stimuli, varies with the power and phase of rhythmic activity between about 1 and 12 Hz (49).

Although the collective evidence makes a strong case that prestimulus state shapes the processing and perceptual consequences of sensory inputs, the functional interpretation of these findings in the context of specific sensory computations or higher cognitive processes has remained difficult (7, 10, 11). Electrophysiological studies in animals have described the state dependency of firing rates relative to cortical oscillations (1215). Hence, it is tempting to interpret the reported prestimulus effects in neuroimaging studies as direct evidence for a link between the neural gain of early sensory cortices and perception. However, this is difficult for two reasons. First, previous studies have used different behavioral protocols (detection and discrimination) and stimuli (tones in silence or noise, gaps in noise, or speech), and each has implied different frequency bands and state parameters as relevant (from 1 to 12 Hz, reporting effects for phase, power, or both). Second, given the coarse spatial resolution of neuroimaging, it has often been difficult to localize the observed correlation of prestimulus state with perception to a specific neural process or brain region. Hence, it remains unclear whether previously reported prestimulus influences indeed originate from auditory cortices, possibly reflecting changes in sensory gain, or result from other high-level regions that are involved in general decision making.

To disambiguate these two possibilities, we collected behavioral and EEG data during two auditory discrimination tasks relying on distinct acoustic features in the same participants. To dissect different stages of the sensory–perceptual cascade, we used single-trial decoding to separate earlier auditory from later decision-related activity (1618). We then used linear modeling to quantify the relation between prestimulus activity, task-relevant sensory evidence, and perceptual choice within each of these components. This allowed us to directly quantify whether putative correlations of prestimulus activity with perceptual choice are mediated by an impact of prestimulus state on early auditory evidence, or arise from higher cognitive processes activated subsequent to early sensory representations.

Results

Behavioral Results.

The discrimination tasks used here were modeled based on a previous target-in-background detection task that had revealed pretarget influences on perception similar to those reported in other auditory studies (4). Subjects performed frequency and intensity discrimination tasks on different days and judged which of two brief tones was higher in pitch (louder). Each tone lasted 50 ms (with a 50-ms delay) and the second tone was always the standard, whereas the first was higher or lower in pitch (or intensity) across seven levels of difficulty titrated around each participant’s threshold (Fig. 1A). Targets were presented on a background cacophony created from the superposition of many naturalistic sounds (4). The complete sensory evidence necessary to perform the task (i.e., both target tones) was available 150 ms after target onset.

Fig. 1.

Fig. 1.

Experimental paradigm and behavioral data. (A) Experimental paradigm. Target sounds (pure tones, each lasting 50 ms, 50-ms intertone interval) were presented at one of six possible time points (2.4 + n*0.033 s, n = 0, …, 5) during a continuous acoustic background. The first tone varied in frequency (intensity) and was either increased or decreased by one of seven levels (titrated around each participant’s psychometric threshold; Δ) relative to the second tone, which was the standard. (B) Group-level psychometric curves for each task, averaged across target positions (mean and SEM across participants; n = 13 participants). (C) Group-level performance (mean and SEM across participants) vs. target position, averaged across stimulus levels. There was no significant effect of target position (main text).

Group-level psychometric curves (n = 13 participants) are shown in Fig. 1B, and demonstrate the comparable performance across tasks [paired t test, t(12) = 2.993, P = 0.096]. The target tones were presented at six different time points relative to background to avoid expectancy effects (19, 20). There was no significant effect of target position on performance [frequency: ANOVA, F(5, 72) = 0.33, P = 0.892; intensity: F(5, 72) = 1.41, P = 0.231], suggesting that any dependency of perception on pretarget activity arises from neural processes not strictly entrained by the acoustic background (4, 5, 9).

Decoding Task-Relevant Sensory Evidence from EEG Activity.

Using single-trial decoding applied across all stimulus levels, we searched for one-dimensional projections that maximally differed between the two conditions that participants were discriminating, namely which of the two tones was higher (in pitch or intensity). For both tasks, the decoding performance quantified by the receiver operator characteristic (ROC) was significant shortly after target presentation (randomization test, P < 0.01, corrected for multiple comparisons along time; Fig. 2A). We assessed the relevance of each projection for encoding task-relevant sensory evidence by computing neurometric curves for stimulus discrimination and neurobehavioral correlations (Fig. 2B). For both tasks, the neurobehavioral correlations were significant shortly after target presentation (randomization test, P < 0.01). Example neurometric curves are shown in Fig. 2C.

Fig. 2.

Fig. 2.

Single-trial decoding of task-relevant evidence. (A) Performance of a linear classifier discriminating the two stimulus conditions (which tone was higher/lower) quantified using the ROC. Classification was performed using the EEG activity within 80-ms sliding windows (the time axis refers to the beginning of each window). Time is relative to target onset, and significance is relative to a randomization test (dashed lines; P < 0.01). (B) Correlation of psychometric curves with neurometric curves constructed from each classifier, with significance relative to a randomization test (dashed lines; P < 0.01). (C) Illustration of neurometric curves (derived from component 2, red, in D) together with the group-averaged psychometric curves (taken from Fig. 1). (D) K-means clustering the scalp projections of the classification components revealed three temporally continuous clusters that were consistent across participants and tasks. The first cluster spanned the period of target presentation (black), the second reflected early auditory activations (red), and the third reflected later frontoparietal activations (blue). (E) Stimulus discrimination performance and neurobehavioral correlations for each of the three components, with significance relative to a randomization test (dashed lines; P < 0.01). Lines and shaded areas as well as error bars show mean and SEM across participants (n = 13).

The scalp projections of the discriminating EEG components exhibited a systematic temporal pattern. Clustering projections in time revealed three distinct components for each task, which shared similarities between tasks. The average scalp projections and time epochs for each component are shown in Fig. 2D. The first component covered the time epoch during which the two target tones were being presented (up to t ∼0.150 s). The second reflected typical auditory activations with central topographies (spanning 0.14–0.28 s for intensity and 0.13–0.28 s for frequency; termed the “auditory” component). For the intensity task this component was bilateral, whereas for the frequency task it was more right-lateralized, in agreement with the understanding that distinct auditory networks are involved in intensity and frequency judgments (21, 22). The third component comprised frontoparietal activations (spanning 0.28–0.4 s; termed “frontoparietal”), likely reflecting the transformation from auditory inputs to awareness (17, 23). The topography of this component differed between tasks in line with the notion that late frontoparietal EEG components reflect the heterogeneity of response strategies and latencies in sensory decision tasks (24). The between-subject similarities of each component did not differ between tasks [Fig. S1A; paired t tests, t(12) = 1.7, P = 0.10; t = 1.3, P = 0.2; and t = 1.2, P = 0.22, respectively]. The second and third components exhibited significant sensitivity to the stimulus condition in both tasks (ROC above 0.5; randomization test, P < 0.01), whereas the first component did not (Fig. 2E). Given that the first component did not capture significant task-relevant evidence and spanned a time period during which target presentation was not yet complete, we excluded this component from subsequent analysis. Neurobehavioral correlations were significant and strongest for the auditory component in both tasks, and for the frequency task they were significant only for this component (Fig. 2E).

Fig. S1.

Fig. S1.

(A) Similarity of the component topographies between participants. For each EEG component, we computed the Pearson correlation of each subject’s own topography with the group average. Results are shown as mean and SEM across participants (n = 13). (B) Regression of sensory evidence, Ytask, on the intensity of the background sound before target onset. Lines indicate the group-level regression t statistics, which did not reach significance when corrected for multiple comparisons across tasks, EEG components, and time (FDR; at P <0.01).

Pretarget Influences in Auditory Networks.

Having identified EEG components that characterize the networks carrying task-relevant sensory evidence allowed us to ask whether pretarget activity within these networks had a significant influence on perceptual choice. Hence, we exploited the low-dimensional projections defined by these classification components as windows onto specific neural processes involved in the sensory–perceptual transformation (1618). Importantly, by extracting the component separately for each participant and task, we avoided the assumption of a common localization of effects across tasks or subjects, which has often been made in previous studies. Rather, we investigated prestimulus activity within the most relevant activity components for each participant and task. Based on previous studies, we expected to find an influence of pretarget activity on perception. However, it remained unclear whether these influences would be across tasks and whether they were mediated by sensory evidence reflected by the respective component.

Based on the weights associated with each component, we derived projections of the relevant single-trial activity separately for each task and participant (Fig. 3A). From these projections, we then extracted the oscillatory power and phase during a pretarget period and determined for which time-frequency bins there was a significant relation between these and choice (“direct influence” model I; Fig. 3B). For both tasks, this revealed no effects of either power or phase on choice (Fig. 3C; at P <0.05; here and in the following, all results are derived using cluster-based permutation controlling for multiple comparisons across time-frequency bins and are corrected for comparisons across regression models, parameters, and tasks using false discovery rate). We then asked whether pretarget activity was related to the sensory evidence (Ytask) encoded by the respective EEG components (model II; Fig. 3C). Indeed, for both tasks, this relation was significant: for the frequency task at low frequencies (2–6 Hz, −0.6 to −0.1 s; Tsum = 66, P = 0.001) and the beta band (16–36 Hz, −0.3 to −0.1 s; Tsum = 77, P = 0.002), and for the intensity task in the alpha and beta bands (8–18 Hz, −0.6 to −0.1 s; Tsum = 46, P = 0.01). For neither task was there a significant effect of phase on evidence. Further, for both tasks, sensory evidence had a significant influence on choice [model III; frequency: t(12) = 3.3, P = 0.006; intensity: t = 3.3, P = 0.006], as expected from the neurobehavioral correlations reported above. Using additional analysis, we ruled out that systematic changes in the loudness of the acoustic background before the target sounds had a systematic influence on evidence or choice (cf. Fig. S1B and SI Results). These results suggest that pretarget activity in auditory networks has no direct effect on choice. Rather, the power in task-specific frequency bands influences the single-trial sensory evidence reflected by these networks, which in turn influences choice. Hence, any potential influence of power on choice (that may not have reached significance here) is likely mediated through an effect on sensory evidence.

Fig. 3.

Fig. 3.

Pretarget activity within auditory networks and relation to choice. (A) Single-trial activity within the auditory EEG component (cf. Fig. 2) was derived by projecting the data, X(t), onto the respective subspace carrying the task-relevant information, Ytask(t), separately for each participant, trial, and task. (Lower) Single-trial activity, Ytask(t), for one participant across trials is displayed. The gray box indicates the time window at which the discriminating component was computed. (B) Possible pathways by which pretarget activity could influence perception. Model I tested for a direct influence of power/phase on choice without involving sensory evidence; model II tested for an influence of power/phase on sensory evidence (Ytask); model III tested for a combined effect of power/phase and evidence on choice. Comparing models II and III allowed testing for a mediation effect of state influences on choice through evidence, hence detecting statistical influences of pretarget activity on choice that are most likely explained by a putative influence on evidence. (C) Group-level regression statistics for both tasks and models I and II (showing group-level t values for power and group-level averages for phase). Significant time-frequency clusters are indicated in black (at P <0.05, corrected for multiple comparisons across time-frequency bins, and FDR-corrected across regression models and tasks; n = 13 participants). The effect of sensory evidence on choice was significant for both tasks (main text). Within these auditory networks there was no direct effect of power or phase on choice.

Pretarget Influences in Frontoparietal Networks.

For this EEG component, we found consistent effects of pretarget phase on choice (Fig. 4). For the frequency task this was prominent in the alpha band (7–14 Hz; −0.4 to −0.1 s; Tsum = 5.0, P = 0.003), and for the intensity task in the theta band (2–6 Hz; −0.6 to −0.1 s; Tsum = 3.9, P = 0.005). For the intensity task there was also an influence of alpha power on choice (10–16 Hz; −0.4 to −0.1 s; Tsum = 31, P = 0.013). Importantly, for neither task did we observe a significant effect of either phase or power on sensory evidence. In addition, for neither task did sensory evidence have a significant influence on choice, although the effect was close to significance for intensity [frequency: t(12) = 1.1, P = 0.27; intensity: t = 2.1, P = 0.056], similar as for the neurobehavioral correlations reported above. This suggests that the influence of pretarget phase on choice is unlikely to be mediated by an influence of phase on the sensory evidence carried by this frontoparietal component. However, to further rule out this option, we investigated the statistical mediation effect of phase and power on choice through evidence (cf. Fig. 3B). Mediation effects for the time-frequency clusters with significant choice influences were not significant for either power or phase in either of the two tasks (at P <0.05). Again we ruled out an influence of background loudness on evidence or choice (cf. Fig. S1B and SI Results). These results suggest that pretarget influences emerging within frontoparietal networks are mediated by mechanisms not directly involving the sensory evidence but rather reflect later-activated and general decision-driving processes.

Fig. 4.

Fig. 4.

Pretarget activity within frontoparietal networks and relation to choice. Group-level regression statistics for both tasks and models (as in Fig. 3) for the frontoparietal component. Significant time-frequency clusters are indicated in black (P < 0.05). The effect of sensory evidence on choice was not significant for either task (main text). Within these networks, phase had a direct effect on choice that was not mediated through sensory evidence.

SI Results

We tested whether fluctuations in the loudness of the acoustic background sound were related to the sensory evidence carried by the different EEG components or whether they were related to choice. Because the background was the same in each trial, trial-by-trial variations in the intensity of the background relative to the target sounds could only arise from the six different target positions relative to the background. The lack of a main effect of target position on performance (cf. Results) already suggests that such an influence is unlikely. We performed two additional control analyses to confirm this. First, we used linear regression to test for an effect of background intensity on sensory evidence in each of the two EEG components of interest. For this, the intensity of the background sound was calculated as the root-mean-squared level in 20-ms windows calculated at 5-ms time steps. Group-level regression statistics was derived similarly as for the regression of sensory evidence on power/phase. The results are shown in Fig. S1B, and revealed no clusters with a significant relation between background intensity and evidence for either task or component (at P <0.01, corrected for multiple comparisons along time and tasks using the false discovery rate). Second, we used logistic regression to test for an effect of background intensity on choice. Given that choice is independent of EEG activity and the background intensity spanned only six different levels, only one regression was necessary per task. The group-level statistics revealed no effect for the intensity [t(12) = 1.0, P = 0.33] but a significant relation for the frequency task [t(12) = 3.3, P = 0.006]. However, the negligible (cross-validated) ROC values for both logistic regression models (intensity: Az = 0.49 ± 0.001, mean ± SEM; frequency: Az = 0.50 ± 0.001, both P >0.05 when subjected to a randomization test) suggest that any potentially group-level consistency in the sign of the relation between background intensity and choice is not sufficient to predict behavioral performance based on variations in the intensity of the acoustic background.

Discussion

These results delineate two mechanisms by which prestimulus activity shapes perception consistently across tasks: one affecting the quality of early sensory representations by the power of rhythmic activity, and one involving changes in later decision-making processes aligned to the phase of delta and alpha activity. Thereby they reconcile previous reports in auditory studies by referring effects to two separate mechanisms. Further, they suggest that the observed variability in the relevant frequency bands arises from the engagement of distinct sensory and decision-related networks in each task.

Sensory and Decision-Related Origins of Prestimulus Influences.

Rhythmic brain activity can affect the quality of early sensory representations (13, 25) and can influence decision criteria or the likelihood of evoking a motor response within frontoparietal areas (11, 26, 27). Using linear discriminant analysis, we were able to consistently separate earlier auditory from later-activated frontoparietal networks. This allowed us to separately quantify the impact of pretarget activity within these. Within auditory networks the sensory evidence was significantly related to perception, and hence pretarget activity could shape behavior via two distinct but not mutually exclusive mechanisms: via scaling the quality of sensory evidence or via another mechanism independent of the sensory input. We found that only power but not phase had an influence on the sensory decision process in these networks, and this was specific to the scaling of sensory evidence. Within later-activated frontoparietal networks, we found a consistent and direct influence of phase on choice. This was not mediated by sensory evidence, as shown by the weak influence of evidence on choice in this component and the absence of a statistical mediation effect. These results confirm previously described influences of oscillatory phase on hearing (47) and localize these to higher-level brain regions (see ref. 7 for a similar interpretation). In the intensity task, we also found a direct effect of alpha power on choice. Hence, depending on the task, both the amplitude and timing of oscillatory activity in frontoparietal networks can shape sensory decisions (see ref. 11 for a similar conclusion in the somatosensory system).

Studies on prestimulus influences in EEG activity have often focused on activity over selected electrodes, such as those carrying the strongest oscillatory power in a band of interest (28, 29) or those generally known to yield strong evoked responses for the modality of interest (6, 20). Our approach avoids these a priori assumptions by considering the activity in a linear EEG component, selected to recover the strongest EEG-based evidence for discriminating the task-relevant stimuli. As shown by recent work on the neural correlates of perceptual decisions (1618, 27), such a signal-driven selection of neuroimaging activity may yield more powerful insights than the focus on individual electrodes, especially as it naturally accounts for intersubject variability and differences in task-relevant networks.

Previous studies using a target-in-background detection task found that performance varied systematically with the target position relative to the background (46). We did not observe such an effect here. One explanation is that in contrast to the previous studies, target sounds were clearly audible and easy to localize in time, whereas the extraction of their relative features was difficult. A prominent theory suggests that the auditory system engages in a rhythmic listening mode aligned to sensory regularities when scanning complex scenes (30, 31). However, in the present paradigm, detection was easy and a rhythmic mode was possibly not engaged. As a result, we report the neural underpinnings of rhythmic perceptual gating when the auditory system is not entrained by the acoustic environment (9).

Reconciling Previous Work and Common Principles Across Modalities.

Previous studies diverged as to the specific frequency bands in which prestimulus activity seems to guide perception. For example, auditory detection was shown to depend on theta power and phase (4), on delta phase (5, 6), or even on multifrequency states (9). Further, temporal predictions were reported to depend on delta and beta power and delta phase (19), whereas speech discrimination was found to vary with alpha phase (7). In the visual domain, alpha activity predicts detection performance via power (32, 33), phase (3), or both (34), and similar findings were reported in the somatosensory system (10, 11, 35). By directly comparing two tasks within the same participants, our results demonstrate consistent patterns of prestimulus influences (power–sensory evidence; phase–choice), but also show that the relevant frequency bands differ between tasks. Hence, the diversity of previous reports is most parsimoniously explained by differences in task-relevant networks that possibly engage neural activity at distinct timescales, and by the intermingling of sensory representations and choice-related activity in the analyzed signatures of brain activity.

In line with this, we found that the topographies of the relevant auditory components differed between tasks. This fits with the understanding that the processing of pitch and loudness may preferentially engage the ventral and dorsal auditory streams and their frontal projections (21, 3638). In both tasks, sensory evidence was related to alpha/beta power. Previously, alpha power has been linked to listening effort in challenging environments (39) and may reflect the inhibition of interfering acoustic streams (40), whereas beta activity has been linked to corticocortical communication (10, 41). The observed alpha/beta effects could reflect intrinsic processes within early auditory regions, such as changes in the signal-to-noise ratio of sensory representations (13). However, we cannot rule out network-level effects such as changes in effective connectivity between auditory regions (10, 42). The phase of pretarget activity was relevant for perception only within the frontoparietal networks. Studies on perceptual decision making have described the rhythmic accumulation of sensory evidence (26) and implied rhythmic phase as critical for implementing the attentional selection in time (43). In line with this, our results provide strong evidence to support a high-level origin of prestimulus phase effects on perception. In the intensity task, we also observed an influence of alpha power on choice. Although we can only speculate as to the origin of this difference between tasks, changes in decision criteria with alpha power have been described previously (11, 27).

How does the absence of phase effects in the auditory component fit with studies showing that the phase of rhythmic auditory activity is dynamically aligned (“entrained”) to acoustic regularities (4446)? In principle, phase entrainment should lead to an influence of phase on subsequent stimuli, a mechanism that has been implied in the segmentation of acoustic scenes (47, 48). In the present study, perception and auditory activity were not strongly entrained by the background sound, and hence the mechanisms linking phase and neural gain were possibly not sufficiently engaged. The power of slow auditory cortical activity has been related to changes in background spiking activity (13). Background activity can change the quality of sensory representations similar to changes in sensory gain associated with phase (13), and this may be the explanation for the power dependency of sensory evidence reported here.

It is important to note that our results are based on detecting significant interactions between multiple variables. This leaves the possibility that weaker (here not significant) effects may exist. Furthermore, although the phase and power derived from the same signal are frequently interpreted as reflecting distinct neural processes, this may be ambiguous as to the underlying neural generators (cf. SI Discussion). Still, the present results show that the phase and power derived from the same band but within distinct task-relevant networks can reflect neural processes that contribute differentially to the sensory–decision cascade. Notably, the present results do not speak to the complementarity of the phase/power influences in the auditory and frontoparietal networks. Some studies have suggested that top–down interactions between cognitive processes and sensory regions determine the patterns of sensory encoding in sensory cortices (49, 50), and future work is required to investigate the possibility that the two mechanisms described here are part of the same large-scale process.

SI Discussion

Can the phase and power derived from the same EEG frequency band reflect distinct processes? Whereas the phase and power derived from a single signal are not statistically independent, the phase and power derived from mesoscopic brain activity such as field potentials or magnetoencephalogram/EEG signals are often used to characterize what are apparently distinct neural processes. For example, within sensory cortices, neural spiking is differentially related to the phase and power of slow network signatures (12, 13, 45), and sensory perception seems to be differentially related to phase and power in different tasks and sensory modalities (Discussion). In turn, EEG signals reflect both the power and synchronization of neural spiking (62). Although a better understanding of the neural origins of the phase and power components of EEG signals requires further advances in the neural modeling of large-scale brain signals (6365), the present results reinforce the notion that phase and power derived from the same band-limited EEG signal can serve as an index onto distinct neural processes. Along this line, we note that the frontoparietal regions underlying the late EEG component analyzed here are likely closer to the scalp than the auditory regions, which may have affected the sensitivity of our analysis to phase or power effects in the two components. Better knowledge about the origins of phase and power and how they are affected by the superposition of many active sources in the brain is required to rule out a potential bias in favor of either signal component in the present or related analyses.

Materials and Methods

Data were obtained from 16 healthy adult participants following written informed consent. The study was conducted in accordance with the Declaration of Helsinki and was approved by the local ethics committee (College of Science and Engineering, University of Glasgow). See SI Materials and Methods for full details. In brief, we used two auditory discrimination tasks based on pure tone targets embedded in a background sound, similar to previous work (4) (Fig. 1A). In the frequency (intensity) task, subjects compared the pitch (loudness) of subsequent tones and judged which of the two was higher (louder). The second tone was always the standard, whereas the first was higher or lower, varying over seven different levels. These levels were equally spaced (in octaves or decibels) between a difference of 0 Hz (0 dB) and twice each participant’s threshold. Frequency thresholds were 0.09 ± 0.103 octaves (mean ± SD) and intensity thresholds were 2.95 ± 1.60 dB. Data from three participants had to be excluded, and results are presented for the 13 participants who each performed both tasks reliably. We used multivariate discriminant analysis across all trials and stimulus levels to localize EEG components that discriminated the two stimulus conditions classified by the participants. Cluster analysis yielded three systematically different components (Fig. 2C), for each of which we derived neurometric curves and projections of single-trial activity. We exploited these projections as an estimator of the underlying task-relevant activity (17, 18, 51), and studied the relation between pretarget activity, sensory evidence (the projection value), and choice using multiple regression models (Fig. 3B): model I: logistic regression of choice on power/phase to determine whether state influences choice (as expected); and model II: linear regression of evidence on power/phase to test whether state influences sensory representations. Any potential effect of state on choice could possibly be mediated through an effect on evidence (direct and indirect pathways in Fig. 3B). Mediation analysis was used to dissociate these two possibilities (52) via a third model: model III: logistic regression of choice on evidence and power/phase. Group-level statistics was based on a cluster-based permutation procedure correcting for multiple comparisons across time-frequency bins (53) and further corrected for multiple comparisons across regression models, EEG components, and tasks using the false discovery rate (FDR; at P <0.05).

SI Materials and Methods

Data were obtained from 16 healthy adult participants (6 males; mean age, 23.9 y) following written informed consent. All had self-reported normal hearing and were briefed about the nature of the study. The study was conducted in accordance with the Declaration of Helsinki and was approved by the local ethics committee (College of Science and Engineering, University of Glasgow).

Experimental Design and Stimulus Material.

We used two auditory discrimination tasks based on pure tone targets. These were embedded in a background sound consisting of a cacophony of naturalistic sounds, similar to a background sound used in our previous work (4). In particular, this cacophony comprised a wide set of environmental noises (forest, city), animal sounds, and sounds originating from tools. The same background cacophony was used in each trial. In both tasks, two pure tones were presented after a pseudorandom delay following the onset of the background (with delays taking one of the following six values: 2,400 + n*33 ms, with n = 0, …, 5; Fig. 1A). Each tone lasted 50 ms (including 5-ms cosine on/off ramps), and the two tones were separated by a 50-ms interval. Hence, full sensory evidence was available 150 ms after target onset. The relative signal-to-noise ratio of the standard tone relative to the background was +2 dB, based on the rms level. In the frequency task, subjects compared the pitch of the tones and judged which of the two was higher. In the intensity task, they had to indicate which of the two tones was louder. The second tone was always fixed (i.e., the standard) in frequency (1,024 Hz) or intensity (+2 dB), whereas the first was higher or lower (balanced and pseudorandomized across trials). This design was chosen so that the pretarget activity could readily influence the representation of the varying tone, without any potential influences from an intermediate sound.

For each task, we used seven different levels of pitch (intensity) differences. These were equally spaced (in octaves or decibels) between a difference of 0 Hz (0 dB) and twice each participant’s own perceptual threshold in the respective discrimination task. The latter was determined using a staircase procedure before the actual experiment (see below). Trials were separated by intertrial periods of 1,700–2,200 ms (uniformly distributed) and presented in a block design (504 trials in total per task and participant). For each of the participants, we obtained data for both tasks, each recorded on a different day. Subjects were instructed to respond as accurately as possible, and the background noise terminated once the response was provided, or after 4 s.

Before the experiment, participants performed training trials without background noise to familiarize themselves with the task (usually 40–80 trials). We then determined each participant’s perceptual threshold for the target in background variant for frequency and intensity discrimination using a 1-up 2-down staircase procedure, using three interleaved staircases and averaging thresholds across these. Across participants, frequency thresholds were 0.09 ± 0.103 octaves (mean ± SD) and intensity thresholds were 2.95 ± 1.60 dB. Data from three participants had to be excluded from further analysis; one performed poorly (<60% correct across all levels) for both tasks, and two performed poorly in one of the tasks. The results presented here are from the 13 participants who each performed both tasks reliably.

EEG Recording Procedures.

Experiments were performed in a dark and electrically shielded room. Acoustic stimuli were presented binaurally using a Sennheiser headphone, and stimulus presentation was controlled through MATLAB (MathWorks) using routines from the Psychophysics toolbox. Sound levels were calibrated using a sound-level meter (model 2250; Brüel & Kjær) to an average of a 65-dB rms level. EEG signals were continuously recorded using an active 64-channel BioSemi system using Ag-AgCl electrodes mounted on an elastic cap (BioSemi) according to the 10/20 system. Four additional electrodes were placed at the outer canthi and below the eyes to obtain the electroocculogram (EOG). Electrode offsets were kept below 25 mV. Data were acquired at a sampling rate of 500 Hz using a low-pass filter of 208 Hz.

General Data Analysis.

Data analysis was carried out offline with MATLAB, using the FieldTrip toolbox (54) and custom-written routines following procedures used in our previous work (4). The EEG data from different recording blocks were preprocessed separately as follows. The data were band-pass–filtered between 1 and 70 Hz, resampled to 150 Hz, and subsequently denoised using independent component analysis (ICA). Usually one or two components reflecting eye movement-related artifacts were identified and removed following definitions provided by Debener et al. (55). In addition, for some subjects, highly localized components reflecting muscular artifacts were detected and removed (56, 57). To detect potential artifacts pertaining to remaining blinks or eye movements, we computed horizontal, vertical, and radial EOG signals following established procedures (58). We rejected trials in which the peak signal on any electrode exceeded a level of ±100 μV or during which potential eye movements were detected based on a threshold of 3 SDs above the mean of the high-pass–filtered EOGs using the procedures suggested by Keren et al. (58). We also excluded trials in which participants responded with reaction times shorter than 400 ms relative to the onset of the first tone, to avoid potential motor activations within the time period that was used for the main analysis (−0.6 to +0.4 s around target sound onset). Together, these criteria led to the rejection of 6.18 ± 11.1% of trials (mean ± SD). For all further analysis, the EEG signals were referenced to the common average reference.

Single-Trial Decoding.

We used multivariate linear discriminant analysis to localize EEG components that discriminated the two stimulus conditions classified by the participants, namely whether the first or second tone was higher in frequency (intensity). Classification was based on regularized linear discriminant analysis to identify a projection in the multidimensional EEG data, x(t), that maximally discriminated between the two conditions across all stimulus levels (51, 59). Each projection is defined by a weight vector, w, which describes a one-dimensional projection y(t) of the EEG data

y(t)=iwixi(t)+c,

with i summing over all channels and c a constant. The regularization parameter was optimized in preliminary tests and kept fixed for all analyses. The classifier was applied to the EEG activity in time windows of 80 ms using a sliding window spanning pre- and posttarget periods, hence providing a discriminating component (defined by w) for each time window.

The performance of the classifier was quantified using the receiver operator characteristic (ROC), referred to as Az, based on 10-fold cross-validation within each participant. Statistical significance of this performance was determined using a randomizing approach, by shuffling condition labels and computing the distribution of group-averaged Az values based on 1,000 randomized datasets after taking the maximal value along time to correct for multiple comparisons (60). Based on the linear model, we derived scalp topographies for each discriminating component by estimating the corresponding forward model, defined as the normalized correlation between the discriminating component and the original EEG activity (51).

To select components of interest that reflect temporally stable and consistent activation patterns, we used k-means clustering of the correlation values between the scalp projections of the components associated with different time points. This was applied to the collective set of topographies across participants (over the range of t = 0 to +0.4 s posttarget onset; Fig. 2). Visual inspection of clustering trees suggested that three components provide a suitable separation of the activations within this time frame, and using more than three components produced at least two that were similar in shape. Clustering resulted in continuous time intervals corresponding to three systematically different components (Fig. 2C). To study these components in further detail, we used for each participant the weight (w) from that time point within each component that provided the maximal Az (Fig. 3A).

In the context of linear EEG analysis, y(t) provides an aggregate representation of the data across sensors and is assumed to be a better estimator (in terms of signal-to-noise ratio and interference) of the underlying task-relevant activity than the data on individual channels (18, 51). We here exploited y as a measure of sensory evidence, as larger values of y (positive or negative) correspond to a better separability of the two conditions. For a given discriminating component, defined by w computed across all stimulus levels, we derived neurometric curves by calculating ROC values between stimulus conditions from the sensory evidence separately for each stimulus level. In addition, we computed the single-trial activity of each discriminating component, y(t), by applying the weights from each of the three components to all time points throughout prestimulus and trial periods, effectively computing the time course of each discriminating component during the entire trial (Fig. 3B). This one-dimensional projection of single-trial task-relevant activity was then analyzed further.

Time-Frequency Analysis.

Complex valued time-frequency (TF) representations were obtained using wavelet analysis in FieldTrip. Frequencies ranged from 2 to 40 Hz in steps of 1 Hz below 16 Hz and steps of 2 Hz above. The width of individual wavelets was scaled with frequency to induce stronger frequency smoothing at higher frequencies. Importantly, to rule out the potential carryover of posttarget activations into the pretarget period during TF analysis, we applied a 40-ms Hanning window to the last 40 ms before target onset, effectively setting the posttarget period to zero for TF analysis (61). TF data were calculated at time points of −0.6 to −0.1 s prestimulus in a 50-ms bin. The power was z-scored within each participant and frequency band across time and trials.

Statistical Analysis.

Our hypotheses concern the relation of pretarget activity (phase/power in specific bands), sensory evidence (derived for each of the three components), and perceptual choice at the single-trial level (Fig. 3B). We tested these relations using three different regression models to dissect the different relations of interest. Model I: logistic regression of choice on power/phase to determine the influence of pretarget activity on choice. Model II: linear regression of sensory evidence (y) on power/phase to quantify whether pretarget state influences the sensory evidence reflected by the respective EEG component. Assuming that a specific state (defined by power or phase in a specific frequency band) influences both choice and evidence, it could be that the apparent influence on choice is mediated by an influence on evidence, which in turn drives choice; alternatively, this influence could originate from a different mechanism not involving evidence (direct and indirect pathways; Fig. 3B). Mediation analysis of regression models was used to dissociate these two possibilities (52) using a third model, model III: logistic regression of choice on sensory evidence and power/phase. Comparing models II and III allowed dissociating the two scenarios based on mediation.

All models were computed separately for each task, power, and phase, and for each of the time-frequency points of pretarget activity. For the regression of evidence on power/phase, y was coded as an unsigned variable (larger values of y reflected stronger sensory evidence regardless of stimulus condition) and was z-scored within each stimulus level. Consequently, positive regression coefficients, for example for power, indicate that higher power relates to stronger task-relevant sensory evidence. Logistic regression was implemented using regularized regression (51). For phase, we entered both the sine- and cosine-transformed phase angles into the regression model. Possible mediation effects of power (phase) on choice through sensory evidence were computed using the mediation effect, defined as the difference in regression parameters between models II and III; we adjusted regression coefficients for dichotomous outcomes (52). Mediation analysis was performed only for those time-frequency intervals exhibiting a significant effect of pretarget activity on choice, and hence where mediation effects could be expected.

To test individual regression parameters for statistical significance, we implemented a two-level approach. First-level contrasts reflecting systematic effects of individual parameters were derived from single-subject data. Beta values for regression parameters were obtained (where applicable) for each time-frequency bin, whereby effects for phase were defined as the root-mean-square average of sine and cosine components. Mediation effects were similarly computed for each time-frequency bin. Group-level analysis was then implemented using a cluster-based permutation procedure that compares clusters of time-frequency bins in the actual data with a distribution of clusters in surrogate data obtained by shuffling condition labels, correcting for multiple comparisons across time-frequency bins (53); detailed parameters: 1,000 iterations, clustering bins with individual values above (below) the 95th (5th) percentiles of the surrogate data, requiring a cluster size of at least four neighbors, computing the cluster mass within each cluster, performing a two-sided test at P <0.05 on the clustered data. For phase, this was based on the group average of the respective betas, whereas for power and evidence we entered the associated group-level t values. The significance of mediation effects was similarly derived by comparing the mediation effect in the actual data with those in the surrogate data. Because we were interested in the effects of multiple parameters and models, we additionally corrected for multiple comparisons across regression models, EEG components, and tasks using the false discovery rate (at P <0.05). Effect sizes for cluster-based statistics are reported as the cluster mass across all bins within a cluster (Tsum). We provide exact P values where possible, but values below 10−5 are abbreviated as such.

Acknowledgments

We thank Marcel Schulze for help with collecting pilot data. This work was supported by the UK Biotechnology and Biological Sciences Research Council (BBSRC; BB/L027534/1). S.W.M. is supported by a studentship from the UK Economic and Social Research Council (ESRC). C.K. is supported by the European Research Council (ERC-2014-CoG; Grant 646657).

Footnotes

The authors declare no conflict of interest.

This article is a PNAS Direct Submission.

This article contains supporting information online at www.pnas.org/lookup/suppl/doi:10.1073/pnas.1524087113/-/DCSupplemental.

References

  • 1.Jensen O, Bonnefond M, VanRullen R. An oscillatory mechanism for prioritizing salient unattended stimuli. Trends Cogn Sci. 2012;16(4):200–206. doi: 10.1016/j.tics.2012.03.002. [DOI] [PubMed] [Google Scholar]
  • 2.Vanrullen R, Busch NA, Drewes J, Dubois J. Ongoing EEG phase as a trial-by-trial predictor of perceptual and attentional variability. Front Psychol. 2011;2:60. doi: 10.3389/fpsyg.2011.00060. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Busch NA, Dubois J, VanRullen R. The phase of ongoing EEG oscillations predicts visual perception. J Neurosci. 2009;29(24):7869–7876. doi: 10.1523/JNEUROSCI.0113-09.2009. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Ng BS, Schroeder T, Kayser C. A precluding but not ensuring role of entrained low-frequency oscillations for auditory perception. J Neurosci. 2012;32(35):12268–12276. doi: 10.1523/JNEUROSCI.1877-12.2012. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Henry MJ, Herrmann B, Obleser J. Entrained neural oscillations in multiple frequency bands comodulate behavior. Proc Natl Acad Sci USA. 2014;111(41):14935–14940. doi: 10.1073/pnas.1408741111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Henry MJ, Obleser J. Frequency modulation entrains slow neural oscillations and optimizes human listening behavior. Proc Natl Acad Sci USA. 2012;109(49):20095–20100. doi: 10.1073/pnas.1213390109. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Strauß A, Henry MJ, Scharinger M, Obleser J. Alpha phase determines successful lexical decision in noise. J Neurosci. 2015;35(7):3256–3262. doi: 10.1523/JNEUROSCI.3357-14.2015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Neuling T, Rach S, Wagner S, Wolters CH, Herrmann CS. Good vibrations: Oscillatory phase shapes perception. Neuroimage. 2012;63(2):771–778. doi: 10.1016/j.neuroimage.2012.07.024. [DOI] [PubMed] [Google Scholar]
  • 9.Henry MJ, Herrmann B, Obleser J. Neural microstates govern perception of auditory input without rhythmic structure. J Neurosci. 2016;36(3):860–871. doi: 10.1523/JNEUROSCI.2191-15.2016. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Weisz N, et al. Prestimulus oscillatory power and connectivity patterns predispose conscious somatosensory perception. Proc Natl Acad Sci USA. 2014;111(4):E417–E425. doi: 10.1073/pnas.1317267111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Baumgarten TJ, Schnitzler A, Lange J. Prestimulus alpha power influences tactile temporal perceptual discrimination and confidence in decisions. Cereb Cortex. 2016;26(3):891–903. doi: 10.1093/cercor/bhu247. [DOI] [PubMed] [Google Scholar]
  • 12.Lakatos P, et al. An oscillatory hierarchy controlling neuronal excitability and stimulus processing in the auditory cortex. J Neurophysiol. 2005;94(3):1904–1911. doi: 10.1152/jn.00263.2005. [DOI] [PubMed] [Google Scholar]
  • 13.Kayser C, Wilson C, Safaai H, Sakata S, Panzeri S. Rhythmic auditory cortex activity at multiple timescales shapes stimulus-response gain and background firing. J Neurosci. 2015;35(20):7750–7762. doi: 10.1523/JNEUROSCI.0268-15.2015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Reig R, Zerlaut Y, Vergara R, Destexhe A, Sanchez-Vives MV. Gain modulation of synaptic inputs by network state in auditory cortex in vivo. J Neurosci. 2015;35(6):2689–2702. doi: 10.1523/JNEUROSCI.2004-14.2015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Haegens S, Nácher V, Luna R, Romo R, Jensen O. α-Oscillations in the monkey sensorimotor network influence discrimination performance by rhythmical inhibition of neuronal spiking. Proc Natl Acad Sci USA. 2011;108(48):19377–19382. doi: 10.1073/pnas.1117190108. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Mostert P, Kok P, de Lange FP. Dissociating sensory from decision processes in human perceptual decision making. Sci Rep. 2015;5:18253. doi: 10.1038/srep18253. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Marti S, King JR, Dehaene S. Time-resolved decoding of two processing chains during dual-task interference. Neuron. 2015;88(6):1297–1307. doi: 10.1016/j.neuron.2015.10.040. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Philiastides MG, Heekeren HR, Sajda P. Human scalp potentials reflect a mixture of decision-related signals during perceptual choices. J Neurosci. 2014;34(50):16877–16889. doi: 10.1523/JNEUROSCI.3012-14.2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Arnal LH, Doelling KB, Poeppel D. Delta-beta coupled oscillations underlie temporal prediction accuracy. Cereb Cortex. 2015;25(9):3077–3085. doi: 10.1093/cercor/bhu103. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Stefanics G, et al. Phase entrainment of human delta oscillations can mediate the effects of expectation on reaction speed. J Neurosci. 2010;30(41):13578–13585. doi: 10.1523/JNEUROSCI.0703-10.2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Belin P, et al. The functional anatomy of sound intensity discrimination. J Neurosci. 1998;18(16):6388–6394. doi: 10.1523/JNEUROSCI.18-16-06388.1998. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Zatorre RJ, Evans AC, Meyer E. Neural mechanisms underlying melodic perception and memory for pitch. J Neurosci. 1994;14(4):1908–1919. doi: 10.1523/JNEUROSCI.14-04-01908.1994. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Giani AS, Belardinelli P, Ortiz E, Kleiner M, Noppeney U. Detecting tones in complex auditory scenes. Neuroimage. 2015;122:203–213. doi: 10.1016/j.neuroimage.2015.07.001. [DOI] [PubMed] [Google Scholar]
  • 24.Gerson AD, Parra LC, Sajda P. Cortical origins of response time variability during rapid discrimination of visual objects. Neuroimage. 2005;28(2):342–353. doi: 10.1016/j.neuroimage.2005.06.026. [DOI] [PubMed] [Google Scholar]
  • 25.McGinley MJ, David SV, McCormick DA. Cortical membrane potential signature of optimal states for sensory signal detection. Neuron. 2015;87(1):179–192. doi: 10.1016/j.neuron.2015.05.038. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Wyart V, de Gardelle V, Scholl J, Summerfield C. Rhythmic fluctuations in evidence accumulation during decision making in the human brain. Neuron. 2012;76(4):847–858. doi: 10.1016/j.neuron.2012.09.015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Lou B, Li Y, Philiastides MG, Sajda P. Prestimulus alpha power predicts fidelity of sensory encoding in perceptual decision making. Neuroimage. 2014;87:242–251. doi: 10.1016/j.neuroimage.2013.10.041. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Busch NA, VanRullen R. Spontaneous EEG oscillations reveal periodic sampling of visual attention. Proc Natl Acad Sci USA. 2010;107(37):16048–16053. doi: 10.1073/pnas.1004801107. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Thut G, Nietzel A, Brandt SA, Pascual-Leone A. Alpha-band electroencephalographic activity over occipital cortex indexes visuospatial attention bias and predicts visual target detection. J Neurosci. 2006;26(37):9494–9502. doi: 10.1523/JNEUROSCI.0875-06.2006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Schroeder CE, Lakatos P. Low-frequency neuronal oscillations as instruments of sensory selection. Trends Neurosci. 2009;32(1):9–18. doi: 10.1016/j.tins.2008.09.012. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Lakatos P, et al. The leading sense: Supramodal control of neurophysiological context by attention. Neuron. 2009;64(3):419–430. doi: 10.1016/j.neuron.2009.10.014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.van Dijk H, Schoffelen JM, Oostenveld R, Jensen O. Prestimulus oscillatory activity in the alpha band predicts visual discrimination ability. J Neurosci. 2008;28(8):1816–1823. doi: 10.1523/JNEUROSCI.1853-07.2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Romei V, et al. Spontaneous fluctuations in posterior alpha-band EEG activity reflect variability in excitability of human visual areas. Cereb Cortex. 2008;18(9):2010–2018. doi: 10.1093/cercor/bhm229. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Mathewson KE, Gratton G, Fabiani M, Beck DM, Ro T. To see or not to see: Prestimulus alpha phase predicts visual awareness. J Neurosci. 2009;29(9):2725–2732. doi: 10.1523/JNEUROSCI.3963-08.2009. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Lange J, Halacz J, van Dijk H, Kahlbrock N, Schnitzler A. Fluctuations of prestimulus oscillatory power predict subjective perception of tactile simultaneity. Cereb Cortex. 2012;22(11):2564–2574. doi: 10.1093/cercor/bhr329. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Hart HC, Palmer AR, Hall DA. Heschl’s gyrus is more sensitive to tone level than non-primary auditory cortex. Hear Res. 2002;171(1-2):177–190. doi: 10.1016/s0378-5955(02)00498-7. [DOI] [PubMed] [Google Scholar]
  • 37.Hickok G, Poeppel D. The cortical organization of speech processing. Nat Rev Neurosci. 2007;8(5):393–402. doi: 10.1038/nrn2113. [DOI] [PubMed] [Google Scholar]
  • 38.Du Y, et al. Rapid tuning of auditory “what” and “where” pathways by training. Cereb Cortex. 2015;25(2):496–506. doi: 10.1093/cercor/bht251. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Obleser J, Weisz N. Suppressed alpha oscillations predict intelligibility of speech and its acoustic details. Cereb Cortex. 2012;22(11):2466–2477. doi: 10.1093/cercor/bhr325. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Strauß A, Wöstmann M, Obleser J. Cortical alpha oscillations as a tool for auditory selective inhibition. Front Hum Neurosci. 2014;8:350. doi: 10.3389/fnhum.2014.00350. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Arnal LH, Wyart V, Giraud AL. Transitions in neural oscillations reflect prediction errors generated in audiovisual speech. Nat Neurosci. 2011;14(6):797–801. doi: 10.1038/nn.2810. [DOI] [PubMed] [Google Scholar]
  • 42.Bowers AL, Saltuklaroglu T, Harkrider A, Wilson M, Toner MA. Dynamic modulation of shared sensory and motor cortical rhythms mediates speech and non-speech discrimination performance. Front Psychol. 2014;5:366. doi: 10.3389/fpsyg.2014.00366. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Landau AN, Schreyer HM, van Pelt S, Fries P. Distributed attention is implemented through theta-rhythmic gamma modulation. Curr Biol. 2015;25(17):2332–2337. doi: 10.1016/j.cub.2015.07.048. [DOI] [PubMed] [Google Scholar]
  • 44.Ng BSW, Logothetis NK, Kayser C. EEG phase patterns reflect the selectivity of neural firing. Cereb Cortex. 2013;23(2):389–398. doi: 10.1093/cercor/bhs031. [DOI] [PubMed] [Google Scholar]
  • 45.Kayser C, Montemurro MA, Logothetis NK, Panzeri S. Spike-phase coding boosts and stabilizes information carried by spatial and temporal spike patterns. Neuron. 2009;61(4):597–608. doi: 10.1016/j.neuron.2009.01.008. [DOI] [PubMed] [Google Scholar]
  • 46.Schroeder CE, Lakatos P, Kajikawa Y, Partan S, Puce A. Neuronal oscillations and visual amplification of speech. Trends Cogn Sci. 2008;12(3):106–113. doi: 10.1016/j.tics.2008.01.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Peelle JE, Davis MH. Neural oscillations carry speech rhythm through to comprehension. Front Psychol. 2012;3:320. doi: 10.3389/fpsyg.2012.00320. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Giraud AL, Poeppel D. Cortical oscillations and speech processing: Emerging computational principles and operations. Nat Neurosci. 2012;15(4):511–517. doi: 10.1038/nn.3063. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Nienborg H, Cumming BG. Decision-related activity in sensory neurons reflects more than a neuron’s causal effect. Nature. 2009;459(7243):89–92. doi: 10.1038/nature07821. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Nienborg H, Roelfsema PR. Belief states as a framework to explain extra-retinal influences in visual cortex. Curr Opin Neurobiol. 2015;32:45–52. doi: 10.1016/j.conb.2014.10.013. [DOI] [PubMed] [Google Scholar]
  • 51.Parra LC, Spence CD, Gerson AD, Sajda P. Recipes for the linear analysis of EEG. Neuroimage. 2005;28(2):326–341. doi: 10.1016/j.neuroimage.2005.05.032. [DOI] [PubMed] [Google Scholar]
  • 52.MacKinnon DP, Fairchild AJ, Fritz MS. Mediation analysis. Annu Rev Psychol. 2007;58:593–614. doi: 10.1146/annurev.psych.58.110405.085542. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Maris E, Oostenveld R. Nonparametric statistical testing of EEG- and MEG-data. J Neurosci Methods. 2007;164(1):177–190. doi: 10.1016/j.jneumeth.2007.03.024. [DOI] [PubMed] [Google Scholar]
  • 54.Oostenveld R, Fries P, Maris E, Schoffelen JM. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data. Comput Intell Neurosci. 2011;2011:156869. doi: 10.1155/2011/156869. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Debener S, Thorne JD, Schneider TR, Viola FC. 2010. Using ICA for the analysis of multi-channel EEG data. Simultaneous EEG and fMRI: Recording, Analysis, and Application, eds Ullsperger M, Debener S (Oxford Univ Press, Oxford), pp 121–134.
  • 56.Hipp JF, Siegel M. Dissociating neuronal gamma-band activity from cranial and ocular muscle activity in EEG. Front Hum Neurosci. 2013;7:338. doi: 10.3389/fnhum.2013.00338. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.O’Beirne GA, Patuzzi RB. Basic properties of the sound-evoked post-auricular muscle response (PAMR) Hear Res. 1999;138(1-2):115–132. doi: 10.1016/s0378-5955(99)00159-8. [DOI] [PubMed] [Google Scholar]
  • 58.Keren AS, Yuval-Greenberg S, Deouell LY. Saccadic spike potentials in gamma-band EEG: Characterization, detection and suppression. Neuroimage. 2010;49(3):2248–2263. doi: 10.1016/j.neuroimage.2009.10.057. [DOI] [PubMed] [Google Scholar]
  • 59.Blankertz B, Lemm S, Treder M, Haufe S, Müller KR. Single-trial analysis and classification of ERP components—A tutorial. Neuroimage. 2011;56(2):814–825. doi: 10.1016/j.neuroimage.2010.06.048. [DOI] [PubMed] [Google Scholar]
  • 60.Nichols TE, Holmes AP. Nonparametric permutation tests for functional neuroimaging: A primer with examples. Hum Brain Mapp. 2002;15(1):1–25. doi: 10.1002/hbm.1058. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Henry MJ, Herrmann B. Low-frequency neural oscillations support dynamic attending in temporal context. Timing Time Percept. 2014;2(1):62–86. [Google Scholar]
  • 62.Whittingstall K, Logothetis NK. Frequency-band coupling in surface EEG reflects spiking activity in monkey visual cortex. Neuron. 2009;64(2):281–289. doi: 10.1016/j.neuron.2009.08.016. [DOI] [PubMed] [Google Scholar]
  • 63.Einevoll GT, Kayser C, Logothetis NK, Panzeri S. Modelling and analysis of local field potentials for studying the function of cortical circuits. Nat Rev Neurosci. 2013;14(11):770–785. doi: 10.1038/nrn3599. [DOI] [PubMed] [Google Scholar]
  • 64.Panzeri S, Macke JH, Gross J, Kayser C. Neural population coding: Combining insights from microscopic and mass signals. Trends Cogn Sci. 2015;19(3):162–172. doi: 10.1016/j.tics.2015.01.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65.Mazzoni A, et al. Computing the local field potential (LFP) from integrate-and-fire network models. PLOS Comput Biol. 2015;11(12):e1004584. doi: 10.1371/journal.pcbi.1004584. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Proceedings of the National Academy of Sciences of the United States of America are provided here courtesy of National Academy of Sciences

RESOURCES