Abstract
Perception of temporal patterns is critical for speech, movement, and music. In the auditory domain, perception of a regular pulse, or beat, within a sequence of temporal intervals is associated with basal ganglia activity. Two alternative accounts of this striatal activity are possible: “searching” for temporal regularity in early stimulus processing stages or “prediction’ of the timing of future tones after the beat is found (relying on continuation of an internally generated beat). To resolve between these accounts, we used functional magnetic resonance imaging (fMRI) to investigate different stages of beat perception. Participants heard a series of beat and nonbeat (irregular) monotone sequences. For each sequence, the preceding sequence provided a temporal beat context for the following sequence. Beat sequences were preceded by nonbeat sequences, requiring the beat to be found anew (“beat finding” condition), or by beat sequences with the same beat rate (“beat continuation”), or a different rate (“beat adjustment”). Detection of regularity is highest during beat finding, whereas generation and prediction are highest during beat continuation. We found the greatest striatal activity for beat continuation, less for beat adjustment, and the least for beat finding. Thus, the basal ganglia's response profile suggests a role in beat prediction, not in beat finding.
Keywords: auditory perception, basal ganglia, beat perception, fMRI, music, prediction, rhythm, timing
Introduction
To learn about and interact with our environment, we must grasp the structure of recurring patterns and make predictions about events (Friston and Kiebel 2009). For example, accurate perception of temporal patterns is crucial to hearing, speech, motor control, and music. Our sensitivity to these patterns is evident even in infancy (Hannon and Trehub 2005; Hannon and Trainor 2007; Soley and Hannon 2010). Sensitivity to some temporal regularities appears to be unique to humans, such as our responsiveness to musical rhythm, and in particular, our ability to rapidly identify the central structural component: the “beat.” The beat is the regular time interval that we may tap to and against which other time intervals in the rhythm can be measured (Large and Palmer 2002; London 2004). Beat structure is important for perception and behavior, improving temporal performance and discrimination (Drake and Gerard 1989; Ross and Houtsma 1994; Hebert and Cuddy 2002; Patel et al. 2005; Grube and Griffiths 2009) as well as performance of nontemporal task goals (Barnes and Jones 2000; Correa et al. 2005; Praamstra and Pope 2006).
At the neuronal level, the presence of beat structure increases activation in the putamen (Grahn and Brett 2007; Grahn and Rowe 2009). Impaired discrimination of beat rhythms in Parkinson's disease suggests that the basal ganglia are crucial for processing of the beat (Grahn and Brett 2009). However, 2 conflicting accounts exist for the basal ganglia role in this previous work: they could either engage in the search for structure (“finding” the beat) or in making subsequent predictions and internally generating the beat based on the detected structure (“continuing” the beat).
These processes are readily illustrated when tapping to music. Tapping begins only after a few seconds of music have passed, when the beat has been “found.” Beat finding requires detection of salient or “accented” events (e.g., bass/drum sounds, loudness changes, or long notes; Cooper and Meyer 1960; Povel and Okkerman 1981; Palmer and Krumhansl 1990). Time between accented events gives the beat interval, which is generally identified as between 300 and 900 ms, equivalent to 67–200 beats per minute (bpm) (Parncutt 1994; van Noorden and Moelants 1999). When tapping begins, beat continuation occurs: the beat interval is continually generated to time future taps, and future beat accents are predicted accurately. Changes in the beat rate require “beat adjustment,” through beat prediction error (did I tap early?) followed by modification of subsequent predictions. Recent work suggests that prediction (indicative of the internal continuation of the beat) is associated with activation of the putamen, whereas prediction error is processed by different parts of the basal ganglia, the caudate, and ventral striatum (Haruno and Kawato 2006; Schiffer and Schubotz 2011). One important factor that may influence beat processing is musical training, either through enabling of better predictions that are honed by years of experience or through creation of a richer internal model produced by explicit knowledge of musical rules.
Here, we examine beat finding, adjustment, and continuation using functional magnetic resonance imaging (fMRI) to assess the role of the cortex and basal ganglia. Participants with different levels of musical training listened to sequentially presented beat and nonbeat sequences (see Fig. 1 for an outline of the paradigm). Beat sequences that were preceded by nonbeat sequences required beat finding, but when they were preceded by other beat sequences, they required beat continuation (if beat rate was the same) or beat adjustment (if the beat rate changed). In addition, within the beat adjustment condition, we compared responses with slower and faster rates. The adjustment processes are different between unexpectedly early events (which elicit larger response shifts) and late events (Coull et al. 2000). Importantly, our design also distinguished the relative speeding and slowing of the beat rate from the absolute beat rate (e.g., 75 bpm is a faster beat rate after 65 bpm, but a slower one after 85 bpm).
Figure 1.
Schematic depictions of experimental design. Beat and nonbeat rhythms were played consecutively, with the previous rhythm determining how the subsequent rhythm was classified. Absolute rhythm rate is indexed by the smallest rhythmic interval used in the sequence (between 140 and 260 ms, in 30 ms steps). Shorter intervals result in faster rates. The type of beat processing required is indicated in the bottom row (finding of the beat, continuation of the beat at the same rate as the previous rhythm, or adjustment of the beat rate from the previous rhythm).
We predicted that putamen activation would be associated with internal beat continuation, in keeping with previous work investigating external and internal factors in beat perception. For example, Grahn and Rowe (2009) found that when strong external markers of the beat were present, putamen activity was lower than when internal generation of the beat was required. This also fits with a generic striatal model of prediction that is supported by studies of associative learning, motor control, and reward (Sardo et al. 2000; Schultz 2006; Wasson et al. 2010). Therefore, we predicted that activity should be greater for beat continuation than beat finding. Under the alternate hypothesis, that the putamen mediates the analysis of the temporal input to find the beat, activity would be higher for beat finding than beat continuation.
Materials and Methods
Participants and Stimuli
Twenty-four healthy right-handed participants (11 male, 13 female, age range 22–44 years, mean 27 years) took part after providing written informed consent. The study was approved by the Cambridge Psychology Research Ethics Committee.
Beat and nonbeat rhythmic stimuli were used. See Figure 1 for schematic depictions of the stimuli. The beat rhythms were constructed from 24 patterns based on those used in previous experiments (Grahn and Brett 2007, 2009; Grahn and Rowe 2009) that rely on the temporal context (i.e., the relative durations) to give rise to the perception of a regular beat (Povel and Okkerman 1981; Grahn and Brett 2007). See Table 1 for a list of the patterns used. The shortest interval was chosen from 140, 170, 200, 230, or 260 ms. All other intervals were integer multiples of the shortest interval. The beat rhythms were thus similar to 4/4 time Western musical rhythms, with the quarter note (fastest note) occurring at an interonset interval of 140–260 ms.
Table 1.
Beat sequences used in the study
| 22314 | 22431 | 31422 | 43122 | 112314 | 112422 |
| 112431 | 211224 | 211314 | 211431 | 222114 | 411231 |
| 422112 | 1111431 | 1122114 | 1123122 | 2112231 | 2113122 |
| 2211114 | 2211231 | 3122112 | 3141111 | 4111131 | 4221111 |
Note: 1 = 140, 170, 200, 230, or 260 ms. All other intervals in the sequence are multiplied by length chosen for the 1 interval.
For each beat rhythm, a nonbeat rhythm was created by jittering the lengths of the intervals in the beat sequence. In each sequence, 1/3 of the intervals were increased in length by 80 ms, 1/3 were decreased by 80 ms, and 1/3 were kept the same length. Thus, the number of intervals, overall rhythm length, and RMS intensity were equivalent between the beat and the nonbeat conditions.
The sequences were 1.82–3.38 s long (mean = 2.6 s). For all conditions, 500 Hz sine tones (rise/fall times of 8 ms) sounded for the duration of each interval, ending 40 ms before the specified interval length to create a silent gap that demarcated the intervals. The sequences used filled intervals, which provide the benefit of attenuation of environmental noise (such as that experienced during MRI). Each beat pattern was used from 5 to 10 times for each session (the exact number for each sequence varied as rhythms were randomly selected for each participant), for a total of 180 beat trials per session. Each nonbeat sequence was used between 2 and 3 times per session (randomly selected for each participant), for a total of 70 nonbeat trials per session. Sixty null trials (blank screens) were also interspersed through the session, experienced by the participants as extended intertrial intervals.
The order of the trials was chosen such that beat rhythms could be divided into different conditions, based on the type of rhythm that had preceded them. Each sequence belonged to 1 of 6 conditions. 1) The “nonbeat” condition: the irregular sequences described above in which no regular beat was present. 2) The “new” condition: beat sequences that were immediately preceded by a nonbeat sequence. The new condition maximally engaged beat finding, as the beat had to be found anew. 3) and 4) These were the beat continuation conditions: “same rate” and “same rate + rhythm.” These were beat sequences that were preceded by another beat sequence with the same beat rate, thus involving beat continuation, with no beat finding or beat adjustment. Half of the beat continuation sequences were in the same rate condition and consisted of a different rhythmic pattern carried out at the same beat rate. The other half of the beat continuation sequences were the same rhythmic pattern carried out at the same rate (same rate + rhythm). In both cases, successful beat prediction is occurring, but in the latter case, the prediction accuracy is maximal as not only can the beat rate be predicted but also the timing of all other tone onsets in the sequence. 5) and 6) Beat adjustment conditions: “slower” and “faster.” These conditions consisted of beat sequences that were preceded by beat sequences at different rates. The second sequence could be at a slower rate than the first (slower) or at a faster rate (faster). An example of a short section of stimulus presentation is available at http://www.jessicagrahn.com/grahnrowe2012.html.
fMRI Experimental Design
Before scanning, participants familiarized themselves with the stimuli and task using a randomly selected sample of the rhythms. They listened to the rhythms and were told that during scanning, they would need to respond to probes that asked them to rate how much they felt a beat in the most recent rhythm that had been played. The button box in the scanner had 4 buttons, so the rating scale ranged from 1 to 4.
Rhythms were presented diotically over headphones with 30 dB attenuation of scanner noise by insert earplugs (3M 1100 earplugs; 3M United Kingdom PLC, Bracknell, UK). Although some studies with auditory stimuli have used “sparse” imaging (Hall et al. 1999; Gaab et al. 2002), the continuous nature of the rhythm presentation made it unfeasible to present all stimuli in intermittent silent periods. In addition, sparse sequences can cause a significant reduction in statistical power relative to continuous imaging (Peelle et al. 2010). Finally, previous studies using these stimuli have been successfully conducted using continuous imaging (Grahn and Brett 2007; Grahn and Rowe 2009). In the current study, none of the participants reported difficulty in hearing the rhythms or focusing on the task. Head fixation used foam pads.
To ensure participants' attention to the rhythms, occasionally, a probe screen appeared with the question “How much did the most recent rhythm have a beat?” They completed the beat rating task by pressing 1 of 4 buttons on a button box to rate how much the most recent rhythm they had heard had a beat. The words beat and “no beat” were listed above the numbers 1–4. On half the trials, beat was on the left side, and no beat was on the right side. On the other half of the trials, the no beat was on the left and beat was on the right. Participants therefore could not know ahead of time whether button 1 was going to be used to indicate a strong beat rating or a no beat rating, and would have to wait for the appearance of the screen to prepare any response. As the main function of the task was simply to ensure attention to the rhythms, the probe only occurred 19 times throughout each session.
Image Acquisition and Analysis
A 3-T Siemens Tim Trio MRI scanner was used to collect 2 runs with 560 echo-planar imaging (EPI) volumes in each. All EPI data had 36 slices, matrix size of 64 × 64, time echo (TE) = 30 ms, time repetition (TR) = 2.19 s, field of view = 19.2 × 19.2 cm, flip angle = 78°, slice thickness 3 mm, interslice distance of 0.75 mm, and in-plane resolution of 3 × 3 mm. High-resolution magnetization prepared rapid gradient echo (MP-RAGE) anatomical images (TR = 2250 ms, TE = 2.99 ms, flip angle = 9°, IT 900 ms, 256 × 256 × 192 isotropic 1 mm voxels) were collected for anatomic localization and coregistration.
SPM5 was used for data analysis (SPM5; Wellcome Department of Imaging Neuroscience, London, UK). The first 5 EPI volumes of each run were discarded to allow for T1 equilibration. Images were sinc-interpolated in time to correct for acquisition time differences and realigned spatially with respect to the first image of the first run using trilinear interpolation. The coregistered MP-RAGE image was segmented and normalized using affine and smoothly nonlinear transformations to the T1 template in Montreal Neurological Institute space. The normalization parameters were then applied to the EPIs, and all normalized EPI images were spatially smoothed with a Gaussian kernel of full-width half-maximum of 8 mm.
Subject-specific first level models included epochs representing the 6 conditions (with epoch duration equivalent to the length of the rhythm) and a transient event for the button press, convolved by the canonical hemodynamic response function. EPI volumes associated with discrete artifacts were included as covariates of no interest (nulling regressors). This included volume displacements >4 mm or spikes of high variance in which scaled volume to volume variance was 4 times greater than the mean variance of the run. Autocorrelations were modeled using a first-order autocorrelation process, and low-frequency noise was removed with a standard high-pass filter of 128 s.
The contrast images estimated from single-participant models were entered into second-level random effects analyses for group inference (Penny and Holmes 2003). It is important to note that the unequal trial numbers between conditions do not affect the level of activity observed in statistical contrasts. Correlations with musical training were conducted by assigning each subject to 1 of 3 levels of musical training: little to none (less than 1 year formal training), moderate (between 1 and 6 years training), and advanced (greater than 6 years training). All whole-brain analyses were thresholded at PFDR < 0.05 whole-brain corrected.
Results
Behavioral Results
The beat ratings obtained during scanning are illustrated in Figure 2. The time stamp of the button presses was recorded for all participants, although the identity of the button that was pressed was only obtained for 16 participants. A repeated measures analysis of variance on beat ratings indicated a significant effect of condition (F5,75 = 41.66, P < 0.001).
Figure 2.
Behavioral ratings. The average beat rating given to each condition when probed during fMRI scanning. Rating scale is from 1 (no beat) to 4 (beat).
Post hoc paired t-tests indicated that the nonbeat condition was rated significantly lower than all beat conditions (new: t15 = 10.0, P < 0.001; faster: t15 = 10.0, P < 0.001; slower: t15 = 7.2, P < 0.001; same rate: t15 = 10.6, P < 0.001; same rate + rhythm: t15 = 7.4, P < 0.001). The faster rhythm condition was rated significantly lower than the new (t15 = 3.0, P < 0.01) and same rate conditions (t15 = 3.3, P < 0.01). The slower condition was rated significantly lower than the same rate condition (t15 = 2.4, P < 0.05) and marginally significantly lower than the new condition (t15 = 2.1, P = 0.051). No other significant rating differences were obtained.
Regional Changes in Brain Activation
A main effect of Beat (beat rhythms–nonbeat rhythms) was found in the putamen bilaterally (see Table 2 and Fig. 3), replicating 2 previous studies (Grahn and Brett 2007; Grahn and Rowe 2009). Activity was also seen in the supplementary motor area (SMA), cingulate gyrus, medial orbitofrontal cortex, premotor cortex, and left temporoparietal junction. Peaks from the reverse contrast (nonbeat–beat) are reported in Table 3 and include the left cerebellum, bilateral superior temporal gyri, right parietal cortex, and right inferior frontal operculum.
Table 2.
Stereotaxic locations of peak voxels in beat rhythms–nonbeat rhythms contrast
| Brain area | BA | Cluster* | t | PFDR | x | y | z |
| L superior frontal gyrus | BA 32 | Cluster 8 | 3.53 | 0.028 | −18 | 36 | 42 |
| L medial superior frontal gyrus | BA 10 | Cluster 2 | 4.85 | 0.035 | −6 | 57 | 3 |
| L medial orbitofrontal cortex | BA 10/11 | Cluster 2 | 4.14 | 0.007 | −6 | 45 | −9 |
| R medial orbitofrontal cortex | BA 11 | Cluster 2 | 3.39 | 0.013 | 6 | 30 | −12 |
| L anterior cingulate | BA 24 | Cluster 1 | 4.15 | 0.013 | −6 | 0 | 45 |
| L SMA | BA 6 | Cluster 1 | 3.88 | 0.007 | −6 | −9 | 60 |
| R SMA | BA 6 | Cluster 1 | 4.66 | 0.017 | 9 | −6 | 60 |
| R SMA | BA 6 | Cluster 1 | 4.22 | 0.03 | 9 | 0 | 48 |
| L premotor cortex | BA 6 | Cluster 6 | 4.71 | 0.007 | −36 | −15 | 51 |
| L middle temporal gyrus | BA 39 | Cluster 5 | 3.94 | 0.011 | −45 | −63 | 24 |
| L precuneus | BA 30 | Cluster 7 | 3.46 | 0.015 | −6 | −57 | 15 |
| R middle cingulate gyrus | BA 31 | Cluster 1 | 3.4 | 0.018 | 9 | −9 | 48 |
| L middle cingulate gyrus | BA 23 | Cluster 1 | 4.31 | 0.035 | −12 | −48 | 39 |
| L middle cingulate gyrus | BA 23 | Cluster 1 | 3.92 | 0.007 | −12 | −27 | 39 |
| L posterior cingulate gyrus | BA 23 | Cluster 1 | 4.03 | 0.031 | −9 | −54 | 36 |
| L putamen | Cluster 3 | 5.05 | 0.007 | −27 | −6 | −3 | |
| L putamen | Cluster 3 | 4.71 | 0.007 | −21 | 3 | −9 | |
| R putamen | Cluster 4 | 4.66 | 0.007 | 27 | 3 | −6 | |
| R putamen | Cluster 4 | 4.14 | 0.013 | 30 | −9 | 3 |
Note: This table shows the brain region, t values, and stereotaxic coordinates (in mm) of peak voxels (P < 0.05 whole-brain FDR corrected) in Montreal Neurological Institute space. *Cluster volumes: 1 = 294 voxels, 2 = 193 voxels, 3 = 190 voxels, 4 = 109 voxels, 5 = 78 voxels, 6 = 73 voxels, 7 = 42 voxels, and 8 = 21 voxels.
Figure 3.

SPM analyses. The beat versus nonbeat activation contrast overlaid on a template brain, thresholded at PFDR < 0.05. Z refers to the level of the axial slice shown in stereotaxic Montreal Neurological Institute space.
Table 3.
Stereotaxic locations of peak voxels in the nonbeat rhythms–beat rhythms contrast
| Brain area | Cluster* | t | PFDR | x | y | Z |
| R inferior frontal, p. triangularis | Cluster 5 | 3.8 | 0.01 | 45 | 24 | 6 |
| R inferior frontal operculum | Cluster 5 | 3.21 | 0.032 | 54 | 12 | 24 |
| R inferior frontal operculum | Cluster 5 | 4.07 | 0.007 | 48 | 15 | 12 |
| R middle frontal gyrus | Cluster 7 | 3.91 | 0.009 | 42 | 48 | 12 |
| R middle frontal gyrus | Cluster 7 | 3.54 | 0.017 | 33 | 60 | 21 |
| Medial superior frontal gyrus | Cluster 6 | 3.85 | 0.01 | 6 | 27 | 39 |
| Supplementary motor area | Cluster 6 | 4.3 | 0.005 | 6 | 9 | 63 |
| R premotor cortex | Cluster 10 | 3.42 | 0.021 | 51 | 6 | 48 |
| R middle frontal gyrus | Cluster 7 | 3.91 | 0.009 | 42 | 48 | 12 |
| R middle frontal gyrus | Cluster 7 | 3.54 | 0.017 | 33 | 60 | 21 |
| L anterior insula | Cluster 8 | 3.46 | 0.02 | −33 | 21 | 3 |
| R anterior insula | Cluster 5 | 4.03 | 0.007 | 36 | 21 | 0 |
| R precuneus | Cluster 9 | 3.23 | 0.031 | 9 | −66 | 54 |
| L superior temporal gyrus | Cluster 4 | 4.67 | 0.003 | −66 | −36 | 9 |
| R superior temporal gyrus | Cluster 2 | 5.88 | 0.001 | 57 | −21 | 0 |
| R superior temporal gyrus | Cluster 2 | 4.71 | 0.003 | 48 | −30 | 0 |
| R inferior parietal | Cluster 3 | 4.13 | 0.006 | 42 | −48 | 39 |
| L cerebellum, Crus 1 | Cluster 1 | 5.21 | 0.001 | −36 | −66 | −27 |
| L cerebellum, Crus 1 | Cluster 1 | 4.78 | 0.002 | −9 | −78 | −24 |
| L cerebellum, Crus 1 | Cluster 1 | 4.56 | 0.004 | −18 | −69 | −27 |
| L cerebellum, VIII lobule | Cluster 1 | 4.43 | 0.004 | −33 | −63 | −51 |
Note: This table shows the brain region, t values, and stereotaxic coordinates (in mm) of peak voxels (P < 0.05 whole-brain FDR corrected) in Montreal Neurological Institute space. *Cluster volumes: 1 = 603 voxels, 2 = 453 voxels, 3 = 201 voxels, 4 = 198 voxels, 5 = 191 voxels, 6 = 132 voxels, 7 = 82 voxels, 8 = 26 voxels, 9 = 12 voxels, and 10 = 10 voxels.
To compare putamen activity among the different beat conditions and to determine whether beat finding, adjustment, or continuation differentially affected putamen activity, a series of t-tests was conducted. First, 2 regions of interest (ROIs) were created from putamen peaks defined from the beat–nonbeat contrast (5 mm radius spheres: left putamen peak coordinate: x = −21, y = 3, z = −9; right putamen peak coordinate: x = 30, y = −9, z = 3). The mean signal for each beat condition was extracted from the ROIs, and the different conditions were compared in a series of paired t-tests. Note that the contrast, which defined the putamen ROIs (beat–nonbeat), was orthogonal to the subsequent statistical comparisons (between different beat conditions).
The findings are consistent with a role for the putamen in beat prediction but not in beat finding, with beat adjustment falling between these 2 conditions (see Fig. 4). In general, putamen activity was highest for the same rate + rhythm condition and lowest for the new condition. Specifically, for left putamen, activity in the new condition was significant lower than the same rate and same rhythm conditions and marginally significantly lower than the slower condition (new vs. same rate: t23 = 3.21, P = 0.004; new vs. same rhythm: t23 = 4.65, P < 0.001; new vs. slower: t23 = 1.94, P = 0.065). Activity in the faster, slower, and same rate conditions was significantly lower than in the same rhythm condition (faster vs. same rhythm: t23 = 2.24, P = 0.035; slower vs. same rhythm: t23 = 3.23, P = 0.004; same rate vs. same rhythm: t23 = 2.36, P = 0.027). For the right putamen, activity in the new condition was significant lower than the slower, same rate, and same rhythm conditions (new vs. slower: t23 = 2.34, P = 0.028; new vs. same rate: t23 = 3.37, P = 0.003; new vs. same rhythm: t23 = 4.31, P < 0.001). Activity in the faster condition was significantly lower than in the same rhythm condition (faster vs. same rhythm: t23 = 2.1, P = 0.046).
Figure 4.
Mean activation from left and right putamen ROIs for each beat condition (relative to the nonbeat control condition). A positive value means greater activity for that particular beat condition compared with the nonbeat condition.
Whole-brain contrasts were also carried out to compare beat continuation (same rate) with beat adjustment (speeding up and slowing down) and also to compare between the different types of beat adjustment (speeding up vs. slowing down). The “same rate–slower” and “slower–same rate” contrasts produced no significant differences at whole-brain levels of correction. “Same rate–faster,” however, activated left frontal cortex (Brodmann area [BA] 9), medial precuneus and prefrontal cortex, bilateral angular gyri, bilateral posterior putamen, and right hippocampus (for a full list of coordinates, see Supplementary Table 1). “Faster–same rate” showed right inferior frontal (BA 44), left SMA, and bilateral anterior insula (see Supplementary Table 2). In addition, the 2 beat adjustment conditions were compared. The “faster–slower” contrast showed no areas with significant activity. The “slower–faster” contrast showed left frontal cortex (BA 9), right ventrolateral prefrontal cortex (BA 47), bilateral cerebellum, posterior putamen, and hippocampal activity (for a full list of coordinates, see Supplementary Table 3). Despite the apparent overlap in areas observed for same rate–faster and slower–faster, a conjunction analysis of same rate–faster and slower–faster did not show any significant overlap.
Effects of Musical Training
Finally, a significant positive correlation with musical training was found in the left superior temporal gyrus (x = −66, y = −33, z = 3, PFDR = 0.001). Negative correlations with musical training were found in left premotor cortex, bilateral superior temporal gyri, superior and middle occipital gyri, fusiform gyri, and lingual gyri (see Supplementary Table 4). To examine whether musical training interacted with differences between beat and nonbeat conditions in the putamen, additional analyses were conducted. In whole-brain analyses, no correlations were found with musical training in the putamen, even with a very liberal statistical threshold (P < 0.05 uncorrected). The activity in the putamen ROIs was subjected to a correlation analyses with musical training and also a series of t-tests comparing putamen activity between musicians and nonmusicians for each rhythm condition. No significant correlations with training or differences between groups were found.
Discussion
The results support our principal hypothesis; the putamen is associated with the prediction of beat timing within sequences of auditory stimuli. Putamen activity was greater for beat continuation than beat finding, across all levels of musical experience. Before considering these imaging results in detail, we will first discuss the behavioral results because of their importance for interpretation of imaging data.
Behavioral Findings
During scanning, participants rated all beat rhythms significantly higher than nonbeat rhythms on the beat perception scale. Ratings of the beat were similar across faster, slower, new, and same beat rates. In particular, the crucial conditions for the fMRI comparisons, the new and same rate conditions, were not rated significantly differently. Therefore, any measured neural differences between the new and same conditions cannot be attributed to differences in the level of beat perception.
The behavioral findings are also consistent with everyday experience: it only takes a couple of seconds to feel the beat. Thus, even when listening to a new rhythm, a beat is easily perceived by the end of the sequence, which is the time that the rating is made. Minor fluctuations in beat rate are well tolerated, as evidenced by similar ratings between changing rates and the same rate. This is consistent with previous work using musical stimuli. In music, the temporal rate of events often slows down or speeds up as part of expressive performance, but people still readily perceive a beat when this occurs (Large et al. 2002) and are still able to synchronize their movement to the music (Drake et al. 2000). Taken together, the current and previous work indicates that beat perception is based on moderately flexible timekeeping mechanisms. Interestingly, timekeeping in these situations is not entirely determined by stimulus characteristics: there are also exposure-related or cultural influences (Drake and Ben El Heni 2003), individuals choose a faster beat rate when tapping along to culturally unfamiliar music compared with culturally familiar music.
fMRI Findings
The Putamen's Role in Beat Processing
Beat rhythms compared with nonbeat rhythms significantly activated the putamen bilaterally. This finding, together with prior studies, demonstrates that the putamen responds to beat perception during rhythm discrimination, pitch deviant detection in rhythmic stimuli, and beat-rating tasks (Grahn and Brett 2007; Grahn and Rowe 2009). Thus, the putamen responds to the beat in a range of perceptual contexts and even when the rhythm is unrelated to the task (as in pitch deviant discrimination). In the visual domain, there is behavioral evidence that processing and prediction of temporal regularity cannot be suppressed, even with participants are specifically instructed to ignore it (Rohenkohl et al. 2011). The fact that putamen activation is consistently observed for beat sequences regardless of whether rhythmic aspects of the stimuli are task relevant may indicate that beat perception is another type of obligatorily encoded stimulus regularity.
Previous observations of putamen activity during beat perception could have been explained by 2 alternate accounts—a role in the search for structure (beat finding) or in predictions based on internal generation of the beat (beat continuation). The current results clearly support the latter: The putamen responded significantly more to conditions associated with beat continuation than beat finding. Indeed, the putamen activity during beat finding (the new condition) was not significantly different from the condition in which no beat existed (the nonbeat condition).
Activation in the basal ganglia did not parallel the behavioral ratings of beat presence, suggesting that basal ganglia activity is not necessarily correlated with the “strength” of beat perception but rather with having established a predictable sense of the beat. Participants rated beat presence highest for the 2 same conditions and the new condition, with the nonbeat condition rated very low. Activation in the putamen, however, was highest for the 2 same conditions, with the new condition extremely low. The pattern of activation suggests that the critical difference between the conditions is whether internal prediction or generation of the beat can be accomplished. Beat prediction is most difficult in the new condition and easiest in the same rhythm condition, suggesting that the basal ganglia activity corresponds to degree of predictability.
The function of the putamen that we identify is consistent with the generalized framework of predictive coding through neuronal generative models. Accurate prediction is a key factor in this process (Friston 2010). In this context, the striatum has been shown to be important for predictions that vary from reward (Schultz 2006) to temporal precision of motor responses (Praamstra and Pope 2007). In particular, the caudate and ventral striatum appear to be sensitive to violations of prediction (prediction error), whereas the putamen encodes the prediction (Haruno and Kawato 2006; Schiffer and Schubotz 2011). The prediction in our study reflects the continuation or maintenance of an internal model of the beat. The proposed role for the putamen in prediction (rather than prediction error) is confirmed and extended here, by the fact that a direct repetition of a rhythm at the same beat rate elicited significantly greater activity compared with the repetition of only beat rate (with a different rhythmic pattern). In a repeated rhythm, accurate predictions can be made not only just about the timing of the beat but also the timing of the individual durations that comprise the rhythm.
The behavioral importance of internal beat models is seen in the tolerance of “counterevidence,” such as syncopation, in which synchrony to the beat is maintained despite sound events occurring “off” the beat (Snyder and Krumhansl 2001). We believe that the “internal” aspect of the prediction is important for inducing putamen activity; stimuli that require internal generation of the beat produce higher levels of putamen activity than stimuli with strong external markers of the beat (Grahn and Rowe 2009). In addition, Parkinson's patients, with abnormal putamen activity, also show deficits in discrimination of changes in rhythms that have a beat (Grahn and Brett 2009). However, they do not report problems with perceiving or finding the beat when listening to music. Therefore, the deficit may not be in perceiving the beat during initial presentations of the rhythm but rather in the ability to internally generate the model of the rhythm during the discrimination phase of the task. For nonbeat rhythms, accurate internal models of the rhythm are very difficult to generate in just 1 or 2 hearings, even for controls. Thus, when neither patients nor controls can rely on a strong internal model, patients are no longer significantly impaired relative to controls.
The basal ganglia have been shown in other work to be active during predictive imagery of learned auditory sequences compared with novel auditory sequences (Leaver et al. 2009). In addition, hearing familiar relative to randomized music, or even familiar relative to unfamiliar music, activates the putamen (Peretz et al. 2009). Temporal predictions have important behavioral consequences: they can enhance the speed of perceptual organization of the sequence and reduce working memory load. There is also evidence that attention is increased at time points coinciding with expectancy of the beat (Jones and Boltz 1989; Jones and Pfordresher 1997), facilitating processing of stimulus features occurring at that time (Jones et al. 2002).
An alternative explanation for greater putamen activity is that interval durations are encoded more deeply or accurately in the same rate and same rate + rhythm conditions. Coull and Nobre (2008) found that activity in the left putamen during temporal encoding correlated with accuracy in a temporal discrimination task. On the other hand, previous work with the rhythms used in the current study on a task that equated difficulty between beat and nonbeat rhythms (Grahn and Brett 2007) still found greater basal ganglia activity for beat rhythms, even though accuracy of the discrimination was not different. Thus, accuracy seems unlikely to be sufficient to explain basal ganglia activity.
Other Brain Areas
In addition to putamen responses, our whole-brain analysis showed that the supplementary motor area and left premotor cortex were more active for beat than nonbeat rhythms. The supplementary motor area is strongly interconnected with the basal ganglia and has been shown to be active when listening to beat rhythms compared with nonbeat rhythms in a previous fMRI study (Grahn and Brett 2007). Premotor cortex is implicated in studies of rhythm and timing. For example, Schubotz (2007) has proposed that the ventral premotor cortex (vPMC) plays a crucial role in temporal prediction, much like that proposed here for the basal ganglia. It is suggested that the premotor representation of the mouth area is also active during temporal prediction, as a result a “body map.” The idea is that the brain “represents not only imagined movement of one's own body but also (current or expected) sensory features of the environment in reference to one's own body” (Schubotz and von Cramon 2003). In Schubotz's work, however, vPMC activation is present when attention is directed to prediction, but not when temporal regularity is present but unattended (Schubotz et al. 2000). In contrast, basal ganglia activity is observed even when participants are not directed to attend to the beat (Grahn and Rowe 2009). However, connectivity between putamen and premotor cortex has been shown to increase during beat rhythms, and the 2 structures may work in tandem. For example, the basal ganglia may be involved in more implicit, or automatic, predictions of regularity rather than task-based attention to prediction, whereas the explicit task may induce participants to rely on motor representations or imagery with premotor cortex activation.
In contrast, nonbeat rhythms (vs. beat rhythms) significantly activated the left cerebellum, right parietal cortex, right inferior frontal operculum, and bilateral superior temporal gyri. These nonbeat rhythms have no temporal regularity, and each interval length must be encoded individually, as opposed to beat rhythms in which all intervals can be encoded relative to the beat interval. Therefore, nonbeat rhythms make considerably more demands on absolute nonrelative timing mechanisms than beat rhythms do. The cerebellar activation for nonbeat rhythms is consistent with a role in absolute timing (Grube et al. 2010; Teki et al. 2011). However, in our task, attentional demands may have differed between beat and nonbeat rhythms. For beat sequences, once the beat is found, the response to the rating task may be chosen. For nonbeat sequences, however, participants may feel that a beat structure could emerge later in the sequence and therefore maintain greater attention throughout the sequence. It is not possible to distinguish whether the nonbeat minus beat activations, particularly in inferior frontal and parietal cortices, are the result of timing-related differences, or the greater attentional demands potentially required by nonbeat sequences.
Addressing of Potential Confounds
There are several potential confounding variables that we addressed in this study. Familiarity can be difficult to dissociate from beat perception, not least because it may be unclear where the line is between increasing familiarity with the beat and formation of an internal representation of the beat. Over the course of the experiment, the general structure of the rhythms would have become very familiar, as all beat rhythms were constructed from the same 6 “chunks,” or subpatterns of intervals (as shown in Table 1, all rhythms are composed of 3 chunks of 4 units each). To assess familiarity-related activity changes, we compared basal ganglia activity in the second session relative to the first session (data not shown). If greater familiarity increased basal ganglia activity, activation in the second session should have been correspondingly greater. No second session increases were observed, indicating that greater familiarity is not an explanation for the greater activity in the same conditions relative to the new condition.
The absolute rate is also a likely determinant of cortical and subcortical activation (Riecker et al. 2003, 2006). In our study, absolute rate was deconfounded from the “experienced”, or contextual, rate (faster or slower or same) by the experimental design. Therefore, the differences between maintaining the same rate and modulating to different rates could be compared without confounding differences in absolute timing. The fact that activations to the slower and same rate conditions did not significantly differ, but the faster and same rate conditions did differ, suggests there were no generic “mismatch’ or rate change–related activations. Instead, speeding up and slowing down elicited different neural responses. This is consistent with behavioral research showing that early violations of temporal prediction are more disruptive than those that are late (McAuley and Jones 2003) and are associated with different patterns of activation (Coull et al. 2000). The current study, however, is the first to index temporal expectancy for a particular structural aspect of a temporal sequence instead of expectancy of a single event: The violation of expectancy could only occur for the underlying beat, not the onset of individual tones in the rhythm, since the latter could not be predicted in these conditions. In previous work with visual stimuli (Coull et al. 2000), unexpectedly early appearances activated visual cortex. We did not find primary sensory increases in the faster condition relative to the same or slower conditions, suggesting that primary area increases may only occur for more basic expectancy violations, rather than predictions of the higher level features like the beat.
We did not use simultaneous electromyography in this study to exclude limb or digit tapping in this study. In the current study, visual observation and careful instructions were used to minimize this potential confound. However, previous work in our laboratory using similar participant populations, sequences, and instructions has shown that the task can be performed without limb movement.
Work in other domains has found decreased neural responses to repeated stimuli: a phenomenon known as repetition suppression or neural adaptation (Desimone 1996; Henson and Rugg 2003). One might therefore expect that the same rate + rhythm condition would show a “reduction” in activation. Although not a classical repetition suppression paradigm, we found no evidence of this. This may be because our stimuli were long (∼2 to 5 s) in comparison with many studies of repetition suppression (<1 s). Alternatively, the overall similarity of the stimuli may cause a small general suppression across all conditions but no additional specific suppression in the same rate + rhythm condition.
Relationship between Beat Perception and Musical Training
Musical training modulates levels of neural activity during rhythm processing (Besson et al. 1994; Vuust et al. 2005, 2006) and results in greater interactions between auditory and premotor cortex (Chen et al. 2006; Grahn and Rowe 2009). In the current study, there were negative correlations (less musical training resulting in greater activation) in left premotor cortex and bilateral auditory cortex. Those with less musical training may have found the task more difficult and therefore maintained greater focus on the stimuli and subsequent response, resulting in greater sensory and premotor activation. Musical expertise has been associated with decreased activation in other musical tasks (Jäncke et al. 2000; Koeneke et al. 2004; Meister et al. 2005; Berkowitz and Ansari 2010), consistent with reduced difficulty or expertise-related efficiency in processing. However, there were positive correlations with musical training in the left superior temporal gyrus, consistent with previous results for passive listening to rhythms (Limb et al. 2006) and music (Ohnishi et al. 2001).
Whether musically trained or not, beat perception occurs spontaneously in most people without great effort. The lack of musical training–related differences in putamen activity is consistent with this observation and with previous work (Grahn and Brett 2007). Cognitive theories of beat perception suggest that the beat in music is indicated by several types of salient changes or accent: volume, duration, melodic, harmonic, timbral, etc. When attempting to find a beat, people generate hypotheses about the beat location based on the perceived accents (Zanto et al. 2006) and predict that future accented events are likely to occur “on the beat” (Povel 1984; Essens and Povel 1985; Povel and Essens 1985; Essens 1995). Although determinations of accent location in music are both melodic and rhythmic, Temperley and Bartlette (2002) list 4 factors that are important in beat finding from a rhythmic point of view: 1) the beat coincides with tone onsets, 2) beats coincide with longer tones, 3) the beat should be regular, and 4) grouping plays a role (the strongest tones tend to be at the beginning of a group of tones). However, this does not account for the fact that most rhythm perception occurs in an ongoing fashion: We are in the process of perceiving the very beginning of a rhythmic sequence only a minority of the time. Therefore, the internal predictions that are set up, driven in part by previous context, also play a significant role in how a rhythm is perceived. We suggest that this prediction is a core part of the basal ganglia role in beat perception, regardless of musical training.
Conclusions
The findings indicate that the putamen do not respond preferentially to the discovery of regular beat structure but instead reflect the prediction or internal generation of future beats continuing after the temporal structure has been found. When modifications to beat predictions are required during minor rate changes, basal ganglia activity is moderately reduced. In contrast, irregular rhythms are associated with decreased basal ganglia activity and increased activity the cerebellum. These findings are consistent with a generic role for the basal ganglia in internally generated predictions.
Funding
Medical Research Council (J.A.G., U.1055.01.0003.00001.01; J.B.R., MC_US_A060_0016) and Wellcome Trust (088324 to J.B.R.).
Supplementary Material
Acknowledgments
Conflict of Interest : None declared.
References
- Barnes R, Jones MR. Expectancy, attention, and time. Cogn Psychol. 2000;41:254–311. doi: 10.1006/cogp.2000.0738. [DOI] [PubMed] [Google Scholar]
- Berkowitz AL, Ansari D. Expertise-related deactivation of the right temporoparietal junction during musical improvisation. Neuroimage. 2010;49:712–719. doi: 10.1016/j.neuroimage.2009.08.042. [DOI] [PubMed] [Google Scholar]
- Besson M, Faïta F, Requin J. Brain waves associated with musical incongruities differ for musicians and non-musicians. Neurosci Lett. 1994;168:101–105. doi: 10.1016/0304-3940(94)90426-x. [DOI] [PubMed] [Google Scholar]
- Chen JL, Zatorre RJ, Penhune VB. Interactions between auditory and dorsal premotor cortex during synchronization to musical rhythms. Neuroimage. 2006;32:1771–1781. doi: 10.1016/j.neuroimage.2006.04.207. [DOI] [PubMed] [Google Scholar]
- Cooper G, Meyer LB. The rhythmic structure of music. Chicago (IL): The University of Chicago Press; 1960. [Google Scholar]
- Correa A, Lupianez J, Tudela P. Attentional preparation based on temporal expectancy modulates processing at the perceptual level. Psychon Bull Rev. 2005;12:328–334. doi: 10.3758/bf03196380. [DOI] [PubMed] [Google Scholar]
- Coull JT, Frith CD, Buchel C, Nobre AC. Orienting attention in time: behavioural and neuroanatomical distinction between exogenous and endogenous shifts. Neuropsychologia. 2000;38:808–819. doi: 10.1016/s0028-3932(99)00132-3. [DOI] [PubMed] [Google Scholar]
- Coull JT, Nobre AC. Dissociating explicit timing from temporal expectation with fMRI. Curr Opin Neurobiol. 2008;18:137–144. doi: 10.1016/j.conb.2008.07.011. [DOI] [PubMed] [Google Scholar]
- Desimone R. Neural mechanisms for visual memory and their role in attention. Proc Natl Acad Sci U S A. 1996;93:13494–13499. doi: 10.1073/pnas.93.24.13494. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Drake C, Ben El Heni J. Synchronizing with music: intercultural differences. Ann N Y Acad Sci. 2003;999:429–437. doi: 10.1196/annals.1284.053. [DOI] [PubMed] [Google Scholar]
- Drake C, Gerard C. A psychological pulse train: how young children use their cognitive framework to structure simple rhythms. Psychol Res. 1989;51:16–22. doi: 10.1007/BF00309271. [DOI] [PubMed] [Google Scholar]
- Drake C, Penel A, Bigand E. Tapping in time with mechanically and expressively performed music. Music Percept. 2000;18:1–24. [Google Scholar]
- Essens PJ. Structuring temporal sequences: Comparison of models and factors of complexity. Perception & Psychophysics. 1995;57:519–532. doi: 10.3758/bf03213077. [DOI] [PubMed] [Google Scholar]
- Essens PJ, Povel DJ. Metrical and nonmetrical representations of temporal patterns. Percept Psychophys. 1985;37:1–7. doi: 10.3758/bf03207132. [DOI] [PubMed] [Google Scholar]
- Friston K. The free-energy principle: a unified brain theory? Nat Rev Neurosci. 2010;11:127–138. doi: 10.1038/nrn2787. [DOI] [PubMed] [Google Scholar]
- Friston K, Kiebel S. Predictive coding under the free-energy principle. Philos Trans R Soc Lond B Biol Sci. 2009;364:1211–1221. doi: 10.1098/rstb.2008.0300. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gaab N, Gaser C, Zaehle T, Jaencke L, Schlaug G. Functional anatomy of pitch memory–an fMRI study with sparse temporal sampling. Neuroimage. 2003;19:1417–1426. doi: 10.1016/s1053-8119(03)00224-6. [DOI] [PubMed] [Google Scholar]
- Grahn JA, Brett M. Rhythm perception in motor areas of the brain. J Cogn Neurosci. 2007;19:893–906. doi: 10.1162/jocn.2007.19.5.893. [DOI] [PubMed] [Google Scholar]
- Grahn JA, Brett M. Impairment of beat-based rhythm discrimination in Parkinson’s disease. Cortex. 2009;45:54–61. doi: 10.1016/j.cortex.2008.01.005. [DOI] [PubMed] [Google Scholar]
- Grahn JA, Rowe JB. Feeling the beat: premotor and striatal interactions in musicians and non-musicians during beat processing. J Neurosci. 2009;29:7540–7548. doi: 10.1523/JNEUROSCI.2018-08.2009. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Grube M, Cooper FE, Chinnery PF, Griffiths TD. Dissociation of duration-based and beat-based auditory timing in cerebellar degeneration. Proc Natl Acad Sci U S A. 2010;107:11597–11601. doi: 10.1073/pnas.0910473107. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Grube M, Griffiths TD. Metricality-enhanced temporal encoding and the subjective perception of rhythmic sequences. Cortex. 2009;45:72–79. doi: 10.1016/j.cortex.2008.01.006. [DOI] [PubMed] [Google Scholar]
- Hall DA, Haggard MP, Akeroyd MA, Palmer AR, Summerfield AQ, Elliott MR, Gurney EM, Bowtell RW. “Sparse” temporal sampling in auditory fMRI. Hum Brain Mapp. 1999;7:213–223. doi: 10.1002/(SICI)1097-0193(1999)7:3<213::AID-HBM5>3.0.CO;2-N. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hannon EE, Trainor LJ. Music acquisition: effects of enculturation and formal training on development. Trends Cogn Sci. 2007;11:466–472. doi: 10.1016/j.tics.2007.08.008. [DOI] [PubMed] [Google Scholar]
- Hannon EE, Trehub S. Metrical categories in infancy and adulthood. Psychological Science. 2005;16:48–55. doi: 10.1111/j.0956-7976.2005.00779.x. [DOI] [PubMed] [Google Scholar]
- Haruno M, Kawato M. Different neural correlates of reward expectation and reward expectation error in the putamen and caudate nucleus during stimulus-action-reward association learning. J Neurophysiol. 2006;95:948–959. doi: 10.1152/jn.00382.2005. [DOI] [PubMed] [Google Scholar]
- Hebert S, Cuddy LL. Detection of metric structure in auditory figural patterns. Percept Psychophys. 2002;64:909–918. doi: 10.3758/bf03196795. [DOI] [PubMed] [Google Scholar]
- Henson RNA, Rugg MD. Neural response suppression, haemodynamic repetition effects, and behavioural priming. Neuropsychologia. 2003;41:263–270. doi: 10.1016/s0028-3932(02)00159-8. [DOI] [PubMed] [Google Scholar]
- Jäncke L, Shah NJ, Peters M. Cortical activations in primary and secondary motor areas for complex bimanual movements in professional pianists. Cogn Brain Res. 2000;10:177–183. doi: 10.1016/s0926-6410(00)00028-8. [DOI] [PubMed] [Google Scholar]
- Jones MR, Boltz M. Dynamic attending and responses to time. Psychol Rev. 1989;96:459–491. doi: 10.1037/0033-295x.96.3.459. [DOI] [PubMed] [Google Scholar]
- Jones MR, Moynihan H, MacKenzie N, Puente J. Temporal aspects of stimulus-driven attending in dynamic arrays. Psychol Sci. 2002;13:313–319. doi: 10.1111/1467-9280.00458. [DOI] [PubMed] [Google Scholar]
- Jones MR, Pfordresher PQ. Tracking musical patterns using joint accent structure. Can J Exp Psychol. 1997;51:271–290. [Google Scholar]
- Koeneke S, Lutz K, Wustenberg T, Jancke L. Long-term training affects cerebellar processing in skilled keyboard players. Neuroreport. 2004;15:1279–1282. doi: 10.1097/01.wnr.0000127463.10147.e7. [DOI] [PubMed] [Google Scholar]
- Large EW, Fink P, Kelso JAS. Tracking simple and complex sequences. Psychol Res. 2002;66:3–17. doi: 10.1007/s004260100069. [DOI] [PubMed] [Google Scholar]
- Large EW, Palmer C. Perceiving temporal regularity in music. Cogn Sci. 2002;26:1–37. [Google Scholar]
- Leaver AM, Van Lare J, Zielinski B, Halpern AR, Rauschecker JP. Brain activation during anticipation of sound sequences. J Neurosci. 2009;29:2477–2485. doi: 10.1523/JNEUROSCI.4921-08.2009. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Limb CJ, Kemeny S, Ortigoza EB, Rouhani S, Braun AR. Left hemispheric lateralization of brain activity during passive rhythm perception in musicians. Anat Rec A Discov Mol Cell Evol Biol. 2006;288A:382–389. doi: 10.1002/ar.a.20298. [DOI] [PubMed] [Google Scholar]
- London J. Hearing in time: psychological aspects of musical meter. New York: Oxford University Press; 2004. [Google Scholar]
- McAuley JD, Jones MR. Modeling effects of rhythmic context on perceived duration: a comparison of interval and entrainment approaches to short-interval timing. J Exp Psychol Hum Percept Perform. 2003;29:1102–1125. doi: 10.1037/0096-1523.29.6.1102. [DOI] [PubMed] [Google Scholar]
- Meister I, Krings T, Foltys H, Boroojerdi B, Muller M, Topper R, Thron A. Effects of long-term practice and task complexity in musicians and nonmusicians performing simple and complex motor tasks: implications for cortical motor organization. Hum Brain Mapp. 2005;25:345–352. doi: 10.1002/hbm.20112. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ohnishi T, Matsuda H, Asada T, Aruga M, Hirakata M, Nishikawa M, Katoh A, Imabayashi E. Functional anatomy of musical perception in musicians. Cereb Cortex. 2001;11:754–760. doi: 10.1093/cercor/11.8.754. [DOI] [PubMed] [Google Scholar]
- Palmer C, Krumhansl CL. Mental representations for musical meter. J Exp Psychol Hum Percept Perform. 1990;16:728–741. doi: 10.1037//0096-1523.16.4.728. [DOI] [PubMed] [Google Scholar]
- Parncutt R. A perceptual model of pulse salience and metrical accent in musical rhythms. Music Percept. 1994;11:409–464. [Google Scholar]
- Patel AD, Iversen JR, Chen Y, Repp BH. The influence of metricality and modality on synchronization with a beat. Exp Brain Res. 2005;163:226–238. doi: 10.1007/s00221-004-2159-8. [DOI] [PubMed] [Google Scholar]
- Peelle JE, Eason RJ, Schmitter S, Schwarzbauer C, Davis MH. Evaluating an acoustically quiet EPI sequence for use in fMRI studies of speech and auditory processing. Neuroimage. 2010;52:1410–1419. doi: 10.1016/j.neuroimage.2010.05.015. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Penny W, Holmes AP. Random effects analysis. In: Frackowiack RSJ, Friston KJ, Frith CD, Dolan R, Price CJ, Ashburner J, Penny W, Zeki S, editors. Human brain function II. 2nd ed. San Diego (CA): Elsevier Academic Press; 2003. [Google Scholar]
- Peretz I, Gosselin N, Belin P, Zatorre RJ, Plailly J, Tillmann B. Musical lexical networks: the cortical organization of music recognition. Ann N Y Acad Sci. 2009;1169:256–265. doi: 10.1111/j.1749-6632.2009.04557.x. [DOI] [PubMed] [Google Scholar]
- Povel D-J, Okkerman H. Accents in equitone sequences. Percept Psychophys. 1981;30:565–572. doi: 10.3758/bf03202011. [DOI] [PubMed] [Google Scholar]
- Povel DJ. A theoretical framework for rhythm perception. Psychol Res. 1984;45:315–337. doi: 10.1007/BF00309709. [DOI] [PubMed] [Google Scholar]
- Povel DJ, Essens PJ. Perception of temporal patterns. Music Percept. 1985;2:411–440. doi: 10.3758/bf03207132. [DOI] [PubMed] [Google Scholar]
- Praamstra P, Pope P. Neurophysiology of implicit timing in serial choice reaction-time performance. J Neurosci. 2006;26:5448–5455. doi: 10.1523/JNEUROSCI.0440-06.2006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Praamstra P, Pope P. Slow brain potential and oscillatory EEG manifestations of impaired temporal preparation in Parkinson's disease. J Neurophysiol. 2007;98:2848–2857. doi: 10.1152/jn.00224.2007. [DOI] [PubMed] [Google Scholar]
- Riecker A, Wildgruber D, Mathiak K, Grodd W, Ackermann H. Parametric analysis of rate-dependent hemodynamic response functions of cortical and subcortical brain structures during auditorily cued finger tapping: a fMRI study. Neuroimage. 2003;18:731–739. doi: 10.1016/s1053-8119(03)00003-x. [DOI] [PubMed] [Google Scholar]
- Riecker A, Kassubek J, Gröschel K, Grodd W, Ackermann H. The cerebral control of speech tempo: opposite relationship between speaking rate and BOLD signal changes at striatal and cerebellar structures. Neuroimage. 2006;29:46–53. doi: 10.1016/j.neuroimage.2005.03.046. [DOI] [PubMed] [Google Scholar]
- Rohenkohl G, Coull JT, Nobre AC. Behavioural dissociation between exogenous and endogenous temporal orienting of attention. PLoS One. 2011;6:e14620. doi: 10.1371/journal.pone.0014620. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ross J, Houtsma AJ. Discrimination of auditory temporal patterns. Percept Psychophys. 1994;56:19–26. doi: 10.3758/bf03211687. [DOI] [PubMed] [Google Scholar]
- Sardo P, Ravel S, Legallet E, Apicella P. Influence of the predicted time of stimuli eliciting movements on responses of tonically active neurons in the monkey striatum. Eur J Neurosci. 2000;12:1801–1816. doi: 10.1046/j.1460-9568.2000.00068.x. [DOI] [PubMed] [Google Scholar]
- Schiffer AM, Schubotz RI. Caudate nucleus signals for breaches of expectation in a movement observation paradigm. Front Hum Neurosci. 2011;5:38. doi: 10.3389/fnhum.2011.00038. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schubotz RI. Prediction of external events with our motor system: towards a new framework. Trends Cogn Sci. 2007;11:211–218. doi: 10.1016/j.tics.2007.02.006. [DOI] [PubMed] [Google Scholar]
- Schubotz RI, Friederici AD, von Cramon DY. Time perception and motor timing: a common cortical and subcortical basis revealed by fMRI. Neuroimage. 2000;11:1–12. doi: 10.1006/nimg.1999.0514. [DOI] [PubMed] [Google Scholar]
- Schubotz RI, von Cramon DY. Functional-anatomical concepts of human premotor cortex: evidence from fMRI and PET studies. Neuroimage. 2003;20(Suppl 1):S120–S131. doi: 10.1016/j.neuroimage.2003.09.014. [DOI] [PubMed] [Google Scholar]
- Schultz W. Behavioral theories and the neurophysiology of reward. Annu Rev Psychol. 2006;57:87–115. doi: 10.1146/annurev.psych.56.091103.070229. [DOI] [PubMed] [Google Scholar]
- Snyder JS, Krumhansl CL. Tapping to ragtime: cues to pulse-finding. Music Percept. 2001;18:455–489. [Google Scholar]
- Soley G, Hannon EE. Infants prefer the musical meter of their own culture: a cross-cultural comparison. Dev Psychol. 2010;46:286–292. doi: 10.1037/a0017555. [DOI] [PubMed] [Google Scholar]
- Teki S, Grube M, Kumar S, Griffiths TD. Distinct neural substrates of duration-based and beat-based auditory timing. J Neurosci. 2011;31:3805–3812. doi: 10.1523/JNEUROSCI.5561-10.2011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Temperley D, Bartlette C. Anatomy of a performance: sources of musical expression. Music Percept. 2002;20:117–149. [Google Scholar]
- van Noorden L, Moelants D. Resonance in the perception of musical pulse. J New Music Res. 1999;28:43–66. [Google Scholar]
- Vuust P, Pallesen KJ, Bailey C, van Zuijen TL, Gjedde A, Roepstorff A, Ostergaard L. To musicians, the message is in the meter: pre-attentive neuronal responses to incongruent rhythm are left-lateralized in musicians. Neuroimage. 2005;24:560–564. doi: 10.1016/j.neuroimage.2004.08.039. [DOI] [PubMed] [Google Scholar]
- Vuust P, Roepstorff A, Wallentin M, Mouridsen K, Ostergaard L. It don't mean a thing. Keeping the rhythm during polyrhythmic tension, activates language areas (BA47) Neuroimage. 2006;31:832–841. doi: 10.1016/j.neuroimage.2005.12.037. [DOI] [PubMed] [Google Scholar]
- Wasson P, Prodoehl J, Coombes SA, Corcos DM, Vaillancourt DE. Predicting grip force amplitude involves circuites in the anterior basal ganglia. Neuroimage. 2010;49:3230–3238. doi: 10.1016/j.neuroimage.2009.11.047. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zanto TP, Snyder JS, Large EW. Neural correlates of rhythmic expectancy. Adv Cogn Psychol. 2006;2:221–231. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.



