Abstract
Naturally occurring signals in audition and touch can be complex and marked by temporal variations in frequency and amplitude. Auditory frequency sweep processing has been studied extensively; however, much less is known about sweep processing in touch since studies have primarily focused on the perception of simple sinusoidal vibrations. Given the extensive interactions between audition and touch in the frequency processing of pure tone signals, we reasoned that these senses might also interact in the processing of higher-order frequency representations like sweeps. In a series of psychophysical experiments, we characterized the influence of auditory distractors on the ability of participants to discriminate tactile frequency sweeps. Auditory frequency sweeps systematically biased the tactile perception of sweep direction. Importantly, auditory cues exerted little influence on tactile sweep direction perception when the sounds and vibrations occupied different absolute frequency ranges or when the sounds consisted of intensity sweeps. Thus, audition and touch interact in frequency sweep perception in a frequency- and feature-specific manner. Our results demonstrate that audio-tactile interactions are not constrained to the processing of simple sinusoids. Because higher-order frequency representations may be synthesized from simpler representations, our findings imply that multisensory interactions in the temporal frequency domain span multiple hierarchical levels in sensory processing.
Keywords: multisensory, crossmodal, audiotactile, integration, spectral
Our ability to sense and discriminate vibrations is critical for perceiving our physical environment. While extant somatosensory research has largely focused on the processing of simple vibrations (i.e., sinusoids), naturally occurring stimuli we encounter during interactions with objects and textures typically consist of complex and dynamic input patterns. Indeed, the scanning of fine textures across the skin produces complex vibration patterns that are transduced by mechanoreceptors in the skin (Manfredi et al., 2014). Our exquisite ability to discriminate and identify fine textures, such as wool from cotton, likely depends on the nervous system’s ability to transmit, represent, and analyze the idiosyncratic vibrations associated with each texture (Bensmaia & Hollins, 2003; Manfredi et al., 2014). How texture-related vibrations are processed is the focus of many recent efforts (Saal, Harvey, & Bensmaia, 2015). In addition to spectral complexity, vibration patterns produced during natural interactions between our hands and objects are also marked by temporal variations in vibration frequency and amplitude. Critically, even how we perceive vibrotactile signals whose properties change in time (e.g., frequency sweeps) is poorly understood. Because dynamic environmental signals may be especially salient and informative, the somatosensory modality may be characterized by selectivity for time varying stimulation in the temporal domain as it is in the spatial domain (Yau, Kim, Thakur, & Bensmaia, 2016).
Complex, dynamic frequency patterns are critical in our auditory experience as well. Because these signals play outsized roles in language processing, the perception of frequency sweeps by audition has been extensively studied. Humans can perceive and discriminate auditory frequency sweeps (Madden & Fire, 1997; Shu, Swindale, & Cynader, 1993). Results from psychophysical adaptation and conditioning studies imply the existence of specialized processing mechanisms that may explicitly support frequency sweep processing (Gardner & Wilson, 1979; Kay & Matthews, 1972). Neural populations can exhibit highly selective tuning for frequency sweep direction and depth (Mendelson & Cynader, 1985; Tian, 2004; Ye, Poo, Dan, & Zhang, 2010; Zhang, Tan, Schreiner, & Merzenich, 2003). In hierarchical cortical models for auditory processing, the neural coding of higher-order auditory features, like frequency sweeps, may be based on representations of simpler acoustic signals (Sadagopan & Wang, 2009). If tactile sweep perception relates to its auditory counterpart, similar neural mechanisms may support higher-order frequency representations in the two modalities.
Audition and touch are intimately linked in the perception of simple temporal frequency signals. There are clear correspondences in how the frequency of a pure tone stimulus is perceived by touch and audition (Bensmaia, Hollins, & Yau, 2005). Additionally, touch and audition exhibit strong perceptual interactions (Occelli, Spence, & Zampini, 2011). Concurrent sounds can modulate detection of vibrations and perception of vibration frequency (Ro, Hsu, Yasar, Caitlin Elmore, & Beauchamp, 2009; Wilson, Reed, & Braida, 2010; Yau, Olenczak, Dammann, & Bensmaia, 2009). These interactions can be highly specific and reciprocal in the frequency domain (Yau, Weber, & Bensmaia, 2010). Such behavioral effects, along with the observation that auditory adaptation can selectively modulate subsequent tactile frequency perception (Crommett, Pérez-Bellido, & Yau, 2017), supports the notion that common or overlapping neural circuits support simple frequency processing by the two senses (Foxe et al., 2002; Fu et al., 2003; Kayser, Petkov, Augath, & Logothetis, 2005; Pérez-Bellido, Barnes, Crommett, & Yau, 2017; Schürmann, Caetano, Jousmäki, & Hari, 2004). Shared or interactive mechanisms potentially link audition and touch in the processing of more complex oscillatory patterns, as suggested by perceptual interactions in texture perception (Guest, Catmur, Lloyd, & Spence, 2002; Jousmäki & Hari, 1998; Lederman, 1979; Yau, Hollins, & Bensmaia, 2009). If auditory and tactile functions are associated across multiple levels of complexity in the temporal frequency domain, we reasoned that the two senses should interact in the perception of frequency sweeps. Thus, characterizing the nature of interactions between audition and touch in higher-order frequency representations like sweeps would be key to elucidating the hierarchical organization of sensory systems dedicated to spectral processing.
Here, we tested the hypothesis that audition and touch interact in multiple representational levels of temporal frequency processing. Similar to frequency-specific auditory influences on the tactile perception of pure tone stimuli, we predicted that auditory frequency sweeps would systematically bias tactile sweep direction judgments. We performed a series of psychophysical experiments characterizing the influence of audition on tactile sweep perception and found evidence that audiotactile interactions in frequency sweep perception result from cue combination in sensory representations rather than from cognitive or decisional biases. We conclude with considerations of simple neural network models that describe potential mechanisms by which auditory frequency sweeps can alter tactile frequency sweep perception.
General Method
We conducted a series of 4 experiments investigating the influence of auditory distractors on tactile frequency sweep perception. Experiment 1 characterized the direction-specific influences of auditory sweeps on tactile sweep direction perception. Experiment 2 characterized the dependence of auditory effects on distractor magnitude, defined by sweep depth (i.e., the range of frequencies spanned by the sweep). Experiment 3 tested whether interactions between audition and touch in frequency sweep perception depended on absolute frequency. Experiment 4 tested whether auditory distractors comprising variations in intensity (i.e., intensity sweeps) rather than frequency could also influence tactile frequency sweep perception.
Participants.
We tested 36 total subjects across 4 experiments. Four subjects participated in all four experiments. Three subjects participated in three experiments. Seven subjects participated in two experiments. All but two participants were right handed according to the Edinburgh Handedness Inventory (Oldfield, 1971; mean scores ± s.e.m.: Experiment 1: 0.66 ± 0.06; Experiment 2: 0.78 ± 0.07; Experiment 3: 0.59 ± 0.17; Experiment 4: 0.69 ± 0.10). The vast majority of participants were naïve to the hypotheses. The authors (LEC and DM) were aware of the hypotheses when they participated in experiments 1 and 2 and experiment 1, respectively. All participants had normal auditory and tactile sensibilities according to self-reports. All testing procedures were performed in compliance with the policies and procedures of the Baylor College of Medicine Institutional Review Board. All participants gave their written informed consent and were paid for their participation.
Frequency sweep discrimination task.
Before the start of an experimental session, participants received training on the tactile frequency sweep discrimination task in order to ensure that they understood the instructions and could perform the task. During each task block (Figure 1A), participants discriminated tactile frequency sweep direction in a 1-interval, 2-alternative forced choice (2AFC) paradigm. On each trial, a vibrotactile frequency sweep stimulus (Figure 1B,C) was delivered to the participant’s left index finger, in conjunction with an auditory stimulus (see Auditory stimulation). Participants were instructed to ignore the auditory stimulus and report whether the vibration was perceived to be increasing or decreasing in frequency. Stimuli were presented for 500 ms. All tactile stimuli were centered on 230 Hz. Tactile sweep depths, defined as the difference in frequency between the start and end of the sweep (FS and FE, respectively), were ±300, ±250, ±200, ±150, ±100, ±50, and 0 Hz. Positive depths indicate “upward” sweeps (i.e., stimuli in which FE was higher than FS). Negative depths indicate “downward” sweeps (i.e., stimuli in which FE was lower than FS). A sweep depth of 0 indicated a 230 Hz stimulus that did not vary in frequency. Tactile sweep stimuli were equated for perceived intensity (see Equating tactile stimulus intensity) to ensure that participants could only discriminate sweep direction by relying on frequency information. Tactile sweep depth and auditory conditions were randomized across trials. No feedback was provided.
Figure 1.
Experimental setup and design. (A) Tactile frequency sweep direction identification was tested in the presence of simultaneous auditory frequency sweep distractors. Subjects reported whether each tactile stimulus was perceived to be increasing or decreasing in the tactile frequency domain, FT. Auditory distractors comprised upward and downward sweeps in the auditory frequency domain, FA, or white noise. (B) Example displacement profile for a tactile sweep beginning at 80 Hz and ending at 380 Hz. (C) Spectrogram indicating the power of the frequencies present in the example tactile sweep shown in B. (D) Custom built sound-attenuation chamber encapsulating an assembly that holds the shaker motor. Participant placed their supinated left hand in the chamber and placed their left index finger under the probe. View shown does not include front panel with entry hole.
Tactile stimulation.
Frequency sweeps were defined digitally in Matlab using the formulation:
where X(t) is the probe position at time t, Fs and FE are the frequencies at the start and end of the sweep, respectively, and TS is the sweep duration in seconds. Vibrotactile stimulation was delivered using a system previously described (Crommett et al., 2017). Frequency sweep stimuli were delivered along the axis perpendicular to the skin surface by a plastic stylus (8-mm diameter) mounted on a shaker motor (Figure 1D; K2007E01, The Modal Shop, Inc., Cincinnati, OH). The probe tip was initially indented into the skin by 1mm to ensure contact throughout stimulus presentation. An accelerometer (Type 8702B50M1, Kistler Instrument Corporation, Amherst, NY) with a dynamic range of ±50g was affixed to the shaker motor. Output from the accelerometer was amplified and conditioned using a piezotron coupler (Type 5134A, Kistler Instrument Corporation, Amherst, NY), digitized (PCI-6229, National Instruments, Austin, TX; sampling rate = 20 kHz) and read into a PC (Windows 7).
In producing some high frequency vibrations, the shaker motor could also generate audible sounds. We used two approaches to attenuate the sounds produced by the shaker motor (Crommett et al., 2017; Yau, Olenczak, et al., 2009). First, all participants wore noise-attenuating earmuffs (Peltor H10A Optime 105 Earmuff, 3M). Second, we designed a sound-attenuation chamber to enclose the shaker motor (Figure 1A). The walls of the chamber consisted of three layers: a hard polyurethane board (84775K23, McMaster-Carr, Robbinsville, NJ), 1-inch thick polyurethane acoustical foam insulation (5692T49, McMaster-Carr, Robbinsville, NJ), and a 3-inch thick egg-carton polyurethane foam sheet (9710T46, McMaster-Carr, Robbinsville, NJ). The shaker motor was mounted to an adjustable stage (UMR8.51, Newport Corp., Irvine, CA) supported by a custom-built aluminum frame. Each participant placed his or her left hand through an entry hole (lined with foam) and rested his or her index finger on a support platform mounted below the shaker motor and contact probe. In preliminary experiments, we confirmed that participants were unable to rely on acoustic cues from the shaker motor to perform the frequency sweep direction discrimination task: Performance fell to chance level when participants performed the task in the absence of finger contact with the stimulator (Figure S1).
Equating tactile stimulus intensity.
To minimize the likelihood that subjects discriminated tactile frequency sweep direction based on intensity cues, we modulated the amplitude of the tactile sweeps using an iso-intensity function. The iso-intensity function represents the amplitudes over which vibrations at different frequencies are perceived to be equally intense (see Crommett, Pérez-Bellido, & Yau, 2017 for details regarding the generation of the iso-intensity curve). Briefly, we determined subjectively-matched amplitudes for a set of pure tone vibrations spanning 100–550 Hz. We then fitted an iso-intensity function on the matched amplitudes and used this iso-intensity function to modulate the amplitude of each sweep. In digitally constructing the tactile sweep stimuli, each stimulus was initially defined with frequency components having identical nominal amplitudes. The component amplitudes were then proportionately scaled according to the values in the iso-intensity function corresponding to the components’ frequencies. This ensured that perceived intensity was approximately unchanging throughout the sweep duration. Tactile stimuli, although clearly supra-threshold, still fell below levels required for detection via bone-conduction (Dirks, Kamm, & Gilman, 1976; Yau et al., 2010).
Auditory stimulation.
Auditory distractors consisted of frequency sweeps (Experiments 1, 2, and 3) and intensity sweeps (Experiment 4). In all experiments, trials comprising Gaussian white noise auditory distractors served as baseline conditions to evaluate tactile performance in the presence of auditory distractors that contained no systematic temporal variations. Gaussian white noise stimuli (duration: 500 ms; sample rate: 44.1 kHz) were digitally generated in MATLAB by first defining a vector of random samples (Gaussian distribution with zero mean) which were subsequently normalized. Auditory distractors were presented at volumes that were clearly suprathreshold, but not aversively loud (<53 dB SPL). The amplitude of the intensity sweeps tested in Experiment 4 increased or decreased linearly over the stimulus duration: The minimum amplitude corresponded to a 10-mV signal (Vrms; measured as the output of the amplifier) and the maximum amplitude corresponded to 22-mV signal.
In preliminary experiments (n = 6), we verified that subjects could reliability discriminate the direction of the auditory frequency sweeps used as distractors in Experiments 1 and 3 (98.6 ± 0.9 accuracy in both experiments). Stimuli were generated digitally using Matlab (MathWorks; Natick, MA, USA) and converted to analog signals using a digital-to-analog card (PCI-6229, National Instruments, Austin, TX; sampling rate = 20 kHz). The analog signals were amplified (PTA2 stereo power amplifier, Pyle) before being delivered binaurally via noise-cancelling in-ear headphones (ATH-ANC23, Audio-Technica U.S., Inc). The amplifier was set to 25% of its maximum intensity output. Participants wore noise-attenuating earmuffs (3M Tekk Protection) over the in-ear headphones.
Data Analysis.
To quantify each participant’s ability to discriminate tactile sweep direction, we fitted the participant’s choice probability data with a Gaussian cumulative distribution function (cdf):
where p(up) is the proportion of trials a given tactile stimulus was judged to be an up sweep, D is the sweep depth of the stimulus, μ and σ are free parameters corresponding to estimates of the participant’s point of subjective equality (PSE) and just-noticeable difference (JND), respectively, and erf(x) is the error function of x. The PSE is a measure of bias and indicates the sweep depth that is equally likely to be judged as an upward or downward sweep. The JND is a measure of sensitivity: This perceptual threshold corresponds to the change in sweep depth, with respect to the PSE, that yields 84% performance. Separate psychometric curves were fitted to the choice probability data of each subject under each distractor condition and we calculated the variance explained (r2) for a goodness-of-fit measure for each fitted curve.
In group-level analysis for each experiment, we determined whether PSE and JND estimates differed significantly as a function of auditory distractor conditions. Group-averaged parameter estimates and standard errors of the mean are reported for each experiment. We conducted repeated-measures ANOVA with auditory distractor condition as the within-subjects factor. We excluded a participant’s data from these analyses if his/her JND (σ) for any distractor condition differed from the group mean for that condition by greater than 2 standard deviations (Crommett et al., 2017). This ensured that the analyses only included data from subjects who reliably discriminated tactile sweep direction in the presence of auditory distractors. Psychometric function fitting and statistical testing were performed using Matlab (MathWorks; Natick, MA, USA). If the rmANOVA yielded a significant main effect of distractor condition, we performed post-hoc paired t-tests that were corrected for multiple comparisons (Bonferroni). When performing post-hoc tests, we also calculated Bayes factors to quantify relative support for the null and alternative hypotheses. Effect sizes are also reported for all statistical tests.
Experiment 1: Influence of auditory sweep direction on tactile sweep direction perception
This experiment was designed to test whether auditory distractors that signaled different frequency sweep directions modulated the perceived direction of tactile frequency sweeps in a direction-specific manner.
Method
Participants.
Twenty-four subjects (13 females; mean age ± s.e.m.: 24.3 ± 0.66 years old) participated in Experiment 1. Data from 4 subjects were identified as outliers and excluded from the analysis (see Data analysis).
Procedure.
Auditory distractors in Experiment 1 comprised 3 conditions: white noise, upward frequency sweeps, and downward frequency sweeps. Distractor sweeps had depths of ±150 Hz and were centered on 230 Hz, matching the center frequency of the tactile stimuli. The white noise distractor, which lacked systematic variations in frequency and intensity, served as a baseline condition against which we could compare performance achieved with the frequency sweep distractors. This condition ensured that an auditory stimulus always co-occurred with the tactile stimuli, obviating concerns that performance differences associated with the distractor sweeps were simply due to distractor effects on attention. Twenty-four observations were obtained for each combination of tactile sweep depth and auditory condition. An experimental session comprised 6 blocks each consisting of 156 randomized trials. Subjects were provided 3–5 minutes to rest between each block.
Results
Figure 2A shows average choice probability data for 20 subjects depicting the likelihood a tactile sweep of a given depth was perceived as an upward sweep. Participants reliably identified tactile sweep direction in the presence of auditory distractors: Average psychometric function fits over participants and conditions tended to be high (mean r2 = 0.90 ± 0.01). Figure 2B shows that auditory distractors biased tactile sweep judgments (PSE; ANoise: 28.29 ± 26.01, ADown: 170.67 ± 45.30, AUp: −90.98 ± 33.97) in a manner that depended significantly on distractor condition (rmANOVA: F2,38 = 13.99, P = 0.00003,ηp2= 0.42). PSE estimates with downward distractor sweeps differed significantly from estimates with upward distractor sweeps (t(19) = 4.09, Pcorr = 0.002, d = 0.63). A Bayes Factor (BF) analysis indicated substantial evidence favoring the alternative hypothesis that the PSE estimates with ADown and AUp differed over the null hypothesis that there were no PSE differences between the two distractor conditions (BF = 55.33). Distractor sweeps exerted direction-specific biasing effects in nearly all subjects (Figure 2C), and PSE estimates with noise distractors differed significantly from estimates with upward sweeps (t(19) = 3.6, Pcorr = 0.006, d = 0.55; BF = 21.08) and downward sweeps (t(19) = 3.05, Pcorr = 0.02, d = 0.47; BF = 7.25). Auditory distractors also differentially modulated tactile perceptual thresholds (Figure 2D; F2,38 = 5.45, P = 0.008, ηp2 = 0.22) with reduced tactile sensitivity observed in the auditory sweep conditions (JND; ANoise: 137.24 ± 13.99, ADown: 224.77 ± 32.5, AUp: 216.92 ± 48.29). Relative to JND estimates associated with ANoise, tactile sensitivity was significantly impaired with ADown (t(19) = 3.27, Pcorr = 0.01, d = 0.503; BF = 10.99) and AUp (t(19) = 2.78, Pcorr = 0.04, d = 0.43; BF = 4.42), though JND estimates did not differ according to distractor sweep direction (t(19) = 0.95, Pcorr = 1, d = 0.15; BF = 0.35).
Figure 2.
Experiment 1 results: Tactile frequency sweep direction identification with auditory white noise and frequency sweep distractors (n=20). (A) Average choice probability data and psychometric functions with white noise (black), the downward sweeping sounds (dark gray, filled circles), and upward sweeping sounds (light gray, filled circles). Data indicate the likelihood a vibration at a given frequency sweep depth was judged as increasing in frequency. (B) Average PSE (μ) estimates in each distractor condition. Circles denote PSE estimates for individual subjects. (C) Scatterplot shows the difference between the PSE estimates with white noise and the PSE estimates with upward and downward auditory sweeps for each subject and the group average (filled marker with error bars indicating s.e.m.). The concentration of points in the upper left quadrant reveals the consistency in direction-specific distractor effects across subjects. (D) Average JND (σ) estimates in each distractor condition. Circles denote JND estimates for individual subjects. Error bars indicate s.e.m.
Experiment 2: Dependence of distractor efficacy on auditory sweep depth
The results of Experiment 1 indicate that auditory frequency sweeps systematically bias the perceived direction of tactile sweeps: Upward auditory sweeps cause a pure tone tactile stimulus (which is stationary in the frequency domain over time) to be experienced as sweeping upward while downward auditory sweeps cause the same tactile stimulus to be experienced as sweeping downward. In Experiment 2, we tested whether the magnitude of the bias effects depended on auditory sweep depth by characterizing the influences of auditory sweeps that varied in depth but whose directions were matched and clearly identifiable. If auditory effects on tactile sweep direction perception were due to cognitive or decisional biases, we reasoned that distractor effects would categorically relate to distractor sweep direction and would be invariant to manipulations of distractor depth. Alternatively, distractor efficacy could scale with distractor depth, which would implicate integration processes that are sensitive to sweep depth as well as direction.
Method
Participants.
Eleven subjects (7 females; 24.1 ± 0.69 years old) participated in Experiment 2. Outlier data from one subject was excluded from the analysis.
Procedure.
Auditory distractors in Experiment 2 comprised upward auditory sweeps presented at 3 depths (100, 200, and 300 Hz) in addition to white noise. Auditory sweeps were centered at 230 Hz, matching the center frequency of the tactile stimuli. Twenty observations were obtained for each combination of tactile sweep depth and auditory condition. An experimental session included 5 task blocks of 208 trials each. Subjects were provided time to rest between each block.
Results
Figure 3A shows average choice probability data for 10 subjects. Participants reliably identified tactile sweep direction in the presence of auditory distractors yielding good psychometric function fits (mean r2 = 0.91 ± 0.01). Consistent with Experiment 1 results, upward auditory sweeps increased the likelihood that participants reported feeling an upward tactile sweep (Figure 3B) (PSE; ANoise: 51.08 ± 33.43, A100U: 21.11 ± 29.06, A200U: −28.22 ± 24.13, A300U: −77.76 ± 17.04). These biasing effects depended significantly on distractor condition (F3,27 = 16.93, P < 0.0001, ηp2= 0.65): Distractor sweeps with larger depths were associated with greater PSE changes. Although there was a nominal positive bias in the PSE estimates achieved with ANoise, these PSE values did not differ significantly from 0 (one sample t-test, t(9) = 1.53, P = 0.16, d = 0.48). Critically, while PSE estimates with 100-Hz auditory sweeps were not significantly different from estimates with noise (t(9) = 1.36, Pcorr = 0.62, d = 0.43; BF = 0.637), PSE estimates for both the A200U distractor sweeps (t(9) = 3.86, Pcorr = 0.012, d = 1.22; BF = 13.39) and the A300U distractor sweeps (t(9) = 4.88, Pcorr = 0.003, d = 1.54; BF = 45.76) were significantly lower compared to the noise condition. JND estimates (Figure 3C) (ANoise: 143.33 ± 18.72, A100U: 167.33 ± 31.98, A200U: 133.48 ± 20.11, A300U: 154.86 ± 26.72) did not differ significantly across distractor conditions (F3,27 = 0.95, P = 0.430, ηp2=0.096). These results replicate the significant direction-specific effects observed in Experiment 1 and demonstrate that auditory biasing of tactile sweep direction perception scales with auditory sweep depth.
Figure 3.
Experiment 2 results: Tactile frequency sweep direction identification with auditory white noise and upward frequency sweep distractors at multiple sweep depths (n=10). (Conventions as in Fig. 2) (A) Average choice probability data and psychometric functions in each distractor condition. (B) Average PSE (μ) estimates in each distractor condition. (C) Average JND (σ) estimates in each distractor condition. Circles denote parameter estimates for individual subjects. Error bars indicate s.e.m.
Experiment 3: Dependence of audiotactile sweep interactions on absolute frequency
The results of Experiments 1 and 2 demonstrate that auditory frequency sweeps bias tactile sweep direction perception in a direction- and depth-specific manner. The depth-dependence of the auditory biasing effects implies that the perceptual interactions are not simply based on categorical evaluations of sweep direction, as might be predicted by decisional accounts. Experiment 3 was designed to further test the specificity of auditory influences on tactile sweep perception by evaluating the dependence of sweep interactions on absolute frequency. In Experiments 1 and 2, the sweep stimuli presented in both modalities were centered on the same absolute frequency; these interactions alone are insufficient for addressing whether integration occurs on frequency sweep representations that are maintained in absolute frequency, consistent with lower-level sensory processing (Alais, Orchard-Mills, & Van der Burg, 2015), rather than higher-order sweep representations that may be frequency invariant. To distinguish between these possibilities, we repeated the paradigm used in Experiment 1, but we tested auditory sweeps that occupied an absolute frequency range outside of the vibration frequencies perceivable by touch.
Method
Participants.
Twelve subjects (9 females; 21.9 ± 0.86 years old) participated in Experiment 3. Data from 2 subjects were identified as outliers and excluded from the analysis.
Procedure.
As in Experiment 1, auditory distractors comprised 3 conditions: white noise, upward sweeps, and downward sweeps. Auditory sweeps were centered at 2000 Hz and had depths of ±1300 Hz. Using this sweep depth allowed us to match the depth/center frequency ratio for sweep distractors used in Experiments 1 and 3 thereby ensuring that subjects perceived the direction of these auditory stimuli with matched accuracies (Auditory stimulation). Twenty-four observations were obtained for each combination of tactile sweep depth and auditory condition. An experimental session comprised 6 blocks of 156 trials each. Subjects were provided time to rest between each block.
Results
Figure 4A shows average choice probability data for 10 subjects in participants performing the tactile sweep direction identification task. Performance was reliable yielding good psychometric function fits (mean r2 = 0.91 ± 0.02). Unlike the effects observed with distractor sweeps centered on 230 Hz, auditory sweeps centered on 2000 Hz did not systematically bias tactile direction judgments (Figure 4B) (PSE; ANoise: 9.79 ± 27.62, ADown: 41.47 ± 22.92, AUp: −0.16 ± 23.74). PSE estimates did not significantly differ across auditory distractor conditions (F2,18 = 1.42, P = 0.268, ηp2= 0.14) and there were no consistent direction-selective biasing effects across subjects (Figure 4C). Distractor sweeps centered on 2000 Hz also failed to systematically and significantly modulate tactile perceptual thresholds (Figure 4D) (ANoise: 183.75 ± 50.09, ADown: 169.01 ± 39.40, AUp: 174.20 ± 39.97; F2,18 = 0.19, P = 0.830, ηp2= 0.02). Thus, auditory frequency sweep distractors, whose directions are clearly identifiable, fail to modulate tactile perception of frequency sweeps when the tones occur in absolute frequency ranges that are too high to be perceived by touch.
Figure 4.
Experiment 3 results: Tactile frequency sweep direction identification with auditory white noise and frequency sweep distractors occurring in high absolute frequency range (n=10). (Conventions as in Fig. 2) (A) Average choice probability data and psychometric functions in each distractor condition. (B) Average PSE (μ) estimates in each distractor condition. (C) Scatterplot shows the difference between the PSE estimates with white noise and the PSE estimates with upward and downward auditory sweeps for each subject and the group average (filled marker with error bars indicating s.e.m.). The dispersion of points reflects the inconsistency in effects across conditions and subjects with auditory sweeps occurring at frequency ranges imperceptible by touch. (D) Average JND (σ) estimates in each distractor condition. Circles denote JND estimates for individual subjects. Error bars indicate s.e.m.
Experiment 4: Influence of auditory intensity sweeps on tactile frequency sweep perception
The auditory distractors tested in Experiments 1–3 were characterized by temporal variations in the frequency domain. Experiment 4 tested whether auditory distractors characterized by variations in signal intensity could also modulate tactile frequency sweep perception. Tactile frequency sweep processing may be unaffected by changes in auditory stimulus intensity just as tactile frequency perception of pure tone vibrations does not relate to the intensity of concurrent auditory pure tones (Yau, Olenczak, et al., 2009). However, amplitude-modulated sounds may be more salient and distracting than auditory pure tones because of their time-varying nature. Furthermore, variations in auditory stimulus intensity can sometimes be associated with perceived variations in stimulus frequency (Neuhoff et al., 1996). Thus, auditory intensity sweep distractors could conceivably bias tactile judgments of frequency sweep direction.
Method
Participants.
Eleven subjects (6 females; 23.9 ± 1.04 years old) participated in Experiment 4. Outlier data from one subject was excluded from the analysis.
Procedure.
Auditory distractors in Experiment 4 comprised 3 auditory conditions: white noise, upward intensity sweeps, and downward intensity sweeps. The frequency of the auditory intensity sweeps was fixed at 230 Hz. The amplitude of the sweep increased or decreased linearly over the stimulus duration (Auditory stimulation). Twenty-four observations were obtained for each combination of tactile sweep depth and auditory condition. An experimental session comprised 6 blocks of 156 trials each. Subjects were provided time to rest between each block.
Results
Figure 5A shows average choice probability data for 10 subjects identifying the direction of tactile frequency sweeps. Subjects achieved reliable performance yielding good psychometric function fits (mean r2 = 0.92 ± 0.02). Although there was a weak pattern in distractor biasing effects on tactile direction judgments (ANoise: 22.75 ± 22.35 Hz, AIDown: 111.16 ± 86.65, AIUp: −48.51 ± 43.88), PSE differences across auditory distractor conditions (Figure 5B) did not achieve statistical significance (F2,18 = 1.7, P = 0.21, ηp2= 0.16). Relative to the noise condition, PSE estimates associated with the intensity sweeps did not reflect consistent direction-specific effects (Figure 5C), unlike the effects observed with auditory frequency sweeps. Distractor effects on tactile perceptual thresholds (Figure 5D) (ANoise: 145.04 ± 20.94 Hz, AIDown: 208.30 ± 43.51, AIUp: 196.96 ± 31.16) also appeared to be weakly modulated by auditory intensity (F2,18 = 2.83, P = 0.09, ηp2= 0.24) with these marginal effects reflecting a trend for distractor sweeps to impair tactile sweep sensitivity more than noise.
Figure 5.
Experiment 4 results: Tactile frequency sweep direction identification with auditory white noise and intensity sweep distractors (n=10). (Conventions as in Fig. 2) (A) Average choice probability data and psychometric functions in each distractor condition. (B) Average PSE (μ) estimates in each distractor condition. (C) Scatterplot shows the difference between the PSE estimates with white noise and the PSE estimates with upward and downward auditory intensity sweeps for each subject and the group average (filled marker with error bars indicating s.e.m.). Auditory intensity sweeps did not consistently exert strong influences on tactile frequency sweep judgments. (D) Average JND (σ) estimates in each distractor condition. Circles denote JND estimates for individual subjects. Error bars indicate s.e.m.
General Discussion
Audition and touch have previously been shown to interact in temporal frequency processing of pure tone signals. Our results build on this work by demonstrating that audition also interacts with touch in the processing of higher-order temporal signals like frequency sweeps. We found that auditory frequency sweeps bias tactile judgments of frequency sweep direction in a direction-specific manner. These effects are not simply categorical with respect to auditory sweep direction; the magnitude of the biasing effects also varies according to the auditory sweep depth as larger changes in auditory frequency produce larger biases in tactile sweep direction judgments. Consistent with the notion that audiotactile interactions in sweep processing are sensitive to low-level stimulus features, we also found that auditory sweeps only influence tactile sweep perception when both stimuli are in the same absolute frequency range. Lastly, we found that tones characterized by amplitude modulation, unlike frequency modulation, exerted minimal biasing effects on tactile frequency sweep perception. Combined with the previous findings demonstrating audiotactile interactions in the perception of pure tone signals, our collective results support the hypothesis that audition and touch interact in multiple representational levels of temporal frequency sweep processing.
Perceptual interactions between sensory modalities have traditionally been considered to be supported by mechanisms based on cognitive or decisional biases (which operate on representations of perceptual judgments) or mechanisms involved in sensory integration (which operate on representations of sensory features). Based on our paradigm, which involved the formation of categorical tactile sweep direction judgments, sweep perception interactions would depend simply on the perceived directions of the auditory and tactile signals if they were a consequence of decisional biases. Alternatively, sweep interactions would be marked by strong dependencies on stimulus features if they were mediated by sensory integration mechanisms. Our results clearly support the sensory account. First, auditory sweep distractors in an absolute frequency range that fell outside of the range of tactile vibration perception failed to significantly bias tactile sweep direction judgments despite the fact that the direction of the auditory sweeps could be perceived clearly. If auditory biasing effects resulted from a corruption of subjects’ tactile sweep judgments by the simple experience of a concurrent and identifiable distractor sweep, auditory sweeps whose directions are equally perceivable should influence touch similarly regardless of their particular spectral content. Second, auditory distractors of different sweep depths biased tactile perception to different degrees. A simple decisional account would not predict that auditory sweeps of varying depths but comparably identifiable directions would differentially bias tactile sweep judgments. Lastly, auditory influences on tactile sweep perception were limited to the frequency domain, as salient variations in sound intensity did not significantly modulate tactile sweep direction perception. This specificity suggests that the mere presence of a time-varying auditory stimulus is insufficient for inducing large biases in tactile frequency sweep perception. However, in some subjects it was notable that auditory intensity sweeps appeared to induce weak biasing effects (Figure 5C): Sweeps that increased in amplitude could increase the likelihood that subjects felt an upward sweeping vibration. This pattern would be consistent with the fact that amplitude variations in a pure tone sound can sometimes induce illusory perceptions of frequency variations (Neuhoff et al., 1996). Collectively, our findings demonstrate that auditory influences on tactile sweep perception depend critically on a number of stimulus features, which implicates an interaction mechanism by sensory integration rather than decisional biases.
How might a neural network combine auditory and tactile sweep information? We constructed two overly simple models (Figure 6A, see Supplemental material for full model details) to highlight potential network architectures that could account for the influence of auditory sweep signals on the perception of concurrent tactile frequency sweeps. Briefly, these models assume the nervous system uses maximum-likelihood decoding to relate neural responses to stimulus perception (Crommett et al., 2017; Jazayeri & Movshon, 2006). Both models assume the existence of sensory neuron populations in an “encoding” layer that are tuned for frequency sweep direction and depth (i.e., selectivity for positive and negative sweep depth values). In both models, the population activity over the sensory neurons is pooled in a separate neural population in a “recoding” layer that represents the likelihood function, the probability that any given sweep depth is signaled by the noisy activity profile in the sensory neuron pool. Finally, a decision rule is then applied to “read out” the most probable stimulus (given by the peak of the likelihood distribution), a computation that is assumed to be performed by other neurons involved in processing decision signals in the context of the perceptual task. The two models differed only with respect to the modality-selectivity of the sensory neurons in the encoding layer (Figure 6A). In the supramodal sensory neuron (SSN) model, we assumed that the sensory neurons were responsive to both auditory and tactile sweeps. In the unimodal sensory neuron (USN) model, we assumed that sensory neurons were only responsive to inputs in a single modality; however, the neurons representing the likelihood function pooled over both of the modality-specific sensory neuron samples. To maintain the simplicity of the models, we assumed fixed relationships in the response properties of the sensory neurons in the encoding layer: 1) the tuning peak locations (μi) (i.e. the preferred sweep depths) were uniformly distributed in the depth domain (-600 to 600 Hz), 2) the tuning widths (σi) were identical for all neurons, and 3) the response gain (ai), indicating the maximum response level for each neuron, was identical for all neurons. Because of these assumptions, the SSN model comprised only 3 free parameters (a, σ, WA) corresponding to response gain, tuning selectivity, and the relative weight assigned to the auditory inputs. Because the USN model consisted of 2 unimodal sensory neuron populations, this model comprised 5 free parameters (aT, aA, σT, σA, WA) corresponding to the response gains and tuning selectivity of the unimodal neuron pools and the relative weight assigned to the auditory inputs. We found that both models were able to reproduce human performance patterns on the sweep direction identification task in Experiment 1 (Figure 6B). Moreover, each model also replicated the depth-dependent interaction patterns observed in Experiment 2 (Figure 6C), even when the model parameters were estimated from independent data associated with Experiment 1. These modeling results indicate that neural networks performing maximum-likelihood estimation can account for the direction- and depth-specific interactions between audition and touch in frequency sweep processing. Given that both models perform comparably in predicting the interaction patterns, one could conclude that the model that comprises a single sensory neuron pool is more probable given that it contains fewer free parameters (Supplemental material). However, our limited behavioral data are clearly insufficient for constraining rigorous competition between these models and we present these toy models as a first step in considering potential mechanisms rather than to make a strong case for either network architecture. Additional studies are necessary to test and constrain these models. For instance, testing for reciprocal interactions by evaluating the effect of tactile distractor sweeps on auditory perception could inform how pooling mechanisms weight inputs from the modalities under different states and contexts. A crossmodal adaptation paradigm could implicate sensory neural populations that exhibit tuning for both sensory modalities, which would distinguish between our two models. Neurophysiology recording experiments testing for bimodal responsiveness could also provide strong tests for these and related network models.
Figure 6.
Modeling frequency sweep encoding and decoding. (A) Two possible network architectures. (left) Supramodal sensory neuron (SSN) model, (right) Unimodal sensory neuron (USN) model. Both models assume a feed-forward architecture that computes the log likelihood function representing the perceived frequency sweep depth. At the encoding level, sensory neurons tuned for sweep depth respond to a bimodal stimulus containing a tactile (black vertical line) and auditory (gray vertical line) sweep component with noisy activity. Sensory neuron responses are pooled by units in the recoding level, which collectively represent the full log likelihood function. The peak of the likelihood function indicates the most probable tactile sweep depth given the activity pattern in the encoding level. The USN model assumes that distinct pools of sensory neurons are dedicated to processing tactile and auditory sweeps. (B) Simulated choice probability data over 10 simulations of Experiment 1 and estimated psychophysical parameters in each distractor condition for SSN model (left) and USN model (right). (C) Simulated choice probability data over 10 simulations of Experiment 2 and estimated psychophysical parameters in each distractor condition for SSN model (left) and USN model (right). Error bars indicate s.e.m. Both models qualitatively reproduce the interaction patterns observed in Experiments 1 and 2 (Supplemental material).
Where in the brain might auditory and tactile frequency sweep representations interact? Multiple regions in human auditory cortex exhibit response modulation according to the direction and rate of auditory frequency sweeps (Brechmann, Baumgart, & Scheich, 2002; Brechmann & Scheich, 2005; Hsieh, Fillmore, Rong, Hickok, & Saberi, 2012; Joanisse & DeSouza, 2014; Johnsrude, Penhune, & Zatorre, 2000; Mendelson & Cynader, 1985; Sams & Näätänen, 1991; Ye et al., 2010; Zhang et al., 2003). These regions may be homologues of the auditory cortical regions in non-human primates which contain neurons responsive to frequency modulated tones (Sadagopan & Wang, 2009; Tian, 2004). Given such response properties, and to the extent that neurons in these areas also respond to tactile sweep inputs, these auditory regions are clear candidate sites for audiotactile interactions in sweep processing. Indeed, auditory cortical regions have been shown to respond to tactile inputs alone (Butler, Foxe, Fiebelkorn, Mercier, & Molholm, 2012; Caetano & Jousmäki, 2006; Foxe et al., 2002; Kayser et al., 2005; Nordmark, Pruszynski, & Johansson, 2012; Schürmann, Caetano, Hlushchuk, Jousmäki, & Hari, 2006) even at the level of unit responses (Fu et al., 2003; Lemus, Hernández, Luna, Zainos, & Romo, 2010; Schroeder et al., 2001). Intriguingly, the tendency for downward auditory sweeps to exert nominally stronger influences on tactile sweep perception may reflect a neural bias in representing downward sweeps more than upward sweeps, as has been reported (Andoni, Li, & Pollak, 2007; Fuzessery, 2006; Razak & Fuzessery, 2006). Interactions may also occur outside of auditory regions: Somatosensory regions can also respond to auditory stimulation (Beauchamp & Ro, 2008; Lemus et al., 2010; Liang, Mouraux, Hu, & Iannetti, 2013; Pérez-Bellido et al., 2017) so sweep interactions could occur in parietal lobe regions as well, although it is notable that explicit neural representations of frequency-modulated tactile stimuli have not been reported. Direct and indirect anatomical projections connect auditory and somatosensory areas (Cappe, Rouiller, & Barone, 2012; Hackett et al., 2007; Ro, Ellmore, & Beauchamp, 2013; Smiley & Falchier, 2009) so auditory and tactile signals could be mutually transmitted and processed across a distributed network of sensory regions. Additionally, auditory and somatosensory signals could converge in regions of frontal (Vergara, Rivera, Rossi-Pool, & Romo, 2016) and posterior parietal cortex (Levine & Schwarzbach, 2017) that support perceptual decision making.
Sensory information is represented and elaborated across processing hierarchies in subcortical and cortical systems (Felleman & Van Essen, 1991). For instance, representations of spatial features are transformed across brain regions comprising the visual cortical system (Connor, Brincat, & Pasupathy, 2007; Yamins & DiCarlo, 2016; Yau, Pasupathy, Brincat, & Connor, 2013) and the somatosensory cortical system (Bensmaia, Denchev, Dammann, Craig, & Hsiao, 2008; Bodegård, Geyer, Grefkes, Zilles, & Roland, 2001). The implementation of analogous neural codes (Yau, Pasupathy, Fitzgerald, Hsiao, & Connor, 2009) and the parallel transformation of feature representations across the visual and somatosensory systems (Yau, Connor, & Hsiao, 2013) may facilitate crosstalk between these two senses in the spatial domain. Representations of the temporal features of sounds similarly become elaborated across the auditory neuraxis ((Rauschecker, Tian, & Hauser, 1995; Recanzone, 2011), but whether temporal features experienced by touch undergo similar representational transformations is unclear (Saal, Wang, & Bensmaia, 2016). Given the interactions between audition and touch in the perception of pure tone stimuli (Crommett et al., 2017; Ro et al., 2009; Wilson et al., 2010; Yau, Olenczak, et al., 2009; Yau et al., 2010) and spectrally complex signals like those associated with textures (Guest et al., 2002; Jousmäki & Hari, 1998; Lederman, 1979; Yau, Hollins, et al., 2009), it is conceivable that auditory and tactile temporal frequency information are also elaborated across parallel and interactive hierarchical processing streams. In this framework, processing at intermediate levels of these hierarchies would support higher-order temporal signals like the frequency-modulated sweeps, and our behavioral results provide clear evidence for multisensory crosstalk at this intermediate representational level.
Conclusion
Our results show that tactile frequency sweep perception is systematically biased by concurrent auditory frequency sweeps. These results reveal that audition and touch interact not only in perceiving the frequency of simple pure tone signals, but also in the processing of higher-order temporal signals characterized by variations in the frequency domain. These behavioral results imply that neural interactions between auditory and tactile representations extend over cortical processing hierarchies.
Context of the Research
We experience and perceive environmental oscillations by audition and touch. We and others have shown that audition and touch interact intimately in the processing of temporal frequency information. Specifically, the manner by which we perceive the frequency of a pure tone signal in one sensory modality is systematically modulated by pure tone signals experienced in the other sensory modality. Such demonstrations of frequency-specific crosstalk between audition and touch imply the existence of shared neural circuits for representing low-level temporal frequency information. However, the signals that we typically encounter are not merely sinusoids. Instead, environmental signals are complex and marked by temporal variations in frequency and amplitude. Because neural representations of higher-order temporal frequency signals may be synthesized from representations of simple pure-tone signals, we designed the current study to test whether audition and touch might also interact in the processing of higher-order temporal frequency signals. Our main finding – that audition and touch interact in frequency sweep processing – implies that multisensory interactions can operate at multiple levels in sensory processing hierarchies. The present findings fit within a broader research program focused on understanding both perceptual and neural processing of multisensory stimuli. We plan to extend the current work using functional neuroimaging to investigate how the human brain represents auditory and tactile frequency sweep signals and multisensory interactions in frequency sweep processing.
Supplementary Material
Acknowledgements:
This research was supported by Alfred P. Sloan Research Fellowship (JMY), R01NS097462 (JMY), and NSF IGERT fellowship (LEC). We thank Yau Lab members for thoughtful discussions, W. Nash, W. Quinlain, and V. Careagas for assistance with the noise attenuation box, and J. Killebrew for technical assistance. This work was performed in the Neuromodulation and Behavioral Testing Facility of BCM’s Core for Advanced MRI (CAMRI). We gratefully acknowledge the assistance and computing resources provided by the CIBR Center for Computational and Integrative Biomedical Research of BCM in the completion of this work.
References
- Alais D, Orchard-Mills E, & Van der Burg E (2015). Auditory frequency perception adapts rapidly to the immediate past. Attention, Perception, and Psychophysics, 77(3), 896–906. 10.3758/s13414-014-0812-2 [DOI] [PubMed] [Google Scholar]
- Andoni S, Li N, & Pollak GD (2007). Spectrotemporal Receptive Fields in the Inferior Colliculus Revealing Selectivity for Spectral Motion in Conspecific Vocalizations. Journal of Neuroscience, 27(18), 4882–4893. 10.1523/JNEUROSCI.4342-06.2007 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beauchamp MS, & Ro T (2008). Neural Substrates of Sound-Touch Synesthesia after a Thalamic Lesion. Journal of Neuroscience, 28(50), 13696–13702. 10.1523/JNEUROSCI.3872-08.2008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- BensmaÏa S, Hollins M, & Yau J (2005). Vibrotactile intensity and frequency information in the Pacinian system: A psychophysical model. Perception & Psychophysics, 67(5), 828–841. 10.3758/BF03193536 [DOI] [PubMed] [Google Scholar]
- Bensmaia SJ, Denchev PV, Dammann JF, Craig JC, & Hsiao SS (2008). The Representation of Stimulus Orientation in the Early Stages of Somatosensory Processing. Journal of Neuroscience, 28(3), 776–786. 10.1523/JNEUROSCI.4162-07.2008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- BensmaÏa SJ, & Hollins M (2003). The vibrations of texture. Somatosensory & Motor Research, 20(1), 33–43. 10.1080/0899022031000083825 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bodegård A, Geyer S, Grefkes C, Zilles K, & Roland PE (2001). Hierarchical processing of tactile shape in the human brain. Neuron, 31(2), 317–328. 10.1016/S0896-6273(01)00362-2 [DOI] [PubMed] [Google Scholar]
- Brechmann A, Baumgart F, & Scheich H (2002). Sound-level-dependent representation of frequency modulations in human auditory cortex: a low-noise fMRI study. Journal of Neurophysiology, 87(1), 423–433. 10.1152/jn.00187.2001 [DOI] [PubMed] [Google Scholar]
- Brechmann A, & Scheich H (2005). Hemispheric shifts of sound representation in auditory cortex with conceptual listening. Cerebral Cortex, 15(5), 578–587. 10.1093/cercor/bhh159 [DOI] [PubMed] [Google Scholar]
- Butler JS, Foxe JJ, Fiebelkorn IC, Mercier MR, & Molholm S (2012). Multisensory Representation of Frequency across Audition and Touch: High Density Electrical Mapping Reveals Early Sensory-Perceptual Coupling. Journal of Neuroscience, 32(44), 15338–15344. 10.1523/JNEUROSCI.1796-12.2012 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Caetano G, & Jousmäki V (2006). Evidence of vibrotactile input to human auditory cortex. NeuroImage, 29(1), 15–28. 10.1016/j.neuroimage.2005.07.023 [DOI] [PubMed] [Google Scholar]
- Cappe C, Rouiller EM, & Barone P (2012). Cortical and Thalamic Pathways for Multisensory and Sensorimotor Interplay. In The Neural Bases of Multisensory Processes (pp. 13–28). 10.1201/b11092-4 [DOI] [PubMed] [Google Scholar]
- Connor CE, Brincat SL, & Pasupathy A (2007). Transformation of shape information in the ventral pathway. Current Opinion in Neurobiology, 17(2), 140–147. 10.1016/j.conb.2007.03.002 [DOI] [PubMed] [Google Scholar]
- Crommett LE, Pérez-Bellido A, & Yau JM (2017). Auditory adaptation improves tactile frequency perception. Journal of Neurophysiology, 117(3). 10.1152/jn.00783.2016 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dirks DD, Kamm C, & Gilman S (1976). Bone conduction thresholds for normal listeners in force and acceleration units. Journal of Speech and Hearing Research, 19(1), 181–186. 10.1044/jshr.1901.181 [DOI] [PubMed] [Google Scholar]
- Felleman DJ, & Van Essen DC (1991). Distributed hierarchical processing in the primate cerebral cortex. Cerebral Cortex, 1(1), 1–47. 10.1093/cercor/1.1.1-a [DOI] [PubMed] [Google Scholar]
- Foxe JJ, Wylie GR, Martinez A, Schroeder CE, Javitt DC, Guilfoyle D, … Murray MM (2002). Auditory-somatosensory multisensory processing in auditory association cortex: an fMRI study. Journal of Neurophysiology, 88(1), 540–543. 10.1152/jn.00694.2001 [DOI] [PubMed] [Google Scholar]
- Fu KM, Johnston T. a, Shah a S., Arnold L, Smiley J, Hackett T. a, … Schroeder CE (2003). Auditory cortical neurons respond to somatosensory stimulation. Journal of Neuroscience, 23(20), 7510–7515. https://doi.org/23/20/7510 [pii] [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fuzessery ZM (2006). Neural Mechanisms Underlying Selectivity for the Rate and Direction of Frequency-Modulated Sweeps in the Inferior Colliculus of the Pallid Bat. Journal of Neurophysiology, 96(3), 1320–1336. 10.1152/jn.00021.2006 [DOI] [PubMed] [Google Scholar]
- Gardner RB, & Wilson JP (1979). Frequency Modulation a ;• S • Pt Test 1 Test, 704–709. [DOI] [PubMed]
- Guest S, Catmur C, Lloyd D, & Spence C (2002). Audiotactile interactions in roughness perception. Experimental Brain Research, 146(2), 161–171. 10.1007/s00221-002-1164-z [DOI] [PubMed] [Google Scholar]
- Hackett TA, Smiley JF, Ulbert I, Karmos G, Lakatos P, De La Mothe LA, & Schroeder CE (2007). Sources of somatosensory input to the caudal belt areas of auditory cortex. Perception, 36(10), 1419–1430. 10.1068/p5841 [DOI] [PubMed] [Google Scholar]
- Hsieh I-H, Fillmore P, Rong F, Hickok G, & Saberi K (2012). FM-selective networks in human auditory cortex revealed using fMRI and multivariate pattern classification. Journal of Cognitive Neuroscience, 24(9), 1896–907. 10.1162/jocn_a_00254 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jazayeri M, & Movshon JA (2006). Optimal representation of sensory information by neural populations. Nature Neuroscience, 9(5), 690–696. 10.1038/nn1691 [DOI] [PubMed] [Google Scholar]
- Joanisse MF, & DeSouza DD (2014). Sensitivity of human auditory cortex to rapid frequency modulation revealed by multivariate representational similarity analysis. Frontiers in Neuroscience, 8(SEP). 10.3389/fnins.2014.00306 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Johnsrude IS, Penhune VB, & Zatorre RJ (2000). Functional specificity in the right human auditory cortex for perceiving pitch direction. Brain, 123(1), 155–163. 10.1093/brain/123.1.155 [DOI] [PubMed] [Google Scholar]
- Jousmäki V, & Hari R (1998). Parchment-skin illusion: sound-biased touch. Current Biology, 8(6), R190 10.1016/S0960-9822(98)70120-4 [DOI] [PubMed] [Google Scholar]
- Kay BYRH, & Matthews DR (1972). Modulation present, 657–677. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kayser C, Petkov CI, Augath M, & Logothetis NK (2005). Integration of touch and sound in auditory cortex. Neuron, 48(2), 373–384. 10.1016/j.neuron.2005.09.018 [DOI] [PubMed] [Google Scholar]
- Lederman SJ (1979). Auditory texture perception. Perception, 8(1), 93–103. 10.1068/p080093 [DOI] [PubMed] [Google Scholar]
- Lemus L, Hernández A, Luna R, Zainos A, & Romo R (2010). Do sensory cortices process more than one sensory modality during perceptual judgments? Neuron, 67(2), 335–348. 10.1016/j.neuron.2010.06.015 [DOI] [PubMed] [Google Scholar]
- Levine SM, & Schwarzbach J (2017). Decoding of auditory and tactile perceptual decisions in parietal cortex. NeuroImage, 162, 297–305. 10.1016/j.neuroimage.2017.08.060 [DOI] [PubMed] [Google Scholar]
- Liang M, Mouraux A, Hu L, & Iannetti GD (2013). Primary sensory cortices contain distinguishable spatial patterns of activity for each sense. Nature Communications, 4 10.1038/ncomms2979 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Madden JP, & Fire KM (1997). Detection and discrimination of frequency glides as a function of direction, duration, frequency span, and center frequency. The Journal of the Acoustical Society of America, 102(5 Pt 1), 2920–2924. 10.1121/1.420346 [DOI] [PubMed] [Google Scholar]
- Manfredi LR, Saal HP, Brown KJ, Zielinski MC, Dammann JF, Polashock VS, & Bensmaia SJ (2014). Natural scenes in tactile texture. Journal of Neurophysiology, 111(February 2014), 1792–802. 10.1152/jn.00680.2013 [DOI] [PubMed] [Google Scholar]
- Mendelson JR, & Cynader MS (1985). Sensitivity of cat primary auditory cortex (AI) neurons to the direction and rate of frequency modulation. Brain Research, 327(1–2), 331–5. 10.1016/0006-8993(85)91530-6 [DOI] [PubMed] [Google Scholar]
- Neuhoff JG, Mcbeath MK, Marks L, Merriman W, Neuhoff B, Marazita J, & Mcbeath K (1996). The Doppler Illusion: The Influence of Dynamic Intensity Change on Perceived Pitch. Journal of Experimental Psychology: Human Perception and Performance, 22(4), 970–985. 10.1037//0096-1523.22.4.970 [DOI] [PubMed] [Google Scholar]
- Nordmark PF, Pruszynski JA, & Johansson RS (2012). BOLD Responses to Tactile Stimuli in Visual and Auditory Cortex Depend on the Frequency Content of Stimulation. Journal of Cognitive Neuroscience, 24(10), 2120–2134. 10.1162/jocn_a_00261 [DOI] [PubMed] [Google Scholar]
- Occelli V, Spence C, & Zampini M (2011). Audiotactile interactions in temporal perception. Psychonomic Bulletin and Review, 18(3), 429–454. 10.3758/s13423-011-0070-4 [DOI] [PubMed] [Google Scholar]
- Pérez-Bellido A, Anne Barnes K, Crommett LE, & Yau JM (2017). Auditory Frequency Representations in Human Somatosensory Cortex. Cerebral Cortex, (January), 1–14. 10.1093/cercor/bhx255 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rauschecker JP, Tian B, & Hauser M (1995). Processing of complex sounds in the macaque nonprimary auditory cortex. Science (New York, N.Y.), 268(5207), 111–4. 10.1126/science.7701330 [DOI] [PubMed] [Google Scholar]
- Razak KA, & Fuzessery ZM (2006). Neural Mechanisms Underlying Selectivity for the Rate and Direction of Frequency-Modulated Sweeps in the Auditory Cortex of the Pallid Bat. Journal of Neurophysiology, 96(3), 1303–1319. 10.1152/jn.00020.2006 [DOI] [PubMed] [Google Scholar]
- Recanzone GH (2011). Perception of auditory signals. Annals of the New York Academy of Sciences, 1224(1), 96–108. 10.1111/j.1749-6632.2010.05920.x [DOI] [PubMed] [Google Scholar]
- Ro T, Ellmore TM, & Beauchamp MS (2013). A neural link between feeling and hearing. Cerebral Cortex, 23(7), 1724–1730. 10.1093/cercor/bhs166 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ro T, Hsu J, Yasar NE, Caitlin Elmore L, & Beauchamp MS (2009). Sound enhances touch perception. Experimental Brain Research, 195(1), 135–143. 10.1007/s00221-009-1759-8 [DOI] [PubMed] [Google Scholar]
- Saal HP, Harvey MA, & Bensmaia SJ (2015). Rate and timing of cortical responses driven by separate sensory channels. eLife, 4(DECEMBER2015). 10.7554/eLife.10450 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Saal HP, Wang X, & Bensmaia SJ (2016). Importance of spike timing in touch: an analogy with hearing? Current Opinion in Neurobiology. 10.1016/j.conb.2016.07.013 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sadagopan S, & Wang X (2009). Nonlinear Spectrotemporal Interactions Underlying Selectivity for Complex Sounds in Auditory Cortex. Journal of Neuroscience, 29(36), 11192–11202. 10.1523/JNEUROSCI.1286-09.2009 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sams M, & Naatanen R (1991). Neuromagnetic responses of the human auditory cortex to short frequency glides. Neuroscience Letters, 121(1–2), 43–46. 10.1016/0304-3940(91)90645-A [DOI] [PubMed] [Google Scholar]
- Schroeder CE, Lindsley RW, Specht C, Marcovici A, Smiley JF, & Javitt DC (2001). Somatosensory input to auditory association cortex in the macaque monkey. Journal of Neurophysiology, 85(3), 1322–1327. [DOI] [PubMed] [Google Scholar]
- Schürmann M, Caetano G, Hlushchuk Y, Jousmäki V, & Hari R (2006). Touch activates human auditory cortex. NeuroImage, 30(4), 1325–1331. 10.1016/j.neuroimage.2005.11.020 [DOI] [PubMed] [Google Scholar]
- Schürmann M, Caetano G, Jousmäki V, & Hari R (2004). Hands help hearing: Facilitatory audiotactile interaction at low sound-intensity levels. The Journal of the Acoustical Society of America, 115(2), 830–832. 10.1121/1.1639909 [DOI] [PubMed] [Google Scholar]
- Shu ZJ, Swindale NV, & Cynader MS (1993). Spectral motion produces an auditory after-effect. Nature, 364(6439), 721–723. 10.1038/364721a0 [DOI] [PubMed] [Google Scholar]
- Smiley JF, & Falchier A (2009). Multisensory connections of monkey auditory cerebral cortex. Hearing Research, 258(1–2), 37–46. 10.1016/j.heares.2009.06.019 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tian B (2004). Processing of Frequency-Modulated Sounds in the Lateral Auditory Belt Cortex of the Rhesus Monkey. Journal of Neurophysiology, 92(5), 2993–3013. 10.1152/jn.00472.2003 [DOI] [PubMed] [Google Scholar]
- Vergara J, Rivera N, Rossi-Pool R, & Romo R (2016). A Neural Parametric Code for Storing Information of More than One Sensory Modality in Working Memory. Neuron, 89(1), 54–62. 10.1016/j.neuron.2015.11.026 [DOI] [PubMed] [Google Scholar]
- Wilson EC, Reed CM, & Braida LD (2010). Integration of auditory and vibrotactile stimuli: Effects of frequency. The Journal of the Acoustical Society of America, 127(5), 3044–3059. 10.1121/1.3365318 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yamins DLK, & DiCarlo JJ (2016). Using goal-driven deep learning models to understand sensory cortex. Nature Neuroscience, 19(3), 356–365. 10.1038/nn.4244 [DOI] [PubMed] [Google Scholar]
- Yau JM, Connor CE, & Hsiao SS (2013). Representation of tactile curvature in macaque somatosensory area 2. Journal of Neurophysiology, 109(12), 2999–3012. 10.1152/jn.00804.2012 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yau JM, Hollins M, & Bensmaia SJ (2009). Textural timbre: The perception of surface microtexture depends in part on multimodal spectral cues. Communicative and Integrative Biology, 2(4), 344–346. 10.4161/cib.2.4.8551 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yau JM, Kim SS, Thakur PH, & Bensmaia SJ (2016). Feeling form: the neural basis of haptic shape perception. Journal of Neurophysiology, 115(2), 631–642. 10.1152/jn.00598.2015 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yau JM, Olenczak JB, Dammann JF, & Bensmaia SJ (2009). Temporal Frequency Channels Are Linked across Audition and Touch. Current Biology, 19(7), 561–566. 10.1016/j.cub.2009.02.013 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yau JM, Pasupathy A, Brincat SL, & Connor CE (2013). Curvature processing dynamics in macaque area V4. Cerebral Cortex, 23(1), 198–209. 10.1093/cercor/bhs004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yau JM, Pasupathy A, Fitzgerald PJ, Hsiao SS, & Connor CE (2009). Analogous intermediate shape coding in vision and touch. Proceedings of the National Academy of Sciences of the United States of America, 106(38), 16457–62. 10.1073/pnas.0904186106 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yau JM, Weber AI, & Bensmaia SJ (2010). Separate mechanisms for audio-tactile pitch and loudness interactions. Frontiers in Psychology, 1(OCT), 1–11. 10.3389/fpsyg.2010.00160 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ye C, Poo M, Dan Y, & Zhang X (2010). Synaptic mechanisms of direction selectivity in primary auditory cortex. Journal of Neuroscience, 30(5), 1861–1868. 10.1523/JNEUROSCI.3088-09.2010 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zhang LI, Tan AYY, Schreiner CE, & Merzenich MM (2003). Topography and synaptic shaping of direction selectivity in primary auditory cortex. Nature, 424(6945), 201–205. 10.1038/nature01796 [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.






