Skip to main content

This is a preprint.

It has not yet been peer reviewed by a journal.

The National Library of Medicine is running a pilot to include preprints that result from research funded by NIH in PMC and PubMed.

bioRxiv logoLink to bioRxiv
[Preprint]. 2026 Mar 19:2026.03.18.712638. [Version 1] doi: 10.64898/2026.03.18.712638

From Breath to Behavior: Respiratory Features Predict Visual Detection Performance

Emily Skog 1, Deepa Issar 1,3, Madison Grigg 1, Samantha E Nelson 2, Jana M Kainerstorfer 1,2,3,4,5,*, Matthew A Smith 1,2,3,4,5,6,*
PMCID: PMC13015306  PMID: 41889949

SUMMARY

Breathing is a continuous bodily rhythm that not only sustains physiology but also shapes brain function and behavior. Here we investigated how respiration interacts with perceptual performance in nonhuman primates performing a visual detection task. Using continuous recordings, we extracted detailed features from each respiratory cycle including timing, duration, phase, depth, and volume, aligned to trial onset. Analyses revealed that timing-related features, such as inhalation onset and the respiration length, were the most reliable markers of trial outcome, whereas amplitude-based measures contributed less consistently. These findings demonstrate that the temporal structure of breathing, rather than its magnitude, plays a dominant role in shaping behavior on a moment-to-moment basis. By uncovering how fine-grained features of respiration align with perceptual success, our work highlights respiration as a strong correlate of cognition and highlights the value of feature-based approaches for linking interoceptive rhythms to behavior.

Keywords: Respiration, interoception, non-human primate, visual perception, body-brain interactions

INTRODUCTION

Respiration is a continuous, partially volitional internal bodily rhythm that has functions that extend far beyond oxygen metabolism. Accumulating evidence from human and animal studies suggests that respiration dynamically interacts with neural circuits and modulates cognitive performance. For example, the phase, volume and rate of breathing have been shown to entrain neural oscillations across multiple brain regions, including the prefrontal cortex1-4, hippocampus1,2,5, and sensory areas2,6-9. These respiration-locked fluctuations similarly influence perceptual capabilities, which were reflected in changes in reaction times10-12 and task accuracy2,13-17. Thus, the act of breathing is tightly interwoven with how the brain perceives, decides, and acts.

More broadly, respiration is a central component of interoception, the process by which the brain senses and integrates internal bodily signals18. Interoception is fundamental for maintaining physiological balance and guiding adaptive behavior, and its disruption has been linked to neuropsychiatric and neurodegenerative disorders, including schizophrenia19-21, depression22-24, and Alzheimer’s disease25,26. Studying respiration therefore offers a unique lens into how interoceptive signals modulate neural activity and cognition.

Prior research on respiration and behavior has largely emphasized slower, minutes-to-hours timescales27,28, such as in sleep29,30, anxiety31-33, or respiratory disease detection34-36. Far less is known about the interactions between respiration and behavior at finer resolutions, such as moment-to-moment perceptual decisions. In addition, many analyses rely on single-dimensional summaries of breathing (e.g., rate, depth, or a single phase value)14,15, which capture broad dynamics but may overlook the rich temporal and structural features embedded in the respiratory signal. This is particularly important because respiration is inherently variable. Features such as phase, depth, flow, duration, and rate covary in complex, nonlinear ways from cycle to cycle. This variability introduces a rich but underexamined source of information27,37. Additionally, general studies often emphasize that neural and behavioral measures are aligned to inhalation, treating it as a key respiratory feature for task success, yet this may obscure subtler features distributed across the entire respiratory cycle15,38.

To overcome these limitations, we present a novel feature-based framework that decodes behavior directly from trial-level respiratory dynamics in a perceptual decision-making task in non-human primates (NHPs). Using high-resolution respiration recordings from NHPs primates performing a visual detection task, we extracted a comprehensive set of features from each respiratory cycle, including timing, phase, duration, depth, and volume, aligned to the onset of each trial. We then used these features to predict behavioral outcomes on a single-trial basis, using interpretable machine learning models. We found that timing-related features (such as rate) were consistently more predictive of behavioral performance than amplitude-based features (such as depth or volume). Overall, this study supports a role for respiration in modulating behavior.

RESULTS

Respiration was recorded during a visual perceptual task in NHPs

We recorded high-resolution respiration waveforms from two male rhesus macaque monkeys (Monkey RA and Monkey AB) during a visual color-change detection task (Figure 1A). Monkeys fixated on a central dot and withheld eye movements until the fixation changed color. Then they made a saccade to a peripheral target to obtain reward (Figure 1B). Trials were classified as correct, false alarm, or miss, as described in the STAR Methods. Perceptual difficulty was manipulated by varying the magnitude of the color change (Figure 1C). This design allowed us to examine how respiration interacts with varying task difficulty.

Figure 1. Respiration recording and behavioral performance during a visual perceptual task in nonhuman primates.

Figure 1.

(A) Experimental setup for simultaneous respiration recordings during perceptual change detection.

(B) Schematic of the color change detection task, which involved a change in the color of the fixation spot quantified by a rotation in CIELUV color space (Monkey RA: ~3°, ~5°, ~9°, ~15°; Monkey AB: ~4°, ~8°, ~10°, ~12°).

(C) Manipulation of perceptual difficulty by varying the degree of color rotation between fixation and target; larger rotations were easier to detect and smaller rotations more difficult.

(D) Psychometric curve of Monkey RA’s behavioral performance. Hit rate (the number of correct trials divided by the total number of correct and miss trials) is plotted as a function of color-change difficulty (ranked from 1, the easiest to 4, the hardest). Light gray lines represent individual session performance, and the dark gray line shows the across-session mean (± SEM).

(E) Behavioral performance summary across all sessions for each monkey. Pie charts indicate the proportion of correct (blue), false alarm (orange), and miss (yellow) trials for Monkey R (left) and Monkey A (right).

We analyzed each animal’s behavioral performance as a function of the color change magnitude, creating a psychophysical response function. Detection performance was defined as hit rate, calculated as the proportion of correct trials divided by the total number of correct and miss trials. This psychometric curve demonstrated a clear improvement in detection performance with increasing color change magnitude for individual sessions and the across-session average (light and dark gray curves, respectively, in Figure 1D for Monkey RA). We titrated the change difficulty values such that overall performance was similar for our two subjects, with approximately 70% of trials resulting in a correct detection (Figure 1E). Thus, our task structure created a robust framework for linking respiration with trial-by-trial behavioral outcomes.

Session-level analysis revealed consistent differences in respiration timing based on trial outcome

To extract the features of respiration traces, we applied a multi-step signal processing pipeline (detailed description in STAR methods) and compared respiration waveforms across trial outcomes (Figure 2A-C). The overall frequency content of the respiration signal was relatively constant across the session, as seen in a spectrogram from one example session (Figure 2A). The dominant band of power was found between 0.1–0.5 Hz, around 15 breaths per minute. Example traces from Monkey RA (Figure 2B) illustrate how bandpass filtering with a third-order Butterworth filter (0.1–0.5 Hz) preserved the overall cycle structure while isolating the dominant frequency. These filtered respiration traces revealed trial outcome–specific differences in waveform shape. When aligned to trial start, average filtered traces across sessions (Figure 2C) showed subtle differences between correct and incorrect (false-alarm and miss) outcomes, motivating further feature-based analyses of respiration as a predictor of behavioral performance.

Figure 2. Signal processing pipeline and outcome-specific differences in respiration waveforms.

Figure 2.

(A) Respiration spectrogram from an example session. Power was concentrated between 0.1–0.5 Hz.

(B) Example respiration traces from Monkey R. Top: raw respiration signal across several trials. Bottom: bandpass-filtered signal (0.1–0.5 Hz, third-order Butterworth filter). Blue shading marks correct trials, and red shading marks incorrect trials (false alarms + misses).

(C) Average respiration waveforms aligned to trial onset, shown separately for correct (blue) and incorrect (red) trials. Shaded regions denote ± 95% confidence intervals across all sessions.

First, we wanted to analyze how respiration changes depending on trial outcome on a session-level basis. Our session-level analysis revealed consistent differences in respiration based on trial outcome. We wondered then whether specific components of the respiratory cycle related to behavior on individual trials. To do this we focused on the first full cycle after each trial’s onset and extracted a comprehensive set of temporal and amplitude-based features (Figure 3A). These included the onset times and durations of inhalation and exhalation, overall cycle length, and measures of volume and depth.

Figure 3. Session-level respiration timing features differ between correct and incorrect trials.

Figure 3.

(A–E) Session-level scatter plots of respiration timing-based features (purple), calculated as the mean feature value for correct trials vs. the mean for incorrect trials (y-axis). Each point corresponds to one session, with Monkey RA shown as circles (n = 41 sessions) and Monkey AB as squares (n = 48 sessions). Wilcoxon signed-rank test was used to determine significance.

(F) Schematic illustration of respiration feature extraction. Timing features (purple) include inhalation/exhalation onset times, durations, and overall cycle length; amplitude features (green) include volume and depth. See STAR Methods for detailed definitions.

(G–K) Same as (A-E) but grouped by respiration amplitude-based features (green).

Across sessions, timing features showed the strongest and most consistent outcome-related effects (Figure 3A-E). For example, inhalation onset relative to trial start (Figure 3A) was earlier for correct trials for both monkeys (Monkey RA: p < 0.001; Monkey AB: p< 0.001; pooled: p < 0.001). Respiration cycle length was also longer in correct trials for Monkey RA and the pooled data, though not for Monkey AB (Figure 3B), while exhalation length showed consistent increases across both animals (Figure 3C). In contrast, amplitude features did not show consistent effects across monkeys (Figure 3G-K). These effects point to a systematic alignment of the respiratory cycle with successful performance, suggesting that temporal aspects of breathing play a key role in shaping trial-by-trial behavior.

Importantly, not all timing features were the same between monkeys. Exhalation onset, for example, was significantly shifted in Monkey AB but not Monkey RA (Figure 3E), suggesting some degree of individual variability. Nevertheless, the overall pattern highlights that respiration timing reliably distinguished correct from incorrect outcomes at the session level. This motivated the next step of our analysis: moving beyond average session effects to ask whether these temporal features could predict behavior on a single-trial basis.

Session-level distributions reveal that respiration features are discriminable within individual recording days.

To complement our across-session comparisons, we next asked whether respiration features could differentiate between trial outcomes within a single recording session. Specifically, we examined whether respiration features that differed across sessions between correct and incorrect trials also showed separable trial-by-trial distributions within an individual day of recording. We focused on two respiration features that consistently showed outcome-related differences across sessions: respiration cycle length and exhalation volume. For illustration, we examined example sessions from the dataset shown in Figure 3. In one session, the distribution of respiration cycle length differed between outcomes, with correct trials tending to have longer respiration cycles than incorrect trials (Figure 4A). In another session, the distribution of exhalation volume was shifted such that correct trials tended to have larger exhalation volumes than incorrect trials (Figure 4B).

Figure 4. Respiration features are discriminable by trial outcome within an example session.

Figure 4.

(A) Example-session histograms of respiration length (first complete cycle after trial onset), plotted separately for correct (blue) and incorrect (red) trials. The mean of each distribution is indicated with a vertical line matched in color to the histogram.

(B) Same as (A), but for exhalation volume from another example session.

For both panels, distributions were compared using a two-sample t test, showing robust outcome-related differences (p ≪ 0.001 for each feature).

Although the trial-by-trial respiration values showed substantial overlap between outcomes, the distributions were significantly different within these example sessions (two-sample t-test; p ≪ 0.001 for both comparisons). These results demonstrate that respiration features carry information about behavioral outcomes that can be detected even within a single day of recording. We next turned to an approach that could aggregate across all of the respiration features to predict trial outcomes on a single-trial basis.

Trial outcomes are well predicted by a combination of respiratory features

To test whether respiratory dynamics could predict behavior at the single-trial level, we trained a set of classifiers using features extracted from the first full respiration cycle following trial onset. Models included Gradient Boosting (GB), Random Forest (RF), Support Vector Machines (SVM), k-nearest neighbors (KNN) and multi-layer perceptrons (MLPs), providing a mix of interpretable ensemble methods and a neural network architecture.

Model performance was evaluated under two complementary cross-validation schemes (Table 1). We used 5-fold cross-validation (5CV) and leave-one-session-out (LOSO). The latter provided a particularly stringent test of robustness by assessing whether models trained on one set of recording days could generalize to entirely new sessions. Both validation schemes revealed that respiration features reliably predicted trial outcome. In Monkey RA, GB and MLP achieved the highest accuracy under 5CV (75%), while the GB performed best under LOSO (72%), closely followed by the MLP (71%). In Monkey AB, RF outperformed other models, reaching 73% in 5CV and 70% in LOSO. These results indicate that models can generalize both within and across sessions, confirming that respiration features carry behaviorally predictive information beyond session-specific variability.

Table 1.

Cross-validated classifier accuracies for both monkeys

Model Type Test Accuracy (%)
5CV -
Monkey RA
LOSO -
Monkey RA
5CV -
Monkey AB
LOSO -
Monkey AB
Gradient Boosting 75 72 72 70
MLP 75 71 72 70
SVM 73 70 71 67
Random Forest 73 70 73 70
KNN 69 67 69 65

Because respiration is a process that can span the border of trials, and potentially be impacted by recent events like sipping at the reward tube, we wanted to determine if the relationship between respiration and behavioral outcome could be impacted by trial history. To do this, we reran the GB classifier on a subset of trials which were chosen because they followed a correct trial. In doing this we balanced our trial selection by current outcome but holding preceding outcome constant. We found that classifier accuracy remained comparable to prediction on the full set of trials, reaching 74% in Monkey RA and 72% in Monkey AB. This demonstrates that respiration features continue to predict behavior even after controlling for trial history.

To interpret which aspects of respiration drove model performance, we examined feature importance scores from the GB model (Figure 5A-B). Timing features, such as respiration length and inhalation start time, were consistently more informative than amplitude-based features, aligning with results in Figure 3. We next addressed feature redundancy using an exhaustive feature-combination analysis (Figure 5C-D). For each possible number of features, all combinations were evaluated and test accuracy was summarized as the mean across all possible sets (gray line) and the best-performing set (red line). In both animals, accuracy increased with the addition of features and the best set of 3-6 features achieved performance on par with the full set. Timing features again were represented in these smaller feature combinations, with amplitude features providing modest complementary value as the model started to reach its plateau. We also built a model which pooled trials across both monkeys into a single dataset yielding comparable accuracy (72% with GB), which indicates that despite individual variability the predictive structure of respiration features generalized across animals.

Figure 5. Features related to respiration timing were the strongest predictors of trial outcome.

Figure 5.

(A) Gradient Boosting feature importance scores for Monkey RA, with features color-coded by type (timing = purple, amplitude = green).

(B) Same as (A), for Monkey AB.

(C) Test accuracy as a function of the number of features included in the model for Monkey RA. For each feature count, all possible feature combinations were evaluated. The gray line shows mean accuracy across combinations, and the red line indicates the best-performing combination (5-fold CV). The inset colored grid below the curve depicts which features were present in the best-performing combination at each feature count: each square corresponds to a feature, that follows the same order as the feature importance from (A) and (B) (y-axis) and number of features in each iteration (x-axis), with purple denoting timing features and green denoting amplitude features.

(D) Same as (C), for Monkey AB.

DISCUSSION

Our findings demonstrate that respiration carries reliable predictive information about perceptual performance on a single-trial basis. By using interpretable, feature-based models, we identified timing-based respiratory features, particularly the latency to inhalation onset and the duration of the first cycle after trial onset, as the strongest predictors of trial outcome. In contrast, amplitude-based features such as depth or volume contributed far less. A key strength of our approach is the systematic extraction of cycle-level features (onset times, durations, phase, depth, and volume) aligned to task behavior.

Across trials, correct responses were associated with earlier inhalations and longer cycle durations, indicating that when the subject started breathing in the trial was behaviorally meaningful. This aligns closely with human behavioral work showing that the inhalation phase of breathing often synchronizes with task engagement and increases the probability of detecting near-threshold stimuli. For example, respiration phase at stimulus onset predicts visual detection performance14, and humans spontaneously time inhalations to anticipated cognitive demands15,39. Although most prior studies describe effects in terms of phase, our results show that explicit timing (i.e., when inhalation begins relative to trial onset) provides an equally powerful description of how respiration shapes perception. In mice, combinations of temporal features such as inspiratory and expiratory durations reliably differentiate behavioral states over extended timescales27. Our results extend this logic to moment-to-moment perceptual decisions, demonstrating that fine-grained temporal features of a single respiration can predict whether a stimulus will be detected.

Together, these behavioral findings complement a growing neural literature showing that respiration rhythmically modulates cortical excitability. Human EEG/MEG studies reveal respiration-locked fluctuations in alpha and beta power, with phases of heightened excitability corresponding to improved stimulus detection14,15. Such oscillatory modulations suggest a form of sensory gating, where inhalation-aligned increases in cortical activity make the brain more receptive to incoming sensory information. This framework provides a mechanistic interpretation of our behavioral results: trials that begin during “high-excitability” respiratory phases, marked by earlier inhalation onset or longer initial cycles, may benefit from increased neural excitability and therefore show higher perceptual accuracy. This work supports a hypothesis in which respiration shapes trial-by-trial fluctuations in perceptual performance. A natural next step is to add neural recordings to directly test whether these behaviorally advantageous respiratory mechanisms correspond to measurable changes in cortical excitability and sensory processing.

This strong contribution of timing-based respiratory features to predicting trial outcomes aligns well with broader evidence that systemic physiology is connected to brain activity and behavior. In ECG and pupil studies, trial onset is often accompanied by reliable shifts in heart rate and pupil size, reflecting preparatory autonomic adjustments16,40-43. Similarly, work in monkeys and humans have shown that neural activity in the visual cortex modulates prior to stimulus onset in anticipation of task demands40-43. These findings suggest the existence of latent preparatory states that coordinate interoceptive and neural rhythms to optimize sensory processing.

More broadly, theories of interoception suggest that bodily rhythms provide context signals that dynamically influence cognition44,45. One account proposes a common arousal mechanism, in which brainstem nuclei such as the locus coeruleus coordinate global state changes that simultaneously modulate cortical excitability and peripheral physiology, including respiration46,47. A complementary framework posits a direct respiratory-neural coupling, where respiratory rhythm generators and afferent pathways interact with cortical and thalamic circuits, allowing breathing dynamics themselves to entrain neuronal oscillations and shape sensory processing48,49. Our data provides complementary evidence that respiration participates in this preparatory alignment and may improve detection accuracy. One interpretation is that breathing rhythms help structure the timing of cortical excitability, such that trials aligned to favorable phases of the respiratory cycle are more likely to succeed. Future work should test this causal role directly, for example, by manipulating respiration timing relative to task events and assessing the impact on performance.

Several considerations point to clear directions for future work. First, while our classifiers exceeded chance and generalized across sessions, they likely capture only a portion of the relevant variance, leaving room to test whether combining respiration with complementary signals (e.g., pupil, ECG, LFP/EEG) and richer temporal models improves predictive power and mechanistic insight31,32,50. Second, recording respiration is indirect and often relies on secondary measures51. In our case, the temperature in the nose was a high-resolution measure for respiration but lacked an absolute amplitude calibration. Future designs can incorporate stable calibration references to better capture amplitude effects. Third, we studied two animals performing a single task. An advantage of our paradigm in non-human primates is that we collected highly replicable behavior in well trained subjects across thousands of trials and many sessions. Nonetheless, expanding to larger, mixed-sex cohorts and multiple task contexts will test generalizability and boundary conditions.

To summarize, this study shows that respiration is not merely a background rhythm but a dynamic indicator of behavior. By revealing that fine-grained features of respiration, especially its task-related timing, predict perceptual performance at the single-trial level, we emphasize the need to incorporate interoceptive rhythms into models of cognition. This framework opens the door to testing whether respiration-aligned interventions could enhance perception, attention, or learning in health and disease.

STAR★METHODS

EXPERIMENTAL MODEL

Two male rhesus macaque monkeys (Macaca mulatta) were used in this study. They were 10 (Monkey RA), and 9 (Monkey AB) years old at the time of data collection. The animals were housed singly in a room operating on a standard 12 hour light/dark cycle and were provided with an enhanced enrichment program. All experimental procedures were approved by the Institutional Animal Care and Use Committees of Carnegie Mellon University and were in accordance with the United States National Research Council’s Guide for the Care and Use of Laboratory Animals.

METHOD DETAILS

Respiration recordings

Respiration was recorded continuously using a custom-built nasal thermocouple probe positioned at the entrance of one nostril in each monkey. The probe consisted of a Type-K thermocouple (glass braid insulated) connected to an AD8495 thermocouple amplifier (Adafruit). The thermocouple wire was threaded through a thin plastic rod and mounted on a cable tension based flexible arm to allow precise positioning in front of the nostril without contacting the animal.

Respiration was measured via temperature fluctuations in the airflow from the nostril. Exhalation caused an increase in temperature relative to the ambient room air (~70° F) and then inhalation then caused a relative decrease in temperature as the room air was drawn past the thermocouple probe, producing corresponding voltage changes at the amplifier output (increase in voltage for and increase in temperature). The analog signal from the thermocouple amplifier was routed through a BNC connection to the data recording system (Ripple Inc; Salt Lake City, UT) and acquired using Trellis software on an analog channel sampled at 1 kHz.

Eye Tracking

Eye position was recorded using an infrared eye tracking system at a rate of 1000 Hz (EyeLink 1000; SR research).

Behavioral task

Subjects performed a color-change detection task in which their goal was to detect and respond as quickly as possible to a subtle change in the color of a central fixation dot. Each trial began with a blank screen presented for a brief inter-trial interval (trial start), after which a blue fixation point appeared at the center of the screen. The monkey initiated the trial by directing its gaze to this fixation point (fixation onset; 321 ± 57 ms from trial start) and was required to maintain fixation for an initial period of 400–600 ms (chosen randomly).

After this initial fixation period, a peripheral target appeared at one of eight possible locations spaced 45° apart around the fixation point. Unlike a simple visually guided saccade task, the animal was required to continue fixating through an additional delay period while the target remained present. The duration of this delay was drawn from a truncated exponential distribution (mean: ~2000 ms, range: 1700–3000 ms), preventing temporal predictability of the go cue.

At the end of the delay, the central fixation dot changed color. This color rotation in CIELUV color space served as the go cue, indicating that the monkey should look toward the previously presented peripheral target. The magnitude of the color rotation varied across trials (Monkey RA: by ~3°, ~5°, ~9°, ~15°; Monkey AB: by ~4°, ~8°, ~10°, ~12°) with larger rotations being perceptually easier to detect.

A trial was counted as correct if the monkey made a saccade to the target within the allowed reaction time window (<500ms to start a saccade and, <200ms to complete a saccade). Trials were labeled false alarms if the monkey broke fixation and looked toward the target before the color change occurred and misses if the monkey failed to initiate a saccade during the reaction time window after the color change. Successful trials earned a fixed juice reward.

Following trial completion, the timing of the subsequent trial depended on the outcome of the preceding trial. After correct trials, the next trial began on average 0.99 ± 0.06 s after the saccade to the target (this interval included both an enforced wait period and the time required for the experimental control system to prepare the next trial). Following false alarms, a timeout penalty was imposed, resulting in a longer inter-trial interval; the next trial began 1.62 ± 0.05 s after the saccade. After miss trials, in which no saccade was made to the target, a longer timeout period was enforced before the next trial began, resulting in an average interval of 2.00 ± 0.10 s after the color change.

Data Analysis

Respiration waveform feature extraction

The respiration signal was processed using a third-order Butterworth bandpass filter (0.1–0.5 Hz) to isolate the typical frequency range of respiration. To eliminate residual high-frequency noise, additional notch filters were applied using the “iirnotch” function (MATLAB 2024b, MathWorks Inc.). Following filtering, the signal was inverted so that rising phases correspond to inhalation and falling phases to exhalation. Finally, the signal was z-scored to account for potential variability in probe placement across recording sessions and then epoched into trials.

A range of respiration features were extracted using different signal processing methods. For each trial, inhalation and exhalation phases were identified using a tailored implementation of the “findpeaks” function (MATLAB 2024b, MathWorks Inc.). Peaks (marking the start of exhalation) and valleys (marking the start of inhalation) were detected to determine the timing and structure of each respiratory cycle. From these, respiration length (interval between successive peaks), as well as inhalation and exhalation durations (intervals between peaks and valleys, and vice versa), were calculated. The onset of inhalation and exhalation were measured relative to trial start to assess timing relative to the task. Additionally, the prominence of each detected peak was extracted to identify for inhalation/exhalation depth. Finally, the respiration signal was numerically integrated between inhalation and exhalation boundaries to compute relative inhalation and exhalation volumes, as well as total respiration volume.

Respiration phase was established by applying the Hilbert transform on the filtered, z-scored respiration signal of the whole session. Then the “angle” function (MATLAB 2024b, MathWorks Inc.) was applied to understand how phase (in degrees) changes across the session. Finally, the respiration phase in degrees was computed separately relative to the trial start and fixation onset.

Data preparation

For each trial, a set of respiratory features was extracted from the first complete respiration cycle following trial start. This cycle was defined as the interval from the first inhalation onset to the subsequent inhalation onset. Focusing on this initial cycle allowed for consistent feature extraction across trials and ensured that all features were temporally aligned relative to the behavioral task.

Each trial was represented as a row in a feature matrix of size [number of trials × number of features], with the features mentioned above.

Before modeling, each feature column was z-score normalized across all to place all features on a common scale. The finalized design matrix and label vector were then used as inputs for all machine learning classifiers.

Machine Learning classification

Data from 41 recording sessions from Monkey RA and 48 from Monkey AB of respiration waveforms were used to predict behavioral outcomes based on features extracted from the respiratory signal. Two validation strategies were implemented to evaluate model performance: (1) five-fold cross-validation, with 80% of trials used for hyperparameter tuning and training and 20% for testing, and (2) leave-one-session-out (LOSO) validation, where models were trained on all sessions except one, which was held out for testing. Both approaches ensured that evaluation was robust across trials and sessions, helping to minimize overfitting and assess generalizability across animals and experimental conditions.

All models were implemented in MATLAB. The classifiers tested were Random Forrest, Support Vector Machines, k-Nearest Neighbor Search, a boosted random forest, ensemble method using decision trees, and multi-layer perceptron (MLP). Model performance was quantified using the area under the curve (AUC), confusion matrices, and accuracy (%).

QUANTIFICATION AND STATISTICAL ANALYSIS

We report as the number of experimental sessions per animal (Monkey R: 41; Monkey A: 48), each on a separate day. Session-level differences in respiration features between correct and incorrect outcomes were assessed with two-sided Wilcoxon signed-rank tests (α =0.05) and with two-sample t-tests (α =0.05). For single-trial decoding, model performance is summarized by classification accuracy (%).

All analyses were implemented in MATLAB R2024b (MathWorks; Statistics and Machine Learning Toolbox); circular tests used standard implementations (e.g., Circular Statistics Toolbox). Our internal lab software randomized the trial conditions; analyses were automated and performed blind to session identity. Sessions/trials failing predefined quality-control thresholds (e.g., corrupted respiration or insufficient incorrect trials for comparison) were excluded prior to analysis, with criteria and totals detailed in figure legends.

KEY RESOURCES TABLE

REAGENT OR RESOURCE SOURCE IDENTIFIER
Deposited data
Respiration and behavioral data Original Data Will be made available upon publication
Experimental models: Organisms/strains
Nonhuman primates: rhesus macaque Primera Science Center
Software and algorithms
MATLAB MathWorks https://www.mathworks.com/products/matlab.html
Code for respiratory feature extraction and classification Original Code Will be made available upon publication
Code to reproduce figures Original Code Will be made available upon publication
Other
Respiration probe Built in-lab Components described in Methods
Eyelink 1000 eye tracker SR research https://www.sr-research.com/

ACKNOWLEDGMENTS

The authors would like to thank our animal care staff for their support. This work was supported by NIH R01 MH128393. E.S. was supported by NIH T32 EB029365 and the Ronald F. and Janice A. Zollo Fellowship. D.I. was supported by NIH F30 MH129056, the Ronald F. and Janice A. Zollo Fellowship, the Thomas-Pittsburgh Chapter ARCS Award and the Carnegie Mellon University Neuroscience Institute Carnegie Fellowship.

Footnotes

RESOURCE AVAILABILITY

Lead contact

Further information and requests for resources should be directed to and will be fulfilled by the lead contact, Matthew A. Smith (mattsmith@cmu.edu).

Materials availability

This study did not generate new materials.

Data and code availability

Data and analysis code used in this paper will be available as of the date of publication.

DECLARATION OF INTERESTS

The authors declare no competing interests.

DECLARATION OF GENERATIVE AI AND AI-ASSISTED TECHNOLOGIES

During the preparation of this work, the author(s) used ChatGPT in order to assist with wording and editing of the writing. After using this tool, the author(s) reviewed and edited the content as needed and take full responsibility for the content of the publication.

REFERENCES

  • 1.Biskamp J., Bartos M., and Sauer J.-F. (2017). Organization of prefrontal network activity by respiration-related oscillations. Sci Rep 7, 45508. 10.1038/srep45508. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Tort A.B.L., Laplagne D.A., Draguhn A., and Gonzalez J. (2025). Global coordination of brain activity by the breathing cycle. Nat. Rev. Neurosci., 1–21. 10.1038/s41583-025-00920-7. [DOI] [Google Scholar]
  • 3.Folschweiller S., and Sauer J.-F. (2023). Controlling neuronal assemblies: a fundamental function of respiration-related brain oscillations in neuronal networks. Pflugers Arch - Eur J Physiol 475, 13–21. 10.1007/s00424-022-02708-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Folschweiller S., and Sauer J.-F. (2023). Behavioral State-Dependent Modulation of Prefrontal Cortex Activity by Respiration. J. Neurosci. 43, 4795–4807. 10.1523/JNEUROSCI.2075-22.2023. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Nakamura N.H., Furue H., Kobayashi K., and Oku Y. (2023). Hippocampal ensemble dynamics and memory performance are modulated by respiration during encoding. Nat Commun 14, 4391. 10.1038/s41467-023-40139-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Dasgupta D., Schneider-Luftman D., Schaefer A.T., and Harris J.J. (2024). Wireless monitoring of respiration with EEG reveals relationships between respiration, behaviour and brain activity in freely moving mice. Journal of Neurophysiology, jn.00330.2023. 10.1152/jn.00330.2023. [DOI] [Google Scholar]
  • 7.Dias A.L.A., Drieskens D., Belo J.A., Duarte E.H., Laplagne D.A., and Tort A.B.L. (2025). Breathing Modulates Network Activity in Frontal Brain Regions during Anxiety. J. Neurosci. 45, e1191242024. 10.1523/JNEUROSCI.1191-24.2024. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Gonzalez J., Torterolo P., Bolding K.A., and Tort A.B.L. (2024). Communication subspace dynamics of the canonical olfactory pathway. iScience 27, 111275. 10.1016/j.isci.2024.111275. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Karjalainen S., Kujala J., and Parviainen T. (2025). Neural activity is modulated by spontaneous and volitionally controlled breathing. Biological Psychology 197, 109026. 10.1016/j.biopsycho.2025.109026. [DOI] [PubMed] [Google Scholar]
  • 10.Johannknecht M., and Kayser C. (2022). The influence of the respiratory cycle on reaction times in sensory-cognitive paradigms. Sci Rep 12, 2586. 10.1038/s41598-022-06364-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Saltafossi M., Zaccaro A., Kluger D.S., Perrucci M.G., Ferri F., and Costantini M. (2025). Respiration facilitates behaviour during multisensory integration. Preprint at bioRxiv, https://doi.org/10.1101/2025.01.10.632352 10.1101/2025.01.10.632352. [DOI] [Google Scholar]
  • 12.Münch E.E., Vögele C., Van Diest I., and Schulz A. (2019). Respiratory modulation of intensity ratings and psychomotor response times to acoustic startle stimuli. Neuroscience Letters 711, 134388. 10.1016/j.neulet.2019.134388. [DOI] [PubMed] [Google Scholar]
  • 13.Nakamura N.H., Fukunaga M., Yamamoto T., Sadato N., and Oku Y. (2022). Respiration-timing-dependent changes in activation of neural substrates during cognitive processes. Cerebral Cortex Communications 3, tgac038. 10.1093/texcom/tgac038. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Kluger D.S., Balestrieri E., Busch N.A., and Gross J. (2021). Respiration aligns perception with neural excitability. eLife 10, e70907. 10.7554/eLife.70907. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Perl O., Ravia A., Rubinson M., Eisen A., Soroka T., Mor N., Secundo L., and Sobel N. (2019). Human non-olfactory cognition phase-locked with inhalation. Nat Hum Behav 3, 501–512. 10.1038/s41562-019-0556-z. [DOI] [PubMed] [Google Scholar]
  • 16.Grund M., Al E., Pabst M., Dabbagh A., Stephani T., Nierhaus T., Gaebler M., and Villringer A. (2022). Respiration, Heartbeat, and Conscious Tactile Perception. J. Neurosci. 42, 643–656. 10.1523/JNEUROSCI.0592-21.2021. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Zelano C., Jiang H., Zhou G., Arora N., Schuele S., Rosenow J., and Gottfried J.A. (2016). Nasal Respiration Entrains Human Limbic Oscillations and Modulates Cognitive Function. J. Neurosci. 36, 12448–12467. 10.1523/JNEUROSCI.2586-16.2016. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Critchley H.D., and Garfinkel S.N. (2017). Interoception and emotion. Current Opinion in Psychology 17, 7–14. 10.1016/j.copsyc.2017.04.020. [DOI] [PubMed] [Google Scholar]
  • 19.Yao B., Gu P., Lasagna C.A., Peltier S., Taylor S.F., Tso I.F., and Thakkar K.N. (2023). Structural connectivity of an interoception network in schizophrenia. Psychiatry Research: Neuroimaging 331, 111636. 10.1016/j.pscychresns.2023.111636. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Ardizzi M., Ambrosecchia M., Buratta L., Ferri F., Peciccia M., Donnari S., Mazzeschi C., and Gallese V. (2016). Interoception and Positive Symptoms in Schizophrenia. Front. Hum. Neurosci. 10.10.3389/fnhum.2016.00379. [DOI] [Google Scholar]
  • 21.Yao B., and Thakkar K. (2022). Interoception abnormalities in schizophrenia: A review of preliminary evidence and an integration with Bayesian accounts of psychosis. Neuroscience & Biobehavioral Reviews 132, 757–773. 10.1016/j.neubiorev.2021.11.016. [DOI] [PubMed] [Google Scholar]
  • 22.Pollatos O., Traut-Mattausch E., and Schandry R. (2009). Differential effects of anxiety and depression on interoceptive accuracy. Depression and Anxiety 26, 167–173. 10.1002/da.20504. [DOI] [PubMed] [Google Scholar]
  • 23.Harshaw C. (2015). Interoceptive dysfunction: Toward an integrated framework for understanding somatic and affective disturbance in depression. Psychological Bulletin 141, 311–363. 10.1037/a0038101. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Paulus M.P., and Stein M.B. (2010). Interoception in anxiety and depression. Brain Struct Funct 214, 451–463. 10.1007/s00429-010-0258-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Hazelton J.L., Fittipaldi S., Fraile-Vazquez M., Sourty M., Legaz A., Hudson A.L., Cordero I.G., Salamone P.C., Yoris A., Ibañez A., et al. (2023). Thinking versus feeling: How interoception and cognition influence emotion recognition in behavioural-variant frontotemporal dementia, Alzheimer’s disease, and Parkinson’s disease. Cortex 163, 66–79. 10.1016/j.cortex.2023.02.009. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Sun W., Ueno D., and Narumoto J. (2022). Brain Neural Underpinnings of Interoception and Decision-Making in Alzheimer’s Disease: A Narrative Review. Front. Neurosci. 16. 10.3389/fnins.2022.946136. [DOI] [Google Scholar]
  • 27.Janke E., Zhang M., Ryu S.E., Bhattarai J.P., Schreck M.R., Moberly A.H., Luo W., Ding L., Wesson D.W., and Ma M. (2022). Machine learning-based clustering and classification of mouse behaviors via respiratory patterns. iScience 25, 105625. 10.1016/j.isci.2022.105625. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Melnychuk M.C., Murphy P.R., Robertson I.H., Balsters J.H., and Dockree P.M. (2020). Prediction of attentional focus from respiration with simple feed-forward and time delay neural networks. Neural Comput & Applic 32, 14875–14884. 10.1007/s00521-020-04841-7. [DOI] [Google Scholar]
  • 29.Kazemi K., Abiri A., Zhou Y., Rahmani A., Khayat R.N., Liljeberg P., and Khine M. (2024). Improved sleep stage predictions by deep learning of photoplethysmogram and respiration patterns. Computers in Biology and Medicine 179, 108679. 10.1016/j.compbiomed.2024.108679. [DOI] [PubMed] [Google Scholar]
  • 30.Krauss D., Richer R., Küderle A., Jukic J., German A., Leutheuser H., Regensburger M., Winkler J., and Eskofier B.M. (2025). Incorporating respiratory signals for machine learning-based multimodal sleep stage classification: a large-scale benchmark study with actigraphy and heart rate variability. Sleep 48, zsaf091. 10.1093/sleep/zsaf091. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Abdelfattah E., Joshi S., and Tiwari S. (2025). Machine and Deep Learning Models for Stress Detection Using Multimodal Physiological Data. IEEE Access 13, 4597–4608. 10.1109/ACCESS.2024.3525459. [DOI] [Google Scholar]
  • 32.Seo W., Kim N., Kim S., Lee C., and Park S.-M. (2019). Deep ECG-Respiration Network (DeepER Net) for Recognizing Mental Stress. Sensors 19, 3021. 10.3390/s19133021. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Xiang J.-Z., Wang Q.-Y., Fang Z.-B., Esquivel J.A., and Su Z.-X. (2025). A multi-modal deep learning approach for stress detection using physiological signals: integrating time and frequency domain features. Front Physiol 16, 1584299. 10.3389/fphys.2025.1584299. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Chen A., Zhang J., Zhao L., Rhoades R.D., Kim D.-Y., Wu N., Liang J., and Chae J. (2021). Machine-learning enabled wireless wearable sensors to study individuality of respiratory behaviors. Biosensors and Bioelectronics 173, 112799. 10.1016/j.bios.2020.112799. [DOI] [PubMed] [Google Scholar]
  • 35.Alkhodari M., and Khandoker A.H. (2022). Detection of COVID-19 in smartphone-based breathing recordings: A pre-screening deep learning tool. PLOS ONE 17, e0262448. 10.1371/journal.pone.0262448. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Hamza A., Himeur Y., Amira A., and Oulefki A. (2024). Predicting Asthma Attacks Through AI-Powered Thermal Imaging Analysis of Breathing Patterns. In 2024 IEEE 10th International Conference on Big Data Computing Service and Machine Learning Applications (BigDataService), pp. 82–86. 10.1109/BigDataService62917.2024.00018. [DOI] [Google Scholar]
  • 37.Brændholt M., Nikolova N., Vejlø M., Banellis L., Fardo F., Kluger D.S., and Allen M. (2025). The respiratory cycle modulates distinct dynamics of affective and perceptual decision-making. PLOS Computational Biology 21, e1013086. 10.1371/journal.pcbi.1013086. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Kayser C., Hehemann L., and Stetza L. (2025). The alignment of respiration to sensory-motor events is shaped by expected effort. Preprint at bioRxiv, https://doi.org/10.1101/2025.09.24.678231 10.1101/2025.09.24.678231. [DOI] [Google Scholar]
  • 39.Shibata H., and Ohira H. Pre-movement respiration–action coupling is specific to self-initiated action: Evidence for action alignment with ongoing breathing. [Google Scholar]
  • 40.Skora L.I., Livermore J.J.A., and Roelofs K. (2022). The functional role of cardiac activity in perception and action. Neuroscience & Biobehavioral Reviews 137, 104655. 10.1016/j.neubiorev.2022.104655. [DOI] [PubMed] [Google Scholar]
  • 41.Jennings J.R., Van Der Molen M. w., and Tanase C. (2009). Preparing hearts and minds: Cardiac slowing and a cortical inhibitory network. Psychophysiology 46, 1170–1178. 10.1111/j.1469-8986.2009.00866.x. [DOI] [PubMed] [Google Scholar]
  • 42.Moresi S., Adam J.J., Rijcken J., Van Gerven P.W.M., Kuipers H., and Jolles J. (2008). Pupil dilation in response preparation. International Journal of Psychophysiology 67, 124–130. 10.1016/j.ijpsycho.2007.10.011. [DOI] [PubMed] [Google Scholar]
  • 43.Jainta S., Vernet M., Yang Q., and Kapoula Z. (2011). The pupil reflects motor preparation for saccades - even before the eye starts to move. Front. Hum. Neurosci. 5, 11905. 10.3389/fnhum.2011.00097. [DOI] [Google Scholar]
  • 44.Gu X., and FitzGerald T.H.B. (2014). Interoceptive inference: homeostasis and decision-making. Trends in Cognitive Sciences 18, 269–270. 10.1016/j.tics.2014.02.001. [DOI] [PubMed] [Google Scholar]
  • 45.Seth A.K. (2013). Interoceptive inference, emotion, and the embodied self. Trends in Cognitive Sciences 17, 565–573. 10.1016/j.tics.2013.09.007. [DOI] [PubMed] [Google Scholar]
  • 46.Melnychuk M., Dockree P., O’Connell R., Murphy P., Balsters J., and Robertson I. (2018). Coupling of Respiration and Attention via the Locus Coeruleus: Effects of Meditation and Pranayama 10.1111/psyp.13091. [DOI] [Google Scholar]
  • 47.Iwamoto M., Yonekura S., Atsumi N., Hirabayashi S., Kanazawa H., and Kuniyoshi Y. (2023). Respiratory entrainment of the locus coeruleus modulates arousal level to avoid physical risks from external vibration. Sci Rep 13, 7069. 10.1038/s41598-023-32995-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Sheriff A., Pandolfi G., Nguyen V.S., and Kay L.M. (2021). Long-Range Respiratory and Theta Oscillation Networks Depend on Spatial Sensory Context. J. Neurosci. 41, 9957–9970. 10.1523/JNEUROSCI.0719-21.2021. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Kluger D.S., and Gross J. (2021). Respiration modulates oscillatory neural network activity at rest. PLOS Biology 19, e3001457. 10.1371/journal.pbio.3001457. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Zhang Q., Chen X., Zhan Q., Yang T., and Xia S. (2017). Respiration-based emotion recognition with deep learning. Computers in Industry 92–93, 84–90. 10.1016/j.compind.2017.04.005. [DOI] [Google Scholar]
  • 51.Kunimatsu J., Akiyama Y., Toyoshima O., and Matsumoto M. (2022). A Noninvasive Method for Monitoring Breathing Patterns in Nonhuman Primates Using a Nasal Thermosensor. eNeuro 9. 10.1523/ENEURO.0352-22.2022. [DOI] [Google Scholar]

Articles from bioRxiv are provided here courtesy of Cold Spring Harbor Laboratory Preprints

RESOURCES