Abstract
When watching a negative emotional movie, we differ from person to person in the ease with which we engage and the difficulty with which we disengage throughout a temporally evolving narrative. We investigated neural responses of emotional processing, by considering inter‐individual synchronization in subjective emotional engagement and disengagement. The neural underpinnings of these shared responses are ideally studied in naturalistic scenarios like movie viewing, wherein individuals emotionally engage and disengage at their own time and pace throughout the course of a narrative. Despite the rich data that naturalistic designs can bring to the study, there is a challenge in determining time‐resolved behavioral markers of subjective engagement and disengagement and their underlying neural responses. We used a within‐subject cross‐over design instructing 22 subjects to watch clips of either neutral or sad content while undergoing functional magnetic resonance imaging (fMRI). Participants watched the same movies a second time while continuously annotating the perceived emotional intensity, thus enabling the mapping of brain activity and emotional experience. Our analyses revealed that between‐participant similarity in waxing (engagement) and waning (disengagement) of emotional intensity was directly related to the between‐participant similarity in spatiotemporal patterns of brain activation during the movie(s). Similar patterns of engagement reflected common activation in the bilateral ventromedial prefrontal cortex, regions often involved in self‐referenced evaluation and generation of negative emotions. Similar patterns of disengagement reflected common activation in central executive and default mode network regions often involved in top‐down emotion regulation. Together this work helps to better understand cognitive and neural mechanisms underpinning engagement and disengagement from emotionally evocative narratives.
Keywords: disengagement, emotional intensity, engagement, fMRI, idiosyncrasy, RSA, subjective
Subjective disengagement, as opposed to engagement, shows stronger inter‐individual alignment, specifically during sad movies, suggesting a collective response to emotional material. We identified two distinctive neural response synchronization patterns underlying engagement and disengagement, potentially supporting emotion regulation and self‐referenced evaluation of emotional material.

1. INTRODUCTION
Some individuals have little trouble disengaging from a negative emotional experience, while others struggle to disengage or even over‐engage (Blicher et al., 2020; Koole, 2009; Rudaizky et al., 2014; Scherer et al., 2001). The neural signatures that give rise to these individual differences are poorly understood. This is partly owed to a long tradition in the field of controlled task designs optimized to assess shared responses for group‐level inferences rather than inter‐individual variability (Finn et al., 2020). How humans engage and disengage from emotional experiences has traditionally been investigated using highly controlled scenarios wherein individuals attend and react to isolated emotional stimuli, such as pictures or short audio‐visual stimuli (Braunstein et al., 2017). This kind of approach obscures the fact that emotions never emerge on their own independently, but are often embedded in an ongoing narrative with varying emotional intensity and valence, determining where to and for how long we direct our attention. Naturalistic designs, such as movie watching, address this issue: events are connected through a common temporally evolving narrative allowing individuals to engage, disengage, and re‐engage at varying time points and for different lengths of time. Despite or because of this rich dynamic scenario, behavioral and neural markers underlying engagement and disengagement have been difficult to characterize.
In the context of this study, we employ our measure to explore emotional engagement and disengagement at the behavioral level. These episodes within the emotional process lack a definitive and precise description within the domain of neuroscience. This lack of clarity is acknowledged by Dmochowski, who advocates for the inclusion of specific operational definitions based on the context under investigation (Dmochowski et al., 2012). For instance, within the educational realm, emotional engagement (EE) is defined as affective responses that encompass a sense of belonging within the school environment, while emotional disengagement (ED) pertains to the withdrawal from educational activities (Steenberghs et al., 2021). Conversely, beyond the educational sphere, EE has been defined by Steenberghs and colleagues as the degree of emotional salience. Fitz and collaborators, on the other hand, characterize EE as mindful awareness and ED as distraction (Fitzpatrick & Kuo, 2022). Hence, we establish operational definitions for these episodes as the positive or negative changes, respectively, in the intensity of emotional impact (affective state) experienced by individuals during varying moments throughout the movie.
Continuous physiological markers like heart rate, skin conductance, and pupil dilation have been previously recorded during movie watching (Hasson et al., 2009; Nguyen et al., 2019; Sharma et al., 2019; van der Meer et al., 2020), but their roles in engagement and disengagement are poorly understood. In contrast, continuous subjective annotations have been used to map fluctuations of emotional arousal or emotional intensity, indicating how emotionally moved one feels from moment to moment throughout movie watching (Hudson et al., 2020; Nummenmaa et al., 2012; Song et al., 2021; Wallentin et al., 2011). A common approach is to watch the same movie twice, retroactively providing continuous annotations during the second watch which reflect one's emotional arousal whilst watching the movie for the first time (Canini et al., 2010; Hanjalic & Xu, 2005; Li et al., 2015; Ruef & Levenson, 2007). While real‐time annotations during scanning affect neural responses to emotional stimulation (therefore obstructing attempts at brain imaging), repeated movie watching only minimally affects subjective annotations (Hutcherson et al., 2005; Lieberman et al., 2007), as demonstrated by high test–retest reliability (Cronbach's = 0.8–0.9; Metallinou & Narayanan, 2013). In this regard, fluctuations in subjectively experienced emotional intensity could help identify alternations between engagement in and disengagement from emotional stimuli throughout movie watching.
A prominent approach to detect shared neural responses in naturalistic designs is inter‐subject correlation analysis (ISC), which evaluates how neural responses synchronize across subjects when they process the same stimulus at the same time (Hasson et al., 2004). Thus, higher and lower neural synchronization would reflect more and less similar stimulus processing, respectively. Even though this assumption implies that shared neural responses should reflect shared behavioral responses, it has been rarely investigated if shared neural responses indeed underlie shared behavioral responses. Instead, associations to behavior have been based on individual magnitudes, such as individual scoring on a questionnaire as opposed to the degree of inter‐individual similarity (but see Chen et al. (2020) and supplement in in Nummenmaa et al. (2012)). Thus, given the sparse understanding of how individuals resemble each other in their subjective emotional experiences during naturalistic designs, their underlying shared neural responses are poorly understood.
Using more trial‐based designs, several studies have shown that the ventro‐medial prefrontal cortex (vmPFC) is involved in both the generation and regulation of negative emotions and might be a relay for generalizable representations of negative emotions (Kragel et al., 2018). Further, the vmPFC has been involved in self‐referenced evaluation such as self‐attributing personality traits or recalling autobiographical memories (Northoff et al., 2006; Svoboda et al., 2006). Not only during movie viewing but even in the absence of sensory stimulation (resting state), the vmPFC exhibits higher inter‐individual variations in neural responses as compared to other brain regions (see L. Chang et al., 2018, L. J. Chang et al., 2021, Hasson et al., 2010, Hasson et al., 2004, and Xie et al., 2021, for naturalistic paradigms and Mueller et al., 2013, for resting state).
Previous work has shown across two different data sets (movie and audio‐book) that the degree to which neural responses synchronized within the default mode network (DMN) was related to how engaging the narrative was perceived (Song et al., 2021). The researchers conclude that the DMN could reflect “a modality‐general network involved in attention and narrative processing” (2021). In their study, neural synchronization was related to a group manifold of continuous subjective engagement derived from an independent sample that was not scanned. Thus, it is still unclear whether and, if so, to which degree, neural synchronization occurs as a function of synchronized subjective engagement and disengagement, which would require both scanning and subjective annotations within one and the same individuals. Further, we hypothesized individuals would alternate between discrete moments of engagement in and disengagement from a narrative. Rather than being interested in the neural activity associated with moments of engagement and disengagement, we sought to investigate how similar patterns of emotional engagement and disengagement were reflected in synchronized patterns of spatiotemporal brain activity across the entire movie clip.
To address these research questions, we instructed 22 subjects to watch two movie clips—one with sad and one with neutral content—while undergoing functional magnetic resonance imaging (fMRI). Roughly 15 minutes after the scanning session and outside of the scanner, subjects watched the same movies again while simultaneously annotating their ongoing continuous perceived emotional intensity. We hypothesized that greater synchronization patterns in subject annotations throughout moments of engagement and disengagement ought to be reflected in distinct synchronization of neural response patterns. Specifically, we speculate that engagement and disengagement similarity patterns should reflect brain regions associated with more similar self‐referential evaluation of emotions and top‐down regulation, respectively.
As outlined above, we hypothesized the vmPFC to be a critical region for subjective emotion processing. However, given the novelty of the method mapping bi‐directional similarity (in both behavior and the brain), we conducted our analysis including brain regions related to emotional and perceptual processing. Alignments in engagement reports were related to more synchronized neural responses mainly within the bilateral vmPFC, together with limbic and visual areas. In contrast, alignments in disengagement reports were related to more synchronized neural responses within frontally distributed regions of the executive and DMN networks. Together these findings suggest the presence of disparate neural systems possibly supporting appraisal of affective relevance and top‐down emotion regulation, respectively.
2. MATERIALS AND METHODS
2.1. Participants
Twenty‐two females (aged 20–49 years, mean age 28.1 6.5), volunteered for the study. Subjects were screened for absence of any neurological or psychiatric disorders using the short version of the Structured Clinical Interview for DSM‐IV (SCID; Wittchen et al., 1997). The study protocols were in accordance with the latest version of the Declaration of Helsinki and approved by the institutional review board of the Charité, Germany. The data were previously published with completely different research questions, without involving continuous annotations of emotional intensitys (Borchardt et al., 2018; Fan et al., 2019).
2.2. Task paradigm
To investigate inter‐individually shared patterns of neural responses together with inter‐individually shared patterns of subjective emotional intensity responses, we presented two movie clips of either sad or neutral content. The sad clip was an excerpt from the movie “21 Grams” (Iñárritu, 2003) which presents a mother learning about the death of her two daughters in a car accident. The video excerpt has a duration of 4.45 min. The neutral clip was an excerpt from the movie “La Stanza Del Figlio” (Moretti, 2001), where scenes of a family are presented in daily life (e.g., having a casual conversation at the dinner table, reading the newspaper) with a duration of 4.54 min. We selected the neutral clip based on its resemblance to the sad clip in terms of low‐level features, such as the presence of human faces, scenes of social interaction, and domestic environments. Both movie clips were shown in dubbed German version. These movies have been used in previous studies to induce sad and neutral emotions respectively (Borchardt et al., 2018; Gaviria et al., 2021; Hanich et al., 2014; Shiota & Levenson, 2009). Every subject started with a 10‐min resting‐state recording, followed by the presentation of one of the two movie clips. No specific instructions were given other than watching the movie clips. The order of presentation of the movie clips was counterbalanced across subjects. To minimize possible carry over effects, 15 min of resting state was recorded between both movies. After the second movie, another resting state of 10 min was recorded.
2.3. Subjective ratings
To assess the overall emotional induction experience of watching these video clips, measures of different emotional dimensions have been previously reported on this data. Specifically, subjects reported higher scores on arousal, dominance, and one's own emotional experience in the negative video compared to the neutral one. Additionally, the negative video was rated as significantly more negative than the neutral video (Borchardt et al., 2018).
2.4. Subjective continuous annotations
Roughly 20 min after the scanning session and outside of the scanner, subjects watched both movie clips again in the same order as presented in the scanner while concurrently annotating how emotionally moving their experience was with regard to when they first watched the movie (Figure 1). The instruction was as follows: “Following, are 2 short ratings of the movie clips that you previously watched. You are going to see both movie clips again on the PC screen and are subsequently asked to rate how emotional you felt while watching.” (German original in supplement). Subjective emotional intensity annotations were performed through a visual analog scale in which respondents move the mouse to the desired position from “not at all” to “very much” (emotionally moving) in a vertical bar on the right side of the screen. The scale was arbitrary ranging from 0 to 250, where 0 was set as “not emotionally moving” and 250 as “very emotionally moving”. The annotations were recorded with a sampling rate of 30 Hz and then down‐sampled to the MRI scanner sampling rate (0.5 Hz).
FIGURE 1.

Subjective emotional intensity annotations for the sad and the neutral movie clip (top and bottom, respectively). Individual time courses are shown in color; the degree of similarity of annotations across participants is shown in black (Euclidean distance based). Grey segments show three movie scenes with the highest annotation accordance (distance >2 standard deviations above the mean).
2.5. Image acquisition
Functional MRI images were acquired on a Siemens Trio 3 T scanner (Siemens, Erlangen, Germany) with a 12‐channel radiofrequency head coil, using a T2*‐weighted Echo Planar Imaging (EPI) sequence (37 axial slices of 3 mm thickness covering the whole brain, TR = 2000 ms, TE = 30 ms, 70° flip angle, 64 × 64 matrix, field of view = 192 × 192 , in‐plane resolution = 3 × 3 ). The natural‐viewing sessions consisted of 147 volumes. T1‐weighted anatomical reference images were acquired using 3D‐MPRAGE sequence (176 sagittal slices covering the whole brain, 1 × 1 × 1 isotropic resolution, TR = 1900 ms, TE = 2.52 ms, 9° flip angle, 256 × 256 matrix).
2.6. Image preprocessing
Preprocessing was performed using fMRIprep 1.3.2 (Esteban et al., 2019), which is based on Nipype 1.1.9 (K. Gorgolewski et al., 2011; K. J. Gorgolewski et al., 2019). Steps included slice time and motion correction, susceptibility distortion correction, realignment, smoothing, and registration to common space. ICA‐AROMA was performed to identify and remove motion‐related components (Pruim et al., 2015). Physiological nuisance was corrected by extracting the mean time course of cerebrospinal‐fluid (CSF) and white matter (WM) calculated from the preprocessed images; this method of motion correction was chosen after comparing to two other sets with different motion correction methods: (1) FIX (Griffanti et al., 2014; Salimi‐Khorshidi et al., 2014) and (2) only ICA, by assessing the quality and validation of functional images through visual inspection of the group independent components according to the criteria described in Griffanti et al. (2017). The first two volumes of fMRI sequence of both movies were discarded to avoid T1 saturation effect.
2.7. Neurosynth brain parcellation and term maps
We used a brain parcellation from the Neurosynth database provided by Vega et al. (2016; http://neurosynth.org). Since the parcellation does not consider parcels for each hemisphere individually, we performed a division of parcels into both hemispheres. We then used search terms that are associated with our research question to identify regions of interest within this parcellation, resulting in a total of 119‐ROIs. This ROI identification was performed by using uniformity test statistical inference maps (Yarkoni et al., 2011). These maps provide a consistent measure of each voxel to be activated across studies that is associated with the search terms of interest. We then used the uniformity test statistical inference maps to identify our regions of interest within the Neurosynth parcellation by including a parcel if the occupancy volume is larger than 30% (Bossier et al., 2020).
The search terms were chosen based on previous work on naturalistic stimulus processing highlighting the importance of lower‐level sensory (“visual perception” and “auditory”) and higher‐level cognitive features like emotion processing (“emotional responses”, “sad”, “arousal”, “cognitive‐emotional”) and self‐reflection (“awareness”; see Table S1; see Saarimäki, 2021). Further, as an exploratory analysis, instead of treating brain regions as individual entities, we repeated our analysis assuming brain regions belonging to one search term as one single parcel.
The terms were chosen based on the literature on naturalistic stimulus processing (2021), taking into consideration sensory (visual perception and auditory) and higher‐level cognitive features for movie viewing, emotion processing and subjective ratings and self‐reflection (sad, arousal, awareness, emotional responses and cognitive‐emotional; see Table S1).
2.8. Emotional engagement and disengagement
Our first aim was to isolate moments in time when individuals indicated the beginning of emotional engagement and disengagement. We hypothesized that these two phases of emotion processing could be derived from rises and falls from continuous fluctuations of subjective emotional intensity annotations. Specifically, we isolated time points where subjective emotional intensity scores fulfilled two conditions: they (a) exhibit a positive difference between two time points (engagement), or a negative difference between two time points (disengagement) and (b) fall within an intra‐individually defined threshold (33rd >emotional intensity at time point t <67th percentile). This algorithm has been previously used to extract phases in neural oscillations (Dessu et al., 2020; Kato et al., 2015; Shine et al., 2019). The outcome of this calculation results in individual onsets of engagement and disengagement, manifesting as binary variables (see Figure S1). Subsequently, these derived onsets were employed in the computation of ISC. We sought to specifically extract rises and falls as opposed to rare highs and lows, because they are more frequent and short‐lived, and thus provide higher sensitivity to individual differences in engagement and disengagement.
2.9. Inter‐subject correlation analyses
2.9.1. Synchronization of neural responses
We first sought to assess to what degree subjects align with either behavioral or neural responses. To this aim, we used inter‐subject‐correlation (ISC: Hasson et al., 2004). First, we extracted the BOLD time series data for each entire brain parcel from individual participants. Subsequently, for each brain parcel, we calculated Pearson correlation distance among all possible pairs of BOLD time courses derived from subject‐specific parcels. This process yielded a single ISC score for each individual parcel. For mapping inter‐subject similarities in behavior, we calculated Pearson correlation distance within all possible pairs of subjects' subjective emotional annotations.
Pearson correlation distance, defined as:
| (1) |
where and are a pair of vectors with observations. Intuitively, by subtracting the calculated Pearson correlation coefficient from 1, we get a correlation matrix (Figure 2) containing all pair‐wise distances between each participant. ISC analysis was performed using functions from nltools package (L. Chang et al., 2018).
FIGURE 2.

Scheme for how Inter‐Subject Representational Similarity Analysis (IS‐RSA) maps the association between inter‐individually shared subjective emotional intensity responses and inter‐individually shared neural responses. From left to right: (a) BOLD time series of a brain region of interest (top) and emotional intensity time series from subjective annotations (bottom) are extracted for each subject separately. (b) Inter‐subject similarity is then calculated for neural responses in that representative brain region (top) and for subjective annotation time courses (bottom). Thus, every parcel in the triangle depicts the correlation strength between two individuals' signal time courses. These neural inter‐subject correlation coefficients (upper triangle) and behavioral inter‐subject coefficients (lower triangle) are then correlated with each other (panel c; color indicates density of ISC scores, with darker colors indicating lower, and brighter colors indicating higher density of ISCs). The resulting correlation coefficient is then transformed into correlation distance for better interpretability (d). This analysis was repeated with different transformations of the subjective annotations: (1) by dichotomizing the subject's emotional annotation time course into moments of increases (yes/no), here called engagement, and (2) by dichotomizing a subject's emotional annotation time course into moments of decreases (yes/no), here called disengagement (see Section 2). To assess inter‐subject similarity between these binary time courses, Hamming correlation was performed instead of Pearson correlation. ISC, inter‐subject‐correlation.
2.9.2. Synchronization of subjective emotional engagement and disengagement
Utilizing Hamming distance for dichotomous variables (Ayala et al., 2013), we computed ISC coefficients for the previously estimated engagement and disengagement episodes. This analysis resulted in two between‐subject correlation matrices (one for engagement and one for disengagement). ISC coefficients for the continuous annotations (no distinction of phases) were also calculated by using Pearson correlation distance.
2.10. Subjective emotional intensity annotations
To examine if inter‐subject correlation (ISC) of emotional engagement and disengagement differ across neutral and sad movies, we built a linear mixed effects regression model (Baayen et al., 2008) predicting ISC based on phase type (engagement/dis‐engagement) and emotion category (sad/neutral), with subject pair ID as a random intercept. Linear and generalized mixed effects modeling was performed using the lme4 package in R (Bates et al., 2014). Correction for multiple comparisons was performed with the False Discovery Rate (FDR) approach Benjamini–Hochberg (Benjamini & Hochberg, 1995).
2.11. Inter‐subject representation similarity analysis
Having investigated behavioral and neural similarity, we next sought to understand the association between the two. That is, we wondered if the degree of similar behavioral responses was systematically related to how a brain region's neural responses synchronizes across individuals. To this aim, we used inter‐subject representation similarity analysis by calculating Spearman distance correlation between ISC scores of brain parcel's BOLD time courses and ISC scores of individuals' subjective emotional intensity time courses. A mantel permutation test (Mantel, 1967) was performed, which calculates for each random permutation (10,000) a set of correlations by shuffling rows and columns of each subject correlation matrix, creating a null distribution to calculate p‐values based on a one‐tail test with the alternative hypothesis being the correlations greater than 0. Finally, permuted p‐values were thresholded with FDR (from nltools [L. Chang et al., 2018]) method (Benjamini & Yekutieli, 2001) to correct for multiple comparisons (119 regions). This resulted in Spearman correlation coefficients falling on a scale bounded between −1 and +1. For ISC analysis and the ISC‐RSA, we used codes by Chen et al. (2020; https://github.com/cosanlab/affective_ISRSA) under the python environment (Python 3.7.12; Van Rossum & Drake, 2009). To avoid misinterpretations in the context of RSA (for discussion see Dimsdale‐Zucker and Ranganath (2018)), we re‐scaled Spearman correlation coefficients using a correlation distance metric Kaufman & Rousseeuw, 2009). This approach transforms coefficients to a range between 0 and +2, where values close to zero represent higher similarity (Popal et al., 2019), as previously done in studies using RSA (Kiani et al., 2007; Kriegeskorte et al., 2008; Kriegeskorte & Kievit, 2013).
Instead of treating each ROI as an individual unit, we sought to repeat our analyses by treating all ROIs corresponding to each search term (arousal, auditory, awareness, cognitive emotional, emotional responses, sad and visual perception) entered into the neurosynth database as one system: for each subject, we first extracted individual BOLD timeseries of each parcel belonging to a search term. We then aggregated over the individual parcel time courses resulting in one time course per search term and subject. Then, ISC was performed for each subject's search term time series, resulting in a ISC matrix per term. Finally, we applied the RSA as outlined above (see Section 2) correlating the search term ISC matrix and behavior ISC matrix.
3. RESULTS
3.1. Subjective emotional intensity annotations
We hypothesized that the emotionally evocative sad movie would result in (1) higher synchronization of emotional reports and (2) higher scores of subjective emotional intensity across subjects in contrast to the neutral movie. Paired t‐tests showed that the mean continuous annotations of subjective emotional intensity were significantly higher for the sad as compared to the neutral movie clip across movies (t(21) = 7.37, p < .001). Using mixed‐effects modeling it was shown that the sad movie clip was associated with higher inter‐individual similarity as compared to the neutral movie clip (sad vs. neutral: = 1.56; 95% CI: [1.45, 1.67], ; see Figure 3b and Table S3).
FIGURE 3.

Estimated inter‐subject similarity of behavioral and neural responses as a function of emotion induction condition (sad/neutral movie) using linear‐mixed effects modeling. (a) Inter‐subject similarity in engaging and disengaging phases, derived from rises and falls in subjective emotional intensity annotations, respectively. (b) Estimated inter‐subject similarity in continuous subjective emotional intensity. (c) Inter‐subject similarity in neural responses throughout the 119 regions of interest
. ISC, inter‐subject correlation.
We further examined if the degree of inter‐individual similarity varied as a function of if individuals engage or disengaged and if this association differs for the neutral movie clip. Engagement and disengagement were identified by increases and decreases in subjective emotional intensity annotations, by (1) considering the direction of signal change and (2) the range of the signal defined by within‐subject percentile thresholds (see Section 2).
For the sad movie, there were on average 6.90 4.82 engagement phases and 2.40 3.14 disengagement phases (see Figure S1) with a mean duration of 13.81 9.65 and 4.81 6.28 seconds, respectively. For the neutral movie, there were on average 1.31 1.64 engagement phases and 0.9 1.63 disengagement phases, with a mean duration of 2.63 3.28 and 1.81 3.26 s in engagement and disengagement, respectively (see Figure S1). During the sad movie, for 10 subjects disengagement phases were absent. Note that the absence of phases does not exclude subjects from the analyses but instead provides valuable information for the calculation of inter‐subject correlation analysis. For the neutral movie, 10 subjects had no engagement and 13 subjects had no disengagement phases, of which 10 had neither engagement nor disengagement phases.
Next, we built a mixed effects model assessing how the inter‐subject similarity of subjective emotional intensity annotations varied as a function of emotional category (sad/neutral) and emotional phase (engagement/disengagement). There was a significant main effect of emotional category (sad vs. neutral: = 2.14; 95% CI: [2.10, 2.17], ), indicating that the inter‐subject similarity was higher during sad as opposed to the neutral condition (Table S4). There was a significant interaction of emotional category with phase (engagement vs. disengagement: = −0.41; 95% CI: [−0.46, −0.36], ), indicating that, for the sad movie clip, subjects were more similar to each other in disengagement as opposed to engagement phases. Taken together, compared to the neutral movie, the sad movie induced higher similarity in subjective emotional processing as indicated by higher inter‐subject similarity in subjective emotional intensity annotations, and subjects aligned more strongly in their disengagement as opposed to engagement behavior. Critically, this differential effect for engagement and disengagement was only present during the sad, but not the neutral movie (Figure 3a), suggesting that the sad movie elicited more consistent emotional fluctuations than the neutral one.
3.2. Inter‐subject similarity in neural responses
Given previous research reporting that individuals synchronize their neural responses during more engaging parts of a movie (Gruskin et al., 2020; Maffei, 2020; Sachs et al., 2020; Song et al., 2021; Trost et al., 2015), we sought to replicate this finding in our data. We first calculated ISCs for each of the 119 brain regions for every movie clip (sad/neutral), separately. We then constructed a mixed effects model predicting ISC scores as a function of movie clip (sad/neutral). This analysis revealed significantly higher brain‐wide inter‐subject similarity in neural responses for the sad as compared to the neutral movie condition (sad vs. neutral: = 0.42; 95% CI: [0.32, 0.52], ; see Table S2). Higher ISC values were concentrated in occipital and temporal areas involving the right occipital fusiform gyrus (mean ISC = .51), followed by the left and right lingual gyrus (mean ISC = .51 and .51, respectively). For the neutral condition, higher ISC values were concentrated in occipital areas (Figure 3c), such as the left and right lingual gyrus (mean ISC = .40; .35) and the occipital pole (mean ISC = .37).
3.3. Inter‐subject representational similarity analysis
Now that we established that the sad movie clip was associated with (1) stronger alignments in emotional intensity annotations and (2) stronger alignments in neural responses across subjects, we used representational similarity analysis (Brooks et al., 2019; Chen et al., 2020; Finn et al., 2020; Sachs et al., 2020) to assess to which degree neural and behavioral synchronization covaries. The analysis revealed that higher inter‐individual synchronization of engagement was underpinned by higher synchronization of neural responses within occipital and prefrontal regions (Figure 4a), specifically within the bilateral occipital fusiform gyrus (left OFG; ; right OFG ()), the bilateral ventromedial prefrontal cortex (vmPFC) (left vmPFC: ; right vmPFC: ) and the right subgenual anterior cingulate cortex (sgACC) (; Table S5).
FIGURE 4.

Inter‐subject representational similarity analysis results. Thresholded similarity maps for variations in subjective engagement and disengagement during sad (a) and neutral (b) movie clips. (a, top) Engagement. vmPFC, ventromedial prefrontal cortex; FG, fusiform gyrus; sgACC, subgenual anterior cingulate cortex. (a, bottom) Disengagement. dmPFC, dorsomedial prefrontal cortex; pgACC, perigenual anterior cingulate cortex; dACC, dorsal anterior cingulate cortex; TPJ, temporoparietal junction; vlPFC, ventrolateral prefrontal cortex; dlPFC, dorsolateral prefrontal cortex; slOC, superior lateral occipital cortex; Cd, caudate nucleus. (b, top) Engagement. ILOC, inferior lateral occipital cortex; AI, anterior insula; dACC; preCG, precentral gyrus; pCG, paracingulate gyrus; pSMG, posterior supramarginal gyrus; pgACC; perigenual cingulate cortex; sPL, superior parietal lobe. (b, bottom) No significant effects. The color bar illustrates the correlation distance ranging between 0 and 2, where values closer to zero reflect higher association between inter‐subject similarity in neural responses and inter‐subject similarity in subjective engagement and disengagement, respectively. Note that effects for engagement in the neutral condition arise mainly from negative correlations, resulting in higher correlation distance. ISC, inter‐subject correlation.
Higher inter‐individual synchronization of disengagement was underpinned by stronger synchronization of neural responses within brain regions mostly belonging to the DMN (Yeo et al., 2011), that is, the left ventrolateral prefrontal cortex (vlPFC) (), left vmPFC (), the left perigenual anterior cingulate cortex (pgACC) () and the left temporo‐parietal junction (TPJ) (; see Figure 4a). Further, effects with the same directionality were found in the left dorsomedial prefrontal cortex (dmPFC) () belonging to salience and somatomotor networks (Yeo et al., 2011, the right dorso‐lateral prefrontal cortex (dlPFC) and the dorsal anterior cingulate cortex (dACC) belonging to the central executive network (CEN; ; , respectively), and the subcortical caudate nucleus (), and the superior lateral occipital cortex (slOC; ; see Table S6).
For the neutral movie, no significant effects were present for disengagement. For engagement, effects were derived negative correlation scores between synchronization in engagement and neural responses (Kriegeskorte et al., 2008; Table S10). Given that positive correlations fulfill the expectations for similarity analyses required to perform RSA, interpreting negative associations has shown to be difficult. Here we follow the suggested interpretation by Dimsdale‐Zucker and Ranganath (2018) arguing that RSA with negative correlation scores reflects very low similarity between neural responses which is better represented when converting correlation scores to a distance metric. In contrast to our sad movie clip, the RSA for the neutral movie clip resulted mainly in negative correlations (except for the TPJ, slOC in disengagement) exacerbating the interpretation of the findings. Additionally, we observe by using the subjective continuous emotional intensity (without distinguishing between phases) the presence of significant regions exclusively associated with the sad movie (Table S7), in contrast to the neutral film (Figure S2).
3.4. Exploratory analyses
As an exploratory analysis, we extended the above outlined analyses treating groups of regions belonging to an individual search term from the Neurosynth database as one single brain parcel (see Section 2). For the sad movie, regions commonly reported with the search term “sad” were the only ones drawing on both emotional engagement () and disengagement (; Figure 5), where greater similarity in brain activity was related to similarity in both engagement and disengagement. For the remaining ROI groups identified with search terms, inter‐subject similarity in emotional engagement was positively associated with inter‐subject similarity in brain activity (“auditory”: ; “arousal”: ; “visual perception”: ; “awareness”: and “cognitive emotional”: ; see Table S8). For the search term “emotional responses” there was neither an association with engagement nor with disengagement. Permuted p‐values were thresholded with FDR (p = .05) across ROIs. For the neutral condition, no significant effects were found for brain parcels associated with the search terms (Table S9).
FIGURE 5.

Inter‐individual similarity of engagement and disengagement in relation to inter‐individual similarity in neural responses for brain regions belonging to search terms from the Neurosynth database. Significant correlations are marked with an asterisk (
). ISC, inter‐subject correlation.
4. DISCUSSION
We used a sad and neutral movie clip with simultaneous continuous annotations of subjective emotional intensity to assess whether, and if so, to which degree individuals with similar emotional experiencing patterns recruit neural circuits similarly during movie watching. In particular, our hypothesis suggests that individuals vary in the tendency to respond to emotional material presented during movie watching, as reflected in varying patterns of onsets and duration of episodes of subjective engagement and disengagement. We sought to identify neural response patterns that synchronize together with the degree of inter‐individual alignment in subjective experiencing. Specifically, we hypothesized that increases and decreases in the timecourse of emotional intensity annotations could reflect the subjectively noticed beginning and end of episodes where individuals emotionally engaged and disengaged, respectively. To this aim, for every individual, we dichotomized the continuous subjective emotional intensity time courses into moments of engagement (present/not present) and disengagement (present/not present) by identifying time points when the signal rose or fell within intra‐individually defined thresholds. We wondered if engagement would be more similar or dissimilar across individuals compared to disengagement and if the degree of shared responses differs for sad compared to neutral content. During the sad movie clip, disengagement reports were significantly more aligned across individuals than engagement reports. In contrast, the neutral movie had less inter‐individual synchronization overall and did not significantly differentiate engagement and disengagement. One could speculate that more similarly experienced disengagement could reflect more similar usage of emotion regulation strategies, while noticed onsets of emotional engagement were more prone to subjective interpretability. However, further research is needed to investigate this. It has been hypothesized that the degree to which individuals resemble one another in their subjective affective experiencing is reflected in the degree to which they show a resemblance in their neural responses (Hasson et al., 2004, 2008; Nummenmaa et al., 2012), however, empirically their inter‐relation is poorly understood. For the sad movie clip, neural responses within the fusiform gyrus exhibited higher synchronization when engagement (but not when disengagement) was synchronized across individuals. This effect was absent for the neutral movie clip. The fusiform gyrus has been consistently found to be a face‐responsive region (Duchaine & Yovel, 2015; Kawasaki et al., 2012; Liang et al., 2017), especially when facial expressions were presented audio‐visually (Wild et al., 2001), with higher activation for emotional than to neutral facial expressions (Apicella et al., 2013). During movie viewing, it was previously shown that the fusiform gyrus exhibits the highest inter‐subject similarity across the entire brain network when faces versus non‐faces were shown (Hasson et al., 2004). It is worth mentioning that the films were not quantitatively equated in terms of low‐level features such as faces. This introduces the potential for these differences to be attributed to factors beyond emotions.
For both engagement and disengagement, our synchronization analysis drew on the vmPFC. The vmPFC has been reported in many different contexts, including, autonomous arousal regulation (Tranel & Damasio, 1994; Zahn et al., 1999; Zhang et al., 2014), negative affect generation with and without regulation (Etkin et al., 2011; Winecoff et al., 2013), self‐directed cognition (Northoff & Bermpohl, 2004). It is hypothesized that the vmPFC does not drive affective responses per se, but instead is crucial for transducing conceptual information to generate affective meaning to drive affective behavior (Roy et al., 2012). In this regard, more synchronized engagement and disengagement reports together with more synchronized vmPFC responses could reflect that sad scenes within the movie were similarly contextualized across individuals, critical for determining whether or not a stimulus is subjectively experienced as emotionally arousing or not.
During the sad movie clip, subjects were more similar in reporting moments of disengagement when compared to moments of engagement. One can only speculate if synchronized disengagement could reflect more similar use of emotion regulation strategies and if so, whether they are of a more explicit or automatic nature. It was previously shown that the induction of sadness results in more analytic strategies as compared to other negative emotions such as anger (Bodenhausen et al., 1994). In this regard, it is possible that inter‐individual similarities in disengagement could have been driven by inter‐individual similarities in explicit emotion regulation strategy usage. Our neural findings support this: stronger alignments in disengagement across subjects related to more synchronized neural responses in frontally distributed regions belonging to the DMN, including the left vlPFC and vmPFC, and regions belonging to the central executive network, including the right dlPFC, and sections of the left dACC. These regions are typically involved when negative emotions are re‐appraised as compared to merely attended to without trying to alter the elicited feelings (Buhle et al., 2014; Diekhof et al., 2011; Kohn et al., 2014; Ochsner et al., 2012; for review see Etkin et al. (2015)). Specifically, it was shown that brain stimulation‐induced increases in dlPFC excitability improved both the up and down‐regulation of negative emotions, and self‐reports indicated that this improved regulation was owed to successful re‐appraisal of negative emotions (Feeser et al., 2014). Even though these studies did not investigate inter‐individual synchronization, these findings could potentially suggest that the frontal synchronization in our data reflects more inter‐individually similar employment of emotion regulation for coping with negative emotions. However, further research is needed to corroborate this assumption and investigate what type of strategies were employed.
We would like to highlight the sgACC and pgACC playing potentially critical complementary roles in the dynamics of emotion processing: related to synchronized engagement and disengagement, respectively. The sgACC drives changes in physiological arousal by predicting current or upcoming emotionally relevant events (Barrett & Simmons, 2015), sgACC metabolism has consistently predicted individual differences in plasma cortisol (Jahn et al., 2010; Sullivan & Gratton, 2002) reflecting a high inter‐individual variability in autonomous arousal regulation (Dixon et al., 2017). One could speculate that more inter‐individually synchronized engagement during movie watching could reflect more similar physiological arousal regulation as a result of more synchronized predictions about the emotional relevance of stimuli in the movie. The pgACC has previously been hypothesized to serve as a next relay for the context dependent self‐referenced evaluation of interoceptive sensations that were first autonomously processed by the sgACC (Lane et al., 2015). Whereas the subgenual part of the ACC connects to regions of the autonomous arousal regions, the pregenual part connects with DMN regions such as vlPFC enabling the attribution of affective valence to subjective feeling states (Ernst et al., 2014; Lane et al., 2015). Thus, whereas synchronized sgACC processing during the sad movie could reflect similar arousal induction as a result of similar predictions of emotional relevance, pgACC synchronization could reflect self‐referenced evaluation of interoceptive sensations necessary for the initiation and incorporation of suitable explicit emotion regulation strategies.
As an exploratory analysis, we grouped brain regions based on their mutual involvement in studies associated with specific search terms in the Neurosynth database (sad, arousal, awareness, visual perception, emotional responses, cognitive emotional, auditory) ([www.neurosynth.org], (Yarkoni et al., 2011)). When repeating the ISC‐RSA with these groups of brain regions as one entity, the neutral movie clip did not reveal a significant association between neural and behavioral synchronization. In contrast, for the sad movie clip, for almost all search term region groups there was a significant positive association with engagement synchronization. For instance, brain region groups for “arousal” involving subcortical areas, for example, anterior insula (Critchley et al., 2001), thalamus (Coull et al., 2004); salience network (Critchley et al., 2005) and DMN (Mourao‐Miranda et al., 2003) synchronized together with engagement synchronization. The only group of brain regions that revealed both a positive association with engagement and disengagement belonged to the search term “sad” (bilateral central and basolateral amygdala and the bilateral anterior insular cortex; Abou Elseoud et al., 2014; Du et al., 2015; Talati et al., 2013; Yang et al., 2019). In contrast, region groups belonging to the search term ‘emotional response’ involving the right dACC, the bilateral central amygdala and the bilateral anterior insula (Hare et al., 2005; Zaki et al., 2012) showed neither an association with engagement nor disengagement. Together these findings suggest a specificity of inter‐individually shared emotional activation and deactivation only for regions associated with sad emotion responses as opposed to emotional responses in general.
The present study relies on correlation analysis to understand the association between inter‐individual behavioral and neural response similarity. However, future work should also investigate the causal inter‐dependencies between neural signatures implicated in subjective emotional experience. For instance, Grosbras et al. (2012) found that strategic activation of the right posterior parietal cortex with Transcranial Direct Current Stimulation (tDCS) enhanced the subjective positive rating of a dance performance recording, as compared to a control (vertex) stimulation (2012). Further, facilitating activation in the vmPFC with transcranial direct current stimulation, as compared to sham stimulation, resulted in reduced experienced emotional intensity together with greater activation within the vmPFC, sgACC, and ventral striatum during negative, but not neutral film viewing.
4.1. Limitations
The present movie‐viewing task design enabled us to induce a richer and emotionally more immersive experience as compared to traditional repeated stimulus–response designs with static pictures, while at the same time mapping temporal alternations of emotional engagement and disengagement. This naturalistic task design comes at a cost: dynamic fluctuations in emotional experiencing can arise from multiple cognitive sources, in that different explicit or implicit regulation processes could have possibly led to one and the same emotional outcome. Additionally, our findings could be confounded by the mood the subjects were in before watching the movies. Future movie fMRI studies should use questionnaires on individuals' mood and arousal state.
In terms of the methodology, through the employment of a functionally oriented parcellation (Neurosyth atlas) discernible trade‐off emerges in terms of its hemispheric specificity. Despite our effort to divide it into two hemispheres, the resultant parcel distribution are uneven, being 60 parcels for the left hemisphere and 59 for the right hemisphere. This uneven allocation could be recognized as a limitation of the approach.
The inclusion of only female participants in the present study can be regarded as a drawback. Our decision to concentrate on female participants is grounded in our research goal of mitigating recognized gender‐related discrepancies in emotional processing (Fine et al., 2009; Hofer et al., 2007; Schulte‐Rüther et al., 2008). While this participant selection strategy might limit the broader applicability of our findings, we recognize the significance of incorporating a more comprehensive range of participants in forthcoming investigations. Additionally, we focused on the induction of sad (compared to neutral) emotions and further research is needed to assess the specificity of our results to each emotional category. Further, our movie clips were relatively short (10 min in total, 5 min each), and one could argue that the duration of the clips was too short to characterize emotion fluctuations, including the fact that engagement/disengagement “on” periods are only a fraction of the time course. However, our behavioral results show pronounced inter‐ and intra‐individually variable fluctuations, suggesting that there were sufficient data inducing fluctuations in emotion processing to investigate our research questions. Lastly, further research needs to replicate and extend these results to a larger sample.
4.2. Conclusion
Taken together, the degree of inter‐individually shared emotional responses (as opposed to their individual magnitude) reveals neural signatures underlying affective processing and regulation in naturalistic paradigms. While the present work aimed at identifying the inter‐individual similarity of neural response patterns underlying inter‐individual similarity of subjective emotional response patterns, further research is needed to extend these findings for specific moments in time with dynamic ISC‐RSA approaches (e.g., Simony et al., 2016).
CONFLICT OF INTEREST STATEMENT
The authors declare that they have no competing financial interests or personal relationships that could have appeared to influence the work reported in this article.
Supporting information
Data S1: Supporting Information.
ACKNOWLEDGMENT
This research was supported by a scholarship to Melanni Nanni‐Zepeda by the National Council of Science and Technology of Mexico (486354). Open Access funding enabled and organized by Projekt DEAL.
Nanni‐Zepeda, M. , DeGutis, J. , Wu, C. , Rothlein, D. , Fan, Y. , Grimm, S. , Walter, M. , Esterman, M. , & Zuberer, A. (2024). Neural signatures of shared subjective affective engagement and disengagement during movie viewing. Human Brain Mapping, 45(4), e26622. 10.1002/hbm.26622
Contributor Information
Melanni Nanni‐Zepeda, Email: nanni.melanni@gmail.com.
Agnieszka Zuberer, Email: azuberer@gmail.com.
DATA AVAILABILITY STATEMENT
All codes for statistical and image analysis are available at https://github.com/attentionaffectlab/Engagement-and-Disengagement. The data used in this study is confidential.
REFERENCES
- Abou Elseoud, A. , Nissilä, J. , Liettu, A. , Remes, J. , Jokelainen, J. , Takala, T. , Aunio, A. , Starck, T. , Nikkinen, J. , Koponen, H. , Zang, Y. F. , Tervonen, O. , Timonen, M. , & Kiviniemi, V. (2014). Altered resting‐state activity in seasonal affective disorder. Human Brain Mapping, 35(1), 161–172. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Apicella, F. , Sicca, F. , Federico, R. R. , Campatelli, G. , & Muratori, F. (2013). Fusiform gyrus responses to neutral and emotional faces in children with autism spectrum disorders: A high density ERP study. Behavioural Brain Research, 251, 155–162. [DOI] [PubMed] [Google Scholar]
- Ayala, J. , Shang, D. , & Yakovlev, A. (2013). Proceedings of integrated circuit and system design. Power and timing modeling, optimization and simulation. Springer.
- Baayen, R. H. , Davidson, D. J. , & Bates, D. M. (2008). Mixed‐effects modeling with crossed random effects for subjects and items. Journal of Memory and Language, 59(4), 390–412. [Google Scholar]
- Barrett, L. F. , & Simmons, W. K. (2015). Interoceptive predictions in the brain. Nature Reviews Neuroscience, 16(7), 419–429. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bates, D. , Mächler, M. , Bolker, B. , & Walker, S. (2014). Fitting linear mixed‐effects models using lme4. arXiv preprint arXiv:1406.5823 .
- Benjamini, Y. , & Hochberg, Y. (1995). Controlling the false discovery rate: A practical and powerful approach to multiple testing. Journal of the Royal Statistical Society: Series B (Methodological), 57(1), 289–300. [Google Scholar]
- Benjamini, Y. , & Yekutieli, D. (2001). The control of the false discovery rate in multiple testing under dependency. Annals of Statistics, 29, 1165–1188. [Google Scholar]
- Blicher, A. , Reinholdt‐Dunne, M. L. , Hvenegaard, M. , Winding, C. , Petersen, A. , & Vangkilde, S. (2020). Engagement and disengagement components of attentional bias to emotional stimuli in anxiety and depression. Journal of Experimental Psychopathology, 11(3), 2043808720943753. [Google Scholar]
- Bodenhausen, G. V. , Sheppard, L. A. , & Kramer, G. P. (1994). Negative affect and social judgment: The differential impact of anger and sadness. European Journal of Social Psychology, 24(1), 45–62. [Google Scholar]
- Borchardt, V. , Fan, Y. , Dietz, M. , Melendez, A. L. H. , Bajbouj, M. , Gärtner, M. , Li, M. , Walter, M. , & Grimm, S. (2018). Echoes of affective stimulation in brain connectivity networks. Cerebral Cortex, 28(12), 4365–4378. 10.1093/cercor/bhx290 [DOI] [PubMed] [Google Scholar]
- Bossier, H. , Roels, S. P. , Seurinck, R. , Banaschewski, T. , Barker, G. J. , Bokde, A. L. , Quinlan, E. B. , Desrivières, S. , Flor, H. , Grigis, A. , Garavan, H. , Gowland, P. , Heinz, A. , Ittermann, B. , Martinot, J.‐L. , Artiges, E. , Nees, F. , Orfanos, D. P. , Poustka, L. , … Moerkerke, B. (2020). The empirical replicability of task‐based FMRI as a function of sample size. NeuroImage, 212, 116601. [DOI] [PubMed] [Google Scholar]
- Braunstein, L. M. , Gross, J. J. , & Ochsner, K. N. (2017). Explicit and implicit emotion regulation: A multi‐level framework. Social Cognitive and Affective Neuroscience, 12(10), 1545–1557. 10.1093/scan/nsx096 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brooks, J. A. , Chikazoe, J. , Sadato, N. , & Freeman, J. B. (2019). The neural representation of facial‐emotion categories reflects conceptual structure. Proceedings of the National Academy of Sciences, 116(32), 15861–15870. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Buhle, J. T. , Silvers, J. A. , Wager, T. D. , Lopez, R. , Onyemekwu, C. , Kober, H. , Weber, J. , & Ochsner, K. N. (2014). Cognitive reappraisal of emotion: A meta‐analysis of human neuroimaging studies. Cerebral Cortex, 24(11), 2981–2990. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Canini, L. , Gilroy, S. , Cavazza, M. , Leonardi, R. , & Benini, S. (2010). Users' response to affective film content: A narrative perspective. 2010 International Workshop on Content Based Multimedia Indexing (CBMI), pp. 1–6.
- Chang, L. , Jolly, E. , Cheong, J. , Burnashev, A. , & Chen, A. (2018). Cosanlab/nltools. 0.3. 11.
- Chang, L. J. , Jolly, E. , Cheong, J. H. , Rapuano, K. M. , Greenstein, N. , Chen, P.‐H. A. , & Manning, J. R. (2021). Endogenous variation in ventromedial prefrontal cortex state dynamics during naturalistic viewing reflects affective experience. Science Advances, 7(17), eabf7129. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chen, P.‐H. A. , Jolly, E. , Cheong, J. H. , & Chang, L. J. (2020). Intersubject representational similarity analysis reveals individual variations in affective experience when watching erotic movies. NeuroImage, 216, 116851. 10.1016/j.neuroimage.2020.116851 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Coull, J. T. , Jones, M. E. , Egan, T. D. , Frith, C. D. , & Maze, M. (2004). Attentional effects of noradrenaline vary with arousal level: Selective activation of thalamic pulvinar in humans. NeuroImage, 22(1), 315–322. [DOI] [PubMed] [Google Scholar]
- Critchley, H. D. , Mathias, C. J. , & Dolan, R. J. (2001). Neural activity in the human brain relating to uncertainty and arousal during anticipation. Neuron, 29(2), 537–545. [DOI] [PubMed] [Google Scholar]
- Critchley, H. D. , Rotshtein, P. , Nagai, Y. , O'Doherty, J. , Mathias, C. J. , & Dolan, R. J. (2005). Activity in the human brain predicting differential heart rate responses to emotional facial expressions. NeuroImage, 24(3), 751–762. [DOI] [PubMed] [Google Scholar]
- Dessu, S. B. , Price, R. M. , Kominoski, J. S. , Davis, S. E. , Wymore, A. S. , McDowell, W. H. , & Gaiser, E. E. (2020). Percentile‐range indexed mapping and evaluation (PRIME): A new tool for long‐term data discovery and application. Environmental Modelling & Software, 124, 104580. 10.1016/j.envsoft.2019.104580 [DOI] [Google Scholar]
- Diekhof, E. K. , Geier, K. , Falkai, P. , & Gruber, O. (2011). Fear is only as deep as the mind allows: A coordinate‐based meta‐analysis of neuroimaging studies on the regulation of negative affect. NeuroImage, 58(1), 275–285. [DOI] [PubMed] [Google Scholar]
- Dimsdale‐Zucker, H. R. , & Ranganath, C. (2018). Chapter 27—Representational similarity analyses: A practical guide for functional MRI applications. In Manahan‐Vaughan D. (Ed.), Handbook of behavioral neuroscience (pp. 509–525). Elsevier. 10.1016/B978-0-12-812028-6.00027-6 [DOI] [Google Scholar]
- Dixon, M. L. , Thiruchselvam, R. , Todd, R. , & Christoff, K. (2017). Emotion and the prefrontal cortex: An integrative review. Psychological Bulletin, 143(10), 1033–1081. [DOI] [PubMed] [Google Scholar]
- Dmochowski, J. P. , Sajda, P. , Dias, J. , & Parra, L. C. (2012). Correlated components of ongoing EEG point to emotionally laden attention–A possible marker of engagement? Frontiers in Human Neuroscience, 6, 112. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Du, Y. , Pearlson, G. D. , Liu, J. , Sui, J. , Yu, Q. , He, H. , Castro, E. , & Calhoun, V. D. (2015). A group ica based framework for evaluating resting FMRI markers when disease categories are unclear: Application to schizophrenia, bipolar, and schizoaffective disorders. NeuroImage, 122, 272–280. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Duchaine, B. , & Yovel, G. (2015). A revised neural framework for face processing. Annual Review of Vision Science, 1, 393–416. [DOI] [PubMed] [Google Scholar]
- Ernst, J. , Böker, H. , Hättenschwiler, J. , Schüpbach, D. , Northoff, G. , Seifritz, E. , & Grimm, S. (2014). The association of interoceptive awareness and alexithymia with neurotransmitter concentrations in insula and anterior cingulate. Social Cognitive and Affective Neuroscience, 9(6), 857–863. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Esteban, O. , Markiewicz, C. J. , Blair, R. W. , Moodie, C. A. , Isik, A. I. , Erramuzpe, A. , Kent, J. D. , Goncalves, M. , DuPre, E. , Snyder, M. , Oya, H. , Ghosh, S. S. , Wright, J. , Durnez, J. , Poldrack, R. A. , & Gorgolewski, K. J. (2019). fMRIPrep: A robust preprocessing pipeline for functional MRI. Nature Methods, 16(1), 111–116. 10.1038/s41592-018-0235-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Etkin, A. , Büchel, C. , & Gross, J. J. (2015). The neural bases of emotion regulation. Nature Reviews Neuroscience, 16(11), 693–700. 10.1038/nrn4044 [DOI] [PubMed] [Google Scholar]
- Etkin, A. , Egner, T. , & Kalisch, R. (2011). Emotional processing in anterior cingulate and medial prefrontal cortex. Trends in Cognitive Sciences, 15(2), 85–93. 10.1016/j.tics.2010.11.004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fan, Y. , Borchardt, V. , von Düring, F. , Leutritz, A. L. , Dietz, M. , Herrera‐Meléndez, A. L. , Bajbouj, M. , Li, M. , Grimm, S. , & Walter, M. (2019). Dorsal and ventral posterior cingulate cortex switch network assignment via changes in relative functional connectivity strength to noncanonical networks. Brain Connectivity, 9(1), 77–94. [DOI] [PubMed] [Google Scholar]
- Feeser, M. , Prehn, K. , Kazzer, P. , Mungee, A. , & Bajbouj, M. (2014). Transcranial direct current stimulation enhances cognitive control during emotion regulation. Brain Stimulation, 7(1), 105–112. [DOI] [PubMed] [Google Scholar]
- Fine, J. G. , Semrud‐Clikeman, M. , & Zhu, D. C. (2009). Gender differences in bold activation to face photographs and video vignettes. Behavioural Brain Research, 201(1), 137–146. [DOI] [PubMed] [Google Scholar]
- Finn, E. S. , Glerean, E. , Khojandi, A. Y. , Nielson, D. , Molfese, P. J. , Handwerker, D. A. , & Bandettini, P. A. (2020). Idiosynchrony: From shared responses to individual differences during naturalistic neuroimaging. NeuroImage, 215, 116828. 10.1016/j.neuroimage.2020.116828 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fitzpatrick, S. , & Kuo, J. R. (2022). Predicting the effectiveness of engagement and disengagement emotion regulation based on emotional reactivity in borderline personality disorder. Cognition and Emotion, 36(3), 473–491. [DOI] [PubMed] [Google Scholar]
- Gaviria, J. , Rey, G. , Bolton, T. , Delgado, J. , van de Ville, D. , & Vuilleumier, P. (2021). Brain functional connectivity dynamics at rest in the aftermath of affective and cognitive challenges. Human Brain Mapping, 42(4), 1054–1069. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gorgolewski, K. , Burns, C. D. , Madison, C. , Clark, D. , Halchenko, Y. O. , Waskom, M. L. , & Ghosh, S. S. (2011). Nipype: A flexible, lightweight and extensible neuroimaging data processing framework in python. Frontiers in Neuroinformatics, 5, 13. 10.3389/fninf.2011.00013 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gorgolewski, K. J. , Esteban, O. , Markiewicz, C. J. , Ziegler, E. , Ellis, D. G. , Jarecka, D. , Johnson, H. , Notter, M. P. , Burns, C. , Manhães‐Savio, A. , Hamalainen, C. , Yvernault, B. , Salo, T. , Goncalves, M. , Jordan, K. , Waskom, M. , Wong, J. , Modat, M. , Clark, D. , … Ghosh, S. (2019, September 7). Nipy/nipype: 1.2.2. Zenodo . 10.5281/zenodo.3402255 [DOI]
- Griffanti, L. , Douaud, G. , Bijsterbosch, J. , Evangelisti, S. , Alfaro‐Almagro, F. , Glasser, M. F. , Duff, E. P. , Fitzgibbon, S. , Westphal, R. , Carone, D. , Beckmann, C. F. , & Smith, S. M. (2017). Hand classification of fMRI ICA noise components. NeuroImage, 154, 188–205. 10.1016/j.neuroimage.2016.12.036 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Griffanti, L. , Salimi‐Khorshidi, G. , Beckmann, C. F. , Auerbach, E. J. , Douaud, G. , Sexton, C. E. , Zsoldos, E. , Ebmeier, K. P. , Filippini, N. , Mackay, C. E. , Moeller, S. , Xu, J. , Yacoub, E. , Baselli, G. , Ugurbil, K. , Miller, K. L. , & Smith, S. M. (2014). ICA‐based artefact removal and accelerated FMRI acquisition for improved resting state network imaging. NeuroImage, 95, 232–247. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Grosbras, M.‐H. , Tan, H. , & Pollick, F. (2012). Dance and emotion in posterior parietal cortex: A low‐frequency RTMS study. Brain Stimulation, 5(2), 130–136. [DOI] [PubMed] [Google Scholar]
- Gruskin, D. C. , Rosenberg, M. D. , & Holmes, A. J. (2020). Relationships between depressive symptoms and brain responses during emotional movie viewing emerge in adolescence. NeuroImage, 216, 116217. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hanich, J. , Wagner, V. , Shah, M. , Jacobsen, T. , & Menninghaus, W. (2014). Why we like to watch sad films. The pleasure of being moved in aesthetic experiences. Psychology of Aesthetics, Creativity, and the Arts, 8(2), 130–143. 10.1037/a0035690 [DOI] [Google Scholar]
- Hanjalic, A. , & Xu, L.‐Q. (2005). Affective video content representation and modeling. IEEE Transactions on Multimedia, 7(1), 143–154. [Google Scholar]
- Hare, T. A. , Tottenham, N. , Davidson, M. C. , Glover, G. H. , & Casey, B. (2005). Contributions of amygdala and striatal activity in emotion regulation. Biological Psychiatry, 57(6), 624–632. [DOI] [PubMed] [Google Scholar]
- Hasson, U. , Avidan, G. , Gelbard, H. , Vallines, I. , Harel, M. , Minshew, N. , & Behrmann, M. (2009). Shared and idiosyncratic cortical activation patterns in autism revealed under continuous real‐life viewing conditions. Autism Research, 2(4), 220–231. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hasson, U. , Landesman, O. , Knappmeyer, B. , Vallines, I. , Rubin, N. , & Heeger, D. J. (2008). Neurocinematics: The neuroscience of film. Projections, 2(1), 1–26. [Google Scholar]
- Hasson, U. , Malach, R. , & Heeger, D. J. (2010). Reliability of cortical activity during natural stimulation. Trends in Cognitive Sciences, 14(1), 40–48. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hasson, U. , Nir, Y. , Levy, I. , Fuhrmann, G. , & Malach, R. (2004). Intersubject synchronization of cortical activity during natural vision. Science, 303(5664), 1634–1640. 10.1126/science.1089506 [DOI] [PubMed] [Google Scholar]
- Hofer, A. , Siedentopf, C. M. , Ischebeck, A. , Rettenbacher, M. A. , Verius, M. , Felber, S. , & Fleischhacker, W. W. (2007). Sex differences in brain activation patterns during processing of positively and negatively valenced emotional words. Psychological Medicine, 37(1), 109–119. [DOI] [PubMed] [Google Scholar]
- Hudson, M. , Seppälä, K. , Putkinen, V. , Sun, L. , Glerean, E. , Karjalainen, T. , Karlsson, H. K. , Hirvonen, J. , & Nummenmaa, L. (2020). Dissociable neural systems for unconditioned acute and sustained fear. NeuroImage, 216, 116522. 10.1016/j.neuroimage.2020.116522 [DOI] [PubMed] [Google Scholar]
- Hutcherson, C. , Goldin, P. R. , Ochsner, K. N. , Gabrieli, J. , Barrett, L. F. , & Gross, J. J. (2005). Attention and emotion: Does rating emotion alter neural responses to amusing and sad films? NeuroImage, 27(3), 656–668. [DOI] [PubMed] [Google Scholar]
- Iñárritu, A. G. (2003). 21 grams [Translated title: 21 Grams].
- Jahn, A. L. , Fox, A. S. , Abercrombie, H. C. , Shelton, S. E. , Oakes, T. R. , Davidson, R. J. , & Kalin, N. H. (2010). Subgenual prefrontal cortex activity predicts individual differences in hypothalamic‐pituitary‐adrenal activity across different contexts. Biological Psychiatry, 67(2), 175–181. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kato, S. , Kaplan, H. S. , Schrödel, T. , Skora, S. , Lindsay, T. H. , Yemini, E. , Lockery, S. , & Zimmer, M. (2015). Global brain dynamics embed the motor command sequence of Caenorhabditis elegans . Cell, 163(3), 656–669. [DOI] [PubMed] [Google Scholar]
- Kaufman, L. , & Rousseeuw, P. J. (2009). Finding groups in data: An introduction to cluster analysis (Vol. 344). John Wiley & Sons. [Google Scholar]
- Kawasaki, H. , Tsuchiya, N. , Kovach, C. K. , Nourski, K. V. , Oya, H. , Howard, M. A. , & Adolphs, R. (2012). Processing of facial emotion in the human fusiform gyrus. Journal of Cognitive Neuroscience, 24(6), 1358–1370. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kiani, R. , Esteky, H. , Mirpour, K. , & Tanaka, K. (2007). Object category structure in response patterns of neuronal population in monkey inferior temporal cortex. Journal of Neurophysiology, 97(6), 4296–4309. [DOI] [PubMed] [Google Scholar]
- Kohn, N. , Eickhoff, S. B. , Scheller, M. , Laird, A. R. , Fox, P. T. , & Habel, U. (2014). Neural network of cognitive emotion regulation—An ALE meta‐analysis and MACM analysis. NeuroImage, 87, 345–355. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Koole, S. L. (2009). The psychology of emotion regulation: An integrative review. Cognition & Emotion, 23(1), 4–41. 10.1080/02699930802619031 [DOI] [Google Scholar]
- Kragel, P. A. , Kano, M. , Van Oudenhove, L. , Ly, H. G. , Dupont, P. , Rubio, A. , Delon‐Martin, C. , Bonaz, B. L. , Manuck, S. B. , Gianaros, P. J. , Ceko, M. , Reynolds Losin, E. A. , Woo, C.‐W. , Nichols, T. E. , & Wager, T. D. (2018). Generalizable representations of pain, cognitive control, and negative emotion in medial frontal cortex. Nature Neuroscience, 21(2), 283–289. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kriegeskorte, N. , & Kievit, R. A. (2013). Representational geometry: Integrating cognition, computation, and the brain. Trends in Cognitive Sciences, 17(8), 401–412. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kriegeskorte, N. , Mur, M. , & Bandettini, P. A. (2008). Representational similarity analysis‐connecting the branches of systems neuroscience. Frontiers in Systems Neuroscience, 2, 4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lane, R. D. , Weihs, K. L. , Herring, A. , Hishaw, A. , & Smith, R. (2015). Affective agnosia: Expansion of the alexithymia construct and a new opportunity to integrate and extend Freud's legacy. Neuroscience & Biobehavioral Reviews, 55, 594–611. [DOI] [PubMed] [Google Scholar]
- Li, T. , Baveye, Y. , Chamaret, C. , Dellandréa, E. , & Chen, L. (2015). Continuous arousal self‐assessments validation using real‐time physiological responses. ASM'15. 10.1145/2813524.2813527 [DOI]
- Liang, Y. , Liu, B. , Xu, J. , Zhang, G. , Li, X. , Wang, P. , & Wang, B. (2017). Decoding facial expressions based on face‐selective and motion‐sensitive areas. Human Brain Mapping, 38(6), 3113–3125. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lieberman, M. D. , Eisenberger, N. I. , Crockett, M. J. , Tom, S. M. , Pfeifer, J. H. , & Way, B. M. (2007). Putting feelings into words. Psychological Science, 18(5), 421–428. 10.1111/j.1467-9280.2007.01916.x [DOI] [PubMed] [Google Scholar]
- Maffei, A. (2020). Spectrally resolved EEG intersubject correlation reveals distinct cortical oscillatory patterns during free‐viewing of affective scenes. Psychophysiology, 57(11), e13652. [DOI] [PubMed] [Google Scholar]
- Mantel, N. (1967). The detection of disease clustering and a generalized regression approach. Cancer Research, 27(2 Part 1), 209–220. [PubMed] [Google Scholar]
- Metallinou, A. , & Narayanan, S. (2013). Annotation and processing of continuous emotional attributes: Challenges and opportunities. 2013 10th IEEE international conference and workshops on automatic face and gesture recognition (FG), pp. 1–8.
- Moretti, N. (2001, March 9). La stanza del figlio [Translated title: The Son's Room].
- Mourao‐Miranda, J. , Volchan, E. , Moll, J. , de Oliveira‐Souza, R. , Oliveira, L. , Bramati, I. , Gattass, R. , & Pessoa, L. (2003). Contributions of stimulus valence and arousal to visual activation during emotional perception. NeuroImage, 20(4), 1955–1963. [DOI] [PubMed] [Google Scholar]
- Mueller, S. , Wang, D. , Fox, M. D. , Yeo, B. T. , Sepulcre, J. , Sabuncu, M. R. , Shafee, R. , Lu, J. , & Liu, H. (2013). Individual variability in functional connectivity architecture of the human brain. Neuron, 77(3), 586–595. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nguyen, M. , Vanderwal, T. , & Hasson, U. (2019). Shared understanding of narratives is correlated with shared neural responses. NeuroImage, 184, 161–170. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Northoff, G. , & Bermpohl, F. (2004). Cortical midline structures and the self. Trends in Cognitive Sciences, 8(3), 102–107. [DOI] [PubMed] [Google Scholar]
- Northoff, G. , Heinzel, A. , De Greck, M. , Bermpohl, F. , Dobrowolny, H. , & Panksepp, J. (2006). Self‐referential processing in our brain—A meta‐analysis of imaging studies on the self. NeuroImage, 31(1), 440–457. [DOI] [PubMed] [Google Scholar]
- Nummenmaa, L. , Glerean, E. , Viinikainen, M. , Jääskeläinen, I. P. , Hari, R. , & Sams, M. (2012). Emotions promote social interaction by synchronizing brain activity across individuals. Proceedings of the National Academy of Sciences, 109(24), 9599–9604. 10.1073/pnas.1206095109 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ochsner, K. N. , Silvers, J. A. , & Buhle, J. T. (2012). Functional imaging studies of emotion regulation: A synthetic review and evolving model of the cognitive control of emotion. Annals of the New York Academy of Sciences, 1251, E1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Popal, H. , Wang, Y. , & Olson, I. R. (2019). A guide to representational similarity analysis for social neuroscience. Social Cognitive and Affective Neuroscience, 14(11), 1243–1253. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pruim, R. H. R. , Mennes, M. , van Rooij, D. , Llera, A. , Buitelaar, J. K. , & Beckmann, C. F. (2015). ICA‐AROMA: A robust ICA‐based strategy for removing motion artifacts from fMRI data. NeuroImage, 112, 267–277. 10.1016/j.neuroimage.2015.02.064 [DOI] [PubMed] [Google Scholar]
- Roy, M. , Shohamy, D. , & Wager, T. D. (2012). Ventromedial prefrontal‐subcortical systems and the generation of affective meaning. Trends in Cognitive Sciences, 16(3), 147–156. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rudaizky, D. , Basanovic, J. , & MacLeod, C. (2014). Biased attentional engagement with, and disengagement from, negative information: Independent cognitive pathways to anxiety vulnerability? Cognition & Emotion, 28(2), 245–259. [DOI] [PubMed] [Google Scholar]
- Ruef, A. M. , & Levenson, R. W. (2007). Continuous measurement of emotion. In Handbook of Emotion Elicitation and Assessment (pp. 286–297). Oxford University Press. [Google Scholar]
- Saarimäki, H. (2021). Naturalistic stimuli in affective neuroimaging: A review. Frontiers in Human Neuroscience, 15, 675078. 10.3389/fnhum.2021.675068 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sachs, M. E. , Habibi, A. , Damasio, A. , & Kaplan, J. T. (2020). Dynamic intersubject neural synchronization reflects affective responses to sad music. NeuroImage, 218, 116512. 10.1016/j.neuroimage.2019.116512 [DOI] [PubMed] [Google Scholar]
- Salimi‐Khorshidi, G. , Douaud, G. , Beckmann, C. F. , Glasser, M. F. , Griffanti, L. , & Smith, S. M. (2014). Automatic denoising of functional MRI data: Combining independent component analysis and hierarchical fusion of classifiers. NeuroImage, 90, 449–468. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Scherer, K. R. , Schorr, A. , & Johnstone, T. (2001). Appraisal processes in emotion: Theory, methods, research. Oxford University Press. [Google Scholar]
- Schulte‐Rüther, M. , Markowitsch, H. J. , Shah, N. J. , Fink, G. R. , & Piefke, M. (2008). Gender differences in brain networks supporting empathy. NeuroImage, 42(1), 393–403. [DOI] [PubMed] [Google Scholar]
- Sharma, K. , Castellini, C. , van den Broek, E. L. , Albu‐Schaeffer, A. , & Schwenker, F. (2019). A dataset of continuous affect annotations and physiological signals for emotion analysis. Scientific Data, 6(1), 1–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shine, J. M. , Breakspear, M. , Bell, P. T. , Ehgoetz Martens, K. A. , Shine, R. , Koyejo, O. , Sporns, O. , & Poldrack, R. A. (2019). Human cognition involves the dynamic integration of neural activity and neuromodulatory systems. Nature Neuroscience, 22(2), 289–296. 10.1038/s41593-018-0312-0 [DOI] [PubMed] [Google Scholar]
- Shiota, M. N. , & Levenson, R. W. (2009). Effects of aging on experimentally instructed detached reappraisal, positive reappraisal, and emotional behavior suppression. Psychology and Aging, 24(4), 890–900. 10.1037/a0017896 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Simony, E. , Honey, C. J. , Chen, J. , Lositsky, O. , Yeshurun, Y. , Wiesel, A. , & Hasson, U. (2016). Dynamic reconfiguration of the default mode network during narrative comprehension. Nature Communications, 7(1), 1–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Song, H. , Finn, E. S. , & Rosenberg, M. D. (2021). Neural signatures of attentional engagement during narratives and its consequences for event memory. Proceedings of the National Academy of Sciences, 118(33), e2021905118. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Steenberghs, N. , Lavrijsen, J. , Soenens, B. , & Verschueren, K. (2021). Peer effects on engagement and disengagement: Differential contributions from friends, popular peers, and the entire class. Frontiers in Psychology, 12, 726815. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sullivan, R. M. , & Gratton, A. (2002). Prefrontal cortical regulation of hypothalamic–pituitary–adrenal function in the rat and implications for psychopathology: Side matters. Psychoneuroendocrinology, 27(1–2), 99–114. [DOI] [PubMed] [Google Scholar]
- Svoboda, E. , McKinnon, M. C. , & Levine, B. (2006). The functional neuroanatomy of autobiographical memory: A meta‐analysis. Neuropsychologia, 44(12), 2189–2208. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Talati, A. , Pantazatos, S. P. , Schneier, F. R. , Weissman, M. M. , & Hirsch, J. (2013). Gray matter abnormalities in social anxiety disorder: Primary, replication, and specificity studies. Biological Psychiatry, 73(1), 75–84. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tranel, D. , & Damasio, H. (1994). Neuroanatomical correlates of electrodermal skin conductance responses. Psychophysiology, 31(5), 427–438. [DOI] [PubMed] [Google Scholar]
- Trost, W. , Frühholz, S. , Cochrane, T. , Cojan, Y. , & Vuilleumier, P. (2015). Temporal dynamics of musical emotions examined through intersubject synchrony of brain activity. Social Cognitive and Affective Neuroscience, 10(12), 1705–1721. [DOI] [PMC free article] [PubMed] [Google Scholar]
- van der Meer, J. N. , Breakspear, M. , Chang, L. J. , Sonkusare, S. , & Cocchi, L. (2020). Movie viewing elicits rich and reliable brain state dynamics. Nature Communications, 11(1), 1–14. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Van Rossum, G. , & Drake, F. L. (2009). Python 3 reference manual. CreateSpace.
- Vega, A. D. L. , Chang, L. J. , Banich, M. T. , Wager, T. D. , & Yarkoni, T. (2016). Large‐scale meta‐analysis of human medial frontal cortex reveals tripartite functional organization. Journal of Neuroscience, 36(24), 6553–6562. 10.1523/JNEUROSCI.4402-15.2016 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wallentin, M. , Nielsen, A. H. , Vuust, P. , Dohn, A. , Roepstorff, A. , & Lund, T. E. (2011). Amygdala and heart rate variability responses from listening to emotionally intense parts of a story. NeuroImage, 58(3), 963–973. 10.1016/j.neuroimage.2011.06.077 [DOI] [PubMed] [Google Scholar]
- Wild, B. , Erb, M. , & Bartels, M. (2001). Are emotions contagious? Evoked emotions while viewing emotionally expressive faces: Quality, quantity, time course and gender differences. Psychiatry Research, 102(2), 109–124. [DOI] [PubMed] [Google Scholar]
- Winecoff, A. , Clithero, J. A. , Carter, R. M. , Bergman, S. R. , Wang, L. , & Huettel, S. A. (2013). Ventromedial prefrontal cortex encodes emotional value. Journal of Neuroscience, 33(27), 11032–11039. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wittchen, H.‐U. , Wunderlich, U. , Gruschwitz, S. , & Zaudig, M. (1997). SKID I. Strukturiertes Klinisches Interview für DSM‐IV. Achse I: Psychische Störungen. Interviewheft und Beurteilungsheft. Eine deutschsprachige, erweiterte Bearb. d. amerikanischen Originalversion des SKID I.
- Xie, T. , Cheong, J. H. , Manning, J. R. , Brandt, A. M. , Aronson, J. P. , Jobst, B. C. , Bujarski, K. A. , & Chang, L. J. (2021). Minimal functional alignment of ventromedial prefrontal cortex intracranial EEG signals during naturalistic viewing. bioRxiv .
- Yang, X. , Liu, J. , Meng, Y. , Xia, M. , Cui, Z. , Wu, X. , Hu, X. , Zhang, W. , Gong, G. , Gong, Q. , Sweeney, J. A. , & He, Y. (2019). Network analysis reveals disrupted functional brain circuitry in drug‐naive social anxiety disorder. NeuroImage, 190, 213–223. [DOI] [PubMed] [Google Scholar]
- Yarkoni, T. , Poldrack, R. A. , Nichols, T. E. , van Essen, D. C. , & Wager, T. D. (2011). Large‐scale automated synthesis of human functional neuroimaging data. Nature Methods, 8(8), 665–670. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yeo, B. T. , Krienen, F. , Sepulcre, J. , Sabuncu, M. , & Lashkari, D. (2011). The organization of the human cerebral cortex estimated by intrinsic 553 functional connectivity. Journal of Neurophysiology, 106(3), 1125–1554. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zahn, T. P. , Grafman, J. , & Tranel, D. (1999). Frontal lobe lesions and electrodermal activity: Effects of significance. Neuropsychologia, 37(11), 1227–1241. [DOI] [PubMed] [Google Scholar]
- Zaki, J. , Davis, J. I. , & Ochsner, K. N. (2012). Overlapping activity in anterior insula during interoception and emotional experience. NeuroImage, 62(1), 493–499. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zhang, S. , Hu, S. , Chao, H. H. , Ide, J. S. , Luo, X. , Farr, O. M. , & Li, C.‐S. R. (2014). Ventromedial prefrontal cortex and the regulation of physiological arousal. Social Cognitive and Affective Neuroscience, 9(7), 900–908. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data S1: Supporting Information.
Data Availability Statement
All codes for statistical and image analysis are available at https://github.com/attentionaffectlab/Engagement-and-Disengagement. The data used in this study is confidential.
