Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2021 Apr 1.
Published in final edited form as: Psychophysiology. 2020 Feb 3;57(4):e13522. doi: 10.1111/psyp.13522

Common circuit or paradigm shift? The functional brain in emotional scene perception and emotional imagery

Nicola Sambuco 1, Margaret M Bradley 1, David R Herring 1, Peter J Lang 1
PMCID: PMC7446773  NIHMSID: NIHMS1065652  PMID: 32011742

Abstract

Meta-analytic and experimental studies investigating the neural basis of emotion often compare functional activation in different emotional induction contexts, assessing evidence for a “core affect” or “salience” network. Meta-analyses necessarily aggregate effects across diverse paradigms and (necessarily) different samples, which ignores potential neural differences specific to the method of affect induction. Data from repeated measures designs are few, reporting contradictory results with a small N. In the current study, functional brain activity is assessed in a large (N = 61) group of healthy participants during two common emotion inductions - scene perception and narrative imagery - to evaluate cross-paradigm consistency. Results indicate that limbic and paralimbic regions, together with visual and parietal cortex, are reliably engaged during emotional scene perception. For emotional imagery, in contrast, enhanced functional activity is found in several cerebellar regions, hippocampus, caudate, and dorso-medial prefrontal cortex, consistent with the conception that imagery is an action disposition. Taken together, the data suggest that there is not an identical emotion network engaged across paradigms, but that the specific neural regions activated during emotional processing can vary significantly with the context of the emotional induction.

Keywords: emotion, fMRI, scene perception, narrative imagery

1. INTRODUCTION

Investigating the neural basis of emotion has spawned hundreds of separate studies exploring emotional processing in a variety of different contexts, including perception, attention, learning, memory, and more. A number of meta-analyses (Kober, Barrett, Joseph, Bliss-Moreau, Lindquist, & Wager, 2008; Lindquist, Wager, Kober, Bliss-Moreau, & Barret, 2012; Satpute, Kang, Bickart, Yardley, Wager, & Barrett, 2015) aggregate data across such studies to identify a common emotional network. However, in these investigations, functional activity is not only compared across different participants, but also across quite disparate contexts of emotional induction (e.g., viewing emotional expressions or scenes, Wright, Wedig, Williams, Rauch, & Albert, 2006; sniffing odorant stimuli, Anderson et al., 2003; imagining emotional events, Mantani, Okamoto, Shirao, Okada, & Yamawaki, 2005; anticipating panic attacks, Bystritsky, Pontillo, Powers, Sabb, Craske, & Bookheimer, 2001; receiving reward or punishment, O’Doherty, Kringelbach, Rolls, Hornak, & Andrews, 2001). In the current fMRI study, we instead assess functional activation in two commonly studied emotional contexts - scene perception and narrative imagery - in a large (N = 61) group of individuals using a repeated measures design, which provides a compelling assessment of whether emotional processing elicits similar patterns of functional activity, regardless of induction context.

Prior neuroimaging studies comparing functional activation across different emotion contexts for the same participants have reported divergent results and the sample sizes were generally small. For instance, Shinkareva et al. (2014) presented 8 participants with either emotional pictures or emotional sounds, reporting no overlap, although the investigation was presumably underpowered. Moreover, this conclusion differed from Peelen, Aktinson, & Vuilleumier (2010), who presented movies of facial expressions or body expressions and nonlinguistic utterances to 18 participants, reporting overlap in medial prefrontal cortex (mPFC) and left superior temporal sulcus (STS). Relatedly, Royet et al. (2000) presented olfactory, visual or auditory stimuli to 18 participants and found a common set of regions (e.g., orbitofrontal cortex, temporal pole and superior frontal gyrus) activated cross-context. Using multivariate pattern analysis (MPVA), Saarimäki et al. (2016) compared functional activation when watching movie clips or imagining different situations that cued unspecified autobiographical memories that induced disgust, fear, happiness, sadness, or neutral emotion, and reported the same classification pattern across induction contexts.

In an early meta-analysis, emotional activation assessed across 106 studies not surprisingly resulted in a functional image that included activation in almost every brain region (Murphy, Nimmo-Smith, & Lawrence, 2003). While the specific modality-independent regions reported vary across meta-analytic studies, limbic and paralimbic regions - sometimes referred to as core limbic or core affect regions, such as the amygdala, anterior insular cortex, thalamus - have been reported as showing modality-independent activation (e.g., Lindquist et al., 2012, 2016). Then, it has been proposed that the recruitment of these regions in both animals and humans is independent of sensory modality and provides evidences for a network of regions mediating appetitive and aversive processing (Hayes & Northoff, 2011; Kober et al., 2008).

On the other hand, Phan, Wager, Taylor, & Liberzon (2002) compared functional activity during visual and memory recall studies, and found that activation of subcortical regions - such as the amygdala - were reported less often during memory. Moreover, the specific set of limbic and paralimbic is not always consistent across different induction contexts (e.g., Brown, Gao, Tisdelle, Eickhoff, & Liotti, 2011; Chikazoe, Lee, Kriegeskorte, & Anderson, 2014). For example, Satpute et al. (2015) found that the amygdala is activated in 2 out of 5 different sensory modalities (visual and auditory in the left amygdala, and visual and olfactory in the right amygdala), suggesting that the existence of a general, supramodal, emotional system engaged across induction contexts lacks of strong empirical support.

One region more consistently engaged across different induction contexts appears to be the anterior insula (and the adjacent inferior frontal gyrus). Brown et al's (2011) metanalysis reported anterior insula involvement in vision, audition, gustation, an olfaction. Moreover, Satpute et al. (2015) also found anterior insula activation in four (visual, auditory, olfactory, and gustatory) of 5 sensory modalities investigated. Importantly, the anterior insula is a major node in the salience network, defined in intrinsic (i.e., resting state) connectivity analyses (Menon, 2015; Seeley et al., 2007; Shirer, Ryali, Rykhlevskaia, Menon, & Grecius, 2012), that was originally described as being involved in creating a salient representation of the visual environment. A number of regions in the salience, as well as core affect, network largely overlap with regions consistently reported as activated during emotional visual perception, in which functional activity during emotional, compared to neutral scenes, is enhanced in the amygdala, thalamus, bilateral inferior frontal gyrus (including portions of the anterior insula), and cingulate cortex, as well as in visual and parietal cortex (Adolph, 2002; Bradley, Sabatinelli, Lang, Fitzimmons, King, & Desai, 2003; Britton, Phan, Taylor, Welsh, Berridge, & Liberzon, 2006; Chang, Gianaros, Manuck, Krishnan, & Wager, 2015; Frank & Sabatinelli, 2014; Hariri, Tessitore, Mattay, Fera, Weinberger, 2002; Lang et al., 1998; Northoff et al., 2004; Padmala, Sirbu, Pessoa, 2017; Pessoa & Adolph, 2010; Sabatinelli et al., 2005, 2007, 2009, 2011).

The current study readdresses the issue of whether a core affect or salience network is cross-paradigmatically involved in emotional processing in a large repeated measures design. Functional activity accompanying emotional processing is assessed in different visual induction contexts – picture viewing and script-driven imagery - providing an additional critical investigation of cross-paradigmatic functional activation during emotional processing. Neural activity is compared during emotional scene perception and emotional narrative imagery in the same individuals in order to identify whether regional activation is similar or different. During emotional scene perception, pleasant, neutral, and unpleasant scenes are presented in a free viewing context using rapid serial visual presentation (RSVP), in which 18 pictures of the same hedonic content (e.g., pleasant) are presented in 6 s. Previous fMRI studies using rapid picture presentation have replicated the pattern of emotional enhancement in visual cortex found at slower rates, as well as significant emotional enhancement in limbic regions (Junghöfer, Sabatinelli, Bradley, Schupp, Elbert, & Lang 2006; Sambuco, Bradley, Herring, Hillbrandt, & Lang, 2019).

Emotional imagery is induced by visually presenting short scripts that describe a series of standard pleasant, neutral, or unpleasant events. Emotional imagery is a key induction context when investigating individual differences in emotionality, such as post-traumatic stress disorder (PTSD; e.g., Clark and Mackay, 2015), since developing a script that describes a personally relevant traumatic event can often target the trauma more precisely than seeking to find an appropriate visual cue. Early PET studies reported differences in functional activity based on whether narrative imagery involved personally relevant (autobiographical) or “standard” (fictitious) scenes, with personally relevant scenes prompting significant amygdala activation (Markowitsch, Thiel, Reinkemeier, Kessler, Koyuncu, & Heiss, 2000; Fink, Markowitsch, Reinkemeier, Bruckbauer, Kessler, & Heiss, 1996). To provide a test of differences and similarities when imagining personally relevant, compared to standard, scenes, “personal” scripts that described a highly unpleasant or pleasant event were constructed based on a brief interview with each participant.

We expected to replicate the pattern of enhanced functional activity when viewing emotional, compared to neutral, scenes previously found during scene perception in the amygdala, thalamus, anterior insula and inferior frontal gyrus (IFG), as well as in visual and parietal cortex. Of interest was the extent to which the same regions were activated during emotional imagery in the same participants. To the extent that a “salience” or “core affect” network is activated when processing emotionally engaging events, regardless of context, we expected to find significant overlap between regional functional activation during emotional perception and imagery, with the possibility that imagining personally relevant scripts, compared to standard, might show stronger limbic effects.

Considering that the salience network used (intrinsic) connectivity to identify relevant regions (e.g., Menon, 2015; Seeley et al., 2007), functional connectivity (generalized context-dependent psychophysiological interaction, gPPI; McLaren, Ries, Xu, & Johnson, 2012) analyses were also performed in addition to the classical functional analyses. Context-modulated functional connectivity provided an additional test of the role of the anterior insula and amygdala (seed regions) in emotional processing as a function of the induction context. If the anterior insula is cross-modally involved in emotional processing, as suggested by core affect and salience networks, regardless of the induction context, we expected increased functional connectivity between the anterior insula and limbic/paralimbic regions during both emotional perception or emotional imagery. In addition to assessing effects of emotionality, we also directly compared appetitive and aversive processing during perception and imagery to determine whether similar patterns of functional differences are found when identical regions are assessed in the same individual.

2. METHOD

2.1. Participants

Participants were 61 adults (35 female; Age: M = 20, SD = 3), including 30 students in Introductory Psychology courses at the University of Florida who participated for course credit or financial compensation, and 31 participants recruited from flyers and ads in the community that received a financial compensation. All participants had normal or corrected to normal vision. The University of Florida Institutional Review Board approved the study, and informed consent was obtained before entering the scanner. Prior to entering the bore of the scanner, participants were fitted with earplugs, headphones, and given a patient-alarm squeeze ball. Cushions were placed inside the head coil to limit head motion, and explicit verbal instructions were provided to discourage movement. Each participant received a structural scan, followed by the emotional imagery assessment and then emotional scene perception.

2.2. Procedures

2.2.1. Emotional Perception.

Pictures were 108 grayscale pictures1 that included 36 pleasant, 36 neutral, and 36 unpleasant pictures that were selected from the International Affective Picture System (IAPS; Lang, Bradley, & Cuthbert, 2008), each divided into two sets of 18 scenes. Pictures were presented using rapid serial visual presentation (RSVP) in which 18 pictures of the same hedonic content (e.g. pleasant) were presented in a 6 s interval at the rate of 3 per s (i.e., 333 ms each). Each set of 18 scenes was presented three times, with all exemplars presented before the next repetition. A variable ITI (inter-trial interval) of 9-s or 12-s was used between the streams of pictures, resulting in a total scan time of approximately five minutes.

All of the participants saw the same set of 90 pictures, which consisted of 30 pictures for each of the 3 hedonic contents. In addition, 18 scenes (6 per hedonic content) were presented to approximately half of the participants, and a different set of 18 presented to the other half. This procedure was designed to counterbalance whether a scene was new or old in a later recognition test (not reported in the current analysis). Twelve different orders were constructed that varied the serial position in which of the pictures were presented across the study, as well as the order of specific scenes within each 6-second stream.

2.2.2. Emotional Imagery.

Materials included 12 "standard" scenes1, including unpleasant (6), neutral (3), and pleasant (3) events that were identical for each participant (see Appendix for scripts). Two additional scripts were developed based on a brief interview that preceded the scanning session, describing one highly unpleasant and one highly pleasant event that the participant had previously experienced. Each script was approximately 30 words long and displayed on 3 lines of the computer screen. A comparison of sentences properties indicated that emotional (pleasant or unpleasant) and neutral sentences did not differ in the number of words, F(2,9) < 2, p = .51, or syllables, F(2,9) < 2, p = .23. To familiarize participants with the scripts prior to scanning, participants rated the pleasure and arousal of each of the 14 scripts using the Self-Assessment Manikin (SAM; Bradley & Lang, 1994; Lang, 1980).

On each trial, a script first appeared on the screen for 9-s and the participant was instructed to read the script and to begin to vividly imagine being an active participant in the described situation. Following script offset, a visual cue (a circle) signaled that the participant should continue to vividly imagine themselves interacting in the described event. Twelve seconds later, the onset of a fixation cross signaled the intertrial interval (9-12 s) in which the participant could relax until the next trial began.

Following three practice trials, two blocks of 16 trials were presented in which each of the standard scripts was presented once (n = 12) and each of the personal scenes was presented twice (n = 4). Two orders were constructed that varied the serial position of a specific script across blocks and participants. The scan time was approximately twenty minutes.

2.3. Image acquisition, processing, and analysis

Data were collected in a 3T Philips scanner with a 32-channel head coil. The scanning sequence began with acquisition of a 160-slice sagittal scout set using a standard T1-weighted fast-field echo sequence. Functional volumes for both emotional perception and imagery were 51 3.5-mm coronal slices acquired using a T2*-weighted echo planar imaging sequence with a 3-s TR, 30 ms TE, 90-degree flip angle, 72 × 72 acquisition matrix, and 180 mm FOV (2.5 × 2.5 in-plane voxel resolution). Offline, the functional data were slice-time adjusted, motion corrected, spatially smoothed (5.0 mm FWHM Gaussian kernel), and converted to percent BOLD signal change (for each voxel based on the mean across the entire time series) using the Analysis of Functional Neuroimages software (AFNI, Cox, 1996).

For the analysis of emotional perception, the hemodynamic time series for each individual was deconvolved using a cubic spline response function (15-s) that coded pleasant, neutral, and unpleasant scene presentations and motion parameters (6). The resulting impulse response function for each hedonic valence was spatially normalized to a Talairach template and resampled to 2.5-mm isotropic voxel size. Based on the resulting waveforms, BOLD activity from 6 to 12 s after picture onset was averaged to analyze effects during emotional perception.

A whole-brain ANOVA, using a false discovery rate (FDR) of p < .05 (p < .008 uncorrected) with a minimum cluster size of 300μl, computed using 2.5mm3 voxels, assessed the difference during emotional (pleasant and unpleasant) and neutral scene perception. Additional criteria required that t-tests comparing viewing neutral and either pleasant or unpleasant scenes alone were significant (p < .05) to eliminate the possibility that clusters were activated during only pleasant or unpleasant scene perception. To determine the direction of the effect for significant clusters, follow up t-tests assessed whether BOLD change during picture viewing was significantly different from baseline (i.e., the mean functional activity across the time series for each voxel) separately for emotional and neutral scenes. Differences due to hedonic content (pleasant vs unpleasant2) were tested in a whole-brain ANOVA that compared functional activity when viewing pleasant or unpleasant scenes, thresholded at FDR < .05 (p < .005 uncorrected) with a minimum cluster size of 300μl.

For the analysis of emotional imagery, the hemodynamic time series for each individual was deconvolved using a cubic spline response function (21-s; beginning with the 9-s script presentation3, followed by the12-s imagery alone period) using 5 contents (standard pleasant, unpleasant, and neutral; personal pleasant and unpleasant) and motion parameters (6). The resulting impulse response function for each hedonic valence was spatially normalized to a Talairach template and resampled to 2.5-mm isotropic voxel size. Based on the resulting waveforms, BOLD activity during imagery was averaged from 12-s to 21-s after script onset and analyzed using a whole-brain ANOVA conducted first on standard scenes that compared emotional (pleasant and unpleasant) and neutral imagery. A false discovery rate of p < .05 (p <. 006 uncorrected) and a minimum cluster extent of 300μl, were used to threshold the data. To exclude the possibility that clusters were active only during either pleasant or unpleasant imagery, additional criteria required that t-tests comparing neutral imagery and either pleasant or unpleasant imagery were both significant (p < .05). In significant clusters, separate follow up analyses tested whether functional activity during emotional or neutral imagery was significantly greater than (or less than) baseline. Differences in hedonic content (pleasant vs. unpleasant2) during emotional imagery were tested using a whole-brain ANOVA that compared functional activity when imagining pleasant and unpleasant standard scenes, thresholded at FDR < .05 (p < .007 uncorrected) and a minimum cluster extent of 300μl.

Differences due to whether scenes were personally relevant or not were tested using a whole brain ANOVA which compared functional activity when imagining personal (pleasant, unpleasant) and standard (pleasant, unpleasant) emotional scenes, thresholded at FDR < .05 (p < .003 uncorrected) and a minimum cluster extent of 300μl. For significant clusters, follow up analyses separately tested whether BOLD changes during emotional or neutral imagery were significantly different from baseline.

To further assess the role of a salience or core affect network in emotional processing, functional connectivity analyses (generalized psychophysiological interaction or gPPI; Cisler, Bush, Steele, 2013; McLaren et al., 2012) were conducted using as seed regions either 1) the bilateral amygdala (a hub in the core affect network) or 2) the dorsal anterior insula (a hub in the salience network). The average time series of the seed region was extracted, detrended, and deconvolved using a hemodynamic response function based on sampling rate (e.g., 3s TR). The interaction regressors for emotional and neutral conditions were created separately by multiplying condition codes with the seed region deconvolved timeseries and added, together with the seed region time series, as new regressors in the original deconvolution analysis. This procedure properly accounts for all sources of variability in the data (McLaren et al., 2012). The beta values corresponding to the interactions involving emotional and neutral contents were then contrasted at the group level using a paired t-test with a FDR < .05 and a minimum cluster extent of 300μl.

3. RESULTS

3.1. Emotional perception.

Figure 1 illustrates regions (red) showing significant emotional modulation during scene perception (see also Table 1). Increased functional activity is found when viewing emotional, compared to neutral, contents in ventral visual cortex (inferior occipital cortex, fusiform gyrus, and inferior temporal cortex), parietal cortex, thalamus, amygdala, and inferior frontal gyrus. During emotional perception, two patterns of BOLD change characterize emotional discrimination. Significant increases in BOLD signal change (above baseline) were found in the visual and parietal cortex, as well as in the posterior thalamus (in the habenula region) and precentral gyrus, when viewing any picture, whether emotional (t(60) = 17.1, 11.9, 9.7, 6.6; all p's < .001) or neutral (t(60) = 13.2, 7.7, 3.8, 3.5; all p's < .001), with additional enhancement when viewing emotional, compared to neutral, scenes. In the amygdala, hippocampus, dorsomedial thalamus, and anterior insula or inferior frontal gyrus (IFG), on the other hand, viewing neutral scenes did not prompt a BOLD change greater than baseline, whereas functional activity when viewing emotional scenes was significantly enhanced, t(60) = 6.8, 6.2, 4.6, 9.1, all p's < .001.

Figure 1.

Figure 1.

Middle Panel: Emotional perception and imagery. Regions uniquely involved in emotional visual perception (red) and emotional imagery (aqua), ordered from activation in more posterior (y = 74) to more anterior regions (y = −52). Enhanced functional activity was found during emotional scene perception (left panel) in ventral visual cortex (inferior occipital cortex, fusiform gyrus, and inferior temporal cortex), parietal cortex, thalamus, amygdala, and inferior frontal gyrus. During emotional imagery (right panel), enhanced functional activity was found in the amygdalae-hippocampal region, lateral cerebellum, vermis, and dorso-medial prefrontal cortex (dmPFC).

Table 1.

Regions showing significantly enhanced BOLD activity during emotional, compared to neutral, perception and during emotional, compared to neutral, imagery.

Region BOLD % change
Pls Unp Neu N t +L +P +I
Emotional Scene Perception
Visual Cortex .70a .60b .42 L 1407 11.2 46 66 −4
R 1539 11.1 − 46 61 −1
Parietal Cortex .35a .27a .17 L 417 6.7 29 54 −49
R 329 6.8 −26 56 −49
Thalamus (posterior) .30b .32a .13 M 47 4.6 11 31 1
Temporal Pole .16a .16a .05 R 73 4.9 −24 1 29
Precentral Gyr .23a .18 .11 R 28 4.0 34 6 −46
Amygdala .12a .15a −.05 L 65 8.3 19 6 9
R 59 6.9 −19 1 6
Hippocampus .10a .10a −.02 L 56 6.2 21 9 9
R 17 5.0 −19 9 6
Thalamus (Dorsomedial) .10a .09a −.02 L 102 6.1 9 9 −9
R 58 5.2 −4 11 −11
Inferior Frontal Gyrus .14a .13a .001 L 1193 6.4 29 −19 6
R 1112 6.9 −36 −29 −6
Mid Cingulate Cortex .08a .05a −06 M 184 5.8 −1 −9 −26
Presupp Motor Cortex .10b .07c −.02 M 67 4.2 −4 −9 −54
Cerebellum C8 .07b .06b −.03 R 51 5.7 −9 66 39
Supramarginal Gyrus .07a .07a −.07 L 73 7.5 54 29 −31
R 69 4.7 −64 34 −26
Frontal Pole .16a .08 −.14b M 37 4.9 −1 −59 4
Emotional Imagery
Cerebellum C1 .20c 18c .05 L 141 5.7 26 71 31
R 192 5.4 −26 71 31
Vermis .09a .07b −.02 M 403 6.3 −1 46 39
Hippocampus .10a .07b −.02 L 71 5.9 24 11 11
R 20 4.1 −21 16 9
Caudate .06b .04c −.03 L 55 5.2 14 −1 −24
R 51 5.0 −16 −1 −24
Frontal Pole .30a .17b −.04 M 72 5.9 −9 −56 1
Dorsal mPFC .10a .06b −.06 M 811 6.1 −6 −46 −26

Note. N=Number of voxels in cluster; t= Peak t-statistic; L = left; R = Right; M = Medial

+L +P +I: Coordinates in Talairach space.

mPFC = medial prefrontal cortex

Significantly different from neutral at

(a)

p<.001

(b)

p<.01

(c)

p<.05.

3.2. Emotional imagery.

Figure 1 illustrates regions (aqua) showing significant BOLD enhancement when participants imagined emotional (pleasant or unpleasant) or neutral standard scenes 4 (also see Table 1). Enhanced functional activity when imagining emotional, compared to neutral, events was found in the hippocampus, lateral cerebellum5, vermis, dorsal caudate, and dorso-medial prefrontal cortex (dmPFC). Effects of emotion in the hippocampus, lateral cerebellum (C1), vermis, caudate, and frontal pole were driven by significant BOLD signal changes (above baseline) during emotional imagery (t(60) = 4.9, 5.5, 5.3, 3.2, 4.5, all p's <.005) whereas functional activity when imagining neutral scenes did not prompt significant changes in functional activity in any of these regions. Notably, the only region with a slightly different pattern of functional activity was dmPFC, in which functional activity when imagining emotional scenes was greater than baseline, t(60) = 3.7, p < .001, whereas imagining neutral scenes was significantly lower than baseline, t(60) = −2.2, p < .03.

3.3. Emotional imagery and perception: Common regions.

Regions that overlap during emotional perception and emotional imagery are few. As Figure 2 illustrates, a small cluster (n = 25 voxels) on the boundary between the amygdala and hippocampus (yellow) overlapped during emotional imagery and perception. Since a whole brain atlas identifies these overlapping voxels as in both the amygdala and hippocampus, follow up ROI analysis were used to further asses the involvement of these two regions in emotional scene perception and imagery. A ROI analysis using the anterior hippocampus indicated enhanced functional activity when imagining emotional, compared to neutral, scenes (F(1,120) = 4.95, p = .028) as well as enhanced functional activity during emotional perception, F(1,120) = 12.67, p = .001. A ROI analysis on the amygdala, however, showed no difference during emotional and neutral imagery, although there was a small, but significant change above baseline when imagining emotional events (M = .04; t(60) = 2.92, p = 0.005), but not during neutral imagery, M = .02, t(60) = 1.17, p =.25. In addition to these regions, only a very small (n=11 voxels) region in the frontal pole showed an overlap.

Figure 2.

Figure 2.

A small overlap (yellow) between emotional scene perception (red) and emotional imagery (aqua) was found between the posterior portion of the amygdala and the anterior hippocampus. The functional cluster involved in emotional perception appears more anterior than the one involved in emotional imagery.

Functional connectivity analysis (gPPI) also revealed no overlap between emotional scene perception and emotional imagery. When dorsal anterior insula or amygdala were used as seed region, enhanced functional connectivity was found between the seed regions and visual cortex (see Supplement Figure S1 and Table S1) when viewing emotional compared to neutral scenes. However, no significant voxels were detected during emotional imagery.

3.4. Appetitive and aversive processing.

Figure 3 illustrates regions that were significantly enhanced when processing pleasant, compared to unpleasant, contents in either perception (red) or imagery (aqua). Pleasant imagery activated a large dorsal region of mPFC (1137 voxels; peak t-statistic, LPI coordinates: ±6, −51, −1), whereas functional activity during pleasant perception was smaller (257 voxels; peak t-statistic, LPI coordinates: ±1, −56, −4), more ventral and in a region closer to the orbitofrontal cortex. In vmPFC in which overlapping activation was found (197 voxels), both pleasant imagery (M: Pleasant = .21, Unpleasant: −.07) and perception (M: Pleasant = .07, Unpleasant: −.14) prompted a significant increase above baseline (imagery: t(60) = 3.3, p = .002, perception: t(60) = 2.2, p = .03), with changes during aversive processing below baseline in both imagery, t(60) = −5.8, p < .001, and perception, t(60) = −5.1, p < .001.

Figure 3.

Figure 3.

Regions showing functional enhancement during pleasant, compared to unpleasant, processing in scene perception (red) and imagery (aqua), and their overlap (yellow) in the ventral medial prefrontal cortex (top) and striatum (bottom).

In the striatum, a small overlap (6 voxels) between pleasant imagery (peak t-statistic, LPI coordinates: ±1, −11, −1) and perception (peak t-statistic, LPI coordinates: ±1, −6, 6) was found in which imagery (M: Pleasant = .06, Unpleasant: −.07; t(60) = 2.6, p =.01), but not perception (M: Pleasant = .04, Unpleasant: −.15; t(60) = 1.2, p = .24), prompted a significant increase above baseline. Moreover, functional activity during aversive processing was significantly lower than baseline in both imagery, t(60) = −3.7, p <.001, and perception, t(60) = −3.7, p <.001.

3.5. Personal vs Standard Imagery.

Regions in which functional activity was enhanced when imagining personal, compared to standard, emotional scenes, includes a large cluster in posterior parietal cortex comprising parts of the precuneus, cuneus, and posterior cingulate cortex (Figure 4), and in lateral parietal cortex in the region of the bilateral angular gyrus (see also Table 2), but no difference in amygdala activation6. The pattern of discrimination between personal and standard imagery in the precuneus and posterior cingulate, as well as in the superior frontal gyrus, middle frontal gyrus, and cerebellum (C2) is characterized by a significant BOLD increase above baseline when imagining personal contents (t(60) = 6.3, 6.3, 3.9, 3.0, respectively; all p's <.005), whereas standard scenes did not prompt significant changes in functional activity. In the angular gyrus, lateral frontal cortex, and anterior mid-cingulate cortex, on the other hand, the pattern was slightly different, with only standard scenes showing significantly less functional activity, compared to baseline, t(60) = −2.7, −4.3, −2.43, respectively, all p's < 0.05.

Figure 4.

Figure 4.

Differences in functional activity when imagining personal (bold lines in the waveforms), compared to standard scenes (dotted lines in the waveforms) included regions in the posterior (top) and lateral (bottom) parietal cortex, such as precuneus, posterior cingulate cortex (pCC), and angular gyrus.

Table 2.

Regions showing enhanced BOLD activity when imagining personal, compared to standard, emotional scenes.

Region BOLD % change –
Emotional contents
Region Personal Standard N t +L +P +I
Precuneus - pCC .22 .005 M 2368 8.5 4 69 −26
Angular Gyrus .12 −.05 L 540 5.7 36 66 −34
R 708 6.4 −44 64 −41
Cerebellum C2 .07 −.03 L 112 4.2 34 64 39
R 23 3.9 −34 69 41
Anterior Mid-Cingulate .04 −.04 M 60 4.2 −9 −29 −29
Mid Frontal Gyrus .09 −.04 L 68 4.2 29 −14 −46
R 165 5.1 −31 −6 −49
Superior Frontal Gyrus .29 .05 M 267 5.3 4 −31 −54
Lateral Frontal Cortex .05 −.12 L 218 4.9 26 −61 −14
R 279 5.8 −36 −49 −9

Note. N=Number of voxels in cluster; t= Peak t-statistic; L= left; R = Right; M = Medial

+L +P +I: Coordinates in Talairach space.

pCC= posterior cingulate cortex

When compared to imagining neutral scenes, imagining emotional scenes that were personally relevant enhanced functional activity in all of the regions that were active during standard emotional imagery, including the cerebellum (M = .25, F(120) = 15.0, p < .001), vermis (M = .09, F(120) = 20.1, p < .001), hippocampus (M = .11, F(120) = 15.7, p < .001), caudate (M = .05, F(120) = 9.4, p = .003), dmPFC (M = .14, F(120) = 27.7, p < .001), and frontal pole (M = .29, F(120) = 22.5, p < .001).

4. DISCUSSION

Regions of enhanced functional activity during emotional, compared to neutral, processing were assessed in a repeated measures design with a large group (N = 61) of participants during emotional scene perception and emotional imagery. The research aim was to determine if a similar network of regions is activated in different emotional contexts. Consistent with prior research (Brown et al., 2011; Satpute et al., 2015; Phan et al., 2002), the data suggest little overlap in regions activated during emotional scene perception and emotional imagery. Whereas viewing emotionally engaging scenes prompted significant functional enhancement in the amygdala, thalamus, anterior insula/IFG and in visual and parietal cortex, as found previously (e.g., Sabatinelli et al., 2011), these regions were not significantly enhanced during emotional imagery. Instead, emotional imagery showed a pattern of functional enhancement in the hippocampus, lateral cerebellum, dmPFC, and caudate nucleus. Taken together, these data do not support the view that a general "core limbic" or "salience" network is activated independently of the context of emotional instigation.

Few differences were found during emotional imagery in the majority of "core limbic" regions identified in earlier meta-analyses of functional activity across dozens of different paradigms (e.g., Kober et al., 2008), whereas these regions - amygdala, thalamus, insula - were reliably activated during emotional scene perception. This may be due to the larger number of studies on emotional visual perception usually contributing to large meta-analyses, compared to other induction contexts. This interpretation is further supported by the current finding of enhanced activation during emotional scene perception in large regions of the extrastriate cortex, which also appear in a number of meta-analytic reports of cross-modal emotional activation (Kober et al., 2008; Lindquist et al., 2012).

The data are consistent with previous studies reporting differential functional activity during emotional processing as a function of sensory modality (e.g., auditory, gustatory, etc.), which have reported that amygdala activation is more reliable during emotional picture viewing (Brown et al, 2011; Phan et al., 2002; Satpute et al., 2015). Although the data do not support a hypothesis of common activation in key regions in the core affect or salience networks across induction contexts, multivariate analyses such as representational similarity or multivariate pattern recognition might show that subregions in either or both regions are able to predict (or classify) functional activity from one context to another (e.g., Haynes, 2015; Miskovic & Anderson, 2018). These analyses would benefit from many more training trials than included in the current paradigm, and the predicted null effect (e.g., no common regions) is best supported by a much larger N (e.g., Chang et al., 2015).

The "salience" network (Menon, 2015) was initially determined based on intrinsic connectivity analyses of resting state brain data and described as activated when creating a salient representation of the visual environment - perceiving stimuli that are deviant from the dominant context, or when processing rewarding or emotionally engaging stimuli. The current data are consistent with reliable activation of the salience network during visual perception (Brown et al., 2011; Sabatinelli et al., 2011; Satpute et al., 2015). Thus, emotional scene viewing prompted significant activation of its key regions of anterior insula, cingulate cortex, and thalamus. In addition, whereas context-dependent connectivity analyses (gPPI) indicated that one of the major nodes of the salience network - the anterior insula - showed enhanced connectivity with the visual cortex, this was found during emotional (compared to neutral) scene viewing, but not during emotional imagery.

Emotional imagery did not prompt activation in key regions of the salience network, either in whole brain analyses, functional connectivity analysis, or in followup ROI analyses. Rather, unique and significant enhancement in functional activity during emotional, compared to neutral, imagery was found bilaterally in the hippocampus, cerebellum, vermis, caudate, and dorsal medial prefrontal cortex (mPFC). Whereas studies of visual perception and visual imagery often report overlapping activation in visual cortex (e.g., Fulford et al., 2018; Pearson, Naselaris, Holmes, & Kosslyn, 2015), narrative imagery is, according to Lang's bioinformational theory (1977, 1979; see also Lang, Cuthbert, & Bradley, 1998), primarily an action context in which the physiological and biological activities prompted by the stimulus context and its accompanying reactions (e.g., fleeing, freezing, fighting, running, etc.) are central elements of the activated mental image, prompting response profiles similar to those occurring in the actual context. The extensive activation of multiple cerebellar regions during emotional imagery supports an action account of emotional imagery. Moreover, both animal and human data suggest that the dorsal region of mPFC, also activated during emotional imagery in the current study, is critical in determining and directing motor output through strong connectivity with motor and pre-motor regions (Euston, Gruber, & McNaughton, 2012).

The current repeated measures investigation allows a direct comparison of two adjacent regions – amygdala and hippocampus - that were significantly activated during either emotional scene perception or emotional imagery, respectively. ROI analysis of the anterior hippocampus resulted in significant enhancement during emotional compared to neutral processing in both scene perception and narrative imagery; on the other hand, ROI analyses of the amygdala did not result in significant enhancement during emotional, compared to neutral, imagery. Taken together, these data suggest that whereas emotional perception reliably activates a small region of anterior hippocampus, amygdala activation is not as reliable during emotional imagery. These data are consistent with an earlier study that reported significant amygdala activation when imagining emotional, compared to neutral, scenes (Costa, Lang, Sabatinelli, Versace, & Bradley, 2010), as the location of functional activation was in the region of the posterior amgydala/anterior hippocampus found here, with the current study raising additional questions that can be pursued in future investigations.

Despite the minimal overlap in regions co-activated during emotional perception and imagery, pleasant, compared to aversive, processing, was associated with significant co-activation in a ventral region of mPFC and the striatum across paradigms. These regions has been previously reported to show enhancement during appetitive, compared to aversive, imagery (e.g., Costa et al., 2010), scene perception (e.g., Sabatinelli, Bradley, Lang, Costa, & Versace, 2007), anticipation (e.g., Sege, Bradley, Weymar, & Lang, 2017), and in a variety of reward contexts. In the current study, whereas pleasant imagery activated a large dorsal region of vmPFC, pleasant perception activated a smaller region that was more ventral and closer to orbitofrontal cortex. Nonetheless, a relatively large overlapping cluster in vmPFC was significantly enhanced during appetitive processing, regardless of context, a region similar to that reported in a recent meta-analyses assessing appetitive processing in multiple contexts, including viewing erotic scenes as well as receiving food or monetary reward (Sescousse, Caldú, Segura, Dreher, 2013).

Although some regions showed selective activation for pleasant, compared to unpleasant, processing, there were no regions in the brain that were uniquely activated in the context of aversive processing. Rather, all of the regions activated during aversive processing shared common activation during appetitive processing. These findings parallel recent meta-analytic (Lindquist et al., 2016) and experimental findings (Chikazoe et al., 2014) that report extensive overlap in functional activation whether emotional processing is aversive or appetitive. Assuming a basic biological chassis for both aversive and appetitive motivation, the documented similarity in neural (as well as physiological) activity in the context of either aversive or appetitive stimulation (Bradley & Lang, 2007; Lang & Bradley, 2010) could reflect the fact that both contexts benefit from similar processes of selective attention, orienting, and preparation for action (Bradley, 2009).

Similar to imagining standard emotional scenes, imagining personally relevant emotional events did not strongly activate the amygdala. Rather, imagining personal, compared to standard, scenes activated large clusters in the posterior parietal cortex, including regions of the precuneus/cuneus and bilateral angular gyrus that are frequently reported in studies of autobiographical memory retrieval (Donaldson, Petersen, & Buckner, 2001; Guerin & Miller, 2009; Yassa & Stark, 2008; Kompus, Eichele, Hugdahl, & Nyberg, 2010; Wagner, Shannon, Kahn, Buckner, 2005; Svoboda, McKinnon, Levine, 2006). Moreover, similar posterior parietal activation has been implicated in explicit or spontaneous retrieval in laboratory tasks in which words, sentences, or scenes are presented in the context of an episodic memory task (e.g., Kim, 2017; Weimar, Bradley, Sege, & Lang, 2018), suggesting that posterior parietal activation is a feature of episodic retrieval. In a recent study (Bradley, Costa, Ferrari, Codispoti, Fitzsimmons, & Lang, 2015), we found similar posterior parietal activation simply when repetitions of scenes were spaced across an incidental encoding phase, consistent with theories that suggest distributed, but not massed repetitions, automatically retrieve earlier episodic representations, facilitating later episodic memory performance.

Taken together, the data suggest that the precuneus and other regions of posterior parietal cortex are associated with retrieving mental representations of a specific, previously experienced event. Of course, even a "standard" scene could contact a related episodic representation if the cue is similar enough to retrieve a personally experienced event. Although one might argue that standard neutral scenes (e.g., climbing stairs, shopping at a store) describe events that participants have personally experienced, it is likely that, because these cues are contextually overloaded, a specific episodic occurrence is not routinely retrieved. A second possibility is that posterior parietal activation in the current study results instead from the fact that participants generated the personal scenes prior to the scanning session, which could be tested by asking participants to generate familiar neutral activities. On the other hand, because all of the scenes were read and rated prior to scanning, the posterior parietal activation found during personal, compared to standard, imagery cannot be due to sheer familiarity with the script content.

In summary, functional brain activity was assessed in the same individuals during emotional scene perception and narrative imagery. Analyses found little overlap in the regions showing enhanced activity during emotional, compared to neutral, processing consistent with a context-dependent view of emotion (e.g., Bradley, 2000; Bradley & Lang, 2018). Functional activity was enhanced in some of the regions held to be part of the "core limbic" or "salience" network, including the amygdala, anterior insula, thalamus, and cingulate cortex, but were specifically activated in the context of emotional perception. In contrast, emotional imagery enhanced functional activity in the hippocampus, lateral cerebellum, cerebellar vermis, and dorsal mPFC, supporting the view that imagery of emotional arousing events is an action context (Lang, 1979). These data support previous studies finding different patterns of functional activation across emotional contexts (Brown et al., 2011; Satpute et al., 2015; Phan et al., 2002 ), and are not consistent with the view that emotional processing activates a general network that is the same in all emotional contexts (i.e., Lindquist et al., 2012, 2016; Touroutoglou, Lindquist, Dickerson, Barrett, 2015). Rather, the data indicate that brain regions active in emotional processing can vary significantly with different methods of emotion induction.

Supplementary Material

Supp info

ACKNOWLEGMENTS

Thanks to Mathias Weymar, Robert Henderson, Katja Hillbrandt and Vanessa Dominguez for their initial assistance with various aspects of the research. David Herring is now at Murray State University, Murray, Kentucky. This work was supported in part by the National Institutes of Health [grant numbers MH094386, MH098078] to P.J.L.

Appendix

1. I just won fifty million dollars! I can’t believe this is happening, but it’s true! I bought the winning ticket in the lottery. It's amazing. I cry, scream, and jump around! (Pleasant)

2. They yell “surprise!” when I open the door. All of my friends are at my house cheering. My heart races, and I smile and laugh out loud, realizing they’ve thrown me a party. (Pleasant)

3. A long kiss. My body responds slowly at first and then with an urgent rhythm. I'm breathless and my skin tingles. I feel gentle hands touching me and my back arches. (Pleasant)

4. I wake up gasping as smoke fills my lungs. I stumble blindly from the bed, crashing into a chair. The fire quickly spreads everywhere, burning my skin when I try to escape. (Unpleasant)

5. I hear the screech of brakes and look up to see a speeding car slam into my friend as she crosses the street. Her leg is crushed, the artery torn, and blood pumps on the road. (Unpleasant)

6. It's my turn to speak in front of the group. They all look at me. My mouth is dry. The words won’t come out. My heart pounds in the silent room. Someone starts laughing. (Unpleasant)

7. I’m alone in a corner, tense and sweaty. Everyone else is enjoying the party – talking, laughing. I don’t know anyone. People look at me and turn away. My face flushes. (Unpleasant)

8. I’m trapped! In the checkout line, endlessly waiting. People crowd against me. Suddenly, I can’t breathe, my chest is tight, and I’m choking. Am I having a heart attack? (Unpleasant)

9. Panic comes out of the blue with no warning. My heart is racing and my stomach is churning. I feel like I’m going to suffocate. Am I going crazy? Am I going to die? (Unpleasant)

10. I get the groceries out of the car. Sliding my hands around the heavy brown bag, I pick it up. I hold it tightly against my chest and lean over to close the trunk. (Neutral)

11. I grab the last sock from the dryer and toss it into the full basket. I smell the fresh scent of the clean clothes, warm from the dryer, and bend to lift the basket. (Neutral)

12. I run the comb through my damp hair, straighten my shirt, and smooth out the wrinkles in my pants. The water is running in the sink so I turn it off before I leave. (Neutral)

Footnotes

1

. Mean pleasure and arousal ratings for stimulus materials using the self-assessment manikin (SAM; Bradley & Lang, 1994; Lang, 1980) in which pleasure varies from 1 (unpleasant) to 9 (pleasant), and arousal varies from 1 (calm) to 9 (excited). Pleasant pictures: pl = 7.2, aro = 5.6; Neutral pictures: pl = 5.0; aro = 3.3; Unpleasant pictures: pl = 3.0; aro = 5.8; Pleasant scripts: pl = 8.2, aro = 7.2; Neutral scripts: pl = 5.6; aro = 4.1; Unpleasant scripts: pl = 2.5; aro = 6.8.

2

. During script presentation (3-9 s), functional activity was heightened for standard emotional (M = .27), compared to neutral (M = .13) scripts only in the anterior portion of the mid temporal cortex (Left: cluster size = 349 voxels, peak LPI = 49, −6, 11; Right: cluster size = 56 voxels, peak LPI = −46, −9, 14).

3

. The contrast of unpleasant > pleasant processing was performed for both scene perception and imagery but no voxels survived the threshold (FDR < .05).

4

. Enhanced functional activity was found for both emotional and neutral imagery in the dorsal-anterior insula, t(60) = 4.72 and 4.8, respectively; p’s < .001, but there was no significant difference in functional activation due to emotionality. For the thalamus, ventral-anterior insula, and posterior insula, there were no significant differences in BOLD signal change during either emotional or neutral imagery, and neither prompted changes different from baseline.

5

. Functional activity in the lateral cerebellum replicates our previous finding of enhanced BOLD signal change when imagining emotional, compared to neutral, scenes (Sabatinelli et al., 2006). Following up on the significant activation of SMA found during emotional imagery in that study, significant enhancement of BOLD activation (greater than baseline) was found during both emotional, M = .14, t(60) = 6.63, p < .001, and neutral imagery, M = .15, t(60) = 5.88, p < .001, in the current study, with no difference between them, F(1,120) < 1, p = 0.65.

6

. Followup ROI analyses indicated that imagining personally relevant emotional scenes prompted a significant BOLD increase (above baseline) in the amgydala, t(60) = 4.0, p < .001, that did not differ from standard emotional scenes.

References

  1. Adolphs R (2002). Neural systems for recognizing emotion. Current Opinion in Neurobiology, 12, 169–177. 10.1016/S0959-4388(02)00301-X [DOI] [PubMed] [Google Scholar]
  2. Amaral DG, Price JL, Pitkanen A, & Carmichael ST (1992). Anatomical organization of the primate amygdaloid complex In Aggleton JP (Ed.), The Amygdala: Neurobiological aspects of emotion, memory, and mental dysfunction (pp. 1–66). New York, NY: Wiley-Liss. [Google Scholar]
  3. Anderson AK, Christoff K, Stappen I, Panitz D, Ghahremani DG, Glover G, Gabrieli JD, & Sobel N (2003). Dissociated neural representations of intensity and valence in human olfaction. Nature Neuroscience, 6(2), 196–202. 10.1038/nn1001 [DOI] [PubMed] [Google Scholar]
  4. Bradley MM (2000). Emotion and motivation In Cacioppo JT, Tassinary LG, & Berntson G (Eds.), Handbook of Psychophysiology (pp. 602–642). New York: Cambridge University Press. [Google Scholar]
  5. Bradley MM (2009). Natural selective attention: orienting and emotion. Psychophysiology, 46, 1–11. 10.1111/j.1469-8986.2008.00702.x [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Bradley MM, Costa VD, Ferrari V, Codispoti M, Fitzsimmons JR, & Lang PJ (2015). Imaging distributed and massed repetitions of natural scenes: Spontaneous retrieval and maintenance. Human Brain Mapping, 36, 1381–1392. 10.1002/hbm.2278 [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Bradley MM, & Lang PJ (1994). Measuring emotion: The Self-Assessment Manikin and the Semantic Differential. Journal of Behavioral Therapy and Experimental Psychiatry, 25(1), 49–59. [DOI] [PubMed] [Google Scholar]
  8. Bradley MM, & Lang PJ (2007). The International Affective Picture System (IAPS) in the study of emotion and attention In Coan JA and Allen JJB (Eds). Handbook of emotion elicitation and assessment (pp. 29–46). Oxford University Press. [Google Scholar]
  9. Bradley MM & Lang PJ (2018). Emotion in body and brain: Context dependent action and reaction In Davidson R, Shackman A, Fox A & Lapate R (Eds.), The nature of emotion, Oxford University Press. [Google Scholar]
  10. Bradley MM, Sabatinelli D, Lang PJ, Fitzsimmons JR, King WM, & Desai P (2003). Activation of the visual cortex in motivated attention. Behavioral Neuroscience, 117(2), 369–380. 10.1037/0735-7044.117.2.369 [DOI] [PubMed] [Google Scholar]
  11. Britton JC, Phan KL, Taylor SF, Welsh RC, Berridge KC, Liberzon I (2006). Neural correlates of social and nonsocial emotions: an fMRI study. NeuroImage, 31, 397–409. 10.1016/j.neuroimage.2005.11.027 [DOI] [PubMed] [Google Scholar]
  12. Brown S, Gao X, Tisdelle L, Eickhoff SB, Liotti M (2011). Naturalizing aesthetics: brain areas for aesthetic appraisal across sensory modalities. NeuroImage, 58, 250–258. 10.1016/j.neuroimage.2011.06.012 [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Bystritsky A, Pontillo D, Powers M, Sabb FW, Craske MG, & Bookheimer SY (2001). Functional MRI changes during panic anticipation and imagery exposure. Neuroreport, 12(18), 3953–3957. 10.1097/00001756-200112210-00020 [DOI] [PubMed] [Google Scholar]
  14. Camille N, Tsuchida A, & Fellows LK (2011). Double dissociation of stimulus-value and action-value learning in humans with orbitofrontal or anterior cingulate cortex damage. The Journal of Neuroscience, 31, 15048–15052. 10.1523/JNEUROSCI.3164-11.2011 [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Chang LJ, Gianaros PJ, Manuck SB, Krishnan A & Wager TD (2015). A sensitive and specific neural signature for picture-induced negative affect. PLoS Biology, 13, e1002180 10.1371/journal.pbio.1002180 [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Chikazoe J, Lee DH, Kriegeskorte N, Anderson AK (2014). Population coding of affect across stimuli, modalities and individuals. Nature Neuroscience, 17, 1114–1122. 10.1038/nn.3749 [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Cisler JM, Bush K, & Steele JS (2013). A comparison of statistical methods for detecting context-modulated functional connectivity in fMRI. NeuroImage, 84(1), 1042–1052. 10.1016/j.neuroimage.2013.09.018 [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Clark IA, Mackay CE (2015). Mental imagery and post-traumatic stress disorder: A neuroimaging and experimental psychopathology approach to intrusive memories of trauma. Frontiers in Psychiatry, 6, 1–12. 10.3389/fpsyt.2015.00104 [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Costa VD, Lang PJ, Sabatinelli D, Versace F, Bradley MM (2010). Emotional imagery: Assessing pleasure and arousal in the brain’s reward circuitry. Human Brain Mapping, 31(9), 1446–1457. 10.1002/hbm.20948 [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Cox RW (1996). AFNI: Software for analysis and visualization of functional magnetic resonance neuroimages. Computers and Biomedical Research, 29(3), 10.1006/cbmr.1996.0014 [DOI] [PubMed] [Google Scholar]
  21. Donaldson DI, Petersen SE, & Buckner RL (2001). Dissociating memory retrieval processes using fMRI: Evidence that priming does not support recognition memory. Neuron, 31, 1047–1059. [DOI] [PubMed] [Google Scholar]
  22. Euston DR, Gruber AJ, McNaughton BL (2012) The role of medial prefrontal cortex in memory and decision making. Neuron, 76, 1057–1070. 10.1016/j.neuron.2012.12.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Fanselow MS & Dong HW (2010). Are the dorsal and ventral hippocampus functionally distinct structures? Neuron, 65, 7–19 (2010). 10.1016/j.neuron.2009.11.031 [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Fink GR, Markowitsch HJ, Reinkemeier M, Bruckbauer T, Kessler J, & Heiss WD (1996). Cerebral representation of one’s own past: Neural networks involved in autobiographical memory. The Journal of Neuroscience, 16, 4275–4282. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Frank DW, & Sabatinelli D (2014). Human thalamic and amygdala modulation in emotional scene perception. Brain Research, 1587, 69–76. 10.1016/j.brainres.2014.08.061 [DOI] [PubMed] [Google Scholar]
  26. Fulford J, Milton F, Salas D, Smith A, Simler A, Winlove C, & Zeman A (2018). The neural correlates of visual imagery vividness – An fMRI study and literature review. Cortex, 105, 26–24. 10.1016/j.cortex.2017.09.014 [DOI] [PubMed] [Google Scholar]
  27. Gottfried JA, O'Doherty J, & Dolan RJ (2002). Appetitive and aversive olfactory learning in humans studied using event-related functional magnetic resonance imaging. The Journal of Neuroscience, 22(24), 10829–10837. 10.1523/JNEUROSCI.22-24-10829.2002 [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Guerin SA, & Miller MB (2009). Lateralization of the parietal old/new effect: An event- related fMRI study comparing recognition memory for words and faces. NeuroImage, 44, 232–242. 10.1016/j.neuroimage.2008.08.035 [DOI] [PubMed] [Google Scholar]
  29. Hariri AR, Tessitore A, Mattay VS, Fera F, Weinberger DR (2002). The amygdala response to emotional stimuli: a comparison of faces and scenes. NeuroImage, 17, 317–323. 10.1006/nimg.2002.1179 [DOI] [PubMed] [Google Scholar]
  30. Haynes JDA (2015). Primer on pattern-based approaches to fMRI: principles, pitfalls, and perspectives. Neuron, 87, 257–270. 10.1016/j.neuron.2015.05.025 [DOI] [PubMed] [Google Scholar]
  31. Hayes DJ, & Northoff G (2011). Identifying a network of brain regions involved in aversion-related processing: a cross-species translational investigation. Frontiers in Integrative Neuroscience, 5, 1–21. 10.3389/fnint.2011.00049 [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Junghöfer M, Sabatinelli D, Bradley MM, Schupp H, Elbert T, & Lang PJ (2006). Fleeting images: Rapid affect discrimination in the visual cortex. NeuroReport, 17(2), 225–229. 10.1097/01.wnr.0000198437.59883.bb [DOI] [PubMed] [Google Scholar]
  33. Kim H (2017). Brain regions that show repetition suppression and enhancement: A meta-analysis of 137 neuroimaging experiments. Human Brain Mapping, 38(4), 1894–1913. 10.1002/hbm.23492 [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Kober H, Barrett LF, Joseph J, Bliss-Moreau E, Lindquist K, & Wager TD (2008). Functional grouping and cortical-subcortical interactions in emotion: A meta-analysis of neuroimaging studies. NeuroImage, 42, 998–1031. 10.1016/j.neuroimage.2008.03.059 [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Kompus K, Eichele T, Hugdahl K, & Nyberg L (2010). Multimodal imaging of incidental retrieval: The low route to memory. Journal of Cognitive Neuroscience, 23(4), 974–960. [DOI] [PubMed] [Google Scholar]
  36. Kragel P, & LaBar K (2016). Decoding the nature of emotion in the brain. Trends in Cognitive Sciences, 20(6), 444–455. 10.1016/j.tics.2016.03.011 [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Lang PJ (1977). Imagery in therapy: An information processing analysis of fear. Behavior Therapy, 8, 862–886. [DOI] [PubMed] [Google Scholar]
  38. Lang PJ (1979). A bio-informational theory of emotional imagery. Psychophysiology, 16, 495–512. [DOI] [PubMed] [Google Scholar]
  39. Lang PJ (1980). Behavioral treatment and bio-behavioral assessment: Computer applications In Sidowski JB, Johnson JH, & Williams TA (Eds.), Technology in mental health care delivery systems (pp. 119–137). Norwood, NJ: Ablex Publishing. [Google Scholar]
  40. Lang PJ and Bradley MM (2010). Emotion and the motivational brain. Biological Psychology, 84(3), 437–50. 10.1016/j.biopsycho.2009.10.007 [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Lang PJ, Bradley MM, & Cuthbert BN (2008). International affective picture system (IAPS): Affective ratings of pictures and instruction manual. Technical Report A-8. University of Florida, Gainesville, FL. [Google Scholar]
  42. Lang PJ, Bradley MM, Fitzsimmons JR, Cuthbert BN, Scott JD, Moulder B & Nangia V (1998). Emotional arousal and activation of the visual cortex: An fMRI analysis. Psychophysiology, 35, 199–210. 10.1111/1469-8986.3520199 [DOI] [PubMed] [Google Scholar]
  43. Lang PJ, Cuthbert BN, & Bradley MM (1998). Measuring emotion in therapy: Imagery, activation, and feeling. Behavior Therapy, 29, 655–674. 10.1016/S0005-7894(98)80024-5 [DOI] [Google Scholar]
  44. Lindquist KA, Wager TD, Kober H, Bliss-Moreau E, & Barrett LF (2012). The brain basis of emotion: a meta-analytic review. Behavioral and Brain Sciences, 35,121–143. 10.1017/S0140525X11000446 [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. Lindquist KA, Satpute AB, Wager TD, Weber J & Barrett LF (2016). The brain basis of positive and negative affect: evidence from a meta-analysis of the human neuroimaging literature. Cerebral Cortex, 26(5), 1910–1922. 10.1093/cercor/bhv001 [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Mantani T, Okamoto Y, Shirao N, Okada G, Yamawaki S (2005). Reduced activation of posterior cingulate cortex during imagery in subjects with high degrees of alexithymia: a functional magnetic resonance imaging study. Biological Psychiatry, 57, 982–990. 10.1016/j.biopsych.2005.01.047 [DOI] [PubMed] [Google Scholar]
  47. Markowitsch HJ, Thiel A, Reinkemeier M, Kessler J, Koyuncu A, & Heiss WD (2000). Right amygdalar and temporofrontal activation during autobiographic, but not during fictitious memory retrieval. Behavioural Neurology, 12, 117–127. [DOI] [PubMed] [Google Scholar]
  48. Menon V (2015) Salience Network In: Toga Arthur W., editor. Brain Mapping: An Encyclopedic Reference, vol. 2, pp. 597–611. Academic Press: Elsevier. [Google Scholar]
  49. McLaren DG, Ries ML, Xu G, & Johnson SC (2012). A Generalized Form of Context-Dependent Psychophysiological Interactions (gPPI): A Comparison to Standard Approaches. NeuroImage, 61(4), 1277–1286. 10.1016/j.neuroimage.2012.03.068 [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Miskovic V, & Anderson AK (2018). Modality general and modality specific coding of hedonic valence. Current Opinion in Behavioral Sciences, 19, 91–97. 10.1016/j.cobeha.2017.12.012 [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Murphy FC, Nimmo-Smith I, Lawrence AD (2003). Functional neuroanatomy of emotions: A meta-analysis. Cognitive Affective & Behavioral Neuroscience, 3, 207–33. [DOI] [PubMed] [Google Scholar]
  52. Northoff G, Heinzel A, Bermpohl F, Niese R, Pfennig A, Pascual-Leone A, Schlaug G (2004). Reciprocal modulation and attenuation in the prefrontal cortex: an fMRI study on emotional–cognitive interaction. Human Brain Mapping, 21, 202–212. 10.1002/hbm.20002 [DOI] [PMC free article] [PubMed] [Google Scholar]
  53. O'Doherty J, Kringelbach ML, Rolls ET, Hornak J, & Andrews C (2001). Abstract reward and punishment representations in the human orbitofrontal cortex. Nature Neuroscience, 4(1), 95–102. 10.1038/82959 [DOI] [PubMed] [Google Scholar]
  54. Padmala S, Sirbu M, & Pessoa L (2017). Potential reward reduces the adverse impact of negative distractor stimuli. Social Cognitive and Affective Neuroscience, 12(9), 1402–1413. 10.1093/scan/nsx067 [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Pessoa L & Adolphs R (2010) Emotion processing and the amygdala: From a “low road” to “many roads” of evaluating biological significance. Nature Reviews Neuroscience, 11, 773–83. 10.1038/nrn2920 [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Pearson J, Naselaris T, Holmes EA & Kosslyn SM (2015). Mental imagery: functional mechanisms and clinical applications. Trends in Cognitive Sciences, 19, 590–602. 10.1016/j.tics.2015.08.003 [DOI] [PMC free article] [PubMed] [Google Scholar]
  57. Peelen MV, Atkinson AP, & Vuilleumier P (2010) Supramodal representations of perceived emotions in the human brain. Journal of Neuroscience, 30, 10127–10134. 10.1523/JNEUROSCI.2161-10.2010 [DOI] [PMC free article] [PubMed] [Google Scholar]
  58. Phan KL, Wager T, Taylor SF, & Liberzon I (2002). Functional neuroanatomy of emotion: A meta-analysis of emotion activation studies in PET and fMRI. NeuroImage, 16, 331–348. 10.1006/nimg.2002.1087 [DOI] [PubMed] [Google Scholar]
  59. Royet JP, Zald D, Versace R, Costes N, Lavenne F, Koenig O, & Gervais R (2000). Emotional responses to pleasant and unpleasant olfactory, visual, and auditory stimuli: a positron emission tomography study. Journal of Neuroscience, 20, 7752–7759. 10.1523/JNEUROSCI.20-20-07752.2000 [DOI] [PMC free article] [PubMed] [Google Scholar]
  60. Saarimäki H, Gostopoulos A, Jääskeläinen IP, Lampinen J, Vuilleumier P, Sams M, & Nummenmaa L (2016). Discrete neural signatures of basic emotions. Cerebral Cortex, 26(6), 2563–2573, 10.1093/cercor/bhv086 [DOI] [PubMed] [Google Scholar]
  61. Sabatinelli D, Bradley MM, Fitzsimmons JR, & Lang PJ (2005). Parallel amygdala and inferotemporal activation reflect emotional intensity and fear relevance. NeuroImage, 24, 1265–1270. 10.1016/j.neuroimage.2004.12.015 [DOI] [PubMed] [Google Scholar]
  62. Sabatinelli D, Bradley MM, Lang PJ, Costa VD, & Versace F (2007). Pleasure rather than salience activates human nucleus accumbens and medial prefrontal cortex. Journal of Neurophysiology, 98, 1374–1379. 10.1152/jn.00230.2007 [DOI] [PubMed] [Google Scholar]
  63. Sabatinelli D, Fortune EE, Li Q, Siddiqui A, Krafft C, Oliver WT, Beck S, & Jeffries J (2011). Emotional perception: meta-analyses of face and natural scene processing. NeuroImage, 54, 2524–2533. 10.1016/j.neuroimage.2010.10.011 [DOI] [PubMed] [Google Scholar]
  64. Sabatinelli D, Lang PJ, Bradley MM, Costa VD, & Keil A (2009). The timing of emotional discrimination in human amygdala and ventral visual cortex. Journal of Neuroscience, 29, 14864–14868. 10.1523/JNEUROSCI.3278-09.2009 [DOI] [PMC free article] [PubMed] [Google Scholar]
  65. Sabatinelli D, Lang PJ, Bradley MM, & Flaisch T (2006). The neural basis of narrative imagery: emotion and action. Progress in Brain Research, 156, 93–103. 10.1016/S0079-6123(06)56005-4 [DOI] [PubMed] [Google Scholar]
  66. Sambuco N, Bradley M, Herring D, Hillbrandt K, Lang PJ (2019). Transdiagnostic trauma severity in anxiety and mood disorder: Functional brain activity during emotional scene perception. Psychophysiology. 10.1111/psyp.13349 [DOI] [PMC free article] [PubMed] [Google Scholar]
  67. Satpute A, Kang J, Bickart K, Yardley H, Wager T, & Barrett LF (2015). Involvement of sensory regions in affective experience: A meta-analysis. Frontiers in Psychology, 6, 1860 10.3389/fpsyg.2015.01860 [DOI] [PMC free article] [PubMed] [Google Scholar]
  68. Seeley WW, Menon V, Schatzberg AF, Keller J, Glover GH, Kenna H, et al. (2007). Dissociable intrinsic connectivity networks for salience processing and executive control. The Journal of Neuroscience, 27, 2349–2356. 10.1523/JNEUROSCI.5587-06.2007 [DOI] [PMC free article] [PubMed] [Google Scholar]
  69. Sege CT, Bradley MM, Weymar M, & Lang PJ (2017). A direct comparison of appetitive and aversive anticipation: Overlapping and distinct neural activation. Behavioural Brain Research, 326, 96–102. 10.1016/j.bbr.2017.03.005 [DOI] [PMC free article] [PubMed] [Google Scholar]
  70. Shinkareva SV, Wang J, Kim J, Facciani MJ, Baucom LB, & Wedell DH (2014). Representations of modality-specific affective processing for visual and auditory stimuli derived from functional magnetic resonance imaging data. Human Brain Mapping, 35, 3558–3568. 10.1002/hbm.22421 [DOI] [PMC free article] [PubMed] [Google Scholar]
  71. Sescousse G, Caldú X, Segura B & Dreher J-C (2013). Processing of primary and secondary rewards: a quantitative meta-analysis and review of human functional neuroimaging studies. Neuroscience and Biobehavioral Reviews, 37, 681–696. 10.1016/j.neubiorev.2013.02.002 [DOI] [PubMed] [Google Scholar]
  72. Shirer WR, Ryali S, Rykhlevskaia E, Menon V, & Greicius MD (2012). Decoding subject-driven cognitive states with whole-brain connectivity patterns. Cerebral Cortex, 22, 158–165. 10.1093/cercor/bhr099 [DOI] [PMC free article] [PubMed] [Google Scholar]
  73. Svoboda E, McKinnon MC, & Levine B (2006). The functional neuroanatomy of autobiographical memory: a meta-analysis. Neuropsychologia, 44, 2189–2208. 10.1016/j.neuropsychologia.2006.05.023 [DOI] [PMC free article] [PubMed] [Google Scholar]
  74. Touroutoglou A, Lindquist KA, Dickerson BC, & Barrett LF (2015). Intrinsic connectivity in the human brain does not reveal networks for ‘basic’ emotions. Social Cognitive & Affective Neuroscience, 10(9), 1257–1265. 10.1093/scan/nsv013 [DOI] [PMC free article] [PubMed] [Google Scholar]
  75. Wagner AD, Shannon BJ, Kahn I, & Buckner RL (2005). Parietal lobe contributions to episodic memory retrieval. Trends in Cognitive Science, 9, 445–453. [DOI] [PubMed] [Google Scholar]
  76. Weymar M, Bradley MM, Sege CT, & Lang PJ (2018). Neural activation and memory of natural scenes: Explicit and spontaneous retrieval. Psychophysiology. 10.1111/psyp.13197 [DOI] [PMC free article] [PubMed] [Google Scholar]
  77. Wright CI, Wedig MM, Williams D, Rauch SL, & Albert MS (2006). Novel fearful faces activate the amygdala in healthy young and elderly adults. Neurobiology of Aging, 27(2), 361–374. 10.1016/j.neurobiolaging.2005.01.014 [DOI] [PubMed] [Google Scholar]
  78. Yassa MA, & Stark CEL (2008). Multiple signals of recognition memory in the medial temporal lobe. Hippocampus, 18(9), 945–954. 10.1002/hipo.20452 [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supp info

RESOURCES