Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2014 Nov 26.
Published in final edited form as: Hum Brain Mapp. 2009 Nov;30(11):3748–3758. doi: 10.1002/hbm.20803

Frontolimbic Responses to Emotional Face Memory: The Neural Correlates of First Impressions

Theodore D Satterthwaite a, Daniel H Wolf a, Ruben C Gur a,b,c, Kosha Ruparel a, Jeffrey N Valdez a, Raquel E Gur a,b,c, James Loughead a
PMCID: PMC4244703  NIHMSID: NIHMS564364  PMID: 19530218

Abstract

First impressions, especially of emotional faces, may critically impact later evaluation of social interactions. Activity in limbic regions, including the amygdala and ventral striatum, has previously been shown to correlate with identification of emotional content in faces; however, little work has been done describing how these signals may influence emotional face memory. We report an event-related fMRI study in 21 healthy adults where subjects attempted to recognize a neutral face that was previously viewed with a threatening (angry or fearful) or non-threatening (happy or sad) affect. In a hypothesis-driven region of interest analysis, we found that neutral faces previously presented with a threatening affect recruited the left amygdala. In contrast, faces previously presented with a non-threatening affect activated the left ventral striatum. A whole-brain analysis revealed increased response in the right orbitofrontal cortex to faces previously seen with threatening affect. These effects of prior emotion were independent of task performance, with differences being seen in the amygdala and ventral striatum even if only incorrect trials were considered. The results indicate that a network of frontolimbic regions may provide emotional bias signals during facial recognition.

Keywords: amygdala, ventral striatum, fMRI, face, memory, emotion, orbitofrontal cortex

Introduction

The neural architecture for the recognition of faces has been well described, with multiple studies implicating a network of brain regions including the fusiform gyrus and the superior temporal sulcus (Vuilleumier and Pourtois 2007). Prior research also suggests that the emotional information conveyed by facial affect may have similarities to other emotional stimuli (Britton, et al. 2006b) and be processed by a network of limbic and cortical regions including the amygdala, ventral striatum, and orbitofrontal cortex (Hariri, et al. 2003; Killgore and Yurgelun-Todd 2004; Loughead, et al. 2008). Building on a large literature examining fear conditioning in rodents (LeDoux 2000), multiple functional magnetic resonance imaging (fMRI) experiments have demonstrated that the amygdala responds robustly to threatening faces (Breiter, et al. 1996; Morris, et al. 1996; Phan, et al. 2002). Similarly, the orbitofrontal cortex (OFC) has been found to respond to displays of angry faces, suggesting that it may be important in modulating the emotional context of facial expression (Blair, et al. 1999; Dougherty, et al. 1999), or may be involved in the inhibition of a behavioral response to a perceived threat (Coccaro, et al. 2007). In contrast, the ventral striatum is most often associated with reward-related behavior in both electrophysiological studies in animals (Schultz 2002) and human neuroimaging experiments (Knutson, et al. 2001; Monk, et al. 2008; O'Doherty, et al. 2001). Several imaging studies have demonstrated that the ventral striatum is recruited by the rewarding properties of beautiful or happy faces (Aharon, et al. 2001; Monk, et al. 2008). However, other studies suggest that amygdala (Davis and Whalen 2001; Fitzgerald, et al. 2006) and ventral striatal responses (Zink, et al. 2006; Zink, et al. 2003) are better explained by salience than by specific threat or reward-related properties.

This body of literature has almost uniformly investigated the neural correlates of passive viewing of emotional faces or facial affect identification. Beyond one small study, which found insular activation to the memory of neutral faces initially paired with a description of a disgusting behavior (Todorov, et al. 2007), remarkably little research has investigated the neural correlates of emotional memory of faces (Gobbini and Haxby 2007). The aphorism “You never have a second chance to make a first impression” attests to the importance of the affective content of a face when it is first encountered, as such information inevitably informs the interpretation of subsequent social interactions (Pessoa 2005; Sommer, et al. 2008). Identifying the neural basis of this phenomena is important for further understanding of both social cognition and its dysfunction in psychiatric disorders such as schizophrenia (Gur, et al. 2007; Holt, et al. 2006; Holt, et al. 2005; Kohler, et al. 2000; Mathalon and Ford 2008), post-traumatic stress disorder (Liberzon, et al. 2007), and depression (Fu, et al. 2007; Fu, et al. 2008).

We investigated the neural correlates of emotional face memory using an incidental memory paradigm and event-related fMRI. Subjects made an ‘old’ vs. ‘new’ recognition judgment regarding previously viewed and novel faces. Importantly, the ‘old’ faces had been originally presented with either threatening (angry or fearful) or non-threatening (happy or sad) facial expressions, while in the forced-choice recognition task all faces were shown with neutral expressions. We hypothesized that faces displaying a threatening affect on “first impression” would recruit the amygdala during the recognition task despite their current neutral affect. In contrast, we expected that neutral faces initially displayed with a non-threatening affect would activate the ventral striatum. As described below, these hypotheses were supported. Furthermore, in an exploratory whole-brain analysis, we found that faces previously viewed with threatening expressions recruited the orbitofrontal cortex. These results delineate a frontolimbic network that may provide emotional bias signals to modulate facial memory, integrating critical information learned from an affect-laden first impression.

Experimental Procedures

Subjects

We studied 24 right-handed participants, who were free from psychiatric or neurologic comorbidity. After a complete description of the study, subjects provided written informed consent. Three subjects were excluded for excessive in-scanner motion (mean volume-to-volume displacement or mean absolute displacement greater than two standard deviations above the group average). This left a final sample of 21 subjects (47.6% male, mean age 32.0 years, SD=8.6).

Task

The face recognition experiment was preceded by an emotion identification task similar to those previously reported by our laboratory (Gur, et al. 2002b; Gur, et al. 2002c; Loughead, et al. 2008). In the emotion identification experiment, subjects viewed 30 faces displaying neutral, happy, sad, angry, or fearful affect, and were asked to label the emotion displayed (Figure 1A). Stimuli construction and validation are described in detail elsewhere (Gur, et al. 2002a). Briefly, the stimuli were color photographs of actors (50% female) who had volunteered to participate in a study on emotion. They were coached by professional directors to express facial displays of different emotions. For the purpose of constructing the affect identification task, a subset of 30 intense expressions was selected based on high degree of accuracy of identification by the raters. During the affect identification task, subjects were not instructed to remember the faces or informed of the memory component of the experiment. The emotion identification and the face recognition experiments were separated by a diffusion tensor imaging acquisition lasting 10 minutes. Results from the emotion identification task will be reported elsewhere.

Figure 1.

Figure 1

Experimental Paradigm. A. Encoding task. Subjects initially performed an emotion identification task in which they identified the facial affect displayed. Four emotional labels were available, including two non-threatening affects (happy and sad), two threatening affects (angry and fearful), and neutral. Subjects were not instructed to remember the faces displayed. B. Face recognition task. Following the affect identification task, subjects were asked to make a forced-choice facial recognition judgment. Thirty faces (targets) from the affect identification task and thirty novel faces (foils) were displayed for 2 seconds each. Subjects made a simple ‘old’ vs. ‘new’ judgment as to whether the face had been previously displayed in the affect identification task. Faces were separated by a variable (0-12s) interval of crosshair fixation.

The face recognition experiment (Figure 1B) presented 30 faces from the emotion identification experiment (targets) along with 30 novel faces (foils). Faces were displayed for two seconds and subjects were asked to make a simple ‘old’ vs. ‘new’ judgment using a two button response pad. Target faces initially presented with happy, sad, angry, or fearful affect were displayed with neutral expressions. Foil stimuli not previously seen were also displayed with a neutral expression. Faces shown with neutral expressions in the emotion identification experiment were treated as a covariate of no interest in the face recognition analysis. Interpretation of responses to neutral faces was confounded by their congruent affect in the emotion identification experiment: whereas target threat or non-threat faces were displayed initially with a strong affect in the emotion identification experiment and subsequently re-presented with a neutral affect in the face recognition experiment, neutral faces were displayed with the same expression in both experiments, making neutral trials difficult to compare to other targets. Previous experiments in our laboratory (Gur, et al. 2002b; Loughead, et al. 2008) have conceptualized emotions along the dimension of threat-relatedness. In the current experiment, faces originally displayed with an angry or fearful affect were modeled together as threat; faces originally displayed with a happy or sad affect were modeled as non-threat. This grouping of emotions is suggested by the theories of Gray (Gray 1990), by previous data from our laboratory (Loughead, et al. 2008), as well as other studies (Hariri, et al. 2000; Suslow, et al. 2006). Overall, there were 6 trials per emotion, yielding 12 threat and 12 non-threat trials per subject. Interspersed with the 60 task trials (30 targets, 30 foil) were 60 baseline trials of variable duration (0 - 12 seconds) during which a crosshair fixation point was displayed on a complex background (degraded face). No stimulus was presented twice and total task duration was 4 minutes and 16 seconds.

fMRI Procedures

Participants were required to demonstrate understanding of the task instructions, the response device, and complete one trial of practice prior to acquisition of face memory data. Earplugs were used to muffle scanner noise and head fixation was aided by foam-rubber restraints mounted on the head coil. Stimuli were rear-projected to the center of the visual field using a PowerLite 7300 video projector (Epson America, Inc.; Long Beach, CA) and viewed through a head coil mounted mirror. Stimulus presentation was synchronized with image acquisition using the Presentation software package (Neurobehavioral Systems, Inc., Albany, CA). Subjects were randomly assigned to use their dominant or non-dominant hand; responses were recorded via a non-ferromagnetic keypad (fORP; Current Designs, Inc.; Philadelphia, PA).

Image Acquisition

BOLD fMRI was acquired with a Siemens Trio 3 Tesla (Erlangen, Germany) system using a whole-brain, single-shot gradient-echo (GE) echoplanar (EPI) sequence with the following parameters: TR/TE=2000/32 ms, FOV=220 mm, matrix= 64 × 64, slice thickness/gap=3/0mm, 40 slices, effective voxel resolution of 3 × 3 × 3mm. To reduce partial volume effects in orbitofrontal regions, EPI was acquired obliquely (axial/coronal). The slices provided adequate brain coverage, from the superior cerebellum and inferior temporal lobe ventrally to the hand-motor area in the primary motor cortex dorsally. Prior to time-series acquisition, a 5-minute magnetization-prepared, rapid acquisition gradient-echo T1-weighted image (MPRAGE, TR 1620ms, TE 3.87 ms, FOV 250 mm, matrix 192x256, effective voxel resolution of 1 × 1 × 1mm) was collected for anatomic overlays of functional data and to aid spatial normalization to a standard atlas space (Talairach and Tournoux 1988).

Performance Analysis

Median percent correct and reaction time (RT, in milliseconds) were calculated for threat and non-threat trials separately. Differences in percent correct were evaluated using the Wilcoxon Signed-Rank test; RT between trial types was compared with paired two-tailed t-tests. Performance across all trials was evaluated by calculating the discrimination index Pr and the response bias Br (Snodgrass and Corwin 1988). These were calculated using the following formulas:

  • Pr = Correct targets – Incorrect foils

  • Br = Incorrect foils / [1 - (Correct targets - False alarms)] – 0.5

The discrimination index Pr provides an unbiased measure of the accuracy of response, with higher values corresponding to a greater degree of accuracy. In contrast, the response bias Br provides an independent measure of the overall tendency of subjects to make ‘old’ or ‘new’ responses regardless of accuracy. Positive values correspond to a familiarity bias (i.e., more likely to say ‘old’ to a new item) whereas negative values correspond to conservative responses with a novelty bias (i.e., more likely to say ‘new’ to an old item).

Image Analysis

fMRI data were preprocessed and analyzed using FEAT (fMRI Expert Analysis Tool) Version 5.1, part of FSL (FMRIB's Software Library, www.fmrib.ox.ac.uk/fsl). Images were slice-time corrected, motion corrected to the median image using a tri-linear interpolation with six degrees of freedom (Jenkinson, et al. 2002), high pass filtered (100s), spatially smoothed (6mm FWHM, isotropic), and grand-mean scaled using mean-based intensity normalization. BET was used to remove non-brain areas (Smith 2002). The median functional and anatomical volumes were coregistered, and then transformed into the standard anatomical space (T1 NMI template, voxel dimensions of 2x2x2 mm) using tri-linear interpolation.

Subject level time-series statistical analysis was carried out using FILM (FMRIB's Improved General Linear Model) with local autocorrelation correction (Woolrich, et al. 2001). The seven condition events (four target emotions, neutrals, foils, and non-responses) were modeled in the GLM after convolution with a canonical hemodynamic response function; temporal derivatives of each condition were also included in the model. Six rigid body movement parameters were included as nuisance covariates.

The contrast map of interest, threat vs. non-threat [i.e., (anger + fear) > (happy + sad)], was subjected to an a priori region of interest (ROI) analysis in two structures implicated in threat and reward processing: the amygdala and ventral striatum. The amygdala ROI was defined using the AAL atlas (Tzourio-Mazoyer, et al. 2002); the ventral striatum was defined from a publication-based probabilistic MNI Atlas used as a binary mask at the threshold of 0.75 probability (Fox and Lancaster 1994; for more information see http://hendrix.imm.dtu.dk/services/jerne/ninf/voi/indexalphabetic). Small volume correction (Friston 1997) was used to identify clusters of at least 10 contiguous 2x2x2 voxels within these ROIs.

In addition to the ROI analysis, two whole-brain mixed effects analyses were performed (Flame2, FMRIB's local analysis of mixed effects): 1) each subject's whole-brain contrast of threat vs. non-threat was entered into a one-sample t-test and 2) each subject's mean activation of the task (all trials) relative to baseline was calculated and entered into a one sample t-test. We corrected for multiple comparisons using a Monte Carlo method with AFNI AlphaSim (R. W. Cox, National Institutes of Health) at a Z threshold of 2.58 and a probability of spatial extent p<0.01. Identified clusters were then labeled according to anatomical regions using the Talairach Daemon database. Coordinates of cluster locations in all analyses are reported in MNI space. To characterize BOLD response amplitudes, mean scaled beta coefficients (% signal change) for threat and non-threat conditions were extracted from significant clusters in the a priori ROI analysis and whole brain analysis.

Finally, we conducted three exploratory analyses to further examine these effects. First, we investigated the effect of individual emotions within significantly activated clusters from the a priori ROIs and certain regions from the whole-brain analysis. To do this, we extracted beta coefficients from each target emotion (happy, sad, angry, fearful) versus a crosshair baseline. Significant differences between individual emotions were assessed with one-tailed paired t-tests. Second, we investigated whether the observed effects were affected by task performance. We modeled seven performance-based regressors at the subject level: threat correct, threat incorrect, non-threat correct, non-threat incorrect, foils, neutrals, and non-responses. Beta coefficients were extracted from above-threshold voxels as described above, and entered into a 2×2 repeated-measures ANOVA (threat vs. non-threat × correct vs. incorrect) to test the main effect of correct response and interaction effects. In this analysis the main effect of threat vs. non-threat was of limited utility as the contrast had already been used to select the voxels under investigation. Third, in order to ascertain if the threat vs. non-threat differences noted in our a priori ROI analysis were present even when subjects did not make a correct response, we used the performance-based regressors and restricted the threat vs. non-threat contrast to incorrect trials only.

Results

Task Performance

The response rate (81.2%) indicated that the task was difficult. Response accuracy was low (median 25% overall among the four target emotions) resulting in a very limited discrimination index (Pr=0.27). The low accuracy was associated with a conservative response bias, whereby subjects were more likely to judge previously presented faces as ‘new’ (Br=-0.29). There were not significant differences between threat and non-threat trials in either response accuracy or response time (Table 1). Although neutral faces and foils were not included in our analysis, accuracy for these trials was significantly higher: 50% for neutral trials and 76% for foils.

Table 1. Performance measures during face recognition task.

Performance measure
Percent Correct Median S.D.
Non-threat 33% 19%
Threat 25% 14%
Response Time (msec)
Non-threat 1020 211
Threat 1069 149
Non-threat Correct 1139 173
Non-threat Incorrect 1005 241
Threat Correct 1121 137
Threat Incorrect 1058 152

Task Activation

The analysis of all trials vs. fixation revealed significant activation for the task in a distributed network including bilateral frontal, parietal, medial temporal, thalamic, anterior cingulate, insular, and medial temporal regions (see Supplementary Table and Figure S1). This is consistent with general accounts of recognition memory (Shannon and Buckner 2004; Wheeler and Buckner 2003), as well as studies of face recognition more specifically (Calder and Young 2005; Ishai and Yago 2006).

Region of Interest Analysis

The a priori ROI analysis revealed significant effects in the left amygdala and left ventral striatum (Figure 2A). The left amygdala responded more to threat than non-threat stimuli in a cluster of 28 contiguous voxels surviving small volume correction (peak voxel: 30, -6, -28). The left ventral striatum showed the opposite effect, activating more to faces initially encountered with a non-threatening affect; 28 voxels (peak voxel: -14, 16, -8) survived small volume correction. While significant effects were only seen on the left, the right amygdala and ventral striatum exhibited similar sub-threshold responses. Overall, these results are consistent with our hypotheses, suggesting that during the recall of emotionally-valenced memories, the amygdala is differentially recruited by threat and the ventral striatum by non-threat even when the recognition cues lack emotional content.

Figure 2.

Figure 2

Imaging results. A. Region of interest analysis. The amygdala and the ventral striatum were identified as a priori regions of interest for analysis of target responses. The left amygdala responded more to target faces previously displayed with a threatening (angry or fearful) affect. In contrast, the left ventral striatum (right panels) responded more to target faces previously displayed with a non-threatening (happy or sad) affect. B. Whole-brain analysis. In a voxelwise analysis, the right orbitofrontal cortex (OFC) responded to target faces previously displayed with a threatening affect.

Whole-Brain Analysis

In order to examine brain regions beyond those selected in our a priori analysis, we performed a whole-brain analysis of the threat > non-threat contrast. The right orbitofrontal cortex showed significantly greater BOLD response during threat trials compared to non-threat, corrected at a Z threshold of 2.58 and a probability of spatial extent p<0.01 (90 voxels, peak at 44, 56, -12; see Figure 2B). The left orbitofrontal cortex exhibited a similar response that was not above this threshold. Two left lateral temporal clusters (posterior: 125 voxels, peak at -64, -38, -4; anterior: 96 voxels, peak at -48, -14, -8) were also found to be above threshold.

Exploratory Analysis of Individual Emotions

In order to probe the source of these threat vs. non-threat differences, we examined the effect of individual emotions within the voxels found to be significant in the above analyses. The results indicate that the threat > non-threat difference was not driven by one individual emotion (see Supplementary Figure S2). In the left amygdala, there were significant differences between each of the threat and non-threatening affects: anger > happy [t(40)=2.2, p=0.01], anger > sad [t(40)=2.3, p=0.01], fear > happy [t(40)=1.8, p<0.05], fear > sad [t(40)=1.8, p<0.05]. There was no difference within threat and non-threat affects. The same pattern was seen in the right OFC, with anger > happy [t(40)=2.6, p<0.01] and anger > sad [t(40)=3.5, p<0.001], as well as fear > happy [t(40)=2.5, p<0.01] and fear > sad [t(40)=3.5, p=0.001]. Again, there were not differences within threat or non-threat emotions. Finally, in the ventral striatum, there was a significant differences between sad and fear [t(40=2.0, p<0.05], as well as a trend towards a difference between happy and fear[t(40)=1.4, p=0.07]. There were no other significant differences between emotions in the ventral striatum; anger demonstrated a more intermediate response.

Exploratory Performance-Based Analyses

Next, we examined the effect of recognition performance on the observed threat vs. non-threat differences. Overall, we found that the effects described above were not significantly influenced by performance (Supplementary Figure S3). There was not a main effect of performance in the left amygdala [F(1,20)=0.03, p=0.85], left ventral striatum [F(1,20)=0.004, p=0.95], or right OFC [F(1,20)=2.69, p=0.11]. Furthermore, there were no interactions between performance and trial type in any of the above regions. However, this analysis was likely underpowered due to the limited number of correct trials.

When we restricted our analysis to incorrect trials only, we found that differences between threat and non-threat trials persisted in our a priori ROIs. In the left amygdala, 4 voxels survived small volume correction when only incorrect trials were included; 50 voxels surpassed an uncorrected p=0.05 threshold. In the left ventral striatum, 20 voxels survived small volume correction.

Discussion

This study investigated the impact of an intense facial affect at first presentation on neural activation during a subsequent recognition task. We found that faces previously seen with a threatening expression (fearful or angry) provoked a greater BOLD response in the left amygdala and right orbitofrontal cortex, whereas the left ventral striatum responded more to faces initially viewed with a non-threatening expression. Furthermore, these differences were independent of performance. The implications and limitations of these findings are discussed below.

Amygdala and OFC respond to previously threatening faces

Past experiments have demonstrated the important role of the amygdala in fear conditioning in animals (for a review, see LeDoux, 2000), in humans with amygdalar lesions (Bechara, et al. 1995; LaBar, et al. 1995), and in healthy controls in neuroimaging paradigms (Dolcos, et al. 2004; LaBar and Cabeza 2006). In addition to conditioning paradigms, the amygdala has been critically implicated in emotional learning across a wide variety of tasks (see LaBar and Cabeza 2006). Specifically, amygdalar activity during encoding of emotional stimuli predicts successful recollection (Canli, et al. 2000; Dolcos, et al. 2004). Furthermore, neutral stimuli encoded in emotional contexts have been shown to recruit the amygdala upon retrieval (Dolan, et al. 2000; Maratos, et al. 2001; Smith, et al. 2004; Taylor, et al. 1998). Work by our group (Gur, et al. 2002b; Gur, et al. 2007) and others (Anderson, et al. 2003; Williams, et al. 2004) has demonstrated that the amygdala responds to displays of threatening faces; however, no previous experiment has investigated limbic contributions to facial affective memory.

Consistent with previous studies using non-facial stimulus categories (Dolan, et al. 2000; Maratos, et al. 2001; Smith, et al. 2004) we found that faces initially encountered with a threatening affect elicit amygdalar response upon re-presentation with neutral affect. Gobbini and Haxby (2007) have proposed a core system of facial recognition that is modulated by emotional information through limbic inputs. Our results suggest that modulatory signals from limbic regions reflect not only the current emotional context, but also represent the prior emotional context from a previous exposure. For example, the negative “first impression” created by a threatening facial expression may produce amygdalar activation initially (Gur, et al. 2002b; LeDoux 2000). Later, when the face is re-encountered with a neutral expression, the amygdala is again recruited in response to the emotional context present during encoding. Thus, these results outline a neural system that integrates social information about individuals over time.

The OFC is known to occupy a privileged role at the intersection of emotion and cognition. It has extensive reciprocal connections to medial temporal lobe (Cavada, et al. 2000) and there is ample evidence for its role in the encoding of memories (Frey and Petrides 2002), specifically the aspects of memory associated with rewarding or aversive information (Morris and Dolan 2001; O'Doherty, et al. 2001; Rolls 2000). Two previous studies (Blair, et al. 1999; Lee, et al. 2008) have found that the OFC responds to angry faces; here we found that, like the amygdala, the right OFC responds to neutral faces previously presented with a threatening affect. Like the amygdala and OFC, two left lateral temporal clusters were also found to respond to previously threatening faces. These clusters are situated near the superior temporal sulcus, which has been shown to be involved in numerous functions (Hein and Knight 2008), including face perception (Haxby, et al. 1999) gaze direction (Grosbras, et al. 2005; Hoffman and Haxby 2000), biological motion (Grossman and Blake 2002). Extending findings that the STS may be involved in the processing displays of facial affect (Williams, et al. 2008), our results suggest that lateral temporal regions may also be recruited by previously encountered affect as well.

Ventral striatum responds to previously non-threatening faces

While much research on emotional memory has focused on the amygdala (Phelps 2004), there is increasing evidence that the striatum also plays an important role in the encoding of emotional information (Adcock, et al. 2006; Brewer, et al. 2008; Britton, et al. 2006a; Phan, et al. 2004; Satterthwaite, et al. 2007). Studies by Monk et al. (2008) and Aharon et al. (2001) have demonstrated that the ventral striatum is recruited by displays of happy and beautiful faces, suggesting that pleasant faces are processed as social reinforcers. Like happy faces, sad faces have been theorized to be rewarding because of their pro-social, affiliative properties, as well as demonstrating submission within a social hierarchy (Bonanno, et al. 2008; Eisenberg, et al. 1989; Eisenberg and Miller 1987; Killgore and Yurgelun-Todd 2004; Lewis and Haviland-Jones 2008). Evidence that the human brain is particularly sensitive to affective displays that provide information on social context and hierarchy is accumulating (Britton, et al. 2006a; Carter and Pelphrey 2008; Fliessbach, et al. 2007; Guroglu, et al. 2008; Kim, et al. 2007; Rilling, et al. 2008; Zink, et al. 2008). Also, two prior fMRI studies have found sad faces to activate the striatum (Beauregard, et al. 1998; Fu, et al. 2004). Our results suggest that reward pathways are recruited not just when a non-threatening face is first encountered, but also when it is re-encountered in a neutral context. Thus, the ventral striatum may provide a reward signal that may act in opposition to the amygdala when making threat vs. non-threat distinctions regarding previously encountered individuals.

Threat vs. non-threat differences persist even when faces are not recognized

In an exploratory analysis, we did not find an effect of task performance on brain activation that would account for the differential response of these brain regions. Indeed, when we excluded correct trials and examined amygdalar and striatal responses to incorrect trials only, the results persisted (although at somewhat lower levels of significance). This potentially important finding suggests that successful recognition is not required for facial affective information to be preserved on a neural level. This is relevant to the ongoing controversy as to whether amygdalar responses to emotional information depend on active attention or if they occur automatically (Pessoa, et al. 2005; Vuilleumier and Driver 2007). Several backward-masking fMRI studies have shown that the amygdala responds to fearful faces presented without awareness (Dannlowski, et al. 2007; Whalen, et al. 1998; Williams, et al. 2006), providing evidence for an automatic “bottom-up” response that does not require attention (LeDoux 2000; Phelps and LeDoux 2005). Our task was not designed to isolate such stimulus-driven properties of amygdala response. However, the results do suggest that a “first impression” that is not explicitly remembered may be encoded on a neural level and implicitly influence subsequent social interactions.

Limitations

Two limitations of this study should be acknowledged. First, the grouping of the emotions into threat and non-threat categories may obscure differences between the individual component emotions. For example, while an angry face represents a direct threat indicated by gaze, a fearful face may indicate a more ambiguous environmental threat (Ewbank, et al. 2008), and some studies (Whalen, et al. 2001) have found that that the amygdala may respond more to fearful than angry faces. However, other studies have demonstrated that the amygdala does reliably respond to anger (Beaver, et al. 2008; Evans, et al. 2008; Ewbank, et al. 2008; Hariri, et al. 2000; Nomura, et al. 2004; Sprengelmeyer, et al. 1998; Stein, et al. 2002; Suslow, et al. 2006). In the current study, anger and fear trials produced a similar response in amygdala and OFC clusters, indicating that the results were not driven by one particular emotion. In contrast, the findings in the ventral striatum were more ambiguous, and distinctions other than the threat vs. non-threat effect may have impacted the results. Future studies are necessary to investigate potentially important differences within categories of threatening and non-threatening faces.

Second, the low accuracy in target recognition led our study to be underpowered for certain performance-based comparisons. Performance was likely limited by the general difficulty of face recognition tasks (Calkins, et al. 2005; Ishai and Yago 2006) and by the fact that subjects were not notified that they would be asked to later recognize the faces from the affect identification task. We observed a strong conservative response bias, suggesting low target accuracy was not due to simple guessing. In particular, while we did not find an interaction of performance with the threat vs. non-threat differences reported, this analysis was limited by the number of correct trials in each group. In contrast, the high proportion of incorrect trials allowed us to examine these trials separately in a sub-analysis, finding that the threat vs. non-threat differences are present in the amygdala and ventral striatum even when only incorrect trials were considered. Nonetheless, in the future it will be important to further assess the effects of correct recognition with a design that produces more correct trials and incorporates behavioral metrics of recognition confidence.

Conclusions

In summary, the current study provides the first evidence for frontolimbic responses that could provide emotional bias signals during facial recognition. The amygdala and OFC appear to signal previously threatening faces, while the ventral striatum may oppose this system by responding to previously non-threatening faces. These differences do not appear to depend on accurate recognition of faces, suggesting a process that does not require explicit awareness. Emotional face memory plays an important role in social cognition and these results provide an intriguing first look into the neural correlates of emotional memory of faces. Furthermore, such processes could contribute to deficits in face recognition and social interaction seen in neuropsychiatric disorders such as schizophrenia, mood disorders, and brain injury (Calkins, et al. 2005; Fu, et al. 2008; Gainotti 2007; Holt, et al. 2006; Holt, et al. 2005).

Supplementary Material

Fig S1

Supplementary Figure 1. Task activation. Compared to fixation, the task activated a distributed network including bilateral frontal, parietal, medial temporal, thalamic, anterior cingulate, and insular regions; bilateral amygdala was robustly task-activated as well.

Supplementary Table. Task activation

Fig S2

Supplementary Figure 2. Exploratory analysis of individual emotions. In the left amygdala and right OFC clusters, threat emotions (anger and fear) responded significantly more than non-threat emotions (happy and sad). In the left ventral striatum, previously non-threatening (happy or sad) faces prompted a greater response than previously fearful faces.

Fig S3

Supplementary Figure 3. Exploratory performance-based analysis. Task performance did not significantly influence effects seen in the amygdala, ventral striatum, or OFC.

Acknowledgments

Supported by grants from the National Institute of Mental Health MH 60722 and MH 19112.

PREVIOUS PRESENTATION: NONE

We thank Masaru Tomita and Kathleen Lesko for data collection. Caryn Lerman provided valuable discussion regarding first impressions.

References

  1. Adcock RA, Thangavel A, Whitfield-Gabrieli S, Knutson B, Gabrieli JD. Reward-motivated learning: mesolimbic activation precedes memory formation. Neuron. 2006;50(3):507–17. doi: 10.1016/j.neuron.2006.03.036. [DOI] [PubMed] [Google Scholar]
  2. Aharon I, Etcoff N, Ariely D, Chabris CF, O'Connor E, Breiter HC. Beautiful faces have variable reward value: fMRI and behavioral evidence. Neuron. 2001;32(3):537–51. doi: 10.1016/s0896-6273(01)00491-3. [DOI] [PubMed] [Google Scholar]
  3. Anderson AK, Christoff K, Panitz D, De Rosa E, Gabrieli JD. Neural correlates of the automatic processing of threat facial signals. J Neurosci. 2003;23(13):5627–33. doi: 10.1523/JNEUROSCI.23-13-05627.2003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Beauregard M, Leroux JM, Bergman S, Arzoumanian Y, Beaudoin G, Bourgouin P, Stip E. The functional neuroanatomy of major depression: an fMRI study using an emotional activation paradigm. Neuroreport. 1998;9(14):3253–8. doi: 10.1097/00001756-199810050-00022. [DOI] [PubMed] [Google Scholar]
  5. Beaver JD, Lawrence AD, Passamonti L, Calder AJ. Appetitive motivation predicts the neural response to facial signals of aggression. J Neurosci. 2008;28(11):2719–25. doi: 10.1523/JNEUROSCI.0033-08.2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Bechara A, Tranel D, Damasio H, Adolphs R, Rockland C, Damasio AR. Double dissociation of conditioning and declarative knowledge relative to the amygdala and hippocampus in humans. Science. 1995;269(5227):1115–8. doi: 10.1126/science.7652558. [DOI] [PubMed] [Google Scholar]
  7. Blair RJ, Morris JS, Frith CD, Perrett DI, Dolan RJ. Dissociable neural responses to facial expressions of sadness and anger. Brain. 1999;122(Pt 5):883–93. doi: 10.1093/brain/122.5.883. [DOI] [PubMed] [Google Scholar]
  8. Bonanno G, Goorin L, Coifman K. Sadness and Grief. In: Lewis M, Haviland-Jones J, Barrett L, editors. Handbook of Emotions. New York, NY: Guilford Publications; 2008. [Google Scholar]
  9. Breiter HC, Etcoff NL, Whalen PJ, Kennedy WA, Rauch SL, Buckner RL, Strauss MM, Hyman SE, Rosen BR. Response and habituation of the human amygdala during visual processing of facial expression. Neuron. 1996;17(5):875–87. doi: 10.1016/s0896-6273(00)80219-6. [DOI] [PubMed] [Google Scholar]
  10. Brewer JA, Worhunsky PD, Carroll KM, Rounsaville BJ, Potenza MN. Pretreatment Brain Activation During Stroop Task Is Associated with Outcomes in Cocaine-Dependent Patients. Biol Psychiatry. 2008 doi: 10.1016/j.biopsych.2008.05.024. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Britton JC, Phan KL, Taylor SF, Welsh RC, Berridge KC, Liberzon I. Neural correlates of social and nonsocial emotions: An fMRI study. Neuroimage. 2006a;31(1):397–409. doi: 10.1016/j.neuroimage.2005.11.027. [DOI] [PubMed] [Google Scholar]
  12. Britton JC, Taylor SF, Sudheimer KD, Liberzon I. Facial expressions and complex IAPS pictures: common and differential networks. Neuroimage. 2006b;31(2):906–19. doi: 10.1016/j.neuroimage.2005.12.050. [DOI] [PubMed] [Google Scholar]
  13. Calder AJ, Young AW. Understanding the recognition of facial identity and facial expression. Nat Rev Neurosci. 2005;6(8):641–51. doi: 10.1038/nrn1724. [DOI] [PubMed] [Google Scholar]
  14. Calkins ME, Gur RC, Ragland JD, Gur RE. Face recognition memory deficits and visual object memory performance in patients with schizophrenia and their relatives. Am J Psychiatry. 2005;162(10):1963–6. doi: 10.1176/appi.ajp.162.10.1963. [DOI] [PubMed] [Google Scholar]
  15. Canli T, Zhao Z, Brewer J, Gabrieli JD, Cahill L. Event-related activation in the human amygdala associates with later memory for individual emotional experience. J Neurosci. 2000;20(19):RC99. doi: 10.1523/JNEUROSCI.20-19-j0004.2000. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Carter EJ, Pelphrey KA. Friend or foe? Brain systems involved in the perception of dynamic signals of menacing and friendly social approaches. Soc Neurosci. 2008;3(2):151–63. doi: 10.1080/17470910801903431. [DOI] [PubMed] [Google Scholar]
  17. Cavada C, Company T, Tejedor J, Cruz-Rizzolo RJ, Reinoso-Suarez F. The anatomical connections of the macaque monkey orbitofrontal cortex. A review. Cereb Cortex. 2000;10(3):220–42. doi: 10.1093/cercor/10.3.220. [DOI] [PubMed] [Google Scholar]
  18. Coccaro EF, McCloskey MS, Fitzgerald DA, Phan KL. Amygdala and orbitofrontal reactivity to social threat in individuals with impulsive aggression. Biol Psychiatry. 2007;62(2):168–78. doi: 10.1016/j.biopsych.2006.08.024. [DOI] [PubMed] [Google Scholar]
  19. Dannlowski U, Ohrmann P, Bauer J, Kugel H, Arolt V, Heindel W, Suslow T. Amygdala reactivity predicts automatic negative evaluations for facial emotions. Psychiatry Res. 2007;154(1):13–20. doi: 10.1016/j.pscychresns.2006.05.005. [DOI] [PubMed] [Google Scholar]
  20. Davis M, Whalen PJ. The amygdala: vigilance and emotion. Mol Psychiatry. 2001;6(1):13–34. doi: 10.1038/sj.mp.4000812. [DOI] [PubMed] [Google Scholar]
  21. Dolan RJ, Lane R, Chua P, Fletcher P. Dissociable temporal lobe activations during emotional episodic memory retrieval. Neuroimage. 2000;11(3):203–9. doi: 10.1006/nimg.2000.0538. [DOI] [PubMed] [Google Scholar]
  22. Dolcos F, LaBar KS, Cabeza R. Interaction between the amygdala and the medial temporal lobe memory system predicts better memory for emotional events. Neuron. 2004;42(5):855–63. doi: 10.1016/s0896-6273(04)00289-2. [DOI] [PubMed] [Google Scholar]
  23. Dougherty DD, Shin LM, Alpert NM, Pitman RK, Orr SP, Lasko M, Macklin ML, Fischman AJ, Rauch SL. Anger in healthy men: a PET study using script-driven imagery. Biol Psychiatry. 1999;46(4):466–72. doi: 10.1016/s0006-3223(99)00063-3. [DOI] [PubMed] [Google Scholar]
  24. Eisenberg N, Fabes RA, Miller PA, Fultz J, Shell R, Mathy RM, Reno RR. Relation of sympathy and personal distress to prosocial behavior: a multimethod study. J Pers Soc Psychol. 1989;57(1):55–66. doi: 10.1037//0022-3514.57.1.55. [DOI] [PubMed] [Google Scholar]
  25. Eisenberg N, Miller PA. The relation of empathy to prosocial and related behaviors. Psychol Bull. 1987;101(1):91–119. [PubMed] [Google Scholar]
  26. Evans KC, Wright CI, Wedig MM, Gold AL, Pollack MH, Rauch SL. A functional MRI study of amygdala responses to angry schematic faces in social anxiety disorder. Depress Anxiety. 2008;25(6):496–505. doi: 10.1002/da.20347. [DOI] [PubMed] [Google Scholar]
  27. Ewbank MP, Lawrence AD, Passamonti L, Keane J, Peers PV, Calder AJ. Anxiety predicts a differential neural response to attended and unattended facial signals of anger and fear. Neuroimage. 2008 doi: 10.1016/j.neuroimage.2008.09.056. [DOI] [PubMed] [Google Scholar]
  28. Fitzgerald DA, Angstadt M, Jelsone LM, Nathan PJ, Phan KL. Beyond threat: amygdala reactivity across multiple expressions of facial affect. Neuroimage. 2006;30(4):1441–8. doi: 10.1016/j.neuroimage.2005.11.003. [DOI] [PubMed] [Google Scholar]
  29. Fliessbach K, Weber B, Trautner P, Dohmen T, Sunde U, Elger CE, Falk A. Social comparison affects reward-related brain activity in the human ventral striatum. Science. 2007;318(5854):1305–8. doi: 10.1126/science.1145876. [DOI] [PubMed] [Google Scholar]
  30. Fox PT, Lancaster JL. Neuroscience on the net. Science. 1994;266(5187):994–6. doi: 10.1126/science.7973682. [DOI] [PubMed] [Google Scholar]
  31. Frey S, Petrides M. Orbitofrontal cortex and memory formation. Neuron. 2002;36(1):171–6. doi: 10.1016/s0896-6273(02)00901-7. [DOI] [PubMed] [Google Scholar]
  32. Friston KJ. Testing for anatomically specified regional effects. Hum Brain Mapp. 1997;5(2):133–6. doi: 10.1002/(sici)1097-0193(1997)5:2<133::aid-hbm7>3.0.co;2-4. [DOI] [PubMed] [Google Scholar]
  33. Fu CH, Williams SC, Brammer MJ, Suckling J, Kim J, Cleare AJ, Walsh ND, Mitterschiffthaler MT, Andrew CM, Pich EM, et al. Neural responses to happy facial expressions in major depression following antidepressant treatment. Am J Psychiatry. 2007;164(4):599–607. doi: 10.1176/ajp.2007.164.4.599. [DOI] [PubMed] [Google Scholar]
  34. Fu CH, Williams SC, Cleare AJ, Brammer MJ, Walsh ND, Kim J, Andrew CM, Pich EM, Williams PM, Reed LJ, et al. Attenuation of the neural response to sad faces in major depression by antidepressant treatment: a prospective, event-related functional magnetic resonance imaging study. Arch Gen Psychiatry. 2004;61(9):877–89. doi: 10.1001/archpsyc.61.9.877. [DOI] [PubMed] [Google Scholar]
  35. Fu CH, Williams SC, Cleare AJ, Scott J, Mitterschiffthaler MT, Walsh ND, Donaldson C, Suckling J, Andrew C, Steiner H. Neural Responses to Sad Facial Expressions in Major Depression Following Cognitive Behavioral Therapy. Biol Psychiatry. 2008 doi: 10.1016/j.biopsych.2008.04.033. [DOI] [PubMed] [Google Scholar]
  36. Gainotti G. Face familiarity feelings, the right temporal lobe and the possible underlying neural mechanisms. Brain Res Rev. 2007;56(1):214–35. doi: 10.1016/j.brainresrev.2007.07.009. [DOI] [PubMed] [Google Scholar]
  37. Gobbini MI, Haxby JV. Neural systems for recognition of familiar faces. Neuropsychologia. 2007;45(1):32–41. doi: 10.1016/j.neuropsychologia.2006.04.015. [DOI] [PubMed] [Google Scholar]
  38. Gray J. Brain systems that mediate both emotion and cognition. Cognition and Emotion. 1990;4:269–288. [Google Scholar]
  39. Grosbras MH, Laird AR, Paus T. Cortical regions involved in eye movements, shifts of attention, and gaze perception. Hum Brain Mapp. 2005;25(1):140–54. doi: 10.1002/hbm.20145. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Grossman ED, Blake R. Brain Areas Active during Visual Perception of Biological Motion. Neuron. 2002;35(6):1167–75. doi: 10.1016/s0896-6273(02)00897-8. [DOI] [PubMed] [Google Scholar]
  41. Gur RC, Sara R, Hagendoorn M, Marom O, Hughett P, Macy L, Turner T, Bajcsy R, Posner A, Gur RE. A method for obtaining 3-dimensional facial expressions and its standardization for use in neurocognitive studies. J Neurosci Methods. 2002a;115(2):137–43. doi: 10.1016/s0165-0270(02)00006-7. [DOI] [PubMed] [Google Scholar]
  42. Gur RC, Schroeder L, Turner T, McGrath C, Chan RM, Turetsky BI, Alsop D, Maldjian J, Gur RE. Brain activation during facial emotion processing. Neuroimage. 2002b;16(3 Pt 1):651–62. doi: 10.1006/nimg.2002.1097. [DOI] [PubMed] [Google Scholar]
  43. Gur RE, Loughead J, Kohler CG, Elliott MA, Lesko K, Ruparel K, Wolf DH, Bilker WB, Gur RC. Limbic activation associated with misidentification of fearful faces and flat affect in schizophrenia. Arch Gen Psychiatry. 2007;64(12):1356–66. doi: 10.1001/archpsyc.64.12.1356. [DOI] [PubMed] [Google Scholar]
  44. Gur RE, McGrath C, Chan RM, Schroeder L, Turner T, Turetsky BI, Kohler C, Alsop D, Maldjian J, Ragland JD, et al. An fMRI study of facial emotion processing in patients with schizophrenia. Am J Psychiatry. 2002c;159(12):1992–9. doi: 10.1176/appi.ajp.159.12.1992. [DOI] [PubMed] [Google Scholar]
  45. Guroglu B, Haselager GJ, van Lieshout CF, Takashima A, Rijpkema M, Fernandez G. Why are friends special? Implementing a social interaction simulation task to probe the neural correlates of friendship. Neuroimage. 2008;39(2):903–10. doi: 10.1016/j.neuroimage.2007.09.007. [DOI] [PubMed] [Google Scholar]
  46. Hariri AR, Bookheimer SY, Mazziotta JC. Modulating emotional responses: effects of a neocortical network on the limbic system. Neuroreport. 2000;11(1):43–8. doi: 10.1097/00001756-200001170-00009. [DOI] [PubMed] [Google Scholar]
  47. Hariri AR, Mattay VS, Tessitore A, Fera F, Weinberger DR. Neocortical modulation of the amygdala response to fearful stimuli. Biol Psychiatry. 2003;53(6):494–501. doi: 10.1016/s0006-3223(02)01786-9. [DOI] [PubMed] [Google Scholar]
  48. Haxby JV, Ungerleider LG, Clark VP, Schouten JL, Hoffman EA, Martin A. The effect of face inversion on activity in human neural systems for face and object perception. Neuron. 1999;22(1):189–99. doi: 10.1016/s0896-6273(00)80690-x. [DOI] [PubMed] [Google Scholar]
  49. Hein G, Knight RT. Superior temporal sulcus--It's my area: or is it? J Cogn Neurosci. 2008;20(12):2125–36. doi: 10.1162/jocn.2008.20148. [DOI] [PubMed] [Google Scholar]
  50. Hoffman EA, Haxby JV. Distinct representations of eye gaze and identity in the distributed human neural system for face perception. Nat Neurosci. 2000;3(1):80–4. doi: 10.1038/71152. [DOI] [PubMed] [Google Scholar]
  51. Holt DJ, Kunkel L, Weiss AP, Goff DC, Wright CI, Shin LM, Rauch SL, Hootnick J, Heckers S. Increased medial temporal lobe activation during the passive viewing of emotional and neutral facial expressions in schizophrenia. Schizophr Res. 2006;82(2-3):153–62. doi: 10.1016/j.schres.2005.09.021. [DOI] [PubMed] [Google Scholar]
  52. Holt DJ, Weiss AP, Rauch SL, Wright CI, Zalesak M, Goff DC, Ditman T, Welsh RC, Heckers S. Sustained activation of the hippocampus in response to fearful faces in schizophrenia. Biol Psychiatry. 2005;57(9):1011–9. doi: 10.1016/j.biopsych.2005.01.033. [DOI] [PubMed] [Google Scholar]
  53. Ishai A, Yago E. Recognition memory of newly learned faces. Brain Res Bull. 2006;71(1-3):167–73. doi: 10.1016/j.brainresbull.2006.08.017. [DOI] [PubMed] [Google Scholar]
  54. Jenkinson M, Bannister P, Brady M, Smith S. Improved optimization for the robust and accurate linear registration and motion correction of brain images. Neuroimage. 2002;17(2):825–41. doi: 10.1016/s1053-8119(02)91132-8. [DOI] [PubMed] [Google Scholar]
  55. Killgore WD, Yurgelun-Todd DA. Activation of the amygdala and anterior cingulate during nonconscious processing of sad versus happy faces. Neuroimage. 2004;21(4):1215–23. doi: 10.1016/j.neuroimage.2003.12.033. [DOI] [PubMed] [Google Scholar]
  56. Kim H, Adolphs R, O'Doherty JP, Shimojo S. Temporal isolation of neural processes underlying face preference decisions. Proc Natl Acad Sci U S A. 2007;104(46):18253–8. doi: 10.1073/pnas.0703101104. [DOI] [PMC free article] [PubMed] [Google Scholar]
  57. Knutson B, Adams CM, Fong GW, Hommer D. Anticipation of increasing monetary reward selectively recruits nucleus accumbens. J Neurosci. 2001;21(16):RC159. doi: 10.1523/JNEUROSCI.21-16-j0002.2001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  58. Kohler CG, Bilker W, Hagendoorn M, Gur RE, Gur RC. Emotion recognition deficit in schizophrenia: association with symptomatology and cognition. Biol Psychiatry. 2000;48(2):127–36. doi: 10.1016/s0006-3223(00)00847-7. [DOI] [PubMed] [Google Scholar]
  59. LaBar KS, Cabeza R. Cognitive neuroscience of emotional memory. Nat Rev Neurosci. 2006;7(1):54–64. doi: 10.1038/nrn1825. [DOI] [PubMed] [Google Scholar]
  60. LaBar KS, LeDoux JE, Spencer DD, Phelps EA. Impaired fear conditioning following unilateral temporal lobectomy in humans. J Neurosci. 1995;15(10):6846–55. doi: 10.1523/JNEUROSCI.15-10-06846.1995. [DOI] [PMC free article] [PubMed] [Google Scholar]
  61. LeDoux JE. Emotion circuits in the brain. Annu Rev Neurosci. 2000;23:155–84. doi: 10.1146/annurev.neuro.23.1.155. [DOI] [PubMed] [Google Scholar]
  62. Lee BT, Seok JH, Lee BC, Cho SW, Yoon BJ, Lee KU, Chae JH, Choi IG, Ham BJ. Neural correlates of affective processing in response to sad and angry facial stimuli in patients with major depressive disorder. Prog Neuropsychopharmacol Biol Psychiatry. 2008;32(3):778–85. doi: 10.1016/j.pnpbp.2007.12.009. [DOI] [PubMed] [Google Scholar]
  63. Lewis M, Haviland-Jones J. Handbook of Emotions. New York, NY: Guilford Publications; 2008. [Google Scholar]
  64. Liberzon I, King AP, Britton JC, Phan KL, Abelson JL, Taylor SF. Paralimbic and medial prefrontal cortical involvement in neuroendocrine responses to traumatic stimuli. Am J Psychiatry. 2007;164(8):1250–8. doi: 10.1176/appi.ajp.2007.06081367. [DOI] [PubMed] [Google Scholar]
  65. Loughead J, Gur RC, Elliott M, Gur RE. Neural circuitry for accurate identification of facial emotions. Brain Res. 2008;1194:37–44. doi: 10.1016/j.brainres.2007.10.105. [DOI] [PubMed] [Google Scholar]
  66. Maratos EJ, Dolan RJ, Morris JS, Henson RN, Rugg MD. Neural activity associated with episodic memory for emotional context. Neuropsychologia. 2001;39(9):910–20. doi: 10.1016/s0028-3932(01)00025-2. [DOI] [PubMed] [Google Scholar]
  67. Mathalon DH, Ford JM. Divergent approaches converge on frontal lobe dysfunction in schizophrenia. Am J Psychiatry. 2008;165(8):944–8. doi: 10.1176/appi.ajp.2008.08050735. [DOI] [PMC free article] [PubMed] [Google Scholar]
  68. Monk CS, Telzer EH, Mogg K, Bradley BP, Mai X, Louro HM, Chen G, McClure-Tone EB, Ernst M, Pine DS. Amygdala and ventrolateral prefrontal cortex activation to masked angry faces in children and adolescents with generalized anxiety disorder. Arch Gen Psychiatry. 2008;65(5):568–76. doi: 10.1001/archpsyc.65.5.568. [DOI] [PMC free article] [PubMed] [Google Scholar]
  69. Morris JS, Dolan RJ. Involvement of human amygdala and orbitofrontal cortex in hunger-enhanced memory for food stimuli. J Neurosci. 2001;21(14):5304–10. doi: 10.1523/JNEUROSCI.21-14-05304.2001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  70. Morris JS, Frith CD, Perrett DI, Rowland D, Young AW, Calder AJ, Dolan RJ. A differential neural response in the human amygdala to fearful and happy facial expressions. Nature. 1996;383(6603):812–5. doi: 10.1038/383812a0. [DOI] [PubMed] [Google Scholar]
  71. Nomura M, Ohira H, Haneda K, Iidaka T, Sadato N, Okada T, Yonekura Y. Functional association of the amygdala and ventral prefrontal cortex during cognitive evaluation of facial expressions primed by masked angry faces: an event-related fMRI study. Neuroimage. 2004;21(1):352–63. doi: 10.1016/j.neuroimage.2003.09.021. [DOI] [PubMed] [Google Scholar]
  72. O'Doherty J, Kringelbach ML, Rolls ET, Hornak J, Andrews C. Abstract reward and punishment representations in the human orbitofrontal cortex. Nat Neurosci. 2001;4(1):95–102. doi: 10.1038/82959. [DOI] [PubMed] [Google Scholar]
  73. Pessoa L. To what extent are emotional visual stimuli processed without attention and awareness? Curr Opin Neurobiol. 2005;15(2):188–96. doi: 10.1016/j.conb.2005.03.002. [DOI] [PubMed] [Google Scholar]
  74. Pessoa L, Padmala S, Morland T. Fate of unattended fearful faces in the amygdala is determined by both attentional resources and cognitive modulation. Neuroimage. 2005;28(1):249–55. doi: 10.1016/j.neuroimage.2005.05.048. [DOI] [PMC free article] [PubMed] [Google Scholar]
  75. Phan KL, Taylor SF, Welsh RC, Ho SH, Britton JC, Liberzon I. Neural correlates of individual ratings of emotional salience: a trial-related fMRI study. Neuroimage. 2004;21(2):768–80. doi: 10.1016/j.neuroimage.2003.09.072. [DOI] [PubMed] [Google Scholar]
  76. Phan KL, Wager T, Taylor SF, Liberzon I. Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies in PET and fMRI. Neuroimage. 2002;16(2):331–48. doi: 10.1006/nimg.2002.1087. [DOI] [PubMed] [Google Scholar]
  77. Phelps EA. Human emotion and memory: interactions of the amygdala and hippocampal complex. Curr Opin Neurobiol. 2004;14(2):198–202. doi: 10.1016/j.conb.2004.03.015. [DOI] [PubMed] [Google Scholar]
  78. Phelps EA, LeDoux JE. Contributions of the amygdala to emotion processing: from animal models to human behavior. Neuron. 2005;48(2):175–87. doi: 10.1016/j.neuron.2005.09.025. [DOI] [PubMed] [Google Scholar]
  79. Rilling JK, Dagenais JE, Goldsmith DR, Glenn AL, Pagnoni G. Social cognitive neural networks during in-group and out-group interactions. Neuroimage. 2008;41(4):1447–1461. doi: 10.1016/j.neuroimage.2008.03.044. [DOI] [PubMed] [Google Scholar]
  80. Rolls ET. The orbitofrontal cortex and reward. Cereb Cortex. 2000;10(3):284–94. doi: 10.1093/cercor/10.3.284. [DOI] [PubMed] [Google Scholar]
  81. Satterthwaite TD, Green L, Myerson J, Parker J, Ramaratnam M, Buckner RL. Dissociable but inter-related systems of cognitive control and reward during decision making: evidence from pupillometry and event-related fMRI. Neuroimage. 2007;37(3):1017–31. doi: 10.1016/j.neuroimage.2007.04.066. [DOI] [PubMed] [Google Scholar]
  82. Schultz W. Getting formal with dopamine and reward. Neuron. 2002;36(2):241–63. doi: 10.1016/s0896-6273(02)00967-4. [DOI] [PubMed] [Google Scholar]
  83. Shannon BJ, Buckner RL. Functional-anatomic correlates of memory retrieval that suggest nontraditional processing roles for multiple distinct regions within posterior parietal cortex. J Neurosci. 2004;24(45):10084–92. doi: 10.1523/JNEUROSCI.2625-04.2004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  84. Smith AP, Henson RN, Dolan RJ, Rugg MD. fMRI correlates of the episodic retrieval of emotional contexts. Neuroimage. 2004;22(2):868–78. doi: 10.1016/j.neuroimage.2004.01.049. [DOI] [PubMed] [Google Scholar]
  85. Smith SM. Fast robust automated brain extraction. Hum Brain Mapp. 2002;17(3):143–55. doi: 10.1002/hbm.10062. [DOI] [PMC free article] [PubMed] [Google Scholar]
  86. Snodgrass JG, Corwin J. Pragmatics of measuring recognition memory: applications to dementia and amnesia. J Exp Psychol Gen. 1988;117(1):34–50. doi: 10.1037//0096-3445.117.1.34. [DOI] [PubMed] [Google Scholar]
  87. Sommer M, Dohnel K, Meinhardt J, Hajak G. Decoding of affective facial expressions in the context of emotional situations. Neuropsychologia. 2008 doi: 10.1016/j.neuropsychologia.2008.04.020. [DOI] [PubMed] [Google Scholar]
  88. Sprengelmeyer R, Rausch M, Eysel UT, Przuntek H. Neural structures associated with recognition of facial expressions of basic emotions. Proc Biol Sci. 1998;265(1409):1927–31. doi: 10.1098/rspb.1998.0522. [DOI] [PMC free article] [PubMed] [Google Scholar]
  89. Stein MB, Goldin PR, Sareen J, Zorrilla LT, Brown GG. Increased amygdala activation to angry and contemptuous faces in generalized social phobia. Arch Gen Psychiatry. 2002;59(11):1027–34. doi: 10.1001/archpsyc.59.11.1027. [DOI] [PubMed] [Google Scholar]
  90. Suslow T, Ohrmann P, Bauer J, Rauch AV, Schwindt W, Arolt V, Heindel W, Kugel H. Amygdala activation during masked presentation of emotional faces predicts conscious detection of threat-related faces. Brain Cogn. 2006;61(3):243–8. doi: 10.1016/j.bandc.2006.01.005. [DOI] [PubMed] [Google Scholar]
  91. Talairach J, Tournoux P. Co-planar stereotaxic atlas of the human brain. New York: Thieme; 1988. [Google Scholar]
  92. Taylor SF, Liberzon I, Fig LM, Decker LR, Minoshima S, Koeppe RA. The effect of emotional content on visual recognition memory: a PET activation study. Neuroimage. 1998;8(2):188–97. doi: 10.1006/nimg.1998.0356. [DOI] [PubMed] [Google Scholar]
  93. Todorov A, Gobbini MI, Evans KK, Haxby JV. Spontaneous retrieval of affective person knowledge in face perception. Neuropsychologia. 2007;45(1):163–73. doi: 10.1016/j.neuropsychologia.2006.04.018. [DOI] [PubMed] [Google Scholar]
  94. Tzourio-Mazoyer N, Landeau B, Papathanassiou D, Crivello F, Etard O, Delcroix N, Mazoyer B, Joliot M. Automated anatomical labeling of activations in SPM using a macroscopic anatomical parcellation of the MNI MRI single-subject brain. Neuroimage. 2002;15(1):273–89. doi: 10.1006/nimg.2001.0978. [DOI] [PubMed] [Google Scholar]
  95. Vuilleumier P, Driver J. Modulation of visual processing by attention and emotion: windows on causal interactions between human brain regions. Philos Trans R Soc Lond B Biol Sci. 2007;362(1481):837–55. doi: 10.1098/rstb.2007.2092. [DOI] [PMC free article] [PubMed] [Google Scholar]
  96. Vuilleumier P, Pourtois G. Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging. Neuropsychologia. 2007;45(1):174–94. doi: 10.1016/j.neuropsychologia.2006.06.003. [DOI] [PubMed] [Google Scholar]
  97. Whalen PJ, Rauch SL, Etcoff NL, McInerney SC, Lee MB, Jenike MA. Masked presentations of emotional facial expressions modulate amygdala activity without explicit knowledge. J Neurosci. 1998;18(1):411–8. doi: 10.1523/JNEUROSCI.18-01-00411.1998. [DOI] [PMC free article] [PubMed] [Google Scholar]
  98. Whalen PJ, Shin LM, McInerney SC, Fischer H, Wright CI, Rauch SL. A functional MRI study of human amygdala responses to facial expressions of fear versus anger. Emotion. 2001;1(1):70–83. doi: 10.1037/1528-3542.1.1.70. [DOI] [PubMed] [Google Scholar]
  99. Wheeler ME, Buckner RL. Functional dissociation among components of remembering: control, perceived oldness, and content. J Neurosci. 2003;23(9):3869–80. doi: 10.1523/JNEUROSCI.23-09-03869.2003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  100. Williams LM, Liddell BJ, Kemp AH, Bryant RA, Meares RA, Peduto AS, Gordon E. Amygdala-prefrontal dissociation of subliminal and supraliminal fear. Hum Brain Mapp. 2006;27(8):652–61. doi: 10.1002/hbm.20208. [DOI] [PMC free article] [PubMed] [Google Scholar]
  101. Williams MA, McGlone F, Abbott DF, Mattingley JB. Stimulus-driven and strategic neural responses to fearful and happy facial expressions in humans. Eur J Neurosci. 2008;27(11):3074–82. doi: 10.1111/j.1460-9568.2008.06264.x. [DOI] [PubMed] [Google Scholar]
  102. Williams MA, Morris AP, McGlone F, Abbott DF, Mattingley JB. Amygdala responses to fearful and happy facial expressions under conditions of binocular suppression. J Neurosci. 2004;24(12):2898–904. doi: 10.1523/JNEUROSCI.4977-03.2004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  103. Woolrich MW, Ripley BD, Brady M, Smith SM. Temporal autocorrelation in univariate linear modeling of FMRI data. Neuroimage. 2001;14(6):1370–86. doi: 10.1006/nimg.2001.0931. [DOI] [PubMed] [Google Scholar]
  104. Zink CF, Pagnoni G, Chappelow J, Martin-Skurski M, Berns GS. Human striatal activation reflects degree of stimulus saliency. Neuroimage. 2006;29(3):977–83. doi: 10.1016/j.neuroimage.2005.08.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  105. Zink CF, Pagnoni G, Martin ME, Dhamala M, Berns GS. Human striatal response to salient nonrewarding stimuli. J Neurosci. 2003;23(22):8092–7. doi: 10.1523/JNEUROSCI.23-22-08092.2003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  106. Zink CF, Tong Y, Chen Q, Bassett DS, Stein JL, Meyer-Lindenberg A. Know your place: neural processing of social hierarchy in humans. Neuron. 2008;58(2):273–83. doi: 10.1016/j.neuron.2008.01.025. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Fig S1

Supplementary Figure 1. Task activation. Compared to fixation, the task activated a distributed network including bilateral frontal, parietal, medial temporal, thalamic, anterior cingulate, and insular regions; bilateral amygdala was robustly task-activated as well.

Supplementary Table. Task activation

Fig S2

Supplementary Figure 2. Exploratory analysis of individual emotions. In the left amygdala and right OFC clusters, threat emotions (anger and fear) responded significantly more than non-threat emotions (happy and sad). In the left ventral striatum, previously non-threatening (happy or sad) faces prompted a greater response than previously fearful faces.

Fig S3

Supplementary Figure 3. Exploratory performance-based analysis. Task performance did not significantly influence effects seen in the amygdala, ventral striatum, or OFC.

RESOURCES