Abstract
The amygdala plays a central role in processing facial affect, responding to diverse expressions and features shared between expressions. Although speculation exists regarding the nature of relationships between expression- and feature-specific amygdala reactivity, this matter has not been fully explored. We used functional magnetic resonance imaging and principal component analysis (PCA) in a sample of 300 young adults, to investigate patterns related to expression- and feature-specific amygdala reactivity to faces displaying neutral, fearful, angry or surprised expressions. The PCA revealed a two-dimensional correlation structure that distinguished emotional categories. The first principal component separated neutral and surprised from fearful and angry expressions, whereas the second principal component separated neutral and angry from fearful and surprised expressions. This two-dimensional correlation structure of amygdala reactivity may represent specific feature-based cues conserved across discrete expressions. To delineate which feature-based cues characterized this pattern, face stimuli were averaged and then subtracted according to their principal component loadings. The first principal component corresponded to displacement of the eyebrows, whereas the second principal component corresponded to increased exposure of eye whites together with movement of the brow. Our results suggest a convergent representation of facial affect in the amygdala reflecting feature-based processing of discrete expressions.
Keywords: affect, classification, correlation, fMRI, perception, principal component analysis
INTRODUCTION
Interpretation of emotional signals represented in the human face is necessary for proper social interaction and is frequently aberrant in psychiatric disorders. Quantitative meta-analyses of neuroimaging studies using emotionally expressive face stimuli reveal that the amygdala is engaged by the majority of emotional categories investigated (Sergerie et al., 2008; Fusar-Poli et al., 2009). However, differences in amygdala reactivity to specific emotional expressions have also been reported (Fusar-Poli et al., 2009; Vytal and Hamann, 2010). Two models have been proposed to explain these patterns. The first model posits that the amygdala responds more strongly to certain categories of emotional expression (Adolphs, 2002), while the second argues that the amygdala is responsive to certain feature-based cues represented to varying degrees across expressions (Engell et al., 2007). The latter has been supported by both behavioral and neuroimaging studies (Jacobs et al., 2012). There is also evidence offered in support of the former model specifically from studies suggesting that angry facial expressions are more easily recognized than happy facial expressions in a crowd of neutral faces (Ohman et al., 2001). However, research demonstrates that different low-level perceptual features can influence reaction times to schematic faces and abstract forms during this paradigm (Coelho et al., 2010), suggesting that the so-called ‘face in the crowd’ effect stems from certain feature-based cues receiving privileged processing, rather than broad categories of emotional expression.
Neuroimaging experiments have demonstrated that certain stimulus features preferentially engage the amygdala. For example, the amygdala is more responsive to the letter ‘V’, which has similar features to a frowning face, than to an inverted ‘V’ (Larson et al., 2009). The amygdala is also more responsive to sharp as opposed to curved objects (Bar and Neta, 2007) and shows increased activity to specific facial features associated with emotional arousal such as large eye whites (Whalen et al., 2004) or increased pupil size (Demos et al., 2008) irrespective of the emotional expression. Thus, amygdala reactivity to emotional expressions may reflect features represented to varying degrees across categories of facial expressions. If this is indeed the case, one would expect amygdala reactivity to correlate across categories of facial expression as the amygdala would track these specific features regardless of the categorical emotional expression of the face.
To address these predictions, we used blood oxygen level-dependent functional magnetic resonance imaging (BOLD fMRI) to measure amygdala reactivity to neutral, fearful, angry and surprised facial expressions in a cohort of 300 young adults. These specific expressions were employed as our parent protocol (see ‘Methods’ section) seeks to map neurogenetic correlates of individual differences in stress responsiveness and threat sensitivity and previous research suggests that each conveys unique information about potential threat in the environment. Fearful and angry expressions are hypothesized to represent implicit and explicit threat, respectively, as well as conditioned stimuli predicting negative outcomes (Davis et al., 2011). Neutral expressions represent threat arising from encounters with novel conspecifics. Surprised expressions represent threat arising from ambiguity regarding the nature of one’s local environment (i.e. positive or negative unexpected events can elicit surprise). BOLD fMRI amygdala responses to each category of emotional expression were then subjected to a principal component analysis (PCA) to determine if principal component loadings reliably distinguished between emotion categories indicating a well-defined correlational structure. We then analyzed which features corresponded to the loadings on the principal components by averaging the face stimuli according to their weightings on the two principal components. In doing so, we attempted to address whether amygdala responsiveness reflects specific stimulus features varying across categorical emotional expressions.
METHODS
Participants
A total of 350 subjects completed an ongoing parent protocol, the Duke Neurogenetics Study (DNS), which assesses a wide range of behavioral and biological traits among non-patient, young adult, student volunteers. In total, 35 participants were excluded for poor accuracy (<75%) during the fMRI task, three were excluded based on anatomical abnormalities, three were excluded because of scanner malfunction and nine were excluded based on poor data quality (for details, see ‘BOLD fMRI data preprocessing’ section) leaving data from 300 participants (174 females, age = 19.6 ± 1.3 s.d.) in the current analyses. All participants provided informed consent in accordance with Duke University guidelines and were in good general health. Participants were free of the following study exclusions: (i) medical diagnoses of cancer, stroke and head injury with loss of consciousness, untreated migraine headaches, diabetes requiring insulin treatment, chronic kidney or liver disease or lifetime history of psychotic symptoms; (ii) use of psychotropic, glucocorticoid or hypolipidemic medication and (iii) conditions affecting cerebral blood flow and metabolism (e.g. hypertension). Participants were not excluded based on diagnosis of any past or current Axis I disorder as defined by the Diagnostic and Statistical Manual of Mental Disorders, 4th Edition (DSM IV) (First et al., 1995) as the DNS seeks to establish broad variability in multiple behavioral phenotypes related to psychopathology. However, as stated earlier, no subjects were taking psychotropic medication at the time of or at least 10 days prior to study participation.
Amygdala reactivity paradigm
The experimental fMRI paradigm (Figure 1) consists of four blocks of a face-processing task interleaved with five blocks of a sensorimotor control task (Brown et al., 2005; Fakra et al., 2009; Hariri et al., 2009). Participant performance (accuracy and reaction time) is monitored during all scans using an MR-compatible button box. During the face-processing task, participants viewed a trio of faces (with neutral expressions or expressing anger, fear or surprise) and selected one of two faces (bottom) identical to a target face (top). Each face-processing block consisted of six images of the same expression, balanced for gender, all of which are derived from a standard set of pictures of facial affect (Ekman and Friesen, 1976). The order of the face-processing blocks was counter-balanced across participants. During the sensorimotor control blocks, participants viewed a trio of simple geometric shapes (circles and ellipses) and selected one of two shapes (bottom) identical to a target shape (top). Each sensorimotor control block consisted of six different shape trios. All blocks were preceded by a brief instruction (‘Match faces’ or ‘Match shapes’) that lasted 2 s. In the face-processing blocks, each of the six face trios was presented for 4 s with a variable interstimulus interval (ISI) of 2–6 s (mean, 4 s), for a total block length of 48 s. A variable ISI was used to minimize expectancy effects and resulting habituation, and maximize amygdala reactivity throughout the paradigm. In the sensorimotor control blocks, each of the six shape trios was presented for 4 s with a fixed interstimulus interval of 2 s, for a total block length of 36 s. Total task length was 390 s.
Fig. 1.
Amygdala reactivity paradigm. Participants match faces or geometric shapes. Face matching blocks contain neutral, angry, fearful or surprised expressions with block order counterbalanced across participants.
BOLD fMRI data acquisition
Each participant was scanned using a research-dedicated GE MR750 3T scanner equipped with high-power high-duty-cycle 50-mT/m gradients at 200 T/m/s slew rate and an eight-channel head coil for parallel imaging at high bandwidth up to 1 MHz at the Duke-UNC Brain Imaging and Analysis Center. A semiautomated high-order shimming program was used to ensure global field homogeneity. A series of 34 interleaved axial functional slices aligned with the anterior commissure–posterior commissure plane were acquired for full-brain coverage using an inverse-spiral pulse sequence to reduce susceptibility artifact (TR/TE/flip angle = 2000 ms/30 ms/60; FOV = 240 mm; 3.75 × 3.75 × 4 mm voxels; interslice skip = 0). Four initial RF excitations were performed (and discarded) to achieve steady-state equilibrium. To allow for spatial registration of each participant’s data to a standard coordinate system, structural images were acquired in 34 axial slices co-planar with the functional scans (TR/TE/flip angle = 7.7 s/3.0 ms/12; voxel size = 0.9 × 0.9 × 4 mm; FOV = 240 mm, interslice skip = 0).
BOLD fMRI data preprocessing
Preprocessing was conducted using SPM8 (www.fil.ion.ucl.ac.uk/spm). Images for each participant were realigned to the first volume in the time series to correct for head motion, spatially normalized into a standard stereotactic space (Montreal Neurological Institute template) using a 12-parameter affine model (final resolution of functional images = 2 mm isotropic voxels) and smoothed to minimize noise and residual differences in gyral anatomy with a Gaussian filter set at 6 mm full-width at half-maximum. Next, the Artifact Detection Tool (ART; http://www.nitrc.org/projects/artifact_detect) was used to generate regressors accounting for the possible confounding effects of volumes with large motion deflections. Specifically, individual whole-brain BOLD fMRI volumes varying significantly from mean-volume signal intensity variation (i.e. within volume mean signal of greater or less than 4 standard deviations of mean signal of all volumes in time series) or individual volumes where scan-to-scan movement exceeded 2 mm translation or 2° rotation in any direction were assigned a lower weight in analyses. Five participants had >5% outlier volumes and were excluded from analysis.
BOLD fMRI data analysis
The general linear model of SPM8 (http://www.fil.ion.ucl.ac.uk/spm) was used to conduct fMRI data analyses. Following preprocessing, linear contrasts employing canonical hemodynamic response functions were used to estimate main effects of expression (e.g. Fear > Shapes) for each individual. Individual contrast images were then used in second-level random effects models accounting for scan-to-scan and participant-to-participant variability to determine mean condition-specific regional responses using one-sample t-tests with a voxel level statistical threshold of p < 0.05, family wise error corrected for multiple comparisons across the entire search volume, and a cluster threshold of 10 contiguous voxels. Contrast estimates were then extracted from functional clusters exhibiting a main effect of task using the above threshold within anatomically defined amygdala regions of interest (ROIs).
Data quality assurance (QA)
Because of the relatively extensive signal loss and noise typically observed in the amygdala, single-subject BOLD fMRI data were included in subsequent analyses only if there was a minimum of 90% signal coverage in the amygdala, defined using the Automated Anatomical Labeling atlas in Wake Forest University Pick Atlas software (Maldjian et al., 2003). Four participants were excluded for having coverage <90% in bilateral amygdala ROIs.
Principal component analysis
PCA was used to visualize the pattern of correlations between right and left amygdala reactivity to blocks of fearful, angry, neutral or surprised facial expressions. We employed this strategy because principal components are easily interpretable, yielding distinct loadings for well-defined correlational structures and non-separable loadings in the absence of clear patterns. PCA was implemented in Matlab (The Mathworks inc., Natick, MA, USA) to decompose the correlation matrix (matrix size: 8 × 8) estimated from the sample (n = 300) into principal components. A scatter plot of the first two principal component loadings was used to identify whether the principal component loadings differentiated between emotional expression categories.
Next we asked if the correlation structure observed in the full sample was robust enough to be detected in smaller samples. To this end, the total sample was divided into subsets of 10–150 participants, increasing sample size by steps of 10 and the performance of a linear classifier was tested on the principal component loadings (matrix size: 8 × 2) in each subset. The classification model consisted of a linear discriminant function (Fisher, 1936), trained and tested on non-overlapping data partitions using Monte Carlo re-sampling. For each sample size, we used holdout cross-validation with 1000 re-samplings to estimate the mean and standard error of the classification accuracy, which was compared to accuracy when amygdala reactivity was randomly assigned to categories with a paired t-test. The significance level for the t-test was set to P < 0.05. Principal components were extracted separately for the test and training sample. Because the direction of principal components is arbitrary, we tested whether those from the training and test sets were similar in orientation by comparing the correlation coefficient between the two. If the correlation coefficient was <0, the principal component in the test sample was multiplied by −1 to ensure that axis orientations were matched. To compare whether more information related to emotion categories was contained in the principal component loadings than in amygdala reactivity to separate emotions, the classification performance using right and left amygdala BOLD parameter estimates for each emotional category was also evaluated for each sample size.
Averaging and subtraction of face stimuli
Face stimuli were averaged according to corresponding loadings on the first and second principal components. Face averaging was performed using online software (www.faceresearch.org). Landmarks around the face were first delineated semi-manually, guided by the software. Face stimuli were then warped into a common template and averaged. The pupils in the average face images were aligned. Averaged faces were then transformed to z-scores before being subtracted from each other using Matlab (The Mathworks Inc.). The method we used to detect changes in facial features associated with the two first principal components is similar to reverse correlation (Ahumada and Lovell, 1971), and has previously been used in face detection paradigms to detect biologically relevant stimulus features (Smith et al., 2012).
RESULTS
Amygdala reactivity
Consistent with previous reports, our fMRI challenge paradigm elicited robust bilateral amygdala reactivity to all four categories of emotional facial expressions (Table 1 and Supplementary Figure S1).
Table 1.
Mean amygdala reactivity and cluster statistics
Expression block | Hemisphere | Mean contrast value (s.e.m) | t-value |
---|---|---|---|
Fear | left | 0.26 (0.03) | 8.57*** |
right | 0.22 (0.02) | 9.00*** | |
Anger | left | 0.28 (0.03) | 9.73*** |
right | 0.27 (0.03) | 10.77*** | |
Neutral | left | 0.23 (0.03) | 7.69*** |
right | 0.21 (0.03) | 8.48*** | |
Surprise | left | 0.20 (0.03) | 7.35*** |
right | 0.21 (0.02) | 8.90*** |
***P < 0.001.
Principal component analysis
PCA was initially performed using contrast values for each emotional category extracted from bilateral amygdala ROIs in all 300 participants. The first principal component loaded more on fearful and angry expressions relative to neutral and surprised expressions, while the second principal component loaded more on neutral and angry expressions relative fearful and surprised expressions (Figure 2). The first and second principal components accounted for 33 and 26% of the variance in amygdala reactivity, respectively. Together, the two principal components separated all four expression categories (Figure 2). This observation indicates a well-defined correlational pattern of amygdala responses across emotional expression categories.
Fig. 2.
Principal component loadings on the four categories of emotional facial expressions. Responses to neutral and surprised expressions have similar loadings on PC1 as do responses to expressions of fear and anger. On PC2, responses to neutral and angry expressions have similar loadings as do responses to expressions of fear and surprise. PC, principal component; Right, right amygdala; Left, left amygdala.
To test the robustness of this correlational pattern, we evaluated whether emotional categories could be correctly classified based on loadings on the first two principal components in smaller samples. We found classification accuracy in subsamples to be significantly above chance (25% accuracy) (all P-values < 0.001) in all sample sizes tested (n = 10 to n = 150) indicating high reproducibility (Figure 3). As expected, classification accuracy increased with sample size from 44% in samples of 10 participants to 83% in samples of 150 participants. In contrast, direct classification of amygdala reactivity (i.e. BOLD parameter estimates) to the four emotion categories yielded poor accuracy (Figure 3) due to comparable reactivity magnitudes across categories (Table 1). Classification accuracy was 25% in sample sizes of 10 participants and increased only marginally to 27% in sample sizes of 150. Classification accuracy was significantly greater than chance across all sample sizes when using the principal component loadings but only in the larger samples (n = 70, n ≥ 90) when using amygdala reactivity values (Table 2).
Fig. 3.
Mean classification accuracy of emotional category as a function of principal component loadings or amygdala reactivity. Bars represent standard error of the mean. Means and standard errors are based on 1000 re-samplings. The blue line represents accuracy when the classifier was trained on the loadings of the two first principal components and tested on independent subsamples of participants. The green line marks accuracy when the classifier was trained directly on amygdala reactivity to each emotional category and then tested in independent subsamples. The magenta line represents classification accuracy when categories were scrambled randomly. Accuracy is plotted for sample sizes ranging from 10 to 150 participants increasing in steps of 10. PC, principal component.
Table 2.
t-tests comparing classification accuracies relative to chance using loadings on principal components or amygdala reactivity for different sample sizes
Sample size | Principal components | Amygdala reactivity |
---|---|---|
t-value | t-value | |
10 | 20.61*** | 0.02 |
20 | 23.00*** | 0.50 |
30 | 22.67*** | 0.72 |
40 | 24.58*** | 1.19 |
50 | 25.24*** | 0.96 |
60 | 26.49*** | 1.22 |
70 | 32.03*** | 2.71** |
80 | 30.74*** | 1.68 |
90 | 35.07*** | 2.46* |
100 | 39.54*** | 3.22** |
110 | 41.48*** | 2.20* |
120 | 45.89*** | 3.58*** |
130 | 52.26*** | 3.71*** |
140 | 56.14*** | 3.93*** |
150 | 60.80*** | 2.75** |
*P < 0.05; **P < 0.01; ***P < 0.001
To isolate specific feature-based cues characterizing the two principal components, face stimuli were averaged together according to principal component loadings. For the first principal component, we averaged face stimuli from the surprised and neutral categories to form one image and those from the angry and fearful categories to form a second image. When these two images were subtracted, a change in eyebrow position best distinguished the two stimulus groups (Figure 4). This indicates that variation along the first principal component is associated with changes in eyebrow position.
Fig. 4.
Face stimuli averaged over emotional categories according to amygdala reactivity loadings on the first two principal components. Top: Face stimuli were averaged over the categories fear/anger (left) and neutral/surprise (middle) according to the first principal component loading. In the heat map representing the difference between the two images (right), the eyebrows differ most. Bottom: Face stimuli were averaged over the categories fear/surprise (left) and neutral/anger (middle) according to the second principal component loading. In the heat map representing the difference between the two images (right), eye whites and the position of the upper eyelids differ most.
For the second principal component, face stimuli from the surprised and fear categories were averaged together to form one image and those from the neutral and angry categories a second image. Subtraction of the group averages revealed that the size of the eyes and the position of the medial brow were most different (Figure 4), suggesting these features are of importance for explaining variation in amygdala loadings along the second principal component.
DISCUSSION
Our results suggest that amygdala reactivity to emotional facial expressions is associated with feature-based cues, namely eyebrow position and eye size, which vary across emotional categories. Loadings on the first principal component corresponded to changes in eyebrow position, whereas those on the second component were associated with changes in the size of the eyes and position of the medial brow. Pure holistic processing of facial affect would predict independent processing of fearful, angry, neutral and surprised facial expressions, whereas feature-based processing would predict correlated responses to these expressions as a function of the degree to which the feature is present across categories. Our results support the latter model. Selective amygdala reactivity to specific feature-based cues rather than broad categories may support more adaptive responses necessary for decoding subtle social signals during dynamic interactions before the emergence of recognizable emotional states.
The pattern we observe at the systems level may reflect the presence of neuronal populations in the amygdala with similar response profiles across broad classes of social stimuli, consistent with research in non-human primates (Brothers et al., 1990). Single unit recordings in humans corroborate these findings by showing that a large proportion of neurons in the amygdala are excited or inhibited by presentation of a range of facial stimuli, whereas only a fraction respond selectively to one emotional expression (Fried et al., 1997; Rutishauser et al., 2011). These results are consistent with theories of brain function emphasizing a convergence of sensory inputs at the level of the amygdala resulting in correlated patterns of activity across a spectrum of stimuli (Mesulam, 1998). It is worth noting that our paradigm was based on perceptual matching and that different principal components might emerge from neural responses within higher order structures necessary for the identification (e.g. labeling) of discrete emotions.
In this study, the features in the face stimuli that were associated with the correlational pattern of amygdala responses were centered on the eye region. These findings support previous research investigating the role of stimulus features on amygdala activity (Bar and Neta, 2007; Demos et al., 2008). The comparison of amygdala loadings on the first principal component from the face average resulted in a difference map, which highlights facial features that change as a function of amygdala loadings on the principal component. In this case, displacement of the eyebrows was identified as the feature of faces most associated with the amygdala’s response to angry and fearful compared with neutral and surprised expressions. The second principal component was associated with amygdala responses to fearful and surprised compared with angry and neutral expressions and the difference image revealed a heightening of the upper eyelid, revelation of more eye white, together with a displacement of the brow. That eye whites are a feature of importance in driving amygdala reactivity is consistent with previous research demonstrating that experimental manipulation of the eye whites is associated with differential amygdala responses (Whalen et al., 2004). Our data extend this earlier finding by demonstrating that amygdala reactivity is sensitive to variation in this feature across categories of emotional expression.
Although substantial evidence suggests that the amygdala is involved in processing of eye features (Kawashima et al., 1999; Adams et al., 2003; Whalen et al., 2004), few studies have investigated amygdala responses to changes in eyebrow position. However, behavioral studies have shown that depression of the medial, but not the lateral, brow is perceived as anger or disgust (Knoll et al., 2008) and changes in brow position are important for decoding facial affect (Frois-Wittmann, 1930; Ekman and Friesen, 1978; Lundqvist et al., 1999). At the level of the amygdala, studies comparing angry schematic faces that varied in the shape of the mouth and the brow to their neutral counterparts observed relatively increased reactivity to the angry schematic face (Wright et al., 2002; Britton et al., 2008). Although these studies show that varying only a few central features in a face can alter amygdala reactivity, it is not possible to isolate whether variation in mouth or eyebrow position, or both, are necessary. Future studies could investigate whether systematic variation in eyebrow position is uniquely associated with amygdala reactivity. Such studies could also use dynamic facial stimuli to move beyond our reverse correlation method to isolate which feature-based cues are driving the pattern of amygdala reactivity.
As a component of a larger research protocol designed to map neurogenetic mechanisms of individual differences in sensitivity to stress and related behaviors (e.g. Carre et al., 2012), our fMRI paradigm focuses on threat-related facial expressions. Thus, our data do not allow extrapolation of the identified feature-based patterns to other expressions (e.g. happy, disgust). However, our focus on threat-related expressions may be of specific relevance for understanding previously documented biases of amygdala reactivity in psychopathology. For example, increased amygdala reactivity to angry and fearful expressions, which represent explicit and implicit threat, respectively, has been documented in anxiety disorders such as social phobia (Phan et al., 2006). Increased reactivity to neutral expressions, which could be interpreted as threatening if displayed by unfamiliar individuals, and angry expressions has been documented in impulse control disorders such as intermittent explosive disorder (Coccaro et al., 2007). Our current data suggest that these biases may reflect the shared feature-based patterns across these expressions (i.e. PC1 in social phobia but PC2 in intermittent explosive disorder) rather than differential reactivity to distinct categories of emotion and may help inform the etiology and possibly treatment of these disorders.
SUPPLEMENTARY DATA
Supplementary data are available at SCAN online.
Acknowledgments
The Duke Neurogenetics Study is supported by Duke University and NIDA grants R01DA031579 and R01DA026222 to A.R.H. The Swedish Research Council and the Sweden-American Foundation provided support to F.A.
REFERENCES
- Adams RB, Gordon HL, Baird AA, Ambady N, Kleck RE. Effects of gaze on amygdala sensitivity to anger and fear faces. Science. 2003;300:1536–6. doi: 10.1126/science.1082244. [DOI] [PubMed] [Google Scholar]
- Adolphs R. Neural systems for recognizing emotion. Current Opinion in Neurobiology. 2002;12:169–77. doi: 10.1016/s0959-4388(02)00301-x. [DOI] [PubMed] [Google Scholar]
- Ahumada A, Lovell J. Stimulus features in signal detection. Journal of the Acoustical Society of America. 1971;49:1751. [Google Scholar]
- Bar M, Neta M. Visual elements of subjective preference modulate amygdala activation. Neuropsychologia. 2007;45:2191–200. doi: 10.1016/j.neuropsychologia.2007.03.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Britton JC, Shin LM, Barrett LF, Rauch SL, Wright CI. Amygdala and fusiform gyrus temporal dynamics: Responses to negative facial expressions. BMC Neuroscience. 2008;9:44. doi: 10.1186/1471-2202-9-44. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brothers L, Ring B, Kling A. Response of neurons in the macaque amygdala to complex social stimuli. Behavioural Brain Research. 1990;41:199–213. doi: 10.1016/0166-4328(90)90108-q. [DOI] [PubMed] [Google Scholar]
- Brown SM, Peet E, Manuck SB, et al. A regulatory variant of the human tryptophan hydroxylase-2 gene biases amygdala reactivity. Molecular Psychiatry. 2005;10:884–88. doi: 10.1038/sj.mp.4001716. [DOI] [PubMed] [Google Scholar]
- Carre JM, Hyde LW, Neumann CS, Viding E, Hariri AR. The neural signatures of distinct psychopathic traits. Social Neuroscience. 2013;8:122–35. doi: 10.1080/17470919.2012.703623. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Coccaro EF, McCloskey MS, Fitzgerald DA, Phan KL. Amygdala and orbitofrontal reactivity to social threat in individuals with impulsive aggression. Biological Psychiatry. 2007;62:168–78. doi: 10.1016/j.biopsych.2006.08.024. [DOI] [PubMed] [Google Scholar]
- Coelho CM, Cloete S, Wallis G. The face-in-the-crowd effect: When angry faces are just cross(es) Journal of Vision. 2010;10(1):7, 1–14. doi: 10.1167/10.1.7. [DOI] [PubMed] [Google Scholar]
- Davis FC, Somerville LH, Ruberry EJ, Berry ABL, Shin LM, Whalen PJ. A tale of two negatives: differential memory modulation by threat-related facial expressions. Emotion. 2011;11:647–55. doi: 10.1037/a0021625. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Demos KE, Kelley WM, Ryan SL, Davis FC, Whalen PJ. Human amygdala sensitivity to the pupil size of others. Cerebral Cortex. 2008;18:2729–34. doi: 10.1093/cercor/bhn034. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ekman P, Friesen W. Pictures of Facial Affect. Palo Alto, CA: Consulting Psychologists Press; 1976. [Google Scholar]
- Ekman P, Friesen W. Facial Action Coding System: A Technique for the Measurement of Facial Movement. Palo Alto, CA: Consulting Psychologists Press; 1978. [Google Scholar]
- Engell AD, Haxby JV, Todorov A. Implicit trustworthiness decisions: Automatic coding of face properties in the human amygdala. Journal of Cognitive Neuroscience. 2007;19:1508–19. doi: 10.1162/jocn.2007.19.9.1508. [DOI] [PubMed] [Google Scholar]
- Fakra E, Hyde LW, Gorka A, et al. Effects of HTR1A C(-1019)G on amygdala reactivity and trait anxiety. Archives of General Psychiatry. 2009;66:33–40. doi: 10.1001/archpsyc.66.1.33. [DOI] [PMC free article] [PubMed] [Google Scholar]
- First M, Spitzer M, Williams J, Gibbon M. Structured Clinical Interview for DSM-IV (SCID) Washington: American Psychiatric Association; 1995. [Google Scholar]
- Fisher RA. The use of multiple measurements in taxonomic problems. Annals of Eugenics. 1936;7:179–88. [Google Scholar]
- Fried I, MacDonald KA, Wilson CL. Single neuron activity in human hippocampus and amygdala during recognition of faces and objects. Neuron. 1997;18:753–65. doi: 10.1016/s0896-6273(00)80315-3. [DOI] [PubMed] [Google Scholar]
- Frois-Wittmann J. The judgment of facial expression. Journal of Experimental Psychology. 1930;13:113–51. [Google Scholar]
- Fusar-Poli P, Placentino A, Carletti F, et al. Functional atlas of emotional faces processing: a voxel-based meta-analysis of 105 functional magnetic resonance imaging studies. Journal of Psychiatry & Neuroscience. 2009;34:418–32. [PMC free article] [PubMed] [Google Scholar]
- Hariri AR, Gorka A, Hyde LW, et al. Divergent effects of genetic variation in endocannabinoid signaling on human threat- and reward-related brain function. Biological Psychiatry. 2009;66:9–16. doi: 10.1016/j.biopsych.2008.10.047. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jacobs RH, Renken R, Aleman A, Cornelissen FW. The amygdala, top-down effects, and selective attention to features. Neuroscience and Biobehavioral Reviews. 2012;9:2069–84. doi: 10.1016/j.neubiorev.2012.05.011. [DOI] [PubMed] [Google Scholar]
- Kawashima R, Sugiura M, Kato T. The human amygdala plays an important role in gaze monitoring - A PET study. Brain. 1999;122:779–83. doi: 10.1093/brain/122.4.779. [DOI] [PubMed] [Google Scholar]
- Knoll BI, Attkiss KJ, Persing JA. The influence of forehead, brow, and periorbital aesthetics on perceived expression in the youthful face. Plastic and Reconstructive Surgery. 2008;121:1793–802. doi: 10.1097/PRS.0b013e31816b13fe. [DOI] [PubMed] [Google Scholar]
- Larson CL, Aronoff J, Sarinopoulos IC, Zhu DC. Recognizing threat: a simple geometric shape activates neural circuitry for threat detection. Journal of Cognitive Neuroscience. 2009;21:1523–35. doi: 10.1162/jocn.2009.21111. [DOI] [PubMed] [Google Scholar]
- Lundqvist D, Esteves F, Ohman A. The face of wrath: Critical features for conveying facial threat. Cognition & Emotion. 1999;13:691–711. doi: 10.1080/02699930244000453. [DOI] [PubMed] [Google Scholar]
- Maldjian JA, Laurienti PJ, Kraft RA, Burdette JH. An automated method for neuroanatomic and cytoarchitectonic atlas-based interrogation of fMRI data sets. Neuroimage. 2003;19:1233–9. doi: 10.1016/s1053-8119(03)00169-1. [DOI] [PubMed] [Google Scholar]
- Mesulam MM. From sensation to cognition. Brain. 1998;121:1013–52. doi: 10.1093/brain/121.6.1013. [DOI] [PubMed] [Google Scholar]
- Ohman A, Lundqvist D, Esteves F. The face in the crowd revisited: A threat advantage with schematic stimuli. Journal of Personality and Social Psychology. 2001;80:381–96. doi: 10.1037/0022-3514.80.3.381. [DOI] [PubMed] [Google Scholar]
- Phan KL, Fitzgerald DA, Nathan PJ, Tancer ME. Association between amygdala hyperactivity to harsh faces and severity of social anxiety in generalized social phobia. Biological Psychiatry. 2006;59:424–9. doi: 10.1016/j.biopsych.2005.08.012. [DOI] [PubMed] [Google Scholar]
- Rutishauser U, Tudusciuc O, Neumann D, et al. Single-unit responses selective for whole faces in the human amygdala. Current Biology. 2011;21:1654–60. doi: 10.1016/j.cub.2011.08.035. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sergerie K, Chochol C, Armony JL. The role of the amygdala in emotional processing: A quantitative meta-analysis of functional neuroimaging studies. Neuroscience and Biobehavioral Reviews. 2008;32:811–30. doi: 10.1016/j.neubiorev.2007.12.002. [DOI] [PubMed] [Google Scholar]
- Smith ML, Gosselin F, Schyns PG. Measuring internal representations from behavioral and brain data. Current Biology. 2012;22:191–6. doi: 10.1016/j.cub.2011.11.061. [DOI] [PubMed] [Google Scholar]
- Vytal K, Hamann S. Neuroimaging support for discrete neural correlates of basic emotions: a voxel-based meta-analysis. Journal of Cognitive Neuroscience. 2010;22:2864–85. doi: 10.1162/jocn.2009.21366. [DOI] [PubMed] [Google Scholar]
- Whalen PJ, Kagan J, Cook RG, et al. Human amygdala responsivity to masked fearful eye whites. Science. 2004;306:2061–1. doi: 10.1126/science.1103617. [DOI] [PubMed] [Google Scholar]
- Wright CI, Martis B, Shin LM, Fischer H, Rauch SL. Enhanced amygdala responses to emotional versus neutral schematic facial expressions. Neuroreport. 2002;13:785–90. doi: 10.1097/00001756-200205070-00010. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.