Skip to main content
Social Cognitive and Affective Neuroscience logoLink to Social Cognitive and Affective Neuroscience
. 2017 Apr 11;12(7):1197–1207. doi: 10.1093/scan/nsx046

The effect of constraining eye-contact during dynamic emotional face perception—an fMRI study

Nouchine Hadjikhani 1,2,, Nicole R Zurcher 1, Amandine Lassalle 1,3, Loyse Hippolyte 4, Noreen Ward 1, Jakob Åsberg Johnels 2,5
PMCID: PMC5490673  PMID: 28402536

Abstract

Eye-contact modifies how we perceive emotions and modulates activity in the social brain network. Here, using fMRI, we demonstrate that adding a fixation cross in the eye region of dynamic facial emotional stimuli significantly increases activation in the social brain of healthy, neurotypical participants when compared with activation for the exact same stimuli observed in a free-viewing mode. In addition, using PPI analysis, we show that the degree of amygdala connectivity with the rest of the brain is enhanced for the constrained view for all emotions tested except for fear, and that anxiety and alexithymia modulate the strength of amygdala connectivity for each emotion differently. Finally, we show that autistic traits have opposite effects on amygdala connectivity for fearful and angry emotional expressions, suggesting that these emotions should be treated separately in studies investigating facial emotion processing.

Keywords: eye-contact effect, dynamic face perception, social brain, amygdala connectivity, emotion perception


Le surveillant, on l’appelle le Bouillon, quand il n’est pas là, bien sûr. On l’appelle comme ça, parce qu’il dit tout le temps: « Regardez-moi dans les yeux », et dans le bouillon il y a des yeux. Moi non plus je n’avais pas compris tout de suite, c’est des grands qui me l’ont expliqué. Le Bouillon a une grosse moustache et il punit souvent, avec lui, il ne faut pas rigoler.

We call our supervisor Broth, when he is not there, of course. We call him like that because he says all the time: ‘Look me in the eyes’, and in the broth, there are eyes. I did not either understand right away, but the older kids explained it to me. Broth has a big mustache and he often punishes, with him you do not laugh.

Rene Goscinny, Le Petit Nicolas, Chapitre 3 ‘Le Bouillon’.

Introduction

The eyes of others truly hold a special status in human social perception, and eye contact serves as an important non-verbal channel for communication and social interaction throughout life (Darwin, 1872/1965). Four-day-old newborns distinguish between a face looking toward them and a face looking away (Farroni et al., 2002), and from age three months and onwards, humans look more at a person's eyes than at other parts of the face (Haith et al., 1977). Still, even healthy adults do not look at other people’s eyes all of the time during face perception (Janik et al., 1978). In the present study, we sought to unravel how people neurophysiologically react when they are constrained to look at others consistently in the eyes compared to when they are free to allocate their gaze pattern during perception of faces with different emotional expressions. We also took an individual differences approach to this issue, since it has been shown that what counts as a ‘comfortable’ amount of eye contact varies between otherwise healthy individuals (Kleinke, 1986; Binetti et al., 2016).

The presentation of faces activates a network of brain areas that have been coined as ‘the social brain’ (Kennedy and Adolphs, 2012). The social brain consists of the amygdala network (amygdala and orbitofrontal cortex), the mentalizing network (posterior superior temporal sulcus, temporal pole, posterior cingulate and dorso-median prefrontal cortex), the empathy network (insula and anterior cingulate), and the action-perception network (inferior frontal gyrus and superior parietal lobule). In addition, face-specific areas are located in the inferior occipital gyrus [referred as IOG—or occipital face area, OFA (Rossion et al., 2003)] and the fusiform gyrus [referred as the fusiform face area, FFA (Kanwisher et al., 1997)].

Activation of the fusiform gyrus by faces is modulated by where people are looking when face stimuli are presented. In their study published in 2007, Morris and colleagues demonstrated that activation in the fusiform gyrus correlates with the amount of fixation in the eyes (Morris et al., 2007). Burra et al. (2013) reported that the amygdala is also sensitive to the eye-contact effect, even in absence of primary visual cortex activation. The amygdala modulates activation of cortical areas during face perception, and the interaction between amygdala and the fusiform gyrus is an important component of face perception (Vuilleumier et al., 2001; Vuilleumier and Pourtois, 2007). The amygdala is strongly connected with the prefrontal cortex (Ghashghaei et al., 2007) and the dynamic interaction of the amygdala with the social brain modulates the recognition and modulation of affective states (Banks et al., 2007; Miyahara et al., 2013). Anatomical connectivity between the amygdala and brain areas involved in emotion perception is modulated by trait anxiety (Greening and Mitchell, 2015) and its functional connectivity can be pharmacologically altered during social-emotional processing (Gorka et al., 2013, 2015). Whether amygdala connectivity varies depending on the emotion presented and to which extent it is modulated not only by anxiety but also by other personality traits has, to our knowledge, not been specifically studied.

In their landmark paper published in 2009, Senju and Johnson (2009b) coined the term ‘eye contact effect’ to describe the modulation of cognitive processing following the perception of eye contact with another human. They demonstrated that eye contact can modulate activity in the social brain network, and developed the hypothesis that this is accomplished through the detection of perceived direct gaze by brain areas of the subcortical route. Recent studies have come to support the subcortical hypothesis, suggesting that automatic establishment of eye contact is mediated by fast subcortical pathways (Rothkirch et al., 2015).

Direct eye contact is indeed a powerful signal—that also conveys different meanings depending on the facial expression of the person. Many studies examining the effect of eye-contact have not addressed the interaction between gaze and emotion (Calder et al., 2002; George and Conty, 2008; Hietanen et al., 2008; Wieser et al., 2009; Helminen et al., 2011), but those who did (Roelofs et al., 2010; Soussignan et al., 2013) concluded that reactivity and behavior was depending on the self-relevance of the combination of gaze and emotion in terms of approach or avoidance. When looking at the effect of gaze, its interaction with the emotional expression needs to be considered, as a direct gaze will have a completely different effect on the viewer depending on the facial expression it is presented with: a smiling face with a direct gaze can be interpreted as an invitation for further interactions, while an angry face with direct gaze signals a potential threat (Darwin, 1872/1965; Argyle and Cook, 1976; Kleinke, 1986; Baron-Cohen, 1995; Emery, 2000).

Emotional processing can be affected by several personality traits, such as anxiety, alexithymia, and the presence of autistic traits. Here, we tested whether these personality traits may affect how constrained gaze is perceived, for all emotions, as well as for specific emotional expressions; we also examined how amygdala connectivity is modulated by these personality traits, using questionnaires specifically addressing each of them. Previous studies have indeed demonstrated that anxious individuals have shortened viewing time in the eye-region (Daly, 1978; Garner et al., 2006; Moukheiber et al., 2010), and recent work from Myllyneva et al. (2015) has revealed increased autonomic arousal and shorter self-controlled viewing time for direct gaze in adolescents with social anxiety disorders compared with controls. If we think of anxiety traits as a continuum in the population, we would expect that anxiety would be potentiating the effect of constraining the attention to the eye region of emotional faces and that this effect would be maximum for anger given that this emotion represents a direct threat for the observer. We first aimed to examine the effect of constraining gaze in the eye-region when looking at faces, by comparing the exact same stimuli in a free-viewing condition and in a condition where typical healthy participants were specifically asked to look at a cross-situated in the eye-region. Our second aim was to examine the effect of distinctive facial expression on differences in brain activation induced by constraining gaze, by looking at each facial expression separately. We used short movies of morphed facial expressions. We chose to use dynamic facial stimuli, as they are more ecological and are known to elicit increased activity in the social brain (Arsalidou et al., 2011). It is important to note that we did not look at the effect of gaze direction, as all stimuli had a direct gaze.

In summary, we hypothesized that constrained eye-contact would increase activation in the social brain, and lead to increased connectivity between the amygdala and the rest of the social brain (Kawashima et al., 1999; Huijgen et al., 2015). We also hypothesized that anxiety, alexithymia and autistic traits would modulate functional connectivity between brain regions.

Materials and methods

Participants

The Lausanne University Hospital ethics committee approved all procedures. All adult participants gave written informed consent before the start of the study. Minor participants provided assent and one of their parents gave written consent. All procedures followed the Declaration of Helsinki. Twenty-five healthy participants were enrolled in the study. They had no history of psychiatric or neurological disorders. Five subjects were excluded from the data analysis due to excessive movement or for not performing the task during the scan. Thus, 20 participants (17 males, 23.5 years ± 8.14 (mean age ± SD), range 12.7–42.9) were included in the final analysis. Performance Intelligence Quotient (PIQ) was assessed using the Wechsler Nonverbal Scale of Ability (Wechsler and Naglieri, 2006) or the Wechsler Abbreviated Scale of Intelligence (Wechsler, 1999) and all participants had a PIQ in the normal range (mean: 113.4 ± 10.7).

Levels of anxiety were evaluated with the State-Trait Anxiety Inventory (STAI) (Spielberger et al., 1983) in adults and with the Revised Children’s Manifest Anxiety Scale (RCMAS-2) in adolescents (Reynolds et al., 1999). In addition, the Toronto Alexithymia Scale (TAS) was used in its adult and children version (Bagby et al., 1994). Autistic traits were evaluated in all participants using the Autism Spectrum Quotient (AQ) self-report questionnaire (Baron-Cohen et al., 2001, 2006).

Visual stimuli and design

Twenty-four movies were created from the NimStim database (Tottenham et al., 2009) representing morphs of facial expressions from NEUTRAL to FEAR, HAPPY or ANGER with Morph Age Pro (Creaceed). Each movie lasted for 5 s, and consisted of a dynamic morph lasting 3 s, followed by 2 s of the final emotional expression. Morphs of NEUTRAL were also created by creating a left-to-right morph between mirror images of neutral faces, in order to also have a dynamic component in this condition. A red fixation cross was present for 1 second between each movie, as well as in blocks of 6 s at the beginning and the end of the presentation, plus 7 short blocks of 3 s each interspersed with the face blocks. During a run, 2 blocks of 48 s of each expression were presented in a pseudo-randomized way. Two versions of these movies were created, with one version containing a red fixation cross in the region of the eyes. Each participant viewed both versions (CROSS and NO-CROSS) during the scanning session (that also comported other tasks not reported in the present manuscript). The order of the CROSS and NO-CROSS versions was counterbalanced across participants, so that about half of them saw the stimuli with NO-CROSS first. Participants were instructed to carefully look at the videos and, in order to monitor their attention, to press a button every time they saw a blue cross between the stimuli, which happened four times during CROSS and four times during NO-CROSS. The stimuli presented during CROSS and NO-CROSS were identical, the only difference was the presence of a fixation cross during the CROSS version.

Imaging data acquisition and analysis

Anatomical and functional MR images were collected in all participants with a 12-channel RF coil in a Siemens 3T scanner (Siemens Tim Trio, Erlangen) at the Centre d’Imagerie BioMédicale in Lausanne. Anatomical images were acquired using a multi-echo magnetization prepared rapid gradient echo sequence (ME-MPRAGE: 176 slices; 256x256 matrix; 1x1x1 mm voxels, echo time (TE): TE1: 1.64 ms, TE2: 3.5 ms, TE3: 5.36 ms, TE4: 7.22 ms; repetition time (TR): 2530 ms; flip angle 7°). Functional data were obtained using an echo planar imaging (EPI) sequence (47 AC-PC slices, 3 × 3 × 3 mm voxels, 64 × 64 matrix; FOV: 216; TE: 30 ms; TR: 3000 ms; flip angle 90°) lasting 425 s.

Functional MRI data processing, as well as preprocessing was carried out using FSL 5.0.2.2. Non-brain tissue was removed from high-resolution anatomical images using Christian Gaser's VBM8 toolbox for SPM8 (Ashburner et al., 2000) and fed into feat. Data were motion corrected using MCFLIRT and motion parameters added as confound variables to the model. Residual outlier timepoints were identified using FSL’s motion outlier detection program and integrated as additional confound variables in the first-level general linear model (GLM) analysis. Preprocessing included spatial smoothing using a Gaussian kernel of 8 mm, grand-mean intensity normalization and high-pass temporal filtering with sigma = 50.0 s.

Subject-level statistical analysis was carried out for the contrasts [ALL EMO > FIXATION], as well as [NEUTRAL > FIXATION], [HAPPY > FIXATION], [ANGRY > FIXATION], and [FEAR > FIXATION] using FILM with local autocorrelation correction for both the CROSS and the NO-CROSS runs. Registration to high-resolution structural images was carried out using FLIRT. Registration to MNI standard space was then further refined using FNIRT nonlinear registration. For each subject, a fixed-effect analysis was used to compare activation for each emotion in the [CROSS > NO-CROSS] and the [NO-CROSS > CROSS] conditions. Group-level analyses for the [CROSS vs NO-CROSS] conditions were performed using mixed effects GLM analysis using FLAME 1 + 2 with automatic outlier detection. In modeling subject variability, this kind of analysis allows inference about the population from which the subjects are drawn. FMRI data processing was carried out using FEAT (FMRI Expert Analysis Tool) version 6.00, part of FSL FMRIB’s software library, www.fmrib.ox.ac.ul/fsl). Z statistic images were thresholded using clusters determined by Z > 2.3 and a corrected cluster significance threshold of P = 0.05 (Worsley, 2001). Cluster-corrected images were displayed on a standard brain (fsaverage for the surface and MNI-template for the volume).

Correlation analyses

Correlations were computed between full brain activation for each emotion and age, as well as with TAS, STAI trait, STAI state, and AQ. In addition, we examined activation and Spearman correlations in a series of anatomical ROIs belonging to the social brain, as defined by Kennedy and Adolphs (Kennedy and Adolphs, 2012). These consisted, for the amygdala network, of the amygdala (AMY) and the orbitofrontal cortex (OFC); for the mentalizing network of the right posterior STS, the temporal pole (TP), the posterior cingulate (post Cing), and the dorso-median prefrontal cortex (dmPFC), for the empathy network of the insula and the anterior cingulate (ant Cing), and for the action–perception network of the right inferior frontal gyrus (IFG) and the right superior parietal lobule (SPL). Activation was also examined in face specific areas including the right occipital fusiform gyrus and the right inferior occipital cortex (IOG).

Anatomical masks were created from the Harvard Oxford cortical and subcortical atlases For each subject, the mean percentage BOLD signal change was extracted for each of those regions of interest for the contrast [CROSS > NO-CROSS] for [ALL EMOTIONS], as well as [NEUTRAL], [HAPPY], [ANGRY], and [FEAR] conditions, using the Featquery tool in FSL.

PPI analyses

A psychophysiological interaction (PPI) was conducted to examine the effect of the presence of a fixation cross in the eye region for each emotion on amygdala functional connectivity with the rest of the brain. The relationships between strength of amygdala functional connectivity difference between CROSS and NO-CROSS conditions with STAI state and STAI trait scores, TAS and AQ (O'Reilly et al., 2012) were also examined. Seed regions were anatomically defined in the left and right amygdala in MNI space, and transformed into each subject space. Time-courses of mean activity were extracted for the left and the right amygdala for each subject using fslmeants. Task specific changes were examined for each subject for each emotion and for each condition, using the mean-centered task time course and the demeaned seed ROI time course as described in (O'Reilly et al., 2012). Then, within subject fixed-effect comparison was done for each emotion for each condition. Higher-level statistics were conducted using mixed-effect FLAME 1 + 2 for the CROSS vs. NO-CROSS condition for each emotion separately. Finally, we examined the correlations for each emotion between behavioral scores and the CROSS vs NO-CROSS condition PPI interaction.

Eye-tracking data of spontaneous face viewing

A subset of the participants (N = 13) took part in a parallel eye-tracking experiment of emotional face. In this experiment, similar NO CROSS face stimuli were shown while gaze patterns were recorded with a Tobii T 120 eyetracker during the first three seconds of presentation (Åsberg Johnels et al., 2016). Analyzing this data allow us to get a picture of how the current sample spontaneously allocate their attention to the eyes in a non-constrained presentation. Proportional averaged fixation durations show that the participants viewed the eyes of the faces for 79, 77, 79, and 64% of the total gaze time for neutral, fearful, angry, and happy faces, respectively. This result is consistent with previous research in showing that neurotypical and healthy adults look more at people's eyes than at other parts of the face, although less than 100% of the time [around 70% of the time in (Pelphrey et al., 2002)]. It also mirror previous results in showing somewhat less eye gaze to happy facial expressions compared with other emotional expressions (all differences in eye gaze for happiness compared to the other emotions were P < 0.06 according to Wilcoxon signed ranks tests). This is due to the fact that a smiling mouth attracts relatively more attention (Eisenbarth and Alpers, 2011). Supplementary Table S14 specifies gaze patterns for each emotion to the eyes, the mouth as well as to non-eyes/non-mouth areas of the screen (i.e. the rest of the screen).

Results

Behavioral results

Anxiety scales

The adults (n = 15) had a mean T score of 46.13 ± 6.78 on the State-Trait Anxiety Inventory (STAI) trait scale (range: 33–57), and of 45.73 ± 5.93 on the STAI state scale (range: 34–54). Adolescents (n= 5) had a mean T score of 54.6 ± 6.87 on Revised Children’s Manifest Anxiety scale (RCMAS) (range: 49–66). We are reporting T-scores, as there are no clear-cut points for the anxiety scales. These scores mean that all participants had very low to moderate levels of anxiety, and none of them met the cut off score for severe anxiety.

Toronto alexithymia score (TAS)

The TAS 20 uses a 5-point Likert scale in adults, and a 2-point Likert scale in children and adolescents. In order to combine both, we used a transform from a 2 to 5 scale using the formula, x’ = 2x + 1, where x is the initial score and x’ the transformed score. The mean score for the group (n = 20) was 40.0 ± 12.3 (range: 17–59). None of the individuals met criteria for alexithymia in this study (cut off score ≥ 61), despite the individual variation we observed.

Autism spectrum quotient (AQ): None of the participants scored above the cutoff of 26 (mean: 12.85± 5.71, range 4–22) (Woodbury-Smith et al., 2005).

Neuroimaging

Effect of age. We performed whole brain and ROI analyses with and without age as a regressor, and did not find a difference. Therefore, we are only reporting the analysis without the covariate of age in text, figures and tables.

Whole brain analyses

All-emotions combined

Free viewing of the stimuli [NO-CROSS > CROSS] did not elicit any increased activation compared to constrained viewing.

Adding a fixation cross to the stimulus [CROSS > NO-CROSS] significantly increased brain activation in a number of areas of the social brain. These consisted of facial visual processing areas (IOG, Fusiform gyrus) as well as areas from the amygdala network (AMY and OFC), the mentalizing network (posterior STS, temporal pole, dmPFC), the empathy network (INS, ACC) and the action perception network (IFG, IPL). In addition, activation was observed in the ventral striatum, and in the cerebellum (Figure 1 and Table 1).

Fig. 1.

Fig. 1.

Map of activation showing the effect of having a fixation cross in the eye region during the perception of dynamic emotional stimuli, with all conditions (NEUTRAL, HAPPY, ANGRY and FEAR) combined. Data are shown at P <0.05, corrected.

Table 1.

Activation for the CROSS > NO CROSS CONDITION, all emotions analyzed together

CROSS>NO CROSS, all emotions
MNI coordinates
Brain area side # voxels Z-MAX X Y Z
21177
Inferior occipital gyrus R 6.89 48 −64 −14
L 4.11 −46 −78 −14
Lateral occipital cortex R 4.29 42 −88 0
L 6.49 −38 −84 22
Precuneus L 5.71 −6 −76 34
R 4.31 8 −74 44
Occipital Fusiform Gyrus R 5.34 40 −62 −12
L 4.73 −34 −78 −16
Intraparietal sulcus L 4.74 −26 −80 30
R 3.85 34 −74 34
Superior parietal lobule R 5.21 32 −60 48
L 4.87 20 −56 62
Posterior superior tem poral cortex R 3.62 58 −20 0
L 3.17 −58 −38 8
Orbitofrontal cortex R 10129 5.58 42 20 −6
L 3.65 −42 20 −12
Inferior prefrontal cortex R 5.22 38 34 32
Temporal pole R 4.81 50 14 −8
L 4.32 −54 18 −8
Dorsolateral prefrontal cortex R 4.22 28 58 18
L 2.77 −42 40 18
Inferior frontal gyrus opercularis R 4.65 58 20 18
Inferior frontal gyrus triangularis R 3.98 46 30 4
Precentral gyrus R 3.90 54 0 32
Insula R 4.62 30 24 −4
L 4.11 −42 18 −4
Putamen R 3.07 18 12 −10
L 2.85 −16 14 −4
Accumbens L 3.60 −12 8 −10
R 2.52 10 16 −6
Amygdala R 3.08 16 −6 −14
2542
Posterior cingulate R 5.00 4 −28 42
L 3.05 −12 −24 40
Anterior cingulate R 4.37 4 0 36
L 3.42 −4 26 30
Paracingulate gyrus R 3.89 6 10 44
L 3.85 −2 14 40
Precentral cortex L 3.59 −30 −6 44
1138 5.33 −28 56 15
Cerebellum VIIIa L 4.28 −28 −44 −52
R 3.53 30 −58 −56
Cerebellum VIIIb R 3.08 16 −58 −54

Correlations with behavioral data and ROI data analyses are reported in supplementary material.

Differences between emotions

When each emotion was considered separately, we observed similar general patterns of activation, although different levels of activation were present for each emotion. The amount of significantly activated voxels (Z > 2.3, corrected cluster significance of P = 0.05) was the highest for ANGRY (4 cluster adding to 46’089 voxels, Zmax = 4.46), followed by NEUTRAL (5 clusters adding to 25’872 voxels, Zmax = 4.41), HAPPY (5 clusters adding to 22’606 voxels, Zmax = 4.22) and FEAR (5 clusters adding to 18’973 voxels, Zmax = 4.11) (Supplementary Figure S1 and Table S1, See supplementary material for more details).

Correlation with behavioral measures

STAIs: Levels of anxiety status predicted activation in the left TPJ for ANGRY and HAPPY faces, as well as in the SMA for ANGRY faces (Table 2). There were no correlations between STAIs and the rest of the brain for NEUTRAL and for FEAR faces.

Table 2.

Activation for the positive correlations with behavioral measures for the CROSS > NO CROSS condition for each emotion separately, Z > 2.3, P = 0.05

ANGRY
NEUTRAL
HAPPY
FEAR
Brain area Side # voxels Zmax MNI
# voxels Zmax MNI
# voxels Zmax MNI
# voxels Zmax MNI
X Y Z X Y Z X Y Z X Y Z
STAI-S Positive correlation
TPJ L 1170 4.09 −54 −32 28 1006 3.6 −54 −36 32
SMA L/R 1665 3.77 2 −10 66
STAI-t Positive correlation
Insula R 895 3.42 42 26 0 664 3.34 42 28 −2
L 720 3.31 −36 10 4
SMA L/R 1030 3.95 −16 6 70
DLPFC L 731 4.31 −30 44 12 681 3.48 −28 44 14
TAS Positive correlation
Insula L 889 3.12 −32 12 6
R 1172 3.5 36 14 2
AQ Positive correlation
Precentral gyrus 868 3.75 50 8 42

STAIt. Levels of anxiety traits predicted activation in the right anterior insula for NEUTRAL, the left dlPFC and bilateral anterior insula and the SMA for ANGRY, and the left dlPFC for FEAR. There were however no correlations between STAIt and HAPPY.

TAS. Levels of alexithymia trait predicted activation in the insula, bilaterally, for ANGRY faces. There were no other correlations between TAS and the rest of the brain for the other emotions.

AQ. Levels of autistic traits predicted activation in the precentral cortex and the middle frontal gyrus for HAPPY faces, but not for other emotions.

PPI analysis results

NEUTRAL. PPI analysis revealed that the amygdala had increased functional connectivity in the CROSS vs NO-CROSS condition for neutral faces with the bilateral pericalcarine cortex, IOG, FFA, inferior temporal cortex, left STS, left TPJ, left superior frontal gyrus, PAG and the cerebellum (bilateral crus I and II, left VIIIb, left IX, vermis VI and right VI; Figure 2; Supplementary Table S7).

Fig. 2.

Fig. 2.

PPI analysis. Maps of increased connectivity between the amygdala and the rest of the cortex for NEUTRAL (top left), HAPPY (top right), ANGRY (bottom left) and FEAR (bottom right). Red-to-yellow depicts increased connectivity between amygdala and the rest of the brain for the CROSS condition, and cyan-to-blue depicts increased connectivity between the amygdala and the rest of the brain for the NO-CROSS condition. All data are shown at P <0.05, corrected.

HAPPY. PPI analysis revealed that the amygdala had increased functional connectivity in the CROSS vs NO-CROSS condition for happy faces with the pericalcarine, cuneal and lateral occipital cortices bilaterally, the right STS, the bilateral orbitofrontal and medial prefrontal cortices, the right IFG, the right temporal pole and, as well as between the accumbens and the caudate bilaterally (Supplementary Table S8).

ANGRY. PPI analysis revealed increased connectivity between the amygdala and the hypothalamus, the thalamus, the left insula and the bilateral primary motor and somatosensory cortices for the CROSS vs NO-CROSS condition (Supplementary Table S9).

FEAR. For fearful faces, there were no differences in amygdala functional connectivity for the CROSS > NO-CROSS condition. However, there was an increase in amygdala functional connectivity for the NO-CROSS >CROSS condition in the bilateral orbitofrontal cortex, right medial prefrontal cortex, in bilateral anterior cingulate, bilateral insula, as well as in the right temporal pole, right superior frontal gyrus and occipital pole (Supplementary Table S10).

PPI analysis results—correlations with behavioral

All behavioral scores examined predicted the amount of functional connectivity of the amygdala with the rest of the cortex, however differently for each of the four emotions (Figures 3 and 4). More details are given in the supplementary material.

Fig. 3.

Fig. 3.

PPI analysis—correlation with STAI trait. Maps of increased connectivity between the amygdala and the rest of the cortex correlated for trait anxiety level for NEUTRAL (top left), HAPPY (top right), ANGRY (bottom left) and FEAR (bottom right). Red-to-yellow depicts increased connectivity between amygdala and the rest of the brain for the CROSS condition, and cyan-to-blue depicts increased connectivity between the amygdala and the rest of the brain for the NO-CROSS condition. All data are shown at P <0.05, corrected.

Fig. 4.

Fig. 4.

PPI analysis—correlation with AQ. Maps of increased connectivity between the amygdala and the rest of the cortex correlated for trait anxiety level ANGRY (left) and FEAR (right). Red-to-yellow depicts increased connectivity between amygdala and the rest of the brain for the CROSS condition, and cyan-to-blue depicts increased connectivity between the amygdala and the rest of the brain for the NO-CROSS condition. All data are shown at P <0.05, corrected.

Correlation between eye-tracking data and behavioral data. See Supplementary Material.

ROI analyses . The results of the ROI analyses are given in the Supplementary Material.

Discussion

In the present study, we investigated whether constraining participants’ attention to the eyes of dynamic emotional faces would enhance their brain response in the regions involved in perceiving social signals. In accordance with our hypothesis, we show that the activity in social brain regions is enhanced for all emotions when attention is constrained to the eye region. It is important to stress that the current study only includes healthy neurotypical adults who spontaneously already look more at the eyes than at other parts of people’s faces. Previous studies have shown, however, that the amount of time people spend looking at the eyes is less than 100% (Janik et al., 1978; Pelphrey et al., 2002), and that there are individual differences in what counts as ‘comfortable’ eye gaze time (Binetti et al., 2016). Given that eye contact is so pivotal in human social interactions, and that information about abnormal eye contact is used during diagnostic for clinical conditions such as autism and schizophrenia, it is surprising that ‘normal’ eye contact behavior remains so ill understood (Binetti et al., 2016). From this perspective, the current study contributes with new, valuable insights. Specifically, even in a group who spontaneously orient towards the eyes, constraining the gaze pattern to be located stably in the eye region produces neural activation patterns in the social brain markedly different from when these individuals are free to allocate their visual attention during face perception. Moreover, this constrained eye contact-effect has somewhat different signatures depending on the emotional expression. Finally, we also show that this effect is of importance for the purpose of understanding individual differences, since the magnitude of the effect for different emotions was modulated by individual differences in anxiety, alexithymia as well as autistic traits.

It might be surprising that the constrained eye contact effect was evident for all emotions including for NEUTRAL faces. A possible mechanism for this effect is that direct gaze is processed prior to affect evaluation and potentiates the process, as suggested by ERP studies showing that mutual eye-contact evokes an early response at 85 ms already, while the effect of the emotional expression is observed at around 115 ms (Pizzagalli et al., 1999; Eimer and Holmes, 2002; Eger et al., 2003; Klucharev and Sams, 2004). Still, although all emotions yielded more activation in the social brain in the constrained fixation condition, the effect was most pronounced for ANGRY faces. This is unsurprising, given that an angry face looking directly at us is considered particularly powerful (Tiedens, 2001) and elicits strong arousal (Garfinkel et al., 2016). Angry faces are detected more efficiently (Fox et al., 2000) as they potentially represent a biological threat (Stussi et al., 2015). When we think of ourselves as less powerful than our angry social partner (e.g. child vs adult), this social partner represents a direct threat (Ewbank et al., 2009), and our typical reaction is to avert our gaze in order to avoid confrontation (Marsh et al., 2005). In fact, when parents are angry and berate their kids, they very often say: ‘look me in the eyes’. Angry faces with direct gaze are indeed more self-relevant as they may signal a danger of being attacked (Sander et al., 2003). They are processed more rapidly when they display a mutual gaze (Adams and Kleck, 2005; Graham and LaBar, 2007; Sander et al., 2007; Bindemann et al., 2008).

Constraining participants to look into the eyes of NEUTRAL faces had the next strongest effect on brain activation. NEUTRAL faces are often ambiguous, and can be perceived as emotionally negatively valenced (Thomas et al., 2001; Donegan et al., 2003; Somerville et al., 2004; Iidaka et al., 2005; Lobaugh et al., 2006) and even threatening in socially anxious individuals (Yoon and Zinbarg, 2007, 2008). Here, we also observed that self-report of anxiety traits predicted activation in the right anterior insula, for NEUTRAL faces, an area involved in interoception and the representation of conscious emotional experience (Hogeveen et al., 2016).

Constraining participants to look into the eyes of FEAR faces had the least effect. Given that fearful faces automatically attract attention in the eye-region (Schyns et al., 2007), the eye-region might already have been fixated in the free-viewing condition and constraining the fixation there would therefore have little effect. In typical participants, it has been suggested that the amygdala triggers reflexive orientation towards fearful eyes (Gamer and Buchel, 2009; Gamer et al., 2010). In addition, individuals do not direct their attention in the eye-region after amygdala damage, which prevents them to recognize fear (Adolphs et al., 2005). However, it has also been shown that averted compared with direct gaze facilitate the identification of fear, in direct contrast to what is the case for more ‘approach-oriented emotions’ such as anger and happiness (Adams and Kleck, 2003).

Personality traits modulating constrained emotional face perception

Anxiety has been found to bias emotional processing (Rossignol et al., 2005). Previous studies have reported that individuals with high levels of social anxiety tend to fixate less in the eye region especially when they process emotional faces (Horley et al., 2003; Schulze et al., 2013). In individuals with social anxiety disorder (SAD), gaze avoidance is present for both positive and negative social stimuli (Weeks et al., 2013). This avoidance is associated with a stronger orienting towards gaze in angry faces in anxious individuals (Holmes et al., 2006), demonstrating heightened sensitivity for these ‘threat’ stimuli. Although we measured general trait anxiety in this study, as opposed to specifically social anxiety, those are strongly associated. Imaging studies of SAD patients have shown that direct eye contact (vs. averted) in neutral faces increased activity in the vmPFC, the parahippocamal gyrus and the posterior cingulate (Schneier et al., 2011) compared to controls, and adolescents with anxiety disorders have enhanced autonomic reaction and self-evaluated arousal during the perception of direct gaze (Myllyneva et al., 2015). In the current study, a subset of participants were eye-tracked during perception of similar face stimuli as those used in the free-viewing condition in the scanner. Importantly, we found the association between diminished spontaneous eye gaze and anxiety traits described in previous research, while no clear associations were seen between eye gaze and either alexithymic or autistic traits. Even though these findings are dampened by the small sample size and that fact that the eye-tracking were performed outside the scanner, the results highlight a more general possibility that the modulation of brain activity in the constrained versus the unconstrained condition could reflect differences in eye fixations that occur during unconstrained face perception. Given that we find robust modulations in brain activity as a function of all these personality traits, an important avenue for future research would be to parse the brain-behavior associations between specific patterns of both spontaneous and constrained eye contact and a range of personality traits in larger groups, including in clinical samples.

In accordance with our hypothesis, we found that anxiety potentiated the effect of constraining the attention to the eye region of emotional faces - especially for anger because it represents direct threat.

Alexithymia, a personality trait characterized by the difficulty to identify and describe emotions, has been associated with deficits in detecting and matching emotional facial expression in healthy populations [for review, see Grynberg et al., (2012)]. People suffering from alexithymia are impaired at recognizing others’ facial expressions (Parker et al., 1993; Mann et al., 1994; Prkachin et al., 2009), possibly because of their atypical attention to the eyes in faces (Bird et al., 2011). None of the participants had TAS scores that would lead to a diagnosis of alexithymia (61 and above). Yet, we found positive correlations between alexithymia scores and level of activity in the anterior insula for all four emotions for the CROSS vs NO-CROSS comparison, indicating that the more difficulties participants had with identifying their own emotion, the more they showed increase in insula activation when having to sustain direct gaze vs looking at faces with no constraints. The strongest correlation between alexithymia and brain activation was found for the ANGRY condition and the more ambiguous NEUTRAL condition.

In addition, for the ANGRY and the NEUTRAL condition we found that alexithymia scores were positively correlated with the strength of connectivity between the amygdala and the mPFC, the STS, the OFC and the vmPFC, as well as the ventral striatum, the thalamus and the brainstem. The present results therefore support the view that alexithymia and anxiety both play an important role in gaze perception (Cook et al., 2013).

Autistic traits. There has been a debate as to whether the social and communication impairments of those with autism result from their difficulty attending the eye region [for review see Senju and Johnson (2009a)]. In the present study conducted in the general population, the only correlation between social brain activation and autistic traits was found in the fusiform gyrus when all emotions were combined. The fusiform gyrus is involved in processing invariant aspects of faces [see Haxby et al. (2000) for a review]. The reason why some people with high levels of autistic traits attend less to the eye region of faces is unclear but could be due to an active avoidance of this facial area due to its arousing properties [see Tanaka and Sung (2016) for an in-depth description of this hypothesis].

The PPI analysis was performed to specifically look at the influence of constrained direct gaze on amygdala connectivity with the rest of the brain for different emotional expressions. In her pioneering study, George et al. (2001) reported that direct gaze (in neutral faces) increases the functional connectivity between the amygdala and the fusiform gyrus, consistent with the concept of the modulation of the fusiform gyrus by amygdala activation. Using fMRI, Sato et al. demonstrated that the left amygdala showed increased activation to angry faces with a direct gaze compared with angry averted or neutral direct or averted (Sato et al., 2004.) Here, we show that constraining gaze to the eye region has a very different effect depending on which emotion is processed, increasing connectivity in different areas of the brain for NEUTRAL, HAPPY and ANGRY faces, while reducing it for FEAR. For NEUTRAL faces, as expected we observed increased connectivity for the CROSS vs NO-CROSS condition essentially in the face-processing network, whilst for HAPPY faces, effects were observed not only in the visual cortex, but also in areas involved in emotional and reward processing, including the ventromedial and orbital prefrontal cortex, the temporal pole and ventral striatum. Constraining gaze into the eyes of ANGRY faces increased the connectivity of the amygdala with the somatosensory and motor cortex, in the region of the faces, as well as in the hypothalamus, the thalamus and the insula. Activity in the hypothalamus may represent a stress response to having to look into the eyes of a threatening face, and it is noteworthy that ANGRY faces were the only emotion where we could observe increased amygdala connectivity with the hypothalamus. Note that the same areas (insula and hypothalamus) are involved in social regulation of neural response to threat (Coan et al., 2006).

A limitation of this study is that we could not collect eye-tracking while the participants were engaged in face perception in the scanner. Such information would have helped us better characterize the participants’ behavioral response to the constrained gaze manipulation. Even so, our data show that constraining typically developed participants to look consistently into the eyes of neutral and emotional dynamic face stimuli remarkably alters brain activation. They show that this effect is the most pronounced for threatening social stimuli (anger). Finally, our data demonstrate that amygdala connectivity varies in function of the emotional expression presented, and is modulated by behavioral traits such as alexithymia, anxiety and autistic traits.

Supplementary data

Supplementary data are available at SCAN online.

Supplementary Material

Supplementary Data

Acknowledgements

We want to thank Ophélie Rogier for her help in data acquisition, Karine Métrailler and Carole Burget for their support in participant’s recruitment and administrative assistance, and A. Lissot and T. Ruest for their help in data analysis.

Funding

This work was supported by the Swiss National Science Foundation (PP00P3-130191 to NH), the Centre d’Imagerie BioMédicale (CIBM) of the University of Lausanne (UNIL), as well as the Foundation Rossi Di Montalera and the LifeWatch Foundation. The funders had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; and preparation, review or approval of the manuscript.

Conflict of interest. None declared.

References

  1. Adams R.B. Jr., Kleck R.E. (2003). Perceived gaze direction and the processing of facial displays of emotion. Psychological Science, 14(6), 644–7. [DOI] [PubMed] [Google Scholar]
  2. Adams R.B. Jr., Kleck R.E. (2005). Effects of direct and averted gaze on the perception of facially communicated emotion. Emotion, 5(1), 3–11. [DOI] [PubMed] [Google Scholar]
  3. Adolphs R., Gosselin F., Buchanan T.W., Tranel D., Schyns P., Damasio A.R. (2005). A mechanism for impaired fear recognition after amygdala damage. Nature 433(7021), 68–72. [DOI] [PubMed] [Google Scholar]
  4. Argyle M., Cook M. (1976) Gaze and Mutual Gaze. New York, NY: Cambridge University Press. [Google Scholar]
  5. Arsalidou M., Morris D., Taylor M.J. (2011). Converging evidence for the advantage of dynamic facial expressions. Brain Topography, 24(2), 149–63. [DOI] [PubMed] [Google Scholar]
  6. Åsberg Johnels J., Hovey D., Zurcher N.R., et al. (2016). Autism and emotional viewing. Autism Research, in press. [DOI] [PubMed] [Google Scholar]
  7. Ashburner J., Andersson J.L., Friston K.J. (2000). Image registration using a symmetric prior–in three dimensions. Human Brain Mapping, 9(4), 212–25. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Bagby R.M., Parker J.D.A., Taylor G.J. (1994). The twenty-item Toronto Alexithymia Scale-I. Item selection and cross-validation of the factor structure. Journal of Psychosomatic Research, 38, 23–32. [DOI] [PubMed] [Google Scholar]
  9. Banks S.J., Eddy K.T., Angstadt M., Nathan P.J., Phan K.L. (2007). Amygdala-frontal connectivity during emotion regulation. Social Cognitive and Affective Neuroscience, 2(4), 303–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Baron-Cohen S. (1995) Mindblindness: An Essay on Theory of Mind. Cambridge, MA: MIT Press. [Google Scholar]
  11. Baron-Cohen S., Hoekstra R.A., Knickmeyer R., Wheelwright S. (2006). The Autism-Spectrum Quotient (AQ)–adolescent version. Journal of Autism and Developmental Disorders, 36(3), 343–50. [DOI] [PubMed] [Google Scholar]
  12. Baron-Cohen S., Wheelwright S., Skinner R., Martin J., Clubley E. (2001). The autism-spectrum quotient (AQ): evidence from Asperger syndrome/high-functioning autism, males and females, scientists and mathematicians. Journal of Autism and Developmental Disorders, 31(1), 5–17. [DOI] [PubMed] [Google Scholar]
  13. Bindemann M., Burton A., Langton S. (2008). How do eye gaze and facial expression interact? Visual Cognition, 16(6), 37–41. [Google Scholar]
  14. Binetti N., Harrison C., Coutrot A., Johnston A., Mareschal I. (2016). Pupil dilation as an index of preferred mutual gaze duration. Royal Society Open Science, 3(7). [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Bird G., Press C., Richardson D.C. (2011). The role of alexithymia in reduced eye-fixation in Autism Spectrum Conditions. Journal of Autism and Developmental Disorders, 41(11), 1556–64. [DOI] [PubMed] [Google Scholar]
  16. Burra N., Hervais-Adelman A., Kerzel D., Tamietto M., de Gelder B., Pegna A.J. (2013). Amygdala activation for eye contact despite complete cortical blindness. Journal of Neuroscience, 33(25), 10483–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Calder A.J., Lawrence A.D., Keane J., et al. (2002). Reading the mind from eye gaze. Neuropsychologia, 40(8), 1129–38. [DOI] [PubMed] [Google Scholar]
  18. Coan J.A., Schaefer H.S., Davidson R.J. (2006). Lending a hand: social regulation of the neural response to threat. Psychological Science, 17(12), 1032–9. [DOI] [PubMed] [Google Scholar]
  19. Cook R., Brewer R., Shah P., Bird G. (2013). Alexithymia, not autism, predicts poor recognition of emotional facial expressions. Psychological Science, 24(5), 723–32. [DOI] [PubMed] [Google Scholar]
  20. Daly S. (1978). Behavioral correlates of social anxiety. British Journal of Social and Clinical Psychology, 17(2), 117–20. [DOI] [PubMed] [Google Scholar]
  21. Darwin C. (1872/1965). The Expression of the Emotions in Man and Animals. Chicago, IL: University of Chicago. [Google Scholar]
  22. Donegan N.H., Sanislow C.A., Blumberg H.P., et al. (2003). Amygdala hyperreactivity in borderline personality disorder: implications for emotional dysregulation. Biological Psychiatry, 54(11), 1284–93. [DOI] [PubMed] [Google Scholar]
  23. Eger E., Jedynak A., Iwaki T., Skrandies W. (2003). Rapid extraction of emotional expression: evidence from evoked potential fields during brief presentation of face stimuli. Neuropsychologia, 41(7), 808–17. [DOI] [PubMed] [Google Scholar]
  24. Eimer M., Holmes A. (2002). An ERP study on the time course of emotional face processing. Neuroreport, 13(4), 427–31. [DOI] [PubMed] [Google Scholar]
  25. Eisenbarth H., Alpers G.W. (2011). Happy mouth and sad eyes: scanning emotional facial expressions. Emotion, 11(4), 860–5. [DOI] [PubMed] [Google Scholar]
  26. Emery N.J. (2000). The eyes have it: the neuroethology, function and evolution of social gaze. Neuroscience and Biobehavioral Reviews, 24(6), 581–604. [DOI] [PubMed] [Google Scholar]
  27. Ewbank M.P., Jennings C., Calder A.J. (2009). Why are you angry with me? Facial expressions of threat influence perception of gaze direction. Journal of Visualization, 9(12), 16, 11–7. [DOI] [PubMed] [Google Scholar]
  28. Farroni T., Csibra G., Simion F., Johnson M.H. (2002). Eye contact detection in humans from birth. Proceedings of the National Academy of Sciences of the United States of America, 99(14), 9602–5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Fox E., Lester V., Russo R., Bowles R.J., Pichler A., Dutton K. (2000). Facial expressions of emotion: are angry faces detected more efficiently? Cognition and Emotion, 14(1), 61–92. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Gamer M., Buchel C. (2009). Amygdala activation predicts gaze toward fearful eyes. Journal of Neuroscience, 29(28), 9123–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Gamer M., Zurowski B., Buchel C. (2010). Different amygdala subregions mediate valence-related and attentional effects of oxytocin in humans. Proceedings of the National Academy of Sciences of the United States of America, 107(20), 9400–5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Garfinkel S.N., Tiley C., O'Keeffe S., Harrison N.A., Seth A.K., Critchley H.D. (2016). Discrepancies between dimensions of interoception in autism: Implications for emotion and anxiety Biological Psychology, 114, 117–26. [DOI] [PubMed] [Google Scholar]
  33. Garner M., Mogg K., Bradley B.P. (2006). Orienting and maintenance of gaze to facial expressions in social anxiety. Journal of Abnormal Psychology, 115(4), 760–70. [DOI] [PubMed] [Google Scholar]
  34. George N., Conty L. (2008). Facing the gaze of others. Neurophysiologie Clinique, 38(3), 197–207. [DOI] [PubMed] [Google Scholar]
  35. George N., Driver J., Dolan R.J. (2001). Seen gaze-direction modulates fusiform activity and its coupling with other brain areas during face processing. Neuroimage, 13(6 Pt 1), 1102–12. [DOI] [PubMed] [Google Scholar]
  36. Ghashghaei H.T., Hilgetag C.C., Barbas H. (2007). Sequence of information processing for emotions based on the anatomic dialogue between prefrontal cortex and amygdala. Neuroimage, 34(3), 905–23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Gorka S.M., Fitzgerald D.A., de Wit H., Phan K.L. (2015). Cannabinoid modulation of amygdala subregion functional connectivity to social signals of threat. International Journal of Neuropsychopharmacology, 18(3). [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Gorka S.M., Fitzgerald D.A., King A.C., Phan K.L. (2013). Alcohol attenuates amygdala-frontal connectivity during processing social signals in heavy social drinkers: a preliminary pharmaco-fMRI study. Psychopharmacology (Berlin), 229(1),141–54. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Graham R., LaBar K.S. (2007). Garner interference reveals dependencies between emotional expression and gaze in face perception. Emotion, 7(2), 296–313. [DOI] [PubMed] [Google Scholar]
  40. Greening S.G., Mitchell D.G. (2015). A network of amygdala connections predict individual differences in trait anxiety. Human Brain Mapping, 36(12), 4819–30. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Grynberg D., Chang B., Corneille O., et al. (2012). Alexithymia and the processing of emotional facial expressions (EFEs): systematic review, unanswered questions and further perspectives. PLoS One, 7(8), e42429.. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Haith M.M., Bergman T., Moore M.J. (1977). Eye contact and face scanning in early infancy. Science, 198(4319), 853–5. [DOI] [PubMed] [Google Scholar]
  43. Haxby J.V., Hoffman E.A., Gobbini M.I. (2000). The distributed human neural system for face perception. Trends in Cognitive Science, 4(6), 223–33. [DOI] [PubMed] [Google Scholar]
  44. Helminen T.M., Kaasinen S.M., Hietanen J.K. (2011). Eye contact and arousal: the effects of stimulus duration. Biological Psychology, 88(1), 124–30. [DOI] [PubMed] [Google Scholar]
  45. Hietanen J.K., Leppanen J.M., Peltola M.J., Linna-Aho K., Ruuhiala H.J. (2008). Seeing direct and averted gaze activates the approach-avoidance motivational brain systems. Neuropsychologia, 46(9), 2423–30. [DOI] [PubMed] [Google Scholar]
  46. Hogeveen J., Bird G., Chau A., Krueger F., Grafman J. (2016). Acquired alexithymia following damage to the anterior insula. Neuropsychologia, 82, 142–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Holmes A., Richards A., Green S. (2006). Anxiety and sensitivity to eye gaze in emotional faces. Brain Cognition, 60(3), 282–94. [DOI] [PubMed] [Google Scholar]
  48. Horley K., Williams L.M., Gonsalvez C., Gordon E. (2003). Social phobics do not see eye to eye: a visual scanpath study of emotional expression processing. Journal of Anxiety Disorders, 17(1), 33–44. [DOI] [PubMed] [Google Scholar]
  49. Huijgen J., Dinkelacker V., Lachat F., et al. (2015). Amygdala processing of social cues from faces: an intracrebral EEG study. Social Cognitive and Affective Neuroscience, 10(11), 1568–76. [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Iidaka T., Ozaki N., Matsumoto A., et al. (2005). A variant C178T in the regulatory region of the serotonin receptor gene HTR3A modulates neural activation in the human amygdala. Journal of Neuroscience, 25(27), 6460–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Janik S.W., Wellens A.R., Goldberg M.L., Dell'Osso L.F. (1978). Eyes as the center of focus in the visual examination of human faces. Perceptual and Motor Skills, 47(3 Pt 1), 857–8. [DOI] [PubMed] [Google Scholar]
  52. Kanwisher N., McDermott J., Chun M.M. (1997). The fusiform face area: a module in human extrastriate cortex specialized for face perception. Journal of Neuroscience, 17(11), 4302–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  53. Kawashima R., Sugiura M., Kato T., et al. (1999). The human amygdala plays an important role in gaze monitoring. A PET study. Brain, 122 (Pt 4), 779–83. [DOI] [PubMed] [Google Scholar]
  54. Kennedy D.P., Adolphs R. (2012). The social brain in psychiatric and neurological disorders. Trends in Cognitive Science, 16, 559–72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Kleinke C.L. (1986). Gaze and eye contact: a research review. Psychological Bulletin, 100(1), 78–100. [PubMed] [Google Scholar]
  56. Klucharev V., Sams M. (2004). Interaction of gaze direction and facial expressions processing: ERP study. Neuroreport, 15(4), 621–5. [DOI] [PubMed] [Google Scholar]
  57. Lobaugh N.J., Gibson E., Taylor M.J. (2006). Children recruit distinct neural systems for implicit emotional face processing. Neuroreport, 17(2), 215–9. [DOI] [PubMed] [Google Scholar]
  58. Mann L.S., Wise T.N., Trinidad A., Kohanski R. (1994). Alexithymia, affect recognition, and the five-factor model of personality in normal subjects. Psychological Report, 74(2), 563–7. [DOI] [PubMed] [Google Scholar]
  59. Miyahara M., Harada T., Ruffman T., Sadato N., Iidaka T. (2013). Functional connectivity between amygdala and facial regions involved in recognition of facial threat. Social Cognitive and Affective Neuroscience, 8(2), 181–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  60. Morris J.P., Pelphrey K.A., McCarthy G. (2007). Controlled scanpath variation alters fusiform face activation. Social Cognitive and Affective Neuroscience, 2(1), 31–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  61. Moukheiber A., Rautureau G., Perez-Diaz F., et al. (2010). Gaze avoidance in social phobia: objective measure and correlates. Behavioural Research Therapy, 48(2), 147–51. [DOI] [PubMed] [Google Scholar]
  62. Myllyneva A., Ranta K., Hietanen J.K. (2015). Psychophysiological responses to eye contact in adolescents with social anxiety disorder. Biological Psychology, 109, 151–8. [DOI] [PubMed] [Google Scholar]
  63. O'Reilly J.X., Woolrich M.W., Behrens T.E., Smith S.M., Johansen-Berg H. (2012). Tools of the trade: psychophysiological interactions and functional connectivity. Social Cognitive and Affective Neuroscience, 7(5), 604–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  64. Parker J.D., Taylor G.J., Bagby R.M. (1993). Alexithymia and the recognition of facial expressions of emotion. Psychotherapy and Psychosomatics, 59(3–4), 197–202. [DOI] [PubMed] [Google Scholar]
  65. Pelphrey K.A., Sasson N.J., Reznick J.S., Paul G., Goldman B.D., Piven J. (2002). Visual scanning of faces in autism. Journal of Autism and Developmental Disorders, 32(4), 249–61. [DOI] [PubMed] [Google Scholar]
  66. Pizzagalli D., Regard M., Lehmann D. (1999). Rapid emotional face processing in the human right and left brain hemispheres: an ERP study. Neuroreport, 10(13), 2691–8. [DOI] [PubMed] [Google Scholar]
  67. Prkachin G., Casey C., Prkachin K.M. (2009). Alexithymia and perception of facial expressions of emotion. Personality and Individual Differences, 46(4), 412–7. [Google Scholar]
  68. Reynolds C.R., Richmond B.O., Castro D. (1999) Échelle Révisée D’Anxiété Manifeste Pour Enfants (R-CMAS). Paris: Les Éditions du Centre de Psychologie Appliquée. [Google Scholar]
  69. Roelofs K., Putman P., Schouten S., Lange W.G., Volman I., Rinck M. (2010). Gaze direction differentially affects avoidance tendencies to happy and angry faces in socially anxious individuals. Behaviour Research and Therapy, 48(4), 290–4. [DOI] [PubMed] [Google Scholar]
  70. Rossignol M., Philippot P., Douilliez C., Crommelinck M., Campanella S. (2005). The perception of fearful and happy facial expression is modulated by anxiety: an event-related potential study. Neuroscience Letters, 377(2), 115–20. [DOI] [PubMed] [Google Scholar]
  71. Rossion B., Caldara R., Seghier M., Schuller A.M., Lazeyras F., Mayer E. (2003). A network of occipito-temporal face-sensitive areas besides the right middle fusiform gyrus is necessary for normal face processing. Brain, 126, 2381–95. [DOI] [PubMed] [Google Scholar]
  72. Rothkirch M., Madipakkam A.R., Rehn E., Sterzer P. (2015). Making eye contact without awareness. Cognition, 143, 108–14. [DOI] [PubMed] [Google Scholar]
  73. Sander D., Grafman J., Zalla T. (2003). The human amygdala: an evolved system for relevance detection. Reviews in Neuroscience, 14(4), 303–16. [DOI] [PubMed] [Google Scholar]
  74. Sander D., Grandjean D., Kaiser S., Wehrle T., Scherer K.R. (2007). Interaction effects of perceived gaze direction and dynamic facial expression: evidence for appraisal theories of emotion. European Journal of Cognitive Psychology, 19, 470–80. [Google Scholar]
  75. Sato W., Yoshikawa S., Kochiyama T., Matsumura M. (2004). The amygdala processes the emotional significance of facial expressions: an fMRI investigation using the interaction between expression and face direction. Neuroimage, 22(2), 1006–13. [DOI] [PubMed] [Google Scholar]
  76. Schneier F.R., Pomplun M., Sy M., Hirsch J. (2011). Neural response to eye contact and paroxetine treatment in generalized social anxiety disorder. Psychiatry Research, 194(3), 271–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  77. Schulze L., Renneberg B., Lobmaier J.S. (2013). Gaze perception in social anxiety and social anxiety disorder. Frontiers in Human Neuroscience, 7, 872.. [DOI] [PMC free article] [PubMed] [Google Scholar]
  78. Schyns P.G., Petro L.S., Smith M.L. (2007). Dynamics of visual information integration in the brain for categorizing facial expressions. Current Biology, 17(18), 1580–5. [DOI] [PubMed] [Google Scholar]
  79. Senju A., Johnson M.H. (2009a). Atypical eye contact in autism: models, mechanisms and development. Neuroscience and Biobehavioral Reviews, 33(8), 1204–14. [DOI] [PubMed] [Google Scholar]
  80. Senju A., Johnson M.H. (2009b). The eye contact effect: mechanisms and development. Trends in Cognitive Sciences, 13(3), 127–34. [DOI] [PubMed] [Google Scholar]
  81. Somerville L.H., Kim H., Johnstone T., Alexander A.L., Whalen P.J. (2004). Human amygdala responses during presentation of happy and neutral faces: correlations with state anxiety. Biological Psychiatry, 55(9), 897–903. [DOI] [PubMed] [Google Scholar]
  82. Soussignan R., Chadwick M., Philip L., Conty L., Dezecache G., Grezes J. (2013). Self-relevance appraisal of gaze direction and dynamic facial expressions: effects on facial electromyographic and autonomic reactions. Emotion, 13(2), 330–7. [DOI] [PubMed] [Google Scholar]
  83. Spielberger C.D., Gorsuch R.L., Lushene R., Vagg P.R., Jacobs G.A. (1983). Manual for the State-Trait Anxiety Inventory. Palo Alto, CA: Consulting Psychologists Press. [Google Scholar]
  84. Stussi Y., Brosch T., Sander D. (2015). Learning to fear depends on emotion and gaze interaction: The role of self-relevance in fear learning. Biological Psychology, 109, 232–8. [DOI] [PubMed] [Google Scholar]
  85. Tanaka J.W., Sung A. (2016). The “Eye Avoidance” hypothesis of autism face processing. Journal of Autism and Developmental Disorders, 46(5), 1538–52. [DOI] [PMC free article] [PubMed] [Google Scholar]
  86. Thomas K.M., Drevets W.C., Dahl R.E., et al. (2001). Amygdala response to fearful faces in anxious and depressed children. Archives in General Psychiatry, 58(11), 1057–63. [DOI] [PubMed] [Google Scholar]
  87. Tiedens L.Z. (2001). The effect of anger on the hostile inferences of aggressive and nonaggressive people: Specific emotions, cognitive processing, and chronic accessibility. Motivation and Emotion, 25(3), 233–51. [Google Scholar]
  88. Tottenham N., Tanaka J.W., Leon A.C., et al. (2009). The NimStim set of facial expressions: judgments from untrained research participants. Psychiatry Research, 168(3), 242–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  89. Vuilleumier P., Armony J.L., Driver J., Dolan R.J. (2001). Effects of attention and emotion on face processing in the human brain: an event-related fMRI study. Neuron, 30(3), 829–41. [DOI] [PubMed] [Google Scholar]
  90. Vuilleumier P., Pourtois G. (2007). Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging. Neuropsychologia, 45(1), 174–94. [DOI] [PubMed] [Google Scholar]
  91. Wechsler D. (1999). Wechsler Abbreviated Scale of Intelligence (WASI). San Antonio, TX: Harcourt Assessment. [Google Scholar]
  92. Wechsler D., Naglieri J. (2006). Wechsler Nonverbal Scale of Ability. San Antonio, TX: PsychCorp Edition, A Brand of Harcourt Assessment. [Google Scholar]
  93. Weeks J.W., Howell A.N., Goldin P.R. (2013). Gaze avoidance in social anxiety disorder. Depress Anxiety, 30(8), 749–56. [DOI] [PubMed] [Google Scholar]
  94. Wieser M.J., Pauli P., Alpers G.W., Muhlberger A. (2009). Is eye to eye contact really threatening and avoided in social anxiety? An eye-tracking and psychophysiology study. Journal of Anxiety Disorders, 23(1), 93–103. [DOI] [PubMed] [Google Scholar]
  95. Woodbury-Smith M.R., Robinson J., Wheelwright S., Baron-Cohen S. (2005). Screening adults for Asperger Syndrome using the AQ: a preliminary study of its diagnostic validity in clinical practice. Journal of Autism and Developmental Disorders, 35(3), 331–5. [DOI] [PubMed] [Google Scholar]
  96. Worsley K.J. (2001). Statistical analysis of activation imagesIn: Jezzard P., Matthews P.M., Smith S.M., editors. Functional MRI: In Introduction to Methods. Oxford: OUP. [Google Scholar]
  97. Yoon K.L., Zinbarg R.E. (2007). Threat is in the eye of the beholder: social anxiety and the interpretation of ambiguous facial expressions. Behaviour Research and Therapy, 45(4), 839–47. [DOI] [PubMed] [Google Scholar]
  98. Yoon K.L., Zinbarg R.E. (2008). Interpreting neutral faces as threatening is a default mode for socially anxious individuals. Journal of Abnormal Psychology, 117(3), 680–5. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary Data

Articles from Social Cognitive and Affective Neuroscience are provided here courtesy of Oxford University Press

RESOURCES