Skip to main content
PMC Canada Author Manuscripts logoLink to PMC Canada Author Manuscripts
. Author manuscript; available in PMC: 2017 Mar 3.
Published in final edited form as: Brain Cogn. 2015 Aug 13;99:97–111. doi: 10.1016/j.bandc.2015.05.007

Fixation to features and neural processing of facial expressions in a gender discrimination task

Karly N Neath 1,*, Roxane J Itier 1
PMCID: PMC5336384  CAMSID: CAMS6549  PMID: 26277653

Abstract

Early face encoding, as reflected by the N170 ERP component, is sensitive to fixation to the eyes. Whether this sensitivity varies with facial expressions of emotion and can also be seen on other ERP components such as P1 and EPN, was investigated. Using eye-tracking to manipulate fixation on facial features, we found the N170 to be the only eye-sensitive component and this was true for fearful, happy and neutral faces. A different effect of fixation to features was seen for the earlier P1 that likely reflected general sensitivity to face position. An early effect of emotion (~120 ms) for happy faces was seen at occipital sites and was sustained until ~350 ms post-stimulus. For fearful faces, an early effect was seen around 80 ms followed by a later effect appearing at ~150 ms until ~300 ms at lateral posterior sites. Results suggests that in this emotion-irrelevant gender discrimination task, processing of fearful and happy expressions occurred early and largely independently of the eye-sensitivity indexed by the N170. Processing of the two emotions involved different underlying brain networks active at different times.

Keywords: Eyes, Facial expression of emotions, Eye-tracking, P1, N170, EPN

1. Introduction

It is generally agreed that the earliest reliable neural signature of face perception is detected in the EEG about 170 ms after stimulus presentation, and manifests as a negative ERP component (N170) over occipito-temporal electrode sites (e.g., Bentin, Allison, Puce, Perez, & McCarthy, 1996; Ganis, Smith, & Schendan, 2012; Jemel et al., 2003; Rossion & Caharel, 2011; Rossion et al., 2000). This component is thought to reflect encoding of the structure of the face (Bentin & Deouell, 2000; Eimer, 2000; Itier & Taylor, 2002; Itier & Taylor, 2004) and the bulk of the literature supports the view that it reflects holistic processing, the processing of the face into an indecomposable whole (Rossion, 2009). However, two recent studies controlling for fixation position using an eye-tracker and a gaze-contingent procedure have shown that the N170 is also sensitive to features within the face and in particular to the eyes. Larger N170s were indeed reported for fixation on the eyes compared to fixation on the mouth of upright faces (de Lissa et al., 2014; Nemrodov, Anderson, Preston, & Itier, 2014; see also Zerouali, Lina, & Jemel, 2013) or compared to fixation on the nose, forehead and even nasion (Nemrodov et al., 2014). This finding echoes previous reports of larger N170s for eye regions presented in isolation compared to whole upright faces (Bentin et al., 1996; Itier, Alain, Sedore, & McIntosh, 2007; Itier, Latinus, & Taylor, 2006; Itier, Van Roon, & Alain, 2011; Taylor, Edmonds, McCarthy, & Allison, 2001) and confirms a special role for eyes in the early processing of the face structure, as also suggested by reverse correlation techniques (e.g., Rousselet, Ince, van Rijsbergen, & Schyns, 2014; Schyns, Jentzsch, Johnson, Schweinberger, & Gosselin, 2003; Schyns, Petro, & Smith, 2007; Schyns, Petro, & Smith, 2009). Importantly however, these recent eye-tracking-EEG studies demonstrated the sensitivity of the N170 to eyes in full faces when the face configuration was not altered (configuration is altered with presentation of isolated eyes or when portions of faces are revealed as in the reverse correlation technique Bubbles). In addition, Nemrodov et al. (2014) showed that this eye sensitivity disappeared in eyeless faces, demonstrating it was due to the presence of the eyes at fovea. These findings, along with numerous others, led the authors to develop the Lateral Inhibition Face Template and Eye Detector (LIFTED) model which proposes that the N170 reflects both the activity of an eye detector and the processing of a face as a whole in a complex interplay between information at fovea and information in parafovea (Nemrodov et al., 2014). In addition to providing a new theoretical account of holistic and featural processing at the neural level, this study highlights the importance of controlling for fixation to face features in ERP face research.

Whether the N170 is sensitive to facial emotions is still debated. Several studies have reported an increased response to fearful faces (e.g., Batty & Taylor, 2003; Blau, Maurer, Tottenham, & McCandliss, 2007; Caharel, Courtay, Bernard, & Lalonde, 2005; Leppänen, Hietanen, & Koskinen, 2008; Leppänen, Moulson, Vogel-Farley, & Nelson, 2007). However, many others have reported no modulation of the N170 by facial emotion (e.g., Ashley, Vuilleumier, & Swick, 2004; Balconi & Lucchiari, 2005; Herrmann et al., 2002; Krolak-Salmon, Fischer, Vighetto, & Mauguière, 2001; Münte et al., 1998; Pourtois, Dan, Grandjean, Sander, & Vuilleumier, 2005; Smith, Weinberg, Moran, & Hajcak, 2013, see Eimer & Holmes, 2007 and Hinojosa, Mercado, & Carretié, 2015 for reviews). The eye region is used most prominently when discriminating fear from other expressions (Smith, Cottrell, Gosselin, & Schyns, 2005) and eyes have been shown to convey threat even when presented in isolation (Fox & Damjanovic, 2006; Whalen et al., 2004). Recently, it has been shown that participants make spontaneous saccades toward the eyes of emotional faces presented even for as short as 150 ms (Gamer, Schmitz, Tittgemeyer, & Schilbach, 2013). Given that previous ERP studies reporting modulations of the N170 with fearful faces did not use an eye-tracker to confirm gaze position, and that the eyes are salient in fearful faces, it is possible that the participants made small eye movements toward the eyes or attended the eyes more for fearful faces than other expressions. These possible movements to the eyes and attentional effects might have driven the N170 modulations with fearful expressions reported previously in the literature. The present study tested this hypothesis by manipulating fixation to features of facial expressions using a gaze-contingent procedure.

As the use of gaze-contingent procedures is very new in ERP face research, we also aimed to investigate more thoroughly the effect of fixation to features, in particular the sensitivity to the eyes, on other ERP components than the N170, namely the preceding P1 and the following Early Posterior Negativity (EPN) components. P1 is a positive component occurring ~80–120 ms at occipital sites and is known to respond to the low-level characteristics of stimuli such as contrast, luminance, color and spatial frequencies (Rossion & Jacques, 2007) and is also sensitive to attentional effects (Luck, Woodman, & Vogel, 2000; Mangun, 1995). However, it is unclear whether the P1 is sensitive to fixation to features and especially to eyes, and its sensitivity to emotional expressions has been controversial (see Vuilleumier & Pourtois, 2007 for a review). The EPN, beginning at ~150 ms and largest between ~200 and 350 ms at occipital-temporal sites, is a well-known marker of emotion processing with a more negative-going response for threatening faces (i.e., angry and fearful faces) compared to happy and neutral expressions (e.g., Rellecke, Palazova, Sommer, & Schacht, 2011; Rellecke, Sommer, & Schacht, 2013; Schupp, Junghöfer, Weike, & Hamm, 2004). No study to date has investigated whether EPN could be sensitive to fixation to facial features bearing emotional significance such as the eyes in fearful faces or the mouth in happy faces.

We investigated whether fixation to the eyes of fearful, happy and neutral faces modulates P1, N170 and EPN responses. Faces were presented with fixation locations on the left eye, right eye, nose and mouth during a gender discrimination task. To ensure correct point of gaze, eye-tracking was used with a fixation-contingent stimulus presentation and any trial in which gaze deviated by more than 1.4_ of visual angle around that fixation location was excluded. To further prevent participants from using anticipatory strategies the fixation-cross was always presented in the center of the screen, while faces were moved around to obtain fixation on the desired feature, as done in Nemrodov et al. (2014) and de Lissa et al. (2014). Given this experimental manipulation we expected an interaction between eye fixation location and hemisphere for the P1 amplitude as most of the face was situated in the left hemifield when fixation was on the right eye (the eye situated on the right side of the participant) and in the right hemifield when fixation was on the left eye (e.g., Luck, Heinze, Mangun, & Hillyard, 1990). We also expected to replicate Nemrodov et al.’s findings (2014) of a larger N170 response for fixation on the eyes compared to fixation on the nose and mouth. Crucially, if attention to the eyes was driving the previously reported N170 increase for fearful faces, we expected to see an emotion by fixation interaction with an enhanced N170 response for fearful faces only when fixation was on the eyes. Alternatively, if the emotional content, processed holistically (Bimler, Skwarek, & Paramei, 2013; Derntl, Seidel, Kainz, & Carbon, 2009; McKelvie, 1995), was driving these N170 modulations, we expected to see a larger N170 response to fearful faces irrespective of fixation location. Finally, the possibility remained that no modulation of the N170 by emotion would be found but we expected a modulation of the EPN, a classic marker of emotion, with a more negative-going response for fearful compared to happy and neutral faces as reported previously. Whether EPN could also respond more to fixation on the eyes of fearful faces than to fixation on other facial features was unpredictable although the fact that EPN was sensitive to emotion even when eyes were covered (Leppänen et al., 2008) led us to predict that fixation on the eyes would not matter at this stage.

2. Materials and Methods

2.1 Participants

Forty-nine undergraduate students from the University of Waterloo (UW) were recruited and received course credit for their participation. All participants reported normal or corrected-to-normal vision as well as no history of neurological or psychiatric disorder. Informed consent was obtained before starting the experiment and the study was approved by the Research Ethics Board at UW. Ten participants failed to achieve eye-tracking calibration and were therefore not tested. Of the 39 participants tested, 19 were rejected for the following reasons. To ensure overt attention to the fixated feature, trials with fixations greater than 1.4° of visual angle from the fixation location (see procedure below) were removed. For eleven participants, this procedure put the overall number of trials below our cut-of score of 40 (correct) trials per condition (i.e. less than 50% of the trials), for multiple conditions. These 11 participants were thus removed1. Five participants were rejected due to too many artefacts also resulting in too few trials per condition. Finally, three were rejected due to high anxiety scores. Anxiety is known to interact with the processing of emotions like fear (e.g., Dugas, Gosselin, & Ladouceur, 2001) therefore only participants with scores in the normal range below 43 on the State-Trait Inventory for Cognitive and Somatic Anxiety questionnaire (STICSA; Ree, French, MacLeod, & Locke, 2008; Van Dam, Gros, Earlywine, & Antony, 2013), were included. The remaining 20 participants (8 females, 18–23 years, M = 20.06 years) were included in the data analyses.

2.2 Stimuli and Procedure

Photographs of 8 individuals (4 males, 4 females) each with fearful, happy and neutral expressions were selected from the MacBrain Face Stimulus Set2 (Tottenham et al., 2009). Images were converted to grayscale in Adobe™ Photoshop CS5 and an elliptical mask was applied on each picture so hair, ears, and shoulders were not visible. All faces subtended a visual angle of 6.30° horizontally and 10.44° vertically, and were presented on a white background for an image visual angle of 9.32° horizontally and 13.68° vertically (see Figure 1). Images did not differ significantly in RMS contrast and pixel intensity between emotions (see analyses and result sections below).

Figure 1.

Figure 1

Left panel examples of one neutral face presented at each fixation location. Participants fixated in the center of the monitor represented here by each rectangle and the face was presented offset so that gaze fixated 4 possible face locations: left eye, right eye, nose and mouth. Note that eye positions are from a viewer perspective (i.e., left eye is on the left of the image). This resulted in the face situated almost entirely in the upper visual field when fixation was on the mouth, mostly in the left visual field when fixation was on the right eye, and mostly in the right visual field when fixation was on the left eye. Right panel, up: one neutral face exemplar with picture size and angular distances between fixation locations (averaged across all emotions and face identities). The yellow circles represent the interest areas of 1.4° centered on each feature that were used to reject eye gaze deviations in each fixation condition (i.e., foveated areas which did not overlap) and to calculate local RMS contrast and pixel intensity for each picture. Right panel, bottom: exemplars of fearful, happy and neutral expressions used in the present study (from NimStim database).

For each stimulus, exact coordinates corresponding to 4 feature locations on the face were recorded: left eye, right eye, nose and mouth. Fixation-crosses on the nose and mouth were aligned with one another along an axis passing through the middle of the nose and face. Eye coordinates were determined by placing the cross on the center of the pupil. A unique central fixation-cross was used and each face was presented offset so the predetermined center of each feature would land on the center of the fixation-cross (Figure 1). No picture was presented in the exact same location due to minor variations in the coordinates of each feature between the eight identities and the three expressions used.

Participants sat in a sound-attenuated Faraday-cage protected booth 70cm from a Viewsonic P95f+ CRT 19-inch colour monitor driven by an Intel Quad CPU Q6700 with a refresh rate of 75Hz. Participants performed a gender discrimination task using a game controller to record their responses. Using the index fingers, one key was pressed for male stimuli, another for female stimuli, and key-press side for each gender was counter-balanced across participants. Before the experiment started, participants were given an 8 trial practice session to introduce them to the experimental procedure. Each trial began with a 0–107ms jittered fixation-cross. Participants were instructed to fixate on the black centered fixation-cross to initiate the trial and to remain fixated there until the response screen appeared. To ensure that participants were fixating on the fixation-cross, a fixation-contingent trigger enforced the fixation on the cross for 307ms3. The face stimulus was then presented for 257ms, followed by a white screen with a question mark prompting their response. This response screen was presented until the participant responded, or for a maximum of 907ms (Figure 2). On average it took participants 621ms (118ms S.D.) to respond (RTs were calculated from stimulus onset; see Table 1). Participants were instructed to categorize faces by their gender as quickly and accurately as possible. After their response, a screen appeared that read “BLINK” for 507ms. Participants were instructed to blink during this time to prevent as much as possible eye movement artifacts during the first 500ms of trial recording. If the participant did not respond, or responded during the “blink” screen, the trial was considered a “miss” and was eliminated from further analysis (Table 1).

Figure 2.

Figure 2

Trial example with right eye fixation: Participants were tested on 960 trials as follows. First the fixation point was displayed on the screen for a jittered amount of time (0–107ms) with a fixation trigger of 307ms. Then the grayscale picture was flashed for 257ms, immediately followed by a white screen with a question mark for 907ms during which participants indicated their response. Lastly, a blink screen appeared for 307ms.

Table 1.

Mean (A) percent error, (B) reaction time (RT) and (C) misses/no-response values for fearful, happy and neutral expressions presented during the gender discrimination task (standard errors to the means in parenthesis).

Mean Error (%)(std error) Mean Reaction Time (RT) (ms) (std error) Mean Misses (%)(std error)
Fear 9.7 (1.0) 629.54 (12.61) 9.3 (0.9)
Happy 8.1 (1.0) 619.94 (12.45) 9.1 (0.9)
Neutral 7.6 (0.9) 614.86 (12.51) 9.6 (0.9)

The block of 96 face trials (3 emotions X 4 fixation locations X 8 identities) was repeated 10 times with a different trial order (randomized), yielding 80 trials per condition across blocks. Participants then completed the 21-item of the trait test from the State-Trait Inventory for Cognitive and Somatic Anxiety (STICSA; Ree et al, 2008). The STICSA is a Likert-scale assessing cognitive and somatic symptoms of anxiety as they pertain to one’s mood in general.

2.3 Electrophysiological Recordings

The EEG recordings were collected continuously at 516Hz by an Active-two Biosemi system at 72 recording sites: 66 channels in an electrode-cap under the 10/20 system-extended and three pairs of additional electrodes. Two pairs of electrodes, situated on the outer canthi and infra-orbital ridges, monitored eye movements; one pair was placed over the mastoids. A Common Mode Sense (CMS) active-electrode and a Driven Right Leg (DRL) passive-electrode acted as a ground during recordings. The electrodes were average-referenced offline.

2.4 Eye-Tracking Recordings

Eye movements were recorded using a remote Eyelink 1000 eye-tracker from SR Research with a sampling rate of 1000Hz. The eye-tracker was calibrated to each participant’s dominant eye, but viewing was binocular. If participants spent over 10s before successfully fixating on the cross, a drift correction was used. After two drift corrections, a mid-block recalibration was performed. Calibration was done using a nine-point automated calibration accuracy test. Calibration was repeated if the error at any point was more than 1°, or if the average for all points was greater than 0.5°. The participants’ head positions were stabilized with a head and chin rest to maintain viewing position and distance constant.

2.5 Data Processing and Analyses

Only correctly answered trials were used for analysis. Trials in which a saccadic eye movement was recorded beyond 1.4° visual angle (70px) around the fixation-location were removed from further analysis (see Figure 1 for interest areas around each fixation location). This size ensured that the areas of interest around the features were non-overlapping. This step in the pre-processing removed an average of 2.0% (±1.7) of trials across the 20 participants included in the final sample.

The data were processed offline using the EEGLab (Derlome & Makeig, 2004) and ERPLab (http://erpinfor.org/erplab) toolboxes implemented in Matlab (Mathworks, Inc.). Average-waveform epochs of 500ms were generated (100ms pre-stimulus-onset to 400ms post-stimulus-onset) and digitally band-pass filtered (0.01–30Hz) using a two-way least-squares FIR filter. Trials containing artifacts >±70μV were then rejected (100μV was used for 6 participants). Trials were then visually inspected and those still containing artefacts were rejected. After trial rejection, participants with less than 40 trials in each condition (out of 80 initial trials) were rejected.

2.5.1 Contrast and Pixel intensity

To evaluate possible influences of low-level factors, we measured the mean pixel intensity and root mean squared (RMS) contrast of each picture using a home-made Matlab program and compared them across emotions using paired sample t-tests, with p-values corrected for multiple comparisons. For each picture, the mean RMS contrast and pixel intensity were also calculated for circular areas of 1.4° visual angle around each fixation location (Figure 1) and were analyzed using a 3 (emotion) X 4 (fixation location) repeated measure analysis of variance (ANOVA).

2.5.2 Percent error and mean Reaction Times (RTs)

Repeated measures ANOVA were conducted separately for percent errors and mean RTs. For each participant, only RTs within 2.5 standard deviations from the mean of each condition were kept in the mean RT calculation (Van Selst & Jolicoeur, 1994) which excluded7.67% of the total number of trials. Within-subject factors included facial expression (3: fear, happiness, neutral) and fixation location (4: left eye, right eye, nose, mouth). Further analyses of the interactions found were completed with separate ANOVAs for each fixation location using ANOVAs with one factor (emotion, 3 levels).

2.5.3 ERP Analysis

For most participants, the P1 component was maximal at electrodes O1 and O2 and was thus measured at these sites between 80 and 130ms post-stimulus-onset (peak around 100ms) using automatic peak detection. Careful inspection of the data also suggested some emotion differences on P1 at Oz so P1 was also analyzed at Oz separately. In contrast to P1, the N170 component was maximal at different electrodes across participants, and within a given participant the N170 was often maximal at different electrodes across the two hemispheres. Thus, to best capture that component, the N170 peak was measured between 120–200ms at the electrode where it was maximal for each subject and for each hemisphere (see Table 2). To measure the time course of the fixation and emotion effects, mean amplitudes were also calculated within six 50ms windows starting from 50ms to 350ms. Preliminary inspection of the data revealed different effects over two electrode clusters at occipital sites (O1, O2 and Oz) and at lateral-posterior sites (CB1/2, P9/10, P7/8 and PO7/8). Thus for each time window, separate analyses were conducted over these two clusters. Note that the lateral-posterior electrodes are those electrodes where the N170 was measured across participants and also included the visual P2 component (peaking around 200ms post-face onset) as well as the Early Posterior Negativity (EPN) component involved in emotion processing. P2 and EPN are broader components and best measured by mean amplitudes.

Table 2.

Number of subjects for whom the N170 was maximal at left (P07, CB1, P9) and right hemisphere (PO8, CB2, P10, and TP10) electrodes.

Left Hemisphere Right Hemisphere
P9 6 P10 8
CB1 8 CB2 5
PO7 6 PO8 5
TP10 2

Repeated measures ANOVAs were conducted using SPSS Statistics 22. Within-subject factors included hemisphere (2: left, right), facial expression (3: fear, happiness, neutral) and fixation location (4: left eye, right eye, nose, mouth) for P1 and N170 peaks. For mean amplitude analyses, an electrode factor was added for occipital sites (3: O1, O2, Oz) and lateral-posterior sites (4: CB1/2, P9/10, P7/8, PO7/8). If necessary further analyses of the interactions found were completed with separate ANOVAs for each fixation location or each emotion. All ANOVAs used Greenhouse-Geisser adjusted degrees of freedom and pair-wise comparisons used Bonferroni corrections for multiple comparisons.

3. Results

3.1 Pixel intensity and RMS contrast

Post-hoc paired t-tests confirmed no differences between emotions (p > .05 for all comparisons) for mean pixel intensity and mean contrast values.

For mean RMS contrast in areas of 1.4° visual angle around each fixation, the highest contrast was seen for the left and right eyes (which did not significantly differ), followed by the mouth and then the nose (which did not significantly differ) (effect of fixation location, F(1.34, 9.40) = 16.34, p < .001, ηp2 = .70, all paired comparisons at p-values < .05; see Table 3). However, the emotion by fixation interaction (F(2.74, 19.18) = 11.48, p < .001, ηp2 = .62) revealed that this specific pattern was significant only for neutral faces (F = 41.71, p < .001; all significant fixation location paired comparisons at p < .05). For happy faces a larger contrast was seen for the mouth compared to the nose fixation (F = 6.07, p < .05). For fearful faces there was an effect of fixation location (F = 5.22, p < .05) however pair-wise comparisons were not significant.

Table 3.

Mean pixel intensity and RMS contrast values for full faces and within 1.4° radius of left eye, right eye, nose and mouth for fearful, happy and neutral expressions (standard errors to the means in parenthesis).

Mean pixel intensity (std error) Mean RMS Contrast (std error)
Full Face Left Eye Right Eye Nose Mouth Full Face Left Eye Right Eye Nose Mouth
Fear .58 (.01) .43 (.02) .44 (.04) .52 (.02) .49 (.03) .33 (.01) .14 (.01) .14 (.01) .12 (.01) .13 (.02)
Happy .57 (.01) .44 (.03) .44 (.03) .51 (.02) .55 (.02) .34 (.01) .13 (.01) .14 (.02) .11 (.01) .13 (.01)
Neutral .57 (.01) .43 (.03) .43 (.02) .51 (.03) .50 (.02) .34 (.01) .14 (.03) .14 (.01) .11 (.01) .10 (.01)

The lowest pixel intensity was seen for the left and right eyes (which did not significantly differ), followed by the mouth and the nose (which did not significantly differ) (effect of fixation location, F(1.77, 12.36) = 42.29, p < .001, ηp2 = .86, all paired comparisons at p-values < .01). The emotion by fixation interaction was also significant (F(2.54,17.81) = 6.79, p < .05, ηp2 = .49), due to larger pixel intensity on the mouth for happy compared to fearful and neutral faces (p-values < .05).

3.2 Behavioral Analyses

Participants tended to make more errors for fearful than neutral faces (main effect of emotion; F(1.79, 34.04) = 4.76, p < .05, ηp2 = .20; fearful-neutral paired comparison significant at p = .06; see Table 1). Additionally, fewer errors were made during fixation to the nose compared to fixation on the left and right eye (main effect of fixation location; F(2.31, 43.89) = 3.58, p < .05, ηp2 = .16; paired comparisons significant at p < .05). No differences were seen for miss rates.

Responses were slowest for fearful expressions (main effect of emotion, F(1.71, 32.41) = 21.57, p < .001, ηp2 = .53; fearful-neutral and fearful-happy paired comparisons significant at p < .001; Table 1) and faster for nose fixation (main effect of fixation location, F(2.62, 49.77) = 7.42, p < .01, ηp2 = .28; significantly faster for the nose than the mouth and right eye, p < .05). No other effects were seen.

3.3 ERP Analyses

3.3.2 Effects of fixation location and emotion at occipital sites

P1 Peak Amplitude

At O1/2, P1 amplitude was overall largest for fixation to the mouth (main effect of fixation, F(2.73, 51.92) = 11.19, p < .001, ηp2 = .37) (see Fig. 3A). An interaction between fixation location and hemisphere was also found (F(2.19, 41.56) = 18.96, p < .001, ηp2 = .50) due to eye fixations yielding opposite effects. On the left hemisphere, P1 was larger for the mouth and left eye (which did not differ significantly) compared to the right eye and the nose fixations (which did not differ) (F = 11.22, p < .001). On the right hemisphere, P1 was larger for the mouth and right eye (which did not differ significantly) compared to the left eye and nose fixations which did not differ (F = 15.83, p < .001; significant paired comparisons right eye-left eye/nose p < .05, mouth-left eye/nose p < .001). No effects of emotion or emotion by fixation interaction were seen (Fig. 3B).

Figure 3.

Figure 3

(A) Grand averages featuring the P1 component for neutral faces at O1, O2, and Oz, showing effects of fixations with larger amplitudes for mouth fixation and opposite hemispheric effects for eye fixations. (B) Difference waveforms generated by subtracting ERPs to fearful from ERPs to neutral faces (solid line) and ERPs to happy from ERPs to neutral faces (dashed line) at O1, O2 and Oz. A clear difference peak for neutral-happy was seen between 100–150ms at Oz and O2 (grey band, peak of the effect around 130ms) and was confirmed by mean amplitude analysis at occipital sites during that time window (see main text and Table 4). (C) Grand averaged waveforms for fearful, happy and neutral faces (across fixation locations) at Oz. The early effect of emotion for happy faces started on the P1 peak at Oz. The grey interval (100–150ms) is where the effect emerged, peaking at 130ms. The red vertical lines represent the limits of the period during which mean amplitudes were analyzed (50–350ms). The topographic map shows the voltage distribution of the difference between neutral and happy faces at 130ms where the “happy effect” was maximal at medial occipital site.

P1 at Oz was also larger for fixation to the mouth compared to the left eye, right eye and nose which did not differ significantly from each other (main effect of fixation location, F(2.66, 50.44) = 10.43, p < .001, ηp2 = .35; significant paired comparisons with mouth fixation at p < .05) (Fig. 3A). However, in contrast to O1 and O2, an effect of emotion was found on P1 at Oz due to a reduced positivity for happy compared to fearful and neutral expressions (main effect of emotion, F(1.74, 33.13) = 6.68, p < .01, ηp2 = .26; significant happy-fearful paired comparison p < .05, happy-neutral paired comparison p = .06) (see Fig 3C). Difference waveforms (neutral-fear and neutral-happy) confirmed this localized effect of emotion at medial occipital site and revealed that this “happy effect” was in fact largest around 130ms (Fig. 3B and 3C map), i.e. in between the P1 and N170 peaks. This effect was confirmed statistically with mean amplitude analyzes during the 100–150ms window (see below).

Mean Amplitudes over Six Time Windows at occipital sites (O1, O2, Oz)

Statistical results for these analyses (50–350ms) are reported in Table 4 and visually depicted in Figures 3 and 4.

Table 4.

Statistical effects on mean amplitudes analyzed over six 50ms time windows at occipital sites (O1, Oz, O2), with F, p and ηp2 values. LH, left hemisphere; RH, right hemisphere; LE, left eye; RE, right eye; No, nose; Mo, mouth; F, fear; H, happy; N, neutral. Main effects p values: p* < .05; p** < .01; p*** < .001; p**** < .0001; ns, not significant. Paired comparison tests are also reported (e.g., H< F + N means that the main effect of emotion is due to smaller mean amplitude for happy compared to both fearful and neutral expressions).

Main effects and interactions 50–100ms 100–150ms 150–200ms 200–250ms 250–300ms 300–350ms
Electrode - F = 6.50, p*, ηp2 = .26
O1 + O2 > Oz
- F = 3.73, p*, ηp2 = .16
O1 + O2 > Oz
F = 5.31, p*, ηp2 = .22
O1 + O2 > Oz
-
Fixation location F= 11.23, p***, ηp2 = .37
Mo > all
F = 25.37, p****, ηp2 = .57
Mo > all
- - F = 3.57, p*, ηp2 = .16
(Mo + LE+) RE > No
F = 5.03, p**, ηp2 = .21
Mo > No
Emotion - F = 11.51, p***, ηp2 = .38
H < F + N
F = 8.55, p**, ηp2 = .31
H + F < N
F = 7.87, p**, ηp2 = .29
H + F < N
F = 6.97, p**, ηp2 = .27
H + F < N
F = 9.29, p**, ηp2 = .33
H < F + N
Electrode X Fixation location F= 16.05, p****, ηp2 = .46
• O1: F = 11.22, p****, ηp2 = .37
Mo + LE > RE + No
• O2:F = 15.83, p****, ηp2 = .46
Mo + RE > LE + No
• Oz:F = 10.43, p****, ηp2 = .35
Mo > all
F = 4.2, p**, ηp2 = .18
• O1: F = 13.28, p****, ηp2 = .41
Mo + LE > RE + No
• O2: F =25.21, p****, ηp2 = .57
Mo > all
• Oz: F = 28.47, p****, ηp2 = .60
Mo > all
- - - F = 4.03, p**, ηp2 = .18
• O1: F = 7.51, p**, ηp2 = .28
Mo > LE + No
• O2: No effect
• Oz: F = 6.35, p**, ηp2 = .25
Mo > No
Electrode X emotion - F = 3.70, p*, ηp2 = .16
• O1: No effect
• O2: F = 5.77, p**, ηp2 =.23
H < N + F
• Oz: F = 18.35, p****, ηp2 = .49
H < N + F
- - - -
Emotion X Fixation location - - F = 3.03, p*, ηp2 = .14
• Mo: F = 22.58, p****, ηp2 = .54
H + F < N
• LE: no effect
• RE: no effect
• No: no effect
- - -
Figure 4.

Figure 4

Mean voltage distribution maps of the difference waveforms between fear and neutral (F-N) and happy and neutral faces (H-N) across six 50ms time intervals from 50ms to 350ms (averaged across fixation location).

More positive amplitudes were seen when fixation was to the mouth compared to the other facial features between 50 and 150ms, and this effect was strongest at Oz. During that time period at O1 and O2, the fixation to one eye yielded opposite effects between the two hemispheres, with larger amplitudes for left eye fixation than right eye fixation on the left hemisphere and vice versa for the right hemisphere. These effects of fixation were virtually identical to those seen on P1 peak (analyzed at O1/2) and disappeared by 150ms. They reappeared more weakly between 250–350ms, with larger amplitudes for mouth fixation during the last time window (Fig. 3A).

An emotion effect was first seen during the 100–150ms time window with smaller amplitudes for happy compared to fearful and neutral expressions (Fig. 3C). An electrode by emotion interaction revealed this effect was only seen at O2 and Oz electrodes. This effect of emotion mirrors that found on the P1 peak at Oz reported previously. During the 150–200ms interval smaller amplitudes were seen for both fearful and happy compared to neutral expressions but only for the mouth fixation condition (emotion by fixation interaction). Between 200 and 300ms, both fearful and happy expressions elicited smaller amplitudes compared to neutral expressions regardless of fixation location. During the 300–350ms window this effect of emotion was significant for happy faces only (happy-neutral comparison p =.003; happy-fearful p =.067). Thus, happy faces elicited smaller amplitudes than neutral faces at occipital sites from 100 until 350ms, as clearly seen on the difference waveforms and their topographic maps (Fig. 4, see also Fig. 3B). In contrast fearful faces elicited smaller amplitudes than neutral faces a bit later, starting at 150ms and vanishing by 300ms. Figure 4 also suggests that the effect for fear was mostly lateral (as discussed next) and only weakly occipital, while the opposite was found for happy faces.

3.3.3 Effects of fixation location and emotion at lateral-posterior sites (CB1/2, P9/10, P7/8, PO7/8)

N170 Peak Amplitude

The N170 amplitude was larger for fixation to the left and right eye (which did not differ) compared to fixation to the mouth and nose which did not differ significantly (main effect of fixation location, F(2.46, 46.80) = 16.43, p < .0001, ηp2 = .46; all paired comparisons at p-values < .01) (Fig. 5A). No other significant effects were seen and in particular no effect of emotion.

Figure 5.

Figure 5

(A) Grand averages featuring the N170 component for neutral faces at P9 and P10 as a function of fixation location. (B) The early and later effect of emotion for fearful faces at temporal-parietal sites. Top: Grand average for fearful, happy and neutral faces (across fixation locations) at PO7 site where the effect was maximal. Bottom: Difference waveforms generated by subtracting ERPs to neutral from ERPs to fearful faces (solid line) and ERPs to neutral from ERPs to happy faces (dashed line) at PO7. The grey intervals (50–100ms) and (150–300ms) are where the early and later emotion effects for fear are seen. The maps show the voltage difference between neutral and fearful faces (F-N) across the scalp at the latency at which the early (80ms) and late (180ms) effects were largest.

P1-to-N170 amplitude

As effects of fixation location were found for P1 at occipital sites, we performed peak-to-peak analyses to track possible influences of P1 onto N170 measures at these lateral sites. We took the amplitude differences between the P1 and N170 at the electrode at which the N170 was largest for each hemisphere and each subject4. Amplitude differences were larger in the right compared to the left hemisphere (main effect of hemisphere, F(1, 19) = 5.14, p < .05, ηp2 = 21; significant paired comparison p < .05) and were larger during fixation to both the left and right eye (which did not differ) compared to the nose and mouth (which did not differ) (main effect of fixation location, F(1.79, 34.07) = 32.27, p < .001, ηp2 = .63; significant left eye-nose/mouth and right eye-nose/mouth paired comparisons p < .001). This confirmed the fixation location found for the N170 peak. In addition however, an interaction between fixation location and hemisphere was seen (F(2.64, 50.22) = 16.42, p < .0001, ηp2 = .46), due to opposite effects of eye fixation in each hemisphere, driven by the P1 (Fig. 5A). On the left hemisphere, the main effect of fixation location (F(2.43, 46.27) = 19.4, p < .0001, ηp2 = .51) was due to larger amplitude differences for the left eye compared to all other fixation locations (amplitude for the right eye was also larger than for the mouth). On the right hemisphere, the fixation location effect (F(1.87, 35.67) = 34.31, p < .0001, ηp2 = .64) was due to amplitudes being larger for both eye fixations (which did not differ) compared to both nose and mouth (which did not differ).

Interestingly, in contrast to the lack of emotion effect for N170 peak, there were significant interactions between hemisphere and emotion (F(1.91, 36.29) = 4.54, p = .019, ηp2 = .19) and hemisphere, emotion and fixation location (F(4.81, 91.31) = 2.96, p = .017, ηp2 = .14). On the left hemisphere, an emotion by fixation location interaction (F(4.20, 79.71) = 3.15, p = .017, ηp2 = .14) was due to a larger P1-N170 amplitude difference for happy faces compared to fearful and neutral faces (significant paired comparisons p < .01) that was seen only when fixation was on the mouth (mouth: effect of emotion, F = 10.10, p < .001; significant paired comparisons p < .01). There was no effect of emotion when fixation was on the left eye (p = .27), right eye (p = .54) or nose (p = .24). On the right hemisphere, the P1-N170 amplitude difference was larger for happy compared to neutral expressions regardless of fixation location (main effect of emotion, F(1.85, 35.44) = 4.59, p =.019, ηp2 = .20; significant happy-neutral paired comparison p < .05, fear-neutral paired comparisons borderline at p = .053).

Overall, taking P1 into account revealed interactions between fixation and hemisphere as seen for P1 peak, as well as emotion effects not seen on either the P1 (at O1/2 sites) or the N170 peaks.

N170-to-P2 amplitude

Amplitude analyses were also performed between the N170 and the amplitude recorded at 200ms as visual inspection of the grand average revealed emotion effects emerging after the peak of the N170. The N170-P2 difference was defined as the subtraction of the peak N170 amplitude (at maximum electrodes picked for each subject) from the actual amplitude at 200ms post-stimulus for each subject. This was done to avoid to peak the P2 itself which is too broad to be peaked clearly in each participant. Individual inspection of the data suggested that for all participants, 200ms was the start or the middle of the P2.

The amplitude difference was less positive in the left hemisphere compared to the right (main effect of hemisphere, F(1, 19) = 5.34, p < .05, ηp2 = .22; paired comparison p < .05). A main effect of fixation location (F(2.37, 45.03) = 3.07, p < .05, ηp2 = .14) was due to overall smaller amplitude for the mouth compared to the other fixation locations. The amplitude difference was also smaller for fearful compared to happy and neutral expressions (main effect of emotion, F(1.78, 33.89) = 11.64, p < .001, ηp2 = .38; significant paired comparisons fear-happy and fear-neutral p < .01). This effect of emotion was confirmed by the mean amplitude analysis over the time window 150–200ms as reported below.

Mean Amplitude analyses over Six Time Windows (CB1/2, P7/8, P9/10, PO7/8)

Statistical results for these analyses (50–350ms) are reported in Table 5 and visually depicted in Figures 4, 5 and 6.

Table 5.

Statistical effects on mean amplitudes analyzed over six 50ms time windows at lateral-posterior sites (CB1/2, P7/8, PO7/8, P9/10), with F, p and ηp2 values. LH, left hemisphere; RH, right hemisphere; LE, left eye; RE, right eye; No, nose; Mo, mouth; F, fear; H, happy; N, neutral. Main effects p values: p* < .05; p** < .01; p*** < .001; p**** < .0001; ns, not significant. Paired comparison tests are also reported (e.g., H< F + N means that the main effect of emotion is due to smaller mean amplitude for happy compared to both fearful and neutral expressions).

Main effects and interactions 50–100ms 100–150ms 150–200ms 200–250ms 250–300ms 300–350ms
Electrode F = 4.91, p*, ηp2 = .21
PO7/8 < P9/10
< CB1/2 < P7/8
F = 16.10, p****, ηp2 = .46
P9/10 < CB1/2
< P7/8 < PO7/8
F = 21.27, p***, ηp2 = .53
P9/10 < CB1/2 + P7/8
< PO7/8
F = 61.83, p****, ηp2 = .77
P9/10 < CB1/2
< P7/8 < PO7/8
F = 53.27, p****, ηp2 = .74
P9/10 < CB1/2
< P7/8 < PO7/8
F = 47.69, p****, ηp2 = .72
P9/10 < CB1/2
< P7/8 < PO7/8
Hemisphere - - - F = 5.72, p*, ηp2 = .23
LH < RH
- F = 14.52, p**, ηp2 = .43
LH < RH
Fixation location - F = 14.97, p****, ηp2 = .44
No < all
F = 14.18, p****, ηp2 = .43
LE + RE < No + Mo
- - -
Emotion - - F = 13.87, p****, ηp2 = .42
F < H + N
F = 23.79, p****, ηp2 = .56
F < H < N
F = 5.88, p**, ηp2 = .24
F < N
F = 5.50, p*, ηp2 = .23
H < N
Electrode X fixation location - F = 9.46, p****, ηp2 = .33
• CB: F = 11.04, p****, ηp2 = .38
No < all
• P7/8: F = 12.01, p****, ηp2 = .39
No < all
• P9/10: F = 16.53, p****, ηp2 = .47
No < M < LE + RE
• PO7/8: F = 13.46, p****, ηp2 = .42
No < LE+RE < M
F = 5.32, p**, ηp2 = .22
• CB: F = 16.64, p***, ηp2 = 47
LE+RE < No + Mo
• P7/8: F = 3.59, p*, ηp2 = .16
No sign. paired comp
• P9/10: F = 15.94 p***, ηp2 = .46
LE + RE < No + Mo
• PO7/8: F = 5.67 p**, ηp2 = .23
LE (+RE) < No (+ Mo)
- - -
Hemisphere X fixation location F = 9.38, p***, ηp2 = .33
• LH: F = 5.57, p**, ηp2 = .23
LE > RE
• RH: F = 3.49, p*, ηp2 = .16
RE > LE (p=.082)
F = 11.91, p****, ηp2 = .39
• LH: : F = 11.64, p****, ηp2 = .38
LE > all
• RH: F = 14.96, p****, ηp2 = .44
RE > all
- - - -
Hemisphere X emotion - - - - F = 4.14, p*, ηp2 = .18
• LH: F = 11.45, p***, ηp2 = .38
F < H + N
• RH: ns
F = 3.61, p*, ηp2 = .16
• LH: F = 7.72, p**, ηp2 = .29
F + H < N
• RH: ns
Electrode X hemisphere X emotion F =3.23, p*, ηp2 = .15
• PO7: F = 12.38, p****, ηp2 = .39
F < H + N
• P7: F = 8.44, p***, ηp2 = .31
F < H + N
- - - - -
Figure 6.

Figure 6

Difference waveforms generated by subtracting neutral from fearful and happy conditions (F-N and H-N, averaged across fixation locations) at lateral-posterior (CB1/2, P7/8, PO7/8, P9/10). The grey zones highlight the time windows during which the effect for fear was significant, an early effect restricted to P7 and PO7 electrodes during 50–100ms and a later effect at all lateral posterior sites (150–300ms).

A hemisphere by fixation location interaction was seen between 50ms–100ms, due to fixation to the eyes yielding opposite effects on each hemisphere, with larger amplitude for fixation to the left eye compared to the right eye on the left hemisphere and vice versa for the right hemisphere. At 100–150ms, this fixation by hemisphere interaction became stronger, with larger amplitude for the left eye fixation compared to all other fixations on the left hemisphere, and larger amplitude for the right eye than the other fixations on the right hemisphere. These effects between 50 and 150ms were driven by the P1 as clearly seen on Fig. 5A, and as also found in the P1-N170 amplitude difference. Between 150–200ms, in line with the fixation effect seen for N170, the mean amplitudes were larger for both the left and the right eye fixations compared to both the nose and mouth fixations, an effect that was most pronounced at P9/P10 and CB1/CB2 sites. No effect of fixation location was seen after 200ms.

A very early effect of emotion was seen between 50–100ms that was restricted to PO7 and P7 (left hemisphere) electrodes, with smaller amplitude for fearful faces compared to both happy and neutral faces (Fig. 5B). This effect peaked at ~80ms. This emotion effect for fearful faces appeared again later beginning at 150ms and lasting until 300ms, this time at all posterior lateral sites (Fig. 5B, Fig. 6). During that time, amplitudes for fearful faces were smaller than amplitudes for both happy and neutral faces, with the fearful-neutral difference peaking around 180ms (Fig. 5B). Mean amplitudes for happy faces were also significantly smaller than for neutral faces between 200–250ms and again later between 300–350ms (Fig. 6). Interestingly, between 250–350ms the emotion effects were seen only for the left hemisphere (P7, PO7, CB1 and P9), as clearly seen on Fig. 4.

4. Discussion

In the present gender categorization task performed on facial expressions of emotion, we tested the effect of fixation to facial features on scalp-recorded ERPs between 50–350ms, encompassing well-studied components (P1, N170 and EPN). We also tested the idea that fixation to fearful eyes might be driving the debated N170 modulation by emotion. Using eye-tracking to enforce correct fixation to facial features we found that the P1 and N170 peaks were sensitive to fixation location but not to emotion. Emotion effects however, were seen between these early peaks and later on at posterior sites, mostly medially and occipitally for happy expressions, and mostly laterally for fearful expressions. Importantly, the emotion effects occurred largely independently of fixation location. We discuss these effects and their implications for our understanding of early face and facial emotion perception.

4.1 Fixation location and facial emotion influenced gender discrimination

Behavioural performance was impacted by fixation location, with less errors and shorter RTs when fixation was on the nose compared to other fixated locations. This result is in line with the idea that when the whole face is available, gender discrimination requires holistic processing (Brown & Perrett, 1993; Zhao & Hayward, 2010) which is most efficient around the face center of gravity situated close to the nose (e.g., Bindemann, Scheepers, & Burton, 2009; de Heering et al., 2008). Note that this does not undermine the idea that some face parts might convey face gender better than others when presented in isolation, as recently reported (Best, Minshew & Strauss, 2010). Most importantly, despite the task being emotion-irrelevant, emotion impacted behavioural performance as seen by participants’ trend toward larger error rates for fearful compared to neutral faces and longer RTs for fearful faces compared to both happy and neutral faces. Similar results have been reported by Scheller, Büchel & Gamer (2012) with lower hit rates for fearful and neutral than happy faces in a gender categorization task where faces were presented for 2 seconds. This finding is sensible given face gender discrimination requires virtually no attention (Reddy, Wilken & Koch, 2004), leaving attention resources available to process the emotions as demonstrated here. Facial expression categories were processed during the present emotion-irrelevant task and impacted gender discrimination.

4.2 Different sensitivity to fixation location for P1 and N170

The P1 component is sensitive to low-level stimuli characteristics such as contrast, luminance, color and spatial frequencies (Luck, Woodman, & Vogel, 2000; Rossion and Jacques, 2008). When low-level factors are controlled for, P1 does not reliably differ between object categories while the N170 does, supporting the view that both components reflect distinct stages of visual processing with only the N170 reflecting high level vision and face categorization (e.g., Ganis et al., 2012; Jemel et al., 2003; Rossion & Caharel, 2011; Tarkiainen, Cornelissen, & Salmelin, 2002).

In the present study, a clear fixation effect was seen with larger P1 amplitude for the right than for the left eye on the right hemisphere and vice versa for the left hemisphere. In fact, analysis of mean amplitudes revealed this effect was seen as early as 50–100ms at occipital sites (Fig. 3A) and between 50–150ms at posterior-lateral sites (Fig. 5A). This fixation effect was also found on the P1-to-N170 analysis which was driven by the P1. This effect of fixation reflects hemifield presentation effects as most of the facial information was in the left visual field when fixation was on the right eye and in the right visual field when fixation was on the left eye (Fig. 1). This hemifield effect was also reported in three recent studies using similar gaze-contingent presentations (de Lissa et al., 2014; Nemrodov et al., 2014; Zerouali et al., 2013). In addition, at occipital sites, a delayed and larger P1 response was seen when fixation was on the mouth compared to each of the other locations. In fact, this effect was seen at occipital sites during the entire epoch (although not significantly between 150–300ms, Fig. 3A, Table 4) and likely reflected sensitivity to the position of the face on the screen. Most of the facial information is in the upper visual field when fixation falls on the mouth compared to the eyes or the nose. Interestingly, the P1 recorded to simple checkerboards presented in the four visual field quadrants has been shown to vary with ipsi/contra-lateral presentations but does not vary appreciably between the upper and lower visual fields (Clark, Fan, & Hillyard, 1994; Di Russo, Martínez, Sereno, Pitzalis, & Hillyard, 2002, p101). It is possible that the visual system is more sensitive to the upper visual field for meaningful stimuli such as faces which are often seen in that area.

As predicted, we replicated the finding by Nemrodov et al. (2014) of larger N170 amplitude for fixation on the left and right eyes (which did not differ significantly) compared to the nose and the mouth which also did not differ significantly (see also de Lissa et al., 2014). This effect is not attributable to a simple face position effect as seen for the P1. The N170 amplitude has been shown to decrease with face eccentricity (Rousselet, Husk, Bennett, & Sekuler, 2005), therefore if this N170 amplitude modulation reflected a face position effect we would expect to see smaller, rather than larger, N170 amplitude for fixation to the eyes, given the more lateral position of the face for these fixation locations compared to the midline fixation locations (nose and mouth). Further demonstration that this N170 modulation reflects a true eye sensitivity was provided by Nemrodov et al. (2014) who showed that the same eye fixation locations did not yield these larger N170 amplitudes when the eyes were not present in fovea (in eyeless faces), despite the same positions of those faces on the screen. This sensitivity of the N170 component to eyes has been shown using isolated eye stimuli, with larger N170s to isolated eyes than full faces (e.g., Bentin et al., 1996; Itier et al., 2006; 2007; 2011) that is seen as early as four years of age (Taylor et al., 2001). The N170 sensitivity to eyes has also been shown using reverse correlation techniques such as the Bubbles technique, which reveals portions of the face, in gender and emotion discrimination tasks (e.g., Schyns et al., 2003; 2007; 2009; Rousselet et al., 2014). The eye sensitivity within full faces as shown here provides further support to the hypothesis of an eye detector during the processing of the face structure (Nemrodov et al., 2014). The current study demonstrates that this eye sensitivity is also seen for faces expressing fear and happiness and is thus largely facial-expression invariant. While mean pixel intensity and contrast did not differ between pictures, local pixel intensity and contrast did. In particular higher contrast and lower pixel intensity were seen for the eyes compared to the nose and mouth. Therefore the hypothesized eye detector might rely on low-level cues such as local contrast and pixel intensity, a possibility that will have to be tested by future studies.

In contrast to the P1 and N170 components, there was no effect of fixation location after 200ms at lateral posterior sites (where P2 and EPN were seen), as also predicted. This result is in line with the idea that the eye sensitivity is specific to the face structural encoding stage as indexed by the N170.

4.3 Early and later occipital effects for happy facial expressions

Using stimuli that did not significantly differ in overall mean pixel intensity and contrast, an early effect was seen for happy faces at medial occipital site Oz (smaller amplitudes for happy than neutral faces) that began around 100ms and peaked around 130ms, i.e. between P1 and N170 peaks. After 150ms this effect was seen more broadly including lateral occipital sites O1 and O2 and was sustained until 350ms. This effect was seen as a negative amplitude difference at occipital sites (happy-neutral difference waves) along with a positive counterpart at frontal sites on topographic maps (Fig. 34). The effect spread a little to the posterior lateral sites between 200–250ms and 300–350ms (Table 5), with a seemingly left-dominant distribution (Fig. 4, hemisphere interaction only between 300–350ms). The P1-N170 analysis at lateral sites also revealed a difference between happy and neutral expressions which was seen regardless of fixation location on the right hemisphere but was seen only for fixation on the mouth on the left hemisphere. The only other time window during which emotion interacted with fixation location was between 150–200ms at occipital sites (Table 4), with again a difference between happy and neutral expressions seen only for the mouth fixation.

Few studies have focused on the ERPs in response to happy faces and this occipital distribution is not often reported. The few studies that have found effects of facial expression on the P1 have reported larger P1 for fearful than neutral or happy faces (see Vuilleumier and Pourtois, 2007 for review) but typically no difference between happy and neutral faces. However, the present data suggest a very localized happy effect at midline site for P1, rather than at the classic lateral sites (including O1/2), which might have been missed by most previous studies. In an explicit emotion categorization task, Morel et al. (2014) recently reported an early happy effect with a larger P1 for happy than neutral faces (i.e. the opposite as found here) in the right hemisphere, but this was seen only for highly anxious participants and was thus likely the result of attentional demands, rather than emotional effects per se. In non-anxious participants, no emotion difference was seen on the lateral P1 (similar to our non-anxious sample); medial occipital sites were not analyzed. During a face-decision task (categorizing faces as intact or smeared) Schacht and Sommer (2009) reported an enhanced negativity for happy compared to neutral (and angry) faces between 128–144ms at parieto occipital sites. Their topographic map resembles our present occipital distribution although also included parietal areas. Midline sites were not measured in that study. Let’s note that although the present happy-neutral difference started on P1, it was maximal after P1, around 130ms, and no such effect was seen for fearful faces, which makes it unlikely a general emotional effect or a simple attentional effect. The data suggest that this effect was specific to the processing of happy expressions.

Our occipital effect for happy faces echoes results reported by Halgren et al. (2000) who recorded magnetic fields in response to various stimuli including happy and sad faces while participants identified repeated faces. Results indicated a midline occipital source in or near the calcarine fissure (around areas V1–V2) that discriminated happy from neutral expressions between 100–120ms post-stimulus. That source was separate from the more lateral and later source that corresponded to the magnetic equivalent of the N170, and was also sensitive to more sensory aspects of the stimuli. Halgren et al. (2000) proposed that a fast discrimination of diagnostic cues such as the smile, based on luminance and contrast, could occur within 100–120ms in those early visual areas and then be relayed rapidly to the amygdala by direct V2-amygdala connections. This explanation is possible here given the local pixel and contrast differences between emotions seen for the mouth area of our stimuli.

The current findings however, further suggest that this occipital activity is seen all the way until at least 350ms. From 150–350ms, it was accompanied by more temporal negativities as well as frontal positivities, which suggests changes of the underlying generators with time. Overall this “happy effect” appears to recruit different spatio-temporal networks with distinctive scalp distributions than the commonly reported rapid processing of fearful faces discussed below.

4.4 Early and later lateral posterior effects for fearful expressions

Early effects of fearful faces have been debated. Most studies have reported no modulation of the P1 by emotion (Palermo and Rhodes, 2007; Vuilleumier and Pourtois, 2007) however a few have reported enhanced P1 for fearful compared to neutral faces in gender discrimination tasks (Pourtois et al., 2005; Wijers et al., 2012), oddball detection tasks (Batty and Taylor, 2003), and passive viewing of emotional faces (Smith et al., 2013). The current results however, suggest modulations by fearful expression before the P1 and only for fearful faces; no effect of fear was seen on the P1 itself, at lateral or medial occipital sites. This early effect of fearful expressions was localized to the left hemisphere seen clearly at PO7 and to a lesser extent at P7 during the 50–100ms time window, peaking around 80ms (Fig. 46). It is unclear what this very early modulation represents and it will have to be reproduced before any conclusion can be drawn.

After this very early effect, modulations of ERPs by fearful faces were next seen right after the N170 component and all the way until 300ms (Table 5, Fig. 46). The effect of facial emotions on the N170 has been debated with several studies reporting no modulation by emotion (see reviews by Eimer & Holmes, 2007 and Hinojosa et al., 2015 and see Rellecke et al., 2013) while others did report increased N170 with fearful faces (e.g., Batty and Taylor, 2003; Blau et al., 2007; Leppanen et al., 2008). However, previous studies have not controlled gaze fixation on the features of facial emotional stimuli. This is important given recent reports of spontaneous saccades toward the eyes of fearful expressions even with stimuli presented for only 150ms (Gamer et al., 2013). We hypothesized that the early ERP modulations of the N170 by fearful faces previously reported might have been driven by attention to the eyes. We reasoned that if this was the case, then early ERP responses would be larger for fearful than happy or neutral faces when fixation was on the eyes but not when fixation was on the nose or mouth. The present results revealed no modulation of the N170 peak amplitude by emotional faces and no interaction of emotion with fixation location. This result is in line with the lack of modulation of the N170 by emotion reported in previous gender discrimination tasks (Pourtois et al., 2005; Sato et al., 2001; Wijers et al., 2012; Wronka & Walentowski, 2011). The lack of emotion by fixation interaction on the N170 suggests that the eye sensitivity demonstrated by this component is largely independent of facial expression of emotion, as mentioned earlier. Attention to the eyes is thus unlikely the reason why previous studies reported early emotional differences.

The effect for fearful faces was mostly seen at lateral posterior sites (and to a lesser extent at occipital sites) and emerged ~150ms during the descending part of the N170 toward the P2 component. It peaked around 180–200ms, and was significantly different from neutral and happy faces until 300ms (Fig. 46, Table 5). The distribution of this fearful effect across the scalp was similar to that reported by Eimer and Holmes (2002, topographic maps of that study reported in Eimer and Holmes, 2007), with bilateral posterior-temporal negativities along with a fronto-central positivity. Eimer and Holmes (2002) however reported this effect starting around 110–120ms, i.e. earlier than in the present study, and suggested the involvement of frontal brain areas. In contrast, as most of the present effects were seen at posterior sites, we believe that the frontal distribution is mostly the positive counterpart of a posterior negativity that is likely coming from posterior visual brain regions. At these lateral posterior sites, this negativity never interacted with fixation location and thus seems to reflect activity linked to the processing of fear added onto the normal activity related to processing neutral faces, as proposed by other groups (Rellecke et al., 2013; Schacht and Sommer, 2009). This added negativity started around the same time as the N170 but was seen mostly after the peak, and again did not interact with the fixation location, suggesting it was different from the structural encoding reflected by the N170 component.

This “fearful effect” was thus seen right after the N170 until around 300ms and encompassed the visual P2 (~200ms) component and the well-known marker of emotion processing Early Posterior Negativity –EPN (Rellecke, Sommer, & Schacht, 2012; Rellecke et al., 2011; Schupp et al., 2004). Our results are in line with previous reports of emotion effects starting around or right after the N170 and lasting 100ms or more (Eimer et al., 2003; Eimer and Kiss, 2007; Leppanen et al., 2007; Schupp et al., 2004; Sprengelmeyer and Jentzsch, 2006), here until about 300ms. This added negativity related to the processing of fear has been suggested to arise from an enhanced processing of emotionally salient stimuli in cortical visual areas involved in the perception of emotionally salient stimuli (Schupps et al., 2004). The timing of this fear-related process coincides with amygdala activation reported in intracranial ERP studies in response to fearful faces ~150–200ms post-stimulus (Meletti et al., 2012; Krolak-Salmon, Hénaff, Vighetto, Bertrand, & Mauguière, 2004; Pourtois, Spinelli, Seeck, & Vuilleumier, 2010a) as well as in a recent MEG study (Dumas et al., 2013). However amygdala activity per se is very unlikely recorded on the scalp with EEG and this fear effect is thus more likely the result of the enhancement of the activity of perceptual visual areas, such as the fusiform gyrus, by the amygdala. Modulations of the fusiform gyrus by the amygdala has indeed been reported by a few intracranial studies (Pourtois, Spinelli, Seeck, & Vuilleumier, 2010b) and MEG studies (e.g. Dumas et al., 2013) around similar times.

4.5 Conclusion

In this gender discrimination task where facial expressions were task-irrelevant, differential effects of fixation location and emotion were seen across various ERP components. A sensitivity to face position were seen early, on the P1 component. An eye sensitivity that was independent of the emotion expressed by the face was seen on the N170 component, possibly reflecting the activity of an eye-detector in the processing of the face structure. The N170 peak was not sensitive to emotion, however effects were seen right after the peak. An “happy effect” was seen at occipital sites that started around 100ms and lasted until 350ms. For fearful faces, an effect was seen around 50–100ms localized to the left hemisphere at lateral-posterior sites followed by a later effect bilaterally from 150 to 300ms, although stronger on the left hemisphere between 250–350ms. Results suggest that facial emotion processing is largely independent from the processing of facial features and face structure and that happy and fearful expressions recruit different spatio-temporal networks with distinctive scalp distributions. Results also highlight the importance of quantifying neural activity around P1 and N170 peaks as emotion effects may be missed by simply measuring these commonly studied ERP markers.

Acknowledgments

This research was supported by grants from the Natural Sciences and Engineering Research Council of Canada (NSERC Discovery Grant #418431), the Ontario government (Early Researcher Award, ER11-08-172), the Canada Foundation for Innovation (CFI, #213322), and the Canada Research Chair (CRC, #959-213322) program to RJI, as well as by a doctoral NSERC grant to KNN. We would like to warmly thank Frank Preston for all his technical help.

Footnotes

1

Note that this high attrition rate indirectly shows that many participants make many eye movements even with 257ms presentation times and that, although tiny, these eye movements are sufficient to put fixation on another facial feature given the size of the stimuli.

2

Development of the MacBrain Face Stimulus Set was overseen by Nim Tottenham and supported by the John D. and Catherine T. MacArthur Foundation Research Network on Early Experience and Brain Development. Please contact Nim Tottenham at tott0006@tc.umn.edu for more information concerning the stimulus set.

3

In practice, it took a bit of time for participants to be correctly fixated on the fixation trigger for a minimum of 307ms, resulting in an average of 964ms (1214ms SD) between the first onset of the fixation cross and the onset of the stimulus presentation. When this time exceeded 10s, a mid-block calibration was done again.

4

Note that the P1 had to be re-measured at these lateral posterior sites for this analysis

References

  1. Ashley V, Vuilleumier P, Swick D. Time course and specificity of event-related potentials to emotional expressions. Neuroreport. 2004;15(1):211–216. doi: 10.1097/00001756-200401190-00041. [DOI] [PubMed] [Google Scholar]
  2. Balconi M, Lucchiari C. Event-related potentials related to normal and morphed emotional faces. The Journal of Psychology. 2005;139(2):176–192. doi: 10.3200/JRLP.139.2.176-192. [DOI] [PubMed] [Google Scholar]
  3. Batty M, Taylor M. Early processing of the six basic facial emotional expressions. Cognitive Brain Research. 2003;17(3):613–620. doi: 10.1016/s0926-6410(03)00174-5. [DOI] [PubMed] [Google Scholar]
  4. Bentin S, Allison T, Puce A, Perez E, McCarthy G. Electrophysiological studies of face perception in humans. Journal of Cognitive Neuroscience. 1996;8:551–565. doi: 10.1162/jocn.1996.8.6.551. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Bentin S, Deouell LY. Structural encoding and identification in face processing: ERP evidence for separate mechanisms. Cognitive Neuropsychology. 2000;17(1–3):35–55. doi: 10.1080/026432900380472. [DOI] [PubMed] [Google Scholar]
  6. Best CA, Minshew NJ, Strauss MS. Gender discrimination of eyes and mouths by individuals with autism. Autism Research. 2010;3(2):88–93. doi: 10.1002/aur.125. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Bimler DL, Skwarek SJ, Paramei GV. Processing facial expressions of emotion: Upright vs. inverted images. Frontiers in Psychology. 2013;4(54):1–12. doi: 10.3389/fpsyg.2013.00054. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Bindemann M, Scheepers C, Burton AM. Viewpoint and center of gravity affect eye movements to human faces. Journal of Vision. 2009;9(2):1–16. doi: 10.1167/9.2.7. [DOI] [PubMed] [Google Scholar]
  9. Blau VC, Maurer U, Tottenham N, McCandliss BD. The face-specific N170 component is modulated by emotional facial expression. Behavioral and Brain Functions. 2007;3(7):1–13. doi: 10.1186/1744-9081-3-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Brown E, Perrett D. What gives a face its gender? Perception. 1993;22:829–840. doi: 10.1068/p220829. [DOI] [PubMed] [Google Scholar]
  11. Caharel S, Courtay N, Bernard C, Lalonde R. Familiarity and emotional expression influence an early stage of face processing: An electrophysiological study. Brain and Cognition. 2005;59(1):96–100. doi: 10.1016/j.bandc.2005.05.005. [DOI] [PubMed] [Google Scholar]
  12. Clark VP, Fan S, Hillyard SA. Identification of early visual evoked potential generators by retinotopic and topographic analyses. Human Brain Mapping. 1994;2(3):170–187. [Google Scholar]
  13. de Heering A, Rossion B, Turati C, Simion F. Holistic face processing can be independent of gaze behaviour: Evidence from the composite face illusion. Journal of Neuropsychology. 2008;2(Pt 1):183–195. doi: 10.1348/174866407x251694. [DOI] [PubMed] [Google Scholar]
  14. de Lissa P, McArthur G, Hawelka S, Palermo R, Mahajan Y, Hutzler F. Fixation location on upright and inverted faces modulates the N170. Neuropsychologia. 2014;57:1–11. doi: 10.1016/j.neuropsychologia.2014.02.006. [DOI] [PubMed] [Google Scholar]
  15. Delorme A, Makeig S. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. Journal of Neuroscience methods. 2004;134:9–21. doi: 10.1016/j.jneumeth.2003.10.009. [DOI] [PubMed] [Google Scholar]
  16. Derntl B, Seidel EM, Kainz E, Carbon CC. Recognition of emotional expressions is affected by inversion and presentation time. Perception. 2009;38(12):1849–1862. doi: 10.1068/p6448. [DOI] [PubMed] [Google Scholar]
  17. Di Russo F, Martínez A, Sereno MI, Pitzalis S, Hillyard SA. Cortical sources of the early components of the visual evoked potential. Human Brain Mapping. 2002;15(2):95–111. doi: 10.1002/hbm.10010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Dugas MJ, Gosselin P, Ladouceur R. Intolerance of uncertainty and worry: Investigating specificity in a nonclinical sample. Behavior Research and Therapy. 2001;25:551–558. [Google Scholar]
  19. Dumas T, Dubal S, Attal Y, Chupin M, Jouvent R, Morel S, George N. MEG evidence for dynamic amygdala modulations by gaze and facial emotions. PLoS one. 2013;8(9):e74145. doi: 10.1371/journal.pone.0074145. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Eimer M. Event-related brain potentials distinguish processing stages involved in face perception and recognition. Clinical Neurophysiology. 2000;111(4):694–705. doi: 10.1016/s1388-2457(99)00285-0. [DOI] [PubMed] [Google Scholar]
  21. Eimer M, Holmes A. An ERP study of the time course of emotional face processing. Neuroreport. 2002;13:427–431. doi: 10.1097/00001756-200203250-00013. [DOI] [PubMed] [Google Scholar]
  22. Eimer M, Holmes A. Event-related brain potential correlates of emotional face processing. Neuropsychologia. 2007;45(1):15–31. doi: 10.1016/j.neuropsychologia.2006.04.022. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Eimer M, Kiss M. Attentional capture by task-irrelevant fearful faces is revealed by the N2pc component. Biological Psychology. 2007;74(1):108–112. doi: 10.1016/j.biopsycho.2006.06.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Eimer M, Holmes A, McGlone FP. The role of spatial attention in the processing of facial expression: An ERP study of rapid brain responses to six basic emotions. Cognitive, Affective, & Behavioral Neuroscience. 2003;3(2):97–110. doi: 10.3758/cabn.3.2.97. [DOI] [PubMed] [Google Scholar]
  25. Fox E, Damjanovic L. The eyes are sufficient to produce a threat superiority effect. Emotion. 2006;6(3):534–539. doi: 10.1037/1528-3542.6.3.534. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Gamer M, Schmitz AK, Tittgemeyer M, Schilbach L. The human amygdala drives reflexive orienting towards facial features. Current Biology. 2013;23(20):R917–R918. doi: 10.1016/j.cub.2013.09.008. [DOI] [PubMed] [Google Scholar]
  27. Ganis G, Smith D, Schendan HE. The N170, not the P1, indexes the earliest time for categorical perception of faces, regardless of interstimulus variance. NeuroImage. 2012;62(3):1563–1574. doi: 10.1016/j.neuroimage.2012.05.043. [DOI] [PubMed] [Google Scholar]
  28. Halgren E, Raij T, Marinkovic K, Jousmäki V, Hari R. Cognitive response profile of the human fusiform face area as determined by MEG. Cerebral Cortx. 2000;10:69–81. doi: 10.1093/cercor/10.1.69. [DOI] [PubMed] [Google Scholar]
  29. Herrmann MJ, Aranda D, Ellgring H, Meuller TJ, Strik WK, Heidrich A, Fallgatter AJ. Face-specific event-related potential in humans is independent from facial expression. International Journal of Psychophysiology. 2002;45(3):241–244. doi: 10.1016/s0167-8760(02)00033-8. [DOI] [PubMed] [Google Scholar]
  30. Hinojosa JA, Mercado F, Carretié L. N170 sensitivity to facial expression: A meta-analysis. Neuroscience and Biobehavioral Reviews. 2015;55:498–509. doi: 10.1016/j.neubiorev.2015.06.002. [DOI] [PubMed] [Google Scholar]
  31. Hung Y, Smith ML, Bayle DJ, Mills T, Cheyne D, et al. Unattended emotional faces elicit early lateralized amygdala-frontal and fusiform activations. NeuroImage. 2010;50(2):727–733. doi: 10.1016/j.neuroimage.2009.12.093. [DOI] [PubMed] [Google Scholar]
  32. Itier RJ, Alain C, Sedore K, McIntosh AR. Early face processing specificity: It’s in the eyes! Journal of Cognitive Neuroscience. 2007;19(11):1815–1826. doi: 10.1162/jocn.2007.19.11.1815. [DOI] [PubMed] [Google Scholar]
  33. Itier RJ, Latinus M, Taylor MJ. Face, eye and object early processing: What is the face specificity? NeuroImage. 2006;29(2):667–676. doi: 10.1016/j.neuroimage.2005.07.041. [DOI] [PubMed] [Google Scholar]
  34. Itier RJ, Taylor MJ. Inversion and contrast polarity reversal affect both encoding and recognition processes of unfamiliar faces: A repetition study using ERPs. NeuroImage. 2002;15(2):353–372. doi: 10.1006/nimg.2001.0982. [DOI] [PubMed] [Google Scholar]
  35. Itier RJ, Taylor MJ. N170 or N1? Spatiotemporal differences between object and face processing using ERPs. Cerebral Cortex. 2004;14(2):132–142. doi: 10.1093/cercor/bhg111. [DOI] [PubMed] [Google Scholar]
  36. Itier RJ, Van Roon P, Alain C. Species sensitivity of early face and eye processing. NeuroImage. 2011;54(1):705–713. doi: 10.1016/j.neuroimage.2010.07.031. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Jemel B, Schuller A, Cheref-Khan Y, Goffaux V, Crommelinck M, Bruyer R. Stepwise emergence of the face-sensitive N170 event-related potential component. Neuroreport. 2003;14(16):2035–2039. doi: 10.1097/00001756-200311140-00006. [DOI] [PubMed] [Google Scholar]
  38. Krolak-Salmon P, Fischer C, Vighetto A, Mauguière F. Processing of facial emotional expression: Spatio-temporal data as assessed by scalp event-related potentials. European Journal of Neuroscience. 2001;13(5):987–994. doi: 10.1046/j.0953-816x.2001.01454.x. [DOI] [PubMed] [Google Scholar]
  39. Krolak-Salmon P, Hénaff M, Vighetto A, Bertrand O, Mauguière F. Early amygdala reaction to fear spreading in occipital, temporal, and frontal cortex: A depth electrode ERP study in human. Neuron. 2004;42:665–676. doi: 10.1016/s0896-6273(04)00264-8. [DOI] [PubMed] [Google Scholar]
  40. Leppänen JM, Hietanen JK, Koskinen K. Differential early ERPs to fearful versus neutral facial expressions: a response to the salience of the eyes? Biological Psychology. 2008;78:150–158. doi: 10.1016/j.biopsycho.2008.02.002. [DOI] [PubMed] [Google Scholar]
  41. Leppänen JM, Moulson MC, Vogel-Farley VK, Nelson CA. An ERP study of emotional face processing in the adult and infant brain. Child Development. 2007;78(1):232–245. doi: 10.1111/j.1467-8624.2007.00994.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Luck SJ, Heinze HJ, Mangun GR, Hillyard SA. Visual event-related potentials index focused attention within bilateral stimulus arrays. II. functional dissociation of P1 and N1 components. Electroencephalography and Clinical Neurophysiology. 1990;75(6):528–542. doi: 10.1016/0013-4694(90)90139-b. [DOI] [PubMed] [Google Scholar]
  43. Luck SJ, Woodman GF, Vogel EK. Event-related potential studies of attention. Trends in Cognitive Sciences. 2000;4(11):432–440. doi: 10.1016/s1364-6613(00)01545-x. [DOI] [PubMed] [Google Scholar]
  44. Mangun GR. Neural mechanisms of visual selective attention. Psychophysiology. 1995;32(1):4–18. doi: 10.1111/j.1469-8986.1995.tb03400.x. [DOI] [PubMed] [Google Scholar]
  45. McKelvie S. Emotional expressions in upside-down faces: Evidence for configurational and componential processing. British Journal of Social Psychology. 1995;34:325–334. doi: 10.1111/j.2044-8309.1995.tb01067.x. [DOI] [PubMed] [Google Scholar]
  46. Meletti S, Cantalupo G, Benuzzi F, Mai R, Tassi L, Gasparini E, et al. Fear and happiness in the eyes: An intra-cerebral event-related potential study from the human amygdala. Neuropsychologia. 2012;50:44–54. doi: 10.1016/j.neuropsychologia.2011.10.020. [DOI] [PubMed] [Google Scholar]
  47. Morel S, George N, Foucher A, Chammat M, Dubal S. ERP evidence for an early bias towards happy faces in trait anxiety. Biological Psychology. 2014;99:183–192. doi: 10.1016/j.biopsycho.2014.03.011. [DOI] [PubMed] [Google Scholar]
  48. Münte TF, Brack M, Grootheer O, Wieringa BM, Matzke M, Johannes S. Brain potentials reveal the timing of face identity and expression judgments. Neuroscience Research. 1998;36(1):25–34. doi: 10.1016/s0168-0102(97)00118-1. [DOI] [PubMed] [Google Scholar]
  49. Nemrodov D, Anderson T, Preston FF, Itier RJ. Early sensitivity for eyes within faces: A new neuronal account of holistic and featural processing. NeuroImage. 2014;97:81–94. doi: 10.1016/j.neuroimage.2014.04.042. [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Palermo R, Rhodes G. Are you always on my mind? A review of how face perception and attention interact. Neuropsychologia. 2007;45(1):75–92. doi: 10.1016/j.neuropsychologia.2006.04.025. [DOI] [PubMed] [Google Scholar]
  51. Pourtois G, Dan ES, Grandjean D, Sander D, Vuilleumier P. Enhanced extrastriate visual response to bandpass: Time course and topographic evokedpotentials mapping. Human Brain Mapping. 2005;26:65–79. doi: 10.1002/hbm.20130. [DOI] [PMC free article] [PubMed] [Google Scholar]
  52. Pourtois G, Spinelli L, Seeck M, Vuilleumier P. Temporal precedence of emotion over attention modulations in the lateral amygdala: Intracranial ERP evidence from a patient with temporal lobe epilepsy. Cognitive, Affective, & Behavioural Neuroscience. 2010a;10(1):83–93. doi: 10.3758/CABN.10.1.83. [DOI] [PubMed] [Google Scholar]
  53. Pourtois G, Spinelli L, Seeck M, Vuilleumier P. Modulation of face processing by emotional expression and gaze direction during intracranial recordings in right fusiform cortex. Journal of Cognitive Neuroscience. 2010b;22(90):2086–2107. doi: 10.1162/jocn.2009.21404. [DOI] [PubMed] [Google Scholar]
  54. Reddy L, Wilken P, Koch C. Face-gender discrimination is possible in the near absence of attention. Journal of Vision. 2004;4(2):106–117. doi: 10.1167/4.2.4. [DOI] [PubMed] [Google Scholar]
  55. Ree MJ, French D, MacLeod C, Locke V. Distinguishing cognitive and somatic dimensions of state and trait anxiety: Development and validation of the state-trait inventory for cognitive and somatic anxiety (STICSA) Behavioural and Cognitive Psychotherapy. 2008;36(3):313–332. [Google Scholar]
  56. Rellecke J, Palazova M, Sommer W, Schacht A. On the automaticity of emotion processing in words and faces: Event-related brain potentials evidence from a superficial task. Brain and Cognition. 2011;77(1):23–32. doi: 10.1016/j.bandc.2011.07.001. [DOI] [PubMed] [Google Scholar]
  57. Rellecke J, Sommer W, Schacht A. Emotion effects on the N170: A question of reference? Brain Topography. 2013;26(1):62–71. doi: 10.1007/s10548-012-0261-y. [DOI] [PubMed] [Google Scholar]
  58. Rossion B. Distinguishing the cause and consequence of face inversion: The perceptual field hypothesis. Acta Psychologica. 2009;132(3):300–312. doi: 10.1016/j.actpsy.2009.08.002. [DOI] [PubMed] [Google Scholar]
  59. Rossion B, Caharel S. ERP evidence for the speed of face categorization in the human brain: Disentangling the contribution of low-level visual cues from face perception. Vision Research. 2011;51(12):1297–1311. doi: 10.1016/j.visres.2011.04.003. [DOI] [PubMed] [Google Scholar]
  60. Rossion B, Jacques C. Does physical interstimulus variance account for early electrophysioloigcal face sensitive responses in the human brain? Ten lessons on the n170. NeuroImage. 2007;39(4):1959–1979. doi: 10.1016/j.neuroimage.2007.10.011. [DOI] [PubMed] [Google Scholar]
  61. Rousselet GA, Husk JS, Bennett PJ, Sekuler AB. Spatial scaling factors explain eccentricity effects on face ERPs. Journal of Vision. 2005;5(10):755–763. doi: 10.1167/5.10.1. [DOI] [PubMed] [Google Scholar]
  62. Rousselet GA, Ince RAA, van Rijsbergen NJ, Schyns PG. Eye coding mechanisms in early human face event-related potentials. Journal of Vision. 2014;14(13):7, 1–24. doi: 10.1167/14.13.7. [DOI] [PubMed] [Google Scholar]
  63. Sander D. Models of Emotion. In: Armony J, Vuilleumier P, editors. The Cambridge Handbook of Human Affective Neuroscience. Cambridge University Press; 2013. pp. 5–53. [Google Scholar]
  64. Sato W, Kochiyama T, Uono S, Matsuda K, Usui K, Inoue Y, Tuchi M. Rapid amygdala gamma oscillations in response to fearful facial expressions. Neuropsycholgia. 2011;49:612–617. doi: 10.1016/j.neuropsychologia.2010.12.025. [DOI] [PubMed] [Google Scholar]
  65. Schacht A, Sommer W. Emotions in word and face processing: Early and late cortical responses. Brain and Cognition. 2009;69:538–550. doi: 10.1016/j.bandc.2008.11.005. [DOI] [PubMed] [Google Scholar]
  66. Scheller E, Büchel C, Gamer M. Diagnostic features of emotional expressions are processed preferentially. PloS One. 2012;7(7):e41792. doi: 10.1371/journal.pone.0041792. [DOI] [PMC free article] [PubMed] [Google Scholar]
  67. Schupp HT, Junghöfer M, Weike AI, Hamm AO. The selective processing of briefly presented affective pictures: An ERP analysis. Psychophysiology. 2004;41(3):441–449. doi: 10.1111/j.1469-8986.2004.00174.x. [DOI] [PubMed] [Google Scholar]
  68. Schyns PG, Jentzsch I, Johnson M, Schweinberger SR, Gosselin F. A principled method for determining the functionality of brain responses. Neuroreport. 2003;14(13):1665–1669. doi: 10.1097/00001756-200309150-00002. [DOI] [PubMed] [Google Scholar]
  69. Schyns PG, Petro LS, Smith ML. Dynamics of visual information integration in the brain for categorizing facial expressions. Curr Biol. 2007;17(18):1580–1585. doi: 10.1016/j.cub.2007.08.048. [DOI] [PubMed] [Google Scholar]
  70. Schyns PG, Petro LS, Smith ML. Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brain: Behavioral and brain evidence. PLoS One. 2009;4(5):e5625. doi: 10.1371/journal.pone.0005625. [DOI] [PMC free article] [PubMed] [Google Scholar]
  71. Smith ML, Cottrell GW, Gosselin F, Schyns PG. Transmitting and decoding facial expressions. Psychological Science. 2005;16(3):184–189. doi: 10.1111/j.0956-7976.2005.00801.x. [DOI] [PubMed] [Google Scholar]
  72. Smith E, Weinberg A, Moran T, Hajcak G. Electrocortical responses to NIMSTIM facial expressions of emotion. International Journal of Psychophysiology. 2013;88(1):17–25. doi: 10.1016/j.ijpsycho.2012.12.004. [DOI] [PubMed] [Google Scholar]
  73. Sprengelmeyer R, Jentzsch I. Event related potentials and the perception of intensity in facial expressions. Neuropsychologia. 2006;44(14):2899–2906. doi: 10.1016/j.neuropsychologia.2006.06.020. [DOI] [PubMed] [Google Scholar]
  74. Tarkiainen A, Cornelissen PL, Salmelin R. Dynamics of visual feature analysis and object-level processing in face versus letter-string perception. Brain : A Journal of Neurology. 2002;125(Pt 5):1125–1136. doi: 10.1093/brain/awf112. [DOI] [PubMed] [Google Scholar]
  75. Taylor MJ, Edmonds GE, McCarthy G, Allison T. Eyes first! eye processing develops before face processing in children. NeuroReport. 2001;12:1671–1676. doi: 10.1097/00001756-200106130-00031. [DOI] [PubMed] [Google Scholar]
  76. Tottenham N, Tanaka JW, Leon AC, ücCarry T, Nurse, Hare TA, … Nelson C. The NimStim set of facial expressions: Judgments from untrained research participants. Psychiatry Research. 2009;168(3):242–249. doi: 10.1016/j.psychres.2008.05.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  77. Van Dam NT, Gros DF, Earleywine M, Antony MM. Establishing a trait anxiety threshold that signals likelihood of anxiety disorders. Anxiety Stress Coping. 2013;26(1):70–86. doi: 10.1080/10615806.2011.631525. [DOI] [PubMed] [Google Scholar]
  78. Van Selst M, Jolicoeur P. A solution to the effect of sample size on outlier elimination. The Quarterly Journal of Experimental Psychology Section A: Human Experimental Psychology. 1994;47(3):631–650. [Google Scholar]
  79. Vlamings PH, Goffaux V, Kemner C. Is the early modulation of brain activity by fearful facial expressions primarily mediated by coarse low spatial frequency information? Journal of Vision. 2009;9(5):12.1–13. doi: 10.1167/9.5.12. [DOI] [PubMed] [Google Scholar]
  80. Vuilleumier P, Pourtois G. Distributed and interactive brain mechanisms during emotion face perception: Evidence from functional neuroimaging. Neuropsychologia. 2007;45:174–194. doi: 10.1016/j.neuropsychologia.2006.06.003. [DOI] [PubMed] [Google Scholar]
  81. Wijers AA, Banis S. Foveal and parafoveal spatial attention and its impact on the processing of facial expressions: An ERP study. Clinical Neurophysiology. 2012;123:513–526. doi: 10.1016/j.clinph.2011.07.040. [DOI] [PubMed] [Google Scholar]
  82. Whalen PJ, Kagan J, Cook RG, Davis FC, Kim H, Polis S, et al. Human amygdala responsivity to masked fearful eye whites. Science. 2004;306(5704):2061. doi: 10.1126/science.1103617. [DOI] [PubMed] [Google Scholar]
  83. Wijers AA, Banis S. Foveal and parafoveal spatial attention and its impact on the processing of facial expressions: An ERP study. Clinical Neurophysiology. 2012;123:513–526. doi: 10.1016/j.clinph.2011.07.040. [DOI] [PubMed] [Google Scholar]
  84. Zerouali Y, Lina J, Jemel B. Optimal eye-gaze fixation position for face-related neural responses. PloS One. 2013;8(6):e60128. doi: 10.1371/journal.pone.0060128. [DOI] [PMC free article] [PubMed] [Google Scholar]
  85. Zhao M, Hayward WG. Holistic processing underlies gender judgments of faces. Attention, Perception, & Psychophysics. 2010;72(3):591–596. doi: 10.3758/APP.72.3.591. [DOI] [PubMed] [Google Scholar]

RESOURCES