Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2008 Oct 22.
Published in final edited form as: Schizophr Res. 2007 Jun 20;94(1-3):253–263. doi: 10.1016/j.schres.2007.05.001

Facial Emotion Recognition in Schizophrenia: When and Why Does It Go Awry?

Bruce I Turetsky 1, Christian G Kohler 1, Tim Indersmitten 1, Mahendra T Bhati 1, Dorothy Charbonnier 1, Ruben C Gur 1
PMCID: PMC2571079  NIHMSID: NIHMS27864  PMID: 17583481

Abstract

Objective

Schizophrenia patients demonstrate impaired emotional processing that may be due, in part, to impaired facial emotion recognition. This study examined event-related potential (ERP) responses to emotional faces in schizophrenia patients and controls to determine when, in the temporal processing stream, patient abnormalities occur.

Method

16 patients and 16 healthy control participants performed a facial emotion recognition task. Very sad, somewhat sad, neutral, somewhat happy, and very happy faces were each presented for 100 ms. Subjects indicated whether each face was “Happy”, “Neutral”, or “Sad”. Evoked potential data were obtained using a 32-channel EEG system.

Results

Controls performed better than patients in recognizing facial emotions. In patients, better recognition of happy faces correlated with less severe negative symptoms. Four ERP components corresponding to the P100, N170, N250, and P300 were identified. Group differences were noted for the N170 “face processing” component that underlies the structural encoding of facial features, but not for the subsequent N250 “affect modulation” component. Higher amplitude of the N170 response to sad faces was correlated with less severe delusional symptoms. Although P300 abnormalities were found, the variance of this component was explained by the earlier N170 response.

Conclusion

Patients with schizophrenia demonstrate abnormalities in early visual encoding of facial features that precedes the ERP response typically associated with facial affect recognition. This suggests that affect recognition deficits, at least for happy and sad discrimination, are secondary to faulty structural encoding of faces. The association of abnormal face encoding with delusions may denote the physiological basis for clinical misidentification syndromes.

Keywords: Schizophrenia, Emotion, Face Recognition, Event-Related Potentials, N170

1. Introduction

Impaired emotional functioning is a core feature of schizophrenia described by Eugen Bleuler (1911) nearly 100 years ago. Emotional abnormalities in schizophrenia are now receiving more attention by clinicians and investigators and include a variety of symptoms such as flat or constricted affect, inappropriate affect, and depression (Kohler et al., 2000a). In addition to negative symptoms' influence on the experience and expression of emotions, there is evidence that schizophrenia patients are impaired in recognizing and discriminating facial emotions (Morrison et al., 1988; Mandal et al., 1998; Edwards et al., 2001; Kohler et al., 2003). It is unclear whether emotion recognition deficits represent a specific or generalized form of cognitive impairment in schizophrenia (Kerr and Neale, 1993; Whittaker et al., 2001), yet recent studies show that emotion processing deficits are uniquely related to clinical symptoms (Kohler et al., 2000b; Silver et al., 2002; Sachs et al., 2004).

A small but growing number of studies have examined the neural substrates of emotion processing in schizophrenia. Animal and human investigations of neural networks involved in emotional behavior implicate the limbic system (primarily the amygdala), hypothalamus, mesocorticolimbic dopaminergic systems, and cortical regions including the orbitofrontal, dorsolateral prefrontal, temporal, and portions of the parietal cortex (LeDoux, 1995; Adolphs et al., 1996). Functional neuroimaging studies in healthy people employing emotionally expressive faces as stimuli reveal the neural circuitry modulating emotion recognition and discrimination. Amygdala activation is consistently observed across paradigms, particularly when subjects attend to the emotional valence of perceived faces (i.e., whether the emotion is positive (happy) or negative (sad, angry, fearful, disgusted) (Gur et al., 2002a). Cognitive or evaluative processing of facial expressions seem to attenuate this amygdala response (Phillips et al., 1998; Hariri et al., 2000) and possibly shift the activation leftward (Phelps et al., 2001). Schizophrenia patients tend to show less activation overall when making an emotional discrimination and fail to recruit limbic regions (Phillips et al., 1999; Gur et al., 2002b).

Among the methods used to investigate brain activity, event-related potentials (ERPs) are unique in revealing the rapid temporal sequence of neural processing underlying emotion recognition and discrimination. ERPs can distinguish the relative contributions of early perceptual and later cognitive factors during perceptual tasks, including emotional face perception. Processing deficits are distinguished based on when the abnormality occurs in the temporal stream of information processing. Affective stimuli elicit distinctive ERP responses as early as 80 ms (Pizzagalli et al., 1999; Pizzagalli et al., 2000; Keil et al., 2001) and as late as 1000 ms (Eimer and Holmes, 2002) post-stimulus. Aversive and socially salient stimuli tend to elicit rapid responses—between 80 and 170 ms post-stimulus (Pizzagalli et al., 1999; Streit et al., 1999; Halgren et al., 2000; Halit et al., 2000; Pizzagalli et al., 2000; Eimer and Holmes, 2002). The simple perception of a face, perhaps the most socially salient of all stimuli, elicits a vertex-positive/occiput-negative ERP component approximately 165–170 ms post-stimulus. This face-specific ERP response, the N170, reflects the early perceptual process involving structural encoding of the face (Eimer, 2000). The N170 is thought to arise primarily from the fusiform gyrus and can be readily distinguished from the perceptual ERP response to other classes of stimuli (Botzel et al., 1995; Herrmann et al., 2005; Iidaka et al., 2006).

Discerning the emotional expression of a face is traditionally thought of as a higher-order cognitive process in which the structural representation of the face is associated with semantic and cognitive information (Bruce and Young, 1986; Balconi and Lucchiari, 2005). Consistent with this model, facial affect recognition has been associated with ERP responses occurring between 400 and 600 ms post-stimulus (Potter and Parker, 1989; Hautecoeur et al., 1993; Halgren and Marinkovic, 1995). However, increasing evidence shows that facial affect recognition can occur earlier. Several studies have reported affect-related ERPs in the 180–250 ms post-stimulus interval, immediately following the structural face encoding response (Marinkovic and Halgren, 1998; Streit et al., 1999; Streit et al., 2001). More recent evidence suggests that the emotional expression of a face can modulate face-specific ERPs (Williams et al., 2006), with negative emotions eliciting larger face-specific ERPs (Caharel et al., 2005).

Limited data exist regarding ERP abnormalities associated with emotion processing deficits in schizophrenia. One study showed that negative emotional expressions elicited larger P300 amplitudes than positive expressions in healthy subjects, but the opposite was true for schizophrenia patients (An et al., 2003). There is a suggestion that this differential response is mediated by clinical disease subtype and is not found in the paranoid subtype (Ueno et al., 2004). Other studies showed reduced affect-modulated ERP amplitudes in schizophrenia patients over frontal electrode sites 180–250 ms post-stimulus after face perception (Horley et al., 2001; Streit et al., 2001). Streit et al described this as a distinct deficit that could be distinguished from perceptual face processing, which remained intact. However, others have suggested that emotional face recognition deficits in schizophrenia are secondary “flow-on” effects of broader deficits in visual face processing (Johnston et al., 2005). Consistent with the latter view, Herrmann et al. (2004) reported schizophrenia deficits in the face-specific N170 ERP response, indicating dysfunction of early visual processing of faces independent of emotional content. Furthermore, these investigators failed to replicate the early 180–250 ms affect-modulated recognition deficit reported by Streit (Herrmann et al., 2006).

In this study we investigated the temporal dynamics of emotional face recognition in schizophrenia patients. We employed an emotion recognition task with happy, sad, and neutral faces as stimuli. Our design uniquely used an extremely brief duration of stimulus presentation to selectively probe affectively mediated visual processing in relative isolation from both early structural encoding and later higher-order cognitive processing.

2. Materials and methods

2.1 Subjects

This study was carried out in accordance with the Declaration of Helsinki regarding human research and with approval of the university’s Institutional Review Board. Participants included 16 patients (12 male, 4 female) with a DSM-IV diagnosis of schizophrenia and 16 healthy comparison subjects (12 male, 4 female) recruited at the Schizophrenia Research Center of the University of Pennsylvania. All patients were stable outpatients at the time of testing. All subjects received a semi-structured psychiatric interview (DIGS, Diagnostic Interview for Genetic Studies). Patients were excluded for any concurrent Axis I diagnosis other than schizophrenia. Control subjects were excluded for any history of an Axis I diagnosis, Axis II Cluster A (schizotypal, schizoid, or paranoid) personality disorder, or family history of an Axis I psychotic disorder. Across groups, subjects were excluded for any history of a neurological disorder, including head trauma with loss of consciousness, any lifetime history of substance dependence, history of substance abuse within the preceding six months, or any medical condition that might affect cerebral functioning. IRB approved, written informed consent was obtained from all subjects.

Demographic and clinical descriptions of the control and patient samples are presented in Table 1 and Table 2. As expected, patients had less formal education than the comparison subjects [t(30 = 5.82, p < 0.00001] but did not differ in age distribution [t(30) =1.18, p = 0.25] or handedness [χ2(2) = 3.69, p = 0.16]. Patients were assessed with the Brief Psychiatric Rating Scale (BPRS) (Overall and Gorham, 1962), the Scale for Assessment of Negative Symptoms (SANS) (Andreasen, 1983), the Scale for Assessment of Positive Symptoms (SAPS) (Andreasen, 1984) and the Hamilton Depression Rating Scale (Ham-D) (Hamilton, 1960). These indicated a relatively mild level of psychiatric symptomatology. Ten patients were taking atypical antipsychotic medications at the time of study; six patients were unmedicated.

Table 1.

Demographic Profile of Sample mean ± SD

Patients (n = 16) Controls (n = 16)
Men (#) 12 12
Women (#) 4 4
Age (years) 30.5 ± 6.0 28.1 ± 5.4
Age range (min – max) 22 – 38 18 – 38
Education (years) 12.5 ± 1.8 17.1 ± 2.5 *
Right handed (#) 12 13
Left handed (#) 3 2
Mixed handed (#) 1 1
*

Subject-group difference p < 0.00001.

Table 2.

Clinical Profile of Patient Sample mean ± SD

Male (n = 12) Female (n = 4)
Age (years) 30.3 ± 6.4 31.3 ± 5.1
Age at onset (years) 22.1 ± 6.4 23.3 ± 6.5
Age range (min-max) 21 – 38 24 – 36
Duration of illness (years) 7.5 ± 5.5 10.5 ± 8.3
Number of hospitalizations 1.9 ± 3.1 1.3 ± 1.0
Deficit / Nondeficits (#) 6 / 6 1 / 3
Medicated / Unmedicated (#) 7 / 5 3 / 1
Brief Psychiatric Rating Scale (BPRS) 33.1 ± 12.8 27.0 ± 1.7
Hamilton Depression Rating Scale 13.7 ± 9.4 5.0 ± 2.6
Negative Symptom Scale (SANS)
     Mean Global Rating 1.8 ± 0.7 0.9 ± 0.3
       Affect 1.9 ± 1.4 1.7 ± 1.5
       Alogia 1.5 ± 1.4 0.0 ± 0.0
       Avolition 1.7 ± 1.5 1.0 ± 1.7
       Anhedonia 3.5 ± 0.7 1.7 ± 1.5**
       Attention 1.9 ± 1.1 0.0 ± 0.0
Positive Symptom Scale (SAPS)
     Mean Global Rating 1.4 ± 0.7 0.3 ± 0.3 *
       Hallucinations 1.9 ± 1.0 0.9 ± 0.5
       Delusions 2.7 ± 1.1 0.7 ± 1.2 *
       Bizarre behavior 0.3 ± 0.9 0.0 ± 0.0
       Formal thought disorder 1.8 ± 1.0 0.0 ± 0.0
*

Male-female difference P < 0.05

**

P < 0.01.

2.2 Stimulus materials and procedure

The behavioral task was constructed from Penn facial emotion stimuli (Erwin et al., 1992). Subjects were asked to identify emotional expressions from grayscale photographs of 2 male and 2 female faces, each varying in both emotional intensity and valence (happy versus sad). There were 5 photographs for each of the 4 faces, depicting very sad (VS), somewhat sad (S), neutral (N), somewhat happy (H), and very happy (VH) emotions. In a forced-choice design, subjects were instructed to press a button indicating whether each face was “Happy”, “Sad”, or “Neutral”. Since neutral faces had no associated emotional intensity, happy (H and VH) and sad (S and VS) faces were each repeated 10 times, whereas neutral (N) faces were repeated 20 times. This ensured that the total number of happy, sad and neutral faces were equal. A total of 240 faces were presented in pseudorandom order, with a stimulus duration of 100 ms. The brief stimulus duration was intended to prevent any extensive perusal and cognitive assessment of the stimuli, so as to elicit a more spontaneous evaluation of the emotional features of each face. During task development we found that a 100 ms stimulus duration allowed subjects to readily identify the emotional expression of each face even though they often could not identify the individual photograph.

2.3 Data recording and processing

Electrodes were applied to the scalp using a 32-channel Quickcap (Neuroscan, El Paso, TX) in an expanded 10–20 system scalp montage referenced to the left mastoid. Bipolar electrodes recorded horizontal and vertical eye movements. Electrical potentials were amplified with a Neuroscan Synamps 32-channel amplifier (gain: 1000; range: 5.5 mV; resolution: 0.084 µV; bandpass filter settings: 0.10–50.0 Hz). Data were digitally sampled at 500 Hz and stored for off-line analysis. The continuous EEG was visually scanned for gross artifact and digitally filtered with a 0.50–20.0 Hz zero phase-shift bandpass filter (12 dB/octave). An eye-movement correction algorithm with established reliability and validity (Semlitsch et al., 1986) was applied to reduce EOG contamination. The continuous recordings were segmented into epochs extending from 100ms pre-stimulus to 600 ms post-stimulus and re-referenced to the mathematical average of all scalp potentials at each time point, to provide a set of unbiased reference-free measures. Average evoked potential waveforms were then computed from epochs for each stimulus type (VS, S, N, H, and VH). Grand averages of the ERP responses to different emotion conditions, at selected electrode sites, are illustrated in Figure 2.

Fig. 2.

Fig. 2

Grand average common-average referenced ERP data for patients and controls from selected channels, for VS, N, and VH stimulus conditions.

2.4 ERP component identification

The multi-channel ERP data were decomposed into constituent spatio-temporal components by computation of the global field power (GFP). GFP corresponds to the spatial standard deviation, across all electrodes, of the electrical potentials recorded at each time point. Computationally it is defined as:

GFP(t)=i=1N(vi(t)v¯(t))2N

where vi(t) is the voltage recorded at electrode i at time point t, (t) is the mean voltage across all electrodes at time t and N is the total number of electrodes. This provides a reference-independent measure of the variability of the potential field across the scalp, as a function of time. Intervals with high GFP denote discrete periods of evoked potential activity with characteristic stable scalp topographies (Skrandies, 1990). Figure 3 illustrates the GFP grand average computed across all subjects and experimental conditions. The GFP revealed 4 discrete ERP components, each with a distinctive scalp topography. The first component spanned the interval from 85 ms to 135 ms with bilateral maxima over the left and right occipital cortices, consistent with the visual P100. The second component occurred between 135 and 185 ms, with a vertex maximum and bilateral posterior temporal minima. This represents the face-specific N170. Component 3, denoted as N250, spanned the interval from 185 to 285 ms, with a mid-sagittal fronto-central minimum and a posterior parieto-occipital maximum. This component corresponds to the period immediately following structural face encoding that has been implicated, in some studies, in early affect modulation. The fourth component, between 350 and 450 ms post-stimulus, had a broad midline positivity and a parietal maximum, consistent with the visual P300.

Fig. 3.

Fig. 3

Above, time course and scalp topography of the grand averaged GFP in both normal and schizophrenia subjects during emotional face recognition. Below, two-dimensional scalp topography of each ERP component including the P100, N170, N250, and P300.

2.5 Statistical analysis

The electrophysiological data were analyzed using repeated measures analysis of variance (ANOVA) with Diagnosis (patients vs. controls) as a between-subjects factor and emotional valence of the stimulus (VS, S, N, H, or VH), designated as Emotion, as a within-subject factor. Separate analyses were conducted for each of the 4 ERP components. The dependent measure was the mean GFP within the specified time interval. Significant main effects of Diagnosis or Emotion (p < 0.05) were followed by separate group comparisons for each Emotion valence, using a Bonferroni correction for 5 post-hoc tests (p < 0.01).

Behavioral data were analyzed similarly with Diagnosis as a between-subject factor and Emotion as a within-subject factor. The dependent measure was the arc sine of the percent correctly classified faces for each Emotion valence. The arc sine transform provided a normally distributed measure of task performance in case of ceiling effects. Behavioral data for one female control subject was unavailable due to technical problems.

For dependent measures that exhibited patient abnormalities, comparisons between medicated and unmedicated patients and correlational analyses were conducted to determine if there were any associations with clinical symptoms or treatment. Given the relatively small sample size and the exploratory nature of these analyses, there were no corrections for multiple comparisons.

3. Results

3.1 Behavioral data

There was a significant main effect for Diagnosis [F(1,29) = 24.04, p < 0.0001] with control subjects performing better, overall, than patients. There was also a significant main effect for Emotion independent of Diagnosis [F(4,116) = 3.98, p < 0.01]. As illustrated in Figure 4, the two more subtle expressions of emotion (S, H) were more difficult to identify than the more extreme emotions (VS, VH) or the neutral faces. Patients and controls exhibited similar response profiles, and separate paired contrasts for each Emotion valence showed that patients were significantly impaired when responding to all Emotion valences (Bonferroni-corrected p < 0.01) except VH (corrected p = 0.19).

Fig. 4.

Fig. 4

Mean ± SD of the arc sine transform of the number of correctly identified emotions for each category of emotional face stimuli.

Among patients, the ability to recognize both VH and H faces was inversely correlated with the severity of negative symptoms as indexed by the overall SANS score. This was especially true for the more difficult identification of H faces (r =−.68; p < 0.01). Analysis of the individual subscales of the SANS revealed a significant inverse correlation with the alogia symptom complex (r = −0.86; p < 0.001). There were no associations with positive symptoms as indicated by either total SAPS or individual SAPS subscale scores, and no associations with Ham-D.

3.2 Electrophysiological Data

Mean GFP for patients and controls, for each ERP component and Emotion valence, is presented in Table 3.

Table 3.

Global Field Power (µV2) mean ± SD

Schizophrenia Controls
P100 N170 N250 P300 P100 N170 N250 P300


VS 1.47±.68 1.57±.70* 1.48±.78 1.02±.28*** 1.66±.49 2.39±1.15 1.94±.81 1.78±.55
S 1.40±.66 1.49±.71** 1.53±.89 1.01±.33*** 1.48±.44 2.34±.97 2.10±.85 1.75±.57
N 1.32±.66 1.48±.79* 1.47±1.00 0.93±.40** 1.55±.53 2.11±.91 1.94±.90 1.50±.59
H 1.50±.83 1.63±.77* 1.53±.84 1.03±.37** 1.75±.54 2.41±.89 2.01±.85 1.57±.50
VH 1.45±.61 1.68±.86* 1.49±.88 1.03±.30*** 1.77±.53 2.43±1.13 1.95±.83 1.82±.79
*

Group differences in mean GFP amplitude: p < 0.05

**

p < 0.01

***

p < 0.001

P100

There was no difference between patients and control subjects in the P100 response. However, there was a significant effect of Emotion across groups [F(4,120) = 3.97, p < 0.01]. Deviation contrasts, which compared the response for each individual Emotion valence to the mean response of all the other valences, indicated that the P100 was selectively reduced for S [F(1,30) = 11.79, p < 0.01] and N [F(1,30) = 6.47, p < 0.05] faces compared to all others, with the biggest responses occurring for the more intense emotional faces.

N170

Schizophrenia subjects exhibited smaller N170 responses overall when compared to healthy controls [F(1,30) = 6.27, p < 0.05]. There was also a significant main effect of Emotion [F(4,120) = 4.13, p < 0.01]. As illustrated in Figure 5, the response to N faces was smaller, across groups, than the response to faces that overtly expressed emotion [F(1,30)=13.89, p < 0.001]. This reduced N170 response was especially notable in healthy controls. In separate group contrasts, patients had markedly reduced N170 responses across all 5 Emotion valences, but only the S valence survived Bonferroni correction for multiple comparisons [F(1,30) = 4.51, corrected p < 0.05].

Fig. 5.

Fig. 5

Mean ± SD GFP of the N170 response during perception of emotional faces in normal and schizophrenia subjects.

Among patients, the N170 response to S and VS faces was inversely correlated with positive symptoms, as indicated by the overall SAPS score [S: r = −0.56, p < 0.05; VS: r = −0.54, p < 0.05]. Examination of the individual SAPS subscales revealed an isolated inverse relationship between the N170 response to sad faces and the level of delusional symptoms [r = −0.50, p < 0.05 1-tailed]. There were no associations with other SAPS subscales, with any SANS ratings, or with Ham-D. There was also no difference between medicated and unmedicated patients.

N250

There was a non-significant trend towards patients having reduced N250 amplitudes [F(1,30) = 2.70, p = 0.11]. However, there was no main effect of Emotion [F(4,120) = 1.31, p = 0.27], and the difference between patients and controls was not affected by Emotion valence.

P300

Schizophrenia patients had significantly reduced P300 amplitudes compared to controls across all Emotion valences [F(1,30) = 18.41, p < 0.001]. There was also a significant main effect of Emotion on the P300 response, independent of Diagnosis [F(4,120) = 4.58, p < 0.01]. Paralleling what we found for the earlier N170, the P300 response to N faces was selectively reduced relative to the response to emotionally expressive faces [F(1,30) = 13.79, p < 0.001]. There was also a trend suggesting an interaction between Diagnosis and Emotion [F(4,120) = 2.08, p = 0.09]. As illustrated in Figure 6, the relative difference between N faces and the other Emotion valences was greater for the controls than for the schizophrenia patients. Patients exhibited less P300 variability across Emotion valences. However, individual paired contrasts supported a robust patient-control difference in P300 response for each Emotion valence (Bonferroni-corrected p < 0.01). There was no effect of medication status on patient responses. The only association to clinical symptomatology was an inverse relationship between the P300 response to VS faces and the Ham-D rating (r=−.56, p<.05).

Fig. 6.

Fig. 6

Mean ± SD GFP of the P300 response during perception of emotional faces in normal and schizophrenia subjects.

Given the parallel deficits in N170 and P300, in patients, we considered whether these two abnormalities were related to each other. If so, then the P300 abnormality might be a "down-stream" consequence of the earlier encoding deficit. We therefore conducted a canonical correlation analysis of these two sets of variables within the patient sample. This revealed one significant canonical root, with canonical R = 0.96 [χ2(25)=45.9, p=.007]. Based on this single canonical factor, patients' N170 amplitudes accounted for 69.9% of the variance of the P300. Total P300 variance redundancy, based on all 5 canonical factors, was 77.8%. A parallel analysis in the control sample revealed no statistically significant canonical correlates.

3.3 Behavioral - Electrophysiological Relationships

We considered whether behavioral performance measures were related to specific electrophysiological measures by conducting canonical correlation analyses between the 5 behavioral measures and the set of 5 GFP measures computed for each individual ERP component. The only significant relationship was between performance and P300 amplitude, with canonical R=0.81 [χ2(25)=48.6, p=.003]. Behavioral performance accounted for 48.0% of P300 variance; P300 accounted for 25.5% of the variance in performance. An examination of the underlying factor structures revealed that both the behavioral and P300 canonical factors had their greatest factor loadings for the VS, S and N faces, with reduced loading for H and VH faces. None of the earlier ERP components exhibited significant associations with performance.

4. Discussion

The behavioral findings in this study are relatively consistent with prior studies of emotional face recognition in schizophrenia. Controls performed better overall when compared to schizophrenia patients, and both groups were better able to recognize more intense expressions of emotion (VH and VS) compared to more subtle expressions of the same emotions. The finding that impaired emotional face recognition in schizophrenia patients was less obvious for VH faces agrees with previous reports of a more severe recognition deficit for negative emotions (Kohler et al., 2003; Leppanen et al., 2006). However, the inverse relationship that we observed, between negative symptoms and the ability to identify happy faces, suggests that this finding of more severe recognition deficits for negative emotions is modulated by specific clinical symptomatology. While it was easier overall for patients to identify happy rather than sad faces, those with more severe negative symptoms, particularly alogia, were less able to take advantage of this reduction in task difficulty.

The electrophysiological data provide insights into the neural mechanisms underlying these behavioral findings. We observed a significant effect of emotional valence on the P100 visual evoked potential, which precedes face-specific structural encoding. Reduced P100 amplitudes were observed during perception of neutral and more subtle sad expressions. An effect of emotional valence at this early stage of visual processing has not been previously reported. Our observation of this effect may reflect differences in stimulus duration between our experiment and previous studies. The brief stimulus duration that we employed prevented any protracted or strategic observation of stimulus features. Rather, subjects had to very quickly process the elemental facial features that were indicative of one or another emotional valence. This may have accentuated the top-down effect of selective attention on the P100 response to the more easily identified intense emotions. Importantly, there were no patient-control differences in the P100 response. This is consistent with Johnston et al. (2005), who also found the P100 component to be intact in patients during face processing. Therefore, although P100 abnormalities have been reported in schizophrenia (Contreras et al., 1990), differences in affect recognition do not reflect a deficit at this relatively non-specific stage of early visual processing.

However, the two groups did differ in their N170 responses, indicating a fundamental deficit in patients for the structural encoding and perceptual integration of facial features. Coupled with the lack of a significant patient - control difference in the subsequent affect-modulated N250 response, this suggests that facial affect recognition deficits are the secondary or "downstream" sequelae of earlier deficits in visual face processing. Our conclusions are therefore consistent with the findings of Johnston et al. (2005) and Herrmann et al. (2004), and contrary to Streit's (2001) contention that schizophrenia patients have a specific deficit in facial emotion recognition but otherwise intact face perceptual processing. The finding of reduced N170 amplitude to face stimuli in schizophrenia is not specific to the emotion recognition task. It has also been reported in a visual target detection task, in which pictures of emotionally neutral faces were intermixed with pictures of other objects (Onitsuka et al., 2006). This N170 amplitude decrement was correlated with decreased fusiform gyrus gray matter volume. Since the fusiform gyrus is thought to be the source of the N170 (Watanabe et al., 1999), this finding supports the idea of a primary disturbance in the neural substrate underlying structural face encoding.

We also found significant patient deficits in P300 amplitude. These were evident across all emotional valences, but were especially notable for the negative emotions. In this sense, our data are consistent with those of An et al. (2003). However, the canonical analysis demonstrated that reduced P300 amplitudes, in patients, were highly correlated with amplitude reductions in the early N170 response. While this does not establish a causative relationship, it suggests that disturbances in higher-order evaluative processes may be secondary consequences of earlier disruptions in sensory encoding of facial stimuli.

There are several points to note about these findings. First, there is increasing evidence that schizophrenia patients exhibit visual processing deficits that are specific to the magnocellular visual system (Butler et al., 2001). Functionally, this manifests, in part, as an inability to integrate discrete elements of an image into a total gestalt (Doniger et al., 2002). We recently demonstrated that patients are selectively impaired in processing "global" vs. "local" features of a composite visual stimulus, and that this is associated with reduced amplitude of the N150 visual ERP (Johnson et al., 2005). Face perception requires the ability to integrate discrete features into a global composite representation. So, the schizophrenia deficit in face processing may not be a unique impairment but may instead be one specific manifestation of a broader deficit in global perceptual integration. Additional studies, examining face perception in conjunction with other perceptual object integration tasks within the same patient sample, would allow this hypothesis to be tested directly.

Second, the N170 response in this study was not independent of the emotional valence of the face. Faces that expressed emotions elicited larger responses than the neutral faces. This suggests that the distinction between the N170, as an ERP index of structural encoding of facial features, and the subsequent N250, as an index of affect-related modulation of this perceptual response, may not be entirely valid. The fact that N250 amplitude did not vary with emotional valence further supports the notion that these two aspects of face processing are not discrete and separable. Rather, the features that contribute to the affective connotation of a face also seem to facilitate its basic perceptual encoding. Such a functional overlap is not typically observed. However, a similar finding, of enhanced N170 amplitude in response to fearful faces compared to neutral expressions, has recently been reported (Blau et al., 2007). It may be significant, in this regard, that the Blau et al. study also used a 100 ms stimulus duration for their faces. Possibly the functional overlap is accentuated or unmasked in some way by the brief stimulus presentation employed in these experiments. It would be worthwhile, therefore, to examine whether our findings also generalize to fearful or angry faces.

Third, the amplitude of the N170 response to sad faces was inversely associated, among patients, with the severity of positive symptoms and, more specifically, with the presence of delusions. This is an intriguing finding, as misidentification syndromes, such as Capgras and Fregoli delusions, are dramatic - albeit rare - clinical phenomena in schizophrenia (Ellis et al., 1994; Feinberg and Roane, 2005). Little is known about the neurophysiological origins of misidentification syndromes, but it has been suggested that impaired visual processing may contribute to their etiology (Phillips and David, 1995). While the patients in our study did not have overt misidentification delusions, the association between paranoid delusions and decreased N170 amplitude is consistent with the hypothesis that disturbances in early visual face encoding could contribute to such clinical phenomena.

Fourth, the question of gender differences was one that we could not address in this study, due to the limited number of female participants. Behaviorally, male schizophrenia patients perform worse than female patients on facial emotion recognition tasks (Kohler et al., 200b). In contrast, healthy women tend to respond preferentially to "local" rather than "global" features of complex visual stimuli (Roalf et al., 2006). So, if emotion recognition deficits reflect an underlying impairment in "global" stimulus processing in schizophrenia, we might expect female patients to perform worse, rather than better, on this task. However, an initial study of "global" vs. "local" stimulus processing in schizophrenia found a "global" processing deficit in patients that was independent of normal gender differences (Johnson et al., 2005). Additional ERP studies of face perception and emotion recognition, with samples large enough to address gender differences, are needed to tease apart what is likely a complex multi-factorial relationship.

Finally, there was an apparent inconsistency between the electrophysiological and the behavioral data that requires comment. Behavioral performance was associated with negative symptoms, while the N170 ERP response was related to positive symptoms. However, the association between task performance and symptomatology was specific to happy faces, which were easier for the patients to identify, while the N170 ERP association was specific to sad faces. This suggests that, for patients with negative symptoms, there may have been a performance ceiling effect that limited their ability to take full advantage of the decreased task demands associated with happy face recognition. However, this appears to be independent of the physiological abnormality indexed by the N170. This is consistent with the canonical correlation analysis which found task performance to be significantly related to P300 but not N170 amplitude, despite the fact that N170 and P300 shared a substantial proportion of variance.

In conclusion, this study indicates that patients with schizophrenia exhibit a physiological abnormality in relatively early visual processing mechanisms underlying structural face encoding. This deficit is relatively independent of the emotional expression of the face, though structural encoding appears to be enhanced by emotional expression. The abnormality may be a specific example of a more diffuse "global vs. local" visual processing deficit, and it may underlie clinical misidentification syndromes. However, the data do not support the hypothesis that schizophrenia patients have a specific impairment in the ability to recognize the emotional expressions of faces, as opposed to other elements of face processing. Future studies need to examine whether these conclusions hold for emotions other than happy and sad.

Fig. 1.

Fig. 1

Examples of the facial emotion stimuli, ranging in valence and intensity from VS (left), through N (center), to VH (right).

Acknowledgement

No specific acknowledgements.

Footnotes

Conflict of Interest

None of the authors have any actual or potential conflicts of interest that could inappropriately influence, or be perceived to influence, this work.

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

References

  1. Adolphs R, Damasio H, Tranel D, et al. Cortical systems for the recognition of emotion in facial expressions. J. Neurosci. 1996;16(23):7678–7687. doi: 10.1523/JNEUROSCI.16-23-07678.1996. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. An SK, Lee SJ, Lee CH, et al. Reduced P3 amplitudes by negative facial emotional photographs in schizophrenia. Schizophr. Res. 2003;64(2–3):125–135. doi: 10.1016/s0920-9964(02)00347-x. [DOI] [PubMed] [Google Scholar]
  3. Andreasen NC. The Scale for the Assessment of Negative Symptoms (SANS) Iowa City: The University of Iowa; 1983. [Google Scholar]
  4. Andreasen NC. The Scale for the Assessment of Positive Symptoms. Iowa City: The University of Iowa; 1984. [Google Scholar]
  5. Balconi M, Lucchiari C. Event-related potentials related to normal and morphed emotional faces. J. Psychol. 2005;139(2):176–192. doi: 10.3200/JRLP.139.2.176-192. [DOI] [PubMed] [Google Scholar]
  6. Blau VC, Maurer U, Tottenham N, et al. The face-specific N170 component is modulated by emotional facial expression. Behav. Brain Funct. 2007 Jan 23;3(1):7. doi: 10.1186/1744-9081-3-7. [Epub ahead of print] [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Bleuler E. Aschaffenburg G. Leipzig: Deuticke: Hrsg. Handbuch der Psychiatrie; 1911. Dementia praecox oder die Gruppe der Schizophrenien. [Google Scholar]
  8. Botzel K, Schulze S, Stodieck SR. Scalp topography and analysis of intracranial sources of face-evoked potentials. Exp. Brain Res. 1995;104(1):135–143. doi: 10.1007/BF00229863. [DOI] [PubMed] [Google Scholar]
  9. Bruce V, Young A. Understanding face recognition. Br. J. Psychol. 1986;77(3):305–327. doi: 10.1111/j.2044-8295.1986.tb02199.x. [DOI] [PubMed] [Google Scholar]
  10. Butler PD, Schechter I, Zemon V, et al. Dysfunction of early-stage visual processing in schizophrenia. Am. J. Psychiatry. 2001;158:1126–1133. doi: 10.1176/appi.ajp.158.7.1126. [DOI] [PubMed] [Google Scholar]
  11. Caharel S, Courtay N, Bernard C, et al. Familiarity and emotional expression influence an early stage of face processing: an electrophysiological study. Brain Cogn. 2005;59(1):96–100. doi: 10.1016/j.bandc.2005.05.005. [DOI] [PubMed] [Google Scholar]
  12. Contreras CM, Garnica R, Torres-Ruiz A, et al. Visual evoked potentials in a sample of schizophrenic patients. Bol. Estud. Med. Biol. 1990;38:22–28. [PubMed] [Google Scholar]
  13. Doniger GM, Foxe JJ, Murray MM, et al. Impaired visual object recognition and dorsal/ventral stream interaction in schizophrenia. Arch. Gen. Psychiatry. 2002;59:1011–1020. doi: 10.1001/archpsyc.59.11.1011. [DOI] [PubMed] [Google Scholar]
  14. Edwards J, Pattison PE, Jackson HJ, et al. Facial affect and affective prosody recognition in first-episode schizophrenia. Schizophr. Res. 2001;48(2–3):235–253. doi: 10.1016/s0920-9964(00)00099-2. [DOI] [PubMed] [Google Scholar]
  15. Eimer M. Event-related brain potentials distinguish processing stages involved in face perception and recognition. Clin. Neurophysiol. 2000;111(4):694–705. doi: 10.1016/s1388-2457(99)00285-0. [DOI] [PubMed] [Google Scholar]
  16. Eimer M, Holmes A. An ERP study on the time course of emotional face processing. Neuroreport. 2002;13(4):427–431. doi: 10.1097/00001756-200203250-00013. [DOI] [PubMed] [Google Scholar]
  17. Ellis HD, Luaute JP, Retterstol N. Delusional misidentification syndromes. Psychopathology. 1994;27(3–5):117–120. doi: 10.1159/000284856. [DOI] [PubMed] [Google Scholar]
  18. Erwin RJ, Gur RC, Gur RE, et al. Facial emotion discrimination: I. Task construction and behavioral findings in normal subjects. Psychiatry Res. 1992;42(3):231–240. doi: 10.1016/0165-1781(92)90115-j. [DOI] [PubMed] [Google Scholar]
  19. Feinberg TE, Roane DM. Delusional misidentification. Psychiatr. Clin. North Am. 2005;28:665–683. doi: 10.1016/j.psc.2005.05.002. [DOI] [PubMed] [Google Scholar]
  20. Gur RC, Schroeder L, Turner T, McGrath C, Chan RM, Turetsky BI, Alsop D, Maldjian J, Gur RE. Brain activation during facial emotion processing. Neuroimage. 2002;16:651–662. doi: 10.1006/nimg.2002.1097. [DOI] [PubMed] [Google Scholar]
  21. Gur RE, McGrath C, Chan RM, et al. An fMRI study of facial emotion processing in patients with schizophrenia. Am. J. Psychiatry. 2002;159(12):1992–1999. doi: 10.1176/appi.ajp.159.12.1992. [DOI] [PubMed] [Google Scholar]
  22. Halgren E, Marinkovic K. Neurophysiological networks integrating human emotions. In: Gazzaniga MS, editor. The Cognitive Neurosciences. Cambridge: MIT Press; 1995. pp. 1137–1150. [Google Scholar]
  23. Halgren E, Raij T, Marinkovic K, et al. Cognitive response profile of the human fusiform face area as determined by MEG. Cereb. Cortex. 2000;10(1):69–81. doi: 10.1093/cercor/10.1.69. [DOI] [PubMed] [Google Scholar]
  24. Halit H, de Haan M, Johnson MH. Modulation of event-related potentials by prototypical and atypical faces. Neuroreport. 2000;11(9):1871–1875. doi: 10.1097/00001756-200006260-00014. [DOI] [PubMed] [Google Scholar]
  25. Hamilton M. A rating scale for depression. J. Neurol. Neurosurg. Psychiatry. 1960;23:56–62. doi: 10.1136/jnnp.23.1.56. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Hariri AR, Bookheimer SY, Mazziotta JC. Modulating emotional responses: effects of a neocortical network on the limbic system. Neuroreport. 2000;11(1):43–48. doi: 10.1097/00001756-200001170-00009. [DOI] [PubMed] [Google Scholar]
  27. Hautecoeur P, Debruyne P, Forzy G, et al. Visual evoked potentials and face recognition. Influence of celebrity and emotional expression. Rev. Neurol. 1993;149(3):207–212. [PubMed] [Google Scholar]
  28. Herrmann MJ, Ehlis AC, Muehlberger A, et al. Source localization of early stages of face processing. Brain Topogr. 2005;18(2):77–85. doi: 10.1007/s10548-005-0277-7. [DOI] [PubMed] [Google Scholar]
  29. Herrmann MJ, Ellgring H, Fallgatter AJ. Early-stage face processing dysfunction in patients with schizophrenia. Am. J. Psychiatry. 2004;161(5):915–917. doi: 10.1176/appi.ajp.161.5.915. [DOI] [PubMed] [Google Scholar]
  30. Herrmann MJ, Reif A, Jabs BE, et al. Facial affect decoding in schizophrenic disorders: a study using event-related potentials. Psychiatry Res. 2006;141(3):247–252. doi: 10.1016/j.psychres.2005.09.015. [DOI] [PubMed] [Google Scholar]
  31. Horley K, Gonsalvez C, Williams L, et al. Event-related potentials to threat-related faces in schizophrenia. Int. J. Neurosci. 2001;107(1–2):113–130. doi: 10.3109/00207450109149761. [DOI] [PubMed] [Google Scholar]
  32. Iidaka T, Matsumoto A, Haneda K, et al. Hemodynamic and electrophysiological relationship involved in human face processing: evidence from a combined fMRI-ERP study. Brain Cogn. 2006;60(2):176–186. doi: 10.1016/j.bandc.2005.11.004. [DOI] [PubMed] [Google Scholar]
  33. Johnson SC, Lowery N, Kohler C, et al. Global-local visual processing in schizophrenia: Evidence for an early visual processing deficit. Biol. Psychiatry. 2005;58:937–946. doi: 10.1016/j.biopsych.2005.04.053. [DOI] [PubMed] [Google Scholar]
  34. Johnston PJ, Stojanov W, Devir H, et al. Functional MRI of facial emotion recognition deficits in schizophrenia and their electrophysiological correlates. Eur. J. Neurosci. 2005;22(5):1221–1232. doi: 10.1111/j.1460-9568.2005.04294.x. [DOI] [PubMed] [Google Scholar]
  35. Keil A, Muller MM, Gruber T, et al. Effects of emotional arousal in the cerebral hemispheres: a study of oscillatory brain activity and event-related potentials. Clin. Neurophysiol. 2001;112(11):2057–2068. doi: 10.1016/s1388-2457(01)00654-x. [DOI] [PubMed] [Google Scholar]
  36. Kerr SL, Neale JM. Emotion perception in schizophrenia: specific deficit or further evidence of generalized poor performance? J. Abnorm. Psychol. 1993;102(2):312–318. doi: 10.1037//0021-843x.102.2.312. [DOI] [PubMed] [Google Scholar]
  37. Kohler CG, Gur RC, Gur RE. Emotional processes in schizophrenia: a focus on affective states. In: Borod JC, editor. The Neuropsychology of Emotion. New York: Oxford University Press; 2000a. pp. 432–455. [Google Scholar]
  38. Kohler CG, Bilker W, Hagendoorn M, et al. Emotion recognition deficit in schizophrenia: association with symptomatology and cognition. Biol. Psychiatry. 2000b;48(2):127–136. doi: 10.1016/s0006-3223(00)00847-7. [DOI] [PubMed] [Google Scholar]
  39. Kohler CG, Turner TH, Bilker WB, et al. Facial emotion recognition in schizophrenia: intensity effects and error pattern. Am. J. Psychiatry. 2003;160(10):1768–1774. doi: 10.1176/appi.ajp.160.10.1768. [DOI] [PubMed] [Google Scholar]
  40. LeDoux JE. Emotion: clues from the brain. Annu. Rev. Psychol. 1995;46:209–235. doi: 10.1146/annurev.ps.46.020195.001233. [DOI] [PubMed] [Google Scholar]
  41. Leppanen JM, Niehaus DJ, Koen L, et al. Emotional face processing deficit in schizophrenia: a replication study in a South African Xhosa population. Schizophr. Res. 2006;84(2–3):323–330. doi: 10.1016/j.schres.2006.02.007. [DOI] [PubMed] [Google Scholar]
  42. Mandal MK, Pandey R, Prasad AB. Facial expressions of emotions and schizophrenia: a review. Schizophr. Bull. 1998;24(3):399–412. doi: 10.1093/oxfordjournals.schbul.a033335. [DOI] [PubMed] [Google Scholar]
  43. Marinkovic K, Halgren E. Human brain potentials related to the emotional expression, repetition, and gender of faces. Psychobiology. 1998;26:348–356. [Google Scholar]
  44. Morrison RL, Bellack AS, Mueser KT. Deficits in facial-affect recognition and schizophrenia. Schizophr. Bull. 1988;14(1):67–83. doi: 10.1093/schbul/14.1.67. [DOI] [PubMed] [Google Scholar]
  45. Onitsuka T, Niznikiewicz MA, Spencer KM, et al. Functional and structural deficits in brain regions subserving face perception in schizophrenia. Am. J. Psychiatry. 2006;163:455–462. doi: 10.1176/appi.ajp.163.3.455. [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Overall JE, Gorham DR. The brief psychiatric rating scale. Psychology Report. 1962;10:799–812. [Google Scholar]
  47. Phelps EA, O'Connor KJ, Gatenby JC, et al. Activation of the left amygdala to a cognitive representation of fear. Nat. Neurosci. 2001;4(4):437–441. doi: 10.1038/86110. [DOI] [PubMed] [Google Scholar]
  48. Phillips ML, David AS. Facial processing in schizophrenia and delusional misidentification: cognitive neuropsychiatric approaches. Schizophr. Res. 1995;17:109–114. doi: 10.1016/0920-9964(95)00035-k. [DOI] [PubMed] [Google Scholar]
  49. Phillips ML, Williams L, Senior C, et al. A differential neural response to threatening and non-threatening negative facial expressions in paranoid and non-paranoid schizophrenics. Psychiatry Res. 1999;92(1):11–31. doi: 10.1016/s0925-4927(99)00031-1. [DOI] [PubMed] [Google Scholar]
  50. Phillips ML, Young AW, Scott SK, et al. Neural responses to facial and vocal expressions of fear and disgust. Proc. Biol. Sci. 1998;265(1408):1809–1817. doi: 10.1098/rspb.1998.0506. [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Pizzagalli D, Lehmann D, Koenig T, et al. Face-elicited ERPs and affective attitude: brain electric microstate and tomography analyses. Clin. Neurophysiol. 2000;111(3):521–531. doi: 10.1016/s1388-2457(99)00252-7. [DOI] [PubMed] [Google Scholar]
  52. Pizzagalli D, Regard M, Lehmann D. Rapid emotional face processing in the human right and left brain hemispheres: an ERP study. Neuroreport. 1999;10(13):2691–2698. doi: 10.1097/00001756-199909090-00001. [DOI] [PubMed] [Google Scholar]
  53. Potter DD, Parker DM. Electrophysiological correlates of facial identity and expression processing. In: Crawford JR, Parker DM, editors. Developments in Clinical and Experimental Neuropsychology. New York: Plenum Press; 1989. pp. 143–155. [Google Scholar]
  54. Roalf DR, Lowery N, Turetsky BI. Behavioral and physiological findings of gender differences in global-local visual processing. Brain Cogn. 2006;60:32–42. doi: 10.1016/j.bandc.2005.09.008. [DOI] [PubMed] [Google Scholar]
  55. Sachs G, Steger-Wuchse D, Kryspin-Exner I, et al. Facial recognition deficits and cognition in schizophrenia. Schizophr. Res. 2004;68(1):27–35. doi: 10.1016/S0920-9964(03)00131-2. [DOI] [PubMed] [Google Scholar]
  56. Semlitsch HV, Anderer P, Schuster P, et al. A solution for reliable and valid reduction of ocular artifacts, applied to the P300 ERP. Psychophysiology. 1986;23(6):695–703. doi: 10.1111/j.1469-8986.1986.tb00696.x. [DOI] [PubMed] [Google Scholar]
  57. Shtasel DL, Gur RE, Mozley PD, et al. Volunteers for biomedical research. Recruitment and screening of normal controls. Arch. Gen. Psychiatry. 1991;48(11):1022–1025. doi: 10.1001/archpsyc.1991.01810350062010. [DOI] [PubMed] [Google Scholar]
  58. Silver H, Shlomo N, Turner T, et al. Perception of happy and sad facial expressions in chronic schizophrenia: evidence for two evaluative systems. Schizophr. Res. 2002;55(1–2):171–177. doi: 10.1016/s0920-9964(01)00208-0. [DOI] [PubMed] [Google Scholar]
  59. Skrandies W. Global field power and topographic similarity. Brain Topogr. 1990;3(1):137–141. doi: 10.1007/BF01128870. [DOI] [PubMed] [Google Scholar]
  60. Streit M, Ioannides AA, Liu L, et al. Neurophysiological correlates of the recognition of facial expressions of emotion as revealed by magnetoencephalography. Brain Res. Cogn. Brain Res. 1999;7(4):481–491. doi: 10.1016/s0926-6410(98)00048-2. [DOI] [PubMed] [Google Scholar]
  61. Streit M, Wolwer W, Brinkmeyer J, et al. EEG-correlates of facial affect recognition and categorisation of blurred faces in schizophrenic patients and healthy volunteers. Schizophr. Res. 2001;49(1–2):145–155. doi: 10.1016/s0920-9964(00)00041-4. [DOI] [PubMed] [Google Scholar]
  62. Ueno T, Morita K, Shoji Y, et al. Recognition of facial expression and visual P300 in schizophrenic patients: differences between paranoid type patients and non-paranoid patients. Psychiatry Clin. Neurosci. 2004;58(6):585–592. doi: 10.1111/j.1440-1819.2004.01307.x. [DOI] [PubMed] [Google Scholar]
  63. Watanabe S, Kakigi R, Koyama S, et al. Human face perception traced by magneto-and electro-encephalography. Brain Res Cogn Brain Res. 1999;8 doi: 10.1016/s0926-6410(99)00013-0. 125–14. [DOI] [PubMed] [Google Scholar]
  64. Whittaker JF, Deakin JF, Tomenson B. Face processing in schizophrenia: defining the deficit. Psychol. Med. 2001;31(3):499–507. doi: 10.1017/s0033291701003701. [DOI] [PubMed] [Google Scholar]
  65. Williams LM, Palmer D, Liddell BJ, et al. The 'when' and 'where' of perceiving signals of threat versus non-threat. Neuroimage. 2006;31(1):458–467. doi: 10.1016/j.neuroimage.2005.12.009. [DOI] [PubMed] [Google Scholar]

RESOURCES