Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2017 Jul 1.
Published in final edited form as: Clin Psychol Sci. 2015 Dec 2;4(4):651–660. doi: 10.1177/2167702615609595

Impaired visual cortical processing of affective facial information in schizophrenia

S Maher 1, T Ekstrom 1, Y Chen 1,*
PMCID: PMC5098551  NIHMSID: NIHMS722766  PMID: 27833789

Abstract

Facial emotion perception impairment in schizophrenia is currently viewed as abnormal affective processing. Facial emotion perception also relies on visual processing. Yet, visual cortical processing of facial emotion is not well understood in this disorder. We measured perceptual thresholds for detecting facial fear and happiness in patients (n=23) and controls (n=23), and adjusted emotion intensity of facial stimuli (via morphing between images of neutral and emotive expressions) for each subject. We then evaluated activations of the visual cortex and amygdala during the performance of perceptually-equated facial emotion detection tasks. Patients had significantly lower fear- and happiness-induced activations in the visual cortex and amygdala. Activations between the visual cortex and amygdala were largely correlated, but the correlations in patients occurred abnormally early in response time course during fear perception. In schizophrenia, visual processing of facial emotion is deficient and visual and affective processing of negative facial emotion may be prematurely associated.

Keywords: Face, schizophrenia, fear, happiness, vision, emotion


Perception of facial emotion plays an important role in social functioning. This important perceptual function is impaired in schizophrenia (Walker, McGuire et al. 1984; Heimberg, Gur et al. 1992; Kohler, Turner et al. 2003; Kohler, Walker et al. 2010). Affective processing of facial emotion information is mediated primarily in the limbic system of the brain (Adolphs, Damasio et al. 1996; Williams, Phillips et al. 2001; Gur, Schroeder et al. 2002). Previous neuroimaging studies have established the limbic system as the brain substrate for impaired facial emotion perception in schizophrenia (Gur, McGrath et al. 2002; Holt, Kunkel et al. 2006; Gur, Loughead et al. 2007).

On the other hand, affective processing of facial emotion information in the limbic system is closely linked to visual processing in the occipital cortex. Activities in amygdala significantly modulate activities in the visual cortex (Vuilleumier, Armony et al. 2003; Vuilleumier, Richardson et al. 2004). Changes in visual signals not only impact activities in the visual cortex but also those in amygdala. Recent behavioral studies have shown that impaired facial emotion perception and impaired visual perception were associated in schizophrenia (Butler, Abeles et al. 2009; Norton, McBain et al. 2009). These functional relationships pose a question as to what role the visual system plays in impaired processing of face emotion information and how the visual system and the limbic system are functionally linked in this psychiatric disorder. To address this question, we need to separate activations in these respective brain systems.

Previous neuroimaging studies primarily examined activations of the limbic system in patients. As such, the contributions of the visual system to facial emotion perception impairments in schizophrenia could not be inferred from these previous results. In this study, we acquired and evaluated activations of the visual cortex and the amygdala during performance of facial emotion perception tasks.

To make valid group comparisons, we also needed to equate emotion intensity levels of face images across participants (and therefore, groups) while acquiring brain activations. Some previous neuroimaging studies showed reduced activations in amygdala while patients performed emotion judgment tasks (Gur, McGrath et al. 2002) or viewed face images with emotive expressions (Schneider, Weiss et al. 1998). Other studies however showed increased amygdala activations in response to emotive or even neutral facial expressions (Kosaka, Omori et al. 2002; Holt, Kunkel et al. 2006). The diverse pattern of amygdala activations could be due to patients’ perceptual misjudgment about face emotions as their capacities for perceiving facial emotion vary significantly and are biased (Phillips, Drevets et al. 2003). The perceptual misjudgment and uncertainty about facial emotions might result in different levels of brain activations when patients saw identical face images with emotive or neutral expressions. In this study, we resolved this issue first by measuring perceptual sensitivities to detect facial emotions in all subjects and then by adjusting emotion intensity of face images for individual subjects based on their own perceptual capacities. We acquired activations of the respective brain regions with these perceptually equated face images.

In schizophrenia, functional activations of the visual cortex are altered during basic visual perception (Braus, Weber-Fahr et al. 2002; Chen, Grossman et al. 2008; Martinez, Hillyard et al. 2008). The fact that facial emotion perception involves visual processing implies that activations in the visual cortex may be similarly affected when patients perform visual and facial emotion perception tasks. For the limbic regions, previous studies have shown that activations to face images with emotive expressions were reduced in patients (Phillips, Williams et al. 1999; Gur, McGrath et al. 2002; Williams, Das et al. 2004). Perceptual studies have also found that modulation of visual aspects of face images has a significant effect on patients’ behavioral performance on facial emotion perception (McBain, Norton et al. 2010; Lee, Gosselin et al. 2011), suggesting a potential role of visual processing in affective processing of facial emotion in schizophrenia. One fMRI resting-state study has found that hyper-connectivity existed between the visual cortex and amygdala in patients with visual hallucinations (Ford, Palzes et al. 2014).

Considering the evidence for a reduction in activity in amygdala and the evidence of increased resting state connectivity between visual cortices and amygdala in schizophrenia, we hypothesized that schizophrenia patients will show greater functional connectivity between amygdala and visual cortices, but lower activation levels in both structures compared to healthy controls during a facial emotion perception task. Heightened connectivity between these structures in schizophrenia might also mean that altered activation timecourse in one structure predicts similar alterations in the other.

Method

Participants

Twenty-three schizophrenia patients participated. They all met the DSM-IV criteria for schizophrenia or schizoaffective disorder. Diagnoses were made independently by experienced clinicians, based on a review of a structured clinical interview for DSM-IV (First, Spitzer et al. 1994) and by evaluating available medical records. Twenty patients were taking antipsychotic medication (daily chlorpromazine equivalent: mean = 493 mg, SD = 496 mg) (Woods 2003).

Twenty-three healthy controls participated. None met DSM-IV criteria for any Axis I psychiatric disorders according to the Structured Interview DSM-IV NP (First, Spitzer et al. 2002). None had a family history of psychosis.

No participant had any diagnosed organic brain disease or any history of substance abuse or dependence during the past 2 years. The groups did not differ in terms of age (patients: M=44.65 years old (SD=13.37); controls: M=39.13 years old (SD=15.38)) or gender (patients: 10 females; controls: 11 females), but controls had a slightly higher verbal IQ, (t(17)= 2.29, p = 0.03).

The study protocol was approved by the McLean Hospital Institutional Review Board. Written informed consent was obtained from all participants prior to testing.

Equating facial stimuli for fMRI

Stimuli were images from the NimStim Face Stimulus Set (http://www.macbrain.org) that included happy, fearful, and neutral expressions. Prior to fMRI, psychophysical testing was conducted to determine perceptual thresholds, defined as the lowest emotion intensity level at which a participant can reliably (i.e. 80% correct) perceive a particular emotion (Figure 1a). Morphing between face images with neutral expression (0%) and face images with emotive expressions (100%) created five additional levels of emotion intensity (3%, 6%, 12%, 24% and 48%) (Norton, McBain et al. 2009; Maher, Ekstrom et al. 2015). These images were presented using a method of constant stimuli (Chen, Bidwell et al. 2005).

Figure 1.

Figure 1

(a) Result of pre-MRI psychophysical testing. The type of facial emotion is depicted on x axis. Perceptual thresholds of patients and controls are plotted on y axis (the higher the threshold, the worse the performance). Error bars indicate ±1 standard error. (b) Illustration of psychophysics-determined fearful and happy face stimuli for fMRI testing. For this illustration, the Th and Th2 conditions here use average perceptual thresholds and average 2 times perceptual threshold of all participants. (c) Perceptual performance during fMRI. The emotion intensity levels of face image are plotted on x axis. The proportion of trials judged as emotive is plotted on y axis. Error bars indicate ±1 standard error.

Participants performed two facial emotion perception tasks (happiness and fear). Each task trial included two sequentially presented images – one with neutral expression and the other with a specific level of emotional intensity. Each emotion intensity level was repeated eight times. The order of seven emotion intensity level and the order of neutral and emotive images in each trial were counterbalanced. Participants indicated if the first or the second image was more emotive (happier or more fearful) – a forced choice paradigm. Perceptual thresholds for each participant were determined by fitting the percentages of correct trials or accuracy during different levels of emotion intensity to a standard psychometric function (Weibull regression: y=100−50exp[−(x/a)b]). Details of the psychophysical method used for this study are also described elsewhere (Maher S, Ekstrom T et al. 2014). The derived thresholds were then used to equate emotion intensity of facial stimuli across subjects for subsequent fMRI.

For the fMRI, happy and fearful face stimuli for each participant were set at four emotion intensity levels: 0% emotion (neutral expression, a baseline condition), perceptual threshold, two times perceptual threshold, and 100% emotion (original emotive expression). The perceptual threshold condition (Th) and the two times perceptual threshold condition (Th2) had emotion intensity set at each participant’s pre-fMRI determined perceptual threshold and at two times that of each participant’s perceptual threshold, respectively. The 0% and 100% emotion conditions used identical emotion intensity levels for all participants. Figure 1b shows the face stimuli at neutral expression, average perceptual threshold, average two times perceptual threshold and 100% emotion intensity levels. As can be seen in Figure 1c, performance at threshold levels dropped from the presumed 80% during fMRI scanning, possibly due to the MR environment that includes less than optimal stimulus presentation, supine positioning of subjects, and loud scanner noise. That said, these factors were the same for both groups and should not influence group differences.

Procedure

Participants lay in a supine position and were shown visual stimuli that were back-projected on a screen and reflected in a mirror attached to the MRI head-coil. The fMRI protocol used an event-related design. It contained 4 runs, two for fear perception and the other two for happiness perception. Each of the four runs had 96 trials (4 emotion intensity levels and 24 repeats). The stimulus presentation sequences were determined by optseq2 (Dale, Fischl et al. 1999) such that overlapping hemodynamic responses for each condition could be efficiently de-convolved. In each trial, a stimulus was displayed for 300 ms. The average inter-stimulus interval was 2056 ms. The task for participants was to indicate whether a face stimulus contained an emotive expression (happiness or fear) or not. Perceptual judgments during a facial emotion perception task were given using an MR compatible button box.

MRI Acquisition

All MRI images were collected at the McLean Brain Imaging Center (Belmont, MA) on a 3.0 Tesla Siemens TIM Trio scanner (Siemens AG, Erlangen, Germany) using a 12-channel head coil. High-resolution structural images (TR=2.1 s, TE=3.3 ms, slices=128, matrix=256×256, flip angle=7°, resolution=1.0 mm×1.0 mm×1.33 mm) were used to register subjects’ imaging data to a standard space. Structural imaging data were read and interpreted by a clinical neuroradiologist to rule out neurological abnormalities.

To increase temporal resolution while ensuring sufficient spatial resolution, fMRI data were acquired using a multiband imaging technique. Each functional run had 32 3.5 mm axial slices with TR/TE=400 ms/30 ms, flip angle=30°, matrix=64×64 on a 220 mm×220 mm FOV, in plane resolution=3.44 mm × 3.44 mm, and multiband factor=8 (Feinberg, Moeller et al. 2010; Tong and Frederick 2014).

Analysis of fMRI Data

Analysis of fMRI data was performed using the fMRI Expert Analysis Tool (FEAT) version 6 (FMRIB’s Software Library, http://www.FMRIb.ox.ac.uk/fsl) (Smith, Jenkinson et al. 2004). General linear modeling (GLM) was performed at the single subject and group levels. After pre-processing (Supplement), data were analyzed with a three-level hierarchical modeling strategy. For the first two levels, a standard fixed effects analysis was performed. The first-level analysis was for each individual run, whereas the second level analysis combined runs for the same emotion type. The third level analysis used analysis of variance (ANOVA) on second-level parameter estimates averaged over regions-of-interest (visual cortex and amygdala) across subjects for each condition and group. An exploratory third-level whole brain analysis was also performed using a GLM to combine subjects in a mixed effects analysis (Flame 1) by group for the purpose of generating brain maps.

Dependent variables used to test primary hypotheses were mean percent signal change values for each of the regions-of-interest (visual cortex and amygdala) extracted from spatial maps of regression coefficients using Featquery in FSL for each respective condition and group. ROIs were constructed by relevant neuroanatomy based on the Harvard-Oxford Cortical Structural Atlas (Desikan, Segonne et al. 2006). The further ROI constraints included only significant voxels (p < 0.05) from the 100% (emotive) vs. 0% (neutral) condition for controls and patients combined. Coordinates for ROIs in MNI space (depicted in Figure 2a) were: left visual cortex (x=2, y=−90, z=25); right visual cortex (x=2, y=90, z=25); left amygdala (x=−24, y=−5, z=−22); right amygdala (x=−24, y=5, z=−22).

Figure 2.

Figure 2

(a) Summary of fMRI results. Top and bottom panels are for amygdala and visual cortex, respectively. On the left, the BOLD signal changes (Z-maps) during fear discrimination and happiness discrimination for controls and patients. The white enclosures in each panel outline anatomical locations of the ROIs: amygdala and visual cortex. On the right, means of emotion-induced BOLD signal changes (i.e. after subtraction of neutral expression from emotive expressions) in amygdala and visual cortex are depicted for controls (blue) and patients (red). Task (fear and happiness) and emotion intensity level (Th vs 0%, Th2 vs 0%, and 100% vs 0%) are depicted on the x axis. Error bars denote ±1 standard error. (b) Time course results. Top panels depict time course correlation between BOLD responses in visual cortex and amygdala. Bottom panels depict time course of emotion-induced activation in visual cortex and amygdala. Error bars denote ±1 standard error. (c) Scatter plot for correlation between BOLD response in visual cortex and amygdala. Each data point represents a participant during either fear or happiness perception. Best fitting line is for average across all participants and both emotion types.

Results

Behavioral responses

For pre-fMRI behavioral testing, perceptual thresholds for happiness perception were higher (worse performance) in patients (22.6% (std (8.1%))) than in controls (20.1% (8.3%)). Similarly, for fear perception, perceptual thresholds were higher in patients (31.6% (8.4%)) than in controls (23.1% (10.0%)). ANOVA of the perceptual thresholds showed a significant interaction between emotion type (happiness vs. fear) and group (patient vs. control) (F=4.25, p<0.05). Follow-up analyses showed higher perceptual thresholds (poor performance) in patients than in controls during fear perception (p<0.005), but not during happiness perception (Figure 1a).

During fMRI scanning, behavioral accuracy (% of correct judgments) was assessed by an ANOVA with factors of group (patient vs. control), emotion type (fear vs. happiness) and emotion intensity level (0%, Th, Th2, 100%). No effects of group or interactions with group were found (Figure 1c). An overall effect of emotion intensity level was found (F[1,45]=52.7, p<0.001) where behavioral performance improved as emotion intensity level increased (Figure 1c).

ROI BOLD responses

Two ANOVAs of BOLD signal change, one for visual cortex and the other for amygdala, were performed with factors of group by task by emotion intensity level. The values used for this analysis were extracted from parameter estimate contrasts of the neutral expression condition versus each of the other three emotion intensity levels (i.e. ANOVA contrast levels were: Th vs. 0%, Th2 vs. 0%, 100% vs. 0%) for each task.

These ANOVAs yielded significant group effects for the visual cortex (F[2,90]=6.127, p<0.02; Figure 2a) and for amygdala (F[2, 90]=14.55, p < 0.001; Figure 2a). These group effects indicate that during happiness and fear perception, activations of visual cortex and amygdala were significantly reduced in patients. The effects of emotion intensity across both groups was also significant (visual cortex: F[2,88]=4.88, p<0.01; amygdala: F[2,88]=7.56, p<0.001). This result indicates that during fear and happiness perception, activations of visual cortex and amygdala increased with increasing emotion intensity. No other main effect or interaction was found.

In other face-sensitive regions (Fusiform Face Area, Occipital Face Area and Superior Temporal Sulcus), emotion-induced activation was negligible or absent, and was not further described here.

Timecourse of BOLD response

The timecourse of the BOLD response for visual cortex and amygdala was extracted using Perl Event-related Average Time-course Extraction (www.jonaskaplan.com/peate/peate-tk.html). All timecourse analyses were performed with the 0% emotional intensity level subtracted from the 100% emotional intensity level. These data were then averaged across hemispheres. ANOVA with factor of time point (2 to 8 seconds) by group was performed for each emotion type and for each ROI. Timecourse analyses showed that patients had earlier activations in amygdala and visual cortex than did controls, as 1) significant interactions of group by time point for both emotion types in amygdala (Fs[6, 264]>2.9, ps<0.01), and 2) significant main effects of group for happiness in both visual cortex and amygdala (Fs[1, 44]>4.08, ps<0.05) were found (Figure 2b). A main effect of time point was also found for happiness in amygdala and visual cortex (F[6,264]>4.5, ps<0.001).

Follow up analysis revealed greater activation for controls in the amygdala for happiness at time points 2 to 5 (ps<0.05). For fear, a marginal difference was found at time point 5 (p=0.056), where activation was greater in controls than in patients.

For within-group analyses, controls showed significant difference for time points 2 vs. 4 and 2 vs. 5 (ps<.05) for fear, where activation increased over the time course. Patients, however, showed a significant decrease in activation over the time course; lower at time point 8 than at time point 2 (p=0.049). For happiness, comparisons between time point 2 and all other time points were significant (ps<0.05) save for the earliest comparison for patients (time point 2 vs. 3) and the last two for controls (time points 2 vs. 7 and 2 vs. 8). All comparisons for happiness showed increasing activation over the time course.

Relationship between activations of visual cortex and amygdala

For comparison between activations of the two brain areas, BOLD signal changes were averaged across hemisphere and intensity level (with the 0% level subtracted from each of other levels first). During fear perception, the activations of visual cortex and amygdala were correlated in both controls (r=0.64, p <0.01) and patients (r=0.69, p <0.001). During happiness perception, activations of visual cortex and amygdala were correlated in controls (r=0.48, p <0.05). In patients, this correlation was moderate but did not reach a level of statistical significance (r=0.31, p >0.05). This result indicates functional associations of activations between the two brain regions (Figure 2c).

To further illustrate the functional association, the correlations between visual cortex and amygdala were examined at each point of the BOLD time course averaged across hemispheres. We used data from the 100% emotion intensity level (with 0% subtracted from it). In patients, significant correlations occurred early (time points 2 [r=0.74, p <0.0001] & 3 [r=0.67, p <0.0001]) between fear-induced activations in visual cortex and in amygdala. In controls, however, significant correlations occurred later (time points 4 [r=0.44, p<0.05] & 6 [r=0.42, p<0.05]). For happiness-induced activations, the correlations were marginal for controls but significant later for patients (time points 5 [r=0.43, p<0.05] & 6 [r=0.61, p<0.001].

Relationship between cortical activations and behavioral responses

In patients, the activations (averaged across emotion intensity levels) of visual cortex during fear perception were significantly correlated with perceptual threshold (r=−.43, p<0.05). In controls, no significant correlation was found.

In patients, activations of visual cortex and amygdala were not correlated with the PANSS scores. Activation of visual cortex during perception of fearful facial expression was correlated with the index of antipsychotic drug dosage (CPZ) (r=0.51, p<0.05).

Discussion

This study found that during facial emotion perception (fear and happiness), emotion-induced activations of patients were reduced in visual cortex as well as in amygdala. In patients, the activations of visual cortex and amygdala were correlated earlier in the time course during fear perception whereas in controls, the correlation occurred later in the time course. These results indicate that visual processing of facial affective information is dampened in schizophrenia. The visual and the limbic systems of patients appear to be associated prematurely for facial fear perception.

Visual processing of affective facial information

Recently, it has been established that visual processing and affective processing do not operate in isolation but mutually influence one another (Vuilleumier, Richardson et al. 2004; Damaraju, Huang et al. 2009; McBain, Norton et al. 2012). In the present study, the examination of the activation difference between face images with emotive versus neutral expression demonstrated that in controls visual cortex was actually responsive to both affective signals and visual signals from face images (Figure 3). In patients, however, this visual processing of affective facial signals was dampened. Their emotion-induced activations were reduced in both the visual cortex and amydgala (Figure 3). This result indicates that schizophrenia patients have deficient processing of facial emotion not only in affective domains, as shown earlier, but also in visual domains. Considering the importance of perceiving faces and emotional expressions in one’s environment (Javitt and Freedman 2015), this visual processing factor must be considered in understanding impaired facial emotion perception in schizophrenia and in developing therapeutic interventions.

Comparing visual processing of different emotional expressions is complicated by varying reliance on particular facial features by different emotion types. For example, perception of fear relies on the eyes while happiness depends more on mouth (Smith, Cottrell et al. 2005; Lee, Gosselin et al. 2011) or change in distance between eyes and mouth (Maher S, Ekstrom T et al. 2014). One may ask whether the differential cortical responses to fear and happiness in patients here reflect the processing of visual signals from different facial regions, rather than emotion per se. This interesting question is best answered through empirical studies of perception of different facial regions with and with emotive expressions, which have been done in healthy populations (Leppanen, Kauppinen et al. 2007) but only sparsely in schizophrenia (Lee, Gosselin et al. 2011).

Affective processing of facial emotions

Although abnormal affective processing in the limbic system including the amygdala has been widely reported in schizophrenia, one issue remained unsettled – the diverse amygdala activations shown in patients. In the present study, reduced activations of the amygdala in patients were found after facial emotion signals were perceptually equated (Figure 3). The perceptual equating eliminated potential misjudgment or bias of facial emotions, a likely reason that previous studies found diverse results of amygdala activations, and ensured unambiguous facial emotion perception by patients. With the visual capacity factor eliminated, the resulting reduction of amygdala activations provides unequivocal evidence for dampened affective processing of facial emotion in schizophrenia.

Functional associations between visual cortex and amygdale

Visual cortex and amygdala are connected at neuroanatomical and functional levels (Amaral, Behniea et al. 2003; Freese and Amaral 2006; Vuilleumier and Pourtois 2007). Activations of these two brain regions are presumably associated for the processing of facial emotion information. The present study supports this presumption by showing correlated activations between the two brain regions during perception of facial emotions in controls (Figure 5). In patients, this correlation was found during perception of facial fear expression and somewhat weakened during perception of facial happiness expression, suggesting that functional association of the two brain regions is largely preserved.

However, in schizophrenia, the earlier-than-normal association of activation between the visual and the affective systems during fear perception suggests premature interactions of visual and affective processing. As precise visual and emotion perception is supported by interactions of these cortical systems (Vuilleumier and Pourtois 2007), the premature association between cortical regions and amygdalamay in patients degrade the quality of perception and interpretation of facial emotions, ultimately compromising social interactions. For example, such “jumping-the-gun” cortical interactions may prime patients to perceive negative emotions from faces even when the emotions are physically absent (Kohler, Turner et al. 2003).

This result is also consistent with previous behavioral studies in which visual modulation of facial emotion images produced significant effects on facial emotion perception in patients (McBain, Norton et al. 2010; Lee, Gosselin et al. 2011). Together, these results suggest that the visual and affective systems in schizophrenia are dampened on their own and are functionally associated prematurely.

Relationships between brain activations and behavioral responses

In patients, activations of visual cortex during perception of fearful facial expression (100%) was marginally correlated with perceptual thresholds (r= −0.41, p=0.06). The modest but inverse correlation suggests that the less the visual cortex is activated, the poorer (the higher threshold) the performance on perception of fearful facial expression. In controls, no correlations were significant.

Limitations

The present study has a few limitations. First, only two types of facial emotions were evaluated. It remains to be seen if deficient visual processing of affective facial information in schizophrenia holds for other types of facial emotion. Second, only two major regions of the visual system and the limbic systems were evaluated. Visual and affective processing may involve other brain regions, especially given the possibility that reduced activations in visual cortex and amygdala in patients may be compensated for by activations elsewhere in the brain. Third, the demonstrated relationship between visual cortex and amygdala was characterized by the correlation data. This is due to the fact that fMRI in this study was acquired using a task activation paradigm that highlights activations in each ROI. To illustrate and compare connectivity between ROIs in patients and controls, application of resting state fMRI may yield additional insights about functional association of visual and affective processing of facial emotion in this psychiatric disorder. Fourth, the correlation between activation of visual cortex and the dose of antipsychotic drug is difficult to interpret – No viable interpretation is available to explain the correlation in visual cortex (but not in amygdala) and for perception of fearful facial expression (but not for happy facial expression). Note that CPZ, while reflecting drug treatment, may also be confounded by illness severity. The correlation should be replicated in future studies. Finally, the use of a neutral baseline here comes with the caveat that increased amygdala activation to neutral expression faces has been found in schizophrenia (Hall, Whalley et al. 2008; Habel, Chechko et al. 2010). Indeed, our data showed that the raw activations for neutral faces were greater for patients in both emotions and hemispheres (ts(44) > 2.24, ps< .03). No group differences for neutral faces were found in the visual cortex though. Compared to the baseline results, the relative activations are more inferential. As has been pervasively discussed earlier (Stark and Squire 2001), fMRI activations are only valid as a relative contrast between cognitive tasks. Our finding of reduced relative activation of affective vs. non-affective processing in schizophrenia is thus insightful but should be viewed with potentially altered non-affective processing in mind.

Conclusion

Using a psychophysically guided fMRI paradigm, this study found that visual processing of affective face information is deficient in schizophrenia. This finding extended previous work on deficient affective processing of facial emotion in this psychiatric disorder and illustrated premature functional association of the visual cortex and amygdala during perception of facial fearful emotions. This result suggests that therapeutic interventions of this socially important function in schizophrenia should target both visual and affective domains.

Supplementary Material

Acknowledgments

This study was supported in part by a grant (R01 MH 096793) from the NIH. We thank Dr. Ongur for supervision of clinical assessment and Drs. Frederick, Nickerson and Tong for help with MRI acquisition.

Footnotes

Supplement

Pre-processing steps included motion correction (Smith, Jenkinson et al. 2004), spatial smoothing with a Gaussian kernel of full-width-at-half-maximum 5.0 mm, and high-pass temporal filtering (high-pass filter cutoff: 100 sec). After non-brain tissue was removed from T1-weighted structural scans using BET (Smith 2002), functional scans were registered to structural images using FLIRT (Jenkinson, Bannister et al. 2002). Structural images were registered to the MNI standard space with FNIRT (Smith, Jenkinson et al. 2004), after which functional images were registered to MNI standard space using these two transformations.

References

  1. Adolphs R, Damasio H, et al. Cortical systems for the recognition of emotion in facial expressions. J Neurosci. 1996;16(23):7678–87. doi: 10.1523/JNEUROSCI.16-23-07678.1996. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Amaral DG, Behniea H, et al. Topographic organization of projections from the amygdala to the visual cortex in the macaque monkey. Neuroscience. 2003;118(4):1099–120. doi: 10.1016/s0306-4522(02)01001-1. [DOI] [PubMed] [Google Scholar]
  3. Braus DF, Weber-Fahr W, et al. Sensory information processing in neuroleptic-naive first-episode schizophrenic patients. Archives of General Psychiatry. 2002;59:696–701. doi: 10.1001/archpsyc.59.8.696. [DOI] [PubMed] [Google Scholar]
  4. Butler PD, Abeles IY, et al. Sensory contributions to impaired emotion processing in schizophrenia. Schizophr Bull. 2009;35(6):1095–107. doi: 10.1093/schbul/sbp109. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Chen Y, Bidwell LC, et al. Visual motion integration in schizophrenia patients, their first-degree relatives, and patients with bipolar disorder. Schizophr Res. 2005;74(2–3):271–81. doi: 10.1016/j.schres.2004.04.002. [DOI] [PubMed] [Google Scholar]
  6. Chen Y, Grossman ED, et al. Differential activation patterns of occipital and prefrontal cortices during motion processing: evidence from normal and schizophrenic brains. Cogn Affect Behav Neurosci. 2008;8(3):293–303. doi: 10.3758/cabn.8.3.293. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Dale AM, Fischl B, et al. Cortical surface-based analysis. I. Segmentation and surface reconstruction. Neuroimage. 1999;9:179–194. doi: 10.1006/nimg.1998.0395. [DOI] [PubMed] [Google Scholar]
  8. Damaraju E, Huang YM, et al. Affective learning enhances activity and functional connectivity in early visual cortex. Neuropsychologia. 2009;47(12):2480–7. doi: 10.1016/j.neuropsychologia.2009.04.023. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Desikan RS, Segonne F, et al. An automated labeling system for subdividing the human cerebral cortex on MRI scans into gyral based regions of interest. Neuroimage. 2006;31(3):968–80. doi: 10.1016/j.neuroimage.2006.01.021. [DOI] [PubMed] [Google Scholar]
  10. Feinberg DA, Moeller S, et al. Multiplexed echo planar imaging for sub-second whole brain FMRI and fast diffusion imaging. PLoS One. 2010;5(12):e15710. doi: 10.1371/journal.pone.0015710. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. First MB, Spitzer RL, et al. Structure Clinical Interview for DSM-IV-TR Axis I Disorders – Non-patient Edition (SCID-I/NP, 11/2002 revision) New York, NY Biometric Research Department, New York State Psychiatric Institute 2002 [Google Scholar]
  12. First MB, Spitzer RL, et al. Structured Clinical Interview for DSM-IV Disorders (SCID) Washington, DC: American Psychiatric Press; 1994. [Google Scholar]
  13. Ford JM, Palzes VA, et al. Visual hallucinations are associated with hyperconnectivity between the amygdala and visual cortex in people with a diagnosis of schizophrenia. Schizophr Bull. 2014;41(1):223–32. doi: 10.1093/schbul/sbu031. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Freese JL, Amaral DG. Synaptic organization of projections from the amygdala to visual cortical areas TE and V1 in the macaque monkey. J Comp Neurol. 2006;496(5):655–67. doi: 10.1002/cne.20945. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Gur RC, Schroeder L, et al. Brain activation during facial emotion processing. Neuroimage. 2002;16(3 Pt 1):651–62. doi: 10.1006/nimg.2002.1097. [DOI] [PubMed] [Google Scholar]
  16. Gur RE, Loughead J, et al. Limbic activation associated with misidentification of fearful faces and flat affect in schizophrenia. Arch Gen Psychiatry. 2007;64(12):1356–66. doi: 10.1001/archpsyc.64.12.1356. [DOI] [PubMed] [Google Scholar]
  17. Gur RE, McGrath C, et al. An fMRI study of facial emotion processing in patients with schizophrenia. Am J Psychiatry. 2002;159(12):1992–9. doi: 10.1176/appi.ajp.159.12.1992. [DOI] [PubMed] [Google Scholar]
  18. Habel U, Chechko N, et al. Neural correlates of emotion recognition in schizophrenia. Schizophr Res. 2010;122(1–3):113–23. doi: 10.1016/j.schres.2010.06.009. [DOI] [PubMed] [Google Scholar]
  19. Hall J, Whalley HC, et al. Overactivation of fear systems to neutral faces in schizophrenia. Biol Psychiatry. 2008;64(1):70–3. doi: 10.1016/j.biopsych.2007.12.014. [DOI] [PubMed] [Google Scholar]
  20. Heimberg C, Gur RE, et al. Facial emotion discrimination: III. Behavioral findings in schizophrenia. Psychiatry Res. 1992;42:253–65. doi: 10.1016/0165-1781(92)90117-l. [DOI] [PubMed] [Google Scholar]
  21. Holt DJ, Kunkel L, et al. Increased medial temporal lobe activation during the passive viewing of emotional and neutral facial expressions in schizophrenia. Schizophr Res. 2006;82(2–3):153–62. doi: 10.1016/j.schres.2005.09.021. [DOI] [PubMed] [Google Scholar]
  22. Javitt DC, Freedman R. Sensory processing dysfunction in the personal experience and neuronal machinery of schizophrenia. Am J Psychiatry. 2015;172(1):17–31. doi: 10.1176/appi.ajp.2014.13121691. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Jenkinson M, Bannister P, et al. Improved optimization for the robust and accurate linear registration and motion correction of brain images. Neuroimage. 2002;17(2):825–41. doi: 10.1016/s1053-8119(02)91132-8. [DOI] [PubMed] [Google Scholar]
  24. Kohler CG, Turner TH, et al. Facial emotion recognition in schizophrenia: intensity effects and error pattern. Am J Psychiatry. 2003;160:1768–74. doi: 10.1176/appi.ajp.160.10.1768. [DOI] [PubMed] [Google Scholar]
  25. Kohler CG, Walker JB, et al. Facial emotion perception in schizophrenia: a meta-analytic review. Schizophr Bull. 2010;36(5):1009–19. doi: 10.1093/schbul/sbn192. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Kosaka H, Omori M, et al. Differential amygdala response during facial recognition in patients with schizophrenia: an fMRI study. Schizophr Res. 2002;57(1):87–95. doi: 10.1016/s0920-9964(01)00324-3. [DOI] [PubMed] [Google Scholar]
  27. Lee J, Gosselin F, et al. How do schizophrenia patients use visual information to decode facial emotion? Schizophr Bull. 2011;37(5):1001–8. doi: 10.1093/schbul/sbq006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Leppanen JM, Kauppinen P, et al. Differential electrocortical responses to increasing intensities of fearful and happy emotional expressions. Brain Res. 2007;1166:103–9. doi: 10.1016/j.brainres.2007.06.060. [DOI] [PubMed] [Google Scholar]
  29. Maher S, Ekstrom T, et al. Greater perceptual sensitivity to happy facial expression. Perception. 2014;43:1353–1364. doi: 10.1068/p7806. [DOI] [PubMed] [Google Scholar]
  30. Maher S, Ekstrom T, et al. Greater perceptual sensitivity to happy facial expression. Perception. 2015 doi: 10.1068/p7806. [DOI] [PubMed] [Google Scholar]
  31. Martinez A, Hillyard SA, et al. Magnocellular pathway impairment in schizophrenia: evidence from functional magnetic resonance imaging. J Neurosci. 2008;28(30):7492–500. doi: 10.1523/JNEUROSCI.1852-08.2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. McBain R, Norton D, et al. Differential roles of low and high spatial frequency content in abnormal facial emotion perception in schizophrenia. Schizophr Res. 2010;122(1–3):151–5. doi: 10.1016/j.schres.2010.03.034. [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. McBain R, Norton D, et al. Differential Spatial Frequency Modulations on Facial Emotion Detection and Perception. The open behaviroal science journal. 2012;6:1–7. [Google Scholar]
  34. Norton D, McBain R, et al. Association of impaired facial affect recognition with basic facial and visual processing deficits in schizophrenia. Biol Psychiatry. 2009;65(12):1094–8. doi: 10.1016/j.biopsych.2009.01.026. [DOI] [PubMed] [Google Scholar]
  35. Phillips ML, Drevets WC, et al. Neurobiology of emotion perception II: Implications for major psychiatric disorders. Biol Psychiatry. 2003;54(5):515–28. doi: 10.1016/s0006-3223(03)00171-9. [DOI] [PubMed] [Google Scholar]
  36. Phillips ML, Williams L, et al. A differential neural response to threatening and nonthreatening negative facial expressions in paranoid and non-paranoid schizophrenics. Psychiatry Res. 1999;92(1):11–31. doi: 10.1016/s0925-4927(99)00031-1. [DOI] [PubMed] [Google Scholar]
  37. Schneider F, Weiss U, et al. Differential amygdala activation in schizophrenia during sadness. Schizophr Res. 1998;34(3):133–42. doi: 10.1016/s0920-9964(98)00085-1. [DOI] [PubMed] [Google Scholar]
  38. Smith ML, Cottrell GW, et al. Transmitting and decoding facial expressions. Psychol Sci. 2005;16(3):184–9. doi: 10.1111/j.0956-7976.2005.00801.x. [DOI] [PubMed] [Google Scholar]
  39. Smith SM. Fast robust automated brain extraction. Hum Brain Mapp. 2002;17(3):143–55. doi: 10.1002/hbm.10062. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Smith SM, Jenkinson M, et al. Advances in functional and structural MR image analysis and implementation as FSL. Neuroimage. 2004;23(Suppl 1):S208–19. doi: 10.1016/j.neuroimage.2004.07.051. [DOI] [PubMed] [Google Scholar]
  41. Stark CE, Squire LR. When zero is not zero: the problem of ambiguous baseline conditions in fMRI. Proc Natl Acad Sci U S A. 2001;98(22):12760–6. doi: 10.1073/pnas.221462998. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Tong Y, Frederick BD. Studying the Spatial Distribution of Physiological Effects on BOLD Signals Using Ultrafast fMRI. Front Hum Neurosci. 2014;8:196. doi: 10.3389/fnhum.2014.00196. [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Vuilleumier P, Armony JL, et al. Distinct spatial frequency sensitivities for processing faces and emotional expressions. Nat Neurosci. 2003;6(6):624–31. doi: 10.1038/nn1057. [DOI] [PubMed] [Google Scholar]
  44. Vuilleumier P, Pourtois G. Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging. Neuropsychologia. 2007;45(1):174–94. doi: 10.1016/j.neuropsychologia.2006.06.003. [DOI] [PubMed] [Google Scholar]
  45. Vuilleumier P, Richardson MP, et al. Distant influences of amygdala lesion on visual cortical activation during emotional face processing. Nat Neurosci. 2004;7(11):1271–8. doi: 10.1038/nn1341. [DOI] [PubMed] [Google Scholar]
  46. Walker E, McGuire M, et al. Recognition and identification of facial stimuli by schizophrenics and patients with affective disorders. Br J Clin Psychol. 1984;23:37–44. doi: 10.1111/j.2044-8260.1984.tb00624.x. [DOI] [PubMed] [Google Scholar]
  47. Williams LM, Das P, et al. Dysregulation of arousal and amygdala-prefrontal systems in paranoid schizophrenia. Am J Psychiatry. 2004;161(3):480–9. doi: 10.1176/appi.ajp.161.3.480. [DOI] [PubMed] [Google Scholar]
  48. Williams LM, Phillips ML, et al. Arousal dissociates amygdala and hippocampal fear responses: evidence from simultaneous fMRI and skin conductance recording. Neuroimage. 2001;14(5):1070–9. doi: 10.1006/nimg.2001.0904. [DOI] [PubMed] [Google Scholar]
  49. Woods SW. Chlorpromazine equivalent doses for the newer atypical antipsychotics. J Clin Psychiatry. 2003;64(6):663–7. doi: 10.4088/jcp.v64n0607. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

RESOURCES