Skip to main content
Social Cognitive and Affective Neuroscience logoLink to Social Cognitive and Affective Neuroscience
. 2014 Feb 24;9(12):1897–1903. doi: 10.1093/scan/nst188

Three stages of emotional word processing: an ERP study with rapid serial visual presentation

Dandan Zhang 1,*, Weiqi He 2,*, Ting Wang 3, Wenbo Luo 2,3,, Xiangru Zhu 4, Ruolei Gu 5, Hong Li 3, Yue-jia Luo 1
PMCID: PMC4249467  PMID: 24526185

Abstract

Rapid responses to emotional words play a crucial role in social communication. This study employed event-related potentials to examine the time course of neural dynamics involved in emotional word processing. Participants performed a dual-target task in which positive, negative and neutral adjectives were rapidly presented. The early occipital P1 was found larger when elicited by negative words, indicating that the first stage of emotional word processing mainly differentiates between non-threatening and potentially threatening information. The N170 and the early posterior negativity were larger for positive and negative words, reflecting the emotional/non-emotional discrimination stage of word processing. The late positive component not only distinguished emotional words from neutral words, but also differentiated between positive and negative words. This represents the third stage of emotional word processing, the emotion separation. Present results indicated that, similar with the three-stage model of facial expression processing; the neural processing of emotional words can also be divided into three stages. These findings prompt us to believe that the nature of emotion can be analyzed by the brain independent of stimulus type, and that the three-stage scheme may be a common model for emotional information processing in the context of limited attentional resources.

Keywords: emotional word, three stages, rapid serial visual presentation, event-related potential

INTRODUCTION

Rapid responses to emotional stimuli play a crucial role in human survival. To date, most of the researches on human emotional processing show photographic renderings of emotionally evocative objects, such as facial expressions and emotional pictures, to experimental subjects. As a highly symbolic species, humans can use much more abstract signals, namely, language, to communicate with each other about themselves and their environment. Thus, in contrast to facial expressions and emotional pictures, which are direct biological cues, words can represent emotions on a symbolic level. Recent research suggests that the emotional processes elicited by different stimuli, such as facial expressions, pictures and words may be similar (Sprengelmeyer and Jentzsch, 2006; Kissler et al., 2007; Schacht and Sommer, 2009a). Luo et al. (2010) proposed a three-stage model of facial expression processing. The present study aimed to determine whether the processing of emotional words also involves three stages.

Numerous studies using event-related potential (ERP) technique in human subjects have separately investigated the time courses with which various types of emotional stimuli are processed. Several studies have indicated that the presentation of emotional faces and words might affect early C1 (Rellecke et al., 2011), and the P1 and the N1 components (Batty and Taylor, 2003; Eger et al., 2003; Hofmann et al., 2009; Kissler and Herbert, 2013). For example, van Hooff et al. (2008) observed larger P1 amplitudes elicited in response to negative words as compared to those for neutral words in an emotional Stroop task. Additionally, the face-sensitive N170 component, measured over the occipito-temporal regions, can be modulated by facial expression, with larger amplitudes elicited in response to fearful and happy, compared with neutral, expressions (Williams et al., 2006).

In addition, many ERP studies have shown that an early posterior negativity (EPN) can be modulated by affective pictures and facial expressions (Junghofer et al., 2004; Schacht and Sommer, 2009a; Frühholz et al., 2011). For instance, Schupp et al. (2003) found that the EPN, whose peak was over occipito-temporal sites between 250 and 350 ms, was more prominent in response to emotional as compared to neutral stimuli. Similarly, emotional word studies observed a more negative-going EPN component for emotional words as compared to neutral words between 200 and 300 ms (Kissler et al., 2007; Herbert et al., 2008; Kissler et al., 2009; Scott et al., 2009) or between 300 and 400 ms (Schacht and Sommer, 2009a,b; Palazova et al., 2011), often appearing in occipito-temporal region and temporally after the N170 component (Scott et al., 2009; Frühholz et al., 2011; Rellecke et al., 2011).

The amplitude of the late positive component (LPC, also termed P3, P3b or late positive potential) has been shown to increase at centro-parietal electrode sites in participants viewing emotional stimuli, such as images (Cuthbert et al., 1999; Schupp et al., 2004a), faces (Schupp et al., 2004b; Schutter et al., 2004) and words (Fischler and Bradley, 2006; Herbert et al., 2006; Frühholz et al., 2011). The LPC is the most consistently reported late effect of emotional word categories on the ERP (Kissler et al., 2006, 2009). This component reflects a more elaborate processing stage and may be associated with task demands, such as attentional capture, evaluation and memory encoding (Herbert et al., 2008; Palazova et al., 2011; Bayer et al., 2012). Previous studies have found that the amplitude of the LPC differs in response to positive vs negative words (Bernat et al., 2001; Palazova et al., 2011; Bayer et al., 2012), indicating that it can discriminate emotional valence.

As the natural covariation between arousal and valence ratings of emotional words takes a U-shape (Lang et al., 1998), most previous studies have used positive and negative words with arousal ratings that are similar to one another but higher than those of neutral words (e.g. Kissler et al., 2006, 2009). For instance, Kissler and Herbert (2013) recently found that (i) the amplitude of the N1 component was larger for negative than for neutral words, (ii) the EPN component was more negative in response to positive and negative words compared with neutral words and (iii) the LPC was more positive for positive than for negative words. Although previous investigations have been in-depth and comprehensive, we believe that the observed emotional effect may result from variations in valence (positive vs negative) and/or arousal (relaxing vs arousing). In general, the arousal dimension of emotional stimuli can produce robust ERP changes; pure valence effects have been less studied and reliable valence effects have been reported less often than arousal effects (Conroy and Polich, 2007; Bayer et al., 2012). The pure influence of emotional valence on different stages of visual word processing remains unclear. The current study employed emotional words with different valence ratings (negative, neutral and positive) but the same arousal rating to investigate the possible effects of emotional valence on ERP data.

The rapid serial visual presentation (RSVP) task is sufficiently sensitive to detect the time course of emotion processing in the brain in the context of limited attention resources. Using an RSVP paradigm, Keil and colleagues (2004, 2006) found that accuracy rates were higher and ERP amplitudes were larger for high-arousal pleasant and unpleasant words than for low-arousal neutral words. We previously used the RSVP task to investigate the time course of facial expression processing and proposed a three-stage model (Luo et al., 2010). The N1 and P1 components may reflect the first stage, during which negative facial expressions are distinguished from others. During the second stage, the vertex positive potential (VPP) and N170 reflect discrimination between emotional and neutral facial stimuli. Finally, in the third stage, emotional facial expressions are differentiated as positive or negative. The N3 and P3 components likely reflect further evaluation of the affective valence of stimuli during this stage. In the present study, we hypothesized that facial expressions and words would elicit similar emotional processing, as the emotional valence, rather than type, of stimulus may be the major determinant of the brain’s processing pattern. The potential similarity of different types of emotional stimulus being processed by the brain prompted us to investigate whether three stages of emotional word processing could be distinguished using the RSVP paradigm.

METHODS

Subjects

Twenty undergraduates (10 females) aged 18–24 years (mean age = 20.8 years) participated in this study. All were right-handed and had normal vision (with or without correction). Informed consent was obtained prior to the experiment according to procedures approved by a local ethical committee.

Stimuli

Considering that adjectives usually describe characteristics, states or traits and may be related more directly to emotions than are nouns and verbs (Palazova et al., 2011), emotional adjectives were used in this study to investigate the pure effects of emotional valence on ERP components.

The stimuli consisted of 18 Chinese adjectives, 12 Chinese pseudowords and four strings of four repeated digits (i.e. 1111, 2222, 5555 and 6666; see the Supplementary Material for more details). The meaningless pseudowords were presented upside-down during the experiment to avoid ‘floor effect’ since it was difficult to detect upright emotional adjectives among a series of quickly presented upright pseudowords. Three types of adjectives (six positive, six negative and six neutral ones) were selected from Chinese Affective Words System (CAWS)1 (Wang et al., 2008; Zhang et al., 2010) with similar occurrence frequencies [F(2, 15) < 1; positive = 44.83 ± 12.78 (M ± SD), neutral = 39.67 ± 9.16, negative = 38.33 ± 11.91 times per million]. There was no significant difference in the strokes across three emotional conditions [F(2, 15) < 1; positive = 17.67 ± 4.59, neutral = 16.00 ± 3.29, negative = 16.33 ± 2.66] while they differed significantly in valence [F(2, 15) = 378, P < 0.001, Inline graphic = 0.981; positive = 6.75 ± 0.18, neutral = 4.83 ± 0.34, negative = 2.87 ± 0.17]. To prevent our results from being contaminated by the arousal across three emotional conditions, the arousal level of selected words was controlled [F(2, 15) < 1; positive = 4.82 ± 0.32, neutral = 4.88 ± 0.41, negative = 4.84 ± 0.14]. The font of the characters was Song Ti No.48. All stimuli (including two-character words and repeated digits) were presented in white color on the black background with the same contrast and brightness. All stimuli were adjusted to have the same size (142 × 74 pixels), and with the viewing angle as 2.4 × 4.5°. The screen resolution was 72 pixels per inch. Subjects were seated in a sound-proof room with their eyes approximately 90 cm from a 17-inch screen. All stimuli were displayed in the center of the screen.

Procedure

Consistent with our previous research (Luo et al., 2010), the present study used the dual-target RSVP task (Raymond et al., 1992). The experimental procedure was programmed with E-Prime 1.2 (Psychology Software Tools, Inc., Pittsburgh, PA). As shown in Figure 1, each trial started with a 500-ms white fixation cross, followed by a 300-ms blue fixation cross. Then a stimulus sequence containing 12 distracting stimuli and two target stimuli were presented (100 ms duration for each picture). The first target (T1) was one of the four upright digit strings and was presented with the same occurrence probability. The second target (T2) was one of the three categories of upright adjectives and was presented equiprobably between emotional conditions. The T1 appeared randomly and equiprobably at the fourth, fifth, sixth or seventh position in the stimulus series while the T2 appeared in the third position after T1 (SOA = 300 ms). To remove the superposed electrical activity elicited by the prior- and post-distractors and to obtain the ERPs elicited purely by T2, a baseline condition was designed with a blank black screen at T2 (Vogel et al., 1998; Sergent et al., 2005). Four conditions were presented with a random order during the experiment (conditions were defined by T2, i.e. neutral word, positive word, negative word and blank screen). After each trial, subjects were required to respond to two questions as accurately as possible regarding the parity of T1 (press Key ‘1’ when it was odd and Key ‘2’ if it was even) and the emotional category of T2 by pressing the button on a response box with their right index finger (press Key ‘1’ if T2 was positive, Key ‘2’ when neutral, Key ‘3’ when negative, and Key ‘4’ when the participant did not see T2). The two questions were presented serially with fixed order in the end of each trial (Figure 1). The questions disappeared when the subject indicated his or her response (i.e. the response was without time limit). Then, subjects would be led into the next stimulus series after a 600-ms period during which the screen remained black and blank. No feedback was presented throughout the experiment. The formal test was divided into four blocks (each block had 96 trials).

Fig. 1.

Fig. 1

The RSVP paradigm used in this experiment. All the stimuli were in Chinese (see the Supplementary Material for more details). (A) Each trial contained twelve pseudo-words (pw), two target stimuli (T1 and T2) and two questions (Q1 and Q2). The time interval between the T1 and T2 was 200 ms. (B) The two questions in each trial.

Electrophysiological recording and analysis

Brain electrical activity was recorded referentially against left mastoid and off-line re-referenced to average reference, by a 64-channel amplifier using a standard 10–20 system (Brain Products, Gilching, Germany). Horizontal electrooculographies (EOGs) were recorded from two electrodes at the outer canthi of each eye. Vertical EOGs were recorded from electrodes situated on infraorbital and supra-orbital regions of the right eye. The electrode impedance was maintained <5 kΩ. Electroencephalogram (EEG) signals were continuously sampled at 500 Hz and filtered within 0.01–100 Hz. The EEG data were corrected for eye movements using the method proposed by Gratton and Coles (1989), as implemented in the Brain Vision Analysis software (Version 2.0; Brain Product, Gilching, Germany). Trials contaminated with large artifacts (peak-to-peak deflection exceeded ±80 µV) were excluded from the averaging (approximately 16 trials per individual were rejected).

The EEG data were segmented in association with T2, beginning 200 ms prior to the stimulus onset and lasting for 1200 ms. Trials were accepted only if the responses for T1 and T2 were correct. Stimulus-locked average ERPs under positive, neutral and negative conditions were computed separately for each participant as the difference between emotional and baseline conditions (i.e. the average ERP in emotional word condition subtracts the average ERP in baseline condition). This study analyzed the potentials of P1, N170, EPN and LPC components across different sets of electrodes according to grand-mean ERP topographies (Williams et al., 2006; Righart and de Gelder, 2007; Luo et al., 2010). Time windows for mean amplitude calculation were centered at the peak latencies of ERP components in grand-mean waveforms, with a shorter window length for early components and a longer length for late components. The mean amplitude of P1 component was calculated at Pz, P3, P4, POz, PO3 and PO4 (time window: 165–195 ms). The mean amplitudes of N170 and EPN components were analyzed at P7, P8, PO7 and PO8 (time window: 250–290 ms for N170 and 300–400 ms for EPN). Nine electrode sites (FCz, FC3, FC4, Cz, C3, C4, CPz, CP3 and CP4) were selected for statistical analysis of the LPC mean amplitude (time window: 480–530 ms).

A three-way repeated measures analyses of variance (ANOVA) on the mean amplitudes of P1, N170, EPN and LPC components were conducted with emotion (three levels: positive, neutral and negative), hemisphere (two levels for N170 and EPN: right and left; three levels for P1 and LPC: right, midline and left) and electrodes (refer to the previous paragraph for specified electrode sites of different ERP components) as within-subjects factors. Post hoc testing of significant main effects was conducted using the Bonferroni method. P-values were corrected by Greenhouse–Geisser correction.

RESULTS

Behavioral data

ANOVAs for the accuracy showed a significant main effect of emotional type [F(2,38) = 9.44, P = 0.003, Inline graphic = 0.332]. The results of pairwise comparison showed that the accuracies of positive words (M ± SD, 87.8 ± 10.4%, P = 0.003) and negative words (86.6 ± 11.8%, P = 0.034) were higher than that of neutral words (77.2 ± 15.0%). There was no significant difference between positive and negative words (P > 0.9).

ERP data

P1

The main effect of hemisphere and the interaction effect between word emotional type and hemisphere for P1 amplitudes were significant [F(2,38) = 10.84, P < 0.001, Inline graphic = 0.363; F(4,76) = 3.52, P = 0.011, Inline graphic = 0.156]. Left hemisphere (M ± SE, 2.73 ± 0.42 µV, P < 0.001) and right hemisphere (2.25 ± 0.41 µV, P = 0.007) elicited larger P1 amplitudes than did midline region (0.89 ± 0.30 µV). Simple effect analysis showed that negative words (3.10 ± 0.48 µV) elicited larger P1 amplitudes than did positive (2.55 ± 0.42 µV, P = 0.018) and neutral words (2.53 ± 0.40 µV, P = 0.009) in left hemisphere [F(2,18) = 4.50, P = 0.026, Inline graphic = 0.333]. However, there was no significant emotional effect in midline region and in right hemisphere [F(2,18) = 1.55, P = 0.239, Inline graphic = 0.147; F(2,18) < 1] (Figure 2).

Fig. 2.

Fig. 2

Grand average ERPs of the P1 and LPC components at the indicated electrode sites.

N170

The main effect of hemisphere for N170 amplitudes was significant [F(1,19) = 6.84, P = 0.017, Inline graphic = 0.265]. The interaction effect between word emotional type and hemisphere for N170 amplitudes was significant [F(2,38) = 13.27, p < 0.001, Inline graphic = 0.411] (Figure 3). Simple effect analysis showed that negative (−6.75 ± 0.64 µV, P = 0.001) and positive words (−6.37 ± 0.68 µV, P = 0.005) elicited larger N170 amplitudes than did neutral words (−5.39 ± 0.68 µV) in left hemisphere [F(2,18) = 7.65, P = 0.004, Inline graphic = 0.459]. However, there was no significant difference between negative (−4.48 ± 0.66 µV) and neutral words (−4.68 ± 0.50 µV, P = 0.555), and between positive (−4.37 ± 0.56 µV, P = 0.223) and neutral words in right hemisphere [F(2,18) < 1].

Fig. 3.

Fig. 3

Grand average ERPs of the N170 and EPN components at the indicated electrode sites.

EPN

The main effects of word emotional type and hemisphere for EPN amplitudes were significant [F(2,38) = 14.73, P < 0.001, Inline graphic = 0.437; F(1,19) = 6.57, P = 0.019, Inline graphic = 0.257]. Positive (−0.52 ± 0.36 µV, P < 0.001) and negative words (−0.32 ± 0.33 µV, P < 0.001) elicited more negative-going EPN than did neutral words (0.45 ± 0.34 µV) while there was no significant difference between the former two emotional conditions (P = 1.000) (Figure 3). The interaction effects between word emotional type and hemisphere, and between electrode and hemisphere for EPN amplitudes were significant [F(2,38) = 5.10, P = 0.011, Inline graphic = 0.212; F(1,19) = 14.75, P = 0.001, Inline graphic = 0.437]. Simple effect analysis showed that negative (−0.85 ± 0.38 µV, P < 0.001) and positive words (−1.13 ± 0.41 µV, P < 0.001) elicited more negative-going EPN than did neutral words (0.32 ± 0.37 µV) in left hemisphere [F(2, 18) = 31.42, P < 0.001, Inline graphic = 0.777]. However, there was no significant difference between negative (0.21 ± 0.39 µV) and neutral words (0.58 ± 0.40 µV, P = 0.163), and between positive (0.09 ± 0.41 µV, P = 0.058) and neutral words in right hemisphere [F(2,18) = 1.94, P = 0.173, Inline graphic = 0.177].

LPC

The LPC amplitudes showed significant main effects of word emotional type, hemisphere and electrode site [F(2,38) = 15.51, P < 0.001, Inline graphic = 0.449; F(2,38) = 4.00, P = 0.026, Inline graphic = 0.174; F(2,38) = 6.98, p = 0.007, Inline graphic = 0.269]. The pairwise comparison showed that positive words (1.62 ± 0.22 µV) elicited larger LPC amplitudes than did negative (1.30 ± 0.19 µV; P = 0.045) and neutral words (0.91 ± 0.22 µV; P < 0.001); negative words elicited larger LPC amplitudes than did neutral words (P = 0.029) (Figure 2). Midline region (1.76 ± 0.31 µV) elicited larger LPC amplitudes than right hemisphere (0.77 ± 0.23 µV, P = 0.030). There was no significant difference between right and left (1.30 ± 0.29 µV, P = 0.630) hemispheres, or between left hemisphere and midline region (P = 0.369). The scalp topographies of the grand-mean ERPs for the three emotional conditions are shown in Figure 4.

Fig. 4.

Fig. 4

Grand average ERP topographies of the P1, N170, EPN and LPC components across three emotional conditions.

DISCUSSION

In the present study, ERPs were measured from participants performing a dual-target RSVP task in order to examine the time course of emotional word processing. Our behavioral data showed that the accuracy in response to positive and negative words was higher than that in response to neutral words. This is in agreement with the idea that emotional stimuli are prioritized in visual processing because they are highly relevant for biological adaptation. At the same time, our ERP data showed that the emotional valence of the stimuli modulated multiple ERP components. First, negative words elicited larger P1 component than did positive and neutral words. Second, both positive and negative words elicited larger N170/EPN components than did neutral words. Finally, positive words elicited larger LPC amplitudes than did negative and neutral words, while negative words elicited larger LPC amplitudes than did neutral words.

We observed that negative words elicited larger P1 amplitudes over the parieto-occipital areas than did positive and neutral words. Considering that the P1 is generally presumed to reflect early attention allocation in the extrastriate cortex (Hillyard et al., 1998), the present results indicated the presence of a ‘negativity bias’ (defined as an early attentional bias towards negative stimuli) (Cacioppo et al., 1999; Crawford and Cacioppo, 2002). The P1 component may reflect the first stage of emotional word processing, which mainly differentiates potentially threatening stimuli from non-threatening stimuli. Enhanced P1 amplitudes have also been reported in response to fearful expressions (Batty and Taylor, 2003; Pourtois et al., 2005) and negative words (Li et al., 2007; van Hooff et al., 2008). This enhancement of P1 in response to negative words may be attributed to the fast information processing of negative stimuli occurring via both subcortical and cortical pathways (Morris et al., 1999; Pessoa and Adolphs, 2010).

The present study showed that positive and negative words elicited larger N170 amplitudes and more negative-going EPN component than did neutral words, while the difference between positive and negative words was not significant within this time window. Thus, we suggest that the N170 and the EPN reflect emotional word processing in the second stage. Previous studies (Bentin et al., 1996; Rossion et al., 2000; Rebai et al., 2001) have demonstrated larger N170 amplitudes for faces as compared to objects; this phenomenon was usually interpreted as a reflection of face-specific encoding. Other reports have indicated that visual presentation of words also elicited the N170 component (Rossion et al., 2003; Simon et al., 2007; Frühholz et al., 2011). In this study, we found that emotional words elicited larger N170 amplitudes than did neutral words, which is consistent with our previous study of emotional face processing (Luo et al., 2010). Therefore, our data indicated that the N170 component mainly differentiates emotional from neutral words and does not differentiate between positive and negative emotions when the arousal values of the stimuli are matched.

Besides N170, this study also observed that the EPN successfully differentiated between neutral and emotional adjectives at the second stage of emotional word processing, which is consistent with previous studies (Kissler et al., 2007; Herbert et al., 2008; Kissler et al., 2009; Scott et al., 2009). A recent study (Bayer et al., 2012) showed that EPN effects are modulated by stimulus-related arousal rather than emotional valence, which seems contradictory with the current study. This discrepancy may be due to different emotional words (nouns in Bayer et al. vs adjectives in this study) and different task requirements between the studies. Bayer et al. (2012) indicated that the EPN modulations in their study were limited to the lexical decision task while the subjects were required to discriminate the emotional valence in our experiment. The notion that the emotional modulation on the EPN component is sensitive to task requirements is also supported by the study of Hinojosa et al. (2010), who found that emotional modulations were absent in low-level perceptual tasks. To sum up, the two components (N170 and EPN) at the time interval of the second stage (250–400 ms) revealed that positive and negative adjectives were able to elicit stronger neural activations compared with neutral adjectives in emotional discrimination task, and that this effect was independent of the arousal level of words.

In a previous study (Luo et al., 2010), we found that the brain distinguishes between positive and negative emotions in the third stage of facial expression processing. The present study consistently found significant differences in the amplitude of the LPC induced by positive, neutral and negative words, suggesting that the LPC reflects the third stage of emotional word processing. Fischler and Bradley (2006) reviewed a series of studies involving the processing of words and simple phrases. They concluded that fronto-central positivity 300–600 ms after stimulus onset was enhanced in response to pleasant and unpleasant stimuli compared with neutral stimuli. In agreement, others have suggested that enhanced LPC amplitudes reflect the elaborate processing of high-arousal stimuli (Cuthbert et al., 1999). However, the LPC effect was evident even after controlling for the arousal levels of different stimulus types in the present study. Our data suggest that the amplitude of the LPC reflects differences not only in arousal, but also in valence among emotional words. These results are consistent with previous studies showing emotional effects on LPC amplitudes at a constant arousal level (Delplanque et al., 2004; Conroy and Polich, 2007). In addition, we observed that the largest LPC amplitude was induced by positive words, in agreement with previous studies (Schapkin et al., 2000; Herbert et al., 2006, 2008; Kissler et al., 2009). Whereas early stimulus registration usually prioritizes unpleasant stimuli (i.e. negativity bias), pleasant words may enhance the elaborative processes of attentional capture, evaluation, categorical decision making and/or memory encoding at a later word processing stage (Herbert et al., 2006).This later effect of ‘positivity offset’ results in larger LPC amplitudes in response to positive words (Ferré, 2003; Kissler et al., 2006).

As expected, the ERP dynamics observed in the present study differed from those documented in the previous facial expression study (Luo et al., 2010). First, we previously found a negativity bias of the anterior N1 component (∼100 ms), whereas the present study found no difference in N1 between emotional conditions. Because facial expressions are direct biological cues, the processing of fearful facial expressions is likely be more rapid than that of negative words. Second, our previous study found that fearful faces elicited larger P3 amplitudes than did happy faces. However, the present study found that positive words elicited larger LPC amplitudes than did negative words, in agreement with several studies documenting larger LPC amplitudes evoked by pleasant words than by unpleasant words (Schapkin et al., 2000; Herbert et al., 2006, 2008; Kissler et al., 2009). Third, our data indicated that while the N170 component in the right hemisphere appeared to respond most to facial expressions, the processing of emotional words does not have this right hemisphere advantage. Previous studies have shown that the visual presentation of words can elicit the N170 (Rossion et al., 2003; Simon et al., 2007; Mercure et al., 2008) and EPN (Frühholz et al., 2011) components in a region lateralized more to the left hemisphere. The left hemisphere advantage of the N170 and EPN components observed in our study may reflect enhanced activity of the visual word-form area, which is particularly responsive to visually presented words and is located in the left fusiform gyrus (McCandliss et al., 2003).

Readers may notice an atypical use of component terms in the present study. For instance, the typical peak latencies for the P1 and N170 components are 100–130 ms and 170 ms (Luck, 2005). However, we used the terms P1 and N170 to name the two ERP components with peak latencies of ∼180 and 260 ms (Figures 2 and 3). We made these designations mainly because the scalp topographies of the two components were consistent with expectations for parieto-occipital P1 and occipito-temporal N170. The latency delays observed for P1, N170 and EPN in the current study are likely attributable to the RSVP paradigm employed in this experiment, which is a relatively difficult task requiring participants to discriminate the parity of a digit string at T1 and then to identify the emotional category of an adjective 200 ms later at T2 in a rapidly presented stimulus stream (see also Kessler et al., 2005; Luo et al., 2013 for a similar ERP latency delay using a dual-target RSVP paradigm).

Our results provide new evidence supporting the hypothesis that a common model may explain the processing of emotional information of various types in the context of limited attention resources. However, current accumulated knowledge remains insufficient to conclude that the brain can analyze the nature of emotion independent of stimulus type. We used only six adjectives per emotional condition in this study. It is because the Chinese Affective Words System contains a total of 1500 two-character words but only 500 adjective, nouns and verbs, respectively. To investigate the pure effect of emotional valence, we used only words with significantly different valence and similar arousal, occurrence and strokes. Few words met these criteria. Therefore, further work is needed to verify the three-stage model using (i) a larger sample of words; (ii) other word categories, such as verbs and nouns (e.g. Kissler and Herbert, 2013) and (iii) other stimulus types, such as emotional images and body postures.

CONCLUSION

In the current study, adjectives were used to investigate the pure effects of emotional valence on ERP components during visual word processing. The data demonstrated that, similar to facial expression processing, emotional word processing occurs in three stages (P1, N170/EPN and LPC) and is modulated independently by the emotional valence of stimuli.

SUPPLEMENTARY DATA

Supplementary data are available at SCAN online.

Conflict of Interest

None declared.

Supplementary Material

Supplementary Data

Acknowledgments

The work was supported by the National Natural Science Foundation of China (31371033, 31300867, 31170984 and 91132704) the Special Public-welfare Project of the Ministry of Health (201002003), and the 973 Program (2011CB711000, 2014CB744600).

Footnotes

1The CAWS contains two-character words which selected from Modern Chinese Dictionary of Commonly Used Words. The total 1500 two-character words have been evaluated in terms of valence, arousal and dominance by 124 participants using a 9-point scale.

REFERENCES

  1. Batty M, Taylor MJ. Early processing of the six basic facial emotional expressions. Brain Research. Cognitive Brain Research. 2003;17:613–20. doi: 10.1016/s0926-6410(03)00174-5. [DOI] [PubMed] [Google Scholar]
  2. Bayer M, Sommer W, Schacht A. P1 and beyond: functional separation of multiple emotion effects in word recognition. Psychophysiology. 2012;49:959–69. doi: 10.1111/j.1469-8986.2012.01381.x. [DOI] [PubMed] [Google Scholar]
  3. Bentin S, Allison T, Puce A, Perez E, McCarthy G. Electrophysiological studies of face perception in humans. Journal of Cognitive Neuroscience. 1996;8:551–65. doi: 10.1162/jocn.1996.8.6.551. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Bernat E, Bunce S, Shevrin H. Event-related brain potentials differentiate positive and negative mood adjectives during both supraliminal and subliminal visual processing. International Journal of Psychophysiology. 2001;42:11–34. doi: 10.1016/s0167-8760(01)00133-7. [DOI] [PubMed] [Google Scholar]
  5. Cacioppo JT, Gardner WL, Berntson GG. The affect system has parallel and integrative processing components: form follows function. Journal of Personality and Social Psychology. 1999;76:839–55. [Google Scholar]
  6. Conroy MA, Polich J. Affective valence and P300 when stimulus arousal level is controlled. Cognition & Emotion. 2007;21:891–901. [Google Scholar]
  7. Crawford LE, Cacioppo JT. Learning where to look for danger: integrating affective and spatial information. Psychological Science. 2002;13:449–53. doi: 10.1111/1467-9280.00479. [DOI] [PubMed] [Google Scholar]
  8. Cuthbert BN, Schupp HT, Bradley MM, Birbaumer N, Lang PJ. Brain potentials in affective picture processing: covariation with autonomic arousal and affective report. Biological Psychology. 1999;52:95–112. doi: 10.1016/s0301-0511(99)00044-7. [DOI] [PubMed] [Google Scholar]
  9. Delplanque S, Lavoie ME, Hot P, Silvert L, Sequeira H. Modulation of cognitive processing by emotional valence studied through event-related potentials in humans. Neuroscience Letters. 2004;356:1–4. doi: 10.1016/j.neulet.2003.10.014. [DOI] [PubMed] [Google Scholar]
  10. Eger E, Jedynak A, Iwaki T, Skrandies W. Rapid extraction of emotional expression: evidence from evoked potential fields during brief presentation of face stimuli. Neuropsychologia. 2003;41:808–17. doi: 10.1016/s0028-3932(02)00287-7. [DOI] [PubMed] [Google Scholar]
  11. Ferré P. Effects of level of processing on memory for affectively valenced words. Cognition & Emotion. 2003;17:859–80. [Google Scholar]
  12. Fischler I, Bradley M. Event-related potential studies of language and emotion: words, phrases, and task effects. Progress in Brain Research. 2006;156:185–203. doi: 10.1016/S0079-6123(06)56009-1. [DOI] [PubMed] [Google Scholar]
  13. Frühholz S, Jellinghaus A, Herrmann M. Time course of implicit processing and explicit processing of emotional faces and emotional words. Biological Psychology. 2011;87:265–74. doi: 10.1016/j.biopsycho.2011.03.008. [DOI] [PubMed] [Google Scholar]
  14. Gratton G, Coles MGH. Generalization and evaluation of eye movement correction procedures. Journal of Psychophysiology. 1989;3:14–16. [Google Scholar]
  15. Herbert C, Junghofer M, Kissler J. Event related potentials to emotional adjectives during reading. Psychophysiology. 2008;45:487–98. doi: 10.1111/j.1469-8986.2007.00638.x. [DOI] [PubMed] [Google Scholar]
  16. Herbert C, Kissler J, Junghofer M, Peyk P, Rockstroh B. Processing of emotional adjectives: evidence from startle EMG and ERPs. Psychophysiology. 2006;43:197–206. doi: 10.1111/j.1469-8986.2006.00385.x. [DOI] [PubMed] [Google Scholar]
  17. Hillyard SA, Vogel EK, Luck SJ. Sensory gain control (amplification) as a mechanism of selective attention: electrophysiological and neuroimaging evidence. Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences. 1998;353:1257–70. doi: 10.1098/rstb.1998.0281. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Hinojosa JA, Mendez-Bertolo C, Pozo MA. Looking at emotional words is not the same as reading emotional words: behavioral and neural correlates. Psychophysiology. 2010;47:748–57. doi: 10.1111/j.1469-8986.2010.00982.x. [DOI] [PubMed] [Google Scholar]
  19. Hofmann MJ, Kuchinke L, Tamm S, Vo MLH, Jacobs AM. Affective processing within 1/10th of a second: high arousal is necessary for early facilitative processing of negative but not positive words. Cognitive Affective & Behavioral Neuroscience. 2009;9:389–97. doi: 10.3758/9.4.389. [DOI] [PubMed] [Google Scholar]
  20. Junghofer M, Peyk P, Kissler J, Herbert C, Flaisch T, Stockburger J, Schupp H. Rapid serial visual presentation in studies on early motivated attention: an overview. Psychophysiology. 2004;41:S17. [Google Scholar]
  21. Keil A, Ihssen N. Identification facilitation for emotionally arousing verbs during the attentional blink. Emotion. 2004;4:23–35. doi: 10.1037/1528-3542.4.1.23. [DOI] [PubMed] [Google Scholar]
  22. Keil A, Ihssen N, Heim S. Early cortical facilitation for emotionally arousing targets during the attentional blink. BMC Biology. 2006;4:23. doi: 10.1186/1741-7007-4-23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Kessler K, Schmitz F, Gross J, Hommel B, Shapiro K, Schnitzler A. Target consolidation under high temporal processing demands as revealed by MEG. NeuroImage. 2005;26:1030–41. doi: 10.1016/j.neuroimage.2005.02.020. [DOI] [PubMed] [Google Scholar]
  24. Kissler J, Assadollahi R, Herbert C. Emotional and semantic networks in visual word processing: insights from ERP studies. Progress in Brain Research. 2006;156:147–83. doi: 10.1016/S0079-6123(06)56008-X. [DOI] [PubMed] [Google Scholar]
  25. Kissler J, Herbert C. Emotion, etmnooi, or emitoon? - faster lexical access to emotional than to neutral words during reading. Biological Psychology. 2013;92:464–79. doi: 10.1016/j.biopsycho.2012.09.004. [DOI] [PubMed] [Google Scholar]
  26. Kissler J, Herbert C, Peyk P, Junghofer M. Buzzwords: early cortical responses to emotional words during reading. Psychological Science. 2007;18:475–80. doi: 10.1111/j.1467-9280.2007.01924.x. [DOI] [PubMed] [Google Scholar]
  27. Kissler J, Herbert C, Winkler I, Junghofer M. Emotion and attention in visual word processing-An ERP study. Biological Psychology. 2009;80:75–83. doi: 10.1016/j.biopsycho.2008.03.004. [DOI] [PubMed] [Google Scholar]
  28. Lang PJ, Bradley MM, Cuthbert BN. Emotion, motivation, and anxiety: brain mechanisms and psychophysiology. Biological Psychiatry. 1998;44:1248–63. doi: 10.1016/s0006-3223(98)00275-3. [DOI] [PubMed] [Google Scholar]
  29. Li W, Zinbarg RE, Paller KA. Trait anxiety modulates supraliminal and subliminal threat: brain potential evidence for early and late processing influences. Cognitive Affective & Behavioral Neuroscience. 2007;7:25–36. doi: 10.3758/cabn.7.1.25. [DOI] [PubMed] [Google Scholar]
  30. Luck SJ. An Introduction to the Event-related Potential Technique. London, UK: The MIT Press; 2005. [Google Scholar]
  31. Luo WB, Feng WF, He WQ, Wang NY, Luo YJ. Three stages of facial expression processing: ERP study with rapid serial visual presentation. NeuroImage. 2010;49:1857–67. doi: 10.1016/j.neuroimage.2009.09.018. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Luo WB, He WQ, Feng WF, et al. Electrophysiological evidence of facial inversion with rapid serial visual presentation. Biological Psychology. 2013;92:395–402. doi: 10.1016/j.biopsycho.2012.11.019. [DOI] [PubMed] [Google Scholar]
  33. McCandliss BD, Cohen L, Dehaene S. The visual word form area: expertise for reading in the fusiform gyrus. Trends in Cognitive Sciences. 2003;7:293–9. doi: 10.1016/s1364-6613(03)00134-7. [DOI] [PubMed] [Google Scholar]
  34. Mercure E, Dick F, Halit H, Kaufman J, Johnson MH. Differential lateralization for words and faces: category or psychophysics? Journal of Cognitive Neuroscience. 2008;20:2070–87. doi: 10.1162/jocn.2008.20137. [DOI] [PubMed] [Google Scholar]
  35. Morris JS, Öhman A, Dolan RJ. A subcortical pathway to the right amygdala mediating “unseen” fear. Proceedings of the National Academy of Sciences of the United States of America. 1999;96:1680–5. doi: 10.1073/pnas.96.4.1680. [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Palazova M, Mantwill K, Sommer W, Schacht A. Are effects of emotion in single words non-lexical? Evidence from event-related brain potentials. Neuropsychologia. 2011;49:2766–75. doi: 10.1016/j.neuropsychologia.2011.06.005. [DOI] [PubMed] [Google Scholar]
  37. Pessoa L, Adolphs R. Emotion processing and the amygdala: from a ‘low road’ to ‘many roads’ of evaluating biological significance. Nature Reviews Neuroscience. 2010;11:773–83. doi: 10.1038/nrn2920. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Pourtois G, Thut G, de Peralta RG, Michel C, Vuilleumier P. Two electrophysiological stages of spatial orienting towards fearful faces: early temporo-parietal activation preceding gain control in extrastriate visual cortex. NeuroImage. 2005;26:149–63. doi: 10.1016/j.neuroimage.2005.01.015. [DOI] [PubMed] [Google Scholar]
  39. Raymond JE, Shapiro KL, Arnell KM. Temporary suppression of visual processing in an RSVP task: an attentional blink? Journal of Experimental Psychology: Human Perception and Performance. 1992;18:849–60. doi: 10.1037//0096-1523.18.3.849. [DOI] [PubMed] [Google Scholar]
  40. Rebai M, Poiroux S, Bernard C, Lalonde R. Event-related potentials for category-specific information during passive viewing of faces and objects. International Journal of Neuroscience. 2001;106:209–26. doi: 10.3109/00207450109149750. [DOI] [PubMed] [Google Scholar]
  41. Rellecke J, Palazova M, Sommer W, Schacht A. On the automaticity of emotion processing in words and faces: event-related brain potentials evidence from a superficial task. Brain and Cognition. 2011;77:23–32. doi: 10.1016/j.bandc.2011.07.001. [DOI] [PubMed] [Google Scholar]
  42. Righart R, de Gelder B. Impaired face and body perception in developmental prosopagnosia. Proceedings of the National Academy of Sciences of the United States of America. 2007;104:17234–8. doi: 10.1073/pnas.0707753104. [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Rossion B, Gauthier I, Tarr MJ, et al. The N170 occipito-temporal component is delayed and enhanced to inverted faces but not to inverted objects: an electrophysiological account of face-specific processes in the human brain. Neuroreport. 2000;11:69–74. doi: 10.1097/00001756-200001170-00014. [DOI] [PubMed] [Google Scholar]
  44. Rossion B, Joyce CA, Cottrell GW, Tarr MJ. Early lateralization and orientation tuning for face, word, and object processing in the visual cortex. NeuroImage. 2003;20:1609–24. doi: 10.1016/j.neuroimage.2003.07.010. [DOI] [PubMed] [Google Scholar]
  45. Schacht A, Sommer W. Emotions in word and face processing: early and late cortical responses. Brain and Cognition. 2009a;69:538–50. doi: 10.1016/j.bandc.2008.11.005. [DOI] [PubMed] [Google Scholar]
  46. Schacht A, Sommer W. Time course and task dependence of emotion effects in word processing. Cognitive Affective & Behavioral Neuroscience. 2009b;9:28–43. doi: 10.3758/CABN.9.1.28. [DOI] [PubMed] [Google Scholar]
  47. Schapkin SA, Gusev AN, Kuhl J. Categorization of unilaterally presented emotional words: an ERP analysis. Acta Neurobiologiae Experimentalis. 2000;60:17–28. doi: 10.55782/ane-2000-1321. [DOI] [PubMed] [Google Scholar]
  48. Schupp HT, Junghofer M, Weike AI, Hamm AO. Emotional facilitation of sensory processing in the visual cortex. Psychological Science. 2003;14:7–13. doi: 10.1111/1467-9280.01411. [DOI] [PubMed] [Google Scholar]
  49. Schupp HT, Junghofer M, Weike AI, Hamm AO. The selective processing of briefly presented affective pictures: an ERP analysis. Psychophysiology. 2004a;41:441–9. doi: 10.1111/j.1469-8986.2004.00174.x. [DOI] [PubMed] [Google Scholar]
  50. Schupp HT, Ohman A, Junghofer M, Weike AI, Stockburger J, Hamm AO. The facilitated processing of threatening faces: an ERP analysis. Emotion. 2004b;4:189–200. doi: 10.1037/1528-3542.4.2.189. [DOI] [PubMed] [Google Scholar]
  51. Schutter DJLG, de Haan EHF, van Honk J. Functionally dissociated aspects in anterior and posterior electrocortical processing of facial threat. International Journal of Psychophysiology. 2004;53:29–36. doi: 10.1016/j.ijpsycho.2004.01.003. [DOI] [PubMed] [Google Scholar]
  52. Scott GG, O'Donnell PJ, Leuthold H, Sereno SC. Early emotion word processing: evidence from event-related potentials. Biological Psychology. 2009;80:95–104. doi: 10.1016/j.biopsycho.2008.03.010. [DOI] [PubMed] [Google Scholar]
  53. Sergent C, Baillet S, Dehaene S. Timing of the brain events underlying access to consciousness during the attentional blink. Nature Neuroscience. 2005;8:1391–400. doi: 10.1038/nn1549. [DOI] [PubMed] [Google Scholar]
  54. Simon G, Petit L, Bernard C, Rebai M. N170 ERPs could represent a logographic processing strategy in visual word recognition. Behavioral and Brain Functions. 2007;3:21. doi: 10.1186/1744-9081-3-21. [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Sprengelmeyer R, Jentzsch I. Event related potentials and the perception of intensity in facial expressions. Neuropsychologia. 2006;44:2899–906. doi: 10.1016/j.neuropsychologia.2006.06.020. [DOI] [PubMed] [Google Scholar]
  56. van Hooff JC, Dietz KC, Sharma D, Bowman H. Neural correlates of intrusion of emotion words in a modified Stroop task. International Journal of Psychophysiology. 2008;67:23–34. doi: 10.1016/j.ijpsycho.2007.09.002. [DOI] [PubMed] [Google Scholar]
  57. Vogel EK, Luck SJ, Shapiro KL. Electrophysiological evidence for a postperceptual locus of suppression during the attentional blink. Journal of Experimental Psychology: Human Perception and Performance. 1998;24:1656–74. doi: 10.1037//0096-1523.24.6.1656. [DOI] [PubMed] [Google Scholar]
  58. Wang YN, Zhou LM, Luo YJ. The pilot establishment and evaluation of Chinese affective words system. Chinese Mental Health Journal. 2008;22:608–12. [Google Scholar]
  59. Williams LM, Palmer D, Liddell BJ, Song L, Gordon E. The ‘when’ and ‘where’ of perceiving signals of threat versus non-threat. NeuroImage. 2006;31:458–67. doi: 10.1016/j.neuroimage.2005.12.009. [DOI] [PubMed] [Google Scholar]
  60. Zhang Q, Li XH, Gold BT, Jiang Y. Neural correlates of cross-domain affective priming. Brain Research. 2010;1329:142–51. doi: 10.1016/j.brainres.2010.03.021. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary Data

Articles from Social Cognitive and Affective Neuroscience are provided here courtesy of Oxford University Press

RESOURCES