Abstract
The current study used event-related potentials (ERPs) to examine the ability of task demand in modulating the effect of reward association on the processing of emotional faces. In the learning phase, a high or low reward probability was paired with male or female facial photos of angry, happy, or neutral expressions. Then, in the test phase, task demand was manipulated by asking participants to discriminate the emotionality or the gender of the pre-learned face with no reward at stake. The ERP results in the test phase revealed that the fronto-central N1 (60–100 ms) and the VPP (160–210 ms) components were sensitive to the interaction between reward and emotion, in that the differences between the mean amplitudes for high- and low-reward conditions were significantly larger in the neutral face and angry face conditions than in the happy face condition. Moreover, reward association and task demand showed a significant interaction over the right hemisphere for the N170 component (140–180 ms), with amplitude difference between high- and low-reward conditions being larger in the emotion task than that in the gender task. The later N2pc component exhibited an interaction between task demand and emotionality, in that happy faces elicited larger N2pc difference waves than angry and neutral faces did in the emotion task, but neutral faces elicited larger N2pc difference waves than angry faces did in the gender task. The N2pc effect aligned with behavioral performance. These results suggest that reward association acts as an ‘emotional tagging’ to imbue neutral or angry faces with motivational significance at early time windows. Task demand functions in a top-down way to modulate the deployment of attentional resources at the later attentional selection stage, but does not affect the early automatic processing of either emotion or reward association.
Keywords: Reward association, Emotion, Facial expression, Task demand, Event-related potential
Introduction
Recent studies have demonstrated that reward-associated stimuli have processing priority, such that they are more likely to capture attention as targets and are more difficult to be suppressed as distractors (Anderson 2013; Anderson and Folk 2010; Hickey et al. 2010a, b, 2011; Kiss et al. 2009; Krebs et al. 2010; Pollmann et al. 2016). This value learning process can also affect the processing of stimuli that already enjoy high attentional salience, such as emotional faces (Chen and Wei 2019; Park et al. 2019; Yao et al. 2014; Yokoyama et al. 2015; Zhou et al. 2019).
Using the event-related potential (ERP) technique and the value-learning paradigm, recent studies have reported modulated ERP responses to emotional stimuli after pairing with high or low reward values. For example, a study by Yao and colleagues (2014) used schematic faces in three phases to investigate the effects of reward learning. The angry faces elicited a shorter reaction time (RT) and a larger N2pc (signaling stronger spatial selection, Luck and Hillyard 1994; Eimer 1996) in the baseline phase. However, after a learning phase in which happy faces were paired with high-reward values and angry faces were paired with low-reward values, the anger superiority effect disappeared in the last phase. The authors concluded that angry faces decreased their salience in capturing attention after pairing with low values. Moreover, in a study from our group (Chen and Wei 2019), participants chose between face pairs to establish reward associations with face identities of angry or happy expressions in the learning phase, then performed a singleton discrimination task to judge the emotion of the learned face singleton (angry or happy) with no reward at stake in the test phase. ERP results revealed significant modulated effects for early N1 and P2 components, with smaller amplitude differences between angry and happy faces in the high-reward condition than in the low-reward condition. This reward-learning effect suggests that high-reward modulated the initial visual encoding of emotional faces, especially angry ones.
These studies suggest that the reward-association process modifies the salience of emotional information in early perception and the ability to capture attention. However, it should be noted that these studies typically used emotion-relevant tasks in which participants directed their attention to the emotional features of the target in the test phase. Therefore, it is intriguing to ask whether reward learning can still modulate the superiority effect on emotional face processing if the task requirements have been changed to emotion-irrelevant demands. It is well established that task demand functions in a top-down way by boosting the salience of task-relevant stimuli so they can compete for more attentional resources (Awh et al. 2012; Baluch and Itti 2011; Fecteau and Munoz 2006). If attentional resources are directed to the non-emotional dimensions of faces in the test phase (e.g., gender or physical facial features), the direct allocation of attentional resources to the emotional dimension should be reduced, so reward-to-emotion modulation would very likely be reduced or diminished. Therefore, exploring how task demand affects the modulation mechanisms of reward learning to emotional faces can help us further understand how our attentional system regulates the relationship between reward learning and emotional processing.
In the current study, we used a similar reward-association procedure in the learning phase as our previous study did (Chen and Wei 2019), in which face identities of different emotions and genders were paired with high or low reward probabilities. In each trial in the learning phase, a pair of faces with the same gender and the same emotional expression were presented for participants to choose. Choosing one of the faces led to a high probability of getting reward feedback while choosing the other one led to a low reward probability. This manipulation ensures participants learn the relationship between reward probabilities and the face identities, rather than a particular property of the face (such as emotionality or gender, Chen and Wei 2019; Raymond and O’Brien 2009). In the test phase, participants were then instructed either to discriminate the emotion of the target face (i.e., the emotion categorization task) or to discriminate the gender of the target face (i.e., the gender task) that was presented in the periphery. These two tasks were used to manipulate the top-down allocation of attentional resources to emotional information after reward-associations were established, and thus, to investigate the effect of task demand on how reward learning modulates the processing of emotional faces.
We recorded ERPs in the test phase, with interest in the emotion-sensitive components, N1, N170, and VPP, and the attentional capture component, N2pc. Frontal-central N1 (around 100 ms post-stimulus onset) is assumed to reflect the quick discerning of the emotional meaning through coarse visual cues and to be related to early attentional capture (Foti et al. 2009; Luo et al. 2010; Vuilleumier and Pourtois 2007; Zhang et al. 2013, 2017). Moreover, at around 200 ms, both the N170 and the vertex positive potential (VPP) are sensitive to the configural processing of faces (Eimer 2000; Rossion et al. 2003), but they might have independent neural generators (Bentin et al. 1996; Eimer 2000; George et al. 2005). Finally, we were also interested in a widely used marker for spatial selective attention – the N2pc (around 250 ms), which reflects attentional bias and the deployment of attentional resources to task-relevant, or salient but task-irrelevant, items at the periphery (Hickey et al. 2010a; Luck 2006; Luck and Hillyard 1994; Wei and Ji 2021; Yao et al. 2014).
According to previous studies (Nummenmaa and Calvo 2015; Park et al. 2019; Wei and Kang 2014), we expected: (1) faster RTs in the gender discrimination task than in the emotional categorization task; (2) faster RTs for high reward association condition than for the low reward association condition; (3) faster RTs to the happy-face than to the angry- and neutral-faces; (4) as neither behavioral effect of emotion in the learning phase nor the interaction of reward and emotion in the test phase had been found in both of our previous studies by using the similar reward learning procedure (Chen and Wei 2019, 2023), we predicted the same behavioral effects on this regard. However, according to our previous results, we predicted an interaction between reward probability and emotion in the emotion categorization task in ERP results, since this task shares the same level of attention to emotional information in the test phase with that in Chen and Wei (2019). Specifically, we hypothesized larger differences among ERP responses to different emotional pictures in the low-reward condition than that in the high-reward condition in both the earlier time windows (N1, N170, and VPP), and in the late attentional capture time window (N2pc), demonstrating a modulating effect of reward learning on emotional processing. For the gender discrimination task, based on recent observations that late but not early emotional processing is affected by task demand (Durston and Itier 2021; Hudson et al. 2021), we predicted that the emotional and the reward information might be processed automatically in the early time window but participants would then need to focus on emotion-irrelevant information (i.e., the gender) for better performing the task in the late time window. Therefore, the reward modulation effect for emotional faces may still exist in the early stage of processing but may disappear in the late stage.
Methods
Participants
According to G-power (Faul et al. 2009), a sample size of 22 subjects was obtained by setting the partial η2 as 0.3, α as 0.05, and power (1-β) as 0.95. Considering potential drop-outs and previous related studies (e.g., Calvo and Beltrán 2014; Yao et al. 2014), twenty-four undergraduate and graduate students participated in this study. Data from two participants were excluded due to erratic brain waves (less than 30 trials remained per condition after removing excessive eyeblinks or muscle artifacts, see EEG recordings and analyses). All the remaining participants (13 females and 9 males, between 19 and 26 years of age) were right-handed. They all had a normal or corrected-to-normal vision, with no neurological or neuropsychological disorders. This study was approved by the Ethics Committee of the Department of Psychology of Capital Normal University, and all participants gave their informed consent before the experiment, in accordance with the Declaration of Helsinki.
Design and materials
A 2 × 3 × 2 within-participant factorial design was used for this experiment, with the first factor being the reward probability (high vs. low), the second factor being the emotional valence of the target face (angry, happy, or neutral), and the third factor being the task demand (emotion task vs. gender task).
The experimental materials included 12 pictures from the Chinese Facial Affective Picture System (CFAPS) (Wang and Luo 2005), with the valence and arousal levels being rated on a 9-point scale (“1” means negative valence or low arousal whereas ‘9’ means positive valence or high arousal). There were 4 happy faces, 4 angry faces, and 4 neutral faces, with half female faces and half male faces in each emotional category. We controlled the picture luminance by using a preset luminance template in Photoshop software. The arousal level of happy and angry faces was matched (mean [M] ± standard deviation [SD]: happy = 6.80 ± 0.19; angry = 7.33 ± 0.54). Neutral faces significantly differed from happy and angry stimuli in their arousal level (M ± SD: neutral = 4.42 ± 0.17). The valence ratings of the three categories differed from each other (M ± SD: happy = 6.93 ± 0.32; neutral = 4.45 ± 0.21; angry = 2.57 ± 0.32).
In addition to each target face, a corresponding scrambled image was created by slicing, random splicing, and low-pass filtering of the original face picture. Each scrambled image was presented lateralized at the same time as the pre-converted target face on each trial (see Fig. 1).
Fig. 1.
Trial sequence. a An example of the trial sequence in the value-learning phase. The task in this phase was to choose the left or right face to maximize earnings on each trial. b Example of the trial sequence in the testing phase. Two kinds of tasks were used in this phase: during the emotion task, participants were instructed to discriminate the emotion of the target face (angry, happy, or neutral), and during the gender task they were instructed to discriminate the gender of the target face (female or male)
Procedures
Value learning phase
We used Presentation software (https://www.neurobs.com) to present the experimental materials and to record the behavioral responses. Participants were seated in front of a computer from a distance of 65 cm. The experimental cabin was sound-attenuated and dimly lit. A white fixation cross (0.4° × 0.4° visual angle) was displayed at the center of a black screen for 500 ms at the start of each trial (Fig. 1). Two faces of the same gender and the same emotionality (3.52° × 4.47° visual angle) were then presented, on the left and the right periphery. The center of the face picture was 2.10° away from the central fixation. There were six pairs of faces, including two pairs of angry faces, two pairs of happy faces, and two pairs of neutral faces, with one pair being female faces and another pair being male faces in each category. After the participant made a choice of face, a message of ‘+25’ or ‘+0’ was then presented according to the reward probability of a certain face. The accumulated reward points the participant made were also presented. For each pair of faces, choosing one resulted in gaining the ‘+25’ feedback of a high probability (80%) and in gaining the ‘+0’ feedback of a low probability (20%). Choosing another one resulted in reversed patterns of reward probability. The reward probabilities in choosing particular faces were counterbalanced between participants. Upon the presentation of the feedback, there was also a high-pitch tone (1500 Hz) or a low-pitch tone (500 Hz) played for 500 ms (Chen and Wei 2019). The high-pitch tone (or low-pitch tone) was accompanied the ‘+25’ (or ‘+0’) feedback for half of the participants and was reversed for the other half. The participants were asked to choose faces and to maximize their reward points without being informed about the reward probabilities.
There were in total of 10 blocks (with each block having 60 trials). Each face pair was repeatedly used for ten times in each block. Participants were paid with base payment (40 Chinese Yuan) adding the transformed experimental reward points (on average, 45 Chinese Yuan) after the experiment. They were informed before the experiment that the points they earned in the training phase would be transformed into real money.
Test phase - emotion task/gender task
In this phase, participants performed emotion or gender tasks with no reward at stake. At the start of each trial, a fixation cross (0.4° × 0.4°) appeared in the center of the screen for 800 ms. Then, the face and the corresponding scrambled image were presented together with the fixation cross for 150 ms. These images were the same size and were presented at the same location as the images presented during the value learning phase. Participants were instructed to discriminate the emotionality or the gender of the target face by pressing corresponding keys on a computer keyboard. In the emotion task, participants were instructed to press the corresponding keys, with ‘D’ (or ‘F’) for happy faces, ‘F’ (or ‘D’) for angry faces, and ‘J’ for neutral faces; or ‘J’ (or ‘K’) for happy faces, ‘K’ (or ‘J’) for angry faces, and ‘F’ for neutral faces. In the gender task, the response keys were ‘F’ (or ‘J’) for males and ‘J’ (or ‘F’) for females. The response keys were counter-balanced across participants. Participants were instructed to respond as accurately and quickly as possible upon the presentation of the target. After the fixation cross was presented for 1500 ms, the screen was blank for 1000 ms to 1500 ms as the inter-trial interval. The random inter-trial intervals were chosen to prevent ERPs elicited in the next trial from being systematically influenced by the preceding ERPs (see Luck 2005). Participants were instructed to keep their eyes fixated on the center of the screen and to minimize eye movements or blinks during the whole experiment.
After the learning phase, half of the participants completed the emotional task first and then completed the gender task, while the other half of the participants did the reverse. Each task contained 384 trials, divided into eight blocks of 48 trials, with each condition having 8 trials in a pseudorandomized order (each of the 12 faces was presented repeatedly for four times in a block). Participants received two practice blocks with 12 trials before each task.
EEG recordings and analyses
Electroencephalographic (EEG) data were recorded in the test phase and were measured by 62 Ag/AgCl electrodes embedded in an elastic cap with the NeuroScan SynAmps system (NeuroScan Inc. Sterling, Virginia, USA), based on the international 10–20 system. The left mastoid served as the reference online. All channels were then re-referenced off-line to the averaged mastoids. We used two additional pairs of electrodes to monitor horizontal and vertical electrooculographic (EOG) recordings (VEOGs and HEOGs) of blinks and eye movements online. One pair of electrodes was placed above and below the left eye, and the other pair was placed 10 mm from the lateral canthi. EEG and EOG were amplified using a 0.01–100 Hz band pass and sampled at 500 Hz. Impedance was kept below 5 kΩ.
The EEG data were analyzed using the EEGLAB toolbox (Delorme and Makeig 2004). Data were band-pass filtered from 0.05 to 40 Hz offline. EOG blink and movement artifacts were corrected using an Independent Component Analysis (ICA) algorithm (Delorme and Makeig 2004). The averaged epoch was 1000 ms, with an additional 100 ms recorded prior to stimulus onset. Baseline corrections were performed for each epoch using the 100 ms pre-stimulus interval. Any epoch with incorrect responses to the target or deflections exceeding ± 75 μv was excluded from the analysis. In total, there were < 20% trials, including error trials and artifact trials, excluded in each condition of each experiment, with > 50 valid trials per condition for each participant.
We analyze the N1, VPP, N170, and N2pc components based on the topographical distribution of grand-averaged ERP activities and related ERP findings (e.g., Calvo and Beltrán 2014; Luo et al. 2010; Wei et al. 2014). The N1 activity is typically peak at fronto-central electrode sites (Calvo and Beltrán 2014; Calvo, Beltrán et al. 2014a; Eimer and Holmes 2002, 2007), and we analyzed the Cz and FCz (average) electrodes between 90 and 140 ms to assess this component. With these electrodes, we also analyzed the VPP, at time windows between 160 and 210 ms. Average amplitudes for each condition during each time window were analyzed using analysis of variance (ANOVA) with three within-participant factors: task demand (emotion task vs. gender task), reward probability (high-reward condition vs. low-reward condition), and emotional faces (angry, happy, or neutral). In addition, we analyzed the the N170 component (140–180 ms) from the lateral temporo-occipital electrodes (P3, P5, PO5, P4, P6, and PO6). Average amplitudes from the left electrodes (P3, P5, and PO5) and the right electrodes (P4, P6, and PO6) were analyzed by ANOVAs with four within-participant factors: task demand (emotion task vs. gender task), reward probability (high-reward condition vs. low-reward condition), emotional faces (angry, happy, or neutral), and electrode topography (left vs. right).
Finally, we analyzed the PO7 and PO8 electrodes between 220 and 280 ms, indexing N2pc component (Eimer 1996; Holmes et al. 2009; Luck and Hillyard 1994). The ipsilateral (contralateral) waveform was measured as the average amplitudes of the left-side electrode to the left-side (right-side) targets and the average amplitudes of the right-side electrode to the right-side (left-side) targets. We then analyzed the data using ANOVAs with four within-participant factors: task demand (emotion task vs. gender task), reward probability (high-reward condition vs. low-reward condition), emotional faces (angry, happy, or neutral), and contralaterality (ipsilateral vs. contralateral to the target location).
For all ANOVAs, the significance level was set to α = 0.05; and where appropriate, either Bonferroni pairwise or simple main effects comparisons were conducted. We used the Greenhouse-Geisser correction for all effects with two or more degrees of freedom in the numerator. We reported all repeated measures ANOVA results with uncorrected degrees of freedom, but with corrected p values.
Results
Behavioral results
Learning task
We calculated the percentages of times the high-reward associated faces were chosen in each block as the value-learning index according to previous studies (Chen and Wei 2023; Raymond and O’Brien 2009; Yokoyama et al. 2015). The results revealed a gradual increase of this index (Fig. 2, left panel). An ANOVA was conducted on the choice probability, with block (first vs. last block) and the facial emotional valence (angry, happy, vs. neutral) as within-participant factors. We found a significant main effect on block, F(1,42) = 242.98, p < 0.001, η2p = 0.920. Bonferroni-corrected pairwise comparisons revealed increased high-reward category selection during the last block (97.12%) than that during the first block (59.96%). Neither the main effect of facial emotional valence nor the interaction reached significance, both ps > 0.05.
Fig. 2.
Average probability (left panel) and RTs (right panel) of choosing the optimal face across participants in each block. Error bars denote the standard error of the mean
We also analyzed the RTs of optimal choices (Fig. 2, right panel) with the same ANOVA. The RT results revealed a gradual acceleration for optimal choices (Fig. 2, right panel). We found a significant main effect on block, F(1,42) = 47.97, p < 0.001, η2p = 0.696. Bonferroni-corrected pairwise comparisons revealed faster high-reward category selection during the last block (1225 ms) than that during the first block (2120 ms). Neither the main effect of facial emotional valence nor the interaction reached significance, both ps > 0.05.
Test phase
The mean RTs and response accuracy for each experimental condition are illustrated in Table 1. An ANOVA on RTs with task demand (emotion task vs. gender task), reward association (high vs. low probability), and the target’s emotional valence (angry, happy, or neutral) as within-participant factors revealed a significant main effect of task demand, F(1,21) = 30.588, p < 0.001, η2p = 0.593, with faster RTs in the gender-task than in the emotion-task (771 vs. 849 ms), as well as a main effect of target emotional valence, F(2,42) = 23.883, p < 0.001, η2p = 0.532. Bonferroni-corrected pairwise comparisons revealed that RTs in the happy-face condition (809 ms) and the neutral-face condition (820 ms) were faster than RTs in the angry-face condition (869 ms; both ps < 0.001), but there was no difference between RTs in the happy and neutral-face conditions (p > 0.1). The main effect of reward probability was not significant, F(1,21) < 1. Moreover, the interaction between task demand and the target’s emotional valence was significant, F(2,42) = 8.419, p = 0.001, η2p = 0.286.
Table 1.
Mean RTs and accuracy rates for each experimental condition
| High-reward | Low-reward | ||||||
|---|---|---|---|---|---|---|---|
| Angry | Happy | Neutral | Angry | Happy | Neutral | ||
| Emotion task | RTs (SE) | 949 (35) | 847 (21) | 897 (25) | 937 (27) | 854 (21) | 883 (23) |
| Accuracy (SE) | 86.9 (12.6) | 96.0 (5.1) | 96.5 (3.1) | 90.6 (9.8) | 96.9 (2.3) | 97.3 (2.7) | |
| Gender task | RTs (SE) | 792 (27) | 765 (24) | 747 (25) | 802 (28) | 771 (24) | 751 (27) |
| Accuracy (SE) | 89.4 (14.2) | 96.0 (4.4) | 97.7 (3.4) | 87.4 (17.3) | 93.9 (7.9) | 97.3 (2.9) | |
Note RTs are in ms; accuracy rates are percentages SE=standard error of the mean
Further one-way ANOVAs were separately conducted for the emotion task and the gender task, with emotional valence as a within-participant factor. The results revealed that target emotional valence had a significant main effect in the emotion task, F(2,42) = 17.353, p < 0.001, η2p = 0.452. Bonferroni-corrected pairwise comparisons revealed that RTs in the happy-face condition (850 ms) and the neutral-face condition (890 ms) were faster than the RTs in the angry-face condition (942 ms; both ps < 0.05). In the gender task, the target’s emotional valence also had a significant main effect, F(2,42) = 14.961, p < 0.001, η2p = 0.416. Bonferroni-corrected pairwise comparisons revealed that RTs in the neutral-face condition (749 ms) were faster than RTs in the happy-face condition (768 ms) and the angry-face condition (797 ms; both ps < 0.01), and that RTs in the happy-face condition were also faster than RTs in the angry-face condition (p < 0.05). No other effects or interactions reached significance.
Analyses of the accuracy rates revealed a significant main effect of target emotional valence, F(2,42) = 24.565, p < 0.001, η2p = 0.539. Bonferroni-corrected pairwise comparisons revealed that the accuracy rates for the neutral-face (97.1%) were higher than those for the happy-face (95.7%) and the angry-face (88.8%; both ps < 0.001), and the accuracy rates for the happy-face were also higher than those for the angry-face (p < 0.01). No other effects reached significance.
ERP results
Analyses of the N1 component (Fig. 3) revealed a significant main effect of the target emotional valence, F(2,42) = 3.502, p = 0.039, η2p = 0.143. Bonferroni-corrected pairwise comparisons revealed that the mean amplitude in the angry-face condition (-4.377 µv) was significantly more negative than in the neutral-face condition (-4.039 µv, p = 0.033), but there was no difference between the happy-face condition (-3.942 µv) and either the angry-face or neutral-face conditions, both ps > 0.05. Moreover, we found a significant interaction between reward and emotionality, F(2,42) = 3.840, p = 0.029, η2p = 0.155. Bonferroni-corrected pairwise comparisons revealed that the difference between the mean amplitudes for the high- and low-reward probabilities was marginally significantly larger in the neutral-face condition (mean difference = 0.419 µv) than the happy-face condition (mean difference = -0.325 µv), p = 0.056, but there was no difference between the angry-face condition (mean difference = 0.260 µv) and either the happy-face or neutral-face conditions, both ps > 0.1. No other effects or interactions reached significance.
Fig. 3.
Grand average waveforms at the Cz electrode, showing the potentials produced in response to the presentation of the target stimulus in the experiment. For all the waveforms, the high-reward association conditions are shown in red lines and the low-reward association conditions are shown in black lines. The topographies of the N1 and VPP are shown below each waveform
Moreover, analysis of the VPP component (Fig. 3) revealed a significant main effect of target emotional valence, F(2,42) = 4.489, p = 0.017, η2p = 0.176. Bonferroni-corrected pairwise comparisons revealed that the amplitudes in the angry-face condition (0.929 µv) were significantly less positive than those in the happy-face (1.514 µv, p = 0.027) or neutral-face (1.399 µv, p = 0.034) conditions. We also found a significant interaction between reward and emotionality, F(2,42) = 5.248, p = 0.009, η2p = 0.200. Bonferroni-corrected pairwise comparisons of the VPP component revealed that the difference between the mean amplitudes for the high- and low-reward probability was larger in the angry-face condition (0.668 µv) than in the happy-face (-0.414 µv) or the neutral-face conditions (-0.368 µv), both ps < 0.05, but there was no difference between the amplitudes in the happy-face condition and the neutral-face condition, p > 0.1. No other effects or interactions reached significance.
Analysis of the N170 component (Fig. 4) revealed a significant main effect of reward, F(1,21) = 4.875, p = 0.038, η2p = 0.188, such that the amplitude in the high reward-probability condition (-1.115 µv) was significantly less negative than that in the low-reward probability condition (-1.312 µv). A significant main effect of emotional valence was also found, F(2,42) = 3.372, p = 0.033, η2p = 0.150. Bonferroni-corrected pairwise comparisons revealed that the mean amplitude in the angry-face condition (-1.377 µv) was significantly more negative than the mean amplitude in the happy-face condition (-1.108 µv, p = 0.007), but there was no difference between amplitudes in the neutral-face condition (-1.155 µv) and either the angry-face or the happy-face conditions, both ps > 0.1. In addition, we found a significant interaction between task demand, reward, and electrode topography. Subsequent separate ANOVAs revealed that for the left electrodes, there was a significant main effect of task demand, F(1,21) = 9.384, p = 0.006, η2p = 0.309, with a more negative amplitude in the emotion-task condition than in the gender-task condition (-0.943 vs. -0.484 µv), but neither the main effect of reward nor the interaction was significant, ps > 0.1. For the right electrodes, there was a significant main effect of the reward, F(1,21) = 10.053, p < 0.01, η2p = 0.324, with a more negative amplitude in the low-reward condition than in the high-reward condition (-1.864 vs. -1.563 µv), but the main effect of task demand was not significant, F(1,21) < 1. Importantly, the interaction between task demand and reward was significant, F(1,21) = 7.340, p = 0.013, η2p = 0.259. Subsequent pairwise comparisons revealed that in the emotion task, the ERP responses for low reward (-2.066 µv) were more negative than those for high reward (-1.491 µv), t(21) = 2.723, p < 0.05, but the responses for the low- and high-reward conditions were comparable in the gender task, t(21) = 0.168, p > 0.1. No other effects or interactions reached significance.
Fig. 4.
Grand average waveforms at the P5 (left hemisphere) and P6 (right hemisphere) electrodes, showing the potentials produced in response to the presentation of the target stimulus in the experiment. Positive voltage is plotted downwards. Bars of the mean amplitudes of N170 are shown below. Error bars denote the standard error of the mean
Finally, analysis of the N2pc component (Fig. 5) revealed a significant main effect of emotional valence, F(2,42) = 6.829, p = 0.003, η2p = 0.245. Bonferroni-corrected pairwise comparisons revealed that the mean amplitude in the angry-face condition (2.504 µv) was significantly more negative than the mean amplitude in the neutral-face condition (2.862 µv), p = 0.008, but there was no difference between the happy-face condition (2.714 µv) and either the angry-face or the neutral-face conditions, both ps > 0.1. We also found a significant interaction between emotionality and contralaterality, F(2,42) = 11.706, p < 0.001, η2p = 0.358. Subsequent Bonferroni-corrected pairwise comparisons revealed that the difference between the mean amplitudes for the contralateral and ipsilateral electrodes was less negative in the angry-face condition (-3.870 µv) than in the happy-face condition (-4.535 µv, p < 0.001) and the neutral-face condition (-4.344 µv, p < 0.01), but there was no difference between the happy-face condition and the neutral-face condition, p > 0.1. Additionally, we found a significant interaction between task demand, emotionality, and contralaterality, F(2,42) = 3.712, p = 0.033, η2p = 0.150. Subsequent Bonferroni-corrected pairwise comparisons revealed that in the emotion task, the difference between the mean amplitudes for the contralateral and ipsilateral electrodes was more negative in the happy-face condition (-4.886 µv) than in either the angry-face condition (-4.062 µv) and the neutral-face condition (-4.347 µv), both ps < 0.05, but there was no difference between the angry-face condition and the neutral-face condition, p > 0.1. In the gender task, the difference between the mean amplitudes for the contralateral and ipsilateral electrodes was more negative in the neutral-face condition (-4.340 µv) than in the angry-face condition (-3.687 µv, p < 0.01), but there was neither difference between the happy-face condition (-4.183 µv) and the neutral-face condition nor difference between the happy- and the angry-face conditions, both ps > 0.1.
Fig. 5.
The left panel displays the ERPs elicited at PO7 and PO8 in response to the target face, shown separately for trials where the face appears contralateral (solid lines) and ipsilateral (dash lines) to the recording electrode. The right panel displays difference waves (contra- minus ipsilateral recordings) elicited by angry (red line), happy (blue line), and neutral faces (black line)
Discussion
The current study investigated the ability of task demand to modify the interaction between reward and emotional processing. Participants learned to associate facial identities with a high- or low-reward probability in the learning phase and to discriminate the emotionality or the gender of the learned faces in the test phase. The ERP results in the later test phase revealed a significant interaction between reward probability and emotion for the fronto-central N1 and the VPP components. Moreover, reward association and task demand exhibited a significant interaction over the right hemisphere for the N170 component, in which the response differences between the high- and low-reward conditions were stronger in the emotion task than in the gender task. Finally, we found a significant interaction of emotion and task demand on the N2pc component (220–280 ms), with happy faces eliciting larger N2pc difference waves than angry and neutral faces did in the emotion task, but neutral faces eliciting larger N2pc difference waves than angry faces did in the gender task.
The behavioral results revealed a ‘happy superiority’ in the emotion task, such that RTs to happy faces were faster than RTs to angry faces. Recent studies have demonstrated that happy faces easily attain more attentional resources during an emotion categorization task (Calvo and Beltrán 2014; Calvo, Fernández-Martín, et al. 2014b; Nummenmaa and Calvo 2015) since the emotion of happiness is the only positive emotion among the six basic emotions (happiness, anger, disgust, fear, sadness, and surprise). In contrast, the gender task revealed a ‘neutral face’ advantage, suggesting that happy and angry emotional information may compete for attentional resources, thus, disturbing the gender discrimination task (e.g., Wronka and Walentowska 2011; Wu et al. 2019).
For both the frontal-central N1 and the VPP components, we found a main effect of emotion, with angry faces eliciting more negative amplitudes than happy and neutral faces, consistent with previous studies (Feldmann-Wüstefeld et al. 2011; Nummenmaa and Calvo 2015). The N1 is thought to reflect the quick discerning of the emotional meaning through rough visual cues and to reflect the early attentional capture (Foti et al. 2009; Luo et al. 2010; Vuilleumier and Pourtois 2007; Zhang et al. 2013, 2017). The VPP is also an emotion-sensitive component and indicates the rapid extraction of emotional information (Eimer and Holmes 2007; Ping et al. 2014; Smith et al. 2013; Williams et al. 2006). These early manifestations of angry faces over happy and neutral faces indicate that participants were more sensitive to the danger signals from angry faces at the very early stage, which has a potential benefit for survival to avoid danger as early as possible (Calvo, Beltrán, et al. 2014a; Vuilleumier and Pourtois 2007).
Moreover, both the N1 and the VPP components exhibited an interaction between reward and emotion, although with somewhat different patterns. Across both tasks, high reward-associated faces elicited more positive-going brain waves compared to low reward-associated faces, in agreement with recent findings that reward-associated stimuli enjoy high motivational salience (Hammerschmidt et al. 2018; Hammerschmidt, Kulke, Hammerschmidt et al. 2017, 2018a, b). The differences between high and low reward-associated stimuli were most pronounced for neutral faces for the N1 component but were most pronounced for angry faces in the VPP component in the current results. Recent studies have suggested that this association history with high reward may serve as an ‘emotional tagging’ mechanism to facilitate the sensory processing of stimuli in a bottom-up way (Bourgeois et al. 2016; Steinberg et al. 2013). Neutral faces may acquire emotional value via associative learning and may be processed in the same way as emotional faces in the early time window (Gupta et al. 2016; Hammerschmidt et al. 2017; Müller-Bardorff et al. 2016; Wentura et al. 2014). For example, one study by Hammerschmidt et al. (2017) employed a similar reward-association paradigm with only neutral faces (but no emotional faces) and found that reward-associated neutral faces could enhance the early occipital P1 component (75–125 ms, the similar time window with the current N1 component). These results indicate that reward history may modulate early visual sensory processing for fast salience detection in the extrastriate visual cortex (Rossi et al. 2017). Moreover, the ‘emotional tagging’ caused by the approaching motivational significance of reward association has a stronger effect on neutral or negative emotions than happy emotions, and we speculated that this is because the happy emotion is of the same motivational direction as reward, thus, leading it to have a smaller influence (Hu et al. 2013; Paul and Pourtois 2017; Paul et al. 2020; Pessoa 2015).
As mentioned in the Introduction, we aimed to test whether the interaction between reward and emotion was affected by task demand. However, no task demand effect was found in the early N1 and VPP components, where the interaction between reward and emotion was observed. Recent studies have suggested that the early processing of facial expressions is independent of task demand, suggesting there is an automatic encoding of different emotional faces at an early stage (Calvo and Nummenmaa 2016; Palermo and Rhodes 2007). Our results further imply that reward-association effects in these early time windows are also independent of task requirements.
Interestingly, we found an interaction between task demand and reward for the N170 component, in which stronger response differences between high and low reward-associated faces were found in the emotion task than in the gender task in the right hemisphere. As mentioned earlier, reward association may act as an emotional signal, which is relatively more implicit compared with the intrinsic emotional characteristics of a photographic face (Chen and Wei 2019). As a consequence, reward association functions more in a non-automatic way compared with the automatic way of processing emotional faces (Hudson et al. 2021), so task demands can successfully exert a top-down impact on reward processing but not on emotional processing at this stage. Moreover, although the N170 has long been suggested to reflect facial or emotional processing (Bentin et al. 1996; Jonas et al. 2016; Lochy et al. 2019; Rossion 2014), some recent studies have consistently reported that it is modulated by reward-associated stimuli, suggesting that the N170 component has a broader sensitivity to stimuli that have both emotional and motivational significance (Jia et al. 2021; Marini et al. 2011; Wu et al. 2019).
We did not find any reward-related effect for the later time window, reflecting the results of attentional selection (i.e., the N2pc component). However, task demand showed a clear effect in modulating brain responses at this stage, such that the emotion and task demand interaction of N2pc showed the same pattern as the behavioral results. In particular, the happy-face condition elicited a more negative N2pc than the angry-face condition and neutral-face condition in the emotion task, whereas the neutral-face condition elicited a more negative N2pc than the angry-face condition and happy-face condition in the gender task. The N2pc component reflects spatial selective attention, and a larger N2pc indicates a stronger attentional capture effect, and thus, faster behavioral responses and higher accuracies for targets (Luck and Hillyard 1994; Yao et al. 2014). Task demand functions in a top-down way to modulate the allocation of attentional resources in the late attentional capture stage to perform a task better. However, as no reward was offered in the test phase, the reward modulation effect was diminished in the late processing stage.
Comparing the differential effects of reward in behavioral results and the ERP results, there might be two reasons to explain the absence of reward effects in behavioral data. First, the reward processing was in a more implicit way in the test phase as both the emotion and gender tasks had no direct requirements on the processing of reward information. These task requirements thus restrain the representation of reward information on the behavioral level. Moreover, as we used photographic faces as stimuli which are of much more complex information than schematic faces (Nummenmaa and Calvo 2015), the redundant information of photographic faces might also interfere with the processing of reward information. However, we still found the reward effect for the early components in ERP data, such as N1, N170, and EPN, which indicate the implicit and early arousal effect for reward-related stimuli and are consistent with previous studies about reward association over emotional stimuli (Chen and Wei 2019, 2023; Hammerschmidt, Kulke, Hammerschmidt et al. 2017, 2018a, b).
Finally, some limitations of the current study should be considered. First, the stimuli set in our study were not large and the twelve faces were repeatedly used in the learning phase and both the emotion and gender tasks in the test phase. Although using a small set of stimuli could ensure participants better learn the reward associations, the repeated presentation of these stimuli in the test phase may reduce the emotional and/or reward salience in the test phase and hence reduce the behavioral and electrophysiological effects. Second, we only manipulated the task demand in the test phase but used the same task in the learning phase. It is of theoretical interest to manipulate task demand in the learning phase to examine the effect of task relevance of emotional information in forming the reward association and hence the effect in the later test phase. Moreover, there were three response keys in the emotion task but two response keys in the gender task. This may cause longer RTs in the former than in the later task. Finally, as we did not record the EEG data in the learning phase, it is difficult to explore the link between the emotional effect in the learning phase and the performance in the test phase. Future studies are needed to further explore the effect of emotional information in regulating the reward learning process.
In conclusion, the reward-modulation effect on emotional faces functions mainly in the early time windows, which may imbue emotional significance to neutral faces or angry faces. Task demand functions in a top-down way to modulate the deployment of attentional resources at the later attentional selection stage, but not the early automatic processing of either emotion or reward association.
Author contributions
Conceptualization: Ning-Xuan Chen and Ping Wei; Methodology: Ning-Xuan Chen; Validation: Ning-Xuan Chen and Ping Wei; Formal analysis: Ning-Xuan Chen; Investigation: Ning-Xuan Chen; Data curation: Ning-Xuan Chen and Ping Wei; Writing—original draft preparation: Ning-Xuan Chen; Writing—review and editing: Ning-Xuan Chen and Ping Wei; Visualization: Ning-Xuan Chen; Supervision: Ping Wei; All authors read and approved the final manuscript.
Funding
This work was supported by the National Natural Science Foundation of China (31971030, 31470979), the Youth Beijing Scholar Project, the Support Project of High-level Teachers in Beijing Municipal Universities in the Period of 13th Five-year Plan, and the Capacity Building for Sci-Tech Innovation-Fundamental Scientific Research Funds (131-20530290058).
Data availability
The data presented in this study are available on request from the corresponding author on reasonable request.
Declarations
Ethical approval
This study was approved by the Ethics Committee of the Department of Psychology of Capital Normal University, and all participants gave their informed consent before the experiment, in accordance with the Declaration of Helsinki.
Competing interests
The authors have no competing interests to declare that are relevant to the content of this article.
Footnotes
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- Anderson BA (2013) A value-driven mechanism of attentional selection. J Vis 13. 10.1167/13.3.7 [DOI] [PMC free article] [PubMed]
- Anderson BA, Folk CL (2010) Variations in the magnitude of attentional capture: testing a two-process model. Atten Percept Psychophys 72:342–352. 10.3758/APP.72.2.342 [DOI] [PubMed] [Google Scholar]
- Awh E, Belopolsky AV, Theeuwes J (2012) Top-down versus bottom-up attentional control: a failed theoretical dichotomy. Trends Cogn Sci 16:437–443. 10.1016/j.tics.2012.06.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Baluch F, Itti L (2011) Mechanisms of top-down attention. Trends Neurosci 34:210–224. 10.1016/j.tins.2011.02.003 [DOI] [PubMed] [Google Scholar]
- Bentin S, Allison T, Puce A, Perez E, McCarthy G (1996) Electrophysiological studies of Face Perception in humans. J Cogn Neurosci 8:551–565. 10.1162/jocn.1996.8.6.551 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bourgeois A, Chelazzi L, Vuilleumier P (2016) How motivation and reward learning modulate selective attention. Prog Brain Res 229:325–342. 10.1016/bs.pbr.2016.06.004 [DOI] [PubMed] [Google Scholar]
- Calvo MG, Beltrán D (2014) Brain lateralization of holistic versus analytic processing of emotional facial expressions. NeuroImage 92:237–247. 10.1016/j.neuroimage.2014.01.048 [DOI] [PubMed] [Google Scholar]
- Calvo MG, Beltran D, Fernandez-Martin A (2014a) Processing of facial expressions in peripheral vision: neurophysiological evidence. Biol Psychol 100:60–70. 10.1016/j.biopsycho.2014.05.007 [DOI] [PubMed] [Google Scholar]
- Calvo MG, Fernández-Martín A, Nummenmaa L (2014b) Facial expression recognition in peripheral versus central vision: role of the eyes and the mouth. Psychol Res 78:180–195. 10.1007/s00426-013-0492-x [DOI] [PubMed] [Google Scholar]
- Calvo MG, Nummenmaa L (2016) Perceptual and affective mechanisms in facial expression recognition: an integrative review. Cogn Emot 30:1081–1106. 10.1080/02699931.2015.1049124 [DOI] [PubMed] [Google Scholar]
- Chen N, Wei P (2019) Reward association alters brain responses to emotional stimuli: ERP evidence. Int J Psychophysiol 135:21–32. 10.1016/j.ijpsycho.2018.11.001 [DOI] [PubMed] [Google Scholar]
- Chen NX, Wei P (2023) Reward history modulates the Processing of Task-Irrelevant Emotional faces in a demanding Task. Brain Sci 13. 10.3390/brainsci13060874 [DOI] [PMC free article] [PubMed]
- Delorme A, Makeig S (2004) EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J Neurosci Methods 134:9–21. 10.1016/j.jneumeth.2003.10.009 [DOI] [PubMed] [Google Scholar]
- Durston AJ, Itier RJ (2021) The early processing of fearful and happy facial expressions is independent of task demands - support from mass univariate analyses. Brain Res 1765:147505. 10.1016/j.brainres.2021.147505 [DOI] [PubMed] [Google Scholar]
- Eimer M (1996) The N2pc component as an indicator of attentional selectivity. Electroencephalogr Clin Neurophysiol 99:225–234. 10.1016/0013-4694(96)95711-9 [DOI] [PubMed] [Google Scholar]
- Eimer M (2000) Effects of face inversion on the structural encoding and recognition of faces. Evidence from event-related brain potentials. Brain Res Cogn Brain Res 10:145–158. 10.1016/s0926-6410(00)00038-0 [DOI] [PubMed] [Google Scholar]
- Eimer M, Holmes A (2002) An ERP study on the time course of emotional face processing. NeuroReport 13:427–431. 10.1097/00001756-200203250-00013 [DOI] [PubMed] [Google Scholar]
- Eimer M, Holmes A (2007) Event-related brain potential correlates of emotional face processing. Neuropsychologia 45:15–31. 10.1016/j.neuropsychologia.2006.04.022 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Faul F, Erdfelder E, Buchner A, Lang AG (2009) Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses. Behav Res Methods 41:1149–1160. 10.3758/brm.41.4.1149 [DOI] [PubMed] [Google Scholar]
- Fecteau JH, Munoz DP (2006) Salience, relevance, and firing: a priority map for target selection. Trends Cogn Sci 10:382–390. 10.1016/j.tics.2006.06.011 [DOI] [PubMed] [Google Scholar]
- Feldmann-Wüstefeld T, Schmidt-Daffy M, Schubö A (2011) Neural evidence for the threat detection advantage: differential attention allocation to angry and happy faces. Psychophysiology 48:697–707. 10.1111/j.1469-8986.2010.01130.x [DOI] [PubMed] [Google Scholar]
- Foti D, Hajcak G, Dien J (2009) Differentiating neural responses to emotional pictures: evidence from temporal-spatial PCA. Psychophysiology 46:521–530. 10.1111/j.1469-8986.2009.00796.x [DOI] [PubMed] [Google Scholar]
- George N, Jemel B, Fiori N, Chaby L, Renault B (2005) Electrophysiological correlates of facial decision: insights from upright and upside-down Mooney-face perception. Brain Res Cogn Brain Res 24:663–673. 10.1016/j.cogbrainres.2005.03.017 [DOI] [PubMed] [Google Scholar]
- Gupta R, Hur YJ, Lavie N (2016) Distracted by pleasure: effects of positive versus negative valence on emotional capture under load. Emotion 16:328–337. 10.1037/emo0000112 [DOI] [PubMed] [Google Scholar]
- Hammerschmidt W, Sennhenn-Reulen H, Schacht A (2017) Associated motivational salience impacts early sensory processing of human faces. NeuroImage 156:466–474. 10.1016/j.neuroimage.2017.04.032 [DOI] [PubMed] [Google Scholar]
- Hammerschmidt W, Kagan I, Kulke L, Schacht A (2018a) Implicit reward associations impact face processing: time-resolved evidence from event-related brain potentials and pupil dilations. NeuroImage 179:557–569. 10.1016/j.neuroimage.2018.06.055 [DOI] [PubMed] [Google Scholar]
- Hammerschmidt W, Kulke L, Broering C, Schacht A (2018b) Money or smiles: independent ERP effects of associated monetary reward and happy faces. PLoS ONE 13:e0206142. 10.1371/journal.pone.0206142 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hickey C, Chelazzi L, Theeuwes J (2010a) Reward changes salience in human vision via the anterior cingulate. J Neurosci 30:11096–11103. 10.1523/JNEUROSCI.1026-10.2010 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hickey C, Chelazzi L, Theeuwes J (2010b) Reward guides vision when it’s your thing: trait reward-seeking in reward-mediated visual priming. PLoS ONE 5:e14087. 10.1371/journal.pone.0014087 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hickey C, Chelazzi L, Theeuwes J (2011) Reward has a residual impact on target selection in visual search, but not on the suppression of distractors. Vis Cogn 19:117–128. 10.1080/13506285.2010.503946 [Google Scholar]
- Holmes A, Bradley BP, Kragh Nielsen M, Mogg K (2009) Attentional selectivity for emotional faces: evidence from human electrophysiology. Psychophysiology 46:62–68. 10.1111/j.1469-8986.2008.00750.x [DOI] [PubMed] [Google Scholar]
- Hu K, Padmala S, Pessoa L (2013) Interactions between reward and threat during visual processing. Neuropsychologia 51:1763–1772. 10.1016/j.neuropsychologia.2013.05.025 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hudson A, Durston AJ, McCrackin SD, Itier RJ (2021) Emotion, gender and gaze discrimination tasks do not differentially Impact the neural Processing of Angry or Happy Facial Expressions-a Mass Univariate ERP Analysis. Brain Topogr 34:813–833. 10.1007/s10548-021-00873-x [DOI] [PubMed] [Google Scholar]
- Jia Y, Cui L, Pollmann S, Wei P (2021) The interactive effects of reward expectation and emotional interference on cognitive conflict control: an ERP study. Physiol Behav 234:113369. 10.1016/j.physbeh.2021.113369 [DOI] [PubMed] [Google Scholar]
- Jonas J, Jacques C, Liu-Shuang J, Brissart H, Colnat-Coulbois S, Maillard L, Rossion B (2016) A face-selective ventral occipito-temporal map of the human brain with intracerebral potentials. Proc Natl Acad Sci U S A 113:E4088–4097. 10.1073/pnas.1522033113 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kiss M, Driver J, Eimer M (2009) Reward priority of visual target singletons modulates event-related potential signatures of attentional selection. Psychol Sci 20:245–251. 10.1111/j.1467-9280.2009.02281.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Krebs RM, Boehler CN, Woldorff MG (2010) The influence of reward associations on conflict processing in the Stroop task. Cognition 117:341–347. 10.1016/j.cognition.2010.08.018 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lochy A, de Heering A, Rossion B (2019) The non-linear development of the right hemispheric specialization for human face perception. Neuropsychologia 126:10–19. 10.1016/j.neuropsychologia.2017.06.029 [DOI] [PubMed] [Google Scholar]
- Luck SJ (2005) Ten simple rules for designing ERP experiments. Event-related potentials: A methods handbook 262083337
- Luck SJ (2006) The Operation of Attention–Millisecond by Millisecond–Over the First Half Second
- Luck SJ, Hillyard SA (1994) Electrophysiological correlates of feature analysis during visual search. Psychophysiology 31:291–308. 10.1111/j.1469-8986.1994.tb02218.x [DOI] [PubMed] [Google Scholar]
- Luo W, Feng W, He W, Wang NY, Luo YJ (2010) Three stages of facial expression processing: ERP study with rapid serial visual presentation. NeuroImage 49:1857–1867. 10.1016/j.neuroimage.2009.09.018 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Marini F, Marzi T, Viggiano MP (2011) Wanted! The effects of reward on face recognition: electrophysiological correlates. Cogn Affect Behav Neurosci 11:627–643. 10.3758/s13415-011-0057-7 [DOI] [PubMed] [Google Scholar]
- Müller-Bardorff M, Schulz C, Peterburs J, Bruchmann M, Mothes-Lasch M, Miltner W, Straube T (2016) Effects of emotional intensity under perceptual load: an event-related potentials (ERPs) study. Biol Psychol 117:141–149. 10.1016/j.biopsycho.2016.03.006 [DOI] [PubMed] [Google Scholar]
- Nummenmaa L, Calvo MG (2015) Dissociation between recognition and detection advantage for facial expressions: a meta-analysis. Emotion 15:243–256. 10.1037/emo0000042 [DOI] [PubMed] [Google Scholar]
- Palermo R, Rhodes G (2007) Are you always on my mind? A review of how face perception and attention interact. Neuropsychologia 45:75–92. 10.1016/j.neuropsychologia.2006.04.025 [DOI] [PubMed] [Google Scholar]
- Park HRP, Kostandyan M, Boehler CN, Krebs RM (2019) Winning smiles: signalling reward by overlapping and non-overlapping emotional valence differentially affects performance and neural activity. Neuropsychologia 122:28–37. 10.1016/j.neuropsychologia.2018.11.018 [DOI] [PubMed] [Google Scholar]
- Paul K, Pourtois G (2017) Mood congruent tuning of reward expectation in positive mood: evidence from FRN and theta modulations. Soc Cogn Affect Neurosci 12:765–774 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Paul K, Pourtois G, Harmon-Jones E (2020) Modulatory effects of positive mood and approach motivation on reward processing: two sides of the same coin? Cognitive. Affect Behav Neurosci 20:236–249 [DOI] [PubMed] [Google Scholar]
- Pessoa L (2015) Multiple influences of reward on perception and attention. Vis Cogn 23:272–290. 10.1080/13506285.2014.974729 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ping W, Guanlan K, Jinhong D, Chunyan G (2014) Monetary incentives modulate the processing of emotional facial expressions: an ERP study. Acta Physiol Sinica 46:437 [Google Scholar]
- Pollmann S, Estocinova J, Sommer S, Chelazzi L, Zinke W (2016) Neural structures involved in visual search guidance by reward-enhanced contextual cueing of the target location. NeuroImage 124:887–897. 10.1016/j.neuroimage.2015.09.040 [DOI] [PubMed] [Google Scholar]
- Raymond JE, O’Brien JL (2009) Selective visual attention and motivation: the consequences of value learning in an attentional blink task. Psychol Sci 20:981–988. 10.1111/j.1467-9280.2009.02391.x [DOI] [PubMed] [Google Scholar]
- Rossi V, Vanlessen N, Bayer M, Grass A, Pourtois G, Schacht A (2017) Motivational salience modulates early visual cortex responses across Task Sets. J Cogn Neurosci 29:968–979. 10.1162/jocn_a_01093 [DOI] [PubMed] [Google Scholar]
- Rossion B (2014) Understanding face perception by means of human electrophysiology. Trends Cogn Sci 18:310–318. 10.1016/j.tics.2014.02.013 [DOI] [PubMed] [Google Scholar]
- Rossion B, Joyce CA, Cottrell GW, Tarr MJ (2003) Early lateralization and orientation tuning for face, word, and object processing in the visual cortex. NeuroImage 20:1609–1624. 10.1016/j.neuroimage.2003.07.010 [DOI] [PubMed] [Google Scholar]
- Smith E, Weinberg A, Moran T, Hajcak G (2013) Electrocortical responses to NIMSTIM facial expressions of emotion. Int J Psychophysiol 88:17–25. 10.1016/j.ijpsycho.2012.12.004 [DOI] [PubMed] [Google Scholar]
- Steinberg C, Bröckelmann AK, Rehbein M, Dobel C, Junghöfer M (2013) Rapid and highly resolving associative affective learning: convergent electro- and magnetoencephalographic evidence from vision and audition. Biol Psychol 92:526–540. 10.1016/j.biopsycho.2012.02.009 [DOI] [PubMed] [Google Scholar]
- Vuilleumier P, Pourtois G (2007) Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging. Neuropsychologia 45:174–194. 10.1016/j.neuropsychologia.2006.06.003 [DOI] [PubMed] [Google Scholar]
- Wang Y, Luo Y-j (2005) Standardization and Assessment of College Students’ facial expression of emotion. Chin J Clin Psych 13:396–398 [Google Scholar]
- Wei P, Ji L (2021) Reward expectation modulates N2pc for target selection: electrophysiological evidence. Psychophysiology 58:e13837. 10.1111/psyp.13837 [DOI] [PubMed] [Google Scholar]
- Wei P, Kang G (2014) Task relevance regulates the interaction between reward expectation and emotion. Exp Brain Res 232:1783–1791. 10.1007/s00221-014-3870-8 [DOI] [PubMed] [Google Scholar]
- Wei P, Kang G, Ding J, Guo C (2014) Monetary incentives modulate the processing of emotional facial expressions: an ERP study. Acta Physiol Sinica 46:437 [Google Scholar]
- Wentura D, Müller P, Rothermund K (2014) Attentional capture by evaluative stimuli: gain- and loss-connoting colors boost the additional-singleton effect. Psychon Bull Rev 21:701–707. 10.3758/s13423-013-0531-z [DOI] [PubMed] [Google Scholar]
- Williams LM, Palmer D, Liddell BJ, Song L, Gordon E (2006) The ‘when’ and ‘where’ of perceiving signals of threat versus non-threat. NeuroImage 31:458–467. 10.1016/j.neuroimage.2005.12.009 [DOI] [PubMed] [Google Scholar]
- Wronka E, Walentowska W (2011) Attention modulates emotional expression processing. Psychophysiology 48:1047–1056. 10.1111/j.1469-8986.2011.01180.x [DOI] [PubMed] [Google Scholar]
- Wu L, Müller HJ, Zhou X, Wei P (2019) Differential modulations of reward expectation on implicit facial emotion processing: ERP evidence. Psychophysiology 56:e13304. 10.1111/psyp.13304 [DOI] [PubMed] [Google Scholar]
- Yao S, Ding C, Qi S, Yang D (2014) Value associations of emotional faces can modify the anger superiority effect: behavioral and electrophysiological evidence. Soc Cogn Affect Neurosci 9:849–856. 10.1093/scan/nst056 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yokoyama T, Padmala S, Pessoa L (2015) Reward learning and negative emotion during rapid attentional competition. Front Psychol 6:269. 10.3389/fpsyg.2015.00269 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zhang D, Luo W, Luo Y (2013) Single-trial ERP analysis reveals facial expression category in a three-stage scheme. Brain Res 1512:78–88. 10.1016/j.brainres.2013.03.044 [DOI] [PubMed] [Google Scholar]
- Zhang D, Liu Y, Wang L, Ai H, Luo Y (2017) Mechanisms for attentional modulation by threatening emotions of fear, anger, and disgust. Cogn Affect Behav Neurosci 17:198–210. 10.3758/s13415-016-0473-9 [DOI] [PubMed] [Google Scholar]
- Zhou X, Du B, Wei Z, He W (2019) Attention capture of non-target emotional faces: an evidence from reward learning. Front Psychol 10:3004. 10.3389/fpsyg.2019.03004 [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
The data presented in this study are available on request from the corresponding author on reasonable request.





