Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2013 Mar 1.
Published in final edited form as: J Exp Soc Psychol. 2012 Mar 1;48(2):583–686. doi: 10.1016/j.jesp.2011.11.009

Who Expressed What Emotion? Men Grab Anger, Women Grab Happiness

Rebecca Neel a, D Vaughn Becker b, Steven L Neuberg a, Douglas T Kenrick a
PMCID: PMC3285231  NIHMSID: NIHMS341754  PMID: 22368303

Abstract

When anger or happiness flashes on a face in the crowd, do we misperceive that emotion as belonging to someone else? Two studies found that misperception of apparent emotional expressions – “illusory conjunctions” – depended on the gender of the target: male faces tended to “grab” anger from neighboring faces, and female faces tended to grab happiness. Importantly, the evidence did not suggest that this effect was due to the general tendency to misperceive male or female faces as angry or happy, but instead indicated a more subtle interaction of expectations and early visual processes. This suggests a novel aspect of affordance-management in human perception, whereby cues to threat, when they appear, are attributed to those with the greatest capability of doing harm, whereas cues to friendship are attributed to those with the greatest likelihood of providing affiliation opportunities.

Keywords: affordance management, face perception, anger, happiness, emotional expressions, threat, gender differences


Can happiness or anger on one face perceptually jump to a nearby face? Previous research suggests faces can sometimes “grab” emotions from their neighbors. That is, we sometimes misperceive an emotional expression as belonging to one face when it is actually expressed by a neighboring face. Are these patterns of emotion “grabbing” random, or is there predictable logic to how the perceptual system rearranges apparent expressions on faces?

Even for non-social targets, at the earliest stage of visual processing, objects’ features do not always appear to belong to their actual owners. Sometimes—particularly under conditions of limited attention—features move between adjacent objects (Treisman & Schmidt, 1982; Treisman, 1986). For example, the redness of a circle presented briefly next to a triangle will sometimes appear to belong to the triangle: The triangle has “grabbed” the redness from the circle. Although illusory conjunctions often occur randomly, top-down influences can constrain them (Becker, Neel, & Anderson, 2010; Goldfarb & Treisman, 2010). Recent proposals of a dual-system for feature integration (Hommel & Colzato, 2009; VanRullen, 2009) suggest that features conjoin randomly for unfamiliar objects, but for familiar objects, the mind’s stored schemata may exert top-down influence on perception to produce schema-congruent images.

Given that expectations can shape misperception, expectations of who is most likely to be happy vs. angry might guide illusory conjunctions of emotional facial expressions. An affordance-based perspective suggests we are attuned to both the threats and the opportunities others pose to us (Gibson, 1979; Johnson & Freeman, 2010; McArthur & Baron, 1983; Neuberg, Kenrick, & Schaller, 2010; Zebrowitz & Collins, 1997). Combining this approach with the idea of “error management” – by which many cognitive biases reflect processes that minimize those mistakes that have the most costly fitness consequences (Galperin & Haselton, in press; Haselton & Nettle, 2006; McKay & Efferson, 2010) – perceptual biases may be calibrated to minimize costly threats and enhance beneficial opportunities (e.g., Miller, Maner, & Becker, 2010). One such bias may be to misperceive anger on those seen to pose the greatest ability and inclination to do us harm (males), and to misperceive happiness on those seen to pose the greatest ability and inclination to nurture and provide social connection (females). Indeed, perceivers gauge a target’s potential for threat in part from facial cues to both emotion and gender (Oosterhof & Todorov, 2008; 2009). In line with this perspective, both stereotypes (Hess, Adams, & Kleck, 2005; Plant, Hyde, Keltner, & Devine, 2000; Plant, Kling, & Smith, 2004) and facial morphology (Becker, Kenrick, Neuberg, Blackwell, & Smith, 2007) contribute to perceptions that men are relatively more angry and women are relatively more happy (see also Brody & Hall, 2008; Hugenberg & Sczesny, 2006). Therefore, both top-down and bottom-up processes lead to the prediction that emotions would illusorily conjoin in ways to best take advantage of the affordances implied by common sex stereotypes.

We predicted that female faces disproportionately grab happiness and male faces disproportionately grab anger. Two studies tested the specific hypotheses that (1) anger illusorily conjoins to male faces, at higher rates than anger conjoins to female faces, (2) happiness illusorily conjoins to female faces, at higher rates than happiness conjoins to male faces, and (3) these effects occur above and beyond any simple stereotype-consistent biases to view men as angry and women as happy when no expressions are present.

Study 1

Method

Participants

58 undergraduates (29 Female, 25 Male, 4 did not indicate) participated in exchange for course credit. Participant sex produced no main effects, and did not moderate any observed effects1.

Materials

Four male and four female White posers were selected from the Ekman and Friesen stimulus set (1976). Angry and happy expressions of each poser were used.

Procedure

Each trial consisted of a 250ms display of two faces, flanked by an integer on the left and right sides. Participants were given the primary objective to add the two numbers. After entering their solution with a key press, a dot appeared where one of the faces had been, and participants were asked to report either the expression (25% of trials) or the gender (75% of trials) of the face that had been there. 128 trials, equally representing all combinations of face type (e.g., happy male with angry female, happy female with happy female, etc.), were administered.

Results and Discussion

To examine the rates at which faces grabbed anger and happiness, respectively, we calculated the error rates for a given type of target face when paired with a distractor expressing the other emotion (e.g., how many times participants mistakenly responded that an angry face was happy when it was next to a happy distractor). To compare these illusory conjunction rates to general tendencies to see anger or happiness on a face when it was absent from the display, we also calculated the rates at which participants misidentified the emotional expression on a target face when the target and distractor had the same expression (and thus the misidentified emotion had not appeared in the display).

A three-way ANOVA on these error rates, with Target Gender (male, female), Target Emotion (angry, happy), and Misidentification Type (misidentified emotion on distractor face, misidentified emotion absent from display) revealed a main effect of Misidentification Type, F(1,57) = 12.45, p = .001, ηp2 = .18, and a 2-way interaction of Target Gender and Target Emotion, F(1,57) = 12.45, p = .001, ηp2 = .18. These effects were qualified by the expected 3-way interaction, F(1,57) = 6.19, p = .016, ηp2 = .10; all other effects, ps > .26.

For trials in which the target and distractor expressed different emotions—thereby allowing for the possibility of illusory conjunctions—an ANOVA with Target Gender (male, female) and Target Emotion (angry, happy) revealed a significant 2-way interaction, F(1,57) = 15.48, p < .001, ηp2 = .21 (see Figure 1). More specifically, as predicted, male faces grabbed anger at higher rates than did female faces: t(57)=2.55, p=.014, d = .38. We also predicted that female faces would grab happiness at higher rates than would male faces, and this was supported as well: t(57)=3.12, p=.003, d = .41. Also, males grabbed anger more than they grabbed happiness, t(57)=3.36, p=.001, d = .44, whereas females grabbed happiness marginally more than they grabbed anger, t(57)=1.62, p=.11, d = .21. (There were no main effects of Target Gender or Target Emotion, Fs < 1.05).

Figure 1.

Figure 1

Illusory conjunction rates in Study 1 (real faces): Distractor faces had a different emotion from the target. Error bars represent standard errors.

For trials in which the target and distractor expressed the same emotion—thereby reflecting the insertion of happiness or anger onto the target face when that expression did not appear in the display—an ANOVA with Target Gender (male, female) and Target Emotion (angry, happy) as factors produced no interaction of Target Gender and Emotion, and no main effects of Target Gender or Emotion, all Fs < 1 (see Figure 2). Specific comparisons likewise showed no difference in the perception of anger on male and female targets, t(57)=1.40, p=.17, d = .18, and happiness was likewise not perceived on female targets at higher rates than on male targets, t(57)=.10, p=.92, d = .01.

Figure 2.

Figure 2

Emotion misperception base-rates in Study 1 (real faces): Distractor faces had the same emotion as the target. Error bars represent standard errors.

The data from Study 1 support our hypotheses: Male faces grabbed anger from their neighbors at higher rates than females did, whereas female faces grabbed happiness from their neighbors at higher rates than males did. Analyses suggested that this was not purely due to a general tendency to see men as angry and women as happy. Even so, we sought to replicate the hypothesized effects in Study 2 with a larger participant sample and different stimulus set.

The faces used in Study 1 offer the external validity of real targets posing natural emotional expressions. Yet photographs of real faces are by nature idiosyncratic, with possible differences in the expressive intensity of component features (for example, closed vs. open mouthed expressions). We thus sought to replicate the effects with computer-generated faces for which cues to gender and extremity of emotional expression could be held constant across targets. Any effects observed consistently across studies are less likely to be driven by confounding weaknesses of any one approach, and allow for more robust inferences about observed results.

Study 2

Method

Participants

129 undergraduates (74 male, 48 female, 7 did not indicate) participated in exchange for course credit. Participant sex did not produce main or interactive effects.

Materials

Stimuli were created using FaceGen Modeller. 11 male and 11 female Caucasian/European faces were created, with the program’s gender control setting approximately equidistant from androgyny in either the male or female direction. For each face, a neutrally expressive image was produced, and then cropped so that the edges of the faces were not visible and the people did not appear bald (a possible cue of male gender). All faces were then pre-rated by 21 undergraduates on clarity of their apparent gender (1=definitely male, 7=definitely female). 6 faces of each gender rated as clearly male (<3) or female (>5) were selected. Angry and happy morphs for each selected face were created using FaceGen’s morphing tools, with the target’s emotion set at the maximum of anger or happiness. Both the angry and happy morphs displayed open-mouthed expressions to produce consistent tooth-exposure across emotion expression. Only the happy and angry morphs, and not the pre-rated neutral face, were used in the experiment (see Figure 3).

Figure 3.

Figure 3

Example of stimuli generated for Study 2. [This figure intended for color on the web]

Procedure

The procedure followed that of Study 1, except for the use of computer generated faces instead of photographs of actual faces.

Results and Discussion

As before, the 3-way ANOVA revealed both a main effect of Misidentification Type, F(1,128) = 81.12 p < .001, ηp2 = .39, and a 2-way interaction of Target Gender and Target Emotion, F(1,128) = 90.02, p < .001, ηp2 = .41. These effects were again qualified by a significant 3-way interaction, F(1,128) = 32.22, p < .001, ηp2 = .20 (for all other effects, ps > .05).

For trials in which the target and distractor expressed different emotions, an ANOVA with Target Gender (male, female) and Target Emotion (angry, happy) revealed a main effect of Target Emotion, F(1,128) = 6.59, p = .01, ηp2 = .05, whereby faces grabbed anger at higher rates than they grabbed happiness (see Figure 4). There was no main effect of Target Gender, F(1,128) = 1.97, p = .16, ηp2 = .02. The predicted 2-way interaction of Target Gender and Target Emotion emerged, F(1,128) = 108.08, p < .001, ηp2 = .46. Paired samples t-tests revealed that male faces grabbed anger at higher rates than did female faces, t(128)=8.26, p<.001, d = .73, and female faces grabbed happiness at higher rates than did male faces, t(128)=6.70, p<.001, d = .59. As before, males grabbed anger more than they grabbed happiness, t(128)=8.80, p<.001, d = .78, whereas females grabbed happiness more than they grabbed anger, this time with statistical significance, t(128)=4.41, p<.001, d = .39.

Figure 4.

Figure 4

Illusory conjunction rates in Study 2 (computer-generated faces): Distractor faces had a different emotion from the target. Error bars represent standard errors.

The Target Gender (male, female) X Target Emotion (angry, happy) ANOVA for trials in which the target and distractor expressed the same emotion produced no significant main effects, Fs < 1.50, but did reveal a significant 2-way interaction, F(1,128) = 13.42, p < .001, ηp2 = .10 (see Figure 5). Paired samples t-tests revealed that angry females were called happy more often than were angry males, t(128)=3.42, p=.001, d = .30; the complementary effect was marginal, as happy males were nonsignificantly more likely to be called angry than were happy females, t(128)=1.69, p=.09, d = .15. As revealed by the significant three-way interaction reported above, the interaction of Target Gender and Target Emotion was much stronger when the target and distractor emotional expressions did not match—thereby revealing the predicted pattern of illusory conjunctions, above and beyond any more general tendency to see men as angry and women as happy.

Figure 5.

Figure 5

Emotion misperception base-rates in Study 2 (computer-generated faces): Distractor faces had the same emotion as the target. Error bars represent standard errors.

Employing a new, emotion- and gender-calibrated face set, Study 2 replicated the findings of Study 1: Male faces grabbed anger, and female faces grabbed happiness, from their neighbors.

General Discussion

Across two studies employing real and computer-generated stimuli, male faces were more likely to grab anger from neighboring faces than were female faces. Given that those most capable of doing harm are also those most likely to display anger (Sell, Tooby, & Cosmides, 2009) such a bias to see existing anger as emanating from men may serve a protective function, minimizing the potentially more costly error of failing to see a man as angry. We also saw a complementary effect, whereby female faces grabbed happiness at higher rates than male faces, and grabbed happiness more than they grabbed anger. Study 1 suggested, and Study 2 confirmed, that these effects emerge beyond the simple overperception of anger and happiness on male and female faces, revealing a separate process by which expectations can influence perception of emotional expressions.

Affordance Management versus Stereotyping?

Do these results suggest that gender-based stereotypes played no role in the misperception of apparent emotion? No. In fact, these results are compatible with a view of affordance management by which stereotypes can inform who is most likely to pose particular threats and opportunities. Thus, although we did not find evidence for a strong stereotyping effect (e.g., misperceiving men as angry and women as happy regardless of the distractor’s emotional expression), our findings suggest a weaker form of stereotyping, by which emotions, when they appeared, were misperceived in stereotype-congruent ways. This finding adds to other work showing that, for White participants, Black male targets (stereotyped as more threatening than White males) tend to draw angry expressions from nearby faces, whereas White male targets tend to pull neutrality (Becker et al., 2010). The current studies thus expand a growing body of work on affordance management via emotion perception, suggesting a nuanced, early-stage application of stereotypical knowledge, whereby limited attention leads to systematic errors conforming to stereotypes.

Should these target gender effects always hold? An affordance-based perspective would suggest not. In contexts where women are considered to be more formidable and threatening than men, as well as those where men might be perceived to afford relative comfort and friendliness, we would expect to find these patterns diminished or reversed. Compatible with the proposition that perceptions of relative dominance vs. affiliation drive gender-based emotion perception biases (e.g., Hess et al., 2004; 2005), in contexts where gender stereotypes predict female dominance or aggression (e.g., in protecting a child), or male affiliation (e.g., as part of a coalition), we might expect the current effect to be reversed. Future work could fruitfully explore the extent to which specific contexts moderate the effects observed here.

Taken together, the current studies offer a novel direction for the growing literature on functional approaches to emotion perception (e.g., Ackerman et al, 2006; Johnson & Freeman, 2010; Maner et al., 2005; Marsh, Adams, & Kleck, 2005; Oosterhof & Todorov, 2008; Shapiro et al., 2009; Zebrowitz, Kikuchi, & Fellous, 2007; 2010) and demonstrate the usefulness of this affordance-based approach for revealing the nuances of social perception.

Highlights.

  • >

    Two studies examine illusory conjunctions of angry and happy expressions.

  • >

    Male faces grab anger from neighboring faces, whereas female faces grab happiness.

  • >

    This was not due to a general tendency to perceive men as angry and women as happy.

  • >

    Findings support an affordance-management perspective on emotion perception.

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

1

Although participant sex effects are common in studies of mating motives and behaviors, for which men and women confront very different trade-offs, studies of self-protective threats, which pose similar problems for both sexes, typically find few sex differences (e.g., Griskevicius, Goldstein, Mortensen, Cialdini, & Kenrick, 2006; Li, Kenrick, Griskevicius, & Neuberg, in press).

References

  1. Ackerman JM, Shapiro JR, Neuberg SL, Kenrick DT, Becker DV, Griskevicius V, Maner JK, Schaller M. They all look the same to me (unless they’re angry): From out-group homogeneity to out-group heterogeneity. Psychological Science. 2006;17:836–840. doi: 10.1111/j.1467-9280.2006.01790.x. [DOI] [PubMed] [Google Scholar]
  2. Becker DV, Kenrick DT, Neuberg SL, Blackwell KC, Smith DM. The confounded nature of angry men and happy women. Journal of Personality and Social Psychology. 2007;92:179–190. doi: 10.1037/0022-3514.92.2.179. [DOI] [PubMed] [Google Scholar]
  3. Becker DV, Neel R, Anderson US. Illusory conjunctions of angry facial expressions follow intergroup biases. Psychological Science. 2010;21:938–940. doi: 10.1177/0956797610373374. [DOI] [PubMed] [Google Scholar]
  4. Brody LR, Hall JA. Gender and emotion in context. In: Lewis M, Haviland-Jones JM, Feldman-Barrett L, editors. Handbook of emotions. 3rd ed. Guilford Press; New York: 2008. pp. 395–408. [Google Scholar]
  5. Ekman P, Friesen WV. Pictures of Facial Affect. Consulting Psychologist Press; Palo Alto, CA: 1976. [Google Scholar]
  6. Galperin A, Haselton MG. Error management and the evolution of cognitive bias. In: Forgas JP, Fiedler K, Sedikedes C, editors. Social Thinking and Interpersonal Behavior. Psychology Press; New York: in press. To appear. [Google Scholar]
  7. Gibson JJ. The ecological approach to visual perception. Houghton Mifflin; Boston: 1979. [Google Scholar]
  8. Goldfarb L, Treisman A. Are some features easier to bind than others? The congruency effect. Psychological Science. 2010;21:676–681. doi: 10.1177/0956797610365130. [DOI] [PubMed] [Google Scholar]
  9. Griskevicius V, Goldstein N, Mortensen C, Cialdini RB, Kenrick DT. Going along versus going alone: When fundamental motives facilitate strategic (non)conformity. Journal of Personality and Social Psychology. 2006;91:281–294. doi: 10.1037/0022-3514.91.2.281. [DOI] [PubMed] [Google Scholar]
  10. Haselton MG, Nettle D. The paranoid optimist: An integrative evolutionary model of cognitive biases. Personality and Social Psychology Review. 2006;10:47–66. doi: 10.1207/s15327957pspr1001_3. [DOI] [PubMed] [Google Scholar]
  11. Hess U, Adams RB, Kleck RE. Facial appearance, gender, and emotion expression. Emotion. 2004;4:378–388. doi: 10.1037/1528-3542.4.4.378. [DOI] [PubMed] [Google Scholar]
  12. Hess U, Adams RB, Kleck RE. Who may frown and who should smile? Dominance, affiliation, and the display of happiness and anger. Cognition and Emotion. 2005;19:515–536. [Google Scholar]
  13. Hommel B, Colzato LS. When an object is more than a binding of its features: Evidence for two mechanisms of visual feature integration. Visual Cognition. 2009;17:120–140. [Google Scholar]
  14. Hugenberg K, Sczesny S. On wonderful women and seeing smiles: Social categorization moderates the happy face response latency advantage. Social Cognition. 2006;24:516–539. [Google Scholar]
  15. Johnson KL, Freeman JB. A “New Look” at person construal: Seeing beyond dominance and discreteness. In: Balcetis E, Lassiter D, editors. The Social Psychology of Visual Perception. Psychology Press; New York: 2010. pp. 253–272. [Google Scholar]
  16. Li YJ, Kenrick DT, Griskevicius V, Neuberg SL. Economic biases in evolutionary perspective: How mating and self-protection motives alter loss aversion. Journal of Personality & Social Psychology. doi: 10.1037/a0025844. in press. [DOI] [PubMed] [Google Scholar]
  17. Maner JK, Kenrick DT, Becker DV, Robertson TE, Hofer B, Neuberg SL, Delton AW, Butner J, Schaller M. Functional projection: How fundamental social motives can bias interpersonal perception. Journal of Personality and Social Psychology. 2005;88:63–78. doi: 10.1037/0022-3514.88.1.63. [DOI] [PubMed] [Google Scholar]
  18. Marsh AA, Adams RB, Kleck RE. Why do fear and anger look the way they do? Form and social function in facial expressions. Personality and Social Psychology Bulletin. 2005;31:73–86. doi: 10.1177/0146167204271306. [DOI] [PubMed] [Google Scholar]
  19. McArthur LZ, Baron RM. Toward an ecological theory of social perception. Psychological Review. 1983;90:215–238. [Google Scholar]
  20. McKay R, Efferson C. The subtleties of error management. Evolution and Human Behavior. 2010;31:309–319. [Google Scholar]
  21. Miller SL, Maner JK, Becker DV. Self-protective biases in group categorization: Threat cues shape the boundary between “us” and “them”. Journal of Personality and Social Psychology. 2010;99:62–77. doi: 10.1037/a0018086. [DOI] [PubMed] [Google Scholar]
  22. Neuberg SL, Kenrick DT, Schaller M. Evolutionary social psychology. In: Fiske ST, Gilbert D, Lindzey G, editors. Handbook of social psychology. John Wiley & Sons; New York: 2010. [Google Scholar]
  23. Oosterhof NN, Todorov A. The functional basis of face evaluation. Proceedings of the National Academy of Sciences. 2008;105:11087–11092. doi: 10.1073/pnas.0805664105. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Oosterhof NN, Todorov A. Shared perceptual basis of emotional expressions and trustworthiness impressions from faces. Emotion. 2009;9:128–133. doi: 10.1037/a0014520. [DOI] [PubMed] [Google Scholar]
  25. Plant EA, Hyde JS, Keltner D, Devine PG. The gender stereotyping of emotions. Psychology of Women Quarterly. 2000;24:81–92. [Google Scholar]
  26. Plant EA, Kling KC, Smith GL. The influence of gender and social role on the interpretation of facial expressions. Sex Roles. 2004;51:187–196. [Google Scholar]
  27. Sell A, Tooby J, Cosmides L. Formidability and the logic of human anger. Proceedings of the National Academy of Sciences. 2009;106:15073–15078. doi: 10.1073/pnas.0904312106. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Shapiro JR, Ackerman JM, Neuberg SL, Maner JK, Becker VD, Kenrick DT. Following in the wake of anger: When not discriminating is discriminating. Personality and Social Psychology Bulletin. 2009;35:1356–1367. doi: 10.1177/0146167209339627. [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Treisman A. Features and objects in visual processing. Scientific American. 1986;254:114–125. [Google Scholar]
  30. Treisman AM, Schmidt H. Illusory conjunctions in the perception of objects. Cognitive Psychology. 1982;14:107–141. doi: 10.1016/0010-0285(82)90006-8. [DOI] [PubMed] [Google Scholar]
  31. VanRullen R. Binding hardwired versus on-demand feature conjunctions. Visual Cognition. 2009;17:103–119. [Google Scholar]
  32. Zebrowitz LA, Collins MA. Accurate social perception at zero acquaintance: The affordances of a Gibsonian approach. Personality and Social Psychology Review. 1997;1:204–223. doi: 10.1207/s15327957pspr0103_2. [DOI] [PubMed] [Google Scholar]
  33. Zebrowitz LA, Kikuchi M, Fellous JM. Are effects of emotion expression on trait impressions mediated by babyfaceness? Evidence from connectionist modeling. Personality and Social Psychology Bulletin. 2007;33:648–662. doi: 10.1177/0146167206297399. [DOI] [PubMed] [Google Scholar]
  34. Zebrowitz LA, Kikuchi M, Fellous JM. Facial resemblance to emotions: Group differences, impression effects, and race stereotypes. Journal of Personality and Social Psychology. 2010;98:175–198. doi: 10.1037/a0017990. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES