Abstract
Younger adults (YA) attribute emotion-related traits to people whose neutral facial structure resembles an emotion (emotion overgeneralization). The fact that older adults (OA) show deficits in accurately labeling basic emotions suggests that they may be relatively insensitive to variations in the emotion resemblance of neutral expression faces that underlie emotion overgeneralization effects. On the other hand, the fact that OA, like YA, show a ‘pop-out’ effect for anger, more quickly locating an angry than a happy face in a neutral array, suggests that both age groups may be equally sensitive to emotion resemblance. We used computer modeling to assess the degree to which neutral faces objectively resembled emotions and assessed whether that resemblance predicted trait impressions. We found that both OA and YA showed anger and surprise overgeneralization in ratings of danger and naiveté, respectively, with no significant differences in the strength of the effects for the two age groups. These findings suggest that well-documented OA deficits on emotion recognition tasks may be more due to processing demands than to an insensitivity to the social affordances of emotion expressions.
Keywords: Aging, Face perception, Emotion resemblance, Overgeneralization, Trait impressions
Research investigating age-related changes in emotion recognition has yielded well-documented age-related declines in the ability to accurately label facial expressions of emotion. According to a meta-analysis of 17 studies, older adults (OA) are significantly worse than younger adults (YA) at labeling prototypical facial expressions (Ruffman et al. 2008). Nevertheless, OA do remain sensitive to emotion expressions, as they typically perform at above chance levels in labeling anger and other expressions, even when their accuracy is significantly lower than that of YA (Isaacowitz et al. 2007). Additionally, OA, like YA, consistently show a ‘pop out’ effect, whereby they are quicker to locate an angry schematic face among an array of neutral schematic faces than to locate a happy schematic face (Hahn et al. 2006; Mather and Knight 2006; Ruffman et al. 2009). Whereas it has been established that OA show deficits in labeling full-blown emotion expressions but retain sensitivity to intense anger expressions in an array of faces, research has not investigated age differences in responsiveness to more subtle emotion cues that have been shown to influence younger adults' trait impressions (e.g., Zebrowitz et al. 2010). The present study investigated whether OA trait impressions are less responsive than those of YA to subtle variations in the resemblance of neutral expression faces to emotion expressions.
Our focus on trait impressions builds on research demonstrating that facial expressions of emotion not only communicate emotional states but also elicit trait impressions in YA, an effect that Secord (1958) dubbed temporal extension. For example, people displaying transient angry expressions are perceived to have stable traits associated with low warmth and high dominance (Knutson 1996; Montepare and Dobish 2003). In addition to perceiving people displaying emotion expressions to have traits congruent with those emotions, YA perceivers also show emotion face overgeneralization (Zebrowitz 1996, 1997), attributing emotion-related traits to people whose neutral expression facial structure resembles an emotion expression. For example, posed neutral expression faces that show more resemblance to an angry expression are perceived by YA as less likeable and trustworthy and more dominant, hostile, and threatening, with opposite impressions of neutral faces showing greater resemblance to a happy expression. These effects have been shown when resemblance to an angry expression is assessed by human raters (Montepare and Dobish 2003) or by objective computer methods (Said et al. 2009; Zebrowitz et al. 2010), as well as when resemblance is manipulated by lowering eyebrow height (Keating et al. 1981). The converse is also true, with neutral expression faces perceived as low or high in trustworthiness resembling angry or happy expressions, respectively (Oosterhof and Todorov 2008). According to the ecological theory of social perception (McArthur and Baron 1983; Zebrowitz et al. 2011), emotion overgeneralization is a byproduct of the adaptive value of responding appropriately to emotional expressions. For example, because the utility of responding appropriately to an angry person is highly important, falsely perceiving hostility when there is none present is less maladaptive than a failure to perceive it when warranted. Whereas it can be adaptive to respond appropriately to actual emotional expressions and to show false positives rather than miss an emotion, the adaptive value of emotion overgeneralization is uncertain, since it hinges on the question of whether people whose neutral expression faces resemble particular emotions actually have the associated traits. It is possible that some people do, but it is also likely that many do not. For example, people who look surprised due to large eyes are not necessarily naïve. In the absence of any information to indicate that the people posing neutral expression faces in the present study had personality traits that matched the emotion their faces resembled, we refer to the attribution of such traits as an emotion overgeneralization effect, not an accuracy effect.
In the present study, we examined whether OA trait impressions from faces would show anger, surprise, and happy emotion overgeneralization effects equivalent to those shown by YA. Given OA deficits in accurately labeling strong emotion expressions, one might expect OA to show little or no sensitivity to variations in the subtle resemblance of neutral expression faces to emotions. This would yield either no significant emotion overgeneralization or significantly weaker overgeneralization than that shown by YA, particularly in the case of anger overgeneralization for which OA show greater labeling deficits (mean effect size r = .34), than they do for surprise (r = .07) and happiness (r = .08) (Ruffman et al. 2008). On the other hand, the fact that OA do show some accuracy labeling emotions as well as sensitivity to anger in the emotion pop out paradigm suggests that they may show significant emotion overgeneralization.
Method
Participants
Sixteen young adult (YA) participants (8 men) and 16 older adult (OA) participants (8 men) completed the study. YA participants, aged 18–21 (M = 18.8, SD = .91), were recruited from a university and received course credit or $15 payment. OA participants, aged 65–93, (M = 75.6, SD = 7.3), were recruited from the local community and were paid $25. OA were screened for dementia using the Mini-Mental State Examination (Folstein et al. 1975), and all scored above 26 out of 30 (M = 28.6, SD = 1.3).
Control Measures
Measures of vision, affect, and cognitive function were administered to assess differences between our OA and YA samples and to determine whether any age differences in emotion overgeneralization could be attributed to differences in these more general processes. The results revealed several differences between OA and YA, most of which are consistent with previous aging research (see Table 2). OA performed worse than YA on the Reading the Mind in the Eyes Test (Eyes test; Baron-Cohen et al. 2001), which assessed the ability to read and label mental states from variations in the appearance of the eye region. OA also performed worse on the Benton Facial Recognition Test (Benton et al. 1983), which assessed the ability to match faces that were the same identity when presented with different orientations, expressions, and lighting conditions, although OA performance on the was in the normal range for normative samples of OA (e.g., Eslinger et al. 1985). In addition, visual acuity (Snellen Eye Chart) and contrast sensitivity (Mars Letter Contrast Sensitivity Test, Mars Perceptrix, Chappaqua, NY, USA), were lower for OA, but in the normal range for both age groups. There were no age differences in color vision (Ishihara's Tests for Color Deficiency, Ishihara 2010). Consistent with previous findings (Mather and Knight 2005, 2006), OA reported more positive affect than YA but did not differ in negative affect on a computerized version of the PANAS (Watson et al. 1988), which was administered because mood can affect emotion perception (Bouhuys et al. 1995). OA also scored higher than YA on the Shipley Vocabulary test (Shipley 1946), consistent with their higher education level and the maintenance of crystallized intelligence in older adulthood (Horn and Cattell 1967). The lower scores of OA than YA on a timed Pattern Comparison Task (Salthouse 1993) are consistent with decreases in processing speed in older adulthood (Salthouse 1996). Further, OA showed poorer executive control as evidenced by their worse performance on a short-form 48 item computerized version of the Wisconsin Card Sort Task to assess executive functioning (the Berg Card Sort Task (BCST; downloaded from http://pebl.sourceforge.net/battery.html and validated by Piper et al. 2012), although they showed equivalent levels of perseverative errors, which are the most diagnostic of executive dysfunction (Greve et al. 2005).
Table 2. OA and YA scores on control measures.
Measure | Older adults (N = 16) | Younger adults (N = 16) | t value | p value | ||
---|---|---|---|---|---|---|
|
|
|||||
M | SD | M | SD | |||
Snellen visual acuity (denominator) | 31.25 | 9.22 | 19.38 | 5.25 | 4.48 | <.001 |
Mars letter contrast sensitivity | 1.64 | .11 | 1.74 | .04 | 3.02 | .005 |
Ishihara's test for color deficiency | 13.88 | .50 | 13.88 | .34 | 0 | 1.00 |
Benton facial recognition test | 45.50 | 3.74 | 47.63 | 3.54 | 1.65 | .109 |
Pattern comparison test | 29.06 | 5.56 | 40.13 | 6.70 | 5.08 | <.001 |
Shipley vocabulary test | 35.13 | 4.37 | 31.88 | 2.87 | 2.49 | .019 |
PANAS positive affect | 3.57 | .52 | 2.77 | 0.82 | 3.24 | .003 |
PANAS negative affect | 1.47 | 0.34 | 1.65 | .41 | 1.31 | .20 |
Mind in the Eyes Test | 21.38 | 4.62 | 26.31 | 5.39 | 2.78 | .009 |
BCST correct responses | 24.50 | 9.53 | 34.94 | 5.23 | 3.84 | .001 |
BCST perseverative errors | 7.06 | 5.72 | 5.81 | .98 | .86 | .396 |
BCST non-perseverative errors | 16.44 | 11.55 | 7.3 | 5.25 | 2.90 | .007 |
BCST total errors | 23.50 | 9.52 | 13.06 | 5.24 | 3.84 | .001 |
BCST trials to complete first category | 12.31 | 11.59 | 10.81 | 5.14 | .473 | .639 |
Level of educationa | 4.31 | 1.58 | 2.63 | .50 | 4.07 | < .001 |
Level of Education was coded for highest level attained: (1) no high school diploma, (2) high school diploma, (3) some college, (4) Bachelor's degree, (5) some graduate work, (6) Masters degree, (7) Doctorate degree
Stimuli
The images used in the present study were 120 White young adult neutral expression faces (60 men), taken from a multi-race set of 360 faces previously used by Zebrowitz et al. (2010). These facial images were selected from four different databases: the University of Stirling PICS database (http://pics.psych.stir.ac.uk/), the AR face database (Martinez and Benavente 1998), NimStim, and 2 yearbooks (one high school and one university yearbook). Images were selected to have neutral expressions, no head tilt, no eyeglasses, and no facial hair. The images were color frontal facial photographs of people displaying neutral expressions and were cropped to remove clothing and non-facial cues. Images were sized to a resolution of 310 × 380 pixels, and displayed at a resolution of 1,280 × 1,024 pixels on a 17 inch CRT monitor, at an approximate size of 7.6 × 10.2 cm, subtending a visual angle of 7.2 by 9.7° with participants seated 60 cm away. To verify that faces had neutral expressions, four YA judges (2 men) provided smile ratings on a 5-point scale, with endpoints labeled no smile/big smile. Eleven faces (3 male) were removed from our analysis because their smile ratings were >2, yielding a total N = 109. Although it is theoretically possible that other emotional expressions may have been present in the stimuli, we believe that it is unlikely that someone asked to pose a neutral expression would instead pose an angry or surprised expression; thus any resemblance of the faces to those expressions would result only from facial structure.
Measures
Emotion Resemblance
The degree to which each generalization face resembled anger, surprise, and happy facial expressions was taken from the Zebrowitz et al. (2010) study, which used connectionist modeling to provide objective indices of emotion resemblance for each face. In this method, a computer network is trained to differentiate two categories of faces (e.g., an angry and neutral expression) based on their facial metrics. When subsequently tested on metrics from the faces rated in this study, the network indicates the probability that each face belongs to one of the trained categories (e.g., the probability that it is an angry face). This probability is then used to predict human judges' impressions of the faces. More details about the modeling procedure are provided in Zebrowitz et al. (2007, 2010).
Trait and Appearance Ratings
Participants rated each generalization face on four traits (hostile, trustworthy, naïve, and warm) and two appearance qualities (attractive and babyfaced) using 7-point scales with endpoints labeled (1) not at all (hostile, naïve, trustworthy, warm, attractive, or babyfaced) and (7) very (hostile, naïve, trustworthy, warm, attractive, or babyfaced). Although ratings of ‘naïve’ were not included in the Zebrowitz et al. (2010) study, we included it to assess surprise expression overgeneralization, both because Zebrowitz et al. found no significant surprise overgeneralization with the traits they used and also because previous research found that prototypical surprise expressions were associated with impressions of greater naiveté (Zebrowitz et al. 2007). Ratings of “warm” also were not included in the study by Zebrowitz et al. (2010). Rather, they had assessed “likeable”. Because they found no significant effects of resemblance to happy expressions on ratings of the likeability of the Caucasian faces used in the present study, we substituted the trait rating “warm”, which we thought might be more sensitive to emotion overgeneralization than the more subjective rating of “likeable”.
Procedure
After obtaining informed consent participants were seated in front of a computer screen. They first completed a computerized version of the PANAS (Watson et al. 1988). Next, MediaLab software (Empirisoft, New York City, NY) was used to present the trait rating task. Before rating the faces, participants viewed an instruction screen for the rating task where they were told to rate each face based only on their first impression of the face, and explicitly told the following: “Do not be concerned about whether your judgments are right or wrong. We are simply interested in whether people show agreement in their first impressions of faces.” Prior to rating the 120 faces for this study, participants rated a different set of 24 male faces on four trait scales (aggressive, dominant, manipulative, and trustworthy) for a separate study. Following these ratings, participants rated the faces included in this study with the following trait order: trustworthy, hostile, naïve, and warm. After the trait ratings, participants rated the 24 male faces for the other study followed by the 120 faces for this study on the appearance qualities of attractiveness and babyfaceness, with the order of these two qualities counterbalanced across participants. Faces for the present study were blocked by sex and the order of male and female faces was counterbalanced across participants. This procedure was employed so that variations in trait impressions would be less likely to reflect sex stereotypes, as we were interested in whether the first impressions of OA were responsive to variations in emotion resemblance within demographic groups, as has been previously shown for YA (Zebrowitz et al. 2010). Participants viewed each face for 4 s and made their rating after the face disappeared. The rating scale remained on the screen until the rating was made, at which time it disappeared and a new face appeared. The entire rating portion lasted about an hour. Following this, participants completed the BCST and a computerized version of the Eyes Test and then completed the other control measures as well as a demographic and health questionnaire.
Results
Inter-rater Agreement in Impressions
With one exception, Cronbach alphas revealed that the individual trait and appearance ratings of both OA and YA judges showed acceptable inter-rater reliabilities, indicating that OA, like YA, show consensual first impressions from faces (all alphas >.7; see Table 1). Given these inter-rater reliabilities, we computed mean ratings for each face across OA judges and across YA judges, and used face rather than rater as the unit of analysis, paralleling the analysis in the Zebrowitz et al. (2010) study. Also following the procedure of Zebrowitz et al. (2010), we created a danger composite by combining the hostile and reverse scored trustworthy ratings, which were highly correlated for both OA (r(14) = .81, p < .0001) and YA (r(14) = .79, p < .0001). The single exception to acceptable reliability was YA ratings of naiveté (α = .53), which could not be raised by dropping any deviant participants. Despite this low alpha, we included YA naïve ratings in our analyses in order to compare effects with OA ratings. However, we recognize that finding weaker emotion overgeneralization effects for YA than OA impressions of naiveté could reflect YA lower reliability.
Table 1. Alphas for YA and OA agreement in face ratings.
Age group | ||
---|---|---|
|
||
YA | OA | |
Danger | 0.85 | 0.86 |
Trustworthy | 0.81 | 0.78 |
Hostile | 0.76 | 0.79 |
Naïve | 0.56 | 0.73 |
Warm | 0.82 | 0.84 |
Attractive | 0.86 | 0.88 |
Babyfaced | 0.83 | 0.85 |
Emotion Overgeneralization Regression Analysis
Using face as the unit of analysis, we performed six separate regression analyses to predict OA and YA impressions (dangerous, naïve, warm) from resemblance to angry expressions, surprise expressions, and happy expressions as well as three other appearance variables. Specifically, to examine the effect of resemblance to emotion expressions on trait impressions with other appearance qualities controlled, we entered face sex (coded 0 for female, 1 for male), mean ratings of attractiveness and babyfaceness for each face by the relevant age group, along with the measures of how much each face resembled anger, happiness, and surprise, as determined by the connectionist model. We controlled other appearance qualities because each is correlated with emotion resemblance (Zebrowitz et al. 2007, 2010), and each has effects on trait impressions that parallel the predicted effects of emotion resemblance (Eagly et al. 1991; Kite et al. 2008; Montepare and Zebrowitz 1998). Therefore, controlling these appearance qualities ensures that differences in impressions of the faces reflects a response to their emotion resemblance, and not face sex, attractiveness, or babyfaceness. All variables were standardized and all regression coefficients reported below also are standardized. The overall R2 values were significant for each of the six regressions: For ratings of danger, OA: R2 = .35, F(4,104) = 14.15, p < .001; YA: R2 = .33, F(4,104) = 12.75, p <.001; for ratings of warmth, OA: R2 = .43, F(4,104) = 19.47, p < .001; YA: R2 = .35, F(4,104) = 13.74, p < .001; and for ratings of naïveté, OA: R2 = .53, F(4,104) = 29.77, p < .001; YA: R2 = .56, F(4,104) = 33.29, p < .001.
Emotion Overgeneralization
Faces with greater resemblance to anger were rated higher in danger by both OA (βOA = .20, p = .045) and YA (βYA = .22, p = .023). They were also rated as less naïve by OA (βOA = −.14, p = .018), whereas this effect was not significant for YA (βYA = −.06, p = .14). Anger resemblance did not significantly predict warmth ratings for either OA (βOA = .03, p = .65) or YA, although the latter showed a slight trend in the predicted direction (βYA = −.11, p = .13). Faces with greater resemblance to surprise were rated higher in naiveté by both OA (βOA = .18, p < .001) and YA (βYA = .10, p = .003). Surprise resemblance did not predict ratings of warmth or danger (all βs < .1, all ps > .25). Resemblance to happiness was not a significant predictor of any ratings (all βs < .07, all ps > .28), except for OA ratings of warmth, which showed a slight trend in the predicted direction (βOA = .09, p = .12).
In order to examine age differences in the magnitude of the emotion overgeneralization coefficients, we performed planned comparisons, using an approach as described in Rosenthal and Rosnow (1984, p. 506) to examine whether two effects are significantly different from one another. This test involved converting effect sizes to Z-scores and testing whether the Z-scores of these two effects were significantly different. With one exception, these analysis revealed no significant age differences in emotion overgeneralization (all ts < 1.14 ps > .3). The exception was a marginally significant tendency for anger resemblance to have a stronger influence on YA than OA warmth ratings, t(120) = 1.81, p = .072. However, as noted above, resemblance to anger did not significantly influence warmth ratings for either age group, which calls for caution in interpreting this effect.
Other Appearance Qualities
We summarize here the effects of attractive and babyface ratings across the six regression analyses. Attractiveness predicted lower danger and higher warmth ratings for both OA and YA: danger: (βOA = −.59 to −.61, all ps < .001; βYA = −.54 to −.60, all ps < .001), warmth (βOA = .54 to .55, all p < .001; βYA = .43 to .45, all p < .001). Attractiveness also predicted lower ratings of naiveté by YA (βYA = −.10 to −.12, all p < .015), with only nonsignificant trends for OA ratings (βOA = −.10 to −.13, ps = .085−.14). On the other hand, babyfaceness predicted higher naiveté ratings for both OA (βOA = .56 to .62, all ps < .001) and YA (βYA = .45 to .46, all ps < .001). Babyfaceness also predicted lower ratings of danger by YA (βYA = −.33 to −.40, all ps < .002) but not OA (βOA = −.10 to −.18, ps = .10–.29); and the same pattern was true for ratings of higher warmth (βYA = .25 to .28, all p< .002; βOA = .02 to .03, all ps > .5).
Discussion
Our results reveal that the tendency for YA impressions of neutral expression faces to be influenced by their objective resemblance to emotion expressions extends to OA, with no significant age differences in the magnitude of these effects. Both OA and YA perceived greater danger in neutral expression faces that showed greater resemblance to anger, consistent with previous emotion overgeneralization results for YA (Zebrowitz et al. 2010). Both also perceived greater naiveté in faces that showed greater resemblance to surprise. In addition, OA perceived less naiveté in neutral expression faces that showed more resemblance to anger, consistent with evidence that people with highly-expressive angry expressions look less naïve (Zebrowitz et al. 2007). Finally, it is noteworthy that the effect of anger resemblance on impressions of danger and the effect of surprise resemblance on impressions of naiveté held true controlling for attractiveness and babyfaceness, which were associated with danger and naiveté ratings by OA and/or YA.
One may question whether these results should be construed as emotion overgeneralization or accuracy. Had we asked participants to tell us what emotion the faces resemble and their choices of anger, surprise, or happy matched the emotion identified by the computer modeling, then it would be appropriate to talk about accuracy. However our questions focused on trait impressions, which is why we frame our results in terms of emotion overgeneralization. For example, if participants rate faces that look subtly surprised as more naïve, we cannot call that accuracy unless we know that the people whose faces more resembled surprise are in fact more naïve. Nevertheless, it is true that participants' trait impressions reveal that they are detecting a subtle, objective resemblance of the faces to an emotion expression, and it is striking that OA do this to the same extent as YA.
Unlike resemblance to anger and surprise, resemblance to happy expressions did not influence the trait impressions of either YA or OA. This result may be due to the restriction of faces used in the present study to the Caucasian faces from Zebrowitz et al. (2010), who found that Caucasian judges showed happy emotion overgeneralization effects on impressions of the likeability of Black and Korean faces but not White faces. On the other hand, they showed stronger anger emotion overgeneralization effects on impressions of the danger of White than Black faces. The moderation of the overgeneralization effects by face race was attributed to the baseline level of impressions, with negative emotion resemblance affecting impressions of own-race faces that were rated relatively positively, and happy resemblance affecting impressions of other-race faces that were rated relatively negatively.
Our finding that OA trait impressions were sensitive to very subtle emotion information conveyed in neutral expression faces, with emotion overgeneralization effects equal to those shown by YA, stands in contrast to the well-documented OA deficits in labeling basic emotion expressions as well as more complex mental states and traits on the Eyes Test (e.g., Pardini and Nichelli 2009), which was replicated in the present study. This contrast suggests a dissociation in the processes that are engaged by traditional emotion labeling tasks and the Mind in the Eyes Test and those being tapped by our emotion overgeneralization task. The latter may engage processes more like those involved when angry faces in a neutral array “pop out” more than happy faces do for both OA and YA (Hahn et al. 2006; Mather and Knight 2006; Ruffman et al. 2009).
One possible explanation for the discrepancies between OA emotion labeling deficits and tasks that show intact emotional processing is that the labeling tasks involve controlled processing aimed at getting the ‘correct’ response, whereas the anger “pop out” effects and the emotion overgeneralization first impressions may both engage more automatic processing. Studies of YA neural activation during emotion labeling tasks support the argument that they involve controlled processing. Compared with passive viewing, which is an automatic processing task, emotion labeling yields a reduction in amygdala activation and an increase in prefrontal cortex activation, a signature of controlled processing (Hariri et al. 2000; Lange et al. 2003). Previous work has suggested that automatic, as compared to controlled, processing remains relatively intact in OA (Peters et al. 2007). As considerable research suggests the automatic nature of trait inferences (Bar et al. 2006; Dovidio et al. 1997; Rule and Ambady 2008; Willis and Todorov 2006), OA emotion overgeneralization may be due to intact automatic processing of subtle cues to emotion in their first impressions of faces. Although this provides a plausible explanation for our results, it would be useful to examine OA emotion overgeneralization using shorter stimulus presentations that more directly implicate automatic processing.
The distinction between automatic and controlled processing also may explain why previous research found that age differences in accurate emotion labeling explained age differences in the detection of deceit (Stanley and Blanchard-Fields 2008; Ruffman et al. 2011) and social gaffes (Halberstadt et al. 2011) despite our finding of no age differences in first impressions of neutral faces based on their resemblance to emotions. Like emotion labeling, detecting deception and gaffes may arguably involve greater controlled processing than our first impression task. Participants in the present study were explicitly told that there were no right or wrong answers and they should just give their first impression of each face. In contrast, participants in the study of social gaffe detection were asked to rate how appropriate behaviors seemed after being told that some were socially appropriate and others were socially inappropriate, and those in studies of deceit detection, were asked to rate how truthful statements seemed after being told that some were truthful and some were lies. These tasks make clear that there are correct and incorrect responses, which is likely to engage controlled processing. Indeed, detecting social gaffes involves determining what is culturally appropriate and comparing behaviors to these expectations, while detecting deceit involves evaluating participants' stories as well as their behaviors to determine if they sound truthful.
It is also possible that stereotype threat undermines OA performance in emotion labeling and deceit and social gaffe detection. Although research on stereotype threat effects has focused on the tendency for OA memory to be adversely affected when tasks are presented in a way that raises performance anxiety due to beliefs about age-related declines in memory function (Chasteen et al. 2012), it is possible that perceptual processes like emotion, deceit, or social gaffe recognition are similarly diminished due to concerns about age-related declines in accuracy, which would not affect first impressions of faces when participants are explicitly told not to be concerned about whether their judgments are right or wrong. Although future research investigating these two possibilities would be worthwhile, our results clearly show that processing of subtle facial cues to emotion under conditions that do not engage performance anxiety is intact in OA.
Our evidence that OA are able to use threat-related information in their impressions of faces contrasts with previous evidence that OA show more positive impressions of faces than do YA, with OA rating faces as less threatening (Ruffman et al. 2006) as well as more trustworthy and less hostile (Zebrowitz et al. 2013). However, Zebrowitz et al. also found that OA and YA showed significant agreement with each other in their trait impressions, despite OA greater positivity. This suggests that OA and YA use some of the same processes in their trait impressions, and the results of the current study indicate that emotion overgeneralization is one of those processes.
OA performed worse than YA not only on the Reading the Mind in the Eyes Test, but also on several other control tasks, effects that replicated previous research findings as discussed above and demonstrated that our OA sample was typical of healthy community dwelling individuals. Yet, despite somewhat poorer face recognition ability, visual acuity, and contrast sensitivity, OA sensitivity to subtle facial resemblances to emotion was equal to that of YA. OA also reported more positive affect on the PANAS, a positivity effect that has been associated with OA greater avoidance of negative stimuli (Isaacowitz et al. 2006; Mather and Carstensen 2005). However, any such tendency did not impair their sensitivity to subtle facial cues to negative emotion, as OA were as likely as YA to form more negative impressions of faces that showed more resemblance to anger.
It should be noted that our failure to find any significant age differences cannot readily be attributed to low power to detect them, since using face as the unit of analysis (N = 109) provided reasonable power. Of course, it is possible that a larger sample of faces would allow us to detect small age differences in emotion overgeneralization. Specifically, we found nonsignificant trends suggesting that OA tended to associate happiness resemblance with greater warmth, and YA tended to associate anger resemblance with less warmth. It is possible that with a larger sample, these effects could become significant. However, the effects most likely to achieve significance with a larger sample of faces were the tendencies for OA to show stronger emotion overgeneralization for impressions of naiveté, which was likely due to lower reliability of YA ratings of this trait. It is also notable that using only younger faces provided a liberal test of age differences, since previous research has provided evidence for an own-age bias in emotion recognition (Malatesta et al. 1987), as well as other face perception tasks (Anastasi and Rhodes 2005; Bartlett and Fulton 1991; Voelkle et al. 2011). Despite the fact that own-age biases might weaken OA emotion overgeneralization when judging younger faces, they still showed effects equal to YA.
Conclusions
OA and YA showed equivalent emotion overgeneralization in trait impressions of neutral expression faces that varied subtly in their resemblance to anger or surprise, and these effects were independent of attractiveness, babyfaceness, and face sex. The anger emotion overgeneralization shown by OA suggests that any age-related reductions in the processing of negative material does not impair OA processing of subtle facial cues to negative emotions. More generally, our findings suggest that well-documented OA deficits on emotion recognition tasks may be due more to deficits in labeling expressions than to insensitivity to their social affordances, such as danger. This demonstrates the importance of the task context in assessing effects of aging, consistent with an ecological approach to aging (see Isaacowitz and Stanley 2011).
Acknowledgments
This research was supported by NIH grant AG38375 to L. A. Zebrowitz. The authors would like to thank Jasmine Boshyan and Bosiljka Milosavljevic for their help in data collection and Derek Isaacowitz for his helpful comments on an earlier draft.
Contributor Information
Robert G. Franklin, Jr., Email: rgfran@brandeis.edu, Department of Psychology, Brandeis University, MS062, Waltham, MA 02454-9110, USA.
Leslie A. Zebrowitz, Email: zebrowitz@brandeis.edu, Department of Psychology, Brandeis University, MS062, Waltham, MA 02454-9110, USA.
References
- Anastasi JS, Rhodes MG. An own-age bias in face recognition for children and older adults. Psychonomic Bulletin and Review. 2005;12(6):1043–1047. doi: 10.3758/bf03206441. [DOI] [PubMed] [Google Scholar]
- Bar M, Neta M, Linz H. Very first impressions. Emotion. 2006;6(2):269. doi: 10.1037/1528-3542.6.2.269. [DOI] [PubMed] [Google Scholar]
- Baron-Cohen S, Wheelwright S, Hill J, Raste Y, Plumb I. The ‘Reading the mind in the eyes’ Test revised version: A study with normal adults, and adults with Asperger syndrome or high-functioning autism. Journal of Child Psychology and Psychiatry. 2001;42(2):241–251. [PubMed] [Google Scholar]
- Bartlett JC, Fulton A. Familiarity and recognition of faces in old age. Memory and Cognition. 1991;19:229–238. doi: 10.3758/bf03211147. [DOI] [PubMed] [Google Scholar]
- Benton A, Van Allen M, Hamsher K, Levin H. Test of facial recognition manual. Iowa City: Benton Laboratory of Neuropsychology; 1983. [Google Scholar]
- Bouhuys AL, Bloem GM, Groothuis TG. Induction of depressed and elated mood by music influences the perception of facial emotional expressions in healthy subjects. Journal of Affective Disorders. 1995;33(4):215–226. doi: 10.1016/0165-0327(94)00092-n. [DOI] [PubMed] [Google Scholar]
- Chasteen AL, Kang SK, Remedios JD. Aging and stereotype threat: Development, process, and interventions. In: Inzlicht M, Schmader T, editors. Stereotype threat: Theory, process, and application. New York: Oxford University Press; 2012. pp. 202–216. [Google Scholar]
- Dovidio JF, Kawakami K, Johnson C, Johnson B, Howard A. On the nature of prejudice: Automatic and controlled processes. Journal of Experimental Social Psychology. 1997;33:510–540. [Google Scholar]
- Eagly AH, Ashmore RD, Makhijani MG, Longo LC. What is beautiful is good, but…: A meta-analytic review of research on the physical attractiveness stereotype. Psychological Bulletin. 1991;110(1):109–128. [Google Scholar]
- Eslinger PJ, Damasio AR, Benton AL, Van Allen M. Neuropsychologic detection of abnormal mental decline in older persons. The Journal of the American Medical Association. 1985;253(5):670–674. [PubMed] [Google Scholar]
- Folstein MF, Folstein SE, McHugh PR. Mini-mental state: A practical method for grading the cognitive state of patients for the clinician. Journal of Psychiatric Research. 1975;12:189–198. doi: 10.1016/0022-3956(75)90026-6. [DOI] [PubMed] [Google Scholar]
- Greve KW, Stickle TR, Love JM, Bianchini KJ, Stanford MS. Latent structure of the Wisconsin card sorting test: A confirmatory factor analytic study. Archives of Clinical Neuropsychology. 2005;20(3):355–364. doi: 10.1016/j.acn.2004.09.004. [DOI] [PubMed] [Google Scholar]
- Hahn S, Carlson C, Singer S, Gronlund SD. Aging and visual search: Automatic and controlled attentional bias to threat faces. Acta Psychologica. 2006;123(3):312–336. doi: 10.1016/j.actpsy.2006.01.008. [DOI] [PubMed] [Google Scholar]
- Halberstadt J, Ruffman T, Murray J, Taumoepeau M, Ryan M. Emotion perception explains age-related differences in the perception of social gaffes. Psychology and Aging. 2011;26:133–136. doi: 10.1037/a0021366. [DOI] [PubMed] [Google Scholar]
- Hariri AR, Bookheimer SY, Mazziotta JC. Modulating emotional responses: Effects of a neocortical network on the limbic system. NeuroReport. 2000;11:43–48. doi: 10.1097/00001756-200001170-00009. [DOI] [PubMed] [Google Scholar]
- Horn JL, Cattell RB. Age differences in fluid and crystallized intelligence. Acta Psychologica. 1967;26:107–129. doi: 10.1016/0001-6918(67)90011-x. [DOI] [PubMed] [Google Scholar]
- Isaacowitz DM, Löckenhoff CE, Lane RD, Wright R, Sechrest L, Riedel R, et al. Age differences in recognition of emotion in lexical stimuli and facial expressions. Psychology and Aging. 2007;22(1):147–159. doi: 10.1037/0882-7974.22.1.147. [DOI] [PubMed] [Google Scholar]
- Isaacowitz DM, Stanley JT. Bringing an ecological perspective to the study of aging and recognition of emotional facial expressions: Past, current, and future methods. Journal of Nonverbal Behavior. 2011;35:261–278. doi: 10.1007/s10919-011-0113-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Isaacowitz DM, Wadlinger HA, Goren D, Wilson HR. Selective preference in visual fixation away from negative images in old age? An eye-tracking study. Psychology and Aging. 2006;21(1):40. doi: 10.1037/0882-7974.21.1.40. [DOI] [PubMed] [Google Scholar]
- Ishihara S. Test for colour-blindness. Kanehara Trading Company; 2010. [Google Scholar]
- Keating CF, Mazur A, Segall MH, Cysneiros PG, Kilbride JE, Leahy P, et al. Culture and the perception of social dominance from facial expression. Journal of Personality and Social Psychology. 1981;40(4):615. [Google Scholar]
- Kite ME, Deaux K, Haines EL, Denmark FL, Paludi MA. Gender stereotypes Psychology of women: A handbook of issues and theories. 2nd. Westport, CT: Praeger Publishers/Greenwood Publishing Group; 2008. [Google Scholar]
- Knutson B. Facial expressions of emotion influence interpersonal trait inferences. Journal of Nonverbal Behavior. 1996;20(3):165–182. [Google Scholar]
- Lange K, Williams LM, Young AW, Bullmore ET, Brammer MJ, Williams SCR, et al. Task instructions modulate neural responses to fearful facial expressions. Biological Psychiatry. 2003;53(3):226–232. doi: 10.1016/s0006-3223(02)01455-5. [DOI] [PubMed] [Google Scholar]
- Malatesta CZ, Izard CE, Culver C, Nicolich M. Emotion communication skills in young, middle-aged, and older women. Psychology and Aging. 1987;2:193–203. doi: 10.1037//0882-7974.2.2.193. [DOI] [PubMed] [Google Scholar]
- Martinez AM, Benavente R. The AR face database. CVC Technical Report #24; 1998. [Google Scholar]
- Mather M, Carstensen LL. Aging and motivated cognition: The positivity effect in attention and memory. Trends in Cognitive Sciences. 2005;9(10):496–502. doi: 10.1016/j.tics.2005.08.005. [DOI] [PubMed] [Google Scholar]
- Mather M, Knight M. Goal-directed memory: The role of cognitive control in older adults' emotional memory. Psychology and Aging. 2005;20(4):554–570. doi: 10.1037/0882-7974.20.4.554. [DOI] [PubMed] [Google Scholar]
- Mather M, Knight MR. Angry faces get noticed quickly: Threat detection is not impaired among older adults. The Journals of Gerontology Series B. 2006;61(1):54–57. doi: 10.1093/geronb/61.1.p54. [DOI] [PubMed] [Google Scholar]
- McArthur LZ, Baron RM. Toward an ecological theory of social perception. Psychological Review. 1983;90(3):215–238. [Google Scholar]
- Montepare JM, Dobish H. The contribution of emotion perceptions and their overgeneralizations to trait impressions. Journal of Nonverbal Behavior. 2003;27(4):237–254. [Google Scholar]
- Montepare JM, Zebrowitz LA. Person perception comes of age: The salience and significance of age in social judgments. In: Zanna MP, editor. Advances in experimental social psychology. Vol. 30. San Diego, CA: Academic Press; 1998. pp. 93–163. [Google Scholar]
- Oosterhof NN, Todorov A. The functional basis of face evaluation. Proceedings of the National Academy of Sciences of the United States of America. 2008;105(32):11087–11092. doi: 10.1073/pnas.0805664105. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pardini M, Nichelli PF. Age-related decline in mentalizing skills across adult life span. Experimental Aging Research. 2009;35(1):98–106. doi: 10.1080/03610730802545259. [DOI] [PubMed] [Google Scholar]
- Peters E, Hess TM, Västfjäll D, Auman C. Adult age differences in dual information processes: Implications for the role of affective and deliberative processes in older adults' decision making. Perspectives on Psychological Science. 2007;2:1–23. doi: 10.1111/j.1745-6916.2007.00025.x. [DOI] [PubMed] [Google Scholar]
- Piper BJ, Li V, Eiwaz MA, Kobel YV, Benice TS, Chu AM, et al. Executive function on the psychology experiment building language tests. Behavior Research Methods. 2012;44:110–123. doi: 10.3758/s13428-011-0096-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rosenthal R, Rosnow RL. Essentials of behavioral analysis: Methods and data analysis. New York: McGraw-Hill; 1984. [Google Scholar]
- Ruffman T, Henry JD, Livingstone V, Phillips LH. A meta-analytic review of emotion recognition and aging: Implications for neuropsychological models of aging. Neuroscience and Bio-behavioral Reviews. 2008;32(4):863–881. doi: 10.1016/j.neubiorev.2008.01.001. [DOI] [PubMed] [Google Scholar]
- Ruffman T, Murray J, Halberstadt J, Vater T. Age-related differences in deception. Psychology and Aging. 2011;27:543–549. doi: 10.1037/a0023380. [DOI] [PubMed] [Google Scholar]
- Ruffman T, Ng M, Jenkin T. Older adults respond quickly to angry faces despite labeling difficulty. The Journals Of Gerontology: Series B. 2009;64B(2):171–179. doi: 10.1093/geronb/gbn035. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ruffman T, Sullivan S, Edge N. Differences in the way older and younger adults rate threat in faces but not situations. The Journals of Gerontology Series B. 2006;61(4):187–194. doi: 10.1093/geronb/61.4.p187. [DOI] [PubMed] [Google Scholar]
- Rule NO, Ambady N. Brief exposures: Male sexual orientation is accurately perceived at 50 ms. Journal of Experimental Social Psychology. 2008;44(4):1100–1105. [Google Scholar]
- Said CP, Sebe N, Todorov A. Structural resemblance to emotional expressions predicts evaluation of emotionally neutral faces. Emotion. 2009;9(2):260–264. doi: 10.1037/a0014681. [DOI] [PubMed] [Google Scholar]
- Salthouse TA. Speed and knowledge as determinants of adult age differences in verbal tasks. Journal of Gerontology. 1993;48:29–36. doi: 10.1093/geronj/48.1.p29. [DOI] [PubMed] [Google Scholar]
- Salthouse TA. The processing-speed theory of adult age differences in cognition. Psychological Review. 1996;103(3):403–428. doi: 10.1037/0033-295x.103.3.403. [DOI] [PubMed] [Google Scholar]
- Secord PF. Facial features and inference processes in interpersonal perception. In: Taguiri R, Petrullo L, editors. Person perception and interpersonal behavior. Stanford, CA: Stanford University Press; 1958. pp. 300–315. [Google Scholar]
- Shipley WC. Institute of living scale. Los Angeles: Western Psychological Services; 1946. [Google Scholar]
- Stanley J, Blanchard-Fields F. Challenges older adults face in detecting deceit: The role of emotion recognition. Psychology and Aging. 2008;23(1):24–32. doi: 10.1037/0882-7974.23.1.24. [DOI] [PubMed] [Google Scholar]
- Voelkle MC, Ebner NC, Lindenberger U, Riediger M. Let me guess how old you are: Effects of age, gender, and facial expression on perceptions of age. Psychology and Aging. 2011 doi: 10.1037/a0025065. [DOI] [PubMed] [Google Scholar]
- Watson D, Clark LA, Tellegen A. Development and validation of brief measures of positive and negative affect: The PANAS scales. Journal of Personality and Social Psychology. 1988;54(6):1063–1070. doi: 10.1037//0022-3514.54.6.1063. [DOI] [PubMed] [Google Scholar]
- Willis J, Todorov A. First impressions making up your mind after a 100-ms exposure to a face. Psychological Science. 2006;17(7):592–598. doi: 10.1111/j.1467-9280.2006.01750.x. [DOI] [PubMed] [Google Scholar]
- Zebrowitz LA. Physical appearance as a basis of stereotyping. In: MacRae N, Hewstone M, Stangor C, editors. Foundations of stereotypes and stereotyping. New York: Guilford Press; 1996. pp. 79–120. [Google Scholar]
- Zebrowitz LA. Reading faces: Window to the soul? Boulder, CO: Westview Press; 1997. [Google Scholar]
- Zebrowitz LA, Bronstad MP, Montepare JM. An ecological theory of face perception. In: Adams R, Ambady N, Nakayama K, Shimojo S, editors. The science of social vision. Oxford: Oxford University Press; 2011. pp. 3–30. [Google Scholar]
- Zebrowitz LA, Franklin RG, Hillman S, Boc H. Comparing older and younger adults' first impressions from faces. Psychology and Aging. 2013;28:202–212. doi: 10.1037/a0030927. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zebrowitz LA, Kikuchi M, Fellous J. Are effects of emotion expression on trait impressions mediated by babyfaceness? Evidence from connectionist modeling. Personality and Social Psychology Bulletin. 2007;33(5):648–662. doi: 10.1177/0146167206297399. [DOI] [PubMed] [Google Scholar]
- Zebrowitz LA, Kikuchi M, Fellous JM. Facial resemblance to emotions: Group differences, impression effects, and race stereotypes. Journal of Personality and Social Psychology. 2010;98(2):175–189. doi: 10.1037/a0017990. [DOI] [PMC free article] [PubMed] [Google Scholar]