Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2010 Jun 1.
Published in final edited form as: Emotion. 2009 Jun;9(3):329–339. doi: 10.1037/a0015179

Young and Older Emotional Faces: Are there Age-Group Differences in Expression Identification and Memory?

Natalie C Ebner 1, Marcia K Johnson 1
PMCID: PMC2859895  NIHMSID: NIHMS149737  PMID: 19485610

Abstract

Studies finding that older compared to young adults are less able to identify facial expressions and have worse memory for negative than positive faces have used only young faces. Studies finding that both age groups are more accurate at recognizing faces of their own than other ages have used mostly neutral faces. Thus, age-differences in processing faces may not extend to older faces, and preferential memory for own-age faces may not extend to emotional faces. To investigate these possibilities, we had young and older participants view young and older faces presented either with happy, angry, or neutral expressions and identify the expressions displayed, and then complete a surprise face recognition task. Older compared to young participants were less able to identify expressions of angry young and older faces and (based on participants’ categorizations) remembered angry faces less well than happy faces. There was no evidence of an own-age bias in memory, but self-reported frequency of contact with young and older adults and awareness of own emotions played a role in expression identification of and/or memory for young and older faces.

Keywords: age-differences, expression identification, face recognition, own-age bias, frequency of social contact


Attention and memory are selective, that is, people attend more to some information than to other and remember some of their experiences better than others. There are several factors that influence whether information is attended to and encoded in memory. From earlier studies we know, for example, that emotionally evocative information is more likely to be attended to and recalled than emotionally neutral information (Bradley, Greenwald, Petry, & Lang, 1992; Charles, Mather, & Carstensen, 2003; Ochsner, 2000).

In the present study we were interested in a specific type of information, namely human faces. Human faces constitute a unique category of objects that we see from very early on and frequently in our daily lives and other people are of great relevance for our physical, social, and emotional well-being. During the course of our lives we are exposed to, and learn to recognize, a large number of faces. Faces are characterized by a high level of similarity and yet processing and recognizing faces is a task at which we become very skilled. With increasing age, however, various cognitive and perceptual abilities decline (Faubert, 2002; Park, Polk, Mikels, Taylor, & Marshuetz 2001; Salthouse, 2004), including increased response time and reduced accuracy on tasks requiring perceptual matching of faces (Grady, McIntosh, Horwitz, & Rapoport, 2000) and reduced accuracy in face recognition (Bartlett, Leslie, Tubbs, & Fulton, 1989; Crook & Larrabee, 1992; Grady et al., 1995). Here we are particularly interested in age-group differences in identification of facial expressions and in memory for faces as discussed next.

Age-Group Differences in Facial Expression Identification and Memory for Emotional Faces

The overall pattern of results regarding age-group differences in facial expression identification is very consistent. As summarized in a recent meta-analysis by Ruffman, Henry, Livingstone, and Phillips (2008) that considered data across 15 published studies from 962 young (mean age 24 years) and 705 older participants (mean age 70 years), the predominant pattern is that of age-related decline in identification of emotions expressed across different modalities (faces, voices, bodies, matching faces to voices): Older compared to young adults are worse at identifying facial expressions of anger, sadness, and fear, with age-group differences in the same direction but substantially smaller for happy and surprised faces. In addition, older adults have more difficulty identifying anger, sadness, and fear, compared to disgust, surprise, and happiness, whereas young adults have more difficulty identifying fear and disgust, followed by anger, surprise, sadness, and happiness.

In contrast, evidence about age-group differences in attention to and memory for emotional faces is less consistent. Some studies find that for older adults positive material (e.g., happy faces) is favored over negative (e.g., angry, sad, or fearful faces) in memory and negative material is less preferentially attended to or is selectively forgotten in older but not in young adults (for a review, see Mather & Carstensen, 2005). For example, Mather and Carstensen (2003) showed participants pairs of faces, one emotional (happy, sad, angry) and one neutral, followed by a dot that appeared in the location of one of the faces. Participants were to press a key as fast as possible corresponding to the location of the dot. Older adults responded slower to the dot when it was presented on the same side as a negative face than when it appeared on the same side as a neutral face, and faster when the dot appeared on the same side as a happy face than on the same side as a neutral face. Young adults did not exhibit an attention bias toward any face category. Recording eye movements during free viewing of faces, older adults looked less at angry and fearful faces than young adults, but the age groups did not differ with respect to happy or sad faces (Isaacowitz, Wadlinger, Goren, & Wilson, 2006; but see Sullivan, Ruffman, & Hutton, 2007). With respect to memory, Mather and Carstensen (2003) showed that older but not young adults remembered positive faces better than negative faces. Grady, Hongwanishkul, Keightley, Lee, and Hasher (2007) found that young but not older adults showed better memory for negative than positive or neutral faces. Leigland, Schulz, and Janowsky (2004) found that both age groups had a better memory for neutral and positive than negative faces but that older adults recognized proportionally fewer of the negative faces than young adults. Finally, D’Argembeau and Van der Linden (2004) showed that both age groups were better at remembering happy than angry faces.

Taken together, the findings to date consistently suggest that older compared to young adults are worse at identifying facial expressions. Furthermore, several studies suggest that young and older adults differ in their attention and memory biases, with older but not young adults showing an attention preference toward positive and away from negative faces and better memory for positive than negative faces. Nevertheless, evidence regarding memory for emotional faces is more mixed. Inconsistencies across studies may result from differences in tasks used and, perhaps more important, from age-group differences in facial expression identification which are likely to have an impact on which faces are remembered. That is, age-group differences in memory for emotional faces should be examined in relation to participants’ own categorizations of facial expressions, as done in the present study. Objective (i.e., normative) categorizations of expressions are usually based on ratings from young (or middle-aged) adults and assessing memory for normatively categorized faces may underestimate older adults’ memory for faces they do identify has showing a certain expression (e.g., anger).

Potential Factors Underlying Age-Group Differences in Facial Expression Identification and Memory for Emotional Faces

Why might there be age-group differences in facial expression identification and/or memory? There is evidence that the ability to understand and regulate one’s own feelings and emotions and to appraise emotions in others is well preserved or even increases with age (Blanchard-Fields, 2007; Carstensen, Isaacowitz, & Charles, 1999), perhaps due to age-related changes in motivational orientation that render emotional goals more important (Carstensen, 1992). We might also expect that practice, that is, experience-based factors such as accumulated life and interpersonal experience, would increase older adults’ ability to identify their own and other people’s emotions and feeling states. Both of these considerations would predict better overall expression identification and memory for emotional faces for older compared to young adults, which clearly has not been found (Mather & Carstensen, 2003; Ruffman et al., 2008). There is also evidence that older adults become more motivated to maximize positive affect and minimize negative affect as an adaptive emotion regulation strategy (Carstensen et al., 1999). This should result in impaired ability to identify negative but not necessarily positive facial expressions and better memory for positive than negative faces, predictions, as noted above, that are partly consistent with the literature (Carstensen & Mikels, 2005; Mather & Carstensen, 2005).

Another potentially important factor is general age-related cognitive decline (e.g., in processing speed; Salthouse, 2000). However, there is no evidence to date that differences in processing speed contribute to differences in expression identification or memory for emotional faces (Keightley et al., 2006; Ruffman et al., 2008). Finally, there is evidence that some regions that are involved in emotional face processing, such as frontal and temporal regions, show substantial age-related changes (Gunning-Dixon et al., 2003; Iidaka et al., 2002), which might contribute to age-related deficits in facial expression identification and/or memory (Calder et al., 2003; Ruffman et al., 2008).

Own-Age Bias in Face Processing

It is important to note that studies conducted so far in the domain of age-group differences in processing emotional faces have used faces of young (some also including middle-aged) individuals but have not systematically varied the age of the presented faces. This might have been, at least in part, due to the lack of appropriate research stimuli (i.e., faces of different ages showing varying facial expressions). Research using neutral faces has shown, however, that adults of different ages are more accurate and faster on recognition of faces of their own as opposed to other ages (referred to as the “own-age bias”; Bäckman, 1991; Lamont, Stewart-Williams, & Podd, 2005). Such findings suggest that the age-relevance of the face constitutes one important factor that influences how faces are attended to, encoded, and/or remembered. Evidence of an own-age bias clearly challenges any interpretation of observed age-group differences in emotional face processing, as older participants may have been at a disadvantage relative to young participants when stimuli were faces of only young individuals. The present study therefore examined the own-age bias in face processing in emotional (happy, angry) as well as neutral faces and explicitly asked whether the age of the face contributes to age-group differences in facial expression identification and/or memory for emotional faces.

The own-age effect is sometimes explained by the amount of exposure and frequency of contact individuals have with persons of their own as opposed to other age groups (Anastasi & Rhodes, 2006; Mason, 1986). It is assumed that daily routines and environments typically result in more frequent encounters with own-age than other-age persons. Consequently people are more familiar with, or skilled at, processing and remembering own-age than other-age faces. It is also possible that individuals are more motivated to attend to and remember own-age faces at the cost of other-age faces, as these two types of faces are differentially likely to represent potential interaction partners and thus differently interesting and socially relevant. Consequently less effort might be invested in decoding expressions of, and remembering, other-age than own-age faces. Bartlett and Fulton (1991), indeed, found that older compared to young participants rated novel older individuals’ faces as more familiar, whereas young faces were rated as more familiar by young than by older participants. However, to our knowledge, the present study is the first that explicitly examined the relation between self-reported frequency of contact with own-age and other-age individuals and expression identification and memory for own-age and other-age faces.

Influence of Awareness of Own Feelings

Another factor that might influence the ability to identify facial expressions and remember emotional faces is individual differences in awareness of own feelings. A person who is aware of his or her own emotional states might have more interest in, and might be more sensitive to, other people’s feeling states, and might therefore be better at identifying facial expressions and remembering emotional faces. The few studies addressing this issue suggest that difficulties in identifying and describing one’s own emotions are negatively related to accurate identification of emotional faces (Parker, Taylor, & Bagby, 1993). Furthermore, more emotionally aware older (but not young) adults seem better able to identify angry facial expressions (Keightley et al., 2006) and to remember negative faces (Grady et al., 2007).

Overview of the Present Study

The present study (a) varied the age and the expression of the presented faces; (b) combined investigation of both age-group differences in facial expression identification and memory for emotional faces; (c) used participants’ subjective instead of normative categorizations of facial expressions to take into account potential age-group differences in expression identification in evaluating emotional face memory; and (d) assessed two factors that might contribute to differences in facial expression identification and/or memory for emotional faces, self-reported frequency of contact with the own and the other age group and awareness of own feelings.

Based on the literature, we expected that both age groups would be better at identifying expressions for own-age than other-age faces and that, overall, older compared to young adults would be less able to correctly identify angry and perhaps neutral and happy facial expressions. With respect to memory, we expected that both age groups would be better at remembering own-age as opposed to other-age faces, and that older, but not young, participants would be relatively better at remembering happy than angry or neutral faces. We hypothesized that both age groups would report more frequent contact with people of their own than the other age group, and that more frequent, self-reported own-age contact would show a positive correlation with the ability to identify facial expressions of, and remember, own-age faces and a negative correlation for other-age faces. Finally, we hypothesized that emotional awareness of own feelings would predict better facial expression identification and memory for emotional faces.

Methods

Participants

Thirty-two young adults (age range 18–22 years, M = 19.3, SD = 1.34) were recruited through the university’s undergraduate participant pool and 24 older adults (age range 65–84 years, M = 74.8, SD = 4.78) from the community based on flyers in, for example, retirement communities or senior citizen centers. Fifty-six percent of the young participants and 50 % of the older participants were female. All of the young participants were Yale University undergraduates (varying majors). Older participants reported a mean of 15.7 years of education (SD = 2.4). Young and older participants differed with respect to their demographic distribution (χ2(1, N = 56) = 11.46, p < .01; young participants: Hispanic/Latino (13 %), Asian (28 %), Black/African American (16 %), and White (63 %); older participants: White (100 %, one older participant reported to be of White and Hispanic/Latino origin). In addition, 88 % of the participants indicated they were born and had always lived in the USA, whereas 12 % had moved to the USA later in their lives, with no significant differences between the two age groups. Neither factor (i.e., White versus Non-White; born in USA versus moved to USA later) had an influence on facial expression identification (White versus Non-White1: tY(32) = .07, p = .95; born in USA versus moved to USA later: tY(32) = −.13, p = .90; tO(24) = −.53, p = .60) or face recognition (White versus Non-White1: tY(32) = −.71, p = .48; born in USA versus moved to USA later: tY(32) = 1.37, p = .18; tO(22) = −1.44, p = .16) in young or older participants (note that all presented faces were Caucasian), and these factors were therefore not further considered.

The age groups differed in their visual motor processing speed, with young participants scoring higher than older participants (MY = 68.6, SD = 9.31; MO = 44.7, SD = 9.74; F(1, 54) = 87.2, p < .01, ηp2 = .62; max score= 93), but they did not differ in their vocabulary (max score = 30; MY = 22.9, SD = 4.20; MO = 21.7, SD = 4.85). Neither visual motor processing speed nor vocabulary were related to facial expression identification (visual motor processing speed: βY = −.19, tY(32) = −1.06, p = .30; βO = .22, tO(24) = 1.04, p = .31; vocabulary: βY = .05, tY(32) = .26, p = .80; βO = .22, tO(24) = −.35, p = .73) or face recognition (visual motor processing speed: βY = −.01, tY(32) = −.05, p = .96; βO = .40, tO(22) = 1.96, p = .06; vocabulary: βY = −.11, tY(32) = −.61, p = .55; βO = .33, tO(22) = 1.58, p = .13) in young or older participants. The age groups did not differ in their general physical health and general emotional well-being but older participants reported more current positive affect (MY = 2.63, SD = 0.62; MO = 3.54, SD = 0.74; F(1, 54) = 25.26, p <.01, ηp2 = .32) and less current negative affect (MY = 1.39, SD = 0.52; MO = 1.12, SD = 0.16; F(1, 54) = 5.95, p < .05, ηp2 = .10) than young participants.

Procedure, Design, Measures, and Materials

Overview

Participants were first informed about the testing procedure and signed a consent form. They then filled out the short version of the Positive and Negative Affect Schedule (PANAS; Watson, Clark, & Tellegen, 1988) as a measure of participants’ current positive and negative affect. They indicated on a scale ranging from 1 (“very slightly or not at all”) to 5 (“extremely”) the extent to which they felt respective emotions (e.g., active, inspired, scared, upset) at the present moment. Higher scores indicated more positive affect and more negative affect, respectively (αPositive affect= .94, αNegative affect= .89). Next, they worked on the Facial Expression Identification and Face Recognition Task (described below). For both phases of this task the experimenter gave verbal instructions and the computer program provided additional written instructions and several practice runs. Between the encoding phase (i.e., facial expression identification) and the face recognition phase, participants filled in a short demographic questionnaire on paper. The retention interval ranged between 4:10 to 6:34 minutes (M = 5:05, SD = 0:23), with no significant age-group difference. Next, participants filled in an abbreviated version of the verbal subscale of the Wechsler Adult Intelligence Scale vocabulary test (WAIS; Wechsler, 1981) and responded to various questionnaires on paper including items on general physical health (“In general [i.e., over the past year], how would you rate your health and physical well-being?”; response options: 1 = “poor” to 5 = “excellent”) and general emotional well-being (“In general [i.e., over the past year], how would you rate your emotional well-being?”; response options: 1 = “depressed” to 5 = “nearly always upbeat and happy”) and items from the Trait Meta-Mood Scale (TMMS; Salovey, Mayer, Goldman, Turvey, & Palfai, 1995). At the end of the session, the Digit-Symbol-Substitution test (Wechsler, 1981) as a measure of visual motor processing speed was administered and participants responded to several items on frequency of contact with persons of their own and the other age group. Young participants were reimbursed with experimental course credits and older participants received a monetary compensation for their participation.

Facial Expression Identification and Face Recognition

The experiment consisted of two main phases (see Figure 1): (1) an incidental encoding phase: the Facial Expression Identification Task, and (2) the Face Recognition Task. The stimulus presentation was controlled using E-Prime (Schneider, Eschman, & Zuccolotto, 2002). During both phases, responses as well as response times were recorded.

Figure 1.

Figure 1

Experimental Tasks: (A) Facial Expression Identification Task and (B) Face Recognition Task.

During the Facial Expression Identification Task, participants were shown color photographs of 48 individual faces, one at a time. Each face either displayed a happy, an angry, or a neutral facial expression. Participants were asked to decide as quickly and as accurately as possible whether the displayed face showed a happy, an angry, a neutral, or none of these three expressions but any other expression such as surprise or fear and to press the corresponding button on the keyboard.2 This procedure was based on propositions from Discrete Emotions Theory (Izard, 1977; Izard & Malatesta, 1987) that emotions are discrete from one another, and packaged with distinctive sets of bodily and facial reactions. To standardize encoding time, independent of the button press, the presentation time for each face was fixed at five seconds.

The presented faces belonged to one of two age groups (young faces: 18–31 years, older faces: 69–80 years). Within each age group, there were equal numbers of male and female faces. Eight stimuli were presented for each age of face by facial expression combination. Presentation of a specific face with a specific expression was counterbalanced across participants. The presentation orders were pseudo-randomized with the constraints that every twelve items each combination of the categories (age of face, gender of face, facial expression) was represented and that not more than two faces of the same category repeated in a row. They were further controlled for level of attractiveness of the face as rated by six independent raters (see information below) as well as hair color of persons displayed.

During the Face Recognition Task participants were shown 72 faces (48 target and 24 distracter faces). Again, each face either displayed a happy, an angry, or a neutral facial expression. Twelve stimuli were presented for each age of face by facial expression combination. Participants first made an old–new judgment (“yes”, “no”) for each presented face and then indicated the confidence level of their decision on a scale ranging from 1 (“not certain at all”) 4 (“very certain”). The next face appeared on the screen as soon as participants had given their responses. There were equal numbers of male and female young and older distracter faces. Target and distracter faces were counterbalanced across participants (i.e., one half of the target faces in three of the presentation orders were used as distracter faces in the other three orders). Target faces of each quarter of the presentation orders at encoding were equally distributed across the presentation orders at recognition with the distracter faces pseudo-randomly distributed in between. Not more than two faces of the same category (age of face, gender of face, facial expression) and not more than three target or distracter faces repeated in a row.

The faces used in this experiment were taken from the FACES database (for detailed information, see Ebner, Riediger, & Lindenberger, 2008). Based on independent ratings of attractiveness and distinctiveness from four young and two older adults for 114 young and older neutral faces from this database3, we selected 72 faces for which young and older raters’ evaluations of attractiveness and distinctiveness did not differ, and that displayed sufficiently unblended versions of happy, angry, and neutral expressions.

Self-Reported Frequency of Contact with Own and Other Age Group

Participants reported frequency of their personal contact (“How often do you have personal (i.e., face-to-face) contact with young adults/older adults (approx. between 18 to 30 years of age/approx. 65 years of age and older)?”) and other type of contact (“How often do you have other types of contact (e.g., phone, e-mail, letter) with young adults/older adults (approx. between 18 to 30 years of age/approx. 65 years of age and older)?”) with their own and the other age group. Response options ranged from 1 to 8 (“less often”, “once per year”, “2–3 times per year”, “once per month”, “2–3 times per month”, “once per week”, “2–3 times per week”, “daily”). As personal and other types of contact were highly correlated (contact with young adults: rpersonal–other= .76, p = .00; contact with older adults: rpersonal–other= .73, p < .01), we computed one score for self-reported frequency of contact with young adults and one score for self-reported frequency of contact with older adults, with higher scores indicating more self-reported exposure to the respective age group.

Attention to and Clarity of Own Feelings

We administered a selection of seven items from two of the subscales (four items from the subscale ‘attention to feelings’, three items from the subscale ‘clarity of feelings’) of the TMMS to assess participants’ attention to and clarity of own feelings. Participants indicated the extent to which they agreed with each of the statements with respect to their own emotional life. Response options ranged from 1 (“strongly disagree”) to 5 (“strongly agree”). We computed one composite mean score with higher scores indicating more attention to and clarity of own feelings.

Results

Facial Expression Identification

Table 1 shows the percent correct facial expression identification for each of the face categories. Two (age group of participant) × 2 (age group of face) repeated-measures analyses of variance (ANOVA) separately for each of the three facial expressions showed that participants were better at identifying expressions in young compared to older angry (Wilks’ λ = .87, F(1, 54) = 8.31, p < .01, ηp2 = .13) and neutral faces (Wilks’ λ = .78, F(1, 54) = 15.27, p < .01, ηp2 = .22) but age of face did not matter for happy faces. In addition, young and older participants differed in identification of angry (F(1, 54) = 5.00, p < .05, ηp2 = .09) but not happy or neutral faces (see Figure 2). An additional comparison of the types of expressions found that participants were better at identifying happy than angry faces (t(55) = 37.77, p < .01) or neutral faces (t(55) = 24.16, p < .01), and were worse at identifying angry than neutral faces (t(55) = −3.29, p < .01). Table 2 shows correct and erroneous categorizations of happy, angry, and neutral faces separately for young and older participants. Most errors made by young and older participants were to assign angry or neutral faces to the ‘other’ category.

Table 1.

Percent Correct Facial Expression Identification

Young Participants Older Participants

Mean SD Mean SD

Happy Faces
    Young Faces 98.0 4.61 96.9 6.65
    Older Faces 96.5 6.53 94.8 8.17
Angry Faces
    Young Faces 82.0 21.52 67.7 26.81
    Older Faces 71.9 18.24 62.5 21.81
Neutral Faces
    Young Faces 89.8 14.35 81.8 18.42
    Older Faces 77.3 20.19 72.4 25.53

Note. Young participants missed responding to 1.0 % of the trials and older participants missed responding to 2.9 % of the trials within the five seconds time interval.

Figure 2.

Figure 2

Age-Group Differences in Facial Expression Identification for Happy, Angry, and Neutral Faces.

Table 2.

Percent Correct Reponses and Errors in Identification of Facial Expressions

Categorized as

Happy Angry Neutral Other
Young Participants
    Happy Faces 97.3 0.0 0.8 1.0
    Angry Faces 0.6 77.0 3.3 18.2
    Neutral Faces 1.8 4.9 83.6 8.8
Older Participants
    Happy Faces 95.8 0.5 1.3 1.0
    Angry Faces 1.6 65.1 5.7 23.4
    Neutral Faces 1.3 2.9 77.1 15.9

Note. Rows do not add up to 100 % since young participants missed responding to 1.0 % of the trials and older participants missed responding to 2.9 % of the trials within the five seconds time interval. Bolded numbers are correct identifications of facial expressions, non-bolded numbers are erroneous identifications of facial expressions.

Recognition Memory

The results of the facial expression identification task indicated that subjective categorizations of facial expressions deviated in some cases from objectively assigned categories and that the age groups differed in this regard (see Table 2). To take these age-group differences into account when examining memory for emotional faces, we considered participants’ subjective categorizations of facial expressions. That is, for each participant, each item was assigned to an emotion category based on the subjective expression identification performance. Thus, if a participant did not identify a face as angry, it was not included in the analysis of angry faces. To investigate whether young and older participants differed in their corrected face recognition of young and older faces for happy, angry, or neutral faces, we conducted 2 (age group of participant) × 2 (age group of face) repeated-measures ANOVAs separately for each of the three facial expressions.4 Corrected face recognition, the dependent variable, was the percentage of correctly recognized target (‘hit’) faces minus the percentage of incorrectly recognized distracter (‘false alarm’) faces. Age of the face did not produce significant effects for any of the facial expressions, thus the data shown in Figure 3 are collapsed across this factor.5 As shown in Figure 3, young participants had better recognition memory than older participants for happy (F(1, 54) = 6.94, p < .01, ηp2 = .12), angry (F(1, 49) = 9.61, p < .01, ηp2 = .18), and neutral faces (F(1, 53) = 8.71, p < .01, ηp2 = .15). In additional comparisons of type of expression for each age group separately, we found that older, but not young, participants were better in remembering happy than angry faces (t(22) = 2.43, p < .05).

Figure 3.

Figure 3

Differences in Memory for Happy, Angry, and Neutral Faces Based on Subjective Categorizations of Expressions for Young and Older Participants.

Self-Reported Difficulty in Identifying Facial Expressions and Ease in Remembering Faces

At the end of the session, participants were asked to indicate for which type of face (age of face, facial expression) it was most difficult to identify the expression and which type of face was easier to remember or whether there were no differences, respectively. The majority of young participants (59 %) reported that identifying facial expressions of older faces was most difficult, whereas the majority of older participants (75 %) reported no difference in difficulty in expression identification between young and older faces (χ2(2, N = 56) = 11.13, p < .01). The age groups did not differ in terms of self-reported ease in remembering young and older faces. Across age of participant and age of face, the majority of participants indicated that identifying expressions of angry (33 %) or neutral faces (51 %) was most difficult, only 9 % indicated that identifying happy faces was most difficult and 7 % indicated no difference between the faces (χ2(3, N = 55) = 28.56, p < .01). Happy faces (56 %) were easier remembered than angry (24 %) or neutral faces (2 %, no difference between faces: 18 %; χ2(3, N = 55) = 34.54, p < .01). This effect was more pronounced in older than young participants (χ2(3, N = 55) = 9.38, p < .05).

Self-Reported Frequency of Contact with Own and Other Age Group

As shown in Table 3, young compared to older participants reported more frequent contact with young adults (Mean rankY = 40.3; Mean rankO = 12.7; Mann-Whitney U = 5.5, p < .01) and less frequent contact with older adults (Mean rankY = 18.5, Mean rankO = 41.8; Mann-Whitney U = 64.0, p < .01). In neither of the age groups did self-reported frequency of contact with the own age group predict better expression identification with own-age faces. However, the more frequent contact young participants reported to have with their own age group, the less well they were able to identify expressions of older faces (β = −.43, t(31) = −2.59, p < .05). Similarly, the more frequent contact older participants reported with other older adults, the less well they were able to identify expressions in young faces, with this latter effect, however, reaching only marginal significance (β = −.35, t(23) = −1.75, p = .09). With respect to memory for faces, the more contact young participants reported with older adults, the better they were able to correctly recognize older faces (β = .43, t(31) = 2.64, p < .01). The reverse effect was not significant in older participants.

Table 3.

Percent of Participants Indicating Contact at Each Frequency Level

Young Participants Older Participants

Score Response Option Self-
Reported
Contact with
Young
Adults
Self-
Reported
Contact with
Older Adults
Self-
Reported
Contact with
Young
Adults
Self-
Reported
Contact with
Older Adults
8 Daily 92.2 0.0 4.2 25.0
7 2–3 times per week 7.8 18.7 37.4 56.2
6 Once per week 0.0 15.5 14.5 8.3
5 2–3 times per month 0.0 20.3 16.7 10.5
4 Once per month 0.0 14.1 18.8 0.0
3 2–3 times per year 0.0 22.0 2.1 0.0
2 Once per year 0.0 1.6 2.1 0.0
1 Less often 0.0 7.8 4.2 0.0

Attention to and Clarity of Own Feelings

Young and older participants did not differ in attention to and clarity of their own feelings as assessed by a subset of items of the TMMS. As expected, the more attention to and clarity of own feelings participants reported, the better was their overall ability to identify expressions in young and older faces (β = .30, t(55) = 2.30, p < .05). This correlation was only independently significant for angry faces (β = .35, t(55) = 2.70, p < .01). Emotional awareness was not related to face recognition memory.

Discussion

Correct identification of facial emotional displays and memory for faces has adaptive value and is essential for successful social interactions and interpersonal relationships (Carstensen, Gross, & Fung, 1998). The present study integrates research on age-group differences in facial expression identification and memory for emotional faces and research on the own-age bias in face processing. We report several novel findings.

Young and Older Adults Differ in Their Ability to Identify Expressions of Young and Older Emotional Faces

We asked young and older participants to identify the expression of happy, angry, and neutral faces displayed by persons of different ages (i.e., young, older). Older compared to young participants were less able to identify angry expressions but did not differ for happy or neutral expressions.6 The fact that older adults showed poorer expression identification for angry but not happy or neutral faces is consistent with other reports suggesting angry faces are more likely to show age-group differences than happy faces. However, it should be noted that the identification of happy faces was near ceiling in both young and older adults (but identification of neutral faces was below ceiling). An age-related deficit in identifying happy expressions might occur under other circumstances or for other positive facial expressions (e.g., love, positive surprise). The finding that older adults showed poorer expression identification for angry faces is consistent with the general finding of age-related decline in facial expression identification (Ruffman et al., 2008) and extends this finding to older as well as young faces. Thus, in agreement with earlier findings using only young faces, our results provide no support for the idea that older adults might improve in identifying emotions from facial displays as a consequence of accumulated interpersonal experience, increased interest in emotions (Carstensen et al., 1999), or improved emotion regulation strategies (Blanchard-Fields, 2007).

Participants were Better at Identifying Expressions of Young than Older Faces

Interestingly, both age groups were more accurate in identifying facial expressions in young compared to older faces. This better performance with young as opposed to older faces could reflect some preference for young over older adults (Hummert, Garstka, O’Brien, Greenwald, & Mellot, 2002), or could indicate that due to age-related changes in physical features (e.g., muscle tissue, wrinkles) expressions in older faces are more ambiguous than in young faces, which makes facial expression decoding more difficult. This finding suggests that there may be important life situations in which older adults may be more likely to be misinterpreted than young adults (e.g., discussions with doctors, lawyers, and social situations in general). Of course, it is important to mention that the present study used faces of persons who had been instructed to move facial muscles to produce prototypical expressions, and do not necessarily reflect spontaneous feeling states. Thus, it would be important to follow-up with studies investigating the identification of more naturally occurring expressions of young and older faces.

Older Adults Have a Better Memory for Happy than Angry Young and Older Faces

After a short retention interval, participants were asked to indicate whether they had seen a face before. Overall face recognition memory was good and showed the expected main effect for age group, with young participants outperforming older participants. Considering participants’ subjective categorizations of facial expressions, older, but not young, participants remembered angry faces less well than happy faces, indicating poorer memory for angry faces independent of poorer ability identifying angry faces. Thus, the present findings strengthen previous findings showing that older but not young adults remember positive faces better than negative faces (Grady et al., 2007; Mather & Carstensen, 2003), an effect that is often explained as reflecting older adults’ preferential attention to positive over negative information (or attention away from or suppression of negative relative to positive information; Carstensen & Mikels, 2005; Mather & Carstensen, 2005). Older adults’ poorer memory for angry faces was not different for young and older faces, arguing against the possibility that previous findings of older adults’ poor memory for negative faces were a consequence simply of an own-age bias in memory for emotional faces.

It is important to note that the present study included only happy, angry, and neutral faces. Given evidence of at least partially distinct neural circuits subserving different emotions and evidence of age-related changes in brain regions involved in processing certain emotions (for a discussion, see Ruffman et al., 2008), as well as age-group differences in visual scan patterns for different facial expressions (Sullivan et al., 2007; Wong, Cronin-Golomb, & Neargarder, 2005), it would be interesting to examine in future studies whether the present findings also generalize to other facial expressions (e.g., disgust, sadness, surprise).

It should also be noted that we did not find any evidence of an own-age bias in memory for emotional faces, contrary to previous studies that focused on neutral faces (Bäckman, 1991; Lamont et al., 2005). One possibility is that with neutral faces, age is the most salient dimension along which faces are varying and thus reveals attentional biases that may be induced by this dimension. In contrast, when faces are varying in emotional expression as well as age, and participants are asked to attend to facial expression as they process each face (as in the expression identification incidental encoding task used here), differential attention to faces based on the ages of the faces is less likely to occur.

Why are Older Adults Less Able to Identify Angry Facial Expressions and Why Do They Remember Angry Faces Less Well?

The literature offers various explanations for older compared to young adults’ decreased ability to identify expressions of, and their reduced memory for, angry faces. One such explanation is based on observations suggesting that different brain structures modulate the effects of negative versus positive stimuli for perception and identification of emotional faces (Calder et al., 2003; Ruffman et al., 2008). There is some evidence that perceiving anger and sadness involves the frontal cortex and right temporal pole (Blair, Morris, Frith, Perrett, & Dolan, 1999), whereas perceiving other emotions depends more on regions such as the amygdale (fear; Adolphs et al., 1999; Anderson & Phelps, 2001), or insula and basal ganglia (disgust; Davidson & Irwin, 1999). Combining these findings with evidence of age-related volume reductions and neuronal loss in the medial temporal cortices (Iidaka et al., 2002), atrophy of the frontal lobes (Gunning-Dixon et al., 2003), and less rapid, but still significant, age-related reductions in the volume of the amygdala (Mu, Xie, Wen, & Shuyun, 1999) offers possible hypotheses about why older adults show declines in their ability to identify and remember facial expressions, especially anger.

Another explanation draws upon evidence of age-related differences in visual scan patterns of emotional faces. Certain visual scan patterns appear to be more efficient for some than for other facial expressions (Calder, Young, Keane, & Dean, 2000). Specifically, identification of happiness and disgust seems to be associated with viewing the lower half of a face, identification of anger, fear, and sadness seems to be associated with examining the upper half of a face, and identification of surprise can be made by viewing either the top or the bottom half of faces. Only a few studies have examined the relation between age-related changes in visual scan patterns and facial expression identification (Sullivan et al., 2007; Wong et al., 2005). Wong et al. (2005) found that regardless of the facial expression of young faces, mean fixation duration was longer in older than in young adults, and young adults made more fixations on the faces. In addition, compared with young adults, older adults fixated less on the top half of the faces, which may have adversely affected their ability to identify fearful, angry, and sad faces but improved their ability to detect disgust. Furthermore, it is possible that age-related declines in visual acuity (e.g., Andersen & Ni, 2008) mediate older adults’ ability to identify facial expressions and to remember faces since impairment in visual acuity may disrupt perception of relevant features. An interesting question for future studies is whether potential age-group differences in vision differentially affect expression identification and memory for faces with different expressions.

Evidence of age-related decreases in automatic mimicry of facial expression offers another explanation for age-associated difficulties in decoding and remembering angry faces. Facial electromyography (EMG) studies that have compared young and older adult’s responses to emotional scenes (not faces) found that the age groups express similar patterns of facial responding (Reminger, Kaszniak, & Dalby, 2000) but different magnitudes of activity (Smith, Hillman, & Duley, 2005). Using young emotional faces, Bailey, Henry, and Nangle (in press) found that angry faces evoked greater brow muscle activity than happy and neutral faces in both young and older adults and that the magnitude of early brow muscle activity was reduced for neutral and, marginally so, angry, but not happy faces in older relative to young participants.

Influence of Ethnicity of Perceiver, Self-Reported Frequency of Contact, and Awareness of Own Feelings

Young and older participants in the present study varied in terms of their demographic distribution (i.e., White versus Non-White; born in USA versus moved to USA later). However, neither factor had an influence on facial expression identification or face memory. These non-significant findings are somewhat surprising considering the literature on differences in processing of faces of the own as opposed to other ethnic groups (Anthony, Cooper, & Mullen, 1992; Elfenbein & Ambady, 2002). A possible explanation is that young participants in the present study were all Yale University undergraduate students who were on a daily basis exposed to faces of different origins. Moreover, the majority of young and older participants were born in the USA, which made it likely that they had been exposed to faces of various different ethnicities throughout their entire life. This explanation is consistent with evidence from a meta-analysis by Elfenbein and Ambady (2002) that showed that the in-group advantage in facial expression identification was smaller for samples that had greater cross-cultural exposure to each other. It is also likely that since all faces presented in the present study were Caucasian, the age of face and the different expressions constituted more salient features than the race of the face (kept constant and thus likely reduced in prominence) and might have overwritten potential effects of ethnicity of the face as a function of ethnicity of the perceiver. It would be an important future route to examine the interaction between ethnicity and age in the face and ethnicity and age of the perceiver for face processing. For example, from our findings, we might expect that accurate reading of the expression of members of another racial group would become even more difficult the older the face.

It is often argued that preference toward own-age faces is related to amount of exposure to own-age as opposed to other-age persons (Anastasi & Rhodes, 2006; Mason, 1986). To our knowledge, the present study is the first that explicitly assessed the relation between self-reported frequency of contact with the own and the other age group and facial expression identification of and memory for young and older faces. We found that, indeed, both age groups reported more frequent contact with their own and less contact with the other age group. Contrary to our expectations, for neither of the age groups was the ability to identify expressions of own-age faces positively related to self-reported frequency of contact with persons of the own age group. However, the more contact young and older participants reported with their own age group, the less they were able to identify expressions of faces of the other age group (this effect was marginally significant in older adults). Thus, interestingly, frequency of contact with the own age group may provide a more sensitive index of frequency of contact with other-age individuals, or a more sensitive index of interest in other-age individuals, than asking directly about frequency of contact with other-age individuals. Also, other-age faces might be less interesting and less likely to represent potential interaction partners for people who have frequent contact with own-age persons and they might therefore invest less effort in decoding expressions in other-age faces. Nevertheless, in terms of memory for faces, young, but not older, participants were better able to correctly recognize older faces, the more contact they reported with older adults suggesting that contact with older adults familiarized young adults with older faces, helping them distinguish one older face from another. This pattern of findings suggests that the features that benefit identification of facial expressions and face recognition are not necessarily identical (e.g., one might be able to identify anger in a face without necessarily knowing whether or not the face had been seen before). Future studies should examine objective as well as subjective measures of exposure to own versus other age groups, and differences between cues to facial expression and cues to facial identity and how they influence face recognition.

The present study finally addressed awareness of own feelings and emotions as another factor that influences expression identification of and memory for young and older faces. Largely consistent with earlier findings (Grady et al., 2007; Keightley et al., 2006; Parker et al., 1993), more emotionally aware participants were better able to identify angry young and older faces. This suggests that attending to and being clear about one’s own feelings is helpful when inferring other people’s feeling states and, especially so, when engaging in relatively difficult facial expression identification tasks such as when identifying anger but not when identifying relatively easy expressions such as happy faces. We did not, however, find an effect of emotional awareness on memory for emotional faces, again suggesting that the factors important for facial expression identification are not necessarily the same as those important for face recognition.

In conclusion, the present study is the first to compare young and older adults on facial expression identification using both young and older faces and to compare young and older adults’ face memory using young and older faces with neutral in addition to emotional expressions. We found that older compared to young adults showed poorer identification of and memory for angry expressions and that this did not depend on the age of the faces but generalized to young and older faces. These findings strongly argue that previous reports of age-related decreases in facial expression identification and worse memory for negative than positive faces in older adults are not due to older adults’ increased difficulties in reading and remembering faces of the other (i.e., the young) age group. In addition, we found evidence that self-reported frequency of contact with persons of the own and the other age group is a factor that contributes to performance in expression identification of and memory for own as opposed to other age faces and that, consistent with earlier findings, awareness of own feelings and emotions is positively related to decoding expressions in other people’s faces.

Acknowledgments

This research was conducted at Yale University and supported by the National Institute on Aging Grant AG09253 awarded to MKJ and by a grant from the German Research Foundation (DFG EB 436/1-1) awarded to NCE. The authors wish to thank the Yale Cognition Project group for discussions of the studies reported in this paper and Kathleen Muller for assistance in data collection.

Footnotes

Publisher's Disclaimer: The following manuscript is the final accepted manuscript. It has not been subjected to the final copyediting, fact-checking, and proofreading required for formal publication. It is not the definitive, publisher-authenticated version. The American Psychological Association and its Council of Editors disclaim any responsibility or liabilities for errors or omissions of this manuscript version, any version derived from this manuscript by NIH, or other third parties. The published version is available at www.apa.org/journals/emo.

1

Since 100 % of the older participants reported White origin, the analysis was only conducted for young participants.

2

We selected happy and angry faces since they represent clearly positive (as compared to, for instance, surprise) and negative (as compared to, for instance, sadness) stimuli, respectively (neutral faces were included for comparison). We provided one response option for each of the three target expressions and a fourth response option representing the ‘other’ category (encompassing positive and negative expressions) to avoid biasing responses with more options referring to negative than positive expressions. Another, practical reason for limiting the number of expressions presented and response options provided was to avoid any difficulties older participants might have with handling more than four response buttons.

3

The FACES database comprises 115 young and older faces. Since equal numbers for young and older faces were needed in the present study pictures of one young face model were not included.

4

For these analyses, two older male participants were dropped as they experienced difficulties coordinating the response buttons in this part of the task resulting in various error responses. In addition, some participants did not produce responses for all possible age of face by facial expression categories, resulting in reduced sample sizes in the respective analyses.

5

Note that young participants reported higher confidence for correctly recognized young (M = 3.56, SD = 0.24) than older faces (M = 3.37, SD = 0.34; Wilks’ λ = .90, F(1, 52) = 5.39, p < .05, ηp2 = .10), whereas older participants’ level of confidence did not differ for young (M = 3.48, SD = 0.43) and older faces (M = 3.44, SD = 0.41).

6

Note that with the sample sizes in the present study, the power to detect potential small differences between the age groups with respect to happy (1–β = .19) and neutral faces (1–β = .29) was low.

References

  1. Adolphs R, Tranel D, Hamann S, Young AW, Calder AJ, Phelps EA, Anderson A, Lee GP, Damasio AR. Recognition of facial emotion in nine individuals with bilateral amygdala damage. Neuropsychologia. 1999;37:1111–1117. doi: 10.1016/s0028-3932(99)00039-1. [DOI] [PubMed] [Google Scholar]
  2. Anastasi JS, Rhodes MG. Evidence for an own-age bias in face recognition. North American Journal of Psychology. 2006;8:237–252. [Google Scholar]
  3. Andersen GJ, Ni R. Aging and visual processing: Declines in spatial not temporal integration. Vision Research. 2008;48:109–118. doi: 10.1016/j.visres.2007.10.026. [DOI] [PubMed] [Google Scholar]
  4. Anderson AK, Phelps EA. Lesions of the human amygdala impair enhanced perception of emotionally salient events. Nature. 2001;411:305–309. doi: 10.1038/35077083. [DOI] [PubMed] [Google Scholar]
  5. Anthony T, Cooper C, Mullen B. Cross-racial facial identification: A social cognitive integration. Personality and Social Psychology Bulletin. 1992;18:296–301. [Google Scholar]
  6. Bäckman L. Recognition memory across the adult life span: The role of prior knowledge. Memory & Cognition. 1991;19:63–71. doi: 10.3758/bf03198496. [DOI] [PubMed] [Google Scholar]
  7. Bailey PE, Henry JD, Nangle MR. Electromyographic evidence for agerelated differences in mimicry of anger. Psychology and Aging. doi: 10.1037/a0014112. (in press) [DOI] [PubMed] [Google Scholar]
  8. Bartlett JC, Fulton A. Familiarity and recognition of faces in old age. Memory & Cognition. 1991;19:229–238. doi: 10.3758/bf03211147. [DOI] [PubMed] [Google Scholar]
  9. Bartlett JC, Leslie JE, Tubbs A, Fulton A. Aging and memory for pictures of faces. Psychology and Aging. 1989;4:276–283. doi: 10.1037//0882-7974.4.3.276. [DOI] [PubMed] [Google Scholar]
  10. Blair RJR, Morris JS, Frith CD, Perrett DI, Dolan RJ. Dissociable neural responses to facial expressions of sadness and anger. Brain. 1999;122:883–893. doi: 10.1093/brain/122.5.883. [DOI] [PubMed] [Google Scholar]
  11. Blanchard-Fields F. Everyday problem solving and emotion: An adult developmental perspective. Current Directions in Psychological Science. 2007;16:26–31. [Google Scholar]
  12. Bradley MM, Greenwald MK, Petry MC, Lang PJ. Remembering pictures: Pleasure and arousal in memory. Journal of Experimental Psychology: Learning, Memory, & Cognition. 1992;18:379–390. doi: 10.1037//0278-7393.18.2.379. [DOI] [PubMed] [Google Scholar]
  13. Calder AJ, Keane J, Manlya T, Sprengelmeyer R, Scott S, Nimmo-Smith I, Young AW. Facial expression recognition across the adult life span. Neuropsychologia. 2003;41:195–202. doi: 10.1016/s0028-3932(02)00149-5. [DOI] [PubMed] [Google Scholar]
  14. Calder AJ, Young AW, Keane J, Dean M. Configural information in facial expression perception. Journal of Experimental Psychology: Human Perception and Performance. 2000;26:527–551. doi: 10.1037//0096-1523.26.2.527. [DOI] [PubMed] [Google Scholar]
  15. Carstensen LL. Social and emotional patterns in adulthood: Support for socioemotional selectivity theory. Psychology and Aging. 1992;7:331–338. doi: 10.1037//0882-7974.7.3.331. [DOI] [PubMed] [Google Scholar]
  16. Carstensen LL, Gross JJ, Fung H. The social context of emotional experience. In: Schaie KW, Lawton MP, editors. Annual Review of Gerontology and Geriatrics. Vol. 17. New York: Springer; 1998. pp. 325–352. [Google Scholar]
  17. Carstensen LL, Isaacowitz DM, Charles ST. Taking time seriously: A theory of socioemotional selectivity. American Psychologist. 1999;54:165–181. doi: 10.1037//0003-066x.54.3.165. [DOI] [PubMed] [Google Scholar]
  18. Carstensen LL, Mikels JA. At the intersection of emotion and cognition: Aging and the positivity effect. Current Directions in Psychological Science. 2005;14:117–121. [Google Scholar]
  19. Charles S, Mather M, Carstensen LL. Aging and emotional memory: The forgettable nature of negative images for older adults. Journal of Experimental Psychology: General. 2003;132:310–324. doi: 10.1037/0096-3445.132.2.310. [DOI] [PubMed] [Google Scholar]
  20. Crook TH, Larrabee GJ. Changes in facial recognition memory across the adult life span. Journal of Gerontology: Psychological Sciences. 1992;47:P138–P141. doi: 10.1093/geronj/47.3.p138. [DOI] [PubMed] [Google Scholar]
  21. D’Argembeau A, Van der Linden M. Identity but not expression memory for unfamiliar faces is affected by ageing. Memory. 2004;12:644–654. doi: 10.1080/09658210344000198. [DOI] [PubMed] [Google Scholar]
  22. Davidson RJ, Irwin W. The functional neuroanatomy of emotion and affective style. Trends in Cognitive Sciences. 1999;3:11–21. doi: 10.1016/s1364-6613(98)01265-0. [DOI] [PubMed] [Google Scholar]
  23. Ebner NC, Riediger M, Lindenberger U. FACES—A database of facial expressions in young, middle-aged, and older women and men: Development and validation. Germany: Max Planck Institute for Human Development; 2008. Unpublished manuscript. [DOI] [PubMed] [Google Scholar]
  24. Elfenbein HA, Ambady N. On the universality and cultural specificity of emotion recognition: A meta-analysis. Psychological Bulletin. 2002;128:203–235. doi: 10.1037/0033-2909.128.2.203. [DOI] [PubMed] [Google Scholar]
  25. Faubert J. Visual perception and aging. Canadian Journal of Experimental Psychology. 2002;56:164–176. doi: 10.1037/h0087394. [DOI] [PubMed] [Google Scholar]
  26. Grady CL, Hongwanishkul D, Keightley M, Lee W, Hasher L. The effect of age on memory for emotional faces. Neuropsychology. 2007;21:371–380. doi: 10.1037/0894-4105.21.3.371. [DOI] [PubMed] [Google Scholar]
  27. Grady CL, McIntosh AR, Horwitz B, Maisog JM, Ungerleider LG, Mentis MJ, Pietrini P, Schapiro MB, Haxby JV. Age-related reductions in human recognition memory due to impaired encoding. Science. 1995;269:218–221. doi: 10.1126/science.7618082. [DOI] [PubMed] [Google Scholar]
  28. Grady CL, McIntosh AR, Horwitz B, Rapoport SI. Age-related changes in the neural correlates of degraded and nondegraded face processing. Cognitive Neuropsychology. 2000;217:165–186. doi: 10.1080/026432900380553. [DOI] [PubMed] [Google Scholar]
  29. Gunning-Dixon FM, Gur RC, Perkins AC, Schroeder L, Turner T, Turetsky BI, Chan RM, Loughead JW, Alsop DC, Maldjian J, Gur RE. Age-related differences in brain activation during emotional face processing. Neurobiology of Aging. 2003;24:285–295. doi: 10.1016/s0197-4580(02)00099-4. [DOI] [PubMed] [Google Scholar]
  30. Hummert ML, Garstka TA, O’Brien LT, Greenwald AG, Mellot DS. Using the Implicit Association Test to measure age differences in implicit social cognitions. Psychology and Aging. 2002;17:482–495. doi: 10.1037//0882-7974.17.3.482. [DOI] [PubMed] [Google Scholar]
  31. Iidaka T, Okada T, Murata T, Omori M, Kosaka H, Sadato N, Yonekura Y. Age-related differences in the medial temporal lobe responses to emotional faces as revealed by fMRI. Hippocampus. 2002;12:352–362. doi: 10.1002/hipo.1113. [DOI] [PubMed] [Google Scholar]
  32. Isaacowitz DM, Wadlinger HA, Goren D, Wilson HR. Selective preference in visual fixation away from negative images in old age? An eye-tracking study. Psychology and Aging. 2006;21:40–48. doi: 10.1037/0882-7974.21.1.40. [DOI] [PubMed] [Google Scholar]
  33. Izard CE. Human emotions. New York: Plenum Press; 1977. [Google Scholar]
  34. Izard CE, Malatesta CZ. Perspectives on emotional development 1: Differential emotions theory of early emotional development. In: Osofsky JD, editor. Handbook of infant development. New York: Wiley; 1987. pp. 494–554. [Google Scholar]
  35. Keightley ML, Winocur G, Burianova H, Hongwanishkul D, Grady CL. Age effects on social cognition: Faces tell a different story. Psychology and Aging. 2006;21:558–572. doi: 10.1037/0882-7974.21.3.558. [DOI] [PubMed] [Google Scholar]
  36. Lamont AC, Stewart-Williams S, Podd J. Face recognition and aging: Effects of target age and memory load. Memory & Cognition. 2005;33:1017–1024. doi: 10.3758/bf03193209. [DOI] [PubMed] [Google Scholar]
  37. Leigland LA, Schulz LE, Janowsky JS. Age related changes in emotional memory. Neurobiology of Aging. 2004;25:1117–1124. doi: 10.1016/j.neurobiolaging.2003.10.015. [DOI] [PubMed] [Google Scholar]
  38. Mason SE. Age and gender as factors in facial recognition and identification. Experimental Aging Research. 1986;12:151–154. doi: 10.1080/03610738608259453. [DOI] [PubMed] [Google Scholar]
  39. Mather M, Carstensen LL. Aging and attentional biases for emotional faces. Psychological Science. 2003;14:409–415. doi: 10.1111/1467-9280.01455. [DOI] [PubMed] [Google Scholar]
  40. Mather M, Carstensen LL. Aging and motivated cognition: The positivity effect in attention and memory. Trends in Cognitive Science. 2005;9:496–502. doi: 10.1016/j.tics.2005.08.005. [DOI] [PubMed] [Google Scholar]
  41. Mu Q, Xie J, Wen Z, Shuyun Z. A quantitative MR study of the hippocampal formation, the amygdala, and the temporal horn of the lateral ventricle in healthy subjects 40–90 years of age. American Journal of Neuroradiology. 1999;20:207–211. [PMC free article] [PubMed] [Google Scholar]
  42. Ochsner KN. Are affective events richly recollected or simply familiar? The experience and process of recognizing feelings past. Journal of Experimental Psychology: General. 2000;129:242–261. doi: 10.1037//0096-3445.129.2.242. [DOI] [PubMed] [Google Scholar]
  43. Park DC, Polk TA, Mikels JA, Taylor SF, Marshuetz C. Cerebral aging: integration of brain and behavioral models of cognitive function. Dialogues in Clinical Neuroscience. 2001;3:151–165. doi: 10.31887/DCNS.2001.3.3/dcpark. [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Parker JD, Taylor GJ, Bagby RM. Alexithymia and the recognition of facial expressions of emotion. Psychotherapy and Psychosomatics. 1993;59:197–202. doi: 10.1159/000288664. [DOI] [PubMed] [Google Scholar]
  45. Reminger SL, Kaszniak AW, Dalby PR. Age-invariance in the asymmetry of stimulus-evoked emotional facial muscle activity. Aging, Neuropsychology, & Cognition. 2000;7:156–168. [Google Scholar]
  46. Ruffman T, Henry JD, Livingstone V, Phillips LH. A meta-analytic review of emotion recognition and aging: Implications for neuropsychological models of aging. Neuroscience and Biobehavioral Reviews. 2008;32:863–881. doi: 10.1016/j.neubiorev.2008.01.001. [DOI] [PubMed] [Google Scholar]
  47. Salovey P, Mayer JD, Goldman S, Turvey C, Palfai T. Emotional attention, clarity, and repair: Exploring emotional intelligence using the Trait Meta-Mood Scale. In: Pennebaker J, editor. Emotion, disclosure, and health. Washington, DC: American Psychological Association; 1995. pp. 125–154. [Google Scholar]
  48. Salthouse TA. Steps towards the explanation of adult age differences in cognition. In: Perfect TJ, Maylor EA, editors. Models of cognitive aging. Oxford: Open University Press; 2000. pp. 19–49. [Google Scholar]
  49. Salthouse TA. What and when of cognitive aging. Current Directions in Psychological Science. 2004;13:140–144. [Google Scholar]
  50. Schneider W, Eschman A, Zuccolotto A. E-Prime Reference Guide. Pittsburgh: Psychology Software Tools Inc.; 2002. [Google Scholar]
  51. Smith DP, Hillman CH, Duley AR. Influences of age on emotional reactivity during picture processing. Journal of Gerontology: Psychological Sciences. 2005;60B:P49–P56. doi: 10.1093/geronb/60.1.p49. [DOI] [PubMed] [Google Scholar]
  52. Sullivan S, Ruffman T, Hutton SB. Age differences in emotion recognition skills and the visual scanning of emotion faces. Journal of Gerontology: Psychological Sciences. 2007;62B:P53–P60. doi: 10.1093/geronb/62.1.p53. [DOI] [PubMed] [Google Scholar]
  53. Watson D, Clark LA, Tellegen A. Development and validation of brief measures of positive and negative affect: The PANAS scales. Journal of Personality and Social Psychology. 1988;54:1063–1070. doi: 10.1037//0022-3514.54.6.1063. [DOI] [PubMed] [Google Scholar]
  54. Wechsler D. Manual for the Wechsler Adult Intelligence Scale - Revised (WAIS-R) New York: Psychological Corporation; 1981. [Google Scholar]
  55. Wong B, Cronin-Golomb A, Neargarder S. Patterns of visual scanning as predictors of emotion identification in normal aging. Neuropsychology. 2005;19:739–749. doi: 10.1037/0894-4105.19.6.739. [DOI] [PubMed] [Google Scholar]

RESOURCES