Abstract
Traditional emotion perception tasks show that older adults are less accurate than young adults at recognizing facial expressions of emotion. Recently, we proposed that socioemotional factors might explain why older adults seem impaired in lab tasks but less so in everyday life (Isaacowitz & Stanley, 2011). Thus, in the present research we empirically tested whether socioemotional factors such as motivation and familiarity can alter this pattern of age effects. In one task, accountability instructions eliminated age differences in the traditional emotion perception task. Using a novel emotion perception paradigm featuring spontaneous dynamic facial expressions of a familiar romantic partner versus a same-age stranger, we found that age differences in emotion perception accuracy were attenuated in the familiar partner condition, relative to the stranger condition. Taken together, the results suggest that both overall accuracy as well as specific patterns of age effects differ appreciably between traditional emotion perception tasks and emotion perception within a socioemotional context.
Keywords: emotion perception, aging, ecological validity, motivation, familiarity
Accurate emotion perception is important for social adjustment and maintaining relationships (Carton, Kessler, & Pape, 1999; Engelberg & Sjöberg, 2004). Yet, older adults are less accurate than young adults at recognizing facial expressions of emotion (Ruffman, Henry, Livingstone, & Phillips, 2008) in traditional lab tasks. This finding is somewhat surprising given older adults’ greater life experience and research that suggests age-related maintenance or gains in the emotional domain (Gross et al., 1997; Kunzmann, Little, & Smith, 2000; Orgeta, 2009). What factors might be responsible for age-related differences in emotion perception in the traditional emotion perception tasks and do these same differences remain when the emotion perception task is more similar to emotion perception in daily life (i.e., when older adults care more and know more)?
A number of studies have failed to account for age-related differences with tests of perceptual abilities, fluid intelligence, working memory, inhibition, education, or even gaze patterns (Keightley, Winocur, Burianova, Hongwanishkul, & Grady, 2006; Murphy & Isaacowitz, 2010; Phillips, MacLean, & Allen, 2002; Sullivan & Ruffman, 2004), suggesting that age differences in emotion perception abilities remain even when controlling for the “usual suspects” of mediators in aging research (i.e., age-related cognitive and perceptual decline) as well as attentional strategy differences. In the present research, we took a different approach: rather than investigating correlates of age and emotion perception accuracy, we manipulated the emotion perception task to test the boundary conditions of age effects. In one task, we manipulated the motivation of participants with accountability instructions (the “caring more” manipulation). In a second task we adapted a paradigm from the empathic accuracy literature to examine the influence of familiarity on age differences in emotion perception accuracy (the “knowing more” manipulation). We used a modified standard interview paradigm with a yoked-subject design in which pairs of perceivers viewed videos of emotional expressions of either their romantic partner or a same-age stranger (Ickes & Hodges, 2013). If older adults’ accuracy on the emotion perception task varied with these manipulations, it would suggest that age differences in emotion perception may be more tied to mutable socioemotional factors than permanent differences in ability.
Next, we describe age differences in traditional emotion perception tasks and the ways in which traditional emotion perception tasks fail to include the rich context of emotion perception in everyday life. Then, against this backdrop, we present evidence that manipulations to increase the ecological validity in other social judgment tasks, namely, by increasing motivation (caring more) and knowledge (knowing more), may improve accuracy and attenuate age differences.
Traditional Emotion Perception Tasks Lack Ecological Validity
The most common way of assessing emotion perception ability is with a single-forced-choice-response to a series of photographs of standardized facial expressions of emotion posed at maximum intensity (Isaacowitz & Stanley, 2011). This is referred to as a traditional emotion perception task (Isaacowitz & Stanley, 2011; Rauers, Blanke, & Riediger, 2013). Although the largest age effects show young adults outperforming older adults at correctly labeling angry, sad, and fearful facial expressions in a traditional emotion perception task (Ruffman et al., 2008), overall accuracy is quite high in this task and some emotions even exhibit ceiling effects (Charles & Campos, 2011; Isaacowitz et al., 2007). For example, in one study with typical results, the poorest accuracy was for older adults’ fear recognition at 72% and the highest accuracy was 97%, for young adults’ happiness recognition (Isaacowitz et al., 2007). Thus, even the lowest accuracy scores are typically better than chance (e.g., about 14% in a task with seven emotion choices). As discussed by Charles and Campos (2011), the extent to which significant age differences in emotion perception are also meaningful differences is an open question. In other words, traditional emotion perception tasks have high reliability, but their ecological validity has not been rigorously tested (Barrett, Mesquita, & Gendron, 2011; Isaacowitz & Stanley, 2011).
Why might older adults be differentially facilitated in content-rich tasks versus tightly controlled but content-poor tasks? One possibility is that when the ecological validity of tasks is increased, older adults are able to rely on their greater social expertise, relative to young adults’ (T. M. Hess, Osowski, & Leclerc, 2005). Extant research on the effect of rich content on age differences in emotion perception is consistent with these ideas: When presented with video without audio of a woman talking about an emotional event, young adults were better than older adults at identifying what the woman was feeling; but when both audio and video were included, these age differences were eliminated (Richter, Dietzel, & Kunzmann, 2010). Similarly, when emotionally-congruent vocal and facial expressions are provided, age differences in emotion perception are eliminated (Hunter, Phillips, & MacPherson, 2010). Taken together, it is clear that additional (congruent) visual or audio information can attenuate or even eliminate age-related deficits on emotional tasks. Yet, the vast majority of research on age differences in emotion perception has relied on traditional emotion perception tasks with static stimuli only. Therefore, the present study attempted to add to the literature on the ways in a more ecologically valid emotion perception task can reduce age-related differences in emotion perception accuracy.
Caring More: The Role of Motivation
In addition to social expertise, age differences in emotion perception may be affected by how motivated individuals feel to expend cognitive resources, which become more scarce with age (Park, 2000; Salthouse, 1996). Motivation can be a powerful influence on behavior, including accuracy in emotion-related tasks. For example, a financial incentive paid to young adult participants for good performance eliminated gender differences in empathic accuracy (Klein & Hodges, 2001). Because increasing age is associated with fewer cognitive resources (Park, 2000; Salthouse, 1996), older individuals are more selective in how they expend these limited resources than younger individuals (Baltes & Baltes, 1990). Thus, age differences in traditional emotion perception tasks may be due to a lack of sufficient motivation for older adults to invest scarce cognitive resources in the task. Consistent with this possibility, age-related differences in emotion perception were eliminated when participants were told that the target shared many common interests with them, relative to controls (Zhang, Fung, Stanley, & Isaacowitz, 2013). The perceived closeness manipulation was interpreted as producing greater motivation and thus better performance for older adults. Furthermore, research on emotion identification for in-group versus out-group members, shows that identification with the social group expressing emotions can lead to better emotion perception, because individuals are more motivated to interpret the emotional expressions of members of their in-group than members of their out-group (Thibault, Bourgeois, & Hess, 2006).
Knowing More: The Role of Familiarity
Another factor that might contribute to greater accuracy in an emotion perception task is simply being familiar with the person expressing the emotion. Participants are better at recognizing the emotional expressions of a familiar partner than a stranger (Ma-Kellams & Blascovich, 2012; Sabatelli, Buck, & Dreyer, 1982; Sternglanz & DePaulo, 2004; Stinson & Ickes, 1992). Additionally, the literature on the broader concept of empathic accuracy, or the ability to correctly infer another person’s thoughts or feelings (Ickes & Simpson, 1997), suggests that people are more accurate at knowing what a close other is feeling than a stranger (Ickes & Hodges, 2013; Sternglanz & DePaulo, 2004). This effect of familiarity may be attributable to the perceiver both knowing more and caring more about a close other’s feelings than a stranger’s (Ickes & Hodges, 2013). This may be especially salient for older adults, who prioritize close partners over acquaintances more than young adults (Fredrickson & Carstensen, 1990). Older adults might not only prioritize their close relationships more than young adults, they might also know their social partners better than young adults. In a novel demonstration of the power of familiarity on empathic accuracy, both young and older adults were better than chance at inferring the feelings of their partner when they were not physically together (Rauers et al., 2013). This was true for members of a couple but not strangers. The authors suggest that better-than-chance empathic accuracy when the partner is not present is due to acquired knowledge about the partner and how he or she typically feels during different parts of the day.
Given the longer duration of close relationships with age, do older couples know more about their partners? Recent work suggests that the knowledge gained in close relationships is at least maintained with age. An experience-sampling study found that young couples are more empathically accurate than older couples when the pairs are physically together, but age differences are eliminated when the pairs are physically separated (Rauers et al., 2013), suggesting that young and older couples are equally able to use what they know about their partner to accurately infer their thoughts and feelings. However, when partners are present, the young couples may have been able to read the emotional expressions of their partner more accurately than older couples, resulting in better accuracy among young couples when partners were physically present. This study used an affect balance score by subtracting negative affect endorsement from positive affect endorsement. In the present studies, we investigated emotion perception in strangers and in close relationships from a discrete emotions perspective.
Traditional Emotion Perception Task with Motivation Manipulation
In the first task we tested whether we could improve older adults’ performance on a traditional emotion perception task by making them care more, which might compel them to spend more cognitive resources on the task, thereby increasing accuracy. We manipulated motivation using accountability instructions. Participants in the motivation condition were informed prior to completing the traditional emotion perception task that they would later need to defend their response choices. This accountability manipulation has been shown in the past to be more involving for participants, which results in greater engagement in effortful processing (e.g., Chen, 2004; Tetlock, 1985, 1992). Furthermore, accountability instructions have reduced age effects in memory and social judgment tasks by differentially improving the performance of older adults, relative to young (T. M. Hess, Germain, Swaim, & Osowski, 2009; T. M. Hess, Rosenberg, & Waters, 2001).
Familiar Partner Emotion Perception Task
In the second task, we investigated whether age differences in emotion perception could be reduced when participants know more, by creating stimuli of familiar partners and comparing emotion perception accuracy of a familiar partner to that of a same-age stranger. In addition to the primary manipulation of familiarity, this emotion perception task provides more content than a traditional emotion perception task: the videos are dynamic rather than static and the expressions are spontaneous rather than posed. Participants are better at recognizing emotions from dynamic displays than static displays (Ambadar, Schooler, & Cohn, 2005; Bould & Morris, 2008; Lederman et al., 2007). Although no work to date has directly compared age differences in recognizing discrete emotions in dynamic versus static stimuli, two studies have found that young and older adults are equally able to identify the global valence of emotional expressions (i.e., positive or negative) from dynamic displays (Krendl & Ambady, 2010; Sze, Goodkind, Gyurak, & Levenson, 2012). In addition, older adults are more sensitive than young adults at distinguishing genuine “Duchenne” smiles from posed smiles presented in a video display (Murphy, Lehrfeld, & Isaacowitz, 2010), but are less able than young adults to determine the emotional meaning when three different types of emotional experiences (positive, negative, or neutral) underlie different kinds of smiles in videos (Riediger, Studtmann, Westphal, Rauers, & Weber, 2014). Thus, the literature to date is mixed on whether age-related differences remain when dynamic emotional stimuli are presented.
Unlike traditional emotion perception tasks that use expressions posed at maximum intensity – and frozen in time at the apex of the emotional expression, this task will present natural, spontaneous facial expressions of emotion in response to emotion-evoking film clips. These more subtle facial expressions may be more difficult for all participants to decode than expressions posed at maximum intensity. Thus, while the motion cues provide important context for identifying the emotion (Bould & Morris, 2008), overall accuracy will likely be much lower than traditional emotion perception tasks because rather than a “caricature” of a facial expression (Barrett, 2006) – these expressions are spontaneous and subtle. This difference will also likely eliminate the problem of ceiling effects that typically restricts traditional emotion perception tasks.
One factor held constant in the present research is the age-matching of the perceiver to the target. Although there is evidence that age-matching is an influential moderator of age effects in social judgment tasks (T. M. Hess et al., 2001), studies specifically focusing on emotion perception accuracy (both static and dynamic) have not found support for an own-age bias (Ebner & Johnson, 2009; U. Hess, Adams, Simard, Stevenson, & Kleck, 2012; Malatesta, Fiore, & Messina, 1987; Malatesta, Izard, Culver, & Nicolich, 1987; Murphy et al., 2010; Riediger, Voelkle, Ebner, & Lindenberger, 2011; Sze et al., 2012).
Hypotheses
Hypothesis 1
We expected to replicate age differences in the traditional emotion perception task. Specifically, we expected an Age Group × Emotion interaction for emotion perception accuracy, with young adults outperforming older adults at recognizing several of the discrete emotions.
Hypothesis 2
We expected to find an Age Group × Motivation Condition interaction for emotion perception accuracy in the traditional emotion perception task, with age differences attenuated in the explicit motivation condition compared to the no motivation condition.
Hypothesis 3
For the familiar partner emotion perception task, we expected that age differences would be attenuated in the familiar partner condition compared to the stranger condition.
Past research has reported differential age effects in emotion perception accuracy for different discrete emotions, although the specific pattern of these emotion effects differs somewhat across studies (Ruffman et al., 2008). Additionally, there is not a clear theoretical reason to predict specific emotion effects, so while we expected there might be some differences across emotions in the patterns of age effects, we did not explicitly predict specific discrete emotion effects.
Method
The data for this study were collected across two lab sessions. In Session 1, participants completed the perceptual, cognitive, and affective tests as well as the traditional emotion perception task. Participants also had their facial expressions video-recorded in Session 1, from which we developed the stimuli for the Dynamic Emotion Perception Task for Session 2.
Participants
Previous research has reported a medium to large effect size for age differences in emotion perception accuracy (Halberstadt, Ruffman, Murray, Taumoepeau, & Ryan, 2011; Murphy & Isaacowitz, 2010). Using G*Power 3.1.9.2 (a publicly-available software package for power calculations; Faul, Erdfelder, Buchner, & Lang, 2009; Faul, Erdfelder, Lang, & Buchner, 2007), we determined that a total sample size of 40 (for a large effect, f = .40) or 98 (for a medium effect f = .25) is the minimum total sample size necessary to provide adequate power (.80) to detect the main effects and the within-between interaction with two groups and two measurements.
We recruited young and older adult couples who were in an exclusive romantic relationship for at least three months. Fifty-two young adults (aged 18-30 years; M = 21.35, SD = 3.16; 28 females and 24 males1) and 57 community-dwelling older adults (aged 60-91 years; M = 74.88, SD = 6.60; 292 females and 28 males) living in the northeastern region of the United States participated in Session 1. Two older adults were excluded from analyses because they did not speak English fluently and did not understand the instructions. All remaining participants spoke English fluently. This left 55 older adults for analyses (aged 60-91 years; M = 74.96, SD = 6.63; 28 females and 27 males). Younger couples had an average relationship duration of 32.33 months (SD = 31.44) with a range of 4-121 months, while older couples had an average relationship duration of 562.40 months (SD = 185.61) with a range of 120-885 months. Young adults were recruited from an introductory psychology course and with flyers posted on campus. Older adults were recruited from a lifelong learning class and with advertisements. Participants received either course credit or a monetary stipend.
Eighty-eight percent of the sample self-identified as Caucasian, 8% Asian American, 3% other, <1% African American, and <1% American Indian. Table 1 displays descriptive statistics and t-test results for age differences in demographic, perceptual, cognitive, and affective/personality variables. On average, participants reported being in fairly good health (M = 3.75, SD = .95) on a 5-point Likert-type scale (1 = poor, 2 =fair, 3 = good, 4 = very good, 5 = excellent). Young adults reported being in significantly better health than older adults, p < .05. However, the average rating for both age groups was between good and very good, suggesting an overall healthy sample. Participants in this sample were highly educated: Most participants had attended some college or were college graduates. Not surprisingly, because young adult participants were primarily college students in the process of completing their education, older adults reported more years of formal education than young adults, p < .001. Participants in this sample exhibited typical age patterns of perceptual and cognitive functioning, with young adults exhibiting better near and far visual acuity and contrast sensitivity than older adults, ps < .001, and older adults outperforming young adults on a vocabulary test, p < .001. We screened for dementia with the Mini-Mental State Exam (MMSE; Folstein, Folstein, & McHugh, 1975). All older adult participants scored 26 or above on the MMSE, which is greater than the diagnostic cut-off score for normal cognitive functioning.
Table 1.
Descriptive Statistics and T-tests for Age Differences in Demographic, Perceptual, Cognitive, and Affective/Personality Variables
| Young Adults (N = 52) |
Older Adults (N = 55) |
|||||||||
|---|---|---|---|---|---|---|---|---|---|---|
|
| ||||||||||
| Variable Type | Max. or Best Score | Mean | SE | Mean | SE | df | t | p | d | |
| Demographic | Health | 5 | 3.96 | .12 | 3.54 | .13 | 104 | 2.36 | .02 | .45 |
| Years Education | 21 | 14.67 | .31 | 16.48 | .37 | 104 | 3.74 | < .001 | .73 | |
|
| ||||||||||
| Perceptual | Far Visual Acuity | 20 | 26.42 | 1.89 | 43.33 | 3.64 | 104 | 4.08 | < .001 | .80 |
| Near Visual Acuity | 20 | 21.25 | .51 | 49.27 | 7.57 | 105 | 3.59 | < .001 | .70 | |
| Contrast Sensitivity | 2.25 | 1.63 | .01 | 1.46 | .02 | 105 | 7.21 | < .001 | 1.35 | |
| Facial Discrimination | 27 | 19.20 | .75 | 17.62 | .69 | 102 | 1.54 | .13 | .30 | |
|
| ||||||||||
| Cognitive | Vocabulary | 20 | 13.29 | .28 | 14.80 | .29 | 104 | 3.77 | < .001 | .73 |
| Affective/ | Depressive Symptoms | 60 | 10.48 | .85 | 7.63 | .91 | 104 | 2.28 | .03 | .44 |
| Social | Future Time Perspective | 7 | 5.46 | .11 | 3.52 | .13 | 104 | 11.51 | < .001 | 2.25 |
Note. Degrees of freedom differ due to missing data. Higher scores indicate better performance/greater endorsement unless otherwise indicated. Significant age differences are denoted by a bold t-statistic, p < .05. The visual acuity score is the denominator of the visual acuity ratio (20/20) representing the letter size in M-units that the participant can read at a distance of 20 meters. Thus, lower scores indicate better visual acuity. The Rosenbaum near vision acuity score also represents the denominator of the ratio for distance equivalents (20/20), with lower scores representing better near visual acuity. One older adult female was missing data for the Health, Education, Far Visual Acuity, Vocabulary, and Affective/Social Variables. Two older adult females and one young adult female were missing the Facial Discrimination data.
Materials and Measures
Perceptual functioning
In order to ensure that all participants could adequately perceive the visual stimuli of facial expressions, we tested participants’ visual functioning on four measures: the standard Snellen Eye Chart for visual acuity (Hetherington, 1954), the Rosenbaum Screener for near vision (Rosenbaum, 1984), the Pelli-Robson Contrast Sensitivity Chart (Pelli, Robson, & Wilkins, 1988), and the Benton Facial Discrimination Test-Short Form (Levin, Hamsher, & Benton, 1975). Although the Benton Facial Discrimination Test is a neuropsychological measure of prosopagnosia, a disorder in the ability to recognize faces, in accordance with past research (Borod et al., 2000; Stanley & Blanchard-Fields, 2008) we used the test as a way to determine whether participants could discriminate between different human faces. In this 27-item test, participants must choose pictures of the same person as the target face out of an array of six faces. The internal consistency of the test is excellent (Cronbach’s α = .90). Scores were the number of target faces matched correctly.
Cognitive functioning
Crystallized intelligence was measured with the 20-item Shipley Vocabulary Test (Zachary, 1986). Scores were the number of multiple-choice words defined correctly.
Affective and personality measures
Depressive symptomology was measured with the Center for Epidemiological Studies Depression Scale (Radloff, 1977). We also assessed whether participants view their future as open-ended or limited with the Future Time Perspective Scale (Carstensen & Lang, 1996). Participants indicated the extent of their agreement with 10 items using a 7-point Likert-type scale (1 = very untrue, 7 = very true). A sample item is: Many opportunities await me in the future. Higher scores on this scale indicate a more expansive future time perspective.
Traditional emotion perception task
Participants were asked to identify the emotion expressed by 42 faces from the Pictures of Facial Affect set (Ekman & Friesen, 1976). Six different faces for each of the six basic emotions plus neutral (anger, disgust, fear, happy, sad, surprise, and neutral) were presented one at a time in a randomized order on a 17-in. display (resolution 1280 × 1024, refresh rate 60 Hz) using GazeTracker software (Eyetellect, LLC, Charlottesville, VA). Participants were seated approximately 80 cm from the monitor and the faces were centered on the screen, subtended at 5.11° visual angle wide × 6.92° visual angle high. The seven emotion labels corresponded to numbers and participants responded with a key press to indicate their choice for each face. Trials were self-paced and labels for the emotions and corresponding numbers were available throughout the task.
Participants were pseudo-randomly (balanced across age and gender) assigned to one of two motivation conditions: explicit motivation or no motivation. The only difference between the two conditions was that after receiving instructions on how to complete the emotion perception task, participants in the explicit motivation condition were also told, “Once you have completed the emotion perception task, we will go back through the test together, face by face, and you will explain to me why you chose the emotion you chose for each face.” In actuality, we did not follow through on this procedure. The additional accountability for the explicit motivation condition was intended to increase motivation for emotion perception accuracy in the task (Chen, 2004; Tetlock, 1985, 1992). Participants in the explicit motivation condition were debriefed about the purpose of this deception at the end of the study session.
Dynamic familiar partner emotion perception task stimuli development
During Session 1, participants were video-recorded while watching film clips intended to evoke seven different emotions: anger, disgust, fear, happy, sad, surprise, and neutral. Participants reported the emotion they felt and intensity ratings for each of the seven film clips.
Film task materials and procedure
Participants watched seven film clips validated to evoke the six basic emotions plus neutral in young and older adults (Rottenberg, Ray, & Gross, 2007), in one of four different counterbalanced orders. Each member of a couple was tested individually, but in order to reduce the possibility that partners would give away how they felt by discussing the clips they saw after participating, each member of a couple saw a completely different set of seven emotion-eliciting film clips. The clips ranged in duration from 14 seconds to 4 minutes.
Participants were informed that they would be video-recorded while they watched the film clips and were asked to, “Express whatever you are feeling on your face so that someone who was watching you would know exactly what you are feeling.” Prior to each film clip, participants completed a pre-film questionnaire in which they indicated, 1) the emotion they felt the most “right now” (angry, disgusted, afraid, happy/amused, sad, surprised, neutral/no emotion), 2) rated how intensely they were feeling that emotion (0 = not feeling even the slightest bit of the emotion to 8 = the most you could imagine feeling the emotion in a lab setting), and 3) indicated any other emotions they felt and the intensity of those emotions.
Immediately following each film clip, participants completed a post-film affect questionnaire, which was identical to the pre-film questionnaire except participants were asked how they felt “during the film clip”. Before going on to the next pre-film questionnaire, participants were instructed to clear their minds and completed a 40-second distractor task where they copied an abstract geometric drawing. This distractor task, adapted from Gross and colleagues (Gross, Sutton, & Ketelaar, 1998), was used in order to reduce emotional carryover effects. Participants were instructed to copy, sketch, or draw an abstract geometric figure that was presented for 40 seconds on the computer screen. Participants were assured that the quality of the drawings is not important; we just want to give you a break between the films.
Editing and coding of dynamic facial expressions
Each participant yielded seven different clips of facial expressions, one for each of the seven emotions. The clips were edited down to approximately 30 seconds each, with the stipulation that each clip begin with a neutral expression and then unfold into the most intense emotion shown for each of the seven emotions. The average duration of edited face videos was M = 23.70 seconds, SD = 2.81. The young adult clips (M = 22.08 sec., SD = 2.35) were significantly shorter than the older adult clips (M = 25.43 sec., SD = 2.16), t(89) = 7.08, p < .001, d = 1.48, perhaps because the emotional expression unfolded more slowly or later in the older adult videos than the young adult videos. Next, all of the emotion clips were independently coded by two researchers familiar with the Facial Action Coding System (FACS; Ekman, Friesen, & Hager, 2002) according to the quality of the emotion expressed (0 = target emotion not present, 1 = target emotion minimally present, 2 = target emotion present, 3 = target emotion clearly present). To assess inter-rater reliability a two-way mixed, consistency, single-measures Intraclass Correlation Coefficient (ICC) was computed for each of the six emotions across participants and separately by age group for these quality codes (see Table 2). ICCs ranged from moderate to substantial (Landis & Koch, 1977), depending on the emotion. By comparing the 95% CIs for young and older adults for each emotion, it can be seen that the reliability for quality codes for most of the emotions were similar for the two age groups (i.e., the confidence intervals overlap). One exception was the reliability for coding the quality of happy facial expressions: the ICC was higher for older adults than young adults and the CIs around the ICCs did not overlap, suggesting that the two coders were more likely to agree on the quality score for happy videos of older adults than young adults.
Table 2.
Intraclass Correlation Coefficients (ICCs) for Inter-rater Reliability of Quality Codes for Dynamic Emotional Videos
| Across Age [95% CI] | YA, 95% CI | OA, 95% CI | |
|
|
|||
| Anger | .65 [.51, .75] | .65 [.45, .79] | .64 [.43, .79] |
| Disgust | .65 [.52, .76] | .61 [.39, .76] | .69 [.49, .82] |
| Fear | .50 [.32, .64] | .49 [.24, .68] | .51 [.25, .70] |
| Happy | .75 [.64, .82] | .48 [.22, .67] | .80 [.67, .89] |
| Sad | .58 [.42, .70] | .62 [.41, .77] | .52 [.27, .71] |
| Surprise | .62 [.47, .73] | .73 [.56, .84] | .51 [.25, .70] |
Average quality codes for each emotion separately by age group are displayed in Table 3. To assess whether there were age or emotion differences in the coder-rated quality of the videos, we conducted a 2 (Age Group) × 6 (Emotion) mixed-design ANOVA on quality scores. There was a main effect of Emotion, F(4.09, 355.53) = 31.02, p < .001, ηp2 = .26. Follow-up pairwise comparisons with Bonferroni adjustments indicated that the quality codes for fear (M = 1.43, SE = .09) were significantly lower than the quality codes for all other emotions, ps < .001. Conversely, the quality codes for happy (M = 2.77, SE = .06) were significantly higher than the quality codes for all other emotions, ps ≤ .001.The Age × Emotion interaction was not significant, F(4.09, 355.53) = .34, p =.86, ηp2 = .004. The main effect of age group was not significant, F(1,87) = .25, p =.62, ηp2 = .003.
Table 3.
Average Quality Codes of Dynamic Facial Expression Videos
| Young Adults (n = 47) | Older Adults (n = 44) | |||
|---|---|---|---|---|
|
|
||||
| Emotion | Mean | SD | Mean | SD |
| Anger | 2.04 | .91 | 1.91 | .89 |
| Disgust | 2.34 | .89 | 2.30 | .83 |
| Fear | 1.43 | .83 | 1.43 | .85 |
| Happy | 2.85 | .36 | 2.70 | .73 |
| Sad | 1.94 | .92 | 2.02 | .76 |
| Surprise | 2.09 | .93 | 1.98 | .93 |
Emotional intensity ratings
Because 42% of participants did not respond to the intensity rating for the neutral clip, we limited our intensity rating analyses to the six emotional film types. The majority of participants indicated they felt the target emotion for each of the six emotional clips. The anger clips were the most likely to evoke a different emotion (n = 18) with participants reporting feeling disgusted or sad rather than angry. The fear and disgust clips were the next most likely to fail to elicit the intended emotion. Twelve participants reported feeling neutral rather than afraid for the fear clips. Twelve participants did not report feeling disgusted for the disgust clip, with the majority of these reporting feeling surprised rather than disgusted. These emotional videos were considered failed inductions and as such were not included in accuracy analyses for the dynamic emotion perception task.
To minimize loss of data due to missing values for the intensity ratings, we used the replace missing values function in SPSS 22.0 to replace missing values with the mean intensity rating for each age group for each missing emotion intensity rating. Young adults had the following number of missing values for each emotion: 11 for anger, 3 for disgust, 6 for fear, 1 for happy, 1 for sad, and 1 for surprise. Older adults had the following number of missing values for each emotion: 7 for anger, 9 for disgust, 7 for fear, 4 for happy, 0 for sad, and 2 for surprise. Thus, out of 637 emotion inductions (91 participants induced into 7 emotions), 52 were not successful and we replaced the null value with the mean by age group for these 52 values (8% of the intensity data). Descriptive statistics for intensity ratings by age group can be found in Table 4. To determine whether there were age differences in the intensity ratings of the emotional film clips, we conducted a 2 (Age Group) × 6 (Emotion) mixed-design ANOVA on the intensity ratings with missing values replaced. Results revealed a significant main effect of Age Group, F(1, 89) = 31.00, p < .001, ηp2 = .26, with older adults (M = 6.21, SE = .17) reporting higher intensity ratings than young adults (M = 4.91, SE = .16). There was also a significant main effect of Emotion, F(5, 445) = 19.68, p < .001, ηp2 = .18. Follow-up pairwise comparisons with Bonferroni adjustments showed that the fear clip (M = 4.37, SE = .19) was rated as significantly less intense than all of the other clips, ps < .001. In addition, disgust (M = 6.24, SE = .18) was rated as significantly more intense than anger (M = 5.57, SE = .16) and surprise (M = 5.50, SE = .18), ps < .05. These main effects were qualified by a significant Emotion × Age Group interaction, F(5, 445) = 2.30, p = .044, ηp2 = .03. Follow-up comparisons with Bonferroni adjustments (see Table 4) showed that for all six emotions, older adults rated their emotional experience as significantly more intense than young adults, albeit to a differing extent (hence the interaction).
Table 4.
Descriptive Statistics and Pairwise Comparisons for Film Intensity Ratings by Age Group
| Young Adults n = 47 | Older Adults n = 44 | ||||
|---|---|---|---|---|---|
| Mean (SD) | Mean (SD) | Mean Difference | p | d | |
| Anger | 4.50 (1.82) | 6.65 (1.22) | 2.15 | < .001 | 1.39 |
| Disgust | 5.57 (2.06) | 6.91 (1.25) | 1.35 | < .001 | .79 |
| Fear | 3.85 (1.71) | 4.89 (1.82) | 1.04 | .006 | .59 |
| Happy | 5.54 (1.72) | 6.45 (1.67) | .91 | .012 | .54 |
| Sad | 5.09 (1.90) | 6.27 (1.47) | 1.19 | .001 | .69 |
| Surprise | 4.91 (1.89) | 6.10 (1.39) | 1.18 | .001 | .72 |
Examining all of these results on our created stimuli together, it appears that the emotion induction for fear was not successful. Thirteen percent of the sample reported feeling neutral rather than afraid during the fear clip. In addition, those participants who did report feeling afraid reported it at a significantly lower intensity than all of the other emotions (with a mean of 4.37 on a scale of 0-8). Finally, the coder-rated quality for fear was significantly lower than all other emotions, suggesting that participants were also not clearly expressing fear in the video clips. For these reasons, we considered the fear induction unsuccessful and so we did not include fear in the dynamic emotion perception task analyses, leaving six emotions (anger, disgust, happiness, sadness, surprise, and neutral) for analysis of emotion perception accuracy.
Dynamic emotion perception task
The duration between session one and session two ranged from 0-9 months. An independent samples t-test indicated that there was more time between the sessions for older adults (M = 4.37 months, SD = 3.19) than young adults (M = 3.05 months, SD = 1.61), t(89) = 2.56, p = .012, d = .52. A total of 47 young adults and 44 older adults returned for the second session of the study3, during which they were asked to identify the emotions expressed in 7 videos of their familiar partner (one for each emotion) and 7 videos of an age-and-sex-matched stranger. That is, the characteristics of the stranger video were matched to the characteristics of the familiar video for each participant so that if an older adult male saw his older adult female partner for the familiar condition, he was assigned an older adult female video set for his stranger video condition. The order of the stranger and familiar partner video sets was counterbalanced between participants and the order of emotion type within each block was also counterbalanced within and across participants. Prior to viewing the stranger block, each participant was presented with a still photo of the stranger and asked to confirm that the person was unknown to them. If the participant knew the stranger, another set of stranger videos was selected. (This happened once for a young adult). Participants watched each of the 14 dynamic emotional expression videos twice, in succession. Because the videos were relatively short (about 30 seconds), we wanted to ensure participants had enough time to view the emotions. For this reason, we showed the emotional videos twice for all emotions, conditions, and participants. Participants were given the following instructions:
Next, you will watch short clips of facial expressions. You will watch each clip twice. After the second viewing, please indicate the emotion that is exhibited the most intensely in the clip by circling one of the following emotion choices on your answer sheet: happy, neutral, sad, fear, surprise, disgust, anger.
Results
H1 and H2: Age Differences in Emotion Perception Accuracy and Effect of Motivation Condition
Table 1 in the supplementary material displays the confusion matrix and hit-rate accuracies as percentages of responses for young and older adults for each emotion, separately by motivation condition. In order to test for, 1) age differences in emotion perception accuracy in the traditional emotion perception task, and, 2) whether age differences in emotion perception accuracy were attenuated in the explicit motivation condition compared to the no motivation condition, we conducted a mixed-design ANOVA with Age Group and Motivation Condition as between-subjects factors and Emotion as a within-subjects factor. The dependent variable was proportion responses correct out of number of responses present for each of the seven discrete emotions, such that scores could range from 0 to 1.0. Of note, and consistent with past findings (Isaacowitz et al., 2007), the hit-rate accuracies for all cells were well-above chance (14% in a decision task with seven choices). Happy faces were recognized best, while fearful faces had the lowest accuracy. All analyses for the traditional emotion perception task were also computed using the unbiased hit rate to correct for any systematic response biases.4 Because the patterns of effects using the unbiased hit rates for accuracy were identical to those using the raw accuracy scores, we present the raw accuracy scores to increase the interpretability of the scores and allow for comparisons across studies.
Consistent with past work (Ruffman et al., 2008), the main effect of Age Group was significant, F(1, 102)5 = 10.66, p = .001, ηp2 = .10, with young adults (M = .85, SE = .01) exhibiting greater emotion perception accuracy overall than older adults (M = .79, SE = .01). There was also a significant main effect of Emotion, F(4.01, 409.15)6 = 65.92, p < .001, ηp2 = .39, which was qualified by the predicted Age Group × Emotion interaction, F(4.01, 409.15) = 4.93, p = .001, ηp2 = .05. Follow-up comparisons with Bonferroni adjustments revealed significant age differences, with young outperforming older adults, for accurate recognition of angry (YA: M = .85, SE = .03; OA: M = .72, SE = .03; p = .008), sad (YA: M = .85, SE = .03; OA: M = .68, SE = .02; p < .001 ), and neutral faces (YA: M = .96, SE = .02; OA: M = .88, SE = .02; p = .004).
The predicted Age Group × Motivation Condition interaction was also significant, F(1, 102) = 9.84, p = .002, ηp2 = .09 (Figure 1). Simple effects tests separately by motivation condition revealed significant age differences in the no motivation condition, F(1, 51) = 17.16, p < .001, ηp2 = .25, with young adults (M = .88, SE = .02) outperforming older adults (M = .77, SE = .02). As predicted, age differences were not significant in the explicit motivation condition, F(1, 51) = .01, p =.92, ηp2 < .001. However, this attenuation in age differences is due to both the lower accuracy of young adults and the higher accuracy of older adults in the explicit motivation condition compared to those in the no motivation condition, respectively. Follow-up tests within age groups confirmed that young adults’ accuracy was significantly worse in the explicit motivation condition than the no motivation condition, F(1, 49) = 8.19, p = .006, ηp2 = .14, whereas older adults’ accuracy did not significantly differ by motivation condition, F(1, 53) = 3.13, p = .082, ηp2 = .06. However, it appears that the means are in the expected direction-- and opposite to young adults-- with older adults exhibiting a trend for slightly better performance in the explicit motivation condition than the no motivation condition. The main effect of Motivation Condition was not significant, F(1, 102) = .16, p = .69, ηp2 = .002. And neither the Motivation Condition × Emotion interaction, F(4.01, 409.15) = 1.75, p = .14, ηp2 = .02, nor the Motivation Condition × Age Group × Emotion interaction, F(4.01, 409.15) = 1.46, p = .21, ηp2 = .01, were significant.
Figure 1.
Average emotion perception accuracy in the traditional emotion perception task as a function of motivation condition and age group. Bars are standard errors of the mean.
H3: Age Differences in Dynamic Emotion Perception Accuracy as a Function of Familiarity
Before analyzing the dynamic emotion perception task data, we wanted to determine whether the scores needed to be corrected for response biases. For example, if the older adults were more likely than the young adults to respond with “disgust” when they were unsure of the emotion, then the older adults might have higher disgust accuracy than young adults due to this disgust bias. To check whether there were systematic biases that differed by age group for endorsing the various response options, we conducted a 2 (Familiarity) × 6 (Emotion) × 2 (Age Group) mixed-design ANOVA on the total responses submitted for each emotion type. The main effect of age group on the number of responses across emotions was not significant, F(1, 89) = 3.47, p = .07, ηp2 = .04. Additionally, the Familiarity × Age Group interaction was not significant, F(1, 89) = 2.20, p = .14, ηp2 = .02, and neither was the Emotion × Age Group interaction, F(3.96, 352.46) = 2.34, p = .056, ηp2 = .03, nor the Familiarity × Emotion × Age Group interaction, F(3.68, 327.40) = .22, p = .91, ηp2 = .003. Because we did not find evidence for systematic age differences in response biases, we moved forward with data analysis with the raw accuracy scores.
In order to reduce the amount of data loss due to failed emotion inductions (see above), the replace missing values function in SPSS 22.0 was used to impute the mean accuracy separately by age group for each of the 14 emotion judgments on the sample of participants who returned for the second session (i.e., 47 young adults and 44 older adults). This resulted in 80 emotion judgments out of 1,274 (8%) being replaced by the age group mean. The number of values replaced with the mean for each emotion judgment is presented in Table 5. Hit-rate accuracy percentages and percentages of each error type are provided in Table 2 of the supplementary materials separately for strangers and familiar partners. It is clear that this task was much more difficult than the traditional emotion perception task: hit-rate accuracies were much lower, with some hovering near chance (e.g., for anger recognition). One-sample t-tests separately by age group revealed that young adults were not significantly different from chance (14%) at identifying angry expressions in strangers or familiar partners, ps > .05. Older adults were significantly worse than chance at identifying anger in their familiar partner, ps < .05. In addition, older adults were not significantly different from chance at recognizing sadness in their familiar partner; or anger, disgust, sadness, or surprise in the stranger condition, ps < .05.
Table 5.
Number of Missing Values Replaced by the Age Group Mean for Emotion Judgment in the Familiar Partner (FP) and Stranger (ST) Dynamic Emotion Perception Task
| Young Adults (n = 47) | Older Adults (n = 44) | Total | |
|---|---|---|---|
| FP Anger | 12 | 8 | 20 |
| FP Disgust | 3 | 10 | 13 |
| FP Happy | 1 | 5 | 6 |
| FP Sad | 1 | 1 | 2 |
| FP Surprise | 0 | 3 | 3 |
| FP Neutral | 0 | 1 | 1 |
|
| |||
| ST Anger | 11 | 6 | 17 |
| ST Disgust | 3 | 8 | 11 |
| ST Happy | 1 | 2 | 3 |
| ST Sad | 1 | 0 | 1 |
| ST Surprise | 1 | 2 | 3 |
| ST Neutral | 0 | 0 | 0 |
|
| |||
| Total | 34 | 46 | 80 |
To determine whether the familiarity of the target influenced age differences in emotion perception accuracy, we conducted a 2 (Age Group) × 6 (Emotion) × 2 (Familiarity Condition) mixed-design ANCOVA on the raw emotion perception accuracy scores. Because there was a wide range for the duration between sessions one and two (0-9 months) and young and older adults significantly differed on this variable, we included duration between sessions as a covariate to control for possible differential history or maturation effects. There was a main effect of Age Group, F(1, 87) = 45.47, p < .001, ηp2 = .34, with young adults (M = .62, SE = .03) outperforming older adults (M = .34, SE = .03). There was also a main effect of Emotion, F(5, 435) = 15.43, p < .001, ηp2 = .15. These main effects were qualified by an Emotion × Age Group interaction, F(5, 435) = 3.53, p = .004, ηp2 = .04. Pairwise comparisons with Bonferroni adjustments revealed significant age differences for all emotions, with young adults exhibiting higher accuracy than older adults for all six emotions. Both groups were best at recognizing happy facial expressions and worst at identifying angry expressions. However, the range of accuracy differed for the two groups. Young adult accuracy ranged from 17% (anger) to 95% (happy), while older adults’ accuracy scores only spanned 5% (anger) to 56% (happy).
There was also a significant Age Group × Familiarity interaction, F(1, 87) = 4.77, p = .032, ηp2 = .05, such that the advantage of familiar partners over strangers was greater for older adults than young adults (see Figure 2). To decompose this interaction and examine the prediction that age differences would be attenuated in the familiarity condition relative to the stranger condition, we performed Emotion × Age Group mixed-design ANCOVAs for each condition separately (controlling for duration between sessions). As expected, in the stranger condition, young adults (M = .63, SE = .03) were more accurate than older adults (M = .29, SE = .03), F(1, 88) = 50.64, p < .001, ηp2 = .37. However, also in the familiar condition, young adults (M = .61, SE = .03) were more accurate than older adults (M = .38, SE = .04), F(1, 87) = 20.87, p < .001, ηp2 = .19. Thus, age differences were significant for both conditions, but the stranger condition had a larger effect size (ηp2 = .37) than the familiar condition (ηp2 = .19).
Figure 2.
Average emotion perception accuracy for the dynamic emotion perception task as a function of familiarity and age group. Bars represent standard errors of the mean.
The main effect of Familiarity was not significant, F(1, 87) = .36, p = .55, ηp2 = .004, nor was the Familiarity × Emotion interaction, F(3.77, 327.85) = .82, p = .51, ηp2 = .009. The Familiarity × Emotion × Age Group interaction was also not significant, F(3.77, 327.85) = 1.48, p = .21, ηp2 = .017. Duration between sessions was not a significant covariate and it did not interact with any of the other variables, ps > .05.7
Discussion
We set out to determine whether age differences in emotion perception accuracy could be attenuated when participants care more about their performance (motivation) or know more about the targets (familiarity) in order to better understand whether age-related differences in emotion perception are biologically-bound or mutable. We found evidence that these factors can influence the pattern of age effects. Experimenter-provided motivation eliminated age differences and the familiarity of the target attenuated age differences.
Caring More: Accountability Instructions Eliminate Age Differences in Emotion Perception
The accountability instructions prior to the traditional emotion perception task did eliminate age differences in emotion perception accuracy (Figure 1), suggesting that motivation to care more about the task improved the performance of older adults. This is an important finding because it speaks to whether age differences in emotion perception reflect motivational shifts (Carstensen, 2006) or brain-based changes (Cacioppo, Bernston, Bechara, Tranel, & Hawkley, 2011). Our findings are consistent with recent work by Zhang and colleagues (2013) in which a closeness manipulation eliminated age differences in an emotion perception task. In the present study, however, young adults’ accuracy was actually hampered by the accountability instructions; perhaps because the instructions changed the way in which young adults approached the task. It may be that young adults in the no motivation condition relied on intuitive, automatic processing of the faces. But when warned they would be made accountable for explaining such choices in the motivation condition, their strategy changed to a more deliberate, analytical style – which led to worse performance. Emotional facial expressions can be identified automatically (Mather & Knight, 2006; Tracy & Robins, 2008). It may be that the natural strategy employed by young and older adults in the traditional emotion perception task differs such that young adults may use an automatic, “gut” response style while older adults might typically use a deliberate, effortful processing style. The accountability instructions may have compelled young adults to process the faces using more controlled processing, and older adults to effortfully engage even more cognitive resources in the task. This possibility is consistent with research on “thin slice” judgments, where manipulations to provide reasons for what are usually intuitive impressions impairs performance (Ambady, 2010). This suggests that age differences in emotion perception could be due to young and older adults naturally using different processing styles (automatic vs. effortful) when making emotion judgments and perhaps older adults should be encouraged to rely on more automatic processing to improve emotion perception accuracy.
Accountability instructions may also trigger a self-oriented focus, rather than an other-oriented focus (Ma-Kellams & Blascovich, 2013). For example, students promised a financial incentive for greater empathic accuracy performed worse than students who were not given a financial incentive (Ma-Kellams & Blascovich, 2013). At this point, these ideas are speculative and future replications of this manipulation and investigation into the underlying mechanisms are warranted.
Knowing More: Target Familiarity Attenuated Age Differences
In the dynamic emotion perception task the most striking result was the overall extremely low accuracy when videos of actual felt emotions were used as stimuli. Compared to the traditional emotion perception task where accuracy ranged from 55% to 98%, accuracy in our familiar partner emotion perception task ranged from 5% to 95% with young adults not different from chance at recognizing anger in strangers or familiar partners. Older adults were actually worse than chance at recognizing anger in both types of partners. Clearly, the dynamic emotional stimuli of the familiar partner emotion perception task, which relied on the target expressing their felt emotions, provided a less clear signal than the static posed facial expressions in the traditional emotion perception task. The familiar partner task was therefore a harder task than the traditional task and overcame one of the challenges of traditional emotion perception studies, namely, ceiling effects (Isaacowitz et al., 2007). This seemed especially true for older adults, whose maximum average score was only 56% for happiness identification.
Nevertheless, age differences were attenuated in the familiar partner condition relative to the stranger condition. Although young adults were more accurate than older adults in both conditions, the smaller effect size for the age difference in the familiar condition, relative to the stranger condition, suggests that familiarity with the target brings the accuracy of young and older adults closer together.
Limitations and Future Directions
These findings are tempered by several limitations. The overall low accuracy in our dynamic emotion perception task is problematic because it suggests the task may have been too difficult. While we were successful at avoiding the ceiling effects that typically handicap this literature, we traded that problem for floor effects. Despite low overall accuracy, a similar pattern of accuracy to the traditional emotion perception task emerged, with happiness being the easiest emotion to recognize and anger being among the most difficult. The fact that the pattern of accuracy differences by discrete emotion is mimicked in our dynamic task helps to validate this new type of task and situate it in the literature. Low accuracy for recognizing naturally expressed emotions is consistent with the contention that the category boundaries of emotions do not necessarily map on to the boundaries of functional emotion concepts (Barrett, 2006; Barrett, Lindquist, & Gendron, 2007).
A second limitation was the age difference in the duration of the edited clips in the dynamic emotion perception task; older adult target clips were significantly longer than young adults’. This difference is confounded with age group because older adults only made emotion judgments about older adult targets and young adults only made judgments about young adult targets. Although we would have preferred to have equal durations of clips across age, given that older adults still performed worse than young adults, and we assume longer clip durations are related to more accurate judgments, we believe it is unlikely that this mean difference of 3.5 seconds can account for the observed pattern of results. Of course, this is an empirical question that deserves further study. A third limitation is that the fear induction was not successful so we could not include fear in the dynamic emotion perception accuracy task. Future work should use a better fear induction technique so age differences in fear recognition of familiar partners can be studied.
Future research should also investigate the effects of motivation and knowledge longitudinally to determine whether age-related differences are due to developmental shifts or cohort effects. Furthermore, it would be interesting to examine how motivation and familiarity influence perception across generations (i.e., young adults interpreting the facial expressions of older adults and vice versa).
Conclusion
The main contributions of this study were the simultaneous investigation of boundary conditions on age effects in emotion perception research paired with the introduction of a more ecologically valid dynamic emotion perception task. Understanding a social partner’s emotion is likely determined based on a constellation of factors – situational context, physiology, history with partner, motivation, biases, goals, body, vocal, lexical cues, etc. Although aging may knock down the accuracy of a single cue (visual perception of facial emotion), older adults may be able to adapt to these changes in everyday life by reserving judgment and/or relying on other cues to determine the emotion of their social partners. However, in traditional emotion perception tasks, participants are forced to make a judgment based on the limited (and perhaps degraded, for older adults) information in the static photograph of a facial expression of a stranger.
Accurate emotion perception is likely related to better interpersonal functioning, and this is probably true at any age. Many of the elements present in everyday emotion-laden interactions are absent in traditional emotion perception tasks, thus potentially leading lab tasks to suggest emotion perception problems that older adults do not actually experience. The current findings hint at important differences in the type of processing that occurs when making emotion judgments in traditional emotion perception tasks versus situations where motivation or knowledge is increased.
More generally, these findings point to basic elements of emotion perception that may be missing in standard tasks with participants of any age. Across age groups, person-level variables such as motivation, as well as dyad-level variables such as familiarity, can be key drivers of emotion perception and performance. A key implication for social perception in general is thus that interpersonal aspects of the emotion perception task may be critical yet understudied.
Supplementary Material
Acknowledgments
This research was supported by the National Institute on Aging T32 AG-00204, R01AG-026323. We are grateful to Hanna Negami, Amy Stricoff, and Nia Fogelman for assistance with data collection.
Footnotes
Portions of this data were presented at the biennial Cognitive Aging Conference (2010) and the annual meeting of the Gerontological Society of America (2011).
The number of male and female participants is unbalanced because two same-sex couples participated in the study.
One older adult female participated whose male partner did not participate.
Three young adults were missing the second session experiment date. For these 3 young adults, we replaced the missing value for duration between sessions with the average duration between sessions for young adults (M = 3.05).
We also analyzed these data using the unbiased hit rate (Wagner, 1993) to correct for possible biases in responses, especially given the age comparisons in this study (Isaacowitz et al., 2007). Because the pattern of results was remarkably similar to the analyses using the raw scores, we present the raw scores in the main text to facilitate interpretation and comparisons across studies. We computed the unbiased hit rate (hits squared divided by (the total number of items multiplied by the base rate of total responses submitted for that emotion across all trials)) and then normalized the proportions with an arcsine transformation. The main effect of age group was significant, F(1, 102) = 9.36, p = .003, ηp2 = .08, with young adults (M = .99, SE = .03) exhibiting greater emotion perception accuracy overall than older adults (M = .84, SE = .03). There was also a significant main effect of Emotion, F(4.34, 442.28) = 113.30, p < .001, ηp2 = .53, which was qualified by the predicted Age Group × Emotion interaction, F(4.34, 442.28) = 3.17, p = .011, ηp2 = .03. Follow-up comparisons with Bonferroni adjustments revealed significant age differences, with young outperforming older adults, for accurate recognition of angry (YA: M = .82, SE = .05; OA: M = .61, SE = .05; p = .004), sad (YA: M = .95, SE = .05; OA: M = .66, SE = .04; p < .001 ), and neutral faces (YA: M = 1.33, SE = .06; OA: M = 1.06, SE = .06; p = .001). The predicted Age Group × Motivation Condition interaction was also significant, F(1, 102) = 9.36, p = .003, ηp2 = .08. Simple effects tests separately by motivation condition revealed significant age differences in the no motivation condition, F(1, 51) = 19.77, p < .001, ηp2 = .28, with young adults (M = 1.07, SE = .05) outperforming older adults (M = .79, SE = .04). As predicted, age differences were not significant in the explicit motivation condition, F(1, 51) = .39, p =.54, ηp2 = .008. However, this attenuation in age differences is due to both the lower accuracy of young adults and the higher accuracy of older adults in the explicit motivation condition compared to those in the no motivation condition, respectively. Follow-up tests within age groups confirmed that young adults’ accuracy was significantly worse in the explicit motivation condition than the no motivation condition, F(1, 49) = 6.86, p = .012, ηp2 = .12, whereas older adults’ accuracy did not significantly differ by motivation condition, F(1, 53) = 2.88, p = .096, ηp2 = .05. However, the means are in the expected direction-- and opposite to young adults-- with older adults exhibiting a trend for slightly better performance in the explicit motivation condition than the no motivation condition.
One young adult female participant was missing data for the traditional emotion recognition task, reducing the young adults sample to 51 for these analyses.
When assumptions about sphericity were violated, statistics applying the Greenhouse-Geisser correction are reported.
We also checked whether duration of relationship might better account for differences in emotion perception accuracy than age. Relationship duration was significantly negatively correlated with emotion perception accuracy in the familiar, r(86) = −.31, p < .001, condition. When total emotion perception accuracy across all six emotions in the familiar partner condition was regressed on age group, age group was significant, b = −.64, SE = .15, p < .001. When duration of the relationship was added to the model as a predictor, age group remained a significant predictor, b = −.77, SE = .33, p = .021, while relationship duration was not significant, b = .001, SE = .001, p = .492. We also examined the correlations between relationship duration and emotion perception accuracy separately by age group. None of the correlations were significant, ps > .30.
References
- Ambadar Z, Schooler JW, Cohn JF. Deciphering the enigmatic face: The importance of facial dynamics in interpreting subtle facial expressions. Psychological Science. 2005;16:403–410. doi: 10.1111/j.0956-7976.2005.01548.x. doi: 10.1111/j.0956-7976.2005.01548.x. [DOI] [PubMed] [Google Scholar]
- Ambady N. The perils of pondering: Intuition and thin slice judgments. Psychological Inquiry. 2010;21:271–278. doi: 10.1080/1047840X.2010.524882. [Google Scholar]
- Baltes PB, Baltes MM. Psychological perspectives on successful aging: The model of selective optimization with compensation. In: Baltes PB, Baltes MM, editors. Successful aging: Perspectives from the behavioral sciences. Cambridge University Press; New York, NY: 1990. pp. 1–34. [Google Scholar]
- Barrett LF. Are emotions natural kinds? Perspectives on Psychological Science. 2006;1:28–58. doi: 10.1111/j.1745-6916.2006.00003.x. doi: 10.1111/j.1745-6916.2006.00003.x. [DOI] [PubMed] [Google Scholar]
- Barrett LF, Lindquist KA, Gendron M. Language as context for the perception of emotion. Trends in Cognitive Sciences. 2007;11:327–332. doi: 10.1016/j.tics.2007.06.003. doi: http://dx.doi.org/10.1016/j.tics.2007.06.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Barrett LF, Mesquita B, Gendron M. Context in emotion perception. Current Directions in Psychological Science. 2011;20:286–290. doi: 10.1177/0963721411422522. [Google Scholar]
- Borod JC, Pick LH, Hall S, Sliwinski M, Madigan N, Obler LK, et al. Relationships among facial, prosodic, and lexical channels of emotional perceptual processing. Cognition & Emotion. 2000;14:193–211. doi: 10.1080/026999300378932. [Google Scholar]
- Bould E, Morris N. Role of motion signals in recognizing subtle facial expressions of emotion. British Journal of Psychology. 2008;99:167–189. doi: 10.1348/000712607X206702. doi: 10.1348/000712607X206702. [DOI] [PubMed] [Google Scholar]
- Cacioppo JT, Bernston GG, Bechara A, Tranel D, Hawkley LC. Could an aging brain contribute to subjective well being?: The value added by a social neuroscience perspective. In: Todorov A, Fiske ST, Prentice D, editors. Social Neuroscience: Toward Understanding the Underpinnings of the Social Mind. Oxford University Press; New York, NY: 2011. pp. 249–262. [Google Scholar]
- Carstensen LL. The influence of a sense of time on human development. Science. 2006;312:1913–1915. doi: 10.1126/science.1127488. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Carstensen LL, Lang FR. Future Time Perspective Scale. Stanford University; Stanford, CA: 1996. [Google Scholar]
- Carton JS, Kessler EA, Pape CL. Nonverbal decoding skills and relationship well-being in adults. Journal of Nonverbal Behavior. 1999;23:91–100. doi: 10.1023/A:1021339410262. [Google Scholar]
- Charles ST, Campos B. Age-related changes in emotion recognition: How, why, and how much of a problem? Journal of Nonverbal Behavior. 2011;35:287–295. doi: 10.1007/s10919-011-0117-2. [Google Scholar]
- Chen Y. Age differences in correction of context-induced biases: Source monitoring and timing of accountability. Aging, Neuropsychology, & Cognition. 2004;11:58–67. doi: 10.1076/anec.11.1.58.29359. [Google Scholar]
- Ebner NC, Johnson MK. Young and older emotional faces: Are there age group differences in expression identification and memory? Emotion. 2009;9:329–339. doi: 10.1037/a0015179. doi: http://dx.doi.org/10.1037/a0015179. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ekman P, Friesen WV. Pictures of Facial Affect [Photos] Consulting Psychologists Press; Palo Alto, CA: 1976. [Google Scholar]
- Ekman P, Friesen WV, Hager J. The Facial Action Coding System. 2002.
- Engelberg E, Sjöberg L. Emotional intelligence, affect intensity, and social adjustment. Personality and Individual Differences. 2004;37:533–542. doi: 10.1016/j.paid.2003.09.024. [Google Scholar]
- Faul F, Erdfelder E, Buchner A, Lang A-G. Statistical power analyses using G*Power 3.1: Tests for correlation and regression analyses. Behavior Research Methods. 2009;41:1149–1160. doi: 10.3758/BRM.41.4.1149. doi: 10.3758/brm.41.4.1149. [DOI] [PubMed] [Google Scholar]
- Faul F, Erdfelder E, Lang A-G, Buchner A. G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods. 2007;39:175–191. doi: 10.3758/bf03193146. doi: 10.3758/bf03193146. [DOI] [PubMed] [Google Scholar]
- Folstein MF, Folstein SE, McHugh PR. Mini-mental state: A practical method for grading the cognitive states of patients for the clinician. Journal of Psychiatric Research. 1975;12:189–198. doi: 10.1016/0022-3956(75)90026-6. doi: 10.1016/0022-3956(75)90026-6. [DOI] [PubMed] [Google Scholar]
- Fredrickson BL, Carstensen LL. Choosing social partners: How old age and anticipated endings make people more selective. Psychology and Aging. 1990;5:335–347. doi: 10.1037//0882-7974.5.3.335. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gross JJ, Carstensen LL, Pasupathi M, Tsai J, Götestam Skorpen C, Hsu AYC. Emotion and aging: Experience, Expression, and Control. Psychology and Aging. 1997;12:590–599. doi: 10.1037//0882-7974.12.4.590. doi: 10.1037/0882-7974.12.4.590. [DOI] [PubMed] [Google Scholar]
- Gross JJ, Sutton SK, Ketelaar T. Relations between affect and personality: Support for the affect-level and affective reactivity views. Personality and Social Psychology Bulletin. 1998;24:279–288. doi: 10.1177/0146167298243005. [Google Scholar]
- Halberstadt J, Ruffman T, Murray J, Taumoepeau M, Ryan M. Emotion perception explains age-related differences in the perception of social gaffes. Psychology and Aging. 2011;26:133–136. doi: 10.1037/a0021366. doi: 10.1037/a0021366. [DOI] [PubMed] [Google Scholar]
- Hess TM, Germain CM, Swaim EL, Osowski NL. Aging and selective engagement: The moderating impact of motivation on older adults' resource utilization. Journals of Gerontology: Psychological Sciences. 2009;64B:447–456. doi: 10.1093/geronb/gbp020. doi: 10.1093/geronb/gbp020. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hess TM, Osowski NL, Leclerc CM. Age and experience influences on the complexity of social inferences. Psychology and Aging. 2005;20:447–459. doi: 10.1037/0882-7974.20.3.447. doi: 10.1037/0882-7974.20.3.447. [DOI] [PubMed] [Google Scholar]
- Hess TM, Rosenberg DC, Waters SJ. Motivation and representational processes in adulthood: The effects of social accountability and information relevance. Psychology and Aging. 2001;16:629–642. doi: 10.1037/0882-7974.16.4.629. [PubMed] [Google Scholar]
- Hess U, Adams RB, Jr., Simard A, Stevenson MT, Kleck RE. Smiling and sad wrinkles: Age-related changes in the face and the perception of emotions and intentions. Journal of Experimental Social Psychology. 2012;48:1377–1380. doi: 10.1016/j.jesp.2012.05.018. doi: 10.1016/j.jesp.2012.05.018. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hetherington R. The Snellen Chart as a test of visual acuity. Psychologische Forschung. 1954;24:349–357. doi: 10.1007/BF00422033. [DOI] [PubMed] [Google Scholar]
- Hunter EM, Phillips LH, MacPherson SE. Effects of age on cross-modal emotion perception. Psychology and Aging. 2010;25:779–787. doi: 10.1037/a0020528. doi: 10.1037/a0020528. [DOI] [PubMed] [Google Scholar]
- Ickes W, Hodges SD. Empathic accuracy in close relationships. In: Simpson JA, Campbell L, editors. The Oxford Handbook of Close Relationships. Oxford University Press; New York: 2013. pp. 348–373. [Google Scholar]
- Ickes W, Simpson JA. Managing empathic accuracy in close relationships. In: Ickes WJ, editor. Empathic Accuracy. Guilford Press; New York, NY, US: 1997. pp. 218–250. [Google Scholar]
- Isaacowitz DM, Loeckenhoff C, Lane R, Wright R, Sechrest L, Riedel R, et al. Age differences in recognition of emotion in lexical stimuli and facial expressions. Psychology and Aging. 2007;22:147–159. doi: 10.1037/0882-7974.22.1.147. doi: 10.1037/0882-7974.22.1.147. [DOI] [PubMed] [Google Scholar]
- Isaacowitz DM, Stanley JT. Bringing an ecological perspective to the study of aging and emotion recognition: Past, current, and future methods. Journal of Nonverbal Behavior. 2011;35:261–278. doi: 10.1007/s10919-011-0113-6. doi: 10.1007/s10919-011-0113-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Keightley ML, Winocur G, Burianova H, Hongwanishkul D, Grady CL. Age effects on social cognition: Faces tell a different story. Psychology and Aging. 2006;21:558–572. doi: 10.1037/0882-7974.21.3.558. doi: 10.1037/0882-7974.21.3.558. [DOI] [PubMed] [Google Scholar]
- Klein KJK, Hodges SD. Gender differences, motivation, and empathic accuracy: When it pays to understand. Personality and Social Psychology Bulletin. 2001;27:720–730. [Google Scholar]
- Krendl AC, Ambady N. Older adults' decoding of emotions: Role of dynamic versus static cues and age-related cognitive decline. Psychology and Aging. 2010;25:788–793. doi: 10.1037/a0020607. doi: 10.1037/a0020607. [DOI] [PubMed] [Google Scholar]
- Kunzmann U, Little TD, Smith J. Is age-related stability of subjective well-being a paradox? Cross-sectional and longitudinal evidence from the Berlin Aging Study. Psychology and Aging. 2000;15:511–526. doi: 10.1037//0882-7974.15.3.511. doi: 10.1037/0882-7974.15.3.511. [DOI] [PubMed] [Google Scholar]
- Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977;33:159–174. doi: 10.2307/2529310. [PubMed] [Google Scholar]
- Lederman SJ, Klatzky RL, Abramowicz A, Salsman K, Kitada R, Hamilton C. Haptic recognition of static and dynamic expressions of emotion in the live face. Psychological Science. 2007;18:158–164. doi: 10.1111/j.1467-9280.2007.01866.x. doi: 10.1111/j.1467-9280.2007.01866.x. [DOI] [PubMed] [Google Scholar]
- Levin HS, Hamsher KDS, Benton AL. A short form of the test of facial recognition for clinical use. The Journal of Psychology. 1975;91:223–228. [Google Scholar]
- Ma-Kellams C, Blascovich J. Inferring the emotions of friends versus strangers: The role of culture and self-construal. Personality and Social Psychology Bulletin. 2012;38:933–945. doi: 10.1177/0146167212440291. doi: 10.1177/0146167212440291. [DOI] [PubMed] [Google Scholar]
- Ma-Kellams C, Blascovich J. The ironic effect of financial incentive on empathic accuracy. Journal of Experimental Social Psychology. 2013;49:65–71. doi: 10.1016/j.jesp.2012.08.014. [Google Scholar]
- Malatesta CZ, Fiore MJ, Messina JJ. Affect, personality, and facial expressive characteristics of older people. Psychology and Aging. 1987;2:64–69. doi: 10.1037//0882-7974.2.1.64. doi: 10.1037/0882-7974.2.1.64. [DOI] [PubMed] [Google Scholar]
- Malatesta CZ, Izard CE, Culver C, Nicolich M. Emotion communication skills in young, middle-aged, and older women. Psychology and Aging. 1987;2:193–203. doi: 10.1037//0882-7974.2.2.193. [DOI] [PubMed] [Google Scholar]
- Mather M, Knight M. Angry faces get noticed quickly: Threat detection is not impaired among older adults. Journal of Gerontology. 2006;61:54–57. doi: 10.1093/geronb/61.1.p54. doi: 10.1093/geronb/61.1.P54. [DOI] [PubMed] [Google Scholar]
- Murphy NA, Isaacowitz DM. Age effects and gaze patterns in recognising emotional expressions: An in-depth look at gaze measures and covariates. Cognition & Emotion. 2010;24:436–452. doi: 10.1080/02699930802664623. [Google Scholar]
- Murphy NA, Lehrfeld JM, Isaacowitz DM. Recognition of posed and spontaneous dynamic smiles in young and older adults. Psychology and Aging. 2010;25:811–821. doi: 10.1037/a0019888. doi: 10.1037/a0019888. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Orgeta V. Specificity of age differences in emotion regulation. Aging & Mental Health. 2009;13:818–826. doi: 10.1080/13607860902989661. doi: 10.1080/13607860902989661. [DOI] [PubMed] [Google Scholar]
- Park DC. The basic mechanisms accounting for age-related decline in cognitive function. In: Park DC, Schwarz N, editors. Cognitive aging: A primer. Psychology Press; New York, NY: 2000. pp. 3–21. [Google Scholar]
- Pelli DG, Robson JG, Wilkins AJ. The design of a new letter chart for measuring contrast sensitivity. Clinical Vision Sciences. 1988;2:187–199. [Google Scholar]
- Phillips LH, MacLean RDJ, Allen R. Age and the understanding of emotions: Neuropsychological and sociocognitive perspectives. Journal of Gerontology: Psychological Sciences. 2002;57B:526–530. doi: 10.1093/geronb/57.6.p526. [DOI] [PubMed] [Google Scholar]
- Radloff LS. The CES-D Scale: A self-report depression scale for research in the general population. Applied Psychological Measurement. 1977;1:385–401. doi: 10.1177/014662167700100306. [Google Scholar]
- Rauers A, Blanke E, Riediger M. Everyday empathic accuracy in younger and older couples: Do you need to see your partner to know his or her feelings? Psychological Science. 2013;24:2210–2217. doi: 10.1177/0956797613490747. doi: 10.1177/0956797613490747. [DOI] [PubMed] [Google Scholar]
- Richter D, Dietzel C, Kunzmann U. Age differences in emotion recognition: The task matters. Journal of Gerontology: Psychological Sciences. 2010;60B:48–55. doi: 10.1093/geronb/gbq068. doi: 10.1093/geronb/gbq068. [DOI] [PubMed] [Google Scholar]
- Riediger M, Studtmann M, Westphal A, Rauers A, Weber H. No smile like another: Adult age differences in identifying emotions that accompany smiles. Frontiers in Psychology. 2014;5 doi: 10.3389/fpsyg.2014.00480. doi: 10.3389/fpsyg.2014.00480. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Riediger M, Voelkle MC, Ebner NC, Lindenberger U. Beyond "Happy, Angry, or Sad?" Age-of-poser and age-of-rater effects on multi-dimensional emotion perception. Cognition & Emotion. 2011;25:968–982. doi: 10.1080/02699931.2010.540812. Special Section. doi: 10.1080/02699931.2010.540812. [DOI] [PubMed] [Google Scholar]
- Rosenbaum JG. The biggest reward for my invention isn't money. Medical Economics. 1984;61:152–163. [Google Scholar]
- Rottenberg J, Ray RD, Gross JJ. Emotion elicitation using films. In: Coan JA, Allen JJB, editors. The handbook of emotion elicitation and assessment. Oxford University Press; London: 2007. pp. 9–28. [Google Scholar]
- Ruffman T, Henry JD, Livingstone V, Phillips LH. A meta-analytic review of emotion recognition and aging: Implications for neuropsychological models of aging. Neuroscience & Biobehavioral Reviews. 2008;32:863–881. doi: 10.1016/j.neubiorev.2008.01.001. doi: 10.1016/j.neubiorev.2008.01.001. [DOI] [PubMed] [Google Scholar]
- Sabatelli RM, Buck R, Dreyer A. Nonverbal communication accuracy in married couples: Relationship with marital complaints. Journal of Personality and Social Psychology. 1982;43:1088–1097. doi: 10.1037//0022-3514.43.5.1088. doi: 10.1037/0022-3514.43.5.1088. [DOI] [PubMed] [Google Scholar]
- Salthouse TA. The processing-speed theory of adult age differences in cognition. Psychological Review. 1996;103:403–428. doi: 10.1037/0033-295x.103.3.403. doi: 10.1037/0033-295X.103.3.403. [DOI] [PubMed] [Google Scholar]
- Stanley JT, Blanchard-Fields F. Challenges older adults face in detecting deceit: The role of emotion recognition. Psychology and Aging. 2008;23:24–32. doi: 10.1037/0882-7974.23.1.24. doi: 10.1037/0882-7974.23.1.24. [DOI] [PubMed] [Google Scholar]
- Sternglanz RW, DePaulo BM. Reading nonverbal cues to emotions: The advantages and liabilities of relationship closeness. Journal of Nonverbal Behavior. 2004;28:245–266. doi: 10.1007/s10919-004-4158-7. [Google Scholar]
- Stinson L, Ickes W. Empathic accuracy in the interactions of male friends versus male strangers. Journal of Personality and Social Psychology. 1992;62:787–797. doi: 10.1037//0022-3514.62.5.787. doi: 10.1037/0022-3514.62.5.787. [DOI] [PubMed] [Google Scholar]
- Sullivan S, Ruffman T. Emotion recognition deficits in the elderly. International Journal of Neuroscience. 2004;114:403–432. doi: 10.1080/00207450490270901. doi: 10.1080/00207450490270901. [DOI] [PubMed] [Google Scholar]
- Sze JA, Goodkind MS, Gyurak A, Levenson RW. Aging and emotion recognition: Not just a losing matter. Psychology and Aging. 2012;27:940–950. doi: 10.1037/a0029367. doi: 10.1037/a0029367. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tetlock PE. Accountability: A social check on the fundamental attribution error. Social Psychology Quarterly. 1985;48:227–236. doi: 10.2307/3033683. [Google Scholar]
- Tetlock PE. The impact of accountability on judgment and choice: Toward a social contingency model. In: Zanna MP, editor. Advances in experimental social psychology. Vol. 25. Academic Press; San Diego, CA: 1992. pp. 331–376. [Google Scholar]
- Thibault P, Bourgeois P, Hess U. The effect of group-identification on emotion recognition: The case of cats and basketball players. Journal of Experimental Social Psychology. 2006;42:676–683. doi: 10.1016/j.jesp.2005.10.006. [Google Scholar]
- Tracy JL, Robins RW. The automaticity of emotion recognition. Emotion. 2008;8:81–95. doi: 10.1037/1528-3542.8.1.81. doi: 10.1037/1528-3542.8.1.81. [DOI] [PubMed] [Google Scholar]
- Wagner HL. On measuring performance in category judgment studies of nonverbal behavior. Journal of Nonverbal Behavior. 1993;17:3–28. doi: 10.1007/BF00987006. [Google Scholar]
- Zachary R. Shipley Institute of Living Scale, Revised Manual. Western Psychological Services; Los Angeles: 1986. [Google Scholar]
- Zhang X, Fung HH, Stanley JT, Isaacowitz DM. Perspective taking in older age revisited: A motivational perspective. Developmental Psychology. 2013;49:1848–1858. doi: 10.1037/a0031211. doi: 10.1037/a0031211. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.


