Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2019 Dec 1.
Published in final edited form as: Dev Psychol. 2018 Oct 18;54(12):2240–2247. doi: 10.1037/dev0000588

Recognition of facial emotions of varying intensities by three-year-olds.

Laurie Bayet 1,2, Hannah F Behrendt 1,3, Julia K Cataldo 1, Alissa Westerlund 1, Charles A Nelson 1,2,4
PMCID: PMC6263821  NIHMSID: NIHMS984955  PMID: 30335429

Abstract

Early facial emotion recognition is hypothesized to be critical to later social functioning. However, relatively little is known about the typical intensity thresholds for recognizing facial emotions in preschoolers, between 2-4 years of age. This study employed a behavioral sorting task to examine the recognition of happy, fearful, and angry expressions of varying intensity in a large sample of 3-year-old children (N = 208). Thresholds were similar for all expressions; accuracy, however, was significantly lower for fear. Fear and anger expressions above threshold were significantly more confused with one another than with other expressions. In contrast, neutral faces were significantly more often interpreted as happy than as angry or fearful. These results provide a comparison point for future studies of early facial emotion recognition in typical and atypical populations of children in this age group.

Keywords: child, emotion, face, recognition, intensity


The ability to accurately recognize facial expressions of emotion is central to social function and nonverbal communication. Perhaps the most compelling evidence for its complexity is the range and variety of conditions and developmental circumstances that affect it, including autism, affective disorders, or experiences of physical abuse (e.g. Brennan, Harris, & Williams, 2014; Lozier, Vanmeter, & Marsh, 2014; Moulson et al., 2014; Muñoz, 2009; Pollak & Sinha, 2002). In addition, early deficits in facial emotion perception have been suggested to act as developmental risk factors for later impairments in social-emotional functioning (Székely et al., 2011).

A considerable wealth of data has documented the development of facial emotion perception in infancy i.e. the first year of life (for reviews, see e.g. Leppänen & Nelson, 2009; Leppänen, 2011) and facial emotion recognition in middle to late childhood (for reviews, see e.g. Gross & Ballif, 1991; Herba & Phillips, 2004). In contrast, there is relatively little research mapping facial emotion recognition in toddlers or young preschoolers (2-4 year-olds), a key developmental period during which aspects of explicit theory of mind and emotion understanding are coming online (Flavell, 2000). Existing data in this age group, focusing on high-intensity emotions, has generally shown that happy expressions are more accurately identified and fearful expressions are least accurately identified in explicit verbal tasks (Kujawa et al., 2014; Nelson & Russell, 2011; Székely et al., 2011; Widen & Russell, 2003). Performance labelling tasks additionally suggest that preschoolers spontaneously produce facial emotion labels that reflect the valence, but not necessarily the specific category, of different facial emotions (Widen & Russell, 2003, 2008, 2010; Widen, 2013). Internalizing symptoms (Kujawa et al., 2014), experiences of physical neglect (Pollak, Cicchetti, Hornung, & Reed, 2000), and experiences of physical abuse to a lesser degree (Pollak et al., 2000), have all been associated with deficits in facial emotion recognition at this early age; such findings exemplify the sensitivity of facial emotion recognition to a range of risk factors for atypical development, such as individual predispositions or early adverse experiences, and further motivate the investigation of typical facial emotion recognition abilities in this age range.

Recent behavioral studies have used morphed facial emotions of varying intensities to uncover the typical development of perceptual thresholds and associated psychometric curves for the recognition of facial emotions in older children from the age of 5-years and up (Gao & Maurer, 2009, 2010), with notable implications for understanding the developmental impact of deprivation and maltreatment on emotion recognition (Bick, Luyster, Fox, Zeanah, & Nelson, 2017; Moulson et al., 2015; Pollak & Kistler, 2002). For example, children with a history of physical abuse exhibit higher thresholds for detecting fearful or sad facial expressions morphed with angry facial expressions (Pollak & Kistler, 2002), in line with an increased attention to and heightened detection of anger cues (Pollak & Sinha, 2002; Shackman, Shackman, & Pollak, 2007). Studies using morphed facial emotion stimuli of varying intensities have additionally shown that 8- and 12-year-olds with a history of early institutional rearing have increased thresholds for recognizing happy facial emotions (Bick et al., 2017; Moulson et al., 2015). These children did not differ from never institutionalized children in their recognition of full intensity faces (Bick et al., 2017; Moulson et al., 2015), clearly demonstrating the utility of morphed facial emotion stimuli of varying intensities in probing subtle variations in facial emotion recognition abilities. However, no previous studies have examined the recognition of facial emotion stimuli of varying intensities in toddlers or preschoolers, limiting the understanding of typical and atypical development of facial emotion recognition abilities in this key developmental period. For example, it has generally been reported that smiling expressions are best recognized by preschoolers at high intensity (Székely et al., 2011; Widen & Russell, 2008), but it is unknown whether this is also associated with a lower intensity threshold for detecting this emotion as reported in older children (Gao & Maurer, 2010). Similarly, it is currently unknown whether experience-specific effects of adversity, such as institutionalization or physical abuse on facial emotion recognition thresholds, which have been reported in older children (Bick et al., 2017; Moulson et al., 2015; Pollak & Kistler, 2002), may emerge as early as the preschool period.

In the current study, we aimed to map the recognition of simple facial expressions of emotion (happiness, anger, and fear) in typically developing 3-year-olds, as a function of the expression intensity. To this end, we presented 3-year-old children with happy, fearful, and angry expressions at varying levels of intensities, and derived the corresponding psychometric curves, perceptual thresholds, and confusion matrices.

Method

All caregivers provided written, informed consent before the experiment, which was approved by the Institutional Review Board of Boston Children’s Hospital (IRB-P00002876, The Development and Neural Bases of Emotion Processing).

Participants

A total of 208 3-year-olds (93 girls; mean age 38.22 months, range 35.71-45.05 months) from a densely populated state of the United States of America participated in this study. Detailed demographic and socio-economic characteristics of the participants are provided in Supplementary Table S1. In short, compared to the population of the surrounding US state (“Census Reporter,” 2017), families in the recruited sample were more likely to report being White, Caucasian, or of Mixed Race; more likely to report higher incomes and education; and more likely to report English as sole primary language at home. Participants were recruited at age 5-, 7-, or 12-months as part of a large longitudinal cohort. All were born with normal birth weight and no history of neurological or seizure disorder, vision impairment, or pre- or perinatal complications. Enrolled families were invited to a follow-up assessment when their child reached 3-years of age (retention rate of about 49%), to the exception of those meeting any of the following criteria: maternal use of anticonvulsants, antipsychotics, or opioids during pregnancy, child diagnosis of autism spectrum disorder, or child diagnosis of genetic disorder.

Stimuli

Stimuli were created as part of an earlier study (Gao & Maurer, 2009). Happy, angry, fearful, and neutral frontal view faces from the same 2 female models (models 3 and 10) were selected from the MacBrain Stimulus set (Tottenham et al., 2009) and cropped onto a white background. 20-80% intensity stimuli were created by morphing neutral with 100% intensity emotional faces using MorphX (http://www.norrkross.com/software/morphx/morphx.php) with 160 pre-defined points. Distortions created by morphing were fixed in Photoshop. Color stimuli were printed onto 10 cm by 14.7 cm cards, and laminated. Each child participant saw a total of 22 faces from 1 of the 2 female models, counterbalanced across participants. These faces were happy, angry, and fearful faces, each at 7 levels of intensity (20%, 30%, 40%, 50%, 60%, 70% and 100%), and 1 neutral face, all presented in random order. Example stimuli are shown on Figure 1. Similar morphed stimuli have been used previously to study facial emotion perception in children and adults (e.g. Morris et al., 1996; Pollak & Kistler, 2002), with a generally close agreement between perceived and morphed intensity (Hess, Blairy, & Kleck, 1997; Morris et al., 1996).

Figure 1. Example stimuli.

Figure 1.

Adapted with permission from: Tottenham, N., Tanaka, J., Leon, A.C., McCarry, T., Nurse, M., Hare, T.A., Marcus, D.J., Westerlund, A., Casey, B.J., Nelson, C.A. (2009). The NimStim set of facial expressions: judgments from untrained research participants. Psychiatry Research, 168(3):242-9

Procedure

Each child participant interacted with the experimenter in a quiet and child-friendly room. The experimenter presented 4 cardboard houses to the child sequentially as follows: “The people in this house are feeling really [scared because they looked out their front window and they saw a bear; happy because their parents said they get to go out for ice cream; calm, they are sitting on the couch and having a quiet time; mad because someone made a mess but they both got in trouble for it]” The experimenter used emotive language to describe each emotional scenario, and placed a scenario picture (bear, ice cream, couch, or spilled glass) onto the roof of each house to identify houses (respectively fearful, happy, neutral, or angry emotional scenarios). After reviewing the 4 houses one more time (“The people in this house are feeling really [scared; happy; mad; calm], with the emotion word spoken in a corresponding emotional tone).”), the experimenter proceeded to introduce the task to the child by saying: “I have a whole lot of people right here and I need you to look at each person and think about how they're feeling, and put them in the house that matches.” Children were not shown any emotional face stimuli or schematic emotional faces when reviewing the 4 houses. The experimenter then presented the child with one card at a time, asking “How do you think she’s feeling?” for each card, repeating instructions as necessary without feedback or hints. There were no training trials. The child placed each card into the house of their choice through a narrow slot on the roof. The task proceeded until all 22 cards were sorted. Similar procedures have been used with children of 5-years of age or older (Gao & Maurer, 2009, 2010; Moulson et al., 2015). The current procedure differed from these previous studies in that the sorting boxes were identified by scenario pictures rather than cartoon facial expressions. Thus, the task could not be completed purely by perceptual matching. Two additional, minor modifications were: 1) the choice to present 3 emotions (happy, fear, and angry) compared to 6 emotions in Gao & Maurer (2010), 4 emotions in Moulson et al. (2015), and a focus on happy, sad and fear in Gao & Maurer (2009), and 2) the choice to present a slightly lower number of 7 intensity levels compared to 10 in Gao & Maurer (2009, 2010) or Moulson et al. (2015). These modifications were implemented to study the early recognition of happy versus threat-relevant expressions and accommodate the younger age group.

Data analysis

Data management.

Demographic and socio-economic data were collected and managed using REDCap (Research Electronic Data Capture) tools hosted at Boston Children’s Hospital (Harris et al., 2009). Experimental data were collected and managed using IBM SPSS Statistics v. 24.

Pre-processing.

Data preprocessing consisted of importing the data for analysis in R. Responses were considered accurate if they matched the displayed expression, and inaccurate otherwise. A total of 13 trials had missing data (no response recorded, for example if the child failed to complete the task), or 0.28% of all data points. No further data points were excluded.

Modeling of psychometric curves.

Child psychometric curves for accuracy exhibited an upper asymptote well below the theoretical level of 100% accuracy, and a lower asymptote well above the theoretical level of 0% accuracy (Figure 2A). This characteristic of the data (which has been observed in other developmental data, e.g., Bayet et al., 2017) violates assumptions of standard psychometric models. To accommodate this, we fitted custom psychometric functions with the following formula:

f(x)=Y0+Y100-Y01+exp[-a*(x-θ)]

where f(x) is the fitted accuracy, x the intensity, a the slope (or scaling factor), Y0 the lower asymptote (0% intensity), Y100 the upper asymptote (100% intensity), and θ the perceptual threshold. When the intensity x is equal to the threshold θ, the fitted accuracy f(x) is halfway between Y0 and Y100. Model selection diagnostics (Aikaike and Bayesian Information Criterion; Supplementary Table S4) and inspection of the resulting fits (Figure S1) clearly confirmed that this modeling choice was more appropriate for this data than the relatively more standard approach of fixing Y0 and Y100 to 0% and 100% accuracy respectively. This more standard approach has typically been used with data from adults or older children, with lower and upper asymptotes close to these theoretical values of 0 and 100% accuracy respectively. In such a context, an accuracy of 50% coincides with the tipping (inflexion) point of the psychometric curve, between the two asymptotes. However, this is not the case when either the lower or upper asymptote deviates from the values of 0 and 100%, respectively, as in the current data. Custom psychometric functions as used in the current analysis accommodate this characteristic by allowing the lower and upper asymptotes to vary; there is at least one precedent to using this approach with developmental data (Bayet et al., 2017).

Figure 2. Emotion sorting behavior at three years of age.

Figure 2.

A. Behavioral accuracy. B. Response rates to neutral faces. *** p < .001; NS p > .05. C. Fitted psychometric curves. D. Sensitivity. E. Conservative bias. F. Average confusion matrix.

Models were fitted separately for each emotion (happy, angry, and fear). The number of parameters needed to fit was reduced by estimating Y0 directly from the response rates to neutral faces, i.e. 0% intensity emotional faces. Non-parametric 95% confidence intervals were derived for each parameter of interest (N=10,000 bootstrap samples). Confidence intervals obtained with other methods are provided in Supplementary materials for comparison purposes.

Signal detection measures.

Group-level measures of sensitivity (d’) and bias (c-bias) were derived at each emotion and intensity by defining each target emotion as “signal” and non-target emotions as “noise” (Calvo, Avero, Fernández-Martín, & Recio, 2016). For a discussion on the choice of measures of sensitivity and bias measures (e.g. d’ versus A’), see Stanislaw & Todorov (1999). Non-parametric 95% confidence intervals for these group-level estimates were obtained by bootstrapping for each emotion and intensity level (N=5000 samples, to reduce computational time). The resulting curves were fitted with custom psychometric curves as described above, with the following modifications. For sensitivity, the baseline asymptote (Y0) was fixed to be 0, as sensitivity should be equal to 0 in the absence of signal. For c-bias, Y0 was directly estimated from the data. To reduce the numbers of parameters needed to be fitted, the perceptual thresholds for c-bias were fixed to be those estimated from the sensitivity curves, and the slope or scaling parameter a was fixed to 0.1.

Confusion (misidentification) rates.

Confusion matrices are obtained for each participant by collapsing over intensities above perceptual thresholds (40% intensity and up, see Results), then averaged over all participants to generate an average confusion matrix.

Software.

Data import and preprocessing were conducted in Matlab 8.2.0.701 (R2013b, The MathWorks, Inc., Natick, MA, USA). Data analyses were conducted in R 3.3.0 (R Core Team, 2016). In particular, non-linear least-square models were fitted using the “nls” function in R.

Data and code availability.

Data are available upon request. Analysis code is openly accessible online at http://dx.doi.org/10.6084/m9.figshare.6108662.

Results

Responses to neutral faces

Each child sorted one neutral face, which facilitated the derivation of baseline response rates for each of the three emotions (happy, angry, and fear) at the group level. Neutral faces were categorized as neutral 37.98% of the time, and as emotional (happy, angry, or fearful) 62.02% of the time. Critically, neutral faces were not equally likely to be categorized as happy, angry, or fearful (χ2[2] = 26.84, total N = 129, p < .001). More specifically, neutral faces were more likely to be categorized as happy then fearful or angry (Figure 2B; Happy versus Fear, χ2[1] = 11.67, total N = 105, p < .001; Happy versus Angry, χ2[1] = 22.51, total N = 94, p < .001; Fear versus Angry, χ2[1] = 2.05, total N = 59, p = .152).

In the following analysis of psychometric curves for categorizing emotional faces, these baseline response rates to neutral faces are used to anchor the lower asymptotes for each emotion (i.e. expected accuracy at 0% intensity). Doing so reduced the number of parameters needed to be fitted for each model.

Psychometric curves

We fit psychometric curve models for each emotion (20-100% intensity faces, see Methods), with two parameters estimating perceptual thresholds and plateau accuracy (Figure 2 B-C, Supplementary Table S2), for each emotion. Perceptual thresholds are defined as the intensity level at which accuracy reaches a level intermediate between the fitted plateau accuracy (100% asymptote) and the accuracy at 0% intensity (response rate to neutral faces), i.e., the inflexion point of the psychometric curve. Some studies have defined the perceptual threshold as the intensity where accuracy reaches 50%, (e.g. Gao & Maurer, 2009, 2010; Luyster, Bick, Westerlund, & Nelson, 2017; Luyster, Powell, Tager-Flusberg, & Nelson, 2014; Moulson et al., 2014). However, in such studies the lower and upper asymptotes were fixed at 0 and 100% accuracy respectively, i.e., the criteria of 50% accuracy did correspond to the inflexion point intermediate between the lower and upper asymptotes. In the current approach, lower and upper asymptotes are not fixed to 0 and 100% accuracy, which allows to estimate perceptual thresholds (the inflexion points) more flexibly (see Supplementary Table S4 and Figure S1 for a direct comparison of both approaches demonstrating that the current approach was preferable for this data).

Perceptual thresholds were consistent across emotions at about 35% intensity, i.e., the confidence intervals for perceptual thresholds clearly overlapped across emotions. In contrast, plateau accuracy differed significantly across emotions, as evidenced by non-overlapping confidence intervals (Table 1). In particular, plateau accuracy was highest for happy (89.85%) followed by angry (about 80.03%) and fear (61.30%). Plateau accuracy for fear was significantly lower than for happy and angry. Plateau accuracy for angry was only marginally lower than for happy (i.e. 95% but not 90% confidence intervals overlapped; 90% CI for happy plateau accuracy [85.66 95.25]; 90% CI for angry plateau accuracy [76.44 84.09]).

Table 1. Perceptual thresholds and plateau accuracy for each emotional expression.

Confidence intervals estimated by the bootstrap method (N = 10,000).

Parameter Emotion Estimate 2.5% CI 97.5% CI Interpretation
Perceptual threshold
(% intensity, 0-100)
Happy 34.68 30.33 39.90 H = F = A
Fearful 34.54 30.75 38.69
Angry 36.51 33.93 39.33
Plateau accuracy
(% accurate, 0-100)
Happy 89.85 84.90 96.48 F < A ≤ H
Fearful 61.30 57.27 66.19
Angry 80.03 75.76 84.94

Similar results were obtained when using the profile or z-score method (Supplementary Table S3); the only difference being that plateau accuracy was significantly lower for angry than happy when using the profile method (versus marginally lower when using the bootstrap or z-score method).

Signal detection analyses

In line with the analysis of accuracy, an analysis of sensitivity (d’) revealed similar perceptual thresholds across the three emotions, and significantly higher plateau sensitivity for happy followed by anger and fear (Table 2; Supplementary Table S5 and Figure S2). In addition, an analysis of bias (c-bias) revealed a significant conservative bias for all emotions, stronger for fear than happy and angry at higher intensity values (Table 3, Figure 2D-E; Supplementary Table S6 and Figure S2). At lower intensity values, the conservative bias was highest for anger followed by fear and happy. Similar results were obtained with the profile method, except that plateau sensitivity did not significantly differ for fear and anger (Supplementary Table S7).

Table 2. Perceptual thresholds and plateau sensitivity for each emotional expression.

Confidence intervals estimated by the bootstrap method (N = 10,000).

Parameter Emotion Estimate 2.5% CI 97.5% CI Interpretation
Perceptual threshold
(% intensity, 0-100)
Happy 40.86 37.08 45.11 H = F = A
Fearful 37.37 33.95 40.79
Angry 35.82 33.57 38.21
Plateau sensitivity
(z-score)
Happy 2.87 2.65 3.16 F < A < H
Fearful 1.60 1.49 1.76
Angry 1.96 1.85 2.08

Table 3. Baseline and plateau conservative bias for each emotional expression.

Confidence intervals estimated by the bootstrap method (N = 10,000). Positive values indicate a bias against categorizing expressions as being of the target emotion.

Parameter Emotion Estimate 2.5% CI 97.5% CI Interpretation
Baseline conservative
bias
(z-score)
Happy 0.26 0.17 0.34 0 < H < F < A
Fearful 0.82 0.75 0.89
Angry 1.14 1.04 1.25
Plateau conservative
bias
(z-score)
Happy 0.24 0.17 0.30 0 < A = H < F
Fearful 0.47 0.43 0.51
Angry 0.15 0.10 0.21

Confusion rates

We next examine the confusion rates for all emotions at intensities above thresholds (40% and up). In particular, we test whether angry and fearful faces are more often confused with one another than angry and happy or fearful and happy faces, as predicted by their emotional valence. The observed confusion matrix is in line with this notion (Figure 2F). Bootstrapped confidence intervals (N=10,000 samples) for the mean misidentification rates confirm that angry faces above thresholds were misidentified (i.e. categorized) as fearful (Mean 17.79%, 95% CI [14.62 21.35]) more often than as happy (Mean 6.39%, 95% CI [4.74 8.61]) or neutral (Mean 5.51%, 95% CI [4.04 7.48]). Reciprocally, fearful faces above thresholds were categorized as angry (Mean 20.16%, 95% CI [16.98 23.75]) more often than as happy (Mean 12.40%, 95% CI [9.81 15.38]) or neutral (Mean 10.39%, 95% CI [8.27 12.79]).

Discussion

In the current study, we used a behavioral sorting task with facial emotions of varying intensities to assess facial emotion recognition in a relatively large sample of typically developing 3-year-olds from a densely populated state of the US. To our knowledge, this study is the first to investigate the recognition of facial emotions of varying intensities during the preschool (2-4 year-old) period. We found similar perceptual thresholds for recognizing happy, angry, and fearful expressions; a lower accuracy and higher conservative bias for recognizing high intensity fearful faces; and higher confusion rates between fear and anger than between these expressions and neutral or happy expressions at high (40-100%) intensities. In addition, 3-year-olds exhibited a tendency to categorize neutral faces as happy rather than angry or fearful.

We find equal perceptual (intensity) thresholds for the recognition of happy, fearful, and angry faces in 3-year-olds. Independent of emotional category (happy, angry, or fearful), emotional faces at intensity levels above 35% were recognized by 3-year-olds with an accuracy intermediate between baseline and full intensity faces. Accuracies at intensity levels of 60% and up were equivalent to full intensity faces for all emotional categories (happy, angry, or fearful). This result contrasts with other studies that have reported perceptual threshold differences in similar tasks in older children (5-years and up; see e.g. Gao & Maurer, 2010). To our knowledge, no previous study of emotion recognition in this or a similar age group has modeled baseline response rates (lower asymptotes) and plateau performance (upper asymptotes) when estimating perceptual thresholds from psychometric curves. Thus, differences in modeling choice for defining and estimating psychometric curves and thresholds may underlie at least some of these differences. Indeed, omitting these parameters when modeling the current data yielded large estimated differences in perceptual thresholds across emotions (including a lower threshold for happy), however along with a clear decline in model quality indicating that the current approach was more appropriate (Supplementary Table S4 and Figure S1). This suggests that estimating lower and upper asymptotes could be important when modeling psychometric curves in young age groups who show differences in behavioral performance compared to adults. Previous studies in 5-year-old children have reported thresholds of about 25-30% intensity for differentiating emotional from neutral faces with 50% accuracy (Gao & Maurer, 2009, 2010). We report higher perceptual thresholds in 3-year-olds, which may reflect methodological differences in defining and estimating perceptual thresholds or correspond to a decrease in perceptual thresholds over development as expected based on previous work demonstrating higher thresholds in 5-10 year-old children than adults for detecting emotional expressions (Gao & Maurer, 2009, 2010). Similarly, the lower perceptual threshold for happy faces at 5-years (Gao & Maurer, 2009, 2010) but not yet at 3-years could either reflect methodological differences in estimating thresholds or indicate a developmental change occurring between 3 and 5 years of age. Future work should address, preferentially longitudinally, the development of psychometric curves for different emotions across early childhood. Importantly, the current threshold estimates will prove instrumental in the design (e.g., choice of target intensities) of future studies examining facial emotion perception in 3-year-olds using morphed stimuli.

We found a higher baseline categorization rate for happy expressions, along with a bias against categorizing lower intensity faces as angry or fearful. In other words, in the absence of more information or “when in doubt”, 3-year-olds tended to interpret facial expressions as positive or neutral. To our knowledge, this finding has not been previously reported, and does not appear well accounted for by existing models of facial emotion perception development in preschoolers. Most notably, according to the differentiation model (Widen, 2013) which was derived from children’s labelling of full intensity emotional faces, 3-year-olds would be expected to reliably differentiate (low or high intensity) happy and angry faces while including fearful faces under the umbrella category of angry or sad faces. Future research on the cognitive mechanisms of low-intensity emotional faces perception in preschooler is needed to reconcile the current empirical finding of a higher baseline categorization rate for happy (versus angry or fear) expressions with existing models of emotional face perception development at this age. A different question that arises is whether individual differences in interpreting emotional faces in early childhood (e.g. categorizing neutral faces as negative) may predict later social-emotional outcome. For example, previous studies in adults have shown that a tendency to interpret neutral faces as negative is linked to current depressive mood (e.g. Leppänen, Milders, Bell, Terriere, & Hietanen, 2004). However, it remains unclear whether this tendency reflects differences in cognitive style that predispose to social-emotional disorders, or current emotional state. Longitudinal studies will be critical for understanding the early development of emotion processing, and for identifying and monitoring children at risk.

We found large differences in peak performance across emotions, specifically a lower plateau accuracy for fearful compared to angry and happy faces: Three-year-olds were significantly less accurate in recognizing fearful faces even at the highest intensity. In addition, 3-year-olds confused fearful and angry expressions with one another more often than with happy or neutral expressions. A higher conservative bias against categorizing high intensity faces as fearful (compared to happy or angry) regardless of their true expression, combined with a lower sensitivity (d’) for categorizing full intensity fearful faces (compared to happy or angry faces), underlay the lower accuracy for categorizing full intensity fearful faces at that age. That is, the low accuracy for categorizing full intensity fearful faces in this sample of 3-year-olds could be attributed not only to a higher difficulty in differentiating these from other emotional faces, but also to a tendency to refrain from categorizing any face as being fearful. This behavioral pattern is consistent with previous reports in 3-year-olds testing high-intensity faces only, which have found higher accuracy for recognizing happy (versus angry or fearful) faces at this age (Camras & Allison, 1985; Herba & Phillips, 2004; Kujawa et al., 2014; Székely et al., 2011; Widen & Russell, 2008) as well as a tendency to interpret fearful faces as being angry (Widen & Russell, 2008, 2010; Widen, 2013). The current findings further demonstrate that the poorer behavioral performance in recognizing fear cannot be attributed to a higher perceptual threshold for fear than other emotions. The poorer performance for fear also cannot be attributed to the higher difficulty of recalling the correct label for fearful faces, as children were not required to verbally label the stimuli. Further, it cannot be attributed to the higher ambiguity of fearful expressions among the emotional faces presented, as the current study did not include expressions with similar facial features such as surprised faces that can be confused with fear (in contrast to e.g. Rodger, Vizioli, Ouyang, & Caldara, 2015). It has been suggested that fearful expressions, which indirectly signal the presence of a threat, could be more complex to understand than angry expressions which are directly threatening to the perceiver (Leppänen & Nelson, 2009, 2012). However, in the current study, happy, fearful and angry scenarios all involved the child understanding that an object (ice-cream) or third-party (bear, person) caused the target emotion. Thus, a different explanation for the observed behavior in the current study could be that the relative rarity of fearful faces in the everyday experience of young children in this cultural context led to a higher difficulty in linking these facial expressions to the fear scenario, regardless of their perceived intensity. While the specific fear scenario used in this study (being scared of seeing a bear) corresponds to a situation that few children would have directly experienced in everyday life, it should still be familiar to 3-year-olds because the scenario of being scared by an animal is often explored in children’s fiction and animal fears are relatively common in preschoolers (Gullone, 1996, 2000). Angry facial expressions may be more frequently and more intensely expressed than fearful expressions e.g. by caregivers in disciplinary situations, or in interactions with siblings or peers, which might explain the higher accuracy in linking high-intensity angry expressions to their corresponding scenario. Taken together, we suspect that a relative rarity of fearful compared to angry or happy faces in the everyday life experience of children in our sample could explain their observed tendency to refrain from categorizing even high intensity faces as being fearful (i.e. a conservative bias). Including novel expressions in similar paradigms could clarify whether the poor recognition performance for fear is specific to the emotional concept or the novelty of the stimulus.

The current findings contrast with the body of evidence pointing to early biases to fearful faces in infancy (e.g. Bayet et al., 2017; Jessen & Grossmann, 2014; Leppänen & Nelson, 2009, 2012). At least one previous study with 3-year-olds (Székely et al., 2011) however has found poorer performance for recognizing but better performance for perceptually matching fearful faces. This suggests that early biases promoting the processing of fear exist despite a developmental lag in understanding this emotional concept. An intriguing hypothesis is that early biases for fear may compensate for the relative cognitive difficulty of understanding fearful expressions. Early biases for fear may allow infants and young children to produce an adaptive response (increased attention, e.g. see LoBue, 2013) without understanding the concept of fear.

The current results should be interpreted in the context of the following limitations: a small number of expressions (happy, angry, and fearful), and a small number of trials per child (one trial per intensity and emotion), and a single set of emotion scenarios and associated pictures. In particular, the set of emotions presented included only one positive (happy) but two negative emotions (anger and fear), which could have influenced the choice behavior of children. Similar limitations apply to most studies of emotion recognition especially in childhood. In addition, the specific scenarios and pictures that were used to contextualize the task and describe each emotion may have contributed to differences in task difficulty across emotions and influenced children’s performance. Future research should determine whether the observed pattern of result, which is consistent with the extant literature on high intensity facial emotion recognition at this age, persist when using different scenarios or pictures to describe each emotion. Moreover, the small number of trials per child did not allow for the simultaneous estimation of all (perceptual threshold, lower asymptote, and upper asymptote) parameters in the models for each emotion, but was offset by a large sample size (N = 208).

In conclusion, here we measured facial emotion recognition in typically developing 3-year-olds as a function of emotion (happy, angry, fear) and expression intensity. We found similar perceptual thresholds for all emotions, but markedly lower plateau accuracy and higher conservative bias for recognizing fearful face expressions. These results inform developmental theories of emotion processing, and provide comparison points for future studies of facial emotion recognition in typical and atypical populations.

Supplementary Material

1

Acknowledgments

This work was funded by NIH Grant R01MH078829 to C.A.N. and Michelle Bosquet, and a Philippe Foundation grant to LB. We are grateful to the families, students, and research assistants that took part in the study, Rhiannon Luyster for her assistance in adapting the task for 3-year-olds, and Johanna Bick for helpful discussions.

References

  1. Bayet L, Quinn PC, Laboissiere R, Caldara R, Lee K, & Pascalis O (2017). Fearful but not happy expressions boost face detection in human infants. Proceedings of the Royal Society B: Biological Sciences, 284, 20171054 https://doi.org/10.1098/rspb.2017.1054 [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Bick J, Luyster R, Fox NA, Zeanah CH, & Nelson CA (2017). Effects of early institutionalization on emotion processing in 12-year-old youth. Development and Psychopathology, 29, 1749–1761. https://doi.org/10.1017/S0954579417001377 [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Brennan AM, Harris AWF, & Williams LM (2014). Neural processing of facial expressions of emotion in first onset psychosis. Psychiatry Research, 219, 477–485. [DOI] [PubMed] [Google Scholar]
  4. Calvo MG, Avero P, Fernández-Martín A, & Recio G (2016). Recognition Thresholds for Static and Dynamic Emotional Faces. Emotion, 16(8), 1186–1200. https://doi.org/10.1037/emo0000192 [DOI] [PubMed] [Google Scholar]
  5. Camras L. a., & Allison K (1985). Children’s understanding of emotional facial expressions and verbal labels. Journal of Nonverbal Behavior, 9, 84–94. https://doi.org/10.1007/BF00987140 [Google Scholar]
  6. Census Reporter. (2017). Retrieved from https://censusreporter.org
  7. Flavell JH (2000). Development of children’s knowledge about the mental world. International Journal of Behavioral Development, 24, 15–23. [Google Scholar]
  8. Gao X, & Maurer D (2009). Influence of intensity on children’s sensitivity to happy, sad, and fearful facial expressions. Journal of Experimental Child Psychology, 102, 503–521. https://doi.Org/10.1016/j.jecp.2008.11.002 [DOI] [PubMed] [Google Scholar]
  9. Gao X, & Maurer D (2010). A happy story: Developmental changes in children’s sensitivity to facial expressions of varying intensities. Journal of Experimental Child Psychology, 107, 67–86. https://doi.org/10.1016/j.jecp.2010.05.003 [DOI] [PubMed] [Google Scholar]
  10. Gross AL, & Ballif B (1991). Children’s understanding of emotion from facial expressions and situations: A review. Developmental Review, 11, 368–398. https://doi.org/10.1016/0273-2297(91)90019-K [Google Scholar]
  11. Gullone E (1996). Developmental psychopathology and normal fear. Behaviour Change, 13, 143–155. https://doi.org/10.1017/S0813483900004927 [Google Scholar]
  12. Gullone E (2000). The development of normal fear: A century of research. Clinical Psychology Review, 20, 429–451. https://doi.org/10.1016/S0272-7358(99)00034-3 [DOI] [PubMed] [Google Scholar]
  13. Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, & Conde JG (2009). Research electronic data capture (REDCap)-A metadata-driven methodology and workflow process for providing translational research informatics support. Journal of Biomedical Informatics, 42, 377–381. https://doi.org/10.1016/j.jbi.2008.08.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Herba C, & Phillips M (2004). Annotation: Development of facial expression recognition from childhood to adolescence: Behavioural and neurological perspectives. Journal of Child Psychology and Psychiatry, 45, 1185–1198. https://doi.org/10.1111/j.1469-7610.2004.00316.x [DOI] [PubMed] [Google Scholar]
  15. Hess U, Blairy S, & Kleck RE (1997). The intensity of emotional facial expressions and decoding accuracy. Journal of Nonverbal Behavior, 21, 241–257. https://doi.Org/10.1023/a:1024952730333 [Google Scholar]
  16. Jessen S, & Grossmann T (2014). Unconscious discrimination of social cues from eye whites in infants. Proceedings of the National Academy of Sciences, 111, 16208–16213. https://doi.org/10.1073/pnas.1411333111 [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Kujawa A, Dougherty L, Durbin CE, Laptook R, Torpey D, & Klein DN (2014). Emotion recognition in preschool children: Associations with maternal depression and early parenting. Development and Psychopathology, 26, 159–170. https://doi.org/10.1017/S0954579413000928 [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Leppänen JM (2011). Neural and developmental bases of the ability to recognize social signals of emotions. Emotion Review, 3, 179–188. https://doi.org/10.1177/1754073910387942 [Google Scholar]
  19. Leppänen JM, Milders M, Bell JS, Terriere E, & Hietanen JK (2004). Depression biases the recognition of emotionally neutral faces. Psychiatry Research, 128, 123–133. https://doi.Org/10.1016/j.psychres.2004.05.020 [DOI] [PubMed] [Google Scholar]
  20. Leppänen JM, & Nelson CA (2009). Tuning the developing brain to social signals of emotions. Nature Review Neuroscience, 10, 37–47. https://doi.org/10.1038/nrn2554 [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Leppänen JM, & Nelson CA (2012). Early development of fear processing. Current Directions in Psychological Science, 21, 200–204. https://doi.org/10.1177/0963721411435841 [Google Scholar]
  22. LoBue V (2013). What Are We So Afraid of? How Early Attention Shapes Our Most Common Fears. Child Development Perspectives, 7, 38–42. https://doi.Org/10.1111/cdep.12012 [Google Scholar]
  23. Lozier LM, Vanmeter JW, & Marsh AA (2014). Impairments in facial affect recognition associated with autism spectrum disorders: A meta-analysis. Development and Psychopathology, 26, 933–945. https://doi.org/10.1017/S0954579414000479 [DOI] [PubMed] [Google Scholar]
  24. Luyster RJ, Bick J, Westerlund A, & Nelson CA (2017). Testing the effects of expression, intensity and age on emotional face processing in ASD. Neuropsychologia, (February), 0–1. https://doi.Org/10.1016/j.neuropsychologia.2017.06.023 [DOI] [PubMed] [Google Scholar]
  25. Luyster RJ, Powell C, Tager-Flusberg H, & Nelson CA (2014). Neural measures of social attention across the first years of life: characterizing typical development and markers of autism risk. Developmental Cognitive Neuroscience, 8, 131–143. https://doi.Org/10.1016/j.dcn.2013.09.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Morris JS, Frith CD, Perrett DI, Rowland D, Young AW, Calder AJ, & Dolan RJ (1996). A differential neural response in the human amygdala to fearful and happy facial expressions. Nature, 383, 812–815. https://doi.org/10.1038/383812a0 [DOI] [PubMed] [Google Scholar]
  27. Moulson MC, Shutts K, Fox NA, Zeanah CH, Spelke ES, & Nelson CA (2015). Effects of early institutionalization on the development of emotion processing: a case for relative sparing? Developmental Science, 18, 298–313. https://doi.org/10.1111/desc.12217 [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Muñoz LC (2009). Callous-unemotional traits are related to combined deficits in recognizing afraid faces and body poses. Journal of the American Academy of Child and Adolescent Psychiatry, 48(5), 554–562. https://doi.org/10.1097/CHI.0b013e31819c2419 [DOI] [PubMed] [Google Scholar]
  29. Nelson NL, & Russell JA (2011). Preschoolers’ use of dynamic facial, bodily, and vocal cues to emotion. Journal of Experimental Child Psychology, 110, 52–61. https://doi.Org/10.1016/j.jecp.2011.03.014 [DOI] [PubMed] [Google Scholar]
  30. Pollak SD, Cicchetti D, Hornung K, & Reed A (2000). Recognizing emotion in faces: Developmental effects of child abuse and neglect. Developmental Psychology, 36, 679–688. https://doi.Org/10.1037//0012-1649.36.5.679 [DOI] [PubMed] [Google Scholar]
  31. Pollak SD, & Kistler DJ (2002). Early experience is associated with the development of categorical representations for facial expressions of emotion. Proceedings of the National Academy of Sciences, 99, 9072–9076. https://doi.org/10.1073/pnas.142165999 [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Pollak SD, & Sinha P (2002). Effects of early experience on children’s recognition of facial displays of emotion. Developmental Psychology, 38, 784–791. https://doi.Org/10.1037/0012-1649.38.5.784 [DOI] [PubMed] [Google Scholar]
  33. R Core Team. (2016). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing; Retrieved from https://www.r-project.org/ [Google Scholar]
  34. Rodger H, Vizioli L, Ouyang X, & Caldara R (2015). Mapping the development of facial expression recognition. Developmental Science, 18, 926–939. https://doi.org/10.1111/desc.12281 [DOI] [PubMed] [Google Scholar]
  35. Shackman JE, Shackman AJ, & Pollak SD (2007). Physical abuse amplifies attention to threat and increases anxiety in children. Emotion, 7, 838–852. https://doi.Org/10.1037/1528-3542.7.4.838 [DOI] [PubMed] [Google Scholar]
  36. Stanislaw H, & Todorov N (1999). Calculation of signal detection theory measures. Behavior Research Methods, 31, 137–149. [DOI] [PubMed] [Google Scholar]
  37. Székely E, Tiemeier H, Arends LR, Jaddoe VWV, Hofman A, Verhulst FC, & Herba CM (2011). Recognition of facial expressions of emotions by 3-year-olds. Emotion, 11, 425–435. https://doi.org/10.1037/a0022587 [DOI] [PubMed] [Google Scholar]
  38. Tottenham N, Tanaka JW, Leon AC, McCarry T, Nurse M, Hare TA, ... Nelson CA (2009). The NimStim set of facial expressions: judgments from untrained research participants. Psychiatry Research, 168, 242–249. https://doi.Org/10.1016/j.psychres.2008.05.006.The [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Widen SC (2013). Children’s interpretation of facial expressions: The long path from valence-based to specific discrete categories. Emotion Review, 5, 72–77. https://doi.org/10.1177/1754073912451492 [Google Scholar]
  40. Widen SC, & Russell JA (2003). A closer look at preschoolers’ freely produced labels for facial expressions. Developmental Psychology, 39, 114–128. https://doi.org/10.1037/0012-1649.39.1.114 [DOI] [PubMed] [Google Scholar]
  41. Widen SC, & Russell JA (2008). Children acquire emotion categories gradually. Cognitive Development, 23, 291–312. https://doi.Org/10.1016/j.cogdev.2008.01.002 [Google Scholar]
  42. Widen SC, & Russell JA (2010). Differentiation in preschooler’s categories of emotion. Emotion, 10, 651–661. https://doi.org/10.1037/a0019005 [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

1

Data Availability Statement

Data are available upon request. Analysis code is openly accessible online at http://dx.doi.org/10.6084/m9.figshare.6108662.

RESOURCES