Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2025 Sep 4.
Published in final edited form as: Cleft Palate Craniofac J. 2024 Aug 28;62(10):1695–1704. doi: 10.1177/10556656241271650

Facial Expressions of Emotion in Children with Cleft Lip and Palate

Robert Brinton Fujiki 1, Fangyun Zhao 2, Paula M Niedenthal 2, Susan L Thibeault 1
PMCID: PMC11868462  NIHMSID: NIHMS2054382  PMID: 39193752

Abstract

Objective:

To examine the facial movements children with cleft lip and palate (CLP) employ to express basic emotions. Ability of observers to interpret facial expressions of children with CLP was also considered.

Design:

Prospective case-control design.

Setting:

Outpatient craniofacial anomalies clinic.

Patients:

Twenty-five children with CLP (age 8 to 12) and 25 age/sex-matched controls.

Outcome Measures:

Children were video recorded making facial expressions representing anger, disgust, fear, happiness, sadness, and surprise. Magnitude of children’s facial movements was quantified and compared using OpenFace. Subsequently, emotion videos were presented to 19 adults who were asked to identify the emotion conveyed in each facial expression. Accuracy of emotion recognition was compared across groups.

Results:

Compared with controls, children with CLP employed significantly (P < .05) smaller magnitude superior and lateral perioral movements to express disgust (Cohen’s d = .50), happiness (Cohen’s d = 1.1), and fear (Cohen’s d = .93). For disgust and sadness, children with CLP employed significantly greater magnitude movements of the nose and chin, presumably to compensate for reduced perioral range of motion. For anger, happiness, and sadness, children with CLP employed smaller magnitude movements of the upper face when compared with controls. Observers identified disgust (OR = 1.26), and fear (OR = 2.44) significantly less accurately in children with CLP when compared with controls.

Conclusions:

Children with CLP employed different facial movements to express certain emotions. Observers less accurately identified some emotions conveyed by facial expressions in children with CLP when compared with controls, likely due in part to differences in facial movements. Future research should explore the implications of these differences for social communication.

Keywords: cleft lip and palate, pediatrics, psychosocial adjustment

Introduction

Children’s ability to share emotion through facial expression is important to social development, interpersonal interaction, friendship formation, and academic learning.15 Facial movements play a key role in emotion expression,6 and differences in facial morphology and physiology may affect this expression.7 For example, cleft lip and palate (CLP) may impact facial expression in a number of ways. CLP occurs when tissues of the upper lip, alveolus, and soft or hard palate do not fuse during fetal development.8,9 Specifically, cleft lip results from the failure of the frontonasal and maxillary processes to fuse, and cleft palate occurs when the palatal shelves of the maxillary processes fail to fuse.8 CLP can occur in isolation or may be associated with a variety of syndromes and congenital anomalies.10,11 CLP is estimated to occur in 10.2 per 10 000 births in the United States.12 Children with CLP are at increased risk for language,13,14 speech,15,16 resonance,17,18 voice,19,20 and some cognitive difficulties,21,22 as well as a variety of other medical complications.2325 Additionally, as a result of scarring, children with CLP often present with reduced circumoral movements26,27 which may affect facial expression and the ability to mimic the facial expressions of others.28 It remains unclear, however, whether these differences in facial morphology and physiology functionally affect the way that children with CLP express emotion.

Although it is acknowledged that asymmetries of circumoral morphology are common,2932 it is less clear how these differences impact facial movements associated with specific emotions. Most available research has focused on smiling.29,30 Studies have reported that asymmetries in the vermillion of the upper lip increase at maximum smile when compared to rest,30 and other investigations have indicated that nasolabial region asymmetry decreases when smiling.29 Scarring may also reduce upward and backward movement of the scar area and result in compensatory downward movement of the lower lip.31 Additionally, children with CLP have self-reported limitations in their ability to round their lips symmetrically, particularly in the context of pouting.33 Research is warranted to examine the magnitude of facial movements children with CLP demonstrate when expressing emotion.

Determining how magnitude of facial movements influences the expression of emotions is a complicated endeavor. For example, asking an individual to smile may only partially recreate the manner in which that individual employs facial expression to convey happiness. In conjunction with other facial movements, smiles may convey a variety of emotions, in various social contexts.34,35 Thus, when examining the relationship between CLP and emotion expression, it is insufficient to isolate any single facial feature. Rather, it is important to consider the combination of facial actions that are associated with emotion expression. For this reason, the study of facial expression has employed systems such as the Facial Action Coding System (FACS), which divides the face into so-called “action units” that quantify specific facial movements.36 Past study has identified the combination of action units evident when basic emotions are expressed (anger, disgust, fear, happiness, sadness, and surprise).37

If children with CLP show limited or atypical facial movements when expressing emotions, it would be crucial to understand how these differences might affect the ability of interactional partners to interpret emotions being expressed. Past study suggests individuals visually scan adolescents with CLP differently than they scan those with typical facial morphology.38 Thus, the ability of observers to interpret facial expressions of emotion in children with CLP may be affected both by the children’s morphology and by the observers’ visual scan path. Although it has been reported that individuals with CLP are perceived more negatively than their unaffected peers,3941 it is unknown if their expressions of emotion are perceived and interpreted differently as well.

The current study examined facial movements children with CLP use to express basic emotions. It was hypothesized that the magnitude of facial movements in children with CLP would be reduced when compared with controls – particularly for movements of the upper lip. Observers’ ability to accurately recognize the facial expressions of children with CLP and controls was also considered. Raters without regular experience interacting with children with CLP were asked to identify the emotions children expressed. It was hypothesized that raters would recognize facial expressions of basic emotions such as happiness and sadness at similar levels of accuracy but would identify expressions of more complex emotions such as anger, fear, disgust, and surprise, more accurately in the control group.

Methods

The study employed a cross-sectional, prospective case-control design. All study procedures were approved by the relevant Institutional Review Board (#2022-0108). Two experiments were performed. First, children with CLP and controls were video recorded as they attempted to display facial expressions of anger, disgust, fear, happiness, sadness, and surprise. Next, the magnitude of facial movements was quantified and compared using OpenFace – an opensource toolkit designed to detect and track facial landmarks.42 In order to determine whether differences in the magnitude of facial movements influenced observer-emotion recognition, recorded emotion videos were then presented to adults who were asked to identify what emotion was being expressed.

Child Population

Twenty-five children with CLP and 25 sex and age-matched controls were recruited (females = 7, males = 18). All participants were between the ages of 8 and 12 and native English speakers. Patients with CLP were identified from the craniofacial anomalies clinic associated with a large pediatric hospital and presented with a history of cleft lip diagnosed by a plastic surgeon. Children with CLP had undergone cleft lip surgery as infants and no future lip revisions were judged to be indicated at the time of study enrollment. Age and sex-matched controls were recruited from the community as well as from the outpatient voice and swallow clinic of the same hospital. For age-matching, birthdays were required to be within 6 months of one another. Children from both groups were excluded if they presented with developmental language disorder or autism spectrum disorder - conditions known to affect emotion recognition accuracy.4347 Additionally, children with hemifacial microsomia, chromosomal or genetic disorders, or dysmorphic facial features were excluded to ensure accurate facial feature tracking.

Emotion Expression Elicitation

Facial expressions for the following emotions were elicited: anger, disgust, fear, happiness, sadness, and surprise. Procedures for eliciting the emotion expressions were as follows. First, each child was asked to describe to the researcher a time they felt this emotion (eg, tell me about a time you felt happy). Once the memory had been described, children were asked to show the researcher the face they made when expressing that emotion (eg, “Show me the face you make when you’re happy”). Videos were then clipped to contain a single facial expression – starting and ending in a neutral facial position (resting position). If the facial expression never returned to neutral, the clip was trimmed one second following the final visually discernable facial movement. Emotions were elicited over secure Zoom. Children completed the task using a computer in a well-lit room. Children were not allowed to wear hats or have other faces in the video frame. Prior to study participation, detailed instructions were provided for participant’s parents to ensure adequate light, internet connectivity, and camera angles during data collection. Light and camera angles were adjusted as needed before the experiment began.

Facial Feature Tracking

Facial expressions were tracked using OpenFace 2.0. OpenFace is an opensource toolkit for measuring facial movements. OpenFace detects and tracks facial landmarks, estimates head position, estimates eye gaze, and performs facial expression recognition.42 Facial expressions are tracked by detecting action units (AU) corresponding to individual components of muscle movement. Action units are defined according to the Facial Action Coding System (FACS),36,48 which is an anatomy-driven system for describing facial movements discernable to the eye. Action unit definitions and numbers are presented in Table 1. OpenFace 2.0 measures 18 of the action units described by FACS using machine learning algorithms.49 Action unit intensities were used to measure emotional facial expressions according to the emotional FACS.36,48 Anger, disgust, fear, happiness, sadness, and surprise were defined using the combination of action units identified in previous research.50,51 Magnitude of movement was calculated from the maximal score across all frames of each clip as has been validated in previous investigations.50,52

Table 1.

Action Unit Numbers, Definitions, and Combinations for Each Emotion.

Action Unit FACS Label Primary Facial Muscle(s)
1 Inner Brow Raiser Frontalis, pars medialis
2 Outer Brow Raiser Frontalis, pars lateralis
4 Brow Lowering Depressor glabellae, depressor supercilli, currugator
5 Upper Lid Raiser Levator palpebrae superioris
6 Cheek Raiser Orbicularis oculi, pars orbitalis
7 Lid Tightener Orbicularis oculi, pars palpebralis
9 Nose Wrinkler Levator labii superioris alaquae nasi
10 Upper Lip Levator labii superioris, caput
Raiser infraorbitalis
12 Lip Corner Puller Zygomatic major
15 Lip Corner Depressor Depressor anguli oris
17 Chin Raiser Mentalis
20 Lip Stretcher Risorius
23 Lip Tightener Orbicularis oris
25 Lips Part Depressor labii, orbicularis oris
26 Jaw Drop Masetter, temporal and internal pterygoid
Emotion Action Units FACS Labels
Anger 4, 5, 7, 23 Brow lowering, upper lid raiser, lid tightener, lip tightener
Disgust 9, 10, 15, 17 Nose wrinkler, upper lip raiser, lip corner depressor, chin raiser
Fear 1, 2, 4, 5, 7, 20, 26 Inner & outer brow raisers, brow lowering, upper lid raiser, lid tightener, lip stretcher, jaw drop
Happy 6, 12, 25 Cheek raiser, lip corner puller, lips apart
Sad 1,4, 6, 15, 17 Inner brow raiser, brow lowering, cheek raiser, lip corner depressor, chin raiser
Surprise 1, 2, 5, 26 Inner & outer brow raisers, upper lid raiser, jaw drop

Action Units and corresponding musculature presented in Ekman, Friesen, & Hager, 2002 & Ko et al., 2021.

The accuracy of OpenFace in identifying facial landmarks in children with CLP was piloted prior to study initiation using images of individuals found online. Accuracy was measured using both visual inspection by expert raters, as well as OpenFace landmark identification confidence levels. All videos were visually inspected to ensure accurate landmark identification prior to analysis, and OpenFace action unit identification was required to be ≥97% confidence level.

Emotion Recognition Stimuli

Twenty-five children with CLP and 25 controls completed the emotion expression task, resulting in a total of 150 videos per group. Of these 150 videos, a random number generator was used to identify 60 videos that were presented to raters. The 60 videos consisted of 10 videos of each of the 6 basic emotions (50% from the CLP group and 50% from children in the control group). Videos were presented along with multiple choice options (anger, disgust, fear, happy, sad, surprise). If raters were unsure what emotion was expressed, they were asked to make their best guess. A random number generator determined the order in which videos were presented to raters. Raters were allowed to view videos multiple times and no time limit was placed on their response. Four videos (6%) were randomly selected (two from the CLP group and two from the control group) to be presented twice for intra-rater reliability calculations. The number of repeated videos was limited to 4 in order to minimize the possibility that raters would be cognizant that some videos were presented twice. Ratings were collected using an online Qualtrics survey.

Adult Raters

Nineteen raters were identified from the community. Raters were >18 years of age and fluent speakers of English. All raters worked in occupations associated with the medical profession in either research or clinical contexts. Raters were excluded if any members of their immediate family presented with CLP, if they worked directly with patients with CLP, or if they worked professionally with children. It was noted whether raters were parents of children of any age (yes/no). Raters were 73.6% white (n = 14), and 26.3% Asian (n = 5).

Statistics

T-tests were used to compare OpenFace data measuring the magnitude of facial expressions across CLP and control groups (two sided, Levene’s test for equal variance was used to determine whether assumptions of equal variance were met). For significant differences, Cohen’s d was calculated to evaluate effect size.

For rater responses from the emotion expression recognition task, intraclass correlation coefficients (ICCs) were used to calculate intra and inter-rater reliability. Intra-rater reliability was calculated on 6% of data, and inter-rater reliability calculated on the entire data set. Logistic regression was used to identify predictors of when the intended emotions were correctly identified from facial expressions. Fixed factors in the model included emotion (anger, disgust, fear, happiness, sadness, and surprise), group (CLP, controls), sex of child (male, female), and age of child. A group by emotion interaction was originally included, however, this was not significant and model fit was better with this interaction term removed. Group and emotion were identified as significant predictors. As such, the proportion of correct responses for each emotion was calculated in both groups. Chi-square tests were used to compare the proportion of correct answers across stimuli from CLP and the control groups, as well as across emotion expressions. For significant differences, odds ratios were calculated to determine the likelihood of an emotion expression being correctly identified in the control group versus the CLP group. A second logistic regression model was run to examine the influence of rater characteristics on emotion identification accuracy. Factors included in the model included rater age, rater sex, and rater parental status (whether raters were parents of children of any age). Due to the exploratory nature of this study, an unadjusted alpha was set at .05 to determine statistical significance

Results

Magnitude of Facial Movements

Twenty-five children with CLP (mean age = 10.08, SD = 1.02; females = 7, males = 18; White = 22, Asian = 3) and 25 controls (m = 10.05, SD = 1.03; females = 7, males = 18; White = 24, Asian = 1) were recruited for the study. In the CLP group, 10 children presented with Veau IV clefts and 15 with Veau III clefts. Means for action unit magnitudes are presented in Table 2.

Table 2.

Means and Standard Deviations for Facial Movements Across Group and Emotions.

Action Unit Cleft Lip & Palate Controls P-Value Cohen’s da
ANGER
04 1.55 (.71) 2.13 (.74) .015* .80
05 .684 (.40) .844 (.48) .265 -
07 1.62 (.92) 2.57 (1.3) .012* .83
23 1.04 (.74) 1.05 (.60) .954 -
DISGUST
09 1.25 (.88) .756 (.38) .028* .73
10 1.57 (.84) 2.09 (1.2) .036* .50
15 1.43 (.74) .991 (.64) .050* .64
17 1.58 (.70) 1.74 (.80) .505 -
HAPPY
06 1.68 (.51) 2.32 (.81) .005* .94
12 2.10 (.89) 3.11 (.93) .001* 1.1
25 1.42 (.68) 1.43 (.61) .950 -
SAD
01 1.08 (.55) 1.61 (.55) .004* .96
04 1.09 (.88) 1.72 (1.0) .041* .69
06 .833 (.64) 1.01 (.53) .354 -
15 .762 (.51) .949 (.48) .243 -
17 2.21 (1.1) 1.62 (.96) .050* .64
FEAR
01 1.03 (.52) 1.32 (.61) .132 -
02 1.02 (.67) .875 (.44) .436 -
04 .950 (.71) 1.51 (1.0) .055 -
05 .936 (.57) .734 (.42) .550 -
07 .755 (.44) 1.07 (.62) .078 -
20 1.40 (.88) .763 (.41) .009* .93
26 1.14 (.59) 1.59 (1.3) .184 -
SURPRISE
01 1.24 (1.0) 1.52 (.96) .384 -
02 1.18 (.75) 1.58 (1.0) .177 -
05 1.12 (.81) 1.10 (.54) .942 -
26 1.51 (.94) 2.53 (1.2) .006* .93
*

Indicates statistical significance at alpha = .05,

a

Cohen’s d absolute values

Anger.

Significantly greater movement of the brow lowering action unit (AU04) was observed in the control group (M = 2.13, SD = .74) than for children with CLP (M = 1.55, SD = .71; P = .015, Cohen’s d = .80). Similarly, significantly greater movement was observed for the lid tightener action unit (AU07) in the control group (M = 2.57, SD = 1.3) than for those with CLP (M = 1.62, SD = .92; P = .012, Cohen’s d = .83).

Disgust.

Significantly greater movement was observed for the upper lip raiser action unit (AU10) in the control group (M = 2.09, SD = 1.2) than for those with CLP (M = 1.57, SD = .84; P = .036, Cohen’s d = .50). Conversely, significantly larger values for the nose wrinkle action unit (AU9) were observed in the CLP group (M = 1.25, SD = .88; P = .028, Cohen’s d = .73) when compared with the control group (M = .756, SD = .38). Similarly, significantly larger values were observed for the lip corner depressor action unit (AU15) in the CLP group (M = 1.43, SD = .74; P = .050, Cohen’s d = .64) than the control group (M = .991, SD = .64).

Happiness.

Significantly larger values for the cheek raiser (AU06) were observed in the control group (M = 2.32, SD = .81) than the CLP group (M = 1.68, SD = .51; P = .005, Cohen’s d = .94). Similarly, significantly larger values were observed for the lip corner puller (AU12) in the control group (M = 3.11, SD = .93; P = .001, Cohen’s d = 1.1) than the CLP (M = 2.10, SD = .89).

Sadness.

Significantly larger values were observed for the inner brow raiser action unit (AU01) in the control group (M = 1.61, SD = .55) when compared with the CLP group (M = 1.08, SD = .55; P = .004, Cohen’s d = .96). Likewise, significantly larger values were observed for brow lowering (AU04) in the control group (M = 1.72, SD = 1.0) when compared with the CLP group (M = 1.09, SD = .88; P = .041, Cohen’s d = .69). For the chin raiser action unit (AU17), significantly larger values were observed in the CLP group (M = 2.21, SD = 1.1; P = .050, Cohen’s d = .54) than the control group (M = 1.62, SD = .96).

Fear.

Significantly larger values were observed for the lip stretcher action unit (AU20) in the CLP group (M = 1.40, SD = 88; P = .009, Cohen’s d = .93) when compared with the control group (M = .763, SD = .41).

Surprise.

Significantly larger values for the jaw drop action unit (AU26) were observed in the control group (M = 2.53, SD = 1.2) than the CLP group (M = 1.51, SD = .94; P = .006, Cohen’s d = .93).

Emotion Expression Ratings

Nineteen raters were recruited for this study (mean age = 32.1, SD = 7.6; females = 13, 6 = parents of at least one child). The proportion of correct responses for each emotion are presented in Figure 1. A statistical summary is presented in Table 3. Overall accuracy ranged from 68% to 90%. Intra-rater reliability was in excellent range (ICC = .92, 95%CI = .85–.96, P < .001) and inter-rater reliability in good range (ICC = .78, 95%CI = .71–.84, P < .001).

Figure 1.

Figure 1.

Proportion of correct responses for emotion recognition task across group and emotion.

Table 3.

Emotion Recognition Results Across Group and Emotion.

Emotion Cleft Lip & Palate % Correct Responses (N) Controls Chi Square P-Value Odds Ratio (95% CI) P-Value
Anger 85.3% (81) 87.4% (83) .833 - -
Disgust 68.4% (65) 86.3% (82) .005* 1.26 (1.1–1.9) .029 * ^
Fear 30.5% (29) 74.7% (71) <.001* 2.44 (1.4–4.1) <.001*^
Happy 100% (95) 97.9% (93) .497 - -
Sad 76.8% (73) 85.3% (81) .195 - -
Surprise 70.5% (67) 97.9% (93) <.001* 1.38 (.90–2.1) .129
All Emotions 76.4% (435) 83.7% (477) .002* 1.09 (.92–1.3) .299
% Correct Responses Across Emotions (N)
Anger Disgust Fear Happy Sad Surprise
86.3%b (164) 77.4%b, (174) 52.6%c (100) 98.9%a (187) 81.1%b (154) 84.2%b (160)
*

Indicates statistical significance,

^

Odd’s ratios indicate increased likelihood of correctly identifying an emotion in the control group compared with the cleft lip and palate group

a,b,c

Each subscript letter denotes that the proportion of correct responses does not significantly differ from the other emotions with that same letter at the .05 level.

CI = confidence interval.

Group (CLP group versus control group) was a significant predictor of expression recognition accuracy (β = 9.33, P = .002). Specifically, Chi Squared tests indicated that rates of correctly identifying disgust differed across CLP and control groups (X2; Pearson1 = 8.6; P < .001). Disgust was 26% more likely to be correctly identified in the control group than in the CLP group (P = .029, OR = 1.26, 95% CI = 1.1–1.9). Chi Squared tests also indicated significant differences in accurately identifying fear (X2; Pearson1 = 37.2; P < .001). Fear was over two times more likely to be correctly identified in the control group than in the CLP group (P < .001, OR = 2.44, 95% CI = 1.4–4.1). Likewise, rates of correctly identifying surprise differed across the CLP and control groups (X2; Pearson1 = 26.7; P < .001). Surprise was 38% more likely to be correctly identified in the control group than in the CLP group, however, this odds ratio was not statistically significant (P = .129). Finally, when all expressions were pooled, accuracy rates significantly differed across the CLP and control groups (X2; Pearson1 = 9.3; P = .002). Across all emotions, expressions were 9% more likely to be correctly identified in the control group (mean difference of 7.3%), however, this odds ratio was not statistically significant (P = .299). Anger, happiness, and sadness were identified at statistically similar rates across children with CLP and controls (Table 3).

Emotion was also a significant predictor of accuracy (β = −.072, P = .002). This occurred because, across all children, happiness was identified more accurately than the other emotions (98.9%; X2; Pearson1 = 9.3; P = .002) (Table 3). Accuracy rates of anger (86.3%), disgust (77.4%), sadness (81.1%), and surprise (84.2%) were statistically similar; however, fear (52.6%) was identified significantly less accurately than all other emotions (X2; Pearson1 = 33.2; P < .001). Neither age (β = −.009, P = .176), nor sex (β = −.015, P = .633) of child were significant predictors of emotion expression recognition accuracy.

Regarding rater characteristics, neither age (β = −.030, P = .06), nor sex (β = −.108, P = .606), nor parent status (β = −.001, P = .995) predicted accuracy.

Discussion

Differences in facial morphology and physiology, particularly reduced capacity for circumoral movement,27 may functionally influence facial expression of emotion in children with CLP. Likewise, these differences may affect the accuracy with which interaction partners recognize children’s facial expressions. These differences have important implications for communicating socially, forming relationships, and developing emotion knowledge. This study examined the magnitude of facial movements that children with CLP and controls employed to express basic emotions. It was hypothesized that the magnitude of facial movements in children with CLP would be reduced when compared with controls. This hypothesis was partially confirmed, as the magnitude of facial movements for multiple action units differed between groups. In most cases, greater movements were observed in the control group. In order to consider the possible impact of any differences in facial movement, observers’ ability to recognize and identify facial expressions was also investigated. It was hypothesized that observers would identify facial expressions of more complex emotions such as fear, disgust, and surprise less accurately in the CLP group. This hypothesis was partially confirmed as observers identified fear, disgust, and surprise significantly more accurately in the control group. However, the odd’s ratio for surprise was not significant.

Several factors could potentially explain those facial movements which were reduced in magnitude of movement in children with CLP. For example, impaired muscle function could have reduced children’s ability to displace soft tissues.31 This is possible considering that electromyography has revealed an asymmetric distribution of muscle activity in cleft lips, suggesting impaired motor function.53 As another factor, an impairment in the perioral sensory system – which influences perioral motor function54,55 – might explain why patients with CLP experience aberrant perioral neurosensory function.56 In addition, children’s expression of emotion might be influenced by the affective input they receive from interaction partners. For instance, eye-tracking studies suggest that interactional partners perceive children with CLP differently than they do children with typical facial morphology.57 Children with CLP may observe different facial expressions as they interact with others which could, in turn, influence their own output. Additional study should further explore these differences.

As expected, findings varied for specific emotion expressions. In expressing disgust, children with CLP presented with reduced movements of the upper lip raiser action unit. These children may have presented with decreased ability to displace the upper lip vertically58 due to increased elastic modulus at the upper lip, presumably caused by stiff or fibrous scar tissue.31,59 Children in the CLP group also demonstrated greater magnitude movements of the nose wrinkling and lip corner depressor action units. These children may have employed greater magnitude movements in these muscle groups to compensate for the reduced vertical range of motion in the upper lip. As such, children were more likely to pull the corners of their mouth down and wrinkle their nose to convey disgust. Differences in expressions of disgust in children with CLP are important considering that observers were significantly less likely to interpret those expressions accurately. The fact that raters had more difficulty identifying disgust in the CLP group may indicate that vertical movement of the upper lip helped differentiate disgust from emotions such as fear or sadness in the control group. Further work is needed to delineate the etiology of these findings.

When expressing fear, children with CLP and controls engaged most action units similarly. Greater movement of the lip stretcher, however, was evident in the controls. In other words, children with CLP displayed less lateral movement of the lips. Previous research has reported asymmetric lateral lip movements in patients with CLP,60 which may have led to less overall lateral movement. This finding was somewhat surprising in that the previous research has suggested that vertical lip movement is generally more impacted than lateral movement in individuals with cleft lip.58 It could be that the thin upper lip often associated with cleft lip resulted in less visually discernable change when the lip stretcher was engaged.26

In expressing surprise, controls exhibited significantly greater movement only in the jaw drop action unit. This finding may reflect the fact that the other action units measured for surprise tracked the upper face, which is less likely to be impacted by CLP. In any event, this greater magnitude jaw-dropping was likely helpful for raters, as they identified surprise more accurately in the control group. It would be interesting to determine if this finding extends to more functional contexts, as evidence suggests that adults primarily express surprise by raising their eyebrows.61

In expressing happiness, differences were observed in the cheek raiser and lip corner puller action units, suggesting that children in the control group displayed greater overall movement when smiling. This finding is consistent with past work indicating reduced upward and backward movement of the upper lip when children with CLP smile.31 Evidently, these differences in movement did not influence raters’ recognition of happiness. This might be expected considering that facial expression of happiness is one of the most easily identified in general.62,63

In expressing anger, children in the control group displayed greater movements in brow lowering and eyelid tightening. This difference is unlikely to be the direct effect of differences in facial morphology in that the upper face is less affected in individuals with CLP.64 As was the case with happiness, differences in movement evidently did not influence raters’ ability to identify anger from facial expression.

When expressing sadness, children in the CLP group presented with significantly less inner brow elevation and brow lowering than children in the control group. Additionally, children with CLP produced less elevation of the chin when compared with controls. Once again, these differences did not appear to impact emotion recognition, as no significant difference in rater ability to identify sadness was observed between the groups. It is interesting to note that the lip corner depressor action unit was similar in both groups, suggesting that children with CLP did not have difficulty depressing the lower lip. Children with CLP may compensate for reduced upper lip range of motion with movement of the lower lip.65

In interpreting the current findings, it should be noted that the current study employed elicited facial expressions which may differ from more variable expressions produced during spontaneous communication.7,6669 This limitation, however, might be expected to affect children in both the CLP and control groups similarly. An advantage of the current study design is that it is known exactly what emotion the children intended to convey with each expression. Future study should examine facial expressions in more natural communicative contexts. In addition, it is important to note that tracking facial movements using machine learning algorithms is not a perfect methodology. It is encouraging that expert inspection and OpenFace confidence levels indicated that landmark identification was highly accurate, however, the software was originally developed using typical adult facial morphology. Thus, future study should optimize the measurement of facial expressions in children with CLP.

It should also be noted that the current study population was racially and culturally homogenous. Additionally, having been recruited from limited geographical areas, children came from relatively similar socioeconomic backgrounds. Socioeconomic status has been reported to influence emotion skills,70,71 and past study suggests that facial expressions of emotion are not culturally universal.72 Thus, future study should examine facial expressions of emotion in more diverse populations. Additionally, raters were all either medical professionals or involved with medical-related research. Although they denied regular interaction with individuals with CLP, this population could be more habituated to viewing a range of facial morphologies. As such, future work should examine raters with other professions. Future work may also consider children with more significant scarring or poorer surgical outcomes, as the current population did not have future lip surgeries planned. Perhaps most importantly, it will be critical to explore how the differences observed in this study affect social communication and emotional development in children with CLP.

Conclusion

In expressing emotion, children with CLP employed facial movements differently than did controls. Specifically, children with CLP employed smaller magnitude superior and lateral perioral movements to express disgust, happiness, and fear. In expressing disgust and sadness, they employed greater magnitude movements in other structures, presumably to compensate for reduced perioral range of motion. In expressing anger, happiness, and sadness, children with CLP employed smaller magnitude movements of the upper face. These differences in movements may have affected raters’ ability to identify facial expressions of more complex emotions, including disgust, fear, and surprise. Raters identified expressions of happiness, sadness, and anger with equal accuracy across children with CLP and controls. Additional study is warranted to explore the implications of these differences in the social and emotional development of children with CLP.

Acknowledgements

Portions of this manuscript were presented at the ACPA Annual Meeting in Denver, CO on 5/24.

Funding

The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: National Institute on Deafness and Other Communication Disorders, (grant number T32-DC009401).

Footnotes

Declaration of Conflicting Interests

The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

References

  • 1.Niedenthal PM, Brauer M. Social functionality of human emotion. Annu Rev Psychol. 2012;63:259–285. doi: 10.1146/annurev.psych.121208.131605 [DOI] [PubMed] [Google Scholar]
  • 2.Crivelli C, Fridlund AJ. Facial displays are tools for social influence. Trends Cogn Sci. 2018;22(5):388–399. doi: 10.1016/j.tics.2018.02.006 [DOI] [PubMed] [Google Scholar]
  • 3.Garcia SE, Tully EC. Children’s recognition of happy, sad, and angry facial expressions across emotive intensities. J Exp Child Psychol. 2020;197:104881. doi: 10.1016/j.jecp.2020.104881 [DOI] [PubMed] [Google Scholar]
  • 4.Brinton B, Fujiki M. Emotion talk: Helping caregivers facilitate emotion understanding and emotion regulation. Top Lang Disord. 2011;31(3):262. doi: 10.1097/TLD.0b013e318227bcaa [DOI] [Google Scholar]
  • 5.Slifer KJ, Pulbrook V, Amari A, et al. Social acceptance and facial behavior in children with oral clefts. Cleft Palate Craniofac J. 2006;43(2):226–236. doi: 10.1597/05-018.1 [DOI] [PubMed] [Google Scholar]
  • 6.Ekman P. Facial expression and emotion. American Psychologist. 1993;48(4):384–392. doi: 10.1037/0003-066X.48.4.384 [DOI] [PubMed] [Google Scholar]
  • 7.Slifer KJ, Diver T, Amari A, et al. Assessment of facial emotion encoding and decoding skills in children with and without oral clefts. J Craniomaxillofac Surg. 2003;31(5):304–315. doi: 10.1016/s1010-5182(03)00057-x [DOI] [PubMed] [Google Scholar]
  • 8.Vyas T, Gupta P, Kumar S, Gupta R, Gupta T, Singh HP. Cleft of lip and palate: A review. J Family Med Prim Care. 2020;9(6):2621–2625. doi: 10.4103/jfmpc.jfmpc_472_20 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Dixon MJ, Marazita ML, Beaty TH, Murray JC. Cleft lip and palate: Understanding genetic and environmental influences. Nat Rev Genet. 2011;12(3):167–178. doi: 10.1038/nrg2933 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Alois CI, Ruotolo RA. An overview of cleft lip and palate. JAAPA. 2020;33(12):17. doi: 10.1097/01.JAA.0000721644.06681.06 [DOI] [PubMed] [Google Scholar]
  • 11.Abbott MA. Cleft lip and palate. Pediatr Rev. 2014;35(5):177–181. doi: 10.1542/pir.35-5-177 [DOI] [PubMed] [Google Scholar]
  • 12., IPDTOC Working Group. Prevalence at birth of cleft lip with or without cleft palate: Data from the international perinatal database of typical oral clefts (IPDTOC). Cleft Palate Craniofac J. 2011;48(1):66–81. doi: 10.1597/09-217 [DOI] [PubMed] [Google Scholar]
  • 13.Cavalheiro MG, Lamônica DAC, de Vasconsellos Hage SR, Maximino LP. Child development skills and language in toddlers with cleft lip and palate. Int J Pediatr Otorhinolaryngol. 2019;116:18–21. doi: 10.1016/j.ijporl.2018.10.011 [DOI] [PubMed] [Google Scholar]
  • 14.Deatherage J, Bourgeois T, O’Brien M, Baylis AL. Examining risk of speech-language disorders in children with cleft lip. J Craniofac Surg. 2022;33(2):395–399. doi: 10.1097/SCS.0000000000008000 [DOI] [PubMed] [Google Scholar]
  • 15.Zhang B, Guo C, Yin H, Zheng Q, Shi B, Li J. The correlation between consonant articulation and velopharyngeal function in patients with unoperated submucous cleft palate. J Craniofac Surg. 2020;31(4):1070–1073. doi: 10.1097/SCS.0000000000006300 [DOI] [PubMed] [Google Scholar]
  • 16.Wyatt R, Sell D, Russell J, Harding A, Harland K, Albery E. Cleft palate speech dissected: A review of current knowledge and analysis. Br J Plast Surg. 1996;49(3):143–149. doi: 10.1016/s0007-1226(96)90216-7 [DOI] [PubMed] [Google Scholar]
  • 17.Fujiki RB, Kostas G, Thibeault SL. Relationship between auditory-perceptual and objective measures of resonance in children with cleft palate: Effects of intelligibility and dysphonia. Cleft Palate Craniofac J. Published online March 8, 2023. doi: 10.1177/10556656231162238 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Kummer AW. Disorders of resonance and airflow secondary to cleft palate and/or velopharyngeal dysfunction. Semin Speech Lang. 2011;32(2):141–149. doi: 10.1055/s-0031-1277716 [DOI] [PubMed] [Google Scholar]
  • 19.Navya A, Pushpavathi M, Preeti P, Supra M. Voice perturbations in repaired cleft lip and palate. Glob J Otolaryngol. 2017;8(1):555729. doi: 10.19080/GJO.2017.08.555729 [DOI] [Google Scholar]
  • 20.Fujiki RB, Thibeault SL. Are children with cleft palate at increased risk for laryngeal pathology? Cleft Palate Craniofac J. 2023;60(11):1385–1394. doi: 10.1177/10556656221104027 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Gallagher ER, Collett BR. Neurodevelopmental and academic outcomes in children with orofacial clefts: A systematic review. Pediatrics. 2019;144(1):e20184027. doi: 10.1542/peds.2018-4027 [DOI] [PubMed] [Google Scholar]
  • 22.Kuhlmann E, van der Plas E, Axelson E, Conrad AL. Brain developmental trajectories in children and young adults with isolated cleft lip and/or cleft palate. Dev Neuropsychol. 2021;46(4):314–326. doi: 10.1080/87565641.2021.1946691 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Demir T, Karacetin G, Baghaki S, Aydin Y. Psychiatric assessment of children with nonsyndromic cleft lip and palate. Gen Hosp Psychiatry. 2011;33(6):594–603. doi: 10.1016/j.genhosppsych.2011.06.006 [DOI] [PubMed] [Google Scholar]
  • 24.Miller CK. Feeding issues and interventions in infants and children with clefts and craniofacial syndromes. Semin Speech Lang. 2011;32(2):115–126. doi: 10.1055/s-0031-1277714 [DOI] [PubMed] [Google Scholar]
  • 25.Abdollahi Fakhim S, Shahidi N, Lotfi A. Prevalence of associated anomalies in cleft lip and/or palate patients. Iran J Otorhinolaryngol. 2016;28(85):135–139. Accessed August 6, 2023. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4881882/ [PMC free article] [PubMed] [Google Scholar]
  • 26.Doğan S, Onçağ G, Akin Y. Craniofacial development in children with unilateral cleft lip and palate. Br J Oral Maxillofac Surg. 2006;44(1):28–33. doi: 10.1016/j.bjoms.2005.07.023 [DOI] [PubMed] [Google Scholar]
  • 27.Trotman CA, Faraway JJ, Phillips C, van Aalst J. Effects of lip revision surgery in cleft lip/palate patients. J Dent Res. 2010;89(7):728–732. doi: 10.1177/0022034510365485 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Fujiki RB, Zhao F, Niedenthal PM, Thibeault SL. Facial mimicry and emotion recognition in children with cleft lip and palate. In: American Cleft Palate-Craniofacial Association 80th Annual Meeting. Podium presentation; 2023. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Dindaroglu F, Dogan E, Dogan S. Is the nasolabial region symmetric in individuals with cleft lip and palate? Cleft Palate Craniofac J. 2024;61(1):12–19. doi: 10.1177/10556656221116535 [DOI] [PubMed] [Google Scholar]
  • 30.Al Rudainy D, Ju X, Mehendale F, Ayoub A. The effect of facial expression on facial symmetry in surgically managed unilateral cleft lip and palate patients (UCLP). J Plast Reconstr Aesthet Surg. 2019;72(2):273–280. doi: 10.1016/j.bjps.2018.10.004 [DOI] [PubMed] [Google Scholar]
  • 31.Lee D, Tanikawa C, Yamashiro T. Impairment in facial expression generation in patients with repaired unilateral cleft lip: Effects of the physical properties of facial soft tissues. PLOS ONE. 2021;16(4):e0249961. doi: 10.1371/journal.pone.0249961 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Brons S, Meulstee JW, Loonen TGJ, et al. Three-dimensional facial development of children with unilateral cleft lip and palate during the first year of life in comparison with normative average faces. PeerJ. 2019;7:e7302. doi: 10.7717/peerj.7302 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Montes ABM, Oliveira TM, Gavião MBD, Barbosa TdS. Orofacial functions and quality of life in children with unilateral cleft lip and palate. Braz Oral Res. 2019;33:e0061. doi: 10.1590/1807-3107bor-2019.vol33.0061 [DOI] [PubMed] [Google Scholar]
  • 34.Martin J, Rychlowska M, Wood A, Niedenthal P. Smiles as multipurpose social signals. Trends Cogn Sci. 2017;21(11):864–877. doi: 10.1016/j.tics.2017.08.007 [DOI] [PubMed] [Google Scholar]
  • 35.Rychlowska M, Jack RE, Garrod OGB, Schyns PG, Martin JD, Niedenthal PM. Functional smiles: Tools for love, sympathy, and war. Psychol Sci. 2017;28(9):1259–1270. doi: 10.1177/0956797617706082 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Ekman P, Friesen WV, Hager JC. Facial action coding system. A Human Face; 2002. [Google Scholar]
  • 37.Friesen W, Ekman P. EMFACS-7: Emotional facial action coding system. In: 1983. Accessed August 1, 2023. https://www.semanticscholar.org/paper/EMFACS-7%3A-Emotional-Facial-Action-Coding-System-Friesen-Ekman/233a5c9460816298e208b6e42b979888415a6d5d
  • 38.Quast A, Batschkus S, Brinkmann J, et al. Effect of cleft lip on adolescent evaluation of faces: An eye-tracking study. Pediatr Dent. 2022;44(2):108–113. [PubMed] [Google Scholar]
  • 39.Meyer-Marcotty P, Kochel J, Boehm H, Linz C, Klammert U, Stellzig-Eisenhauer A. Face perception in patients with unilateral cleft lip and palate and patients with severe class III malocclusion compared to controls. J Craniomaxillofac Surg. 2011;39(3):158–163. doi: 10.1016/j.jcms.2010.05.001 [DOI] [PubMed] [Google Scholar]
  • 40.Gkantidis N, Papamanou DA, Christou P, Topouzelis N. Aesthetic outcome of cleft lip and palate treatment. Perceptions of patients, families, and health professionals compared to the general public. J Craniomaxillofac Surg. 2013;41(7):e105–e110. doi: 10.1016/j.jcms.2012.11.034 [DOI] [PubMed] [Google Scholar]
  • 41.Hartmann GC, Guimarães LK, Maggioni CG, et al. Social attractiveness perception of a cleft lip repair in an adolescent via eye-tracking. J Stomatol Oral Maxillofac Surg. 2022;123(5):e526–e532. doi: 10.1016/j.jormas.2022.01.007 [DOI] [PubMed] [Google Scholar]
  • 42.Baltrušaitis T, Zadeh A, Lim YC, Morency LP. Openface 2.0: facial behavior analysis toolkit. 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018); 2018; IEEE:59–66. doi: 10.1109/FG.2018.00019 [DOI] [Google Scholar]
  • 43.Brinton B, Spackman MP, Fujiki M, Ricks J. What should chris say? The ability of children with specific language impairment to recognize the need to dissemble emotions in social situations. J Speech Lang Hear Res. 2007;50(3):798–811. doi: 10.1044/1092-4388(2007/055) [DOI] [PubMed] [Google Scholar]
  • 44.Fujiki M, Spackman MP, Brinton B, Illig T. Ability of children with language impairment to understand emotion conveyed by prosody in a narrative passage. Int J Lang Commun Disord. 2008;43(3):330–345. doi: 10.1080/13682820701507377 [DOI] [PubMed] [Google Scholar]
  • 45.Brinton B, Fujiki M, Hurst NQ, Jones ER, Spackman MP. The ability of children with language impairment to dissemble emotions in hypothetical scenarios and natural situations. Lang Speech Hear Serv Sch. 2015;46(4):325–336. doi: 10.1044/2015_LSHSS-14-0096 [DOI] [PubMed] [Google Scholar]
  • 46.Fujiki M, Spackman MP, Brinton B, Hall A. The relationship of language and emotion regulation skills to reticence in children with specific language impairment. J Speech Lang Hear Res. 2004;47(3):637–646. doi: 10.1044/1092-4388(2004/049) [DOI] [PubMed] [Google Scholar]
  • 47.Fitzpatrick P, Frazier JA, Cochran D, Mitchell T, Coleman C, Schmidt RC. Relationship between theory of mind, emotion recognition, and social synchrony in adolescents with and without autism. Front Psychol. 2018;9:1337. doi: 10.3389/fpsyg.2018.01337 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Ekman P, Friesen WV. Facial action coding system: Manual. Consulting Psychologists Press; 1978. [Google Scholar]
  • 49.Baltrušaitis T, Mahmoud M, Robinson P. Cross-dataset learning and person-specific normalisation for automatic action unit detection. 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG); 2015. Vol 06:1–6. doi: 10.1109/FG.2015.7284869 [DOI] [Google Scholar]
  • 50.Ko H, Kim K, Bae M, et al. Changes in computer-analyzed facial expressions with age. Sensors. 2021;21(14):4858. doi: 10.3390/s21144858 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Gupta P, Maji S, Mehra R. Compound facial emotion recognition based on facial action coding system and SHAP values. IRJASH. 2023;5(Issue 05S):26–34. doi: 10.47392/irjash.2023.S004 [DOI] [Google Scholar]
  • 52.Olderbak S, Hildebrandt A, Pinkpank T, Sommer W, Wilhelm O. Psychometric challenges and proposed solutions when scoring facial emotion expression codes. Behav Res. 2014;46(4):992–1006. doi: 10.3758/s13428-013-0421-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Radeke J, van Dijk JP, Holobar A, Lapatki BG. Electrophysiological method to examine muscle fiber architecture in the upper lip in cleft-lip patients. J Orofac Orthop. 2014; 75(1):51–61. doi: 10.1007/s00056-013-0193-5 [DOI] [PubMed] [Google Scholar]
  • 54.Essick GK. Effects of hypesthesia on oral behaviors of the orthognathic surgery patient. J Oral Maxillofac Surg. 1998;56(2):158–160. doi: 10.1016/S0278-2391(98)90857-6 [DOI] [PubMed] [Google Scholar]
  • 55.Stranc MF, Fogel M, Dische S. Comparison of lip function: Surgery vs radiotherapy. Br J Plast Surg. 1987;40(6):598–604. doi: 10.1016/0007-1226(87)90154-8 [DOI] [PubMed] [Google Scholar]
  • 56.Essick GK, Dorion C, Rumley S, Rogers L, Young M, Trotman CA. Report of altered sensation in patients with cleft lip. Cleft Palate Craniofac J. 2005;42(2):178–184. doi: 10.1597/03-124.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Meyer-Marcotty P, Gerdes ABM, Reuther T, Stellzig-Eisenhauer A, Alpers GW. Persons with cleft lip and palate are looked at differently. J Dent Res. 2010;89(4):400–404. doi: 10.1177/0022034509359488 [DOI] [PubMed] [Google Scholar]
  • 58.Trotman CA, Phillips C, Essick GK, et al. Functional outcomes of cleft lip surgery. Part I: Study design and surgeon ratings of lip disability and need for lip revision. Cleft Palate Craniofac J. 2007;44(6):598–606. doi: 10.1597/06-124.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Barlow SM, Trotman CA, Chu SY, Lee J. Modification of perioral stiffness in patients with repaired cleft lip and palate. Cleft Palate Craniofac J. 2012;49(5):524–529. doi: 10.1597/10-092 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Offerman RE, Cleall JF, Subtelny JD. Symmetry of lip activity in repaired unilateral clefts of the lip. Cleft Palate J. 1964;1(3):347–356. http://cleftpalatejournal.pitt.edu/ojs/cleftpalate/article/view/41. Accessed August 6, 2023. [PubMed] [Google Scholar]
  • 61.Reisenzein R, Bördgen S, Holtbernd T, Matz D. Evidence for strong dissociation between emotion and facial displays: The case of surprise. J Pers Soc Psychol. 2006;91(2):295–315. doi: 10.1037/0022-3514.91.2.295 [DOI] [PubMed] [Google Scholar]
  • 62.Spackman MP, Fujiki M, Brinton B. Understanding emotions in context: The effects of language impairment on children’s ability to infer emotional reactions. Int J Lang Commun Disord. 2006;41(2):173–188. doi: 10.1080/13682820500224091 [DOI] [PubMed] [Google Scholar]
  • 63.Samadiani N, Huang G, Hu Y, Li X. Happy emotion recognition from unconstrained videos using 3D hybrid deep features. IEEE Access. 2021;9:35524–35538. doi: 10.1109/ACCESS.2021.3061744 [DOI] [Google Scholar]
  • 64.Duffy S, Noar JH, Evans RD, Sanders R. Three-Dimensional analysis of the child cleft face. The Cleft Palate Craniofacial Journal. 2000;37(2):137–144. doi: 10.1597/1545-1569_2000_037_0137_tdaotc_2.3.co_2 [DOI] [PubMed] [Google Scholar]
  • 65.Trotman CA, Barlow SM, Faraway JJ. Functional outcomes of cleft lip surgery. Part III: Measurement of lip forces. The Cleft Palate Craniofacial Journal. 2007;44(6):617–623. doi: 10.1597/06-138.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66.Motley MT, Camden CT. Facial expression of emotion: A comparison of posed expressions versus spontaneous expressions in an interpersonal communication setting. West J Comm. 1988;52(1):1–22. doi: 10.1080/10570318809389622 [DOI] [Google Scholar]
  • 67.Namba S, Makihara S, Kabir RS, Miyatani M, Nakao T. Spontaneous facial expressions are different from posed facial expressions: Morphological properties and dynamic sequences. Curr Psychol. 2017;36(3):593–605. doi: 10.1007/s12144-016-9448-9 [DOI] [Google Scholar]
  • 68.Schmidt KL, Ambadar Z, Cohn JF, Reed LI. Movement differences between deliberate and spontaneous facial expressions: Zygomaticus Major action in smiling. J Nonverbal Behav. 2006;30(1):37–52. doi: 10.1007/s10919-005-0003-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69.Park S, Lee K, Lim JA, et al. Differences in facial expressions between spontaneous and posed smiles: Automated method by action units and three-dimensional facial landmarks. Sensors (Basel). 2020;20(4):1199. doi: 10.3390/s20041199 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 70.Côté S, Gyurak A, Levenson RW. The ability to regulate emotion is associated with greater well-being, income, and socioeconomic Status. Emotion. 2010;10(6):923–933. doi: 10.1037/a0021156 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71.Elsayed NM, Luby JL, Barch DM. Contributions of socioeconomic status and cognition to emotion processes and internalizing psychopathology in childhood and adolescence: A systematic review. Neurosci Biobehav Rev. 2023;152:105303. doi: 10.1016/j.neubiorev.2023.105303 [DOI] [PubMed] [Google Scholar]
  • 72.Jack RE, Garrod OGB, Yu H, Caldara R, Schyns PG. Facial expressions of emotion are not culturally universal. Proc Natl Acad Sci U S A. 2012;109(19):7241–7244. doi: 10.1073/pnas.1200155109 [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES