Skip to main content
Frontiers in Psychology logoLink to Frontiers in Psychology
. 2021 Feb 4;12:591682. doi: 10.3389/fpsyg.2021.591682

How Can People Express Their Trait Self-Esteem Through Their Faces in 3D Space?

Xiaoyang Wang 1,2, Xiaoqian Liu 1,2,*, Yuqian Wang 1, Tingshao Zhu 1,2
PMCID: PMC7889801  PMID: 33613379

Abstract

Background

Trait self-esteem reflects stable self-evaluation, and it affects social interaction patterns. However, whether and how trait self-esteem can be expressed through behaviors are controversial. Considering that facial expressions can effectively convey information related to personal traits, the present study investigated the three-dimensional (3D) facial movements related to self-esteem level and the sex differences therein.

Methods

The sample comprised 238 participants (46.2% males, 53.8% females). Their levels of trait self-esteem were evaluated by employing the Rosenberg Self-Esteem Scale (SES) (47.9% low self-esteem, 52.1% high self-esteem). During self-introductions, their facial movements in 3D space were recorded by Microsoft Kinect. Two-way ANOVA was performed to analyze the effect of self-esteem and gender on 3D facial movements. Additionally, Logistic regression models were established to describe the relationship between 3D facial movements and self-esteem levels in both genders.

Results

The results of two-way ANOVA revealed a main effect of trait self-esteem level for cheeks and lips’ movements. Meanwhile, there was a significant interaction between trait self-esteem and gender on the variability of lips’ movements. In addition, the combination of facial movements can effectively identify trait self-esteem in men and women, with 75.5 and 68% accuracy, respectively.

Conclusion

The present results suggest that the 3D facial expressions of individuals with different trait self-esteem levels were different, and such difference is affected by gender. Our study explores a possible way in which trait self-esteem plays a role in social interaction and also provides the basis for automatic self-esteem recognition.

Keywords: facial expressions, three-dimensional data, gender, logistic regression, trait self-esteem, two-way ANOVA

Introduction

Self-esteem is the positive or negative self-evaluation a person feels about oneself (Rosenberg, 1965), which can be divided into trait self-esteem (long-time fluctuations) and state self-esteem (short-time fluctuations) (Kernis, 1993). While state self-esteem is influenced by social evaluation (Leary et al., 2003), trait self-esteem is influenced by social evaluation and in turn affects social behavior (Greenwald and Banaji, 1995), social decision-making (Anthony et al., 2007), and ultimately, individual well-being (Baumeister et al., 2003). Individuals with higher trait self-esteem usually appear self-confident to others (Krause et al., 2016). They give people a more positive first impression (Back et al., 2011), are more likely to be admitted during interviews (Imada and Hakel, 1977), and have higher sales performance at work (Miner, 1962). Vice versa, perceptions of other’s self-esteem is an important factor in determining the mode of communication (Macgregor et al., 2013). One way to obtain an understanding of how trait self-esteem affects social interaction is to investigate the expression of trait self-esteem.

Previous studies have reported that males have slightly higher self-esteem than females (Feingold, 1994). Some comparisons showed that gender is an important factor in the relationship between self-esteem and body image (Israel and Ivanova, 2002), alcohol using (Neumann et al., 2009), and life satisfaction (Moksnes and Espnes, 2013). Besides, the self-reported personality traits of men and women with high self-esteem have weak differences (Robins et al., 2001). Agentic traits such as extraversion are important for men’s self-esteem, while communal traits such as agreeableness are important for women’s self-esteem. However, only a few studies have investigated sex differences in the expression of trait self-esteem. Considering that men and women with high self-esteem tend to show different traits and do different behaviors, they are much likely to show their trait self-esteem in different ways.

Many researchers have studied the perception of self-esteem instead of the expression of self-esteem, yet different results have been obtained. A study revealed that self-esteem may not be perceived well through explicit behavioral clues (e.g., maintain eye contact and hesitate when speaking) (Savin-Williams and Jaquish, 1981). Other studies indicated zero correlation between self-reported self-esteem and others’ perceived self-esteem (Yeagley et al., 2007; Kilianski, 2008). However, a recent study has shown that when controlling acquaintance, people can identify trait self-esteem from self-introduction videos fairly accurately (Hirschmueller et al., 2018). The perception of self-esteem is influenced by prior knowledge and individual differences (Ostrowski, 2010; Nicole et al., 2015), and it is difficult to identify fine-grained behavioral information. Using advanced technology to encode the expression of trait self-esteem may be an objective way to study the interpersonal effects of trait self-esteem. For example, the relationship between encoded gait information and trait self-esteem was found in previous research (Sun et al., 2017).

Compared with other expressions, facial movements convey more information about personal traits (Harrigan et al., 2004) and are more noticed. Meanwhile, facial expressions play an important role in social interaction and communication (Berry and Landry, 1997; Ishii et al., 2018). As a result, facial movement is likely to be an aspect of trait self-esteem expression. On one hand, facial movements are related to personality and depression (de Melo et al., 2019; Suen et al., 2019), which are linked to self-esteem (Robins et al., 2001). On the other hand, gender differences exist in facial movements (Hu et al., 2010). Taken together, these previous findings provide evidence for the sex differences in the facial movements related to trait self-esteem. Therefore, it is important to distinguish the unique facial expressions of men and women with high self-esteem/low self-esteem. Moreover, three-dimensional (3D) facial data are generally considered to contain more information than 2D facial data and have better performance in identifying traits (Jones et al., 2012; de Melo et al., 2019). Thus, it is more comprehensive to explore the facial expressions of trait self-esteem in 3D space.

The purpose of this study was to investigate the 3D facial movements related to trait self-esteem and whether such relationship is influenced by gender. Based on previous findings, we hypothesized that facial movements would be separately influenced by trait self-esteem level and gender. In addition, we expected interaction exists between self-esteem level and gender in their effect on facial movements.

Materials and Methods

Participants

A total of 240 students and workers from University of Chinese Academy of Sciences participated in this study. To participate in the study, participants had to meet three inclusion criteria: (1) At least 18 years old; (2) Fluent in Mandarin; and (3) Healthy and able to make normal facial expressions. After excluding the data of participants who frequently covered their faces with their hands, 238 qualified samples were obtained, of which 110 were male and 128 were female. The retention rate was 99.17%. The average age was 22.8 years (SD = 2.8).

A post hoc G power analysis (Faul et al., 2007) on 238 participants was conducted with a priori alpha levels of 0.05 and a medium effect. This power analysis demonstrated relatively large effects (effects of power = 0.97) (MacCallum et al., 1996).

Measures

Rosenberg Self-Esteem Scale

This study used the Chinese version of the Rosenberg Self-Esteem Scale (SES) (Rosenberg, 1965; Ji and Yu, 1993) as a tool to measure trait self-esteem (Kang, 2019). The scale contains 10 questions in total and the score range from 10 to 40; the higher the score, the higher the level of self-esteem. Generally speaking, scores lower than 15 indicate very low self-esteem, scores range in 15–20 indicate low self-esteem, scores range in 20–30 indicate normal self-esteem, scores range in 30–35 indicate high self-esteem, and scores above 35 indicate very high self-esteem. In this study, we divided the subjects into two groups based on their self-esteem score. Scores higher than 30 were classified into high self-esteem group (HSE), and scores lower than or equal to 30 were classified into low self-esteem group (LSE).

Kinect

A Kinect camera can be used to recognize facial movements automatically (Zaletelj and Koir, 2017) (Figure 1). The sampling frequency of the Kinect is 30 Hz, which means it can record 30 frames of individual facial activity per second. To quantitatively describe the 3D facial movements, Kinect is able to capture and track the key 17 action units (AUs) (Ekman and Friesen, 1978) around facial features in 3D space. The specific meaning of each AU is provided in Table 1, and the intuitive facial movements are shown in Figure 2, which shows that there are three AUs moving in depth dimension. Most of the AUs are expressed as a numeric weight that varies between 0 and 1. Three of the AUs – jaw slide right, right eyebrow lower, and left eyebrow lower – vary between −1 and +1. The absolute value of the numeric weight represents the moving distance of the corresponding AU. Finally, Kinect will collect one list of time series data for each AU’s movements.

FIGURE 1.

FIGURE 1

The Kinect camera and a brief introduction about its main components.

TABLE 1.

The specific meaning of each AU.

Facial action units Action description Value range
JawOpen Dropping of the chin without opening the mouth (0, 1)
LipPucker Pouting (0, 1)
JawSlideRight Slide the chin to the right (the negative number indicates a sliding of the chin to the right and then to the left) (−1, 1)
LipStretcherRight Stretch the right corners of mouth to the right (0, 1)
LipStretcherLeft Stretch the left corners of mouth to the left (0, 1)
LipCornerPullerLeft Pull the left corners of mouth to the upper left (0, 1)
LipCornerPullerRight Pull the right corners of mouth to the upper right (0, 1)
LipCornerDepressorLeft Pull the left corners of mouth to the left and down (0, 1)
LipCornerDepressorRight Pull the right corners of mouth to the right and down (0, 1)
LeftcheekPuff Puff up the left cheek (0, 1)
RightcheekPuff Puff up the right cheek (0, 1)
LefteyeClosed Close the left eye (0, 1)
RighteyeClosed Close the right eye (0, 1)
RighteyebrowLower Right eyebrow down (the negative number indicates a raised right eyebrow) (−1, 1)
LefteyebrowLower Left eyebrow down (the negative number means the left eyebrow is raised) (0, 1)
LowerlipDepressorLeft Pull the lower lip down to the left (0, 1)
LowerlipDepressorRight Pull the lower lip down to the right (0, 1)

The value represents the size of the facial action units’ movements in proportion to the resting state, usually 0.8 means a very large movement.

FIGURE 2.

FIGURE 2

The schematic diagram of each AU.

Design and Procedure

To analyze the relationship between self-esteem and 3D facial movements, this study designed a facial data collection experiment with a self-introductory situation according to the following steps: (1) First, the demographic information of a subject was recorded. (2) Then, the level of self-esteem was determined using the SES. (3) During the experimental task (self-introduction), the participants’ facial changes in 3D space (the movement of AUs) were captured by Kinect.

Specifically, during the experimental task, a self-introductory situation was set. Compared with conversation contexts and emotional-induction scenes, self-introduction facing the camera is least affected by social evaluation. We believe that in this situation, the subjects show their trait self-esteem rather than state self-esteem (Park et al., 2007). In this section, participants were asked to make a 1-min self-introduction using the following outline: (1) Please introduce yourself and your hometown in detail; (2) Please give a detailed account of your major and the interesting research work you did when you were at school; (3) Please describe your plans for the future and what kind of work you would like to do. Meanwhile, illumination, the relative position, and the relative distance between Kinect and participants were strictly controlled. For illumination, we chose a bright room without direct sunlight for data acquisition; for relative position, we asked the participants to keep their body as still as possible; for distance, we controlled the distance between the camera and the participants to 3 m.

The whole process of the experiment is shown in Figure 3. The same steps were followed for each subject independently. The entire process took approximately 10 min.

FIGURE 3.

FIGURE 3

The whole process of the experiment.

Statistical Analysis

After the experiment, we collected demographic information, self-esteem score, and 3D facial movements for each subject. Then we performed data preprocessing. For self-esteem scores, people in the LSE group were defined as those whose self-esteem score was less than or equal to 30 points, while the HSE group had a score greater than 30. For AU data, we normalized the AU data and eliminated outliers outside of the three standard deviations. Then the average and standard deviation values of each AU were calculated to describe the magnitude and variability of the facial muscle’s movements. The greater the absolute value, the larger the moving amplitude. In addition, the greater the standard deviation, the larger the facial variability.

For each AU indicator, descriptive analyses were used to calculate the means and SDs for each group (LSE vs HSE), and the data were also analyzed according to gender (male vs female). Considering the ratio of the number of variables to the sample size, differences between the mean scores for group × gender were calculated using the two-way ANOVA. Then, multivariate analyses using logistic regression models were conducted to identify the relationship between AUs’ movements level and self-esteem level in which odds ratios (ORs) were calculated. In addition, stepwise logistic regressions were followed for removing unimportant independent variables to get the best-fitted models. All statistical processes were performed on SPSS 22.0, with a two-tailed probability value of <0.05 considered to be statistically significant.

Results

A total of 110 males (46.22%) and 128 (53.78%) were involved in this study. No significant sex × SE-level differences in age distribution were found. The mean values as well as standard deviation values of AU movements are described as M ± SD in Table 2. The significant results from a 2 × 2 ANOVA separated by SE level (i.e., LSE and HSE) and sex (i.e., male and female) are shown in Table 3. Significant models to predict self-esteem level from AU movements are shown in Table 4.

TABLE 2.

Participants characteristics.

Male (n = 110)
Female (n = 128)
LSE (n = 50) HSE (n = 60) LSE (n = 64) HSE (n = 64)
M ± SD M ± SD M ± SD M ± SD
Age 22.84 ± 1.973 23.33 ± 3.079 21.98 ± 2.504 23.17 ± 3.215
SES score 27.00 ± 3.369 34.77 ± 2.513 26.66 ± 3.433 34.31 ± 2.085
Mean value of facial AUs
Jaw_open_mean 0.105 ± 0.056 0.113 ± 0.061 0.102 ± 0.062 0.104 ± 0.055
LipPucker_mean 0.383 ± 0.217 0.425 ± 0.195 0.409 ± 0.214 0.393 ± 0.184
JawSlideRight_mean 0.012 ± 0.07 -0.001 ± 0.063 0.013 ± 0.055 0.006 ± 0.058
LipStretcherRight_mean 0.094 ± 0.081 0.123 ± 0.1 0.083 ± 0.087 0.09 ± 0.08
LipStretcheLeft_mean 0.102 ± 0.092 0.125 ± 0.093 0.11 ± 0.092 0.121 ± 0.087
LipCornerPullerLeft_mean 0.128 ± 0.129 0.133 ± 0.123 0.156 ± 0.134 0.127 ± 0.131
LipCornerPullerRight_mean 0.147 ± 0.123 0.168 ± 0.13 0.197 ± 0.154 0.162 ± 0.138
LipCornerDepressorLeft_mean 0.131 ± 0.086 0.101 ± 0.072 0.136 ± 0.085 0.12 ± 0.087
LipCornerDepressorRight_mean 0.126 ± 0.084 0.111 ± 0.095 0.146 ± 0.101 0.131 ± 0.096
LeftcheekPuff_mean 0.19 ± 0.171 0.278 ± 0.177 0.2 ± 0.193 0.228 ± 0.18
RightcheekPuff_mean 0.228 ± 0.194 0.308 ± 0.195 0.216 ± 0.208 0.255 ± 0.226
LefteyeClosed_mean 0.264 ± 0.13 0.289 ± 0.142 0.267 ± 0.136 0.253 ± 0.113
RighteyeClosed_mean 0.356 ± 0.158 0.36 ± 0.158 0.348 ± 0.168 0.319 ± 0.128
RighteyebrowLower_mean 0.181 ± 0.311 0.233 ± 0.371 0.2 ± 0.289 0.105 ± 0.292
LefteyebrowLower_mean 0.197 ± 0.343 0.269 ± 0.391 0.243 ± 0.322 0.138 ± 0.333
LowerlipDepressorLeft _mean 0.038 ± 0.049 0.039 ± 0.039 0.071 ± 0.074 0.048 ± 0.046
LowerlipDepressorRight_mean 0.047 ± 0.054 0.044 ± 0.035 0.074 ± 0.066 0.056 ± 0.05
Std value of facial AUs
Jaw_open_std 0.045 ± 0.015 0.048 ± 0.015 0.055 ± 0.016 0.055 ± 0.016
LipPucker_std 0.094 ± 0.039 0.111 ± 0.049 0.127 ± 0.052 0.125 ± 0.044
JawSlideRight_std 0.049 ± 0.02 0.054 ± 0.022 0.05 ± 0.017 0.046 ± 0.013
LipStretcherRight_std 0.056 ± 0.034 0.068 ± 0.038 0.055 ± 0.037 0.062 ± 0.039
LipStretcheLeft_std 0.056 ± 0.031 0.067 ± 0.033 0.062 ± 0.035 0.066 ± 0.033
LipCornerPullerLeft_std 0.086 ± 0.056 0.094 ± 0.058 0.116 ± 0.069 0.09 ± 0.07
LipCornerPullerRight_std 0.089 ± 0.054 0.103 ± 0.057 0.117 ± 0.064 0.096 ± 0.06
LipCornerDepressorLeft_std 0.077 ± 0.033 0.072 ± 0.046 0.087 ± 0.038 0.073 ± 0.037
LipCornerDepressorRight_std 0.075 ± 0.032 0.074 ± 0.045 0.086 ± 0.038 0.077 ± 0.04
LeftcheekPuff_std 0.062 ± 0.049 0.089 ± 0.045 0.076 ± 0.05 0.081 ± 0.041
RightcheekPuff_std 0.091 ± 0.065 0.112 ± 0.051 0.091 ± 0.054 0.093 ± 0.057
LefteyeClosed_std 0.098 ± 0.03 0.108 ± 0.028 0.112 ± 0.029 0.116 ± 0.028
RighteyeClosed_std 0.115 ± 0.036 0.119 ± 0.031 0.123 ± 0.033 0.13 ± 0.034
RighteyebrowLower_std 0.084 ± 0.037 0.089 ± 0.045 0.095 ± 0.045 0.087 ± 0.034
LefteyebrowLower_std 0.090 ± 0.047 0.092 ± 0.047 0.097 ± 0.044 0.096 ± 0.041
LowerlipDepressorLeft_std 0.039 ± 0.044 0.048 ± 0.041 0.073 ± 0.056 0.053 ± 0.045
LowerlipDepressorRight_std 0.041 ± 0.048 0.046 ± 0.037 0.070 ± 0.052 0.053 ± 0.045

LSE, low trait self-esteem group; HSE, high trait self-esteem group; _mean, the mean value of the AU’s corresponding temporal records; _std, the standard deviation of the AU’s corresponding temporal records.

TABLE 3.

Significant results for two-way analysis of variance.

Variance Variable SS F P η2 Pairwise comparisons
Sex LowerlipDepressorLeft _mean 0.026 8.716** 0.003 0.036 Males < females
LowerlipDepressorRight_mean 0.023 8.485** 0.004 0.035 Males < females
Jaw_open_std 0.004 16.951*** 0.000 0.068 Males < females
LipPucker_std 0.033 15.169*** 0.000 0.061 Males < females
LefteyeClosed_std 0.008 9.426** 0.002 0.039 Males < females
RighteyeClosed_std 0.006 5.312* 0.022 0.022 Males < females
LowerlipDepressorLeft_std 0.022 9.956** 0.002 0.041 Males < females
LowerlipDepressorRight_std 0.018 8.668** 0.004 0.036 Males < females
SES level LipCornerDepressorLeft_mean 0.030 4.386* 0.037 0.018 LSE > HSE
LeftcheekPuff_mean 0.199 6.080* 0.014 0.025 LSE < HSE
RightcheekPuff_mean 0.212 4.945* 0.027 0.021 LSE < HSE
LipStretcherRight_std 0.005 3.865 0.050 0.016 LSE < HSE
LeftcheekPuff_std 0.014 6.801* 0.010 0.028 LSE < HSE
Sex × SES level LipCornerPullerLeft_std 0.016 3.961* 0.048 0.017 LSE:Male < HSE:Male LSE:Female > HSE:Female
LipCornerPullerRight_std 0.017 4.975* 0.027 0.021 LSE:Male < HSE:Male LSE:Female > HSE:Female
LowerlipDepressorLeft_std 0.013 5.870* 0.016 0.024 LSE:Male < HSE:Male LSE:Female > HSE:Female

SS, sum of squares; LSE, low trait self-esteem group; HSE, high trait self-esteem group. *P < 0.05; **P < 0.01; ***P < 0.001.

TABLE 4.

The relationship between AUs’ movement and self-esteem.

Gender Variates β 95% CI P value
Male LipPucker_mean 0.022*
LipPucker_mean1 0.347 0.093–1.293 0.115
LipPucker_mean2 3.079 0.695–13.645 0.139
LipPucker_mean3 0.765 0.210–2.793 0.686
LeftcheekPuff_mean 0.003**
LeftcheekPuff_mean1 0.075 0.018–0.310 0.000***
LeftcheekPuff_mean2 0.483 0.124–1.871 0.292
LeftcheekPuff_mean3 0.248 0.070–0.877 0.031*
LipCornerDepressorLeft_std 0.008**
LipCornerDepressorLeft_std1 2.482 0.565–10.915 0.229
LipCornerDepressorLeft_std2 0.249 0.060–1.031 0.055
LipCornerDepressorLeft_std3 0.403 0.101–1.612 0.199
Female LefteyebrowLower_mean 0.016**
LefteyebrowLower_mean1 5.553 0.823–5.866 0.116
LefteyebrowLower_mean2 2.389 1.849–15.123 0.002**
LefteyebrowLower_mean3 1.383 0.549–3.939 0.443
LipCornerDepressorLeft_std 0.013*
LipCornerDepressorLeft_std1 3.478 0.559–9.882 0.037*
LipCornerDepressorLeft_std2 5.507 0.116–2.096 0.003**
LipCornerDepressorLeft_std3 1.261 0.112–1.902 0.663

AU_mean(std)1, Q0–Q25 score section; AU_mean(std)2, Q25–Q50 score section; AU_mean(std)3, Q50–Q75 score section. *P < 0.05; **P < 0.01; ***P < 0.001.

There were significant main effects for the SE levels: LipCornerDepressorLeft_mean [F (1, 234) = 4.386, P = 0.037], LeftcheekPuff_mean [F (1, 234) = 6.080, P = 0.014], RightcheekPuff_mean [F (1, 234) = 4.945, P = 0.027], LipStretcherRight_std [F (1, 234) = 3.865, P = 0.050], and LeftcheekPuff_std [F (1, 234) = 6.801, P = 0.010]. The results revealed that amplitude and richness of facial movements increased as self-esteem became higher, except the amplitude of LipCornerDepressorLeft_mean AU.

There were also significant interaction effects between sex and self-esteem level: LipCornerPullerLeft_std [F (1, 234) = 3.961, P = 0.048], LipCornerPullerRight_std [F (1, 234) = 4.975, P = 0.027], and LowerlipDepressorLeft_std [F (1, 234) = 5.870, P = 0.016]. These three AUs in men with low self-esteem moved more variably than men with high self-esteem, while in women, the opposite was true.

Step-forward logistic regression models were conducted to check whether some crucial AUs’ movements could combine to predict self-esteem as the optimal model in both genders. Table 4 presented the multivariable associations between AUs’ movements in 3D space and self-esteem. Three independent variables were included in the logistic regression model of men, they were LipPucker_mean (P = 0.022), LeftcheekPuff_mean (P = 0.003), and LipCornerDepressorLeft_std (P = 0.008). In addition, two independent variables were included in the logistic regression model of women, they were LefteyebrowLower_mean (P = 0.016) and LipCornerDepressorLeft_std (P = 0.013). Those AUs’ indicators might be powerful when they are combined to predict self-esteem. Both logistic regression models were significant (P < 0.001), with men and women models having prediction accuracy in classification of 75.5 and 68%, respectively. In addition, we made a further confirmation that the effect of building models separately for men and women was better than building a combining model (shown in Table 5).

TABLE 5.

The results of logistic regression models.

Accuracya Precisionb Recallc
Model for all 67.2 65.0 68.4
Model for female 68.0 67.1 70.3
Model for male 75.5 76.7 66.0

aThe number of correctly classified/the total number of samples. bThe number of LSE samples correctly classified/the total number of samples classified as LSE. cThe number of LSE samples correctly classified as positive/the total number of LSE samples.

Discussion

The purpose of the study was to explore the relationship between 3D facial movements and trait self-esteem. Kinect was employed to quantify facial movements during self-introduction and statistical analysis explored the relationship between facial movements and level of self-esteem. The results were concordant with our hypothesis, revealing that some 3D facial movements were related to trait self-esteem, and sex factor has an impact on such relationship.

Previous studies found that self-esteem may not be perceived well through explicit behavioral clues (Savin-Williams and Jaquish, 1981; Kilianski, 2008). In contrast, the present findings indicate that during self-introductions, the trait self-esteem has a great effect on some facial AUs’ movements, and a specific combination of facial AUs’ movements can effectively predict the level of trait self-esteem. This extends a recent work that self-esteem perceptions of strangers based on self-introduction videos were fairly accurate (r = 0.31, P = 0.002) (Hirschmueller et al., 2018). Specifically, compared with the LSE group, individuals in the HSE group puff their cheeks larger, move their lips to right more, and puff their left cheek more. Differently, high self-esteem people move their lip corner to bottom left smaller, which is a very unique movement best representing sadness (coded as AU15 in FACS) (Ekman and Friesen, 1978). One potential explanation is that trait self-esteem may be expressed through different emotions during self-rating. While individuals with high self-esteem tend to express more positive emotions such as pride (Tracy and Robins, 2003; Agroskin et al., 2014) and happiness (Kim et al., 2019) when introducing themselves, those with low self-esteem are more likely to exhibit negative emotions including sadness and disgust (Satoh, 2001). In addition, it is noteworthy that the left cheek was found to be the area most affected by trait self-esteem (for amplitude: η2 = 0.025, for variability: η2 = 0.028). This led to the hypothesis that trait self-esteem may be a potential factor for the left cheek bias (Nicholls et al., 2002; Lindell and Savill, 2010; Lindell, 2013). Future factor analysis researches on left cheek bias can take trait self-esteem into consideration.

Moreover, the interaction effect exists in the influence of trait self-esteem and gender on facial movements. In terms of the variability of lip movements, high self-esteem males move more variable than low self-esteem males, while low self-esteem females move more variable than high self-esteem females. One way to obtain an understanding of why men and women express their trait self-esteem differently is that self-esteem is largely derived from interpersonal experience (Leary et al., 1995) and gender identity (Cvencek et al., 2016). Therefore, men who are more deterrent enjoy higher social status and thus, higher levels of self-esteem. On the contrary, women who have more affinity have enhanced interpersonal relationships and thus, higher levels of self-esteem (Baldwin and Keelan, 1999). An alternative explanation is that men and women with high self-esteem have different personality traits. While agentic traits such as extraversion are more important for men’s self-esteem, communal traits (e.g., agreeableness) are more crucial for women’s self-esteem (Robins et al., 2001). Although the mechanism is unclear, our study indicates that the facial expressions of trait self-esteem are related to gender, and establishing self-esteem prediction models according to gender is better than combined prediction. It is recommended that future studies on self-esteem should consider gender as an important factor.

The following briefly foregrounds some of the study’s implications. First, the present study collected reasonable self-introduction video data and employed advanced 3D facial movement quantification techniques to determine the relationship between 3D facial expressions and trait self-esteem. To the authors’ knowledge, it is the first to investigate the quantified 3D facial expressions of trait self-esteem, and our results demonstrate the necessity. Second, our research discovers the effect of gender on the facial expression of self-esteem and provided suggestions for future nonverbal research on self-esteem. Finally, the results provide a basis for predicting self-esteem based on 3D facial data. In view of the possibility that the self-reporting method may result in false answers during interviews and job scenarios (McLarnon et al., 2019), a facial prediction method offers a supplementary strategy to measure self-esteem.

However, the findings must be considered in light of various limitations. First of all, since this study was conducted in China, how many findings can be extended to other cultures is uncertain. Previous studies have indicated that Asian subjects may have different facial expressions than subjects from other cultural backgrounds (Matsumoto and Ekman, 1989). In addition, cultures differed significantly in the magnitude of gender effect on self-esteem. Future cross-cultural research might help to understand the difference in facial expression of self-esteem between cultures. Secondly, in accordance with theories of the lens model (Watson and Brunswik, 1958), the influence of trait self-esteem in social interaction is realized through the expression of trait self-esteem and the perception of such expression. This study only objectively explored the relationship between facial expressions and self-esteem but did not investigate the extent to which human perceivers can be affected by these expressions. A further limitation can be seen in the influence of participants’ knowledge background in self-reporting methods.

Regarding the future lines of research, stable trait self-esteem prediction models can be proposed based on researches about trait self-esteem expression. Considering social approval and knowledge background, automatic self-esteem recognition can be used to measure self-esteem in interview scenarios or special populations. Furthermore, multimodal data (such as eye movements, facial movements, and gestures) can be combined to fully describe the expression of self-esteem and establish more accurate prediction models.

Data Availability Statement

The datasets presented in this article are not readily available because raw data cannot be made public. If necessary, we can provide behavioral characteristic data. Requests to access the datasets should be directed to XL.

Ethics Statement

The studies involving human participants were reviewed and approved by the scientific research ethics committee of the Chinese Academy of Sciences Institute of Psychology (H15010). The patients/participants provided their written informed consent to participate in this study.

Author Contributions

TZ contributed to the conception and design of the study. XL collected the data and developed the instrument. XW performed most of the statistical analysis and wrote the manuscript with input from all authors. YW performed part of the statistical analysis and helped drafting a part of the text. All authors contributed to the article and approved the submitted version.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We wish to thank all subjects for their participation in this study.

Footnotes

Funding. This research was funded by the Key Research Program of the Chinese Academy of Sciences (No. ZDRW-XH-2019-4).

References

  1. Agroskin D., Klackl J., Jonas E. (2014). The self-liking brain: a VBM study on the structural substrate of self-esteem. PLoS One 9:e86430. 10.1371/journal.pone.0086430 [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Anthony D. B., Wood J. V., Holmes J. G. (2007). Testing sociometer theory: self-esteem and the importance of acceptance for social decision-making. J. Exp. Soc. Psychol. 43 425–432. 10.1016/j.jesp.2006.03.002 [DOI] [Google Scholar]
  3. Back M. D., Schmukle S. C., Egloff B. (2011). A closer look at first sight: social relations lens model analysis of personality and interpersonal attraction at zero acquaintance. Eur. J. Pers. 25, 225–238. 10.1002/per.790 [DOI] [Google Scholar]
  4. Baldwin M. W., Keelan J. P. R. (1999). Interpersonal expectations as a function of self-esteem and sex. J. Soc. Pers. Relationsh. 16 822–833. 10.1177/0265407599166008 [DOI] [Google Scholar]
  5. Baumeister R. F., Campbell J. D., Krueger J. I., Vohs K. D. (2003). Does high self-esteem cause better performance, interpersonal success, happiness, or healthier lifestyles? Psychol. Sci. Public Interest 4 1–44. 10.1111/1529-1006.01431 [DOI] [PubMed] [Google Scholar]
  6. Berry D. S., Landry J. C. (1997). Facial maturity and daily social interaction. J. Pers. Social Psychol. 72, 570–580. 10.1037/0022-3514.72.3.570 [DOI] [Google Scholar]
  7. Cvencek D., Greenwald A. G., Meltzoff A. N. (2016). Implicit measures for preschool children confirm self-esteem’s role in maintaining a balanced identity. J. Exp. Soc. Psychol. 62 50–57. 10.1016/j.jesp.2015.09.015 [DOI] [Google Scholar]
  8. de Melo W. C., Granger E., Hadid A. Ieee. (2019). “Combining global and local convolutional 3D networks for detecting depression from facial expressions,” in Proceedings of the 2019 14th Ieee International Conference on Automatic Face and Gesture Recognition (Lille: ), 554–561. [Google Scholar]
  9. Ekman P., Friesen W. V. (1978). Facial Action Coding System (FACS): a technique for the measurement of facial action. Riv. Psichiatr. 47 126–138. [Google Scholar]
  10. Faul F., Erdfelder E., Lang A.-G., Buchner A. (2007). GPower 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav. Res. Methods 39 175–191. 10.3758/BF03193146 [DOI] [PubMed] [Google Scholar]
  11. Feingold A. (1994). Gender differences in personality: a meta-analysis. Psychol. Bull. 116 429–456. 10.1037/0033-2909.116.3.429 [DOI] [PubMed] [Google Scholar]
  12. Greenwald A. G., Banaji M. R. (1995). Implicit social cognition: attitudes, self-esteem, and stereotypes. Psychol. Rev. 102 4–27. 10.1037/0033-295x.102.1.4 [DOI] [PubMed] [Google Scholar]
  13. Harrigan J. A., Wilson K., Rosenthal R. (2004). Detecting state and trait anxiety from auditory and visual cues: a meta-analysis. Pers. Soc. Psychol. Bull. 30 56–66. 10.1177/0146167203258844 [DOI] [PubMed] [Google Scholar]
  14. Hirschmueller S., Schmukle S. C., Krause S., Back M. D., Egloff B. (2018). Accuracy of self-esteem judgments at zero acquaintance. J. Pers. 86 308–319. 10.1111/jopy.12316 [DOI] [PubMed] [Google Scholar]
  15. Hu Y., Yan J., Shi P. (2010). “A fusion-based method for 3D facial gender classification,” in Proceedings of the 2010 The 2nd International Conference on Computer and Automation Engineering (ICCAE) (Singapore: ), 369–372. 10.1109/ICCAE.2010.5451407 [DOI] [Google Scholar]
  16. Imada A. S., Hakel M. D. (1977). Influence of nonverbal-communication and rater proximity on impressions and decisions in simulated employment interviews. J. Appl. Psychol. 62, 295–300. 10.1037/0021-9010.62.3.295 [DOI] [Google Scholar]
  17. Ishii L. E., Nellis J. C., Boahene K. D., Byrne P., Ishii M. (2018). The importance and psychology of facial expression. Otolaryngol. Clin. North Am. 51, 1011–1017. 10.1016/j.otc.2018.07.001 [DOI] [PubMed] [Google Scholar]
  18. Israel A. C., Ivanova M. Y. (2002). Global and dimensional self-esteem in preadolescent and early adolescent children who are overweight: Age and gender differences. Int. J. Eat. Disord. 31, 424–429. 10.1002/eat.10048 [DOI] [PubMed] [Google Scholar]
  19. Ji Y., Yu X. (1993). The self-esteem scale, SES. Chin. Ment. Health J. 7 251–252. [Google Scholar]
  20. Jones A. L., Kramer R. S. S., Ward R. (2012). Signals of personality and health: the contributions of facial shape, skin texture, and viewing angle. J. Exp. Psychol. Hum. Percept. Perform. 38 1353–1361. 10.1037/a0027078 [DOI] [PubMed] [Google Scholar]
  21. Kang Y. (2019). The relationship between contingent self-esteem and trait self-esteem. Soc. Behav. Pers. Int. J. 47 e7575 10.2224/sbp.7575 [DOI] [Google Scholar]
  22. Kernis M. H. (1993). “The roles of stability and level of self-esteem in psychological functioning,” in Self-Esteem, ed. Baumeister R. F. (New York, NY: Springer US; ), 167–182. 10.1007/978-1-4684-8956-9_9 [DOI] [Google Scholar]
  23. Kilianski S. E. (2008). Who do you think I think I am? Accuracy in perceptions of others’ self-esteem. J. Res. Pers. 42 386–398. 10.1016/j.jrp.2007.07.004 [DOI] [Google Scholar]
  24. Kim E. S., Hong Y.-J., Kim M., Kim E. J., Kim J.-J. (2019). Relationship between self-esteem and self-consciousness in adolescents: an eye-tracking study. Psychiatry Invest. 16 306–313. 10.30773/pi.2019.02.10.3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Krause S., Back M. D., Egloff B., Schmukle S. C. (2016). Predicting Self-confident Behaviour with Implicit and Explicit Self-esteem Measures. Eur. J. Personal. 30, 648–662. 10.1002/per.2076 [DOI] [Google Scholar]
  26. Leary M. R., Gallagher B., Fors E., Buttermore N., Baldwin E., Kennedy K., et al. (2003). The invalidity of disclaimers about the effects of social feedback on self-esteem. Pers. Soc. Psychol. Bull. 29 623–636. 10.1177/0146167203029005007 [DOI] [PubMed] [Google Scholar]
  27. Leary M. R., Tambor E. S., Terdal S. K., Downs D. L. (1995). Self-Esteem as an interpersonal monitor. J. Pers. Soc. Psychol. 68 270–274. 10.1207/s15327965pli1403 [DOI] [Google Scholar]
  28. Lindell A. K. (2013). The silent social/emotional signals in left and right cheek poses: A literature review. Laterality 18, 612–624. 10.1080/1357650x.2012.737330 [DOI] [PubMed] [Google Scholar]
  29. Lindell A. K., Savill N. J. (2010). Time to turn the other cheek? The influence of left and right poses on perceptions of academic specialisation. Laterality 15, 639–650. 10.1080/13576500903201784 [DOI] [PubMed] [Google Scholar]
  30. MacCallum R. C., Browne M. W., Sugawara H. M. (1996). Power analysis and determination of sample size for covariance structure modeling. Psychol. Methods 1 130–149. 10.1037/1082-989x.1.2.130 [DOI] [Google Scholar]
  31. Macgregor J. C. D., Fitzsimons G. M., Holmes J. G. (2013). Perceiving low self-esteem in close others impedes capitalization and undermines the relationship. Pers. Relatsh. 20 690–705. 10.1111/pere.12008 [DOI] [Google Scholar]
  32. Matsumoto D., Ekman P. (1989). American-Japanese cultural differences in intensity ratings of facial expressions of emotion. Motiv. Emot. 13, 143–157. 10.1007/BF00992959 [DOI] [Google Scholar]
  33. McLarnon M. J. W., DeLongchamp A. C., Schneider T. J. (2019). Faking it! Individual differences in types and degrees of faking behavior. Pers. Individ. Dif. 138 88–95. 10.1016/j.paid.2018.09.024 [DOI] [Google Scholar]
  34. Miner J. B. (1962). Personality and ability factors in sales performance. J. Appl. Psychol. 46, 6–13. 10.1037/h0044317 [DOI] [Google Scholar]
  35. Moksnes U. K., Espnes G. A. (2013). Self-esteem and life satisfaction in adolescents-gender and age as potential moderators. Qual. Life Res. 22, 2921–2928. 10.1007/s11136-013-0427-4 [DOI] [PubMed] [Google Scholar]
  36. Neumann C. A., Leffingwell T. R., Wagner E. F., Mignogna J., Mignogna M. (2009). Self-esteem and gender influence the response to risk information among alcohol using college students. J. Subst. Use 14, 353–363. 10.3109/14659890802654540 [DOI] [Google Scholar]
  37. Nicholls M. E. R., Clode D., Lindell A. K., Wood A. G. (2002). Which cheek to turn? The effect of gender and emotional expressivity on posing behavior. Brain Cogn. 48, 480–484. 10.1006/brcg.2001.1402 [DOI] [PubMed] [Google Scholar]
  38. Nicole W., Wilhelm F. H., Birgit D., Jens B. (2015). Gender differences in experiential and facial reactivity to approval and disapproval during emotional social interactions. Front. Psychol. 6:1372. 10.3389/fpsyg.2015.01372 [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Ostrowski T. (2010). Self-esteem and social support in the occupational stress-subjective health relationship among medical professionals. Polish Psychol. Bull. 40 13–19. 10.2478/s10059-009-0003-5 [DOI] [Google Scholar]
  40. Park L. E., Crocker J., Kiefer A. K. (2007). Contingencies of self-worth, academic failure, and goal pursuit. Pers. Soc. Psychol. Bull. 33 1503–1517. 10.1177/0146167207305538 [DOI] [PubMed] [Google Scholar]
  41. Robins R. W., Tracy J. L., Trzesniewski K., Potter J., Gosling S. D. (2001). Personality correlates of self-esteem. J. Res. Pers. 35 463–482. 10.1006/jrpe.2001.2324 [DOI] [Google Scholar]
  42. Rosenberg M. (1965). Society and the Adolescent Self-Image. Princeton, NJ: Princeton University Press. [Google Scholar]
  43. Satoh Y. (2001). Self-disgust and self-affirmation in university students. Jpn. J. Educ. Psychol. 49 347–358. 10.5926/jjep1953.49.3_347 [DOI] [Google Scholar]
  44. Savin-Williams R. C., Jaquish G. A. (1981). The assessment of adolescent self-esteem: a comparison of methods1. J. Pers. 49 324–335. 10.1111/j.1467-6494.1981.tb00940.x [DOI] [Google Scholar]
  45. Suen H.-Y., Hung K.-E., Lin C.-L. (2019). TensorFlow-based automatic personality recognition used in asynchronous video interviews. IEEE Access 7, 61018–61023. 10.1109/access.2019.2902863 [DOI] [Google Scholar]
  46. Sun B., Zhang Z., Liu X., Hu B., Zhu T. (2017). Self-esteem recognition based on gait pattern using Kinect. Gait Posture 58 428–432. 10.1016/j.gaitpost.2017.09.001 [DOI] [PubMed] [Google Scholar]
  47. Tracy J. L., Robins R. W. (2003). “Does pride have a recognizable expression?,” in Emotions Inside Out: 130 Years after Darwin’s the Expression of the Emotions in Man and Animals, Vol. 1000 eds Ekman P., Campos J. J., Davidson R. J., DeWaal F. B. M. (New York, NY: New York Acad Sciences; ), 313–315. 10.1196/annals.1280.033 [DOI] [PubMed] [Google Scholar]
  48. Watson A. J., Brunswik E. (1958). Perception and the representative design of psychological experiments. Philos. Q. 8:382 10.2307/2216617 [DOI] [Google Scholar]
  49. Yeagley E., Morling B., Nelson M. (2007). Nonverbal zero-acquaintance accuracy of self-esteem, social dominance orientation, and satisfaction with life. J. Res. Pers. 41 1099–1106. 10.1016/j.jrp.2006.12.002 [DOI] [Google Scholar]
  50. Zaletelj J., Koir A. (2017). Predicting students’ attention in the classroom from Kinect facial and body features. Eurasip J. Image Vide. process. 2017:80 10.1186/s13640-017-0228-8 [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

The datasets presented in this article are not readily available because raw data cannot be made public. If necessary, we can provide behavioral characteristic data. Requests to access the datasets should be directed to XL.


Articles from Frontiers in Psychology are provided here courtesy of Frontiers Media SA

RESOURCES