Skip to main content
Proceedings of the National Academy of Sciences of the United States of America logoLink to Proceedings of the National Academy of Sciences of the United States of America
. 2013 Jan 8;110(3):E181–E182. doi: 10.1073/pnas.1211865110

Reply to Sauter and Eisner: Differences outweigh commonalities in the communication of emotions across human cultures

Rachael E Jack a,b,1, Oliver G B Garrod b, Hui Yu b, Roberto Caldara c, Philippe G Schyns b
PMCID: PMC3549138  PMID: 23437444

In response to our work (1), Sauter and Eisner (2) argue, “… [our] strong claims … are not supported by [our] data.” Herein, we show that their arguments are unsubstantiated, primarily reflecting misunderstandings.

Sauter and Eisner argue “…[EA] understanding of English emotion terms ... may have been more variable … because of varying lexical correspondence … as well as … English proficiency.”

East Asian (EA) observers possessed a minimum International English Language Testing System score of 6.0 (1, p. 2), including use of fairly complex language. As high frequency words, all emotion labels were also well within capability. Furthermore, before testing, we established comprehension by obtaining descriptions and synonyms of each emotion label.

Sauter and Eisner argue “Because the Chinese ‘cultural group’ includes many cultural and linguistic subgroups, variability even for in-group signals should be expected.”

We show variance primarily between emotion categories, refuting the notion that both cultures represent each basic emotion with the same set of facial movements. If variability arose because of linguistic difference in subcultures (and where universality is supported) one would expect variance only within emotion categories.

Sauter and Eisner argue “… [Jack et al.,] found an optimal solution with fewer than six clusters in the East Asian (EA) group…”

Our analysis demonstrates an optimal fit of six clusters for Western Caucasian (WC) only. No optimal fit exists for EA, regardless of cluster number. In fact, Fig. S1 in ref. 1 explicitly reveals that EA Mutual Information never reaches a level comparable to WC, because of overlap in facial movements between basic emotion categories.

Sauter and Eisner argue “…[our methods] fail to account for [configural facial expression processing]… and …dynamic cues.”

Our methods captured the dynamic face information significantly associated with the cultural perception of each emotion (figure 1 in ref. 1). In fact, in Movie S2 in ref. 1, the EA “happy” model displays synchronous upwards eyebrow motions before the mouth (Fig. 1 here shows corresponding temporal curves). Our methods do capture relevant dynamic information, including order, speed, and synchrony of facial configurations.

Fig. 1.

Fig. 1.

Dynamics of an East Asian “happy” model (see Movie S2 in ref. 1 for optimal understanding). Color-coded curves represent the temporal dynamics of the Action Units (AUs) significantly correlated with the perception of “happy” in an East Asian observer. High Intensity (Movie S2, right column): AU2 (outer brow raiser, dark blue curve) has an earlier onset relative to the other AUs (Movie S2 shows early eyebrow flick), demonstrating specific ordering of AUs. In contrast, the other AUs are more synchronous, as illustrated by similar temporal curves. Low Intensity (Movie S2, left column): AU synchrony is further demonstrated. Thus, our methods do capture a variety of temporal differences in the mental representations of facial expressions of emotion [see also Fig. 3 in our work (1) for culture-specific spatiotemporal locations of emotional intensity representation in different facial expressions].

Sauter and Eisner argue “… [dynamic reverse correlation] method is imperfect even in the ‘baseline’ case of Caucasian Europeans.”

Sauter and Eisner (2) base their argument on different research (3) reporting aggregate normalized correlations between WC model facial movements and proposed prototypes and variants (4). Their statement is unwarranted for three reasons.

First, each aggregate correlation was statistically significant, demonstrating a close fit of our WC models to “baseline.” Maximum-fit analysis revealed perfect and near-perfect fits for all six expressions.

Second, although proposed prototypes and variants describe different ways of expressing the same emotion (e.g., “surprise” with open or closed mouth), their natural variance (i.e., how each is distributed within/between cultures) is unknown. Our data-driven methods, which harness subjective cultural perceptions, do capture the natural variance of facial expressions, including a spectrum broader than those currently proposed (e.g., happy with parted lips).

Third, the models are recognized with a high accuracy comparable to those reported with standardized 2D faces (e.g., refs. 5 and 6).

Thus, existing facts demonstrate that our methods can expand current knowledge beyond basic facial expressions, revealing natural variance within a culture.

Footnotes

The authors declare no conflict of interest.

References

  • 1.Jack RE, Garrod OGB, Yu H, Caldara R, Schyns PG. Facial expressions of emotion are not culturally universal. Proc Natl Acad Sci USA. 2012;109(19):7241–7244. doi: 10.1073/pnas.1200155109. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Sauter D, Eisner F. Commonalities outweigh differences in the communication of emotions across human cultures. Proc Natl Acad Sci USA. 2012;110:E180. doi: 10.1073/pnas.1209522110. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Yu H, Garrod OGB, Schyns PG. Perception-driven facial expression synthesis. Comput Graph. 2012;36:152–162. [Google Scholar]
  • 4.Ekman P, Friesen WV. Facial Action Coding System: A Technique For The Measurement Of Facial Movement. Palo Alto, CA: Consulting Psychologists Press; 1978. [Google Scholar]
  • 5.Jack RE, Blais C, Scheepers C, Schyns PG, Caldara R. Cultural confusions show that facial expressions are not universal. Curr Biol. 2009;19(18):1543–1548. doi: 10.1016/j.cub.2009.07.051. [DOI] [PubMed] [Google Scholar]
  • 6.Nelson N, Russell JA. Universality revisited. Emotion Review. 10.1177/1754073912457227.

Articles from Proceedings of the National Academy of Sciences of the United States of America are provided here courtesy of National Academy of Sciences

RESOURCES