Abstract
There has been little research over the past few decades focusing on similarities and differences in the form and function of emotional signals in nonhuman primates, or whether these communication systems are homologous with those of humans. This is, in part, due to the fact that detailed and objective measurement tools to answer such questions have not been systematically developed for nonhuman primate research. Despite this, emotion research in humans has benefited for over 30 years from an objective, anatomically based facial-measurement tool: the Facial Action Coding System. In collaboration with other researchers, we have now developed a similar system for chimpanzees (ChimpFACS) and, in the process, have made exciting new discoveries regarding chimpanzees’ perception and categorization of emotional facial expressions, similarities in the facial anatomy of chimpanzees and humans, and we have identified homologous facial movements in the two species. Investigating similarities and differences in primate emotional communication systems is essential if we are to understand unique evolutionary specializations among different species.
Keywords: emotion, facial expression, chimpanzee Facial Action Coding System, evolution, communication
Facial expression is a necessary component of social communication in primates. Darwin (1872) initiated the study of comparative facial expression by speculating that the expressive signals produced by animals might have similar causes and consequences as those shown by people. Although many facial expressions appear to be highly preserved across many primate species, there are many examples of species-specific expressions, suggesting that some species have undergone evolutionary adaptations dependent on their specific social and ecological needs. Specific functional differences will only become clear when researchers make detailed comparisons between humans and related species such as other extant primates and, in particular, the chimpanzee, our closest living relative.
Recent studies have demonstrated striking similarities in the facial expression repertoires of humans and chimpanzees (Parr, Waller, Vick, & Bard, in press), in the perceptual cues used by both species to discriminate among facial expression categories (Parr, Hopkins, & de Waal, 1998), and in the organization of their underlying facial musculature (Burrows, Waller, Parr, & Bonar, 2006; Waller et al., 2006). These studies have considerably advanced our understanding of the evolution of facial expressions, and they pave the way for analyses of the social function of facial expressions in ongoing social interactions (Waller & Dunbar, 2005). It is only by comparing facial expressions across primate species that we can begin to understand how perceptual systems have evolved to cope with such stimuli, and how they mediate social interactions.
PERCEPTION OF FACIAL EXPRESSIONS IN CHIMPANZEES
Human studies have shown that faces and facial expressions are recognized using configural cues, or the spatial relationship among the features (Calder, Young, Keane, & Dean, 2000). Over a decade of research at the Yerkes Primate Center has confirmed a configural bias for unfamiliar face discrimination in chimpanzees that is remarkably similar to human face processing (Parr, Dove, & Hopkins, 1998; Parr, Heintz, & Akamagwuna, 2006; Parr, Winslow, Hopkins, & de Waal, 2000). However, only a handful of studies have compared facial expression categorization between chimpanzees and humans, despite the existence of a broad repertoire of distinct facial expressions in the chimpanzee, including the bared-teeth display, pant-hoot, relaxed open mouth expression (or play face), scream, and relaxed-lip face (Parr, Cohen, & de Waal, 2005; see Fig. 1).
Fig. 1.
Prototypical chimpanzee facial expressions and their probability of correct category assignment as identified through discriminant functions analyses and using ChimpFACS coding. (Photographs courtesy L.A. Parr and the Living Links Center, Emory University.)
One of the first attempts to study expression categorization in chimpanzees presented five adult subjects with a computerized task and required them to discriminate among five basic categories of facial expressions (see Fig. 2; Parr, Hopkins, & de Waal, 1998). The goals of this study were (a) to determine whether chimpanzees could visually distinguish among different examples of expressions and (b) to understand the role of distinctive features versus overall configuration in achieving these categorizations. In this initial experiment, each expression was paired with a neutral face as the nonmatching example. The results showed that chimpanzees were able to categorize most facial expressions, including screams, play faces, and bared-teeth displays, on day one of testing but required more presentations in order to discriminate the pant hoot (Parr, Hopkins, & de Waal, 1998). Moreover, they never learned to discriminate the relaxed-lip face—an emotionally neutral expression—from the neutral portrait, suggesting that perhaps emotional countenance plays a role in these expression categorizations.
Fig. 2.
An illustration of the matching-to-sample (MTS) facial expression discrimination task. The subject is first presented with a sample stimulus, a facial expression (in this case, a bared-teeth display) and a cross-shaped cursor on the computer screen over a black background (Fig 2a). The sample stimulus is the image to match, and the subject first must orient toward it by touching it with the joystick-controlled cursor. After this, the sample clears the screen and the subject is presented with two alternative stimuli (Fig. 2b)—one matches the sample by showing the same category of expression made by a different individual (left), while the other shows a different expression (relaxed-lip face, right side). The correct choice is to select the stimulus that looks most like the sample.
These results are interesting with regard to a more recent experiment that examined the role of multimodal cues (audio and visual) in expression categorization (Parr, 2004). Researchers have long been interested in whether combining signals from different modalities can alter the meaning of a message. In this experiment, videos of facial expressions were paired either with their appropriate vocalization (e.g., a scream face with a scream) or with an incongruent vocalization (e.g., a scream face with hooting). The two comparison photographs showed expressions that matched either the visual or the audio component of the sample (i.e., a scream face or pant-hoot). Spontaneously, chimpanzees preferred to match some expressions according to their auditory salience and some according to their visual salience. Screams were, for example, most accurately identified when their visual component was present in the sample, regardless of the auditory feature, whereas the pant-hoot was matched most accurately when hooting was the audio component, regardless of the visual feature in the sample. This may provide some explanation for why, in the initial experiment (Parr, Hopkins, & de Waal, 1998), pant-hoot was the last expression type to be visually categorized. It appears to be more salient as an auditory stimulus.
Parr and colleagues went on to investigate whether the expressions were being categorized using the overall configuration or through the extraction of specific facial features (Parr, Hopkins, & de Waal, 1998). To do this, five main expression types (bared-teeth, play face, pant-hoot, relaxed-lip, and scream) were characterized according to specific features, such as mouth open, teeth visible, and so on. Every possible combination of expression pairs was then presented, totalling 20 different dyads. In ten of the dyads, the target and nonmatching expressions shared three or more features in common, whereas the other 10 dyads shared fewer than two features. The hypothesis was that if the chimpanzees were categorizing expressions using distinctive features, such as teeth visible, their performance would be better on distinct dyads than on similar dyads. This first prediction was supported: Overall, subjects’ performance was significantly better discriminating expression dyads that had little feature overlap than it was discriminating those that looked similar. However, this turned out to be true only for some expression types and not others. This suggests an interaction between expression category and mode of processing but does not support an overall configural bias for expression categorization in this species, as has been shown in humans and is clearly important in basic face perception (Calder et al., 2000).
One limitation of these studies was that chimpanzee expressions were broken down into categories by human experimenters who, although experts in chimpanzee communication, could only rely on their own subjective impressions. For more complete analyses of the role of specific facial features, a more systematic and objective methodology must be employed. Expressive communication is often subtle, involving blended signals that are not always prototypically displayed or flashed on and off at peak intensities. Moreover, human studies have demonstrated that even subtle, individual facial movements can bias subjective impressions of the overall facial configuration. Therefore, a more standardized and objective measurement tool is needed to advance our understanding of emotional communication in chimpanzees.
CHIMPFACS: A NEW TOOL FOR DESCRIBING CHIMPANZEE FACIAL EXPRESSIONS
The Facial Action Coding System (FACS) is an anatomically based coding system that describes human facial appearance changes based on underlying muscle action (Ekman & Friesen, 1978; Ekman, Friesen, & Hager, 2002). Each movement change is denoted by standardized numeric codes, called action units (AUs), which correspond to the most minimal units of facial movement. This system eliminates any need to infer emotion when labelling facial expressions and thus provides an objective method for comparing them across different populations. Consequently, FACS has become the gold standard for analyzing human facial movement. In order to more accurately describe the complex communicative facial repertoire of our closest living relative and to assess facial movements that may be homologous with those of humans, we have recently developed a chimpanzee facial action coding system, ChimpFACS (Vick, Waller, Parr, Pasqualini-Smith & Bard, 2007). In developing this system, we first needed to fully understand the facial musculature of chimpanzees in relation to humans. To this end, Burrows and colleagues (Burrows et al., 2006) conducted the first modern dissection of chimpanzee facial muscles and confirmed that all 23 facial (mimetic) muscles present in humans are present in chimpanzees and share roughly the same anatomical organization. The only differences were subtle and involved the size and connectivity of some muscles, providing important anatomical clarification for developing ChimpFACS (Burrows et al., 2006).
Furthermore, the chimpanzee face itself is quite dramatically different from that of humans. Chimpanzees, for example, have a heavy brow ridge, lack fatty cheeks, do not show a protruding nose, and so forth. Therefore, we undertook a facial muscle stimulation study in both chimpanzees and humans to document how muscle action changed the appearance of the face (Waller et al., 2006). Thin microelectrodes were inserted directly into the main body of facial muscles in awake humans and anesthetized chimpanzees and then stimulated to achieve contraction. The results confirmed that (a) the movements of the human face were equivalent to human FACS action units, validating the anatomical basis of FACS, and (b) the stimulation of equivalent muscles in the chimpanzees produced appearance changes very similar to those in humans. Thus, regardless of the differences in facial morphology, similar muscular action produced similar appearance changes in both species (Waller et al., 2006). With the anatomical and functional bases for comparative facial movement validated, ChimpFACS was created by identifying the spontaneous occurrence of each specific movement from videos and photographs (Vick et al., 2007). In total, 43 AUs were described for the chimpanzee, 17 of which related to specific facial muscles, while the remaining were miscellaneous action descriptors (ADs), such as head and eye movements, similar to those described by FACS. Interestingly, some movements common in humans, such as brow knitting caused by the contraction of corrugator and associated muscles (AU4), were never observed in the chimpanzee, despite the presence of the corresponding muscles (Burrows et al., 2006).
ADVANCING THE CATEGORIZATION OF CHIMPANZEE FACIAL EXPRESSIONS USING ChimpFACS
Unlike traditional ethological approaches that rely on top-down approaches to identify primate facial expression configurations, essentially examining expression systems in total before inquiring about specific expressions, ChimpFACS is essentially a bottom-up technique, building categories of facial expressions from their component movements. We were curious whether this bottom-up approach could be used to validate, and perhaps even advance, knowledge of the existing chimpanzee facial expression categories. Over 250 facial expression examples were categorized according to overall expression configuration using published guidelines (Parr et al., 2005), and their AU's were also coded using ChimpFACS (Parr et al., in press). The resulting codes and categories were then subjected to discriminant functions analyses, a statistical method used to predict dependent variables based on regularities in sets of independent variables. In this case, the method was used to predict the correct classification of facial expressions based on action units (the probability of correct category assignment of expression categories based on their action unit features can be seen in Fig. 1). Remarkably, the bottom-up technique of ChimpFACS reliably predicted expression categories; moreover, for each of these expressions, a unique combination of muscle movements was identified (Parr et al., in press). These prototypical configurations will be invaluable for future studies of expression categorization, as they represent expressions at their peak intensity level, similar to the posed stimuli used most often in human studies.
To understand the potential homology between human and chimpanzee facial expressions, Figure 3 shows a comparison between prototypical chimpanzee facial configurations and the homologous facial movements in humans. There are many apparent similarities in the emotional countenance of the human and chimpanzee, again suggesting a strong homologous basis for facial expressions in these species. Unfortunately, there is little data on the emotional meaning of chimpanzee facial expressions, so comparisons with humans are limited to similarities in the physical appearance of the face. For example, the configu-ration AU12 (lip corner puller), AU25 (lips part), and AU26 (jaw drop) is common to both the chimpanzee play face and the human laugh. Although other researchers have suggested that these expressions are homologous (e.g., van Hooff, 1972), this is the first time these similarities have been confirmed using an anatomically-based reference system. Moreover, comparisons of the physical similarities in facial appearance between species reveal some expressions that might be unique to chimpanzees, such as the pant-hoot (a long-distance call), which does not map onto a meaningful expression in humans.
Fig. 3.
Proposed facial expression homologues in chimpanzees and humans. From left to right, the chimpanzee expressions show the bared-teeth display, pant-hoot, play face, scream face, and bulging-lip face. Corresponding human expressions are shown in the top row, along with the action units (AUs) shared by the expressions in both species. (Human photos from Ekman, Friesen, & Hager, 2002; chimpanzee photos courtesy L.A. Parr and the Living Links Center, Emory University.)
Given that a subtle facial movement—a raised brow, a snarled lip—has the power to change human social dynamics it seems crucial that researchers use more rigorous analysis systems, such as ChimpFACS, to examine how such minute changes can influence spontaneous social interactions. The function of emotional signals in primates is typically assessed by quantifying their social consequences through observation (van Hooff, 1972; Waller & Dunbar, 2005), but researchers undertaking contextual analysis of social condition rarely conduct accompanying micro-analyses of facial behavior. In part, this has been due to the lack of a rigorous system for measuring facial behavior in the necessary detail. With the development of ChimpFACS, however, we are now in a position to objectively and precisely measure facial behavior during chimpanzee social interactions and gain a better understanding of how emotional signals are used, what they mean, and how they can effectively mediate social exchanges in primate societies.
CONCLUSION
Numerous advances have been made over the last decade in our understanding of the evolution of communication. In computerized tasks, chimpanzees discriminate facial expressions, and this appears to involve a combination of configural and feature-based cues in addition to specific multimodal features. ChimpFACS is a new tool that will advance the study of facial expressions and their evolutionary interpretation. First, it provides a common language for referencing facial behavior across studies and species. Second, facial expressions can be recorded in terms of their component movements with no a priori assumptions about specific expression categories or emotional meaning. Future studies will examine how chimpanzees perceive the component movements of facial expressions and how different movements contribute to the overall configural interpretation. This will be particularly useful, as chimpanzee facial expressions, like those of humans, are not always used as peak-intensity signals, and thus the salience of each component movement may contribute differently to the interpretation of the signal (Parr et al., 2005). Most importantly for an evolutionary perspective, ChimpFACS enables facial expressions to be compared with human expressions at the level of both basic anatomical organization and outward appearance.
Emotional signals are undoubtedly crucial to human social interactions and group processes, and by comparing how these systems function in related species we can begin to address why and how emotional processes evolved. We have suggested that, in much the same way as language has been proposed to bond social groups, emotional communication functions to maintain social relationships by reducing uncertainty and facilitating social cohesion (Waller & Dunbar, 2005). Having a truly comparative tool to study facial expression in other primate species enables a broader investigation of emotion and begins to build a long-awaited evolutionary psychology of emotional communication. Future studies will investigate the emotional salience of these signals and how they function in ongoing social interactions, adding functional data to the morphological comparisons described here. Such data will help researchers conduct rigorous comparative, evolutionary analyses at a new level of detail, which is essential if we are to understand the relationship between facial expressions and emotional communication and the impact of the latter on social interactions.
Recommended Reading.
Parr, L.A. (in press). (See References)
van Hooff, J.A.R.A.M. (1972). (See References)
Vick, S.J., (2007). (See References)
Waller, B.M., Vick, S.J., Parr, L.A., Bard, K.A, Smith Pasqualini, M.C., Gothard, K.M., & Fuglevand, A.J. (2006). (See References)
Acknowledgments
Support for this paper was provided by RR-00165 from the NIH/NCRR to the Yerkes National Primate Research Center, R01-MH068791 to Lisa Parr, and a Lever-hulme Trust Research Interchange Grant (F/00678/E, PI: Kim Bard) to the University of Portsmouth. Thanks to the Living Links Center, Emory University and Paul Ekman for the use of photographs. Special thanks to Kim Bard and Marcia Pasqualini-Smith for initial collaborations on the development of the ChimpFACS.
REFERENCES
- Burrows AM, Waller BM, Parr LA, Bonar CJ. Muscles of facial expression in the chimpanzee (Pan troglodytes): Descriptive, comparative and phylogenetic contexts. Journal of Anatomy. 2006;208:153–168. doi: 10.1111/j.1469-7580.2006.00523.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Calder AJ, Young AW, Keane J, Dean M. Configural information in facial expression perception. Journal of Experimental Psychology: Human Perception and Performance. 2000;26:527–551. doi: 10.1037//0096-1523.26.2.527. [DOI] [PubMed] [Google Scholar]
- Darwin C. The expression of the emotions in man and animals. 3rd ed. Oxford University Press; New York: 1998. (Original work published 1872) [Google Scholar]
- Ekman P, Friesen WV. Facial action coding system. Consulting Psychology Press; California: 1978. [Google Scholar]
- Ekman P, Friesen WV, Hager JC. Facial action coding system. Research Nexus; Salt Lake City: 2002. [Google Scholar]
- Parr LA. Perceptual biases for multimodal cues in chimpanzee affect recognition. Animal Cognition. 2004;7:171–178. doi: 10.1007/s10071-004-0207-1. [DOI] [PubMed] [Google Scholar]
- Parr LA, Cohen M, de Waal The influence of social context on the use of blended and graded facial displays in chimpanzees (Pan troglodytes). International Journal of Primatology. 2005;26:73–103. [Google Scholar]
- Parr LA, Dove T, Hopkins WD. Why faces may be special: Evidence for the inversion effect in chimpanzees (Pan troglodytes). Journal of Cognitive Neuroscience. 1998;10:615–622. doi: 10.1162/089892998563013. [DOI] [PubMed] [Google Scholar]
- Parr LA, Heintz M, Akamagwuna U. Three studies of configural face processing by chimpanzees. Brain and Cognition. 2006;62:30–42. doi: 10.1016/j.bandc.2006.03.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Parr LA, Hopkins WD, de Waal FBM. The perception of facial expressions in chimpanzees (Pan troglodytes). Evolution of Communication. 1998;2:1–23. [Google Scholar]
- Parr LA, Waller BM, Vick SJ, Bard KA. Classifying chimpanzee facial expressions by muscle action. Emotion. doi: 10.1037/1528-3542.7.1.172. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Parr LA, Winslow JT, Hopkins WD, deWaal FBM. Recognizing facial cues: Individual recognition in chimpanzees (Pan troglodytes) and rhesus monkeys (Macaca mulatta). Journal of Comparative Psychology. 2000;114:47–60. doi: 10.1037/0735-7036.114.1.47. [DOI] [PMC free article] [PubMed] [Google Scholar]
- van Hooff JARAM. A comparative approach to the phylogeny of laughter and smiling. In: Hinde RA, editor. Non-verbal communication. Cambridge University Press; Cambridge: 1972. pp. 209–240. [Google Scholar]
- Vick SJ, Waller BM, Parr LA, Pasqualini-Smith MC, Bard KA. A cross species comparison of facial morphology and movement in humans and chimpanzees using FACS. Journal of Nonverbal Behaviour. 2007;31:1–20. doi: 10.1007/s10919-006-0017-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Waller BM, Dunbar RIM. Differential behavioural effects of silent bared teeth display and relaxed open mouth display in chimpanzees (Pan troglodytes). Ethology. 2005;111:129–142. [Google Scholar]
- Waller BM, Vick SJ, Parr LA, Bard KA, Smith Pasqualini MC, Gothard KM, Fuglevand AJ. Intramuscular electrical stimulation of facial muscles in humans and chimpanzees: Duchenne revisited and extended. Emotion. 2006;6:367–382. doi: 10.1037/1528-3542.6.3.367. [DOI] [PMC free article] [PubMed] [Google Scholar]