Abstract
Human facial expressions are unique in their ability to express our emotions and communicate them to others. The mimic expression of basic emotions is very similar across different cultures and has also many features in common with other mammals. This suggests a common genetic origin of the association between facial expressions and emotion. However, recent studies also show cultural influences and differences. The recognition of emotions from facial expressions, as well as the process of expressing one’s emotions facially, occurs within an extremely complex cerebral network. Due to the complexity of the cerebral processing system, there are a variety of neurological and psychiatric disorders that can significantly disrupt the coupling of facial expressions and emotions. Wearing masks also limits our ability to convey and recognize emotions through facial expressions. Through facial expressions, however, not only “real” emotions can be expressed, but also acted ones. Thus, facial expressions open up the possibility of faking socially desired expressions and also of consciously faking emotions. However, these pretenses are mostly imperfect and can be accompanied by short-term facial movements that indicate the emotions that are actually present (microexpressions). These microexpressions are of very short duration and often barely perceptible by humans, but they are the ideal application area for computer-aided analysis. This automatic identification of microexpressions has not only received scientific attention in recent years, but its use is also being tested in security-related areas. This article summarizes the current state of knowledge of facial expressions and emotions.
Key words: emotion, facial expression, microexpression, cerebral emotion processing
1. Introduction
Humans are social creatures who can hardly survive on their own. The everyday life of most people consists of a multitude of interactions with our fellow human beings. These interactions do not only transfer factual information, but also reveal information about the sender (self-revelation level), the relationship of the interactors (relationship level), and information about what we are trying to achieve from our communication partner (appeal level). Except for the factual information level, sending and perceiving emotions is crucial for successful communication. Emotion perception is helpful for contextual understanding of information, interpretation of others’ actions and also their intentions. Emotions can be conveyed in a variety of ways in human communication. For example, speech or gestures can be used in addition to facial expressions. However, facial expressions are by far the most commonly used form for human communication. The correct interpretation of other people’s emotions from their facial expressions can therefore lead to considerable advantages in social interaction 1 2 .
Over the past 50 years, great progress has been made in our understanding of mimic emotion transmission. In contrast, our understanding of emotions themselves lags far behind, and they remain one of the most fascinating and mysterious products of brain function. This is evident in the difficulty of defining emotions.
In this article, we will first address the definition of facial expression, communication, and emotion, and then discuss the voluntary and involuntary aspects of our facial expression of emotion. We will discuss the societal benefits, evolutionary origins, and current possibilities of automated emotion recognition.
2. Facial expression, communication, and emotions
The term of facial expression describes visible movements of the facial surface, which is conditioned by a specific pattern of muscle activity. Facial expressions can be perceived and interpreted by people. It is a central element of human expressive behavior and is particularly suitable for expressing emotions and communicating them to the outside world. The interpretation of emotions from facial expressions requires higher cerebral processing capacities, which besides the evaluation of the current visual input also strongly depends on contextual information and the availability of memory content 2 . In contrast to facial expressions and communication, the concept of emotion is not clearly defined and there are still controversial discussions about what an emotion is in the first place and whether consciousness is a prerequisite for emotions 3 4 5 6 7 8 . Some authors believe that emotions are conscious abstractions of one‘s internal state, which are assembled cognitively. According to this definition, an emotion requires awareness of one’s self, and thus emotion as a concept would no longer be applicable to many other species. Other authors argue that consciousness is merely a property of emotions and their processing, which is additive to the basal emotion properties 9 10 11 .
Emotions are often described by words (happy, sad, angry). In psychological research, “valence” and “arousal” have become accepted as a way of describing the concept numerically. Valence describes the affective content of an emotion as a continuum from pleasant to unpleasant or good to bad, while arousal describes the degree of internal activation by the emotion.
In this paper, the latter definition will be used and thus emotions considered as evolutionarily conserved functional states of the brain 7 12 13 . According to this view, emotions emerged from evolutionary processes and primitive forms may still be present in humans as conserved features. However, it is undisputed that emotions trigger certain reactions on biochemical, physiological, and behavioral levels 14 .
To interpret an emotion from a person’s facial expression, we do not always need the entire face; single parts are often sufficient (e.g., sad eyes or happy mouth) 15 16 . The integration of a wide variety of individual features across the entire face is a highly integrative performance that is significantly more complex than just a summation of individual properties 17 18 .
The significance of individual facial regions for decoding emotions differs between different emotions. For example, the eyes are particularly importance for the perception of sadness 19 , anger 20 21 , and fear 22 23 . Although other parts of the face also provide cues for these emotions, it has been shown, for example, for fear that emotion recognition form the upper part of the face is more accurate than from the lower one 24 . In contrast, people find it easier to recognize happy facial expressions from the lower part of the face 24 . Accordingly, observers of happy faces focus longer on the mouth region 22 23 25 . For the nose region, a particular importance in decoding facial expressions of disgust could be demonstrated 19 .
In addition to the interpretation of individual facial regions, facial expression is also processed and interpreted holistically, i.e., there is still a processing of facial expression that is not composed of the individual features and is thus more than the sum of its parts 26 . The significance of this holistic facial interpretation for the recognition of emotions differs between emotions. However, it is true for all emotions that their interpretation is composed neither purely of the perception and processing of individual features of subregions of the face nor exclusively of this holistic face interpretation, but consists of a combination of these two parallel processes 16 27 .
3. Voluntary and involuntary facial expression, classification of facial expression – microexpressions
Humans can control their facial expressions arbitrarily. This voluntary control can be used to feign an emotional state. This deception mostly refers to a communication partner, but can also be directed to oneself. The voluntary control of facial expressions has a target state, which corresponds to an internal model (what we want to express). In addition to the voluntary facial movements, however, felt emotions are generally expressed involuntarily by facial expressions and the corresponding muscles are involuntarily controlled. They penetrate our conscious control and are reflected in our face. This emotional leakage does not always have to be accompanied by large movements of muscles, but can in part also be very discreet. These movements are summarized under the term of microexpressions. Specifically, we understand microexpressions as facial movements that are short-term, involuntary, and automatic. Certain muscle groups contract or relax in fixed patterns. It is further assumed that these patterns have a genetic basis and are closely coupled to the emotional state of a person. The genetic basis has further led to the assumption that there is a set of basal emotional states that are universal across cultures 28 , although this assumption continues to be debated to this day 29 . Because these movements are involuntary, it is assumed that they allow conclusions to be drawn about a person’s actual emotional state.
Of particular interest is the interaction of voluntary and involuntary facial movements. Voluntary facial movement is often used to conceal the actual emotion felt and thus to deceive the communication partner. The question is, whether these voluntary facial movements are fast enough and, more importantly, complete enough to suppress involuntary microexpressions that give a hint to genuine emotion.
In a classic experiment, Paul Ekman and Wallace Friesen filmed a psychiatric patient trying to convince the therapist that she was fine and could be discharged from the psychiatric ward. At the end of the interview, the patient broke down crying and confessed that she continued to be unwell. During the phase of pretending to be well, various short-term microexpressions could be detected, which clearly indicated deception 30 . Subsequently, there were still a large number of studies on this topic that ultimately concluded that it is not always possible for the deceiver to maintain the deception perfectly and that it is often involuntary that emotions escape via facial expressions in the form of microexpressions. Due to the involuntary nature of these microexpressions, it is further assumed that they allow conclusions to be drawn about the actual emotional state of a person.
4. Cultural similarities and differences in facial expression and emotion
The dominance of the Western world in academia over the past century has resulted in the majority of studies on the perception of facial expressions and emotions being conducted on culturally westernized Caucasians. There have been few studies in the last century that have examined the relationship between facial expressions and emotions across cultures and ethnicities. The first significant studies were conducted by the American psychologist Paul Ekman in the 1960s and 1970s and he concluded that facial expression of emotions was universal across cultures 28 31 . This was thought to be true for at least six basic emotions 32 . This view remained unchallenged for a long time, but more recent studies now suggest cultural differences 29 . This is especially true when emotional expression is specifically examined according to contextual and cultural influences. For example, people of Western and East Asian cultures show common perceptions of pain, but differences in the mimic expression of pleasure have also been found 29 .
In our everyday life and communication with others, however, we assume as a matter of course that our model of emotions and emotion perception is universal. Seemingly contrary to this statement, however, there is also a racial bias that influences the perception of the mimically expressed emotions of people from other cultures 33 .
5. Influence of diseases on facial expressions and emotions
Facial expressions – as well as the interpretation of facial expressions of others – can be altered by various diseases. The classic example for a disturbance of the mimic expression of emotions is Parkinson’s syndrome. This is a neurodegenerative disease that is associated with a reduction in facial movements 34 35 . In later stages, the ability to modulate facial expressions is severely restricted so that even the term of “mask face” is used 35 . In this stage, the patients do no longer show any emotionally triggered facial expression. This diminished or, in some cases, nonexistent mimic expression has a significant impact on the patients’ ability to communicate and on how they are perceived by others 36 37 38 . Thus, observers perceive the corresponding patients as less empathetic, bored, anxious, and cranky 36 37 38 . This leads to less interest in relationships with these patients 36 . The consequences are increasing deterioration of social relationships, even with close people, increasing social isolation, and a concomitant deterioration in perceived quality of life 34 35 . These significant effects highlight the importance of facial expressions for social interaction.
Reduced or absent facial expressions also have a direct impact on communication behavior. In the context of medical conversations, for example, there are often monologues of the doctors in which disease models are illustrated or further therapy steps are explained. Although in such situations often only one person speaks, the speaker receives a multitude of mimic messages, expressing e.g. the degree of understanding of what is said, but also giving information about the emotional state of the interlocutor. If the speaker becomes aware of these messages, he/she then modulates the content according to the feedback received. The complete absence of any mimic response in such a conversational satiation is an impressive experience, since it gives the feeling of conducting the conversation without orientation. The speaker has no indication of whether the interlocutor has understood what has been said, nor does it become clear whether the interlocutor reacts with annoyance, rejection, or acceptance. The condition leads to uncertainty and discomfort for the speaker. There seems to be an evolutionarily determined need for communicative feedback. The “still face” experiments conducted with babies and toddlers by Edward Tronick seem to point in this direction 39 . In these experiments, the mothers suddenly stop any mimic expression. The children then start various actions in order to restore interaction. They increasingly show reactions of discomfort until they usually react with crying 39 .
These experiments, but also the feelings as adults in similar situations do not only emphasize the importance of facial expressions for communication, but also force the interlocutors to change their communication behavior. Longer monologues are no longer possible, since one has to rely on verbal feedback to what is said. It also makes clear how strongly even “monologues” depend on the constant non-verbal exchange of information.
The effect of reduced or absent facial expressions described in the example of Parkinson’s syndrome is directly transferable to other diseases that are accompanied by a reduced ability to express emotions facially, e.g., muscular dystrophies, myasthenia gravis, or amyotrophic lateral sclerosis. The negative effects on communication and perception of the affected person can also be found in unilateral disorders of facial motor function such as peripheral facial nerve palsy 40 .
However, in addition to the reduced ability to express emotions mimically, Parkinson patients also suffer from a reduced ability to perceive mimic expressions of others 41 42 43 and also have fundamental difficulties with processing the emotional content 44 45 46 . This is a characteristic of many neurodegenerative diseases. The accompanying cerebral changes of the disease are often considered as cause. However, the influence of one’s own restricted facial expressions has also been discussed for some years as a cause of the limited ability to interpret the emotional content of the facial expressions of others. Thus, limitations in the interpretation of the emotional content of others’ facial expressions is also found in myasthenia gravis 47 , myotonic dystrophy 48 , Duchenne muscle dystrophy 49 , and amyotrophic lateral sclerosis 50 . Although all these diseases primarily affect the patients’ motor function, they also have central effects. Interestingly, however, changes in the interpretation of emotional facial expressions of others can also be observed in purely peripheral disorders such as peripheral facial nerve palsy 51 52 , supporting the theory that limitations in one’s own purely motor ability to express emotions facially have a negative effect on the ability to interpret emotions in the facial expressions of others. This theory is supported by the findings that, for example, in amyotrophic lateral sclerosis, mimic emotion detection is reduced even when other cognitive abilities are still preserved 50 . Also, Parkinson patients as well as patients with Huntington’s disease show a direct correlation between the ability to express themselves mimically and the ability to interpret the emotions of others 53 54 55 . In summary, the main common feature of peripheral facial nerve palsy with all these described neurodegenerative diseases is the reduction of the ability to control facial expressions motorically. Nevertheless, all diseases show common effects on emotion recognition and cerebral processing of emotions. This suggests that even in the presence of possible alternative causes for a reduced emotion recognition, the purely motor disorder may impair it. These findings also align well with the currently dominant theory of embodied recognition. This theory states that an important mechanism for perceiving facial emotions is their sensorimotor simulation in one’s own brain. This requires a mental replication of the perceived facial expression. However, since one’s own facial expression is disturbed, also the mental replicability is altered, which ultimately limits the ability to simulate and thus recognize facial emotions 56 57 .
However, in addition to these peripheral and/or central disorders described above, there are also many central disorders that interfere with emotion detection of others’ facial expressions. For example, deficits are found in patients with schizophrenia 58 , chronic alcoholism 59 , borderline personality disorder 60 , and multiple sclerosis 61 62 63 .
6. Facial expression and emotions in the era of masks
Wearing face masks has become a commonplace in the era of the COVID pandemic. Face masks represent the most cost-effective protective measure against the SARS-CoV2 virus and allow continued personal social contact 64 . Nevertheless, considerable resistance to the wearing of masks was shown in many countries. The main reason discussed is the feeling of restriction of personal freedom 65 66 . The fact that wearing a face mask is perceived as such a strong restriction of personal freedom despite the considerable advantages seems to be related to the perceived strong restriction of physical and psychological freedom by the face mask. Most probably, a piece of textile that would have to be worn as a bandage on the upper arm, for example, would not have been associated with the same feeling of restriction of personal freedom. Wearing a face mask is certainly an inconvenience and can also lead to an increase in headaches 67 or acne 68 in some individuals. However, these side effects are rare. It is much more likely that the feeling of restriction of freedom arises from the fact that the mask covers most of the very regions of the body that makes us communicative and social creatures. However, our acceptance of wearing the mask again affects how we are perceived by others. Thus negative feelings when wearing the mask affect also the social perception by other mask users 69 . Wearing masks naturally also disturbs the recognition of emotions from the facial expressions of our fellow human beings and makes social interactions more difficult 70 . Since the mimic emotion recognition is not composed of the sum of facial subareas, but also involves as strong holistic process, wearing masks worsens this ability more than would be expected from the lack of information from the covered facial areas 71 . It is further argued that the resulting deficits in nonverbal communication give people a feeling of insecurity and discouragement and can also promote the emergence of psychological disorders if there is a corresponding predisposition 72 .
The effects described above are currently being reduced by the increasing abolition of the obligation to wear face masks. However, it is still obligatory in the health care system and thus a particular challenge for physicians regarding empathy and conversation skills. It is crucial that both communication partners are aware of the resulting perception deficits and that these deficits are compensated by changing and, above all, increasing bidirectional communication behavior.
7. Facial expressions and emotions in other species
In many species, the expression of emotion takes place on biochemical and physiological levels similar to human responses. For example, responses to fear are very similar with accelerated respiration, change in blood distribution, hormone release, and pupil dilation. In facial expressions, there are significantly stronger differences between species, which are due to the different number of muscle groups in the face and the ability to control them.
Already Darwin stated that animals can develop specific facial expressions adapted to their species 73 . The “Facial Action Coding System” developed by Ekman and Friesen for the anatomy-based classification of emotions in humans 74 could also be successfully used in seven other species including chimpanzees 75 76 77 . Even in rodents, mimic expressions with at least an affective content could be detected 78 79 .
However, it is undisputed that the possibilities of facial expression are reduced with a lower level of development 75 . In this context, the question of how much self-awareness is necessary to express emotions mimically and to recognize them in others remains of great interest.
8. Cerebral processing of mimic emotion perception
Functional imaging studies show that the processing of emotional faces is associated with increased activity in visual, limbic, temporoparietal, cerebellar, as well as prefrontal areas and in the putamen 80 ( Fig. 1 ). Visual information first reaches the visual cortex. Here, processing of emotional faces occurs in several subareas (inferior occipital gyrus, lingual gyrus, fusiform gyrus). Processing in these regions appears to be deep in the processing hierarchy and independent of the emotion processed or its valence 81 82 . The cerebellum is also involved in emotion processing. Although the cerebellar areas have strong anatomical connections to the cortical association cortex and various limbic structures 83 , they, like the visual areas, appear to be independent of the emotion itself being processed 84 85 86 .
Fig. 1.
Regions of increased neuronal activity in response to happy, sad, angry, fearful, and disgusted human faces compared with neutral faces (FG fusiform gyrus, ACC anterior cingulate cortex, LG lingual gyrus, IOG inferior occipital gyrus, MFG medial frontal gyrus) 80 .
Fearful faces activate the medial frontal cortex, which plays a key role in conscious experience and the regulation of emotion experience.
The amygdala is activated when looking at sad, fearful, or even happy faces, whereas angry or disgusted faces do not cause a change in amygdala activity. For the latter two emotions, particularly strong activation was found in the insular region 80 that seems to respond more strongly to disgusted faces than to angry ones. Limbic regions such as the amygdala and hippocampus are thought to be more involved in processing external positive emotional sensory information, while the insular region is activated primarily for interoceptive stimuli and distressing external stimuli 87 . The cingulate cortex is particularly activated by the perception of happy faces 87 .
This distribution makes clear that the processing of emotional contents of faces occurs in a complex network of brain regions. Different emotions involve different brain regions to different degrees, but many brain regions are involved in the processing of most emotions.
However, the cerebral activation pattern in the processing of emotional faces seems to depend not only on the emotion presented but also on the attention paid to it 88 89 90 . In general, there is consensus that face processing shows right-brain dominance. This is a long-standing finding for general processing of faces for identity recognition, emotion, and attention 88 89 90 . This is also congruent to the finding that in patients with parietal lesions, right-sided lesions lead to stronger neglect symptoms compared to left-sided ones 91 . This right-sided dominance for general processing of faces also seems to refer to the processing of emotional faces. Thus, a bilateral presentation of fearful faces found greater activation and connectivity of right visual cortex and right amygdala compared with neutral faces 92 . Even when little attention is paid to faces, right lateralized increased activity is found in the amygdala and temporal pole 93 .
The processing of emotional faces described above takes place explicitly, i.e. with an awareness of the information. This is in contrast to the implicit, i.e. unconscious, processing of emotional information from faces. Experimentally, this unconscious perception can be generated, e.g., by temporally retroactive masking with other stimuli or by interocular suppression. This implicit processing is particularly interesting because it was shown early that patients with structural lesions of the primary visual cortex could discriminate specific characteristics of these stimuli 94 . Subsequent studies revealed that this ability relates particularly to the perception of emotional faces 95 – which is a phenomenon that has been called affective blindness. Affected patients show the ability to correctly identify emotions from faces with a disproportionately high frequency, although they cannot see them 96 97 . They show increased activity in the left amygdala, left striatum, and right middle temporal gyrus 93 . Processing of unconsciously perceived faces is also dependent on expressed emotions. In the processing of unconsciously perceived faces with positive emotions, the temporal and parietal cortex are more involved than in the unconscious processing of neutral faces. In contrast, unconsciously perceived facial expressions with negative emotional content lead to stronger activities in subcortical regions 93 .
Overall, the parallel existence of a subcortical processing pathway to the right amygdala via the midbrain and thalamus has been proposed for the processing of emotional faces, while a parallel cortical pathway exists for conscious face perception 93 98 99 100 . However, this view is controversial, as some authors argue that the subconscious processing of emotional faces is also cortical 101 102 103 . The existence of a subcortical parallel pathway would also explain the fact that basal emotions in faces can be perceived in prefrontal areas at the speed of up to 120 ms 104 , which is surprisingly fast considering that motion-independent foveal information reaches the occipital cortex only after about 60 ms and the complex compound P100 component is measurable only after 100 ms. Despite this considerable speed, however, a fast cortical pathway cannot be ruled out here either. In addition to these fast components, however, significantly later components are also measurable in the prefrontal cortex. It appears that the interpretation of emotions occurs in a temporally staggered process. The prefrontal cortex is also of particular importance in this context because it modulates visual information processing in other brain regions 104 . This iterative information processing seems particularly plausible because it shows strong similarities to the parallel organization in other sensory systems 105 106 . The existence of parallel very fast responses also indicates the considerable evolutionary importance of emotion recognition, since this additive processing pathway must have provided an evolutionary advantage due to its speed.
9. Interpretation of emotions from facial expressions in their meaning in society
It is generally agreed that the emotions in faces we look at have an effect on us. Although we would not cognitively classify the nice smile of the hotel employee at the counter as a sign of joy about our visit, we mostly perceive it as pleasant or at least the absence of a smile as unpleasant. In the legal systems of Western democracies, the interpretation of emotions by judges is also seen as influencing the sentence. This fact presupposes the self-conception that we believe we can correctly classify emotions. Thus, the display of grief, both in facial expressions and in words, often has the effect of reducing the sentence. In this regard, many people consider facial expressions to be more decisive under the assumption that they allow a stronger insight into emotions than words of regret read aloud. For example, a US Supreme Court Justice (Anthony Kennedy) wrote in 1992 that reading the defendant’s emotions is necessary to “know the heart and mind of the offender”.
Even if the interpretation of emotions and their assessment on sentencing is considered fair in our legal system, the same is not true for emotion interpretation in other areas of society. The American psychologist Paul Ekman developed and implemented a training program for the US Transportation Security Administration in 2007 that trained staff to observe passengers for suspicious signs of stress, deception, or anxiety. The program is based on Paul Ekman’s research from the late 1970s, during which Ekman published several articles on identifying emotions from minimal changes in facial expression 107 . It is further based on the assumption that the attempt to hide or lie about emotions leads to the leakage of emotions in the form of microexpressions in facial expressions.
However, the considerable inaccuracy and racial bias led to important criticism by scientists, members of the US Congress, and various human rights organizations 108 . Leaving aside ethical and moral aspects, this program is highly interesting from a purely scientific perspective, as it explores the hypothesis that it is possible to determine from the mere observation of people whether they are cheating or planning hostile acts. An idea of the size of the program comes from the number of people involved. In 2015, 2,800 people were employed in this so-called SPOT program and between 2007 and 2013, 900 million US dollars were invested. Of course, no academic-driven research can reach this scale. The enormous data size and the various technology deployments make this program particularly interesting.
The aforementioned hypothesis has found many supporters, and thus personnel have been hired at London Heathrow Airport for behavioral observation, and the US Department of Homeland Security is also pursuing a program for electronic assessment of nonverbal behavior. In this context, the study of facial expressions is a particularly important aspect. Critics argue that there is no scientific evidence to support the assumption that conclusions about intent or future behavior can be drawn from observations 109 . This assessment was shared by a review commission of the program by the US Congress. In response to these concerns, the Transportation Security Administration commissioned an independent study, the results of which are currently pending.
Automated analysis and categorization of emotions from faces has been intensively researched over the last 2 decades. Current algorithms are able to identify emotions with more than 90% correctness from static pictures. In the last 10 years, the analysis of pictures has taken a back seat in research, while increasing research has been done on the analysis of emotions from video sequences. Here, too, significant success has been achieved in the automated analysis and categorization of emotions from faces due to the considerable increase in computer power and, above all, to advances in the research field of computer vision 110 111 112 .
The analysis of videos is much more complex than the analysis of static images. When analyzing videos, it is not a matter of analyzing each individual picture, but rather these emotion analyses are dependent on the characteristics of the face in the past and in the future. In particular, unlike static images, voluntary and involuntary facial movements can be distinguished in videos. It is precisely the “leakage” of genuine emotion through a feigned emotion that can be analyzed. And it is precisely this interplay of voluntary and involuntary movements and the associated detection of microexpressions that are the subject of very active research 113 .
The “leakage” of genuine emotions through an acted facial expression lasts for a very short time and is often difficult to detect by the human eye. The duration of microexpressions is reported to be in the range of 40–200 ms 30 114 . Another property of microexpressions that makes them difficult to detect by the human eye is that they are often fragmented. Only part of the muscle groups expressing a particular emotion are activated during these microexpressions 115 .
These characteristics of microexpressions make them an ideal application area for computational image analysis. There are many applications, e.g., in business negotiations, in psychiatric interviews, and in forensics 108 116 . A first approach was developed by Paul Ekman in 2002. He trained people with the help of a tool to detect 7 different categories of microexpressions 117 . The results were however sobering, since it could be shown that students, who worked with this tool in the recognition of microexpressions, were inferior to the staff of the US Coast Guard that worked without assistance 118 .
To date, the development of analysis methods has been significantly limited by the paucity of available datasets. To create datasets with the richest possible microexpressions, subjects are encouraged to watch videos with emotional content and are simultaneously offered rewards if they do not show emotions. However, labeling these data remains difficult and results in currently only a few existing freely accessible datasets of limited size 113 119 .
Additionally, analyses are complicated by head movements of the recorded person but also by facial movements without emotional reference (e.g. blinking). Besides the interpretation of the emotional content of a microexpression, especially the identification of the phases of occurrence of this microexpression represents a considerable problem 120 . For these partial problems, there has been an increasing number of investigations and a significant improvement in the quality of results, especially in the last 12 years. If the number of emotions to be recognized is limited (e.g., to positive, negative, surprised, and others), results with an accuracy of around 70% can be obtained 121 122 123 .
On single databases (no performance across databases), current results show an accuracy of 65% to more than 70% in the recognition, if the characteristics that should be recognized are limited (e.g., positive, negative, surprised, and others). Current methods also focus on the application of so-called convolutional neural networks, which can achieve an accuracy of over 70% with different auxiliary methods 124 and this partly even across different datasets 125 . Despite this significant progress, there is still potential for improvement in emotion detection from videos and also in the detection and interpretation of microexpressions. The current quality of results clearly limits its use in practice.
10. Conclusion
The interplay of facial expressions and emotions is an important part of daily social interaction. Scientific research over the past 50 years has yielded a great increase in knowledge. In particular, deeper insights into evolutionary origins and behavioral properties of facial expression and emotion have been gained. Cerebral processing has also been intensively researched and is now largely anatomically assignable. However, functional models of the recognition and expression of emotion in facial expressions, which would significantly advance our understanding of cerebral processing mechanisms, are still lacking. In the coming years, the recognition of emotions in videos and in particular the recognition of microexpressions will be a major scientific and, increasingly, economic challenge.
Footnotes
Interessenkonflikt Die Autorinnen geben an, dass kein Interessenkonflikt besteht.
Literatur
- 1.Frank MG, Stennett J. The forced-choice paradigm and the perception of facial expressions of emotion. J Pers Soc Psychol. 2001;80:75–85. doi: 10.1037//0022-3514.80.1.75. [DOI] [PubMed] [Google Scholar]
- 2.Grossmann T, Johnson MH. The development of the social brain in human infancy. Eur J Neurosci. 2007;25:909–919. doi: 10.1111/j.1460-9568.2007.05379.x. [DOI] [PubMed] [Google Scholar]
- 3.Barrett LF. Functionalism cannot save the classical view of emotion. Soc Cogn Affect Neurosci. 2017;12:34–36. doi: 10.1093/scan/nsw156. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Adolphs R, Mlodinow L, Barrett LF. What is an emotion? Curr Biol. 2019;29:R1060–R1064. doi: 10.1016/j.cub.2019.09.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.LeDoux J. Rethinking the emotional brain. Neuron. 2012;73:653–676. doi: 10.1016/j.neuron.2012.02.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Shackman AJ, Wager TD. The emotional brain: Fundamental questions and strategies for future research. Neurosci Lett. 2019;693:68–74. doi: 10.1016/j.neulet.2018.10.012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Anderson DJ, Adolphs R. A framework for studying emotions across species. Cell. 2014;157:187–200. doi: 10.1016/j.cell.2014.03.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Panksepp J, Lane RD, Solms M, Smith R. Reconciling cognitive and affective neuroscience perspectives on the brain basis of emotional experience. Neurosci Biobehav Rev. 2017;76:187–215. doi: 10.1016/j.neubiorev.2016.09.010. [DOI] [PubMed] [Google Scholar]
- 9.Fanselow MS, Pennington ZT. A return to the psychiatric dark ages with a two-system framework for fear. Behav Res Ther. 2018;100:24–29. doi: 10.1016/j.brat.2017.10.012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Chikazoe J, Lee DH, Kriegeskorte N, Anderson AK. Population coding of affect across stimuli, modalities and individuals. Nat Neurosci. 2014;17:1114–1122. doi: 10.1038/nn.3749. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Adolphs R. Emotion. Curr Biol. 2010;20:R549–552. doi: 10.1016/j.cub.2010.05.046. [DOI] [PubMed] [Google Scholar]
- 12.Adolphs R. How should neuroscience study emotions? by distinguishing emotion states, concepts, and experiences. Soc Cogn Affect Neurosci. 2017;12:24–31. doi: 10.1093/scan/nsw153. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Adolphs R, Anderson D. Princeton University Press; 2018. The Neuroscience of Emotion: A New Synthesis. [DOI] [Google Scholar]
- 14.Damasio A, Carvalho GB. The nature of feelings: evolutionary and neurobiological origins. Nat Rev Neurosci. 2013;14:143–152. doi: 10.1038/nrn3403. [DOI] [PubMed] [Google Scholar]
- 15.Ellison JW, Massaro DW. Featural evaluation, integration, and judgment of facial affect. J Exp Psychol Hum Percept Perform. 1997;23:213–226. doi: 10.1037//0096-1523.23.1.213. [DOI] [PubMed] [Google Scholar]
- 16.Beaudry O, Roy-Charland A, Perron M, Cormier I, Tapp R. Featural processing in recognition of emotional facial expressions. Cogn Emot. 2014;28:416–432. doi: 10.1080/02699931.2013.833500. [DOI] [PubMed] [Google Scholar]
- 17.McKone E, Davies AA, Darke H, Crookes K, Wickramariyaratne T, Zappia S, Fiorentini C, Favelle S, Broughton M, Fernando D. Importance of the inverted control in measuring holistic face processing with the composite effect and part-whole effect. Front Psychol. 2013;4:33. doi: 10.3389/fpsyg.2013.00033. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Tobin A, Favelle S, Palermo R. Dynamic facial expressions are processed holistically, but not more holistically than static facial expressions. Cogn Emot. 2016;30:1208–1221. doi: 10.1080/02699931.2015.1049936. [DOI] [PubMed] [Google Scholar]
- 19.Schurgin MW, Nelson J, Iida S, Ohira H, Chiao JY, Franconeri SL. Eye movements during emotion recognition in faces. J Vis. 2014;14:14. doi: 10.1167/14.13.14. [DOI] [PubMed] [Google Scholar]
- 20.Eisenbarth H., Alpers G.W. Happy mouth and sad eyes: Scanning emotional facial expressions. Emotion. 2011;11:860–865. doi: 10.1037/a0022758. [DOI] [PubMed] [Google Scholar]
- 21.Blais C, Fiset D, Roy C, Saumure Régimbald C, Gosselin F. Eye fixation patterns for categorizing static and dynamic facial expressions. Emotion. 2017;17:1107–1119. doi: 10.1037/emo0000283. [DOI] [PubMed] [Google Scholar]
- 22.Gillespie SM, Rotshtein P, Wells LJ, Beech AR, Mitchell IJ. Psychopathic traits are associated with reduced attention to the eyes of emotional faces among adult male non-offenders. Front Hum Neurosci. 2015;9:552. doi: 10.3389/fnhum.2015.00552. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Wells LJ, Gillespie SM, Rotshtein P. Identification of Emotional Facial Expressions: Effects of Expression, Intensity, and Sex on Eye Gaze. PLoS One. 2016;11:e0168307. doi: 10.1371/journal.pone.0168307. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Calder AJ, Young AW, Keane J, Dean M. Configural information in facial expression perception. Journal of Experimental Psychology: Human Perception and Performance. 2000;26:527–551. doi: 10.1037/0096-1523.26.2.527. [DOI] [PubMed] [Google Scholar]
- 25.Calvo MG, Nummenmaa L. Detection of emotional faces: salient physical features guide effective visual search. J Exp Psychol Gen. 2008;137:471–494. doi: 10.1037/a0012771. [DOI] [PubMed] [Google Scholar]
- 26.Tanaka JW, Simonyi D. The “Parts and Wholes” of Face Recognition: A Review of the Literature. Quarterly Journal of Experimental Psychology. 2016;69:1876–1889. doi: 10.1080/17470218.2016.1146780. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Kilpeläinen M, Salmela V. Perceived emotional expressions of composite faces. PLoS ONE. 2020;15:e0230039. doi: 10.1371/journal.pone.0230039. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Ekman P, Friesen WV. Constants across cultures in the face and emotion. Journal of Personality and Social Psychology. 1971;17:124–129. doi: 10.1037/h0030377. [DOI] [PubMed] [Google Scholar]
- 29.Chen Z, Whitney D. Tracking the affective state of unseen persons. Proceedings of the National Academy of Sciences. 2019;116:7559–7564. doi: 10.1073/pnas.1812250116. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Ekman P, Friesen WV. Nonverbal leakage and clues to deception. Psychiatry. 1969;32:88–106. doi: 10.1080/00332747.1969.11023575. [DOI] [PubMed] [Google Scholar]
- 31.Ekman P, Sorenson ER, Friesen WV. Pan-Cultural Elements in Facial Displays of Emotion. Science. 1969;164:86–88. doi: 10.1126/science.164.3875.86. [DOI] [PubMed] [Google Scholar]
- 32.Ekman P. An argument for basic emotions. Cognition and Emotion. 1992;6:169–200. doi: 10.1080/02699939208411068. [DOI] [Google Scholar]
- 33.Ruba AL, McMurty R, Gaither SE, Wilbourn MP. How White American Children Develop Racial Biases in Emotion Reasoning. Affect Sci. 2022;3:21–33. doi: 10.1007/s42761-022-00111-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Jankovic J. Parkinson’s disease: clinical features and diagnosis. J Neurol Neurosurg Psychiatry. 2008;79:368–376. doi: 10.1136/jnnp.2007.131045. [DOI] [PubMed] [Google Scholar]
- 35.Bologna M, Fabbrini G, Marsili L, Defazio G, Thompson PD, Berardelli A. Facial bradykinesia. J Neurol Neurosurg Psychiatry. 2013;84:681–685. doi: 10.1136/jnnp-2012-303993. [DOI] [PubMed] [Google Scholar]
- 36.Hemmesch AR, Tickle-Degnen L, Zebrowitz LA. The influence of facial masking and sex on older adults’ impressions of individuals with Parkinson’s disease. Psychol Aging. 2009;24:542–549. doi: 10.1037/a0016105. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Pentland B, Gray JM, Riddle WJ, Pitcairn TK. The effects of reduced non-verbal communication in Parkinson’s disease. Br J Disord Commun. 1988;23:31–34. doi: 10.3109/13682828809019874. [DOI] [PubMed] [Google Scholar]
- 38.Tickle-Degnen L, Lyons KD. Practitioners’ impressions of patients with Parkinson’s disease: the social ecology of the expressive mask. Soc Sci Med. 2004;58:603–614. doi: 10.1016/s0277-9536(03)00213-2. [DOI] [PubMed] [Google Scholar]
- 39.Tronick E.The neurobehavioral and social-emotional development of infants and children1st ed.W. W. Norton; Co: 2007 [Google Scholar]
- 40.Dobel C, Miltner WHR, Witte OW, Volk GF, Guntinas-Lichius O. [Emotional impact of facial palsy] Laryngorhinootologie. 2013;92:9–23. doi: 10.1055/s-0032-1327624. [DOI] [PubMed] [Google Scholar]
- 41.Herrera E, Cuetos F, Rodríguez-Ferreiro J. Emotion recognition impairment in Parkinson’s disease patients without dementia. J Neurol Sci. 2011;310:237–240. doi: 10.1016/j.jns.2011.06.034. [DOI] [PubMed] [Google Scholar]
- 42.Dujardin K, Blairy S, Defebvre L, Duhem S, Noël Y, Hess U, Destée A. Deficits in decoding emotional facial expressions in Parkinson’s disease. Neuropsychologia. 2004;42:239–250. doi: 10.1016/s0028-3932(03)00154-4. [DOI] [PubMed] [Google Scholar]
- 43.Ibarretxe-Bilbao N, Junque C, Tolosa E, Marti M-J, Valldeoriola F, Bargallo N, Zarei M. Neuroanatomical correlates of impaired decision-making and facial emotion recognition in early Parkinson’s disease. Eur J Neurosci. 2009;30:1162–1171. doi: 10.1111/j.1460-9568.2009.06892.x. [DOI] [PubMed] [Google Scholar]
- 44.Poewe W.Non-motor symptoms in Parkinson’s disease Eur J Neurol 200815Suppl 114–20. 10.1111/j.1468-1331.2008.02056.x [DOI] [PubMed] [Google Scholar]
- 45.Péron J, Dondaine T, Le Jeune F, Grandjean D, Vérin M. Emotional processing in Parkinson’s disease: a systematic review. Mov Disord. 2012;27:186–199. doi: 10.1002/mds.24025. [DOI] [PubMed] [Google Scholar]
- 46.Reijnders JSAM, Ehrt U, Weber WEJ, Aarsland D, Leentjens AFG.A systematic review of prevalence studies of depression in Parkinson’s disease Mov Disord 200823183–189.quiz 313 10.1002/mds.21803 [DOI] [PubMed] [Google Scholar]
- 47.Lázaro E, Amayra I, López-Paz JF, Jometón A, Martín N, Caballero P, De Nicolás L, Hoffmann H, Kessler H, Ruiz B et al. Facial affect recognition in myasthenia gravis. Span J Psychol. 2013;16:E52. doi: 10.1017/sjp.2013.59. [DOI] [PubMed] [Google Scholar]
- 48.Winblad S, Hellström P, Lindberg C, Hansen S. Facial emotion recognition in myotonic dystrophy type 1 correlates with CTG repeat expansion. J Neurol Neurosurg Psychiatry. 2006;77:219–223. doi: 10.1136/jnnp.2005.070763. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Hinton VJ, Fee RJ, De Vivo DC, Goldstein E. Poor facial affect recognition among boys with duchenne muscular dystrophy. J Autism Dev Disord. 2007;37:1925–1933. doi: 10.1007/s10803-006-0325-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Sedda A. Disorders of emotional processing in amyotrophic lateral sclerosis. Curr Opin Neurol. 2014;27:659–665. doi: 10.1097/WCO.0000000000000147. [DOI] [PubMed] [Google Scholar]
- 51.Korb S, Wood A, Banks CA, Agoulnik D, Hadlock TA, Niedenthal PM. Asymmetry of Facial Mimicry and Emotion Perception in Patients With Unilateral Facial Paralysis. JAMA Facial Plast Surg. 2016;18:222–227. doi: 10.1001/jamafacial.2015.2347. [DOI] [PubMed] [Google Scholar]
- 52.Storbeck F, Schlegelmilch K, Streitberger K-J, Sommer W, Ploner CJ. Delayed recognition of emotional facial expressions in Bell’s palsy. Cortex. 2019;120:524–531. doi: 10.1016/j.cortex.2019.07.015. [DOI] [PubMed] [Google Scholar]
- 53.Wagenbreth C, Wattenberg L, Heinze H-J, Zaehle T. Implicit and explicit processing of emotional facial expressions in Parkinson’s disease. Behav Brain Res. 2016;303:182–190. doi: 10.1016/j.bbr.2016.01.059. [DOI] [PubMed] [Google Scholar]
- 54.Ricciardi L, Visco-Comandini F, Erro R, Morgante F, Bologna M, Fasano A, Ricciardi D, Edwards MJ, Kilner J. Facial Emotion Recognition and Expression in Parkinson’s Disease: An Emotional Mirror Mechanism? PLoS One. 2017;12:e0169110. doi: 10.1371/journal.pone.0169110. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Trinkler I, Devignevielle S, Achaibou A, Ligneul RV, Brugières P, Cleret de Langavant L, De Gelder B, Scahill R, Schwartz S, Bachoud-Lévi A-C. Embodied emotion impairment in Huntington’s Disease. Cortex. 2017;92:44–56. doi: 10.1016/j.cortex.2017.02.019. [DOI] [PubMed] [Google Scholar]
- 56.Wood A, Rychlowska M, Korb S, Niedenthal P. Fashioning the Face: Sensorimotor Simulation Contributes to Facial Expression Recognition. Trends Cogn Sci. 2016;20:227–240. doi: 10.1016/j.tics.2015.12.010. [DOI] [PubMed] [Google Scholar]
- 57.Price TF, Harmon-Jones E. Embodied emotion: the influence of manipulated facial and bodily states on emotive responses. Wiley Interdiscip Rev Cogn Sci. 2015;6:461–473. doi: 10.1002/wcs.1370. [DOI] [PubMed] [Google Scholar]
- 58.Belge J-B, Maurage P, Mangelinckx C, Leleux D, Delatte B, Constant E. Facial decoding in schizophrenia is underpinned by basic visual processing impairments. Psychiatry Res. 2017;255:167–172. doi: 10.1016/j.psychres.2017.04.007. [DOI] [PubMed] [Google Scholar]
- 59.Maurage P, Campanella S, Philippot P, Martin S, de Timary P. Face processing in chronic alcoholism: a specific deficit for emotional features. Alcohol Clin Exp Res. 2008;32:600–606. doi: 10.1111/j.1530-0277.2007.00611.x. [DOI] [PubMed] [Google Scholar]
- 60.Domes G, Schulze L, Herpertz S.C. Emotion recognition in borderline personality disorder-a review of the literature. J Pers Disord. 2009;23:6–19. doi: 10.1521/pedi.2009.23.1.6. [DOI] [PubMed] [Google Scholar]
- 61.Parada-Fernández P, Oliva-Macías M, Amayra I, López-Paz JF, Lázaro E, Martínez Ó, Jometón A, Berrocoso S, García de Salazar H, Pérez M. Accuracy and reaction time in recognition of facial emotions in people with multiple sclerosis. Rev Neurol. 2015;61:433–440. [PubMed] [Google Scholar]
- 62.Berneiser J, Wendt J, Grothe M, Kessler C, Hamm AO, Dressel A. Impaired recognition of emotional facial expressions in patients with multiple sclerosis. Mult Scler Relat Disord. 2014;3:482–488. doi: 10.1016/j.msard.2014.02.001. [DOI] [PubMed] [Google Scholar]
- 63.Cecchetto C, Aiello M, D’Amico D, Cutuli D, Cargnelutti D, Eleopra R, Rumiati RI. Facial and bodily emotion recognition in multiple sclerosis: the role of alexithymia and other characteristics of the disease. J Int Neuropsychol Soc. 2014;20:1004–1014. doi: 10.1017/S1355617714000939. [DOI] [PubMed] [Google Scholar]
- 64.Li T, Liu Y, Li M, Qian X, Dai SY. Mask or no mask for COVID-19: A public health and market study. PLoS One. 2020;15:e0237691. doi: 10.1371/journal.pone.0237691. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Kiefer B. Masque, Covid et liberté. Rev Med Suisse. 2020;16:1780. [PubMed] [Google Scholar]
- 66.Miller FG. Liberty and Protection of Society During a Pandemic: Revisiting John Stuart Mill. Perspect Biol Med. 2021;64:200–210. doi: 10.1353/pbm.2021.0016. [DOI] [PubMed] [Google Scholar]
- 67.Ong JJY, Bharatendu C, Goh Y, Tang JZY, Sooi KWX, Tan YL, Tan BYQ, Teoh H, Ong ST, Allen DM et al. Headaches Associated With Personal Protective Equipment – A Cross-Sectional Study Among Frontline Healthcare Workers During COVID-19. Headache: The Journal of Head and Face Pain. 2020;60:864–877. doi: 10.1111/head.13811. [DOI] [PubMed] [Google Scholar]
- 68.Teo W. The “Maskne” microbiome – pathophysiology and therapeutics. Int J Dermatology. 2021;60:799–809. doi: 10.1111/ijd.15425. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 69.Biermann M, Schulze A, Unterseher F, Atanasova K, Watermann P, Krause-Utz A, Stahlberg D, Bohus M, Lis S. Trustworthiness appraisals of faces wearing a surgical mask during the Covid-19 pandemic in Germany: An experimental study. PLoS ONE. 2021;16:e0251393. doi: 10.1371/journal.pone.0251393. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 70.Carbon C-C. Wearing Face Masks Strongly Confuses Counterparts in Reading Emotions. Front. Psychol. 2020;11:566886. doi: 10.3389/fpsyg.2020.566886. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71.Ferrari GRA, Möbius M, van Opdorp A, Becker ES, Rinck M. Can’t Look Away: An Eye-Tracking Based Attentional Disengagement Training for Depression. Cogn Ther Res. 2016;40:672–686. doi: 10.1007/s10608-016-9766-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 72.Matuschek C, Moll F, Fangerau H, Fischer JC, Zänker K, van Griensven M, Schneider M, Kindgen-Milles D, Knoefel WT, Lichtenberg A et al. Face masks: benefits and risks during the COVID-19 crisis. Eur J Med Res. 2020;25:32. doi: 10.1186/s40001-020-00430-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 73.Darwin C, Ekman P. Oxford University Press; 2009. The expression of the emotions in man and animals 4th ed., 200th anniversary ed. [Google Scholar]
- 74.Ekman P, Friesen WV. Facial Action Coding System. 2019 doi: 10.1037/t27734-000. [DOI] [Google Scholar]
- 75.Zych AD, Gogolla N. Expressions of emotions across species. Curr Opin Neurobiol. 2021;68:57–66. doi: 10.1016/j.conb.2021.01.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 76.Parr LA, Waller BM, Vick SJ, Bard KA. Classifying chimpanzee facial expressions using muscle action. Emotion. 2007;7:172–181. doi: 10.1037/1528-3542.7.1.172. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77.Parr LA, Waller BM, Burrows AM, Gothard KM, Vick SJ. Brief communication: MaqFACS: A muscle-based facial movement coding system for the rhesus macaque. Am J Phys Anthropol. 2010;143:625–630. doi: 10.1002/ajpa.21401. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 78.Grill HJ, Norgren R. The taste reactivity test. I. Mimetic responses to gustatory stimuli in neurologically normal rats. Brain Res. 1978;143:263–279. doi: 10.1016/0006-8993(78)90568-1. [DOI] [PubMed] [Google Scholar]
- 79.Grill HJ, Norgren R. Chronically decerebrate rats demonstrate satiation but not bait shyness. Science. 1978;201:267–269. doi: 10.1126/science.663655. [DOI] [PubMed] [Google Scholar]
- 80.Fusar-Poli P, Placentino A, Carletti F, Landi P, Allen P, Surguladze S, Benedetti F, Abbamonte M, Gasparotti R, Barale F et al. Functional atlas of emotional faces processing: a voxel-based meta-analysis of 105 functional magnetic resonance imaging studies. J Psychiatry Neurosci. 2009;34:418–432. [PMC free article] [PubMed] [Google Scholar]
- 81.Adolphs R. Neural systems for recognizing emotion. Curr Opin Neurobiol. 2002;12:169–177. doi: 10.1016/s0959-4388(02)00301-x. [DOI] [PubMed] [Google Scholar]
- 82.Barton JJS. Face processing in the temporal lobe. Handb Clin Neurol. 2022;187:191–210. doi: 10.1016/B978-0-12-823493-8.00019-5. [DOI] [PubMed] [Google Scholar]
- 83.Baillieux H, De Smet HJ, Paquier PF, De Deyn PP, Mariën P. Cerebellar neurocognition: insights into the bottom of the brain . Clin Neurol Neurosurg. 2008;110:763–773. doi: 10.1016/j.clineuro.2008.05.013. [DOI] [PubMed] [Google Scholar]
- 84.Reiman EM, Lane RD, Ahern GL, Schwartz GE, Davidson RJ, Friston KJ, Yun LS, Chen K. Neuroanatomical correlates of externally and internally generated human emotion. Am J Psychiatry. 1997;154:918–925. doi: 10.1176/ajp.154.7.918. [DOI] [PubMed] [Google Scholar]
- 85.Sacchetti B, Baldi E, Lorenzini CA, Bucherelli C. Cerebellar role in fear-conditioning consolidation. Proc Natl Acad Sci U S A. 2002;99:8406–8411. doi: 10.1073/pnas.112660399. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 86.Turner BM, Paradiso S, Marvel CL, Pierson R, Boles Ponto LL, Hichwa RD, Robinson RG. The cerebellum and emotional experience. Neuropsychologia. 2007;45:1331–1341. doi: 10.1016/j.neuropsychologia.2006.09.023. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.Husted DS, Shapira NA, Goodman WK. The neurocircuitry of obsessive-compulsive disorder and disgust. Prog Neuropsychopharmacol Biol Psychiatry. 2006;30:389–399. doi: 10.1016/j.pnpbp.2005.11.024. [DOI] [PubMed] [Google Scholar]
- 88.Thiebaut de Schotten M, Dell’Acqua F, Forkel SJ, Simmons A, Vergani F, Murphy DGM, Catani M.A lateralized brain network for visuospatial attention Nat Neurosci 2011141245–1246.10.1038/nn.2905 [DOI] [PubMed] [Google Scholar]
- 89.Ha D, De E, Ea Y, Dw H.Brain lateralization of emotional processing: historical roots and a future incorporating “dominance.” Behavioral and cognitive neuroscience reviews 2005410.1177/1534582305276837 [DOI] [PubMed] [Google Scholar]
- 90.Vuilleumier P, Mohr C, Valenza N, Wetzel C, Landis T. Hyperfamiliarity for unknown faces after left lateral temporo-occipital venous infarction: a double dissociation with prosopagnosia. Brain. 2003;126:889–907. doi: 10.1093/brain/awg086. [DOI] [PubMed] [Google Scholar]
- 91.Parton A, Malhotra P, Husain M. Hemispatial neglect. J Neurol Neurosurg Psychiatry. 2004;75:13–21. [PMC free article] [PubMed] [Google Scholar]
- 92.T, N., J, D., Hj, H., and R, D.Asymmetrical activation in the human brain during processing of fearful faces Current biology : CB 200515 10.1016/j.cub.2004.12.075 [DOI] [PubMed] [Google Scholar]
- 93.Qiu Z, Lei X, Becker SI, Pegna AJ. Neural activities during the Processing of unattended and unseen emotional faces: a voxel-wise Meta-analysis. Brain Imaging Behav. 2022;16:2426–2443. doi: 10.1007/s11682-022-00697-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 94.Weiskrantz L, Warrington EK, Sanders MD, Marshall J. Visual capacity in the hemianopic field following a restricted occipital ablation. Brain. 1974;97:709–728. doi: 10.1093/brain/97.1.709. [DOI] [PubMed] [Google Scholar]
- 95.de Gelder B, Pourtois G, van Raamsdonk M, Vroomen J, Weiskrantz L. Unseen stimuli modulate conscious visual experience: evidence from inter-hemispheric summation. Neuroreport. 2001;12:385–391. doi: 10.1097/00001756-200102120-00040. [DOI] [PubMed] [Google Scholar]
- 96.de Gelder B, Vroomen J, Pourtois G, Weiskrantz L. Non-conscious recognition of affect in the absence of striate cortex. Neuroreport. 1999;10:3759–3763. doi: 10.1097/00001756-199912160-00007. [DOI] [PubMed] [Google Scholar]
- 97.Pegna AJ, Khateb A, Lazeyras F, Seghier ML. Discriminating emotional faces without primary visual cortices involves the right amygdala. Nat Neurosci. 2005;8:24–25. doi: 10.1038/nn1364. [DOI] [PubMed] [Google Scholar]
- 98.Méndez-Bértolo C, Moratti S, Toledano R, Lopez-Sosa F, Martínez-Alvarez R, Mah YH, Vuilleumier P, Gil-Nagel A, Strange BA. A fast pathway for fear in human amygdala. Nat Neurosci. 2016;19:1041–1049. doi: 10.1038/nn.4324. [DOI] [PubMed] [Google Scholar]
- 99.Morris JS, Ohman A, Dolan RJ. A subcortical pathway to the right amygdala mediating “unseen” fear. Proc Natl Acad Sci U S A. 1999;96:1680–1685. doi: 10.1073/pnas.96.4.1680. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 100.Kragel PA, Čeko M, Theriault J, Chen D, Satpute AB, Wald LW, Lindquist MA, Feldman Barrett L, Wager TD. A human colliculus-pulvinar-amygdala pathway encodes negative emotion. Neuron. 2021;109:2404–2.412E8. doi: 10.1016/j.neuron.2021.06.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 101.Sanchez-Lopez J, Cardobi N, Pedersini CA, Savazzi S, Marzi CA. What cortical areas are responsible for blindsight in hemianopic patients? Cortex. 2020;132:113–134. doi: 10.1016/j.cortex.2020.08.007. [DOI] [PubMed] [Google Scholar]
- 102.Cauchoix M, Crouzet SM. How plausible is a subcortical account of rapid visual recognition? Front Hum Neurosci. 2013;7:39. doi: 10.3389/fnhum.2013.00039. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 103.Pessoa L, Adolphs R. Emotion processing and the amygdala: from a “low road” to “many roads” of evaluating biological significance. Nat Rev Neurosci. 2010;11:773–783. doi: 10.1038/nrn2920. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 104.Kawasaki H, Kaufman O, Damasio H, Damasio AR, Granner M, Bakken H, Hori T, Howard MA, Adolphs R. Single-neuron responses to emotional visual stimuli recorded in human ventral prefrontal cortex. Nat Neurosci. 2001;4:15–16. doi: 10.1038/82850. [DOI] [PubMed] [Google Scholar]
- 105.Klingner CM, Brodoehl S, Huonker R, Witte OW. The Processing of Somatosensory Information Shifts from an Early Parallel into a Serial Processing Mode: A Combined fMRI/MEG Study. Frontiers in systems neuroscience. 2016;10:103. doi: 10.3389/fnsys.2016.00103. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 106.Klingner CM, Brodoehl S, Huonker R, Gotz T, Baumann L, Witte OW. Parallel processing of somatosensory information: Evidence from dynamic causal modeling of MEG data. Neuroimage. 2015;118:193–198. doi: 10.1016/j.neuroimage.2015.06.028. [DOI] [PubMed] [Google Scholar]
- 107.Ekman P, Oster H. Facial expressions of emotion. Annual Review of Psychology. 1979;30:527–554. doi: 10.1146/annurev.ps.30.020179.002523. [DOI] [Google Scholar]
- 108.Weinberger S. Airport security: Intent to deceive? Nature. 2010;465:412–415. doi: 10.1038/465412a. [DOI] [PubMed] [Google Scholar]
- 109.Hontz CR, Hartwig M.Kleinman, and Meissner, CA.Credibility Assessment at Portals. Portals Committee Report 2009
- 110.Zeng Z, Pantic M, Roisman GI, Huang TS. A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell. 2009;31:39–58. doi: 10.1109/TPAMI.2008.52. [DOI] [PubMed] [Google Scholar]
- 111.Bettadapura V. Face Expression Recognition and Analysis: The State of the Art. 2012 doi: 10.48550/ARXIV.1203.6722. [DOI] [Google Scholar]
- 112.Sariyanidi E, Gunes H, Cavallaro A. Automatic Analysis of Facial Affect: A Survey of Registration, Representation, and Recognition. IEEE Trans Pattern Anal Mach Intell. 2015;37:1113–1133. doi: 10.1109/TPAMI.2014.2366127. [DOI] [PubMed] [Google Scholar]
- 113.Oh Y-H, See J, Le Ngo AC, Phan RC-W, Baskaran VM. A Survey of Automatic Facial Micro-Expression Analysis: Databases, Methods, and Challenges. Front Psychol. 2018;9:1128. doi: 10.3389/fpsyg.2018.01128. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 114.Ekman P. WW Norton; Co: 2009. Telling Lies: Clues to Deceit in the Marketplace, Politics, and Marriage Reissue Edition. [Google Scholar]
- 115.Porter S, ten Brinke L. Reading between the lies: identifying concealed and falsified emotions in universal facial expressions. Psychol Sci. 2008;19:508–514. doi: 10.1111/j.1467-9280.2008.02116.x. [DOI] [PubMed] [Google Scholar]
- 116.Frank M, Herbasz M, Sinuk K, Keller A, Nolan C. I see how you feel: Training laypeople and professionals to recognize fleeting emotions. In. 2009:pp. 1–35. [Google Scholar]
- 117.Ekman P. 2002.
- 118.Seidenstat P, Splane FX. 2009.
- 119.Guerdelli H, Ferrari C, Barhoumi W, Ghazouani H, Berretti S. Macro- and Micro-Expressions Facial Datasets: A Survey. Sensors (Basel) 2022;22:1524. doi: 10.3390/s22041524. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 120.Davison AK, Lansley C, Costen N, Tan K, Yap MH. SAMM: A Spontaneous Micro-Facial Movement Dataset. IEEE Trans. Affective Comput. 2018;9:116–129. doi: 10.1109/TAFFC.2016.2573832. [DOI] [Google Scholar]
- 121.Zong Y, Huang X, Zheng W, Cui Z, Zhao G. Learning From Hierarchical Spatiotemporal Descriptors for Micro-Expression Recognition. IEEE Transactions on Multimedia. 2018;20:3160–3172. doi: 10.1109/TMM.2018.2820321. [DOI] [Google Scholar]
- 122.Zheng H. Micro-Expression Recognition based on 2D Gabor Filter and Sparse Representation. J. Phys.: Conf. Ser. 2017;787:12013. doi: 10.1088/1742-6596/787/1/012013. [DOI] [Google Scholar]
- 123.Kim DH, Baddar WJ, Ro YM. Micro-Expression Recognition with Expression-State Constrained Spatio-Temporal Feature Representations. In Proceedings of the 24th ACM international conference on Multimedia MM ’16. (Association for Computing Machinery) 2016:pp. 382–386. doi: 10.1145/2964284.2967247. [DOI] [Google Scholar]
- 124.Leong S-M, Noman F, Phan RC-W, Baskaran VM, Ting C-M. GraphEx: Facial Action Unit Graph for Micro-Expression Classification. In 2022 IEEE International Conference on Image Processing (ICIP) 2022:3296–3300. doi: 10.1109/ICIP46576.2022.9897873. [DOI] [Google Scholar]
- 125.Song B, Zong Y, Li K., Zhu J, Shi J, Zhao L. Cross-Database Micro-Expression Recognition Based on a Dual-Stream Convolutional Neural Network. IEEE Access. 2022;10:66227–66237. doi: 10.1109/ACCESS.2022.3185132. [DOI] [Google Scholar]