ABSTRACT.
Recognizing the other's emotions is an important skill for the social context that can be modulated by variables such as gender, age, and race. A number of studies seek to elaborate specific face databases to assess the recognition of basic emotions in different contexts.
Objectives:
This systematic review sought to gather these studies, describing and comparing the methodologies used in their elaboration.
Methods:
The databases used to select the articles were the following: PubMed, Web of Science, PsycInfo, and Scopus. The following word crossing was used: “Facial expression database OR Stimulus set AND development OR Validation.”
Results:
A total of 36 articles showed that most of the studies used actors to express the emotions that were elicited from specific situations to generate the most spontaneous emotion possible. The databases were mainly composed of colorful and static stimuli. In addition, most of the studies sought to establish and describe patterns to record the stimuli, such as color of the garments used and background. The psychometric properties of the databases are also described.
Conclusions:
The data presented in this review point to the methodological heterogeneity among the studies. Nevertheless, we describe their patterns, contributing to the planning of new research studies that seek to create databases for new contexts.
Keywords: Facial Expression, Validation Study, Emotions, Facial Recognition, Psychometrics
RESUMO.
Reconhecer as emoções do outro é uma habilidade importante para o contexto social, que pode ser modulada por variáveis como sexo, idade e raça. Vários estudos buscam elaborar bancos de faces específicos para avaliar o reconhecimento de emoções básicas em diferentes contextos.
Objetivos:
Esta revisão sistemática buscou reunir esses estudos, descrevendo e comparando as metodologias utilizadas em sua elaboração.
Métodos:
As bases de dados utilizadas para a seleção dos artigos foram: PubMed, Web of Science, PsycInfo e Scopus. Foi utilizado o seguinte cruzamento de palavras: “facial expression database OR stimulus set AND development OR validation”.
Resultados:
O total de 36 artigos mostrou que a maioria dos estudos utilizou atores para expressar as emoções, que foram suscitadas de situações específicas para serem o mais espontâneas possível. Os bancos de faces foram compostos principalmente de estímulos coloridos e estáticos. Além disso, a maioria dos estudos buscou estabelecer e descrever padrões para registrar os estímulos, como a cor das roupas utilizadas e o fundo. As propriedades psicométricas dos bancos de faces também são descritas.
Conclusões:
Os dados apresentados nesta revisão apontam para a heterogeneidade metodológica entre os estudos. Apesar disso, descrevemos seus padrões, contribuindo para o planejamento de novas pesquisas que buscam criar bancos de faces específicos para novos contextos.
Palavras-chave: Expressão Facial, Estudo de Validação, Emoções, Reconhecimento Facial, Psicometria
INTRODUCTION
Emotions play an important role in society life, as they enable interaction among people. According to the evolutionary theories, all emotions derive from a set of basic emotions common to both humans and animals and which are genetically determined 1,2 . One of the ways for us to recognize the other's emotion is through facial expressions, since the face is one of the most expressive visual stimuli in society life 3 . The ability to recognize emotions through the face can already be perceived in newborns, a fact that justifies the innate nature of this skill 4 .
From a study using a systematized task, Ekman and Friesen 5 postulated six basic emotions, which are related to evolutionary adaptations and can be universally recognized, namely, happiness, sadness, fear, disgust, surprise, and anger. In addition, they identified that the cultural aspects did not modulate the way in which these emotions were expressed 5 . Thus, the evidence indicated that all human beings had the same movements of the facial muscles under certain circumstances 6,7 , turning the ability to express emotions into a behavioral phenotype.
However, a number of studies began to notice that, within this phenotype common to human beings, some variables could modulate the way to recognize these facial expressions, such as cultural context 8 , age 9 , gender 10 , and race 11 . Taking these variables into account, several studies started to construct and validate specific face databases to assess the ability to recognize emotions through facial expressions 12–16 since, when selecting a set of facial expression stimuli, it is necessary to consider characteristics of the model that are expressing the emotions, as well as who will recognize them.
Therefore, the existing facial expression databases present great diversity with regard to the physical characteristics of those who express the emotions, the way in which emotions are induced during the construction of the image database, and how they are presented in the validation stage 12–14 . Despite the methodological differences across the studies, they follow important standards for the construction and validation of the series of stimuli. Comparing the methodology used by the studies in the creation of these databases, regardless of the characteristics of who expresses the stimuli, can contribute to the planning of new research studies that seek to create face databases for new contexts. Thus, the objective of this systematic review was to gather studies that constructed face databases to assess the recognition of facial expressions of basic emotions, describing and comparing the methodologies used in the stimuli construction phase.
METHODS
Search strategies and eligibility criteria
The search strategy for this systematic review was created and implemented prior to study selection, in accordance with the checklist presented in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) 17 . The databases used to select the articles were the following: PubMed, Web of Science, PsycInfo, and Scopus. The following word crossing was used: “Facial expression database OR Stimulus set AND development OR Validation.” The searches were conducted from June to December 8, 2021.
The lists of references of the selected articles were also researched for additional sources. The inclusion criteria were surveys that constructed face databases to assess the recognition of basic emotions, published in original articles or disclosed on official websites, without language or time restrictions. Letters to the editor, books and book chapters, reviews, comments, notes, errata, theses, dissertations, and bibliographic/systematic reviews were excluded. In addition, it is worth noting that only the construction stage of the databases was included in this review.
Therefore, additional studies conducted after construction, such as normative data, were not contemplated in the analysis.
Study selection
All the articles found in the databases were saved in the Rayyan electronic reference manager. After removing duplicate articles and according to the inclusion criteria of this study, all articles were evaluated by two independent researchers (DF and BF) through their titles and abstracts. In this stage, the researchers classified the articles as “yes,” “no,” or “perhaps.” Subsequently, the researchers reached consensus as to whether the articles recorded as “perhaps” should be included in the review.
After the inclusion of these studies, three researchers (DM, BF, and MB) read the articles in full and extracted information such as year of publication and study locus, name of the database built, characteristics of the participants who expressed the emotions (number of participants, place of recruitment, gender, age and race), basic emotions expressed, and final total of stimuli included in the database and their specific characteristics (Table 1) 12–16,25–63 . Subsequently, the methodological characteristics of the databases were collected, such as the method used to elicit the emotions, patterns in the capture of stimuli, criteria used in the validation stage, sample characteristics in the validation stage, and psychometric qualities assessed (Table 2) 12–16,25–63 .
Table 1. General characteristics of face databases.
Authors and year of publication | Study location | Name of the built database | Theoretical reference | Characteristics of participants who expressed emotions | Basic emotions expressed | Total of stimuli | Specific characteristics of stimuli |
---|---|---|---|---|---|---|---|
Benda and Scherf. (2020) 25 | The United States | Complex Emotion Expression Database (CEED) | Recognition of complex emotions in young people (Empirical) |
8 professional actors
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise | 243 images |
|
Chung et al. (2019) 26 | South Korea | Yonsei Face Database (YFace DB) | Basic emotions (Ekman and Friesen, 1975) 56 |
74 local community and university volunteers
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 1,480 stimuli | |
Conley et al. (2018) 16 | The United States | The racially diverse affective expression (RADIATE) | Racial heterogeneity in emotion recognition (Empirical) |
109 community adults
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 1,721 images |
|
Dalrymple et al. (2013) 27 | The United States | The Dartmouth Database of Children's Faces | Recognition of emotions in children (Empirical) |
80 community children
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 964 images |
|
Donadon et al. (2019) 28 | Brazil | Baby Faces | Ekman's Neurocultural Theory (1972) 57 |
20 babies
|
1) Happiness 2) Sadness 3) Fear 4) Anger 5) Surprise 6) Neutral | 57 images |
|
Ebner et al. (2010) 13 | Germany | Faces--a life-span Database of Facial Expressions | Age differences in emotion recognition (Ruffman et al., 2008) 58 |
179 actors and extras recruited from a modeling agency
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Neutral | 2,052 images |
|
Egger et al. (2011) 29 | The United States | NIMH Child Emotional Faces Picture Set (NIMH-ChEFS) | Recognition of emotions in children (Empirical) |
59 child actors
|
1) Happiness 2) Sadness 3) Fear 4) Anger 5) Neutral | 482 images |
|
Ekman and Friesen. (1976) 30 | The United States | Pictures of Facial Affect (POFA) | Pan-cultural elements in facial expressions of emotions (Ekman et al., 1969) 5 |
10 individuals
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 110 images |
|
Fujimura and Umemura. (2018) 31 | Japan | A facial expression database based on the dimensional and categorical model of emotions | The influence of angles on emotion recognition (Borod et al., 1998) 59 |
8 professional actors
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 920 stimuli | |
Franz et al. (2021) 32 | Germany | Picture-Set of Young Children's Affective Facial Expressions (PSYCAFE) | Recognition of emotions in children (Empirical) |
35 children
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 104 images |
|
Garrido et al. (2017) 33 | Portugal | Stills and Videos of facial Expressions (SAVE database) | Recognition of emotions in dynamic stimuli (Empirical) |
20 students
|
1) Happiness 2) Neutral | 120 stimuli |
|
Giuliani et al. (2017) 15 | The United States | The DuckEES child and adolescent dynamic facial expressions stimulus set | Recognition of emotions in dynamic stimuli (Empirical) |
37 children and teenage actors
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Neutral | 120 videos |
|
Happy et al. (2015) 34 | India | The Indian Spontaneous Expression Database for Emotion Recognition (ISED) | Basic emotions (Ekman and Friesen, 1975) 56 |
50 individuals
|
1) Happiness 2) Sadness 3) Disgust 4) Surprise | 428 videos |
|
Kaulard et al. (2012) 35 | Germany | The MPI Facial Expression Database | Language and emotions (Empirical) |
19 native Germans without professional acting experience
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger | 18800 videos |
|
Keutmann et al. (2015) 36 | The United States | Visual and vocal emotional expressions of adult and child actors | Item Response Theory in face database construction (Empirical) |
150 actors (Adults: n=139 and kids: n=11)
|
1) Happiness 2) Sadness 3) Fear 4) Anger 5) Neutral | 152 stimuli |
|
Kim et al. (2017) 37 | South Korea | Korea University Facial Expression Collection – Second Edition (KUFEC-II) | The role of culture in recognizing emotions (Empirical) |
57 actors
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 399 images |
|
Langner et al. (2010) 38 | Netherlands | Radboud Faces Database | The influence of angles and direction of gaze on emotion recognition (Empirical) |
49 young and children Young: 39 Children: 10
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 5,880 images | |
LoBue and Thrasher. (2015) 14 | The United States | The Child Affective Facial Expression (CAFE) | Recognition of emotions in children's faces of different races (Empirical) |
154 children
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 1,192 images |
|
Lundqvist et al. (1998) 39 | Sweden | Karolinska Directed Emotional Faces (KDEF) Database | - |
70 actors
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 490 images |
|
Ma et al. (2020) 40 | China | Han, Hui, and Tibetan Chinese facial expression database | The role of culture in recognizing emotions (Empirical) |
630 volunteers
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 930 images |
|
Ma et al. (2015) 41 | The United States | Chicago Face Database (CFD) | Limitations of existing face databases (Empirical) |
158 individuals from the University of Chicago Laboratory and amateur actors
|
1) Happiness 2) Fear 3) Neutral | 158 images |
|
Maack et al. (2017) 42 | Norway | The Tromso Infant Faces Database (TIF) | Influence of child stimuli on the adult attention system (Brosch et al., 2007; Parsons et al., 2011; Borgi et al., 2014)60-62 |
18 babies
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 119 images |
|
Meuwissen et al. (2017) 43 | The United States | Developmental Emotional Faces Stimulus Set (DEFSS) | Limitations of existing face databases (Empirical) |
116 volunteers 42 children 44 teenagers 30 adults
|
1) Happiness 2) Sadness 3) Fear 4) Anger 5) Neutral | 404 images |
|
Minear and Park. (2004) 44 | The United States | A lifespan database of adult facial stimuli | Influence of age on emotion recognition (Empirical) |
576 community volunteers
|
1) Happiness 2) Neutral | 1,142 images |
|
Negrão et al. (2021) 45 | Brazil | The Child Emotion Facial Expression Set | Recognition of emotions in children (Empirical) |
132 children
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 971 stimuli |
|
Novello et al. (2018) 46 | Brazil | Youth Emotion Picture Set | Recognition of Facial Emotions in Teens (Empirical) |
31 randomly selected volunteers
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 42 images |
|
O'Reilly et al. (2016) 47 | The United Kingdom | The EU-Emotion Stimulus Set | Limitations of existing face databases (Empirical) |
19 actors
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 249 videos |
|
Olszanowski et al. (2015) 48 | Poland | Warsaw set of emotional facial expression. pictures (WSEFEP) | Limitations of existing face databases (Empirical) |
30 professional actors
|
1) Happiness 2) Sadness 3) Fear 4) Anger 5) Surprise 6) Neutral | 210 images |
|
Passareli et al. (2018) 49 | Italy | Facial Expression Recognition Test (FERT) | Basic emotions (Ekman e Friesen, 1975) 56 and Item Response Theory (Reise and Revicki, 2014) 63 |
6 professional actors
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 42 images |
|
Romani-Sponchiado et al. (2015) 50 | Brazil | Child Emotions Picture Set | Recognition of facial emotions in children (Empirical) |
18 children
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 225 images |
|
Samuelsson et al. (2012) 51 | Sweden | Umeå University Database of Facial Expressions | Limitations of existing face databases (Empirical) |
60 community individuals
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 424 images |
|
Sharma and Bhushan. (2019) 52 | India | Indian Affective Picture | Basic emotions (Ekman and Friesen, 1975) 56 and limitations of existing face databases (Empirical) |
4 professional actors
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 140 images |
|
Tottenham et al. (2009) 12 | The United States | The NimStim set of facial expressions | Basic emotions (Ekman and Friesen, 1975) 56 and limitations of existing face databases (Empirical) |
43 professional actors
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 672 images |
|
Tracy et al. (2009) 53 | Canada | Universidade da Califórnia, Davis, Set of Emotion Expressions (UCDS) | Basic emotions (Ekman and Friesen, 1975) 56 and limitations of existing face databases (Empirical) |
28 community individuals
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise | 73 images |
|
Vaiman et al. (2017) 54 | Argentina | Expresiones de Emociones Faciales (FACS) | The role of culture in recognizing emotions (Empirical) |
14 Argentines from the community
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 60 images |
|
Yang et al. (2020) 55 | China | Tsinghua facial expression database | The role of culture in recognizing emotions (Empirical) |
63 young and 47 elderly Chinese natives with an interest in acting Young
|
1) Happiness 2) Sadness 3) Fear 4) Disgust 5) Anger 6) Surprise 7) Neutral | 880 images |
|
ND: not declared; M: male; F: female; SD: standard deviation.
Additional features of the face database.
Table 2. Methodological characteristics used in the studies to create the databases.
Authors and year of publication | Name of the database elaborated | Method used to elicit the emotions | Patterns in stimulus capture | Criteria used in the validation stage for inclusion of stimuli in the final database | Sample characteristics in the stage for the validation of the stimuli | Psychometric properties assessed |
---|---|---|---|---|---|---|
Benda and Scherf. (2020) 25 | Complex Emotion Expression Database (CEED) | 1) Presentation of an equivalent photograph expressing the emotion 2) Emotions elicited from specific situations |
|
Accuracy ≥50% |
796 volunteers recruited through MTurk
|
|
Chung et al. (2019) 26 | Yonsei Face Database (YFace DB) | 1) Presentation of an equivalent photograph expressing the emotion 2) Instruction on muscle movement of the emotions based on the FACS 3) Emotions elicited from specific situations |
|
Accuracy, intensity, and naturalness |
212 students from the Seoul University
|
|
Conley et al. (2018) 16 | The racially diverse affective expression (RADIATE) | Presentation of an equivalent photograph expressing the emotion |
|
Accuracy and Cohen's kappa |
662 participants recruited through MTurk
|
|
Dalrymple et al. (2013) 27 | The Dartmouth Database of Children's Faces | Emotions elicited from specific situations |
|
Images recognized with ≥70% accuracy |
163 students and members of the Dartmouth College academic community
|
|
Donadon et al. (2019) 28 | Baby Faces | The parents were instructed and trained to provoke the intended emotions | ND | Rasch model to minimize floor and ceiling effects with values from 0.50 to 1.50 Rate of correct answers according to Kringelbach et al. 2008 64 |
Validation 119 volunteers from the community
|
|
Ebner et al. (2010) 13 | Faces--a life-span Database of Facial Expressions | 1) Emotion induction through photographs and videos 2) Emotions elicited from specific situations |
|
Agreement among evaluators for (1) purity of the facial expression and (2) high intensity facial expression |
154 students
|
|
Egger et al. (2011) 29 | NIMH Child Emotional Faces Picture Set (NIMH-ChEFS) |
|
The cutoff point for the image to be included was that ≥15 evaluators identified the intended emotion |
20 professors and employees of the Duke University Medical Center
|
|
|
Ekman and Friesen. (1976) 30 | Pictures of Facial Affect (POFA) | Instruction on muscle movement of the emotions based on FACS | ND | ND | ND | ND |
Fujimura and Umemura (2018) 31 | A facial expression database based on the dimensional and categorical model of emotions | 1) Emotions elicited from specific situations 2) Instruction on muscle movement of the emotions based on FACS |
|
Agreement among the evaluators Mean of 69% agreement among the evaluators (SD=21%) |
39 university students
|
|
Franz et al. (2021) 32 | Picture-Set of Young Children's Affective Facial Expressions (PSYCAFE) | 1) Guidance of emotions in theater workshops 2) Directed Facial Action Task used to guide the movement of anatomical landmarks |
|
Step 1 Confirmatory hierarchical cluster analysis by Ward Step 2 Intensity, authenticity, and likeability. Accuracy (77-100%) and AFFDEX Software |
Step 1 197 volunteers from the community
|
|
Garrido et al. (2017) 33 | Stills and Videos of facial Expressions (SAVE database) | Emotions elicited from specific situations |
|
Stimuli with an assessment of 2.5 SD above or below the mean |
120 university students
|
|
Giuliani et al. (2017) 15 | The DuckEES child and adolescent dynamic facial expressions stimulus set | Emotions elicited from specific situations |
|
Images recognized with ≥70% accuracy |
36 volunteers from the Oregon University
|
|
Happy et al. (2015) 34 | The Indian Spontaneous Expression Database for Emotion Recognition (ISED) | Emotion induction through videos |
|
Agreement among the evaluators (Fleiss’ Kappa) |
Four trained evaluators
|
|
Kaulard et al. (2012) 35 | The MPI Facial Expression Database | Emotions elicited from specific situations |
|
Consistency among the evaluators (Fleiss’ Kappa) |
20 German natives
|
|
Keutmann et al. (2015) 36 | Visual and vocal emotional expressions of adult and child actors | Emotions elicited from specific situations |
|
Accuracy |
510 students, 226 from Drexel University and 284 from the University of Central Florida
|
|
Kim et al. (2017) 37 | Korea University Facial Expression Collection – Second Edition (KUFEC-II) | Instruction on muscle movement of the emotions based on FACS |
|
Internal consistency Accuracy |
75 evaluators
|
|
Langner et al. (2010) 38 | Radboud Faces Database | Instruction on muscle movement of the emotions based on FACS |
|
Accuracy |
276 students from Radboud University
|
|
LoBue and Thrasher. (2015) 14 | The Child Affective Facial Expression (CAFE) | Instruction on muscle movement of the emotions based on FACS was carried out during improvised games |
|
Images recognized with ≥60% accuracy |
|
|
Lundqvist et al. (1998) 39 | Karolinska Directed Emotional Faces (KDEF) Database | The participants were free to express the emotion as they wished | Background: Neutral Clothes: Gray T-shirt Distractors removed: Beard, mustache, earrings, glasses, and makeup | ND | ND | ND |
Ma et al. (2020) 40 | Han, Hui, and Tibetan Chinese facial expression database | 1) Emotion induction through photographs and videos 2) Instruction on muscle movement of the emotions based on FACS |
|
Images recognized with ≥60% accuracy |
|
|
Ma et al. (2015) 41 | Chicago Face Database (CFD) | 1) Emotions expressed from verbal instructions 2) Presentation of an equivalent photograph expressing the emotion |
|
Two independent judges assessed how believable the expression was on a Likert scale from 1 to 9 (1=not at all believable; 9=very believable) |
1,087 evaluators (convenience sample)
|
|
Maack et al. (2017) 42 | The Tromso Infant Faces Database (TIF) | The parents were instructed to elicit the intended emotions with games and specific stimuli |
|
The photographs with best agreement among the evaluators were selected Mean classification of clarity and intensity below 2.5 Validation: (a) expression portrayed, (b) clarity of expression, (c) intensity of the expression, and (d) valence of the expression |
720 participants
|
|
Meuwissen et al. (2017) 43 | Developmental Emotional Faces Stimulus Set (DEFSS) | 1) Emotions elicited from specific situations 2) Presentation of an equivalent photograph expressing the emotion |
|
The images recognized by less of 55% of the evaluators were excluded |
228 university students between undergraduate and graduate levels and children preappointed by the family via the Internet
|
|
Minear and Park. (2004) 44 | A life span database of adult facial stimuli | Emotions expressed from verbal instructions |
|
ND | ND | ND |
Negrão et al. (2021) 45 | The Child Emotion Facial Expression Set | 1) Presentation of an equivalent photograph expressing the emotion 2) Emotions elicited from specific situations |
|
Step 1: 100% agreement between two evaluators Step 2: 100% agreement between other two evaluators (two of each step) |
Four judges
|
|
Novello et al. (2018) 46 | Youth Emotion Picture Set | 1) Emotions elicited from specific situations 2) Presentation of an equivalent photograph expressing the emotion 3) Presentation of videos and a game to specifically elicit the emotion of anger |
|
Images recognized with ≥75% accuracy |
Adults: 101 volunteers recruited through the snowball method
|
|
O'Reilly et al. (2016) 47 | The EU-Emotion Stimulus Set | Emotions elicited from specific situations |
|
Accuracy |
1,231 volunteers
|
|
Olszanowski et al. (2015) 48 | Warsaw Set of Emotional Facial Expression Pictures (WSEFEP) | Instruction on muscle movement of the emotions based on FACS |
|
Agreement in recognition |
1,362 participants
|
|
Passareli et al. (2018) 49 | Facial Expression Recognition Test (FERT) | Presentation of an equivalent photograph expressing the emotion |
|
Unidimensional model |
794 volunteers from the community
|
|
Romani-Sponchiado et al. (2015) 50 | Child Emotions Picture Set | Emotion induction through videos |
|
Images recognized with ≥60% accuracy |
30 psychologists with experience in child development
|
|
Samuelsson et al. (2012) 51 | Umeå University Database of Facial Expressions | Instruction on muscle movement of the emotions based on FACS |
|
Accuracy |
526 participants
|
|
Sharma and Bhushan. (2019) 52 | Indian Affective Picture | 1) Presentation of an equivalent photograph expressing the emotion 2) Emotions elicited from specific situations |
|
Accuracy Intensity (9-point scale) |
350 undergraduate students
|
|
Tottenham et al. (2009) 12 | The NimStim set of facial expressions | Emotions expressed from verbal instructions |
|
Validity (accuracy and Cohen's kappa) and reliability |
Group 1 47 university students
|
|
Tracy et al. (2009) 53 | University of California, Davis, Set of Emotion Expressions (UCDS) | Instruction on muscle movement of the emotions based on FACS |
|
Accuracy (the most recognized emotion of each expression was included in the final database) |
Study 1 175 undergraduate students
|
|
Vaiman et al. (2017) 54 | FACS | Emotions elicited from specific situations |
|
Images recognized with ≥70% accuracy |
466 students from the Psychology School of the National University of Córdoba.
|
|
Yang et al. (2020) 55 | Tsinghua facial expression database | 1) Emotions elicited from specific situations 2) Instruction on muscle movement of the emotions based on FACS |
|
Images recognized with ≥70% accuracy |
34 young individuals and 31 older adults, Chinese Young individuals
|
ND: not declared; M: male; F: female; MTurk: Amazon Mechanical Turk; FACS: Facial Action Coding System (Ekman and Friesen, 1978) 65 ; ANCOVA: analysis of covariance; ANOVA: repeatedmeasure analysis of variance.
Only images with ≥50% accuracy were included in the final database;
Satisfactory indexes; ‡There was a significant difference in precision between the analyzed variables;
The mean rate of correct identification of the emotions was 62.5%; //Only images recognized by ≥15 evaluators were included in the final database;
There was no significant difference in precision between the analyzed variables;
The mean rate of correct identification of the emotions was 66%;
Only images with ≥60% accuracy were included in the final;
Accuracy is presented for each emotion and varied from 44 to 100%;
Only images recognized by at least 55% of the evaluators were included in the final database. The mean recognition of the final database was 63%;
The mean recognition rate of the final database varied from 47 to 94%.
Risk of bias
The studies selected in this review are for the construction of face databases. In this sense, the traditional risk of bias tools used in randomized and nonrandomized studies is not applicable. The task elaborated by the studies must offer valid and interpretable data for the assessment of facial recognition of basic emotions of individuals in certain contexts. Therefore, the quality of the studies included can be observed based on the analyses performed for the reliability and validity of the databases elaborated 18,19 .
Data analysis
We analyzed the psychometric properties assessed by the studies in the stage for the validation of the stimuli (Table 2) 64,65 . This information is important to assess the quality of the database that was elaborated. Qualitatively, we followed the standards for educational and psychological testing of the American Educational Research Association 20 and the stages specified in Resolution 09-2018 of the Brazilian Federal Council of Psychology 21 , which regulates the dimensions necessary for the assessment of psychological tests. Consequently, information based on the analysis of the database items and the measures for validity evidence were obtained (Table 2).
In addition, we sought to identify in Table 2 when the psychometric measure assessed by the studies presented satisfactory indexes. For accuracy, as a reference standard we used the consensus among most of the studies on the construction of face databases that include stimuli with recognition rates ≥70%. In some cases, the studies established other rates for recognition, which were indicated as symbols in the table.
Since accuracy is a fundamental indicator for stimuli selection and has been widely used as a quality parameter for construction studies, this variable is included in the table as an indicator of both precision and content-based validity evidence, since it is a precision measure that was used to validate the database content. For agreement among the evaluators, the studies generally use Cohen's or Fleiss’ kappa indexes. Therefore, we used value ≥60% as a reference 22,23 . For internal consistency, we used Cronbach's alpha value >0.70 as a reference 24 .
RESULTS
Selection and presentation of the studies
Figure 1 presents the search and selection process for the 36 articles included in this systematic review 12–17,25–63 .
Figure 1. The article selection process according to the PRISMA initiative recommendations 17 .
Table 1 presents the general characteristics of the face databases included and Table 2 presents the methodological characteristics used to create each of them.
General characteristics of the face databases included
The articles included were published between 1976 and 2020, the majority dating from 2015 and 2017. Of the 36 articles included, 30.56% were carried out in the United States. In relation to the theoretical framework used for the construction of the databases, 75% of the studies were empirically based. In other words, the limitations of the databases already built were the basis for this construction.
Most of the articles (61.1%) elaborated databases made up by six basic emotions (i.e., happiness, sadness, fear, anger, disgust, and surprise), as well as neutral faces. Some databases did not neutral faces, or surprise and disgust. Two databases only included happiness and neutral faces, one database only included happiness, fear, and neutral; and another included only happiness, sadness, anger, and surprise.
In relation to the participants, 41.7% of the studies selected resorted to actors (either amateur or professional) to express the emotions. The mean age of the actors varied from 13.24 to 73.2 years, with four studies including different age groups in their databases. Only five of the studies with actors included different races in their samples, and seven studies included any of the specific race, namely, Caucasian, Japanese, Korean, Polish, Indian, or Chinese. Three studies did not report the actors’ race.
In relation to the other studies, that is, those that present the basic emotions expressed by community-dwelling individuals, inserted in various contexts, presented ages varying from 4 months to 93 years, and five of these studies included volunteers of different ages. Of these, 10 studies included participants of different races and the remaining studies included only one race, namely, Korean, Caucasian, Indian, and Chinese. Three studies did not report the participants’ race. With regard to the presentation of the stimuli, 86.1% of the studies included colored faces in their databases, four studies used black and white faces, and one study included both colored and black and white faces in its database.
Most of the databases included (75%) present static stimuli, four studies are of dynamic stimuli, and five databases have both static and dynamic stimuli. Five studies presented open and closed mouth expressions, and other studies included additional features such as varying intensities and varying angles. The final total stimuli included in the databases varied from 42 to 18,800.
Methodological characteristics used in the studies
Method used to elicit the emotions
The method used to elicit the emotions varied across the studies. In general, more than one method was used in this stage. Predominantly, 44.4% of the studies used specific situations as one of the ways to elicit the intended emotions, such as “Imagine that you have just won the lottery; imagine that you have just lost a loved one.” The studies also used instructions based on the muscle movement of the emotions considering protocols such as the Investigator's Guide for the Facial Action Coding System (FACS), others used a photograph as a model, and others elicited the emotions from photographs and/or videos.
Two studies that built faces with infants and children used an instructional protocol, performed by the parents, to elicit the intended emotions. In one study, the individuals could express the emotion any way they wanted. Three studies elicited emotions in the participants through verbal instructions, such as “Make a happy face” and one study used workshops to teach children how to express basic emotions as well as a Directed Facial Action Task used to guide movement of anatomical landmarks.
Recording the stimuli
Most of the studies sought to establish and describe patterns to record the stimuli. For example, the images were photographed against a white background, black, or gray, and the individuals wore black or white garments. In addition, 55.6% of the studies established distractors that should be removed from the volunteers so that the images could be recorded, such as jewelry, accessories, and strong makeup.
Validation stage
The number of participants who validated the faces constructed by the studies varied from 4 to 1,362, and most of the participants who validated the stimuli were inserted in a university context. The way to validate the final stimuli in the database varied across the studies. The majority included recognition accuracy as one of the criteria, with images included reaching recognition percentages from >50 to ≥75%. The studies also used other criteria to include the stimuli in the final database, such as agreement among the evaluators.
Psychometric properties of the final database
Only one study did not include accuracy as a precision measure. In most of the cases, it was also used to validate the task content and even for item analysis. One study also used the method of halves as a precision measure. In 66.7% of the studies, the stimuli were recognized with ≥70% accuracy.
Test-retest reliability was a variable used to assess task precision in four studies, all presenting satisfactory indexes for this dimension. Regarding the measures of validity evidence, 10 studies used Cohen's kappa or Fleiss’ kappa to validate the task content according to the agreement among the evaluators. All of them presented satisfactory indexes in this dimension. Only one study used Cronbach's alpha to assess internal consistency, also reporting a satisfactory value.
Six studies analyzed the items’ difficulty. Three studies used Item Response Theory (IRT); one study analyzed difficulty according to the intensity and representativeness scores; one study used the Classical Test Theory (CTT); and one study used discrimination.
Two studies presented validity evidence based on the internal structure. One of them used exploratory factor analysis and the other resorted to factor analysis through the two-parameter Bayesian model. In addition, the other study presented validity evidence based on the convergent relationship, presenting a descriptive comparison of the database built with the POFA bank, with satisfactory indexes.
Fourteen (38.9%) studies presented validity evidence based on the relationship with other variables.
DISCUSSION
The ability to recognize emotional facial expressions can be modulated by variables such as gender, age, and race. In this sense, a number of studies sought to elaborate valid facial expression databases to assess recognition of emotions in specific populations and contexts. However, the methodological heterogeneity among construction studies can make it difficult to create patterns for the construction of these stimuli, regardless of the context and characteristics of who express them. This systematic review sought to gather the studies that built face databases to assess recognition of basic emotions, describing and comparing the methodologies used in its development.
General characteristics of the face databases included
The way to present the stimuli of an emotion recognition test has already been target of discussions among researchers in the area, since a pioneering study showed that the recognition of static and dynamic facial emotional stimuli involves different neural areas 66 . In this review, most of the studies consist of static stimulus databases. The difference in the recognition of static or dynamic stimuli is still an unanswered discussion, given that some studies report a higher rate of recognition of dynamic stimuli 67,68 while others point to a minimal or no difference in the recognition of these stimuli 69,70 .
Khosdelazad et al. 71 investigated the differences in the performance of 3 emotion recognition tests in 84 healthy participants. The results point to a clear difference in the performance of tests with static or dynamic stimuli, with the stimuli that change from a neutral face to the intended emotion (dynamic) being the most difficult to be recognized, given the low performance in the test 71 . However, it is noteworthy that variables such as age and schooling also modulated performance in the tests, highlighting the importance of normative data regardless of the type of stimulus chosen 71 .
Several stimuli databases for facial expressions of emotions were developed in order to be used in specific populations and cultures 72 . Cultural issues must be taken into account when understanding these emotional expressions, as they can exert an influence on their recognition 73 . A study that considered ethnicity as an influencing factor in the performance of emotion recognition tasks and compared this ability to identify emotions between Australian and Chinese individuals verified that people perform worse when classifying emotions that are expressed on faces of another ethnicity 74 . In this sense, the cultural characteristics of the stimulus presented can also modulate performance in the test.
In addition to the difference in the pattern of response when recognizing emotions from another culture, studies showed that there is still a difference in the pattern of intensity recognized, regardless of the race or gender of the stimulus presented 75,76 . This fact happens probably because we manage our emotions according to the our learnings throughout our lives, clearly shaped by the cultural context in which we are inserted 76,77 . Thus, we learned in certain situations to hide or amplify our emotions, consequently affecting how we recognize emotions and highlighting the clear influences of culture on our social and cognitive abilities 76,78 .
Furthermore, when we think about the modulating character of the cultural context in the recognition of emotions, it is important to highlight the impact that socioeconomic status can also have on this ability. In particular, some countries and regions with greater socioeconomic disparities may reflect different patterns of cognitive abilities 79 . For example, a large international study investigated, in 12 countries and 587 participants, the influence of nationality on core social cognition skills 80 .
After controlling the analyses for other modulating variables such as age, sex, and education, the results showed that a variation of 20.76% (95%CI 8.26–35.69) in the test score that evaluated emotion recognition can be attributed to the nationality of the individuals evaluated 80 . These results make us reflect on the cultural disparities that exist in underdeveloped countries and how these aspects can influence the social and cognitive variables, as well as the recognition of emotions discussed here.
In addition, aspects related to the participant's profile can also interfere in task performance. Five studies in this review presented open and closed mouth expressions and other studies included additional features such as varying intensities, gaze directions, and varying angles. These variables can also modulate task performance. Emotions expressed with the mouth open seem to increase the intensity of the emotion perceived by the subject 81,82 . Consequently, incorporating this face variation to the database can be important to assess the emotion experienced by the individual who recognizes the stimuli. In addition, open-mouthed facial expressions seem to draw more the attention of the respondent than closed-mouthed expressions 81 .
Hoffmann et al. 83 found a correlation between the intensity and accuracy of recognition of an emotion, where higher intensities were associated with greater accuracy in the perception of the face. However, Wingenbach et al. 84 did not find effects of the intensity level on expression recognition. Despite the controversial results regarding emotion intensity, it can still be an important variable to be taken into account in the construction of databases in order to compare recognition between different degrees of intensities.
The perception of the emotion expressed can also be modulated by the gaze direction of the person expressing it 85 so that when gaze is directed at the participant, this recognition is greater than when compared to the look avoided 86 . In addition, photographing the expressions from different angles can increase the ecological validity of the database built 38 .
Methodological characteristics used in the studies
Method used to elicit the emotions
An important methodological choice in the studies that elaborate face databases is the way in which the stimuli will be elicited and who is going to express them. Our results show that most of the studies included in this systematic review resort to actors (either amateur or professional) to express the emotions. Such methodological choice can be justified by the fact that people who have experience in acting are able to express more realistic emotions than individuals without any experience 87 . Thus, resorting to actors to act out emotions can be advantageous with regard to bringing the emotions expressed to a more real context.
The literature indicates that there are three different ways to induce emotions, namely:
Posed emotions are those expressed by actors or under specific guidance, tending to be less representative of an emotion expressed in a real context 89 . Induced emotions have a more genuine character than posed emotions, as varied eliciting stimuli are presented to the participant in order to generate the most spontaneous emotion possible 89 . However, it is noteworthy that this way of inducing emotion can also have limitations as to its veracity, since induction is carried out in a context controlled by the researcher 89 . Spontaneous emotions are considered closer to a real-life context. However, due to their observable character, their recording could only be possible when the individuals are not aware that they are being recorded. Thus, any research procedure can bias this spontaneity 89 .
To increase induction effectiveness, the studies use a combination of techniques and procedures to facilitate achievement of the intended emotions. Among the 36 studies analyzed in this review, 44.4% used specific hypothetical situations as one of the ways to elicit the intended emotions, such as “Imagine that you have just won the lottery; imagine that you have just lost a loved one.” Thus, despite induction being generated in a controlled context, using hypothetical everyday situations aims at remedying the limitation of expressions that are not very representative of real life.
Recording the stimuli
All construction studies try to capture stimuli following some kind of pattern. Some explore this pattern more in detail and others are more objective. Despite this, the data included in this review indicate that it is important to standardize the clothes worn by the participants and the background they are positioned against during the capture of stimuli.
In addition, most construction studies have established distractors that should be removed prior to image capture, such as jewelry, accessories, and strong makeup. Our hypothesis is that these distractors could direct the attention of those who respond to the task and exert an impact on recognition performance, since attention can be a modulating variable in emotional tasks 90 .
Validation stage
The way to validate the stimuli in the databases elaborated varies greatly across the studies. Based on the methods used in the construction, the validation criteria are defined. Accuracy is the most used precision indicator in the development and validation of face databases that assess recognition of emotions 12,13 , which is why it was presented in most of the studies included. Recognition rate ≥70% is the most frequently used. However, the choice of which criterion to adopt at this stage is varied, and it is common to adopt other rates and criteria to validate the database, such as intensity, clarity, and agreement between evaluators.
Psychometric properties of the final database
We seek to follow the standards established by Resolution 09-2018 of the Federal Council of Psychology, which regulates the necessary dimensions for the assessment of psychological tests to verify the psychometric qualities of the databases. Although the studies present construction of tasks and not instruments, recognition of emotions is an important skill that allows for interaction in society and can be used to assess social cognition to predict the diagnosis of mental disorders 91 .
The analyses presented by the studies in this stage are also heterogeneous. However, some dimensions presented in the studies become strictly necessary to verify the quality of the database elaborated. With regard to the technical requirements, it is important to evaluate dimensions related to precision and validity evidence of the constructed task 20,21 . It is worth noting that normative data are also important to assess the quality of the task. However, this variable and other important analyses were not included in this review as they are found in articles published separately.
This review showed that the studies that elaborate face databases for the recognition of emotions present heterogeneous methods. However, similarities between the studies allow us to trace important patterns for the development of these stimuli, such as using more than one method to elicit the most spontaneous emotion possible, standardizing the characteristics of the volunteers for capturing the stimuli, validating the database based on preestablished criteria, and presenting data referring to precision and validity evidence. With regard to future directions related to the research methods, greater standardization of the methods for eliciting and validating emotions would make the choice of the type of task to be used in each context more reliable.
Footnotes
This study was conducted by the Study and Research Group on Mental Health, Cognition and Aging – ProViVe, Universidade Federal de São Carlos, São Carlos, SP, Brazil.
Funding: This study was financed in part by the Brazilian fostering agencies: Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES [Coordination for the Advancement of Higher Education Personnel]), finance code 001). DMF is a recipient of a scholarship from the CAPES (grant: # 88887.338752/2019-00) and MAMB is a recipient of a scholarship from Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP [State of São Paulo Research Assistance Foundation], process: 20/04936-4).
REFERENCES
- 1.Darwin C. The expression of the emotions in man and animals. Chicago: University of Chicago Press; 2015. [Google Scholar]
- 2.Plutchik R. The nature of emotions: human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. American Scientist. 2001;89(4):344–350. [Google Scholar]
- 3.Palermo R, Rhodes G. Are you always on my mind? A review of how face perception and attention interact. Neuropsychologia. 2007;45(1):75–92. doi: 10.1016/j.neuropsychologia.2006.04.025. [DOI] [PubMed] [Google Scholar]
- 4.Pascalis O, Slater A. The development of face processing in early childhood. New York: Nova Science Publishers; 2003. [Google Scholar]
- 5.Ekman P, Sorenson ER, Friesen WV. Pan-cultural elements in facial displays of emotion. Science. 1969;164(3875):86–88. doi: 10.1126/science.164.3875.86. [DOI] [PubMed] [Google Scholar]
- 6.Ekman P. Facial expression and emotion. Am Psychol. 1993;48(4):384–392. doi: 10.1037/0003-066X.48.4.384. [DOI] [PubMed] [Google Scholar]
- 7.Schmidt KL, Cohn JF. Human facial expressions as adaptations: evolutionary questions in facial expression research. Am J Phys Anthropol. 2001;(Suppl 33):3–24. doi: 10.1002/ajpa.20001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Barrett LF, Mesquita B, Gendron M. Context in emotion perception. Current Directions in Psychological Science. 2011;20(5):286–290. doi: 10.1177/0963721411422522. [DOI] [Google Scholar]
- 9.Ebner NC. Age of face matters: age-group differences in ratings of young and old faces. Behav Res Methods. 2008;40(1):130–136. doi: 10.3758/brm.40.1.130. [DOI] [PubMed] [Google Scholar]
- 10.Chaplin TM, Aldao A. Gender differences in emotion expression in children: a meta-analytic review. Psychol Bull. 2013;139(4):735–765. doi: 10.1037/a0030737. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Zebrowitz LA, Kikuchi M, Fellous JM. Facial resemblance to emotions: group differences, impression effects, and race stereotypes. J Pers Soc Psychol. 2010;98(2):175–189. doi: 10.1037/a0017990. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Tottenham N, Tanaka JW, Leon AC, McCarry T, Nurse M, Hare TA, et al. The NimStim set of facial expressions: judgments from untrained research participants. Psychiatry Res. 2009;168(3):242–249. doi: 10.1016/j.psychres.2008.05.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Ebner NC, Riediger M, Lindenberger U. FACES--a database of facial expressions in young, middle-aged, and older women and men: development and validation. Behav Res Methods. 2010;42(1):351–362. doi: 10.3758/BRM.42.1.351. [DOI] [PubMed] [Google Scholar]
- 14.LoBue V, Thrasher C. The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults. Front Psychol. 2015;5:1532–1532. doi: 10.3389/fpsyg.2014.01532. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Giuliani NR, Flournoy JC, Ivie EJ, Von Hippel A, Pfeifer JH. Presentation and validation of the DuckEES child and adolescent dynamic facial expressions stimulus set. Int J Methods Psychiatr Res. 2017;26(1):e1553–e1553. doi: 10.1002/mpr.1553. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Conley MI, Dellarco DV, Rubien-Thomas E, Cohen AO, Cervera A, Tottenham N, et al. The racially diverse affective expression (RADIATE) face stimulus set. Psychiatry Res. 2018;270:1059–1067. doi: 10.1016/j.psychres.2018.04.066. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372(71) doi: 10.1136/bmj.n71. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med. 2006;119(2):166.e7–166.e16. doi: 10.1016/j.amjmed.2005.10.036. [DOI] [PubMed] [Google Scholar]
- 19.Pittman J, Bakas T. Measurement and instrument design. J Wound Ostomy Continence Nurs. 2010;37(6):603–607. doi: 10.1097/WON.0b013e3181f90a60. [DOI] [PubMed] [Google Scholar]
- 20.American Educational Research Association, American Psychological Association, National Council on Measurement in Education . Standards for educational and psychological testing. Washington: American Educational Research Association; 2014. [Google Scholar]
- 21.Brasil. Conselho Federal de Psicologia Estabelece diretrizes para a realização de Avaliação Psicológica no exercício profissional da psicóloga e do psicólogo, regulamenta o Sistema de Avaliação de Testes Psicológicos - SATEPSI e revoga as Resoluções n° 002/2003, n° 006/2004 e n° 005/2012 e Notas Técnicas n° 01/2017 e 02/2017. [[cited on Dec 01, 2022]]. Resolução n° 9, de 25 de abril de 2018. Available from: https://satepsi.cfp.org.br/docs/ResolucaoCFP009-18.pdf .
- 22.Cohen J. A coefficient of agreement for nominal scales. Education and Psychological Measurement. 1960;20(1):37–46. doi: 10.1177/001316446002000104. [DOI] [Google Scholar]
- 23.Fleiss JL. Measuring nominal scale agreement among many raters. Psychological Bulletin. 1971;76(5):378–382. doi: 10.1037/h0031619. [DOI] [Google Scholar]
- 24.Cortina JM. What is coefficient alpha? An examination of theory and applications. J Appl Psychol. 1993;78(1):98–104. doi: 10.1037/0021-9010.78.1.98. [DOI] [Google Scholar]
- 25.Benda MS, Scherf KS. The complex emotion expression database: a validated stimulus set of trained actors. PLoS One. 2020;15(2):e0228248. doi: 10.1371/journal.pone.0228248. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Chung KM, Kim S, Jung WH, Kim Y. Development and validation of the Yonsei face database (YFace DB) Front Psychol. 2019;10:2626–2626. doi: 10.3389/fpsyg.2019.02626. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Dalrymple KA, Gomez J, Duchaine B. The dartmouth database of children's faces: acquisition and validation of a new face stimulus set. PLoS One. 2013;8(11):e79131. doi: 10.1371/journal.pone.0079131. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Donadon MF, Martin-Santos R, Osório FL. Baby faces: development and psychometric study of a stimuli set based on babies’ emotions. J Neurosci Methods. 2019;311:178–185. doi: 10.1016/j.jneumeth.2018.10.021. [DOI] [PubMed] [Google Scholar]
- 29.Egger HL, Pine DS, Nelson E, Leibenluft E, Ernst M, Towbin KE, et al. The NIMH Child Emotional Faces Picture Set (NIMH-ChEFS): a new set of children's facial emotion stimuli. Int J Methods Psychiatr Res. 2011;20(3):145–156. doi: 10.1002/mpr.343. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Ekman P, Friesen WV. Pictures of facial affect. Palo Alto: Consulting Psychologists Press; 1976. [Google Scholar]
- 31.Fujimura T, Umemura H. Development and validation of a facial expression database based on the dimensional and categorical model of emotions. Cogn Emot. 2018;32(8):1663–1670. doi: 10.1080/02699931.2017.1419936. [DOI] [PubMed] [Google Scholar]
- 32.Franz M, Müller T, Hahn S, Lundqvist D, Rampoldt D, Westermann JF, et al. Creation and validation of the Picture-Set of Young Children's Affective Facial Expressions (PSYCAFE) PLoS One. 2021;16(12):e0260871. doi: 10.1371/journal.pone.0260871. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Garrido MV, Lopes D, Prada M, Rodrigues D, Jerónimo R, Mourão RP. The many faces of a face: comparing stills and videos of facial expressions in eight dimensions (SAVE database) Behav Res Methods. 2017;49(4):1343–1360. doi: 10.3758/s13428-016-0790-5. [DOI] [PubMed] [Google Scholar]
- 34.Happy SL, Patnaik P, Routray A, Guha R. The Indian spontaneous expression database for emotion recognition. IEEE Transactions on Affective Computing. 2015;8(1):131–142. doi: 10.1109/TAFFC.2015.2498174. [DOI] [Google Scholar]
- 35.Kaulard K, Cunningham DW, Bülthoff HH, Wallraven C. The MPI facial expression database--a validated database of emotional and conversational facial expressions. PLoS One. 2012;7(3):e32321. doi: 10.1371/journal.pone.0032321. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Keutmann MK, Moore SL, Savitt A, Gur RC. Generating an item pool for translational social cognition research: methodology and initial validation. Behav Res Methods. 2015;47(1):228–234. doi: 10.3758/s13428-014-0464-0. [DOI] [PubMed] [Google Scholar]
- 37.Kim SM, Kwon YJ, Jung SY, Kim MJ, Cho YS, Kim HT, et al. Development of the Korean facial emotion stimuli: Korea university facial expression collection 2nd edition. Front Psychol. 2017;8:769–769. doi: 10.3389/fpsyg.2017.00769. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Langner O, Dotsch R, Bijlstra G, Wigboldus DHJ, Hawk ST, van Knippenberg A. Presentation and validation of the Radboud Faces Database. Cognition and Emotion. 2010;24(8):1377–1388. doi: 10.1080/02699930903485076. [DOI] [Google Scholar]
- 39.Lundqvist D, Flykt A, Öhman A. The Karolinska directed emotional faces--KDEF. (CD ROM) Stockholm: Karolinska Institute, Department of Clinical Neuroscience, Psychology Section; 1998. [Google Scholar]
- 40.Ma J, Yang B, Luo R, Ding X. Development of a facial-expression database of Chinese Han, Hui and Tibetan people. Int J Psychol. 2020;55(3):456–464. doi: 10.1002/ijop.12602. [DOI] [PubMed] [Google Scholar]
- 41.Ma DS, Correll J, Wittenbrink B. The Chicago face database: a free stimulus set of faces and norming data. Behav Res Methods. 2015;47(4):1122–1135. doi: 10.3758/s13428-014-0532-5. [DOI] [PubMed] [Google Scholar]
- 42.Maack JK, Bohne A, Nordahl D, Livsdatter L, Lindahl ÅAW, Øvervoll M, et al. The Tromso Infant Faces Database (TIF): development, validation and application to assess parenting experience on clarity and intensity ratings. Front Psychol. 2017;8:409–409. doi: 10.3389/fpsyg.2017.00409. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Meuwissen AS, Anderson JE, Zelazo PD. The creation and validation of the developmental emotional faces stimulus set. Behav Res Methods. 2017;49(3):960–966. doi: 10.3758/s13428-016-0756-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Minear M, Park DC. A lifespan database of adult facial stimuli. Behav Res Methods Instrum Comput. 2004;36(4):630–633. doi: 10.3758/bf03206543. [DOI] [PubMed] [Google Scholar]
- 45.Negrão JG, Osorio AAC, Siciliano RF, Lederman VRG, Kozasa EH, D'Antino MEF, et al. The child emotion facial expression set: a database for emotion recognition in children. Front Psychol. 2021;12:666245–666245. doi: 10.3389/fpsyg.2021.666245. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Novello B, Renner A, Maurer G, Musse S, Arteche A. Development of the youth emotion picture set. Perception. 2018;47(10-11):1029–1042. doi: 10.1177/0301006618797226. [DOI] [PubMed] [Google Scholar]
- 47.O'Reilly H, Pigat D, Fridenson S, Berggren S, Tal S, Golan O, et al. The EU-emotion stimulus set: a validation study. Behav Res Methods. 2016;48(2):567–576. doi: 10.3758/s13428-015-0601-4. [DOI] [PubMed] [Google Scholar]
- 48.Olszanowski M, Pochwatko G, Kuklinski K, Scibor-Rylski M, Lewinski P, Ohme RK. Warsaw set of emotional facial expression pictures: a validation study of facial display photographs. Front Psychol. 2015;5:1516–1516. doi: 10.3389/fpsyg.2014.01516. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Passarelli M, Masini M, Bracco F, Petrosino M, Chiorri C. Development and validation of the Facial Expression Recognition Test (FERT) Psychol Assess. 2018;30(11):1479–1490. doi: 10.1037/pas0000595. [DOI] [PubMed] [Google Scholar]
- 50.Romani-Sponchiado A, Sanvicente-Vieira B, Mottin C, Hertzog-Fonini D, Arteche A. Child Emotions Picture Set (CEPS): development of a database of children's emotional expressions. Psychology & Neuroscience. 2015;8(4):467–478. doi: 10.1037/h0101430. [DOI] [Google Scholar]
- 51.Samuelsson H, Jarnvik K, Henningsson H, Andersson J, Carlbring P. The Umeå university database of facial expressions: a validation study. J Med Internet Res. 2012;14(5):e136–e136. doi: 10.2196/jmir.2196. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Sharma U, Bhushan B. Development and validation of Indian Affective Picture Database. Int J Psychol. 2019;54(4):462–467. doi: 10.1002/ijop.12471. [DOI] [PubMed] [Google Scholar]
- 53.Tracy JL, Robins RW, Schriber RA. Development of a FACS-verified set of basic and self-conscious emotion expressions. Emotion. 2009;9(4):554–559. doi: 10.1037/a0015766. [DOI] [PubMed] [Google Scholar]
- 54.Vaiman M, Wagner MA, Caicedo E, Pereno GL. Development and validation of an Argentine set of facial expressions of emotion. Cogn Emot. 2017;31(2):249–260. doi: 10.1080/02699931.2015.1098590. [DOI] [PubMed] [Google Scholar]
- 55.Yang T, Yang Z, Xu G, Gao D, Zhang Z, Wang H, et al. Tsinghua facial expression database – a database of facial expressions in Chinese young and older women and men: development and validation. PLoS One. 2020;15(4):e0231304. doi: 10.1371/journal.pone.0231304. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Ekman P, Friesen WV. Unmasking the face: a guide to recognizing emotions from facial clues. Nova Jersey: Prentice-Hall; 1975. [Google Scholar]
- 57.Ekman P. In: Nebraska Symposium on Motivation. Cole J., editor. Lincoln: University of Nebraska Press; 1972. Universals and cultural differences in facial expressions of emotion; pp. 207–282. [Google Scholar]
- 58.Ruffman T, Henry JD, Livingstone V, Phillips LH. A meta-analytic review of emotion recognition and aging: implications for neuropsychological models of aging. Neurosci Biobehav Rev. 2008;32(4):863–881. doi: 10.1016/j.neubiorev.2008.01.001. [DOI] [PubMed] [Google Scholar]
- 59.Borod JC, Koff E, Yecker S, Santschi C, Schmidt JM. Facial asymmetry during emotional expression: gender, valence, and measurement technique. Neuropsychologia. 1998;36(11):1209–1215. doi: 10.1016/s0028-3932(97)00166-8. [DOI] [PubMed] [Google Scholar]
- 60.Brosch T, Sander D, Scherer KR. That baby caught my eye… attention capture by infant faces. Emotion. 2007;7(3):685–689. doi: 10.1037/1528-3542.7.3.685. [DOI] [PubMed] [Google Scholar]
- 61.Parsons CE, Young KS, Kumari N, Stein A, Kringelbach ML. The motivational salience of infant faces is similar for men and women. PLoS One. 2011;6(5):e20632. doi: 10.1371/journal.pone.0020632. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.Borgi M, Cogliati-Dezza I, Brelsford V, Meints K, Cirulli F. Baby schema in human and animal faces induces cuteness perception and gaze allocation in children. Front Psychol. 2014;5:411–411. doi: 10.3389/fpsyg.2014.00411. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63.Reise SP, Revicki DA. Handbook of item response theory modeling. New York: Taylor & Francis; 2014. [Google Scholar]
- 64.Kringelbach ML, Lehtonen A, Squire S, Harvey AG, Craske MG, Holliday IE, et al. A specific and rapid neural signature for parental instinct. PLoS One. 2008;3(2):e1664–e1664. doi: 10.1371/journal.pone.0001664. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Ekman P, Friesen WV. Environmental Psychology. Nonverbal Behavior; 1978. Facial action coding system. [DOI] [Google Scholar]
- 66.Humphreys GW, Donnelly N, Riddoch MJ. Expression is computed separately from facial identity, and it is computed separately for moving and static faces: neuropsychological evidence. Neuropsychologia. 1993;31(2):173–181. doi: 10.1016/0028-3932(93)90045-2. [DOI] [PubMed] [Google Scholar]
- 67.Cunningham DW, Wallraven C. Dynamic information for the recognition of conversational expressions. J Vis. 2009;9(13):7.1–7.17. doi: 10.1167/9.13.7. [DOI] [PubMed] [Google Scholar]
- 68.Knappmeyer B, Thornton IM, Bülthoff HH. The use of facial motion and facial form during the processing of identity. Vision Res. 2003;43(18):1921–1936. doi: 10.1016/s0042-6989(03)00236-0. [DOI] [PubMed] [Google Scholar]
- 69.Gold JM, Barker JD, Barr S, Bittner JL, Bromfield WD, Chu N, et al. The efficiency of dynamic and static facial expression recognition. J Vis. 2013;13(5):23–23. doi: 10.1167/13.5.23. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 70.Fiorentini C, Viviani P. Is there a dynamic advantage for facial expressions? J Vis. 2011;11(3):17–17. doi: 10.1167/11.3.17. [DOI] [PubMed] [Google Scholar]
- 71.Khosdelazad S, Jorna LS, McDonald S, Rakers SE, Huitema RB, Buunk AM, et al. Comparing static and dynamic emotion recognition tests: performance of healthy participants. PLoS One. 2020;15(10):e0241297. doi: 10.1371/journal.pone.0241297. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 72.Ferreira BLC, Fabrício DM, Chagas MHN. Are facial emotion recognition tasks adequate for assessing social cognition in older people? A review of the literature. Arch Gerontol Geriatr. 2021:104277–104277. doi: 10.1016/j.archger.2020.104277. [DOI] [PubMed] [Google Scholar]
- 73.Matsumoto D, Hwang HS, Yamada H. Cultural differences in the relative contributions of face and context to judgments of emotions. Journal of Cross-Cultural Psychology. 2012;43(2):198–218. doi: 10.1177/0022022110387426. [DOI] [Google Scholar]
- 74.Craig BM, Zhang J, Lipp OV. Facial race and sex cues have a comparable influence on emotion recognition in Chinese and Australian participants. Atten Percept Psychophys. 2017;79(7):2212–2223. doi: 10.3758/s13414-017-1364-z. [DOI] [PubMed] [Google Scholar]
- 75.Matsumoto D. Ethnic differences in affect intensity, emotion judgments, display rule attitudes, and self-reported emotional expression in an American sample. Motiv Emot. 1993;17:107–123. doi: 10.1007/BF00995188. [DOI] [Google Scholar]
- 76.Engelmann JB, Pogosyan M. Emotion perception across cultures: the role of cognitive mechanisms. Front Psychol. 2013;4:118–118. doi: 10.3389/fpsyg.2013.00118. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77.Ekman P, Friesen WV. Constants across cultures in the face and emotion. J Pers Soc Psychol. 1971;17(2):124–129. doi: 10.1037/h0030377. [DOI] [PubMed] [Google Scholar]
- 78.Park DC, Huang CM. Culture wires the brain: a cognitive neuroscience perspective. Perspect Psychol Sci. 2010;5(4):391–400. doi: 10.1177/1745691610374591. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 79.Daugherty JC, Puente AE, Fasfous AF, Hidalgo-Ruzzante N, Pérez-Garcia M. Diagnostic mistakes of culturally diverse individuals when using North American neuropsychological tests. Appl Neuropsychol Adult. 2017;24(1):16–22. doi: 10.1080/23279095.2015.1036992. [DOI] [PubMed] [Google Scholar]
- 80.Quesque F, Coutrot A, Cox S, de Souza LC, Baez S, Cardona JF, et al. Culture shapes our understanding of others’ thoughts and emotions: an investigation across 12 countries. PsyArXiv. 2020 doi: 10.31234/osf.io/tg2ay. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 81.Langeslag SJE, Gootjes L, van Strien JW. The effect of mouth opening in emotional faces on subjective experience and the early posterior negativity amplitude. Brain Cogn. 2018;127:51–59. doi: 10.1016/j.bandc.2018.10.003. [DOI] [PubMed] [Google Scholar]
- 82.Horstmann G, Lipp OV, Becker SI. Of toothy grins and angry snarls--open mouth displays contribute to efficiency gains in search for emotional faces. J Vis. 2012;12(5):7–7. doi: 10.1167/12.5.7. [DOI] [PubMed] [Google Scholar]
- 83.Hoffmann H, Kessler H, Eppel T, Rukavina S, Traue HC. Expression intensity, gender and facial emotion recognition: women recognize only subtle facial emotions better than men. Acta Psychol (Amst) 2010;135(3):278–283. doi: 10.1016/j.actpsy.2010.07.012. [DOI] [PubMed] [Google Scholar]
- 84.Wingenbach TSH, Ashwin C, Brosnan M. Sex differences in facial emotion recognition across varying expression intensity levels from videos. PLoS One. 2018;13(1):e0190634. doi: 10.1371/journal.pone.0190634. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 85.Adams RB, Jr, Kleck RE. Effects of direct and averted gaze on the perception of facially communicated emotion. Emotion. 2005;5(1):3–11. doi: 10.1037/1528-3542.5.1.3. [DOI] [PubMed] [Google Scholar]
- 86.Strick M, Holland RW, van Knippenberg A. Seductive eyes: attractiveness and direct gaze increase desire for associated objects. Cognition. 2008;106(3):1487–1496. doi: 10.1016/j.cognition.2007.05.008. [DOI] [PubMed] [Google Scholar]
- 87.Scherer KR, Bänziger T. In: Blueprint for affectively computing. A sourcebook. Scherer KR, Bänziger T, Roesch E, editors. Oxford: Oxford University Press; 2010. On the use of actor portrayals in research on the emotional expression; pp. 166–176. [Google Scholar]
- 88.Wu CH, Lin JC, Wei WL. Survey on audiovisual emotion recognition: databases, features, and data fusion strategies. APSIPA Transactions on Signal and Information Processing. 2014;3(1):e12–e12. doi: 10.1017/ATSIP.2014.11. [DOI] [Google Scholar]
- 89.Haamer RE, Rusadze E, Lüsi I, Ahmed T, Escalera S, Anbarjafari G. In: Human-robot interaction. Theory and application. Anbarjafari G, Escalera S, editors. London: IntechOpen; 2017. Review on emotion recognition databases; pp. 39–63. [DOI] [Google Scholar]
- 90.Srivastava P, Srinivasan N. Emotional information modulates the temporal dynamics of visual attention. Perception. 2008;37:1–29. [Google Scholar]
- 91.American Psychiatric Association . Diagnostic and statistical manual of mental disorders. 5th ed. Washington: American Psychiatric Press; 2013. [Google Scholar]