Table 2. Methodological characteristics used in the studies to create the databases.
| Authors and year of publication | Name of the database elaborated | Method used to elicit the emotions | Patterns in stimulus capture | Criteria used in the validation stage for inclusion of stimuli in the final database | Sample characteristics in the stage for the validation of the stimuli | Psychometric properties assessed |
|---|---|---|---|---|---|---|
| Benda and Scherf. (2020) 25 | Complex Emotion Expression Database (CEED) | 1) Presentation of an equivalent photograph expressing the emotion 2) Emotions elicited from specific situations |
|
Accuracy ≥50% |
796 volunteers recruited through MTurk
|
|
| Chung et al. (2019) 26 | Yonsei Face Database (YFace DB) | 1) Presentation of an equivalent photograph expressing the emotion 2) Instruction on muscle movement of the emotions based on the FACS 3) Emotions elicited from specific situations |
|
Accuracy, intensity, and naturalness |
212 students from the Seoul University
|
|
| Conley et al. (2018) 16 | The racially diverse affective expression (RADIATE) | Presentation of an equivalent photograph expressing the emotion |
|
Accuracy and Cohen's kappa |
662 participants recruited through MTurk
|
|
| Dalrymple et al. (2013) 27 | The Dartmouth Database of Children's Faces | Emotions elicited from specific situations |
|
Images recognized with ≥70% accuracy |
163 students and members of the Dartmouth College academic community
|
|
| Donadon et al. (2019) 28 | Baby Faces | The parents were instructed and trained to provoke the intended emotions | ND | Rasch model to minimize floor and ceiling effects with values from 0.50 to 1.50 Rate of correct answers according to Kringelbach et al. 2008 64 |
Validation 119 volunteers from the community
|
|
| Ebner et al. (2010) 13 | Faces--a life-span Database of Facial Expressions | 1) Emotion induction through photographs and videos 2) Emotions elicited from specific situations |
|
Agreement among evaluators for (1) purity of the facial expression and (2) high intensity facial expression |
154 students
|
|
| Egger et al. (2011) 29 | NIMH Child Emotional Faces Picture Set (NIMH-ChEFS) |
|
The cutoff point for the image to be included was that ≥15 evaluators identified the intended emotion |
20 professors and employees of the Duke University Medical Center
|
|
|
| Ekman and Friesen. (1976) 30 | Pictures of Facial Affect (POFA) | Instruction on muscle movement of the emotions based on FACS | ND | ND | ND | ND |
| Fujimura and Umemura (2018) 31 | A facial expression database based on the dimensional and categorical model of emotions | 1) Emotions elicited from specific situations 2) Instruction on muscle movement of the emotions based on FACS |
|
Agreement among the evaluators Mean of 69% agreement among the evaluators (SD=21%) |
39 university students
|
|
| Franz et al. (2021) 32 | Picture-Set of Young Children's Affective Facial Expressions (PSYCAFE) | 1) Guidance of emotions in theater workshops 2) Directed Facial Action Task used to guide the movement of anatomical landmarks |
|
Step 1 Confirmatory hierarchical cluster analysis by Ward Step 2 Intensity, authenticity, and likeability. Accuracy (77-100%) and AFFDEX Software |
Step 1 197 volunteers from the community
|
|
| Garrido et al. (2017) 33 | Stills and Videos of facial Expressions (SAVE database) | Emotions elicited from specific situations |
|
Stimuli with an assessment of 2.5 SD above or below the mean |
120 university students
|
|
| Giuliani et al. (2017) 15 | The DuckEES child and adolescent dynamic facial expressions stimulus set | Emotions elicited from specific situations |
|
Images recognized with ≥70% accuracy |
36 volunteers from the Oregon University
|
|
| Happy et al. (2015) 34 | The Indian Spontaneous Expression Database for Emotion Recognition (ISED) | Emotion induction through videos |
|
Agreement among the evaluators (Fleiss’ Kappa) |
Four trained evaluators
|
|
| Kaulard et al. (2012) 35 | The MPI Facial Expression Database | Emotions elicited from specific situations |
|
Consistency among the evaluators (Fleiss’ Kappa) |
20 German natives
|
|
| Keutmann et al. (2015) 36 | Visual and vocal emotional expressions of adult and child actors | Emotions elicited from specific situations |
|
Accuracy |
510 students, 226 from Drexel University and 284 from the University of Central Florida
|
|
| Kim et al. (2017) 37 | Korea University Facial Expression Collection – Second Edition (KUFEC-II) | Instruction on muscle movement of the emotions based on FACS |
|
Internal consistency Accuracy |
75 evaluators
|
|
| Langner et al. (2010) 38 | Radboud Faces Database | Instruction on muscle movement of the emotions based on FACS |
|
Accuracy |
276 students from Radboud University
|
|
| LoBue and Thrasher. (2015) 14 | The Child Affective Facial Expression (CAFE) | Instruction on muscle movement of the emotions based on FACS was carried out during improvised games |
|
Images recognized with ≥60% accuracy |
|
|
| Lundqvist et al. (1998) 39 | Karolinska Directed Emotional Faces (KDEF) Database | The participants were free to express the emotion as they wished | Background: Neutral Clothes: Gray T-shirt Distractors removed: Beard, mustache, earrings, glasses, and makeup | ND | ND | ND |
| Ma et al. (2020) 40 | Han, Hui, and Tibetan Chinese facial expression database | 1) Emotion induction through photographs and videos 2) Instruction on muscle movement of the emotions based on FACS |
|
Images recognized with ≥60% accuracy |
|
|
| Ma et al. (2015) 41 | Chicago Face Database (CFD) | 1) Emotions expressed from verbal instructions 2) Presentation of an equivalent photograph expressing the emotion |
|
Two independent judges assessed how believable the expression was on a Likert scale from 1 to 9 (1=not at all believable; 9=very believable) |
1,087 evaluators (convenience sample)
|
|
| Maack et al. (2017) 42 | The Tromso Infant Faces Database (TIF) | The parents were instructed to elicit the intended emotions with games and specific stimuli |
|
The photographs with best agreement among the evaluators were selected Mean classification of clarity and intensity below 2.5 Validation: (a) expression portrayed, (b) clarity of expression, (c) intensity of the expression, and (d) valence of the expression |
720 participants
|
|
| Meuwissen et al. (2017) 43 | Developmental Emotional Faces Stimulus Set (DEFSS) | 1) Emotions elicited from specific situations 2) Presentation of an equivalent photograph expressing the emotion |
|
The images recognized by less of 55% of the evaluators were excluded |
228 university students between undergraduate and graduate levels and children preappointed by the family via the Internet
|
|
| Minear and Park. (2004) 44 | A life span database of adult facial stimuli | Emotions expressed from verbal instructions |
|
ND | ND | ND |
| Negrão et al. (2021) 45 | The Child Emotion Facial Expression Set | 1) Presentation of an equivalent photograph expressing the emotion 2) Emotions elicited from specific situations |
|
Step 1: 100% agreement between two evaluators Step 2: 100% agreement between other two evaluators (two of each step) |
Four judges
|
|
| Novello et al. (2018) 46 | Youth Emotion Picture Set | 1) Emotions elicited from specific situations 2) Presentation of an equivalent photograph expressing the emotion 3) Presentation of videos and a game to specifically elicit the emotion of anger |
|
Images recognized with ≥75% accuracy |
Adults: 101 volunteers recruited through the snowball method
|
|
| O'Reilly et al. (2016) 47 | The EU-Emotion Stimulus Set | Emotions elicited from specific situations |
|
Accuracy |
1,231 volunteers
|
|
| Olszanowski et al. (2015) 48 | Warsaw Set of Emotional Facial Expression Pictures (WSEFEP) | Instruction on muscle movement of the emotions based on FACS |
|
Agreement in recognition |
1,362 participants
|
|
| Passareli et al. (2018) 49 | Facial Expression Recognition Test (FERT) | Presentation of an equivalent photograph expressing the emotion |
|
Unidimensional model |
794 volunteers from the community
|
|
| Romani-Sponchiado et al. (2015) 50 | Child Emotions Picture Set | Emotion induction through videos |
|
Images recognized with ≥60% accuracy |
30 psychologists with experience in child development
|
|
| Samuelsson et al. (2012) 51 | Umeå University Database of Facial Expressions | Instruction on muscle movement of the emotions based on FACS |
|
Accuracy |
526 participants
|
|
| Sharma and Bhushan. (2019) 52 | Indian Affective Picture | 1) Presentation of an equivalent photograph expressing the emotion 2) Emotions elicited from specific situations |
|
Accuracy Intensity (9-point scale) |
350 undergraduate students
|
|
| Tottenham et al. (2009) 12 | The NimStim set of facial expressions | Emotions expressed from verbal instructions |
|
Validity (accuracy and Cohen's kappa) and reliability |
Group 1 47 university students
|
|
| Tracy et al. (2009) 53 | University of California, Davis, Set of Emotion Expressions (UCDS) | Instruction on muscle movement of the emotions based on FACS |
|
Accuracy (the most recognized emotion of each expression was included in the final database) |
Study 1 175 undergraduate students
|
|
| Vaiman et al. (2017) 54 | FACS | Emotions elicited from specific situations |
|
Images recognized with ≥70% accuracy |
466 students from the Psychology School of the National University of Córdoba.
|
|
| Yang et al. (2020) 55 | Tsinghua facial expression database | 1) Emotions elicited from specific situations 2) Instruction on muscle movement of the emotions based on FACS |
|
Images recognized with ≥70% accuracy |
34 young individuals and 31 older adults, Chinese Young individuals
|
ND: not declared; M: male; F: female; MTurk: Amazon Mechanical Turk; FACS: Facial Action Coding System (Ekman and Friesen, 1978) 65 ; ANCOVA: analysis of covariance; ANOVA: repeatedmeasure analysis of variance.
Only images with ≥50% accuracy were included in the final database;
Satisfactory indexes; ‡There was a significant difference in precision between the analyzed variables;
The mean rate of correct identification of the emotions was 62.5%; //Only images recognized by ≥15 evaluators were included in the final database;
There was no significant difference in precision between the analyzed variables;
The mean rate of correct identification of the emotions was 66%;
Only images with ≥60% accuracy were included in the final;
Accuracy is presented for each emotion and varied from 44 to 100%;
Only images recognized by at least 55% of the evaluators were included in the final database. The mean recognition of the final database was 63%;
The mean recognition rate of the final database varied from 47 to 94%.