Skip to main content
Frontiers in Neurology logoLink to Frontiers in Neurology
. 2020 Feb 6;10:1415. doi: 10.3389/fneur.2019.01415

Visual Behavior, Pupil Dilation, and Ability to Identify Emotions From Facial Expressions After Stroke

Anny Maza 1, Belén Moliner 2, Joan Ferri 2, Roberto Llorens 1,2,*
PMCID: PMC7016192  PMID: 32116988

Abstract

Social cognition is the innate human ability to interpret the emotional state of others from contextual verbal and non-verbal information, and to self-regulate accordingly. Facial expressions are one of the most relevant sources of non-verbal communication, and their interpretation has been extensively investigated in the literature, using both behavioral and physiological measures, such as those derived from visual activity and visual responses. The decoding of facial expressions of emotion is performed by conscious and unconscious cognitive processes that involve a complex brain network that can be damaged after cerebrovascular accidents. A diminished ability to identify facial expressions of emotion has been reported after stroke, which has traditionally been attributed to impaired emotional processing. While this can be true, an alteration in visual behavior after brain injury could also negatively contribute to this ability. This study investigated the accuracy, distribution of responses, visual behavior, and pupil dilation of individuals with stroke while identifying emotional facial expressions. Our results corroborated impaired performance after stroke and exhibited decreased attention to the eyes, evidenced by a diminished time and number of fixations made in this area in comparison to healthy subjects and comparable pupil dilation. The differences in visual behavior reached statistical significance in some emotions when comparing individuals with stroke with impaired performance with healthy subjects, but not when individuals post-stroke with comparable performance were considered. The performance dependence of visual behavior, although not determinant, might indicate that altered visual behavior could be a negatively contributing factor for emotion recognition from facial expressions.

Keywords: social cognition, theory of mind, facial expressions, emotion, visual behavior, gaze, pupil dilation, stroke

Introduction

Social cognition is the innate human ability to interpret others' feelings and emotions and to regulate one's own behavior accordingly (1). This ability involves a combination of conscious and unconscious processes that facilitate social behavior and has supported human evolution from our ape-like ancestors to our current status as humans (2). Both verbal and non-verbal forms of communication during social interaction are intertwined and reinforced to enable an interpretation of the social context. Body posture (3) and movements (4) and, especially, facial expressions (5, 6) are common sources of non-verbal information that allow us to identify, and discriminate between, to a certain degree, the emotional states of others. Specifically, the ability to recognize emotional expressions on faces has been repeatedly investigated in the literature, evidencing certain universal patterns across cultures (7), ages (8), and sex (9). The recording and analysis of eye movements and gaze patterns through eye-tracking technology have provided cognitive neuroscientists with insights into both the cognitive and physiological processing of visual information (10), which is especially interesting in terms of investigating the ability to recognize facial expression of emotion. Thus, eye-tracking studies have consistently shown that the eyes, mouth, and nose are the most thoroughly explored facial structures involved in scrutinizing emotional expressions (1113). Moreover, the visual exploration of these areas has been shown to be dependent on the expressed emotion (13, 14), its intensity (11), the visual perspective of the face (15), or the resolution (12) and size of the visual stimuli (16). Apart from visual behavior, eye-tracking technology also allows the temporal variation of pupil size to be registered. Pupil dilation is controlled by both the sympathetic and parasympathetic nervous system (17) in response not only to light changes (18), but also to cognitive processes that involve alertness (19), memory (20), language (21), decision making (22), and emotional processing (18, 2327). In the latter category, variations in pupil size have been described during the visualization of pictures with emotional attributes in comparison to neutral pictures (24, 26), and similar results have been reported with auditory stimulation (23). Importantly, pupil dilation has been related to an increase in sympathetic activity during emotional processing (24).

The acquisition, processing, and recognition of emotional information from faces involve a complex network of peripheral and central systems. In addition to the visual cortex and cortical association areas, which are commonly involved in the processing of visual information (17, 28), other brain regions, such as the fusiform face area in the ventral temporal lobes, are recruited when a human face is within sight (29). Other structures, such as the inferior occipital gyrus, the superior temporal sulcus (30), and the amygdala (31), are likewise engaged in the decoding of emotional information. The distributed nature of this brain circuitry is particularly vulnerable to both focal and diffuse injuries, such as those derived from cerebrovascular and traumatic accidents, which is supported by the high incidence of impairment in the ability to discriminate among emotions after an injury to the brain (3244). The great majority of studies on facial emotion recognition have focused on individuals with traumatic brain injury (3542), and have evidenced an apparent increased difficulty to recognize negative expressions, such as anger, disgust, sadness, or fear (35). Fewer, but still a substantial number of, studies have investigated this ability after stroke (34, 4345), showing worsened performance in those subjects with lesions in the right hemisphere (34). Concurrent with impaired performance, altered visual exploration behavior has also been reported after brain injuries of different severity (46, 47).

The clinical relevance of difficulties in identifying facial expressions relies on its association with different neurobehavioral symptoms that range from changes in personality (3242) to impaired self-awareness (48), which can negatively impact social integration (49, 50). These sequelae and other neurobehavioral changes after an acquired brain injury may complicate the quality of life of both the patients and their caregivers (51).

The diminished ability to identify facial expression of emotions after brain lesions has traditionally been explained by an impaired emotional processing (33); however, alterations in visual exploration could bias the integration of visual information and, consequently, have an additional negative effect on the performance of emotional tasks. While this hypothesis has been investigated in other pathologies with associated social cognition deficits, such as schizophrenia (52) and autism spectrum disorders (53), its plausibility after cerebrovascular injury is still unknown.

In light of the existing evidence, we hypothesized that individuals with stroke would perform poorly in comparison to healthy subjects at identifying emotions from facial expressions, and that this effect would also be revealed when considering individual emotions separately. We additionally hypothesized that impaired performance after stroke could be partially explained by an altered visual exploration of the face, and evidenced by an altered variation in pupil dilation. Consequently, the objectives of this study were to investigate the accuracy of the performance, the visual behavior, and pupil dilation of a sample of individuals with stroke during the identification of emotional facial expressions.

Methods

Participants

A convenience sample of individuals with stroke were recruited from the outpatient unit of the neurorehabilitation service of Vithas Hospital Valencia al Mar (València, Spain), Vithas Hospital Aguas Vivas (Carcaixent, Spain), and the Brain Injury Center of Vithas Vinalopó (Elx, Spain). The inclusion criteria in this group were diagnosis of stroke confirmed by CT and/or MRI, aged over 18 years, with a fairly good cognitive condition, as defined by scores above 23 in the Mini-Mental State Examination (54), and the ability to follow instructions, as defined by scores above 45 in the receptive language index of the Mississippi Aphasia Screening Test (55). Individuals were excluded if they had disabling visual deficits, such as hemianopsia or impaired visual acuity, which would prevent appropriate visual stimulation and interaction. An additional group of healthy subjects, over 18 years of age, with no known cognitive or psychiatric impairments, were enrolled as controls.

A total of 111 individuals, 46 with stroke and 65 healthy controls, participated in the study. The group of individuals with stroke—either ischemic (n = 18) or hemorrhagic (n = 28)—consisted of 23 women and 23 men with a median time since injury of 428.0 (222–678) days and a median age of 53.5 (44–58) years. The control group consisted of 35 women and 30 men, with a median age of 48 (30–79) years. No significant differences were found in any demographic variable between these groups (Table 1).

Table 1.

Characteristics of healthy subjects and individuals with stroke.

Healthy subjects Individuals with stroke Significance
Sex (n, %) p = 0.836
    Women 35 (53.8%) 23 (50.0%)
    Men 30 (46.2%) 23 (50.0%)
Age (years) 48.0 (36–62) 53.5 (44–58) p = 0.382
Etiology (n, %)
    Ischemic 18 (39.1%)a
        TACI 9 (47.4%)b
        PACI 5 (26.3%)b
        LACI 5 (26.3%)b
    Hemorrhagic 28 (60.9%)a
Localization of the injury (n, %)
    Right anterior circulation 20 (43.5%)
    Left anterior circulation 17 (37.0%)
    Posterior circulation 9 (19.5%)
Time since injury (days) 428.0 (222–678)
Visual perception and cognition
    Letter cancelation test (n) 10.0 (10–10)
    Wechsler Memory Scale IV
        Visual reproduction 8.0 (7–9)
    Rey–Osterrieth complex figure copy 32.0 (30–34)
    Color trail test
        Part A (s) 52.0 (38–68)
        Part B (s) 110.0 (80.5–138)
    Wechsler Adult Intelligence Scale IV
        Symbol search 19.50 (14.25–26)
        Matrix reasoning 17.5 (12–22)

Demographic and clinical characteristics of healthy participants and individuals with a stroke. TACI, Total anterior circulation infarcts; PACI, Partial anterior circulation infarcts; LACI, Lacunar circulation infarcts. Age, time since injury, and performance in the neuropsychological tests are expressed in terms of median and interquartile range.

a

Percentage of all participants with stroke.

b

Percentage of participants with ischemic stroke.

All subjects who satisfied the inclusion criteria and accepted the terms of participation in the study provided informed written consent before enrolment. Ethical approval for this study was obtained from the Ethics Committees of the clinical institutions involved.

Instrumentation

Gaze behavior and pupil dilation was estimated using a Tobii TX300 screen-based eye tracker (Tobii AB, Stockholm, Sweden). This device captures gaze data from the corneal reflection of emitted IR light at 300 Hz. The system includes a 23″ screen, with a resolution of 1,920 × 1,080 pixels, which provides visual stimulation, and an eye-tracking unit, which includes an array of IR illuminators (transmitters) and sensors (receptors). In addition, the eye tracker is controlled by a dedicated computer, which incorporates a secondary screen that allows the trial to be managed and supervise without the visual stimulation being interfered with.

Visual stimuli were designed using Tobii Studio 3.2.1 (Tobii AB, Stockholm, Sweden). These consisted of 28 images extracted from the Karolinska Directed Emotional Faces database (56). The images illustrated four subjects—two men and two women—randomly selected from a list of 70 available people. The images reflected facial expressions of fear, anger, disgust, happiness, sadness, surprise, or an absence of emotion (neutrality). The images were displayed in the center of the screen, covering its entire height, which resulted in a picture size of 21 × 28 cm. The remaining areas of the screen were black. It was reported that the minimum time to explore the entire face is 4 s (57). Taking this into account, the stimuli were designed to be displayed in a randomized order for a 5-s period, 1 s longer than the minimum time period necessary to explore the entire face (57), during which gaze behavior and pupil dilation were recorded. Before each image was shown, a black screen was displayed for 500 ms to provide a subtractive baseline correction (58). After each image was shown, a thumbnail of the picture, along with seven words corresponding to the seven possible emotions, were displayed for a maximum of 30 s.

Procedure

The experiment took place in a dedicated, quiet room in one of the three clinical facilities, which was free of distractors and had controlled lighting conditions. The same experimenter conducted the study at all three sites. The participants were briefly introduced to the task, and were then asked to sit comfortably in a chair facing toward the eye tracker, with their head at an approximate distance of 65 cm from the screen. The eye tracker was calibrated for each participant. After the calibration process, the accuracy of the calibration was experimentally determined, using the deviation between target points on the screen and superimposed estimated fixation points. If the accuracy proved insufficient, the calibration process was repeated. Once the calibration was successful, the experiment was started. The participants were asked to stare at the faces that appeared on the screen for 5 s and then to identify the emotion that, according to their criteria, best matched each facial expression, and to choose this from the seven words shown on the screen. The participants were asked to name the emotion, and the experimenter noted down each answer and then continued the study. If the participants were not able to answer in 30 s, that picture was considered unanswered and the experiment continued. Consequently, the total duration of the study, without considering the calibration process, varied according to the time each participant needed to identify each emotion.

The participants were also assessed using a battery of neuropsychological tests that evaluated the cognitive abilities that involved their visual perceptive skills. This assessment included the letter cancelation test, the visual reproduction subtest of the Wechsler Memory Scale IV, the Rey–Osterrieth complex figure, the color trail test, and the symbol search and matrix reasoning subtests of the Wechsler Adult Intelligence Scale IV.

Data Analysis

The accuracy in identifying the emotions from the facial expressions was estimated as a percentage of the correct identifications of each emotion, as in previous works (8, 11, 12, 15, 16). According to this value, two subgroups of the individuals with stroke were determined—those with comparable performance to the healthy individuals (those with an equal or better performance than the median performance of the healthy controls) or those with poorer performance than the healthy subjects (those with poorer performance than the median performance of the healthy controls).

Gaze behavior was defined in terms of the number of fixations—also known as fixation count—and the total time spent—also known as total fixation duration—on the eyes, nose, and mouth, which, as mentioned above, have been identified as being the most representative areas involved in a visual scan of the face (11, 12, 14, 15, 59). These areas were manually defined for each visual stimulus image, in accordance with previous studies (11, 12, 15, 59). The averaged pupil diameter variation was also extracted, as in previous studies (18, 60, 61). The results of all the eye-tracking measures represent the averaged behavior of both eyes. Finally, performance at identifying emotions from facial expressions was defined as the percentage of correct identifications.

Prior to the computation of pupil dilation, the pupil data were pre-processed, as follows. First, those images or baselines that presented a ratio of missing data >50% in either eye were discarded (23, 60, 62, 63). Second, the first 2 s of the stimuli were also discarded to remove the initial pupil contraction (60). Third, the non-physiological variations in pupil size, identified as those changes occurring at a faster rate than 5 mm/s, were removed. Fourth, the remaining time windows of missing data were linearly interpolated (23, 62, 63). Fifth, the time series were low-pass filtered at 8.3 Hz to reveal the low-frequency trend (23, 62). Finally, variations in pupil size were obtained through subtractive baseline correction, in which pupil size is converted to an absolute difference from baseline pupil size to that during the stimuli (corrected pupil size = pupil size – baseline) (58).

Differences between the groups of participants, in terms of demographic and clinical variables, visual behavior, and performance, were investigated using independent-sample Mann–Whitney U tests, except for sex distribution, etiology, and laterality of injury, which were investigated using chi-squared tests. The level of alpha was set to 0.05 for all analyses.

Data regarding fixation duration, fixation count, and pupil dilation were extracted using Tobii Studio 3.2.1. Signal processing was performed using MATLAB 2018b (MathWorks Inc., MA, USA). The statistical analyses were performed using SPSS for Windows v.22 (IBM, Armonk, NY, USA).

Results

Accuracy

The accuracy in identifying facial expressions of emotion showed significant differences between the individuals with stroke and the healthy subjects, with the former showing decreased accuracy (p = 0.012) (Figure 1). The detrimental effect of a cerebrovascular accident incident was consistent for all emotions, but was particularly severe for anger (p = 0.030), happiness (p = 0.034), neutrality (p = 0.016), and surprise (p = 0.016), where the differences reached statistical significance.

Figure 1.

Figure 1

Accuracy of each group of participants by emotion. Percentage of correct answers obtained by all groups of participants in all emotions and in each emotion, separately. HG, Healthy participants; SG, Participants with stroke; SG-CP, Participants with stroke with comparable performance to healthy participants; SG-WP, Participants with stroke with worse performance than healthy participants. **p < 0.01, *p < 0.05, ~p < 0.1.

A more in-depth analysis indicated that 21 participants with stroke showed accuracy comparable to the healthy subjects, while the remaining 25 participants had a relatively poor performance (Table 2). Differences in the overall accuracy of this latter group and the healthy controls reached statistical significance (p < 0.001); however, no differences in cognitive ability that involved visual perceptive skills were detected, except in part B of the color trail test. When analyzing performance by emotion, the individuals with stroke and poor performance showed significantly decreased accuracy in comparison to the healthy subjects at identifying anger (p < 0.001), happiness (p = 0.006), neutrality (p < 0.001), sadness (p = 0.048), and surprise (p = 0.002) (Figure 1).

Table 2.

Characteristics of individuals with stroke grouped according to their performance.

Individuals post-stroke with comparable performance Individuals post-stroke with worse performance Significance
HG vs. SG-CP HG vs. SG-WP SG-CP vs. SG-WP
Sex (n, %) p = 0.520 p = 0.242 p = 0.236
    Women 13 (61.9%) 10 (40.0%)
    Men 8 (38.1%) 15 (60.0%)
Age (years) 48.0 (37–58) 55 (46–60) p = 0.706 p = 0.098 p = 0.114
Etiology (n, %) p = 0.864
    Ischemic 9 (42.9%)a 9 (36.0%)a
        TACI 3 (33.3%)b 5 (55.6%)b
        PACI 2 (22.2%)b 3 (33.3%)b
        LACI 4 (44.4%)b 1 (11.1%)b
    Hemorrhagic 12 (57.1%)a 16 (64.0%)a
Localization of the injury (n, %) p = 0.410
    Right anterior circulation 5 (23.8%) 15 (60.0%)
    Left anterior circulation 9 (42.9%) 8 (32.0%)
    Posterior 7 (33.3%) 2 (8.0%)
Time since injury (days) 421.0 (230–641) 431.0 (214–1054) p = 0.700
Visual perception and cognition
    Letter cancelation test (n) 10.0 (10–10) 10.0 (10–10) p = 0.801
    Wechsler Memory Scale IV
        Visual reproduction 8.0 (8–9) 8.0 (7–10) p = 0.851
    Rey–Osterrieth complex figure copy 33.0 (31–34) 31.0 (27–34) p = 0.134
    Color trail test
        Part A (s) 43.5 (35–59) 57.0 (38–75) p = 0.141
        Part B (s) 95.5 (78–121) 125 (92–165) p = 0.034
    Wechsler Adult Intelligence Scale IV
        Symbol search 23.5 (14–32) 18.0 (14–26) p = 0.281
        Matrix reasoning 16.0 (11–22) 18.0 (12–22) p = 0.972

Demographic and clinical characteristics of healthy participants and individuals with a stroke. HG, healthy subjects. SG-CP, Individuals with stroke with comparable performance to healthy subjects. SG-WP, Individuals with stroke with worse performance than healthy subjects; TACI, Total anterior circulation infarcts; PACI, Partial anterior circulation infarcts; LACI, Lacunar circulation infarcts. Age, time since injury, and performance in the neuropsychological tests are expressed in terms of median and interquartile range.

a

Percentage of all participants with stroke.

b

Percentage of participants with ischemic stroke.

Visual Behavior

No significant differences emerged when comparing the visual behavior of the individuals with stroke as a whole and with the healthy subjects; however, the individuals with stroke showed a tendency to spend less time (p = 0.073) and perform fewer fixations (p = 0.056) on the eyes in comparison to the healthy subjects. While the healthy subjects focused their attention on the eyes, nose, and mouth, in that order, the individuals with stroke mostly focused on the nose rather than the eyes. These differences were consistent for all emotions, and were statistically significant for fear (p = 0.039) and surprise (p = 0.019) (Figures 2, 3).

Figure 2.

Figure 2

Total fixation duration on the eyes of each group of participants by emotion. Total fixation duration on the eyes showed by all groups of participants in all emotions and in each emotion, separately. HG, Healthy participants; SG, Participants with stroke; SG-CP, Participants with stroke with comparable performance to healthy participants; SG-WP, Participants with stroke with worse performance than healthy participants. **p < 0.01, *p < 0.05, ~p < 0.1.

Figure 3.

Figure 3

Number of fixations on the eyes of each group of participants by emotion. Number of fixations on the eyes showed by all groups of participants in all emotions and in each emotion, separately. HG, Healthy group; SG, Stroke group; SG-CP, Stroke group with comparable performance; SG-WP, Stroke group with worse performance. *p < 0.05, ~p < 0.1.

No differences in visual behavior were detected between the healthy controls and the participants post-stroke with comparable performance, either when considering all emotions or when analyzing each emotion separately (Figures 2, 3). In contrast, when compared to the healthy controls, individuals post-stroke with poorer performance showed a tendency toward significance in time spent on the eyes (p = 0.059) and fixations made on the eyes (p = 0.076), both variables having lower values than those of the healthy group. The separate analysis of each emotion showed significant differences between these groups in terms of time spent on the eyes for happiness (p = 0.040) and surprise (p = 0.008), and in the number of fixations for surprise (p = 0.021). Tendencies toward significance appeared in the time spent on the eyes for fear (p = 0.053) and in the number of fixations for fear (p = 0.053), happiness (p = 0.055), and sadness (p = 0.092) (Figures 2, 3).

No differences were found between the visual behavior of any group for the mouth or nose.

Pupil Dilation

No significant differences were found in the variation in pupil dilation between the healthy subjects and individuals with stroke, or in general, or by emotion (Figure 4).

Figure 4.

Figure 4

Pupil dilation variation of each group of participants by emotion. Variation of the pupil size showed by all groups of participants in all emotions and in each emotion, separately. HG, Healthy group; SG, Stroke group; SG-CP, Stroke group with comparable performance; SG-WP, Stroke group with worse performance. ~p < 0.1.

No significant differences emerged between the healthy subjects and the individuals with stroke with similar or poorer performance (Figure 4). However, individuals with stroke with poorer performance than the healthy subjects showed a tendency toward signification in fear (p = 0.059) and anger (p = 0.098) (Figure 4).

Discussion

This study investigated the accuracy of responses to visual stimuli, the visual behavior, and pupil dilation in individuals with stroke while identifying emotional facial expressions in comparison to healthy subjects. The individuals with stroke showed a significantly relatively poor overall performance in comparison to the healthy subjects, which was also evident when analyzing each emotion separately. Although the different performances between the groups did not correspond to significantly different visual behaviors or pupillary activity, the individuals with stroke seemed to direct less attention toward the eyes and exhibited diminished pupil response. Importantly, when considering those individuals with stroke with impaired performance, these differences were significant for specific emotions. In contrast, the post-stroke individuals with comparable performance to the control group did not show any differences in their visual behavior or pupillary response from those of the healthy subjects. No relevant differences were found between the participants post-stroke with different performance in terms of any demographic or clinical variable, which supports the idea that an impaired ability to identify emotional facial expressions could be partially caused by altered visual behavior.

The ability of the healthy subjects in our study to identify facial expressions of emotion is similar to that reported in previous works, evidencing the greatest accuracy in identifying happiness and surprise, with opposite results in identifying fear (12, 15, 16). Their accuracy was, however, slightly inferior in all emotions in comparison to other reports (12, 15, 16). This effect was especially evident for fear, which had the lowest accuracy values. This might be explained by the fact that the healthy participants in our study, whose ages matched those of the individuals who had had brain injury, were significantly older than the participants in other studies, who were mostly recruited from the student bodies of universities and were, therefore, mostly in their 20s (12, 15, 16). As reported in previous studies, poorer performance at identifying emotional expressions is expected in older age (64, 65). In addition, although the images used in our study were extracted from the same database used in other studies (12, 15, 16), and the images were randomly selected, the emotions shown in the images may have been more difficult to recognize than in the images used in other studies. The visual behavior of the healthy participants was also consistent with existing reports, showing that eyes, noses, and mouths are the most relevant facial structures used in identifying facial expressions (1215, 59). The hierarchical distribution of attention to the eyes, followed by the nose and the mouth, is also supported in most of the existing literature (11, 15, 16). In this study, the eyes were especially relevant when identifying surprise and fear, but seemed to draw less attention for disgust, which is consistent with a previous study (15). Nonetheless, it is important to highlight that there is no fixed or common pattern of visual behavior while identifying different emotions, as equally evidenced by our study and in previous reports (12, 15, 16). Additionally, our results must be taken into account considering that assessing accuracy by a simple count of the correct identifications without regard for false alarms or bias in the use of response categories, although it is the most common approach to analyze this behavior (8, 11, 12, 15, 16), could be misleading (66).

The variation in pupil dilation in our study was greater than that reported in previous studies (24, 60, 67). This dissimilarity may have derived from the use of different images, which, despite having been normalized, might have promoted different levels of arousal, consequently modulating the pupil response in a different way. Our results are, however, supported by a previous study, which reported the lowest variation in pupil dilation for expressions of happiness and neutrality, and the highest variation for fear (60). Despite this, it should be taken into consideration that variations in pupil dilation are triggered by different mechanisms, from simple autonomous processes, such as pupillary light reflex (18, 19), to high executive functioning (22, 27), so a definitive identification of the source of the variation is not possible using this technology. In addition, although the methodology of our study has been repeatedly used in previous investigations (25, 57, 60), it is important to consider that the use of a black screen as a baseline may have negatively contributed to the identification of the source of the pupil variation.

Individuals with stroke showed impaired performance at identifying facial expressions of emotion, in line with previous studies (34, 4345). Interestingly, these studies grouped the emotions by their attributes, reporting that individuals post-stroke exhibited better performance at identifying positively attributed emotions over negatively attributed emotions (4345). This effect is also supported by our results, which showed the greatest accuracy for happiness and surprise, and the worst performance for fear. Nevertheless, the differences between the healthy participants and the individuals with stroke were not significant for all emotions, in contrast to what was reported in a previous study (43). The use of different images might explain these dissimilarities. Some emotions in our study might have been particularly difficult to interpret, affecting both groups in a similar way. The decreased attention toward the eyes exhibited by the individuals post-stroke in comparison to the healthy subjects is suggestive of an altered perception of visual information, which could partially explain their impaired ability to identify the facial expression of emotions. This hypothesis is supported by the differences in visual behavior for happiness and surprise, which represented a huge challenge to this group of participants. In contrast, differences in the accuracy of identifying anger and neutrality were not associated with differences in visual behavior when observing these emotions. The inconsistency in these results might reflect the complexity of the perceptual and cognitive processes underlying the decoding of facial expressions (3, 30). Although not statistically significant, individuals post-stroke showed a slightly diminished pupillary response compared to the healthy subjects, which, if endorsed in further studies, might reflect diminished emotional arousal, confirming previous reports (18, 23, 24). It is important to highlight, however, that pupil dilation is also driven by the co-activation of multiple brain areas (19, 68), which might be affected by a cerebrovascular accident.

In general, a comparison of the visual behavior and pupillary activity between healthy subjects and individuals with stroke only showed a decreased attention to the eyes, but this did not reach statistical significance. Although these results might support a degree of comparability between both groups, a separate analysis of the individuals with stroke, according to their performance, exposed significant differences. Differences between individuals with stroke with impaired performance and healthy subjects were stronger and significant for happiness, surprise, and fear. Pupil dilation was also lower and showed a tendency to significance for fear and anger. In contrast, participants with comparable performance showed similar visual behaviors and pupillary responses. The differences detected in groups with different performances, but an absence of any other clinical or demographic dissimilarities, suggest that altered visual behavior could be a contributing factor to impaired performance, rather than the neurological condition itself. Altered visual behavior, together with impaired emotional processing, which has been repeatedly reported after stroke (34, 43), could explain the accuracy of these individuals in identifying emotional facial expressions.

Conclusions

This study corroborated the negative effect of a cerebrovascular accident on the ability to identify facial expressions of emotion, which was also supported by analyzing the emotions separately. Our results showed that individuals with stroke looked for a shorter time and fewer times at the eyes than did healthy subjects, but without significantly differing from the pattern of observation of the healthy subjects. These differences were, however, accentuated when analyzing individuals with stroke according to their performance. While no differences were detected between the healthy subjects and the individuals post-stroke with comparable performance, this latter group showed increased and significant differences in different measures compared to healthy subjects, suggesting that altered visual behavior might be associated with, and be a contributing factor to, difficulties in identifying the facial expression of emotions after stroke. No significant differences were found in pupil dilation between healthy controls and individuals with stroke.

Data Availability Statement

The datasets generated for this study will not be made publicly available as they contain confidential data.

Ethics Statement

Ethical approval for the study was granted by the Institutional Review Board of Vithas Hospital Valencia al Mar. All participants provided written informed consent before taking part in the study.

Author Contributions

RL designed the study. BM, JF, and RL defined the clinical aspects regarding individuals with stroke, and BM assessed their condition. AM conducted the experimental sessions and analyzed the data. All the authors discussed the results of the experiment.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Footnotes

Funding. This study was funded by Conselleria de Educación, Cultura y Deporte of Generalitat Valenciana of Spain (Project SEJI/2019/017), and Universitat Politècnica de València (Grant PAID-10-18).

References

  • 1.Nijsse B, Spikman JM, Visser-Meily JMA, de Kort PLM, van Heugten CM. Social cognition impairments are associated with behavioural changes in the long term after stroke. PLoS ONE. (2019) 14:e0213725. 10.1371/journal.pone.0213725 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Feldman RS, White JB, Lobato D. Social skills and nonverbal behavior. In: Robert SF. editor. Development of Nonverbal Behavior in Children. New York, NY: Springer New York; (1982). p. 259–277. [Google Scholar]
  • 3.Vuilleumier P, Pourtois G. Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging. Neuropsychologia. (2007) 45:174–94. 10.1016/j.neuropsychologia.2006.06.003 [DOI] [PubMed] [Google Scholar]
  • 4.Frith CD, Frith U. How we predict what other people are going to do. Brain Res. (2006) 1079:36–46. 10.1016/j.brainres.2005.12.126 [DOI] [PubMed] [Google Scholar]
  • 5.Zinchenko O, Yaple ZA, Arsalidou M. Brain responses to dynamic facial expressions: a normative meta-analysis. Front Hum Neurosci. (2018) 12:227. 10.3389/fnhum.2018.00227 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Ekman P, David SL, Matsumoto R, Oster H, Rosenberg EL, Scherer KR. Facial expression and emotion. Am Psychol. (1993) 48:384–92. 10.1037/0003-066X.48.4.384 [DOI] [PubMed] [Google Scholar]
  • 7.Srinivasan R, Martinez AM. Cross-cultural and cultural-specific production and perception of facial expressions of emotion in the wild. IEEE Trans Affect Comput. (2018) 14:1–15. 10.1109/TAFFC.2018.2887267 [DOI] [Google Scholar]
  • 8.Smith ML, Grühn D, Bevitt A, Ellis M, Ciripan O, Scrimgeour S, et al. Transmitting and decoding facial expressions of emotion during healthy aging: more similarities than differences. J Vis. (2018) 18:10. 10.1167/18.9.10 [DOI] [PubMed] [Google Scholar]
  • 9.Thompson AE, Voyer D. Sex differences in the ability to recognise non-verbal displays of emotion: a meta-analysis. Cogn Emot. (2014) 28:1164–95. 10.1080/02699931.2013.875889 [DOI] [PubMed] [Google Scholar]
  • 10.DoleŽal J, Fabian V. 41. Application of eye tracking in neuroscience. Clin Neurophysiol. (2015) 126:e44 10.1016/j.clinph.2014.10.200 [DOI] [Google Scholar]
  • 11.Guo K. Holistic gaze strategy to categorize facial expression of varying intensities. PLoS ONE. (2012) 7:e42585. 10.1371/journal.pone.0042585 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Guo K, Soornack Y, Settle R. Expression-dependent susceptibility to face distortions in processing of facial expressions of emotion. Vision Res. (2019) 157:112–22. 10.1016/j.visres.2018.02.001 [DOI] [PubMed] [Google Scholar]
  • 13.Eisenbarth H, Alpers GW. Happy mouth and sad eyes : scanning emotional facial expressions. Emotion. (2011) 11:860–5. 10.1037/a0022758 [DOI] [PubMed] [Google Scholar]
  • 14.Schurgin MW, Nelson J, Iida S, Ohira H, Chiao JY, Franconeri SL. Eye movements during emotion recognition in faces. J Vis. (2015) 14:1–16. 10.1167/14.13.14 [DOI] [PubMed] [Google Scholar]
  • 15.Guo K, Shaw H. Face in profile view reduces perceived facial expression intensity : an eye-tracking study. ACTPSY. (2015) 155:19–28. 10.1016/j.actpsy.2014.12.001 [DOI] [PubMed] [Google Scholar]
  • 16.Guo K. Size-invariant facial expression categorization and associated gaze allocation within social interaction space. Perception. (2013) 42:1027–42. 10.1068/p7552 [DOI] [PubMed] [Google Scholar]
  • 17.Hall JE, John E, Guyton AC. Guyton and Hall Textbook of Medical Physiology. Philadelphia, PA: Saunders Elsevier; (2011). [Google Scholar]
  • 18.Sirois S, Brisson J. Pupillometry. Wiley Interdiscip Rev Cogn Sci. (2014) 5:679–92. 10.1002/wcs.1323 [DOI] [PubMed] [Google Scholar]
  • 19.Eckstein MK, Guerra-Carrillo B, Miller Singley AT, Bunge SA. Beyond eye gaze: what else can eyetracking reveal about cognition and cognitive development? Dev Cogn Neurosci. (2017) 25:69–91. 10.1016/j.dcn.2016.11.001 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Ariel R, Castel AD. Eyes wide open: enhanced pupil dilation when selectively studying important information. Exp Brain Res. (2014) 232:337–44. 10.1007/s00221-013-3744-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Zekveld AA, Kramer SE. Cognitive processing load across a wide range of listening conditions: Insights from pupillometry. Psychophysiology. (2014) 51:277–84. 10.1111/psyp.12151 [DOI] [PubMed] [Google Scholar]
  • 22.De Gee JW, Knapen T, Donner TH. Decision-related pupil dilation reflects upcoming choice and individual bias. Proc Natl Acad Sci USA. (2014) 111:1–8. 10.1073/pnas.1317557111 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Babiker A, Faye I, Malik A. Pupillary behavior in positive and negative emotions. In: IEEE ICSIPA 2013 - IEEE International Conference on Signal and Image Processing. (Melaka: ) (2013). p. 7–11. [Google Scholar]
  • 24.Bradley MM, Miccoli L, Escrig MA, Lang PJ. The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology. (2008) 45:602–7. 10.1111/j.1469-8986.2008.00654.x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Duque A, Sanchez A, Vazquez C. Gaze-fixation and pupil dilation in the processing of emotional faces: the role of rumination. Cogn Emot. (2014) 28:1347–66. 10.1080/02699931.2014.881327 [DOI] [PubMed] [Google Scholar]
  • 26.Lanata A, Armato A, Valenza G, Scilingo EP. Eye tracking and pupil size variation as response to affective stimuli: a preliminary study. In: 5th International Conference on Pervasive Computing Technologies for Healthcare, Pervasive Health. Dublin: (2011). p. 78–84. [Google Scholar]
  • 27.Peinkhofer C, Knudsen GM, Moretti R, Kondziella D. Cortical modulation of pupillary function: systematic review. Peer J. (2019) 7:e6882. 10.7717/peerj.6882 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Huff T, Tadi P. Neuroanatomy, Visual Cortex. Treasure Island, FL: StatPearls Publishing; (2019). [PubMed] [Google Scholar]
  • 29.Grill-Spector K, Knouf N, Kanwisher N. The fusiform face area subserves face perception, not generic within-category identification. Nat Neurosci. (2004) 7:555–62. 10.1038/nn1224 [DOI] [PubMed] [Google Scholar]
  • 30.Ferretti V, Papaleo F. Understanding others: emotion recognition in humans and other animals. Genes Brain Behav. (2019) 18:1–12. 10.1111/gbb.12544 [DOI] [PubMed] [Google Scholar]
  • 31.Sergerie K, Chochol C, Armony JL. The role of the amygdala in emotional processing: a quantitative meta-analysis of functional neuroimaging studies. Neurosci Biobehav Rev. (2008) 32:811–30. 10.1016/j.neubiorev.2007.12.002 [DOI] [PubMed] [Google Scholar]
  • 32.Rapcsak SZ, Galper SR, Comer JF, Reminger SL, Nielsen L, Kaszniak AW, et al. Fear recognition deficits after focal brain damage: a cautionary note. Neurology. (2000) 54:575. 10.1212/WNL.54.3.575 [DOI] [PubMed] [Google Scholar]
  • 33.Radice-Neumann D, Zupan B, Tomita M, Willer B. Training emotional processing in persons with brain injury. J Head Trauma Rehabil. (2009) 24:313–23. 10.1097/HTR.0b013e3181b09160 [DOI] [PubMed] [Google Scholar]
  • 34.Yuvaraj R, Murugappan M, Norlinah MI, Sundaraj K, Khairiyah M. Review of emotion recognition in stroke patients. Dement Geriatr Cogn Disord. (2013) 36:179–96. 10.1159/000353440 [DOI] [PubMed] [Google Scholar]
  • 35.Babbage DR, Yim J, Zupan B, Neumann D, Tomita MR, Willer B. Meta-analysis of facial affect recognition difficulties after traumatic brain injury. Neuropsychology. (2011) 25:277–85. 10.1037/a0021908 [DOI] [PubMed] [Google Scholar]
  • 36.Milders M, Fuchs S, Crawford JR. Neuropsychological impairments and changes in emotional and social behaviour following severe traumatic brain injury. J Clin Exp Neuropsychol. (2003) 25:157–72. 10.1076/jcen.25.2.157.13642 [DOI] [PubMed] [Google Scholar]
  • 37.Genova HM, Genualdi A, Goverover Y, Chiaravalloti ND, Marino C, Lengenfelder J. An investigation of the impact of facial affect recognition impairments in moderate to severe TBI on fatigue, depression, and quality of life. Soc Neurosci. (2017) 12:303–7. 10.1080/17470919.2016.1173584 [DOI] [PubMed] [Google Scholar]
  • 38.Rigon A, Voss MW, Turkstra LS, Mutlu B, Duff MC. Different aspects of facial affect recognition impairment following traumatic brain injury: the role of perceptual and interpretative abilities. J Clin Exp Neuropsychol. (2018) 40:805–19. 10.1080/13803395.2018.1437120 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Rosenberg H, McDonald S, Dethier M, Kessels RPC, Westbrook RF. Facial emotion recognition deficits following moderate-severe traumatic brain injury (TBI): re-examining the valence effect and the role of emotion intensity. J Int Neuropsychol Soc. (2014) 20:994–1003. 10.1017/S1355617714000940 [DOI] [PubMed] [Google Scholar]
  • 40.Lancelot C, Gilles C. How does visual context influence recognition of facial emotion in people with traumatic brain injury? Brain Inj. (2019) 33:4–11. 10.1080/02699052.2018.1531308 [DOI] [PubMed] [Google Scholar]
  • 41.McDonald S. Impairments in social cognition following severe traumatic brain injury. J Int Neuropsychol Soc. (2013) 19:231–46. 10.1017/S1355617712001506 [DOI] [PubMed] [Google Scholar]
  • 42.Vallat-Azouvi C, Azouvi P, Le-Bornec G, Brunet-Gouet E. Treatment of social cognition impairments in patients with traumatic brain injury: a critical review. Brain Inj. (2019) 33:87–93. 10.1080/02699052.2018.1531309 [DOI] [PubMed] [Google Scholar]
  • 43.Tippett DC, Godin BR, Oishi K, Oishi K, Davis C, Gomez Y, et al. Impaired recognition of emotional faces after stroke involving right amygdala or insula. Semin Speech Lang. (2018) 39:87–99. 10.1055/s-0037-1608859 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Abbott JD, Cumming G, Fidler F, Lindell AK. The perception of positive and negative facial expressions in unilateral brain-damaged patients: a meta-analysis. Laterality. (2013) 18:437–59. 10.1080/1357650X.2012.703206 [DOI] [PubMed] [Google Scholar]
  • 45.Abbott JD, Wijeratne T, Hughes A, Perre D, Lindell AK. The perception of positive and negative facial expressions by unilateral stroke patients. Brain Cogn. (2014) 86:42–54. 10.1016/j.bandc.2014.01.017 [DOI] [PubMed] [Google Scholar]
  • 46.Delazer M, Sojer M, Ellmerer P, Boehme C, Benke T. Eye-tracking provides a sensitive measure of exploration deficits after acute right MCA stroke. Front Neurol. (2018) 9:359. 10.3389/fneur.2018.00359 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Lech M, Kucewicz MT, Czyzewski A. Human computer interface for tracking eye movements improves assessment and diagnosis of patients with acquired brain injuries. Front Neurol. (2019) 10:6. 10.3389/fneur.2019.00006 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Spikman JM, Milders MV, Visser-Keizer AC, Westerhof-Evers HJ, Herben-Dekker M, van der Naalt J. Deficits in facial emotion recognition indicate behavioral changes and impaired self-awareness after moderate to severe traumatic brain injury. PLoS ONE. (2013) 8:e0065581. 10.1371/journal.pone.0065581 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Knox L, Douglas J. Long-term ability to interpret facial expression after traumatic brain injury and its relation to social integration. Brain Cogn. (2009) 69:442–9. 10.1016/j.bandc.2008.09.009 [DOI] [PubMed] [Google Scholar]
  • 50.Struchen MA, Clark AN, Sander AM, Mills MR, Evans G, Kurtz D. Relation of executive functioning and social communication measures to functional outcomes following traumatic brain injury. NeuroRehabilitation. (2008) 23:185–98. 10.3233/NRE-2008-23208 [DOI] [PubMed] [Google Scholar]
  • 51.Ferro JM, Caeiro L, Santos C. Poststroke Emotional and Behavior Impairment: a narrative review. Cerebrovasc Dis. (2009) 27:197–203. 10.1159/000200460 [DOI] [PubMed] [Google Scholar]
  • 52.Bortolon C, Capdevielle D, Raffard S. Face recognition in schizophrenia disorder: a comprehensive review of behavioral, neuroimaging and neurophysiological studies. Neurosci Biobehav Rev. (2015) 53:79–107. 10.1016/j.neubiorev.2015.03.006 [DOI] [PubMed] [Google Scholar]
  • 53.Harms MB, Martin A, Wallace GL. Facial emotion recognition in autism spectrum disorders: a review of behavioral and neuroimaging studies. Neuropsychol Rev. (2010) 20:290–322. 10.1007/s11065-010-9138-6 [DOI] [PubMed] [Google Scholar]
  • 54.Folstein MF, Folstein SE, McHugh PR. “Mini-mental state”. A practical method for grading the cognitive state of patients for the clinician. J Psychiatr Res. (1975) 12:189–98. 10.1016/0022-3956(75)90026-6 [DOI] [PubMed] [Google Scholar]
  • 55.Romero M, Sánchez A, Marín C, Navarro MD, Ferri J, Noé E. Clinical usefulness of the Spanish version of the Mississippi Aphasia Screening Test (MASTsp): validation in stroke patients. Neurol English Ed. (2012) 27:216–24. 10.1016/j.nrleng.2011.06.001 [DOI] [PubMed] [Google Scholar]
  • 56.Lundqvist D, Flykt A, Öhman A. The Karolinska Directed Emotional Faces – KDEF. Stockholm: CD ROM from Department of Clinical Neuroscience, Psychology section, Karolinska Institutet; (1998). [Google Scholar]
  • 57.Aguillon-Hernandez N, Roché L, Bonnet-Brilhault F, Roux S, Barthelemy C, Martineau J. Eye movement monitoring and maturation of human face exploration. Med Princ Pract. (2016) 25:548–54. 10.1159/000447971 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58.Mathôt S, Fabius J, Van Heusden E, Van der Stigchel S. Safe and sensible preprocessing and baseline correction of pupil-size data. Behav Res Methods. (2018) 50:94–106. 10.3758/s13428-017-1007-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Green C, Guo K. Factors contributing to individual differences in facial expression categorisation. Cogn Emot. (2018) 32:37–48. 10.1080/02699931.2016.1273200 [DOI] [PubMed] [Google Scholar]
  • 60.Burley DT, Gray NS, Snowden RJ. As far as the eye can see: relationship between psychopathic traits and Pupil response to affective stimuli. PLoS ONE. (2017) 12:e0167436. 10.1371/journal.pone.0167436 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Partala T, Surakka V. Pupil size variation as an indication of affective processing. Int J Hum Comput Stud. (2003) 59:185–98. 10.1016/S1071-5819(03)00017-X [DOI] [Google Scholar]
  • 62.Babiker A, Faye I, Malik A. Differentiation of pupillary signals using statistical and functional analysis. In: 5th International Conference on Intelligent and Advanced Systems (ICIAS). Kuala Lumpur: (2014). p. 1–6. [Google Scholar]
  • 63.Gotham KO, Siegle GJ, Han GT, Tomarken AJ, Crist RN, Simon DM, et al. Pupil response to social-emotional material is associated with rumination and depressive symptoms in adults with autism spectrum disorder. PLoS ONE. (2018) 13:e0200340. 10.1371/journal.pone.0200340 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 64.Demenescu LR, Mathiak KA, Mathiak K. Age-and gender-related variations of emotion recognition in pseudowords and faces. Exp Aging Res. (2014) 40:187–207. 10.1080/0361073X.2014.882210 [DOI] [PubMed] [Google Scholar]
  • 65.Grainger SA, Henry JD, Phillips LH, Vanman EJ, Allen R. Age deficits in facial affect recognition: the influence of dynamic cues. J Gerontol Ser B Psychol Sci Soc Sci. (2017) 72:622–632. 10.1093/geronb/gbv100 [DOI] [PubMed] [Google Scholar]
  • 66.Wagner HL. On measuring performance in category judgment studies of nonverbal behavior. J Nonverbal Behav. (1993) 17:3–28. 10.1007/BF00987006 [DOI] [Google Scholar]
  • 67.Bradley MM, Lang PJ. Memory, emotion, and pupil diameter: repetition of natural scenes. Psychophysiology. (2015) 52:1186–93. 10.1111/psyp.12442 [DOI] [PubMed] [Google Scholar]
  • 68.Larsen RS, Waters J. Neuromodulatory correlates of pupil dilation. Front Neural Circuits. (2018) 12:21. 10.3389/fncir.2018.00021 [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

The datasets generated for this study will not be made publicly available as they contain confidential data.


Articles from Frontiers in Neurology are provided here courtesy of Frontiers Media SA

RESOURCES