Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2019 Jul 25.
Published in final edited form as: IEEE Trans Neural Syst Rehabil Eng. 2016 Jul 14;25(6):739–749. doi: 10.1109/TNSRE.2016.2591556

Design of a Virtual Reality System for Affect Analysis in Facial Expressions (VR-SAAFE); application to schizophrenia

E Bekele 1, D Bian 2, J Peterman 3, S Park 4, N Sarkar 5
PMCID: PMC6657800  NIHMSID: NIHMS887502  PMID: 27429438

Abstract

Schizophrenia is a life-long, debilitating psychotic disorder with poor outcome that affects about 1% of the population. Although pharmacotherapy can alleviate some of the acute psychotic symptoms, residual social impairments present a significant barrier that prevents successful rehabilitation. With limited resources and access to social skills training opportunities, innovative technology has emerged as a potentially powerful tool for intervention. In this paper, we present a novel virtual reality (YR)-based system for understanding facial emotion processing impairments that may lead to poor social outcome in schizophrenia. We henceforth call it a YR System for Affect Analysis in Facial Expressions (YR-SAAFE). This system integrates a YR-based task presentation platform that can minutely control facial expressions of an avatar with or without accompanying verbal interaction, with an eye-tracker to quantitatively measure a participants real-time gaze and a set of physiological sensors to infer his/her affective states to allow in-depth understanding of the emotion recognition mechanism of patients with schizophrenia based on quantitative metrics. A usability study with 12 patients with schizophrenia and 12 healthy controls was conducted to examine processing of the emotional faces. Preliminary results indicated that there were significant differences in the way patients with schizophrenia processed and responded towards the emotional faces presented in the YR environment compared with healthy control participants. The preliminary results underscore the utility of such a YR-based system that enables precise and quantitative assessment of social skill deficits in patients with schizophrenia.

Index Terms: Virtual Reality, Affective Computing, Eye Gaze, Physiology, Schizophrenia Therapy

I. INTRODUCTION

SCHIZOPRENIA is a debilitating psychotic disorder that affects about 1% of the population , and is one of the most disabling conditions in terms of occupational and social functioning [1]. Schizophrenia is characterized by hallucinations, delusions, thought disorders and abnormal social and affective behaviors, reflecting fragmentation and disintegration of cognitive, emotional and social functions [2], [3]. Some of the psychotic symptoms such as hallucinations and thought disorder are partly ameliorated by antipsychotic drugs, but the route to recovery is mainly hampered by social impairments [4]. Marked social impairments are present at all stages of this illness beginning well before the first signs of florid psychosis and throughout the course of illness [5], [6] but currently, there are no effective pharmacological treatments targeting the social impairments. Behavioral treatments have been developed in order to address these social deficits with some far-transfer functional improvements [7]. Many of these behavioral treatments are focused on improving facial emotion recognition through various training programs [7]–[9]. Thus, understanding causal mechanisms of these social cognitive deficits would be the first step toward designing and implementing effective social interventions. Moreover, developing innovative quantitative methods of measuring social cognitive deficits in schizophrenia will contribute towards elucidating the neural and cognitive bases of abnormal social behavior.

A relatively unexplored aspect of emotion recognition deficits in schizophrenia is the emotional state of the individual during emotion perception. Previous research in individuals with unipolar depression indicates an emotion recognition deficit when individuals were experiencing an active mood episode [10], [11]. Emotional experiences and expressions of individuals with schizophrenia have been of particular interest for a long time with impairment in coherence of emotional responses. Various studies investigated the self-report of emotional experiences in response to film clips [12], [13] and still images [14]. Individuals with schizophrenia report experiencing the same or even stronger emotions compared with controls [15], but when presented with emotionally evocative stimuli, individuals with schizophrenia do not show facial expressions of emotion even though they report experiencing an emotion [16]–[18]. Therefore, it is important to investigate potential interplay between emotion recognition deficits and emotional responsivity aberrations. Furthermore, by elucidating the role of internal emotional experience during the recognition of others emotions, we may be able to develop treatments that take this interplay into account.

Much of the emotion research in schizophrenia has focused on facial emotion recognition using static pictures of faces. However, outside of the laboratory, emotions are expressed, perceived and experienced dynamically over time. Using virtual reality (VR) interfaces allows for the use of stimuli that conveys the temporal nature of emotional expressions, which is lost in traditional static emotion recognition tasks. Thus, a VR environment provides a more ecologically valid simulation of an emotional expression with increased precision in the temporal dynamics of the emotional expression. A recent review of virtual reality (VR) based treatments for psychiatric disorders indicate that they can be highly effective [19], [20]. For example, VR-based training improved cognitive functioning and vocational outcome in schizophrenic patients, and the efficacy of VR training was greater than that of therapist-based training [21]. VR based treatments may show significant efficacy because the VR environment allows for realistic simulations of social interactions that may take place in the real world. Repeated practice of social skills with an emotionally expressive avatar in the VR environment will support social learning. Presenting dynamic stimuli to evoke emotional experiences, while recording physiological signals from peripheral nervous system, and real-time eye gaze patterns provide an effective strategy for investigating psychophysiological correlates of emotions felt by individuals with schizophrenia. Thus, the apparent disconnect between outward display of emotion by SZ patients and their subjective, internal feelings could be studied by examining the involuntary peripheral physiological responses of the autonomic nervous system. Moreover, the ability to monitor eye gaze will help clarify the role of visual attention on emotion processing, and how gaze patterns differ between schizophrenia patients and healthy participants. However, to our knowledge, there is no existing system that can provide such functionalities. These studies may eventually lead to using these internal measures as a way of modeling the emotional state of the patient for an on-line closed loop interaction for SZ intervention in the future.

In this work, we present a customized VR-based system for SZ intervention that was originally designed for adolescents with autism [22]. This system is capable of presenting various dynamic emotional facial expressions of virtual avatars in modifiable intensities (e.g., high anger, medium happiness etc.) within a social context. The system is integrated with a peripheral physiological signal measurement system and an eye tracker to precisely index a participants physiological and eye gaze response when he is engaged in an emotion recognition task. This novel system is then used in a usability study to assess its usefulness in understanding emotion processing of SZ patients. Thus the primary contributions of this paper are two fold: 1) design and development of a novel SZ intervention system for emotion recognition tasks, and 2) assess the usefulness of such a system in a usability study to generate objective data that might potentially be helpful in understanding the underlying emotion processing mechanism of patients with SZ.

The rest of the paper is organized as follows. Section 2 presents a brief survey of related work in the area of VR based SZ intervention. Section 3 presents the design and development of the integrated VR-based system. Sections 4 and 5 present the methods and procedure of the usability study, and its results, respectively. Finally, conclusions and contributions of the paper are summarized in Section 6.

II. RELATED WORK AND SCOPE FOR CONTRIBUTION

SZ is characterized by social and emotional impairments to emotion identification and processing impairments [3]. More specifically disparate emotional experiences are of particular interest in SZ intervention [2]. Traditionally, emotion recognition and affect analysis in response to stimuli in SZ have been investigated using static pictures and isolated video clips [7], [23]. However, more natural contextual emotion recognition requires the use of dynamic stimuli presentation and emotional response analysis. The use of VR technology holds promise in SZ intervention for the reasons of flexibility, controllability, and adaptability. In addition, integrating VR technology with physiological and eye gaze monitoring may lead to more in-depth understanding of emotion processing as opposed to using only static images and isolated and non-interactive video clips.

A. VR for social intervention in Schizophrenia

In the context of technology-based SZ intervention, VR systems have been investigated with SZ for symptom assessment [24], training of medication management skills [25], hallucinations [26], social perception [27], role playing [28], improving the diagnosis of SZ [29] and job training [21]. The use of VR allows for the circumvention of the limitations of conventional social interventions (e.g., time and effort required by therapists, lack of personalization). In contrast, rapidly advancing technologies such as VR offer many advantages for personalized intervention: flexibility, controllability, extensive (almost infinite) repertoire of stimuli, low-burden, low-cost and safety [30]. There have been a few studies that looked at emotion recognition using dynamic virtual faces and found performance deficits in individuals with schizophrenia [31], [32]. Such results indicate that there is a clear need for the incorporation of VR in the training and treatment of emotion recognition in social interactions. However, these previous interventions relied on user reporting and outward measures of performance. However, self-reports from SZ patients may be unreliable [8] and outward performance measures alone may not be sufficient to understand the emotion processing mechanism. Monitoring physiological response and eye gaze indices in addition to performance measures, may offer an opportunity to delve deeply into the emotion processing mechanism. Thus incorporation of affect analysis during these VR treatments could provide greater understanding of the individuals response during the treatment and could allow for adjustment of the treatment for optimal skill building in the future.

B. Physiology and Eye gaze in Schizophrenia

The apparent disconnect between internal subjective experience and the outward expression of emotional states presents a complex problem for understanding emotional deficits in individuals with schizophrenia. Early investigations into the physiological reactivity of individuals with schizophrenia using galvanic skin response (GSR), a measure of autonomic arousal, suggested that subgroups of patients with schizophrenia show different GSR profiles given their stimuli [33]. However, some other studies have not found a difference between individuals with schizophrenia and controls using emotionally evocative images [1], [34]. The use of electromyography (EMG) to measure activity of muscles involved in the production of emotional expressions (e.g., zygomaticus majoris and the corrugator supercilii) has provided insight into the expressive deficit in individuals with SZ [35]–[37]. Although visible expressions are reduced in individuals with SZ, Kring and colleagues [18] found that patients and controls showed similar valence-dependent zygomatic activity (e.g., greater activity in response to positive images) and corrugator activity (e.g., greater activity in response to negative images). Other studies have shown reduced facial muscle activity in response to emotionally-arousing stimuli in patients but that they are activating the muscles in a valence congruent way [36], [37]. The EMG studies suggest that while patients may not be producing overt emotionally congruent expressions, they are engaging the musculature associated with these expressions, albeit at an attenuated level. Other physiological modalities such as breathing rate, heart rate, and systolic blood pressure were employed for studies involving individuals with SZ [34]. Additionally several studies explored eye gaze patterns of individuals with SZ [38], [39]. Moreover, the use of the startle eye blink paradigm has shown a reversed pattern of emotional response of individuals with schizophrenia [40], [41]. Typically, the startle eye blink response is attenuated when paired with a positively valenced stimulus and enhanced when paired with a negatively valenced stimulus [42].

C. Scope for contributions

As can be seen from the above-mentioned literature survey, there exist a large number of studies in physiology and eye gaze based affect analysis of SZ patients. Most of these results, however, are based on static stimuli or video clips. In recent years, VR has been used for SZ intervention but mostly as a performance-based system without having any built-in internal measures. The scope of this present work is to bridge this gap and develop a VR-based SZ intervention system that is integrated with both peripheral physiology and eye gaze monitoring systems with the aim of precisely understanding the implicit response to VR-based dynamic representation of facial expressions. We believe that by precisely controlling emotional expressions in VR and gathering objective individualized eye gaze and physiological responses related to emotion recognition as well as performance data, new efficient intervention paradigms can be developed in the future. The presented novel VR-based system and the findings in this study may inform future development of affect-sensitive virtual social interaction tasks for SZ intervention.

III. SYSTEM DESIGN

We describe how the overall integrated system was designed in this section. The system was originally designed for autism spectrum disorder (ASD) intervention [22]. Given that there are several commonalities in social impairments between SZ and ASD [43] such as atypical emotion processing and lack of affective display, the original system was deemed appropriate for use with SZ patients with minor modifications. Since the readership for ASD and SZ is likely to be different and since the experiment cannot be described without system details, we present the complete VR-based system in this paper. The VR system was composed of three major components: the task presentation environment, the eye tracking component, and the peripheral physiology monitoring component. The task presentation environment was based on Unity3D game engine by Unity Technologies (http://unity3d.com). A remote desktop eye tracker by Tobii Technologies (www.tobii.com) called Tobii X 120 was employed for gaze tracking. A wireless physiological signals acquisition device called BioNomadix by Biopac Inc. (www.biopac.com) with 8 channels of peripheral physiology electrodes was used to record the physiological signals. Each component ran separately while communicating via a network interface (Fig. 1).

Fig. 1.

Fig. 1.

Overall system diagram.

A. The VR Emotion Presentation Engine

Facial emotional expressions and lip-syncing were the major animations of this project. Range of weights were assigned from 0 (no deformation) to 20 (maximum deformation) for each emotional expression. The universally accepted seven emotional expressions proposed by Ekman were used in this ASD project [44]. The expressions were: enjoyment, surprise, contempt, sadness, fear, disgust, and anger. The range of each facial expression had four arousal levels: low, medium, high, and extreme. The four levels were chosen by careful evaluation by clinical psychologists involved in this project. These facial expressions were constructed based on facial expressions of taped actors and were used in an earlier study for children with autism. These facial expressions were validated by a group of normal college students (n = 9) before they were used in the autism study. In addition to these facial expressions, seven phonetic viseme poses were created using the same set-driven key controller technique. The phonemes are L, E, M, A, U, O, and I. These phonemes were used to create lip-synced speech animations for storytelling. The story was used to give context to the emotional expressions. A total of 16 stories were lip-synced for each character. The avatars were customized and rigged using an online animation service, mixamo (www.mixamo.com) together with Autodesk Maya (www.autodesk.com/maya). All the facial expressions and lip-syncing for contextual stories narrated by the avatars were animated in Maya. A total of seven avatars including 4 boys and 3 girls were selected. Close to 20 facial bone rigs were controlled by set driven key controllers for realistic facial expressions and phonetic visemes for lip-syncing. Each facial expression had four arousal levels (i.e., low, medium, high, and extreme). A total of 252 (16 lip-synced stories + 20 emotion expression plus neutral for each character) animations were developed and imported to Unity3D game engine for task presentation.

Before we proceeded any further, we conducted a separate usability study with a control group of college students to determine which of these emotions were ambiguous in our design. We found that most subjects confused contempt with disgust and fear with anger. Thus we dropped contempt and fear and used the rest 5 emotions in our tasks. The created animations and characters were imported into unity and unity was used as the main VR engine to present the emotion facial expressions.

B. Peripheral Monitoring Systems

The Tobii X120 eye tracker recorded at 120 Hz frame rate allowing a free head movement of 30 × 22 × 30 cm (width x height x depth) at an approximately 70 cm distance. We used two applications connected to the eye tracker: one for diagnostic visualization as the experiment progressed and another one to record, preprocess and log the eye tracking data. The main eye tracker application computed eye physiological indices (PI) such as pupil diameter (PD) and blink rate (BR) and behavioral indices (BI) [45] such as fixation duration (FD) from the raw gaze data (Fig. 2).

Fig. 2.

Fig. 2.

The eye tracker application. The application computed the three basic eye indices, i.e. pupil diameter (PD), blink rate (BR), and fixation duration (FD) online and logs them further offline analysis.

The wireless Bionomadix physiological monitoring system with a total of 8 channels of physiological signals was used at 1000 Hz. The physiological signals monitored were: pulse photoplethesymogram (PPG), skin temperature (SKT), galvanic skin response (GSR), 3 electromyograms (EMG), and respiration (RSP) (Fig. 3). Due to the apparent disconnect between what patients with SZ feel and their outward expressions, they are not usually expressive of their internal affective states and these states often are not visible externally [46]. Physiological signals may be relatively less affected by these impairments and can be useful in understanding the internal psychological states and pattern [1]. Among the signals we monitored, GSR and PPG are directly affected by the sympathetic response of the autonomic nervous system (ANS) [47] although PPG also changes with parasympathetic response.

Fig. 3.

Fig. 3.

Peripheral physiological electrodes placement. The sensors included 3 EMG sensors, two on the face and one on the top shoulder as shown in the figure, PPG, GSR and SKT in the fingers with the two black strips showing the bipolar electrodes placed in the distal phalanx of the index and ring fingers with SKT at the middle finger and PPG in the thumb.

C. Offline Data Analysis

The collected physiological data were processed to extract useful features and decipher any differences between the two subject groups, patients with SZ and a control group, for conditions of selected emotional expressions presentation and neutral baseline condition. We extracted 81 features from PPG, GSR, RSP and SKT as well as EMG for this analysis. These features were chosen because of their correlation with engagement and emotion recognition process as noted in psychophysiology literature [1], [47]–[49]. The PPG signal was used to extract heart rate (HR), which is a cardiac index used to measure stress and certain emotions [22]. The GSR was decomposed into two major components, i.e., phasic and tonic components, and from them features such as skin conductance response rate (SCRrate) and mean skin conductance level (SCL) were extracted. The RSP signal was used to extract the breathing rate (BR). The mean skin temperature (SKT) was obtained from the SKT signal. EMG signal is also well studied for facial emotional recognition tasks. For the eye tracking data, we extracted the following features: pupil diameter (PD), fixation duration (FD), sum of fixation counts (SFC), saccade path length (SPL), and blink rate (BR). The statistical analyses employed in investigating possible group differences in the physiological, electromyographic, and eyetracking variables are described in the following section.

IV. METHODS AND PROCEDURE

A. Experimental Setup

The presentation engine ran on Unity while eye tracking and peripheral physiological monitoring were performed in parallel using separate applications on separate machines that communicated with the Unity-based presentation engine via a network interface. The VR task was presented using a 24 flat LCD panel monitor (at resolution 1980 × 1080) while the IAPS picture was presented on the same monitor with a resolution of 1024 × 768 in order to preserve the original resolution of the images. The experiment was performed in a laboratory with two rooms separated by one-way glass windows for observation. The researchers sat in the outside room. In the inner room, the subject sat in front of the task computer. The task computer display was also routed to the outer room for observation by the researchers. The session was video recorded for the whole duration of the participation. The study was approved by the Institutional Review Board (IRB) of Vanderbilt University.

B. Subjects

A total of 12 patients with SZ (Male: n=8, Female: n=4) of ages (M=45. 7, SD=9.4) and an age and IQ matched 12 healthy non-psychiatric subjects (Male: n=6, Female: n=6) controls of ages (M=44.9, SD=9.9) were recruited and participated in the usability study. All patient subjects were recruited through existing clinical research programs and had established clinical diagnosis (Table I).

TABLE I.

Profile of the 12 Subjects in the SZ Group AND THE CTR Group

Demographic
Information
CTR
Mean (SD)
SZ
Mean (SD)
t p
Age 44.9 (9.9) 45.7 (9.4) −0.21 0.83
Gender 6 F / 6 M 4 F / 8 M chi2 = 0.68 0.4
Education, years 13.4 (2.1) 15.4 (2.2) −2.2 0.03
IQ 107.1 (6.8) 106.4 (7.4) 0.22 0.82
SAPS N/A 12.9 (7.7) N/A N/A
SANS N/A 24.5 (11.4) N/A N/A
SPQ
Positive 2.75 (4.1) N/A N/A N/A
Negative 4.00 (3.8)
Disorganized 2.58 (3.8)
CPZ (mg/kg/day) 359.62 (247.6) N/A N/A N/A

Premorbid intelligence was assessed using the North American Adult Reading Test [50]. Chlorpromazine equivalent (mg/day; [51]). Semi-structured clinical interviews assessing symptoms over the past month. Brief Psychiatric Rating Scale (BPRS; [52]); Scale for the Assessment of Positive Symptoms (SAPS; [3], [53]); and the Scale for the Assessment of Negative Symptoms (SANS; [53]). SPQ = Schizotypal Personality Questionnaire; CPZ = Chlorpromazine Equivalent Dose; N/A = not Applicable.

The control group was recruited from the local community. The IQ measures were used to potentially screen for intellectual competency to complete the tasks.

C. Tasks and Procedure

The VR-based system presented a total of 20 trials corresponding to the 5 emotional expressions with each expression having 4 levels. Each trial was 12–15 s long. In each trial, first, the character narrated a context story that was linked to the emotional expression that followed for the next 5 seconds. The avatar exhibited a neutral emotional face during story telling. Ratings in the VR systems was presented after each trial of emotion expressions that asked the subjects what emotion they thought the avatar displayed and how confident they were in their choice. A separate VR session was also conducted with just the facial expression without the vignette story to compare it against the VR session with contextual story in an effort to control for the effect of the contextual story in the facial emotion recognition. The two tasks were presented with a randomized order to avoid the ordering effect. A typical laboratory visit was approximately 1 hour and 30 minutes long. During the first 15 minutes, a trained therapist prepared the subject for the experiment by placing the physiological sensors on the participant. Before the task began, the eye tracker was calibrated. The calibration was a fast 9 points calibration that took about 10–15 s. After this procedure, the subject was instructed to sit still in the absence of stimuli to collect their baseline physiological responses. At the start of each task, a welcome screen greeted the subject and described what was about to happen and how the subject was to interact with the system. Immediately after the welcome screen, the trials started. At the end of each trial, questionnaires popped up asking the subject what emotion he/she thought the avatar displayed and how confident he/she was in his/her choice in the VR system. The two emotional expression presentation tasks were randomized for each subject across trials to avoid ordering effects as described above. To avoid other confounding factors arising from the context stories, the stories were recorded with a monotonous tone and there was no specific facial expression displayed by the avatars during that context period in the first VR task with contextual story.

D. Statistical Analysis

For the eye tracking and physiological variables listed in Section III, System Design, C. Offline Data Analysis, pairwise unequal variance t-tests were performed. Due to a malfunction in recording device, the eye movements of one of the individuals in the patient group, analysis of the eye tracking variables had 11 participants in the patient group and 12 in the controls. The behavioral data indices (% accuracy, confidence, and latency of response) were analyzed by repeated measures ANOVA with task type (VR1 and VR2) as the within group variable and group (SZ and CTR) as the between groups variable.

V. RESULTS

The Collected eye gaze and physiological data for both groups were analyzed as described in the offline data analysis section above. In this section, we first present the comparative feature level analysis for both eye gaze and physiological data. Subsequently we present clustering analysis based on physiological data to show discriminability of elevated responses towards the VR emotional responses. For the feature level comparison, five features each from gaze as well as physiological signals were extracted to compare elicited responses during the facial emotional recognition tasks in the virtual environment. The data from prominent positive (joy and surprise) and prominent negative emotions (anger and disgust) were combined with high and extreme levels of arousal to generate the positive and negative category dataset, respectively. We also extracted baseline features for the physiological data to compare the responses in these categories in the clustering analysis.

A. Eye Gaze Features Analysis

Most of the eye gaze indices showed differences between the two groups of participants (Table II). In all the cases considered, the patient group showed more saccadic eye movement as indicated by higher SFC, lower FD and shorter SPL. These indices are known to correlate with ones engagement [54] and pupil constriction was associated with engagement [9]. Therefore, based on these data, the patients were less engaged than the control group in the VR session. Moreover, notably lower blink rates for the patient group indicated abnormal gaze pattern with lower than average blinks per minute while processing such stimuli [54]. Pairwise statistical t-test analysis indicated that during the positive emotions presentations the patient group had statistically significantly lower fixation duration (SZ M=155.98, CTR M=308.35, p<0.05) and blink rates (SZ M=1.83, CTR M=3.31, p<0.05) as compared to the healthy control group. During the negative emotions presentations, 4 out of the 5 eye features including PD (SZ M=2.51, CTR M=2.9, p<0.05), FD (SZ M=329.82, CTR M=147.53, p<0.05), SPL (SZ M=58, CTR M=77.55, p<0.05), and BR (SZ M=2.06, CTR M=3.08, p<0.05) were statistically significantly lower for the patient group when compared with the control group. No statistically significant difference arose between the positive and negative emotion presentations within each group. Similar pattern of less engagement was observed for the patient group as compared to the control group during the VR session even when emotion without the contextual stories were presented.

TABLE II.

Eye Tracking Features for the VR Session

PD
(mm)
FD
(ms)
SFC
(N/A)
SPL
(pix)
BR
(bpm)
Positive
Category
SZ Ave 2.63 155.98 42.58 70.32 1.83
SD 1.15 174.67 31 80.13 1.16
CTR Ave 2.87 308.35 51.98 83.81 3.31
SD 0.39 273.85 42.12 45.32 1.66
Negative
Category
SZ Ave 2.51 147.53 39.94 58 2.06
SD 1.22 129.06 27.18 40.3 1.49
CTR Ave 2.9 329.82 53.94 77.55 3.08
SD 0.38 304.77 43.55 42.22 1.95

B. Intergroup Eye Gaze Visualization

To determine if there were any differences in gaze scanning pattern between the patient group and the control group and between positive emotions and negative emotions intragroup, we performed clustering of the gaze points on to distinct facial regions of the face of the avatars, see [22] for details. For qualitative analysis of these gaze clusters, we generated heat map and masked map visualizations of the scanning gaze patterns to compare group variations. We generated visualization for all the five emotional expressions that were presented in the VR experiments for all the avatars averaged across all the participants in each group. In almost all the cases, the patients exhibited asymmetrically atypical scanning patterns when compared to the control group. Fig. 4 shows an example comparison for one avatar between the patient (top) and the control (bottom) groups for the emotion anger averaged across all trials and all participants within each group. It can be observed that the patients were looking in an asymmetric pattern and were focused more on context irrelevant areas such as the forehead and even outside of the facial region. On the contrary, the healthy control group participants focused more on the context relevant areas such as the mouth and eyes with a more symmetric scanning gaze pattern. In summary, the visualizations indicate atypical facial scanning pattern by the patients group with more asymmetric and dispersed gaze pattern as compared to a more symmetric concentrated gaze to the important regions of interest such as the eyes and the mouth by the healthy controls.

Fig. 4.

Fig. 4.

Masked maps overlaid on at the side of heat map visualizations. The top figure shows the SZ group gaze while the lower figure depicts the CTR group gaze aggregated across subjects and trials.

C. Physiological Features Analysis

As shown in Table III, the patient group had higher emotional response indicators including higher heart and skin conductance response rates and comparable breathing rate compared to the control group on both the positive and negative emotion presentations. Both skin temperature (SKT) and skin conductance levels (SCL) were statistically significantly different with patients exhibiting significantly lower SKT and SCL in both the positive (SKT: SZ/CTR M=90.27/94.81, SCL SZ/CTR M=6.79/9.24, p<0.05) and negative (SKT: SZ/CTR M=90.16/94.96, SCL SZ/CTR M=6.49/8.79, p<0.05) emotions categories. Also, the breathing rate (BR) was higher in the negative emotions category for the patient group (BR SZ P/N M=17.96/22.39, p<0.05).

TABLE III.

Physiological Features for the VR Session

HR
(bpm)
SKT
(0F)
BR
(bpm)
SCL
(μS)
SCR
(rpm)
Positive
Category
SZ Ave 87.83 90.27 17.96 6.79 3.12
SD 13.83 9.17 9.84 3.65 2.79
CTR Ave 85.43 94.81 18.49 9.24 2.42
SD 15.41 2.48 6.16 6.87 2.77
Negative
Category
SZ Ave 87.88 90.16 22.39 6.49 2.91
SD 14.37 9.1 12.55 3.41 2.7
CTR Ave 86.53 94.96 20.67 8.79 2.13
SD 14.08 2.57 11.23 6.5 2.65

Features for the baseline where participants were monitored for 3 minutes without any stimuli at the beginning of each session. For each group, their baseline physiological response was compared to their responses in the positive and negative conditions. The SZ group showed significantly greater SCR for the positive condition compared to baseline (t(11) = −3.36, p = 0.006). Additionally, there was a trend towards the SZ showing significantly greater SCR for the negative condition compared to baseline (t(l1) =−2.06, p = 0.06). Greater SCR of SZ group in both the positive and negative categories suggests higher phasic arousal, irrespective of the stimuli; this result indicates an undifferentiated response to emotional conditions in schizophrenia.

D. Physiological Clustering Analysis

To get a clearer understanding of which features are responsible for the differences between the two groups with the two conditions and the baseline, we further explored physiological responses by performing clustering of data points from trials of the positive and negative conditions with the full 8 channels of physiology and 81 features extracted out of them. Although, the limited feature level comparison with the selected 5 features are easy to apply to demonstrate response differences between the two groups and within each group, feature level analysis is not suitable for the whole 81 feature set.

The clustering analysis helps probe whether there are differences in physiological response pattern in the presence of positive and negative emotional stimuli as compared to the baseline and also with each other in both groups. If clustering analysis reveals clear pattern differences, supervised models could be built for a closed loop physiology based online interaction in the future. Two clustering methods were used in this work, which were the widely used k-means and a recent technique based on density peaks [55]. However, we observed that k-means had difficulty in extracting non-spherical clusters and hence was not able to capture highly non-linear pattern differences. Moreover, results from the density clustering method outperformed those from k-means for this particular dataset. Therefore, in this paper, we present our analysis from the density based clustering method. Fig. 5 shows example clustering analysis for the control (top) and the SZ (bottom) group using the density peak clustering method [56]. The decision graph indicates minimum distance to density peak points from each data point ((5) versus the cluster density around each data point (ρ). Cluster centroids are selected using product of these two parameters, meaning points with higher density that are surrounded with other higher density points are likely cluster centroids. Using both these cues, cluster centroids were chosen and the resulting clustering is indicated in the bottom multidimensional scaled dissimilarity of the data points. For details of the clustering methods, please, refer to [55]. We compared the negative and positive emotions of the VR stimuli for both groups. We additionally compared these two emotional stimuli with the baseline physiological responses of each group. We further performed sequential forward feature selection to see the most prominent features in the physiological responses.

Fig. 5.

Fig. 5.

Physiological clustering using the density peaks method. Positive vs. negative categories for SZ (top) and CTR (bottom).

Fig. 6 presents the clustering results from the density clustering method. Accuracy was computed as a percentage of data points that are correctly clustered using ground truth clusters. Each pair of comparisons out of the two conditions, i.e., positive (pos) and negative (neg) emotion categories, the two groups, i.e., SZ and CTR and the baseline (BL) are taken as the two clusters in each comparison. The horizontal axes indicate the features included in the sequential forward selection (SFS) feature selection method. Feature [1, N] means features from 1 to N including features 1 and N. The top and the middle plots show that responses to positive and negative emotions were significantly greater than the baseline in both groups. This demonstrates that the VR stimuli was successful in eliciting emotional responses compared to baseline. The results also indicate that there is no clear difference between responses to the positive and negative emotion categories, demonstrated by the low accuracy, close to chance or 50%, for the two clusters. Lower clustering accuracy indicates absence of differences between the two clusters considered. Of particular importance is the variation in features as shown by the results of SFS. In both groups and conditions compared with the baseline, it appears that facial EMG features (features 10 to 39) and respiratory related features (features 76 to 81) contributed more pronounced emotional responses. GSR features (features 40 to 45) were the next contributing features. The cardiovascular features (features 1 to 10) did not produce sufficient response in both groups.

Fig. 6.

Fig. 6.

Clustering accuracies using the density peaks method.

E. Performance Data Analysis

Three performance metrics were considered for analysis, i.e., percentage performance of correctly identifying the emotional expressions, how confident the subjects were with completely confident being 100% and completely un-confident being 0%, and the time the subjects took to make their selection once the emotion presentations were finished. The performance data was first collapsed into the two virtual reality sessions, i.e., VR facial expression with vignette stories (VR1) and VR facial expression without stories (VR2). Fig. 7 indicates that there were performance differences between the two groups in both sessions. The patients exhibiting lower performance compared to the control group in both VR1 and VR2. However, these differences were not statistically significant, (F(l,22) = 1.94, p = 0.18). Across both groups, performance was significantly better on the VR 2 task than the VR1 task (F(l,22) = 22.26, p<0.000l, Cohens d = 2.02). These performance differences across both groups for the two different tasks could be explained in relation to the vignette stories in VR1. The addition of contextual stories that were recorded with monotonous tone (no vocal emotional prosody) may have introduced greater ambiguity that interacted with the identification of emotion expressed by the avatar. This finding suggests that isolated emotion recognition is easier than emotion recognition in the presence of contextual social interaction. The capacity and flexibility of VR for presenting and manipulating multiple attributes, features and components of stimuli dynamically are big advantages that the traditional static pictures and non-modifiable video clips cannot emulate. VR environment can be programmed to present a very wide range of vocal intonations and other social interaction parameters to maximize the potential training of interpretation of emotional faces in a social context.

Fig. 7.

Fig. 7.

Overall performance metrics.

Patients and controls did not differ in their confidence ratings for recognition responses in both the VR sessions (F(l,22) = 0.005, p>0.05). The control group were significantly more confident in VR2 than VR1 (t(11) = 2.92, p = 0.01, Cohens d =0.50). This suggests that the monotonous (no prosody) stories could have contributed to the mis-recognition; introduction of the recorded vignettes may have reduced confidence in their ability to correctly identify the emotion expressed the avatars. This suggests that monotonous presentation of social context may not be the best approach moving forward in the development of social training VR programs. On the last performance metric, the response latency, The groups did not differ in their response times across both tasks (F(l,22) = 1.91, p>0.05). Across both groups though, response latencies were longer for the VR1 task than the VR2 task (F(l,22) = 5.50, p<0.05, Cohens d = 0.50). The groups did not differ in the degree of slowing between the two tasks (F(l,22) = 0.23, p>0.05).

Emotion recognition performance is known to be biased by subjects preferred choice regardless of the stimuli presented, i.e., personal bias. To account for personal bias, we first broke down the performance into each emotion. The top plot of Fig. 8 shows that of the 5 emotions presented, the control group had a higher performance than the schizophrenia group except on anger in both VR sessions and joy (enjoyment) in VR2 session. Moreover, both groups had a higher performance in VR2 session than in VR1 in all emotions except disgust for the patient group.

Fig. 8.

Fig. 8.

Per emotion raw (biased) and bias-corrected performance.

We then computed the response bias, in which subjects reported a certain emotion regardless of the stimuli. Table IV shows the number and percentage response baises. The expected response bias for 5 emotions is 20%. Most of the biases are not far from 25%, which indicates balanced responses. Emotions with response bias that are too far from the expected 20% value are generally biased. We used these emotion response biases to adjust the raw performance by dividing the raw performance by the percentage bias and rescaling to 100%. The bottom plot of Fig. 8 shows the response bias corrected performance. After the bias adjustment, in VR1, the CTR group performed higher than the patients in all emotions except disgust whereas they performed higher than the patient group in all emotions except surprise in VR2. The notable result after the bias adjustment is, both groups performed better in VR2 than VR1 in all the emotions. Taken together, these results suggest that patients do not show an erratic or idiosyncratic response bias to the emotional expressions of the avatars. Such similar performance profiles might indicate that patients are able to pick up on the emotion specific signals in the expression, albeit, at a slightly more impaired level.

TABLE IV.

Response Bias of Subjects for Each Emotion

Subject
Response
VR1 VR2
SZ CTR SZ CTR
n % n % n % n %
Anger 56 23.33 49 20.42 55 22.92 41 17.08
Disgust 48 20.00 59 24.58 44 18.33 54 22.50
Joy 35 14.58 32 13.33 46 19.17 35 14.58
Sadness 56 23.33 59 24.58 43 17.92 50 20.83
Surprise 45 18.75 41 17.08 52 21.67 60 25.00

VI. DISCUSSION AND CONCLUSION

The objective of this research was to design and test a VR-based schizophrenia intervention system that would offer opportunities to analyze emotion recognition mechanism in patients with SZ that might not be possible either in static picture or isolated video clip based protocols. This novel system, VR-SAAFE, allows presentation of realistic facial emotional expressions in VR that can be precisely controlled both in terms of valence and arousal and can be embedded with contextual vignettes. Furthermore, VR-SAAFE is integrated and synchronized with both physiology and eye gaze data acquisition capabilities such that targeted physiology and gaze based analysis can be performed during the emotion recognition tasks. Since emotion recognition is a dynamic task, such abilities will likely have positive impact in understanding the process of emotion recognition in patients with SZ and eventually designing new intervention in the future. We are not aware of any similar systems in the literature. We performed a usability study to test the usefulness of VR-SAAFE. Eye tracking and physiological signals were collected during the VR task and analyzed offline. While these data are preliminary and based on one usability study, they provide interesting findings and thus indicate the potential usefulness of such a system as VR-SAAFE. The results from gaze and physiological feature level analysis show that they are viable indicators of internal emotional states of patients with SZ. The patient group overall responded more intensely to the positive emotion presentations than both the negative and neutral (baseline, in the case of VR) emotion conditions for almost all the features. The analysis of the eye gaze features indicated that the SZ group exhibited shorter fixation durations and scan-path lengths which corroborate previous eye tracking literature using static faces [38], [39] [57]. With shorter scan-path lengths and shorter fixation durations, individuals with SZ did not strategically visually scan the face in a manner that is conducive to perceiving emotional expression. As the visualizations show qualitatively, the SZ group showed an aberrant pattern of fixation allocation to the important features of the face. Additionally, the SZ group displayed reduced pupil dilation and blink rates while performing the task. Both of these indices are associated with engagement [58] and cognitive effort [42] during a task suggesting that the SZ group did not engage with the task to the same extent as the control group. This interpretation is tempered with the fact that the all individuals in the SZ group were currently taking antipsychotics when they participated and the association between dopamine and oculomotor function may have played a role in the group differences found [59]. The results from the physiological feature and cluster analyses suggest that at a feature level there are some indications of physiological differentiation based on emotional valence. Specifically, the control group showed differences between the positive and negative stimuli in their skin temperature and skin conductance. These two signals were also significantly different from the SZ group for both emotional valence conditions. These findings are in line with previous studies finding reduced tonic skin conductance responses in a subset of individuals with SZ [2], [3], [5]. Interestingly, in the SZ group, there was a significant increase in the phasic skin conductance response rates, which was not seen in the control group. The increase in SCR during the social cognition task suggests that there was a failure to habituate [3], [6]. These indicators provide some initial evidence that they could be used in future social cognition interventions that take into account ones emotional state when interacting with others.

This conclusion is bolstered by the findings of the cluster-level analyses, which indicate that, across both groups, ones emotional state during the social cognition tasks is distinguishable from their baseline state. This suggests that there was a distinct change to participants physiological states when performing the task and provides preliminary evidence for the potential use of this physiological state monitoring system in social cognitive intervention programs. To be able to track an individuals physiological state while interacting with the VR-based intervention system, it may be possible to tailor the interactions to best suit the particular individual. In this initial study, even though the interactions were fairly limited (e.g., presentation of emotional expressions accompanied by social vignettes), a difference in physiological state was still found in the participants. Future studies may be able to capitalize on this findings and incorporate more complex social interactions, which may provide even richer physiological information to use in understanding the emotional and physiological states. Importantly, the current study did not find a significant effect of valence on the cluster-level analysis of the physiological data. This is perhaps to be expected given that GSR was one of the strongest predictors in the cluster analysis and GSR is believed to be more associated with arousal than valence [7]. Taken together, the current system provides initial evidence for its use in understanding emotional states of individuals via analysis of physiological signals when engaging in social cognitive tasks.

Finally, while there were no group differences in performance on the VR tasks, both groups performed better on the task wherein emotional expressions were presented without verbal social vignettes. As noted above, this performance differential may have been due to the monotonous vocal intonation in which the vignettes were presented may have increased uncertainty in the participants. By removing the emotional prosody in the speech, the incongruence between auditory and visual cues may have made it more difficult for the participants to correctly identify the emotions expressed by the avatars. The fact that the control group reported significantly greater confidence in their performance on the VR task with just the emotional expressions supports this. Future investigations should take care to provide congruent prosodic presentations of social vignettes in order to better simulate the multimodal presentations of emotional and social information in social interactions.

While it is beyond the scope of this current work, this preliminary study using the presented novel system could inform future adaptive VR applications for SZ therapy that could harness the inherent processing pattern of patients with SZ as captured from their gaze and peripheral physiological signals. Such implicit mode of interaction may be advantageous over performance-only interactions for objective, extensive, and natural interaction with the virtual social avatars. However, it is to be noted that although the results of the usability study is promising, we would like to acknowledge two major limitations of the current study. First, the social interaction provided by the current system was limited and needs further development for truly versatile bidirectional social interaction of meaning. Second, the sample size for the usability study although similar to many other usability studies in this field, is still small and thus the results of this study should be interpreted with care and should not be generalized for clinical significance until a thorough longitudinal study is conducted in the future. Despite these limitations, we believe that, this initial study demonstrates the value of adaptive VR-based SZ intervention systems. For example, the ability to subtly adjusting emotional expressions of the avatars, integrating this platform into more relevant social paradigms, and embedding online physiological and gaze data to guide interactions to understand psychological states of patients with SZ could turn out to be quite useful tools. We believe such capabilities will lead to eventual development of a more adaptive, individualized and autonomous therapeutic system that could provide insight about emotion processing mechanisms of patients with SZ.

Acknowledgment

We would like to thank Lindsey McIntosh, Channing Cochran, Jamie Michael, Rachel Aaron and Megan Ichinose for helping with the recruitment and assessment of the participants. Jamie Michael and Rachel Aaron also assisted with the selection of the I APS stimuli.

Contributor Information

E. Bekele, Department of Electrical and Computer Engineering, Vanderbilt University, Nashville, TN, 37212 USA.

D. Bian, Department of Electrical and Computer Engineering, Vanderbilt University, Nashville, TN, 37212 USA

J. Peterman, Department of Psychology.

S. Park, Department of Psychology.

N. Sarkar, Department of Mechanical Engineering and Computer Engineering..

References

  • [1].Hempel RJ, Tulen JH, van Beveren NJ, van Steenis HG, Mulder PG, and Hengeveld MW, “Physiological responsivity to emotional pictures in schizophrenia,” Journal of psychiatric research, vol. 39, no. 5, pp. 509–518, 2005. [DOI] [PubMed] [Google Scholar]
  • [2].Dawson ME and Nuechterlein KH, “Psychophysiological dysfunctions in the developmental course of schizophrenic disorders,” Schizophrenia Bulletin, vol. 10, no. 2, pp. 204–232, 1984. [DOI] [PubMed] [Google Scholar]
  • [3].Ikezawa S, Corbera S, Liu J, and Wexler BE, “Empathy in elec-trodermal responsive and nonresponsive patients with schizophrenia,” Schizophrenia research, vol. 142, no. 1, pp. 71–76, 2012. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [4].Couture SM, Granholm EL, and Fish SC, “A path model investigation of neurocognition, theory of mind, social competence, negative symptoms and real-world functioning in schizophrenia,” Schizophrenia research, vol. 125, no. 2, pp. 152–160, 2011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [5].Bernstein AS, Frith CD, Gruzelier JH, Patterson T, Straube E, Venables PH, and Zahn TP, “An analysis of the skin conductance orienting response in samples of american, british, and german schizophrenics,” Biological Psychology, vol. 14, no. 3, pp. 155–211, 1982. [DOI] [PubMed] [Google Scholar]
  • [6].Gruzelier J and Venables P, “Skin conductance orienting activity in a hetrogenous sample of schizophrenics: Possible evidence of limbic dysfunction,” The Journal of nervous and mental disease, vol. 155, no. 4, pp. 277–287, 1972. [DOI] [PubMed] [Google Scholar]
  • [7].Bradley MM, Cuthbert BN, and Lang PJ, “Picture media and emotion: Effects of a sustained affective context,” Psychophysiology, vol. 33, no. 6, pp. 662–670, 1996. [DOI] [PubMed] [Google Scholar]
  • [8].Myin-Germeys I and Delespaul PA, “Schizophrenia patients are more emotionally active than is assumed based on their behavior,” Schizophrenia Bulletin, vol. 26, no. 4, pp. 847–854, 2000. [DOI] [PubMed] [Google Scholar]
  • [9].Anderson CJ, Colombo J, and Jill Shaddy D, “Visual scanning and pupillary responses in young children with autism spectrum disorder,” Journal of Clinical and Experimental Neuropsychology, vol. 28, no. 7, pp. 1238–1256, 2006. [DOI] [PubMed] [Google Scholar]
  • [10].Loi F, Vaidya JG, and Paradiso S, “Recognition of emotion from body language among patients with unipolar depression,” Psychiatry research, vol. 209, no. 1, pp. 40–49, 2013. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [11].Naudin M, Carl T, Surguladze S, Guillen C, Gaillard P, Belzung C, El-Hage W, and Atanasova B, “Perceptive biases in major depressive episode,” PloS one, vol. 9, no. 2, p. e86832, 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [12].Kring AM, Kerr SL, Smith DA, and Neale JM, “Flat affect in schizophrenia does not reflect diminished subjective experience of emotion,” Journal of abnormal psychology, vol. 102, no. 4, p. 507, 1993. [DOI] [PubMed] [Google Scholar]
  • [13].Kring AM and Neale JM, “Do schizophrenic patients show a disjunctive relationship among expressive, experiential, and psychophysiological components of emotion?” Journal of abnormal psychology, vol. 105, no. 2, p. 249, 1996. [DOI] [PubMed] [Google Scholar]
  • [14].Herbener ES, Song W, Khine TT, and Sweeney JA, “What aspects of emotional functioning are impaired in schizophrenia?” Schizophrenia research, vol. 98, no. 1, pp. 239–246, 2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [15].Cohen AS and Minor KS, “Emotional experience in patients with schizophrenia revisited: meta-analysis of laboratory studies,” Schizophrenia Bulletin, vol. 36, no. 1, pp. 143–150, 2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [16].Berenbaum H and Oltmanns TF, “Emotional experience and expression in schizophrenia and depression,” Journal of abnormal psychology, vol. 101, no. 1, p. 37, 1992. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [17].Earnst KS and Kring AM, “Emotional responding in deficit and non-deficit schizophrenia,” Psychiatry research, vol. 88, no. 3, pp. 191–207, 1999. [DOI] [PubMed] [Google Scholar]
  • [18].Kring AM and Moran EK, “Emotional response deficits in schizophrenia: insights from affective science,” Schizophrenia Bulletin, vol. 34, no. 5, pp. 819–834, 2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [19].Valmaggia LR, Latif L, Kemptom MJ, and Maria R-C, “Virtual reality in the psychological treatment for mental health problems: An systematic review of recent evidence,” Psychiatry Research, 2016. [DOI] [PubMed] [Google Scholar]
  • [20].Veling W, Moritz S, and van der Gaag M, “Brave new worldsreview and update on virtual reality assessment and treatment in psychosis,” Schizophrenia bulletin, vol. 40, no. 6, pp. 1194–1197, 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [21].Tsang MM and Man DW, “A virtual reality-based vocational training system (vrvts) for people with schizophrenia in vocational rehabilitation,” Schizophrenia research, vol. 144, no. 1, pp. 51–62, 2013. [DOI] [PubMed] [Google Scholar]
  • [22].Bekele E, Zheng Z, Swanson A, Crittendon J, Warren Z, and Sarkar N, “Understanding how adolescents with autism respond to facial expressions in virtual reality environments,” Visualization and Computer Graphics, IEEE Transactions on, vol. 19, no. 4, pp. 711–720, 2013. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [23].Lang PJ, Bradley MM, and Cuthbert BN, “International affective picture system (iaps): Technical manual and affective ratings,” 1999.
  • [24].Freeman D, “Studying and treating schizophrenia using virtual reality: a new paradigm,” Schizophrenia Bulletin, vol. 34, no. 4, pp. 605–610, 2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [25].Kurtz MM, Baker E, Pearlson GD, and Astur RS, “A virtual reality apartment as a measure of medication management skills in patients with schizophrenia: a pilot study,” Schizophrenia Bulletin, vol. 33, no. 5, pp. 1162–1170, 2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [26].Yellowlees PM and Cook JN, “Education about hallucinations using an internet virtual reality system: a qualitative survey,” Academic Psychiatry, vol. 30, no. 6, pp. 534–539, 2006. [DOI] [PubMed] [Google Scholar]
  • [27].Kim K, Kim J-J, Kim J, Park D-E, Jang HJ, Ku J, Kim C-H, Kim IY, and Kim SI, “Characteristics of social perception assessed in schizophrenia using virtual reality,” Cyberpsychology & behavior, vol. 10, no. 2, pp. 215–219, 2006. [DOI] [PubMed] [Google Scholar]
  • [28].Park K-M, Ku J, Choi S-H, Jang H-J, Park J-Y, Kim SI, and Kim J-J, “A virtual reality application in role-plays of social skills training for schizophrenia: a randomized, controlled trial,” Psychiatry research, vol. 189, no. 2, pp. 166–172, 2011. [DOI] [PubMed] [Google Scholar]
  • [29].Sorkin A, Weinshall D, Modai I, and Peled A, “Improving the accuracy of the diagnosis of schizophrenia by means of virtual reality,” American Journal of Psychiatry, 2006. [DOI] [PubMed] [Google Scholar]
  • [30].Strickland D, “Virtual reality for the treatment of autism,” Studies in health technology and informatics, pp. 81–86, 1997. [PubMed] [Google Scholar]
  • [31].Souto T, Baptista A, Tavares D, Queirs C, and Antnio M, “Facial emotional recognition in schizophrenia: preliminary results of the virtual reality program for facial emotional recognition,” Revista de Psiquiatría Cinica, vol. 40, no. 4, pp. 129–134, 2013. [Google Scholar]
  • [32].Dyck M, Winbeck M, Leiberg S, Chen Y, and Mathiak K, “Virtual faces as a tool to study emotion recognition deficits in schizophrenia,” Psychiatry research, vol. 179, no. 3, pp. 247–252, 2010. [DOI] [PubMed] [Google Scholar]
  • [33].Venables P and Wing J, “Level of arousal and the subclassification of schizophrenia,” Archives of general psychiatry, vol. 7, no. 2, pp. 114–119, 1962. [DOI] [PubMed] [Google Scholar]
  • [34].Hempel RJ, Tulen JH, van Beveren NJ, Mulder PG, and Hengeveld MW, “Subjective and physiological responses to emotion-eliciting pictures in male schizophrenic patients,” International Journal of Psychophysiology, vol. 64, no. 2, pp. 174–183, 2007. [DOI] [PubMed] [Google Scholar]
  • [35].Kring AM, Kerr SL, and Earnst KS, “Schizophrenic patients show facial reactions to emotional facial expressions,” Psychophysiology, vol. 36, no. 02, pp. 186–192, 1999. [PubMed] [Google Scholar]
  • [36].Mattes R, Schneider F, Heimann H, and Birbaumer N, “Reduced emotional response of schizophrenic patients in remission during social interaction,” Schizophrenia research, vol. 17, no. 3, pp. 249–255, 1995. [DOI] [PubMed] [Google Scholar]
  • [37].Wolf K, Mass R, Kiefer F, Wiedemann K, and Naber D, “Characterization of the facial expression of emotions in schizophrenia patients: preliminary findings with a new electromyography method,” Canadian journal of psychiatry, vol. 51, no. 6, p. 335, 2006. [DOI] [PubMed] [Google Scholar]
  • [38].Manor BR, Gordon E, Williams LM, Rennie CJ, Bahramali H, Latimer CR, Barry RJ, and Meares RA, “Eye movements reflect impaired face processing in patients with schizophrenia,” Biological psychiatry, vol. 46, no. 7, pp. 963–969, 1999. [DOI] [PubMed] [Google Scholar]
  • [39].Williams LM, Loughland CM, Gordon E, and Davidson D, “Visual scanpaths in schizophrenia: is there a deficit in face recognition?” Schizophrenia research, vol. 40, no. 3, pp. 189–199, 1999. [DOI] [PubMed] [Google Scholar]
  • [40].Volz M, Hamm AO, Kirsch P, and Rey E-R, “Temporal course of emotional startle modulation in schizophrenia patients,” International Journal of Psychophysiology, vol. 49, no. 2, pp. 123–137, 2003. [DOI] [PubMed] [Google Scholar]
  • [41].Vrana SR, Spence EL, and Lang PJ, “The startle probe response: A new measure of emotion?” Journal of abnormal psychology, vol. 97, no. 4, p. 487, 1988. [DOI] [PubMed] [Google Scholar]
  • [42].Piquado T, Isaacowitz D, and Wingfield A, “Pupillometry as a measure of cognitive effort in younger and older adults,” Psychophysiology, vol. 47, no. 3, pp. 560–569, 2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [43].Cheung C, Yu K, Fung G, Leung M, Wong C, Li Q, Sham P, Chua S, and Me Alonan G, “Autistic disorders and schizophrenia: related or remote? an anatomical likelihood estimation,” PloS one, vol. 5, no. 8, p. el2233, 2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [44].Ekman P, “Facial expression and emotion,” American Psychologist, vol. 48, no. 4, p. 384, 1993. [DOI] [PubMed] [Google Scholar]
  • [45].Lahiri U, Warren Z, and Sarkar N, “Design of a gaze-sensitive virtual social interactive system for children with autism,” Neural Systems and Rehabilitation Engineering, IEEE Transactions on, no. 99, pp. 1–1, 2012. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [46].Bleuler E, “Dementia praecox or the group of schizophrenias,” International University Press, New York, 1911. [Google Scholar]
  • [47].Cacioppo J, Tassinary L, and Berntson G, Handbook of psychophysiology. Cambridge Univ Pr, 2007. [Google Scholar]
  • [48].Liu C, Conn K, Sarkar N, and Stone W, “Online affect detection and robot behavior adaptation for intervention of children with autism,” Robotics, IEEE Transactions on, vol. 24, no. 4, pp. 883–896, 2008. [Google Scholar]
  • [49].Welch KC, Lahiri U, Liu C, Weller R, Sarkar N, and Warren Z, “An affect-sensitive social interaction paradigm utilizing virtual reality environments for autism intervention,” in International Conference on Human-Computer Interaction. Springer, 2009, pp. 703–712. [Google Scholar]
  • [50].Blair JR and Spreen O, “Predicting premorbid iq: a revision of the national adult reading test,” The Clinical Neuropsychologist, vol. 3, no. 2, pp. 129–136, 1989. [Google Scholar]
  • [51].Andreasen NC, Pressler M, Nopoulos P, Miller D, and Ho B-C, “Antipsychotic dose equivalents and dose-years: a standardized method for comparing exposure to different drugs,” Biological psychiatry, vol. 67, no. 3, pp. 255–262, 2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [52].Overall JE and Gorham DR, “The brief psychiatric rating scale,” Psychological reports, vol. 10, no. 3, pp. 799–812, 1962. [Google Scholar]
  • [53].Andreasen NC, “Scale for the assessment of positive/negative symptoms,” Iowa City: University of Iowa, 1984. [Google Scholar]
  • [54].W Z Lahiri U. and N. S, “Design of a gaze-sensitive virtual social interactive system for children with autism,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 19, no. 4, pp. 443–452,2011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [55].Rodriguez A and Laio A, “Clustering by fast search and find of density peaks,” Science, vol. 344, no. 6191, pp. 1492–1496, 2014. [DOI] [PubMed] [Google Scholar]
  • [56].Robins B, Dautenhahn K, and Dubowski J, “Investigating autistic children’s attitudes towards strangers with the theatrical robot-a new experimental paradigm in human-robot interaction studies,” pp. 557–562, 2004.
  • [57].Green MJ, Williams LM, and Davidson D, “Visual scanpaths to threat-related faces in deluded schizophrenia,” Psychiatry research, vol. 119, no. 3, pp. 271–285, 2003. [DOI] [PubMed] [Google Scholar]
  • [58].Van Orden KF, Limbert W, Makeig S, and Jung T-P, “Eye activity correlates of workload during a visuospatial memory task,” Human Factors: The Journal of the Human Factors and Ergonomics Society, vol. 43, no. 1, pp. 111–121, 2001. [DOI] [PubMed] [Google Scholar]
  • [59].Karson CN, Bigelow L, Kleinman J, Weinberger D, and Wyatt R, “Haloperidol-induced changes in blink rates correlate with changes in bprs score,” The British Journal of Psychiatry, vol. 140, no. 5, pp. 503–507, 1982. [DOI] [PubMed] [Google Scholar]

RESOURCES