Skip to main content
Basic and Clinical Neuroscience logoLink to Basic and Clinical Neuroscience
. 2022 May 1;13(3):285–294. doi: 10.32598/bcn.2021.632.3

A Predictive Model for Emotion Recognition Based on Individual Characteristics and Autonomic Changes

Ateke Goshvarpour 1,2,*, Atefeh Goshvarpour 3, Ataollah Abbasi 3
PMCID: PMC9706293  PMID: 36457877

Abstract

Introduction:

Studies have repeatedly stated the importance of individual differences in the problem of emotion recognition. The primary focus of this study is to predict Heart Rate Variability (HRV) changes due to affective stimuli from the individual characteristics. These features include age (A), gender (G), linguality (L), and sleep (S). In addition, the best combination of individual variables was explored to estimate emotional HRV.

Methods:

To this end, HRV indices of 47 college students exposed to images with four emotional categories of happiness, sadness, fear, and relaxation were analyzed. Then, a novel predictive model was introduced based on the regression equation.

Results:

The results show that different emotional situations provoke the importance of different individual variable combinations. The best variables arrangements to predict HRV changes due to emotional provocations are LS, GL, GA, ALS, and GALS. However, these combinations were changed according to each subject separately.

Conclusion:

The suggested simple model effectively offers new insight into emotion studies regarding subject characteristics and autonomic parameters.

Highlights

  • HRV affective states was predicted using the individual characteristics.

  • A novel predictive model was proposed utilizing the regression.

  • Distinctive emotional situations provoke the importance of different individual variable combinations.

  • The close association exists between gender and physiological changes in emotional states.

Plain Language Summary

In everyday life, emotions play a critical role in health, social relationships, and daily functions. Among physiologicalmeasures, the ANS activity, especially Heart Rate Variability (HRV), plays an important role in many recent theories of emotion. Many studies have analyzed HRV differences in the physiological mechanism of emotional reactions as a function of individual variables such as age, gender, and linguality, as well as other factors like sleep duration. It is the first study that explored the importance of individual characteristic’s involvements and combinations was explored in the problem of emotion prediction based on an HRV parameter. To this effect, an emotion predictive model was proposed based on the linear combinations of individual differences with acceptable performance.

Keywords: Emotion, Heart rate, Individual differences, Model

1. Introduction

Emotions play a crucial role in health, social relationships, and daily functions. Concerning the importance of emotions, the emotion recognition via physiological parameters (including Galvanic Skin Response (GSR), respiration, Electrocardiogram (ECG), blood pressure, or Electroencephalogram (EEG)) has been fascinated by several researchers in the field of affective computing ( Frantzidis et al., 2010; Goshvarpour et al., 2015; Nardelli et al., 2015; Valenza et al., 2014). Among these, Autonomic Nervous System (ANS) activity is a fundamental component in many recent theories of emotion. Overall autonomic measures, Heart Rate (HR) (and also Heart Rate Variability (HRV)) is the most often reported measure ( Kreibig, 2010). Numerous approaches such as standard features and nonlinear indices have been used in the literature to analyze the HRV signal quantitatively. However, the main focus was on the simple standard features ( Chang, Zheng, & Wang, 2010; Choi & Woo, 2005; Greco, Valenza, Lanata, Rota, & Scilingo, 2014; Haag, Goronzy, Schaich,& Williams 2004; Jang, Park, Park, Kim, & Sohn, 2015; Katsis, Katertsidis, Ganiatsas, & Fotiadis, 2008; Katsis, Katertsidis, & Fotiadis, 2011; Kim, Bang, & Kim, 2004; Li & Chen, 2006; Liu, Conn, Sarkar, & Stone, 2008; Niu, Chen, & Chen, 2011; Picard, Vyzas, & Healey, 2001; Rainville, Bechara, Naqvi, & Damasio, 2006; Rani, Liu, Sarkar, & Vanman, 2006; Yannakakis & Hallam, 2008; Yoo, Lee, Park, Kim, Lee, & Jeong, 2005; Zhai & Barreto, 2006).

Also, different HRV patterns have been reported in the context of different emotion-related autonomic responses ( Kreibig, 2010). Many studies have analyzed these differences in the physiological mechanism of emotional reactions as a function of individual variables such as age, gender, and linguality, as well as other factors like sleep duration ( Bayrami et al., 2012; Chen, Liu, Z.- Wu, Ding, Dong, & Hirota 2015; Franzen, Buysse, Dahl, Thompson, & Siegle, 2009; Yoo, Gujar, Peter, Jolesz, & Walker, 2007). The appendix presents a short review of the literature. They mainly tended to evaluate one or two parameters separately, and the relation and interaction between these factors on emotional reactions were not simultaneously considered. For example, 1) Are the emotional responses of two genders with insufficient sleep the same? 2) If the subject’s age is also considered, what changes have been made in the emotional responses? 3) Which one has the maximum effect on the emotional autonomic changes? It is supposed that the individual information can jointly affect emotional conformation. Therefore, this relationship and interaction should be considered in the affect analytic system.

The present study aimed to evaluate the effects of age, gender, linguality, and sleep duration on the autonomic responses associated with emotional inductions simultaneously, and it attempted to offer a predictive model for these interactions. The rest of this manuscript is prepared as follows. Section 2 offers the material and methods used in this work. Section 3 reports the experimental results of the proposed procedure. Finally, section 4 presents the study’s conclusion.

2. Materials and Methods

Data collection

The ECG of 47 college students attending the Sahand University of Technology was collected. All participants were Iranian students. To elicit emotions in the participants, images from the International Affective Picture System (IAPS) were used ( Lang, Bradley, & Cuthbert, 2005). Based on the dimensional structure of the emotional space, the images of the IAPS were chosen to correspond to the four classes of emotions ( Goshvarpour et al., 2015): relaxation, happiness, sadness, and fear. Upon arrival in the laboratory, all participants were requested to read and sign a consent form, to participate in the experiment. All participants reported no history of neurological disease, cardiovascular, epileptic, or hypertension diseases. The subjects were asked not to eat caffeine, salt, or fat foods two hours before data recording and remained still during the experiment, particularly avoiding movements of their fingers, hands, and legs.

The procedure took about 15 minutes, and images were represented after two minutes of rest. In the initial baseline measurement, subjects were instructed to keep their eyes open and watch a blank screen. Then, 28 blocks of pictorial stimuli were randomly shown on the screen to prevent habituation in subjects. Furthermore, they were balanced among subjects. Each block consisted of five pictures from the same emotional class displayed for about 15 s with a 10-s blank screen period at the end. This process was done to ensure the stability of the emotion over time. The blank screen period was applied to allow the return of physiological fluctuations to the baseline and assure the regularity in the demonstration of different emotional images. The blank screen is followed by a white plus (for 3 s) to prompt the subjects to concentrate and look at the center of the screen and prepare them for the next block. They were also asked to self-assess their emotional states. Figure 1 demonstrates the protocol description. All signals were recorded in the computational neuroscience laboratory of the Sahand University of Technology. A 16-channel PowerLab (manufactured by AD Instruments) with a sampling rate of 400 Hz and a digital notch filter was used to remove current line noise.

Figure 1.

Figure 1.

Protocol description

The emotional model

Finding a linear relationship between variables usually describes the observed data. In addition, it makes possible a reasonable prediction of new observations. In the previous studies, changes in HRV can serve as a valuable and effective tool to analyze affective states. Consequently, in the current study, the ratio of HRV changes during rest to each emotional state is considered a dependent (or response) variable to model the affective states.

It has been shown that (refer to appendix): first, women intensely experience emotions. Second, the older apply different strategies in emotion regulation and heightened positive emotions. Third, different brain functions are activated in bilinguals during emotional stimuli presentation. In addition, an augmented reactivity to negative emotions has been reported. Therefore, stronger emotional weights should be considered for women, older, bilinguals, and sleep-deprived participants.

Based on the results of individual characteristics and the HRV index, the evaluative model for emotion recognition can be recognized with a regression equation. Consequently, the evaluation index of affective autonomic changes can be calculated by Equation 1.

f=ax1+bx2+cx3+dx4+E 1.

In this Equation, x1 represents the gender G) characteristics: 1 for men and 2 for women. x2 is the subjects’ age (A) range: 1 for subjects in the age range 19–22 years and 2 for subjects in the age range 22–25 years. x3 carries the linguality (L) information which is coded 1 if the subject is monolingual and 2 if the subject is bilingual. Sleep (S) is coded by x4: 1 for subjects with normal sleeping and 2 for sleep deprivation. The ratios of HRV changes during rest to each affective state are captured by f. Therefore, f1, f2, f3, and f4 are formed for happiness, relaxation, sadness, and fear affective states, respectively.

3. Results

Eighty percent of recording data were randomly nominated to calculate the coefficients of a, b, c, d, and e by a linear fitting of HRV data, and the rest (20%) for testing. Different combinations of individual characteristics are also considered in the model. To this end, the coefficients play an essential role. For instance, if we want to evaluate the role of age and gender (GA) on the model, the coefficients of c and d should be set to zero. As a result, the following conditions are considered in the model evaluation (Equation 2):

GA:c=d=0GL:b=d=0GS:b=c=0Al:a=d=0AS:c=c=0LS:a=b=0GAL:d=0GAS:c=0GLS:b=0ALS:a=0GALS:allCoffsareexist 2.

To evaluate the performance of the model, the mean error (difference between the real data and the estimated one) and the Root Mean Square Error (RMSE) are calculated. Table 1 outlines the results.

Table 1.

The performance of the proposed model

Individual Considerations

Model Evaluations Subjects GA GL GS AL AS LS GAL GAS GLS ALS GALS
f1 1 0.5747 0.4099 0.2742 0.394 0.2298 0.1486 0.5599 0.4189 0.2525 0.2282 0.3991
2 0.0716 0.0623 0.2289 0.1118 0.2733 0.3423 0.0834 0.0842 0.2118 0.2726 0.0695
3 0.3959 0.2619 0.0954 0.2124 0.051 0.0181 0.4077 0.24 0.1124 0.0517 0.2548
4 0.2217 0.0575 0.2427 0.1251 0.2675 0.1799 0.2134 0.3802 0.2319 0.2667 0.3702
E 5 0.0754 0.174 0.0074 0.0512 0.1936 0.106 0.0877 0.0653 0.0245 0.1928 0.0499
6 0.6026 0.4384 0.6236 0.506 0.6484 0.5608 0.5944 0.7612 0.6128 0.6476 0.7511
7 0.1726 0.1042 0.1987 0.2123 0.306 0.2489 0.1804 0.2668 0.2105 0.3067 0.2772
8 0.0535 0.1107 0.0745 0.0431 0.0993 0.0117 0.0453 0.2121 0.0637 0.0985 0.202
9 0.1674 0.0996 0.0443 0.2912 0.1813 0.0939 0.1817 0.04 0.0236 0.1827 0.0582
ME 0.2595 0.1910 0.1989 0.2163 0.2500 0.1900 0.2615 0.2743 0.1938 0.2497 0.2702
RMSE 0.9853 0.7066 0.7947 0.7867 0.8922 0.7541 0.9802 1.0422 0.7747 0.8910 1.0265
f2 1 0.1737 0.0569 0.2354 0.2036 0.3259 0.2734 0.1031 0.2794 0.1694 0.306 0.2048
2 0.0218 0.1211 0.0168 0.0957 0.2008 0.1453 0.0566 0.0714 0.0169 0.1939 0.0349
3 0.3454 0.4844 0.5478 0.5367 0.6386 0.7009 0.3942 0.4597 0.5969 0.6585 0.5126
4 0.389 0.342 0.1635 0.3132 0.1661 0.1254 0.4663 0.2621 0.2294 0.1866 0.3436
E 5 0.3864 0.5078 0.3554 0.4902 0.3851 0.4406 0.3994 0.2684 0.3703 0.392 0.2815
6 0.0234 0.0236 0.2021 0.0524 0.1995 0.2401 0.1007 0.1035 0.1361 0.1789 0.0219
7 0.0237 0.0187 0.0854 0.0817 0.1759 0.1481 0.0434 0.1294 0.1022 0.1834 0.1511
8 0.3786 0.3141 0.4326 0.2153 0.3095 0.2817 0.4201 0.5091 0.4683 0.317 0.5543
9 0.2561 0.2809 0.4739 0.1106 0.2577 0.2983 0.2006 0.4077 0.4268 0.2371 0.3497
ME 0.2220 0.2388 0.2792 0.2333 0.2955 0.2949 0.2427 0.2767 0.2796 0.2948 0.2727
RMSE 0.8127 0.8977 0.9842 0.8632 0.9814 1.0222 0.8803 0.9424 1.0027 0.9877 0.9778
f3 1 0.4276 0.3521 0.3035 0.3775 0.3437 0.2654 0.3864 0.3357 0.2715 0.2998 0.3028
2 0.0072 0.0111 0.0375 0.0571 0.0661 0.0756 0.0441 0.0759 0.0695 0.11 0.1062
3 0.1569 0.0814 0.197 0.1068 0.2235 0.1518 0.1157 0.2186 0.1551 0.1747 0.1763
4 0.0555 0.0738 0.1893 0.0056 0.147 0.1442 0.0186 0.1403 0.1475 0.0983 0.1007
E 5 0.0462 0.1217 0.0061 0.0963 0.0204 0.0513 0.0874 0.0155 0.048 0.0284 0.0268
6 0.1416 0.0379 0.0078 0.0398 0.0501 0.0933 0.0202 0.0568 0.0997 0.0474 0.0515
7 0.1961 0.0571 0.0437 0.0886 0.0988 0.0445 0.1034 0.0858 0.0353 0.0014 0.006
8 0.5607 0.475 0.5206 0.4591 0.4941 0.4196 0.4436 0.499 0.4132 0.3966 0.3933
9 0.6447 0.6669 0.4923 0.6889 0.5475 0.5504 0.7104 0.5345 0.5627 0.5962 0.6026
ME 0.2485 0.2086 0.1998 0.2133 0.2213 0.1996 0.2144 0.2180 0.2003 0.1948 0.1962
RMSE 1.0006 0.9089 0.8268 0.9281 0.8664 0.7825 0.9408 0.8552 0.7907 0.8112 0.8157
f4 1 0.1898 0.2176 0.3582 0.1491 0.3115 0.3431 0.1534 0.3015 0.333 0.2748 0.2701
2 0.0706 0.0428 0.0978 0.1113 0.0511 0.0827 0.107 0.0411 0.0726 0.0144 0.0097
3 0.125 0.1529 0.3521 0.0844 0.2468 0.2784 0.0722 0.2812 0.3153 0.2101 0.2359
4 0.3448 0.2466 0.1458 0.3098 0.2351 0.148 0.316 0.2128 0.1291 0.211 0.197
E 5 0.2245 0.355 0.3929 0.2942 0.3462 0.4584 0.3009 0.3362 0.4408 0.3979 0.388
6 0.1695 0.1742 0.2707 0.2452 0.2914 0.2777 0.2373 0.3211 0.307 0.3477 0.3649
7 0.0979 0.0701 0.0705 0.1386 0.0238 0.0554 0.1343 0.0138 0.0453 0.0129 0.0176
8 0.2356 0.2633 0.1625 0.1948 0.126 0.1647 0.1992 0.1083 0.1458 0.0971 0.0866
9 0.2749 0.2471 0.1065 0.3156 0.1532 0.1216 0.3113 0.1632 0.1318 0.1899 0.1947
ME 0.1925 0.1966 0.2174 0.2048 0.1983 0.2145 0.2035 0.1977 0.2134 0.1951 0.1961
RMSE 0.6287 0.6520 0.7440 0.6642 0.6786 0.7458 0.6645 0.6868 0.7493 0.6994 0.7049

E: error; ME: mean error; RMSE: root mean square error.

Different combinations of individual characteristics can serve to predict the HRV indices in different affective states (Table 1). Based on the mean error results, LS and GL combinations can track the HRV changes due to happy stimuli. While GA outperforms the others for relaxation and fear. In addition, ALS and GALS result in the best prediction of HRV indices for sad incentives. Different individual parameters are involved in the emotional state prediction. Different results are obtained considering subjects exclusively. The results of subject 2 are accurate in all emotional conditions.

4. Discussion

In the current study, a simple predictive model of emotion was presented based on an autonomic feature. For the first time, the importance of individual characteristics involvements and combinations was examined in the problem of emotion prediction based on an HRV parameter. For a two-dimensional emotion theory, four categories of affective states were introduced: happiness, relaxation, sadness, and fear. The HRV changes in the rest in relation to each affective state are considered a dependent variable. An affect predictive model was proposed based on the linear combinations of individual differences with acceptable performance.

The results of this study showed that different subjective characteristics are involved in predicting HRV indices of affective states. LS, GL, GA, ALS, and GALS are desired arrangements to predict HRV changes due to emotional provocations. Researchers can conclude that a close association is observed between gender and physiological changes in emotional states. This result was consistent with published articles extensively where the role of gender in emotion recognition was stated ( Chen et al., 2015). However, the role of some indices like sleep quantity or linguality on autonomic indices affected by visual affective states has not been explored so far. Different results were obtained according to each subject. Different combinations of individual characteristics incorporate into the affect prediction. Previous studies confirm the role of subject differences in emotion perception ( Donges, Kersting, & Suslow, 2012; Martin, Berry, Dobranski, Horne, & Dodgson, 1996). Different emotion perceptions are because of various individual characteristics, as well as the past experiences of emotions ( Barrett, Mesquita, Ochsner, & Gross, 2007). Based on the role of several individual characteristics, a new perspective of the emotion predictive model was presented in the current study. However, more data are required to establish the role of individual characteristics in predicting autonomic emotional states.

Appendix

Gender differences and emotions

A review of articles on emotion suggests that men and women employ different strategies to control and express their emotions. Compared to men, women experience positive and negative emotions more intensely ( Grossman & Wood, 1993), and they are more emotional ( Grewal & Salovey, 2005). Dealing with frightening situations, women have reported more fear ( Gordon & Riger, 1991). Also, some studies have found different emotional valence and arousal ratings between the two genders ( Murnen & Stockton, 1997): more emotional arousal in men and higher rates of valence in women. Recent evidence has suggested higher brain activity in females compared to males in some Electroencephalogram (EEG) studies. Guntekin and Basar (2007) perceived a larger beta response in women when observing facial expression; however, it was independent of the type of emotion. In women, the greater electrodermal responses ( Kring & Gordon, 1998), the more facial electromyographic reactions, and the higher heart rates ( Bradley, Codispoti, Sabatinelli, & Lang, 2001) have been realized during unpleasant stimuli compared to men.

Age and emotional reactions

In previous works, there are some confirmations about the existence of close relations between age and the type of emotion that should be recognized. Specifically, anger, sadness, fear, happiness, and surprise are hardly identified in older adults compared to younger participants ( Isaacowitz et al., 2007; Ruffman, Henry, Livingstone, & Phillips, 2008). However, they could correctly recognize positive stimuli. In contrast, despite higher recognition rates of negative motivations in younger adults, they are easily distracted by these types of emotions ( Thomas & Hasher, 2006). Low arousal positive affect increased, and negative affect across both low- and high-arousal levels decreased in older adults; however, no age differences were observed in high arousal positive affect ( Kessler & Staudinger, 2009). To explain the reason for such positive affections in elders, some scientists examined brain activities. They claimed that the changes in the functional organization of the brain are the main reason ( Cacioppo, Bernston, Bechara, Tranel, & Hawkley, 2011). However, the literature suggests different strategies to regulate emotional reactions ( Urry & Gross, 2010).

Bilingualism and emotion

There is a great interest in emotional information processing in bilinguals. It was believed that the first language is an emotional expressiveness language, whereas the second one is an emotional distance language ( Dewaele, 2008; Pavlenko, 2002). Recalling emotional stimuli, the native language (Spanish) was compared with the second language (English) ( Anooshian & Hertel, 1994). Results showed that emotional motivations were better recalled than the neutrals in the first language. However, considering the valence dimension, the authors reported altered emotional recalling in the native language ( Aycicegi & Harris, 2004). The negative ones (except for taboo) were less recalled than the neutrals. Shorter reaction time in monolinguals ( Altarriba, 2006) and stronger emotional weight in the first language of bilinguals ( Dewaele, 2008; Pavlenko, 2002) have been reported.

There is evidence that different structures and functions of the brain ( Kim, Relkin, Lee, & Hirsch, 1997) are activated in bilinguals during the presentation of emotional stimuli. From the perspective of physiological responses, the most attention is devoted to the skin conductivity (SC) responses of bilinguals ( Caldwell–Harris & Aycicegi–Dinn, 2009; Harris, Aycicegi, & Gleason, 2003). Greater SC has appeared in monolinguals to the emotional stimuli ( Harris, et al, 2003). However, the heart rates of Turkish-Persian and Kurdish-Persian bilinguals are evaluated ( Bayrami et al., 2012). The authors found that in both groups of bilinguals, negative motivations triggered a greater heart rate in the native language than that of the second language.

Sleep duration and emotional responses

Short or long sleep duration causes undesirable effects on mood, cognition, physiological function, alertness, and memory ( Taub, 1980; Taub et al., 1971). According to evidence, a close relationship is observed between emotions and sleep ( Berger, Miller, Seifer, Cares, & Lebourgeois, 2012; Walker & Harvey, 2010). Disrupting emotional memories, decreasing emotion reactivity, weakening sensitivity to positive stimuli, and consolidating sensitivity to negative ones are some outcomes of sleep deficiency ( Franzen et al., 2009; Gujar et al., 2011; Pilcher & Huffcutt, 1996; Sterpenich et al., 2007). To have optimal processing and evaluation of emotion, enough sleep is needed. Insufficient sleep may cause biases in processing the negative valence stimuli ( Gujar et al., 2011). Notably, an augmented reactivity to negative emotions, including anger and fear, has been documented throughout the day without sleep ( Gujar et al., 2011). Using functional magnetic resonance imaging, different functions of the brain (augmentation in amygdala reactivity) to negative emotional stimuli have been reported for a night of sleep deprivation ( Yoo et al., 2007). By evaluating autonomic reactivity, researchers demonstrated a larger pupillary response to negative pictures in sleep deficiency ( Franzen et al., 2009).

5. Conclusion

A new perspective of the emotion predictive model was presented based on the role of several individual characteristics, in the current study. For modeling, we evaluated different models as well as different values of parameters. However, it was desirable to choose the simplest and at the same time the most efficient model. Based on this, the proposed model was selected. However, this model may not work well for other HRV parameters. However, to establish the role of individual characteristics in the prediction of autonomic emotional states more data are needed. The number of participants should be greatly increased so that this model can be used more confidently. More samples (larger population) should have enough variety in terms of gender, age, bilingualism, and sleep quantity.

Ethical Considerations

Compliance with ethical guidelines

All ethical principles are considered in this article. The participants were informed of the purpose of the research and its implementation stages. They were also assured about the confidentiality of their information and were free to leave the study whenever they wished, and if desired, the research results would be available to them. A written informed consent has been obtained from the subjects. Principles of the .Helsinki Convention was also observed.

Acknowledgments

The authors gratefully acknowledge Computational Neuroscience Laboratory, where the data were collected, and all the subjects volunteered for the study.

Footnotes

Funding

This research did not receive any grant from funding agencies in the public, commercial, or non-profit sectors.

Authors' contributions

Conceptualization and Funding acquisition and Resources: All authors; Investigation, Data analysis, Methodology, Data collection and Writing-original draft: Atefeh Goshvarpour and Ateke Goshvarpour; Writing – review & editing: All authors.

Conflict of interest

The authors declared no conflict of interest.

References

  1. Altarriba J. (2006). Cognitive approaches to the study of emotion-laden and emotion words in monolingual and bilingual memory. In Pavlenko A. (Ed.), Bilingual minds: Emotional experience, expression and representation (pp. 232–256). United Kingdom: Multilingual Matters. https://books.google.com/books/about/Bilingual_Minds.html?id=x0YWC_g23isC [Google Scholar]
  2. Anooshian J., Hertel P. (1994). Emotionality in free recall: Language specificity in bilingual memory. Cognition and Emotion, 8(6), 503–514. [DOI: 10.1080/02699939408408956] [DOI] [Google Scholar]
  3. Aycicegi A., Harris C. (2004). Bilinguals recall and recognition of emotion words. Cognition and Emotion, 18(7), 977–987. [DOI: 10.1080/02699930341000301] [DOI] [Google Scholar]
  4. Bayrami M., Ashayeri H., Modarresi Y., Bakhshipur A., Farhangdoost H. (2012). [Physiological evidence for perceptive difference of emotional words among native and second language Turkish and Kurdish bilinguals (Persian)]. Journal of Kermanshah University of Medical Sciences, 16(5), 411–420. https://brieflands.com/articles/jkums-77356.html [Google Scholar]
  5. Barrett L., Mesquita B., Ochsner K., Gross J. (2007). The Experience of Emotion. Annual Review of Psychology, 58, 373–403. [DOI: 10.1146/annurev.psych.58.110405.085709] [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Berger R., Miller A., Seifer R., Cares S., Lebourgeois M. (2012). Acute sleep restriction effects on emotion responses in 30- to 36-month-old children. Journal of Sleep Research, 21(3), 235-246. [DOI: 10.1111/j.1365-2869.2011.00962.x] [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Bradley M., Codispoti M., Sabatinelli D., Lang P. (2001). Emotion and motivation II: Sex differences in picture processing. Emotion, 1(3), 300–319. [DOI: 10.1037/1528-3542.1.3.300] [DOI] [PubMed] [Google Scholar]
  8. Cacioppo J., Bernston G., Bechara A., Tranel D., Hawkley L. (2011). Could an aging brain contribute to subjective well being: The value added by a social neuroscience perspective. In Todorov A., Fiske S., Prentice D. (Eds.), Social neuroscience: Toward understanding the underpinnings of the social mind (pp. 249–262). New York: Oxford University Press. [DOI: 10.1093/acprof:oso/9780195316872.003.0017] [DOI] [Google Scholar]
  9. Caldwell-Harris C., Aycicegi-Dinn A. (2009). Emotion and lying in a non-native language. International Journal of Psychophysiology, 71(3), 193–204. [DOI: 10.1016/j.ijpsycho.2008.09.006] [DOI] [PubMed] [Google Scholar]
  10. Chang C. Y., Zheng J. Y., Wang C. J. (2010). Based on support vector regression for emotion recognition using physiological signals. International Joint Conference on Neural Networks (IJCNN), Barcelona, Spain, 18–23 July 2010. [DOI: 10.1109/IJCNN.2010.5596878] [DOI] [Google Scholar]
  11. Chen L.-F., Liu Z.-T., Wu M., Ding M., Dong F.-Y., Hirota K. (2015). Emotion-age-gender-nationality based intention understanding in human-robot interaction using two-layer fuzzy support vector reggresion. International Journal of Social Robotics, 7(5), 709–729. [DOI: 10.1007/s12369-015-0290-2] [DOI] [Google Scholar]
  12. Choi A., Woo W. (2005). Physiological sensing and feature extraction for emotion recognition by exploiting acupuncture spots. In Tao J., Tan T., Picard R. W. (Eds.), 1st International Conference in Affective Computing and Intelligent Interaction (pp. 590–597). Berlin: Springer-Verlag. [DOI: 10.1007/11573548_76] [DOI] [Google Scholar]
  13. Dewaele J. M. (2008). The emotional weight of I love you in multilinguals’ languages. Journal of Pragmatics, 40(10), 1753–1780. [DOI: 10.1016/j.pragma.2008.03.002] [DOI] [Google Scholar]
  14. Donges U.-S., Kersting A., Suslow T. (2012). Women’s greater ability to perceive happy facial emotion automatically: Gender differences in affective priming. Plos One, 7(7), e41745. [DOI: 10.1371/journal.pone.0041745] [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Frantzidis C., Bratsas C., Papadelis C., Konstantinidis E., Pappas C., D Bamidis P. (2010). Toward emotion aware computing: An integrated approach using multichannel neurophysiological recordings and affective visual stimuli. IEEE Transactions on Information Technology in Biomedicine, 14(3), 589–597. [DOI: 10.1109/TITB.2010.2041553] [DOI] [PubMed] [Google Scholar]
  16. Franzen P., Buysse D., Dahl R., Thompson W., Siegle G. (2009). Sleep deprivation alters pupillary reactivity to emotional stimuli in healthy young adults. Biological Psychology, 80(3), 300–305. [DOI: 10.1016/j.biopsycho.2008.10.010] [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Gordon M., Riger S. (1991). The female fear: The social costs of rape. Chicago: University of Illinois Press. https://books.google.com/books/about/The_Female_Fear.html?id=4N6i2tfpp5sC [Google Scholar]
  18. Goshvarpour A., Abbasi A., Goshvarpour A. (2015). Affective visual stimuli: Characterization of the picture sequences impacts by means of nonlinear approaches. Basic and Clinical Neuroscience, 6(4), 209–221. [PMC free article] [PubMed] [Google Scholar]
  19. Greco A., Valenza G., Lanata A., Rota G., Scilingo E. (2014). Electrodermal activity in bipolar patients during affective elicitation. IEEE Journal of Biomedical and health Informatics, 18(6), 1865–1873. [DOI: 10.1109/JBHI.2014.2300940] [DOI] [PubMed] [Google Scholar]
  20. Grewal D., Salovey P. (2005). Feeling smart: The science of emotional intelligence. American Scientist, 93(4), 330–339. [DOI: 10.1511/2005.4.330] [DOI] [Google Scholar]
  21. Grossman M., Wood W. (1993). Sex differences in intensity of emotional experience: A social role interpretation. Journal of Personality and Social Psychology, 65(5), 1010–1022. [DOI: 10.1037//0022-3514.65.5.1010] [DOI] [PubMed] [Google Scholar]
  22. Gujar N., Mcdonald S., Nishida M., Walker M. (2011). A role for REM sleep in recalibrating the sensitivity of the human brain to specific emotions. Cerebral Cortex, 21(1), 115–123. [DOI: 10.1093/cercor/bhq064] [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Guntekin B., Basar E. (2007). Gender differences influence brain’s beta oscillatory responses in recognition of facial expressions. Neuroscience Letter, 424(4), 94–99. [DOI: 10.1016/j.neulet.2007.07.052] [DOI] [PubMed] [Google Scholar]
  24. Haag A., Goronzy S., Schaich P., Williams J. (2004). Emotion recognition using biosensors: First steps towards an automatic system. In Tutorial and research workshop on affective dialogue systems (pp. 36–48). Berlin: Springer. [DOI: 10.1007/978-3-540-24842-2_4] [DOI] [Google Scholar]
  25. Harris C., Aycicegi A., Gleason J. (2003). Taboo words and reprimands elicit greater autonomic reactivity in a first language than in a second language. Applied Psycholinguistics, 24(4), 561–579. [DOI: 10.1017/S0142716403000286] [DOI] [Google Scholar]
  26. Isaacowitz D., Löckenhoff C., Lane R., Wright R., Sechrest L., Riedel R., et al. (2007). Age differences in recognition of emotion in lexical stimuli and facial expressions. Psychology and Aging, 22(1), 147–159. [DOI: 10.1037/0882-7974.22.1.147] [DOI] [PubMed] [Google Scholar]
  27. Jang E. H., Park B. J., Park M. S., Kim S. H., Sohn J. H. (2015). Analysis of physiological signals for recognition of boredom, pain, and surprise emotions. Journal of Physiological Anthropology, 34(1), 25. [DOI: 10.1186/s40101-015-0063-5] [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Katsis C., Katertsidis N., Fotiadis D. (2011). An Integrated System Based on Physiological Signals for the Assessment of Affective States in Patients with Anxiety Disorders. Biomedical Signal Processing and Control, 6(3), 261–268. [DOI: 10.1016/j.bspc.2010.12.001] [DOI] [Google Scholar]
  29. Katsis C., Katertsidis N., Ganiatsas G., Fotiadis D. (2008). Toward emotion recognition in car-racing drivers: a biosignal processing approach. IEEE Transaction on Systems, Man, and Cybernetics, Part A: Systems and Humans, 38(3), 502–512. [DOI: 10.1109/TSMCA.2008.918624] [DOI] [Google Scholar]
  30. Kessler E. M., Staudinger U. (2009). Affective experience in adulthood and old age: The role of affective arousal and perceived affect regulation. Psychology and Aging, 24(2), 349–362. [DOI: 10.1037/a0015352] [DOI] [PubMed] [Google Scholar]
  31. Kim K., Bang S., Kim S. (2004). Emotion recognition system using short-term monitoring of physiological signals. Medical and Biological Engineering and Computing, 42(3), 419–427. [DOI: 10.1007/BF02344719] [DOI] [PubMed] [Google Scholar]
  32. Kim K., Relkin N., Lee K., Hirsch J. (1997). Distinct cortical areas associated with native and second languages. Nature, 388(6638), 171–174. [DOI] [PubMed] [Google Scholar]
  33. Kreibig S. (2010). Autonomic nervous system activity in emotion: A review. Biological Psychology, 84(3), 394–421. [DOI: 10.1016/j.biopsycho.2010.03.010] [DOI] [PubMed] [Google Scholar]
  34. Kring A., Gordon A. (1998). Sex differences in emotion: Expression, experience, and physiology. Journal of Personality and Social Psychology, 74(3), 686–703. [DOI: 10.1037/0022-3514.74.3.686] [DOI] [PubMed] [Google Scholar]
  35. Lang P., Bradley M., Cuthbert B. (2005). International affective picture system (IAPS): digitized photographs, instruction manual and affective ratings, Technical Report A-6. Gainesville, FL: University of Florida. [DOI: 10.1037/t66667-000] [DOI] [Google Scholar]
  36. Li L., Chen J. H. (2006, November). Emotion recognition using physiological signals. In International conference on artificial reality and telexistence (pp. 437–446). Berlin: Springer. [DOI: 10.1007/11941354_44] [DOI] [Google Scholar]
  37. Liu C., Conn K., Sarkar N., Stone W. (2008). Physiology-based affect recognition for computer-assisted intervention of children with autism spectrum disorder. International Journal of Human-Computer Studies, 66(9), 662–677. [DOI: 10.1016/j.ijhsc.2008.04.003] [DOI] [Google Scholar]
  38. Martin R., Berry G., Dobranski T., Horne M., Dodgson P. (1996). Emotion perception threshold: Individual differences in emotional sensitivity. Journal of Research in Personality, 30(2), 290–305. [DOI: 10.1006/jrpe.1996.0019] [DOI] [Google Scholar]
  39. Murnen S., Stockton M. (1997). Gender and self-reported sexual arousal in response to sexual stimulation: A meta-analytic review. Sex Roles, 37(3–4), 135–153. [DOI: 10.1023/A:1025639609402] [DOI] [Google Scholar]
  40. Nardelli M., Valenza G., Greco A., Lanata A., Scilingo E. (2015). Recognizing emotions induced by affective sounds through heart rate variability. IEEE Transactions on Affective Computing, 6(4), 385–394. [DOI: 10.1109/TAFFC.2015.2432810] [DOI] [Google Scholar]
  41. Niu X., Chen L., Chen Q. (2011). Research on genetic algorithm based on emotion recognition using physiological signals. In International Conference on Computational Problem-Solving (ICCP) (pp. 614–618). Chengdu: IEEE. [DOI: 10.1109/ICCPS.2011.6092256] [DOI] [Google Scholar]
  42. Pavlenko A. (2002). Bilingualism and emotions. Multilingualism, 21(1), 45–78. [DOI: 10.1515/mult.2002.004] [DOI] [Google Scholar]
  43. Picard R., Vyzas E., Healey J. (2001). Toward machine emotional intelligence: analysis of affective physiological state. IEEE Transaction on Pattern Analysis and Machine Intelligence, 23(10), 1175–1191. [DOI: 10.1109/34.954607] [DOI] [Google Scholar]
  44. Pilcher J., Huffcutt A. (1996). Effects of sleep deprivation on performance: A meta-analysis. Sleep, 19(4), 318–326. [DOI: 10.1093/sleep/19.4.318] [DOI] [PubMed] [Google Scholar]
  45. Rainville P., Bechara A., Naqvi N., Damasio A. (2006). Basic emotions are associated with distinct patterns of cardiorespiratory activity. International Journal of Psychophysiology, 61(1), 5–18. [DOI: 10.1016/j.ijpsycho.2005.10.024] [DOI] [PubMed] [Google Scholar]
  46. Rani P., Liu C., Sarkar N., Vanman E. (2006). An empirical study of machine learning techniques for affect recognition in human-robot interaction. Pattern Analysis and Applications, 9(1), 58–69. [DOI: 10.1007/s10044-006-0025-y] [DOI] [Google Scholar]
  47. Ruffman T., Henry J., Livingstone V., Phillips L. (2008). A meta-analytic review of emotion recognition and aging: Implications for neuropsychological models of aging. Neuroscience & Biobehavioral Review, 32(4), 863–881. [DOI: 10.1016/j.neubiorev.2008.01.001] [DOI] [PubMed] [Google Scholar]
  48. Sterpenich V., Albouy G., Boly M., Vandewalle G., Darsaud A., Balteau E., et al. (2007). Sleep-related hippocampo-cortical interplay during emotional memory recollection. PLoS Biology, 5(11), e282. [DOI: 10.1371/journal.pbio.0050282] [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. Taub J. (1980). Effects of ad lib extended-delayed sleep on sensorimotor performance, memory and sleepiness in the young adult. Physiology & Behavior, 25(1), 77–87. [DOI: 10.1016/0031-9384(80)90185-7] [DOI] [PubMed] [Google Scholar]
  50. Taub J., Globus G., Phoebus E., Drury R. (1971). Extended sleep and performance. Nature, 233(5315), 142–143. [DOI: 10.1038/233142a0] [DOI] [PubMed] [Google Scholar]
  51. Thomas R., Hasher L. (2006). The influence of emotional valence on age differences in early processing and memory. Psychology and Aging, 21(4), 821–825. [DOI: 10.1037/0882-7974.21.4.821] [DOI] [PMC free article] [PubMed] [Google Scholar]
  52. Urry H., Gross J. (2010). Emotion regulation in older age. Current Directions in Psychological Science, 19(6), 352–357. [DOI: 10.1177/0963721410388395] [DOI] [Google Scholar]
  53. Valenza G., Citi L., Lanata A., Scilingi E., Barbieri R. (2014). Revealing real-time emotional responses: A personalized assessment based on heartbeat dynamics. Scientific Reports, 4, 4998. [DOI: 10.1038/srep04998] [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Walker M., Harvey A. (2010). Obligate symbiosis: Sleep and affect. Sleep Medicine Reviews, 14(4), 215–217. [DOI: 10.1016/j.smrv.2010.02.003] [DOI] [PubMed] [Google Scholar]
  55. Yannakakis G., Hallam J. (2008). Entertainment modeling through physiology in physical play. International Journal of Human-Computer Studies, 66(10), 741–,,755. [DOI: 10.1016/j.ijhcs.2008.06.004] [DOI] [Google Scholar]
  56. Yoo S., Lee C., Park Y., Kim N., Lee B., Jeong K. (2005). Neural network based emotion estimation using heart rate variability and skin resistance. In Wang L., Chen K., Ong Y.S. (Eds.), 1st International Conference in Advances in Natural Computation (pp. 818–824). Berlin: Springer-Verlag. [DOI: 10.1007/11539087_110] [DOI] [Google Scholar]
  57. Yoo S. S., Gujar N., Peter H., Jolesz F., Walker M. (2007). The human emotional brain without sleep - a prefrontal amygdala disconnect. Current Biology: CB, 17(20), R877-R878. [DOI: 10.1016/j.cub.2007.08.007] [DOI] [PubMed] [Google Scholar]
  58. Zhai J., Barreto A. (2006). Stress detection in computer users based on digital signal processing of noninvasive physiological variables. 28th Annual International Conference Engineering in Medicine and Biology Society (pp. 1355–1358). New York City: IEEE. [DOI: 10.1109/IEMBS.2006.259421] [DOI] [PubMed] [Google Scholar]

Articles from Basic and Clinical Neuroscience are provided here courtesy of Iranian Neuroscience Society

RESOURCES