Skip to main content
Data in Brief logoLink to Data in Brief
. 2026 Mar 24;66:112720. doi: 10.1016/j.dib.2026.112720

Physiological responses to emotional video stimuli: ECG, EDA, and temperature data

António Oseas Pataca a, Saif Al-jumaili b, Paulo Jorge Coelho c,d, Nuno M Garcia b,e, Carlos Albuquerque f,g,h, Ivan Miguel Pires a,
PMCID: PMC13068604  PMID: 41970170

Abstract

This dataset contains synchronised electrocardiogram (ECG), electrodermal activity (EDA), skin temperature, and accelerometer recordings collected from 44 adult participants while viewing 18 emotionally evocative video clips. The stimuli were selected from validated affective video repositories and represent diverse emotional categories, including neutral, positive, fear-related, social stress, sadness, humour, anger, shame, and interpersonal conflict. Physiological signals were acquired using BITalino (r)evolution boards at a sampling rate of 1000 Hz and stored as timestamped raw text files, with one file per stimulus to ensure consistent segmentation across participants. Each participant folder also includes structured demographic metadata in JSON format. Emotional categories correspond to stimulus-level intended elicitation targets and are not treated as participant-level ground truth. The fixed presentation sequence enables temporal comparability across subjects while preserving alignment of multimodal recordings. The dataset supports research in affective computing, psychophysiology, stress analysis, and emotion recognition, and complements existing resources by providing synchronised raw physiological signals aligned with discrete, validated video stimuli. The dataset is openly accessible via Mendeley Data.

Keywords: Emotion recognition, Physiological data, Stress detection, Sensors, Dataset


Specifications Table

Subject Computer Sciences
Specific subject area Affective Computing, Psychophysiology, Emotion Recognition
Type of data Raw
Presented as text files
Data collection Data collection was performed using BITalino (r)evolution boards (PLUX Wireless Biosignals, S.A.), which recorded ECG, EDA, skin temperature, and accelerometer signals. Participants wore the NK-G04E-VR Virtual Reality Glasses during the experiment to view 18 emotionally evocative videos designed to elicit a range of emotions. Data were captured at a 1000 Hz sampling rate and stored with timestamps. Demographic data (age, gender, health conditions) were also collected. Participants were healthy adults, and those with neurological disorders were excluded. No data normalization was applied.
Data source location Data were collected in the Viseu District (40.6569° N, 7.9125° W) and Aveiro District (40.6405° N, 8.6538° W), Portugal. Participants viewed emotional video stimuli through NK-G04E-VR Virtual Reality Glasses in a non-laboratory setting, and physiological signals were recorded using BITalino sensors.
Data accessibility Repository name: Physiological Responses to Emotional Video Stimuli: ECG, EDA, and Temperature Data
Data identification number:10.17632/8dr667fxhz.1
Direct URL to data:https://data.mendeley.com/datasets/8dr667fxhz/1
Instructions for accessing these data: Data can be accessed directly through the provided DOI link. No authentication is required for download.
Related research article none

1. Value of the Data

  • The dataset spans a broad range of emotional categories, enabling the study of diverse affective responses within a single experimental framework.

  • The synchronised ECG, EDA, and temperature signals provide a multimodal resource for investigating physiological correlates of emotion and stress.

  • The fixed sequence of video stimuli allows analysis of the temporal evolution of physiological responses across trials.

  • The inclusion of demographic metadata enables exploration of inter-individual variability, including age- and gender-related effects.

  • The dataset may support the development and comparative evaluation of emotion-recognition approaches in wearable sensing, stress monitoring, and human-computer interaction research.

2. Background

This dataset enables researchers to examine how the body responds to emotional stimuli, which is essential across fields such as affective computing, psychophysiology, and mental health. Emotional states lead to measurable changes in the autonomic nervous system that can be tracked using signals such as electrocardiography (ECG), electrodermal activity (EDA), skin temperature, and accelerometers [1]. These signals offer objective methods of identifying emotions and support applications in human-computer interaction, stress monitoring, and mental health evaluation [2,3].

Existing datasets in the literature, such as AMIGOS [4], CASE [5], and WESAD [6], have limitations in terms of emotion diversity, signal timing, and participant diversity. Our dataset improves on these by collecting ECG, EDA, and temperature data from 44 participants as they watched 18 video clips designed to evoke different emotions. These videos depict a wide range of emotions, from simple to complicated, including humour, intense fear, shame, and guilt. This dataset enables investigation of physiological dynamics across multiple emotional categories and supports the development of emotion recognition approaches. The dataset is a valuable tool for examining stress and emotion in multiple contexts.

Table 1 highlights differences in sample size, sensing modalities, stimulus structure, and annotation strategy, clarifying the positioning and complementary nature of the proposed dataset.

Table 1.

Comparison of selected physiological emotion datasets.

Dataset Participants Physiological Signals Stimuli Type Annotation Type Experimental Structure
AMIGOS [4] 40 ECG, EDA, EEG, GSR Short and long emotional videos Self-report valence/arousal Individual and group settings
CASE [5] 30 ECG, EDA, respiration Continuous affective stimuli Continuous valence/arousal annotations Continuous annotation protocol
WESAD [6] 15 ECG, EDA, EMG, respiration, temperature Stress induction tasks (Trier Social Stress Test and others) Stress condition labels Controlled laboratory stress protocol
Proposed dataset 44 ECG, EDA, skin temperature, accelerometer 18 validated emotional video clips Stimulus-level emotional categories (no participant-level self-report) Fixed sequence, one file per stimulus

3. Data Description

The dataset presented in this paper is stored in a Mendeley Data repository [7]. The data is organised into folders, one per participant. Each folder contains the physiological recordings associated with the 18 video clips. For each stimulus, a separate file named bitalino_n.txtis created, where n ranges from 1 to 18, corresponding to the video number. In addition, a user_data.json file is included in the same folder, containing the participant's demographic metadata.

Each recording file (bitalino_n.txt) has the following structure:

  • First column: Timestamp of each sample (ms)

  • Columns 2–5: Digital inputs and outputs (I1, I2, O1, O2)

  • Column 6: AI1 channel (unused in this experiment)

  • Column 7: AI2 channel, recording ECG signal (mV)

  • Column 8: AI3 channel, recording EDA signal (mV)

  • Column 9: AI4 channel, recording Skin Temperature (mV)

  • Column 10: AI5 channel, recording Accelerometer signals (mV)

Example data from bitalino_1.txt:

Time (ms), I1, I2, O1, O2, AI1, AI2, AI3, AI4, AI5
1755,790,652,815, 0, 0, 0, 0, 0, 521, 15, 502, 628
1755,790,652,825, 0, 0, 0, 0, 0, 519, 16, 496, 627
1755,790,652,835, 0, 0, 0, 0, 0, 513, 15, 498, 629

The following subsections will explain the relevant data for this study, including ECG, EDA, and Temperature data. The AI5 channel contains accelerometer data that may be used to identify motion artefacts during recording.

3.1. ECG data (AI2)

The ECG signal was recorded from electrodes placed on the upper chest using channel AI2. Values are stored in millivolts (mV). This channel provides cardiac activity patterns. Fig. 1 shows an example ECG waveform over time.

Fig. 1.

Fig 1 dummy alt text

Example ECG waveform from one participant during a single video clip (AI2 channel). This trace is shown for illustrative purposes to show the signal format and typical morphology and is not an aggregate statistic across participants.

3.2. EDA data (AI3)

Electrodermal activity was recorded through the AI3 channel. Values are expressed in millivolts (mV), reflecting skin conductance changes associated with sympathetic nervous system activity. Fig. 2 illustrates the EDA signal captured during one of the clips.

Fig. 2.

Fig 2 dummy alt text

Example EDA signal from one participant during a single video clip (AI3 channel). This trace is provided as an illustrative example of the recorded signal and is not an aggregate statistic across participants.

3.3. Temperature data (AI4)

Skin temperature was recorded with the AI4 channel in millivolts (mV). This sensor provides insight into peripheral thermal regulation during emotional elicitation. Example data show relatively stable readings across trials (Fig. 3). Temperature values are recorded in millivolts. Conversion details are available in the BITalino temperature sensor documentation [8].

Fig. 3.

Fig 3 dummy alt text

Example skin temperature signal from one participant during a single video clip (AI4 channel). This trace is shown as an illustrative example of the recorded data and is not an aggregate statistic across participants.

3.4. Other data

Along with sensor data, each participant folder has a file called user_data.json that has metadata on the participant's age, gender, exercise habits, diet, health issues, and other important information.

The file is formatted in JSON. Here is an example of the metadata that was gathered for one of the participants:

Example data

 {
  ``ID'': 1,
  ``Gender'': ``Male'',
  ``Age'': 27,
  ``Weight (Kg)'': 58,
  ``Height (cm) '': 174,
  ``Do you follow any specific diet?'': ``No'',
  ``How many portions of fruit or vegetables do you consume per day?'': ``1–2'',
  ``Approximately how much water do you consume per day?'': ``From 0.6L to 1L (one large bottle)'',
  ``How often do you exercise (including walking, running, swimming, cycling, or playing other team or individual sports)?'': ``4 - 6 times per week'',
  ``For how long, on average, do you do it (each time)?'': ``Up to 30 min'',
  ``What types of exercises do you prefer?'': ``Individual Sports'',
  ``Do you have any health conditions that might impact your stress level?'': ``No'',
  ``If yes, which ones? [Musculoskeletal]'': ``No'',
  ``If yes, which ones? [Respiratory]'': ``No'',
  ``If yes, which ones? [Cardiac] '': ``No'',
  ``If yes, which ones? [Metabolic]'': ``No'',
  ``If yes, which ones? [Neurological]'': ``No'',
  ``If yes, which ones? [Others]'': ``No'',
  ``Do you usually use any strategy to relieve stress?'': ``Yes'',
  ``If yes, which ones? [Benzodiazepines (e.g., diazepam, alprazolam)]'': ``No'',
  ``If yes, which ones? [Antidepressants (e.g., fluoxetine, sertraline)]'': ``No'',
  ``If yes, which ones? [Natural supplements (e.g., valerian, lemon balm)]'': ``No'',
  ``If yes, which ones? [Others (please specify): playing the piano]'': ``Yes'',
  ``If yes, which ones? [I don't know/I don't remember the name]'': ``No'',
  ``Does tobacco consumption influence your stress level?'': ``No'',
  ``Does alcohol consumption affect your stress level?'': ``I do not consume alcohol'',
  ``On average, how many hours do you usually sleep per night?'': ``6 h'',
  ``How would you describe your physical state at the moment?'': ``Good'',
  ``How do you rate your mental/cognitive state (e.g., concentration, memory) at the moment?'': ``Good'',
  ``How would you describe your emotional/psychological state (e.g., anxiety level, mood) at the moment?'': ``Balanced'',
  ``On a scale of 1 (very low) to 5 (very high), what is your interest in participating in this study?'': 5
 }

4. Experimental Design, Materials, and Methods

4.1. Participants

A total of 44 volunteers participated in this study, comprising 12 women and 32 men. Participants ranged in age from 25 to 72 years, with an average of 36.2 years (SD = 11.9). A summary of the demographic variables is presented in Table 2. Before inclusion, all individuals signed an informed consent form that guaranteed the confidentiality and anonymisation of their data. The consent procedure was conducted in accordance with ethical standards, ensuring that participants were fully informed about the study's objectives and any potential risks. Only individuals who agreed to these conditions were included in the dataset. The research protocol was reviewed and approved by the Ethics Committee of the University of Aveiro (reference 69-CED/2024), and participants were informed that their anonymised data would be made publicly accessible via Mendeley Data [7].

Table 2.

Participant demographics.

Data Acquired Distribution (Total)
Age 25 (Min), 36.18 ± 11.87 (Mean), 72 (Max) years
Gender 32 (Male), and 12 (Female)
Weight 54 (Min), 74.51 ± 12.88 (Mean), 102 (Max) kg
Height 160 (Min), 167.28 ± 8.88 (Mean), 189 (Max) cm
Sleep Duration 6 (Min), 6.0 ± 0.0 (Mean), 6 (Max) hours

The sample's health profile was diverse. Some participants reported conditions that could influence stress or overall health, including musculoskeletal (n = 1), respiratory (n = 2), cardiac (n = 2), and metabolic (n = 1) disorders. No cases of neurological disorders were identified. However, two individuals reported other health issues not covered by the predefined categories. This variability in clinical conditions contributes to a richer dataset, enabling the exploration of how different pathologies may interact with stress responses and physiological monitoring.

4.2. Materials

The experimental setup consisted of three main components: the BITalino (r)evolution board, the NK-G04E-VR Virtual Reality Glasses, and disposable electrodes.

The BITalino (r)evolution board (PLUX Wireless Biosignals, S.A.) [9,10], Fig. 4, is a versatile biosignal acquisition device that integrates multiple physiological sensors and supports both Bluetooth (BT) and Bluetooth Low Energy (BLE) communication. It includes a rechargeable Li-Po battery, a micro-USB charging port, and expansion ports for connecting electrodes and additional sensors. The board supports sampling rates up to 1000 Hz with a 10-bit-resolution ADC. Among its integrated sensors are ECG, EDA, EMG, and an accelerometer, making it suitable for acquiring multimodal physiological data.

Fig. 4.

Fig 4 dummy alt text

BITalino (r)evolution biosignal acquisition board.

The NK-G04E-VR Virtual Reality Glasses (Fig. 5) were used to present the emotional video stimuli. The headset is made of ABS material and features an adjustable head strap and ergonomic belt for participant comfort. It features optical lenses with adjustable pupil and focal distance, providing a viewing angle of 90–100°

Fig. 5.

Fig 5 dummy alt text

NK-G04E-VR virtual reality glasses.

Finally, disposable pre-gelled electrodes were employed to ensure reliable skin contact for biosignal acquisition. These electrodes were connected to the BITalino device through colour-coded electrode cables, following standard placement protocols for ECG and EDA recordings.

4.3. Experimental setup

The emotional video stimuli used in this study were selected from validated scientific repositories and previous research on affective video databases. The initial pool of videos was gathered from four primary sources:

  • The EmoStim Database, a validated repository of emotional film clips developed at the University of Geneva [11].

  • The dataset described by Maffei and Angrilli [12] provides standardised emotional video stimuli.

  • The emotional video set curated by researchers from the University of Minho, focusing on affective responses in Portuguese participants [13].

  • A recent dataset published on Research Square that explores validated clips for experimental emotion elicitation [14].

After careful selection and validation, 18 video clips were selected for the experiment. The clips were categorised according to the predominant emotion they were expected to elicit. The final distribution of emotional categories is presented in Table 3. The emotional categories assigned to the selected video clips correspond to the intended elicitation targets as defined in the source repositories. These categories should be interpreted as stimulus-level labels and not as confirmed participant-level emotional ground truth. The dataset does not include subjective self-report emotion ratings for each clip.

Table 3.

Stimuli videos.

1. Neutral 10. Humor

2. Neutral 11. Shame / Guilt
3. Positive / Relaxing 12. Intense Fear
4. Sadness 13. Satisfaction / Love
5. Mild Fear / Anxiety 14. Interpersonal Conflict
6. Mild Fear / Anxiety 2 15. Surprise
7. Intense Joy 16. Sadness – Isolation
8. Social Stress 17. Neutral
9. Anger / Irritation 18. Neutral

Each participant was exposed to all 18 clips. At the same time, their physiological responses (heart rate, electrodermal activity, and skin temperature) were continuously recorded using the experimental setup described in the Materials and Procedure subsections. The videos were presented in the same predetermined sequence for all participants. This approach was chosen to ensure temporal comparability across subjects and to facilitate consistent segmentation and alignment of physiological recordings. While this design improves between-subject comparability, it may introduce order, habituation, or carryover effects, which should be considered in downstream analyses.

4.4. Procedure

Data acquisition for Physiological Responses to Emotional Video Stimuli was conducted using the BITalino integrated circuit, with data recorded via the BitalinoScientiSST custom Android application installed on the researcher's smartphone [15]. Participants watched emotional video clips using a VR headset while their physiological responses were captured. The following steps outline the procedure used to collect the dataset:

  • Prepare the equipment: The participant wore a VR headset to watch the emotional video clips. Electrodes from the BITalino device were securely attached to the participant's skin to capture physiological signals.

  • Open the application: The researcher launched the BitalinoScientiSST custom-built Android application (Fig. 6a). In the settings menu, the researcher searched for and paired the BITalino integrated device via Bluetooth (Fig. 6b).

  • Enter participant ID: A unique numeric participant ID was entered into the application to ensure anonymisation and to organise data storage.

  • Start the test: Once all devices were connected and the participant was ready, the researcher pressed the ``Start'' button (Fig. 6a) to initiate physiological data collection.

  • Splitting the collection: Since the experimental protocol involved 18 video clips, the researcher pressed the ``New'' button before each video to start a new collection segment corresponding to that specific stimulus (Fig. 6c and d).

  • Data collection: The BITalino sensors recorded heart rate (HR), electrodermal activity (EDA), and skin temperature. Data were stored in a numbered folder. Inside this folder, the system created multiple text files, including:
    • bitalino_1.txt, bitalino_2.txt, …, bitalino_18.txt – containing data splits for each of the 18 videos
  • End the test: After the participant completed the test, the researcher pressed the ``Stop'' button in the application to terminate recording. The VR Glass and electrodes were then removed.

  • Store additional data: In the same folder as the physiological data, the user_data.json file contained additional demographic information collected from the participant.

Fig. 6.

Fig 6 dummy alt text

Mobile application (BitalinoScientiSST) interface used during data acquisition. (a) The main acquisition screen is used to start and stop recording after the participant ID is entered. (b) The Bluetooth device discovery and pairing screen is used to connect the BITalino device. (c) New-segment action used immediately before each video stimulus to split the session into stimulus-aligned recordings. (d) Example of the acquisition screen after creating a new segment for the next stimulus. Pressing New creates a new output file bitalino_n.txt (n = 1 to 18), with one file per video clip, matching the file organisation described in the Data Description section.

4.5. Sensor placement

For this study, BITalino sensors were used to record physiological signals, including electrocardiogram (ECG), electrodermal activity (EDA), and skin temperature. To ensure consistency across participants, electrodes were attached at standardised body locations, as illustrated in the figures below. Each signal was stored according to a predefined naming convention. The ECG electrodes were placed according to the conventions for limb and precordial leads, with precordial electrodes placed on the left side, near the heart, at specific anatomical locations. This configuration enabled the capture of reliable cardiac activity. The EDA electrodes were attached to the ventral side of the forearm, depending on the specific experimental setup or device used to reliably capture skin conductance activity. The temperature sensor was placed on the arm to detect temperature variations associated with the reactions. This location provided a stable reading of peripheral skin temperature while minimising interference from participant movements. Fig. 7 shows an example of electrode placement during one experiment.

Fig. 7.

Fig 7 dummy alt text

Example of Electrode placement during one experiment.

Limitations

The dataset includes 44 participants, which limits statistical power for subgroup analyses. The sex distribution is imbalanced, with 32 male and 12 female participants, and the age distribution, although broad (25–72 years), is not evenly represented across age groups.

Sex- and age-related differences in autonomic nervous system function are well documented. Physiological markers such as heart rate variability, electrodermal activity, and peripheral temperature regulation may vary according to hormonal, cardiovascular, and metabolic factors. Consequently, models trained directly on this dataset may reflect patterns more representative of the predominant demographic groups and may not generalise equally to underrepresented populations.

All participants viewed the 18 video clips in a fixed order, so order and fatigue effects cannot be excluded. Physiological signals are provided in raw form; motion artefacts, sensor contact variability, and preprocessing choices may influence downstream analyses. Health information is self-reported and not clinically verified. Additionally, emotional categories represent intended stimulus classifications and were not individually validated through self-report measures for each participant.

Ethics Statement

The research protocol was reviewed and approved by the Ethics Committee of the University of Aveiro (reference 69-CED/2024), and participants provided informed consent.

CRediT Author Statement

António Oseas Pataca: Conceptualization; Methodology; Software; Formal analysis; Investigation; Data Curation; Writing - Original Draft; Writing - Review & Editing; Saif Al-jumaili: Conceptualization; Methodology; Software; Formal analysis; Writing - Original Draft; Writing - Review & Editing; Paulo Jorge Coelho: Conceptualization; Methodology; Software; Formal analysis; Writing - Original Draft; Writing - Review & Editing; Nuno M. Garcia: Conceptualization; Methodology; Validation; Writing - Review & Editing; Funding acquisition; Carlos Albuquerque: Conceptualization; Methodology; Validation; Writing - Review & Editing; Supervision; Funding acquisition; Ivan Miguel Pires: Conceptualization; Methodology; Validation; Formal analysis; Resources; Data Curation; Writing - Review & Editing; Supervision; Funding acquisition.

Acknowledgements

This work is funded by national funds through FCT – Fundação para a Ciência e a Tecnologia, I.P., and, when eligible, co-funded by EU funds under project/support UID/50008/2025 – Instituto de Telecomunicações, with DOI identifier - https://doi.org/10.54499/UID/50008/2025.

This work is also funded by national funds through FCT – Fundação para a Ciência e a Tecnologia, I.P., and, when eligible, co-funded by EU funds under project/support UID/00308/2025 – Instituto de Engenharia de Sistemas e Computadores de Coimbra - INESC Coimbra, with DOI identifier - https://doi.org/10.54499/UID/00308/2025.

This work is also funded by national funds through FCT – Fundação para a Ciência e a Tecnologia, I.P., and, when eligible, co-funded by EU funds under project/support UID/00645/2025, with DOI identifier - https://doi.org/10.54499/UID/00645/2025.

This work is also funded by national funds through FCT – Fundação para a Ciência e a Tecnologia, I.P., and, when eligible, co-funded by EU funds under project/support UID/00742/2025, with DOI identifier - https://doi.org/10.54499/UID/00742/2025.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Data Availability

References

  • 1.Liao Y., Gao Y., Wang F., Zhang L., Xu Z., Wu Y. Emotion recognition with multiple physiological parameters based on ensemble learning. Sci. Rep. 2025;15 doi: 10.1038/s41598-025-96616-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Huang X., Zhu S., Wang Z., He Y., Jin H., Liu Z. EVA-MED: enhanced valence-arousal multimodal emotion dataset. arXiv Preprint. 2025 doi: 10.1038/s41597-025-06214-y. https://arxiv.org/abs/2503.16584 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.V. Strizhkova, H. Kachmar, H. Chaptoukaev, R. Kalandadze, N. Kukhilava, T. Tsmindashvili, N. Abo-Alzahab, M. A. Zuluaga, M. Balazia, A. Dantcheva, F. Brémond, L. M. Ferrari, MVP: multimodal emotion recognition based on video and physiological signals, European Conference on Computer Vision, Milan, Italy, September 29–October 4, 2024, Proceedings, Part XV (pp. 101-116). 10.1007/978-3-031-91581-9. [DOI]
  • 4.Miranda-Correa J.A., Abadi M.K., Sebe N., Patras I. Amigos: a dataset for affect, personality and mood research on individuals and groups. IEEE Trans. Affect. Comput. 2018;12:479–493. [Google Scholar]
  • 5.Sharma R. CASE: a dataset of continuous affect annotations and physiological signals for emotion analysis. Sci. Data. 2019;6:1–13. doi: 10.1038/s41597-019-0209-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Schmidt P., Reiss A., Duerichen R., Marberger C., Van Laerhoven K. Proc. 20th ACM Int. Conf. Multimodal Interact. 2018. Introducing wesad, a multimodal dataset for wearable stress and affect detection; pp. 400–408. [Google Scholar]
  • 7.Pataca A.O., Albuquerque C., Pires I. Physiological responses to emotional video stimuli: ECG, EDA, and temperature data. Mendeley Data. 2025;V1, doi: 10.17632/8dr667fxhz.1. [DOI] [Google Scholar]
  • 8.S.A. Wireless Biosignals . Temperature (TMP) Sensor Data Sheet. 2016. https://support.pluxbiosignals.com/wp-content/uploads/2021/11/revolution-tmp-sensor-datasheet.pdf [Google Scholar]
  • 9.Da Silva H.P., Guerreiro J., Lourenço A., Fred A., Martins R. BITalino: a novel hardware framework for physiological computing. Proc. Int. Conf. Physiol. Comput. Syst. 2014:246–253. [Google Scholar]
  • 10.Batista D., Plácido da Silva H., Fred A., Moreira C., Reis M., Ferreira H.A. Benchmarking of the BITalino biomedical toolkit against an established gold standard. Healthc. Technol. Lett. 2019;6:32–36. doi: 10.1049/htl.2018.5037. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Somarathna R., Vuilleumier P., Mohammadi G. EmoStim: a database of emotional film clips with discrete and componential assessment. IEEE Trans. Affect. Comput. 2023;15(3):1202–1212. doi: 10.1109/TAFFC.2023.3328900. [DOI] [Google Scholar]
  • 12.Maffei A., Angrilli A. E-MOVIE: experimental MOVies for induction of emotions in neuroscience. PLoS One. 2019;14 doi: 10.1371/journal.pone.0223124. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Carvalho S., Leite J., Galdo-Álvarez S., Gonçalves F.O. The emotional movie database (EMDB): a self-report and psychophysiological study. Appl. psychophysiol. biofeedb. 2012;37(4):279–294. doi: 10.1007/s10484-012-9201-6S. Soares, others, Emotional Movie Database (EMDB), (2012). [DOI] [PubMed] [Google Scholar]
  • 14.Carvalho S., Coelho C.G., Mendes A.J., Gonçalves Ó., Leite J. The emotional movie database (EMDB): an expanded toolkit for emotion research. Motiv. Emot. 2026:1–26. doi: 10.1007/s11031-026-10220-x. [DOI] [Google Scholar]
  • 15.Pataca A.O., Albuquerque C., Pires I.M. Approach framework on data collection, processing, and classification method for stress detection using wearable sensor data. Proc. 22nd Int. Conf. Mob. Syst. Pervasive Comput. (MobiSPC 2025); Leuven, Belgium; 2025. pp. 404–411. Elsevier. [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement


Articles from Data in Brief are provided here courtesy of Elsevier

RESOURCES