Skip to main content
International Journal of Developmental Disabilities logoLink to International Journal of Developmental Disabilities
. 2022 Dec 15;70(5):887–903. doi: 10.1080/20473869.2022.2154928

The missing piece. Physiological data as a factor for identifying emotions of people with profound intellectual and multiple disabilities

Torsten Hammann 1,, Jakob Valič 2, Gašper Slapničar 2, Mitja Luštrek 2
PMCID: PMC11308966  PMID: 39131753

Abstract

Introduction: The preferences of people with profound intellectual and multiple disabilities (PIMD) often remain unfulfilled since it stays challenging to decode their idiosyncratic behavior resulting in a negative impact on their quality of life (QoL). Physiological data (i.e. heart rate (variability) and motion data) might be the missing piece for identifying emotions of people with PIMD, which positively affects their QoL.

Method: Machine learning (ML) processes and statistical analyses are integrated to discern and predict the potential relationship between physiological data and emotional states (i.e. numerical emotional states, descriptive emotional states and emotional arousal) in everyday interactions and activities of two participants with PIMD.

Results: Emotional profiles were created enabling a differentiation of the individual emotional behavior. Using ML classifiers and statistical analyses, the results regarding the phases partially confirm previous research, and the findings for the descriptive emotional states were good and even better for the emotional arousal.

Conclusion: The results show the potential of the emotional profiles especially for practitioners and the possibility to get a better insight into the emotional experience of people with PIMD including physiological data. This seems to be the missing piece to better recognize emotions of people with PIMD with a positive impact on their QoL.

Keywords: profound intellectual and multiple disabilities, emotions, physiological data, heart rate (variability), quality of life, everyday life, machine learning

Introduction

People with profound intellectual and multiple disabilities (PIMD) often are a comparatively small group – estimated at 0.15 to 0.25% of the general population (Mohr 2011) – with quite heterogeneous reasons for their disabilities and individual competences (Axelsson et al. 2014, Nakken and Vlaskamp 2007). Usually, people with PIMD are characterized by significantly low intellectual functioning, low adaptive behavior, sensory and motor impairments, complex health problems (Nakken and Vlaskamp 2007, World Health Organization 2019), and non-use of verbal language or symbolic communication (Bellamy et al. 2010, Maes et al. 2007). In order to approach this group of people, it is important to consider the International Classification of Functioning, Disability and Health (ICF) as well, since the ICF takes into account not only the described impairments, but also the environmental factors with promoting or inhibiting effects (World Health Organization 2001). Part of these environmental factors are direct interaction partners, especially with regard to communicative processes (Hostyn and Maes 2009). A high level of adaptation is required from the interaction partners to be understood by the persons with PIMD, as well as to decode their mostly unconventional and idiosyncratic behavior signals, such as certain facial, physical or vocal cues. Especially for unfamiliar interaction partners, this is a challenging task. If the environment lacks the necessary knowledge about specific persons with PIMD (Brady et al. 2012), the wishes and feelings of these people often remain unfulfilled (Petry and Maes 2006). This in turn has a negative impact on the satisfaction of their physiological and social needs, development of cognitive, communicative and emotional competences as well as on quality of life (QoL) as a whole (Axelsson et al. 2014, Maes et al. 2007, van der Putten et al. 2018).

There has been an increasing scientific interest in the QoL concepts for people with PIMD (Hammann and Engelhardt 2022, Nieuwenhuijse et al. 2017). In essence, these concepts provide a framework to assess both objective living conditions and subjective well-being, to identify strategies for improving QoL, and in turn to evaluate these strategies for their effectiveness (Felce and Perry 1995, Petry and Maes 2009). The domains of QoL, i.e. physical wellbeing, material wellbeing, social wellbeing, development and activity, emotional wellbeing (Felce and Perry 1995), are valid and relevant for people with PIMD (Hammann and Engelhardt 2022). Unfortunately, measuring QoL in people with PIMD is challenging as common approaches including direct questioning are mostly not applicable (Hammann and Engelhardt 2022, Nieuwenhuijse et al. 2022). Thereby, especially the QoL domains such as social wellbeing, development and activity, and emotional wellbeing could be positively influenced by a better insight into the emotional expression of people with PIMD (Bermejo et al. 2014, Krämer and Zentel 2020).

In contrast to the field of QoL, studies in emotion research with people with PIMD are still scarce (Adams and Oliver 2011, Krämer and Zentel 2020). When conducting emotion research, the interpretation of participants’ reactions always relies on the underlying understanding of emotions. According to appraisal theories, an emotion is not a fixed state but a recursive and largely unconscious process with mutually influencing components: The cognitive appraisal of a stimulus (e.g. objects, activities or social interactions) based on personal experiences leads to a motivational change in action tendencies followed by both physiological responses in the autonomic nervous system (e.g. cardiovascular changes) and behavioral responses (e.g. changes in facial expression, vocalization or movements) (Lazarus 1991, Moors et al. 2013). These responses can be classified using the well-known Circumplex model as an example of a dimensional approach that locates emotional states according to the intersections of the two dimensions arousal (low–high) and valence (positive–negative) (Russell 1980). Empirical findings support the dimensional approach especially with regard to the subjective, physiological and expressive components (e.g. Bradley et al. 2001, Bradley and Lang 2000).

In terms of people with PIMD, the greatest difficulty lies in the subjective component and related self-reports. Typically, emotion research uses standardized stimuli to elicit the emotions being investigated across all participants. Including people with PIMD, this procedure is often not feasible since the necessary self-reports commonly used as an important information source (Petry and Maes 2006, Schalock et al. 2002) can hardly be obtained due to the communicative difficulties described previously. Therefore, individualized stimuli known from their everyday life are usually selected by close direct support persons (DSPs) being sure that they evoke the specific emotions (Krämer and Zentel 2020, Lima et al. 2013, Vos et al. 2010, Vos et al. 2012, Vos et al. 2013).

The aforementioned study of Vos and colleagues (Vos et al. 2010) was partly conducted and those studies of Krämer, Lima, and colleagues (Krämer and Zentel 2020, Lima et al. 2013) were entirely performed in laboratory settings with the aim of reducing confounding influences. In contrast, a less frequently used approach is based on observations directly in the field, i.e. from the participants’ daily lives, and followed by emotional coding by DSPs (Marteau et al. 2016, Petry and Maes 2006). The advantage of this approach is that participants with PIMD can remain in their familiar environment and the influence of research activities (e.g. when using stimuli) is reduced as well. However, this complicates the effort to exclude confounding influences.

In addition to mostly inapplicable self-reports when including people with PIMD, common proxy reports by DSPs tend to be lacking in quality as well, which is why a triangulated approach, i.e. linking different methods and sources of information, is recommended (Lyons et al. 2017). Especially regarding emotional expressions, the missing piece of this triangulation, besides observing the highly individual behavior signals of people with PIMD, might be the inclusion of the physiological component (Munde et al. 2012).

When presented with negative stimuli compared to positive ones, people with PIMD show higher heart rate (HR) variability, more respiratory sinus arrhythmia, slower breathing and lower skin conductance (Vos et al. 2010, 2013). In addition to HR, skin temperature allows the same conclusions about positive and negative emotions in both people with and without disabilities (Vos et al. 2012). Moreover, analyzing HR and respiration can be valuable for assessing alertness in people with PIMD (Munde et al. 2012). This also has implications for the assessment of emotions, as attention is usually turned away from negative stimuli (Vos et al. 2010). Thereby, the use of the same stimuli usually produces consistent physiological responses of HR and skin conductance (Lima et al. 2013). Another way to measure emotions in people with PIMD might be startle reflex modulation, usually measured by eye blink magnitude (Lyons et al. 2013). Presenting live information on physiological changes in people with PIMD to their DSPs can support interaction quality and thus emotional bonding between the two partners in everyday life (Frederiks et al. 2019).

A new step for research in social science is the use of artificial intelligence – more specifically, machine learning (ML). ML means that complex algorithms derive knowledge from sample data; the resulting model can in turn be applied to yet unknown data for analysis (Grimmer et al. 2021). This approach has already been implemented for emotion prediction (e.g. Domínguez-Jiménez et al. 2020), but is entirely unique and challenging especially due to the person centered evaluation within single case design (SCD) studies required in research including people with PIMD.

Our research integrates, inter alia, ML processes to discern and predict the potential relationship between physiological data, i.e. heart rate (variability) (HR(V)) as well as motion changes, and emotional states in everyday interactions and activities of two participants with PIMD following the subsequent research questions:

  • How can emotional states be distinguished in people with PIMD by analyzing their behavioral signals?

  • Are differences between phases of baseline and activities related to changes in HR(V) and movement data?

  • Are the emotional states related to changes in HR(V) and motion data?

Method

Participants

Our study involved two participants, Adam and Hanna (pseudonyms for anonymity), who were 8.0 and 6.5 years old when the data collection started in 2019. They lived at home with their families and attended daily the kindergarten of an institution for people with disabilities in Poland. In accordance with the principles of research ethics (see Ethics approval), both were selected by the DSPs to be part of the study due to their profound intellectual disabilities in combination with their lack of verbal language and symbolic communication. They were not able to walk independently and mostly stayed in their wheelchair, in a lying position on a mattress or similar, or on the lap of a DSP. For maintaining a sitting position, changing positions and physiological needs, they required assistance from a DSP. Other diagnoses mentioned for Hanna were vesicorenal reflux and cerebral palsy. The latter was manifested by increased muscle tension, a tonic labyrinthine reflex and a psychomotor delay while manipulating objects. Adam also had high muscle tone due to cerebral palsy and tetraparesis and additionally a severe form of epilepsy (Lennox-Gastaut syndrome). Furthermore, the correct position was important to allow him to breathe comfortably. While no visual or hearing impairments were reported for Hanna, Adam has a visual impairment with alternating strabismus.

Measures

The Empatica E4 is a non-invasive and battery-powered wristband with a size of 44 × 40 x 16 mm and a weight of 40 g. Using a photoplethysmography (PPG) sensor on the top of the wrist, the E4 records blood volume pulse (BVP) with two green and two red LEDs (sampling frequency: 64 Hz; resolution: .9 nW/digit). A 3-axis accelerometer (ACC) allows for capturing motion-based activity (Empatica S.R.L. 2018). The electrodermal activity sensor on the ventral inner wrist measures changes in certain electrical properties of the skin using two stainless steel electrodes (sampling frequency: 4 Hz; resolution: 1 digit ∼900 pSiemens) and the infrared thermopile records skin temperature (sampling frequency: 4 Hz, resolution: .02 °C). The wireless E4 is similar to a wristwatch, which makes it comfortable and unobtrusive to wear (McCarthy et al. 2016). Based on the manufacturer’s instructions, it was placed on the wrist and worn just tight enough to allow both comfortable wearing and good signal quality (Empatica S.R.L. 2018). The focus of the present study was on the PPG and ACC sensors.

In parallel with the physiological measurements, a video recording was made from two perspectives, as unobtrusively as possible, in order to record the behavior of the participants for the further procedure including the creation of the emotional profiles and the annotation process (see Procedure, Data Processing and Figure 1).

Figure 1.

Figure 1.

Overall research process analyzing emotional states and phases.

Procedure

The methodology of the study is oriented toward a SCD approach with AB design (Horner et al. 2005) meeting international standards regarding the amount of measurement points (U.S. Department of Education 2017). Starting in November 2019, approximately 7.6 h of data (i.e. video recordings and physiological data) were collected on 18 measurement days for Adam and 6.5 h of data on 14 measurement days for Hanna over a period of 7.7 months (see Figure 1 for the overall research process). The total data set was recorded in the everyday life of the participants. Subsequently, the recordings were classified in A-phases and B-phases according to the SCD (see Table 1) resulting in multiple and repeated AB-phases with different durations. The A-phases – also known as the baseline phases – are defined by the absence of any specific activity offered by the DSP and no direct interaction. Due to ethical reasons, the A-phases are usually short in order to ensure the DSPs’ pedagogical objective of providing ongoing support. B-phases show one-to-one situations with the close DSPs from the daily life of Adam and Hanna in kindergarten divided contentwise into categories common for the target group (van der Putten and Vlaskamp 2011). Table 2 shows the distribution of these categories.

Table 1.

Overview of the recordings including classification into phases, activities and excluded data.

  Adam Hanna
Total data set 7 h 37 min 6 h 32 min
graphic file with name YJDD_A_2154928_ILG0001_B.jpg    
A-phase 40 min 53 min
B-phase 6 h 57 min 5 h 39 min
graphic file with name YJDD_A_2154928_ILG0002_B.jpg    
Excluded data (unclear or covered behavioral signals and technical difficulties) 2 h 7 min 32 min
graphic file with name YJDD_A_2154928_ILG0003_B.jpg    
Final data set 5 h 30 min 6 h 0 min

The data sets always include video recordings and physiological data.

Table 2.

Distribution of the activity categories in the B-phases.

  Adam Hanna
Eating and drinking 36% 10%
Physically oriented activities 23% 40%
Audiovisual activities 24% 37%
Care 2% /
Transfer/position change 6% 4%
Little/moderate interaction 9% 9%

/ means no data.

In order to achieve the objective of annotating and further analyzing the emotional expression of both participants in all recordings, it was necessary to create individual emotional profiles of the participants. This creation initially required triangulated proxy reporting (Lyons et al. 2017) with close involvement of DSPs, as they are most familiar with the participants with PIMD and capable of evaluating their idiosyncratic behavior signals. Therefore, the INSENSION Questionnaire (Engelhardt et al. 2020) was filled in by these DSPs in the first step of the analysis of emotional expression. This questionnaire covers:

  • general data about the person with PIMD (i.e. information on age, gender, medical status, general competencies regarding motor, visual, and hearing skills as well as personal preferences or dislikes)

  • and information on his/her specific emotional behavior based on the occurrence frequency (i.e. always, often, or sometimes) related to the respective valence (i.e. pleasure, neutral, or displeasure).

In the second step, the DSPs went through 30% of the total data set of both participants and annotated the emotional expression based on a valence scale oriented to the Circumplex model (Russell 1980):

  • ‘7’ to ‘9’ stands for pleasure

  • ‘4’ to ‘6’ stands for neutral

  • ‘1’ to ‘3’ stands for displeasure

An annotation of ‘1’ means that an extreme discomfort was recognized, while an annotation of ‘9’ corresponds to a very intense pleasure moment. Furthermore, a more extreme emotional state is accompanied by a higher level of arousal, which means that the annotation of ‘5’ represents the lowest arousal. The triangulation was completed by additional validation of specific behavior signals. For this purpose, the researchers selected five ambiguous sequences for Adam and three for Hanna with a length of five to 30 s depending on how long the specific behavior was shown in the recordings. Ambiguous sequences are defined as situations in which researchers could no longer clearly analyze participants’ behavioral signals by using only the emotional profiles. Therefore, the DSPs as experts for the participants with PIMD unbiasedly evaluated the behavior signals in these recorded sequences and decided whether this behavior was most likely an indication of pleasure, neutral or displeasure. Based on the results of the INSENSION questionnaire, the validation of the ambiguous sequences and the annotation process, the emotional profiles (see Tables 6 and 7) were created using the expert knowledge of the DSPs (see Figure 1).

Table 6.

Emotional profile of Adam.

Valence scale Behavior description
Always Often Sometimes
Pleasure 9
  • Smiling widely and vocalizing/shouting for a long duration at a high volume

  • Moving jerkily his body, arms and legs in recurring manner

  • Overstretching his head backwards

 
8
  • Smiling widely

  • Vocalizing for a medium duration at a medium volume

  • Moving jerkily his body, arms and legs for a short duration

 
7
  • Smiling slightly

  • Vocalizing for a short duration at low volume

   
Neutral 6
  • Looking around with open eyes and partly opened mouth

 
  • Moving a little bit his arms and legs

5
  • Moving very little and slowly with his whole body

  • Semi-closed eyes

 
4
  • Acting passively or even a bit rejectively

  • Making snoring sound while breathing

 
Displeasure 3
  • Hitting his head backwards or turning his head away

  • Coughing briefly leading to slight tightening and tensing of arms and legs

  • Rubbing his legs together and outstretching slightly his arms

  • Staring gaze with widened eyes and tossing slightly his head from left to right for a short duration

 
2
  • Coughing several times leading to strong tightening and cramping of the arms and legs

  • Rubbing quickly and intensely his legs together and outstretching and raising his arms

  • Staring gaze with widened eyes and tossing his head from left to right for a short duration in recurring manner

 
1
  • Crying continuously, Screaming at high volume with narrowed eyes, wide-open mouth, tensely bent arms

   

Table 7.

Emotional profile of Hanna.

Valence scale Behavior description
Always Often Sometimes
Pleasure 9
  • Smiling widely and vocalizing/shouting at a high volume and a high pitch with narrowed eyes

  • Moving jerkily her body, arms and legs in a repetitive, struggling way

  • Overstretching strongly her body backwards

8
  • Smiling widely and vocalizing with a medium duration at a medium volume

  • Moving

  • Moving jerkily his body, arms and legs for a short duration

  • Overstretching slightly her body backwards

7
  • Smiling slightly for a short time

  • Vocalizing at a very low volume

  • Manipulating consciously objects with her arms and hands

  • Raising or bending slightly her arms

   
Neutral 6
  • Moving a little bit her body, arms or legs with raised eyebrows

   
5
  • Moving very little and slowly with her whole body

   
4
  • Moving her head, body, arms or legs slightly jerkily

 
  • Slight frown

  • Wrinkled nose

Displeasure 3
  • Turing the corners of her mouth

  • Strong frown and wrinkled nose

  • Coughing briefly

  • Vocalizing shortly at a medium volume and a low pitch

  • Moving her head, body, arms or legs slightly in jerky manner

 
2
  • Vocalizing/moaning intermittently at a high volume and a low pitch with downturned corners of mouth, a strong frown and a wrinkled nose

  • Moving her head, body, arms or legs jerkily

 
1
  • Crying with her eyes closes and frowning heavily

   

Using the emotional profiles, the researchers annotated the same 30% of total data set of each participant as the DSPs by means of the aforementioned valence scale. To prove the degree of agreement between the annotations of the researchers and the DSPs, the inter-rater reliability was calculated in the ELAN software (ELAN (Version 6.0) 2020) using the modified Cohen’s Kappa (Holle and Rein 2015) and an overlap criterion of 60%. The Cohen’s Kappa is 0.73 for Adam and 0.71 for Hanna, which corresponds to a substantial agreement (Landis and Koch 1977). Afterwards, the researchers annotated the remaining 70% of the total data set of each participant (see Figure 1).

Recordings with low-quality physiological data, which were in most cases due to technical difficulties of the Empatica wristband or artifacts caused by movements, were excluded. In addition, sequences in which participants showed completely unclear behavioral signals or in which certain body parts important for interpreting their behavior (especially facial expressions) could not be recognized (e.g. facing away from the camera or body parts hidden by objects) were not annotated by human coders. Therefore, the final data set was reduced to 5.5 h for Adam and 6 h for Hanna (see Table 1). In addition, Table 3 shows the distribution of each emotional state annotated after excluding data due to unclear or unrecognizable behavioral signals and technical difficulties. These results of the annotation process were used for further analysis including training the ML models and calculating statistical differences (see Figure 1).

Table 3.

Distribution of each emotional state annotated after excluding data due to unclear or unrecognizable behavioral signals and technical difficulties.

Valence scale B. H. A.
Pleasure 9 0.9% 1.4% 0.4%
8 10.4% 18.3% 2.8%
7 17.8% 31.5% 4.5%
Neutral 6 16.2% 22.1% 10.5%
5 24.9% 9.1% 40.3%
4 23.9% 13.3% 34.1%
Displeasure 3 2.8% 0.5% 5.0%
2 3.1% 3.8% 2.3%
1 / / /

A.: Adam; B.: both; H.: Hanna. / means no data.

Data processing

In order to use the PPG and ACC signal measured by the Empatica E4 wristband for further analysis, the recordings were divided into intervals of 30 s with a minimum length of 15 s, which is required for reliable extraction of HR(V) features. Using a custom peak detector based on the lowest and highest HR (50–150 bpm), a comprehensive set of HR(V) features known from previous studies (e.g. Domínguez-Jiménez et al. 2020) were extracted from PPG signal for each interval. In addition, a large set of motion features developed over years of activity recognition research (Janko et al. 2018) were extracted from ACC signal for each interval as well. These features are ‘the observed properties of the unit of analysis’ (Grimmer et al. 2021, p. 399) – in this case of physiological data. For example, this includes the mean heart rate (HRmean) or the standard deviation of NN intervals (SDNN) in terms of HR(V) features and velocity on the three axes (velocity (XBand, YBand, ZBand)) or mean value for all axes (absoluteMean (XBand, YBand, ZBand)) regarding motion features. Excluding those features with almost permanently constant values, the set of features was reduced to the most promising ten HR(V) features and 80 motion features (see Table 4).

Table 4.

Extracted features regarding HR(V) from the BVP measured by the PPG sensor and motion data from the ACC.

Signals Features
PPG HRmean, RMSSD, HRmedian, IBImedian, SDNN, SDSD, SDbonus1, SDbonus2, LF, HF
ACC absoluteArea (AllBand, XBand, YBand, ZBand), absoluteMean (XBand, YBand, ZBand), amplitude (XBand, YBand, ZBand), area_Under Acceleration Magnitude, area (XLow, YLow, ZLow), averageVectorLength, averageVectorLengthPower, interQuartileRange (MagnitudesBand, XBand, YBand, ZBand), manipulationLow, meanCrossingRate (XBand, YBand, ZBand), meanKineticEnergy (XBand, YBand, ZBand), mean (XLow, YLow, ZLow), peaks (CountLow, PeakAvgLow, SumLow), pitch StdDevLow, postureDistance (XLow, ZLow), quartilesMagnitudes (XBand, YBand), quartilesQ1 (XBand, YBand, ZBand), quartilesQ2 (XBand, YBand, ZBand), quartilesQ3 (XBand, YBand, ZBand), roll (AvgLow, MotionAmountLow, StdDevLow), squareSumOfComponent (X, Y, Z), sumOfSquareComponents, sumPerComponent (XBand, YBand, ZBand), totalAbsoluteAreaBand, totalEnergy (XBand, YBand, ZBand), totalKineticEnergyBand, totalMagnitudeBand, totalMeanLow, variance (XBand, YBand, ZBand), velocity (XBand, YBand, ZBand)

Afterwards, the analyses of phases and emotional states were conducted, whereas the analysis of emotional states required a multi-step approach (see Figure 1):

  1. Analysis of the numerical emotional states (i.e. ‘2’ to ‘9’), whereas the annotation of ‘1’, i.e. a clear displeasure moment, did not appear in the final set of recordings

  2. Analysis of the descriptive emotional states (i.e. pleasure vs. neutral vs. displeasure)

  3. Analysis of the emotional arousal (i.e. pleasure and displeasure combined vs. neutral)

The first step investigates the potential relationship between feature changes and the numerical classes of emotional states (i.e. ‘2’ to ‘9’). For this purpose, each 30 s interval was assigned to the most dominant annotated emotional state. During the annotation process, it appeared that the more extreme Hanna’s and Adam’s (dis)pleasure moments were (i.e. ‘2’, ‘8’, and ‘9’), the shorter their duration was. Especially, these most extreme emotional states are important from a pedagogical perspective to gain insights into the emotional experience of the participants with PIMD and a related potential need for action (e.g. the possible desire to stop an activity). Thus, in order not to lose these very meaningful moments, additional weighting (see factors in Table 5) was necessary. In summary, the clearest pleasure, displeasure and neutral moments (i.e. ‘2’, ‘5’, ‘8’, and ‘9’) were weighted more heavily than the transition classes in between (i.e. ‘3’, ‘4’, ‘6’, and ‘7’). After the duration of each annotation has been multiplied by the weighting factor, the emotional state with the highest result was defined as most dominant and assigned to the specific 30 s interval.

Table 5.

Overview of different types of emotional states and its weighting factor.

Weighting factor Numerical emotional states Descriptive emotional states Emotional arousal
3 ‘9’ Pleasure  
2 ‘8’ Aroused
1 ‘7’  
1 ‘6’ Neutral  
2 ‘5’ Not aroused
1 ‘4’  
1 ‘3’ Displeasure  
2 ‘2’ Aroused
3 ‘1’  

The second step shifts the focus to the three descriptive emotional states: pleasure, displeasure and neutral. After weighting, the eight individual classes were first mapped to the three main classes (see Table 5). Subsequently, the descriptive emotional states with the highest results were assigned to the respective 30 s intervals, as described in the previous paragraph.

In the third step, the two descriptive emotional states, pleasure and displeasure, were combined and contrasted with the neutral states, which should lead to an insight in the emotional arousal (see Table 5).

Results

Common for SCD studies, the results are presented per person but also for both participants together to obtain a larger database, which is essential for ML approach. The results are divided into the presentation of the emotional profiles as well as the evaluation of trained ML models and the statistical analysis regarding the different types of phases and emotional states.

Emotional profiles

The emotional profiles of Adam and Hanna (see Tables 6 and 7) were created based on the expert knowledge of close DSPs in order to implement the further research activities. These profiles provide a differentiation of the individual emotional expression of the two participants by assigning the observed behavioral signals to the numerical emotional state. These behavioral signals are divided into three categories: always, often and sometimes. For instance, this means that Adam always signals the greatest pleasure, i.e. a ‘9’ on the valence scale, by smiling widely and vocalizing or shouting for a long duration at a high volume. In addition, he often jerkily moves his body, arms and legs in recurring manner or overstretches his head backwards. If behavioral signals of a specific category are divided into multiple bullet points, the occurrence of one behavioral signal is sufficient for the assignment to an emotional state, which is the case for Adam regarding an ‘8’ on the valence scale.

Machine learning

ML was used in this research to prove whether the different kind of emotional states or phases can be predicted from physiological data. A positive result would enable a better insight into the emotional expression of people with PIMD.

For predicting the phases and the three kinds of emotional states, several classification methods including Decision Trees (DT), Random Forest (RF), Support Vector Classifier (SVC), AdaBoost (ADA) and Extreme Gradient Boosting (XGB) were used. In addition to the models trained with these algorithms, a baseline classifier was added to the comparison, which always predicts the majority class. Using hyper-parameter optimization and 5-fold-cross-validation, the accuracy and the F1-score of each classification method were calculated.

While the accuracy represents the proportion of correct predictions made by a classification method, false negatives and false positives are weighted more heavily in the calculation of the F1 score. The higher the accuracy and the F1-score (i.e. worst performance is 0.0, best performance is 1.0), the better a classifier can predict the respective phase or emotional states. Especially for datasets with imbalanced classes, the F1-score is most suitable for estimating the performance of the classifiers. Overall, the main objective is to achieve a higher score than the baseline classifier, since this classifier always selects the majority class.

Accuracy is defined as

accuracy=true positives+true negativestrue positives+true negatives+false positives+false negatives

F1-score is defined as

F1 score=true positivestrue positives+12(false positives+false negatives)

Table 8 lists the accuracy and the F1-score of each classification method. In almost all cases, XGB is the most accurate classifier for predicting the particular phase and the emotional arousal. Regarding the numerical and descriptive emotional states, RF usually achieves the best results for both participants. For Adam, four classifiers have the highest accuracy regarding descriptive emotional states, whereas the baseline classifier has the lowest F1-score of these four classifiers. Focusing exclusively on emotional states, it is noteworthy that the results improve successively from numerical to descriptive emotional states to emotional arousal.

Table 8.

Results of the ML classifiers regarding phases, numerical emotional states, descriptive emotional states and emotional arousal.

  Phases
Numerical emotional states
Descriptive emotional states
Emotional arousal
  B. H. A. B. H. A. B. H. A. B. H. A.
Baseline 0.87 (0.47) 0.85 (0.46) 0.89 (0.47) 0.25 (0.05) 0.31 (0.06) 0.4 (0.07) 0.65 (0.26) 0.52 (0.23) 0.85 (0.31) 0.65 (0.39) 0.56 (0.36) 0.85 (0.46)
XGB 0.9 (0.73) 0.87 (0.68) 0.98 (0.93) 0.4 (0.25) 0.32 (0.2) 0.57 (0.29) 0.74 (0.54) 0.61 (0.47) 0.85 (0.42) 0.77 (0.74) 0.7 (0.69) 0.86 (0.64)
DT 0.82 (0.64) 0.79 (0.63) 0.93 (0.81) 0.37 (0.26) 0.33 (0.21) 0.43 (0.24) 0.67 (0.51) 0.58 (0.44) 0.74 (0.43) 0.71 (0.69) 0.64 (0.63) 0.77 (0.57)
RF 0.9 (0.71) 0.85 (0.57) 0.97 (0.9) 0.43 (0.27) 0.37 (0.22) 0.59 (0.27) 0.74 (0.52) 0.66 (0.45) 0.85 (0.38) 0.76 (0.73) 0.67 (0.65) 0.87 (0.65)
SVC 0.87 (0.47) 0.85 (0.46) 0.89 (0.47) 0.31 (0.1) 0.31 (0.06) 0.48 (0.14) 0.65 (0.26) 0.52 (0.35) 0.85 (0.31) 0.65 (0.39) 0.56 (0.36) 0.85 (0.46)
ADA 0.89 (0.72) 0.85 (0.62) 0.97 (0.92) 0.32 (0.14) 0.32 (0.12) 0.4 (0.19) 0.65 (0.45) 0.54 (0.43) 0.72 (0.41) 0.74 (0.71) 0.68 (0.67) 0.83 (0.66)

The accuracy for each classification method is listed first with the F1-score in parentheses. The classifiers with the best performance is printed in bold.

A.: Adam; B.: both; H.: Hanna.

Statistical analysis

The purpose of the statistical analysis was to investigate whether the different emotional states and phases differ significantly according to physiological data. This finding would assist in discerning the emotional expression of people with PIMD. Based on the χ2 test, none of the features had a normal distribution, which is why the non-parametric Mann-Whitney U test was used in combination with a step-down multitest method using Sidak adjustments to correct p values to ensure that the features were not found significant by chance. The exact results of the phases, the descriptive emotional states and the emotional arousal are shown in Table A1 in the appendix, whereas the distribution of significant features is summarized in Table 9.

Table 9.

Distribution of significant HR(V) and motion features regarding phases, descriptive emotional states and emotional arousal based on the Mann-Whitney U test results.

  Phases
Descriptive emotional states
Emotional arousal
  A-phase vs. B-phase
Pleasure vs. neutral
Neutral vs. displeasure
Displeasure vs. pleasure
(Dis)pleasure vs. neutral
  B. H. A. B. H. A. B. H. A. B. H. A. B. H. A.
All features 32% 21% 51% 82% 46% 51% 25% 32% 21% 44% 9% 11% 78% 49% 42%
motion features 36% 23% 51% 83% 46% 49% 27% 37% 24% 50% 11% 12% 78% 49% 39%
HR(V) features 0% 0% 50% 79% 50% 71% 7% 0% 0% 7% 0% 7% 79% 50% 64%

A.: Adam; B.: both; H.: Hanna.

Regarding both participants, a significant difference between phase A and phase B can be seen for 40% of all features, but for none of the HR(V) features. Focusing merely on Hanna, one fifth of all features are significantly different and again none of the HR(V) features. For Adam, two thirds of the motion features as well as 70% of the HR(V) features show a significant difference between the two phases.

The descriptive emotional states were divided into three main class comparisons: pleasure vs. neutral, neutral vs. displeasure, and displeasure vs. pleasure. The results for both participants show more significances in the comparison of pleasure vs. neutral and displeasure vs. pleasure. This is particularly evident in the comparison of pleasure vs. neutral, in which four-fifths of the motion features and all HR(V) features indicate significant differences. Additionally, this is the only comparison in which the HR(V) features are more often significant than the motion features. If displeasure is part of the comparison, the number of significant features decreases substantially. This is especially evident in the individual analysis of displeasure vs. pleasure.

Regarding emotional arousal, a significant difference between neutral states and aroused states can be seen for both participants for more than 70% of all features. Focusing only on Hanna, half of all features are significantly different. For Adam, 40% of all features show a significant difference, but all HR(V) features.

Discussion

Assessing QoL in people with PIMD is a challenging task in which insight into their emotional expression might be beneficial. For this purpose, the first research question deals with the distinction of the personal emotional expression of the two participants. The individual, triangulated approach combining observations in everyday life, proxy reports, video analyses and the final validation process – again including the close DSPs – enabled a very detailed understanding of the emotional expression and built the foundation for the further research activities. In general, the procedure itself and the emotional profiles created can be used as a valuable orientation for research and for direct support of the target group as a guiding alternative to mostly missing self-reports. Moreover, the hypothesis of Lyons (2005) – people with PIMD show a consistent behavioral repertoire within their emotional expression – could be confirmed as well.

Following the second research question, the next step was the analysis of the different phases identified in the everyday life of the participants. Oriented to the SCD approach, the recordings were divided into A-phases, i.e. without any specific activity or interaction offer, and B-phases. The only specification for the B-phases was that they should consist of everyday activities in one-to-one situations. Based on the different activities, a large variance was observed even between the two participants (see Table 2).

It was assumed that more interactions and activities in B-phases would lead to more arousal and to more movement of the participants. The results of the ML classifiers confirm this assumption especially for Adam and indicate a good distinction between the more relaxed and the more activated phases in both the accuracy and the F1-score. This also verifies the research findings of Lima et al. (2013) and Krämer and Zentel (2020), as the same triggers – in the case of this study, this means the A-phases and B-phases – elicited comparable physiological responses.

For Hanna, the results of the ML classifiers in terms of potential prediction of the phases are less good due to different potential reasons. The imbalanced data situation between the phases (see distribution of phases in Table 1) certainly hinders all further analyses. The evaluation of the recordings was based solely on the objective phase criteria described above without including further specifics (e.g. level of activity of the participants). Compared to Adam, Hanna moved more in situations without DSP or during longer phases without activities (i.e. during A-phases), which in turn worsened the contrast between phases. This also affected the HR(V) features and motion features, which is supported in the statistical analyses for Hanna, since less features provided significant results.

Regarding the third research question, a three-step approach was used in the analysis of the emotional states, investigating first the exact emotional valence with numerical values, then descriptive emotional states, and finally emotional arousal. Focusing on Hanna and both participants together, the ML classifiers achieve good results for the descriptive emotional states and even better for the emotional arousal in terms of accuracy and F1 score. Moreover, the statistical analysis of the emotional arousal, in contrast to the phases, shows consistently significant results for both motion features and HR(V) features, especially when focusing on both participants.

In the case of Adam, the ML classifiers perform just slightly better than the baseline classifier in terms of descriptive emotional states and emotion arousal (i.e. especially the F1-score). This can be explained, inter alia, by the results of the statistical analysis regarding the descriptive emotional states. While pleasure and neutral are highly significantly different, this is not the case for the other two comparisons. The low significant difference between pleasure and displeasure makes the analysis step toward emotional arousal reasonable, since pleasure and displeasure are combined in this step. For better results with the ML classifiers for both descriptive emotional states and emotional arousal, however, a better distinction between neutral and displeasure would have been necessary. Moreover, this explains why there is almost no improvement in the accuracy of the classifiers between these two analysis steps. In conclusion, it shows how important more annotations of displeasure moments would have been for both participants (see distribution of emotional states in Table 3).

With regard to the descriptive emotional states and emotional arousal, it can be summarized that motion features and HR(V) features provide acceptable differentiation and prediction especially for Hanna alone and for both participants together, which confirms the research findings of Vos (2010, 2012, 2013).

From an pedagogical perspective, it can be very beneficial for DSPs to obtain information about the current emotional arousal of the person with PIMD in everyday life (e.g. during a care situation) to support direct interaction, as also shown by Frederiks et al. (2019).

Insight into the exact valence (i.e. numerical emotional states) by the DSPs would allow the persons with PIMD to better communicate their emotional experience and in turn allow DSPs a more appropriate response. However, this challenge still remains. Although results of the ML classifiers in predicting the numerical emotional states are better than the baseline classifier, they performed worse than those for predicting the descriptive emotional states and emotional arousal.

Regarding the limitations of this study, it can be stated that it was possible to collect quite a large data corpus (i.e. raising and annotating), especially for research including this target group, but both in terms of collection frequency and overall study length, in particular the ML approach shows that more data – especially of displeasure moments – would probably yield better results.

Although DSPs were continuously involved and the methodological implementation regarding the creation of the emotional profiles, the annotation process and the further analyses could be realized without any major problems, there remains some uncertainty that the annotated behaviors actually match the emotional states since no self-reports of participants themselves were possible.

The realistic setting in everyday life was intentionally chosen in order to gain a direct insight into the lives of people with PIMD. Nevertheless, it also has disadvantages compared to the laboratory setting. For example, it is more difficult to exclude confounding factors that may have an influence on the emotional state of the participants. Everyday life does not provide perfect research data. For instance, only a few clear moments of pleasure and, in particular, displeasure could be captured (see Table 3). Furthermore, this situation was negatively influenced by the fact that the more extreme the emotional state of the participants was assessed (i.e. annotation of ‘2’, ‘8’ or ‘9’), the more movement occurred, which in turn led to more motion artifacts (see Procedure) and finally to more exclusion of such important and rare data. In addition, it must be noted that even this study has covered only a small glimpse of the whole of life of the participants.

In conclusion, assessing the emotional expression of people with PIMD, especially in everyday life, is a challenging task. Nevertheless, the creation and use of the emotional profiles is a technology-free and comparatively simple solution for practitioners, which has the potential to improve mutual understanding in direct support of people with PIMD. Moreover, the overall results show that it is possible to get a better insight into the emotional experience of people with PIMD if the expert knowledge of DSPs and observations from everyday life are combined with analyses of HR(V) and motion data using state-of-the-art ML algorithms. Although there is not yet an off the shelf solution of this experimental setting used in this study, this combination seems to be the missing piece to better recognize emotions of people with PIMD. Thereby, this missing piece shows the potential to increase their self-determination by enabling decision-making opportunities (e.g. more precise understanding of personal preferences or aversions for certain stimuli) and better direct participation, and subsequently to provide a positive impact on their QoL. Furthermore, it can be assumed that a direct insight into the emotional arousal and especially into the emotional valence of a person with PIMD with a simultaneous presentation of the analysis results to the counterpart – comparable to the procedure of Frederiks et al. (2019) – would improve the direct interaction, especially when the two parties are unfamiliar with each other.

Future research should follow this path and attempt to collect a larger corpus of data (e.g. capturing more moments of clear displeasure and clear pleasure and including other physiological parameters such as skin conductance or skin temperature) over an even longer period of time in different life contexts to give the ML approach a better foundation for deeper analysis. Reintegrating such findings into everyday life and investigating their feasibility in live interactions and activities should be an objective of future studies as well. Finally, non-contact sensors should be used in future research to monitor physiological data in order to be even more unobtrusive to participants with PIMD and thus more respectful of their privacy.

Appendix.

Table A1.

Mann-Whitney U test results regarding phases, descriptive emotional states and emotional arousal.

  Phases
Descriptive emotional states
Emotional arousal
  A-phase vs. B-phase
Pleasure vs. neutral
Neutral vs. displeasure
Displeasure vs. pleasure
(Dis)pleasure vs. neutral
Feature B. H. A. B. H. A. B. H. A. B. H. A. B. H. A.
absolute Area
AllBand
0.0634 0.1563 <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* 0.4354 0.3497 <0.0001* <0.0001* <0.0001*
absoluteArea
XBand
0.0005* 0.0861 0.0009* <0.0001* 0.0166 0.0278 0.0444 0.4014 0.0177 <0.0001* 0.214 0.0003* <0.0001* 0.0201 0.466
absoluteArea
YBand
<0.0001* <0.0001* 0.0598 0.216 0.3664 0.4022 0.3596 <0.0001* 0.0099 0.2079 <0.0001* 0.1032 0.2834 0.1111 0.0419
absoluteArea
ZBand
<0.0001* 0.0861 <0.0001* 0.0022* 0.0166 0.3704 0.0037 0.4014 0.0019 <0.0001* 0.214 0.0003* 0.0548 0.0201 0.044
absoluteMean
XBand
0.0004* 0.0904 0.0007* <0.0001* 0.0168 0.0268 0.0406 0.3778 0.0121 <0.0001* 0.2345 0.0021 <0.0001* 0.0197 0.4306
absoluteMean
YBand
<0.0001* <0.0001* 0.0453 0.2183 0.3825 0.4041 0.3497 <0.0001* 0.0084 0.1955 <0.0001* 0.0918 0.2888 0.1151 0.0385
absoluteMean
ZBand
<0.0001* 0.0904 <0.0001* 0.0023* 0.0168 0.3689 0.0035 0.3778 0.0012 <0.0001* 0.2345 0.0021 0.0559 0.0197 0.0354
amplitude
XBand
0.0001* 0.1625 <0.0001* <0.0001* <0.0001* 0.0178 0.0325 0.1552 0.0394 0.2163 0.0164 0.0019 <0.0001* <0.0001* 0.3918
amplitude
YBand
<0.0001* 0.4083 <0.0001* <0.0001* <0.0001* 0.0004* 0.009 0.0009* 0.4568 0.0564 0.147 0.0025 <0.0001* <0.0001* 0.0089
amplitude
ZBand
0.0049 0.1625 0.0023 <0.0001* <0.0001* 0.0086 0.0531 0.1552 0.0614 0.2163 0.0164 0.0019 <0.0001* <0.0001* 0.2705
areaUnder
Acceleration Magnitude
0.2216 0.2151 0.4505 <0.0001* <0.0001* 0.0005* 0.3738 0.054 0.4536 0.0015 0.3707 0.0074 <0.0001* <0.0001* 0.0096
area
XLow
0.0001* 0.015 0.0017* <0.0001* 0.013 0.0472 0.0313 0.4886 0.0263 <0.0001* 0.1874 0.0029 <0.0001* 0.0179 0.4435
area
YLow
<0.0001* <0.0001* 0.0619 0.0501 0.3044 0.1792 0.4718 <0.0001* 0.0104 0.1912 <0.0001* 0.1892 0.076 0.3637 0.0145
area
ZLow
<0.0001* 0.015 <0.0001* 0.0018* 0.013 0.4252 0.0022 0.4886 0.0027 <0.0001* 0.1874 0.0029 0.0544 0.0179 0.0417
average
VectorLength
0.191 0.0893 0.3702 <0.0001* <0.0001* 0.0001* 0.2746 0.0286 0.49 0.0005* 0.2255 0.0017 <0.0001* <0.0001* 0.0064
average
VectorLength Power
0.191 0.0893 0.3702 <0.0001* <0.0001* 0.0001* 0.2746 0.0286 0.49 0.0005* 0.2255 0.0017 <0.0001* <0.0001* 0.0064
interQuartile Range
Magnitudes Band
0.1581 0.0078 <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* 0.0001* 0.1575 0.2152 <0.0001* <0.0001* <0.0001*
interQuartile Range
XBand
0.0907 0.2365 <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* 0.0001* <0.0001* 0.1282 0.0492 <0.0001* <0.0001* <0.0001*
interQuartile Range
YBand
0.0004* 0.1429 <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* 0.0005* <0.0001* 0.1676 0.002 <0.0001* <0.0001* <0.0001*
interQuartile Range
ZBand
0.0425 0.2365 <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* 0.0001* <0.0001* 0.1282 0.0492 <0.0001* <0.0001* <0.0001*
manipulation
Low
<0.0001* <0.0001* 0.2841 0.0005* 0.0001* 0.3444 0.0067 0.448 0.0001* <0.0001* 0.0399 0.0021 0.0189 0.0002* 0.0021
meanCrossing Rate
XBand
<0.0001* <0.0001* 0.0011* 0.0001* 0.0066 <0.0001* 0.0752 0.2215 0.0098 0.2537 0.0145 0.1455 0.0001* 0.0145 <0.0001*
meanCrossing Rate
YBand
<0.0001* <0.0001* <0.0001* 0.0205 0.1444 0.061 0.0184 0.0541 0.0001* 0.0057 0.0047 0.0172 0.137 0.2461 0.0002*
meanCrossing Rate
ZBand
<0.0001* <0.0001* 0.0107 0.0002* 0.0066 <0.0001* 0.0883 0.2215 0.0119 0.2537 0.0145 0.1455 0.0002* 0.0145 <0.0001*
meanKinetic Energy
XBand
0.0094 0.4783 0.0024 <0.0001* 0.1625 0.0004* 0.4979 0.39 0.304 0.0052 0.2544 0.0023 <0.0001* 0.1918 0.0256
meanKinetic Energy
YBand
0.0001* 0.2685 <0.0001* 0.2253 <0.0001* 0.0105 0.0535 0.0146 0.2886 0.0891 0.1625 0.01 0.1108 <0.0001* 0.1113
meanKinetic Energy
ZBand
0.1055 0.4783 0.1084 <0.0001* 0.1625 0.0001* 0.4468 0.39 0.3741 0.0052 0.2544 0.0023 <0.0001* 0.1918 0.011
mean
XLow
0.0001* 0.0146 0.0015* <0.0001* 0.0145 0.0456 0.0288 0.4683 0.0213 <0.0001* 0.2033 0.0022 <0.0001* 0.0192 0.4245
mean
YLow
<0.0001* <0.0001* 0.0589 0.0491 0.3112 0.1747 0.4584 <0.0001* 0.0099 0.1745 <0.0001* 0.1804 0.0764 0.3458 0.0137
Mean
ZLow
<0.0001* 0.0146 <0.0001* 0.0018* 0.0145 0.4233 0.002 0.4683 0.0022 <0.0001* 0.2033 0.0022 0.0566 0.0192 0.0378
peaks
CountLow
<0.0001* 0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* 0.0112 <0.0001* 0.0375 0.0106 0.4566 <0.0001* <0.0001* <0.0001*
peaks
PeakAvgLow
<0.0001* 0.1449 <0.0001* 0.0283 <0.0001* 0.2699 0.0023 0.0256 0.0155 0.0093 0.2323 0.0053 0.0039 <0.0001* 0.154
peaks
SumLow
<0.0001* 0.0004* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* 0.0079 <0.0001* 0.3106 0.0667 0.1955 <0.0001* <0.0001* <0.0001*
pitch
StdDevLow
0.348 0.0036 <0.0001* <0.0001* 0.1001 <0.0001* 0.2306 0.2453 0.0818 <0.0001* 0.4369 0.0001* <0.0001* 0.0904 <0.0001*
postureDistance
XLow
0.0137 0.0002* 0.1035 0.0077 0.0717 0.4549 0.0355 0.0012 0.0003* 0.0013 0.02 0.0002* 0.0635 0.0247 0.0141
postureDistance
ZLow
<0.0001* 0.0002* <0.0001* 0.1074 0.0717 0.2281 0.0052 0.0012 <0.0001* 0.0013 0.02 0.0002* 0.4134 0.0247 0.0003*
quartiles Magnitudes
XBand
0.0052 0.0639 <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* 0.0005* <0.0001* 0.1436 0.0128 <0.0001* <0.0001* <0.0001*
quartiles Magnitudes
YBand
0.0389 0.2856 <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* 0.0001* <0.0001* 0.1298 0.0272 <0.0001* <0.0001* <0.0001*
quartilesQ1
XBand
0.0837 0.3524 <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* 0.0001* <0.0001* 0.1819 0.0609 <0.0001* <0.0001* <0.0001*
quartilesQ1
YBand
0.0004* 0.0995 <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* 0.0015 <0.0001* 0.1298 0.0019 <0.0001* <0.0001* <0.0001*
quartilesQ1
ZBand
0.0252 0.3524 <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* 0.0001* <0.0001* 0.1819 0.0609 <0.0001* <0.0001* <0.0001*
quartilesQ2
XBand
0.0219 0.0221 0.4108 0.1554 0.012 0.3868 0.1826 0.2878 0.251 0.4158 0.3623 0.475 0.1133 0.0128 0.2584
quartilesQ2
YBand
0.0556 0.3286 0.0071 0.0531 0.1159 0.4477 0.2732 0.2466 0.092 0.2771 0.142 0.2665 0.0506 0.1606 0.1635
quartilesQ2
ZBand
0.0093 0.0221 0.2303 0.1356 0.012 0.3542 0.1712 0.2878 0.2323 0.4158 0.3623 0.475 0.0966 0.0128 0.2273
quartilesQ3
YBand
0.0005* 0.1675 <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* 0.0005* <0.0001* 0.2658 0.0145 <0.0001* <0.0001* <0.0001*
quartilesQ3
ZBand
0.0689 0.1683 <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* 0.0001* 0.0001* 0.093 0.0593 <0.0001* <0.0001* <0.0001*
roll
AvgLow
0.305 <0.0001* <0.0001* <0.0001* 0.0198 <0.0001* 0.0024 0.0001* 0.0048 0.0001* 0.001 0.0095 <0.0001* 0.004 <0.0001*
roll
MotionAmount Low
0.427 <0.0001* <0.0001* <0.0001* 0.0014 <0.0001* 0.0004* <0.0001* 0.0033 0.0007* 0.0001* 0.0361 <0.0001* 0.0001* <0.0001*
roll
StdDevLow
0.3018 <0.0001* <0.0001* <0.0001* 0.0111 <0.0001* 0.0007* <0.0001* 0.0054 0.0007* 0.0001* 0.0153 <0.0001* 0.0012* <0.0001*
squareSumOf
Component
0.2261 0.3163 0.4793 <0.0001* 0.0008* 0.0007* 0.4531 0.0885 0.3931 0.0012 0.4892 0.0159 <0.0001* 0.0006* 0.0087
squareSumOf
Component_X
0.0097 0.496 0.0022 <0.0001* 0.1313 0.0004* 0.4937 0.4006 0.3074 0.0054 0.2443 0.0025 <0.0001* 0.1576 0.0248
squareSumOf
Component_Y
0.0001* 0.2702 <0.0001* 0.2072 <0.0001* 0.0096 0.0551 0.0141 0.282 0.0958 0.1634 0.0097 0.1017 <0.0001* 0.1092
squareSumOf
Component_Z
0.1095 0.496 0.107 <0.0001* 0.1313 0.0001* 0.4515 0.4006 0.3768 0.0054 0.2443 0.0025 <0.0001* 0.1576 0.0106
sumOf
Square Components
0.1843 0.3718 0.3565 <0.0001* 0.0004* 0.0007* 0.3359 0.0431 0.3873 0.0011* 0.4078 0.0276 <0.0001* 0.0002* 0.0084
sumPer Component
XBand
0.0098 0.4717 0.0017* <0.0001* 0.1183 0.0006* 0.4535 0.4229 0.2638 0.0065 0.2598 0.0028 <0.0001* 0.1412 0.0352
sumPer Component
YBand
0.0001* 0.2686 <0.0001* 0.1433 <0.0001* 0.0076 0.0511 0.0134 0.3035 0.1147 0.1602 0.0082 0.0661 <0.0001* 0.0912
sumPer Component
ZBand
0.1112 0.4717 0.0902 <0.0001* 0.1183 0.0002* 0.4859 0.4229 0.3371 0.0065 0.2598 0.0028 <0.0001* 0.1412 0.0154
totalAbsolute
AreaBand
0.0634 0.1563 <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* <0.0001* 0.4354 0.3497 <0.0001* <0.0001* <0.0001*
totalEnergy
XBand
0.0486 0.1512 <0.0001* <0.0001* <0.0001* 0.0643 <0.0001* 0.0019 0.0002* 0.0192 0.1177 0.0001* <0.0001* <0.0001* 0.0969
totalEnergy
YBand
<0.0001* 0.2793 <0.0001* <0.0001* <0.0001* 0.0002* 0.0018 <0.0001* 0.3453 0.0001* 0.3328 0.0025 <0.0001* <0.0001* 0.0035
totalEnergy
ZBand
0.2537 0.1512 <0.0001* <0.0001* <0.0001* 0.0337 <0.0001* 0.0019 0.0004* 0.0192 0.1177 0.0001* <0.0001* <0.0001* 0.1653
totalKinetic
EnergyBand
0.2226 0.3249 0.4524 <0.0001* 0.001* 0.0008* 0.4496 0.0902 0.3923 0.0013 0.4885 0.0156 <0.0001* 0.0008* 0.0089
Total
Magnitude Band
0.4261 0.2033 0.0001* <0.0001* <0.0001* 0.237 <0.0001* 0.0004* 0.0014 0.0033 0.2043 0.0032 <0.0001* <0.0001* 0.0558
Total
MeanLow
0.0594 <0.0001* 0.0639 <0.0001* 0.0015 0.0767 0.0226 0.0094 0.1046 <0.0001* 0.0001* 0.0555 <0.0001* 0.0097 0.4422
variance
XBand
0.1072 0.0954 <0.0001* <0.0001* <0.0001* 0.1499 <0.0001* 0.0003* <0.0001* 0.0178 0.2317 <0.0001* <0.0001* <0.0001* 0.0181
variance
YBand
<0.0001* 0.1998 <0.0001* <0.0001* <0.0001* 0.0001* 0.0007* <0.0001* 0.2916 0.0001* 0.2443 0.0023 <0.0001* <0.0001* 0.002
variance
ZBand
0.346 0.0954 <0.0001* <0.0001* <0.0001* 0.0927 <0.0001* 0.0003* <0.0001* 0.0178 0.2317 <0.0001* <0.0001* <0.0001* 0.0338
velocity
XBand
0.0054 0.2428 0.0026 <0.0001* 0.011 0.0439 0.0622 0.649 0.0371 <0.0001* 0.1513 0.0086 <0.0001* 0.016 0.4937
velocity
YBand
<0.0001* <0.0001* 0.0761 0.1617 0.1862 0.1008 0.3668 0.0001* 0.0032 0.4086 0.0007* 0.2536 0.1606 0.0645 0.0034
velocity
ZBand
<0.0001* 0.2428 <0.0001* 0.0008* 0.011 0.3648 0.0061 0.649 0.0042 <0.0001* 0.1513 0.0086 0.0261 0.016 0.0634
HRmean 0.0526 0.1939 0.1329 <0.0001* <0.0001* <0.0001* 0.0002* 0.0106 0.0039 0.1719 0.3934 0.1168 <0.0001* <0.0001* <0.0001*
HRmedian 0.0062 0.1337 0.0083 <0.0001* <0.0001* <0.0001* 0.0013 0.013 0.0293 0.3666 0.1717 0.1199 <0.0001* <0.0001* <0.0001*
IBImedian 0.0062 0.1337 0.0083 <0.0001* <0.0001* <0.0001* 0.0013 0.013 0.0293 0.3666 0.1717 0.1199 <0.0001* <0.0001* <0.0001*
SDNN 0.1775 0.0847 0.0001* <0.0001* <0.0001* <0.0001* 0.0062 0.0174 0.0459 0.0028 0.3315 0.001 <0.0001* <0.0001* <0.0001*
SDSD 0.0611 0.0422 <0.0001* <0.0001* 0.0041 <0.0001* 0.0352 0.0447 0.1332 0.0056 0.4371 0.0023 <0.0001* 0.0024 0.0001*
RMSSD 0.2865 0.152 0.0003* <0.0001* <0.0001* <0.0001* 0.019 0.0205 0.1443 0.0003* 0.9407 0.0005* <0.0001* <0.0001* <0.0001*
SDbonus1 0.1775 0.0847 0.0001* <0.0001* <0.0001* <0.0001* 0.0062 0.0174 0.0459 0.0028 0.3315 0.001 <0.0001* <0.0001* <0.0001*
SDbonus2 0.0267 0.0543 <0.0001* <0.0001* 0.106 0.0001* 0.0889 0.0995 0.1928 0.0418 0.1935 0.0053 <0.0001* 0.0775 0.0007*
LF 0.0571 0.4742 0.0003* <0.0001* 0.0028 <0.0001* 0.0027 0.134 0.0047 0.1235 0.4435 0.0472 <0.0001* 0.0024 <0.0001*
HF 0.0527 0.489 0.0006* <0.0001* 0.0001* <0.0001* 0.05 0.3405 0.0404 0.0029 0.1043 0.0097 <0.0001* 0.0001* <0.0001*
*

p < .05 using step-down multitest method using Sidak adjustments.

A.: Adam; B.: both; H.: Hanna.

Funding Statement

The research presented herewith has been conducted within the INSENSION project which has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement no. 780819.

Disclosure statement

No potential conflict of interest was reported by the authors.

Ethics approval

The study was approved by the Ethics Committee at the Heidelberg University of Education (approval number EV2019/04) and the Bioethical Committee at the Poznan University of Medical Sciences (approval number 10/21). With reference to international standards and research recommendations (Hammann 2022, Mietola et al. 2017, Williams 2008), legal representatives of the participants were informed about the research approach, its benefits and risks to consent with the additional request to take the perspective of the participant at the best possible rate. From the beginning to the end of the study, permanent and transparent reflection with optional readjustment took place by all persons involved. If signs of rejection or discomfort had been detected among the participants or expressed by a DSP, the study would have been discontinued.

References

  1. Adams, D. and Oliver, C.. 2011. The expression and assessment of emotions and internal states in individuals with severe or profound intellectual disabilities. Clinical Psychology Review, 31, 293–306. [DOI] [PubMed] [Google Scholar]
  2. Axelsson, A. K., Imms, C. and Wilder, J.. 2014. Strategies that facilitate participation in family activities of children and adolescents with profound intellectual and multiple disabilities: Parents’ and personal assistants’ experiences. Disability and Rehabilitation, 36, 2169–2177. [DOI] [PubMed] [Google Scholar]
  3. Bellamy, G., Croot, L., Bush, A., Berry, H. and Smith, A.. 2010. A study to define: Profound and multiple learning disabilities (PMLD). Journal of Intellectual Disabilities: JOID, 14, 221–235. [DOI] [PubMed] [Google Scholar]
  4. Bermejo, B. G., Mateos, P. M. and Sánchez-Mateos, J. D.. 2014. The emotional experience of people with intellectual disability: An analysis using the international affective pictures system. American Journal on Intellectual and Developmental Disabilities, 119, 371–384. [DOI] [PubMed] [Google Scholar]
  5. Bradley, M. M., Codispoti, M., Cuthbert, B. and Lang, P. J.. 2001. Emotion and motivation I: Defensive and appetitive reactions in picture processing. Emotion (Washington, D.C.), 1, 276–298. [PubMed] [Google Scholar]
  6. Bradley, M. M. and Lang, P. J.. 2000. Measuring emotion: Behavior, feeling, and physiology. In: Lane R. and Nadel L., eds., Cognitive neuroscience of emotion. New York: Oxford University Press, pp.242–276. [Google Scholar]
  7. Brady, N. C., Fleming, K., Thiemann-Bourque, K., Olswang, L., Dowden, P., Saunders, M. D. and Marquis, J.. 2012. Development of the communication complexity scale. American Journal of Speech-Language Pathology, 21, 16–28. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Domínguez-Jiménez, J. A., Campo-Landines, K. C., Martínez-Santos, J. C., Delahoz, E. J. and Contreras-Ortiz, S. H.. 2020. A machine learning model for emotion recognition from physiological signals. Biomedical Signal Processing and Control, 55, 101646. [Google Scholar]
  9. ELAN (Version 6.0) , 2020. Nijmegen: Max Planck Institute for Psycholinguistics, The Language Archive. Available at: <https://archive.mpi.nl/tla/elan> [accessed 31 May 2022]
  10. Empatica S.R.L ., 2018. E4 wristband from empatica: User’s manual. Available at: <https://empatica.app.box.com/v/E4-User-Manual> [accessed 31 May 2022].
  11. Engelhardt, M., Krämer, T., Marzini, M., Sansour, T. and Zentel, P.. 2020. Communication assessment in people with PIMD. Evaluating the use of the INSENSION Questionnaire – Longform (InQL). Psychoeducational Assessment, Intervention and Rehabilitation, 2, 1–14. [Google Scholar]
  12. Felce, D. and Perry, J.. 1995. Quality of life: Its definition and measurement. Research in Developmental Disabilities, 16, 51–74. [DOI] [PubMed] [Google Scholar]
  13. Frederiks, K., Sterkenburg, P., Barakova, E. and Feijs, L.. 2019. The effects of a bioresponse system on the joint attention behaviour of adults with visual and severe or profound intellectual disabilities and their affective mutuality with their caregivers. Journal of Applied Research in Intellectual Disabilities: JARID, 32, 890–900. [DOI] [PubMed] [Google Scholar]
  14. Grimmer, J., Roberts, M. E. and Stewart, B. M.. 2021. Machine learning for social science: An agnostic approach. Annual Review of Political Science, 24, 395–419. [Google Scholar]
  15. Hammann, T. 2022. Forschungsethische Diskussion im Kontext komplexer Behinderung. Zeitschrift Für Heilpädagogik, 73, 382–393. [Google Scholar]
  16. Hammann, T. and Engelhardt, M.. 2022. Lebensqualität und schwere und mehrfache Behinderung. In: Zentel P., ed. Lebensqualität und geistige Behinderung: Theorien, Diagnostik, Konzepte. Stuttgart: Kohlhammer Verlag, pp.223–241. [Google Scholar]
  17. Holle, H. and Rein, R.. 2015. Easydiag: A tool for easy determination of interrater agreement. Behavior Research Methods, 47, 837–847. [DOI] [PubMed] [Google Scholar]
  18. Horner, R. H., Carr, E. G., Halle, J., McGee, G., Odom, S. and Wolery, M.. 2005. The use of single-subject research to identify evidence-based practice in special education. Exceptional Children, 71, 165–179. [Google Scholar]
  19. Hostyn, I. and Maes, B.. 2009. Interaction between persons with profound intellectual and multiple disabilities and their partners: A literature review. Journal of Intellectual & Developmental Disability, 34, 296–312. [DOI] [PubMed] [Google Scholar]
  20. Janko, V., Rešçiç, N., Mlakar, M., Drobnič, V., Gams, M., Slapničar, G., Gjoreski, M., Bizjak, J., Marinko, M. and Luštrek, M.. 2018. A new frontier for activity recognition: The Sussex-Huawei locomotion challenge, Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, October 8 - 12, 2018. Singapore: Association for Computing Machinery, pp.1511–1520. [Google Scholar]
  21. Krämer, T. and Zentel, P.. 2020. Expression of emotions of people with profound intellectual and multiple disabilities. A single-case design including physiological data. Psychoeducational Assessment, Intervention and Rehabilitation, 2, 15–29. [Google Scholar]
  22. Landis, J. R. and Koch, G. G.. 1977. The measurement of observer agreement for categorical data. Biometrics, 33, 159–174. [PubMed] [Google Scholar]
  23. Lazarus, R. S. 1991. Progress on a cognitive-motivational-relational theory of emotion. The American Psychologist, 46, 819–834. [DOI] [PubMed] [Google Scholar]
  24. Lima, M., Silva, K., Amaral, I., Magalhães, A. and Sousa, L. d.. 2013. Beyond behavioural observations: A deeper view through the sensory reactions of children with profound intellectual and multiple disabilities. Child: Care, Health and Development, 39, 422–431. [DOI] [PubMed] [Google Scholar]
  25. Lyons, G. S. 2005. The Life Satisfaction Matrix: An instrument and procedure for assessing the subjective quality of life of individuals with profound multiple disabilities. Journal of Intellectual Disability Research: JIDR, 49, 766–769. [DOI] [PubMed] [Google Scholar]
  26. Lyons, G. S., De Bortoli, T. and Arthur-Kelly, M.. 2017. Triangulated proxy reporting: A technique for improving how communication partners come to know people with severe cognitive impairment. Disability and Rehabilitation, 39, 1814–1820. [DOI] [PubMed] [Google Scholar]
  27. Lyons, G. S., Walla, P. and Arthur-Kelly, M.. 2013. Towards improved ways of knowing children with profound multiple disabilities: Introducing startle reflex modulation. Developmental Neurorehabilitation, 16, 340–344. [DOI] [PubMed] [Google Scholar]
  28. Maes, B., Lambrechts, G., Hostyn, I. and Petry, K.. 2007. Quality-enhancing interventions for people with profound intellectual and multiple disabilities. Journal of Intellectual & Developmental Disability, 32, 163–178. [DOI] [PubMed] [Google Scholar]
  29. Marteau, F., Dalmat-Kasten, M., Kaye, K., Castillo, M.C. and Montreuil, M.. 2016. Exploratory study of a scale of emotional state for children with intellectual and multiple disabilities (EREEP). Child & Family Behavior Therapy, 38, 1–14. [Google Scholar]
  30. McCarthy, C., Pradhan, N., Redpath, C. and Adler, A.. 2016. Validation of the Empatica E4 wristband, 2016 IEEE EMBS International Student Conference (ISC). 29–31 May 2016, Ottowa, Canada. [Google Scholar]
  31. Mietola, R., Miettinen, S. and Vehmas, S.. 2017. Voiceless subjects? Research ethics and persons with profound intellectual disabilities. International Journal of Social Research Methodology, 20, 263–274. [Google Scholar]
  32. Mohr, L. 2011. Schwerste Behinderung und theologische Anthropologie. Oberhausen: ATHENA-Verlag. [Google Scholar]
  33. Moors, A., Ellsworth, P. C., Scherer, K. R. and Frijda, N. H.. 2013. Appraisal theories of emotion: State of the art and future development. Emotion Review, 5, 119–124. [Google Scholar]
  34. Munde, V., Vlaskamp, C., Vos, P., Maes, B. and Ruijssenaars, W.. 2012. Physiological measurements as validation of alertness observations: An exploratory case study of three individuals with profound intellectual and multiple disabilities. Intellectual and Developmental Disabilities, 50, 300–310. [DOI] [PubMed] [Google Scholar]
  35. Nakken, H. and Vlaskamp, C.. 2007. A need for a taxonomy for profound intellectual and multiple disabilities. Journal of Policy and Practice in Intellectual Disabilities, 4, 83–87. [Google Scholar]
  36. Nieuwenhuijse, A. M., Willems, D. L., van Goudoever, J. B., Echteld, M. A. and Olsman, E.. 2017. Quality of life of persons with profound intellectual and multiple disabilities: A narrative literature review of concepts, assessment methods and assessors. Journal of Intellectual and Developmental Disability, 44, 261–271. [Google Scholar]
  37. Nieuwenhuijse, A. M., Willems, D. L., van Goudoever, J. B. and Olsman, E.. 2022. The perspectives of professional caregivers on quality of life of persons with profound intellectual and multiple disabilities: A qualitative study. International Journal of Developmental Disabilities, 68, 190–197. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Petry, K. and Maes, B.. 2006. Identifying expressions of pleasure and displeasure by persons with profound multiple disabilities. Journal of Intellectual & Developmental Disability, 31, 28–38. [DOI] [PubMed] [Google Scholar]
  39. Petry, K. and Maes, B.. 2009. Quality of life: People with profound intellectual and multiple disabilities. In: Pawlyn J. and Carnaby S., eds. Profound intellectual and multiple disabilities. Oxford, UK: Wiley-Blackwell. pp.15–36. [Google Scholar]
  40. Russell, J. A. 1980. A circumplex model of affect. Journal of Personality and Social Psychology, 39, 1161–1178. [DOI] [PubMed] [Google Scholar]
  41. Schalock, R. L., Brown, I., Brown, R., Cummins, R. A., Felce, D., Matikka, L., Keith, K. D. and Parmenter, T.. 2002. Conceptualization, measurement, and application of quality of life for persons with intellectual disabilities: Report of an international panel of experts. Mental Retardation, 40, 457–470. [DOI] [PubMed] [Google Scholar]
  42. U.S. Department of Education , 2017. What Works Clearinghouse™. Standards handbook. Version 4.0. Available at: <https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_standards_handbook_v4.pdf> [accessed 31 May 2022].
  43. van der Putten, A. A. J., Haar, A., ter, Maes, B. and Vlaskamp, C.. 2018. Tausendfüßler: Erkenntnisse zur Unterstützung von Menschen mit (sehr) schweren geistigen und mehrfachen Behinderungen - ein Literatur-Review. Teilhabe, 57, 55–62. [Google Scholar]
  44. van der Putten, A. A. J. and Vlaskamp, C.. 2011. Day services for people with profound intellectual and multiple disabilities: An analysis of thematically organized activities. Journal of Policy and Practice in Intellectual Disabilities, 8, 10–17. [Google Scholar]
  45. Vos, P., De Cock, P., Munde, V., Petry, K., Van Den Noortgate, W. and Maes, B.. 2012. The tell-tale: What do heart rate; skin temperature and skin conductance reveal about emotions of people with severe and profound intellectual disabilities? Research in Developmental Disabilities, 33, 1117–1127. [DOI] [PubMed] [Google Scholar]
  46. Vos, P., De Cock, P., Petry, K., van den Noortgate, W. and Maes, B.. 2010. Do you know what i feel? A first step towards a physiological measure of the subjective well-being of persons with profound intellectual and multiple disabilities. Journal of Applied Research in Intellectual Disabilities, 23, 366–378. [Google Scholar]
  47. Vos, P., De Cock, P., Petry, K., van den Noortgate, W. and Maes, B.. 2013. See me, feel me. Using physiology to validate behavioural observations of emotions of people with severe or profound intellectual disability. Journal of Intellectual Disability Research: JIDR, 57, 452–461. [DOI] [PubMed] [Google Scholar]
  48. Williams, J. R. 2008. The declaration of Helsinki and public health. Bulletin of the World Health Organization, 86, 650–652. [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. World Health Organization , 2001. International classification of functioning disability and health (ICF). World Health Organization. Available at: <>
  50. World Health Organization , 2019. ICD-11. International classification of diseases 11th revision: 6A00.3 Disorder of intellectual development, profound. Available at: <https://icd.who.int/browse11/l-m/en#/http%3a%2f%2fid.who.int%2ficd%2fentity%2f1017992057> [accessed 31 May 2022].

Articles from International Journal of Developmental Disabilities are provided here courtesy of The British Society of Developmental Disabilities

RESOURCES