Skip to main content
Sage Choice logoLink to Sage Choice
. 2025 Oct 30;15(1):20–54. doi: 10.1177/18796397251390252

Emotion recognition in people with Huntington's disease: A comprehensive systematic review

Nicolò Zarotti 1,2,3,, Alice Storey 2, Sarah Lloyd 2, Laura Mesia Guevara 2, Helen Caswell 2, Cliff Chen 4, Jane Simpson 3
PMCID: PMC12847465  PMID: 41166343

Abstract

Background

Deficits of emotion recognition have received increasing attention in people with Huntington's disease (HD) in the three decades since the discovery of the HD gene. However, the characterisation of such deficits across different disease stages, types of stimuli, and sensory modalities is currently unclear.

Objective

This study aimed to provide a comprehensive review of the evidence on emotion recognition deficits in HD gene carriers (both manifest and premanifest) over the three decades since definitive gene testing.

Method

A systematic review was carried out from January 1993 to January 2025 across MEDLINE, PsycINFO, Academic Search Complete, and CINAHL (PROSPERO registration: CRD42023398649).

Results

From 9735 initial citations, 59 studies were eventually included. In manifest HD, facial recognition of negative emotions such as anger, fear, disgust, and sadness was consistently impaired, whereas happiness and neutral expressions were generally spared. A few auditory studies showed consistent deficits for disgust, fear, and anger, while happiness and sadness appeared less affected. Only preliminary evidence is currently available for deficits involving body language, visual and written vignettes, videos, and olfactory and gustatory tasks. Although sparser, the evidence for premanifest HD suggests that some individuals may develop significant recognition difficulties prior to motor onset, particularly due to early frontostriatal deterioration and white matter disruption.

Conclusions

Impairments of facial recognition of negative emotions are reported consistently in manifest HD, while only preliminary results are available for other modalities. The evidence involving premanifest HD is much sparser. Key implications for clinical practice and future research are outlined and discussed.

Keywords: neuropsychology, psychology, psychiatry, neuropsychiatry, cognition, social aspects of HD, emotion recognition, body language, social cognition

Plain language summary

People with Huntington's disease can face psychological problems that affect their daily lives. One common issue is difficulties understanding what emotions other people feel. Scientists have studied this topic since they found the Huntington's gene in 1993, yet the full picture remains unclear. This review looked at 59 studies on emotion recognition in Huntington's disease published between January 1993 and January 2025 across four major databases. The results show that in people with manifest HD recognising angry, fearful, disgusted, and sad faces is often difficult. On the other hand, spotting happy or neutral faces is usually fine. A few experiments that used voices instead of faces showed similar weaknesses for disgust, fear, and anger in manifest HD, although happiness and sadness were less impaired. Evidence for understanding emotions from body postures, short stories, films, smells, or tastes is still limited. Some but not all people with premanifest HD also struggle with emotion recognition, probably because of early damage in frontal areas of the brains. These findings are important for researchers and people with HD alike, as they can help families, friends, caregivers, and patients respond better to daily challenges.

Introduction

Huntington's disease (HD) is a progressive neurodegenerative disorder linked to CAG repeat expansion in a mutated gene (Huntingtin) on the short arm of chromosome 4. 1 It causes progressive basal ganglia damage, particularly involving the corpus striatum (caudate nucleus and putamen). This in turns leads to the development of motor impairments – such as chorea (involuntary movements), dystonia, bradykinesia, dysarthria, dysphagia, and rigidity. 2 The transmission mechanism is autosomal-dominant, meaning that affected individuals’ children have a 50% probability of inheriting the gene. 1 Since 1993, genetic testing has been available to ascertain positive gene status. 3

The prevalence of HD in the UK is around 12.3 people per 100,000, 4 while the global pooled prevalence is estimated to be 4.88 per 100,000. 5 The development of motor difficulties, which tends to occur around the age of 40, 1 is normally considered the onset of the condition, after which people with HD (pwHD) are considered ‘manifest’ or ‘symptomatic’. People who have tested positive but do not yet experience movement issues are considered ‘premanifest’ or ‘presymptomatic’, while symptom-free individuals with a family history of the condition but no genetic testing are often described as ‘at-risk’.

HD progression is also associated with a wide range of cognitive impairments. 6 These represent one of the earliest detectable clinical signs, with reduced speed of information processing being one of the strongest predictors of disease onset at the premanifest stage and disease progression at the manifest stage, particularly due to early involvement of the striatum. 7 Other common cognitive impairments involve reduced executive functions (e.g., planning, organisation, attention-shifting, self-monitoring, mental flexibility, and goal-directed behaviour; linked to disruptions to frontostriatal circuits), attention and automation (particularly evident on dual tasks, such as walking and talking at the same time), memory (mostly due to executive deficits rather than hippocampal disruption) and social cognition. 6 813

Among the problems involving social cognition, deficits of emotion recognition – defined as the ability to perceive and interpret affective information correctly in and from others14,15 – have been consistently described in pwHD, especially regarding facial recognition of negative emotions such as anger, fear, and disgust.1618 While a number of recent reviews have addressed broader social cognition issues in this population,1921 only one systematic review has so far focused on emotion recognition deficits in pwHD specifically. 22 This included studies between 1993 and 2010 and concluded that the vast majority of studies showing emotion recognition impairments in HD used visual tasks involving the identification of emotions from facial stimuli, with the evidence based on different types of stimuli (e.g., emotional body language)23,24 or other sensory modalities (e.g., voices, tastes, and odours)25,26 being much scarcer. Similarly, the evidence around impairments in people with premanifest HD appeared to be sparser, with fewer studies involving this population and more inconclusive findings. 22

Consequently, as 15 years have passed since the previous major review, and the literature and assessment of emotion recognition has progressed over this time period, an update is now warranted. In addition, the review by Henley and colleagues 22 only included comparisons between manifest or premanifest individuals and a control group of healthy volunteers, excluding any studies comparing single pwHD groups with published normative data or between-group comparisons across different HD stages. Thus, the present study aimed to fill these gaps by providing a more comprehensive systematic review of emotion recognition in Huntington's disease since the discovery of the Huntingtin gene.

Methods

For the purpose of this study, a systematic review approach was adopted, following the latest Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines. 27 More specifically, the present review was guided by the following research questions:

  1. Is emotion recognition impaired in people with HD?

  2. Are impairments different between premanifest and manifest stages?

  3. Are impairments different across emotions?

  4. Are impairments different across stimuli and/or sensory modalities?

  5. What methods are used to assess emotion recognition in people with Huntington's disease?

Due to the high level of heterogeneity in study designs, operationalisation of variables, methods of measuring emotion recognition, and reporting of key aspects of findings (e.g., effect sizes) highlighted within the literature by previous comparable reviews,22,28 a meta-analytic approach was not considered viable for this study.

Search strategy

A comprehensive search was performed from January 1st 1993 (year of the discovery of the Huntingtin gene) to January 1st 2025, using a combination of free text terms across four major databases: Academic Search Ultimate (ASU), CINAHL, MEDLINE, and PsycINFO. Hand searches were also performed in reference lists of included studies in order to identify further relevant citations. Guidance was provided by a subject specific librarian regarding the search terms and strategy. The search logic grid as well as the full search strategy can be consulted in the Supplementary Information.

Inclusion criteria

To be included in the present review, studies had to a) include adults (aged 18 and over) with manifest or premanifest HD; b) report quantitative data allowing for the analysis of group differences or effects sizes; c) compare HD gene carriers (either manifest or premanifest) to another group and/or normative data on tasks requiring the identification of human emotions based on stimuli from any sensory modality (e.g., visual, auditory). The decision to include only studies involving adult participants was made in light of the evidence that the biological and clinical manifestations of juvenile-onset HD differ significantly from those of adult-onset HD. 29

Quantitative observational or experimental studies with a cross-sectional, between-subjects, or quasi-experimental design were included in the present review. Due to financial limitations, only studies published fully in English were considered eligible. No geographical limits were applied to the database searches for published literature.

Studies not adhering to the concept under investigation, not published in full in the English language, involving participants aged 17 or below, or involving and/or relating to animals were excluded. Systematic reviews, reviews, commentaries, editorials, letters, and qualitative studies were also excluded. Grey literature was not included to ensure all evidence had undergone a formal peer review process and adhered to the highest level of scientific scrutiny.

Studies primarily focused on stimulus congruence, reaction times, or eliciting emotion expressions or experiences were not included; when studies mixed these with more traditional emotion recognition tasks, the recognition results were still included when available, even if they were not the main focus of the investigation.

Study selection

Results from searches of electronic databases were imported into a reference management software, where duplicate citations were removed. Studies and guidelines were selected using the eligibility criteria described above. In the first phase, all titles were screened by two reviewers, and those that clearly did not meet the inclusion criteria were excluded. In the second stage, all abstracts and full text articles were screened for eligibility by two reviewers; in case of disagreement, a third reviewer provided a further level of scrutiny.

Data extraction

Selected data items were coded and extracted into an Excel sheet. All data were extracted by three reviewers and double checked by a further three to ensure accuracy. Data items for extraction included authors, year of publication, country of origin/study, type of study (e.g., design), key characteristics of the population (e.g., number, gender, sample power, context of recruitment, confirmation of HD status, premanifest or manifest, disease stage), types of emotion recognition investigated, emotion recognition measures adopted, and relevant results (e.g., means, SDs) and conclusions.

Data synthesis

The extracted data were initially synthesised by two reviewers, who produced a preliminary narrative summary. This was then checked for consistency by a further two reviewers, before being revised by the whole research team during the manuscript write-up.

Quality assessment

The appraisal of quality and risk of bias of the studies included in the review was first performed by two reviewers independently, who then cross-checked the results. In light of the design of the majority of the studies (i.e., non-interventional, between-groups) as well as the heterogeneity of adopted methods, the process criteria did not include standardised tools but were instead informed by previous comparable reviews.22,28 More specifically, these focused on a technical appraisal of elements such as sample size and power analysis, demographic data, control groups, the nature of stimuli and their presentation, response options, and potential confounding variables. Any disagreements between the two reviewers were solved by collective discussions with the whole Research Team.

Results

From an initial return of 9735 citations across MEDLINE, PsycINFO, CINHAL, and ASU, 6577 were retained following screening for duplicates and articles which were peer reviewed and fully published in English. From these, 6478 citations were excluded after screening by title and abstract, leaving 99 full-text articles to be considered for inclusion. Following full-text screening, 59 studies were eventually included in the present review. Figure 1 illustrates the PRISMA diagram for the selection of the studies, while their main characteristics are summarised below and outlined in Tables 18. See Supplementary Information for the list of citations excluded following full-text screening.

Figure 1.

Figure 1.

PRISMA flow diagram for selection of studies.

Table 1.

Sensory modalities across included studies.

Study Visual Auditory Olfactory Gustatory Tactile Multimodality
10
17
18
23
24
25
26
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
62
63
74
75
76
77
78
79
80
81
83
84
85
86
87
88
102
103
104
105
106
107
108
109

Table 2.

Technical appraisal of included studies.

Study Control present Longitudinal design Power analysis performed Sample size considered Normality tested FWER addressed IQ reported CAG reported ES reported
10
17
18
23
24
25
26
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
62
63
74
75
76
77
78
79
80
81
83
84
85
86
87
88
102
103
104
105
106
107
108
109

Note. ES=effect size; FWER=Family-Wise Error Rate.

Table 3.

Findings for visual emotion recognition.

Comparison Study Stimuli Measure Anger Fear Disgust Happiness Neutral Sadness Surprise T/C
Manifest/Ctrl 102 Eyes RMET
34 Faces POFA
42 Faces POFA
Eyes RMET
31 Eyes RMET
Faces ADEFS
Audioless Videos ADEFS
45 Faces ADEFS
50 Faces POFA
Faces EHT
Written Vignettes Custom
79 Faces KDEF
24 Body language Custom
49 Visual Vignettes IAPS
Written Vignettes Custom
83 Eyes RMET
77 Eyes RMET
57 Eyes RMET
Faces POFA
26 Words Custom
Visual Vignettes IAPS
48 Faces EHT
Faces FEEST
Visual Vignettes IAPS
74 Faces EHT
Eyes RMET
52 Faces EHT
Eyes RMET
Faces POFA
80 Faces KDEF
85 Faces KDEF
37 Faces POFA
104 Faces POFA
38 Faces POFA
51 Faces EHT
Eyes RMET
Audioless Videos TASIT-EET
78 Eyes RMET
44 Faces POFA
105 Faces Custom
39 Faces POFA
17 Faces MFS
Faces Custom
108 Faces POFA
86 Faces KDEF
76 Faces POFA
Faces FEEST
Faces MFS
Eyes Custom
43 Faces POFA
Eyes RMET
55 Faces POFA
Faces EHT
32 Faces EHT
62 Faces POFA, KDEF
63 Faces POFA, KDEF
41 Faces KDEF
81 Faces FEEST
Faces POFA
109 Faces EHT
35 Faces Custom
36 Faces JeFEE
Faces ADEFS
47 Faces BESST
Body Language BESST
Manifest/Norms 33 Faces FAB
Premanifest/Ctrl 23 Faces POFA
Body Language Custom
56 Eyes RMET
77 Eyes RMET
84 Faces POFA
74 Faces EHT
Eyes RMET
52 Faces EHT
Eyes RMET
53 Faces POFA
87 Faces FEEST
103 Faces EHT
38 Faces POFA
51 Faces EHT
Eyes RMET
Audioless Videos TASIT-EET
78 Eyes RMET
44 Faces POFA
88 Faces POFA
106 Eyes RMET
40 Eyes RMET
54 Faces FEEST
81 Faces FEEST
Faces POFA
75 Faces ERT
10 Eyes RMET
Premanifest/Manifest 77 Eyes RMET
74 Faces EHT
Eyes RMET
52 Faces EHT
Eyes RMET
53 Faces POFA
44 Faces POFA
107 Faces ACS-AN
Premanifest/Norms 30 Faces FAB

Note.=no significant difference; ◆=significant difference; ACS-AN=Advanced Clinical Solutions Affect Naming; ADEFS=Amsterdam Dynamic Facial Expression Set; BESST=Bochum Emotional Stimulus Set; EHT=Emotion Hexagon Test; ERT=Emotion Recognition Task; FEEST=Facial Expressions of Emotions: Stimuli and Tests (i.e., POFA + EHT); IAPS=Affective Picture System; JeFEE=Jerusalem Facial Expressions of Emotion; KDEF=Karolinska Directed Emotional Faces; MFS=Manchester Face Set; NimStim=NimStim Set of Facial Expressions; POFA=Pictures of Facial Affect; RMET=Reading the Mind in the Eyes Test; T/C=total or composite score.

Table 4.

Findings for auditory emotion recognition.

Comparison Study Stimuli Measure Anger Fear Disgust Happiness Neutral Sadness Surprise T/C
Manifest/Ctrl 50 Nonverbal Sounds Custom
26 Nonverbal Sounds Custom
48 Nonverbal Sounds Custom
17 Nonverbal Sounds Custom
Nonverbal Sounds Custom
18 Nonverbal Sounds Custom
76 Nonverbal Sounds Custom
55 Nonverbal Sounds Custom
Premanifest/Ctrl 54 Nonverbal Sounds Custom
Premanifest/Norms 32 Verbal Sounds FAB

Note.=no significant difference; ◆=significant difference; Ctrl=control.

Table 5.

Findings for olfactory emotion recognition.

Comparison Study Stimuli Measure Anger Fear Disgust Happiness Neutral Sadness Surprise T/C
Manifest/Ctrl 26 Odorants Custom
25 Odorants Custom

Note.=significant difference; Ctrl=Control.

Table 6.

Findings for gustatory emotion recognition.

Comparison Study Stimuli Measure Anger Fear Disgust Happiness Neutral Sadness Surprise T/C
Manifest/Ctrl 26 Liquids Custom
25 Foodstuff Custom

Note.=no significant difference; ◆=significant difference; Ctrl=Control.

Table 7.

Findings for multimodality emotion recognition.

Comparison Study Stimuli Measure Anger Fear Disgust Happiness Neutral Sadness Surprise T/C
Manifest/Ctrl 34 Video Vignettes TASIT-EET
45 Video Vignettes Custom
46 Video Vignettes TASIT-EET
Premanifest/Norms 30 Faces + Verbal Sounds FAB

Note.=no significant difference; ◆=significant difference; Ctrl=Control; FAB=Florida Affect Battery; TASIT-EET=The Awareness of Social Inference Test – Emotion Evaluation Task.

Table 8.

Main characteristics of included studies.

Study Country Design HD Stage HD Group
N
Age (M, SD)
Gender (%F)
Ctrl Group
N
Age (M, SD)
Gender (%F)
ER Modality ER Stimuli ER Measures Key ER results
10 UK Between-groups Premanifest 117
37.38 (11.06)
70
217
40 (15.39)
51
Visual Eyes RMET No significant difference observed between groups.
17 UK Between-groups Manifest 15
52.29 (9.41)
80
17
56.31 (8.92)
39
Visual
Auditory
Faces
Nonverbal Sounds
MFS
Custom
HD group significantly worse than controls at recognising anger, disgust and fear, but not happiness, sadness, or surprise.
18 France Between-groups Manifest 14
51.29 (7.69)
42.8
15
46.80 (11.18)
46.6
Auditory Nonverbal Sounds Custom HD group significantly worse than controls at recognising nonverbal sounds of happiness (achievement and pleasure), anger, disgust, fear, and pleasure.
23 Canada Between-groups Premanifest 21
48.3 (10.1)
57.1
27
49.2
55.6
Visual Faces
Body language
POFA
Custom
HD group significantly worse at recognising anger and disgust in isolated facial expressions.
No difference between HD group and controls in anger, sadness and disgust from body language.
24 France Between-groups Manifest 19
52.0 (9.1)
47.4
19
48.2 (8.0)
42.1
Visual Body Language Custom HD group significantly worse than controls with full-body expressions. No impairment of fear or sadness.
25 UK Between-groups Manifest 8
NR
NR
8
NR
NR
Olfactory
Gustatory
Odorants
Foodstuff
Custom HD group significantly worse at recognising disgusting odours compared to controls.
HD group significantly less prone to recognise inappropriate food combinations as disgusting, but no differences in reaction to gustatory stimuli.
26 Australia Between-groups Manifest 14
54.6 (11.6)
40
14
51.3 (9.25)
40
Visual
Auditory
Olfactory
Gustatory
Visual Vignettes
Words
Nonverbal Sounds
Odorants
Liquids
IAPS
Custom
HD group significantly impaired at recognising disgust from written vignettes, but not fear happiness, sadness, or surprise.
No impairment from words.
Impaired recognition of anger and disgust from nonverbal sounds but not fear and sadness.
General odour naming impairment.
No significant impairment using taste.
30 Ireland Single case study (longitud.) Premanifest 1
50
0
Visual
Auditory
Multimodality
Faces
Verbal Sounds
Faces + Verbal Sounds
FAB Baseline:
Deficit on found on facial, verbal, and multimodal emotion recognition.
2-year follow-up:
No significant change.
31 France Between-groups Manifest 20
37.41 (8.96)
45
20
38.49 (10.14)
45
Visual Faces
Audioless Videos
ADEFS No significant difference between HD and controls.
32 Germany Between-groups Premanifest 14
31 (1.8)
64.2
Gene-negative: 8
75
38.25 (14.51)
Matched:
37
36.23 (12.6)
54.1
Visual
Auditory
Faces
Nonverbal Sounds
Baseline and 6-month:
Faces: premanifest group significantly impaired on faces of disgust, but no other emotion compared to gene-negative; premanifest significantly impaired on faces of surprise, fear, disgust, and anger compared to matched controls.
Sounds: no significant differences between premanifest and gene-negative groups on any emotions; no matched control comparison.
12-month:
Faces: premanifest group significantly impaired on faces of disgust and surprise, but no other emotion; no matched control comparison.
Sounds: no significant differences between premanifest and gene-negative groups on any emotions; no matched control comparison.
Recognition of faces and voices expressing happiness, surprise, fear, and anger correlated significantly in the premanifest group.
33 USA Case series Manifest 5
41
NR
Visual Faces FAB Significant general impairments of emotion recognition in four out of 5 HD manifest HD patients.
34 Argentina Between-groups At-risk
Manifest
At-risk:
19
29.2 (9.6)
68.4
Manifest:
18
43.8 (10.3)
50
At-risk Ctrl:
18
29.5 (10.2)
38
Manifest Ctrl: 18
43.2 (10.3)
66

Visual
Multimodality
Faces POFA
TASIT-EET
At-risk and HD participants impaired in recognition of negative emotions with isolated faces (POFA) but normal in multimodality (TASIT-EET).
35 China Between-groups Manifest 6
44.8 (4.16)
33.33
16
45.19 (4.97)
37.5
Visual Faces Custom HD group significantly worse than controls at recognising surprise, fear, sadness, disgust, and anger, but not happiness.
36 Israel Between-groups Manifest 21
47.38 (13.20)
45
21
44.75 (13.98)
50
Visual Faces JeFEE
ADEFS
HD group significantly worse than controls across all emotional stimuli. However, performance of in the HD group approached chance level when introducing more ecologically looking facial expressions.
37 Australia
Canada
USA
Between-groups Premanifest
Manifest
464
41.43 (9.63)
62.93
57
43.01 (10.13)
61.4
Visual Faces POFA HD group significantly worse than controls on all negative emotions (anger, disgust, fear and sadness).
No significant difference found for happiness, surprise, or neutral stimuli.
38 Canada
France
Netherlands
UK
Between-groups Premanifest
Stage 1 Manifest
Stage 2 Manifest
Pre-HD A:
61
41.0 (8.7)
52.5
Pre-HD B:
54
40.5 (9.2)
57.4
Stage 1 HD:
113
47.2 (10.3)
59.7
Stage 2 HD:
51.2 (8.7)
43.9
116
45.8 (10.3)
54
Visual Faces POFA Premanifest HD groups significantly worse than controls at recognising anger, fear, and surprise. Nonsignificant trends for happiness, sadness, and disgust.
Both Stage 1 and Stage 2 Manifest HD groups significantly worse controls across all stimuli: happiness, sadness, anger, fear, disgust, surprise, and neutral.
39 Canada
France
Netherlands
UK
Between-groups Manifest Group 1:
43
48.43 (9.78)
47
Group 2:
67
48.20 (10.05)
61
107
46.13 (10.14)
54
Visual Faces POFA HD group worse than controls on all emotional stimuli.
HD participants with apathy worse than non-apathetic HD participants.
Specific impairment in the recognition of happiness in HD participants with apathy compared to non-apathetic HD participants.
40 Italy Between-groups Premanifest
Manifest
Premanifest:
20
34.9 (8.9)
50
Manifest:
40
45.3 (10.1)
50
Premanifest Ctrl:
40
31.7 (4.7)
50
Manifest Ctrl:
Ctrl: 40
47.2 (7.2)
50
Visual Eyes RMET Recognition significantly worse in premanifest and manifest HD compared to controls.
Manifest group also significantly worse than premanifest.


41 Italy Between-groups Manifest 12
61.08 (11.90)
31
11
65.45 (10.34)
27.3
Visual Faces KDEF Baseline:
HD group significantly worse in recognition of anger, fear, and sadness.
Follow-up:
No between-groups or within-groups analysis available.
42 Mexico Between-groups Manifest 12
42.7 (median)
67
Relatives:
12
44.7 (median)
58
Manifest Ctrl:
12
37.1 (median)
75
Visual
Auditory
Eyes
Faces
POFA
RMET
HD group significantly worse on emotion recognition through eyes (RMET) but not faces (POFA).
43 German
UK
Case series Manifest Case 1
1
41
0
Case 2:
1
30
0
15
43.3 (12.1)
53
Visual Faces EHT Significant impairment of recognition of disgust and fear in both patients.
Normal recognition of happiness, surprise, sadness, and anger.
44 UK Between-groups Premanifest
Manifest
Premanifest:
20
47.6 (8.45)
40
Manifest:
20
38.4 (9.5)
65
20
47.9 (9.3)
40
Visual Faces POFA Manifest impaired on anger, fear, disgust, and sadness compared to both controls and premanifest; no significant differences on other emotions.
Premanifest not significantly different from controls on any emotion.
Benton, phonemic fluency, semantic fluency, and WAIS-III vocabulary found to contribute to explanation of variance in emotion recognition in regression models.
45 France Single case study Manifest 1
47 (NR)
0
20
46 (7.5)
NR
Visual
Multimodality
Eyes
Faces
ADFES
RMET
No significant impairment observed.
46 Australia Between-groups Manifest 17
61 (12)
47
24
62 (9)
63
Multimodality Video Vignettes TASIT-EET HD group significantly worse than controls on negative emotions (anger, fear, disgust, sadness), but not positive ones (happiness, surprise) or neutral. No specific scores available for each emotion.
47 UK Between-groups Manifest 13
53.46 (5.11)
69
12
52.17 (7.907)
58.3
Visual Faces
Body language
BESST HD group significantly worse than controls in recognising disgust and anger from facial stimuli, and fear, sadness, and neutral stimuli from body language.
48 Australia Between-groups Manifest 14
54.6 (11.17)
42.8
14
51.8 (8.37)
50
Visual
Auditory
Faces
Visual Vignettes
Nonverbal Sounds
EHT
FEEST
Custom
HD group significantly worse than controls at recognising anger, disgust, fear, sadness and surprise from faces, but not happiness.
Only disgust impaired from visual vignettes.
49 UK Between-groups Manifest 16
57.5
50
16
56.6
56.25
Visual Eyes RMET HD group significantly worse than controls.
50 UK Between-groups Manifest Study 1:
21
50.43 (8.70)
42.8
Study 2:
19
42.43 (11.35)
47.7
Study 1:
Different for each task
Study 2:
14
42.43 (11.35)
64
Visual
Auditory
Faces
Nonverbal Sounds
Written Vignettes
POFA
EHT
Custom
HD group significantly worse at recognising anger, fear, and disgust in facial and auditory tasks. Only anger impaired on written vignettes.
51 Denmark Between-groups Premanifest
Manifest
Premanifest:
50
37 (NR)
42
Manifest:
50
51 (NR)
40
39
41 (NR)
56
Visual Faces
Eyes
Video Vignettes
EHT
RMET
TASIT-EET
No significant impairments observed in the Premanifest HD group across all tasks compared to controls.
Manifest HD group significantly worse than controls at recognising emotions through faces (EHT), eyes (RMET), and video vignettes (TASIT-EE).
52 Denmark Between-groups (longitud.) Premanifest
Manifest
T1 Premanifest:
50
36.5 (8.8)
42
T1 manifest:
48
51.2 (12)
39.5
T2 premanifest:
34
41.15 (8.5)
38.2
T2 manifest:
46
52.89 (11.6)
45.6
46
42.0 (13.4)
56.5
Visual Faces
Eyes
EHT
RMET
POFA
Premanifest HD group not significantly impaired compared to controls at either baseline or 6-year follow up.
Manifest HD group consistently impaired compared to premanifest HD group and controls at both baseline and 6-year follow-up.
53 UK Between-groups Premanifest
Manifest
Premanifest:
21
37.2 (7.9)
52
Manifest:
40
48.5 (9.6)
50
20
44.9 (10.5)
65
Visual Faces POFA HD group significantly worse than controls at recognising, surprise, disgust, anger and fear, and worse than premanifest HD at recognising disgust and anger.
54 UK Between-groups Premanifest 11
NR
NR
17
50.7 (14.3)
47
Visual
Auditory
Faces
Nonverbal Sounds
FEEST
Custom
HD group significantly worse than controls at recognising anger, fear, disgust, sadness, and surprise, but not happiness, from facial stimuli.
Impairment of fear, disgust, happiness, and surprise, but not anger and sadness from auditory stimuli.
55 Germany Between-groups Manifest 41
48.7 (10.0)
NR
26
47.0 (9.5)
NR
Visual Faces
Eyes
POFA
RMET
HD group significantly worse than controls at recognising anger, fear, disgust, sadness, and surprise, but not happiness, from facial stimuli. Effect sizes highest for disgust and anger.
General impairment in recognising emotions from eyes.
56 UK Between-groups Manifest 13
53.1
38.5
12
53.1
50
Visual Visual Vignettes
Written Vignettes
IAPS
Custom
HD group significantly worse at recognising fear from visual vignettes but not disgust or happiness.
No impairment on fear, disgust, or happiness with written vignettes.
57 Spain Between-groups Manifest 22
58.09 (9.73)
59
19
52.00 (9.69)
53
Visual Eyes
Faces
RMET
POFA
HD group significantly worse than controls with both eyes (RMET) and faces (POFA).
62 France Between-groups Manifest 13
54.1 (7.2)
46
18
52.3 (5.4)
33
Visual Faces POFA
KDEF
HD group significantly worse than control in recognition of fear, disgust, and sadness, but not fear, happiness and surprise.
63 France Between-groups Manifest 28
50 (8)
42.8
24
49 (10)
50
Visual Faces POFA
KDEF
HD group significantly worse than control in recognition of anger, fear, disgust, happiness, sadness, and surprise.
74 Denmark Between-groups Premanifest
Manifest
Premanifest:
40
41.3 (11.0)
40
Manifest:
40
51.7 (11.9)
47.5
32
48.1 (14.1)
59
Visual Faces
Eyes
EHT
RMET
HD group significantly worse than controls across both measures.
75 USA Between-groups Premanifest 21
54 (9)
57
16
56 (12)
62
Visual Faces ERT Premanifest HD group significantly less likely than controls to recognise anger, fear, and sadness, but not disgust, happiness, surprise, or neutral stimuli.
76 UK Between-groups Manifest 10
47 (9)
50
12
57 (9)
33
Visual
Auditory
Faces
Eyes
Nonverbal Sounds
POFA
FEEST
MFS
Custom
HD group significantly impaired on both visual and auditory tasks, with predominant impairment for negative emotions.
77 UK Between-groups Premanifest
Manifest
Premanifest:
16
42.13 (13.49)
NR
Manifest:
16
13.25 (2.11)
NR
28
46.7 (13.4)
54
Visual Eyes RMET Both premanifest and manifest HD groups both significantly worse than controls.
Manifest HD group significantly worse than premanifest HD group.
78 UK Between-groups Premanifest
Early manifest
Moderate manifest
Late manifest
Premanifest:
29
43.5 (9.5)
51.7
Early manifest:
12
54.1 (11.5)
25
Moderate manifest:
18
52.8 (14.4)
50
Late manifest:
20
56.1 (10.2)
35
26
59 (11.7)
46.1
Visual Eyes RMET No significant impairments observed in the Premanifest HD group compared to controls.
All Manifest HD groups significantly worse than controls.
79 Australia Between-groups Manifest 11
56.82 (9.81)
37.5
11
55.64 (7.06)
NR
Visual Faces KDEF HD group significantly worse than controls with neutral, angry, and disgust facial expressions.
80 Austria Between-groups Manifest 18
51.9 (10.4)
44
18
49.2 (10.3)
44
Visual Faces KDEFS HD group significantly less accurate than controls at recognising sadness, anger, and disgust, but not fear and surprise.
HD group significantly better than controls at recognising happiness.
81 Portugal Between-groups Premanifest
Manifest
Premanifest:
16
36.2 (1.8)
87.5
Manifest:
9
48.8 (4.6)
0
22
41.0 (2.3)
59
Visual Faces FEEST
POFA
Premanifest HD group not significantly impaired compared to controls.
Manifest HD group significantly impaired on all emotional stimuli compared to controls. Smaller impairment for happiness.
83 UK Between-groups Premanifest 20
45.0 (14.0)
70
26
45.7 (14.4)
69.2
Visual Eyes RMET HD group significantly worse than controls.
84 UK Between-groups Premanifest 23
38.53 (11.24)
NR
15
38.26 (11.82)
NR
Visual Faces POFA Premanifest significantly poorer at recognising disgust than controls; no difference on any other emotions.
85 Austria Between-groups Manifest 28
48.4 (9.4)
39
28
47.2 (7.5)
39
Visual Faces KDEFS HD group significantly less accurate than controls at recognising anger, disgust, surprise, and sadness.
86 Austria Between-groups Manifest 18
51.9 (10.4)
44.4
18
49.2 (10.3)
44.4
Visual Faces KDEF HD group significantly worse than controls for anger, but not fear, disgust, happiness, sadness, or surprise.

87 Germany Between-groups Premanifest 9
37.4 (5.4)
44
9
NR
44
Visual Faces FEEST Premanifest HD group significantly worse than controls at recognising disgust, but not anger, fear, happiness, sadness, or surprise.
88 UK Between-groups Premanifest 16
43.81 (8.30)
75
14
39.43 (11.40)
71
Visual Faces POFA No significant impairment observed.
102 France Between-groups Manifest 18
50.7 (8.8)
44.5
18
47.5
38.9
Visual Eyes RMET HD group significantly worse than control on RMET.
103 Australia Between-groups Premanifest 17
43.8 (10)
47
13
42.0 (11.4)
30.7
Visual Faces EHT No significant differences observed.
104 Australia Between-groups Premanifest
Manifest
23
48.83 (8.90)
56
25
49.64 (8.86)
60
Visual Faces ADFES HD group significantly worse at recognising happiness, anger, disgust sadness, and surprise.
105 Netherlands Between-groups Manifest 8
46.4 (11.2)
37.5
30
39 (11.1)
53
Visual Faces Custom HD group impaired in recognition of disgust and anger, but not fear, happiness, sadness, and surprise.
106 Italy Between-groups Premanifest 18
35.6 (7.2)
50
18
37.3 (9.6)
50
Visual Eyes RMET Premanifest HD group significantly worse than controls.
107 USA Between-groups Premanifest
Manifest
Premanifest:
14
47.43 (10.83)
50
Manifest:
62
50.29 (13.12)
55
Visual Faces ACS-AN Manifest HD group significantly worse than Premanifest HD group.
108 Spain Between groups Manifest 21
58.1 (9.7)
59.1
22
52 (9.7)
53
Visual Faces POFA HD group significantly worse than controls.
109 Denmark Between-groups Manifest 52
51.0 (11.8)
40.3
166
47.9 (20.9)
58.4
Visual Faces EHT HD group significantly worse than controls.

Note. ACS-AN=Advanced Clinical Solutions Affect Naming; ADEFS=Amsterdam Dynamic Facial Expression Set; EHT=Emotion Hexagon Test; ER=emotion recognition; ERT=Emotion Recognition Task; FEEST=Facial Expressions of Emotions: Stimuli and Tests (POFA + EHT); IAPS=Affective Picture System; JeFEE=Jerusalem Facial Expressions of Emotion; KDEF=Karolinska Directed Emotional Faces; MFS=Manchester Face Set; NimStim=NimStim Set of Facial Expressions; POFA=Pictures of Facial Affect; RMET=Reading the Mind in the Eyes Test; T/C=total or composite score.

Design

The vast majority of the investigations (55, 93.2%) adopted a cross-sectional between-groups design comparing people with HD at different stages and some form of control group. Three papers adopted a single case design,3032 while only one study carried out a case series of five individuals. 33

Countries

Most of the included studies were carried within high income countries, with Western Europe alone contributing to just over 70% of the evidence base. With regards to specific countries, the United Kingdom was the most represented (16, 27%), followed by France (7, 11.9%), Australia (6, 10%), and Germany and Denmark (4 studies/6.8% each). North America only accounted for four studies – three from the United States (5%) and one from Mexico (1.7%) – while China, Israel, and Argentina were only represented by a study each.3436

Sample size

Sample sizes for pwHD ranged from single case studies to large multicentre investigations enrolling over 400 participants (e.g., the TRACK-HD cohorts).3739 Control samples were similarly variable, ranging between one and 217 participants, the latter from a large international online study. 10 However, median group sizes were modest (i.e., around 18 for pwHD and 20 for controls), indicating predominantly small-to-medium scale studies consistent with the rarer nature of HD.

Type of HD Participants

Just over half of the included studies (39, 59%) recruited only pwHD at the manifest stage, while around 11 (19%) only recruited premanifest individuals. Another fifth (13, 22%) enrolled both manifest and premanifest pwHD.

Demographic information

The mean age of pwHD was around 47 years old; of these, manifest groups, as expected, tended towards the late 40s or early 50s, while premanifest groups were typically five to seven years younger. The whole range of means spanned from age 30 in premanifest samples to around 61 in manifest ones. Control groups showed comparable distributions, with means between 31.7 years 40 and 65.5 years. 41

HD samples were on average predominantly male (i.e., 44% female), while control groups were slightly more balanced (51% female). Where reported, HD groups showed a median of around 12.7 years of formal education, with a range between 4.1 years 38 and 20 years. 31 Controls showed almost identical figures (median: ≈ 12.5), though the upper end of the range was lower (16 years). 42 Mean IQ levels for HD participants fell within the average range (108), ranging from 99 35 to 117. 32 The mean of controls was similar (107), with a range between 104.5 43 and 109. 44 Neither type of sample ever exceeded the superior range on average.

Sensory modalities

Out of the 59 included studies, 56 (94.9%) included tasks investigating the visual modality (e.g., faces, eyes, visual vignettes). Auditory stimuli were a distant second, with only nine studies adopting them. Olfactory and gustatory stimuli were adopted concurrently by only two studies (3.4%),25,26 while multimodality stimuli (i.e., mixing one or more sensory modalities, such as audiovideo) were adopted in four investigations.30,34,45,46 No study investigated emotion recognition through tactile stimuli. Table 1 summarises the results by study, while Figure 2 illustrates the distribution of sensory modalities across all included studies.

Figure 2.

Figure 2.

Emotion recognition modalities across included studies. Note. Total included studies = 59. Some studies investigated multiple modalities.

Of the 56 studies adopting tasks investigating visual emotion recognition, an overwhelming majority (48, 85.7%) used facial stimuli, while eyes alone accounted for 24 studies (42.9%). Only around 17% of all included studies investigated other visual emotion recognition modalities, such as body language,23,24,47 visual vignettes,26,48,49 written vignettes,49,50 words, 26 and audioless videos.31,51 Figure 3 outlines the distribution of specific types of stimuli across the studies which explored the visual modality.

Figure 3.

Figure 3.

Types of visual stimuli across included studies. Note. Total visual studies = 56. Some studies adopted multiple stimulus types.

Technical appraisal

As mentioned above, the process criteria for the technical appraisal of the included studies were informed by previous comparable reviews.22,28 The details of the appraisal are outlined below, while Table 2 provides a summary for each study.

Control samples

Almost all included studies (56, 94.95%) compared pwHD with some form of control group, with 44 comparing people with manifest HD against controls and 20 premanifest individuals against controls. Only single studies compared manifest 33 or premanifest 30 pwHD against standardised norms. Most included studies included healthy controls matched for age, gender and/or education. A smaller number of studies included gene-negative individuals as controls.34,5254

Longitudinal designs

Only five studies adopted longitudinal designs,30,41,52,54,55 following participants for up to six years. Of these, one was a single case study, 30 and the remaining ones adopted a between-subjects design. Two enrolled either only premanifest30,54 or manifest41,55 individuals, while only one study 52 recruited both.

Power analysis and sample size considerations

The majority of the studies relied on convenience samples, with only four reporting an a priori power calculation.10,36,56,57 Three of these also contributed to the 11 investigations (18.6%) that included some type of explicit consideration or justification around the adopted sample size.10,36,57

Data normality

Fewer than half of the included studies (24, 40.7%) reported considerations around data normality, suggesting that violations of test assumptions may have gone unrecognised in other investigations.

Family-wise error rates (FWER)

Just over half of the investigations (34, 57%) clearly addressed family-wise error rates (FWER) when running multiple comparisons, either by applying corrections such as Bonferroni or False Discovery Rate (FDR), or by providing a rationale for not applying any adjustment (e.g., excessive conservativeness with smaller sample). Thus, the risk of false-positive findings may have been increased in the remaining 25 studies which did not report any consideration for FWER.

Clinical characteristics

The reporting of some of the essential clinical characteristics of pwHD was quite inconsistent across studies. While CAG-repeat lengths were reported by a large majority (41, 69.5%), only around a quarter of the included studies (16, 27.1%) reported actual or estimated IQ scores.

Effect sizes

Less than one third of the included studies (18, 30.5%) reported standardised effect sizes when presenting their results, meaning that the current literature severely limits insight into clinical significance. However, the fact that the studies that did report effect sizes were published more recently appears to suggest a potential trend improvement in this regard.

Emotion recognition measures

No measures of emotion recognition specifically developed for or validated with pwHD are currently available. Thus, all included studies relied on measures developed for other populations or custom tools. While efforts have been made to describe the most common measures based on other populations as clearly as possible, custom tools as well as some of the lesser-known tests often feature detailed descriptions which require their specific context of use to allow for a full characterisation. Readers are therefore invited to consult the relevant citation to learn more about these.

Visual measures

The investigation of visual emotion recognition relied heavily on face-based measures, with 26 studies adopting Paul Ekman's classic Pictures of Facial Affect (POFA) 58 – a stimulus set consisting of 110 frontal photographs depicting six basic emotions (anger, fear, surprise, happiness, sadness, and disgust), along with a neutral expression. It should be noted that studies adopting the Ekman 60 Faces Test (Ek-60F) 58 were also counted as using the POFA since the test draws from the same 110-picture stimulus set.

The second most common measure, adopted by 23 studies, was the Reading the Mind in the Eyes Test (RMET), 59 which focuses on the recognition of emotional expressions from photographs of eyes alone. The Emotion Hexagon Test (EHT 60 ; characterised by facial images morphing within a hexagonal shape with expressions blending two different emotions), 60 was adopted by 14 investigations, while non-standardised custom tasks were developed ad hoc by nine studies. The Karolinska Directed Emotional Faces set (KDEF), 61 a set of 4900 pictures delivered by 70 individuals displaying 7 different emotional expressions from 5 different angles, was used by seven studies – on two occasions in a custom combination with the POFA.62,63 The Facial Expressions of Emotions: Stimuli and Tests (FEEST; a computerised, updated version of the POFA) 64 was adopted by six investigations, while visual components of the Florida Affect Battery (FAB) 65 – a test measuring facial emotional discrimination, emotional prosody discrimination, and facial-auditory cross-modality tasks – was adopted by one study.

Some of the less commonly adopted measures (i.e., by fewer than five studies) included the Amsterdam Dynamic Facial Expression Set (ADEFS), 66 the Affective Picture System (IAPS), 67 the Manchester Face Set (MFS), 68 the Bochum Emotional Stimulus Set (BESST), 69 the Jerusalem Facial Expressions of Emotion (JeFEE), 70 the Emotion Recognition Task (ERT), 71 and the Advanced Clinical Solutions – Affect Naming (ACS-AN). 72

Auditory measures

Studies exploring auditory emotion recognition relied predominantly on custom tasks based on non-verbal sounds, with only one standardised measure, the emotional prosody discrimination component of the FAB, 65 being adopted by one investigation. 32

Olfactory and gustatory measures

No standardised measures were adopted by the two studies investigating olfactory and gustatory emotion recognition, with both using custom odour and taste emotional labelling protocols.25,26

Multimodality measures

Almost all the six studies which explored emotion recognition across multiple modalities used tasks based on video vignettes, with the most common being The Awareness of Social Inference Test – Emotion Evaluation Task (TASIT-EET) 73 and the Amsterdam Dynamic Facial Expression Set videos (ADEFS). 66 One investigation 45 also adopted a custom video vignettes measure, while another 30 used a composite score based on a combination of faces and non-verbal sounds from the FAB. 65

Emotion recognition findings

The main findings on all emotion recognition modalities are outlined below, divided by type of comparison. Due to their stimulus heterogeneity, the results for the visual modality are presented according to the specific type of tasks. For these, results from studies adopting larger participant cohorts (e.g., TRACK-HD) are also highlighted where relevant. Tables 37 summarise the findings for each modality, while Table 8 provides an outline of the main characteristics and key findings for of all included studies.

It should be noted that not all investigations within each modality explored all basic emotions (i.e., anger, fear, disgust, happiness, neutral, sadness, surprise) and some studies carried out multiple comparisons per modality using different measures. As a consequence, the number of studies reported for each emotion below may not match with the total number of studies in each modality.

Facial emotion recognition

Manifest v. Controls

When comparing manifest individuals against matched controls on facial emotion recognition tasks, impairments in pwHD were reported by over 60% of the studies adopting a total or composite emotion recognition score (7/11). On tests of specific emotions, significantly poorer facial emotion recognition for negative emotions (e.g., anger, sadness) was also reported consistently in the HD group. More specifically, over three-quarters of the studies that measured anger (29/35), fear (26/35) and disgust (30/35), and more than 60% of the ones exploring sadness (22/34), reported significant poorer performance in the HD group – including all the studies from the TRACK-HD multicentre cohorts.3739 Recognition of surprise was also found to be impaired in more than half of the investigations (19/32); however, no impairment was reported in one of the TRACK-HD studies. 37 Finally, performance on facial stimuli for happiness was largely intact across studies, with just over one in five reporting impairments (8/35); the same was observed for neutral stimuli, albeit based on much fewer investigations (4/10). However, it should be noted that significant deficits for both happiness and neutral items were found in two of the TRACK-HD studies.38,39

Manifest v. Norms

Only one study compared manifest pwHD to published norms on facial emotion recognition, 33 finding a significant impairment using the total Facial Affect Recognition Scale of the FAB. 65

Premanifest v. Controls

When comparing premanifest individuals against matched controls, impairments in facial emotion recognition were reported less frequently. None of the three studies adopting a facial composite score found significant differences between groups.51,52,74 Similarly, less than half of the emotion-specific investigations reported deficits for anger (5/12), fear (3/10), and disgust (5/12). Only one study reported impairments for sadness, 75 while the TRACK-HD premanifest cohort evaluated by Labuschagne et al. 38 was one of the only two investigations (along with Sprengelmeyer and colleagues 54 ) to report deficits of recognition of surprise. Facial recognition of happiness and neutral faces was entirely preserved in people with premanifest HD across all studies.

Manifest v. Premanifest

Compared to premanifest individuals, people with manifest HD were found to be impaired in most studies adopting a facial composite or total score (2/3). The two studies which compared participants on specific emotions44,53 both found significant impairments of recognition of anger, fear, and disgust in manifest pwHD; however, only Henley et al. found deficits for recognition surprise and happiness, while only Milders and colleagues found a deficit for sadness. The latter was also the only study to compare manifest and premanifest people on neutral facial stimuli, finding no significant difference between groups. 44

Premanifest v. Norms

Only one single case study compared the facial emotion recognition performance of a premanifest individual against published norms, 30 finding a significant discrepancy with normative data using the total Facial Affect Recognition Scale of the FAB. 65

Eyes emotion recognition

Manifest v. Controls

Eleven out of 12 investigations which compared manifest individuals to matched controls on tasks involving eye-based emotion recognition adopted the RMET, 59 which only yields a total score. Of these, 10 found a significant impairment in pwHD. The only study which measured specific emotions using a custom eyes-based task 76 found significant impairments for recognition of fear, disgust, and sadness, but not of anger, happiness, or surprise.

Premanifest v. Controls

All the nine studies which compared premanifest individuals against controls on eyes emotion recognition adopted the RMET. Unlike with manifest participants, however, the results of these investigations were less consistent, with more than half of the studies (5/9) finding no significant impairment in premanifest pwHD – including an international online study enrolling 117 participants across Italy and the UK. 10

Manifest v. Premanifest. The three studies comparing premanifest and manifest individuals on eyes emotion recognition all adopted the RMET and found significantly poorer performance in the manifest group than the premanifest.52,74,77

Body language emotion recognition

Manifest v. Controls

Two studies used emotional body language pictures with hidden facial expressions to compare manifest individuals with matched controls. De Gelder et al. 24 developed a custom task and found impairments for anger and neutral stimuli, but not for fear and sadness. Zarotti et al. 47 instead adopted 70 frontal body language pictures from the standardised Bochum Emotional Stimulus Set (BESST) 69 and found impairments for recognition of fear, sadness, and neutral stimuli, but not of anger, disgust, happiness, or surprise.

Premanifest v. Controls

Only one study explored emotional body language recognition in premanifest pwHD, comparing them to matched controls. 23 The results, based on a custom task, showed no significant group differences for anger, disgust, or sadness.

Other visual emotion recognition

Manifest v. Controls

Three studies adopted visual vignette stimuli from the International Affective Picture System (IAPS) 67 – a standardised set of emotionally evocative pictures including both human (e.g., faces) and non-human (e.g., rooms) stimuli – to compare manifest pwHD with matched controls. On one hand, Eddy et al. 49 found impairments for fear, but not disgust and happiness. On the other, Hayes et al. 26 found that manifest individuals showed a deficit for disgust, but not fear, happiness, or neutral stimuli. These results were corroborated by a later study by the same group, which found again significant impairments for disgust but none for anger, fear, or sadness. 48 Written emotional stimuli, whereby participants had to read emotional vignettes or words (and hence rely on vision), were also adopted by three studies.26,49,50 Among these, however, only Calder and colleagues 50 found a significant difference between groups, specifically for anger. Finally, Larsen and colleagues 51 adopted an audioless version of the videos from the TASIT-EET 73 and found a significant impairment in manifest pwHD on the measure's total score. Similarly, Caillaud et al. 31 reported a case study in which a manifest individual was found to have no impairment based on the audioless video stimuli from the ADFES. 66

Premanifest v. Controls

An audioless version of the TASIT-EET 73 was used by Larsen and colleagues 51 to compare premanifest individuals with matched controls, finding no significant differences between groups on its total score.

Auditory emotion recognition

Manifest v. Controls

Each of the seven studies which used auditory emotion recognition tasks to compare manifest individuals against matched controls found a significant impairment for recognition of disgust, with fear and anger being a close second (6/7 and 5/7 studies respectively). On the other hand, recognition of happiness and surprise were found to be impaired in less than half of the investigations (2/5), and only one reported deficits of recognition of sadness. 17 None of the studies measured neutral expressions or adopted composite or total scores.

Premanifest v. Control

Only one study compared premanifest individuals to controls using a custom auditory task, finding no significant differences on a composite score. 54

Premanifest v. Norms

One study compared premanifest pwHD to published norms, 30 reporting a significant impairment in the total Vocal Affect Recognition Scale of the FAB. 65

Olfactory and gustatory emotion recognition

Manifest v. Controls

Only two studies investigated olfactory and gustatory emotion recognition in pwHD by comparing manifest individuals with matched controls on custom measures for both modalities. Hayes and colleagues 26 found a significant difference on the composite score for olfactory emotion recognition based on odorants, but not for gustatory stimuli (i.e., liquids). On the other hand, Mitchel et al. 25 found a significant impairment in the HD group on both olfactory (i.e., odorants) and gustatory (i.e., foodstuff) tasks.

Multimodality emotion recognition

Manifest v. Controls

The three studies which compared manifest individuals with matched controls on multimodal emotion recognition tasks (all based on video vignettes) yielded contrasting results. Philpott et al. 46 found a significant group difference using the full (i.e., both audio and video) TASIT-EET, 73 whereas Caillaud et al. 45 did not find any significant impairments when adopting a custom video task (‘Pierre and Marie’). Similarly, Baez et al. 34 found no group differences based on the TASIT-EET total score or any of its specific emotional items.

Premanifest v. Norms

Burke et al. 30 reported a case study in which a premanifest individual was found to be impaired on the multimodality component (i.e., matching emotional prosody with emotional faces and vice versa) of the FAB. 65

Emotion Recognition Associations

Demographic Associations

Johnson et al. 37 found reduced emotion recognition performance was associated with increasing age, lower estimated IQ, and lower educational level. There appeared to be no gender effect in participants in most studies, although Johnson et al. 37 found females performed better than males. Larsen et al. 51 and Rees et al. 17 found emotion recognition deficits even after controlling for social and environmental factors, suggesting these may not be contributing variables.

Clinical associations

A number of studies investigated the potential correlation between performance in emotion recognition and clinical variables such as CAG length, CAG-Age-Product (CAP) scores, or estimated time to disease onset in premanifest individuals. Mason et al. 78 and Larsen et al. 51 reported higher CAG-Age-Product (CAP) score and estimated time to disease onset were associated with worse emotion recognition performance. However, Croft et al. 79 reported that CAG repeat length was not associated with emotion recognition scores. Similarly, Ille, Schäfer, et al. 80 and van Asselen et al. 81 found no correlations between emotion recognition and CAG repeats, symptom duration, and the Total Motor Score (TMS) of the Unified Huntington's Disease Rating Scale (UHDRS). 82

However, other studies did find that the UHDRS TMS correlated with emotion recognition from faces 36 and body language, 24 while Total Functional Capacity scores from the same scale were positively correlated with body language recognition. 47 Furthermore, emotion recognition was found to predict decline in functional capacity over a six-year follow up, independently of executive function, depression, and baseline disease severity. 74 However, Eddy et al., 83 Tinkler et al., 62 and Ille, Holl, et al. 80 found no correlations between disease duration, age of motor onset, and disease burden. Finally, higher levels of insulin-like growth factor 1 (IGF-1) were found to be significantly associated with better facial emotion recognition performance in a small sample of manifest pwHD. 57

Cognitive associations

Across the included studies, some found associations between cognitive status or executive function and emotion recognition in HD participants,37,41,44,46,76 while others did not.36,80,81,84 In several investigations, single measures of executive functioning correlated with emotion recognition.48,77 However, Baez et al. 34 found no correlation between emotion recognition and executive functioning, and a longitudinal six-year follow up study showed that emotion recognition and functional decline were separate from executive functions. 74

Some studies also considered whether a deficit in basic facial recognition was associated with performance on emotion recognition tasks. In this regard, the evidence appears again inconsistent, with some findings showing that better face recognition predicted more accurate emotion recognition,17,37,44 but others reporting no association between facial recognition and emotion recognition.48,50 Furthermore, after controlling for impairments on face matching in their analyses, Calder et al. 50 and Rees et al. 17 found the effect of poor recognition of negative emotions remained significant for HD participants.

Psychological associations

Some studies controlled for anxiety and depression levels when analysing emotion recognition performance. Hendel et al. 74 found deficits even after controlling for depression in regression analyses. However, Croft et al. 50 and Johnson et al. 37 found no association between emotion recognition performance and levels of depression. In addition, difficulties in emotion regulation showed no relationship with emotion recognition performance across two studies.10,47 Results for apathy appear more contrasting, with Hendel et al. 52 reporting that apathy correlated with emotion recognition, but no association was found in another study. 63 Finally, Baez et al. 34 found no correlation between emotion recognition and empathy in pwHD.

Medication associations

Osborne-Crowley et al. 39 found no effect of taking antidepressants or antipsychotics on emotion recognition. However, in early-stage HD, Labuschagne et al. 38 found that neuroleptics were associated with worse recognition performance, while selective serotonin reuptake inhibitors (SSRIs) were associated with better scores on recognition tasks. Both investigations enrolled large samples from the TRACK-HD cohorts. Similar findings for antidopaminergic medications were also reported by a later study. 57

Cross-Modality associations

Only two studies investigated potential associations of emotion recognition performances across multiple modalities in pwHD, both between the visual and auditory modality. Sprengelmeyer et al. 54 found significant positive correlations between the recognition of faces and voices expressing happiness, surprise, fear, and anger. However, no significant association was found for sadness and disgust. On the other hand, Hayes et al. 26 found a significant positive correlation between the recognition of vocal expressions of disgust and disgusting vignettes from the IAPS. 67

Neuroimaging findings

Neuroimaging investigations were carried out by around one-third of the included studies (18/59). Most adopted structural methods – such as computerised tomography (CT) scans, volumetric magnetic resonance imaging (MRI), or voxel-based morphometry (VBM) – and hypothesised frontostriatal atrophy as the anatomical substrate of emotion recognition impairments in pwHD. More specifically, early CT results by Sprengelmeyer et al. 55 linked impaired recognition of negative facial expressions to caudate and frontal volume loss. Subsequent studies corroborated these findings by showing that poorer emotion recognition performance was associated with reduced grey matter in the caudate, putamen, insula, and orbitofrontal cortex in manifest HD.53,85,86 Similarly, two VBM analyses showed that subtle frontostriatal degeneration was significantly correlated with poorer affective social cognition in premanifest individuals.31,45 In their VBM study, Gil-Polo and colleagues 57 found that reduced frontotemporal grey matter volume and cortical thinning were significantly associated with lower IGF-1 levels in manifest HD, which were in turn significantly correlated with lower emotion recognition performance. In addition, a diffusion-tensor imaging (DTI) study found that white matter disruptions in the corpus callosum, the frontal gyrus, right anterior cingulate cortex, insula and amygdala regions, cerebellum, and brainstem were significantly associated with poorer emotion recognition from faces and eyes in manifest pwHD. 43

Albeit less frequent in number, functional neuroimaging studies showed hypoactivation within areas such as the precuneus, anterior insula, anterior and posterior cingulate, and supramarginal gyrus in manifest HD during tasks involving emotion recognition.77,87 Moreover, lower amygdala–fusiform connectivity and reduced activation of the superior temporal sulcus were observed in premanifest individuals when judging facial emotional expressions.75,78 Finally, Novak et al. 88 found that reduced neural activity in premanifest pwHD – comparable to the other studies – could be partially distinguished based on the processing of three specific emotions (i.e., disgust, anger, and happiness).

Discussion

The present work aimed to provide a comprehensive systematic review of empirical studies on emotion recognition in people with HD over the three decades since the consistent identification of the HD gene through direct testing. From 9735 initial records, 59 studies were considered eligible for inclusion. Even when accounting for the more encompassing inclusion criteria of this work, the number of included studies has increased more than threefold compared to a previous review from 15 years ago, 22 highlighting a huge surge in interest on the topic of emotion recognition within the HD literature. Most included studies adopted a cross-sectional design, with only five following participants longitudinally and four reporting single cases or case series. Around 70% of the evidence came from Western Europe, and especially the United Kingdom, while other continents were much less represented. Sample sizes ranged from one to over 400, with median values around 20.

The results in people with manifest HD showed robust evidence of deficits of recognition of negative emotions from facial cues, with over 75% of relevant studies consistently reporting impairments for recognition of anger, fear, and disgust, and more than half for sadness and surprise. Facial recognition of happiness and neutral expressions appears instead to be more preserved, although some notable exceptions were two of the studies based on the large TRACK-HD cohorts.38,39 On tests focused on eye stimuli, 90% of studies with manifest individuals reported evidence of global emotion recognition impairments, while one also reported specific deficits for fear, disgust, and sadness. 76 Auditory studies, although sparser than facial ones, all show consistent deficits for the recognition of disgust, fear, and anger, while happiness and sadness appeared less affected. Only preliminary evidence is currently available for deficits of emotion recognition from body language,24,47 visual and written vignettes,26,49,50 audioless videos, 51 multimodal video vignettes, 46 olfactory tasks based on odorants,25,26 and gustatory tasks based on foodstuff. 25

On the other hand, the evidence involving people with premanifest HD is currently much less consistent, with no significant differences found with matched controls on any composite or total scores of facial emotion recognition. When testing for specific emotions, impairments in recognition of anger, fear, and disgust were reported in less than 40% of relevant studies, while only isolated investigations reported issues with sadness 75 and surprise. 54 Facial recognition of happiness and neutral expressions was preserved across all premanifest studies. Findings involving tasks based on eye expressions were mixed, with around half showing no overall impairments, including a large international online sample. 10 Studies exploring recognition across other sensory modalities in premanifest HD were severely lacking, with isolated contrasting findings available on auditory tasks,32,54 only single norm-based evidence for multimodal deficits, 30 and no investigation available for olfactory or gustatory tasks.

Across both manifest and premanifest participants, older age, lower education or IQ, and greater motor impairments (as measured by the UHDRS-TMS) were linked to poorer emotion recognition, while gender or CAG repeat length showed weak or no associations. Associations with medications appear tentative, with some evidence suggesting that neuroleptics may lower recognition performance, while SSRIs may improve it. Only two investigations looked into associations of emotion recognition deficits across modalities, particularly visual and auditory, finding contrasting results.26,54 Cognitive abilities such as executive functioning and basic facial recognition skills were associated with emotion recognition only in some studies, showing an overall inconsistent pattern which warrants further investigation due to the wide range of cognitive issues pwHD may experience at all disease stages.6,89 Similarly, the exploration of associations between psychological difficulties and emotion recognition performance showed mixed results, with only sparse evidence available for anxiety, depression, and apathy. This represents again an issue which needs further attention, especially in light of the wide range of (often unrecognised) psychological difficulties which are experienced by people affected by HD9092 and the current severe lack of psychological support available for this population.93,94 Finally, structural and functional neuroimaging studies consistently associated poorer emotion recognition with frontostriatal atrophy, white-matter disruption, and hypoactivation of large cortico-subcortical brain networks in both manifest and, more subtly, premanifest participants. These findings also appear consistent with neurobiological evidence that loss or disruption of hypothalamic neuropeptides such as oxytocin and vasopressin in pwHD may be implicated with anormal frontostriatal activation as well as reduced emotional processing and recognition.95,96

Overall, the findings from this review highlight that impairments of emotion recognition in pwHD likely follow a progressive pattern which may not always be equally apparent across disease stages. Before motor onset, most premanifest individuals are still able to recognise emotions effectively from faces and eyes, but a small subset may show early subtle difficulties with specific emotional cues, although with patterns that are yet to be clearly evidenced. These issues appear to be linked with the early involvement of frontostriatal areas which are essential for social cognitive tasks and are consistent with the other early cognitive difficulties which can often become evident before motor phenoconversion.7,20 At the manifest stage, deficits become significantly more apparent across visual (faces, eyes, body language) and auditory modalities, and may also present in olfactory, gustatory, or multimodal tasks. In contrast to some of the earlier HD literature, which suggested emotion recognition impairments in affected individuals were selective for disgust, 97 it is now clearer that these difficulties predominantly affect all negative emotions, while relatively sparing the recognition of positive affect (e.g., happiness) and neutral stimuli. When compared with premanifest individuals, manifest pwHD are also likely to show significantly poorer emotion recognition performance across the board.

Implications for clinical practice

A number of important clinical implications can be drawn from this review. Since the evidence indicates that, similarly to other cognitive skills, emotion recognition abilities in HD may deteriorate early in the premanifest stage and become more pronounced as the disease evolves, routine assessments of facial emotion recognition (as well as other modalities as further evidence accrues) may support the disease staging process and act as a potential cognitive biomarker of disease progression. This idea was already suggested by Henley and colleagues in their previous review 15 years ago 22 and now – with a three times stronger body of evidence – it should receive renewed attention in HD clinical practice. With regards to facial recognition measures, Paul Ekman's classic Pictures of Facial Affect (POFA) 58 may be considered as first choice, particularly due to its reliability, rich evidence-base with pwHD and other conditions, and its availability in a variety of different delivery methods – e.g., pencil and paper or computerised in the form of the FEEST. 64

In addition, when performing medication reviews, HD clinicians should consider the potential effects of different pharmacological treatment options, such as those based on neuroleptics or SSRIs, on emotion recognition. This appears especially relevant considering the potential negative impact of emotion recognition deficits on the social interactions and quality of life in HD. 22

Finally, evidence has shown that psychotherapy can play a pivotal role in improving emotional awareness as well as general social cognitive skills.98100 Therefore, as further evidence accrues on the potential association between other neuropsychological difficulties (e.g., anxiety, depression, executive dysfunction) and emotion recognition, the development of targeted cognitive and behavioural interventions for pwHD should be prioritised with the aim to improve social cognition and quality of life of affected individuals and their families.93,101

Methodological limitations and future directions

Although interest in the topic of emotion recognition in pwHD has clearly increased in the past few decades, several methodological limitations exist within the literature, which prevented the adoption of meta-analytic approaches and provide caveats on the conclusions of this review. The issues highlighted previously 22 around power, heterogeneity of methods, small sample sizes, and poor reporting of key covariates (e.g., CAG repeats, IQ) still persist to an extent – albeit with some signs of improvement. For example, the TRACK-HD study allows for the investigation of emotion recognition in large international cohorts of pwHD.

In future studies, significantly more focus is therefore needed on the inclusion of crucial methodological elements such as increased power, adjustment for multiple comparisons, consideration of data normality, and reporting of effect sizes. As the present article only included literature published in English, future reviews should also aim to include studies published in other languages.

Methodological heterogeneity may also help explain some of the conflicting findings observed across studies, particularly when investigating multiple emotions, as the way specific elements of affect or emotion are operationalised may vary significantly across measures. For instance, facial emotion recognition from front-facing images (e.g., the POFA) may be easier than when adopting angled pictures (e.g., the KDEF 61 or EHT 60 ). However, such differences are rarely accounted for when total scores are reported. Future studies should therefore aim to provide a more comprehensive characterisation of the subtler differences across measures which may have an impact on participants’ results.

In addition, further investigations are strongly warranted on emotion recognition based on visual stimuli other than faces and eyes (e.g., body language, visual vignettes) as well as currently neglected sensory modalities – such as auditory, olfactory, gustatory, and multimodality tasks. This is even more relevant for individuals with premanifest HD, in order to address the current sparse nature of the literature involving individuals at this stage and should adopt some of the standardised measures which have shown preliminary significant results in this review (e.g., the verbal components of the FAB 65 ) rather than rely on custom tasks.

Finally, the further development of standardised assessment batteries across all sensory modalities, and their specific validation with pwHD, remains a high priority to reduce the impact of methodological heterogeneity, improve ecological validity, and shed new light into the different facets of emotion recognition impairments in this population.

Conclusions

In manifest HD, facial recognition of negative emotions such as anger, fear, disgust, and sadness is consistently impaired, whereas happiness and neutral expressions are generally spared. A small number of auditory studies show consistent deficits for disgust, fear, and anger, while happiness and sadness appear less affected. Only preliminary evidence is currently available for deficits involving body language, visual and written vignettes, videos, and olfactory and gustatory tasks. The evidence involving premanifest individuals is currently sparser; however, studies suggest that sporadic areas of significant emotion recognition weakness may develop in some people prior to the onset of motor symptoms, particularly due to early degeneration of frontostriatal pathways and disruption of white matter tracts. Clinicians should consider routine assessments of emotion recognition to aid staging and inform new targeted interventions. Future research should also focus on adopting more adequately powered, longitudinal designs using standardised and validated emotion recognition tests, with the potential to establish a gold standard for the assessment of emotion recognition in this population.

Supplemental Material

sj-docx-1-hun-10.1177_18796397251390252 - Supplemental material for Emotion recognition in people with Huntington's disease: A comprehensive systematic review

Supplemental material, sj-docx-1-hun-10.1177_18796397251390252 for Emotion recognition in people with Huntington's disease: A comprehensive systematic review by Nicolò Zarotti, Alice Storey, Sarah Lloyd, Laura Mesia Guevara, Helen Caswell, Cliff Chen and Jane Simpson in Journal of Huntington's Disease

Footnotes

ORCID iD: Nicolò Zarotti https://orcid.org/0000-0002-8129-6151

Authors’ contributions: All authors contributed to the study conception and design. Material preparation, data collection, and analysis were performed by all authors. The first draft of the manuscript was written by NZ and AS and all authors contributed to further versions of the manuscript. All authors read and approved the final manuscript.

Funding: The authors received no financial support for the research, authorship, and/or publication of this article.

The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Data availability: Datasets from the present work can be accessed from the authors upon request.

Supplemental material: Supplemental material for this article is available online.

References

  • 1.Walker FOO. Huntington’s disease. Lancet 2007; 27: 143–150. [DOI] [PubMed] [Google Scholar]
  • 2.Novak MJU, Tabrizi SJ. Huntington’s disease. BMJ (Online) 2010; 340: 34–40. [DOI] [PubMed] [Google Scholar]
  • 3.Huntington’s Disease Collaborative Research Group . A novel gene containing a trinucleotide repeat that is expanded and unstable on Huntington’s disease chromosomes. Cell 1993; 72: 971–983. [DOI] [PubMed] [Google Scholar]
  • 4.Evans SJW, Douglas I, Rawlins MD, et al. Prevalence of adult Huntington’s disease in the UK based on diagnoses recorded in general practice records. J Neurol Neurosurg Psychiatry 2013; 84: 1156–1160. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Medina A, Mahjoub Y, Shaver L, et al. Prevalence and incidence of Huntington’s disease: an updated systematic review and meta-analysis. Mov Disord 2022; 37: 2327–2335. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Wahlin T, Byrne G. Cognition in Huntington’s Disease. Huntingtons disease—Core concepts …; 1860, http://cdn.intechopen.com/pdfs-wm/28331.pdf (2012, accessed 12 July 2014).
  • 7.Corey-Bloom J, Williams ME, Beltran-Najera I, et al. Central cognitive processing speed is an early marker of Huntington’s disease onset. Mov Disord Clin Pract 2021; 8: 100–105. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Migliore S, D’Aurizio G, Maffi S, et al. Cognitive and behavioral associated changes in manifest Huntington disease: a retrospective cross-sectional study. Brain Behav 2021; 11: 1–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Migliore S, D’Aurizio G, Curcio G, et al. Task-switching abilities in pre-manifest Huntington’s disease subjects. Parkinsonism Relat Disord 2019; 60: 111–117. [DOI] [PubMed] [Google Scholar]
  • 10.Zarotti N, Simpson J, Fletcher I, et al. Exploring emotion regulation and emotion recognition in people with presymptomatic Huntington’s disease: the role of emotional awareness. Neuropsychologia 2018; 112: 1–9. [DOI] [PubMed] [Google Scholar]
  • 11.Hunefeldt T, Maffi S, Migliore S, et al. Emotion recognition and inhibitory control in manifest and pre-manifest Huntington’s disease: evidence from a new Stroop task. Neural Regen Res 2020; 15: 1518–1525. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Williams JK, Kim J-I, Downing N, et al. Everyday cognition in prodromal huntington disease. Neuropsychology 2015; 29: 255–267. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Bora E, Velakoulis D, Walterfang M. Social cognition in Huntington’s disease: a meta-analysis. Behav Brain Res 2015; 297: 131–140. [DOI] [PubMed] [Google Scholar]
  • 14.Ekman P. Facial expression and emotion. Am Psychologist 1993; 48: 384–392. [DOI] [PubMed] [Google Scholar]
  • 15.Ekman P. Expression and the nature of emotion. In: Scherer K and Ekman P (eds) Approaches Emotion. HIllsdale, NJ: Lawrence Erlbaum, 1984, pp. 319–344. [Google Scholar]
  • 16.Bates G, Tabrizi S, Jones L. Huntington’s Disease. New York: Oxford University Press, 2014. [Google Scholar]
  • 17.Rees EM, Farmer R, Cole JH, et al. Inconsistent emotion recognition deficits across stimulus modalities in Huntington׳s disease. Neuropsychologia 2014; 64C: 99–104. [DOI] [PubMed] [Google Scholar]
  • 18.Robotham L, Sauter DA, Bachoud-Levi AC, et al. The impairment of emotion recognition in Huntington’s disease extends to positive emotions. Cortex 2011; 47: 880–884. [DOI] [PubMed] [Google Scholar]
  • 19.Escudero-Cabarcas J, Pineda-Alhucema W, Martinez-Banfi M, et al. Theory of mind in Huntington’s disease: a systematic review of 20 years of research. J Huntingtons Dis 2024; 13: 15–31. [DOI] [PubMed] [Google Scholar]
  • 20.Cavallo M, Sergi A, Pagani M. Cognitive and social cognition deficits in Huntington’s disease differ between the prodromal and the manifest stages of the condition: a scoping review of recent evidence. Br J Clin Psychol 2022; 61: 214–241. [DOI] [PubMed] [Google Scholar]
  • 21.Rizzo G, Martino D, Avanzino L, et al. Social cognition in hyperkinetic movement disorders: a systematic review. Soc Neurosci 2023; 18: 331–354. [DOI] [PubMed] [Google Scholar]
  • 22.Henley SMD, Novak MJU, Frost C, et al. Emotion recognition in Huntington’s disease: a systematic review. Neurosci Biobehav Rev 2012; 36: 237–253. [DOI] [PubMed] [Google Scholar]
  • 23.Aviezer H, Bentin S, Hassin RR, et al. Not on the face alone: perception of contextualized face expressions in Huntingtons disease. Brain 2009; 132: 1633–1644. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.de Gelder B, Van den Stock J, Balaguer RD, et al. Huntington’s disease impairs recognition of angry and instrumental body language. Neuropsychologia 2008; 46: 369–373. [DOI] [PubMed] [Google Scholar]
  • 25.Mitchell IJ, Heims H, Neville EA, et al. Huntington’s disease patients show impaired perception of disgust in the gustatory and olfactory modalities. J Neuropsychiatry Clin Neurosci 2005; 17: 119–121. [DOI] [PubMed] [Google Scholar]
  • 26.Hayes CJ, Stevenson RJ, Coltheart M. Disgust and Huntington’s disease. Neuropsychologia 2007; 45: 1135–1151. [DOI] [PubMed] [Google Scholar]
  • 27.Page MJ, Moher D, Bossuyt PM, et al. PRISMA 2020 Explanation and elaboration: updated guidance and exemplars for reporting systematic reviews. Br Med J 2021; 52: n160. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Monti G, Meletti S. Emotion recognition in temporal lobe epilepsy: a systematic review. Neurosci Biobehav Rev 2015; 55: 280–293. [DOI] [PubMed] [Google Scholar]
  • 29.Fusilli C, Migliore S, Mazza T, et al. Biological and clinical manifestations of juvenile Huntington’s disease: a retrospective analysis. Lancet Neurol 2018; 17: 986–993. [DOI] [PubMed] [Google Scholar]
  • 30.Burke T, Healy D, Casey P, et al. Atypical social cognitive processing in premotor Huntington’s disease: a single case study. Ir J Psychol Med 2017; 34: 53–58. [DOI] [PubMed] [Google Scholar]
  • 31.Caillaud M, Laisney M, Bejanin A, et al. Specific cognitive theory of mind and behavioral dysfunctions in early manifest Huntington disease: a case report. Neurocase 2020; 26: 36–41. [DOI] [PubMed] [Google Scholar]
  • 32.Sprengelmeyer R, Young AW, Sprengelmeyer A, et al. Recognition of facial expressions: selective impairment of specific emotions in Huntington’s disease. Cogn Neuropsychol 1997; 14: 839–879. [Google Scholar]
  • 33.Jacobs DH, Shuren J, Heilman KM. Impaired perception of facial identity and facial affect in huntington’s disease. Neurology 1995; 45: 1217–1218. [DOI] [PubMed] [Google Scholar]
  • 34.Baez S, Herrera E, Gershanik O, et al. Impairments in negative emotion recognition and empathy for pain in huntington’s disease families. Neuropsychologia 2015; 68: 158–167. [DOI] [PubMed] [Google Scholar]
  • 35.Wang K, Hoosain R, Yang R-MM, et al. Impairment of recognition of disgust in Chinese with Huntington’s or Wilson’s disease. Neuropsychologia 2003; 41: 527–537. [DOI] [PubMed] [Google Scholar]
  • 36.Yitzhak N, Gurevich T, Inbar N, et al. Recognition of emotion from subtle and non-stereotypical dynamic facial expressions in Huntington’s disease. Cortex 2020; 126: 343–354. [DOI] [PubMed] [Google Scholar]
  • 37.Johnson SA, Stout JC, Solomon AC, et al. Beyond disgust: impaired recognition of negative emotions prior to diagnosis in Huntington’s disease. Brain 2007; 130: 1732–1744. [DOI] [PubMed] [Google Scholar]
  • 38.Labuschagne I, Jones R, Callaghan J, et al. Emotional face recognition deficits and medication effects in pre-manifest through stage-II Huntington’s disease. Psychiatry Res 2013; 207: 118–126. [DOI] [PubMed] [Google Scholar]
  • 39.Osborne-Crowley K, Andrews SC, Labuschagne I, et al. Apathy associated with impaired recognition of happy facial expressions in Huntington’s disease. J Int Neuropsychol Soc 2019; 25: 453–461. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Olivetti Belardinelli M, Huenefeldt T, Maffi S, et al. Effects of stimulus-related variables on mental states recognition in Huntington’s disease. Int J Neurosci 2019; 129: 563–572. [DOI] [PubMed] [Google Scholar]
  • 41.Unti E, Mazzucchi S, Frosini D, et al. Social cognition and oxytocin in Huntington’s disease: new insights. Brain Sci 2018; 8: 161. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Bayliss L, Galvez V, Ochoa-Morales A, et al. Theory of mind impairment in Huntington’s disease patients and their relatives. Arq Neuropsiquiatr 2019; 77: 574–578. [DOI] [PubMed] [Google Scholar]
  • 43.Sprengelmeyer R, Young AW, Baldas EM, et al. The neuropsychology of first impressions: evidence from Huntington’s disease. Cortex 2016; 85: 100–115. [DOI] [PubMed] [Google Scholar]
  • 44.Milders M, Crawford JRR, Lamb a, et al. Differential deficits in expression recognition in gene-carriers and patients with Huntington’s disease. Neuropsychologia 2003; 41: 1484–1492. [DOI] [PubMed] [Google Scholar]
  • 45.Caillaud M, Laisney M, Bejanin A, et al. Social cognition profile in early Huntington disease: insight from neuropsychological assessment and structural neuroimaging. J Huntingtons Dis 2024; 13: 467–477. [DOI] [PubMed] [Google Scholar]
  • 46.Philpott AL, Andrews SC, Staios M, et al. Emotion evaluation and social inference impairments in Huntington’s disease. J Huntingtons Dis 2016; 5: 175–183. [DOI] [PubMed] [Google Scholar]
  • 47.Zarotti N, Fletcher I, Simpson J. New perspectives on emotional processing in people with symptomatic Huntington’s disease: impaired emotion regulation and recognition of emotional body language. Arch Clin Neuropsychol 2019; 34: 610–624. [DOI] [PubMed] [Google Scholar]
  • 48.Hayes CJ, Stevenson RJ, Coltheart M. The processing of emotion in patients with Huntington’s disease: variability and differential deficits in disgust. Cogn Behav Neurol 2009; 22: 249–257. [DOI] [PubMed] [Google Scholar]
  • 49.Eddy CM, Mitchell IJ, Beck SR, et al. Altered subjective fear responses in Huntington’s disease. Parkinsonism Relat Disord 2011; 17: 386–389. [DOI] [PubMed] [Google Scholar]
  • 50.Calder AJ, Keane J, Young AW, et al. The relation between anger and different forms of disgust: implications for emotion recognition impairments in Huntington’s disease. Neuropsychologia 2010; 48: 2719–2729. [DOI] [PubMed] [Google Scholar]
  • 51.Larsen IU, Vinther-Jensen T, Gade A, et al. Do i misconstrue? Sarcasm detection, emotion recognition, and theory of mind in huntington disease. Neuropsychology 2016; 30: 181–189. [DOI] [PubMed] [Google Scholar]
  • 52.Hendel RK, Hellem MNN, Larsen IU, et al. Impairments of social cognition significantly predict the progression of functional decline in Huntington’s disease: a 6-year follow-up study. Appl Neuropsychol Adult 2024; 31: 777–786. [DOI] [PubMed] [Google Scholar]
  • 53.Henley SMD, Wild EJ, Hobbs NZ, et al. Defective emotion recognition in early HD is neuropsychologically and anatomically generic. Neuropsychologia 2008; 46: 2152–2160. [DOI] [PubMed] [Google Scholar]
  • 54.Sprengelmeyer R, Schroeder U, Young AW, et al. Disgust in pre-clinical Huntington’s disease: a longitudinal study. Neuropsychologia 2006; 44: 518–533. [DOI] [PubMed] [Google Scholar]
  • 55.Sprengelmeyer R, Young AWAW, Calder AJAJ, et al. Loss of disgust perception of faces and emotions in Huntington’s disease. Brain 1996; 119: 1647–1665. [DOI] [PubMed] [Google Scholar]
  • 56.Eddy CM, Rickards HE. Theory of mind can be impaired prior to motor onset in Huntington’s disease. Neuropsychology 2015; 29: 792–798. [DOI] [PubMed] [Google Scholar]
  • 57.Gil-Polo C, Martinez-Horta SI, Sampedro Santalo F, et al. Association between insulin-like growth factor-1 and social cognition in Huntington’s disease. Mov Disord Clin Pract 2023; 10: 279–284. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58.Ekman P, Friesen WV. Pictures of Facial Affect. Palo Alto: Consulting Psychologists Press, 1976. [Google Scholar]
  • 59.Baron-Cohen S, Wheelwright S, Hill J, et al. The ‘Reading the mind in the Eyes’ test revised version: a study with normal adults, and adults with Asperger syndrome or high-functioning autism. J Child Psychol Psychiatry 2001; 42: 241–251. [PubMed] [Google Scholar]
  • 60.Sprengelmeyer R, Rausch M, Eysel UT, et al. Neural structures associated with recognition of facial expressions of basic emotions. Proc R Soc B: Biol Sci 1998; 265: 1927–1931. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Lundwist D, Flyict A, Ohman A. The Karolinska Directed Emotional Faces (KDEF). Stockholm: Department of Clinical Neuro- science, Psychology section, Karolinska Institutet, 1998. [Google Scholar]
  • 62.Trinkler I, Cleret de Langavant L, Bachoud-Lévi A-C. Joint recognition–expression impairment of facial emotions in Huntington’s disease despite intact understanding of feelings. Cortex 2013; 49: 549–558. [DOI] [PubMed] [Google Scholar]
  • 63.Trinkler I, Devignevielle S, Achaibou A, et al. Embodied emotion impairment in Huntington’s disease. Cortex 2017; 92: 44–56. [DOI] [PubMed] [Google Scholar]
  • 64.Young AW, Perrett DI, Calder AJ, et al. Facial expressions of emotion: Stimuli and tests (FEEST). Bury St Edmunds: Thames Valley Test Company, 2002. [Google Scholar]
  • 65.Bowers D, Blonder L, Heilman K. Florida affect battery, https://com-neurology.sites.medinfo.ufl.edu/files/2011/12/Florida-Affect-Battery-Manual.pdf (1998, accessed 13 June 2025).
  • 66.van der Schalk J, Hawk ST, Fischer AH, et al. Moving faces, looking places: validation of the Amsterdam dynamic facial expression set (ADFES). Emotion 2011; 11: 907–920. [DOI] [PubMed] [Google Scholar]
  • 67.Lang PJ, Bradley MM, Cuthbert BN. International Affective Picture System (IAPS): Technical Manual and Affective Ratings. 1997.
  • 68.Whittaker JF, Deakin JFW, Tomenson B. Face processing in schrizophrenia: defining the deficit. Psychol Med 2001; 31: 499–507. [DOI] [PubMed] [Google Scholar]
  • 69.Thoma P, Soria Bauser D, Suchan B. BESST (Bochum Emotional Stimulus Set)-A pilot validation study of a stimulus set containing emotional bodies and faces from frontal and averted views. Psychiatry Res 2013; 209: 98–109. [DOI] [PubMed] [Google Scholar]
  • 70.Yitzhak N, Giladi N, Gurevich T, et al. Gently does it: humans outperform a software classifier in recognizing subtle, nonstereotypical facial expressions. Emotion 2017; 17: 1187–1198. [DOI] [PubMed] [Google Scholar]
  • 71.Montagne B, Kessels RPC, De Haan EHF, et al. The emotion recognition task: a paradigm to measure the perception of facial emotional expressions at different intensities. Percept Mot Skills 2007; 104: 589–598. [DOI] [PubMed] [Google Scholar]
  • 72.Holdnack JA, Drozdick LW. Social cognition. In: Holdnack J, Drozdick L. (eds) Advanced clinical solutions for WAIS-IV and WMS-IV: clinical and interpretive manual. San Antonio, TX: Pearson Clinical, 2009, pp.299–372. [Google Scholar]
  • 73.McDonald S, Flanagan S, Rollins J, et al. A new clinical tool for assessing social perception after traumatic brain injury. J Head Trauma Rehabil 2003; 18: 219–238. [DOI] [PubMed] [Google Scholar]
  • 74.Hendel RK, Hellem MNN, Hjermind LE, et al. On the association between apathy and deficits of social cognition and executive functions in Huntington’s disease. J Int Neuropsychol Soc 2023; 29: 369–376. [DOI] [PubMed] [Google Scholar]
  • 75.Rosas HD, Lewis L, Connors N, et al. Are you angry? Neural basis of impaired facial expression recognition in pre-manifest Huntington’s. Parkinsonism Relat Disord 2023; 109: 105289. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 76.Snowden JS, Austin Na, Sembi S, et al. Emotion recognition in Huntington’s disease and frontotemporal dementia. Neuropsychologia 2008; 46: 2638–2649. [DOI] [PubMed] [Google Scholar]
  • 77.Eddy CM, Rickards HE, Hansen PC. Through your eyes or mine? The neural correlates of mental state recognition in Huntington’s disease. Hum Brain Mapp 2018; 39: 1354–1366. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 78.Mason SL, Zhang J, Begeti F, et al. The role of the amygdala during emotional processing in Huntington’s disease: from pre-manifest to late stage disease. Neuropsychologia 2015; 70: 80–89. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79.Croft RJ, McKernan F, Gray M, et al. Emotion perception and electrophysiological correlates in Huntington’s disease. Clin Neurophysiol 2014; 125: 1618–1625. [DOI] [PubMed] [Google Scholar]
  • 80.Ille R, Holl AK, Kapfhammer H-P, et al. Emotion recognition and experience in Huntington’s disease: is there a differential impairment? Psychiatry Res 2011; 188: 377–382. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 81.van Asselen M, Júlio F, Januário C, et al. Scanning patterns of faces do not explain impaired emotion recognition in Huntington disease: evidence for a high level mechanism. Front Psychol 2012; 3: 31. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 82.Huntington Study Group . Unified Huntington’s disease rating scale: reliability and-consistency. Mov Disord 1996; 11: 136–142. [DOI] [PubMed] [Google Scholar]
  • 83.Eddy CM, Sira Mahalingappa S, Rickards HE. Is Huntington’s disease associated with deficits in theory of mind? Acta Neurol Scand 2012; 126: 376–383. [DOI] [PubMed] [Google Scholar]
  • 84.Gray JM, Young AW, Barker WA, et al. Impaired recognition of disgust in Huntingtons disease gene carriers. 1997. [DOI] [PubMed]
  • 85.Ille R, Schäfer A, Scharmüller W, et al. Emotion recognition and experience in Huntington disease: a voxel-based morphometry study. J Psychiatry Neurosci 2011; 36: 383–390. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 86.Scharmüller W, Ille R, Schienle A, et al. Cerebellar contribution to anger recognition deficits in Huntington’s disease. Cerebellum 2013; 12: 819–825. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 87.Hennenlotter A, Schroeder U, Erhard P, et al. Neural correlates associated with impaired disgust processing in pre-symptomatic Huntington’s disease. Brain 2004; 127: 1446–1453. [DOI] [PubMed] [Google Scholar]
  • 88.Novak MJU, Warren JD, Henley SMD, et al. Altered brain mechanisms of emotion processing in pre-manifest Huntington’s disease. Brain 2012; 135: 1165–1179. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 89.Dumas EEM, van den Bogaard SJA, Middelkoop HAM, et al. A review of cognition in Huntington’s disease. … Biosci(Schol Ed … 2013; 5: 1–18. [DOI] [PubMed] [Google Scholar]
  • 90.Zarotti N, Dale M, Eccles FJR, et al. More than just a brain disorder: a five-point manifesto for psychological care for people with Huntington’s disease. J Pers Med 2022; 12: 64. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 91.Dale M, Wood A, Zarotti N, et al. Using a clinical formulation to understand psychological distress in people affected by huntington’s disease: a descriptive, evidence-based model. J Pers Med 2022; 12: 1–19. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 92.Maffi S, Zarotti N, Scocchia M, et al. Childhood trauma and psychological distress during adulthood in children from Huntington’s disease families: an exploratory retrospective analysis. J Huntingtons Dis 2025; 14: 162–170. [DOI] [PubMed] [Google Scholar]
  • 93.Zarotti N, Dale M, Eccles F, et al. Psychological interventions for people with Huntington’s disease: a call to arms. J Huntingtons Dis 2020; 9: 231–243. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 94.Zarotti N, D’Alessio B, Scocchia M, et al. ‘I wouldn’t even know what to ask for’: patients’ and Caregivers’ experiences of psychological support for Huntington’s disease in Italy. NeuroSci 2024; 5: 98–113. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 95.Hellem MNN, Cheong RY, Tonetto S, et al. Decreased CSF oxytocin relates to measures of social cognitive impairment in Huntington’s disease patients. Parkinsonism Relat Disord 2022; 99: 23–29. [DOI] [PubMed] [Google Scholar]
  • 96.Labuschagne I, Poudel G, Kordsachia C, et al. Oxytocin selectively modulates brain processing of disgust in huntington’s disease gene carriers. Prog Neuropsychopharmacol Biol Psychiatry 2018; 81: 11–16. [DOI] [PubMed] [Google Scholar]
  • 97.Halligan PW. Inability to recognise disgust in Huntington’s disease. Lancet 1998; 351: 464. [DOI] [PubMed] [Google Scholar]
  • 98.Ajilchi B, Kisely S, Nejati V, et al. Effects of intensive short-term dynamic psychotherapy on social cognition in major depression. J Mental Health 2020; 29: 40–44. [DOI] [PubMed] [Google Scholar]
  • 99.Renner F, Cuijpers P, Huibers MJH. The effect of psychotherapy for depression on improvements in social functioning: a meta-analysis. Psychol Med 2014; 44: 2913–2926. [DOI] [PubMed] [Google Scholar]
  • 100.Lane RD, Subic-Wrana C, Greenberg L, et al. The role of enhanced emotional awareness in promoting change across psychotherapy modalities. J Psychother Integr 2022; 32: 131–150. [Google Scholar]
  • 101.Simpson J, Eccles F, Zarotti N. Extended evidence-based guidance on psychological interventions for psychological difficulties inindividuals with Huntington's disease, Parkinson's disease, motor neurone disease, and multiple sclerosis. Zenodo 2021. DOI: https://doi.org/10.5281/zenodo.4593883. [Google Scholar]
  • 102.Allain P, Havet-Thomassin V, Verny C, et al. Evidence for deficits on different components of theory of mind in Huntington’s disease. Neuropsychology 2011; 25: 741–751. [DOI] [PubMed] [Google Scholar]
  • 103.Kipps CM, Duggins AJ, McCusker EA, et al. Disgust and happiness recognition correlate with anteroventral insula and amygdala volume respectively in preclinical Huntington’s disease. J Cogn Neurosci 2007; 19: 1206–1217. [DOI] [PubMed] [Google Scholar]
  • 104.Kordsachia CC, Labuschagne I, Andrews SC, et al. Diminished facial EMG responses to disgusting scenes and happy and fearful faces in Huntington’s disease. Cortex 2018; 106: 185–199. [DOI] [PubMed] [Google Scholar]
  • 105.Montagne B, Kessels RPC, Kammers MPM, et al. Perception of emotional facial expressions at different intensities in early-symptomatic Huntington’s disease. Eur Neurol 2006; 55: 151–154. [DOI] [PubMed] [Google Scholar]
  • 106.Olivetti Belardinelli M, Hünefeldt T, Meloni R, et al. Abnormal visual scanning and impaired mental state recognition in pre-manifest Huntington disease. Exp Brain Res 2021; 239: 141–150. [DOI] [PubMed] [Google Scholar]
  • 107.Rossetti MA, Anderson KM, Hay KR, et al. An exploratory pilot study of neuropsychological performance in two Huntington disease centers of excellence clinics. Arch Clin Neuropsychol 2024; 39: 24–34. [DOI] [PubMed] [Google Scholar]
  • 108.Saiz-Rodríguez M, Gil-Polo C, Diez-Fairen M, et al. Polymorphisms in the oxytocin receptor and their association with apathy and impaired social cognition in Huntington’s disease. Neurol Sci 2022; 43: 6079–6085. [DOI] [PubMed] [Google Scholar]
  • 109.Vogel A, Jørgensen K, Larsen IU. Normative data for Emotion Hexagon test and frequency of impairment in behavioral variant frontotemporal dementia, Alzheimer’s disease and Huntington’s disease. Appl Neuropsychol:Adult 2022; 29: 127–132. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

sj-docx-1-hun-10.1177_18796397251390252 - Supplemental material for Emotion recognition in people with Huntington's disease: A comprehensive systematic review

Supplemental material, sj-docx-1-hun-10.1177_18796397251390252 for Emotion recognition in people with Huntington's disease: A comprehensive systematic review by Nicolò Zarotti, Alice Storey, Sarah Lloyd, Laura Mesia Guevara, Helen Caswell, Cliff Chen and Jane Simpson in Journal of Huntington's Disease


Articles from Journal of Huntington's Disease are provided here courtesy of SAGE Publications

RESOURCES