Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2019 Dec 27.
Published in final edited form as: J Intellect Disabil Res. 2016 Jul 26;60(10):993–1009. doi: 10.1111/jir.12319

Processing of stimulus content but not of emotional valence is altered in persons with Williams syndrome

A P Key 1, E M Dykens 2
PMCID: PMC6933513  NIHMSID: NIHMS1062711  PMID: 27457303

Abstract

Background

Individuals with Williams syndrome (WS) exhibit hypersociability and may respond atypically to emotional information in social and nonsocial stimuli. It is not yet clear whether these difficulties are specific to emotional content or stimulus type. This study examined the neural processes supporting social and emotional information processing in WS.

Method

Visual event-related potentials were recorded in 19 adults with WS and 10 typical peers during a picture-viewing task requiring detection of smiling faces among other social and nonsocial images with positive and negative emotional content.

Results

The participant groups were not significantly different in affective processing of positive and negative stimuli and perceived faces as different from nonsocial images. Participants with WS showed subtle differences in face-specific perceptual processes (e.g., face inversion, N170 lateralization), suggesting a more feature-based processing. They also demonstrated reduced attention and arousal modulation (P3, LPP) in response to faces vs. nonsocial images. These differences were independent of IQ.

Conclusions

There was no evidence of greater than typical perceptual, attentional, or affective processing of social information in WS. The results support the idea that altered face perception processes and not the increased salience of social stimuli or difficulties with emotion discrimination may contribute to the hypersocial phenotype in WS.

Keywords: attention, emotion, ERP, face, nonsocial, Williams syndrome


Williams syndrome (WS) is a rare disorder (1:7,500; Strømme et al., 2002) caused by a deletion of at least 24 genes on chromosome 7 (Hillier et al., 2003). It involves mild to moderate intellectual disabilities (Bellugi et al., 2000; Mervis & Klein-Tasman 2000; Martens et al., 2008) with deficits in visuo-spatial processing and relative strengths in verbal domains (Bellugi et al., 1999; Mervis et al., 2000; Karmiloff-Smith et al., 2003).

Prominent features of the WS behavioural phenotype include hypersociability (Bellugi et al., 2007; Fishman et al., 2011; Mervis & Klein-Tasman, 2000), interest in faces (Jones et al., 2000; Mervis et al., 2000), and prolonged visual attention towards the eyes (Jones et al., 2000; Mervis, 2003; Riby & Hancock, 2008). Individuals with WS are successful at face recognition (Tager-Flusberg et al., 2003; Mobbs et al., 2004; Karmiloff-Smith et al., 2004; Annaz et al., 2009), demonstrate enhanced empathy (Mervis & Klein-Tasman, 2000; Tager-Flusberg & Sullivan, 2000), and can identify mental states of others by looking at the eyes or whole faces (Riby & Back, 2010; Tager-Flusberg et al., 1998). However, while some reported intact facial expression recognition in face-matching tasks (Bellugi et al., 1994; Tager-Flusberg & Sullivan, 2000) and accurate happy/sad identification of schematic faces (Karmiloff-Smith et al., 1995), others noted IQ-related difficulties with verbal labeling of facial affect (Gagliardi et al., 2003; Lacroix et al., 2009; Plesa-Skwerer et al., 2006; Porter et al., 2007).

The hypersocial phenotype in WS is also associated with indiscriminate friendliness (Jarvinen et al., 2013) and greater tendency to view strangers as approachable (Fishman et al., 2011; Jones et al., 2000; Martens et al., 2009), which could present safety concerns in daily life. Behavioural studies proposed several explanations, including altered perceptual processing of faces (Jarvinen-Pasley et al., 2010), atypically increased salience of social stimuli (Frigerio et al., 2006), or difficulties inhibiting responses to social information (Porter at al., 2007), but no conclusion has been reached.

The mechanisms underlying the social-emotional functioning in WS have been probed using neuroimaging methods, with alterations noted in the size and connectivity of the amygdala (Reiss et al., 2004), orbitofrontal cortex and frontostriatal circuit (Meyer-Lindenberg et al., 2005; Mobbs et al., 2007; Munoz et al., 2010). Compared to typical peers, persons with WS showed reduced amygdala activation in response to negative emotional faces (Haas et al., 2009; Mimura et al., 2010) but increased activation for happy vs. neutral expressions (Haas et al., 2009) and for nonsocial stimuli (Meyer-Lindenberger et al., 2005, but see Plesa-Skwerer et al., 2011). Persons with WS also demonstrated absent (Meyer-Lindenberger et al., 2005) or atypically reversed (Mimura et al., 2010) patterns of orbitofrontal cortical activity during facial emotion discrimination, suggesting more positive perception of negative expressions.

Event-related potentials (ERP) may offer additional insights into social-emotional information processing in WS. Previous studies in typical populations have identified several ERP responses that differentiate between social and nonsocial stimuli as well as between emotional and neutral content. An occipito-temporal negative peak at 170 ms after stimulus onset (N170) is larger for faces than other visual stimuli (Bentin & Deouell, 2000; Jemel et al., 2003), and for inverted compared to upright faces (Bentin et al., 1996; Rossion et al., 2000). A preceding positive peak, P1, may also be sensitive to face vs. non-face differences (Batty & Taylor, 2003), although whether it is due to physical vs. social aspects is debatable (Rossion & Caharel, 2011). The effects of emotion on the N170 response are inconsistent, with some reporting increased amplitudes for negative (fear, disgust) compared to neutral faces (Batty & Taylor, 2003; Blau et al., 2007; Caharel et al., 2005), while others observed no effects (Dong & Lu, 2010; Eimer & Holmes, 2007). These discrepancies may be explained by differences in stimulus types or attentional demands (Blau et al., 2007).

Affective information processing in typical participants is reflected by a temporo-occipital early processing negativity (EPN) and by a centro-parietal late positive potential (LPP), which vary in amplitude between emotional and neutral stimuli (Hajcak et al., 2012; Olofsson et al., 2008; Schupp et al., 2000). The EPN appears in active and passive tasks approximately 250ms after stimulus onset, often as a relative negativity in response to emotional compared to neutral stimuli (Hajcak et al., 2012). It reflects enhanced attention to emotional content (Schupp et al., 2006; Foti et al., 2009) due to amygdala’s feedback to sensory cortices (Eimer & Holmes, 2007; Vuilleumier & Pourtois, 2007). Larger EPNs have been observed in anxious individuals (Muhlberger et al., 2009) and may also differentiate between emotions, with larger responses to positive than negative stimuli (Schupp et al., 2006; Weinberg & Hajcak, 2010).

The LPP response begins 300–500 ms after stimulus onset and is larger for both positive and negative compared to neutral stimuli (Olofsson et al., 2008; Schupp et al., 2000). It varies with the stimulus emotional intensity and/or arousal levels and reflects downstream effects of amygdala activation (Sabatinelli et al., 2005, 2007; Hajcak et al., 2010). The LPP overlaps spatially and temporally with the P3 response that indexes motivational properties of stimuli (Hajcak et al., 2012) and attentional and memory processes (Polich, 2007). However, the LPP is functionally distinct because it is elicited even in tasks that do not require explicit affective evaluation (Cacioppo et al., 1996; Hajcak et al., 2010). The anterior LPP differentiates valence: negative emotions elicit larger amplitudes over the right hemisphere while the reverse is observed over the left hemisphere (Cunningham et al., 2005).

In contrast to the general population, very few ERP studies examined face or emotion processing in WS. Mills et al. (2000) noted that during matching of sequentially presented faces, persons with WS demonstrated a larger than typical N200 amplitude that correlated with better performance on the Benton Test of Facial Recognition. Face recognition responses in 300–500ms period were comparable to those of the typical group but delayed. However, the face-specific N170 was not examined, and no nonsocial stimuli were included, making it difficult to evaluate whether the observed results were face-specific. More recently, Key & Dykens (2015) reported larger than typical N170 responses to faces in young adults with WS while they passively viewed photographs depicting unfamiliar individuals and houses.

In a combined fMRI/ERP study of emotional face processing (Haas et al., 2009), persons with WS demonstrated greater than typical amygdala activation and larger amplitude of the posterior positive response in 300–500ms window (consistent with the LPP) for happy vs. neutral facial expressions. They also differentiated fearful from neutral faces at anterior scalp locations within 200–280 ms after stimulus onset, while no such differences were observed in the typical group.

Thus, existing data suggest that persons with WS may generate exaggerated responses to social-emotional information, but it is not yet clear whether these differences are specific to emotion (positive vs. negative) and/or stimulus type (faces vs. nonsocial stimuli). The current study examined how emotional valence and stimulus content affect brain responses in persons with WS. We hypothesised that because of their heightened social interest, persons with WS may process faces (but not nonsocial stimuli) more extensively compared to the typical peers, manifested by the increased amplitude of P1/N170 responses at the perceptual stage, and/or by the greater P3 amplitude to faces vs. other stimuli suggesting increased attentional resource allocation. Furthermore, group differences in the processing of social and nonsocial emotional content could be evident in the amplitude of the EPN and LPP responses: if participants with WS are more sensitive to the emotional information, then they would generate larger amplitudes compared to the typical group.

Method

Participants

Nineteen adults with WS (12 males; M age=27.39, SD=11.91, range 19–59 years) and 10 typical adults (7 males; M age=26.95, SD=3.89, range 21–35 years) participated in the study. Three participants with WS were left-handed, the rest were right-handed (M LQ=.56+/−.68) as determined by Edinburgh Handedness Inventory (Oldfield 1971). All typical participants were right-handed (M LQ=.76+/−.26). For participants with WS, the mean total IQ was 68.79 (SD=18.70) as assessed by the Kaufman Brief Intelligence Test-2 (Kaufman & Kaufman 2004). Typical participants were university students or employed in the community, with presumed intellectual functioning in the average range. All participants had normal or corrected-to-normal vision. Typical participants and parents or legal guardians of persons with WS provided written informed consent, and participants with WS provided written assent. The study has been independently reviewed and approved by the university Institutional Review Board.

ERP task

Stimuli

Thirty-two color photographs of upright and inverted faces (Ekman & Matsumoto, 1993) and nonsocial objects (household objects, non-primate animals) served as the stimuli. One half of the images had positive affective value (e.g., a smiling face, a birthday cake), while the other half was negative (e.g., an angry face, a snake). The nonsocial stimuli depicted situations commonly eliciting positive (e.g., a present, sunny beach) and negative (e.g., lightning, graveyard) reactions in persons with and without developmental disabilities. Photograph were presented centered on a black computer screen, and from the viewing distance of 90 cm, subtended visual angles of 8.91˚(h) x 6.68˚(w).

Electrodes

A 128-electrode geodesic sensor net (EGI, Inc., Eugene, OR) connected to a high-impedance amplifier was used to record ERPs. Electrode impedance levels were at or below 40 kOhm. Data were sampled at 250Hz with the filters set to .1–30 Hz. All electrodes were referred to vertex and re-referenced offline to an average reference.

Procedure

Each stimulus category (faces, inverted faces, objects) and emotional content (positive, negative) were presented equally often in random order. Participants were asked to press one button on a hand-held response box for smiling faces and another button for all other stimuli. Specific button assignment was counterbalanced across the participants. The smiling faces (upright and inverted) appeared on 48 of 144 trials (33%).

Each stimulus was presented for 1000ms, the response window extended for another 1000ms. Intertrial interval was marked by a blank black screen and varied in length between 1200–1600 ms to prevent habituation and development of trial onset expectations. E-prime 2.0 (PST, Inc., Pittsburgh, PA) controlled stimulus presentation. The entire task included 144 trials and lasted ~10 minutes. During periods of inattention or motor activity, stimulus presentation was suspended until the participant was ready to continue.

Data Analysis

Behavioural Data

Accuracy and reaction time (RT) data were collected for each stimulus condition and analysed using a mixed model ANOVA with Group (2: WS, TD) x Stimulus Category (3: face, inverted face, nonsocial images) x Emotion (2: positive, negative) factors and Huynh-Feldt correction.

ERP

The EEG was segmented on stimulus onset to include a 100-ms prestimulus baseline and an 800-ms post-stimulus interval. Trials containing ocular or movement artifacts were excluded using an automated screening algorithm in NetStation followed by a manual review. Data for individual electrodes with poor signal quality were reconstructed using spherical spline interpolation. Trials with more than 15% interpolated electrodes were discarded. Trial retention rates were comparable across conditions and groups (WS M=20.16+/−2.78; TD M=20.63+/−3.03 per condition).

Artifact-free ERPs were averaged and baseline-corrected using the 100-ms prestimulus interval. To reduce the number of analysed electrodes, only data averaged within a priori selected electrode clusters were used (see Figure 1). These were based on previous studies (e.g., Blau et al., 2007; Cunningham et al., 2005; Schupp et al., 2004) and corresponded to left and right occipito-temporal locations optimal for examining P1, face-specific N170, and emotion-sensitive EPN responses; central and parietal midline locations optimal for attention- and emotion-sensitive P3 and LPP responses; and left and right fronto-temporal regions where emotion-specific LPP differences have been reported.

Figure 1.

Figure 1.

Electrode layout and clusters used in the analyses.

Next, mean amplitudes for P1 (100–150ms), N170 (140–190ms), EPN (200–300ms), P3 (300–500ms), and LPP (500–800ms) were obtained for each electrode cluster (see Table 2). These time windows were selected a priori based on published studies, including a previous application of the same experimental paradigm in individuals with other intellectual disabilities (e.g., Key et al., 2013). The resulting values were entered into a mixed ANOVA using Group (2: WS, TD) x Stimulus Category (3: face, inverted face, nonsocial stimulus) x Emotion (2: positive, negative) x Electrode (2: left/right occipito-temporal for P1, N170 and EPN; central/parietal for P3 and LPP; left/right fronto-temporal for anterior LPP) design and Huynh-Feldt corrections. Significant main effects and interactions were followed with post-hocs. False discovery rate (FDR; Benjamini & Hochberg, 1995) was used to control for multiple significance tests.

Table 2.

Mean amplitudes (means/st. dev) for analysed peaks in participants with TD and WS.

TD WS
Mean SD Mean SD Mean SD Mean SD
Occipito-temporal P1 Left Right Left Right
Smiling Face 1.625 2.580 1.601 2.199 1.189 1.109 1.453 1.781
Negative Face 1.472 2.226 1.980 1.997 0.846 1.581 1.327 1.986
Inverted Smiling Face 1.663 2.237 2.073 2.208 0.780 1.505 1.563 1.868
Inverted Negative Face 1.870 1.686 2.136 1.706 1.404 1.304 2.061 1.767
Positive Picture 0.709 2.242 1.237 1.971 0.884 1.211 1.229 1.921
Negative Picture 0.792 1.398 1.060 2.290 0.526 1.280 0.785 1.433
Occipito-temporal N170 Left Right Left Right
Smiling Face 0.425 3.243 −0.396 3.432 −1.808 2.294 −1.104 2.351
Negative Face 0.297 2.520 0.185 3.153 −2.024 2.736 −1.179 2.837
Inverted Smiling Face 0.144 3.179 −0.556 3.639 −1.504 2.634 −0.766 2.773
Inverted Negative Face 0.320 2.827 −0.604 3.355 −0.836 2.671 0.471 3.106
Positive Picture 1.185 2.035 1.796 2.118 −0.504 2.356 −0.292 2.285
Negative Picture 0.686 2.108 1.095 2.995 −0.902 2.766 −0.372 2.159
Occipito-temporal EPN Left Right Left Right
Smiling Face 2.188 2.100 4.073 1.184 5.363 3.399 7.388 4.346
Negative Face 2.206 1.796 4.188 1.976 5.058 2.569 7.004 3.892
Inverted Smiling Face 1.070 1.614 2.432 1.955 3.842 2.809 5.797 3.445
Inverted Negative Face 1.392 1.670 2.742 2.017 4.508 3.517 6.555 4.404
Positive Picture 3.065 2.800 4.496 2.223 5.372 2.884 6.884 4.004
Negative Picture 2.001 2.122 3.367 2.884 3.725 3.385 5.390 3.968
P3 Central Parietal Central Parietal
Smiling Face −0.126 2.210 3.399 1.991 0.486 2.655 1.840 2.297
Negative Face −0.457 1.696 3.142 1.423 0.154 2.382 1.478 2.331
Inverted Smiling Face 1.082 1.813 2.862 2.067 2.053 2.435 2.203 2.561
Inverted Negative Face 1.026 2.132 2.196 1.490 1.367 2.171 2.240 1.966
Positive Picture −0.568 1.958 2.352 2.163 0.245 2.310 2.837 2.790
Negative Picture −0.932 1.780 1.926 2.321 0.095 2.351 2.782 2.704
Central/Parietal LPP Central Parietal Central Parietal
Smiling Face 0.415 1.742 0.454 1.752 1.273 2.251 −1.019 2.535
Negative Face 0.570 1.656 0.509 1.706 1.348 1.723 −1.254 1.806
Inverted Smiling Face 1.431 1.782 0.078 2.745 2.251 1.725 −1.062 2.161
Inverted Negative Face 1.170 1.396 −0.907 2.230 2.094 1.885 −1.442 1.559
Positive Picture 0.249 1.333 −1.082 2.056 1.113 1.547 −1.014 2.291
Negative Picture 0.150 1.191 −0.132 1.887 1.573 1.555 −0.369 2.485
Fronto-Temporal LPP Left Right Left Right
Smiling Face −0.341 1.720 0.344 1.388 0.466 2.030 1.143 1.875
Negative Face −1.029 1.693 0.864 1.649 0.587 1.436 1.126 2.095
Inverted Smiling Face −0.296 2.609 0.593 2.109 0.383 1.695 0.927 1.901
Inverted Negative Face 0.156 2.612 0.745 1.705 0.836 2.441 0.819 2.575
Positive Picture 0.417 2.248 0.758 1.427 1.110 1.601 −0.032 1.882
Negative Picture −1.762 2.667 1.154 0.836 −0.440 2.027 1.038 1.974

Results

Behavioural performance

Analyses of the behavioural accuracy data revealed main effects of Group, F(1,27)=6.241, p=.019, partial η2=.188, and Emotion, F(1,27)=5.325, p=.029, partial η2=.165, but no interactions. Despite their overall high performance, individuals with WS were less accurate than the typical group. For all participants, lower accuracy in identification of positive than negative images was observed. Participants with WS who had higher IQs identified inverted smiling faces (r=.533, p=.019) and negative pictures (r=.719, p=.001) more accurately.

For the reaction time, main effects of Group, F(1,27)=12.879, p=.001, partial η2=.323, Stimulus, F(2,54)=8.968, p<.001, partial η2=.249, and Emotion, F(1,27)=10.465, p=.003, partial η2=.279, were significant, as well as the Stimulus x Emotion x Group interaction, F(2,54)=6.282, p=.004, partial η2=.189. Participants with WS responded slower than typical to all stimuli, but their RTs did not correlate with IQ (p’s=.138-.335). Within the WS group, the smiling faces elicited faster responses than negative faces, both in the upright and inverted condition, t(18)=3.486, p=.003, d=.800 and t(18)=4.333, p<.001, d=.994, respectively. Responses to the inverted negative faces were slower than to upright negative faces and to negative nonsocial stimuli, t(18)=3.426, p=.003, d=.786 and t(18)=4.020, p=.001, d=.922, respectively. The typical group responded slower to inverted smiling faces than to positive nonsocial pictures, t(9)=3.921, p=.004, d=1.240. The other contrasts were not significant after correcting for multiple comparisons. See Table 1 for summary of the behavioral results.

Table 1.

Summary of behavioral performance on the picture classification task.

Stimulus Emotion WS TD Total WS vs. TD
Mean SD Mean SD Mean SD p-value
Accuracy
Faces + 0.92 0.12 0.94 0.11 0.93 0.11 0.675
0.95 0.07 0.98 0.03 0.96 0.06 0.317
Inverted Faces + 0.85 0.24 0.96 0.05 0.88 0.20 0.148
0.94 0.11 0.99 0.02 0.96 0.09 0.168
Nonsocial images + 0.81 0.27 1.00 0.00 0.88 0.24 0.04
0.90 0.13 0.99 0.02 0.93 0.11 0.028
Reaction Time
Face + 730.68 99.15 606.70 85.70 687.93 110.80 0.002
775.66 103.83 631.88 73.82 726.08 116.28 0.001
Inverted Face + 773.74 155.67 648.74 99.06 730.63 149.63 0.03
842.14 169.98 654.43 87.45 777.42 171.10 0.003
Nonsocial + 785.93 148.96 572.70 75.33 712.40 163.48 <.001
765.19 140.86 599.27 93.30 707.97 148.30 0.002

ERP findings

Occipito-temporal P1 (100–150ms)

The main effect of Stimulus, F(2,54)=7.683, p=.001, partial η2=.222, reflected more positive amplitudes for upright and inverted faces than nonsocial images, t(28)=2.224, p=.034, d=.413, and t(28)=4.042, p<.001, d=.751, respectively. There were no emotion-related effects for the P1 response.

Occipito-temporal N170 (140–192ms)

There were a main effect of Stimulus, F(2,54)=6.536, p=.003, partial η2=.195, and a Stimulus x Electrode x Group interaction, F(2,54)=8.905, p<.001, partial η2=.248. Post-hocs indicated no significant group differences in ERPs elicited by any particular stimulus condition, but the topographic distribution of stimulus discrimination effects varied between groups (Figure 2). The typical group demonstrated larger N170 amplitudes in response to upright and inverted faces compared to nonsocial stimuli over the right hemisphere, t(9)=3.097, p=.013, d=.979 and t(9)=3.159, p=.012, d=.999, respectively. Although prominent in waveform plots, the amplitude differences between upright vs. inverted faces did not reach significance. The WS group showed a larger left N170 to faces than nonsocial stimuli, t(18)=3.251, p=.004, d=.746, while larger right hemisphere responses were recorded for upright than inverted faces, t(18)=2.781, p=.012, d=.638. There were no emotion-related effects for the N170 response.

Figure 2.

Figure 2.

Responses to faces vs. nonsocial stimuli at left and right occipito-temporal (P1, N170) locations in participants with TD and WS.

Occipito-temporal EPN (200–300ms)

There were main effects of Stimulus, F(2,54)=7.087, p=.002, partial η2=.208, Emotion, F(1,27)=4.892, p=.036, partial η2=.153, and Electrode, F(1,27)=18.981, p<.001, partial η2=.413, as well as a Stimulus x Emotion interaction, F(2,54)=9.126, p<.001, partial η2=.253. Post-hocs indicated that negative compared to positive nonsocial stimuli were associated with larger EPNs, t(28)=5.295, p<.001, d=.983. Additionally, inverted happy faces elicited more negative amplitudes than upright happy faces or positive nonsocial stimuli, t(28)=4.148, p<.001, d=.770 and t(28)=4.286, p<.001, d=.796, respectively.

Central/Parietal P3 (300–500ms)

There were main effects of Stimulus, F(2,54)=7.809, p=.001, partial η2=.224, and Electrode, F(1,27)=17.702, p<.001, partial η2=.396, as well as a Stimulus x Electrode interaction, F(2,54)=16.416, p<.001, partial η2=.378. Group differences were reflected in the interactions of Stimulus x Group, F(2,54)=4.686, p=.013, partial η2=.148, and Stimulus x Electrode x Group, F(2,54)=4.589, p=.014, partial η2=.145. Post-hoc analyses revealed that groups differed in the topography of stimulus discrimination effects but not the response to any particular stimulus (see Figure 3). The typical group generated larger parietal P3 amplitudes to faces than nonsocial stimuli, t(9)=2.722, p=.024, d=.861. At central locations, more positive amplitudes were observed for inverted than upright faces and nonsocial stimuli, t(9)=4.889, p=.001, d=1.546 and t(9)=4.691, p=.001, d=1.483, respectively. Participants with WS also evidenced larger central P3 amplitudes for inverted than upright faces and nonsocial stimuli, t(18)=5.075, p<.001, d=1.164 and t(18)=7.507, p<.001, d=1.722, respectively. However, they showed larger parietal P3 amplitudes for nonsocial stimuli than upright faces, t(18)=3.389, p=.003, d=.777. There were no emotion-related effects for the central/parietal P3 response.

Figure 3.

Figure 3.

Central and parietal P3 responses to social (faces, inverted faces) vs. nonsocial stimuli in participants with TD and WS.

Central/Parietal LPP (500–800ms)

There was a main effect of Electrode, F(1,27)=21.429, p<.001, partial η2=.442, as well as a a Stimulus x Group, F(2,54)=3.548, p=.036, partial η2=.116, and a Stimulus x Electrode interactions, F(2,54)=12.125, p<.001, partial η2=.310. Post-hocs indicated that more positive amplitudes were recorded at central than parietal locations for all stimuli (faces: t(28)=4.110, p<.001, d=.763; inverted faces: t(28)=6.786, p<.001, d=1.260; nonsocial stimuli: t(28)=3.593, p=.001, d=.667). At central locations, larger amplitudes were recorded for inverted than upright faces, t(28)=5.408, p<.001, d=1.00, and nonsocial stimuli, t(28)=4.954, p<.001, d=.920. No significant stimulus-related differences were observed for the parietal cluster. Typical participants elicited a larger LPP to upright and inverted faces than to nonsocial stimuli, t(9)=3.282, p=.009, d=1.038, and t(9)=2.476, p=.035, d=.783, respectively. Stimulus-related differences did not reach significance in participants with WS (p’s=.076-.579).

Emotion-related effects were present in the form of a Stimulus x Emotion interaction, F(2,54)=5.124, p=.009, partial η2=.160. Post-hocs revealed larger LPP responses to negative than positive nonsocial stimuli, t(28)=2.626, p=.014, d=.488. Also, inverted happy faces elicited a larger LPP than positive nonsocial stimuli, t(28)=3.635, p=.001, d=.675.

Fronto-temporal LPP (500–800 ms)

There were Stimulus x Emotion, F(2,54)=3.334, p=.043, partial η2=.110, Emotion x Electrode, F(1,27)=11.765, p=.002, partial η2=.303, and Stimulus x Emotion x Electrode interactions, F(2,54)=11.711, p<.001, partial η2=.303. Post-hocs identified larger left LPP responses for positive than negative nonsocial stimuli, t(28)=4.602, p<.001, d=.855, while negative stimuli were associated with larger right LPP amplitudes, t(28)=3.564, p=.001, d=.662.

None of these ERP measures correlated with IQ in persons with WS.

Discussion

In an effort to explore the mechanisms underlying hypersocial phenotype in WS, this study examined whether persons with WS process content (social vs. nonsocial) and emotional valence (positive vs. negative) of visual stimuli differently from typical peers. Group differences were observed in the behavioural responses and in the ERP indices of stimulus content but not in emotional valence discrimination.

Despite their statistically lower accuracy and speed of behavioural responses compared to the typical peers, participants with WS identified smiling faces among social and nonsocial images very well (>90%), with the lowest accuracy scores observed for inverted smiling faces (85%) and positive nonsocial images (81%). While IQ contributed to a small subset of accuracy values (e.g., inverted smiling faces), it did not correlate with the RTs. Combined with the high accuracy of responses, we interpret slower RTs in participants with WS to reflect the speed-accuracy trade-off associated with the desire to comply with the task instructions and perform to the best of their ability.

There were no significant group differences at the earliest stage of perceptual processing. Both groups elicited larger P1 amplitudes for faces than nonsocial stimuli, consistent with greater attention allocation to social stimuli (Clark & Hillyard, 1996). Participants with TD and with WS also demonstrated larger N170 amplitudes to faces compared to nonsocial stimuli, but differed in the lateralization of this response. Typical participants evidenced stimulus discrimination over the right hemisphere, consistent with prior reports of N170 hemisphere asymmetry (Rossion et al., 1999; Itier & Taylor, 2002). Conversely, in persons with WS, the same discrimination was observed over the left hemisphere. ERPs recorded over the right hemisphere generally reflect more global or holistic processing of stimuli, while the left hemisphere is associated with more feature-based processing (Deruelle & de Schonen, 1998; Scott & Nelson, 2006). This interpretation fits well with previous behavioural and eye-tracking studies reporting that persons with WS may process faces in a featural manner (e.g., Annaz et al., 2009; Deruelle et al., 1999; Karmiloff-Smith et al., 2004; Riby & Hancock, 2008, 2009). The lack of the N170 enhancement for inverted faces in participants with WS provides additional evidence of altered face perception. In typical individuals, inverted compared to upright faces generate a larger N170 amplitude (Rossion et al., 1999; Eimer, 2000). Conversely, reduced N170 amplitudes for inverted faces were observed in response to more abstract Mooney faces or Arcimboldo paintings, where inversion led to the stimuli appearing less face-like (George et al., 2005; Caharel et al., 2013).

We interpret our N170 findings as extending previous behavioral results of Jarvinen-Pasley et al. (2010) suggesting that atypical face perception may contribute to the hypersocial phenotype in WS. A recent report of a correlation between the reduced N170 amplitude to inverted faces and increased severity of autism symptoms in adults with Prader-Willi syndrome (Key et al., 2013), a developmental disability characterised by social difficulties and IQ similar to that of WS, further supports the proposed connection between face perception and social behavior.

Responses to emotional content were first evident in the EPN (200–300ms) and not significantly different between the groups. Larger posterior EPN amplitudes were observed for negative than positive nonsocial stimuli, and for inverted smiling faces than upright smiling faces or positive nonsocial stimuli. Although several studies report enhanced EPN to negative vs. neutral stimuli (e.g., Eimer et al., 2003; Sato et al., 2001; Schupp et al., 2004), others note larger EPN to positive stimuli when comparing positive and negative affective valence (Schupp et al., 2006; Weinberg & Hajcack, 2010). The reversed direction of valence-related differences in the present study could be due to the stimuli, because the EPN response may reflect perceptual characteristics affecting selective attention in addition to the emotional arousal (Schupp et al., 2006). Nevertheless, while further studies are needed, our findings suggest that just like the typical peers, persons with WS were sensitive to differences in the emotional valence of the images.

Analysis of the P3 responses examined attentional processes involved in the task of detecting smiling faces among other social and nonsocial images. Contrary to expectations, there was no target-specific enhancement of the P3 amplitude in either group. In typical participants, all upright faces (smiling and non-smiling) compared to inverted faces or nonsocial stimuli elicited a larger parietal P3. At central sites, the P3 amplitude increase was observed for inverted faces compared to all other stimuli. Persons with WS demonstrated a similar increase in the more anterior P3 response to the inverted faces, but elicited a larger parietal P3 response to nonsocial stimuli rather than to upright faces.

The lack of P3 enhancement to smiling faces could be attributed to insufficiently low probability of the targets. Although upright and inverted smiling faces comprised approximately 33% of the total trials, 50% of all stimuli were faces. Compared to nonsocial stimuli, they were likely processed as potential targets (regardless of orientation or emotional expression), possibly due to their high motivational relevance as social stimuli. Also, multiple exemplars of smiling faces were used rather than a singe target, which could necessitate attention to all faces in order to accurately identify smiling targets.

The finding of greater attention to nonsocial stimuli in WS even when smiling faces were explicitly identified as the most task-relevant stimuli is inconsistent with the prior behavioural reports of increased interest in faces in individuals with WS (e.g., Riby et al., 2011; Plesa-Skewer et al., 2009, 2011). However, this finding aligns well with existing neuroimaging evidence of increased amygdala activation by nonsocial scenes than by faces (Meyer-Lindenberger et al., 2005). The nonsocial stimuli used in the present study were selected to be particularly relevant to the interests and fears of persons with WS, and thus could have attracted greater attention than faces. The lack of enhanced parietal P3 for faces could also be due to general attentional difficulties common in WS (e.g., Pober & Dykens 1996; Greer et al., 1997). Our paradigm required participants to actively monitor the varied stimulus stream in order to detect the relatively infrequent targets. Thus, the lack of the expected increase in the parietal P3 to smiling faces could reflect insufficient attention or greater distractibility during the task. Reduced accuracy of behavioural responses in participants with WS could further support this interpretation.

The centro-parietal LPP response indexing stimulus-related arousal (Hajcak et al., 2012) reflected group differences in processing of stimulus content but not affective valence. In typical participants, the upright faces were associated with greater arousal than inverted faces and nonsocial stimuli. The lack of similar responses in persons with WS is consistent with recent studies of autonomic nervous system function indicating reduced physiological reactivity to faces in WS (Capitao et al., 2011; Doherty-Sneddon et al., 2009; Plesa-Skwerer et al., 2009).

Emotion-related effects were reflected by increased centro-parietal LPP amplitudes for negative vs. positive nonsocial stimuli and for inverted smiling faces vs. other positive stimuli. While prior studies (e.g., Hajcak & Dennis, 2009) did not observe valence-specific modulation of the centro-parietal LPP amplitude using stimuli from the IAPS set (Lang, Bradley, & Cuthbert, 2005), it is possible that our negative vs. positive nonsocial images were more arousing to the participants. Conversely, the fronto-temporal LPP responses demonstrated the expected lateralised valence discrimination effects (Cunningham et al., 2005) with larger amplitudes for positive than negative nonsocial stimuli over the left hemisphere, and the opposite pattern over the right hemisphere. The likely reason for the emotion discrimination effects being observed primarily for the nonsocial stimuli is their greater visual complexity compared to faces (Britton et al., 2006).

In sum, our ERP data did not support the hypothesis of increased perceptual and/or attentional processing of social information in WS. Participants with WS perceived faces as categorically different from nonsocial images, but in contrast to typical controls, they displayed a bias toward featural rather than configural processing and demonstrated reduced attention and arousal in response to faces compared to nonsocial images. Our data also did not evidence greater than typical processing of affective content in WS, as the groups were not significantly different in their responses to positive and negative emotional stimuli.

Limitations

While many results from this study replicate previous findings in typical populations and expand them to a group with developmental disabilities, this work also has several limitations. It is possible that our stimuli were not optimal for eliciting the selected ERP responses. In contrast to the face stimuli, the nonsocial images were not obtained from a standardised set. Instead, they were selected to represent content likely to elicit positive and negative affective reactions in persons with WS. Nevertheless, the presence of N170 amplitude differences for faces vs. nonsocial stimuli in both participant groups indicates that the two stimulus categories were perceived as perceptually and categorically distinct. Furthermore, the emotion-related modulation of the EPN and LPP amplitudes suggests that our nonsocial stimuli did elicit the expected emotional reactions.

Another potential limitation is the unequal sample sizes for the TD and WS groups. ERPs averaged across a smaller number of samples tend to be larger (Thomas et al., 2004), but the lack of significant group differences in ERP responses to specific stimulus conditions suggests minimal bias due to sample size. Finally, the study did not include informant or other behavioural measures of inattention, which may have shed light on relations between ERP results and difficulties sustaining or shifting attention in WS.

Conclusions

The study demonstrates that persons with WS process emotional content (positive vs. negative) of visual stimuli similarly to their typical peers, perceptually differentiate social and nonsocial images, but evidence specific alterations in face processing. Brain responses of participants with WS suggested feature-based perceptual processing and did not demonstrate the anticipated increased attention or arousal in response to faces compared to nonsocial stimuli, even though the salience of faces was emphasised by the task instructions. These findings contradict the behavioral reports of increased interest in faces and support the idea that altered face perception and not the increased arousal or salience of social stimuli may contribute to the hypersocial phenotype in WS. Future studies need to examine in greater detail the association between neural mechanisms of face processing and social strengths and weaknesses in persons WS.

Figure 4.

Figure 4.

Emotion-sensitive EPN (occipito-temporal) and LPP (centro-parietal and fronto-temporal) responses to positive and negative nonsocial stimuli in participants with TD and WS.

Acknowledgements

This research was supported by National Institute of Child Health and Development Grant P30HD15052 and U54HD083211 to the Vanderbilt Kennedy Center for Research on Human Development, and by a Vanderbilt University Discovery grant (EMD). The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH. We would like to thank the participants and their families for their support of the study. We are grateful to Elizabeth Roof, MA, Senior Research Scientist, for assistance in coordinating research participants and Susan M. Williams for help with ERP acquisition and processing.

References

  1. Annaz D, Karmiloff-Smith A, Johnson MH, & Thomas MSC (2009). A cross-syndrome study of the development of holistic face recognition in children with autism, Down syndrome, and Williams syndrome. Journal of Experimental Child Psychology, 102(4), 456–486. [DOI] [PubMed] [Google Scholar]
  2. Batty M, & Taylor MJ (2003). Early processing of the six basic facial emotional expressions. Brain research. Cognitive brain research, 17(3), 613–620. doi: 10.1016/S0926-6410(03)00174-5 [DOI] [PubMed] [Google Scholar]
  3. Bellugi U, Wang P, & Jernigan T (1994). Williams syndrome: An unusual neuropsychological profile In Browman S & Grafman J (Eds.), Atypical Cognitive Deficits in Developmental Disorders: Implications for Brain Function. Hillsdale, NJ: Lawrence Erlbaum. [Google Scholar]
  4. Bellugi U, Järvinen-Pasley A, Doyle TF, Reilly J, Reiss AL, & Korenberg JR (2007). Affect, social behavior, and the brain in Williams syndrome. Current Directions In Psychological Science, 16(2), 99–104. [Google Scholar]
  5. Bellugi U, Lichtenberger L, Jones W, Lai Z, & St George M (2000). I. The neurocognitive profile of Williams Syndrome: a complex pattern of strengths and weaknesses. Journal of cognitive neuroscience, 12 Suppl 1, 7–29. [DOI] [PubMed] [Google Scholar]
  6. Bellugi U, Lichtenberger L, Mills D, Galaburda A, & Korenberg JR (1999). Bridging cognition, the brain and molecular genetics: evidence from Williams syndrome. Trends in Neurosciences, 22(5), 197–207. [DOI] [PubMed] [Google Scholar]
  7. Benjamini Y, & Hochberg Y (1995). Controlling the false discovery rate: A practical and powerful approach to multiple testing. Journal of the Royal Statistical Society. Series B (Methodological), 57(1), 289–300. [Google Scholar]
  8. Bentin S, & Deouell LY (2000). Structural encoding and identification in face processing: ERP evidence for separate mechanisms. Cognitive Neuropsychology, 17(1–3), 35–55. [DOI] [PubMed] [Google Scholar]
  9. Bentin S, Allison T, Puce A, & Perez E (1996). Electrophysiological studies of face perception in humans. Journal of cognitive neuroscience, 8(6), 551–565. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Blau VC, Maurer U, Tottenham N, & McCandliss BD (2007). The face-specific N170 component is modulated by emotional facial expression. Behavioral and Brain Functions, 3, 7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Britton JC, Phan KL, Taylor SF, Welsh RC, Berridge KC, & Liberzon I (2006). Neural correlates of social and nonsocial emotions: An fMRI study. NeuroImage, 31(1), 397–409. doi: 10.1016/j.neuroimage.2005.11.027 [DOI] [PubMed] [Google Scholar]
  12. Cacioppo JT, Crites SL, & Gardner WL (1996). Attitudes to the right: Evaluative processing is associated with lateralized late positive event-related brain potentials. Personality and Social Psychology Bulletin, 22(12), 1205–1219. doi: 10.1177/01461672962212002 [DOI] [Google Scholar]
  13. Caharel S, Courtay N, Bernard C, Lalonde R, & Rebaï M (2005). Familiarity and emotional expression influence an early stage of face processing: An electrophysiological study. Brain and Cognition, 59(1), 96–100. doi: 10.1016/j.bandc.2005.05.005 [DOI] [PubMed] [Google Scholar]
  14. Capitão L, Sampaio A, Férnandez M, Sousa N, Pinheiro A, & Gonçalves ÓF (2011). Williams syndrome hypersociability: A neuropsychological study of the amygdala and prefrontal cortex hypotheses. Research in Developmental Disabilities, 32(3), 1169–1179. [DOI] [PubMed] [Google Scholar]
  15. Cashon CH, Ha O-R, DeNicola CA, & Mervis CB (2013). Toddlers with Williams Syndrome Process Upright but not Inverted Faces Holistically. Journal of Autism and Developmental Disorders. doi: 10.1007/s10803-013-1804-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Clark VP, & Hillyard SA (1996). Spatial selective attention affects early extrastriate but not striate components of the visual evoked potential. Journal of cognitive neuroscience, 8(5), 387–402. [DOI] [PubMed] [Google Scholar]
  17. Cunningham WA, Espinet SD, DeYoung CG, & Zelazo PD (2005). Attitudes to the right- and left: Frontal ERP asymmetries associated with stimulus valence and processing goals. NeuroImage, 28(4), 827–834. doi: 10.1016/j.neuroimage.2005.04.044 [DOI] [PubMed] [Google Scholar]
  18. Deruelle C, & De Schonen S (1998). Do the right and left hemispheres attend to the same visuospatial information within a face in infancy? Developmental Neuropsychology, 14(4), 535–554. doi: 10.1080/87565649809540727 [DOI] [Google Scholar]
  19. Deruelle C, Mancini J, Livet M, Casse-Perrot C, & de Schonen S (1999). Configural and local processing of faces in children with Williams syndrome. Brain and Cognition, 41(3), 276–298. [DOI] [PubMed] [Google Scholar]
  20. Doherty-Sneddon G, Riby DM, Calderwood L, & Ainsworth L (2009). Stuck on you: face-to-face arousal and gaze aversion in Williams syndrome. Cognitive Neuropsychiatry, 14(6), 510–523. doi: 10.1080/13546800903043336 [DOI] [PubMed] [Google Scholar]
  21. Dong G, & Lu S (2010). The relation of expression recognition and affective experience in facial expression processing: An event-related potential study. Psychology Research and Behavior Management, 3, 65–74. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Dykens EM (2003). Anxiety, fears, and phobias in persons with Williams syndrome. Developmental Neuropsychology, 23(1–2), 291–316. 10.1080/87565641.2003.9651896 [DOI] [PubMed] [Google Scholar]
  23. Eimer M (2000). Effects of face inversion on the structural encoding and recognition of faces. Evidence from event-related brain potentials. Brain research. Cognitive brain research, 10(1–2), 145–158. [DOI] [PubMed] [Google Scholar]
  24. Eimer M, & Holmes A (2007). Event-related brain potential correlates of emotional face processing. Neuropsychologia, 45(1), 15–31. doi: 10.1016/j.neuropsychologia.2006.04.022 [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Eimer M, Holmes A, & McGlone FP (2003). The role of spatial attention in the processing of facial expression: an ERP study of rapid brain responses to six basic emotions. Cognitive, Affective & Behavioral Neuroscience, 3(2), 97–110. [DOI] [PubMed] [Google Scholar]
  26. Ekman P, & Matsumoto D (1993). Japanese and Caucasian Facial Expressions of Emotion (JACFEE). Consulting Psychologists Press: Palo Alto, CA. [Google Scholar]
  27. Fishman I, Yam A, Bellugi U, & Mills D (2011). Language and sociability: Insights from Williams syndrome. Journal of Neurodevelopmental Disorders, 3(3), 185–192. doi: 10.1007/s11689-011-9086-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Foti D, Hajcak G, & Dien J (2009). Differentiating neural responses to emotional pictures: Evidence from temporal-spatial PCA. Psychophysiology, 46(3), 521–530. doi: 10.1111/j.1469-8986.2009.00796.x [DOI] [PubMed] [Google Scholar]
  29. Frigerio E, Burt DM, Gagliardi C, Cioffi G, Martelli S, Perrett DI, & Borgatti R (2006). Is everybody always my friend? Perception of approachability in Williams syndrome. Neuropsychologia, 44(2), 254–259. [DOI] [PubMed] [Google Scholar]
  30. Gagliardi C, Frigerio E, Burt DM, Cazzaniga I, Perrett DI, & Borgatti R (2003). Facial expression recognition in Williams syndrome. Neuropsychologia, 41(6), 733–738. [DOI] [PubMed] [Google Scholar]
  31. Haas BW, Mills D, Yam A, Hoeft F, Bellugi U, & Reiss A (2009). Genetic influences on sociability: Heightened amygdala reactivity and event-related responses to positive social stimuli in Williams syndrome. Journal of Neuroscience, 29(4), 1132–1139. doi: 10.1523/JNEUROSCI.5324-08.2009 [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Hajcak G, & Dennis TA (2009). Brain potentials during affective picture processing in children. Biological Psychology, 80(3), 333–338. doi: 10.1016/j.biopsycho.2008.11.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Hajcak G, MacNamara A, & Olvet DM (2010). Event-Related Potentials, Emotion, and Emotion Regulation: An Integrative Review. Developmental Neuropsychology, 35(2), 129–155. doi: 10.1080/87565640903526504 [DOI] [PubMed] [Google Scholar]
  34. Hajcak G, Weinberg A, MacNamara A, & Foti D (2012). ERPs and the study of emotion In Luck SJ & Kappenman ES (Eds.), Oxford Handbook of ERP Components. New York: Oxford University Press, pp. 441–474. [Google Scholar]
  35. Hillier LW, Fulton RS, Fulton LA, Graves TA, Pepin KH, Wagner-McPherson C, et al. (2003). The DNA sequence of human chromosome 7. Nature, 424(6945), 157–164. doi: 10.1038/nature01782 [DOI] [PubMed] [Google Scholar]
  36. Itier RJ, & Taylor MJ (2002). Inversion and contrast polarity reversal affect both encoding and recognition processes of unfamiliar faces: A repetition study using ERPs. NeuroImage, 15(2), 353–372. doi: 10.1006/nimg.2001.0982 [DOI] [PubMed] [Google Scholar]
  37. Järvinen-Pasley A, Adolphs R, Yam A, Hill KJ, Grichanik M, Reilly J, Mills D, Reiss A, Korenberg J, & Bellugi U (2010). Affiliative behavior in Williams syndrome: Social perception and real-life social behavior. Neuropsychologia, 48(7), 2110–2119. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Järvinen-Pasley A, Bellugi U, Reilly J, Mills DL, Galaburda A, Reiss AL, & Korenberg JR (2008). Defining the social phenotype in Williams syndrome: A model for linking gene, the brain, and behavior. Development and Psychopathology, 20(01). doi: 10.1017/S0954579408000011 [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Järvinen A, Korenberg JR, & Bellugi U (2013). The social phenotype of Williams syndrome. Current Opinion in Neurobiology, 23(3), 414–422. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Jemel B, Pisani M, Calabria M, Crommelinck M, & Bruyer R (2003). Is the N170 for faces cognitively penetrable? Evidence from repetition priming of Mooney faces of familiar and unfamiliar persons. Brain research. Cognitive brain research, 17(2), 431–446. [DOI] [PubMed] [Google Scholar]
  41. Jones W, Bellugi U, Lai Z, Chiles M, Reilly J, Lincoln A, & Adolphs R (2000). II. Hypersociability in Williams Syndrome. Journal of Cognitive Neuroscience, 12 Suppl 1, 30–46. [DOI] [PubMed] [Google Scholar]
  42. Karmiloff-Smith A, Brown JH, Grice S, & Paterson S (2003). Dethroning the myth: cognitive dissociations and innate modularity in Williams syndrome. Developmental Neuropsychology, 23(1–2), 227–242. doi: 10.1080/87565641.2003.9651893 [DOI] [PubMed] [Google Scholar]
  43. Karmiloff-Smith A, Klima E, Bellugi U, & Grant J (1995). Is there a social module? Language, face processing, and theory of mind in individuals with Williams Syndrome. Journal of cognitive neuroscience, 7(2), 196–208. [DOI] [PubMed] [Google Scholar]
  44. Karmiloff-Smith A, Thomas M, Annaz D, Humphreys K, Ewing S, Brace N, et al. (2004). Exploring the Williams syndrome face-processing debate: The importance of building developmental trajectories. Journal of Child Psychology and Psychiatry, 45(7), 1258–1274. [DOI] [PubMed] [Google Scholar]
  45. Kaufman A, & Kaufman N (2004). Kaufman Brief Intelligence Test, Second Edition. Circle Pines, MN: AGS Publishing. [Google Scholar]
  46. Key A, & Dykens E (2015). Face repetition detection and social interest: An ERP study in adults with and without Williams syndrome. Social Neuroscience. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Key AP, Jones D, & Dykens EM (2013). Social and emotional processing in Prader-Willi syndrome: genetic subtype differences. Journal of Neurodevelopmental Disorders, 5(1), 7 10.1186/1866-1955-5-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. Lacroix A, Guidetti M, Rogé B, & Reilly J (2009). Recognition of emotional and nonemotional facial expressions: A comparison between Williams syndrome and autism. Research in Developmental Disabilities, 30(5), 976–985. doi: 10.1016/j.ridd.2009.02.002 [DOI] [PubMed] [Google Scholar]
  49. Lang PJ, Bradley MM, & Cuthbert BN (2005). International affective picture system (IAPS): affective ratings of pictures and instruction manual Technical Report A-6. University of Florida, Gainesville, FL. [Google Scholar]
  50. Martens MA, Wilson SJ, & Reutens DC (2008). Williams syndrome: A critical review of the cognitive, behavioral, and neuroanatomical phenotype. Journal of Child Psychology and Psychiatry, 49(6), 576–608. doi: 10.1111/j.1469-7610.2008.01887.x [DOI] [PubMed] [Google Scholar]
  51. Martens MA, Wilson SJ, Dudgeon P, & Reutens DC (2009). Approachability and the amygdala: Insights from Williams syndrome. Neuropsychologia, 47(12), 2446–2453. [DOI] [PubMed] [Google Scholar]
  52. Mervis CB (2003). Williams syndrome: 15 years of psychological research. Developmental Neuropsychology, 23(1–2), 1–12. doi: 10.1080/87565641.2003.9651884 [DOI] [PubMed] [Google Scholar]
  53. Mervis CB, & Klein-Tasman BP (2000). Williams syndrome: Cognition, personality, and adaptive behavior. Mental Retardation and Developmental Disabilities Research Reviews, 6(2), 148–158. doi: [DOI] [PubMed] [Google Scholar]
  54. Mervis CB, Robinson BF, Bertrand J, Morris CA, Klein-Tasman BP, & Armstrong SC (2000). The Williams Syndrome Cognitive Profile. Brain and Cognition, 44(3), 604–628. doi: 10.1006/brcg.2000.1232 [DOI] [PubMed] [Google Scholar]
  55. Meyer-Lindenberg A, Hariri AR, Muñoz KE, Mervis CB, Mattay VS, Morris CA, & Berman KF (2005). Neural correlates of genetically abnormal social cognition in Williams syndrome. Nature Neuroscience, 8(8), 991–993. doi: 10.1038/nn1494 [DOI] [PubMed] [Google Scholar]
  56. Mills DL, Alvarez TD, St George M, Appelbaum LG, Bellugi U, & Neville H (2000). III. Electrophysiological studies of face processing in Williams syndrome. Journal of cognitive neuroscience, 12 Suppl 1, 47–64. [DOI] [PubMed] [Google Scholar]
  57. Mimura M, Hoeft F, Kato M, Kobayashi N, Sheau K, Piggot J, et al. (2010). A preliminary study of orbitofrontal activation and hypersociability in Williams Syndrome. Journal of Neurodevelopmental Disorders, 2(2), 93–98. doi: 10.1007/s11689-009-9041-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  58. Mobbs D, Eckert MA, Menon V, Mills D, Korenberg J, Galaburda AM, et al. (2007). Reduced parietal and visual cortical activation during global processing in Williams syndrome. Developmental Medicine & Child Neurology, 49(6), 433–438. doi: 10.1111/j.1469-8749.2007.00433.x [DOI] [PubMed] [Google Scholar]
  59. Mobbs D, Garrett A, Menon V, Rose F, Bellugi U, & Reiss AL (2004). Anomalous brain activation during face and gaze processing in Williams syndrome. Neurology, 62, 2070–2076. [DOI] [PubMed] [Google Scholar]
  60. Mühlberger A, Wieser MJ, Herrmann MJ, Weyers P, Tröger C, & Pauli P (2009). Early cortical processing of natural and artificial emotional faces differs between lower and higher socially anxious persons. Journal of Neural Transmission, 116(6), 735–746. doi: 10.1007/s00702-008-0108-6 [DOI] [PubMed] [Google Scholar]
  61. Muñoz KE, Meyer-Lindenberg A, Hariri AR, Mervis CB, Mattay VS, Morris CA, & Berman KF (2010). Abnormalities in neural processing of emotional stimuli in Williams syndrome vary according to social vs. non-social content. NeuroImage, 50(1), 340–346. doi: 10.1016/j.neuroimage.2009.11.069 [DOI] [PMC free article] [PubMed] [Google Scholar]
  62. Oldfield RC (1971). The assessment and analysis of handedness: the Edinburgh inventory. Neuropsychologia, 9, 97–113. [DOI] [PubMed] [Google Scholar]
  63. Olofsson JK, Nordin S, Sequeira H, & Polich J (2008). Affective picture processing: An integrative review of ERP findings. Biological Psychology, 77(3), 247–265. doi: 10.1016/j.biopsycho.2007.11.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
  64. Plesa-Skwerer D, Ammerman E, André M-C, Ciciolla L, Fine AB, & Tager-Flusberg H (2011). A multimeasure approach to investigating affective appraisal of social information in Williams syndrome. Journal of Neurodevelopmental Disorders, 3(4), 325–334. doi: 10.1007/s11689-011-9100-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  65. Plesa-Skwerer D, Borum L, Verbalis A, Schofield C, Crawford N, Ciciolla L, & Tager-Flusberg H (2009). Autonomic responses to dynamic displays of facial expressions in adolescents and adults with Williams syndrome. Social Cognitive and Affective Neuroscience, 4(1), 93–100. doi: 10.1093/scan/nsn041 [DOI] [PMC free article] [PubMed] [Google Scholar]
  66. Plesa-Skwerer D, Verbalis A, Schofield C, Faja S, & Tager-Flusberg H (2006). Social-perceptual abilities in adolescents and adults with Williams syndrome. Cognitive Neuropsychology, 23(2), 338–349. doi: 10.1080/02643290542000076 [DOI] [PubMed] [Google Scholar]
  67. Polich J (2007). Updating P300: An integrative theory of P3a and P3b. Clinical Neurophysiology, 118(10), 2128–2148. doi: 10.1016/j.clinph.2007.04.019 [DOI] [PMC free article] [PubMed] [Google Scholar]
  68. Pober BR, & Dykens EM (1996). Williams syndrome: An overview of medical, cognitive, and behavioral features. Child and Adolescent Psychiatric Clinics of North America, 5(4), 929–943. [Google Scholar]
  69. Porter MA, Coltheart M, & Langdon R (2007). The neuropsychological basis of hypersociability in Williams and Down syndrome. Neuropsychologia, 45(12), 2839–2849. doi: 10.1016/j.neuropsychologia.2007.05.006 [DOI] [PubMed] [Google Scholar]
  70. Reiss AL (2004). An experiment of nature: Brain anatomy parallels cognition and behavior in Williams Syndrome. Journal of Neuroscience, 24(21), 5009–5015. doi: 10.1523/JNEUROSCI.5272-03.2004 [DOI] [PMC free article] [PubMed] [Google Scholar]
  71. Riby DM, & Back E (2010). Can individuals with Williams syndrome interpret mental states from moving faces? Neuropsychologia, 48(7), 1914–1922. doi: 10.1016/j.neuropsychologia.2010.03.010 [DOI] [PubMed] [Google Scholar]
  72. Riby DM, & Hancock PJB (2008). Viewing it differently: Social scene perception in Williams syndrome and Autism. Neuropsychologia, 46(11), 2855–2860. doi: 10.1016/j.neuropsychologia.2008.05.003 [DOI] [PubMed] [Google Scholar]
  73. Riby DM, & Hancock PJB (2009). Do faces capture the attention of individuals with Williams syndrome or autism? Evidence from tracking eye movements. Journal of Autism and Developmental Disorders, 39(3), 421–431. [DOI] [PubMed] [Google Scholar]
  74. Riby DM, Hanley M, Kirk H, Clark F, Little K, Fleck R, et al. (2013). The interplay between anxiety and social functioning in Williams syndrome. Journal of Autism and Developmental Disorders, 44(5), 1220–1229. 10.1007/s10803-013-1984-7 [DOI] [PubMed] [Google Scholar]
  75. Rossion B, Delvenne JF, Debatisse D, Goffaux V, Bruyer R, Crommelinck M, & Guérit JM (1999). Spatio-temporal localization of the face inversion effect: An event-related potentials study. Biological Psychology, 50(3), 173–189. [DOI] [PubMed] [Google Scholar]
  76. Rossion B, & Caharel S (2011). ERP evidence for the speed of face categorization in the human brain: Disentangling the contribution of low-level visual cues from face perception. Vision Research, 51(12), 1297–1311. doi: 10.1016/j.visres.2011.04.003 [DOI] [PubMed] [Google Scholar]
  77. Rossion B, Dricot L, Devolder A, Bodart JM, Crommelinck M, Gelder B, & Zoontjes R (2000). Hemispheric asymmetries for whole-based and part-based face processing in the human fusiform gyrus. Journal of Cognitive Neuroscience, 12(5), 793–802. [DOI] [PubMed] [Google Scholar]
  78. Sabatinelli D, Bradley MM, Fitzsimmons JR, & Lang PJ (2005). Parallel amygdala and inferotemporal activation reflect emotional intensity and fear relevance. NeuroImage, 24(4), 1265–1270. doi: 10.1016/j.neuroimage.2004.12.015 [DOI] [PubMed] [Google Scholar]
  79. Sabatinelli D, Lang PJ, Keil A, & Bradley MM (2007). Emotional perception: correlation of functional MRI and event-related potentials. Cerebral Cortex, 17(5), 1085–1091. doi: 10.1093/cercor/bhl017 [DOI] [PubMed] [Google Scholar]
  80. Sato W, Kochiyama T, Yoshikawa S, & Matsumura M (2001). Emotional expression boosts early visual processing of the face: ERP recording and its decomposition by independent component analysis. NeuroReport, 12(4), 709–714. [DOI] [PubMed] [Google Scholar]
  81. Schupp HT, Cuthbert BN, Bradley MM, Cacioppo JT, Ito T, & Lang PJ (2000). Affective picture processing: the late positive potential is modulated by motivational relevance. Psychophysiology, 37(2), 257–261. [PubMed] [Google Scholar]
  82. Schupp H, Cuthbert B, Bradley M, Hillman C, Hamm A, & Lang P (2004). Brain processes in emotional perception: Motivated attention. Cognition & Emotion, 18(5), 593–611. doi: 10.1080/02699930341000239 [DOI] [Google Scholar]
  83. Schupp HT, Flaisch T, Stockburger J, & Junghöfer M (2006). Emotion and attention: event-related brain potential studies In Anders, Ende, Junghöfer M, Kissler J, Wildgruber (Eds.), Progress in Brain Research (Vol. 156, pp. 31–51). Elsevier. doi: 10.1016/S0079-6123(06)56002–9 [DOI] [PubMed] [Google Scholar]
  84. Scott LS, & Nelson CA (2006). Featural and configural face processing in adults and infants: A behavioral and electrophysiological investigation. Perception, 35(8), 1107–1128. doi: 10.1068/p5493 [DOI] [PubMed] [Google Scholar]
  85. Strømme P, Bjørnstad PG, & Ramstad K (2002). Prevalence estimation of Williams syndrome. Journal of Child Neurology, 17(4), 269–271. [DOI] [PubMed] [Google Scholar]
  86. Tager-Flusberg H, Plesa-Skwerer D, Faja S, & Joseph RM (2003). People with Williams syndrome process faces holistically. Cognition, 89(1), 11–24. [DOI] [PubMed] [Google Scholar]
  87. Tager-Flusberg H, & Sullivan K (2000). A componential view of theory of mind: evidence from Williams syndrome. Cognition, 76(1), 59–90. [DOI] [PubMed] [Google Scholar]
  88. Tager-Flusberg H, Boshart J, & Baron-Cohen S (1998). Reading the windows to the soul: evidence of domain-specific sparing in Williams syndrome. Journal of Cognitive Neuroscience, 10(5), 631–639. [DOI] [PubMed] [Google Scholar]
  89. Thomas DG, Grice JW, Najim-Briscoe RG, & Miller J (2004). The influence of unequal numbers of trials on comparisons of average event-related potentials. Developmental Neuropsychology, 26(3), 753–774. [DOI] [PubMed] [Google Scholar]
  90. Vuilleumier P, & Pourtois G (2007). Distributed and interactive brain mechanisms during emotion face perception: Evidence from functional neuroimaging. Neuropsychologia, 45(1), 174–194. doi: 10.1016/j.neuropsychologia.2006.06.003 [DOI] [PubMed] [Google Scholar]
  91. Weinberg A, & Hajcak G (2010). Beyond good and evil: The time-course of neural activity elicited by specific picture content. Emotion, 10(6), 767–782. doi: 10.1037/a0020242 [DOI] [PubMed] [Google Scholar]

RESOURCES