Abstract
While well-represented on clinical measures, co-speech gesture production has never been formally studied in autistic adults. Twenty-one verbally fluent autistic adults and 21 typically developing controls engaged in a controlled conversational task. Group differences were observed in both semantic/pragmatic and motoric features of spontaneously produced co-speech gestures. Autistic adults prioritized different functions of co-speech gesture. Specifically, they used gesture more than controls to facilitate conversational turn-taking, demonstrating a novel nonverbal strategy for regulating conversational dynamics. Autistic adults were more likely to gesture unilaterally than bilaterally, a motoric feature of gesture that was individually associated with autism symptoms. Co-speech gestures may provide a link between nonverbal communication symptoms and known differences in motor performance in autism.
Keywords: Autism spectrum disorder, gesture, nonverbal communication, motor skills, conversation, adulthood
Communicative co-speech gestures are spontaneously produced hand movements that occur during speech and add both semantic and pragmatic content to communication. Impairments in nonverbal communication (including gesture) are now required for a diagnosis of autism spectrum disorder (ASD; American Psychiatric Association 2013), and the absence of early gestures is considered a red flag for ASD in toddlers (Robins et al. 2014). Although gestures produced by young children with ASD during language acquisition (e.g., pointing) have been extensively studied, with robust delays reported in both production and comprehension (Charman et al. 2003; Mundy et al. 1990; Winder et al. 2013), gestures that are used during the course of developed, fluent speech are relatively understudied in this population. In fact, co-speech gestures have never been examined in verbally fluent autistic adults1.
Co-speech gestures are social and communicative, with positive effects on listener comprehension. Gestures engage communicative partners during verbal interactions, and both children and adults glean novel information from gesture that is not present in speech (Cassell et al. 1998; Kelly 2001). A recent meta-analysis of studies involving typically developing individuals found that co-speech gestures benefit overall comprehension with an average effect size of Cohen’s d = .61, when comparing speech produced with and without gestures (Hostetter 2011). The beneficial effect of gesture on utterance comprehension is strongest in contexts in which the power of the linguistic signal is somehow weakened, for example, when paired with ambiguous speech or in a noisy environment, as well as in children, who are still developing fluent language (Hostetter 2011; Obermeier et al. 2012; Rogers 1978). These findings emphasize the power gesture has in conveying information, and highlight gesture’s potential assistive role for individuals with spoken language differences.
In addition to improving listener comprehension, gestures can add content to verbal interactions. For example, gestures add visuospatial information that may not be conveyed by speech (e.g., imagine someone describing an egg they found on a hike while holding their finger and thumb a mere half-inch apart – the listener would see that the egg was tiny without the speaker saying it). They can add emphasis or signal uncertainty. They also serve critical pragmatic functions that are often overlooked, such as linking the context of the physical environment to the contents of speech, allowing listeners to make inferences about a speaker’s intended meaning (e.g., imagine a parent who says to their child, “it’s almost time to go,” while pointing to the child’s mess on the floor; Kelly et al. 1999).
Gestures can be used to convey information, but they are also socially engaging. In an unpublished study, featured speakers in TED talks that went the most viral (i.e., those that were viewed the most) used nearly twice as many co-speech gestures as talks that didn’t go viral (Van Edwards n.d.). Taken together, the evidence demonstrates that gesture benefits the communicative signal both by supporting and enhancing information conveyed by speech, and by engaging listeners’ attention and interest during discourse. If autistic people do not achieve these goals with their co-speech gestures, they may miss out on important opportunities to engage and convey information to their social partners.
Critically, gesture has the potential to support dyadic social interaction during conversations. For instance, gesture might be used to cue turn-taking during fluent conversation, to indicate certainty/uncertainty associated with a statement, or to refer to a conversational partner’s contribution (Bavelas et al. 1992). Conversational turn-taking can be challenging for some autistic people (Kaczmarek 2002), in part because engaging in a smooth back-and-forth conversation relies on the harmonious deployment of multiple linguistic/pragmatic skills that differ in ASD. For example, autistic people may pause longer (Bone et al. 2013) in general speak more slowly (Parish-Morris et al. 2016), and are less likely to adhere to a conversational topic established by another person (Volden et al. 2007), compared to individuals without ASD. Many of these differences become more pronounced in highly social contexts (Bone et al. 2013), making conversation an especially important research topic for verbal autistic people that want to improve their social skills for the purposes of maintaining jobs, friendships, and romantic relationships. Visual aids that support social interaction (e.g., co-speech gestures) could be an important additional cue when language signals are insufficient to support successful conversations.
The role of gesture in supporting the social structure of conversations has not been studied in ASD to date. Other communicative functions of gestures (e.g., how gesture adds content to speech) have been studied in verbally fluent children and adolescents on the autism spectrum. The functions of gestures are analyzed by categorizing gestures based on their meanings, or semantic contribution, often denoted as gesture “types”. Given that gestures by definition are not easily categorized, a number of different systems have been developed to categorize gestures into types. The most widely used system, developed by David McNeill (1992), divides gestures into iconic, metaphorical, deictic, beat, and emblem gestures. Iconic gestures – “descriptives” on the Autism Diagnostic Observation Schedule (ADOS; Lord et al. 2012) – take advantage of the visuospatial power of gesture to illustrate physical characteristics of a referent, such as its shape or size; metaphorical gestures similarly have prominent visuospatial properties but present information in a more abstract manner, for example moving a hand upward and outward to describe building knowledge or a skill. Deictic gestures are generally pointing gestures that reference a specific object in the environment. Beat gestures are rhythmic hand movements executed in close temporal synchrony with important words or phrases. Finally, emblem gestures (“conventional” gestures on the ADOS) are gestures with a defined meaning (e.g., “ok” gesture – palm facing away from gesturer’s body, pointer and thumb in a circle, with the other three fingers extended and pointing up), that differ from the other gesture types in that they are culturally defined, and consistently carry the same meaning.
Several studies have now applied the McNeill system to ASD. Different patterns of gesture type distribution have been observed, generally demonstrating that autistic speakers tend to use more iconic gestures relative to other types (Medeiros and Winsler 2014; Morett et al. 2016; So et al. 2014; So and Wong 2016; though see also de Marchena & Eigsti 2010 and Silverman, Eigsti, & Bennetto 2017 for null findings). The over-use of iconic gestures by autistic children and adolescents suggests that autistic people may use gestures predominantly for concrete rather than abstract functions.
Interactive gestures are not included in the McNeill system, but are a distinct type of co-speech gesture that make reference to the conversation or the conversational partner, rather than presenting information; their function is to maintain the social structure of a conversation (Bavelas et al. 1992). For example, a speaker might gesture toward their conversational partner when referencing something that interlocutor had said earlier, as if citing that person’s contribution, or gesture along with specific words/phrases to signal emphasis or uncertainty. Interactive gestures have not been studied in ASD to date. It may seem surprising that these fundamentally social gestures have not been studied in ASD. One reason for this could be that most gesture research in ASD has been conducted using primarily one-sided interactions, such as narratives. Narrative paradigms have the advantage of being more easily controlled than conversational tasks; however, because they are one-sided they preclude the study of interpersonal dynamics, such as turn-taking, which are critically important for understanding social variation in individuals with ASD. Because they are less dynamic, narratives offer fewer opportunities to observe interactive gestures compared to natural conversation. Thus, the primary goal of the current study is to examine the communicative functions of gesture production in verbally fluent autistic adults during back-and-forth conversation.
Co-speech gesture is a communicative behavior, but it is also a highly motoric behavior. Autistic people demonstrate a wide range of differences in the motor domain (for reviews, see: Bhat et al. 2011; Fournier et al. 2010; Gowen and Hamilton 2013), including demonstrable weaknesses in the execution of skilled movements and pantomimes (i.e., "praxis", Dowell et al. 2009; Dziuk et al. 2007; Gizzonio et al. 2015; MacNeil and Mostofsky 2012; Mostofsky et al. 2006). Qualitative atypicalities have also been reported in the in motor behaviors of many autistic people (Kanner 1943), and are associated with “frank” ASD (i.e., the phenomenon that ASD diagnosis is evident within moments of observing some individuals), despite not appearing at all within the ASD diagnostic criteria (de Marchena and Miller 2017). As such, co-speech gestures may also provide a link between basic motor skill differences and nonverbal communication. Thus, a secondary goal of the current study is to examine motoric features associated with gesture production in autistic adults.
Current Study
This study addresses three major questions: (1) are atypicalities in co-speech gesture production evident in verbally fluent autistic adults? (2) what communicative functions do co-speech gestures serve in autistic adults? In particular, might gestures serve to compensate for known differences in (verbal) pragmatic communication? And (3) are differences evident in both communicative and motoric aspects of co-speech gestures?
To address these questions, we developed a referential communication task, designed to elicit back-and-forth conversation between adult participants and trained confederates. Referential communication tasks, in which conversational partners must communicate reciprocally in order to accomplish a shared goal, carry the advantage of splitting the difference between experimental control and naturalism. Developed by psycholinguists specifically to study back-and-forth communication in a relatively controlled setting, referential communication tasks strike a balance between internal and external validity and have been used to study both verbal (e.g., Krauss and Weinheimer 1966) and nonverbal communication (Holler and Wilkin 2011). The task was relatively long (~20 minutes), providing ample opportunity to elicit a large number and wide variety of co-speech gestures. Gestures spontaneously produced during the course of the interaction were coded for both communicative and motoric features. We predicted that co-speech gestures would be used differently by adults with and without ASD, and that differences would be evident across both communicative and motoric aspects of production.
Method
Participants
Participants were 21 verbally fluent autistic adults and 21 typically developing control (TDC) adults. Participants were matched at the group level on age and full-scale IQ, as measured by the Wechsler Abbreviated Scale of Intelligence – Second Edition (WASI-2; Wechsler 2011); participants were required to score at or above a standard score of 70 on both the Verbal Comprehension Index and the Perceptual Reasoning Index to be included in the study, see Table 1. All participants were able to use fluent, complex language, as judged by a clinician per the requirements for administration of Module 4 of the Autism Diagnostic Observation Schedule, 2nd Edition (ADOS-2; Lord et al. 2012). Informed consent was obtained from all individual participants included in the study. All procedures performed in studies involving human participants were conducted in accordance with the ethical standards of the institutional and/or national research committee and in compliance with the 1964 Helsinki declaration and its later amendments.
Table 1:
Participant characterization variables.
ASD (n = 21; 18 male) |
TDC (n = 21; 17 male) |
p-value | Cohen’s d | |
---|---|---|---|---|
Chronological age (years) | 26.71 (6.71); 20 – 46 |
28.24 (9.20); 20 – 48 |
.54 | 0.19 |
FSIQ (WASI-II) | 106.19 (18.78); 73–137 |
111.10 (10.26); 97 – 136 |
.30 | 0.38 |
VCI (WASI-II) | 111.38 (22.55); 76 – 160 |
111.95 (11.83); 93 – 142 |
.92 | 0.03 |
PRI (WASI-II) | 99.62 (18.70); 70 – 142 |
107.86 (12.58); 91 – 131 |
.10 | 0.53 |
ADOS-2 Total Score | 13.52 (4.29); 5 – 21 |
1.00 (0.84); 0 – 3 |
< .001 | −5.67 |
SCQ (n=20 ASD, n=19 TDC) |
17.50 (6.86); | 1.26 (0.93); | < .001 | −4.72 |
SRS-2 Total T-score | 64.29 (8.01); 47 – 75 |
46.71 (6.35); 38 – 59 |
< .001 | −2.49 |
Note: Data presented as Mean (SD); Range. ASD = autism spectrum disorder. TDC = typically developing control. FSIQ = Full Scale IQ. WASI-II = Wechsler Abbreviated Scale of Intelligence – Second Edition. VCI = Verbal Comprehension Index. PRI = Perceptual Reasoning Index. ADOS-2 = Autism Diagnostic Observation Schedule, Second Edition. SCQ = Social Communication Questionnaire, Lifetime Form. SRS-2 = Social Responsiveness Scale, 2nd Edition, Adult Self-Report.
Autistic participants were involved in a pilot study of a novel treatment program to improve social functioning, called TUNE In (Training to Understand and Navigate Emotions and Interactions (Pallathra et al. 2018); NIMH R34MH104407; PI: Brodkin). They were recruited from a variety of sources, including the Center for Autism Research at the Children’s Hospital of Philadelphia, ASD service providers and media advertisements, and the Adult Autism Spectrum Program at Penn Medicine. All participants in the ASD group met diagnostic criteria for ASD per the Diagnostic and Statistical Manual of Mental Disorders (DSM-5; American Psychiatric Association 2013), as determined by a clinician with expertise in ASD in adulthood and informed by the ADOS-2 (Lord et al. 2012; all participants completed Module 4) and the Social Communication Questionnaire, Lifetime Form (SCQ; Rutter et al. 2003). The SCQ was completed by a family member who knew the participant during early childhood. Psychological assessments, including diagnostic and cognitive testing, were conducted before participants enrolled in TUNE In. All experimental data included in the current study was collected after participants completed treatment.
TDC participants were recruited through advertisements at the Children's Hospital of Philadelphia, the University of Pennsylvania, local businesses, and a local community college. In addition, study information was sent to adults from a mailing list maintained by the University of Pennsylvania for adults without ASD who had participated in previous studies at our research center. Study advertisements stated that the goal of the study was to understand motor behavior and social communication in adults. Matched controls all scored below clinical cutoffs on the ADOS-2, SCQ, and SRS-2 (Social Responsiveness Scale, Second Edition, Adult Self-Report; Constantino 2012), which was used as a measure of autism symptoms. Participant characterization variables are presented in Table 1
Experimental Task
Design.
Participants in the current study completed a collaborative referential communication task with a trained confederate. Participants and confederates sat across a table from one another in a quiet room, each with a networked, touch-screen laptop in front of them, See Figure 1. Each player had a 2×4 grid displayed on their screen, along with eight abstract figures. One player – the director – had a completed grid on their screen, with all eight figures in pre-determined locations; the other player – the matcher – had an empty grid with all eight figures randomly placed outside of the grid, See Figure 2. Players could not see each other’s screens.
Figure 1: Experimental setup for referential communication task.
Participants sat across a table from confederates, each with a networked, touch-screen laptop in front of them.
Figure 2: Screen view and stimuli for Blocks 1 (2a and 2b) and 2 (2c and 2d).
(2a) Participant screen at the beginning of Block 1 (Director role)
(2b) Confederate screen at the beginning of Block 1 (Matcher role)
(2c) Participant screen at the beginning of Block 2 (Matcher role)
(2d) Confederate screen at the beginning of Block 2 (Director role)
The goal of the game was for the two players to have identically matching grids at the end of each trial; that is, the matcher was to place their figures in the same grid locations as displayed on the director’s screen. Participants and confederates were instructed to work together to accomplish this goal, and to say anything they wanted to get there; thus, participant and confederate went back and forth in a collaborative fashion to describe their figures and get their grids to match.
The game was played over two blocks. In the first block, the participant was always the director and the confederate was always the matcher. Players switched roles for the second block, so that the participant was the matcher and the confederate was the director, see Figure 2. Participants were assigned to both roles to control the amount of time they spent in a primarily speaking role (i.e., director), and how much time they spent in a primarily listening role (i.e., matcher), although participants across groups both spoke and listened as both director and matcher.
Each block consisted of five trials. All five trials in a given block used a single set of eight figures; however, the target locations of the eight figures varied by trial. A different set of eight figures was used in the second block.
Procedure.
A research assistant explained how to play the game (Appendix). Players were instructed to describe each figure on their grid to the confederate in numerical order, from one to eight, so that the other player could place each figure in the target location using touchscreen drag-and-drop. Players were explicitly instructed that matchers should say “okay” or “got it” after they placed the correct figure in the correct location, so that directors knew to move on to the next figure. This was added following Nadig and colleagues (2015) to increase the chances of task success in the ASD group, thus facilitating group comparisons of communication behavior. In the rare cases in which dyads did not complete the grid correctly, they were instructed to keep trying until they got it, before moving on to the next trial. Prior to starting each block, participants and confederates completed two practice trials, using images of familiar objects (e.g., apple, car).
The entire interaction was video recorded from a side viewing angle so that all gestures produced by both the participant and the confederate could be captured and later coded offline. Gesture was never mentioned, and participants were not aware that gesture was the focus of the study.
Stimuli.
Stimuli for the practice trials were easily identifiable common objects (e.g., apple, car, book, chair, etc.). Stimuli for the experimental conditions were sixteen three-dimensional monochromatic figures, modeled after stimuli used in Hummel and Biederman (1992) and created on Google SketchUp; all experimental stimuli are shown in Figure 2. Visuospatial concepts, including images that are not easily given a verbal label, tend to elicit more gestures than verbal concepts (for a review, see Alibali, 2005). Some figures shared specific visuospatial features. For example, multiple figures had a cylindrical component that varied in size and position. In each condition, two figures were identical but were rotated to view in different angles. These manipulations were included to increase the chances that participants would rely more on gesture, as the figures could not be named and were challenging to describe verbally, while still being visually simple.
The task was created to be easy, while still taking time and effort to complete. Pilot testing confirmed that adults with and without ASD could successfully complete the task, and that the task and all figures included in the final study design elicited back-and-forth conversation and gesturing. Spontaneous verbal descriptions of the objects, and associated gestures, produced by pilot participants were used to help develop a verbal and gestural script for confederates. (Appendix).
Confederates and confederate training.
Traditional referential communication studies invite two participants to complete the task at once; thus the dyad is the focus of study. Given our aim (to compare gesture behaviors produced by autistic participants and controls) our approach was to train a group of confederates to play the game in a semi-structured way with participants, using a flexible script that could be adjusted based on participant responses. This allowed us to exert some degree of experimental control over the interaction while still allowing the conversations to unfold naturally.
Confederates were college students and research assistants at our center (in their 20’s). Participants were always paired with a confederate of the opposite gender; thus, the majority of confederates were young women. When in the matcher role, confederates were trained to respond to participants’ initial descriptions of the figures with a scripted (i.e., pre-planned) term and co-speech gesture. When in the director role, confederates gave a scripted description of each figure along with a scripted gesture (see Appendix). The script was visible to the confederate on their laptop screen (see Figure 2). After using the initial script on Trial 1, confederates could say or ask whatever they wanted to complete the grid collaboratively. Confederates were, however, trained to wait longer than they normally would to ask questions or offer suggestions, to allow for differences in processing time that might exist between groups. Confederates completed practice administrations with other research assistants until they reached fidelity on the parameters described above (determined by the first author).
Behavioral coding.
All spontaneously produced co-speech hand gestures were identified and coded from video recordings of the interaction. Coders were undergraduate research assistants from the University of Pennsylvania. All coders were naive to the study hypotheses and participant diagnosis. Coders were extensively trained (to a criterion of at least 80% agreement on all dependent measures of interest) before beginning coding. Gestures were coded using Datavyu software (Datavyu Team 2014).
Identifying and categorizing gesture.
All gestures produced during the game were identified, categorized, and counted. Hand movements were classified as gestures if they were: (1) spontaneously produced communicative hand movements, and (2) distinct from adaptors (e.g., hand-scratching, squirming, hair-pulling) or self-stimulatory behaviors. Hand movements not visible to the interlocutor (i.e., to the other player), including gestures produced under the table or behind a computer screen (from the interlocutor’s visual perspective), were included. In the rare case that a suspected hand movement was not visible to the camera, it was not coded as a gesture.
Gestures were coded for both the participant and the trained confederate. Of the 3567 total gestures produced by all participants and confederates over the course of the task, 1624 (46%) were coded by two independent coders for the purposes of calculating inter-rater reliability, which is provided below.
With the exception of reliability calculations, only participant gestures were included in the current study. A total of 1579 participant gestures were identified and coded (including 811 gestures produced by autistic adults, and 768 produced by controls), making this the largest corpus of co-speech gestures in ASD of which we are aware.
Coding semantic/pragmatic features of gesture.
Gestures were categorized according to three semantic/pragmatic features: (1) gesture type, (2) inclusion of new information not present in speech, and (3) coders’ confidence in their own decision making about the gesture’s purpose.
Gesture type.
Gestures were categorized into one of six gesture types: interactive, representational, deictic, beat, numerical, or other, see Table 2 for descriptions. This classification system was based primarily on the work of David McNeill (1992) and Janet Bavelas (1992). Kappa for gesture type was .79, indicating good agreement.
Table 2:
Name, description, and example of gesture types.
Name | Description | Examples |
---|---|---|
Interactive | Gestures referring directly to the other person in the conversation and serving functions related to the interaction requirements of the dialogue, such as (a) citing the interlocutor’s contribution or (b) indicating uncertainty/hedging. | (a) The one you said was the ice cream cone [hand moves toward interlocutor with palm open and facing up] (b) I’m not sure [hand flips back-and-forth from palm-up to palm-down] |
Representational | Gestures depicting physical or metaphorical properties of an object/idea (e.g., size, shape, motion). “Descriptive” gestures on the ADOS-2. |
It’s long and skinny [fingers pinch together at midline and hands spread apart horizontally] |
Deictic | Pointing gestures, including (a) pointing to physical things in the room and (b) pointing to locations of ideas for future reference. | (a) The slanted one [points to a figure on the computer screen] (b) I left my glasses at home [points toward the door, while saying] |
Beat | Small rhythmic hand movements produced in time with speech, often used for emphasis. |
A cone on top of a square [makes rapid chopping motion with fist on word “cone”] |
Numerical | Gestures indicating quantity. |
The second one is… [holds up two fingers] |
Other | Movements that coders believed to be gestures, but that did not fit into one of the above categories. | (a) Hmm [hand goes to chin in thinking position] |
Note: In “Examples” column, speech is presented in italics and associated gesture is presented in brackets. ADOS-2 = Autism Diagnostic Observation Schedule, Second Edition.
New information.
Representational gestures were further coded for how much new information they added to the conversation that was not presented in the semantically-related speech, on a 0 to 2 scale. A 0 indicated that the gesture and the speech overlapped semantically almost completely, and thus no new information was added by the gesture. For example, moving one’s hand across their body in a straight, horizontal, and flat motion, while saying ‘it is straight across’, does not add any new information to what was said in speech. A score of 1 indicated that the gesture added some new information to the speech. For example, showing a long and skinny cylinder with one’s hand and moving it from side to side, while saying, “it is a small cylinder,” adds information that the cylinder is long and skinny, and not a prototypically-proportioned cylinder. A 2 indicated that the information presented in the gesture and the speech did not overlap at all, thus gesture added completely new information to the speech. For example, making a ball shape with the hands, while saying, ‘the big one’ adds completely new information. Intra-class correlation coefficients (ICC; two-way mixed, consistency, single measures), which are recommended for both continuous and interval data, were used to measure test reliability of all non-categorical variables. ICC for new information was .69, in the good range.
Confidence.
Coders gave gestures a brief gloss (i.e., a translation or meaning of the gesture) and rated their confidence in the gloss of the gesture on a 0 to 2 scale (0 indicating not confident, 1 indicating somewhat confident, and 2 indicating confident). Unlike the other codes, raters were not specifically trained in how to make this distinction, and were not aware that this code would be used to assess group differences. Thus, these codes likely varied based on both how easy the individual gesture was to code, and how confident each coder felt about their own decision making. ICC for confidence was .55 indicating fair agreement. Thus, while reliability for this variable was not as high as for the variables for which coders were trained to consensus, it is still reliable enough to include in analyses.
Coding motoric features of gesture.
Three motoric features of participants’ hand movements were coded for each gesture: (1) gesture height, (2) gesture size, and (3) hand(s) used to execute gesture. Height and size were collapsed into a “saliency” variable for all analyses, per previous research (Chu et al. 2014). For both height and size codes, if the two hands moved in different ways, the highest code received was recorded.
Height.
Height was measured on a 3-point scale as the highest point of the gesture, relative to the table and the gesturer’s body. The gesture was classified as either under the table (code: 1), between the table and the gesturer’s chin (code: 2), or above the gesturer’s chin (code: 3). ICC for height was .90, in the excellent range (Hallgren 2012).
Size.
Size was measured on a 4-point scale and captured the relative involvement of body parts (specifically the hand and arm) used to execute the gesture during the stroke phase of the gesture (i.e., the actual movement of the gesture, not including preparation of the hands or relaxation after execution of the gesture). The gesture was classified as either using the finger(s) only (e.g., finger extends, but palm, elbow, and wrist remains stationary; code: 1), the hand(s) (e.g., hand flicks and palm moves, but elbow and wrist remains stationary; code: 2), the forearm(s) (e.g., forearm moves across body and wrist moves through space, but elbow remains stationary; code: 3), or the full arm(s) (e.g., full arm including elbow moves through space; code: 4). ICC for size was .77, in the excellent range.
Hand.
Gestures were coded for the hand used to enact the gesture. Gestures were coded as being produced with the right hand only, the left hand only, with both hands in a symmetrical movement, or with both hands in an asymmetrical movement (i.e., both hands doing something different). Kappa for gesture hand was .90, in the excellent range.
Analytic Approach
Count variables, including the total number of gestures, counts of different gesture types, and counts of what hand(s) was used to produce each gesture, were square-root transformed due to evidence of Poisson distribution. The general linear model was used for all group comparisons. For categorical variables, including gesture type and hand use, mixed-models ANCOVAs were performed, with group as the between-subjects factor, gesture variable (i.e., gesture type category or hand(s) used) as the repeated measure, and gesture count as the covariate, to control for individual differences in overall gesture production. Individual differences in gesture production were substantial, ranging from 1 to 115 gestures produced over the course of the task (Mean = 37.6, SD = 30.8). To avoid drawing conclusions from very low numbers of gestures in some participants, with the exception of gesture rate, which can be validly analyzed on low-frequency gesturers, the remaining gesture analyses presented included only participants who produced at least 5 gestures on the task. Thus, for all analyses except rate, 2 TDC adults and 3 autistic adults were excluded, resulting in a slightly reduced sample of n = 19 controls and n = 18 autistic adults. We note that all analyses were also conducted with all participants included, and the pattern of results was comparable.
Results
Task Performance
All participants except two (both in the ASD group) performed perfectly on every trial (meaning that they and their interlocutor ended each trial with perfectly matching pieces). Average task accuracy did not differ by group, t(40) = 1.13, p = .26, Cohen’s d = 0.34; thus, any group differences reported are taken to be the result of differences in communication style, and not task performance. Autistic adults took significantly longer to complete the task, with an average of 9.57 mins (SD = 2.41 mins) of active description time (i.e., excluding task instructions, practice, and chitchat between trials/conditions) compared to 6.35 minutes (SD = 2.29 mins) in the TDC sample, t(40) = 4.424, p < .001, Cohen’s d = 1.40. Next we turn to group differences in how gestures were used over the course of the task.
Rate
Overall gesture rate was computed as gestures/minute (i.e., the total gesture count for each participant was divided by the total active task time). TDC participants gestured at a marginally higher rate (Mean (SD) = 6.15 (4.29) gestures/min) than autistic participants (Mean (SD) = 4.01 (3.78)), t(40) = 1.71, p = .09, Cohen’s d = 0.54.
Semantic and Pragmatic Features of Co-Speech Gesture
Type.
Gestures were categorized into one of five gesture types, plus an “other” category for gestures that could not be classified. There was a very large main effect of gesture type, F(5,30) = 22.79, p < .001, partial η2 = .79, with participants in both groups using far more representational gestures than any other type. There was also a significant interaction between type and group, F(5,30) = 4.81, p = .002, partial η2 = .45, with a large effect size, demonstrating that autistic adults used a different distribution of gesture types compared to controls. Planned follow-up contrasts (see Table 3) demonstrated that this effect was driven primarily by a higher rate of both interactive gestures and other/unclassifiable gestures in ASD.
Table 3:
Gesture types, by diagnostic group.
ASD (n=18) Mean (SD) |
TDC (n=19) Mean (SD) |
p | Cohen’s d | |
---|---|---|---|---|
Interactive | 8.3 (7.1) | 3.4 (3.1) | .03 | 0.78 |
Representational | 22.9 (21.2) | 29.9 (19.2) | .15 | 0.50 |
Deictic | 2.6 (5.1) | 0.7 (1.0) | .22 | 0.42 |
Beat | 8.6 (8.1) | 5.8 (6.1) | .16 | 0.49 |
Numeric | 0.9 (2.6) | 0.1 (0.3) | .11 | 0.58 |
Other | 1.4 (1.4) | 0.3 (0.7) | >.01 | 1.00 |
Note: ASD = autism spectrum disorder. TDC = typically developing control.
Interactive gestures: Exploratory analyses.
We conducted a follow-up analysis to further explore the surprising finding that autistic adults used more interactive gestures than controls. All interactive gestures (n = 215) were additionally classified according to the functions they served during the task. Functions included in this analysis were selected based on the work of Janet Bavelas and colleagues (1992), and included: gestures that mark uncertainty/hedging, gestures to regulate conversational turn-taking, gestures that signal that verbal information being presented was shared across interlocutors, gestures signaling agreement, and gestures signaling disagreement. Exploratory independent-samples t-tests, presented in Table 4, revealed that the group difference in interactive gestures was driven by a dramatically increased use of gestures to regulate turn-taking by autistic adults compared to controls. On average, autistic adults used gesture to regulate turn-taking over three times as often as controls.
Table 4:
Functions of interactive gestures, by group
ASD n |
TDC n |
ASD Mean (SD) |
TDC Mean (SD) |
p | Cohen’s d | |
---|---|---|---|---|---|---|
Uncertainty | 10 | 6 | 2.4 (3.1) | 0.9 (1.6) | .10 | 0.57 |
Turn Taking | 14 | 10 | 2.4 (2.4) | 0.7 (0.7) | .01 | 0.98 |
Shared Information | 11 | 7 | 1.4 (1.6) | 0.7 (1.3) | .13 | 0.52 |
Agreement | 9 | 6 | 0.3 (0.6) | 0.6 (1.1) | .94 | 0.34 |
Disagreement | 4 | 4 | 0.3 (0.6) | 0.1 (0.3) | .17 | 0.03 |
Note: n refers to the number of participants in each group (out of n=18 ASD and n=19 TDC) who used a gesture to fulfill the given interactive function. ASD = autism spectrum disorder. TDC = typically developing control.
New information.
To determine whether gestures contributed additional information beyond what was already conveyed through speech, coders rated how much “new information” was presented in participants’ representational gestures (relative to their speech) on a 0, 1, or 2 scale. On average, there was no group difference in how much new information was presented in gesture, t(22.42) = 0.60, p = .55, d = 0.25.
Interpretability.
To determine how interpretable gestures were, we compared coders’ ratings of their own confidence in the gesture’s function/meaning (on a 0, 1, or 2 scale). Confidence scores were high in both groups (ASD Mean(SD) = 1.74(0.26), TDC Mean(SD) = 1.90(0.09)); however, mean scores still statistically differed between groups, t(35) = 2.39, p = .03, d = 1.05, with lower confidence in gesture meaning/function in the ASD group. Variance also differed between groups, Levene’s statistic = 14.04, p = .001, with the ASD group showing much greater variation in how confident coders felt about their ratings.
Motor Features of Co-Speech Gesture
Saliency.
Previous research has demonstrated a relationship between gesture saliency and social cognitive features such as empathy (Chu et al. 2014). Thus, coders rated saliency on two dimensions, 1) the height where the gesture was executed, and 2) how much of the arm was used to execute the gesture, which were summed to create a total saliency score ranging from 2-7 possible points. Groups did not differ on gesture saliency, F(1,35) = 2.19, p = .15, Cohen’s d = 0.49 (ASD Mean (SD) = 4.56 (0.52), TDC Mean (SD) = 4.79 (0.42)).
Hand use.
As an additional measure of motor behavior during gestural communication, coders rated which hand or hands participants used to produce each gesture (right hand only, left hand only, two hands symmetrically, two hands asymmetrically). There was a large main effect of hand selection, F(3,33) = 7.50, p = .001, partial η2 = .41, with single-handed gestures executed by the right hand being the most common overall, see Figure 3. A large group by hand selection interaction was also observed, F(3,33) = 4.88, p = .006, partial η2 = .31. This interaction effect appeared to be driven primarily by a relatively increased tendency to use single-handed gestures in the ASD group, as well as a relatively increased tendency to use two-handed gestures in the TDC group.
Figure 3: Hand(s) spontaneously selected to execute gesture.
Across groups, participants were most likely to gesture with their right hand only. Autistic adults and typical adults showed a different pattern of spontaneous hand selection across the task. Error bars represent standard error.
** p < .01
To directly compare single-handed vs. two-handed gestures, follow-up analyses collapsed across left and right single-handed gestures, and symmetrical and asymmetrical two-handed gestures. This analysis was also theoretically motivated due to our prediction that autistic adults may exert differential motoric patterns during communication. There was no main effect of single-handed vs. two-handed gesturing, F(1,34) = 1.22, p = .28, partial η2 = .04; however, there was a large interaction between group and the number of hands used during gesturing, F(1,34) = 5.93, p = .02, partial η2 = .15. Follow-up paired-samples t-tests demonstrated that controls used roughly the same number of single-handed vs. two-handed gestures, t(18) = −1.01, p = .32, Cohen’s d =−0.48; in contrast, autistic adults produced twice as many single-handed gestures compared to two-handed gestures, t(17) = 2.20, p = .04, Cohen’s d = 1.07, see Figure 4.
Figure 4: Single-handed vs. two-handed gestures, across groups.
Typical adults were equally likely to gesture with one vs. two hands. In contrast, autistic adults were twice as likely to use a single hand to gesture compared to two hands, suggesting reduced motoric effort or complexity during communication. Error bars represent standard error.
* p < .05
It is possible that this effect of hand use may have been driven by a confound: specifically, if interactive gestures are more likely to be produced by a single hand compared to other gesture types, then the ASD group may have used more single-handed gestures due to differences in communicative functions of gesture, thus limiting the interpretation that greater use of single-handed gestures is driven by motoric differences. To address this possibility, we conducted a repeated-measures ANOVA with the three most common gesture types (interactive, representational, and beat) as the independent variable, and the proportion of times the gesture type was executed with a single hand as the dependent variable. On average, interactive gestures were executed with a single hand 57% of the time (SD = 35%), representationals were executed with a single hand 52% of the time (SD = 31%), and beats were executed with a single hand 58% of the time (SD = 38%), F(2,26) = 0.665, p = .53 partial η2 = .05. Thus, the three predominant gesture types in the current study were equally likely to be executed with one vs. two hands, supporting the conclusion that hand selection was associated more with motor features of nonverbal communication than gesture function.
Relationship Between Gesture Use and Measures of ASD Symptoms
Two multiple linear regressions were conducted to test the relationship between features of gesture production and ASD symptoms. ADOS-2 scores were taken as a measure of the behavioral presentation of autism symptoms, as ADOS-2 scores are based on an individual’s in-the-moment behaviors, as rated by an experienced clinician. SRS-2 (adult self-report) scores were taken as a measure of general ASD symptoms, as these scores are based on an individual’s broader day-to-day experiences (Hus et al. 2013).
First, we conducted a multiple linear regression with total ADOS-2 score as the dependent variable2. All variables that showed significant or marginal group differences were entered as predictors, thus rate, confidence, and proportion of single-handed gestures were all entered, and proportion of interactive gestures was entered as a measure of gesture type. This analysis was only conducted for the ASD sample, given the limited variability in ADOS-2 scores in the TDC group. Taken together, gesture variables explained 54% of the variance in ADOS-2 scores, F(4,13) =3.81, p = .03. Among the four variables entered as predictors, proportion of single-handed gestures was the only significant independent predictor, β = .663, p = .01, see Figure 5. The values of the three non-significant predictors were as follows: for rate, β = − .237, p = .25; for confidence, β = .043, p = .85, and for gesture type (i.e., proportion of interactive gestures), β = .008, p = .97. Taken together, this model suggests that features of gesture production are closely associated with the behavioral presentation of ASD in adults.
Figure 5: Relationship between hand selection and ASD symptoms.
The tendency to select a single hand while gesturing was strongly positively correlated with ADOS-2 scores in autistic adults, r(18) = .70, p = .001, showing that individuals who were most likely to gesture with only one hand had the most severe symptom presentation, as measured by the ADOS-2.
Next, we completed the same analysis, with SRS-2 total raw scores as the dependent variable. This model was not significant, F(4,13) = 0.91, p = .49, and none of the entered gesture variables were independent predictors of SRS-2 scores. Values for the four predictors were as follows: for proportion of single-handed gestures, β = .−.209, p = .45, for rate, β = − .416, p = .68; for confidence, β = −.127, p = .66, and for gesture type (i.e., proportion of interactive gestures), β = −.376, p = .22. Taken together, this model suggests that features of gesture production are not related to general ASD symptoms, as measured by adult self-report.
Discussion
This study is the first to systematically investigate communicative co-speech gestures in autistic adults, demonstrating that differences in this domain of nonverbal communication persist into adulthood, even in verbally fluent individuals. Differences were observed across both semantic/pragmatic and motoric features of gesture production. Below we summarize these findings and discuss their implications for ASD research.
Autistic adults gestured at a marginally lower rate than controls, consistent with the bulk of the literature on co-speech gestures in ASD, which has sometimes reported group differences and sometimes has not (de Marchena and Eigsti 2010; de Marchena and Eigsti, 2014; Garcia-Perez et al. 2007; Medeiros and Winsler 2014; Morett et al. 2016; Silverman et al. 2017). This soft finding, in combination with prior literature, suggests that if there is a true group difference in gesture rate in ASD, the effect is small and noisy. This may not be surprising, given that gesture rate is neither unambiguously semantic/pragmatic nor motoric; rather, it shares elements of both domains.
Gesture rate differences are not consistently associated with ASD, but differences in gesture production quality are widely cited in the clinical literature. This suggests that qualitative features (i.e., how gestures are used or enacted) might best capture gestural communication differences in this population. In fact, prior research suggests that broad measures of gesture quality are associated with clinical ASD measures, while measures of gesture rate are not (Silverman et al. 2017). In this study, we sought to identify some of the specific ways in which gestures may be qualitatively distinct in ASD. The nonverbal communication symptom of ASD is defined both in terms of quantity of nonverbal behavior (i.e., “total lack of…”) and in terms of quality of nonverbal behavior (i.e., “abnormalities in…”; American Psychiatric Association 2013). Quantity of nonverbal communication (i.e., rate/frequency) is relatively well-characterized on questionnaires about social communication symptoms, like the SCQ (Rutter et al. 2003) and the SRS (Constantino 2012), with fewer gestures being indicative of ASD. The ADOS-2 (Lord et al. 2012), which includes two gesture-specific items when used to assess verbally fluent adults, prompts clinicians to attend to both quantity and quality of different gesture types. However, the specific ways that gesture might differ qualitatively in ASD have yet to be adequately characterized empirically. We found that autistic adults used gesture to prioritize different communicative functions, produced gestures that were more difficult for neurotypical raters to interpret, and were less likely to spontaneously use both hands to produce their gestures. We suggest that these variables, and others yet to be discovered, provide an inroads for specifying – and quantifying – distinctive features of co-speech gesture in ASD that have traditionally been labeled as “qualitative.”
Semantic/Pragmatic Features of Co-Speech Gesture in ASD
Gestures serve a wide range of communicative functions, which were measured in this study by categorizing gestures into one of five “types.” Autistic adults produced a significantly different distribution of gesture types compared to controls, suggesting that they use gesture for different communicative functions. Specifically, autistic adults were more than twice as likely to use interactive gestures.
Interactive gestures serve a variety of discourse functions. In an exploratory follow-up analysis, we found that the increased rate of interactive gestures in ASD was driven primarily by a dramatically increased use of gestures to regulate conversational turn-taking, compared to controls. As shown in Figure 6, turn-taking interactive gestures were used both to hold the floor (i.e., don’t interrupt me!, Figure 6a), and to indicate that it is the other person’s turn to speak (i.e., you go ahead, Figure 6b). Conversational turn-taking, though understudied empirically in ASD, is an area with clear practical applications. We speculate that the autistic adults in our study may have found gesture to be an accessible way to regulate turn-taking, compared to other relevant domains, such as language, prosody, or eye gaze3. This speculation may run counter to clinical wisdom that young, language delayed, autistic children are less likely to compensate for their limited language by gesturing, compared both to typically developing children, and language-delayed children without ASD. Co-speech gestures, however, may function differently than early pre-verbal gestures. For example, Braddock and colleagues (2016) found that verbally fluent autistic adolescents with lower scores on the Children’s Communication Checklist (Bishop and Volkmar 2003) gestured more often, suggesting that these teens were using gesture to compensate for broad weaknesses in verbal communication. Future research can test whether autistic adults who use more interactive gestures are less likely to show verbal disruptions in turn-taking, as evidenced, for example, in fewer interruptions or awkward pauses.
Figure 6: Examples of Interactive gestures used to regulate turn-taking.
Interactive gestures are used to regulate conversational turn-taking in a number of ways, including signaling that a speaker would like to hold the floor (6a; gloss: “don’t interrupt me,” “let me finish”), and signaling that a speaker is ready to yield the floor (6b; gloss: “you go ahead,” “what do you think?”)
Gesture is also used to supplement information provided in speech. In fact, a common approach to understanding the relationship between speech and gesture is to look at how much supplementary information gesture provides. Previous work has demonstrated that autistic children (So et al. 2014) and adolescents (Morett et al. 2016) supplement less via gesture; however, we did not replicate this phenomenon in adults. Discrepant results across studies could reflect differences in tasks or stimuli. Alternatively, our findings could reflect convergence across groups over time, either as typically developing children use fewer supplementary combinations with development (Alibali et al. 2009), or as autistic children and adolescents increasingly use gesture to supplement with age.
Beyond these narrow differences in the communicative functions of gestures, we also found broad differences in the communicative effectiveness of gesture in autistic adults. This was demonstrated in two ways. First, we found that coders’ confidence in the meaning of each coded gesture, while high overall, was significantly lower in the ASD group. Coders were not aware that this rating would be compared by diagnostic group, and in fact believed it would be used as a metric for evaluating coding fidelity. Thus, differences in how clearly a gesture expressed its intended meaning were apparent even to highly trained coders who were unaware of participants’ diagnostic status. Second, we found that gestures produced by autistic adults were more likely to be classified as “other” gestures, with a large effect size (a full standard deviation difference between groups). This demonstrates that autistic adults were more likely to produce movements that coders believed to be gestures, but were unable to classify. In the real world, we can imagine that gestures produced by autistic adults may not achieve their intended goal if they are not easily interpreted by interlocutors. We note one important caveat to these findings: all raters involved in the current study were neurotypical college students. It is quite possible that these “hard to interpret” gestures may be much easier to interpret when the recipient is, for example, another autistic adult, family member, or close friend. If this is the case, then co-speech gestures may not be an appropriate treatment target. Future research with a variety of conversational dyads is necessary to answer these important questions.
Motoric Features of Co-Speech Gesture in ASD
Next we turn to differences in the motor features of gestures produced by autistic adults. Two motor variables were included in the present study: (1) overall gesture size, measured by the height of the gesture relative to the body, and the number of hand/arm joints needed to execute the movement, and (2) the number of hands used to produce each gesture. We predicted that autistic adults would produce smaller gestures relative to controls, given evidence that individual differences in gesture size are associated with social cognition (Chu et al. 2014). However, we found no group difference in gesture size, suggesting that gesture size may not be associated broadly with social-communication.
A simple measure of hand selection (i.e., whether the participant used one or two hands to produce each gesture) was robustly associated with an ASD diagnosis. TDC adults were equally likely to spontaneously use one or two hands to gesture; however, autistic adults used twice as many single-handed gestures relative to two-handed gestures. Further, within the ASD group, an individual participant’s tendency toward using single-handed gestures was strongly positively correlated with ADOS-2 scores (i.e., ASD symptoms), and in fact was the only gesture variable in the current study that independently predicted ADOS-2 scores (predicting a full 49% of the variability), with single-handed gesturers having more observable ASD symptoms.
The tendency to rely more on single-handed gestures may be related to differences in interhemispheric connectivity that have been observed in ASD. Specifically, atypicalities in the corpus callosum, the large, white matter tract connecting the left and right hemispheres, have consistently been observed in ASD across development (for a meta-analysis, see Lefebvre et al. 2015). Atypical corpus callosum growth (surface area and thickness) is seen as early as 6 months of age, particularly in the anterior/genu sections, which are associated with motor control (Wolff et al. 2015). Beyond measures of area, differences in white matter integrity have consistently been demonstrated in ASD, including atypical development of white matter microstructure in the corpus callosum (Travers et al. 2015), reduced fractional anisotropy (Aoki et al. 2013; Shukla et al. 2010; Vogan et al. 2016), and increased mean diffusivity (Aoki et al. 2013). Our study looked only at behavior, but shows promise that certain measures taken from naturalistic interaction samples could be linked to brain-based measures, someday providing a direct link between atypical brain development and real-world social-communication functioning in ASD.
From a behavioral perspective, autistic adults’ reduced use of two-handed gestures may reflect difficulties with bilateral motor coordination (Eliassen et al. 2000). Bilateral coordination has been understudied in ASD; however, there is evidence of lateral motor asymmetries in ASD relative to both typical and developmentally delayed control groups (Esposito et al. 2009; Teitelbaum et al. 1998). Further, basic differences in bilateral coordination of rhythmic movements may be related to challenges with interpersonal synchrony (Isenhower et al. 2012), which itself is associated with a broad range of prosocial behaviors. More research is needed to understand both basic capabilities for bilateral coordination in ASD, and how this domain of motor skills may be related to nonverbal communication and interpersonal coordination.
Limitations and Future Directions
One limitation of the current study is that all participants in our ASD sample were verbally fluent, and it is unknown how well our results might generalize to autistic adults who also have limited cognitive and language skills – adults who are in even greater need of support for communication. A further limitation is that we focused almost exclusively on one communicative modality: co-speech gestures. Communication is inherently multi-modal (e.g., language, prosody, gesture, facial expressions, eye gaze), and in fact, integrating across modalities in both production and comprehension may be one the greatest communication challenges faced by autistic people. Because we did not measure linguistic output during the task, we can only speculate on the relationship between verbal strategies for conversational turn-taking and nonverbal strategies; future studies will more directly measure this link. Similarly, we did not specifically measure motor skills using a validated motor skills assessment – rather, we measured motor behavior within the context of communication. There is evidence that motor skills are broadly related to social communication symptoms in ASD (Dziuk et al. 2007); future research in our lab aims to probe this relationship more specifically by concurrently assessing both motor skills and nonverbal communication. We appreciate the challenges inherent in conducting fine-grained analyses across a wide range of behaviors at once, and, as others have argued, consider this to be one of the most important frontiers facing autism research (Fein and Helt 2017). The increase in automated, computational approaches and “behavioral imaging” (Rehg et al. 2014) will improve the field’s ability to measure a wider number of behaviors at once. We hope that our findings will encourage other ASD researchers to include communicative gestures among their measured behaviors.
Conclusions
The findings reported here demonstrate differences across both semantic/pragmatic and motor features of gesture in ASD. Autistic adults may use gestures to facilitate conversational turn-taking, a finding with clear clinical implications that can be viewed as a strength. Autistic adults in this study were much more likely to use single-handed gestures than two-handed gestures, a phenomenon that might reflect underlying differences in interhemispheric connectivity, and may point to tractable neural foundations of gestural differences in this population. Communicative co-speech gestures may provide a link between known differences in motor behavior and nonverbal communication symptoms in ASD. Future research can test this relationship directly by, for example, comparing spontaneously produced co-speech gestures with tests of motor representation or performance.
Acknowledgements
We thank the participants who made this research possible, and who impressed us all with their creativity in describing the task stimuli. This project would not have been possible without the support of a large team at the Center for Autism Research at CHOP, including especially: Leslie Adeoye, Leila Bateman, Madeline Conca, Zachary Dravis, Fiona Fergusson, Emily Ferguson, Ashley Pallathra, Juhi Pandey, Alison Pomykacz, Leah Wang, Yuchen Zhang, and Alisa Zoltowski. Special thanks to the scientists who generously shared both their wisdom and their resources to make this work stronger: Karen Adolph and Janet Bavelas. We acknowledge funding from the following sources: U54 HD86984 (National Institute of Child Health and Human Development), R34MH104407 (National Institute of Mental Health), T32NS007413 (National Institute of Neurological Disorders and Stroke), the Eagles Charitable Foundation, and the McMorris Family Foundation. Portions of this work were presented at the Meeting of the International Society for Autism Research in 2017 and 2018.
We thank the participants who made this research possible, and who impressed us all with their creativity in describing the task stimuli. This project would not have been possible without the support of a large team at the Center for Autism Research at CHOP, including especially: Leslie Adeoye, Leila Bateman, Madeline Conca, Zachary Dravis, Fiona Fergusson, Emily Ferguson, Ashley Pallathra, Juhi Pandey, Alison Pomykacz, Leah Wang, Yuchen Zhang, and Alisa Zoltowski. Special thanks to the scientists who generously shared both their wisdom and their resources to make this work stronger: Karen Adolph and Janet Bavelas. We acknowledge funding from the following sources: U54 HD86984 (National Institute of Child Health and Human Development), R34MH104407 (National Institute of Mental Health), T32NS007413 (National Institute of Neurological Disorders and Stroke), the Eagles Charitable Foundation, and the McMorris Family Foundation. Finally, we thank three anonymous peer reviewers for their thoughtful and constructive feedback on earlier drafts of this manuscript. Portions of this work were presented at the Meeting of the International Society for Autism Research in 2017 and 2018.
Appendix
Note:
Italicized = speak
Italicized and bold = gesture and speak at the same time
- = nothing scripted
Confederate Script—Participant as Director / Confederate as Matcher Role
Figure 1: Mug
Gesture: [Hold up mug to chest]
Figure 2: Microphone
Gesture: [Hold up microphone to mouth/chest]
Figure 3: -
Figure 4: Seesaw
Gesture: [Hold up forearm in horizontal position in front of chest]
Figure 5: -
Figure 6: -
Figure 7: Backwards C
Gesture: [Make backwards C shape with hand]
Figure 8: -
Confederate Script—Participant as Matcher / Confederate as Director Role
Figure 1: The first one looks like an ice cream cone.
Gesture: [Hold up ice cream cone to mouth/chest]
Figure 2: Uh, number 2 is like triangular… [PAUSE] it looks like the slice of pizza.
Gesture: [Start with palms facing each other at center of body and move out and away from body in triangle shape]
Figure 3: Square 3 has like um a cone… [PAUSE] a cone on top of a square.
Gesture: [“I don’t know” conventional gesture with both hands, slightly shrugging shoulders]
Figure 4: This one kinda looks like a sauté pan, the handle, circle with a handle [PAUSE] – with the handle up and to the left.
Gesture: [Grab sauté pan handle off to side/center of body and slide hand back and forward]
Figure 5: Um see through square.
Gesture: [Draw square shape with fingers using full arms in front of face starting at top center]
Figure 6: Square 6 is the other sauté pan.
Gesture: [abstract deictic gesture, i.e., pointing to an abstract location in space indicative of “other”]
Figure 7: Um, uh, it’s a cylinder, it looks like a can of soda.
Gesture: [Hold up soda can to mouth/chest]
Figure 8: And eight is the only one left.
Gesture: [Beat gesture]
Footnotes
“Autistic person” is the term preferred by many adult self-advocates on the autism spectrum (e.g., Kenny et al. 2016; Lydia 2015), and will therefore be used throughout this paper.
On the ADOS-2, Module 4, one of the algorithm items (A10) specifically assesses co-speech gestures, thus, to check for circularity, we ran the same analyses with all algorithm items, except A10, as the dependent variable, and the pattern of results was identical.
We note that all autistic adults in the current sample had recently participated in the TUNE In social skills intervention. Conversational turn-taking skills were explicitly addressed and practiced as part of this treatment, although gestures were not specifically targeted. It is possible that some adults in this sample spontaneously learned to use gestures to regulate turn taking as part of their involvement in this treatment.
At the time of the study, all authors were affiliated with the Center for Autism Research at the Children’s Hospital of Philadelphia. Ashley de Marchena was also affiliated with the University of the Sciences, Department of Behavioral and Social Sciences. Armen Bagdasarov was also associated with the University of Pennsylvania, Department of Psychology. Julia Parish-Morris was also associated with the Perelman School of Medicine at the University of Pennsylvania, Department of Psychiatry. Brenna B. Maddox was also associated with the Perelman School of Medicine at the University of Pennsylvania, Department of Psychiatry, Center for Mental Health Policy and Services Research. Edward S. Brodkin was also associated with the Perelman School of Medicine at the University of Pennsylvania, Department of Psychiatry, Center for Neurobiology and Behavior, Translational Research Laboratory. Robert T. Schultz was also associated with the Perelman School of Medicine at the University of Pennsylvania, Department of Psychiatry and the Children's Hospital of Philadelphia, Department of Pediatrics.
Compliance with Ethical Standards
This study was approved by the Institutional Review Boards at both the University of Pennsylvania and the Children’s Hospital of Philadelphia. Prior to enrolling in the study and completing any study measures, all participants gave written informed consent.
Conflict of Interest: The authors declare that they have no conflict of interest.
References
- Alibali MW (2005). Gesture in Spatial Cognition: Expressing, Communicating, and Thinking About Spatial Information. Spatial Cognition & Computation, 5(4), 307–331. 10.1207/s15427633scc0504_2 [DOI] [Google Scholar]
- Alibali MW, Evans JL, Hostetter AB, Ryan K, & Mainela-Arnold E (2009). Gesture–speech integration in narrative: Are children less redundant than adults? Gesture, 9(3), 290–311. doi: 10.1075/gest.9.3.02ali [DOI] [PMC free article] [PubMed] [Google Scholar]
- American Psychiatric Association. (2013). Diagnostic and statistical manual of mental disorders (DSM-5®). American Psychiatric Pub. [Google Scholar]
- Aoki Y, Abe O, Nippashi Y, & Yamasue H (2013). Comparison of white matter integrity between autism spectrum disorder subjects and typically developing individuals: A meta-analysis of diffusion tensor imaging tractography studies. Molecular Autism, 4(1), 25. doi: 10.1186/2040-2392-4-25 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bavelas JB, Chovil N, Lawrie DA, & Wade A (1992). Interactive gestures. Discourse processes, 15(4), 469–489. doi: 10.1080/01638539209544823 [DOI] [Google Scholar]
- Bhat AN, Landa RJ, & Galloway JCC (2011). Current perspectives on motor functioning in infants, children, and adults with autism spectrum disorders. Physical Therapy, 91(7), 1116–1129. doi: 10.2522/ptj.20100294 [DOI] [PubMed] [Google Scholar]
- Bishop D, & Volkmar F (2003). The Children’s Communication Checklist: CCC-2. ASHA. [Google Scholar]
- Bone D, Lee C-C, Chaspari T, Black MP, Williams ME, Lee S, et al. (2013). Acoustic-prosodic, turn-taking, and language cues in child-psychologist interactions for varying social demand. In INTERSPEECH (pp. 2400–2404). [Google Scholar]
- Braddock BA, Gabany C, Shah M, Armbrecht ES, & Twyman KA (2016). Patterns of gesture use in adolescents with autism spectrum disorder. American Journal of Speech-Language Pathology, 1. doi: 10.1044/2015_AJSLP-14-0112 [DOI] [PubMed] [Google Scholar]
- Cassell J, McNeill D, & McCullough K-E (1998). Speech-gesture mismatches: Evidence for one underlying representation of linguistic and nonlinguiStic information. Pragmatics and Cognition, 6, 1–24. doi: 10.1075/pc.7.1.03cas [DOI] [Google Scholar]
- Charman T, Drew A, Baird C, & Baird G (2003). Measuring early language development in preschool children with autism spectrum disorder using the MacArthur Communicative Development Inventory (Infant Form). Journal of child language, 30(1), 213–236. doi: 10.1017/S0305000902005482 [DOI] [PubMed] [Google Scholar]
- Chu M, Meyer A, Foulkes L, & Kita S (2014). Individual differences in frequency and saliency of speech-accompanying gestures: The role of cognitive abilities and empathy. Journal of Experimental Psychology: General, 143(2), 694–709. doi: 10.1037/a0033861 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Constantino JN (2012). Social Responsiveness Scale, Second Edition. WPS. [Google Scholar]
- Datavyu Team. (2014). Datavyu: A video coding tool Databrary Project, New York University; http://datavyu.org [Google Scholar]
- de Marchena A, & Eigsti IM (2010). Conversational gestures in autism spectrum disorders: Asynchrony but not decreased frequency. Autism Research, 3(6), 311–322. doi: 10.1002/aur.159 [DOI] [PubMed] [Google Scholar]
- de Marchena A, & Eigsti IM (2014). Context counts: The impact of social context on gesture rate in verbally fluent adolescents with autism spectrum disorder. Gesture, 14(3), 375–393. doi: 10.1075/gest.14.3.05mar [DOI] [Google Scholar]
- de Marchena A, & Miller J (2017). “Frank” presentations as a novel research construct and element of diagnostic decision-making in autism spectrum disorder: Frank ASD. Autism Research, 10(4), 653–662. doi: 10.1002/aur.1706 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dowell LR, Mahone EM, & Mostofsky SH (2009). Associations of postural knowledge and basic motor skill with dyspraxia in autism: Implication for abnormalities in distributed connectivity and motor learning. Neuropsychology, 23(5), 563–570. doi: 10.1037/a0015640 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dziuk MA, Larson JCG, Apostu A, Mahone EM, Deneckla MB, & Mostofsky SH (2007). Dyspraxia in autism: Association with motor, social, and communicative deficits. Developmental Medicine and Child Neurology, 49,734–739. doi: 10.1111/j.1469-8749.2007.00734.x [DOI] [PubMed] [Google Scholar]
- Eliassen JC, Baynes K, & Gazzaniga MS (2000). Anterior and posterior callosal contributions to simultaneous bimanual movements of the hands and fingers. Brain, 123(12), 2501–2511. doi: 10.1093/brain/123.12.2501 [DOI] [PubMed] [Google Scholar]
- Esposito G, Venuti P, Maestro S, & Muratori F (2009). An exploration of symmetry in early autism spectrum disorders: Analysis of lying. Brain and Development, 31(2), 131–138. doi: 10.1016/j.braindev.2008.04.005 [DOI] [PubMed] [Google Scholar]
- Fein DA, & Helt MS (2017). Facilitating autism research. Journal of the International Neuropsychological Society, 23(9–10). doi: 10.1017/S1355617717001096 [DOI] [PubMed] [Google Scholar]
- Fournier KA, Hass CJ, Naik SK, Lodha N, & Cauraugh JH (2010). Motor coordination in autism spectrum disorders: A synthesis and meta-analysis. Journal of autism and developmental disorders, 40(10), 1227–1240. doi: 10.1007/s10803-010-0981-3 [DOI] [PubMed] [Google Scholar]
- Garcia-Perez RM, Lee A, & Hobson RP (2007). On intersubjective engagement in autism: A controlled study of nonverbal aspects of conversation. Journal of Autism and Developmental Disorders, 37, 1310–1322. doi: 10.1007/s10803-006-0276-x [DOI] [PubMed] [Google Scholar]
- Gizzonio V, Avanzini P, Campi C, Orivoli S, Piccolo B, Cantalupo G, et al. (2015). Failure in pantomime action execution correlates with the severity of social behavior deficits in children with autism: A praxis study. Journal of Autism and Developmental Disorders, 45(10), 3085–3097. doi: 10.1007/s10803-015-2461-2 [DOI] [PubMed] [Google Scholar]
- Gowen E, & Hamilton A (2013). Motor abilities in autism: A review using a computational context. Journal of autism and developmental disorders, 43(2), 323–344. doi: 10.1007/s10803-012-1574-0 [DOI] [PubMed] [Google Scholar]
- Hallgren KA (2012). Computing inter-rater reliability for observational data: An overview and tutorial. Tutorials in quantitative methods for psychology, 8(1), 23–34. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Holler J, & Wilkin K (2011). Co-speech gesture mimicry in the process of collaborative referring during face-to-face dialogue. Journal of Nonverbal Behavior, 35(2), 133–153. doi: 10.1007/s10919-011-0105-6 [DOI] [Google Scholar]
- Hostetter AB (2011). When do gestures communicate? A meta-analysis. Psychological Bulletin, 137(2), 297–315. doi: 10.1037/a0022128 [DOI] [PubMed] [Google Scholar]
- Hummel JE, & Biederman I (1992). Dynamic binding in a neural network for shape recognition. Psychological review, 99(3), 480. [DOI] [PubMed] [Google Scholar]
- Hus V, Bishop S, Gotham K, Huerta M, & Lord C (2013). Factors influencing scores on the social responsiveness scale. Journal of Child Psychology and Psychiatry, 54(2), 216–224. doi: 10.1111/j.1469-7610.2012.02589.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Isenhower RW, Marsh KL, Richardson MJ, Helt M, Schmidt RC, & Fein D (2012). Rhythmic bimanual coordination is impaired in young children with autism spectrum disorder. Research in Autism Spectrum Disorders, 6(1), 25–31. doi: 10.1016/j.rasd.2011.08.005 [DOI] [Google Scholar]
- Kaczmarek LA (2002). Assessment of social-communicative competence: An interdisciplinary model. Paul H Brookes Publishing. [Google Scholar]
- Kanner L (1943). Autistic disturbances of affective contact. Nervous Child, 2, 217–250. [PubMed] [Google Scholar]
- Kelly SD (2001). Broadening the units of analysis in communication: Speech and nonverbal behaviours in pragmatic comprehension. Journal of Child Language, 28(2), 325–349. doi: 10.1017/S0305000901004664 [DOI] [PubMed] [Google Scholar]
- Kelly SD, Barr DJ, Church RB, & Lynch K (1999). Offering a hand to pragmatic understanding: The role of speech and gesture in comprehension and memory. Journal of Memory and Language, 40(4), 577–592. doi: 10.1006/jmla.1999.2634 [DOI] [Google Scholar]
- Kenny L, Hattersley C, Molins B, Buckley C, Povey C, & Pellicano E (2016). Which terms should be used to describe autism? Perspectives from the UK autism community. Autism, 20(4), 442–462. doi: 10.1177/1362361315588200 [DOI] [PubMed] [Google Scholar]
- Krauss RM, & Weinheimer S (1966). Concurrent feedback, confirmation, and the encoding of referents in verbal communication. Journal of Personality and Social Psychology, 4(3), 343–346. doi: 10.1037/h0023705 [DOI] [PubMed] [Google Scholar]
- Lefebvre A, Beggiato A, Bourgeron T, & Toro R (2015). Neuroanatomical diversity of corpus callosum and brain volume in autism: Meta-analysis, analysis of the Autism Brain Imaging Data Exchange project, and simulation. Biological Psychiatry, 78(2), 126–134. doi: 10.1016/j.biopsych.2015.02.010 [DOI] [PubMed] [Google Scholar]
- Lord C, Rutter M, DiLavore PC, Risi S, Gotham K, & Bishop S (2012). Autism diagnostic observation schedule: ADOS-2. Western Psychological Services Los Angeles, CA. [Google Scholar]
- Lydia B (2015, March 18). Identity-first language. Autism Self Advocay Network. http://autisticadvocacy.org/about-asan/identity-first-language/. Accessed 20 November 2017 [Google Scholar]
- MacNeil LK, & Mostofsky SH (2012). Specificity of dyspraxia in children with autism. Neuropsychology, 26(2), 165. doi: 10.1037/a0026955 [DOI] [PMC free article] [PubMed] [Google Scholar]
- McNeill D (1992). Hand and mind: What gestures reveal about thought. Chicago: University of Chicago Press. [Google Scholar]
- Medeiros K, & Winsler A (2014). Parent–child gesture use during problem solving in autistic spectrum disorder Journal of Autism and Developmental Disorders, 44(8), 1946–1958. doi: 10.1007/s10803-014-2069-y [DOI] [PubMed] [Google Scholar]
- Morett LM, O’Hearn K, Luna B, & Ghuman AS (2016). Altered gesture and speech production in ASD detract from in-person communicative quality. Journal of Autism and Developmental Disorders, 46(3), 998–1012. doi: 10.1007/s10803-015-2645-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mostofsky SH, Dubey P, Jerath VK, Jansiewwicz EM, Goldberg MC, & Denckla MB (2006). Developmental dyspraxia is not limited to imitation in children with autism spectrum disorders. Journal of the International Neuropsychological Society, 12,314–326. doi: 10.1017/S1355617706060437 [DOI] [PubMed] [Google Scholar]
- Mundy P, Sigman M, & Kasari C (1990). A longitudinal study of joint attention and language development in autistic children. Journal of Autism and Developmental Disorders, 20, 115–128. doi: 10.1007/BF02206861 [DOI] [PubMed] [Google Scholar]
- Nadig AS, Seth S, & Sasson M (2015). Global similarities and multifaceted differences in the production of partner-specific referential pacts by adults with autism spectrum disorders. Language Sciences, 1888. doi: 10.3389/fpsyg.2015.01888 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Obermeier C, Dolk T, & Gunter TC (2012). The benefit of gestures during communication: Evidence from hearing and hearing-impaired individuals. Cortex, 48(7), 857–870. doi: 10.1016/j.cortex.2011.02.007 [DOI] [PubMed] [Google Scholar]
- Pallathra AA, Day-Watkins J, Calkins ME, Maddox BB, Miller J, Parish-Morris J, et al. (2018). Improvement in social functioning following participation in TUNE In, a novel cognitive-behavioral treatment program – results from a 2nd cohort of adults with ASD. Presented at the Meeting of the International Society for Autism Research, Rotterdam,The Netherlands. [Google Scholar]
- Parish-Morris J, Liberman M, Ryant N, Cieri C, Bateman L, Ferguson E, & Schultz R (2016). Exploring autism spectrum disorders using HLT. In Proceedings of the Third Workshop on Computational Lingusitics and Clinical Psychology (pp. 74–84). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rehg JM, Rozga A, Abowd GD, & Goodwin MS (2014). Behavioral imaging and autism. IEEE Pervasive Computing, 13(2), 84–87. doi: 10.1109/MPRV.2014.23 [DOI] [Google Scholar]
- Robins DL, Casagrande K, Barton M, Chen C-MA, Dumont-Mathieu T, & Fein D (2014). Validation of the modified checklist for autism in toddlers, revised with follow-up (M-CHAT-R/F). Pediatrics, 133(1), 37–45. doi: 10.1542/peds.2013-1813 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rogers WT (1978). The contribution of kinesic illustrators toward the comprehension of verbal behavior within utterances. Human communication research, 5(1), 54–62. doi: 10.1111/j.1468-2958.1978.tb00622.x [DOI] [Google Scholar]
- Rutter M, Bailey A, & Lord C (2003). The Social Communication Questionnaire. Los Angeles: Western Psychological Services. [Google Scholar]
- Shukla DK, Keehn B, Lincoln AJ, & Müller R-A (2010). White matter compromise of callosal and subcortical fiber tracts in children with autism spectrum disorder: A diffusion tensor imaging study. Journal of the American Academy of Child and Adolescent Psychiatry, 49(12), 1269–1278, 1278.e1–2. doi: 10.1016/j.jaac.2010.08.018 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Silverman LB, Eigsti I-M, & Bennetto L (2017). I tawti taw a puddy tat: Gestures in canary row narrations by high-functioning youth with autism spectrum disorder: Gesture production in ASD. Autism Research. doi: 10.1002/aur.1785 [DOI] [PMC free article] [PubMed] [Google Scholar]
- So W-C, & Wong MK-Y (2016). I use my space not yours: Use of gesture space for referential identification among children with autism spectrum disorders. Research in Autism Spectrum Disorders, 26, 33–47. doi: 10.1016/j.rasd.2016.03.005 [DOI] [Google Scholar]
- So W-C, Wong MK-Y, Lui M, & Yip V (2014). The development of co-speech gesture and its semantic integration with speech in 6- to 12-year-old children with autism spectrum disorders. Autism, 1362361314556783. doi: 10.1177/1362361314556783 [DOI] [PubMed] [Google Scholar]
- Teitelbaum P, Teitelbaum O, Nye J, Fryman J, & Maurer RG (1998). Movement analysis in infancy may be useful for early diagnosis of autism. Proceedings of the National Academy of Sciences, 95(23), 13982–13987. doi: 10.1073/pnas.95.23.13982 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Travers BG, Tromp DPM, Adluru N, Lange N, Destiche D, Ennis C, et al. (2015). Atypical development of white matter microstructure of the corpus callosum in males with autism: A longitudinal investigation. Molecular Autism, 6, 15. doi: 10.1186/s13229-015-0001-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Van Edwards V (n.d.). 5 secrets of a successful TED talk. Science of People. http://www.scienceofpeople.com/ted/. Accessed 31 May 2017 [Google Scholar]
- Vogan VM, Morgan BR, Leung RC, Anagnostou E, Doyle-Thomas K, & Taylor MJ (2016). Widespread white matter differences in children and adolescents with autism spectrum disorder. Journal of Autism and Developmental Disorders, 46(6), 2138–2147. doi: 10.1007/s10803-016-2744-2 [DOI] [PubMed] [Google Scholar]
- Volden J, Magill-Evans J, Goulden K,& Clarke M (2007). Varying language register according to listener needs in speakers with autism spectrum disorder. Journal of Autism and Developmental Disorders, 37(6), 1139–1154. doi: 10.1007/s10803-006-0256-1 [DOI] [PubMed] [Google Scholar]
- Wechsler D (2011). Wechsler Abbreviated Scale of Intelligence–Second Edition. Bloomington, MN: Pearson. [Google Scholar]
- Winder BM, Wozniak RH, Parladé MV, & Iverson JM (2013). Spontaneous initiation of communication in infants at low and heightened risk for autism spectrum disorders. Developmental Psychology, 49(10), 1931. doi: 10.1037/a0031061 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wolff JJ, Gerig G, Lewis JD, Soda T, Styner MA, Vachet C, et al. (2015). Altered corpus callosum morphology associated with autism over the first 2 years of life. Brain, 138(7), 2046–2058. doi: 10.1093/brain/awv118 [DOI] [PMC free article] [PubMed] [Google Scholar]