Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2010 Jan 5.
Published in final edited form as: Aphasiology. 2007 Aug 1;21(6-8):717–725. doi: 10.1080/02687030701192273

Gesture and aphasia: Helping hands?

Victoria L Scharp 1, Connie A Tompkins 1, Jana M Iverson 1
PMCID: PMC2801920  NIHMSID: NIHMS159310  PMID: 20054430

Abstract

Background

The study of communicative gestures is one of considerable interest for aphasia, in relation to theory, diagnosis, and treatment. Significant limitations currently permeate the general (psycho)linguistic literature on gesture production, and attention to these limitations is essential for both continued investigation and clinical application of gesture for people with aphasia.

Aims

The aims of this paper are to discuss issues imperative to advancing the gesture production literature and to provide specific suggestions for applying the material herein to studies in gesture production for people with aphasia.

Main Contribution

Two primary perspectives in the gesture production literature are distinct in their proposals about the function of gesture, and about where gesture arises in the communication stream. These two perspectives will be discussed, along with three elements considered to be prerequisites for advancing the research on gesture production. These include: operational definitions, coding systems, and the temporal synchrony characteristics of gesture.

Conclusions

Addressing the specific elements discussed in this paper will provide essential information for both continued investigation and clinical application of gesture for people with aphasia.


The study of communicative gestures is of considerable interest for aphasia, in relation to theory, diagnosis, and treatment. For example, definitions and theories of aphasia differ in whether or not language impairment must cross all modalities of communication, including gesture. Some type or form of gesture is included in subtests of the Porch Index of Communicative Ability (PICA, Porch, 1967), scored in test batteries like the Communicative Abilities in Daily Living (CADL, Holland, 1980) and the pantomime recognition test (Duffy & Duffy, 1975, 1981), and modelled in the Promoting Aphasics Communicative Effectiveness (PACE) treatment programme (Davis & Wilcox, 1985). Also, aphasia researchers and clinicians consider gesture as both a means of communicative facilitation (e.g., Records, 1994) and compensation (e.g. Tompkins & Scharp, 2006).

Despite the broad potential relevance of gesture in aphasia, relatively little research has been done in this area. Thus the question remains as to how we can best advance the body of knowledge on gesture production in aphasia. Significant limitations currently permeate the general (psycho)linguistic literature on gesture production, and attention to these limitations is essential for both continued investigation and clinical application of gesture for people with aphasia.

At the core of the limitations in the general gesture production literature is a theoretical difference among authors in the instantiation of the function of gesture during the communication process. More specifically, gesture is viewed either as a part of the communicative process that is driving language expression from its inception or as a supplemental aspect of communication enacted primarily during word finding. This paper discusses several fundamental differences between these two perspectives and pinpoints aspects of the current literature that require particular consideration including: an operational definition of gesture, description and implementation of coding systems for studying gesture production, and the nature of the relationship between temporal characteristics of gesture and speech. Finally, specific examples of necessary gesture production studies particularly for patients with aphasia will be provided.

CONCEPTUALISING GESTURE PRODUCTION

Existing models of gesture production reflect two major perspectives that differ on the central question: What is the function of gesture during communication? The proposed function of gesture for communication is in turn directly connected to its point of entry into the communication process. The two primary perspectives in the gesture production literature are distinct in their proposals about the function of gesture, and about where gesture arises in the communication stream. While individual elements of particular models of gesture production and comparisons between models are beyond the scope of this paper (for details of the models see de Ruiter, 2000; Krauss, Chen, & Gottesman, 2000; McNeill, 1992, 2005; Rose, 2006), this section outlines some general principles that underlie these two perspectives, as a springboard for discussing how these points of view motivate research questions and the execution of research in gesture production.

The first perspective on the function of gesture during communication is that gesture is an integral component at the conceptual level of expression. In his most recent book, McNeill (2005) proposes that gesture is integrated at the earliest stages of communication, and is part of the driving force of communication. According to McNeill, gestures are “conceived of as ingredients in an imagery–language dialectic that fuels speech and thought” (p. 3). The act of gesturing for communication is inseparable from the verbal message and rests at the conceptual level. Verbal communication and gesture are produced in parallel, and gesture is a potentially equal participant in the conceptual/planning stages. Communicative expression thus occurs via both verbal and spatial means, providing a temporally linked, non-redundant, multidimensional, content-rich message.

The Sketch model of gesture and speech production by de Ruiter (2000) is consistent with the perspective that gesture is enmeshed with verbal communication at the conceptual level. The Sketch model (designated by de Ruiter as a “speech” model because it is based on Levelt’s 1989 model of speech production) purports that the primary function of gesture is for communication and that it emerges from the conceptualiser component of Levelt’s model. The Sketch model and McNeill’s perspective are also consistent with the Mutually Adaptive Modalities hypothesis (MAM, de Ruiter, 2006; Melinger & Levelt, 2004). The MAM states that if a speaker is unable to utilise one communication modality effectively (e.g., verbal communication in an excessively noisy place; gesture when face-to-face communication is not possible), the proportion of spatial information is skewed either towards gesture use or towards spatially loaded verbal expression respectively. While the Sketch model and MAM hypothesis are proposed to be able to accommodate the second major perspective outlined below, these principles clearly align with the view that gestures are conceptually driven.

The lexical retrieval hypothesis (Krauss et al., 2000) is the second major perspective on gesture production. In this view, gestures serve to assist in the retrieval of words from the mental lexicon. Krauss and colleagues’ work focuses on “lexical gestures”, which are defined as “spontaneous, complex, articulate hand-arm movements that seem related to the ideational content of the speech they accompany” (Morsella & Krauss, 2005, p. 415). Lexical gestures in this perspective are used predominantly as a supplemental mechanism to facilitate spoken language, and occur for the speaker’s benefit (e.g., Butterworth & Beattie, 1978; Morrel-Samuels & Krauss, 1992). Krauss and colleagues (Krauss, Chen, & Chawla, 1996) in conjunction with Butterworth and Hadar (1989), posit that lexical gestures are engaged as a preverbal priming mechanism and are enacted most frequently during word finding, specifically when additional (spatial) information is needed to prime and access a word. This view of gesture production is particularly relevant to aphasia and normal ageing, given the widely reported word-finding difficulties that can accompany both.

Research cited in support of the word retrieval hypothesis includes studies that report an increase in “lexical” gestures during dysfluencies in language production (Butterworth & Beattie, 1978; Morrel-Samuels & Krauss, 1992) and studies utilising gesture restriction paradigms that yield an increase in dysfluent verbal productions (Rauscher, Krauss, & Chen, 1996; Rime & Schiaratura, 1991). The rationale for the hypothesis investigated in these studies is the link between “lexical” gestures and verbal fluency. As a speaker experiences a halting or stalled moment in their verbal communication, ostensibly due to a word-finding delay, lexical gestures increase; when a speaker is unable to use gestures, increases in pauses and dysfluencies occur.

The lexical retrieval hypothesis in its current instantiation cannot accommodate some of the fundamental observations that motivate the MAM hypothesis. However, a subset of communicative gestures beyond the lexical gestures of interest to Krauss and colleagues may fill this gap. As acknowledged by Krauss and Hadar (1999), this additional subset of gestures includes deictics, emblems, and pantomimes from Kendon/McNeill’s continuum (Kendon, 1980; McNeill, 2005). The potential distinction between gesturing for communicative intent (gesturing for the listener) versus using gesture to enhance lexical retrieval (gesturing for self) again reflects the basic question of the function served by spontaneous movements of the hands and arms during communication.

When considering conceptual frameworks on gesture production, we are in agreement with a recent commentary by Rodriguez and Gonzalez Rothi (2006) that discusses the limitations of these models, and we underscore the need to view models as guiding tools that must continue to be developed, tested, and challenged. We also raise one further, specific consideration for both investigators and clinicians. The two perspectives on gesture production discussed above emphasise a key difference in the purpose and point of entry for gesture during communication, but they are not necessarily mutually exclusive. However, the proposed difference in the function of gesture during communication does impact the research emanating from each perspective. That is, the nature of the research questions, the methods used to elicit gesture, and the coding systems for setting boundaries of what is counted as a gesture are grounded in the perspective of the investigator, a fact that should be kept in mind when interpreting, comparing, and applying the results of various studies.

This conceptual framework debate has numerous implications for research on gesture production in aphasia, whether it be basic or clinical in nature. Four examples are provided here. First, aphasia research has not adequately considered the potentially important theoretical or clinical differences between gestures that are produced for the benefit of the (aphasic) speaker’s communication partners, versus gestures that are produced for the aphasic person him/herself. Perhaps there are at least partially differentiable systems that support each of these gestural functions, and that can be differentially impaired or spared, or differentially trained or generalised. Investigating such questions in aphasia could also provide evidence about the theoretical (and anatomic) coherence or fractionability of the two conceptual frameworks. A second, more focused example refers to the types of gestures deemed relevant for study in each conceptual framework. It may be fundamental for advancing gesture research in aphasia to determine whether there are normative gestural profiles for various elements or types of gesture in various communicative contexts; e.g., with topics or communication partners that vary in familiarity. Third, the perspective on the function of gesture should help to determine the most relevant gesture elicitation methods. For example, the lexical retrieval perspective calls for a task designed to accentuate word-finding difficulties, while the stance that gestures emerge from a conceptual origin could be more fruitfully evaluated in a task that, for example, tapped gesture production incidental to the spatial working memory demands of the communication. Finally, the relevant profiles on which to classify adults with aphasia may vary depending on the conceptual framework. From the lexical retrieval perspective on gesture, it may turn out to be just as critical to establish aphasia participant profiles based on their gesture use as it is to classify and compare them on verbal expression. By contrast, if gesture is deemed to arise from the conceptual level of communication, verbal expressive characteristics may suffice for patient classification.

DEFINITIONS AND CODING SYSTEMS

A clear definition of the gestures of interest in any investigation or clinical application is a mandatory element for interpretation, but often this crucial aspect is left for the reader to surmise. It can be difficult to disentangle definitions from the coding systems used to outline what is coded as a gesture and if that gesture is considered “communicative”. Additionally, the context in which a communicative gesture is produced impacts the analysis of the contribution of the gesture (i.e., naturalistic context, face-to-face communicative exchanges, spontaneous description, confrontation naming, etc.; Power & Code, 2006). Historically, operational definitions of gesture have been overlooked, and it is only recently that both definitions and models/frameworks of gesture production have been discussed in the same investigative circles (Rose, 2006).

Partly reflecting the typical lack of operational specificity, the gesture production literature is rife with terminological confusion. For example, the terms “emblems” (conventionalised symbolic gestures such as the okay sign; McNeill, 2005) and “iconics” (movements relating to the semantic content of the verbal expression; McNeill, 1992) are often interchanged in this literature. The term “gesticulation” used by Kendon (1980) to describe hand/arm movements of an illustrative nature, is now instantiated by McNeill (2005) as “motion that embodies a meaning relatable to the accompanying speech” (p. 5). Further complicating the picture, different terms and descriptions have been applied across perspectives and disciplines. For example, the “lexical-only” gesture perspective does not reference “beats” (McNeill, 1992, 2005) and “batons” (Efron, 1941), gestures that adhere to the prosodic/rhythmic features of language production and are integral for emphasis and communicative flow per McNeill.

Pantomimes are a unique gesture type that most authors agree encompasses movements of the hands, fingers, and arms without co-occurring verbal expression. Because these isolated and variable movements of the hands/arms (pantomimes can be executed in a variety of ways, unlike emblems which have a predictable static presentation; McNeill, 2005) are typically performed without verbal expression, the “point of entry” issue discussed in the prior section of this paper is irrelevant. Gesture researchers also tend to segregate self-touching or grooming movements of the hands and arms into a category distinct from gestures used during communication (Blonder, Burns, Bowers, Moore, & Heilman, 1995). These self-touching/grooming gestures are not tied to verbal expression, which sets them apart from communicative gesture. Given that the field can agree on definitions for gestures that are predominantly executed independent from verbal expression, one might wonder why it is so difficult to define the co-verbal movements of the hands and arms.

We propose that the problem of inconsistent definitions rests to a large degree on the point of entry distinction discussed earlier. If there is a fundamental difference in the function of communicative gestures, then a range of definitions may be needed to encompass the relevant phenomena. Any single definition of communicative gesture would have to capture evolving perspectives on the function of gesture. In our view, in order to derive an inclusive but specific definition for “gesture”, questions of the following sort need to be considered: Is it necessary to have a single definition that can cross all frameworks and populations, and that can encompass multiple gesture types? Are the gestures used in an adult/developed system fundamentally different in function from those that emerge during development or in an impaired system? Another definitional issue concerns whether gesture indeed falls into discrete categories, or whether it resides along a fluid continuum, as proposed by McNeill (2005), and how to capture apparent variations in gesture production in either case.

Rosenbek, LaPointe, and Wertz (1989) underscore both inclusionary and exclusionary aspects as critical components of definitions that are sufficiently comprehensive and specific, an emphasis with which we concur. While some may not consider an operational definition imperative, the lack of specifically and consistently defined terms in the gesture production literature challenges the integration of perspectives from multiple disciplines (e.g., neurology, cognition, and psycholin-guistics), the continued development of conceptual frameworks and models, and the application of results from existing studies. Until the field is farther along in achieving definitional precision and consistency, individual researchers and clinicians would do a great service by specifically setting the boundary conditions of what they consider to be (and exclude from consideration) the gestures of interest in their work.

As a starting point, we sought out explicit definitions provided in the gesture and aphasia literature. An early example indicates that “gesture was conceived as a unit of expression which might consist of a number of individual movements … some of these gestural units could be grouped together into larger, more complex gestural phrases” (Cicone, Wapner, Foldi, Zurif, & Gardner, 1979, p. 329). A more recent definition from Cunningham and Ward (2003) stated that “gesture was defined very broadly as a purposeful, symbolic hand signal, purposeful pointing, or purposeful facial expression” (p. 691). Another example comes from Foundas et al. (1995), who write: “a single gesture was defined as a discrete movement of the arm and/or hand that resulted in one continuous motion followed by a visible pause in the action,” (p. 207). While it is laudable that there are statements describing how gesture was defined in these studies, the three definitions overlap only slightly, in considering gesture to be some form of movement(s) of the hands/arms. The Cicone et al. definition also encompasses a series of movements that count as a singular gesture, and facial expression is included in the definition of Cunningham and Ward. This very slight overlap in explicit definitions leads to obvious difficulty in cross comparisons of studies of gesture in aphasia.

To begin to fill the definitional gap, we propose the following as a potential working definition of a communicative gesture: Communicative gesture includes spontaneous movements of the hands, arms, and fingers that are typically co-verbal and provide information that is consistent with the content of the verbal message, but can also provide additional information not contained in the verbal expression. Communicative gestures do not include self-touching, grooming gestures, or body-focused movements (Butterworth, Swallow, & Grimston, 1981). Additional features would need to be added to this definition depending on the type and context in which a gesture is to be studied. Such features could be captured in an explicit coding system, developed for that purpose.

A coding system is a template that guides authors in their analysis of gestures executed in an experimental context. Variation in coding systems also confounds the gesture production literature. The coding systems implemented to measure and study gesture are tightly knitted to the types of gestures being studied. While there are several templates for gesture coding systems (Krauss et al., 1996; McNeill, 1992) a single coding system (i.e., what “counts” as a specific type of communicative gesture) has yet to be agreed. This puts the reliability and validity of individual gesture studies in question, and makes virtually impossible the integration or meta-analysis of findings across gesture production studies.

Two examples will serve to illustrate the problem in coding system variability. In her study of cued recall for verbal targets, Frick-Horbury (2002) coded or classified elicited gestures into four categories: iconic, metaphoric, body-focused movements, and vague gestures. These four categories were used to guide analysis and interpretation of gesture use in that study. A quite different coding system was used by Alibali, Bassock, Solomon, Syc, and Goldin-Meadow (1999), who classified gestures into continuous representations (smooth, continuous motions of the hands/arms), discrete representations (taps, points or beats), “both” representations (movements that incorporated both continuous and discrete representations), or “neither” representations. As with definitions, coding systems have also been applied differently by gesture point of entry, population, and communicative context. For example, studies motivated from the lexical retrieval perspective limit attention to a subset of “lexical” gestures, whereas work that reflects the view of McNeill and colleagues explores a range of gestures along a continuum (Kendon, 1980; McNeill, 2005). Coding is further complicated when any single gesture can involve several aspects of different gesture subtypes (see, e.g., McNeill, 2005).

In summary, along with the fundamental differences in perspectives on the function of gesture during communication, vague or idiosyncratic definitions and varying coding systems complicate the implementation and interpretation of gesture studies. These factors should be carefully considered by aphasiologists when investigating and considering treatment approaches that incorporate gesture for patients with aphasia.

TEMPORAL CHARACTERISTICS OF GESTURE

The timing features of gesture are of interest due to the predominantly co-verbal nature of communicative gestures (McNeill, 1992, 2005). Studying communicatively disordered populations, a common practice across aphasiology and related disciplines, may provide valuable insight into the intact gesture-language system for communication. For example, in one study Mayberry and Jaques (2000) observed that people who stutter halt their gesture stroke (while maintaining the hand-shape) until their dysfluency resolves, at which point the gesture stroke is continued or completed. Studies that incorporate hand/arm restriction paradigms also provide evidence on the synchronous nature of gesture and verbal communication (Rauscher et al., 1996; Rime & Schiaratura, 1991), emphasising a possible trade in the spatial content of the message between the verbal expression and gesture depending on whether movements are restricted (i.e., crossing the arms) akin to the MAM hypothesis discussed above.

The temporal characteristics of gesture are particularly relevant in a model proposed by Tuite (1993). In this model, gesture and speech are rooted in a “rhythmic” or “viscero-motor” component of communication. To illustrate the close relation of gestures to an internal “rhythmic pulse” (an unspecified term), Tuite superimposes a speaker’s intonation peaks onto a movement display. Only limited data are provided as examples of this proposed phenomenon and, to our knowledge, no subsequent studies have been conducted to test this view. As such, Tuite’s rhythmic hypothesis raises additional questions rather than offers conclusions.

However, the notion of a rhythmic pulse is an interesting one for research and practice on gesture production in aphasia and other communicatively impaired populations. One might investigate, for example, the concurrence between a rhythmic interruption and an unfilled pause during a word-finding attempt. Perhaps such a relationship could be quantified with respect to the halting of a gesture stoke, as described in studies of people who stutter (Mayberry & Jaques, 2000).

The temporal relationship between spoken language and spontaneous hand, arm, and finger movements during communication remains an understudied area in gesture production. The precise parameters of the gestural movement and the spoken utterance have typically been judged via a perceptual means, leaving much to interpretation and leaving inter-judge reliability almost absent from published studies in this domain. A replicable and valid method for studying this relationship is needed in typical populations prior to its application to disordered populations like those with aphasia. For example, studies that directly manipulate the variable(s) that are predicted to impact the temporal relationship between speech and gesture (i.e., prosodic stress, perturbation of movement, communicative context) are needed in order to gain any precision in measuring this elusive relationship. Once a fuller picture of the timing of the speech and gesture relationship is captured, perhaps more specific diagnostic and treatment approaches for people with aphasia can be shaped.

CONCLUSIONS

This paper presents a discussion of issues that, when appropriately considered, will enhance the conduct and interpretation of studies on gesture production in aphasia. There are clear limitations in the available body of gesture production literature in typical populations, ranging from vague and inconsistent definitions and coding systems to inadequate attention to temporal characteristics of gesture. Clinicians and investigators who wish to incorporate gesture into the diagnosis and treatment of a traditionally variable disordered population like those with aphasia will benefit from considering these issues.

As one step along a path to improving subsequent research efforts, we have provided a general working definition of communicative gesture, while emphasising that it will need to be tailored for specific purposes. We have also illustrated how the investigator’s stance on theoretical perspectives for the function of gesture can affect other crucial study elements like the development of research questions, choice of gesture elicitation methods, and identification of an explicit coding scheme for identifying, analysing, and interpreting gesture. Furthermore, we have suggested a path for studies of the temporal synchrony between speech and gesture in which a precise relationship has yet to be established.

As a final comment, in order to gain further insight into the nature and function of gesture, it is vital to understand individual differences in gesture production. Knowledge about individual variability in typical populations will also provide a backdrop against which to assess and manage gesture use in a disordered population like those with aphasia.

In sum, a marriage is needed of sound theoretical orientation, operational definitions, and measurement approaches to yield valid and replicable studies of gesture production in aphasia. Only then will we make progress in disentangling whether, and how, incorporating gesture in aphasia diagnosis and treatment will provide a “helping hand”.

References

  1. Alibali MW, Bassok M, Solomon KO, Syc SE, Goldin-Meadow S. Illuminating mental representations through speech and gesture. Psychological Science. 1999;10(4):327–333. [Google Scholar]
  2. Blonder LX, Burns AF, Bowers D, Moore RW, Heilman KM. Spontaneous gestures following right hemisphere infarct. Neuropsychologia. 1995;33(2):203–213. doi: 10.1016/0028-3932(94)00099-b. [DOI] [PubMed] [Google Scholar]
  3. Butterworth B, Beattie G. Gesture and silence as indicators of planning speech. In: Campbell R, Smith P, editors. Recent advances in the psychology of language. New York: Plenum; 1978. pp. 347–360. [Google Scholar]
  4. Butterworth B, Hadar U. Gesture, speech and computational stages: A reply to McNeill. Psychological Review. 1989;96(1):168–174. doi: 10.1037/0033-295x.96.1.168. [DOI] [PubMed] [Google Scholar]
  5. Butterworth B, Swallow J, Grimston M. Gestures and lexical processes in jargonaphasia. In: Brown J, editor. Jargonaphasia. New York: Academic Press; 1981. pp. 113–124. [Google Scholar]
  6. Cicone M, Wapner W, Foldi N, Zurif E, Gardner H. The relation between gesture and language in aphasic communication. Brain and Language. 1979;8:324–349. doi: 10.1016/0093-934x(79)90060-9. [DOI] [PubMed] [Google Scholar]
  7. Cunningham R, Ward CD. Evaluation of a training programme to facilitate conversation between people with aphasia and their partners. Aphasiology. 2003;17(8):687–707. [Google Scholar]
  8. Davis GA, Wilcox MJ. Adult aphasia rehabilitation: Applied pragmatics. San Diego, CA: College Hill Press; 1985. [Google Scholar]
  9. de Ruiter J. The production of gesture and speech. In: McNeill D, editor. Language and gesture. Cambridge, UK: Cambridge University Press; 2000. pp. 284–311. [Google Scholar]
  10. de Ruiter J. Can gesticulation help aphasic people speak, or rather, communicate? Advances in Speech-Language Pathology. 2006;8(2):124–127. [Google Scholar]
  11. Duffy RJ, Duffy JR. Pantomime recognition in aphasics. Journal of Speech and Hearing Disorders. 1975;44:156–168. doi: 10.1044/jshr.1801.115. [DOI] [PubMed] [Google Scholar]
  12. Duffy RJ, Duffy JR. Three studies of deficits in pantomimic expression and pantomimic recognition in aphasia. Journal of Speech and Hearing Research. 1981;46:70–84. doi: 10.1044/jshr.2401.70. [DOI] [PubMed] [Google Scholar]
  13. Efron D. Gesture and environment. Morningside Heights, NY: King’s Crown Press; 1941. [Google Scholar]
  14. Foundas AL, Macauley BL, Raymer AM, Maher LM, Heilman KM, Gonzalez Rothi LJ. Gesture laterality in aphasic and apraxic stroke patients. Brain and Cognition. 1995;29:204–213. doi: 10.1006/brcg.1995.1277. [DOI] [PubMed] [Google Scholar]
  15. Frick-Horbury D. The use of hand gestures as self-generated cues for recall of verbally associated targets. American Journal of Psychology. 2002;115(1):1–20. [PubMed] [Google Scholar]
  16. Holland AL. Communicative abilities in daily living. Baltimore, MD: University Park Press; 1980. [Google Scholar]
  17. Kendon A. Gesticulation and speech: Two aspects of the process of utterance. In: Key MR, editor. The relation between verbal and nonverbal communication. The Hague: Mouton; 1980. pp. 207–227. [Google Scholar]
  18. Krauss R, Chen Y, Chawla P. Nonverbal behavior and nonverbal communication: What do conversational hand gestures tell us? In: Zanna M, editor. Advances in experimental social psychology. Vol. 26. New York: Academic Press; 1996. pp. 389–450. [Google Scholar]
  19. Krauss R, Chen Y, Gottesman R. Lexical gestures and lexical access: A process model. In: McNeill D, editor. Language and gesture. Cambridge, UK: Cambridge University Press; 2000. pp. 261–283. [Google Scholar]
  20. Krauss R, Hadar U. The role of speech-related arm/hand gestures in word retrieval. In: Campbell R, Messing L, editors. Gesture, speech, and sign. Oxford, UK: Oxford University Press; 1999. pp. 63–116. [Google Scholar]
  21. Levelt W. Speaking–from intention to articulation. Cambridge, MA: MIT Press; 1989. [Google Scholar]
  22. Mayberry RI, Jaques J. Gesture production during stuttered speech: Insights into the nature of gesture-speech integration. In: McNeill D, editor. Language and thought. Chicago, IL: Cambridge University Press; 2000. pp. 199–214. [Google Scholar]
  23. McNeill D. Hand and mind: What gestures reveal about thought. Chicago, IL: University of Chicago Press; 1992. [Google Scholar]
  24. McNeill D. Gesture and thought. Chicago, IL: University of Chicago Press; 2005. [Google Scholar]
  25. Melinger A, Levelt W. Gesture and the communicative intention of the speaker. Gesture. 2004;4:119–141. [Google Scholar]
  26. Morrel-Samuels P, Krauss RM. Word familiarity predicts temporal asynchrony of hand gestures and speech. Journal of Experimental Psychology. 1992;18(3):615–622. [Google Scholar]
  27. Morsella E, Krauss RM. Muscular activity in the arm during lexical retrieval: Implications for gesture-speech theories. Journal of Psycholinguistic Research. 2005;34(4):415–427. doi: 10.1007/s10936-005-6141-9. [DOI] [PubMed] [Google Scholar]
  28. Porch BE. Porch index of communicative ability. Palo Alto, CA: Consulting Psychologists Press; 1967. [Google Scholar]
  29. Power E, Code C. Waving not drowning: Utilising gesture in the treatment of aphasia. Advances in Speech-Language Pathology. 2006;8(2):149–152. [Google Scholar]
  30. Rauscher FH, Krauss RM, Chen Y. Gesture, speech, and lexical access: The role of lexical movements in speech production. Psychological Science. 1996;7(4):226–231. [Google Scholar]
  31. Records N. A measure of the contribution of gesture to the perception of speech in listeners with aphasia. Journal of Speech and Hearing Research. 1994;37:1086–1099. doi: 10.1044/jshr.3705.1086. [DOI] [PubMed] [Google Scholar]
  32. Rime B, Schiaratura L. Gesture and speech. In: Feldman R, Rime B, editors. Fundamentals of nonverbal behaviour. Cambridge, UK: Cambridge University Press; 1991. pp. 239–284. [Google Scholar]
  33. Rodriguez AD, Gonzalez Rothi LJ. Even broken clocks are right twice a day: The utility of models in the clinical reasoning process. Advances in Speech-Language Pathology. 2006;8(2):120–123. [Google Scholar]
  34. Rose ML. The utility of arm and hand gestures in the treatment of aphasia. Advances in Speech-Language Pathology. 2006;8(2):92–109. [Google Scholar]
  35. Rosenbek JC, LaPointe LL, Wertz RT. Aphasia: A clinical approach. Boston: Little, Brown & Co; 1989. [Google Scholar]
  36. Tompkins CA, Scharp VL. Communicative value of self cues in aphasia: A re-evaluation. Aphasiology. 2006;20(7):684–704. doi: 10.1080/02687030500334076. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Tuite K. The production of gesture. Semiotica. 1993;93(12):83–105. [Google Scholar]

RESOURCES