Skip to main content
Philosophical Transactions of the Royal Society B: Biological Sciences logoLink to Philosophical Transactions of the Royal Society B: Biological Sciences
. 2019 Nov 18;375(1789):20180408. doi: 10.1098/rstb.2018.0408

Becoming human: human infants link language and cognition, but what about the other great apes?

Miriam A Novack 1,, Sandra Waxman 1
PMCID: PMC6895556  PMID: 31735145

Abstract

Human language has no parallel elsewhere in the animal kingdom. It is unique not only for its structural complexity but also for its inextricable interface with core cognitive capacities such as object representation, object categorization and abstract rule learning. Here, we (i) review recent evidence documenting how (and how early) language interacts with these core cognitive capacities in the mind of the human infant, and (ii) consider whether this link exists in non-human great apes—our closest genealogical cousins. Research with human infants demonstrates that well before they begin to speak, infants have already forged a link between language and core cognitive capacities. Evident by just three months of age, this language–cognition link unfolds in a rich developmental cascade, with each advance providing the foundation for subsequent, more precise and more powerful links. This link supports our species' capacity to represent and convey abstract concepts and to communicate beyond the immediate here and now. By contrast, although the communication systems of great apes are sophisticated in their own right, there is no conclusive evidence that apes establish reference, convey information declaratively or pass down communicative devices via cultural transmission. Thus, the evidence currently available reinforces the uniqueness of human language and the power of its interface to cognition.

This article is part of the theme issue ‘What can animal communication teach us about human language?’

Keywords: language, cognition, infancy, primates, communication, gesture

1. Introduction

Language is unique to humans. Acquiring a human language involves mastering a complex, multi-layered symbolic system that weaves together several components—phonetics, phonology, morphology, syntax, semantics, pragmatics. Moreover, language does not occur in isolation, but rather in exquisite interaction with a host of non-linguistic cues, including gesture, within a social communicative network [1,2]. Perhaps most importantly for the current paper, human language interfaces seamlessly and spontaneously with core cognitive capacities—complex processes such as object representation, object categorization and abstract rule learning [3,4]. When we say that language links to these core cognitive processes, we mean that linguistic signals feed back to influence our representations of the world [5]. This language–cognition interface makes possible unparalleled communicative precision and enhances conceptual flexibility. For example, in the context of viewing a single scene, we can specify either a particular individual (that monarch hovering by the lowest branch), the object category to which the individual belongs (butterflies), a property of that individual (its colour) or the action in which it is engaged (hovering). This flexibility, precision and representational power are hallmarks of human language: together, they permit us to move beyond the ‘here and now’, to imagine possible futures and evoke the past, to generate new symbolic systems, to create poetry and maths and to convey the contents of our hearts and minds across generations. This language–cognition link serves as a conduit for learning and cultural transmission.

Importantly, however, human language does not emerge in the infant fully formed. Instead, it evolves gradually over the first years of life, shaped not only by our native endowments but also by our experience with the particular native language(s) to which we are exposed [6]. This has raised several fundamental questions. Some researchers have asked how, and how early, infants begin to link the language they hear to the world around them [3]. Others have asked which aspects of human language, if any, are shared with non-human animals. Certainly, all species communicate; among great apes, for example, communication systems include vocalizations, gestures and facial expressions [7]. By examining these communicative signals, researchers have identified certain, perhaps isolated, components of the human linguistic system in non-humans. For example, it is now clear that some non-human species show rule-governed serial ordering of elements (protogrammar), while others show hints of reference [8]. Still other researchers have sought to identify which fundamental cognitive capacities of humans are shared with non-humans. Thus far, the evidence suggests that several core cognitive capacities are shared, including the ability to distinguish among distinct individual objects, to form object categories, to navigate space, to detect basic aspects of quantity and understand causality [4,9]. There is also evidence to suggest that non-human primates, and most domesticated canines, share with humans certain social–cognitive skills [10,11].

In this paper, we take a new tack. We use the existing evidence from human infants and non-human great apes (hereafter, apes) to ask whether the communicative systems of apes influence core cognitive capacities, as language does for human infants. To the best of our knowledge, this question has not directly been tested.

To gain traction on this question, we cast a wide net, considering both language and gestures. Among humans, although gesture is considered non-linguistic or extra-linguistic, there is broad agreement that it is nonetheless a powerful communicative device, perhaps especially so for infants, who produce gestures before they produce their first words [12]. Among apes, too, gesture is prevalent from infancy and throughout their lifespan [13]. Thus, considering the gestural communication among human infants and apes provides traction on two theoretical issues. First, it permits us to examine communicative development broadly and, in doing so, to tease apart effects of language, per se, from effects of social communication more generally. Second, it provides insights into the communicative tools we share with apes and the ways in which we differ.

To foreshadow, despite the considerable evidence of sophisticated gestural communication systems among apes, we find no evidence that these systems link to core cognitive capacities as they do in human infants. In particular, apes (i) do not typically point or use their gestures to establish reference, (ii) use gesture primarily for imperative purposes, (iii) communicate primarily with dependence on the present, and (iv) do not use gesture for cultural transmission. Moreover, we find that in contrast with the evidence with human infants, there is scant evidence of substantial developmental change in communicative capacities, of apes. This, we argue, is consistent with the view that language and its increasingly specific links to core cognitive processes are unique to humans.

2. Early language acquisition and its link to cognition in the first years of life

In this section, we review evidence documenting the developmental origin of a precocious link between language and core cognitive capacities in human infants and how it unfolds in the first year of life. This review reveals a rich cascading process in which infants' earliest links between language and cognitive processes (evident as early as three months of age) provide the foundation for subsequent, more precise and more powerful links at the interface of language and cognition.

(a). The earliest interface between language and cognition: infants in their first year

It is now well known that human infants do not approach the task of language acquisition as blank slates. At birth, they prefer listening to the vocalizations of human and non-human primates over other sounds; within months, their listening preferences narrow as they become attuned to human language, and to their own native language(s) in particular [14]. This developmental process of perceptual tuning, in which exposure is instrumental [15,16], is adaptive. It ensures that infants devote their attention increasingly towards the communicative signals of our species and the individuals who will serve as their communicative and pedagogical partners [17].

Importantly, however, infants’ perceptual preferences for the sounds of language cannot reveal whether (and when) they begin to link these to core cognitive capacities, including object and event categorization, object individuation and abstract rule learning. Addressing this issue requires identifying whether (and when) listening to language influences infants' performance in cognitive tasks like object categorization.

Evidence from infants’ performance in object categorization tasks reveals that well before infants begin to speak, a powerful link between language and core cognitive processes is available. For infants as young as three months of age, listening to human language boosts infants' formation of object categories—a building block of cognition. By contrast, listening to other sounds, including tone sequences and backwards speech, fails to support infant object categorization [18,19]. Thus, by three months of age, infants are not only tuned perceptually to the communicative signals of their communities, but are also tuned to a principled link between the sounds of language and core cognitive processes that will ultimately constitute the foundations of meaning.

This precocious link emerges as part of an initially broader template of signals that includes human language as well as vocalizations of non-human primates. At three and four months of age, listening to the vocalizations of the Madagascar blue-eyed lemur (Eulemur macaco flavifrons) provides precisely the same cognitive advantage as listening to human speech [19]. Yet by six months, lemur vocalizations no longer exert this advantageous effect [19]. Thus, by six months of age, infants have increasingly specified which signals they will link to core cognitive capacities.

But what are the mechanisms by which infants tune this language–cognition interface? The evidence reveals two distinct routes, both of which underscore infants’ exquisite sensitivity to experience.

First, experience influences which of the initially privileged links (including language, lemur calls) to core cognitive capacities infants will maintain and which they will sever. Between four and six months, infants gain considerable exposure to human vocalizations, but little, if any, to lemur vocalizations. Might they maintain the link to human vocalizations because they are exposed to this signal, and sever their initial link to lemur calls because they are ‘deprived’ of experience with this signal? We addressed this question by developing an experiment for infants at six months, because at this point, they have already ‘tuned out’ the link between lemur vocalizations and cognitive processes [19]. When we exposed infants systematically to lemur vocalizations between four and six months, they maintained (rather than tuned out) the link between lemur calls and categorization when tested at six months [20]. Importantly, exposing infants to signals outside the initially privileged set (cf. backward-speech) offered no advantageous effect on categorization. Thus, although experience is instrumental in identifying which of the initially privileged links to core cognitive capacities infants will maintain and which they will sever, experience cannot on its own create new links between those processes and signals if they had not already existed.

Second, for signals other than human and non-human primate vocalizations (that is, signals that are not included in infants' initially broad template), a different kind of experience is required. We know that humans readily infuse otherwise non-linguistic signals (e.g. sine-wave tone sequences) with communicative status (e.g. Morse code). We also know that listening to tone sequences, presented on their own, does not boost infant cognition at any age [18] Moreover, recall, too, that mere exposure is insufficient to link these types of signals to cognition [20]. But what if they were embedded in an explicitly social, communicative interchange? Will six-month-old infants link these otherwise ‘inert’ signals to core cognitive capacities like categorization and abstract rule learning, if they are embedded in a social communicative exchange?

To test this, we embedded sine-wave tone sequences into a rich communicative episode. We created a videotaped dialogue between two women—one speaking in English and the other responding in ‘beeps’ (i.e. sine-wave tones). By embedding tones within this social, communicative episode, these signals, which had previously been outside the initially privileged set, subsequently boosted both object categorization [21] and abstract rule learning [22]. Moreover, embedding otherwise inert signals into language can also be effective for older infants and toddlers [21,23,24]. Thus, the conceptual or symbolic power of these otherwise inert signals is ignited if it is embedded in language.

(b). Tuning the link between language and cognition in the second year of life

As development proceeds, the interface between language and core cognitive processes becomes further specified. Between 9 and 12 months, infants successfully trace not only when a novel word is uttered, but to which objects or events it has been applied. As a result, the ways in which objects are named guide infants' mental representations of objects and categories. Object categorization in both infants and adults is facilitated when the same word is applied consistently to a set of distinct objects [2527]. Conversely, infants represent objects as distinct individuals or distinct categories when different words are applied to distinct objects [2832].

The language–cognition link, as expressed in very young infants, does not remain constant as infants enter their second and third year of life. On the contrary, infants’ expectations about the conceptual consequences of naming become increasingly powerful and precise as infants begin to tease apart these distinct grammatical categories. They can use the position of a word within a sentence to distinguish among grammatical categories [33] and forge increasingly precise links between distinct grammatical forms and their distinct kinds of meaning.

These more specific links unfold in a cascading fashion. Until roughly 12 months of age, infants appear to be ‘generalists’: novel words, be they presented as either nouns or adjectives, highlight any kind of commonality among objects [27,34,35]. Yet by 13 months, infants have more precise expectations. They extend novel nouns on the basis of category-based, and not property-based, commonalities, although they have not yet established a comparably precise expectation for adjectives [34]. Indeed, for most of their second year, infants continue to link novel adjectives to either category-based (e.g. horse) or property-based (e.g. colour, texture) commonalities [3437]. It is not until 18–24 months that they begin to mapp novel adjectives specifically to property-based, commonalities [36,38]. Lastly, infants' expectations for verbs also appear to follow a protracted developmental course: it is not until close to their second birthday that infants reliably map novel verbs to event categories [3943].

Thus far, we have shown that within the first few years of life human infants show impressive links between language, communication and core cognitive processes: (i) a link between language and object categorization is evident as early as three months of age [18]; (ii) it emerges as part of a broad template that initially encompasses not only vocalizations of humans, but also those of non-human primates. This offers a glimpse into both the ontogenetic and phylogenetic origins of human language acquisition; (iii) by six months, infants have tuned this initially broad link specifically to human vocalizations [19]; (iv) this tuning process is experience-driven: if infants are exposed to signals that were part of the initially broad template during this tuning period (from four to six months), they maintain (rather than tune out) the link between these signals and cognition [20]; (v) another way to forge a link between a non-linguistic signal and cognition is to embed it within a rich social communicative episode; (vi) between 9 and 12 months, infants further tune their language–cognition interface. They expect that specific words, not just language in general, link to concepts; and (vii) during their second year, infants tease apart different kinds of words and map them to different kinds of meaning. This permits them to specify which of the myriad possible commonalities, present within a particular set of entities, a speaker is referring to. This is not to say that linguistic sophistication of a 2-year-old approaches that of a 3-year-old or adult, but rather that they are participating in a truly linguistic system.

Infants’ advances in the first years are impressive, but leave two key questions unanswered. First, it remains to be seen whether at the earliest developmental juncture (in the first year of life), language permits infants to move beyond the here-and-now. Second, additional research is required to identify the mechanisms by which listening to language boosts core cognitive processes [44]. It is possible that for very young infants (three to six months), simple attentional mechanisms may be at play; but by 12 months, if not earlier, attention is no longer a sufficient account [18,45]. Thus, in human infants, what may begin as an attentional mechanism evolves considerably over development.

(c). The development of gestural communication in human infants

For human infants, language does not develop in isolation, but in concert with other social communicative tools, most notably gesture. Infants' first gestures, emerging at just eight to nine months, typically include reaching to request objects, stretching out their arms to be picked up, as well as ritualized routines such as waving to others or playing peekaboo [46]. Around 12 months, infants across cultures begin to produce deictic points [12,47]. Because points typically precede infants’ production of their first words [12], they constitute infants' first productive, intentional representational tools. Infants point for declarative purposes (that is, to get others to feel, know or do things for them), as well as for imperative purposes (that is, to request objects, actions or information) [12,4851]. In this way, infants’ precocious pointing gestures serve as an index of their underlying social–cognitive abilities.

Notably, infants’ early gestures, their pointing gestures in particular, predict their broader communicative development, including the attainment of near as well as longer term lexical milestones. For example, early pointing predicts the onset of new words. Objects that infants point to (but do not yet name) tend to appear in their productive vocabularies just a few months later, suggesting that gesture can pave the way for early word learning [52]. Early gesture production (at 14 months) is also predictive of child vocabulary 3 years later [53]. Finally, and more specifically, pointing seems to be linked to the establishment of lexical reference—an infant is more likely to learn a novel label for an object if the labelling occurs while the infant points to an object, than if they are only looking at it or reaching towards it [51].

As infants begin to combine words in the second year of life, gesture continues to play an important role. Often, infants' gestures provide information that is redundant with what they say in speech (e.g. pointing to a cup while saying ‘cup’), but this is not always the case. Infants also produce supplementary speech + gesture combinations (i.e. point at cup while saying ‘mummy’ to indicate ‘mummy's cup’). Combinations like these represent infants' earliest syntactic constructions and predict the onset of two-word combinations a few months later [52,54]. Thus, gesture brings additional sophistication to the interface between language and cognition. Note that the role of gesture in communication changes as children's linguistic capacities evolve. That is, although children first communicate primarily with gesture, gestures subsequently give way to formal language as children develop those linguistic elements [55,56].

In addition, in the second year of life, iconic gestures begin to appear robustly in infants' production and comprehension [5761]. Iconic gestures are those in which the form of the gesture represents the form of the intended referent (e.g. wiggling one's fingers to mimic the legs of a spider; moving one's hand to indicate the path of a train moving along a track). Although relatively infrequent in infants' spontaneous production [55,58], iconic gestures convey rich representational information and therefore signal a huge leap in infants’ communicative power. In particular, the combination of speech and iconic gesture supports conceptual advances in domains ranging from number learning [62] to Piagetian conservation [63] and mathematical equivalence [64].

Perhaps even more compelling evidence comes from infants who are born deaf to hearing, non-signing parents. For these infants, gesture constitutes an especially powerful linguistic element [65]. Although deprived of linguistic input from speech, these deaf infants do have access to a host of other communicative signals, including touch, eye gaze and gesture. Like hearing children, they gesture to communicate, yet in contrast with hearing children, their gestures become increasingly systematic and elaborate as they create systems known as homesigns [65,66]. Careful analyses of homesign systems reveal linguistic features that appear resiliently even in the absence of any linguistic input. For example, homesigning children impose consistent serial ordering of gestural elements [67], establish displaced reference [68], mark negation [69] and mark distinctions between nouns and verbs [70]. Importantly, this systematicity is imposed by the children themselves; it is absent in the gestures of their parents [71]. These resilient features highlight the human infants' innate capacity to build a language; they also underscore our species’ prowess in appropriating signals other than spoken language into our communicative repertoires.

Nevertheless, in the absence of linguistic interchange, homesign can only take a child so far. It is only when homesign children come together that their gestural communication develops into a more complex and linguistically sophisticated system [72]. Moreover, homesign gestural systems do not appear to have the same advantageous effect as a full shared language when it comes to certain core concepts such as large exact number or spatial representations [73,74]. This suggests that an important ingredient to the link between language and core cognitive processes may be the element of cultural transmission.

Together, the evidence from hearing and deaf children reveal four ways in which human infants link their communicative systems (language and gesture) to core cognitive processes: (i) infants as young as three months spontaneously link the language they hear to core cognitive processes. It remains to be seen whether young infants forge this same link for communicative signals more broadly, including gesture; (ii) gesture is an integral component to infant communicative development and foreshadows subsequent semantic and syntactic advances; (iii) gestures are typically infants' first productive communicative tools, but as infants acquire language, gestures give way, taking on a supplementary (secondary) role; and (iv) finally, in the absence of linguistic input (i.e. in the case of deaf children born to hearing parents), gestures have the capacity to convey language-like features (i.e. homesigns). However, homesign on its own does not support the acquisition of foundational concepts (like numerical or spatial representations) in the same way that language does. Only when these gesture systems come together within a linguistic community does this link to foundational concepts emerge.

3. Beyond humans: gestural communication among non-human great apes

Our goal in this last section is to look beyond human infants and consider the communicative abilities of the other great apes.

Like humans, apes use a range of communicative signals—including vocalizations, facial expressions and gestures—to convey information and to influence the behaviour of others [75]. There has been substantial research on primate vocalizations, much of it focusing specifically on vocalizations produced in the context of evolutionarily urgent survival functions such as responding to predators and communicating sources of food [76,77]. Certainly, these vocalizations are impressive and some have argued that they reflect a variety of sophisticated cognitive and social functions [7881]. But others have argued that ape gestures—more than their vocalizations—provide the most compelling comparisons to human language [8285]. Therefore, we focus primarily on ape gesture and its communicative status in comparison to human language and human gesture.

(a). The gestural repertoires of apes

Gestures are part of the communicative repertoires of all species of great apes [75]. Typically emerging at around nine months of age [86,87], these gestures are produced deliberately and voluntarily in social interactions [8890]. Ape gestures have been defined as ‘discrete, mechanically ineffective physical movements of the body observed during periods of communication’ [88, p. 749] and include tactile gestures (involving bodily contact with another individual, e.g. hitting another), auditory gestures (incorporating non-vocal sounds, e.g. stomping) and visible gestures (those can be seen from a distance, e.g. arm raising) [75,88,90]. Note that within the ape and human literatures, the definition of what constitutes a gesture often differs; definitions of human gesture tend to focus primarily on silent, empty-handed movements that make no physical contact with objects or other people (see [91] for a discussion of differences in definition and coding). Nevertheless, important cross-species comparisons can be made.

Most researchers agree that apes' gestures share two key features with those of humans' gestures: flexibility and intentionality [13,88,90,9294]. Regarding flexibility, apes produce a variety of gesture types (e.g. arm raise; poke; object shake) flexibly across different situations (e.g. affiliation, grooming, resting, social play). They can use a single gesture type across multiple contexts, as well as multiple gesture types within a single context [7]. Regarding intentionality, apes’ gestures are often responsive to the attentional states of their would-be communicative partners: for instance, when the intended partner is looking elsewhere, chimps tend to initiate by making physical contact (touching the other) and auditory gestures (banging on the ground) [88,95].

(b). How do ape gestures differ from human gestures?

Like humans, the gestures of apes reveal an intention to communicate and insight into the attentional states of their conspecifics. But apes' gestures also differ considerably from those of humans. Most striking are species differences in the presence versus absence of (i) pointing and the establishment of reference, (ii) gesturing for declarative purposes, (iii) communicating without contextual support, and (iv) evidence for learning processes and developmental cascades. Together, these differences raise intriguing questions about whether the communicative systems of apes link to their core cognitive capacities, as is the case for humans.

First, apes in the wild do not use pointing gestures to communicate with conspecifics [96100]. In one comparative study, researchers used the same criteria to characterize spontaneous referential gestures produced by 1- to 2-year-old human children and chimpanzees in natural settings. Among children, nearly 25% of the gestures produced were classified as potentially referential (e.g. directed to an external location or third party). Among chimpanzees, fewer than 0.1% of their gestures met this criterion [91]. Human-reared or human-captive apes can be taught to use pointing gestures; however, these points typically occur only in communication with humans and only in contexts where the goal is to convey imperative information, typically a request for food [101106].

Second, beyond the case of pointing, ape gestures appear to be reserved exclusively for imperative purposes. They gesture to regulate face-to-face interactions in the here-and-now such as play, grooming, fighting or tandem travel [98]. For example, most gestures between apes are produced in dyadic contexts, aimed to get the attention of a would-be social partner [107]. There is little to no evidence that apes gesture declaratively to direct another's attention simply for the sake of sharing interest in it or commenting on it. By contrast, human infants frequently gesture for declarative purposes, sharing their intentions with their carers [48,98,102].

Third, ape gestures are considerably more dependent on the contextual support of the present than are those of human children. Although both children and captive apes can use gesture to refer to non-present entities (i.e. they can point to an empty plate that used to contain food), these gestures are still dependent on referencing present objects (i.e. the now-empty plate) [104106]. In wild ape populations, spontaneous gestures typically require the use of a present object. For example, to request ‘play’, an ape may hit a conspecific; to request ‘being carried’ a juvenile may place their hand on their mother's back. Humans, too, can use contextual support to express ideas via gesture (e.g. pointing to an object that we want). Yet, in addition, we also ubiquitously gesture in the absence of any referent object, (e.g. using one's hands to describe the shape of a missing puzzle piece; demonstrating how to cut with scissors, even when none are present). There is no clear evidence of this type of iconic gesture production in apes [108].

Fourth, there is little evidence that ape gestural repertoires are readily learned through imitation or through cultural transmission [88,93,98,109111]. Instead, the ape gestural repertoire consists primarily of species-typical behaviours [8890]. Some have interpreted the scant variability in gestural repertoires across groups of apes as evidence that ape gestures are innate, acquired primarily through genetic transmission [88,93]. Others have claimed that certain types of ape gestures are adaptations of full-fledged actions to create a more restricted gestural form that will elicit a target behaviour, a process known as ontogenetic ritualization [111]. For example, to request a climb on its mother's back, infants first push down her rear end to gain access to climb. But over time, this behaviour is streamlined: to elicit the response, the infant need only touch the mother's back [112]. Certainly, this process involves learning, but the learning occurs only in that particular interaction in that dyad concerning that action.

Finally, there are dramatic differences in the developmental course of gesture systems among apes and humans. The most striking difference is that in humans, early gestures are integrated spontaneously into a rapidly burgeoning linguistic system; this system is at once more comprehensive in its communicative and symbolic reach and more precise in its expression than the systems observed among apes. Human infants initially rely heavily on gestures but this reliance decreases steadily [52,56]. As their linguistic capacities advance, infants move systematically from producing gestures alone to producing gesture + language (and then language + language) combinations. By contrast, apes' reliance on gesture for communication does not seem to change, even among apes trained by humans to acquire new symbolic signals [113].

Certainly, there are cases in which apes, raises by humans, learn complex symbol systems including spoken words, pictograms or sign language [114119] (see [120] for a comprehensive review and discussion of the many controversies surrounding this topic). In such cases, learning to use discrete symbols is achieved only with considerable repetition or reward-based paradigms. Evidence like this offers insight into the capabilities of the ape mind, given a set of symbols. Nevertheless, we are cautious in drawing strong conclusions from these examples, as they are rare and have been observed only as a result of human intervention.

The fact that apes are capable of learning new symbol systems speaks to their impressive intelligence, and to the obvious evolutionary links between the ape brain and the human brain. Additionally, there is evidence that some language-trained apes can successfully group novel exemplars into lexical categories, raising the intriguing possibility that learning human-like abstract symbols may support object categorization in non-human apes [121]. However, it typically takes apes many trials of learning with rewards to acquire basic use of these symbols. Note that this differs from how human infants spontaneously acquire language, as well as how they can easily adapt novel symbols as category markers or object labels, given only a single session of seeing these symbols embedded in a communicative interaction [21,23]. One perhaps important counter-example bears mention: two infant bonobos, Kanzi and Makula, may have spontaneously learned symbols on which their mother had previously been trained [118,119]).

Even for an ape that has mastered productive use of a signal system with their human trainers, there are sharp boundary conditions on their use. Language-trained apes use acquired symbol systems almost exclusively for imperative purposes in interactions with humans [122,123]. Furthermore, in stark contrast to humans, there seems to be an upper limit to apes’ combinatorial abilities. Even language-trained apes overwhelmingly produce symbols in isolation; the virtual absence of combinations that exceed two symbols reveals a compelling difference between children and apes [124].

Taken together, the existing evidence reveals that although apes in the wild show impressive usage of communicative gestures, produced intentionally and with flexibility, these gesture systems differ dramatically from human communication (for a more nuanced discussion, see [88]). They do not make use of pointing gestures, gesture for imperative purposes only, typically require present context to gesture, do not pass down their gestures through cultural transmission, and do not undergo significant developmental shifts in gesture use. Finally, despite the fact that some apes have, with great training, learned a limited set of human-like symbol systems, their learning processes are distinct from human language learning and their use of these symbols is largely limited.

4. Conclusion

Like humans, apes have flexible and intentional communication systems. However, the representational scope and precision of these systems differs from that of humans. Human infants are equipped with an innate predisposition to acquire language, one that is shaped powerfully by their tuning to communicative input. Only human infants reveal an intricate developmental cascade of communicative capacities, each serving as the springboard for the next. Only humans appear to communicate referentially and declaratively, and to engage in cultural learning and transmission.

Human infants as young as three months of age link language to core cognitive processes such as categorization, and this link continues to unfold with development. But what about apes? Certainly, apes' gestures influence each other's planning, expectations and behaviour. But do they feed back to influence core cognitive processes, like categorization or rule learning, as they do for humans? To the best of our knowledge, this question has not been tested. Given the evidence reviewed here, we question whether apes' gestures would have the capacity to link to core cognitive processes in the way that language does for humans. That is, without components like referential communication, declarative functions, abstraction and cultural transmission, it seems unlikely that ape's gestural symbols would be capable of influencing their core cognitive processes. We look forward to future research that will directly address this question. Evidence like this may shed light on whether the gestural systems of apes provided the evolutionary groundwork for the emergence of human language as well as the evolutionary origins of the powerful and intricate link between language and cognition.

Data accessibility

This article has no additional data.

Authors' contributions

M.A.N. and S.W. both contributed to the theoretical framework and to the writing of the manuscript.

Competing interests

We declare we have no competing interests.

Funding

This work was supported by NIH (grant no. R01HD083310) to S.W. and NIH (grant no. F32HD095580) to M.A.N.

References

  • 1.McNeill D. 1992. Hand and mind: what gestures reveal about thought, 416 p University of Chicago Press; (Cited 11 August 2015) See https://books.google.com/books?hl=en&lr=&id=3ZZAfNumLvwC&pgis=1. [Google Scholar]
  • 2.Kendon A. 2004. Gesture: visible action as utterance, 400 p Cambridge University Press; (Cited 24 November 2015) See https://books.google.com/books?hl=en&lr=&id=hDXnnzmDkOkC&pgis=1 [Google Scholar]
  • 3.Perszyk DR, Waxman SR. 2018. Linking language and cognition in infancy. Annu. Rev. Psychol. 69, 231–250. ( 10.1146/annurev-psych-122216-011701) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Spelke ES. 2003. What makes us smart? Core knowledge and natural language. In Language in mind: advances in the investigation of language and thought (eds Gentner D, Goldin-Meadow S). Cambridge, MA: MIT Press; See http://www.wjh.harvard.edu/~lds/pdfs/spelke2003.pdf. [Google Scholar]
  • 5.Miller GA. 1990. The place of language in a scientific psychology. Psychol. Sci. 1, 7–14. ( 10.1111/j.1467-9280.1990.tb00059.x) [DOI] [Google Scholar]
  • 6.Mehler J, Christophe A. 1995. Maturation and learning of language in the first year of life. In The cognitive neurosciences (ed. Gazzaniga MS.), pp. 943–954. Cambridge, MA: The MIT Press. [Google Scholar]
  • 7.Call J, Tomasello M. 2007. The gestural repertoire of chimpanzees (Pan troglodytes). In The gestural communication of apes and monkeys (eds Call J, Tomasello M), pp. 17–39. New York, NY: Taylor & Francis; ( 10.1075/gest.8.3.18ken) [DOI] [Google Scholar]
  • 8.Hauser MD, Chomsky N, Fitch WT. 2002. The faculty of language: what is it, who has it, and how did it evolve? Science 298, 1569 LP–1579 LP. ( 10.1126/science.298.5598.1569) [DOI] [PubMed] [Google Scholar]
  • 9.Carey S. 2009. The origin of concepts CN—stacks 153.23 C276o. Oxford, UK: Oxford University Press. [Google Scholar]
  • 10.Hare B, Brown M, Williamson C, Tomasello M. 2002. The domestication of social cognition in dogs. Science 298, 1634–1636. ( 10.1126/science.1072702) [DOI] [PubMed] [Google Scholar]
  • 11.Call J, Tomasello M. 2008. Does the chimpanzee have a theory of mind? 30 years later. Trend. Cogn. Sci. 12, 187–192. [DOI] [PubMed] [Google Scholar]
  • 12.Bates E, Camaioni L, Volterra V. 1975. The acquisition of performatives prior to speech. Merrill Palmer Q. Behav. Dev. 21, 205–226. [Google Scholar]
  • 13.Tomasello M, George BL, Kruger AC, Jeffrey M, Farrar A, Evans A. 1985. The development of gestural communication in young chimpanzees. J. Hum. Evol. 14, 175–186. ( 10.1016/S0047-2484(85)80005-1) [DOI] [Google Scholar]
  • 14.Vouloumanos A, Hauser MD, Werker JF, Martin A. 2010. The tuning of human neonates’ preference for speech. Child Dev. 81, 517–527. ( 10.1111/j.1467-8624.2009.01412.x) [DOI] [PubMed] [Google Scholar]
  • 15.Werker JF, Tees RC. 1984. Cross-language speech perception: evidence for perceptual reorganization during the first year of life. Infant Behav. Dev. 7, 49–63. ( 10.1016/S0163-6383(84)80022-3) [DOI] [Google Scholar]
  • 16.Kuhl PK, Williams KA, Lacerda F, Stevens KN, Lindblom B. 1992. Linguistic experience alters phonetic perception in infants by 6 months of age. Science 255, 606–608. ( 10.1126/science.1736364) [DOI] [PubMed] [Google Scholar]
  • 17.Vouloumanos A, Waxman SR. 2014. Listen up! Speech is for thinking during infancy. Trends Cogn. Sci. 18, 642–646. ( 10.1016/j.tics.2014.10.001) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Ferry AL, Hespos SJ, Waxman SR. 2010. Categorization in 3- and 4-month-old infants: an advantage of words over tones. Child Dev. 81, 472–479. ( 10.1111/j.1467-8624.2009.01408.x) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Ferry AL, Hespos SJ, Waxman SR. 2013. Nonhuman primate vocalizations support categorization in very young human infants. Proc. Natl Acad. Sci. USA 110, 15 231–15 235. ( 10.1073/pnas.1221166110) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Perszyk DR, Waxman SR. 2016. Listening to the calls of the wild: the role of experience in linking language and cognition in young infants. Cognition 153, 175–181. ( 10.1016/j.cognition.2016.05.004) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Ferguson B, Waxman SR. 2016. What the [beep]? Six-month-olds link novel communicative signals to meaning. Cognition 146, 185–189. ( 10.1016/j.cognition.2015.09.020) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Ferguson B, Lew-Williams C. 2016. Communicative signals support abstract rule learning by 7-month-old infants. Sci. Rep. 6, 25434 ( 10.1038/srep25434) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Woodward AL, Hoyne KL. 1999. Infants’ learning about words and sounds in relation to objects. Child Dev. 70, 65–77. ( 10.1111/1467-8624.00006) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Namy LL, Waxman SR. 1998. Words and gestures: infants' interpretations of different forms of symbolic reference. Child Dev. 69, 295–308. ( 10.1111/j.1467-8624.1998.tb06189.x) [DOI] [PubMed] [Google Scholar]
  • 25.Gelman SA, Heyman GD. 1999. Carrot-eaters and creature-believers: the effects of lexicalization on children's inferences about social categories. Psychol. Sci. 10, 489–493. ( 10.1111/1467-9280.00194) [DOI] [Google Scholar]
  • 26.Lupyan G, Rakison DH, McClelland JL. 2007. Language is not just for talking: redundant labels facilitate learning of novel categories. Psychol. Sci. 18, 1077–1083. ( 10.1111/j.1467-9280.2007.02028.x) [DOI] [PubMed] [Google Scholar]
  • 27.Waxman SR, Markow DB. 1995. Words as invitations to form categories: evidence from 12- to 13-month-old infants. Cogn. Psychol. 29, 257–302. ( 10.1006/cogp.1995.1016) [DOI] [PubMed] [Google Scholar]
  • 28.Feigenson L, Halberda J. 2008. Conceptual knowledge increases infants' memory capacity. Proc. Natl Acad. Sci. USA 105, 9926 LP–9930 LP. ( 10.1073/pnas.0709884105) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Ferguson B, Havy M, Waxman SR. 2015. The precision of 12-month-old infants' link between language and categorization predicts vocabulary size at 12 and 18 months. Front. Psychol. 6, 1319 ( 10.3389/fpsyg.2015.01319) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Waxman SR, Braun I. 2005. Consistent (but not variable) names as invitations to form object categories: new evidence from 12-month-old infants. Cognition 95, B59–B68. ( 10.1016/j.cognition.2004.09.003) [DOI] [PubMed] [Google Scholar]
  • 31.Xu F. 2002. The role of language in acquiring object kind concepts in infancy. Cognition 85, 223–250. ( 10.1016/S0010-0277(02)00109-9) [DOI] [PubMed] [Google Scholar]
  • 32.Xu F, Cote M, Baker A. 2005. Labeling guides object individuation in 12-month-old infants. Psychol. Sci. 16, 372–377. ( 10.1111/j.0956-7976.2005.01543.x) [DOI] [PubMed] [Google Scholar]
  • 33.Waxman SR, Lidz J. 2006. Early word learning. In Handbook of child psychology: cognition, perception, and language (eds Kuhn D, Siegler RS), pp. 229–335, 6th edn Hoboken, NJ: Wiley. [Google Scholar]
  • 34.Waxman SR. 1999. Specifying the scope of 13-month-olds’ expectations for novel words. Cognition 70, B35–B50. ( 10.1016/S0010-0277(99)00017-7) [DOI] [PubMed] [Google Scholar]
  • 35.Waxman S, Booth A. 2003. The origins and evolution of links between word learning and conceptual organization: new evidence from 11-month-olds. Dev. Sci. 6, 128–135. ( 10.1111/1467-7687.00262) [DOI] [Google Scholar]
  • 36.Booth AE, Waxman SR. 2009. A horse of a different color: specifying with precision infants' mappings of novel nouns and adjectives. Child Dev. 80, 15–22. ( 10.1111/j.1467-8624.2008.01242.x) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Imai M, Gentner D. 1997. A cross-linguistic study of early word meaning: universal ontology and linguistic influence. Cognition 62, 169–200. ( 10.1016/S0010-0277(96)00784-6) [DOI] [PubMed] [Google Scholar]
  • 38.Waxman SR, Markow DB. 1998. Object properties and object kind: twenty-one-month-old infants’ extension of novel adjectives. Child Dev. 69, 1313–1329. ( 10.2307/1132268) [DOI] [PubMed] [Google Scholar]
  • 39.Arunachalam S, Waxman SR. 2010. Meaning from syntax: evidence from 2-year-olds. Cognition 114, 442–446. ( 10.1016/j.cognition.2009.10.015) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Arunachalam S, Escovar E, Hansen MA, Waxman SR. 2013. Out of sight, but not out of mind: 21-month-olds use syntactic information to learn verbs even in the absence of a corresponding event. Lang. Cogn. Process. 28, 417–425. ( 10.1080/01690965.2011.641744) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Syrett K, Arunachalam S, Waxman SR. 2014. Slowly but surely: adverbs support verb learning in 2-year-olds. Lang. Learn. Dev. 10, 263–278. ( 10.1080/15475441.2013.840493) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Waxman SR, Lidz JL, Braun IE, Lavin T. 2009. Twenty four-month-old infants' interpretations of novel verbs and nouns in dynamic scenes. Cognit. Psychol. 59, 67–95. ( 10.1016/j.cogpsych.2009.02.001) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Yuan S, Fisher C. 2009. ‘Really? She Blicked the Baby?’: two-year-olds learn combinatorial facts about verbs by listening. Psychol. Sci. 20, 619–626. ( 10.1111/j.1467-9280.2009.02341.x) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Ferguson B, Waxman SR. 2017. Linking language and categorization in infancy. J. Child Lang. 44, 527–552. ( 10.1017/S0305000916000568) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Gelman SA, Waxman SR. 2009. Response to Sloutsky: taking development seriously: theories cannot emerge from associations alone. Trends Cogn. Sci. 13, 332–333. ( 10.1016/j.tics.2009.05.004) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Fenson L, Dale PS, Reznick JS, Bates E, Thal DJ, Pethick SJ, Tomasello M, Mervis CB, Stiles J. 1994. Variability in early communicative development. Monogr. Soc. Res. Child Dev. 59, i–185. ( 10.2307/1166093) [DOI] [PubMed] [Google Scholar]
  • 47.Liszkowski U, Brown P, Callaghan T, Takada A, de Vos C. 2012. A prelinguistic gestural universal of human communication. Cogn. Sci. 36, 698–713. ( 10.1111/j.1551-6709.2011.01228.x) [DOI] [PubMed] [Google Scholar]
  • 48.Tomasello M, Carpenter M, Liszkowski U. 2007. A new look at infant pointing. Child Dev. 78, 705–722. ( 10.1111/j.1467-8624.2007.01025.x) [DOI] [PubMed] [Google Scholar]
  • 49.Begus K, Southgate V. 2012. Infant pointing serves an interrogative function. Dev. Sci. 15, 611–617. ( 10.1111/j.1467-7687.2012.01160.x) [DOI] [PubMed] [Google Scholar]
  • 50.Kovács ÁM, Tauzin T, Téglás E, Gergely G, Csibra G. 2014. Pointing as epistemic request: 12-month-olds point to receive new information. Infancy 19, 543–557. ( 10.1111/infa.12060) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Lucca K, Wilbourn MP. 2018. Communicating to learn: infants' pointing gestures result in optimal learning. Child Dev. 89, 941–960. ( 10.1111/cdev.12707) [DOI] [PubMed] [Google Scholar]
  • 52.Iverson JM, Goldin-Meadow S. 2005. Gesture paves the way for language development. Psychol. Sci. 16, 367–371. ( 10.1111/j.0956-7976.2005.01542.x) [DOI] [PubMed] [Google Scholar]
  • 53.Rowe ML, Goldin-Meadow S. 2009. Early gesture selectively predicts later language learning. Dev. Sci. 12, 182–187. ( 10.1111/j.1467-7687.2008.00764.x) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Özçalışkan Ş, Goldin-Meadow S. 2009. When gesture-speech combinations do and do not index linguistic change. Lang. Cogn. Process. 24, 190–217. ( 10.1080/01690960801956911) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Iverson JM, Capirci O, Caselli MC. 1994. From communication to language in two modalities. Cogn. Dev. 9, 23–43. ( 10.1016/0885-2014(94)90018-3) [DOI] [Google Scholar]
  • 56.Goldin-Meadow S, Butcher C. 2003. Pointing toward two-word speech in young children. In Pointing: where language, culture and cognition meet (ed. Kita S.), pp. 85–107. Mahwah, NJ: Erlbaum; ( 10.4324/9781410607744) [DOI] [Google Scholar]
  • 57.Stanfield C, Williamson R, Özçalişkan Ş. 2014. How early do children understand gesture–speech combinations with iconic gestures? J. Child Lang. 41, 462–471. ( 10.1017/S0305000913000019) [DOI] [PubMed] [Google Scholar]
  • 58.Özçalışkan Ş, Goldin-Meadow S. 2011. Is there an iconic gesture spurt at 26 months. In Integrating gestures: the interdisciplinary nature of gesture (cited 24 November 2015). See https://books.google.com/books?hl=en&lr=&id=b8pzCSxIjd8C&oi=fnd&pg=PT171&dq=Özçalişkan,+S.,+%26+Goldin-Meadow,+S.++Is+there+an+iconic+gesture+spurt+at+26+months%3F++In+G.+Stam+%26+M.+Ishino+(Eds.),++Integrating+gestures:+The+interdisciplina.
  • 59.Cartmill EA, Rissman L, Novack MA, Goldin-Meadow S. 2017. The development of iconicity in children's co-speech gesture and homesign. Lang Interact Acquis. 8, 42–68. ( 10.1075/lia.8.1.03car) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Novack MA, Goldin-Meadow S, Woodward AL. 2015. Learning from gesture: how early does it happen? Cognition 142, 138–147. ( 10.1016/j.cognition.2015.05.018) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Goodrich W, Hudson Kam CL. 2009. Co-speech gesture as input in verb learning. Dev. Sci. 12, 81–87. ( 10.1111/j.1467-7687.2008.00735.x) [DOI] [PubMed] [Google Scholar]
  • 62.Gibson DJ, Gunderson EA, Spaepen E, Levine SC, Goldin-Meadow S. 2019. Number gestures predict learning of number words. Dev Sci. 22, e12791 ( 10.1111/desc.12791) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63.Breckinridge Church R, Goldin-Meadow S. 1986. The mismatch between gesture and speech as an index of transitional knowledge. Cognition 23, 43–71. ( 10.1016/0010-0277(86)90053-3) [DOI] [PubMed] [Google Scholar]
  • 64.Perry M, Breckinridge Church R, Goldin-Meadow S. 1988. Transitional knowledge in the acquisition of concepts. Cogn. Dev. 3, 359–400. ( 10.1016/0885-2014(88)90021-4) [DOI] [Google Scholar]
  • 65.Goldin-Meadow S. 2003. The resilience of language: what gesture creation in deaf children can tell us about how all children learn language CN—stacks 401.93 G619r. New York, NY: Psychology Press; ( 10.4324/9780203943267) [DOI] [Google Scholar]
  • 66.Goldin-Meadow S, Mylander C. 1984. Gestural communication in deaf children: the effects and noneffects of parental input on early language development. Monogr. Soc. Res. Child Dev. 49, 1–151. ( 10.2307/1165838) [DOI] [PubMed] [Google Scholar]
  • 67.Feldman H, Goldin-Meadow S, Gleitman L. 1987. Beyond Herodotus: the creation of a language by linguistically deprived deaf children. In Action, symbol, and gesture: the emergence of language (ed. Lock A.), pp. 351–414. New York, NY: Academic Press. [Google Scholar]
  • 68.Butcher C, Mylander C, Goldin-Meadow S. 1991. Displaced communication in a self-styled gesture system: pointing at the nonpresent. Cogn. Dev. 6, 315–342. ( 10.1016/0885-2014(91)90042-C) [DOI] [Google Scholar]
  • 69.Franklin A, Giannakidou A, Goldin-Meadow S. 2011. Negation, questions, and structure building in a homesign system. Cognition 118, 398–416. ( 10.1016/j.cognition.2010.08.017) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 70.Goldin-Meadow S, Butcher C, Mylander C, Dodge M. 1994. Nouns and verbs in a self-styled gesture system: what's in a name? Cogn. Psychol. 27, 259–319. ( 10.1006/cogp.1994.1018) [DOI] [PubMed] [Google Scholar]
  • 71.Goldin-Meadow S, Mylander C. 1983. Gestural communication in deaf children: noneffect of parental input on language development. Science 221, 372 LP–374 LP. ( 10.1126/science.6867713) [DOI] [PubMed] [Google Scholar]
  • 72.Senghas A, Kita S, Özyürek A. 2004. Children creating core properties of language: evidence from an emerging sign language in Nicaragua. Science 305, 1779–1782. ( 10.1126/science.1100199) [DOI] [PubMed] [Google Scholar]
  • 73.Spaepen E, Coppola M, Spelke ES, Carey SE, Goldin-Meadow S. 2011. Number without a language model. Proc. Natl Acad. Sci. USA 108, 3163–3168. ( 10.1073/pnas.1015975108) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 74.Pyers JE, Shusterman A, Senghas A, Spelke ES, Emmorey K. 2010. Evidence from an emerging sign language reveals that language supports spatial cognition. Proc. Natl Acad. Sci. USA 107, 12 116–12 120. ( 10.1073/pnas.0914044107) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 75.Call J, Tomasello M, (eds) 2007. The gestural communication of apes and monkeys.  p. viii, 256 New York, NY: Taylor & Francis Group/Lawrence Erlbaum Associates. [Google Scholar]
  • 76.Seyfarth RM, Cheney DL. 2003. Signalers and receivers in animal communication. Annu. Rev. Psychol. 54, 145–173. ( 10.1146/annurev.psych.54.101601.145121) [DOI] [PubMed] [Google Scholar]
  • 77.Seyfarth RM, Cheney DL. 2010. Production, usage, and comprehension in animal vocalizations. Brain Lang. 115, 92–100. ( 10.1016/j.bandl.2009.10.003) [DOI] [PubMed] [Google Scholar]
  • 78.Schel AM, Townsend SW, Machanda Z, Zuberbühler K, Slocombe KE. 2013. Chimpanzee alarm call production meets key criteria for intentionality. PLoS ONE 8, e76674 ( 10.1371/journal.pone.0076674) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79.Crockford C, Wittig RM, Mundry R, Zuberbühler K. 2012. Wild chimpanzees inform ignorant group members of danger. Curr. Biol. 22, 142–146. ( 10.1016/j.cub.2011.11.053) [DOI] [PubMed] [Google Scholar]
  • 80.Zuberbühler K, Gomez J-C. 2018. Primate intentional communication. In The international encyclopedia of anthropology (ed. Callan H.), pp. 1–10. Hoboken, NJ: Wiley Blackwell. [Google Scholar]
  • 81.Lameira AR. 2017. Bidding evidence for primate vocal learning and the cultural substrates for speech evolution. Neurosci. Biobehav. Rev. 83, 429–439. ( 10.1016/j.neubiorev.2017.09.021) [DOI] [PubMed] [Google Scholar]
  • 82.Arbib MA, Liebal K, Pika S. 2008. Primate vocalization, gesture, and the evolution of human language. Curr. Anthropol. 49, 1053–1076. ( 10.1086/593015) [DOI] [PubMed] [Google Scholar]
  • 83.Corballis MC, Corballis MC. 2002. From hand to mouth: the origins of language. Princeton, NJ: Princeton University Press; ( 10.1017/s0022226702221982) [DOI] [Google Scholar]
  • 84.Armstrong DF, Stokoe WC, Wilcox SE. 1995. Gesture and the nature of language. Cambridge, UK: Cambridge University Press. [Google Scholar]
  • 85.Hewes GW, et al. 1973. Primate communication and the gestural origin of language (and comments and reply). Curr. Anthropol. 14, 5–24. ( 10.1086/201401) [DOI] [Google Scholar]
  • 86.Plooij FX. 1984. The behavioral development of free-living chimpanzee babies and infants. Norwood, NJ: Ablex Publishing Corp. [Google Scholar]
  • 87.Schneider C, Call J, Liebal K. 2012. Onset and early use of gestural communication in nonhuman great apes. Am. J. Primatol. 74, 102–113. ( 10.1002/ajp.21011) [DOI] [PubMed] [Google Scholar]
  • 88.Hobaiter C, Byrne RW. 2011. The gestural repertoire of the wild chimpanzee. Anim. Cogn. 14, 745–767. ( 10.1007/s10071-011-0409-2) [DOI] [PubMed] [Google Scholar]
  • 89.Byrne RW, Cartmill E, Genty E, Graham KE, Hobaiter C, Tanner J. 2017. Great ape gestures: intentional communication with a rich set of innate signals. Anim. Cogn. 20, 755–769. ( 10.1007/s10071-017-1096-4) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 90.Hobaiter C, Byrne RW. 2011. Serial gesturing by wild chimpanzees: its nature and function for communication. Anim. Cogn. 14, 827–838. ( 10.1007/s10071-011-0416-3) [DOI] [PubMed] [Google Scholar]
  • 91.Kersken V, Gómez J-C, Liszkowski U, Soldati A, Hobaiter C. 2018. A gestural repertoire of 1-to 2-year-old human children: in search of the ape gestures. Anim. Cogn. 22, 577–595. ( 10.1007/s10071-018-1213-z) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 92.Liebal K, Call J, Tomasello M. 2004. Use of gesture sequences in chimpanzees. Am. J. Primatol. 64, 377–396. ( 10.1002/ajp.20087) [DOI] [PubMed] [Google Scholar]
  • 93.Genty E, Breuer T, Hobaiter C, Byrne RW. 2009. Gestural communication of the gorilla (Gorilla gorilla): repertoire, intentionality and possible origins. Anim. Cogn. 12, 527–546. ( 10.1007/s10071-009-0213-4) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 94.Cartmill EA, Byrne RW. 2010. Semantics of primate gestures: intentional meanings of orangutan gestures. Anim. Cogn. 13, 793–804. ( 10.1007/s10071-010-0328-7) [DOI] [PubMed] [Google Scholar]
  • 95.Liebal K, Call J, Tomasello M, Pika S. 2004. To move or not to move: how apes adjust to the attentional state of others. Interact. Stud. 5, 199–219. ( 10.1075/is.5.2.03lie) [DOI] [Google Scholar]
  • 96.Gómez JC, Sarriã E, Tamarit J. 1993. The comparative study of early communication and theories of mind: ontogeny, phylogeny, and pathology. Oxford, UK: Oxford University Press. [Google Scholar]
  • 97.Krause MA. 1997. Comparative perspectives on pointing and joint attention in children and apes. Int. J. Comp. Psychol. 10, 137–157. [Google Scholar]
  • 98.Tomasello M, Camaioni L. 1997. A comparison of the gestural communication of apes and human infants. Hum. Dev. 40, 7–24. ( 10.1159/000278540) [DOI] [Google Scholar]
  • 99.Tomasello M. 2007. If they're so good at grammar, then why don't they talk? hints from apes' and humans’ use of gestures. Lang. Learn Dev. 3, 133–156. ( 10.1080/15475440701225451) [DOI] [Google Scholar]
  • 100.Tomasello M. 2008. Why don't apes point? Trends Linguist. Stud. Monogr. 197, 375. [Google Scholar]
  • 101.Leavens DA, Hopkins WD. 1998. Intentional communication by chimpanzees: a cross-sectional study of the use of referential gestures. Dev. Psychol. 34, 813–822. ( 10.1037/0012-1649.34.5.813) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 102.Bullinger AF, Zimmermann F, Kaminski J, Tomasello M. 2011. Different social motives in the gestural communication of chimpanzees and human children. Dev. Sci. 14, 58–68. ( 10.1111/j.1467-7687.2010.00952.x) [DOI] [PubMed] [Google Scholar]
  • 103.Call J, Tomasello M. 1996. The effect of humans on the cognitive development of apes. In Reach into thought: the minds of great apes (eds Russon AE, Bard KA, Taylor Parker S), pp. 371–403. Cambridge, UK: Cambridge University Press. [Google Scholar]
  • 104.Bohn M, Call J, Tomasello M. 2015. Communication about absent entities in great apes and human infants. Cognition 145, 63–72. ( 10.1016/j.cognition.2015.08.009) [DOI] [PubMed] [Google Scholar]
  • 105.Bohn M, Call J, Tomasello M. 2016. The role of past interactions in great apes' communication about absent entities. J. Comp. Psychol. 130, 351–357. ( 10.1037/com0000042) [DOI] [PubMed] [Google Scholar]
  • 106.Lyn H, Russell JL, Leavens DA, Bard KA, Boysen ST, Schaeffer JA, Hopkins WD. 2014. Apes communicate about absent and displaced objects: methodology matters. Anim. Cogn. 17, 85–94. ( 10.1007/s10071-013-0640-0) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 107.Pika S, Liebal K, Call J, Tomasello M. 2005. Gestural communication of apes. Gesture 5, 41–56. ( 10.1075/gest.5.1-2.05pik) [DOI] [Google Scholar]
  • 108.Pika S. 2008. Gestures of apes and pre-linguistic human children: similar or different? First Lang. 28, 116–140. ( 10.1177/0142723707080966) [DOI] [Google Scholar]
  • 109.Tomasello M, Gust D, Frost GT. 1989. A longitudinal investigation of gestural communication in young chimpanzees. Primates 30, 35–50. ( 10.1007/BF02381209) [DOI] [Google Scholar]
  • 110.Call J, Tomasello M. 1994. Production and comprehension of referential pointing by orangutans (Pongo pygmaeus). J. Comp. Psychol. 108, 307–317. ( 10.1037/0735-7036.108.4.307) [DOI] [PubMed] [Google Scholar]
  • 111.Tomasello M, Call J, Nagell K, Olguin R, Carpenter M. 1994. The learning and use of gestural signals by young chimpanzees: a trans-generational study. Primates 35, 137–154. ( 10.1007/BF02382050) [DOI] [Google Scholar]
  • 112.Halina M, Rossano F, Tomasello M. 2013. The ontogenetic ritualization of bonobo gestures. Anim. Cogn. 16, 653–666. ( 10.1007/s10071-013-0601-7) [DOI] [PubMed] [Google Scholar]
  • 113.Gillespie-Lynch K, Greenfield P, Feng Y, Savage-Rumbaugh S, Lyn H. 2013. A cross-species study of gesture and its role in symbolic development: implications for the gestural theory of language evolution. Front. Psychol. 4, 160 ( 10.3389/fpsyg.2013.00160) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 114.Premack D. 1971. Language in chimpanzee. Science 172, 808–822. ( 10.1126/science.172.3985.808) [DOI] [PubMed] [Google Scholar]
  • 115.Gardner RA, Gardner BT, Van Cantfort TE. 1989. Teaching sign language to chimpanzees CN—stacks 599.8844 T253. Albany, NY: State University of New York Press; ( 10.1002/ajpa.1330840214) [DOI] [Google Scholar]
  • 116.Miles H. 1990. The cognitive foundations for reference in a signing orangutan. Cambridge, UK: Cambridge University Press. [Google Scholar]
  • 117.Patterson FG. 1978. The gestures of a gorilla: language acquisition in another pongid. Brain Lang. 5, 72–97. ( 10.1016/0093-934X(78)90008-1) [DOI] [PubMed] [Google Scholar]
  • 118.Savage-Rumbaugh ES, Murphy J, Sevcik RA, Brakke KE, Williams SL, Rumbaugh DM, Bates E. 1993. Language comprehension in ape and child. Monogr. Soc. Res. Child Dev. 58, i–252. ( 10.2307/1166068) [DOI] [PubMed] [Google Scholar]
  • 119.Savage-Rumbaugh ES. 1986. Ape language: from conditioned response to symbol. New York, NY: Columbia University Press; ( 10.1007/bf02735178) [DOI] [Google Scholar]
  • 120.Lyn H. 2012. Apes and the evolution of language: taking stock of 40 years of research. In Oxford library of psychology: the Oxford handbook of comparative evolutionary psychology (eds Vonk J, Shackelford TK), pp. 356–378. New York, NY: Oxford University Press. [Google Scholar]
  • 121.Savage-Rumbaugh ES, Rumbaugh DM, Smith ST, Lawson J. 1980. The linguistic essential. Science 210, 922–925. [DOI] [PubMed] [Google Scholar]
  • 122.Greenfield PM, Savage-Rumbaugh ES. 1990. Grammatical combination in Pan paniscus: processes of learning and invention in the evolution and development of language. Cambridge, UK: Cambridge University Press. [Google Scholar]
  • 123.Rivas E. 2005. Recent use of signs by chimpanzees (Pan troglodytes) in interactions with humans. J. Comp. Psychol. 119, 404 ( 10.1037/0735-7036.119.4.404) [DOI] [PubMed] [Google Scholar]
  • 124.Wynne C. 2008. Aping language: a skeptical analysis of the evidence for nonhuman primate language. Skept (Altadena, CA). Skeptics Society Skeptic Magazine 13, 10–15. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

This article has no additional data.


Articles from Philosophical Transactions of the Royal Society B: Biological Sciences are provided here courtesy of The Royal Society

RESOURCES