Abstract
Accumulating evidence has suggested the existence of a human action recognition system involving inferior frontal, parietal, and superior temporal regions that may participate in both the perception and execution of actions. However, little is known about the specificity of this system in response to different forms of human action. Here we present data from PET neuroimaging studies from passive viewing of three distinct action types, intransitive self-oriented actions (e.g., stretching, rubbing one’s eyes, etc.), transitive object-oriented actions (e.g., opening a door, lifting a cup to the lips to drink), and the abstract, symbolic actions–signs used in American Sign Language. Our results show that these different classes of human actions engage a frontal/parietal/STS human action recognition system in a highly similar fashion. However, the results indicate that this neural consistency across motion classes is true primarily for hearing subjects. Data from deaf signers shows a non-uniform response to different classes of human actions. As expected, deaf signers engaged left-hemisphere perisylvian language areas during the perception of signed language signs. Surprisingly, these subjects did not engage the expected frontal/parietal/STS circuitry during passive viewing of non-linguistic actions, but rather reliably activated middle-occipital temporal-ventral regions which are known to participate in the detection of human bodies, faces, and movements. Comparisons with data from hearing subjects establish statistically significant contributions of middle-occipital temporal-ventral during the processing of non-linguistic actions in deaf signers. These results suggest that during human motion processing, deaf individuals may engage specialized neural systems that allow for rapid, online differentiation of meaningful linguistic actions from non-linguistic human movements.
Keywords: Human action perception, mirror neuron, American Sign Language (ASL)
Introduction
Interest in characterizing the neural systems and mechanisms involved in the perception of human actions has been fueled, in part, by recent studies of the macaque monkey. These papers report a unique neurophysiological response of a selective set of mirror neurons: cells which appear to couple the execution of goal directed actions with the perception of similar goal directed actions in another. In the original studies, a small population of these neurons was found to reside in a ventral premotor region (F5) (Gallese et al., 1996; Rizollatti et al., 1996). Subsequent research has found neurons with mirror properties in area 7b (area PF of von Economo, 1929) of parietal cortex (Fogassi et al., 1998; Gallese et al., 2002). This F5-7b circuit in macaque, often referred to as the mirror-neuron circuit, is speculated to be part of a larger mirror neuron system (Rizzolatti & Craighero, 2004) forming the biological basis for understanding a wide range of human actions including such complex behavioral constructs as imitation, social intent, empathy, and even human language (e.g., Rizzolatti & Arbib, 1998; Iacoboni et al., 1999; Rizzolatti et al., 2001; Ferrari et al., 2003; Rizzolatti & Craighero, 2004).
Data from functional neuroimaging has been used to argue for a human homologue of a mirror-neuron system. A meta-analysis of PET data investigating the observation and imitation of hand actions (Grezes & Decety, 2001) identified a largely bilateral network that contributes to the action/perception pairings. This network includes the superior temporal sulcus, intraparietal sulcus, the inferior parietal lobule, and the premotor cortex. Functional MRI studies have further localized specialized cortical regions with properties that emulate those of mirror-like neurons. For example, Grezes et al. (2003) reported significant co-activation for executed and observed grasping in bilateral intraparietal sulcus, dorsal premotor cortex, superior temporal sulcus, and right parietal operculum (SII). In addition, activity was reported in the left ventral limb of the left precentral sulcus (BA 6) with extension to pars opercularis (BA 44) of the inferior frontal gyrus.
While a great deal of attention has been paid to identifying the anatomical loci that form the individual units of the mirror network, the functional mechanisms of this system are not as well understood. The sheer abundance of highly abstract functions now attributed or related to a frontal-parietal mirror-neuron system is impressive. Though theoretically parsimonious, the notion that a single cortical network mediates this wealth of behaviors, including language, is pragmatically challenging. For example, the notion of resonance is often used in the description of mirror responses in the nervous system. How this term relates to neural regions involved in action/perception pairings is largely unspecified. Other researchers have evoked forward and inverse models of sensorimotor control and perception as a possible theoretical construct in understanding mirror systems (Miall, 2003; Carr et al., 2003; Iacoboni & Zaidel, 2003). In this scheme, an inverse model describes the involvement of the STS, PF, and F5 in the perception of action, while a forward model linking F5 to PF to STS is used to generate predictions of movement outcome during imitated actions (Miall, 2003; Carr et al., 2003; Iacoboni, 2005).
Relevant to the current aims, it remains unclear whether a human mirror neuron system is equally reactive to all forms of human actions. An early study by Grezes, Costes, and Decety (1999) has shown convincingly that the engagement of the human action recognition system may be modified both by the content of gestures observed (i.e., whether an action is known or unknown to the viewer) and by the intention of the subject while viewing the action (i.e., to watch an action with or without the goal of subsequent imitation). However, little is known about the specific correlates of different action types. Consider the following situations: An observer watches a person bring her hand to her mouth to act as a megaphone while shouting out the name of a child. In another instance, this same person brings her hand to her mouth to guard a sneeze. In yet another instance, the person has raised her hand, holding an ice-cream cone, to her mouth. In a fourth instance the hand is brought into contact with the chin in a conventionalized manner: the arbitrary combined configurations of hand, mouth, and motion type signaling the concept “mother” in American Sign Language (ASL). Does a human action/mirror system become equivalently engaged by each of these instances of distinct action classes?
In the present study, we sought to determine whether the focus and extent of neural activity during passive viewing of human actions is modulated as a function of the type of human action observed and the experience of the viewer. We examined the perception of three classes of actions (self-oriented, object-oriented, and communicative) in two groups of subjects (hearing individuals unfamiliar with signed language and deaf users of signed language). The three classes of action were chosen to reflect increasing degrees of meaningfulness. Self-oriented actions, such as scratching one’s head or rubbing one’s eyes, are highly frequent and may not trigger conceptual elaboration. Impressionistically, in everyday interactions one tends to “look past” these gestures, perhaps because they are largely irrelevant for the viewer. Object-oriented actions (throwing a ball, folding a shirt, etc.) may be considered goal-directed actions that have clear, highly-specific, and predictable functional consequences. Finally, gestures used in manual signing systems of the deaf, such as American Sign Language (ASL), are clearly communicative in nature even for individuals who are not users of signed languages.
For deaf individuals who use a visual-manual language as their primary form of communication, the successful perception of each of these types of human motion is especially vital. Not only must signers be attuned to the usual plethora of non-linguistic actions produced by those around them, they must also be able to quickly detect the presence of linguistic movements produced by fellow signers. This entails being able to parse sign language motions from other kinds of human movements that co-occur in the visual environment and map these sign language actions to linguistic movement patterns stored in memory.
A growing literature reports that the perception of signed and spoken languages engages left hemisphere perisylvian, inferior frontal, and posterior temporal-parietal regions. In addition, some studies have shown a greater role for right hemisphere regions in sign comprehension (Neville et al., 1998; Bavelier et al., 1998; Newman et al., 2002; for recent reviews, see Corina, in press; Corina & Knapp, in press). While a broad neural specialization for sign language processing has been documented, it is largely unknown how the parsing and mapping of these linguistic movements occurs, and critically, how this may differ from the processing of other forms of human actions. It is also unknown whether deaf subjects’ experience with visual-manual linguistic actions alters the processing of non-linguistic humans actions.
In the present study, we sought to compare the engagement of neural systems during the passive viewing of classes of human actions described above. We consider the similarities and differences between two groups of perceivers: native deaf signers and hearing sign naive subjects. The use of PET technique in examination of neural responses in these two groups of subjects during the perception of distinct classes of human action stimuli can help further determine the specificity and malleability of a hypothesized human mirror neuron system.
Results
Hearing non-signers
Perception of self-oriented actions
Regional cerebral blood flow (rCBF) increases during perception of self–oriented actions compared to baseline were located in the left inferior frontal gyrus (BA 46), middle frontal gyrus (BA 9), precentral gyrus (BA 6), postcentral gyrus (BA 7), cerebellum, right superior frontal gyrus (BA 11), cingulate gyrus (BA 29/31/32), middle occipital gyrus (BA 19), caudate, and bilateral inferior parietal lobule (BA 40), inferior occipital gyrus (BA 18), and insula (BA 13) (Figure 1a, Table 1a, left panel).
Figure 1.

Table 1.
| Hearing
|
Deaf
|
|||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Left Hemisphere | Right Hemisphere | Left Hemisphere | Right Hemisphere | |||||||||||||||||
| Region | Area | x | y | z | Z score | Area | x | y | z | Z score | Area | x | y | z | Z score | Area | x | y | z | Z score |
| Table 1a. Self-oriented Actions > Baseline. | ||||||||||||||||||||
|
| ||||||||||||||||||||
| Frontal | ||||||||||||||||||||
| Superior frontal gyrus | 11 | 24 | 58 | −22 | 3.81 | |||||||||||||||
| 11 | 18 | 50 | −26 | 3.70 | ||||||||||||||||
| Medial frontal gyrus | 6 | 10 | −24 | 66 | 3.91 | |||||||||||||||
| Middle frontal gyrus | 9 | −54 | 20 | 26 | 3.62 | 46 | 42 | 14 | 22 | 3.34 | ||||||||||
| 6 | −34 | −2 | 52 | 3.39 | ||||||||||||||||
| 46 | −47 | 28 | 18 | 3.52 | ||||||||||||||||
| Inferior frontal gyrus | 46 | −46 | 48 | 4 | 3.30 | |||||||||||||||
| Precentral gyrus | 6 | −62 | 0 | 32 | 3.55 | 6 | −30 | −16 | 68 | 3.57 | ||||||||||
| 6 | −58 | 4 | 38 | 3.27 | ||||||||||||||||
| Cingulate | 31 | 16 | −20 | 40 | 3.94 | |||||||||||||||
| 32 | 18 | 34 | 18 | 3.49 | ||||||||||||||||
| Temporal | ||||||||||||||||||||
| Insula | 13 | −38 | 8 | 14 | 4.62 | 13 | 36 | 8 | 18 | 3.66 | ||||||||||
| 13 | −40 | 12 | 4 | 3.79 | ||||||||||||||||
| Middle temporal gyrus | 21 | 38 | −4 | −26 | 3.61 | |||||||||||||||
| 21 | 56 | 4 | −30 | 3.65 | ||||||||||||||||
| Hippocampus | 27 | −40 | 2 | 3.73 | ||||||||||||||||
| Inferior temporal gyrus | 20 | −30 | 0 | −40 | 3.29 | |||||||||||||||
| Parahippocampal gyrus | 19 | −30 | −50 | −5 | 3.50 | 28 | 24 | −8 | −26 | 4.27 | ||||||||||
| Posterior cingulate | 29 | −2 | −44 | 2 | 3.35 | |||||||||||||||
| Fusiform gyrus | 19 | 52 | −70 | −14 | 3.96 | |||||||||||||||
| 37 | 42 | −46 | −18 | 3.38 | ||||||||||||||||
| Parietal | ||||||||||||||||||||
| Postcentral gyrus | 7 | −6 | −54 | 68 | 3.93 | 2 | −34 | −36 | 68 | 6.64 | 2 | 24 | −36 | 68 | 3.57 | |||||
| 2 | −58 | −24 | 46 | 3.50 | ||||||||||||||||
| Superior parietal | 7 | −36 | −46 | 58 | 3.35 | 39 | 42 | 14 | 22 | 3.34 | ||||||||||
| Inferior parietal | 40 | −45 | −54 | 52 | 3.87 | 40 | 52 | −58 | 48 | 3.78 | ||||||||||
| Subcortical | ||||||||||||||||||||
| Caudate | 24 | −34 | 12 | 4.33 | ||||||||||||||||
| Occipital | ||||||||||||||||||||
| Inferior occipital gyrus | 18 | −32 | −84 | −10 | 3.43 | 18 | 34 | −94 | −12 | 3.36 | 18 | −16 | −90 | −15 | 3.75 | 17 | 30 | −96 | −10 | 3.56 |
| 18 | −22 | −94 | −12 | 3.69 | 18 | 42 | −90 | −10 | 3.45 | |||||||||||
| Middle occcipital gyrus | 19 | 52 | −70 | −15 | 3.72 | 18 | 46 | −78 | −8 | 4.16 | ||||||||||
| Cerebellum | −28 | −50 | −42 | 3.32 | ||||||||||||||||
| Table 1b. Object-oriented Actions > Baseline. | ||||||||||||||||||||
|
| ||||||||||||||||||||
| Frontal | ||||||||||||||||||||
| Superior frontal gyrus | 11 | 18 | 48 | −26 | 4.31 | |||||||||||||||
| 11 | 24 | 58 | −22 | 3.71 | ||||||||||||||||
| Medial frontal gyrus | 6 | 10 | −24 | 66 | 4.81 | |||||||||||||||
| Middle frontal gyrus | 9 | −54 | 20 | 28 | 3.49 | 9 | 18 | 34 | 18 | 3.53 | 47 | 58 | 36 | −8 | 3.32 | |||||
| 10 | −46 | 48 | 4 | 3.43 | ||||||||||||||||
| 46 | −46 | 28 | 20 | 3.37 | ||||||||||||||||
| Inferior frontal gyrus | 47 | −32 | 20 | −8 | 3.56 | |||||||||||||||
| Precentral gyrus | 6 | −62 | 0 | 32 | 3.37 | 4 | −28 | −22 | 68 | 3.36 | ||||||||||
| 6 | −48 | −6 | 54 | 3.61 | ||||||||||||||||
| Cingulate | 31 | 20 | −22 | 38 | 3.87 | |||||||||||||||
| Temporal | ||||||||||||||||||||
| Middle temporal gyrus | 21 | 54 | 2 | −30 | 3.95 | |||||||||||||||
| Parahippocampal gyrus | 27 | −22 | −33 | 0 | 3.62 | 30 | 25 | −50 | 6 | 3.60 | 30 | −26 | −52 | 2 | 3.66 | 28 | 24 | −8 | −26 | 4.26 |
| Insula | 13 | −38 | 8 | 14 | 4.60 | 37 | 62 | −64 | 4 | 3.58 | ||||||||||
| Parietal | ||||||||||||||||||||
| Postcentral gyrus | 7 | −4 | −56 | 70 | 4.42 | 2 | −38 | −38 | 63 | 3.76 | 2 | 54 | −18 | 30 | 3.38 | |||||
| 5 | −37 | −42 | 62 | 3.50 | ||||||||||||||||
| Superior parietal | 7 | 34 | −68 | 50 | 3.66 | 7 | −35 | −46 | 58 | 3.64 | ||||||||||
| Inferior parietal | 40 | −44 | −51 | 58 | 3.58 | 40 | 50 | −55 | 52 | 3.56 | ||||||||||
| Occipital | ||||||||||||||||||||
| Middle occipital gyrus | 19 | 48 | −76 | −6 | 4.32 | |||||||||||||||
| 19 | 52 | −72 | −10 | 4.01 | ||||||||||||||||
| Subcortical | ||||||||||||||||||||
| Caudate | 26 | −36 | 12 | 4.34 | ||||||||||||||||
| Cerebellum | −28 | −44 | −34 | 3.30 | −18 | −90 | −22 | 3.54 | ||||||||||||
| Table 1c. ASL > Baseline. | ||||||||||||||||||||
|
| ||||||||||||||||||||
| Frontal | ||||||||||||||||||||
| Superior frontal gyrus | 10 | 36 | 60 | 14 | 3.28 | |||||||||||||||
| 11 | 18 | 48 | −24 | 4.25 | ||||||||||||||||
| 11 | 24 | 58 | −22 | 3.57 | ||||||||||||||||
| Middle frontal gyrus | 6 | −34 | −2 | 52 | 3.47 | 11 | −24 | 48 | −12 | 4.69 | 6 | 10 | −24 | 66 | 3.81 | |||||
| 46 | 42 | 14 | 22 | 3.55 | ||||||||||||||||
| Inferior frontal gyrus | 46 | −46 | 48 | 4 | 3.67 | 9 | −34 | 6 | 26 | 3.81 | ||||||||||
| Precentral gyrus | 6 | −48 | −6 | 54 | 3.41 | 6 | 44 | 2 | 38 | 3.34 | 6 | −20 | −16 | 68 | 3.60 | |||||
| 6 | −42 | 2 | 60 | 3.48 | ||||||||||||||||
| 4 | −60 | −18 | 38 | 3.21 | ||||||||||||||||
| Cingulate | 29 | −10 | −40 | 10 | 3.25 | 32 | 16 | 36 | 14 | 3.80 | ||||||||||
| Temporal | ||||||||||||||||||||
| Superior temporal gyrus | 38 | −36 | 18 | −32 | 3.63 | 41 | 46 | −28 | 12 | 3.89 | ||||||||||
| Middle temporal gyrus | 21 | 56 | 4 | −30 | 3.27 | |||||||||||||||
| Parahippocampal gyrus | 30 | −18 | −36 | 6 | 4.03 | 30 | 18 | −10 | −12 | 3.43 | ||||||||||
| Insula | 13 | −38 | 8 | 14 | 4.21 | 13 | −40 | 22 | 2 | 3.27 | ||||||||||
| Uncus | 28 | 24 | −8 | −26 | 4.37 | |||||||||||||||
| Hippocampus | 30 | −42 | 6 | 3.40 | ||||||||||||||||
| Parietal | ||||||||||||||||||||
| Postcentral gyrus | 7 | −4 | −50 | 68 | 4.09 | 7 | 16 | −78 | 46 | 3.38 | 5 | −36 | −40 | 60 | 3.52 | 5 | 22 | −38 | 62 | 3.62 |
| Inferior parietal | 40 | −48 | −46 | 36 | 3.49 | 40 | 46 | −56 | 52 | 3.75 | 39 | −52 | −66 | 46 | 3.29 | 40 | 50 | −26 | 26 | 3.52 |
| Subcortical | ||||||||||||||||||||
| Caudate | 26 | −38 | 12 | 4.35 | ||||||||||||||||
| Thalamus | 24 | −14 | 22 | 4.06 | ||||||||||||||||
| Cerebellum | −28 | −48 | −42 | 3.37 | 26 | −56 | −44 | 3.36 | ||||||||||||
Perception of object-oriented actions
When the perception of object-oriented actions are compared to baseline, increased activations were observed in the left inferior frontal gyrus (BA 47), middle frontal gyrus (BA 9/46 and), precentral gyrus (BA 6), postcentral gyrus (BA 7), insula (BA 13), cerebellum, right superior frontal gyrus (BA 11), middle frontal gyrus (BA 9), superior parietal lobule (BA 7), cingulate gyrus (BA 31), caudate and bilateral inferior parietal lobule (BA 40), and parahippocampal gyrus (BA 27) (Figure 1b, Table 1b, left panel).
Perception of ASL in hearing non-signers
The perception of ASL signs compared to baseline was associated with increased activations in the left inferior frontal gyrus (BA 46), middle frontal gyrus bilaterally (BA 6), left insula (BA 13), and cerebellum, the right superior frontal gyrus (BA 10/11), hippocampus, caudate, and bilateral cingulate gyrus (BA 32), postcentral gyrus (BA 7), and inferior parietal lobule (BA 40) (Figure 1c, Table 1c, left panel).
Common activations
The most striking feature of these data is the tremendous commonality of neural-anatomical recruitment across all human action categories tested: self-oriented, object-oriented, and linguistic-communicative. A conjunction procedure (Nichols et al., 2004) was used to identify statistically significant regions common to each of these three conditions (Figure 2, Table 2, left panel). Common activation areas are left frontal regions including the inferior frontal gyrus (BA 46) and middle frontal gyrus (BA 6), and right superior frontal lobe (BA 11), superior parietal cortex (BA 7), and cingulate, and bilateral insula (BA 13) and inferior parietal cortex.
Figure 2.

Common Activations of Self-oriented Actions, Object-oriented Actions, and ASL: Hearing Subjects.
Table 2.
Common Activations: Self-oriented Actions, Object-oriented Actions, and ASL.
| Hearing
|
Deaf
|
|||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Left Hemisphere | Right Hemisphere | Left Hemisphere | Right Hemisphere | |||||||||||||||||
| Region | Area | x | Y | z | Z score | Area | x | y | z | Z score | Area | x | y | z | Z score | Area | x | y | z | Z score |
| Frontal | ||||||||||||||||||||
| Superior frontal gyrus | 11 | 18 | 50 | −26 | 3.7 | |||||||||||||||
| Middle frontal gyrus | 6 | −34 | −2 | 52 | 3.39 | |||||||||||||||
| Medial frontal gyrus | 6 | −26 | −16 | 72 | 3.33 | 6 | 12 | −24 | 66 | 3.30 | ||||||||||
| Cingulate | 32 | 18 | 34 | 18 | 3.56 | |||||||||||||||
| Inferior frontal gyrus | 46 | −46 | 48 | 4 | 3.30 | |||||||||||||||
| Temporal | ||||||||||||||||||||
| Insula | 13 | −38 | 8 | 14 | 4.21 | 13 | 26 | −38 | 12 | 4.31 | ||||||||||
| Parahippocampal gyrus | 28 | 28 | −8 | 26 | 4.26 | |||||||||||||||
| Middle temporal | 21 | 56 | 4 | −30 | 3.27 | |||||||||||||||
| Parietal | ||||||||||||||||||||
| Superior parietal | 7 | −6 | −54 | 74 | 3.91 | |||||||||||||||
| Inferior parietal | 40 | −48 | −46 | 36 | 3.13 | 40 | 52 | −58 | 52 | 3.56 | ||||||||||
| Postcentral gyrus | 5 | −36 | −40 | 66 | ||||||||||||||||
Regions unique to self-oriented grooming actions and object-oriented actions
Direct contrasts between the self-oriented and object-oriented action conditions were used to evaluate the selective neural systems associated with each condition. Regions showing statistically greater response to the observation of self–oriented actions (Table 3) relative to the object-oriented actions were mainly in visual association cortex. These areas included bilateral posterior aspects of the fusiform gyrus and middle-occipital cortex (−52, −72, −6; 50, −76, −16). Regions showing statically greater activity in response to transitive action processing (Table 4) included the left anterior cingulate, right lingual gyrus, and bilateral frontal-orbital cortex (BA 11).
Table 3.
Hearing Subjects. Self-oriented Actions > Object-oriented Actions.
| Left Hemisphere | Right Hemisphere | |||||||||
|---|---|---|---|---|---|---|---|---|---|---|
| Region | Area | x | y | z | Z score | Area | x | y | z | Z score |
| Temporal | ||||||||||
| Insula | 38 | 16 | 24 | 3.48 | ||||||
| Parietal | ||||||||||
| Precuneus | 18 | 30 | −102 | −2 | 4.34 | |||||
| Occipital | ||||||||||
| Middle occipital gyrus | 18 | −52 | −72 | −6 | 3.61 | 19 | 50 | −76 | −16 | 4.88 |
| Inferior occipital gyrus | 18 | −24 | −94 | −14 | 4.38 | |||||
Table 4.
Hearing Subjects. Object-oriented Actions > Self-oriented Actions.
| Left Hemisphere | Right Hemisphere | |||||||||
|---|---|---|---|---|---|---|---|---|---|---|
| Region | Area | x | y | z | Z score | Area | x | y | z | Z score |
| Frontal | ||||||||||
| Orbital gyrus | 11 | −2 | 36 | −28 | 3.70 | 11 | 14 | 44 | −22 | 3.82 |
| Cingulate | 32 | −14 | 48 | −6 | 4.53 | |||||
| 24 | −8 | 30 | 6 | 3.41 | ||||||
| Parietal | ||||||||||
| Precuneus | 7 | 8 | −52 | 46 | 3.88 | |||||
| Occipital | ||||||||||
| Lingual gyrus | 18 | 2 | −72 | 0 | 4.98 | |||||
| Cerebellum/Fusiform | −12 | −46 | −14 | 3.36 | 19 | 28 | −58 | 2 | 3.90 | |
Summary
Our primary finding from the data of normally-hearing, sign-naïve persons is that neural activity in response to passively viewing human action was remarkably consistent across all three (very different) classes of human motion. Moreover, these primary foci included regions previously identified as critical to a human action recognition system: most notably, superior parietal (BA 40/7), ventral premotor (BA 6), and inferior frontal regions (BA 46).
Parietal
An inferior frontal/superior parietal network factors significantly in emerging models of action-perception mirror systems (Miall, 2003; Carr et al., 2003; Iacoboni, 2003). In the human, both the anterior intraparietal sulcus and the supramarginal gyrus have been considered to be possible homologues to macaque 7b/AIP/PF (Grezes et al., 2003). These areas are closely associated with the ventral premotor cortex (vPMC) both anatomically and functionally, especially with regard to sensori-motor mappings (Decety et al., 1997). In our data, parietal activation was typically bilateral and showed both superior (BA 7) and inferior extent (BA 40).
Frontal
Activations within inferior premotor and inferior frontal regions have been a hallmark of human action processing studies. Numerous researchers have suggested homologies between ventral premotor region (F5) identified in macaques and inferior frontal gyrus in humans, including Broca’s area (Schubotz & von Cramon, 2003; Rizzolatti & Craighero, 2004; Buccino et al., 2004; Aziz-Zadeh et al., 2006). In our data, consistent activation of ventral premotor region (BA 6) and inferior frontal region (BA 46) was observed. Also present was consistent and robust activation of the right medial frontal orbital region (BA 11). Frontal orbital regions have been implicated in social cognition and evaluation of reward properties in social interaction and in response to emotion perspective taking (Moll et al., 2003; Rolls, 2004; Hynes et al., 2006).
Temporal
More controversial is the role of superior temporal regions in the mirror system (Puce & Perrett, 2003). In our data, superior temporal activation was not consistently detected; however, insula activation was robust.
While common activations were present across conditions, some differences were apparent. For example, in the direct comparison between self-oriented actions and object-oriented actions, prominent activity was noted in visual association regions bilaterally encompassing the right fusiform and left middle occipital gyri. The right fusiform gyus is a region that Dectey et al. (1997) report greater activation in response to meaningless sequences of hand and arm actions compared to meaningful actions. This is interesting when one considers that viewing another person’s self-oriented grooming actions is considerably less meaningful than viewing canonical actions performed with common, well-known objects. A second region that is more active in response to viewing self-oriented versus object-oriented actions is a visual area in the left middle occipital lobe that is proximal to the extrastriate body area (EBA) described by Downing et al. (2001).
The primarily lateral posterior occipital activations found for viewing self-oriented actions stand in contrast to visual regions more active in response to object-oriented actions. Here, medial posterior occipital activations are prominent. Interestingly, similar lingual gyrus activations have been seen in cases in which subjects were required to make inferences about other human participants, including their personality traits (Kjaer et al., 2002) and motives governing unfair monetary offers from a human partner (Sanfey et al., 2003). These data suggest that medial visual areas are engaged when people speculate about attributes or actions of others. Coupled with activation of visual regions involved in the discernment of socially relevant cues was activation in bilateral medial frontal-orbital cortex (BA 11). We note that some stimuli included in this condition contain some valenced action: for example, a vignette in which the model sneaks off with a wallet and another in which the model hides a gun. The engagement of these medial posterior occipital and frontal orbital regions is consistent with a more cognitive-evaluative assessment of these human actions.
Finally, activations in the inferior temporal visual regions with spatial extension into the cerebellum may reflect fusiform activation similar to that reported in Beauchamp et al. (2003) in a study comparing the perception of tool movement versus human movement. In the present study, we are contrasting the perception of humans with objects to humans without objects and thus in a similar fashion are examining neural correlates of object movement.
Taken together, these data suggest a common neural system responsive to human action processing that involves a bilateral superior parietal (BA 40/7), left-ventral premotor (BA 6), and inferior frontal (BA 46) circuit. In addition, differences between self-oriented and object-oriented actions suggest higher-level visual-object and social-evaluative encoding of the object-oriented transitive actions which engaged medial visual and frontal systems. This stands in contrast to the more prominent posterior occipital-lateral activation for self-oriented actions which we speculate may be an index of attention to body form and movement. We consider the hearing subjects responses to the communicative ASL gestures in the next section.
Deaf signers
Perception of self-oriented grooming actions
Regional cerebral blood flow (rCBF) increases during perception of self–oriented actions compared to baseline was associated with activations in left hemisphere precentral gyrus (BA 6), inferior temporal gyrus (BA 20), and superior parietal lobule. Right hemisphere activation included medial (BA 6) and lateral middle frontal gyrus (BA 46), and middle temporal lobe (BA 21) with extension to temporal-occipital regions (BA 19/37). Bilateral activation of inferior occipital (BA 17/18), paraphippocampual gyrus, superior parietal regions and postcentral sulcus (BA 2) are noted (see Figure 3a, Table 1a, right panel).
Figure 3.

Perception of transitive actions
When the perception of object-oriented actions is compared to baseline (Figure 3b, Table 1b, right panel), increased rCBF was observed in the left hemisphere precentral gyrus (BA 4), superior parietal lobule (BA 7/5), and cerebellum. Right hemisphere activation included medial (BA 6) and lateral middle frontal gyrus (BA 47) and middle temporal lobe (BA 21) with extension to temporal-occipital regions (BA 19/37). Bilateral activation of parahippocampal gyrus and postcentral sulcus (BA 2) are noted.
Perception of ASL in deaf signers
The rCBF changes in response to the perception of ASL compared to baseline was associated with activations in the left middle frontal gyrus (BA 11), inferior frontal gyrus (BA 9), precentral gyrus (BA 6/4), insula (BA 13), and superior temporal gyrus (BA 38). The right hemisphere was activated in the middle frontal gryus (BA 6/46), superior and middle temporal gyri (BA 41/21), uncus (BA 28), thalamus, and cerebellum (Figure 3c, Table 1c, right panel). Bilateral activation included post-central gyrus (BA 5), inferior parietal regions (BA 39/40), and parahippocampal gyrus (BA 30).
Common activations
Only a small set of regions were found to be common for all three conditions in the deaf signers (Figure 4, Table 2, right panel). These regions included right parahippocampus, middle temporal gyrus (BA 21) and medial frontal gyrus bilaterally and left post-central gyrus (BA 5).
Figure 4.

Common Activations of Self-oriented Actions, Object-oriented Actions, and ASL: Deaf Subjects.
Summary
Our primary finding is that in contrast to hearing subjects, who exhibited extensive overlap in the neural regions encoding these three different classes of movement, deaf subjects’ neural responses to human actions (relative to baseline) showed greater sensitivity (i.e., a less uniformity) to the class of action being viewed. While the neural responses to self-oriented and object-oriented actions showed a fair degree of similarity to one another (notably, activation within right anterior middle temporal gyrus and right middle occipital visual association regions and superior pre- and post-central sulcus bilaterally), not unexpectedly, the neural responses to ASL appear different. In the ASL condition, activation is found in left pre- and post-central sulcus and right temporal gyrus, conspicuously lacking is activation in posterior visual association areas. Further differences in the ASL data are observed in the frontal lobe, which include activations in inferior frontal regions and lateral premotor region (BA 6) bilaterally.
The variability in these comparisons may reflect the relatively fewer numbers of subjects who contributed scans to these contrasts (n=6). To provide additional confirmation of these findings, we consider patterns of activation derived from direct comparisons of the main conditions of interest (ASL and Object-oriented and Self-oriented Actions) which permits inclusion of all deaf subjects’ scans (n=10). Our findings of condition variability are statistically confirmed in the individual condition contrasts presented below.
Regions unique to ASL processing relative to self-oriented action and object-oriented viewing
To highlight the neural regions in the deaf that are sensitive to properties of ASL, we contrast data from ASL with the combined responses of self-oriented and object-oriented actions (Figure 5, shown in red). These contrasts provide and evaluation of linguistic and non-linguistic human action in the deaf signers (Figure 5, Table 5). We also consider these same contrasts in the hearing subjects (Figure 6, Table 6).
Figure 5.

Deaf Subjects. ASL > Non-linguistic Gestures (red), Non-linguistic Gestures > ASL (green).
Table 5.
Deaf Subjects.
| ASL > Non-linguistic Actions
|
Non-linguistic Actions > ASL
|
|||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Left Hemisphere | Right Hemisphere | Left Hemisphere | Right Hemisphere | |||||||||||||||||
| Region | Area | x | y | Z | Z score | Area | x | y | z | Z score | Area | x | y | z | Z score | Area | x | y | z | Z score |
| Frontal | ||||||||||||||||||||
| Superior frontal gyrus | 10 | −2 | 64 | 30 | 3.9 | 10 | 20 | 66 | 14 | 3.38 | ||||||||||
| Middle Frontal Gyrus | 46 | 54 | 42 | 14 | 3.22 | |||||||||||||||
| Medial Frontal Gyrus | 6 | 12 | −14 | 52 | 3.27 | |||||||||||||||
| Inferior frontal gyrus | 46 | −48 | 28 | 16 | 3.33 | |||||||||||||||
| 9 | −52 | 22 | 22 | 3.11 | ||||||||||||||||
| Temporal | ||||||||||||||||||||
| Superior temporal gyrus | 41 | 50 | −28 | 14 | 3.47 | 38 | 34 | 16 | −40 | 4.01 | ||||||||||
| Middle temporal gyrus | 21 | 54 | 10 | −38 | 3.79 | |||||||||||||||
| Inferior temporal gyrus | 20 | −30 | −4 | −44 | 4.89 | 20 | 38 | −14 | −42 | 3.29 | ||||||||||
| Parahippocampal gyrus | 37 | −16 | −16 | −14 | 3.35 | |||||||||||||||
| Insula | 13 | −44 | 14 | 0 | 3.24 | |||||||||||||||
| Parietal | ||||||||||||||||||||
| Postcentral gyrus | 5 | 26 | −46 | 68 | 3.71 | |||||||||||||||
| Precuneus | 7 | 10 | −60 | 48 | 3.41 | |||||||||||||||
| Cerebellum | −22 | −52 | −36 | 4.41 | 12 | −52 | −30 | 4.15 | ||||||||||||
| Occipital | ||||||||||||||||||||
| Middle occipital gyrus | 19 | −52 | −76 | 0 | 3.17 | 19 | 50 | −76 | −10 | 5.17 | ||||||||||
| 19 | 60 | −64 | 16 | 3.45 | ||||||||||||||||
| 18 | 26 | −88 | 14 | 4.46 | ||||||||||||||||
Figure 6.

Hearing Subjects. ASL > Non-linguistic Gestures (red), Non-linguistic Gestures > ASL (green).
Table 6.
Hearing Subjects.
| ASL > Non-linguistic Actions
|
Non-linguistic Actions > ASL
|
|||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Left Hemisphere | Right Hemisphere | Left Hemisphere | Right Hemisphere | |||||||||||||||||
| Region | Area | x | y | z | Z score | Area | x | y | z | Z score | Area | x | y | z | Z score | Area | x | y | z | Z score |
| Frontal | ||||||||||||||||||||
| Middle Frontal Gyrus | 6 | 36 | 0 | 64 | 3.34 | |||||||||||||||
| Medial Frontal Gyrus | 10 | −16 | 66 | 4 | 3.31 | 6 | 8 | −24 | 50 | 3.71 | ||||||||||
| Precentral gyrus | 6 | 36 | −16 | 68 | 3.44 | |||||||||||||||
| Cingulate | 32 | 8 | 36 | 8 | 3.28 | 24 | −18 | 8 | 28 | 3.4 | ||||||||||
| Temporal | ||||||||||||||||||||
| Middle temporal gyrus | 39 | −54 | −70 | 10 | 3.42 | |||||||||||||||
| Parahippocampal gyrus | 30 | −8 | −48 | 4 | 3.47 | |||||||||||||||
| Uncus | 36 | 26 | −2 | −32 | 3.72 | |||||||||||||||
| Parietal | ||||||||||||||||||||
| Postcentral gyrus | 5 | 26 | −46 | 66 | 3.24 | |||||||||||||||
| Precuneus | 7 | −6 | −76 | 56 | 3.65 | |||||||||||||||
| Occipital | ||||||||||||||||||||
| Middle occipital gyrus | 18 | 32 | −88 | 4 | 5.04 | |||||||||||||||
| Inferior occipital gyrus | 18 | 44 | −80 | −8 | 4.74 | |||||||||||||||
| Lingual gyrus | 18 | 6 | −74 | −20 | 3.5 | |||||||||||||||
These direct contrasts reveal that sign language perception engenders neural activity in classic frontal and posterior superior temporal language areas, including left inferior frontal (BA 46/9) and superior temporal (BA 41) regions and insula (BA 13). Right hemisphere activation included medial frontal gyrus (BA 6) and superior parietal lobe (BA 5/7) and bilateral activation in the cerebellum. The participation of left-hemisphere perisylvian and inferior frontal cortical regions in the perception of signed languages of the deaf has been confirmed in several neuroimaging studies (for recent reviews, see Corina & McBurney, 2001; Emmorey, 2002; Corina & Knapp, in press). The reliance of signed language processing on left hemisphere regions is further underscored by the unequivocal evidence for sign language aphasia following left hemisphere lesions (Poizner et al., 1998; Hickok et al., 1997). It is also noteworthy that activations within the left hemisphere auditory association area (BA 41) were observed in this population. These data are consistent with observations that, despite auditory deprivation, deaf signers show activity within classical auditory regions in response to visual linguistic stimuli (Petitto et al., 2000; Bavelier et al., 2001; Emmorey et al., 2002; MacSweeney et al., 2002; Fine et al., 2005). Brain imaging studies of signers have reported robust right hemisphere perisylvian activation for signing (Neville et al., 1998; Bavelier et al., 2001; MacSweeney et al., 2004; Newman et al., 2002). The present data, which uses individual sign stimuli, show little evidence of right perisylvian activation when contrasted against our baseline task (Figure 3c, Table 1c, right panel) and in the direct contrast to non-linguistic gestures (Figure 5, Table 5, left panel). However, it is important to note that studies which have reported extensive right hemisphere effects for sign comprehension have all involved the presentation of sentential material. As with spoken languages, syntactic- and discourse-level factors may more fully engage right hemisphere perisylvian regions than does the perception of single signs (Xu et al., 2005; Robertson et al., 2000; St. George et al., 1999).
When non-linguistic actions are directly contrasted with ASL (Figure 5, shown in green), prominent activity associated with their viewing was found in right middle occipital posterior visual association areas (BA 19/18) extending into the ventral temporal lobe (BA 20). Activation was also seen in anterior regions of middle and superior temporal gyrus. Right frontal lobe action included middle frontal gyrus (BA 46) and bilateral superior frontal activation (BA 10) (Table 5, right panel). Left hemisphere activation included middle occipital regions (BA 19) and left inferior temporal and parahippocampal gyrus activation.
Posterior middle occipital-temporal association areas are associated with specialized, high-level classes of visual stimulation. For example, regions of lateral posterior visual association cortex have been proposed to be selective for faces (Kanwisher et al., 1997; Kanwisher, 2000), images of bodies (Downing, et al., 2001), and biological movement (Grossman et al., 2000; Grossman & Blake, 2001; Grossmann & Blake, 2002).
A growing number of studies have begun further characterizing both the functional overlap and functional-anatomical specificities within these regions (Downing et al., 2005; Morris et al., 2006; Peelen et al., 2006). It is interesting that in our data, there is a clear focus of activity in right hemisphere occipital-temporal association areas in that is showing a stronger response to these non-linguistic human actions compared to ASL viewing. These regions correspond to the EBA and hMT region described by Peelen et al. (2006). These data may indicate a differential processing of movement and visual properties of non-linguistic action which may reflect active enhancement of regions sensitive to human body form and movement which serve to gate non-linguistic and linguistic actions. Alternatively, deaf signers may show a greater reliance upon top-down processing in the recognition of highly familiar linguistically compositional human actions thus leading to a more automatic and efficient processing of highly familiar visual linguistic featural information with less reliance on early visual processing.
In a previous study, MacSweeney et al. (2004) also examined contrasts between sign (in this case British Sign Language) and non-linguistic gestures in deaf and hearing subjects. However, this study was designed to provide a close contrast between a natural signed language (BSL) and similar gestural display. The authors choose a highly-constrained set of stylized intransitive non-linguistic gestures which are used as signaling code by racecourse bookmakers aka Tic Tac. As described, these stimuli shared many gestural and rhythmic dynamic qualities of natural signed languages (MacSweeney et al., 2004). Only a small set of recurring sets of Tic Tac tokens (n=16) were in non-linguistic condition and the stimuli were filmed with facial expressions so as to match the linguistic expression used in BSL. The results of this study suggested highly similar neural activation for BSL sentences and Tic Tac sentences. This is not surprising given that the researchers were trying to optimize the physical similarities. Each contrast set against a static low-level baseline, produced widespread bilateral posterior temporal-occipital, STS, and inferior temporal gyrus activation. Direct comparison between BSL and Tic Tac indicated a circumscribed left lateralized posterior temporal lobe with extension into supramarginal gyrus and inferior fusiform and middle temporal gyrus activity. Only a limited region of enhanced activation for Tic Tac in comparison to BSL was found in the signing group and this was focused in the right hemisphere posterior temporal/occipital regions. The locations reported (38, −56, −2; 24; 38, −64, −7; 24, −83, −7) are proximal, but somewhat inferior to the regions we have found prominent in our comparisons.
Finally, we cannot rule out that our differences in activation may be partially attributable to sensory differences, rather than content differences, of these naturalistic stimuli. While the total duration of passive viewing blocks was equivalent across runs (and a constant SOA used), the naturalistic stimuli do differ in temporal duration. Thus, posterior visual regions may be driven to greater extent by the longer non-linguistic stimuli relative to the shorter ASL stimuli. Figure 6 and Table 6 illustrate these same contrasts from the hearing subjects. As is evident from these figures (ASL activation shown in red, Non-linguistic action activation shown in green) these stimulus contrasts do lead to prominent occipital activation for the non-linguistic gestures relative to the ASL. However, this activation is more limited in the hearing subjects, and does not include extension into anterior middle occipital BA 19 region and does not extend robustly into temporal-ventral and anterior temporal lobe regions. A conjunction analysis was used to compare common activations across deaf and hearing subjects in the contrasts comparing Non-linguistic activation versus ASL. In this analysis one large right posterior occipital cluster is observed which includes the inferior occipital gyrus (BA 18) (X= 44, Y= −80, Z= −6) with extension into posterior middle occipital BA 19 (32, −92, 6). It is also note worthy that in this contrast the ASL activation in the hearing subjects (shown in red) does not encompass frontal and perisylvian language regions as was observed in the deaf subject.
The statistical group comparisons described below (deaf versus hearing) provides further specificity and confirmation of the differences in the processing of linguistic and non-linguistic action in deaf signer and hearing non-signing subjects.
Group Comparison: Non-linguistic Action versus Linguistic Action
A fixed-effects group comparison allowed us to examine the differences between hearing and deaf subjects in the processing of non-linguistic gestures and American Sign Language relative to baseline1. This analysis provides a further quantification of the individual within-group patterns described above. The separate treatment of the ASL and the non-linguistic gestures (self-oriented and object-oriented actions) in these group analyses provides a further control for stimuli differences inherent in these naturalistic stimuli.
Figure 7 illustrates the combined contrast maps reflecting significant differences for normally hearing (shown in green) and deaf (shown in red) subjects for non-linguistic gestures versus baseline. The figure indicates that hearing subjects activated (to a greater extent than the deaf) a left inferior frontal, superior temporal, and bilateral parietal network associated with human action processing (Table 7, left panels).
Figure 7.

Group Comparison of Non-linguistic Gestures > Baseline: Deaf (red) and Hearing (green).
Table 7.
Self-oriented and Object-oriented Actions > Baseline.
| Hearing > Deaf
|
Deaf > Hearing
|
|||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Left Hemisphere | Right Hemisphere | Left Hemisphere | Right Hemisphere | |||||||||||||||||
| Region | Area | x | y | z | Z score | Area | x | y | z | Z score | Area | x | y | Z | Z score | Area | x | y | z | Z score |
| Frontal | ||||||||||||||||||||
| Superior frontal gyrus | 6 | −16 | 4 | 70 | 3.48 | 11 | 24 | 62 | −22 | 4.15 | ||||||||||
| Middle frontal gyrus | 46 | −44 | 22 | 20 | 3.76 | |||||||||||||||
| 10 | −36 | 60 | 2 | 3.64 | 10 | 36 | 60 | 14 | 3.97 | |||||||||||
| Inferior frontal gyrus | 47 | −28 | 22 | −12 | 3.8 | 45 | 54 | 28 | 4 | 3.4 | ||||||||||
| Precentral gyrus | 6 | −62 | −18 | 44 | 4.2 | 4 | 18 | −32 | 68 | 4.06 | ||||||||||
| Cingulate | 24 | 12 | 10 | 32 | 3.61 | |||||||||||||||
| Temporal | ||||||||||||||||||||
| Temporal pole | 38 | 60 | 12 | −12 | 3.8 | |||||||||||||||
| 44 | 6 | −16 | 3.47 | |||||||||||||||||
| Superior temporal lobe | 39 | −64 | −56 | 24 | 3.8 | 21 | 38 | 4 | −32 | 3.88 | ||||||||||
| 42 | 66 | −20 | 8 | 3.27 | ||||||||||||||||
| Middle temporal gyrus | 20 | −56 | −40 | −12 | 3.36 | |||||||||||||||
| Insula | 38 | −38 | 8 | 18 | 3.92 | |||||||||||||||
| Parahippocampal gyrus | 36 | −22 | −38 | −4 | 3.82 | −30 | −50 | −2 | 3.44 | |||||||||||
| Parietal | ||||||||||||||||||||
| Postcentral gyrus | 7 | −4 | −54 | 74 | 4.29 | |||||||||||||||
| Precuneus | 7 | −18 | −78 | 56 | 3.41 | |||||||||||||||
| Inferior parietal | 40 | −50 | −46 | 26 | 3.49 | 40 | 52 | −58 | 52 | 3.31 | 40 | −60 | −36 | 34 | 3.59 | |||||
| −46 | −42 | 60 | 3.25 | |||||||||||||||||
| Subcortical | ||||||||||||||||||||
| Caudate | −10 | −2 | 18 | 4.09 | 26 | −38 | 12 | 3.86 | ||||||||||||
| Occipital | −12 | 24 | 2 | 3.45 | ||||||||||||||||
| Middle occipital gyrus | 19 | −52 | −78 | −12 | 4.00 | |||||||||||||||
| Cuenus | 18 | −20 | −106 | 4 | 5.13 | 18 | 18 | −90 | 8 | 3.91 | ||||||||||
| Cerebellum | 16 | −82 | −30 | 3.37 | ||||||||||||||||
Deaf subjects show a very different pattern involving left middle occipital and inferior parietal regions and bilateral posterior occipital regions including prominent contribution by the cuneus (Table 7, right panel). Extensive right anterior temporal pole and inferior frontal gyrus activation were also found, as was activity in primary motor and left premotor cortex. Cuneus activation is associated with high-level visual processing in a variety of domains, including spatial navigation (Maguire et al., 1998), visual search (Gitelman et al., 2002), and complex object processing (Ishai et al., 2002) and it has further been reported in the perception of meaningless hand and arms gestures relative to meaningful movements (Decety et al., 1997).
Coupled with this posterior activation was the involvement of a left middle occipital-temporal area a high-level visual association region. It is interesting that this particular left hemisphere region has been observed in studies of visual object processing, including decisions regarding face versus object properties (Gerlach et al., 2000; Levy et al., 2001; Hasson et al., 2001). In the context of the present study, these findings are consistent with engagement of a visual posterior lateral system that is involved in the assessments of visual properties of action stimuli that may be engaged for the differentiation of non-linguistic human action in deaf individuals.
The right temporal pole activity is another prominent feature of the deaf data. There is growing evidence to suggest that the right temporal pole may participate in the association between visual information of faces and person-related semantics (Tsukiura et al., 2006). Since our model is a deaf actor, we cannot rule out the recognition of this individual by our deaf subjects. However, the asymmetrical response we observe (RH > LH) appears different in kind than the more expected patterns of bilateral and or strongly left-temporal pole activation for person-name recognition (Leveroni et al., 2000; Gomo-Tempini & Price, 2001; Grabowski et al., 2001).
It is interesting to note that in a study of a voice-selective area in human auditory cortex, Belin et al. (2000) reported a right superior temporal pole area proximal to that observed in the present study (60, 12, −12) that responded to vocal sounds including speech (words, connected speech) and non-speech (coughs, sighs, laughs) compared with energy matched non-vocal sounds (natural sounds, animal cries, mechanical sounds). The present data suggest that this region in the deaf may be contributing to the analysis of communicative potential of human actions. If true, this is a particularly striking example of auditory association area plasticity.
Group differences for ASL processing relative to baseline are shown in Figure 8, Table 8, left and right panels. These data indicate that during the passive viewing of ASL, the deaf (shown in red), relative to hearing (shown in green), engaged bilateral motor and somatosensory regions in central sulcus regions. In contrast, hearing subjects showed relatively greater activation in visual association processing areas including the cuneus and posterior-superior parietal and inferior temporal parietal regions. Prominent right medial frontal lobe and frontal orbital activation is also seen.
Figure 8.

Group Comparison of American Sign Language > Baseline: Deaf (red) and Hearing (green).
Table 8.
American Sign Language > Baseline.
| Hearing > Deaf
|
Deaf > Hearing
|
|||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Left Hemisphere | Right Hemisphere | Left Hemisphere | Right Hemisphere | |||||||||||||||||
| Region | Area | x | y | z | Z score | Area | x | y | z | Z score | Area | x | y | z | Z score | Area | x | y | z | Z score |
| Frontal | ||||||||||||||||||||
| Superior frontal gyrus | 11 | 18 | 48 | −24 | 3.55 | |||||||||||||||
| Middle frontal gyrus | 10 | 36 | 64 | 10 | 3.81 | |||||||||||||||
| 6 | 46 | 0 | 40 | 3.56 | ||||||||||||||||
| Medial frontal gyrus | 10 | 2 | 64 | 12 | 3.22 | |||||||||||||||
| Central sulcus | 4 | −62 | −18 | 44 | 4.53 | 4 | 66 | −18 | 30 | 3.12 | ||||||||||
| 4 | −12 | −20 | 50 | 3.37 | 4 | 18 | −30 | 68 | 3.93 | |||||||||||
| Precentral gyrus | 4/6 | −24 | −36 | 68 | 3.69 | 4/6 | 26 | −20 | 66 | 3.05 | ||||||||||
| Cingulate | 16 | 18 | 36 | 3.33 | 31 | −10 | −38 | 34 | 3.34 | 31 | 4 | −26 | 46 | 4.1 | ||||||
| −14 | −58 | 6 | 3.9 | |||||||||||||||||
| Temporal | ||||||||||||||||||||
| Superior temporal lobe | 39 | −64 | −56 | 24 | 3.61 | |||||||||||||||
| Insula | 13 | −38 | 8 | 18 | 3.41 | |||||||||||||||
| Parietal | ||||||||||||||||||||
| Postcentral gyrus | 7 | −4 | −58 | 72 | 3.52 | |||||||||||||||
| Precuneus | 7 | −20 | −76 | 54 | 3.9 | 7 | 34 | −70 | 54 | |||||||||||
| Inferior parietal | 40 | −46 | −46 | 34 | 3.35 | 40 | 54 | −56 | 48 | 3.24 | ||||||||||
| Subcortical | ||||||||||||||||||||
| Caudate | −10 | −2 | 16 | 3.43 | 26 | −38 | 12 | 3.77 | ||||||||||||
| Occipital | ||||||||||||||||||||
| Cuenus | 19 | −4 | −88 | 26 | 3.38 | |||||||||||||||
| Middle occipital gyrus | 19 | −54 | −62 | −4 | 3.65 | |||||||||||||||
| Cerebellum | 26 | −70 | −30 | 3.84 | ||||||||||||||||
A notable feature of the group differences was the great motor and somatosensory activation in response to ASL for the deaf relative to the hearing subjects specifically, bilateral activation in a superior motor region, and in a second more lateral focus that maps to areas subserving hand and finger functions. The activation of mouth motor regions during the perception of speech is well-attested (Wilson et al., 2004; Calvert & Campbell, 2003). The present results suggest homologous motor activations in the deaf, who primary language is articulated with the hands and fingers. This data indicates that in the case of the deaf signers, the linguistic encoding of signs engenders activation in the primary motor and somatosensory cortices suggesting a linkage between the execution/sensations of human movement and the perception of signs.
In this ASL group contrast, there is a lack of significant differences in inferior frontal activations. This lack of differences suggest that inferior frontal regions involved in non-linguistic action perception observed in the hearing subjects (for example, see Figure 7), may share considerable overlap with the inferior frontal region coding linguistic action in the case of the deaf subjects (for example, see Figure 5). This data may be taken as evidence that the frontal neural system mediating linguistic processing in the deaf may be functionally related to more general human action recognition system (for discussion, see Corina & Knapp, 2006). It remains unclear whether such linguistic recruitment in the deaf precludes development of these frontal regions for non-linguistic action processing (for example, compare Figure 7).
Discussion
Taken together these data lead us to three primary conclusions. First, in normally hearing subjects, the passive viewing of self-oriented actions, object-oriented transitive actions, and American Sign Language actions all engage a common neural network including superior parietal (BA 40/7), left ventral premotor (BA 6), and inferior frontal (BA 46) regions. The degree of anatomical overlap across these three classes of human actions is impressive. These results add to the growing evidence for a specialized system involved in human action recognition.
Second, deaf subjects who have had life-long experience with human actions in the context of linguistic motion perception (i.e. ASL) show a qualitatively different pattern of results. These individuals exhibited a marked difference between neural regions subserving linguistic (sign language) and non-linguistic human actions. As expected, signs engaged previously described left-hemisphere language areas in posterior-superior temporal, parietal, and inferior frontal areas. These data add to the growing evidence for specialized regions for linguistic processing, independent of the modality of expression (sign or speech).
Third, even for the classes of non-linguistic gestures, deaf and hearing groups showed qualitatively different patterns of activation. While hearing subjects recruited the human action system described above, deaf subjects showed engagement of posterior occipital temporal regions and the right temporal pole. In direct comparisons of non-linguistic gestures versus ASL, both groups show prominent activation of posterior occipital regions, likely reflecting duration differences in the naturalistic stimuli. However, beyond this shared activation, we do observe significant group differences as well. These group comparisons provided further evidence for frank differences in the processing of linguistic actions (ASL) and naturally occurring transitive and self-oriented actions. The occipital-temporal regions unique to the deaf correspond to well-described extrastriate association areas including extrastriate body area and hMT (Downing et al., 2001; Downing et al., 2005).
The results suggest that life long experience with a visual language may significantly alter brain regions involved in the human action recognition. Signers must be attuned to the presence of non-linguistic actions produced by those around them and they must they must be able to quickly detect the presence of linguistic movements produced by fellow signers. This entails being able to parse sign language motions from other kinds of human movements that co-occur in the visual environment and extends to the mapping of these sign language actions to linguistic movement patterns stored in memory. The differences in neural activity in the present studies suggest possible mechanisms for this active filtering. Top-down linguistic knowledge may engender fast efficient mapping for signs into cortical regions specialized for linguistic processing and primary motor regions corresponding to hand and finger representations may factor significantly in these mapping. Signers may make use of extrastriate regions to quickly identify and filter non-linguistic gestures from linguistic ones for the further processing of lexical semantic content in a fashion not required for non-signers. The active on-going participation of these regions responsible for detection of cues for non-linguistic status may reflect a mode of human processing that is different from the usual human action recognition system commonly observed in non-signing subjects. Taken together, these data suggest a human action recognition system that is a modifiable in the face of competing demands for the categorization of ecologically-relevant human movements.
Experimental Procedure
Stimuli
Three classes of stimuli were constructed for the present experiment: self-oriented actions, transitive object-oriented actions, and common ASL signs. All stimulus items were videotaped, edited, and digitized using a digital editing system and were presented with an SOA of 3000ms. A single professional male deaf actor served as the model for all stimuli.
Self-oriented actions
Self-oriented actions, which could also be termed “self-grooming,” (Figure 9a) are highly frequent motions with little social importance for a viewer. Our stimuli were vignettes consisting of the hands manipulating or acting on a part of the body, especially the upper torso, face, and head. Some representative actions included running one’s fingers through hair, rubbing one’s eye, and stretching the two intertwined hands forward. There were a total of 35 actions with a mean duration of 1650 ms.
Figure 9.

Object-oriented actions
Object-oriented actions involved a model canonically manipulating a variety of common objects (Figure 9b). Examples of these stimuli include folding a shirt, biting an apple, and popping a balloon. The majority of these actions occurred near the upper torso. There were 32 unique stimuli with a mean duration of 1433 ms.
American Sign Language
ASL stimuli were two-handed common nouns (Figure 9c). These are meaningful communicative actions for users of signed language and are recognized as communicative even by non-signers. Signs are highly conventionalized and are composed of a limited set of recurring manual-gestural elements that include a discrete selection of handshapes and movements on, to, and about a constrained set of body spatial locations. We avoided nouns that could be considered “pantomimic” of the object (for example, the sign for toothbrush in which the index finger mimics the bushing of the teeth). There were 36 unique stimuli with a mean duration of 333 ms.
Baseline
A fourth condition served as an active baseline to control for stimulus luminance and low-level motion cues necessarily common to all human motions. These baseline stimuli were created by digitally superimposing an ASL stimulus on themselves with a 90 degree shift in orientation, and then further processing with an ADOBE aftereffect diffuse filter radius 16 (Figure 9d). The resulting stimuli are screens of complex moving visual images in which the content of the image is incoherent, but the visible movement shears are derived from human actions.
Subjects
Two groups of right handed subjects were tested: hearing individuals with no knowledge of sign language (n=10, 5 male, age 21–27) and profoundly deaf native signers who have had life long experience with a sign language (dB > 80, n=10, 4 male, age 20–29). The term native signer refers to an individual raised in a signing household and whose first and primary language is a sign language.
Tasks
Subjects viewed videotapes of self-directed action, transitive actions, two-handed ASL noun signs, and the human motion-derived baseline. Following each passive viewing condition, subjects participated in a short recognition test of signs or actions which required them to identify two to four previously seen items from set of six. All subjects performed near ceiling on this simple recognition measure. Presentations of the four classes of stimuli were counterbalanced within a larger series of conditions that were used to explore linguistic and motor neural correlates of signing. The results of these studies are published in Corina et al. (2003) and San Jose-Robertson et al. (2004).
Scanning Methods
Each participant was scanned at the NIH PET department located in Bethesda, Maryland using a GE-Advance 3D PET camera (GE Medical Systems, Waukesha, WI) which has an axial and in-plane resolution of 4.25 mm. Thirty-five planes, offset by 4.25 mm, were acquired simultaneously. A thermo-plastic mask was applied to each subject to maintain his/her position between the scans. An initial transmission scan was performed for each participant to allow correction for attenuation. Four 10mCi injections of H2015 were administered to the subjects through a venous catheter in the left arm. The PET scans were initiated by the arrival of the O15 water bolus in the brain (approximately 20 seconds after injection) and continued for one minute. There were five minute intervals between scans.
Statistical Analysis
PET scans were registered, spatially smoothed (12mm3) and stereotaxically normalized into MNI coordinate space using SPM2 software (Wellcome Department of Cognitive Neurology, London, UK). PET images from all subjects were scaled to an overall cerebral blood flow (CBF) grand mean of 50mL/100g/min. Analysis of covariance, with global activity as a confounding covariate, was performed on a pixel-by-pixel basis. An SPM multiple-subject, multiple-conditions factorial design was used to evaluate within-group task conditions (Self-oriented Actions, Object-oriented Actions, ASL Signing, and Baseline). Due to time constraints, the baseline condition for four deaf subjects were not obtained. Analysis of contrasts involving this baseline is predicated on data from six deaf subjects. Direct comparison between all other conditions utilizes data from 10 deaf subjects. SPM multi-group multiple condition factorial analysis was used in a fixed effects between group comparisons. Main effects and Group contrasts were evaluated at p < .005 uncorrected, cluster extent > 20 voxels. Conjunction analysis (Nichols et al., 2004) was used to evaluate commonalities between multiple conditions. This conjunction analysis is used to ensure that all the contrast effects evaluated under conjunction are non-null at p > .002 level. Each group contrast is masked inclusively by the relevant within group contrast of interest at p < .05 level. For example, the group contrast evaluating whether deaf subjects showed activations during the ASL versus baseline comparison that were significantly different from hearing subject activations for ASL versus baseline is inclusively masked by deaf subject ASL versus baseline. This procedure is used to insure that the significant results reflect true increases from baseline.
The figures shown represent the rendering of the SPM output at p< .005 threshold and display cluster sizes of 20 voxels or greater. The tables, in contrast, list a more conservative account for the data, and report significant activation of Z > 3.21, p < .001. In this approach, the figures provide an assessment of global patterns of activity observed and the tables provide a more conservative and economic account for the patterns of data.
Acknowledgments
This work was supported by grant NIH_NIDCD RO1 DC003099 awarded to David Corina. We thank Gallaudet University and all the participants of this study. We thank Mary Mendoza for help in the preparation of the manuscript.
Appendix
| NOUNS | ACTION | SELF GROOMING |
|---|---|---|
| ANIMAL | TAKE PICTURE | PICK EAR |
| RAIN | LICK STAMP | SCRATCH BELLY |
| BOTTLE | BITE APPLE | PICK NAIL |
| HOUSE | WATCH TV | BRUSH HAIR |
| WIFE | KICK FOOTBALL | SCRATCH NECK |
| AUDIENCE | READ NEWSPAPER | RUB NOSE |
| RIVER | SWEEP WITH BROOM | WIPE MOUTH |
| DRAMA | FOLD SHIRT | CRACK KNUCKLES |
| STUDENT | DRAW WITH | RUB EYE |
| BLOOD | CRAYONS | RUB TEETH |
| SOLDIER | STAPLE PAPER | ADJUST SHOULDERS |
| TEMPERATURE | COUNT MONEY | RUB JAW |
| BOX | POP BALLOON | YAWN |
| PARKING | SMELL FLOWER | BRUSH OFF SHOULDERS |
| FOREST | HIDE GUN | WIPE BROW |
| FIRE | HOSPITAL COMB HAIR | RUB TEMPLES |
| HOTEL | PLAY WITH BLOCKS | RUB FACE |
| COLLEGE | SHAVE FACE | BRUSH OFF CHEST |
| DOCTOR | BLOW OUT FLAME | CLEAN HANDS |
| DOOR | BREAK COOKIE | POP SHOULDER |
| FAMILY | READ BOOK | SCRATCH CHIN |
| AUTUMN | THROW BASKETBALL | LICK FINGERS |
| SISTER | BRUSH TEETH | LICK LIPS |
| COUNTRY | CHEW GUM | POP KNUCKLES |
| DAY | DRINK WATER | WIGGLE JAW |
| OPPOSITE | KISS TOY FROG | RUB EYE |
| TEAM | CRUSH CAN | BITE HAND |
| NIGHT | SPILL WATER | PAT STOMACH |
| BEACH | STEAL WALLET | PULL UP SLEEVES |
| BOOK | SIT DOWN IN CHAIR | PICK TEETH |
| BOAT | TYPE ON TTY | RUB CHEEK |
| CHURCH | PLUG IN CORD | MASSAGE SHOULDER |
| PAPER | BRUSH HAIR | |
| EARTH | PICK TEETH | |
| CAR | RUB LEFT SHOULDER |
Footnotes
Each group contrast is masked inclusively by the relevant within group contrast of interest at p < .05 level. For example, the group contrast evaluating whether deaf subjects showed activations during the ASL versus baseline comparison that were significantly different from hearing subject activations for ASL versus baseline is inclusively masked by deaf subject ASL versus baseline. This procedure is used to insure that the significant results reflect true increases from baseline. Due to time constraints, the baseline condition for four deaf subjects was not obtained. Analysis of contrasts involving this baseline is predicated on data from six deaf subjects. Direct comparison between all other conditions utilizes data from 10 deaf subjects.
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Literature References
- Arbib MA. Rana Computatrix to human language: towards a computational neuroethology of language evolution. Philos Transact A Math Phys Eng Sci. 2003;361:2345–2379. doi: 10.1098/rsta.2003.1248. [DOI] [PubMed] [Google Scholar]
- Aziz-Zadeh L, Koski L, Zaidel E, Mazziotta J, Iacoboni M. Lateralization of the Human Mirror Neuron System. J Neurosci. 2006;26(11):2964–2970. doi: 10.1523/JNEUROSCI.2921-05.2006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bavelier D, Brozinsky C, Tomann A, Mitchell T, Neville H, Liu G. Impact of early deafness and early exposure to sign language on the cerebral organization for motion processing. J Neurosci. 2001;21:8931–8942. doi: 10.1523/JNEUROSCI.21-22-08931.2001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bavelier D, Corina D, Jezzard P, Clark V, Karni A, Lalwani A, Rauschecker JP, Braun A, Turner R, Neville HJ. Hemispheric specialization for English and ASL: left invariance-right variability. Neuroreport. 1998;11;9(7):1537–42. doi: 10.1097/00001756-199805110-00054. [DOI] [PubMed] [Google Scholar]
- Beauchamp MS, Lee KE, Haxby JV, Martin A. FMRI responses to video and point-light displays of moving humans and manipulable objects. J Cogn Neurosci. 2003;15(7):991–1001. doi: 10.1162/089892903770007380. [DOI] [PubMed] [Google Scholar]
- Belin P, Zatorre RJ, Lafaille P, Ahad P, Pike B. Voice-selective areas in human auditory cortex. Nature. 2000;403(6767):309–12. doi: 10.1038/35002078. [DOI] [PubMed] [Google Scholar]
- Calvert GA, Campbell RJ. Reading speech from still and moving faces: the neural substrates of visible speech. J Cogn Neurosci. 2003;15(1):57–70. doi: 10.1162/089892903321107828. [DOI] [PubMed] [Google Scholar]
- Carr L, Iacobani M, Dubeau MC, Mazziotta JC, Lenzi GL. Neural mechanisms of empathy in humans: a relay from neural systems for imitation to limbic areas. Proc Natl Acad Sci U S A. 2003;100(9):5497–5502. doi: 10.1073/pnas.0935845100. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Corina DP, Knapp H. Sign language processing and the mirror neuron system. doi: 10.1016/s0010-9452(08)70393-9. In press. [DOI] [PubMed] [Google Scholar]
- Corina DP, McBurney S. The neural representation of language in users of American Sign Language. J Commun Disord. 2001;34(6):455–471. doi: 10.1016/s0021-9924(01)00063-6. [DOI] [PubMed] [Google Scholar]
- Corina DP, San Jose-Robertson L, Guillemin A, High J, Braun AR. Language lateralization in a bimanual language. J Cogn Neurosci. 2003;15:718–730. doi: 10.1162/089892903322307438. [DOI] [PubMed] [Google Scholar]
- Corina DP. Aphasia in users of Signed Language. In: Coppens P, Lebrun Y, Basso A, editors. Aphasia in atypical populations. Lawrence Erlbaum; London: 1998. pp. 261–309. [Google Scholar]
- Decety J, Grezes J, Costes N, Perani D, Jeannerod M, Procyk E, Grassi F, Fazio F. Brain activity during observation of actions. Influence of action content and subject’s strategy. Brain. 1997;120:1763–1777. doi: 10.1093/brain/120.10.1763. [DOI] [PubMed] [Google Scholar]
- Downing PE, Chan AW, Peelen MV, Dodds CM, Kanwisher N. Domain Specificity in Visual Cortex. Cereb Cortex. 2005 doi: 10.1093/cercor/bhj086. In press. [DOI] [PubMed] [Google Scholar]
- Downing PE, Jiang Y, Shuman M, Kanwisher N. A cortical area selective for visual processing of the human body. Science. 2001;293:2470–2473. doi: 10.1126/science.1063414. [DOI] [PubMed] [Google Scholar]
- Emmorey K, Damasio H, McCullough S, Grabowski T, Ponto LL, Hichwa RD, Bellugi U. Neural systems underlying spatial language in American Sign Language. Neuroimage. 2002;17(2):812–24. [PubMed] [Google Scholar]
- Emmorey K. Language, cognition, and the brain: insights from sign language research. Mahwah, NJ: Lawrence Erlbaum Associates; 2002. [Google Scholar]
- Ferrari PF, Gallese V, Rizzolatti G, Fogassi L. Mirror neurons responding to the observation of ingestive and communicative mouth actions in the monkey ventral premotor cortex. Eur J Neurosci. 2003;17:1703–1174. doi: 10.1046/j.1460-9568.2003.02601.x. [DOI] [PubMed] [Google Scholar]
- Fine I, Finney EM, Boyton GM, Dobkins KR. Comparing the effects of auditory deprivation and sign language within the auditory and visual cortex. J Cogn Neurosci. 2005;17(10):1621–37. doi: 10.1162/089892905774597173. [DOI] [PubMed] [Google Scholar]
- Fogassi L, Gallese V, Fadiga L, Rizzolatti G. Neurons responding to the sight of goal directed hand/arm actions in the parietal area PF (7b) of the macaque monkey. Abstr Soc Neurosci. 1998;24:257.5. [Google Scholar]
- Gallese V, Fadiga L, Fogassi L, Rizzolatti G. Action recognition in the premotor cortex. Brain. 1996;119:593–609. doi: 10.1093/brain/119.2.593. [DOI] [PubMed] [Google Scholar]
- Gallese V, Fogassi L, Fadiga L, Rizzolatti G. Action representation and the inferior parietal lobule. In: Prinz W, Hommel B, editors. In Attention and Performance. Oxford University Press; New York: 2002. pp. 247–266. [Google Scholar]
- Gerlach C, Law I, Gade A, Paulson OB. Categorization and category effects in normal object recognition: a PET study. Neuropsychologia. 2000;2000;38(13):1693–703. doi: 10.1016/s0028-3932(00)00082-8. [DOI] [PubMed] [Google Scholar]
- Gitelman DR, Parrish TB, Friston KJ, Mesulam MM. Functional anatomy of visual search: regional segregations within the frontal eye fields and effective connectivity of the superior colliculus. Neuroimage. 2002;15(4):970–82. doi: 10.1006/nimg.2001.1006. [DOI] [PubMed] [Google Scholar]
- Grezes J, Armony JL, Rowe J, Passingham RE. Activations related to “mirror” and “canonical” neurons in the human brain: an fMRI study. Neuroimage. 2003;18:928–37. doi: 10.1016/s1053-8119(03)00042-9. [DOI] [PubMed] [Google Scholar]
- Grezes J, Costes N, Decety J. Top down effect of the strategy on the perception of human biological motion: a PET investigation. Cogn Neuropsychol. 15:553–582. doi: 10.1080/026432998381023. [DOI] [PubMed] [Google Scholar]
- Grezes J, Decety J. Functional anatomy of execution, mental simulation, observation, and verb generation of actions: a meta-analysis. Hum Brain Mapp. 2001;12:1–19. doi: 10.1002/1097-0193(200101)12:1<1::AID-HBM10>3.0.CO;2-V. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Grossman ED, Blake R. Brain activity evoked by inverted and imagined biological motion. Vision Res. 2001;41:1475–1482. doi: 10.1016/s0042-6989(00)00317-5. [DOI] [PubMed] [Google Scholar]
- Grossman ED, Blake R. Brain areas active during visual perception of biological motion. Neuron. 2002;35(6):1167–1175. doi: 10.1016/s0896-6273(02)00897-8. [DOI] [PubMed] [Google Scholar]
- Grossman E, Donnelly M, Price R, Pickens D, Morgan V, Neighbor G, Blake R. Brain areas active during visual perception of biological motion. J Cogn Neurosci. 2000;12:711–720. doi: 10.1162/089892900562417. [DOI] [PubMed] [Google Scholar]
- Hasson U, Hendler T, Ben Bashat D, Malacj R. Vase or face? A neural correlate of shape-selective grouping processes in the human brain. J Cogn Neurosci. 2001;13(6):744–753. doi: 10.1162/08989290152541412. [DOI] [PubMed] [Google Scholar]
- Hickok G, Bellugi U, Klima ES. The basis of the neural organization for language: evidence from sign language aphasia. Rev Neurosci. 1997;8(3–4):205–22. doi: 10.1515/revneuro.1997.8.3-4.205. [DOI] [PubMed] [Google Scholar]
- Hynes CA, Baird AA, Grafton ST. Differential role of the orbital frontal lobe in emotional versus cognitive perspective-taking. Neuropsychologia. 2006;44(3):374–383. doi: 10.1016/j.neuropsychologia.2005.06.011. [DOI] [PubMed] [Google Scholar]
- Iacoboni M, Woods RP, Brass M, Bekkering H, Mazziotta JC, Rizzolatti G. Cortical mechanisms of human imitation. Science. 1999;286(5449):2526–2528. doi: 10.1126/science.286.5449.2526. [DOI] [PubMed] [Google Scholar]
- Iacoboni M, Zaidel E. Interhemispheric visuo-motor integration in humans: the effect of redundant targets. Eur J Neurosci. 2003;17(9):1981–1986. doi: 10.1046/j.1460-9568.2003.02602.x. [DOI] [PubMed] [Google Scholar]
- Iacoboni M. Neural mechanisms of imitation. Curr Opin Neurobiol. 2005;5(6):632–637. doi: 10.1016/j.conb.2005.10.010. [DOI] [PubMed] [Google Scholar]
- Ishai A, Ungerleider LG, Martin A, Haxby JV. The representation of objects in the human occipital and temporal cortex. J Cogn Neurosci. 2000;12(Suppl 2):35–51. doi: 10.1162/089892900564055. [DOI] [PubMed] [Google Scholar]
- Kanwisher N. Domain specificity in face perception. Nat Neurosci. 17:4302–4311. doi: 10.1038/77664. [DOI] [PubMed] [Google Scholar]
- Kanwisher N, McDermott J, Chun NM. The fusiform face area: a module in human extrastriate cortex specialized for face perception. Nat Neurosci. 1997;17:4302–4311. doi: 10.1523/JNEUROSCI.17-11-04302.1997. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kjaer T, Nowak M, Lou H. Reflective self-awareness and conscious states: PET evidence for a common midline parietofrontal core. Neuroimage. 2002;17(2):1080. [PubMed] [Google Scholar]
- Lane RD, Reiman EM, Ahern GL, Schwartz GE, Davidson RJ. Neuroanatomical correlates of happiness, sadness, and disgust. Am J Psychiatry. 1997;154(7):926–933. doi: 10.1176/ajp.154.7.926. [DOI] [PubMed] [Google Scholar]
- Levy I, Hasson U, Avidan G, Hendler T, Malach R. Center-periphery organization of human object areas. Nat Neurosci. 2001;4(5):533–9. doi: 10.1038/87490. [DOI] [PubMed] [Google Scholar]
- MacSweeney M, Campbell R, Woll B, Giampietro V, David AS, McGuire PK, Calvert GA, Brammer MJ. Dissociating linguistic and nonlinguistic gestural communication in the brain. Neuroimage. 2004;22(4):1605–1618. doi: 10.1016/j.neuroimage.2004.03.015. [DOI] [PubMed] [Google Scholar]
- MacSweeney M, Woll B, Campbell R, McGuire PK, David AS, Williams SC, Suckling J, Calvert GA, Brammer MJ. Neural systems underlying British Sign Language and audio-visual English processing in native users. Brain. 2002;125(Pt 7):1583–1593. doi: 10.1093/brain/awf153. [DOI] [PubMed] [Google Scholar]
- Maguire EA, Burgess N, Donnett JG, Frackowiak RS, Frith CD, O’Keefe J. Knowing where and getting there: a human navigation network. Science. 1998;8;280(5365):921–924. doi: 10.1126/science.280.5365.921. [DOI] [PubMed] [Google Scholar]
- Miall RC. Connecting mirror neurons and forward models. Neuroreport. 2003;14(17):2135–2137. doi: 10.1097/00001756-200312020-00001. [DOI] [PubMed] [Google Scholar]
- Moll J, de Oliveira-Souza R, Eslinger PJ, Bramati IE, Mourao-Miranda J, Andreiuolo PA, Pessoa L. The neural correlates of moral sensitivity: a functional magnetic resonance imaging investigation of basic and moral emotions. J Neurosci. 2002;22(7):2730–2736. doi: 10.1523/JNEUROSCI.22-07-02730.2002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Morris JP, Pelphrey KA, McCarthy G. Occipitotemporal activation evoked by the perception of human bodies is modulated by the presence or absence of the face. Neuropsychologia. 2006;44(10):1919–1927. doi: 10.1016/j.neuropsychologia.2006.01.035. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Neville HJ, Bavelier D, Corina D, Rauschecker J, Karni A, Lalwani A, Braun A, Clark V, Jezzard P, Turner R. Cerebral organization for language in deaf and hearing subjects: biological constraints and effects of experience. Proc Natl Acad Sci USA. 1998;95(3):922–929. doi: 10.1073/pnas.95.3.922. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Newman AJ, Bavelier D, Corina D, Jezzard P, Neville HJ. A critical period for right hemisphere recruitment in American Sign Language processing. Nat Neurosci. 2002;5(1):76–80. doi: 10.1038/nn775. [DOI] [PubMed] [Google Scholar]
- Nichols T, Brett M, Andersson J, Wager T, Poline JB. Valid conjunction inference with the minimum statistic. Neuroimage. 2004;25(3):653–660. doi: 10.1016/j.neuroimage.2004.12.005. [DOI] [PubMed] [Google Scholar]
- Peelen MV, Wiggett AJ, Downing PE. Patterns of fMRI activity dissociate overlapping functional brain areas that respond to biological motion. Neuron. 2006;16;49(6):815–822. doi: 10.1016/j.neuron.2006.02.004. [DOI] [PubMed] [Google Scholar]
- Petitto LA, Zatorre RJ, Gauna K, Nikelski EJ, Dostie D, Evans AC. Speech-like cerebral activity in profoundly deaf people processing signed languages: implications for the neural basis of human language. Proc Natl Acad Sci USA. 2000;97:13961–13966. doi: 10.1073/pnas.97.25.13961. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Poizner H, Klima ES, Bellugi U. What the hands reveal about the brain. MIT press; Cambridge: 1987. [Google Scholar]
- Puce A, Perrett D. Electrophysiology and brain imaging of biological motion. Philos Trans R Soc Lond B Biol Sci. 2003;358(1431):435–445. doi: 10.1098/rstb.2002.1221. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rizzolatti G, Arbib MA. Language within our grasp. Trends Neurosci. 1998;21(5):188–194. doi: 10.1016/s0166-2236(98)01260-0. [DOI] [PubMed] [Google Scholar]
- Rizzolatti G, Camarda R, Fogassi L, Gentilucci M, Luppino G, Matelli M. Functional organization of inferior area 6 in the macaque monkey. II. Area F5 and the control of distal movements. Exp Brain Res. 1988;71:491–507. doi: 10.1007/BF00248742. [DOI] [PubMed] [Google Scholar]
- Rizzolatti G, Craighero L. The mirror-neuron system. Annu Rev Neurosci. 2004;27:169–192. doi: 10.1146/annurev.neuro.27.070203.144230. [DOI] [PubMed] [Google Scholar]
- Rizzolatti G, Fadiga L, Gallese V, Fogassi L. Premotor cortex and the recognition of motor actions. Brain Res Cogn Brain Res. 1996;3:131–141. doi: 10.1016/0926-6410(95)00038-0. [DOI] [PubMed] [Google Scholar]
- Rizzolatti G, Fogassi L, Gallese V. Neurophysiological mechanisms underlying the understanding and imitation of action. Nat Rev Neurosci. 2001;2:661–670. doi: 10.1038/35090060. [DOI] [PubMed] [Google Scholar]
- Robertson DA, Gernsbacher MA, Guidotti SJ, Robertson RR, Irwin W, Mock BJ, Campana ME. Functional neuroanatomy of the cognitive process of mapping during discourse comprehension. Psychol Sci. 2000;11(3):255–260. doi: 10.1111/1467-9280.00251. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rolls ET. The functions of the orbitofrontal cortex. Brain Cogn. 2004;55(1):11–29. doi: 10.1016/S0278-2626(03)00277-X. [DOI] [PubMed] [Google Scholar]
- Sanfey IG, Rilling JK, Aronson JA, Nystrom LE, Cohen JD. The neural basis of economic decision-making in the ultimatum game. Science. 2003;300(5626):1755–1758. doi: 10.1126/science.1082976. [DOI] [PubMed] [Google Scholar]
- San Jose-Robertson L, Corina DP, Ackerman D, Guillemin A, Braun AR. Neural systems for sign language production: mechanisms supporting lexical selection, phonological encoding, and articulation. Hum Brain Mapp. 2004;23(3):156–167. doi: 10.1002/hbm.20054. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schubotz RI, von Cramon DY. Functional-anatomical concepts of human premotor cortex: evidence from fMRI and PET studies. Neuroimage. 2003;20(Suppl 1):S120–131. doi: 10.1016/j.neuroimage.2003.09.014. [DOI] [PubMed] [Google Scholar]
- StGeorge M, Kutas M, Martinez A, Sereno MI. Semantic integration in reading: engagement of the right hemisphere during discourse processing. Brain. 1999;122(7):1317–1325. doi: 10.1093/brain/122.7.1317. [DOI] [PubMed] [Google Scholar]
- Wilson SM, Saygin AP, Sereno MI, Iacoboni M. Listening to speech activates motor areas involved in speech production. Nat Neurosci. 2004;7(7):701–702. doi: 10.1038/nn1263. [DOI] [PubMed] [Google Scholar]
- Xu J, Kemeny S, Park G, Frattali C, Braun A. Language in context: emergent features of word, sentence, and narrative comprehension. Neuroimage. 2005;15;25(3):1002–1015. doi: 10.1016/j.neuroimage.2004.12.013. [DOI] [PubMed] [Google Scholar]
