
Keywords: cerebral cortex, hand function, prehension
Abstract
Bimanual movements that require coordinated actions of the two hands may be coordinated by synchronous bilateral activation of somatosensory and motor cortical areas in both hemispheres, by enhanced activation of individual neurons specialized for bimanual actions, or by both mechanisms. To investigate cortical neural mechanisms that mediate unimanual and bimanual prehension, we compared actions of the left and right hands in a reach to grasp-and-pull instructed-delay task. Spike trains were recorded with multiple electrode arrays placed in the hand area of primary motor (M1) and somatosensory (S1) cortex of the right hemisphere in macaques, allowing us to measure and compare the relative timing, amplitude, and synchronization of cortical activity in these areas as animals grasped and manipulated objects that differed in shape and location. We report that neurons in the right hemisphere show common task-related firing patterns for the two hands but actions of the ipsilateral hand elicited weaker and shorter-duration responses than those of the contralateral hand. We report significant bimanual activation of neurons in M1 but not in S1 cortex when animals have free choice of hand use in prehension tasks. Population ensemble responses in M1 thereby provide an accurate depiction of hand actions during skilled manual tasks. These studies also demonstrate that somatosensory cortical areas serve important cognitive and motor functions in skilled hand actions. Bilateral representation of hand actions may serve an important role in “motor equivalence” when the same movements are performed by either hand and in transfer of skill learning between the hands.
NEW & NOTEWORTHY Humans can manipulate small objects with the right or left hand but typically select the dominant hand to handle them. We trained monkeys to grasp and manipulate objects with either hand, while recording neural activity in primary motor (M1) and somatosensory (S1) cortex. Actions of both hands activate M1 neurons, but S1 neurons respond only to the contralateral hand. Bilateral sensitivity in M1 may aid skill transfer between hands after stroke or head injury.
INTRODUCTION
Prehension is an object-oriented behavior needed for skilled manual tasks. Object grasp by the hand is also the first step in haptic exploration needed to identify objects tactually, without vision (1–4). The hand shape during grasp provides a geometric mapping of an object’s form. For familiar objects, the hand is preshaped during reach, permitting efficient acquisition (5, 6). Tactile signals of salient object properties such as compliance, curvature, edges, and corners are used to recognize a grasped object and/or manipulate it as a tool (7–10). Grasping behaviors are performed easily by both the right and left hands. The hand used in skilled tasks is often determined by individual hand preference, by its proximity to the object, by previous experience, as well as by the task goals.
During earlier studies of prehension (11–21), we showed that cortical spike trains in primary somatosensory (S1) and posterior parietal (PPC) cortex usually signal specific hand actions during reaching and grasping, rather than specific object properties such as size, shape, and workspace location. Moreover, as first demonstrated by Mountcastle and colleagues (22–25), we found that hand manipulation neurons in PPC respond more vigorously to natural grasping actions than to passively applied tactile stimuli such as light touch, stroking the hand, or vibration.
The data reported in our earlier studies of prehension were obtained when macaques used the hand and arm contralateral to the cortical recording site. However, we also noted that some PPC neurons responded to actions of the ipsilateral hand during spontaneous reaching and grasping, such as when acquiring food from the experimenter or when food was freely available in the environment (20, 26–30).
Additionally, whereas the hand representation in areas 3b and 1 is acallosal, that is, they lack anatomical connections to the homotypical body part in the opposite hemisphere, neurons in areas 2 and 5 display bilateral anatomical connections between the two hemispheres, especially in area 5 (reviewed in Ref. 31). Krubitzer and Disbrow (31) note that PPC area 5 may serve to integrate inputs between the hands and aid interhemispheric transfer of information necessary for interlimb hand coordination.
In this report, we directly compare neural responses in S1 and M1 cortex of the same animal during use of the left and right hands in a trained prehension task. Our goal was to measure the strength and timing of action representation for the two hands in sensorimotor cortical areas of the same animal. Here we report that neurons in both areas showed common task-related firing patterns and that actions of the ipsilateral hand often evoked weaker responses in M1 neurons than those of the contralateral hand but S1 neurons responded almost uniformly to contralateral actions only. We propose that bilateral representation of hand actions may serve an instructional role in “motor equivalence” when the same movements are performed by either hand. Bilateral representation of hand actions facilitates transfer of skill learning from one hemisphere to the other, providing a neural mechanism for recovery of function after unilateral stroke or head injury (32). Our findings also indicate that sensorimotor cortical areas serve important cognitive and motor functions during skilled hand actions, providing feedback about accomplishment of task goals.
Preliminary results of this study have been published in abstract form (27, 28) and presented to the Society for Neuroscience.
METHODS
Experimental protocols used in this study were reviewed and approved by the New York University Langone Medical Center Institutional Animal Care and Use Committee (IACUC) and are in accordance with the guiding principles for the care and use of experimental animals approved by the Councils of the American Physiological Society, the National Research Council, and the Society for Neuroscience.
Prehension Task
Neurophysiological and behavioral data were obtained from monkey AR, an adult female Macaca nemestrina, seated in a primate chair and trained to perform an instructed-delay grasping task. The task is similar to that used in previous studies from our laboratory to assess prehension in macaques (11–21). The test objects were round, rectangular, and cylindrical knobs mounted on a panel or box attached to the chair (Fig. 1, top left). The animal was guided by five visual icons displayed on a computer screen that signaled the shape and location of the rewarded object (Fig. 1, top right). To obtain a reward, the animal was trained to reach, grasp, lift, or pull the designated object, maintain hold for a brief interval, return the object to its start location, and release the grasp. The animal was free to select the actual grasp posture used when manipulating the objects, but the start locations and timing of responses were more constrained in this reach-to-pull task than in our earlier reach-to-lift studies. If the correct object was selected and held in place, the animal received a juice reward; if it chose the wrong object, or failed to respond, no reward was delivered on that trial.
Figure 1.
Instructed-delay bimanual prehension task. Top left: instrument panel used to assess actions of the left and right hands when grasping and pulling 1 of 5 objects arranged symmetrically in front of the animal. Knobs are numbered from left to right, with the rectangular knob placed at the midline, the cylinders aligned with the animal’s shoulders, and the spheres lateral to the left and right shoulders. Optical sensors detect placement of the left and right hands over the square LEDs providing standardized start points for reach (“hand plates”). Top right: visual cues presented on a computer monitor indicate the target object(s) and task stages to the animal. A: trial start. The computer initiates a trial by displaying 5 white icons on its monitor and lighting the LEDs on the hand plates. B: knob cue. The rewarded object is signaled when 1 icon turns red after both hands rest on the hand plates; the other icons remain white. C: GO-cue. The animal is trained to wait until the cue color on the target object icon changes from red to green before beginning to reach. D: reward signal. Correct performance is signaled by changing the target icon color from green to yellow and is accompanied by a juice reward. Bottom: images of hand actions performed by monkey AR during the task. A: placement of the left and right hands on the “home plates” during cue presentation. B: pull of knob 1 with the left hand. C: pull of knob 5 with the right hand.
To make the knob array equally accessible to both hands, we designed a new shape panel containing five knobs spaced 8 mm apart and arrayed symmetrically in front of the animal. The knob panel was positioned vertically on the front of the primate chair (Fig. 1, left), with the middle object (knob 3) centered at the animal’s midline. Objects of the same size and shape were placed at symmetrical positions on the panel. For purposes of reference, we numbered them from left to right. Knobs 1 and 5 were 2-cm-diameter spheres placed at the extreme left and right locations on the panel lateral to the two shoulders. Knobs 2 and 4 (1.5-cm-diameter × 2.5-cm-long cylinders) were aligned to the left and right shoulders, respectively. Knob 3 (a 3.8-cm × 2-cm × 2-cm rectangular block) was placed at the midline, equally accessible to the two hands. The knob shafts were spring loaded so that the animal needed to pull and hold them actively (Fig. 1, bottom center and right); switches inside the vertical panel detected knob pull and return to the rest position.
The shape panel also included a pair of “hand plates” at its base equipped with optical sensors to detect hand placement and removal. The hand plates were designated by square LEDs that were illuminated at the start of each trial and served as the common (waist level) start location from which reach was initiated. The animal rested its hands with the fingers extended over the square LEDs during the cue period without exerting force (Fig. 1, bottom, A). The LEDs were turned off when both hands were placed successfully and then left in place for at least 250 ms.
Each session was divided into blocks of 60–80 trials lasting 15–18 min. Individual objects were tested in pseudorandom order: each knob was presented 10–15 times per block. Each trial consisted of three major segments: 1) the trial instructions, 2) the movement epoch, and 3) the intertrial interval. The trial sequence was programmed in Superlab Pro on an iMac computer interfaced to the knob panel through a National Instruments USB digital I/O device (NI USB-6501) and custom electronics.
Sensors within the shape panel provided electronic signals of hand actions to the behavioral computer and to the neural recording systems. Optical sensors in each hand plate detected hand placement on the hand plate (stage 0) and hand movement off the hand plate at the start of reach (stage 1). Switches on each knob shaft detected knob pull (stage 4) and return (stage 6) during active manipulation. Logic circuits within the behavioral interface module signaled key task stages: 1) the hand plate AND-gate detected when both hands were in place before the task instructions were delivered, and 2) the knob OR-gate signaled when any knob was pulled (stage 4) and released (stage 8). The Superlab Pro software stored time stamps of key task events: the trial start and end times, the onset time of task cues, reward times, and the animal’s reaction times. Other task stage times were monitored with digital video as noted below.
Hand Kinematics during Specific Task Stages
The behavioral computer initiated each trial (stage 0, trial start) by displaying five white icons on a gray monitor screen (Fig. 1, top right, A, start) and by sounding a tone to get the subject’s attention. It also illuminated the square LEDs on the hand plates: the animal was trained to place both hands over the LEDs to indicate readiness to proceed and to obtain subsequent task cues.
During the ensuing “cue period,” the shape and location of the rewarded object were signaled by coloring one icon red (Fig. 1, top right, B, red cue). The animal was trained to maintain both hands on the hand plates during a further 800- to 1,000-ms “delay period” until the cue color changed from red to green (Fig. 1, top right, C, green cue) before reaching to the instructed target. This protocol is analogous to waiting at a traffic light. The delay period allowed us to distinguish neural activity related to motor planning from precontact activation of the arm and hand. If the animal lifted either hand from the hand plate before the green “Go-signal” occurred, the icon color turned white, and the trial start sequence was repeated.
The “movement period” began after the green Go-signal: the animal had to reach (stage 1) toward the cued knob within 5 s, touch or contact it (stage 2), grasp (stage 3) and pull (stage 4) it, and maintain the hold posture (stage 5) for a further 800 ms to obtain a reward. The animal could use either hand to pull the knob but typically used the hand closest to the target object, leaving the other hand resting on its hand plate. The animal was free to select the particular grasp posture used for each object that was natural, comfortable, and fluid.
The task computer signaled correct responses with a yellow icon (Fig. 1, top right, D) and a tone, plus several drops of juice. It signaled incorrect responses (wrong knob or short hold time) by flashing all five icons in black, sounding a minor chord, and providing no reward.
The trial ended when the grip was relaxed (stage 6) and the knob returned to the start position (stage 7) and released from the hand (stage 8). The monitor screen was blanked to gray in the subsequent 2-s intertrial interval (stage −1). The next Trial start (stage 0) was signaled by the reappearance of the five white icons, the auditory tone, and illumination of the hand plate LEDs. Background firing rates were measured during the 500-ms interval (stage −1) preceding each successive trial start.
The task timeline is shown in Fig. 2; task stage numbers are similar to those used in our earlier reports (11–20) to allow comparison of average firing rates in this report to those reported previously in studies of S1, M1, and PPC in other animals when the animal used only the contralateral hand. Task actions were self-paced, without additional time constraints on trial initiation or stage duration beyond the 2-s pause of the intertrial interval.
Figure 2.
Task timeline. Top trace (“computer”) is an analog signal of important task events: trial start (A), red knob cue (B), green GO-cue (C), and reward (D). Other traces show digital flags marking placement of the left and right hands on and off each hand plate, hand plate AND-gate marking when both hands are on their respective hand plates, knob OR-gate indicating pull and return of any knob, and knob ID switches that monitor pull and release on each knob. The trace marked “video kinematics” is a voltage signal marking events recorded on the digital video camera: premovement period (0; 250 ms prior to the start of reach), reach (1), contact (2), grasp (3), knob pull (4), knob hold (5), knob return (6), relax grip (7), and release knob (8).
Actions of the hand and arm during task stages 2, 3, and 5 were monitored off-line with digital video (DV) recordings of the animal’s behavior synchronized to neuronal spike trains, as described previously (11–20). DV camcorders provided lateral and/or frontal images of the animal and the workspace at 30 frames/s, allowing us to use the video time code to define the onset times of previously recorded task actions accurate to 33.3 ms. Hand behaviors viewed in frame-by-frame mode allowed us to compile event logs of the task stages on each trial using the frame time code (h:min:s:frame) to mark events of interest and to visualize how hand postures evolve over time. This procedure is similar to the replay videos used in professional sports (e.g., Major League Baseball) to review on-field umpire’s calls of close plays. Eye movements and gaze are also visible in the video clips but were not quantified.
Recording Sites in the Cerebral Cortex
Extracellular recordings of neural spike trains were made with multielectrode arrays in the right hemisphere of monkey AR through a 25-mm-diameter, low-profile MRI-compatible plastic chamber (Fig. 3A) centered over the precentral gyrus (motor cortex) with techniques for chronic single-unit recordings detailed in Refs. 11–20. The chamber was permanently implanted on the skull over the cortical region of interest (3 mm rostral to the central sulcus and 20 mm lateral to the midline; large red dot, Fig. 3A) in an aseptic surgical procedure under general anesthesia. The dura was left intact to prevent infection and contain brain swelling.
Figure 3.
Cortical recording sites in monkey AR. A: photograph of the recording chamber implanted on the skull after craniotomy. Note the location of the central sulcus and middle cerebral blood vessels viewed through the intact dura; this region was avoided during recordings. B: MRI coronal section through the center of the recording chamber. C and D: reconstruction of the cortical surface topography with Brainsight software from MR slices in coronal, sagittal, and axial planes; yellow hexagon outlines the computed chamber perimeter. The recording chamber was centered over the hand area of primary motor (M1) cortex in the right hemisphere (red marker); recording tracks were run in both M1 and primary somatosensory (SI) cortex. The blue dot in C shows the projected recording sites from rostral M1 for data in Figs. 5–9. The blue dot in D shows the projected recording sites from area 2 of S1 for data in Figs. 11–15.
The implant site was determined after MRI imaging of the whole brain to enable brain reconstruction with Brainsight software (Rogue Research; Fig. 3). The chamber was surgically implanted medial to the central sulcus (red dot over brain MRI reconstruction in Fig. 3B), providing access to the hand area of M1 (caudal to the arcuate sulcus at the level of the arcuate spur) and area 2 of S1 cortex (blue dots in Fig. 3, C and D). The chamber was sealed with a transparent Lucite cap, except during recordings, when the cap was replaced with a sterile Silastic membrane held in a stainless steel ring. We avoided recording sites in regions close to the central sulcus or the middle cerebral artery.
Buprenex (buprenorphine hydrochloride, 0.01 mg/kg twice a day) was administered after surgery for a 4-day period, to alleviate postoperative pain. Solu-Medrol (methylprednisolone sodium succinate; 5 mg/kg im) was given immediately after surgery and on the following day, to reduce brain swelling; no behavioral or recording sessions were conducted in this period. Intraoperative antibiotics [Baytril (enrofloxacin) solution; 1 mg/kg] were supplemented with once-daily doses for 6–7 days postoperatively. Wound margins were cleansed with surgical sponges soaked in chlorhexidine solution and/or hydrogen peroxide. Topical antibiotics (gentamycin or Baytril) were applied as necessary to the tissue surrounding the implant site. The interior of the chamber was rinsed with sterile saline before and after each recording session.
Microelectrode Recording and Analysis Techniques
During recording sessions, up to four Epoxylite-insulated tungsten microelectrodes (FHC, model UEWLFELE2N1X; impedance = 2 MΩ) were advanced into the brain through the intact dura by a computer-controlled multiple electrode positioning system (Alpha Omega EPS-MT) that allowed simultaneous recordings from four independently mobile tungsten microelectrodes placed in 0.8-mm-diameter guide tubes spaced 1 mm apart within a larger cannula; none of the guide tubes penetrated the dura or the brain. Electrode 1 was placed most rostrally and electrode 3 caudally; electrodes 2 and 4 flanked electrodes 1 and 3 medially and laterally during each session. We labeled each neuron studied with the session number, the electrode guide tube number (E1–E4), and the serial order of recording in the session. Sterile calibrated positioning guides placed within the chamber lumen specified the site of microelectrode insertion. Microelectrodes were removed from the brain after each recording session, and new sites were selected in subsequent sessions.
Spike trains recorded from each electrode were amplified and filtered (Cyberamp, Axon Instruments, band pass, 100 Hz to 3 kHz) and digitized at 48 kHz 16 bit, or 32 kHz 12 bit, by the DV camcorders and stored as AIFF files together with QuickTime video records of the hand actions as described in Refs. 11–21. A list of spike time stamps obtained during continuous data sampling was compiled for each neuron directly from the AIFF files by using interactive clustering to distinguish neuronal action potentials from noise and to separate the spike waveforms of up to four different neurons recorded from each electrode into individual traces (11, 21).
Spike trains of individual neurons were linked to the task stages and quantified with custom software (21) written in IGOR Pro 5 (WaveMetrics, Inc.). We constructed continuous displays of spike trains, firing rates, and behavioral events within each video clip from video event tables compiled from behavioral flags of the eight task stages (e.g., trial start, hands on, approach, contact, etc.). Text files listing time stamps of each behavioral flag were stored as Excel spreadsheets, allowing us to align spikes for multiple trials in rasters and peristimulus time histograms (PSTHs) aligned to specific task actions as well as to multiple events within each trial.
Average firing rates per task stage (i.e., total spike counts between stage markers divided by the stage duration) were also compiled for each neuron and averaged across trials, to provide objective classifiers of firing patterns in each brain area studied (see Figs. 9 and 15).
Figure 9.

A–I: mean firing rates per task stage (±SE) for 9 primary motor cortex (M1) neurons comparing actions of the left (LH; blue) and right (RH; red) hands during the 8 task stages and the pretrial interval (stage −1). Firing rates increased at the trial start (stage 0). Both excitation and inhibition (or response suppression) were stronger when the contralateral (left) hand was used. Most of these neurons responded at highest rates during stage 1 (reach to the target object) with the contralateral (left) hand (D, E, and G–I) but 3 of the 9 M1 neurons showed similar firing rates for both hands (A–C).
Figure 15.

Somatosensory cortex (S1) neurons in the right hemisphere respond preferentially to actions of the contralateral (left) hand (LH; blue), responding weakly or not at all to the ipsilateral (right) hand (RH; red). A–I: average firing rates per task stage (±SE) from 9 S1 neurons comparing actions of the left and right hands during the task stages and during the pretrial interval (stage −1) or at trial start (stage 0); same stage numbering code as in Fig. 5. All of these cells responded strongly to object acquisition and manipulation during stages 1-4 during left hand trials (blue) but not during trials testing the right hand (red). The strongest responses typically occur at contact (stage 2) and grasp (stage 3) and during pull (stage 4) but weakly to static hold (stage 5). Rasters and/or peristimulus time histograms (PSTHs) for these neurons are presented in Fig. 13 (A–D), Fig. 10 (E), Fig. 11 (F and G), and Fig. 12 (B and A).
Average firing rates per stage on each trial also served as inputs for statistical analyses. A repeated-measures analysis of variance model (StatView, SAS Institute) analyzed whether there was significant modulation of firing rates between individual task stages and the pretrial interval (stage −1) for each trial (F test, P < 0.05) and whether knob identity (size, shape, and/or location) or hand used was significant. We normalized firing rates by dividing the mean firing rate per stage by the peak mean rate in the “best” stage.
Hand laterality effects on neural firing rates were also assessed with a handedness index (HI) that compared mean firing rates per task stage evoked by actions of the two hands:
Positive HI values indicate neural preferences for the contralateral hand; negative HI values denote preferences for the ipsilateral hand. An HI = 0 indicates equal preference for the two hands, an HI = +1 indicates exclusive activation by the contralateral hand, and an HI = −1 indicates exclusive activation by the ipsilateral hand.
RESULTS
To investigate cortical neural mechanisms that mediate prehension in macaques, we compared actions of the left and right hands in a trained grasping task. Spike trains of 29 neurons in M1 and 38 neurons in S1 were selected for quantification on the basis of near-ideal spike isolation during 52 recording sessions from the hand area of the right hemisphere. We measured the relative timing, amplitude, and synchronization of cortical spike trains in the right hemisphere as the animal used the left and/or right hands to manipulate objects that differed in shape and location in the workspace.
Although it has been widely reported that each hemisphere controls and/or monitors actions of the contralateral half of the body, we report here that sensorimotor areas in the right hemisphere represent actions of both hands and arms and that the contralateral hand dominated these responses. In our reach-to-pull prehension task where the animal had free choice and could use either hand to pull and hold the objects, this subject usually grasped the cued object with the closest hand. Moreover, the neurons studied during task performance often responded similarly to the same actions performed by either hand.
Responses to the Bimanual Prehension Task in Primary Motor Cortex
As we reported previously for other animals studied with a reach-to-lift task (11–20), actions of the contralateral hand in the current reach-to-pull task evoked complex responses in primary motor cortex (M1) that spanned multiple task stages, and were quite diverse. Figures 4 and 5 show exemplar recordings from an M1 site 3 mm anterior to the chamber center and 1 mm lateral to it (blue dot in Fig. 3C). The rasters and PSTH in Fig. 4A are aligned to the onset of the knob instruction cue (red markers) after the animal placed both hands on the hand plate start location. This neuron (AR50_E2U1) showed a slight increase in firing rate coincident with the red knob cue and maintained steady firing until the green GO-cue appeared (green markers). The firing rate increased at the start of reach (dark blue markers) when the left hand rose from the hand plate and reached toward the objects but then dropped suddenly below the pretrial rate before the knob was pulled (light blue markers). Firing stopped abruptly midway through the subsequent movement period after the hand contacted, grasped, and pulled the knob (Fig. 4B, light blue marker) and remained low during the hold stage. Firing then resumed at lower rates during static hold of the knob; firing rates increased again after the juice reward was delivered (magenta markers) and the knob was released from the hand.
Figure 4.
Spike trains of primary motor cortex (M1) neurons reflect reach and grasp actions of the contralateral hand. Rasters and peristimulus time histograms (PSTHs) for an M1 neuron (neuron AR50_E2U1) recorded at the site of the blue dot in Fig. 3C aligned to electronic behavioral control signals. A: responses aligned to the onset of the red knob instruction cue (red symbols) after the animal rested its hands on the hand plates. This neuron increased its firing rate coincident with the knob instruction cue (red symbols) and maintained steady firing until the GO-cue (green symbols) appeared. Reach began shortly thereafter, when the left hand rose from the hand plate. The firing rate increased as reach began (dark blue symbols). Firing stopped abruptly midway through the reach movement period as the hand contacted and pulled the object (light blue symbols). B: responses aligned to knob pull (light blue symbols). The neuron remained silent during pulling and at the start of static hold. The firing rate remained low during hold and rose after the juice reward was delivered (magenta symbols) and the trial ended.
Figure 5.

Primary motor cortex (M1) neuron excited by reach and inhibited by grasp for all knobs tested. The rasters (A–C) and peristimulus time histograms (PSTHs) (D) are aligned to digital video images of hand contact with the individual knobs. Same neuron as in Fig. 4. The firing rate increased sharply during the initial 300-ms pretrial period and dropped abruptly before the hand touched the knob (red marker); the firing rate returned to baseline when the object was released from grasp. Dark orange markers in each raster denote the start of reach; red, magenta, and blue markers show hand contact (red) with each object; grasp (magenta) began 1 video frame after contact. Knob pull (dark blue) and static hold (light blue) were each delayed by single video frames. The green markers near the trial end indicate knob return to the start position, relaxation of grasp (dark green), and knob release (cyan). Silencing of the spike train occurred ∼150 ms before contact. A: knob 1: large sphere. B: knob 2, medium cylinder. C: knob 3, rectangular block. Responses to all 3 objects and reach directions are similar in time course and amplitude. E: average firing rates (±SE) of this neuron for each knob and for each task stage are highest during approach (stage 1) and lowest during object manipulation (stages 2–4). Firing rates are similar for the 3 knobs. Symbols in the key show the color-coded markers denoting the individual task stages in rasters and PSTHs.
The rasters and PSTHs in Fig. 5 show the same data aligned to left hand contact with the knobs (red markers) from the digital video recordings of task performance and subdivided by specific objects. Firing rates increased sharply during reach when knobs 1, 2, and 3 were tested during the 750-ms approach period (stage 1, dark orange marker) and dropped abruptly 250 ms before the hand touched, grasped, and pulled the knob (stages 2–4, red, magenta, and blue markers); the firing rate returned to baseline when the object was released from grasp (stages 6–8, green markers). It is noteworthy that the highest firing rates occurred in anticipation of contact with the hand (in stage 1) but not in later stages, when the subject actually grasped and pulled it (Fig. 4B). Responses to all three objects and reach directions were similar in magnitude and time course as quantified in the PSTHs and average firing rate graphs below. No data are shown here for knobs 4 and 5 because they were pulled with the right hand, whose actions did not excite this neuron (see Fig. 8A).
Figure 8.
Diverse responses to the task during 2 different recording sessions. The peristimulus time histograms (PSTHs) of 6 neurons recorded in primary motor cortex (M1) on tracks 50 (A–C) and 51 (D–F) in monkey AR during performance of the reach-to-pull task. Responses of each neuron were aligned to hand contact with the knobs as determined from digital video images. Responses are strongest for the contralateral (left) hand (LH; blue traces); responses to the ipsilateral (right) hand (RH; red traces) are weaker, delayed, or absent. Colored markers above the PSTHs denote mean onset times of the task stages; timings are similar for both hands. Although some neurons responded similarly to both hands, all fired at higher average rates when the contralateral hand was used. Inhibition was also more pronounced when the contralateral hand was used (A and F).
In contrast, the spike trains in Fig. 6 illustrate simultaneous recordings 1 day later from a neighboring site in M1 where neurons responded similarly to actions of both the left and right hands but the contralateral (left) hand evoked greater task-related modulation of firing rates (Fig. 6, A and C). The neuron in Fig. 6, A and B, was recorded on electrode 1 (neuron AR51_E1U1). It increased firing at the trial start (light orange marker) and responded strongly at the start of reach (dark orange marker); firing rates peaked at contact (red marker) and then decreased as grasp was relaxed (green markers) and the object released from the hand (cyan marker). Responses to actions of the ipsilateral (right) hand were similar in polarity, but less intense, and shorter in duration when knobs of the same shape and body-centered location were tested.
Figure 6.

Task-evoked spike rasters to actions of the right and left hands recorded simultaneously on 2 different electrodes in primary motor cortex (M1). Responses to actions of the contralateral (left) hand (A and C) are stronger than those evoked by the ipsilateral (right) hand (B and D), but the response polarity of each neuron is the same for both hands. Colored markers on the raster denote the mean onset times of the task stages on each trial (same color code as in Fig. 5). The contralateral (left) hand evoked greater and longer-duration task-related modulation of firing rates, and the response time course was similar regardless of which hand was used. Peristimulus time histograms (PSTHs) for these neurons are shown in Fig. 8, D and F. These neurons did not distinguish the individual knobs grasped during the task. A and B: neuron AR51_E1U1 increased its firing rate before reach began and showed peak responses just prior to contact. Responses to the right hand were weaker and shorter in duration. Task-evoked spike trains ended when the object was released, particularly from the left hand (green markers). C and D: neuron AR51_E2U2, recorded on a neighboring electrode, was inhibited at the start of reach and remained quiet until contact when firing rose, particularly for knob 1.
The neuron in Fig. 6, C and D, was recorded more caudally and medially on electrode 2 (neuron AR51_E2U2); it was silenced or inhibited at the start of reach by the left hand and remained quiet until the hand contacted the knob, when firing rates rose (Fig. 6C), particularly for knob 1. Responses returned to baseline at object release from the hand (green markers). Spike suppression during reach was also less effective when the animal used the right (ipsilateral) hand (Fig. 6D). The rebound excitation at contact was also less pronounced in trials of the ipsilateral hand. Note that although we did not directly instruct the animal on which hand to use on individual trials, the animal pulled the middle knob (knob 3) with the left hand on 9 trials and with the right hand on 10 trials for both neurons.
Object features (shape and workspace location) had minimal effect on task-evoked spike trains. For example, knobs 1 and 5 were matched in shape and size, but knob 1 was aligned to the left shoulder and knob 5 to the right shoulder. However, the evoked spike trains were longer lasting and more intense when tested with the contralateral (left) hand (Fig. 6A) than with the ipsilateral (right) hand (Fig. 6B). Similarly, responses evoked by knob 3, placed at the midline and equidistant from the two shoulders, were higher in rate and longer in duration when the animal used the contralateral (left) hand (Fig. 6A), but it is noteworthy that each of these neurons responded similarly to actions of both the left and right hands.
To better visualize the effects of task actions and handedness on M1 spike trains, we compared the spike trains of six neurons recorded simultaneously in session AR51 during repeated tests of the left and right cylinders (knobs 2 and 4) in Fig. 7. Although these knobs were similarly shaped and sized, and were placed at similar body-centered workspace locations (in front of the left and right shoulders), the firing patterns were similar for both hands, although the spike density and timing were higher when the contralateral (left) hand was used to acquire knob 2 than when the ipsilateral (right) hand reached for and grasped knob 4. All but one of these neurons (E2U2) responded vigorously to reach and acquisition of knob 2 by the contralateral (left) hand and then fired at lower rates during holding by that hand. Object release from the hand during stages 6–8 was accompanied by weak spike bursts. These same task actions evoked lower firing rates but similar firing patterns when performed by the ipsilateral (right) hand during tests of knob 4. The common firing patterns to actions of the two hands provide a neural substrate for signaling actions of one or both hands in M1.
Figure 7.
Multielectrode, simultaneously recorded spike rasters from 6 primary motor cortex (M1) neurons in session 51 comparing actions of the right and left hands when testing the cylindrical knobs. Each set of rasters illustrates responses of the same groups of neurons identified on right. Firing rates were lower when the ipsilateral (right) hand was used for each of these neurons. Complete rasters for E1U1 and E2U2 are provided in Fig. 6.
During simultaneous recordings from groups of neighboring microelectrodes (Fig. 8), we found that M1 firing patterns differed considerably from neuron to neuron. Many of these cells responded 300 ms or more before contact and fired more vigorously to hand actions during reaching than to static postures during holding. Although some neurons were excited strongly during reach (Fig. 8A) and others responded mainly to grasp and knob pull (Fig. 8C), the majority were broadly tuned, responding at similar rates to reach and object manipulation (Fig. 8, B, D, and E). A few neurons were inhibited during one or more task stages (Fig. 8, A and F). Some also responded to similar actions by the ipsilateral hand but at longer latencies and lower rates (Fig. 8, D–F, red traces).
Rasters and PSTHs show how individual spikes depict the inputs and outputs of each neuron as a function of time during the task. Another way to evaluate task responses is to quantify and compare the magnitude and duration of task responses of individual neurons for each task action or sensory input. Figure 9 shows the average firing rates per task stage (total spike count between stage markers divided by the stage duration). These graphs compare and quantify how M1 neurons alter their firing rates as functions of actions performed on task trials.
The highest firing rates in M1 occurred during reaching with the left hand (stage 1, approach), when four of the nine M1 neurons illustrated showed peak firing rates when approaching the knob panel with the contralateral (left) hand (Fig. 9, E and G–I). One cell (Fig. 9D) showed peak firing at the trial start (stage 0), when the animal reached to the hand plates from outside the workspace and obtained the task cues. The remaining M1 neurons responded most vigorously to hand-object interactions when the object was first touched or grasped in stages 2–4 or held statically in either hand (stage 5). None showed maximum firing when the hand was withdrawn from the knobs.
Responses to the Bimanual Prehension Task in Somatosensory Cortex
The neurons we recorded in area 2 of primary somatosensory (S1) cortex exhibited more stereotyped task responses than M1 neurons, responding primarily to object manipulation by the contralateral (left) hand (Fig. 10, blue PSTH). There was little neural activity in S1 before hand contact with the knob. Firing rates of S1 neurons typically increased slightly from baseline during reaching, beginning 50–100 ms before knob contact by the left hand, and peaked at contact, when the contralateral (left) hand first touched or grasped the knob (stages 2 and 3, red and magenta markers). High firing rates were maintained during grasping and decayed rapidly after the knob was pulled (stage 4, dark blue marker). Firing rates were generally lower during holding (stage 5, light blue marker) but not silenced like some M1 neurons. S1 neurons often fired a second spike burst when the grip was relaxed and the object released from the hand (stages 6–8, green and cyan markers).
Figure 10.
Rasters and peristimulus time histograms (PSTHs) for a primary somatosensory cortex (S1) neuron in area 2 that responded to task actions of the contralateral (left) hand (LH) but not to those of the ipsilateral (right) hand (RH). Session AR40 was recorded 4.0 mm posterior to the center of the recording chamber and 1.0 mm lateral to the chamber center. This neuron fired at peak rates at contact (stage 2) and grasp (stage 3) but at lower rates during reach (stage 1) and knob pull (stages 4 and 5).
Performance with the ipsilateral (right) hand was not task modulated (Fig. 10, red PSTH); S1 neurons did not respond to similar actions performed by the ipsilateral hand (red PSTH).
The knob identity appeared to have little effect on S1 firing rates; spike trains in the rasters of Fig. 10 appear similar in tests of the left sphere (knob 1), left cylinder (knob 2), and center rectangle (knob 3), although these objects had different shapes (and surface curvature). These geometric features were rarely encoded by S1 average firing rates, perhaps because the objects had similar overall dimensions and were grasped with similar hand postures.
S1 neurons generally distinguished object shape by their duration of firing rather than by their firing rates. For example, the neuron shown in Fig. 11 fired longer-duration responses to grasp of knob 1 with the left hand than to grasp of knob 2 or 3 with that same hand. The superimposed color-coded PSTHs below the raster show that all three knobs evoked similar high firing rates during acquisition by the left hand but that the rounded shape of knob 1 evoked slightly longer-lasting spike bursts than the elongated forms of knobs 2 and 3. The long-duration spike bursts may reflect the relative comfort of holding the sphere that lacked edges or corners, as the animal was not prompted directly to return the knob or to release grasp.
Figure 11.
Neural responses in primary somatosensory cortex (S1) are strongest at contact and grasp. Rasters and peristimulus time histograms (PSTHs) aligned to hand contact with the knob and grouped by knob. Markers above the PSTH denote the mean onset times of the task stages. The neuron fired a strong burst at contact (stage 2) and then maintained this rate during grasp (stage 3) and knob pull (stage 4). Firing rates declined during hold (stage 5) and then returned to baseline when the knob was released (stage 8). Firing rates did not differ between knobs. Knob 1, left sphere; knob 2, left cylinder; knob 3, center rectangle; knob 4, right cylinder. Note that this neuron responded most intensely at initial contact with the knob(s), regardless of shape. It fired at lower rates during reach, pull, and hold and in the pretrial epoch and after release of grasp (stage 8).
Both rapidly adapting (RA) and slowly adapting (SA) responses occurred in S1. Figure 12 illustrates responses of a pair of simultaneously recorded S1 neurons in session AR41. Both neurons increased their firing rates during approach (stage 1) and fired at peak rates at contact (stage 2) when the animal used the left (contralateral) hand (blue PSTHs). The RA neuron increased its firing rate late during stage 1 (approach) and fired maximally at contact (Fig. 12A). It was silenced during holding (stage 5) with the left hand but fired a second late burst when grip was relaxed and the object released from the hand during stages 6–8 (green markers). The SA neuron (Fig. 12B) also began firing late in the approach stage and fired maximally at contact by the left hand. Moderate spike rates were sustained during knob pull and holding. The SA neuron maintained moderate firing rates when the object was held in the left (contralateral) hand; its activity returned to baseline after object release (stage 8).
Figure 12.
Rasters and peristimulus time histograms (PSTHs) for a pair of simultaneously recorded primary somatosensory cortex (S1) neurons on track 41 that responded to contact and release of objects with the contralateral (left) hand but showed little or no task-related activity when the ipsilateral (right hand) was used. Firing rates of both neurons peaked at contact (stage 2) when the contralateral (left) hand was used and began firing at low rates during approach. A: the rapidly adapting neuron responded at contact (stage 2) and was silent during holding (stage 5) but fired brief bursts when the object was released from the hand (stages 6–8). B: the slowly adapting neuron maintained moderate firing when the object was touched and held in the hand (stages 2–5); activity returned to baseline after object release (stage 8). Neither of these neurons responded to actions of ipsilateral (right) hand. Track 41 was recorded in area 2, P4.0 mm from the center of the recording chamber and 0.0 mm lateral to the chamber center (blue dot in Fig. 3D).
The diversity of S1 responses during the prehension task is illustrated in Fig. 13 for five neurons recorded simultaneously on electrodes 1, 2, and 3 in session AR38. Although the firing patterns and rates of these neurons differ, all five of them fired maximally at contact, grasp, and pull (stages 2–4) and at lower rates during reach and hold (stages 1 and 5). High firing rates were sustained after knob pull (stage 4), and most of them fired less intensely or were silent during holding (stage 5). Tonic activity in the pretrial period was also low (stage 0, gold marker).
Figure 13.
Primary somatosensory cortex (S1) spike trains are similar for simultaneously recorded neurons. Top: spike rasters from 5 S1 neurons in session AR38 (4.0 mm posterior to the chamber center and 2.0 mm lateral) comparing responses to trials 1–6 of knob 1 (left sphere). Each raster set illustrates responses of the same neurons identified in the peristimulus time histogram (PSTH). Colored markers above the rasters denote the stage onset times on those trials (same color code as in Fig. 5). The lowest spike train in each set is from the same neuron as in Fig. 11. Bottom: PSTHs for all trials peak at contact for these 5 neurons and return to baseline at release (stage 8). All 5 cells increased their firing at or slightly before contact and maintained activity for similar periods. LH, left hand.
We also noted brief spike bursts during reaching as the left hand was lifted from the hand plates before contact (dark orange markers, Figs. 11–13), but this early activity was intermittent and not sustained. Other S1 neurons (Fig. 14) showed brief precontact spike bursts as the hand opened and was preshaped to facilitate grasping. These precontact spikes suggest that S1 neurons may function in an anticipatory manner because of efference copy of motor commands for reaching and grasping or may reflect proprioceptive responses of hand afferents to digit extension during hand shaping (hand opening) before contact or at knob release. This preparatory activity may be similar in origin to the off-responses seen when the hand opened at the trial end and was withdrawn from the knob (stages 6–8).
Figure 14.
Spike rasters for 2 simultaneously recorded precontact activated neurons in session AR41. The neuron recorded on electrode 4 (bottom) fired regular spike bursts during reach and a second burst when the hand opened prior to release of the knob after holding. The neuron recorded on electrode 1 (top) fired at high rates immediately prior to hand contact with the knob and again as the grip was relaxed and the object released by the hand.
The average firing rate graphs in Fig. 15 for nine S1 neurons quantify their firing rates during actions performed during stages of the prehension task. Task-activated neurons in S1 responded vigorously when the contralateral (left) hand contacted and/or grasped an object during stages 2 and 3 (Fig. 15, blue traces) but failed to respond or responded weakly to the same actions performed by the ipsilateral (right) hand (Fig. 15, red traces).
The highest firing rates in this population occurred at contact or grasp (stages 2 and 3) when touch began: eight of the nine neurons fired maximally when the left hand touched the knob(s). Seven of the nine neurons fired at higher rates than baseline during approach (stage 1) when the hand was preshaped before touching the object (Fig. 15, A–C, E, and G–I). Static holding with the contralateral hand in stage 5 was the least preferred action in this population (Fig. 15, A, B, and E–I). Firing rates during stages 2–4 did not differ, probably because each of these stages lasted at most one video frame, and did not distinguish multiple time epochs. Thus, neurons in S1 provide important signals of hand motion on and off objects during grasping tasks and fire at lower rates to static hand postures.
Integration of Hand Kinematics and Dynamics in M1 and S1 Cortex
To quantify and compare the relative magnitude and duration of handedness effects in M1 and S1 cortex of the right hemisphere, we normalized and averaged mean firing rates per task stage for each neuron by dividing the average firing rate per stage by the cell’s peak mean rate in the most preferred stage (Fig. 16, right). Although some M1 neurons showed similar response patterns for both hands, firing rates were uniformly higher and response duration longer when the contralateral hand was tested (blue trace).
Figure 16.

Different task actions are signaled in primary motor (M1) and somatosensory (S1) cortex. Left: bar graphs showing % and total number of neurons in M1 cortex (A) and S1 cortex (B) that showed significant excitation during the 8 task stages when performed by the left (contralateral, blue bars) and right (ipsilateral, red bars) hands. Significant bimanual activity was found only in M1 and not in area 2 of S1. Some neurons are specialized for reach and others for grasp, but the majority respond to both reach and grasp behaviors. Right: mean normalized firing rates per task stage (±SE) in M1 cortex (A) and S1 cortex (B) of the right hemisphere during the prehension task performed by the left (blue trace) and right (red trace) hands. M1 neurons are broadly tuned to reach, grasp, and pull (stages 1–4) and respond weakly in later stages. M1 neurons respond to the task cues, 300 ms or more before the onset of reaching (stage 0), and fire at higher rates to action than to static postures. S1 neurons in the right hemisphere respond primarily to actions of the contralateral hand during reach (hand shaping), contact, grasp, and pull (stages 1–4) but also to object release from the hand (stages 6–7) and do not respond to actions of the ipsilateral (right) hand. They provide tactile and proprioceptive information about object acquisition only for the contralateral (left) hand.
The normalized mean firing rates of the M1 and S1 populations shown in Fig. 16, right, mirror those of the individual neurons illustrated in Figs. 9 and 15. In both areas, mean normalized firing rates were higher when the left (contralateral) hand performed the task (blue trace) than during performance with the right (ipsilateral) hand (red trace). The mean M1 firing rates rose above the pretrial rate during the cue period at the trial start (stage 0). Maximum firing in M1 occurred during reach with the left hand (stage 1), as the subject sought to acquire the cued object. The firing rate decayed during object manipulation (stages 2–4) and holding (stage 5) and was lowest in the later release stages (stages 6–8). Although many M1 neurons showed similar response patterns for both hands, firing rates were uniformly higher and response duration longer when the contralateral hand was tested. Actions of the ipsilateral hand were signaled in M1 by a similarly shaped, but lower-amplitude, temporal profile.
S1 neurons fired at higher mean normalized rates than M1 neurons when the subject used the contralateral (left) hand (blue trace in Fig. 16, right). S1 firing rates rose slightly during reaching (stage 1) when nothing contacted the hand but the hand posture was preshaped for grasp. Firing rates in S1 were highest at contact (stage 2) and remained high during grasping and pulling (stages 3 and 4) with the contralateral hand (blue trace). S1 firing rates fell during static holding (stage 5) and then decayed from the peak rate more rapidly than those in M1. The S1 population also signaled relaxation of grasp and object release, as many neurons displayed a second peak in stages 6 and 7 when grip was relaxed, the hand opened, and the object released. Normalized firing rates during use of the ipsilateral (right) hand (red trace in Fig. 16, right) were low, and nearly flat, reflecting the weak responses of S1 neurons to any actions of the ipsilateral hand.
The bar graphs in Fig. 16, left, summarize the size of the M1 and S1 populations that showed altered firing rates during specific task stages compared to the pretrial rate (in stage −1), before the animal received the task cues. Blue bars indicate the totals for actions performed by the contralateral (left) hand and red bars denote the totals for actions by the ipsilateral (right) hand during each stage. Although we tested actions of both hands for each neuron, we found that only M1 neurons, not S1 cells, show elevated firing rates when the ipsilateral hand was used (red bars).
The highest firing rates in M1 occurred during reaching with the left hand (stage 1, approach), when 86% of the M1 neurons tested (25/29) showed significantly elevated firing rates during approach with the contralateral (left) hand (blue bars in Fig. 16A, left), but only 31% of M1 neurons (9/29) showed significantly increased firing rates during approach with the ipsilateral (right) hand (red bars in Fig. 16, left). High firing rates were maintained by 69% of M1 neurons (20/29) when the animal used the left (contralateral) hand at contact (stage 2), whereas only 24% (7/29) showed significantly higher firing rates at contact with the right (ipsilateral) hand.
At the trial start (stage 0), when the animal was cued to obtain an object, 52% of M1 neurons tested (15/29) showed a significant rise in firing rates over baseline; only 7% of M1 neurons (2/29) increased their firing rate after cues specifying reach with the ipsilateral (right) hand. Object manipulation by the contralateral hand (grasp, pull, and hold) modulated firing rates of half of the M1 neurons (14/29), but only ∼20% of M1 neurons (6/29) responded to these task actions performed by the ipsilateral hand. Actions performed during the late task stages (lower, relax, and release) were signaled by 10–15% of M1 neurons during trials using the contralateral hand (4/29, 4/29, 2/29) and 5% to trials with the ipsilateral hand (2/29, 2/29,1/29).
S1 neurons were activated throughout the period of hand-object interaction by the contralateral hand but not by the ipsilateral hand. Over 90% of S1 neurons tested in the right hemisphere fired at significantly higher rates during the approach (35/38) and contact (34/38) stages using the left (contralateral) hand. Object manipulation with the left hand [grasping (30/38), pulling (28/38), and holding (30/38)] evoked significant activity in 79% of S1 neurons, two-thirds (25/38) showed significant changes in firing as the objects were returned to the start position, and ∼40% signaled grip relaxation (16/38) and release from the hand (8/38). Only 4% of S1 neurons (1/38) showed significant increases in firing during actions of the ipsilateral hand (red bars in Fig. 16B, left), and these occurred during knob pull. None of the S1 neurons studied showed significant reduction in firing below baseline during actions of the left (contralateral) hand, and 2/38 S1 neurons showed significant inhibition during use of the ipsilateral hand at contact and during knob pull. Thus, neurons in S1 provide important signals of motion of the left hand on and off objects during grasping tasks and fire at lower rates to static postures.
We also used a handedness index (Fig. 17) to calculate the relative magnitude of responses to similar actions of the two hands. A HI = 0 signifies that the two hands evoked equal responses, HI = +1 indicates a strong preference for the contralateral limb (left hand), and HI = −1 indicates a strong bias toward the ipsilateral (right) limb. Neurons in M1 showed a modest bias toward the contralateral limb with a mean HI = 0.36 during approach; the M1 HI ranged from 0.2 to 0.3 at the trial start (when the animal was instructed which knob to pull) and was lower during object manipulation when the hand interacted with the objects (stages 2–8). In contrast, S1 firing rates were strongly biased toward the contralateral limb, with greatest preference at contact (HI = 0.82), and were maintained above 0.6 during the remainder of the task. Our findings indicate that S1 neurons of the right hemisphere are tuned primarily to somatosensory inputs from the contralateral hand during hand shaping and hand-object interactions. Neurons in frontal motor areas show less lateralization of action representation but still display a moderate preference for actions of the contralateral limb.
Figure 17.
Mean handedness index (HI) ± SE for the 29 primary motor cortex (M1) neurons and 38 primary somatosensory (S1) neurons recorded in monkey AR. Same task stage designation as in Fig. 15. S1 neurons had higher HI values than M1 neurons because they did not respond to actions of the ipsilateral hand. The greatest HI values occurred in S1 at contact, when firing rates were highest, and during approach in M1, when firing rates were also highest.
DISCUSSION
Bimanual and Unimanual Cortical Representation of Hand Actions
To assess the degree of lateralization of hand function in sensorimotor areas of the cerebral cortex, we recorded spike trains of 67 single neurons individually, and as neural ensembles, with arrays of up to four microelectrodes as a macaque performed a trained prehension task. The task required training and practice in which the subject learned to associate visual icons displayed on a computer monitor with instructions for obtaining food rewards. The icons did not indicate how large the reward was but rather the action to be performed and practiced: grasping an unfamiliar object that is not edible itself but is linked to an apparatus that dispenses juice to be consumed immediately. Control and prediction are the key elements that need to be trained. The animal needed to learn to link abstract symbols on a computer monitor with hand actions that are rewarded. The task cues did not explicitly require use of a particular hand.
The recordings spanned a large swath of cortex from frontal motor areas caudal to the arcuate sulcus, through the somatosensory regions of area 2 bordering the central sulcus. M1 neurons appear to play important roles in object acquisition, responding most vigorously during reaching, whereas S1 responses are focused on object manipulation such as touching, grasping, pulling, and lifting. Our previous studies of PPC (11–20, 29, 30) indicate that the region surrounding the intraparietal sulcus plans upcoming hand actions during prehension.
Contrary to expectations, we found that M1 neurons were activated earlier than S1 neurons in the population analyzed; spike trains in our studies of M1 began as early as ∼750 ms before the hand touched the object. M1 neurons typically increased their firing rates after the GO-cue, before the actual reach movement began, and showed peak firing during reach, weaker responses during object manipulation, and low rates of activity during static holding. Neurons in M1 were split between bimanually responsive types (10/29) and those strongly favoring the contralateral hand and arm (19/29). M1 peak responses to actions of the ipsilateral hand typically occurred during the same task stages as those performed by the contralateral hand. Ipsilateral representation was generally weaker and shorter in time course than that evoked by the contralateral hand. Responses to relaxation of grip and object release from the hand were infrequent and weak in M1. We found little effect of the knob shape and/or location in the workspace on firing rates during task performance.
We propose that the M1 responses to the ipsilateral hand documented here underlie the process of motor equivalence (Ref. 33, cited in Ref. 34) in which actions performed by the nondominant hand are similar in kinematics and function to those usually performed by the dominant hand. Bilateral representation in M1 thus provides a potential hardwired neural mechanism for recovery of function after unilateral stroke or head injury in which an intact nondominant hand may substitute for the paretic hand or instruct it if the two hands perform the same actions in concert or sequentially (32). Our present findings indicate that M1 networks in the animal studied signal actions performed by both the right and left hands, providing hardwired, potentially redundant circuits for skilled motor control in each hemisphere.
Motor equivalence may also provide a neural substrate for transfer of skills between the hands. For example, in previous studies using similar grasping tasks (16–18), we found that it took ∼2 wk to retrain an animal to use the other hand when the recording site was shifted to the other hemisphere, whereas training for the original studies took >1 yr.
Bilateral representation thus provides a potential neural mechanism for recovery of function after stroke or head injury in which the intact hand may substitute for the paretic hand or instruct it if the two hands perform the same actions in concert or sequentially (32). Bilateral representation provides a hardwired mechanism to transfer learning from one hand to the other, presumably by callosal connections between the two cortical hemispheres.
Neurons in primary somatosensory (S1) cortex area 2 showed the greatest lateralization of hand representation, with S1 cells responsive almost exclusively to actions of the contralateral hand and arm. S1 neurons in the right hemisphere signaled liftoff of the contralateral hand from the start position as reach began (Fig. 14) and/or increased their firing rates before the hand contacted the test objects. Most S1 neurons were insensitive to actions of the ipsilateral hand.
Neural activity in S1 peaked when the contralateral hand initially touched the object and/or secured it in the grasp. S1 neurons often fired a second burst of spikes as the grip was relaxed and the object released from the hand. Actions of the ipsilateral hand to the same or similarly shaped objects did not evoke significant changes in the neuron’s baseline firing rate. The weak representation of ipsilateral hand actions in S1 provides a mechanism for distinguishing active touch signals from passively applied tactile sensations.
Tactile and proprioceptive sensory feedback from the hand to S1 may play an instructional role enabling skilled object manipulation by each hand. The precontact activity suggests that copies of the motor command to grasp are transmitted to S1, enabling S1 neurons to evaluate afferent sensory information from the hand. The precontact spike trains may distinguish self-acquired stimuli from those originating from external sources. Alternatively, precontact activity may reflect proprioceptive activity of SA2 tactile afferents or muscle spindle stretch receptors that signal hand posture during hand shaping before object acquisition (38–41).
Somatosensory feedback from the hand to S1 may play an instructional role enabling skilled object manipulation by either hand (1, 7–10, 35–42). Copies of the motor command to reach and grasp may be transmitted to S1, enabling S1 neurons to evaluate afferent proprioceptive or tactile information from the hand. The precontact spike trains might distinguish self-acquired stimuli from those originating from external sources. Alternatively, precontact activity may reflect proprioceptive activity of SA2 tactile afferents or muscle spindle stretch receptors that signal the hand posture during hand shaping before object acquisition (38–42).
We recognize that the data presented in this report are derived from a small sample of neurons in M1 and S1. However, in previous studies of three macaques using a similar task (grasp to lift rather than grasp to pull), we found that the earliest task-related activity occurred in PPC neurons whose firing rates increased in conjunction with the animal’s reach to the start position and persisted at high rates until the cued object was acquired by the hand (12, 13, 17, 29, 30). These earlier studies included 76 neurons recorded in M1 and S1 cortex. We noted in those reports that S1 neurons fired at peak rates during contact and/or grasping as shown in the present report and that M1 neurons fired maximally during reaching. Therefore, we proposed that postcontact activation of specific hand muscles is paralleled by matching somatosensory feedback to S1 needed for efficient task performance. In this manner, S1 feedback to M1 may regulate levels of grip and load forces needed for successful performance of grasping tasks.
Active and Passive Touch
In an earlier study of tactile recognition of letters of the alphabet, Vega-Bermudez, Johnson, and Hsiao (43) stated that active and passive touch provide equivalent information to the conscious brain, based on psychophysical studies measuring human tactile discrimination of letters of the alphabet. Although they stressed similarities between vision and touch (36, 37, 43–45), particularly the properties of a two-dimensional receptive sheet, the analogy seems oversimplified.
Klatzky and Lederman (1–3) have contended that touch is primarily a three-dimensional modality designed to capture the volumetric, topographic, and elastic properties of objects, features that are relevant to shape discrimination of objects such as were provided by the objects tested in our experiments. They noted that haptic sensory qualities are best appreciated by active manipulation (grasping, rotation, and contour tracing by the fingers and palm).
In a 2008 review (35), Steven Hsiao expressed similar ideas about the three-dimensional character of object representation in the brain. In Figure 1 of that report, he showed an image of a monkey grasping a cylindrical disk and in the same figure showed an image of the empty hand configured in the same posture. He suggested that percepts of object size and shape were conveyed by both tactile and proprioceptive inputs from SA1 and SA2 afferents. SA1 fibers were posited to signal the texture and curvature applied to each finger, and SA2 fibers and/or muscle afferents signaled the overall hand posture during grasping (38–40). The interaction of proprioceptive and tactile information is necessary for coding object size, as the two modalities provide complementary information. Interestingly both touch and proprioception are transduced by Piezo2 mechano-sensory proteins (47, 48).
The interaction of touch and proprioception during object grasp poses an interesting paradox. If you are asked to demonstrate how you would hold a medium-sized sphere such as an orange or a baseball, you do not experience the sensation of a sphere but rather the clear proprioceptive sensation of your curved hand. If you are presented with that object and told to grasp it, you would form your hand into a similar posture during reaching, providing a simple mechanism for efficient grasp, called hand preshaping, described originally by Marc Jeannerod and colleagues (reviewed in Ref. 6). Interestingly, when your hand touches and grasps the orange, you feel the object, not your hand posture. It is as if touch sensations suppress proprioception, altering the sensation from proprioceptive to exteroceptive. So it is possible that some of the precontact responses we recorded in S1 (Fig. 14) reflect the proprioceptive signals of hand posture sensed by SA2 afferents during reaching and hand preshaping that are suppressed by tactile sensations when contact is made.
There are fundamental differences between touching (active) and being touched (passive), originally described by J. J. Gibson (4). Active touch is fundamentally a top-down process in which the subject has agency and controls what occurs. Subjects select relevant salient features of objects to determine active hand behaviors. We usually grasp objects along their narrowest, smallest dimensions. Gibson named these graspable features “object affordances.” We did not train our animals how to grasp the test objects used in this or earlier studies of prehension but let them determine where to place the hand to manipulate the objects efficiently.
In our experiments, the animal controlled how it used its hand, selected which object to grasp and the most efficient hand posture needed to acquire it, and decided how to manipulate it to achieve the task goals. These paradigms are based on feedforward models of motor control in which the subject used previous experience to acquire and manipulate objects, predicted the sensory consequences of touching, and modified behaviors accordingly (9, 10, 34).
Passive touch is basically a bottom-up process in which subjects react to external stimuli selected by the experimenter. The experimenter controls the timing, amplitude, duration, and spatial spread of stimuli delivered to the hand. Subsequent behaviors are sensory evoked and guided usually by the instructions provided for the paradigm. Subjects therefore need to analyze all of the transmitted somatosensory information and select specific features guided in part by learning and the task instructions. Tactile stimuli are classified into experimenter-selected categories and/or rated along an intensive or hedonic scale. Subjects do not have control over the task conditions other than their willingness to comply with the task instructions.
The information conveyed during active touch depicts the physical properties of objects, the motor actions of the subject’s own hand and arm, and their relation to the task goals. Intrinsic object properties, such as size, shape, weight, texture, and compliance, are sensed before contact by vision and after the object is touched by somatosensory receptors.
Visual and somatosensory properties of objects are integrated in higher brain areas with memories of previous encounters, allowing the subject to predict somatosensory properties such as weight, texture, and compliance before touching and grasping the object. Extrinsic properties of objects, such as their location and orientation in the workspace, are conveyed primarily by visual inputs, allowing subjects to draw on previous experience to plan an appropriate reach trajectory and grasp strategy for efficient acquisition and manipulation. Thus, visual and tactile sensations in each trial provide important information about what the hand touches and where it is located in the workspace.
Hand and arm movements and postures are sensed by muscle afferents and by Ruffini endings in the skin (38–41). Proprioceptive information is necessary for skill learning and for guiding complex motor patterns needed to manipulate objects. These proprioceptive sensations signal how the object is acquired and manipulated. Somatosensory receptors in the skin also convey information about the grip and load forces needed for skilled, efficient manipulation of objects such as tools and to prevent damage to the object or the hand from excessive force application (42). This information is used on a trial-by-trial basis for task performance.
In passive touch experiments, the hand is usually immobilized with the fingers fully extended and the skin somewhat stretched. The lack of hand mobility deprives the subject of useful proprioceptive information that can aid object recognition and feature detection (41). Furthermore, Hsiao and colleagues (46) showed that hand opening and closure during object manipulation by humans provides natural synergies that aid object identification; similar findings have been reported for grasping behaviors in monkeys (49–50).
GRANTS
Research reported in this publication was supported by the National Institute of Neurological Disorders and Stroke (NINDS) of the National Institutes of Health under Award Number R01 NS011862, from Human Brain Project Research Grant R01 NS044820 funded jointly by NINDS, the National Institute of Mental Health (NIMH), and the National Institute of Aging (NIA), and from Research Grant AG064452 from NIA.
DISCLAIMERS
The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
DISCLOSURES
No conflicts of interest, financial or otherwise, are declared by the authors.
AUTHOR CONTRIBUTIONS
E.P.G. conceived and designed research; E.P.G., D.F.P., and J.C. performed experiments; E.P.G. and J.C. analyzed data; E.P.G. interpreted results of experiments; E.P.G. prepared figures; E.P.G. drafted manuscript; E.P.G. edited and revised manuscript; E.P.G., D.F.P., and J.C. approved final version of manuscript.
ACKNOWLEDGMENTS
We thank Gianluigi Campanile and Paola Fillipi, students at the University of Bologna, Faculty of Farmacia, Bologna, Italy for skilled and dedicated participation in these studies. We also thank Alex Porras of the NYU School of Medicine, Department of Neuroscience and Physiology for the design of the computer interface for the shape panel.
REFERENCES
- 1.Klatzky RL, Lederman SJ, Metzger V. Identifying objects by touch: an ‘expert system’. Percept Psychophys 37: 299–302, 1985. doi: 10.3758/bf03211351. [DOI] [PubMed] [Google Scholar]
- 2.Lederman SJ, Klatzky RL. Hand movements: a window into haptic object recognition. Cogn Psychol 19: 342–368, 1987. doi: 10.1016/0010-0285(87)90008-9. [DOI] [PubMed] [Google Scholar]
- 3.Klatzky RL, Lederman SJ. Haptic object perception: spatial dimensionality and relation to vision. Philos Trans R Soc Lond B Biol Sci 366: 3097–3105, 2011. doi: 10.1098/rstb.2011.0153. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Gibson JJ. Observations on active touch. Psychol Rev 69: 477–491, 1962. doi: 10.1037/h0046962. [DOI] [PubMed] [Google Scholar]
- 5.Jeannerod M. The timing of natural prehension movements. J Mot Behav 16: 235–254, 1984. doi: 10.1080/00222895.1984.10735319. [DOI] [PubMed] [Google Scholar]
- 6.Jeannerod M, Arbib MA, Rizzolatti G, Sakata H. Grasping objects: the cortical mechanisms of visuomotor transformation. Trends Neurosci 18: 314–320, 1995. doi: 10.1016/0166-2236(95)93921-J. [DOI] [PubMed] [Google Scholar]
- 7.Jenmalm P, Johansson RS. Visual and somatosensory information about object shape control manipulative fingertip forces. J Neurosci 17: 4486–4499, 1997. doi: 10.1523/JNEUROSCI.17-11-04486.1997. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Jenmalm P, Dahlstedt S, Johansson RS. Visual and tactile information about object-curvature control fingertip forces and grasp kinematics in human dexterous manipulation. J Neurophysiol 84: 2984–2997, 2000. doi: 10.1152/jn.2000.84.6.2984. [DOI] [PubMed] [Google Scholar]
- 9.Flanagan JR, Johansson RS. Action plans used in action observation. Nature 424: 769–771, 2003. doi: 10.1038/nature01861. [DOI] [PubMed] [Google Scholar]
- 10.Johansson RS, Flanagan JR. Coding and use of tactile signals from the fingertips in object manipulation tasks. Nat Rev Neurosci 10: 345–359, 2009. doi: 10.1038/nrn2621. [DOI] [PubMed] [Google Scholar]
- 11.Ro JY, Debowy D, Lu S, Ghosh S, Gardner EP. Digital video: a tool for correlating neuronal firing patterns with hand motor behavior. J Neurosci Methods 82: 215–231, 1998. doi: 10.1016/S0165-0270(98)00055-7. [DOI] [PubMed] [Google Scholar]
- 12.Gardner EP, Ro JY, Debowy D, Ghosh S. Facilitation of neuronal firing patterns in somatosensory and posterior parietal cortex during prehension. Exp Brain Res 127: 329–354, 1999. doi: 10.1007/s002210050803. [DOI] [PubMed] [Google Scholar]
- 13.Debowy DJ, Ghosh S, Ro JY, Gardner EP. Comparison of neuronal firing rates in somatosensory and posterior parietal cortex during prehension. Exp Brain Res 137: 269–291, 2001. doi: 10.1007/s002210000660. [DOI] [PubMed] [Google Scholar]
- 14.Gardner EP, Debowy D, Ro JY, Ghosh S, Babu KS. Sensory monitoring of prehension in the parietal lobe: a study using digital video. Behav Brain Res 135: 213–224, 2002. doi: 10.1016/s0166-4328(02)00167-5. [DOI] [PubMed] [Google Scholar]
- 15.Debowy DJ, Babu KS, Hu EH, Natiello M, Reitzen S, Chu M, Sakai J, Gardner EP. New applications of digital video technology for neurophysiological studies of hand function. In: The Somatosensory System: Deciphering the Brain’s Own Body Image, edited by Nelson R. Boca Raton FL: CRC Press, 2002, p. 219–241. [Google Scholar]
- 16.Gardner EP, Babu KS, Reitzen SD, Ghosh S, Brown AS, Chen J, Hall AL, Herzlinger MD, Kohlenstein JB, Ro JY. Neurophysiology of prehension. I. Posterior parietal cortex and object-oriented hand behaviors. J Neurophysiol 97: 387–406, 2007. doi: 10.1152/jn.00558.2006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Gardner EP, Ro JY, Babu KS, Ghosh S. Neurophysiology of prehension. II. Response diversity in primary somatosensory (S-I) and motor (M-I) cortices. J Neurophysiol 97: 1656–1670, 2007. doi: 10.1152/jn.01031.2006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Gardner EP, Babu KS, Ghosh S, Sherwood A, Chen J. Neurophysiology of prehension. III. Representation of object features in posterior parietal cortex of the macaque monkey. J Neurophysiol 98: 3708–3730, 2007. doi: 10.1152/jn.00609.2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Chen J, Reitzen SD, Kohlenstein JB, Gardner EP. Neural representation of hand kinematics during prehension in posterior parietal cortex of the macaque monkey. J Neurophysiol 102: 3310–3328, 2009. doi: 10.1152/jn.90942.2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Gardner EP. Dorsal and ventral streams in the sense of touch. In: The Senses: a Comprehensive Reference, Vol 16 Somatosensation, edited by Gardner EP, Kaas JH.. Oxford: Elsevier, 2008, p. 233–258. [Google Scholar]
- 21.Sherwood A, Lang EJ, Gardner EP. The neuroinformatics digital video toolkit (Abstract). Soc Neurosci Abstr 32: 6, 2006. [Google Scholar]
- 22.Mountcastle VB, Lynch JC, Georgopoulos A, Sakata H, Acuna C. Posterior parietal association cortex of the monkey: command functions for operations within extrapersonal space. J Neurophysiol 38: 871–908, 1975. doi: 10.1152/jn.1975.38.4.871. [DOI] [PubMed] [Google Scholar]
- 23.Mountcastle VB. The parietal system and some higher brain functions. Cereb Cortex 5: 377–390, 1995. doi: 10.1093/cercor/5.5.377. [DOI] [PubMed] [Google Scholar]
- 24.Mountcastle VB. Perceptual Neuroscience: the Cerebral Cortex. Cambridge, MA: Harvard University Press, 1998. [Google Scholar]
- 25.Mountcastle VB. The Sensory Hand. Neural Mechanisms of Somatic Sensation. Cambridge MA: Harvard University Press, 2005. [Google Scholar]
- 26.Gardner EP, Chen J. Spike trains in posterior parietal cortex (PPC) encode trained and natural grasping behaviors (Abstract). Soc Neurosci Abstr 35: 732.1, 2012. [Google Scholar]
- 27.Putrino D, Chen J, Gardner EP. Representation in motor cortex (M1) of hand actions in a bimanual prehension task (Abstract). Soc Neurosci Abstr 34: 588.04, 2011. [Google Scholar]
- 28.Chen J, Putrino D, Gardner EP. Representation in somatosensory (S1) cortex of hand actions in prehension tasks (Abstract). Soc Neurosci Abstr 34: 588.07, 2011. [Google Scholar]
- 29.Chen J, Baker JL, Gardner EP. Unimanual and bimanual hand actions in a trained prehension task (Abstract). Soc Neurosci Abstr 36: 651.12, 2013. [Google Scholar]
- 30.Chen J, Baker JL, Gardner EP. Object grasp by the ipsilateral and contralateral hands evokes similar responses from posterior parietal cortex (PPC) neurons (Abstract). Soc Neurosci Abstr 37: 250.16, 2014. [Google Scholar]
- 31.Krubitzer L, Disbrow E. The evolution of parietal areas involved in hand use in primates. In: The Senses: a Comprehensive Reference. Vol 16: Somatosensation, edited by Gardner EP, Kaas JH.. Oxford: Elsevier, 2008, p. 183–214. [Google Scholar]
- 32.Raghavan P, Krakauer JW, Gordon AM. Impaired anticipatory control of fingertip forces in patients with a pure motor or sensorimotor lacunar syndrome. Brain 129: 1415–1425, 2006. doi: 10.1093/brain/awl070. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Raibert MH. Motor Control and Learning by the State Space Model (PhD thesis). Cambridge, MA: MIT, 1977. [Google Scholar]
- 34.Wolpert DM, Bastian AS. Principles of sensorimotor control in motor systems. In: Principles of Neural Science (6th ed.), edited by Kandel ER, Koester JD, Mack SH, Siegelbaum SS.. New York: McGraw-Hill, 2021, p. 713–736. [Google Scholar]
- 35.Hsiao SS. Central mechanisms of tactile shape perception. Curr Opin Neurobiol 18: 418–424, 2008. doi: 10.1016/j.conb.2008.09.001. [DOI] [PubMed] [Google Scholar]
- 36.Johnson KO. The roles and functions of cutaneous mechanoreceptors. Curr Opin Neurobiol 11: 455–461, 2001. doi: 10.1016/s0959-4388(00)00234-8. [DOI] [PubMed] [Google Scholar]
- 37.Johnson KO, Hsiao SS. Neural mechanisms of tactual form and texture perception. Annu Rev Neurosci 15: 227–250, 1992. doi: 10.1146/annurev.ne.15.030192.001303. [DOI] [PubMed] [Google Scholar]
- 38.Dimitriou M, Edin BB. Discharges in human muscle receptor afferents during block grasping. J Neurosci 28: 12632–12642, 2008. doi: 10.1523/JNEUROSCI.3357-08.2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Edin BB. Quantitative analysis of dynamic strain sensitivity in human skin mechanoreceptors. J Neurophysiol 92: 3233–3243, 2004. doi: 10.1152/jn.00628.2004. [DOI] [PubMed] [Google Scholar]
- 40.Edin BB, Abbs JH. Finger movement responses of cutaneous mechanoreceptors in the dorsal skin of the human hand. J Neurophysiol 65: 657–670, 1991. doi: 10.1152/jn.1991.65.3.657. [DOI] [PubMed] [Google Scholar]
- 41.Kim SS, Gomez-Ramirez M, Thakur PH, Hsiao SS. Multimodal interactions between proprioceptive and cutaneous signals in primary somatosensory cortex. Neuron 86: 555–566, 2015. doi: 10.1016/j.neuron.2015.03.020. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Westling G, Johansson RS. Responses in glabrous skin mechanoreceptors during precision grip in humans. Exp Brain Res 66: 128–140, 1987. doi: 10.1007/BF00236209. [DOI] [PubMed] [Google Scholar]
- 43.Vega-Bermudez F, Johnson KO, Hsiao SS. Human tactile pattern recognition: active versus passive touch, velocity effects, and patterns of confusion. J Neurophysiol 65: 531–546, 1991. doi: 10.1152/jn.1991.65.3.531. [DOI] [PubMed] [Google Scholar]
- 44.Pei YC, Hsiao SS, Bensmaia SJ. The tactile integration of local motion cues is analogous to its visual counterpart. Proc Natl Acad Sci USA 105: 8130–8135, 2008. doi: 10.1073/pnas.0800028105. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Yau JM, Pasupathy A, Fitzgerald PJ, Hsiao SS, Connor CE. Analogous intermediate shape coding in vision and touch. Proc Natl Acad Sci USA 106: 16457–16462, 2009. doi: 10.1073/pnas.0904186106. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Thakur PH, Bastian AJ, Hsiao SS. Multidigit movement synergies of the human hand in an unconstrained haptic exploration task. J Neurosci 28: 1271–1281, 2008. doi: 10.1523/JNEUROSCI.4512-07.2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Ranade SS, Woo SH, Dubin AE, Moshourab RA, Wetzel C, Petrus M, Mathur J, Bégay V, Coste B, Mainquist J, Wilson AJ, Francisco AG, Reddy K, Qiu Z, Wood JN, Lewin GR, Patapoutian A. Piezo2 is the major transducer of mechanical forces for touch sensation in mice. Nature 516: 121–125, 2014. doi: 10.1038/nature13980. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Ranade SS, Syeda R, Patapoutian A. Mechanically activated ion channels. Neuron 87: 1162–1179, 2015. [Erratum in Neuron 88: 433, 2015]. doi: 10.1016/j.neuron.2015.08.032. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Mason CR, Gomez JE, Ebner TJ. Hand synergies during reach-to-grasp. J Neurophysiol 86: 2896–2910, 2001. doi: 10.1152/jn.2001.86.6.2896. [DOI] [PubMed] [Google Scholar]
- 50.Santello M, Flanders M, Soechting JF. Patterns of hand motion during grasping and the influence of sensory guidance. J Neurosci 22: 1426–1435, 2002. [DOI] [PMC free article] [PubMed] [Google Scholar]












