Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2022 Nov 18.
Published in final edited form as: Curr Dir Psychol Sci. 2022 Jul 25;31(5):411–418. doi: 10.1177/09637214221101877

Cross-Modal Interactions of the Tactile System

K Sathian 1,2,3, Simon Lacey 1,2
PMCID: PMC9674209  NIHMSID: NIHMS1804253  PMID: 36408466

Abstract

The sensory systems responsible for perceptions of touch, vision, hearing, etc. have traditionally been regarded as mostly separate, only converging at late stages of processing. Contrary to this dogma, recent work has shown that interactions between the senses are robust and abundant. Touch and vision are both commonly used to obtain information about a number of object properties, and share perceptual and neural representations in many domains. Additionally, visuotactile interactions are implicated in the sense of body ownership, as revealed by powerful illusions that can be evoked by manipulating these interactions. Touch and hearing both rely in part on temporal frequency information, leading to a number of audiotactile interactions reflecting a good deal of perceptual and neural overlap. The focus in sensory neuroscience and psychophysics is now on characterizing the multisensory interactions that lead to our panoply of perceptual experiences.


An exciting development in the past few decades has been the growing realization that the seemingly distinct sensory systems, such as touch, vision, and hearing, are not strictly independent, as is often taught. Rather, these systems are intricately intertwined in the brain and in perception. Here we review work, mostly from the last 25 years or so with a particular focus on human research, establishing the close perceptual and neural interactions between touch and vision in a number of domains, including object orientation, shape, texture and motion, and the perceived ownership of the body and its parts. We then review some recent work examining perceptual and neural interactions between touch and hearing, with a focus on the domain of temporal frequency.

INTERACTIONS BETWEEN VISION AND TOUCH

Although vision is a distance sense while touch is a contact sense, the sensory systems supporting vision and touch have much in common. Both these senses are used to assess object properties such as orientation, shape and surface texture, spatial relations between objects or parts of objects, and object motion. Thus, it should not be surprising that the perceptual characteristics of the visual and tactile systems resemble each other, or that the corresponding neural representations overlap. For ease of exposition, we have chosen to organize the following sections by domain, referring in each case to perceptual interactions and then their neural substrates.

Orientation discrimination

We perceive the orientation of environmental objects visually, but also appreciate the orientation of small or graspable objects haptically (i.e. with the hands). Further, vision and touch interact in the perception of stimulus orientation, as exemplified in tactile modulation of a visual illusion called the tilt illusion: In vision, the perceived orientation of a central grating can be affected by the orientation of a surrounding grating – when the central and surrounding gratings differ in orientation by a few degrees, the central grating is perceived as tilted further away from the orientation of the surround than it actually is. This “repulsive” effect is enhanced by a simultaneously presented tactile surround grating2 whose orientation is congruent to that of the visual surround (Pérez-Bellido et al., 2018a). Another example comes from the study of binocular rivalry, where two stimuli presented independently to each eye result in associated percepts that switch unpredictably back and forth. For instance, when the two stimuli are orthogonally oriented gratings, the perceptual experience oscillates between the two gratings. When a tactile grating is then added in an orientation matching that of one of the visual gratings, this visual grating tends to become dominant over its competitor in the rivalry (Lunghi & Alais, 2015). Such interactions suggest that the representation of object orientation is common to vision and touch. This idea fits with neuroimaging studies using positron emission tomography (PET), which showed that the same visual cortical region is active during both visual (Sergent et al., 1992) and tactile (Sathian et al., 1997) discrimination of grating orientation. This region appears to be the human counterpart of the monkey visual cortical area known as the sixth visual area, or V6 (Figure 1A; reviewed by Sathian, 2016). Transient disruption of this visual cortical area by focal transcranial magnetic stimulation (TMS, application of brief magnetic pulses to the head) actually interferes with tactile grating orientation discrimination (Zangaladze et al., 1999), demonstrating the functional role of this visual area in touch. These neuroimaging and neurostimulation findings, together with the psychophysical studies outlined above, converge on the conclusion that object orientation is encoded in a neuronal pool accessible to both vision and touch. Collectively, these observations illustrate the concept that so-called “visual” cortical areas subserve not only visual but also the corresponding non-visual tasks, a point we return to repeatedly in this review. Accordingly, one could argue that the term “visual” for these areas is incorrect, but in the present review we refer to these areas as visual, not only for the sake of simplicity but also in recognition of the fact that these areas reside in the hierarchy of visual projections.

Figure 1.

Figure 1.

(A-C) Locations of the human brain regions referenced in the text; the front (anterior) of the brain is on the left and the back (posterior) on the right in each illustration (adapted from Sathian, 2016). Brain regions in (A) & (C) are shown on sagittal slices (i.e. in planes oriented along the anterior-to-posterior axis of the head), x coordinates give the distance (mm) from the mid-sagittal plane (the midline dividing the head into equal right and left halves), with positive and negative numbers indicating the right and left cerebral hemisphere, respectively. Brain regions in (B) are shown on a partially “inflated” side view of the left hemisphere, in which the sulci (irregular grooves in (A) and (C)) appear in darker gray than the adjacent gyri (protuberances in (A) and (C)). (A) Visual area V6, the sixth in the hierarchy of visual areas originally described in macaque monkeys. (B) The prefrontal cortex (PFC) located anteriorly in the brain; the parietal operculum (Latin: little lid), forming part of the upper bank of the prominent lateral fissure; the central sulcus (CS) shown as a landmark – this sulcus demarcates the frontal lobe anteriorly from the parietal lobe posteriorly; the primary somatosensory cortex (S1) forms the most anterior part of the parietal lobe; the intraparietal sulcus (IPS) approximately bisects the parietal lobe; the visual area MT (so named for the homologous area in macaque monkeys, known as the middle temporal visual area); and the lateral occipital complex (LOC) located near where the occipital lobe (the most posterior lobe) meets the temporal lobe in the lower part of the cerebral hemisphere. (C) Medial (i.e. closer to the mid- sagittal plane) occipital cortex (MOC). (D) Conceptual model of haptic shape perception, in which object familiarity modulates recruitment of appropriate neural networks. For familiar objects, the LOC is driven top-down from the PFC, allowing facilitation by object imagery, whereas for unfamiliar objects the LOC is driven bottom-up from S1, aided by spatial imagery. The IPS integrates representations of an object from its component parts while the LOC houses modality-independent object representations. (Adapted from Lacey et al., 2014).

Shape perception

While an object’s shape is a defining visual property, shape is also often assessed haptically, especially during grasping. Recognition of unfamiliar objects is “view-dependent” in both visual and haptic modalities, since object rotation between study and test impairs recognition, although familiar objects can often be recognized despite being rotated, presumably due to acquisition of multiple representations from different viewpoints (see review by Peissig & Tarr, 2007). Interestingly, though, crossmodal object recognition (when an object is initially studied visually but recognition is tested haptically, or vice versa), although not quite as good as within-modal object recognition, is view-independent, i.e., it is unaffected by rotation of the object between study and test (Lacey et al., 2007a). We were able to induce within-modal view-independence through either visual or haptic perceptual learning, by repeated trials in which participants first studied objects and then were asked to recognize them in both rotated and unrotated orientations, and found that the view-independence acquired in the trained modality transferred crossmodally without further training (Lacey et al., 2009). Moreover, crossmodal training with unrotated and rotated objects, where the modality of object presentation was switched between study and test, by itself was sufficient to produce both visual and haptic view-independence (Lacey et al., 2009). These studies point to the existence of a multisensory representation of object shape that incorporates view-independence.

Haptic shape is encoded not only in somatosensory cortical areas but also in shape-selective visual cortex, in a region called the lateral occipital complex (LOC, Figure 1B), homologous to the inferotemporal cortex of monkeys (see review by Sathian, 2016). Even sound cues can activate the LOC during shape recognition in both sighted and blind people, leading to the supposition that the LOC computes geometric representations of shape, regardless of the input modality (reviewed by Sathian, 2016). Posterior parietal cortex, in and around the intraparietal sulcus (IPS, Figure 1B), is also involved in the perception of shape through both vision and touch; however, its function appears to be to reconstruct representations of objects from their component parts. To understand the involvement of areas such as the LOC and IPS in haptic compared to visual shape perception, we undertook a series of functional magnetic resonance imaging (fMRI) studies of activity and connectivity in multiple tasks. Based on these studies, we proposed a conceptual model of haptic shape perception (Figure 1D). An important feature of our model is the ability for mental imagery, which appears to be quite important for haptic shape perception. Indeed, mental imagery is also valuable for visual perception, especially under suboptimal viewing conditions, where imagery is used to construct hypotheses against which incoming visual imput is compared (Kosslyn, 1994). Visual imagery can be subdivided into object imagery, characterized by rich pictorial representations that integrate surface characteristics such as texture, and spatial imagery, emphasizing spatial relationships between parts of (or entire) objects; further, individuals vary in their relative preference for spatial or object imagery (Kozhevnikov et al., 2005). The object-spatial distinction in imagery also applies to haptics (Lacey et al., 2011), and the accuracy of crossmodal object recognition correlates with the ability for spatial, but not object, imagery (Lacey et al., 2007a). Our model (Lacey et al., 2014) tapped into differences between object and spatial imagery in relation to the neural networks underlying haptic shape perception. We proposed that visual object imagery is particularly relevant for haptic perception of the shapes of familiar objects, with the LOC being activated via top-down pathways from prefrontal cortex (PFC, Fig 1B) that presumably drive such imagery of previously encountered objects. In contrast, we argued that haptic perception of unfamiliar objects (for which object imagery is not readily available) is mediated instead by bottom-up drive from primary somatosensory cortex (S1, Figure 1B), facilitated by spatial imagery processes in the IPS that are thought to allow whole-from-part reconstruction of object representations that are critical for unfamiliar objects but may also be useful for familiar objects (Lacey et al., 2014).

The commonalities between visual and haptic shape perception resonate with the question famously posed by Irish philosopher William Molyneux to his British colleague John Locke, in which he asked whether restoration of sight to someone blind from birth would allow visual recognition of objects previously only experienced through touch (Locke, 1706/1997). It turns out that the empirical answer to this question is nuanced: Five congenitally blind individuals who underwent surgical treatment aimed at restoring their vision were unable to visually match a haptic sample object within 48 hours after surgery. However, three of the five participants who were tested days to weeks later showed substantial improvement on the crossmodal test (Held et al., 2011). Thus, it appears that the answer to Molyneux’s question is negative immediately after sight restoration, but turns positive after a short period of time, presumably reflecting the (rapid) effect of multimodal experience. Given the small sample size and the preliminary nature of this study, further work on this topic is desirable.

Texture perception

The texture of object surfaces is a property that is primarily sensed via touch, and touch is superior to vision in this domain (see review, Sathian, 2016). This is not surprising when one considers the multiple dimensions of texture, which include rough-smooth, hard-soft and sticky-slippery (reviewed by Bensmaia, 2009) – judgments for which we tend to rely on touch. The rough-smooth dimension has been extensively studied from psychophysical and neurophysiological perspectives, with the conclusion that spatial patterns are particularly important (as they are in vision) for coarser tactile textures, although temporal cues also contribute; temporal frequency (an important property of auditory as well as tactile stimuli) becomes increasingly important as tactile textures get finer (see review by Bensmaia, 2009). In keeping with the dominance of touch in texture perception, tactile textures bias judgments of simultaneously encountered visual textures, but not vice versa (Guest and Spence, 2003).

Parietal opercular cortex (a part of somatosensory cortex located in the upper bank of the lateral fissure, Figure 1B) is a key locus where haptic texture is represented. Interestingly, a multivariate classifier3 could distinguish multivoxel spatial patterns of fMRI activity between visual presentations of glossy or rough objects, not only in visual cortical areas, but also in parietal opercular cortex (Sun et al., 2016), implying that the visual stimuli evoked corresponding haptic representations. Reciprocally, reiterating the theme of task-specific visual cortical recruitment during touch, haptic assessment of texture activates texture-selective visual cortical areas, especially in medial occipital cortex (MOC, Figure 1C, reviewed by Sathian, 2016). Remarkably, texture-selective parietal opercular cortex is also active when listening to sentences containing textural metaphors, such as “she had a rough day” (Lacey et al., 2012), suggesting that metaphorical roughness is understood by reference to its physical counterpart. This underscores the “grounding” of abstract concepts in relevant sensorimotor processes, an idea originally proposed by Aristotle, developed in modern cognitive psychology (see Barsalou, 2008, for a review), and applied to the subject of metaphors by Lakoff and Johnson (1980).

Motion perception

The motion after-effect is well known in vision – after exposure to visual motion in one direction for about 10 s, a static visual stimulus seems to move in the opposite direction, thought to be due to adaptation of the motion detectors. This after-effect also manifests crossmodally: adaptation to visual motion induces a tactile motion after-effect, and vice versa, indicating a shared visuotactile representation of motion (Konkle et al., 2009). Consistent with this, numerous neuroimaging studies have demonstrated that motion-selective visual cortical areas referred to as the MT complex (MT, Figure 1B) are also recruited by tactile or auditory motion, in both sighted and blind people, although some studies have failed to find crossmodal activation. TMS over the MT complex interferes with discriminating both the speed and direction of tactile motion (Amemiya et al., 2017; this article also reviews the previous neuroimaging and TMS studies), reinforcing the idea that this and other visual cortical areas underlie the performance of domain-specific tasks (motion, shape etc.) in a modality-independent manner.

Body ownership

Our sense of our own body – where it is in space and what belongs to our body and not someone else’s – should logically be one of our strongest percepts (reviewed in Ehrsson, 2020). However, the rubber hand illusion, an especially intriguing visuotactile interaction, challenges this idea: As originally described (Botvinick & Cohen, 1998), one arm of the participant is concealed and ‘replaced’ with a realistic fake arm. While participants look at the fake arm, the experimenter synchronously brushes the fake hand and the participant’s own (hidden) hand, which induces the illusory feeling of touch on the unseen hand, the sense that the fake arm belongs to the participant’s body (incorporation into the body image), and that the unseen, real arm is located in the same position as the fake arm (proprioceptive drift). The self-reported strength of the illusion scales with activity in ventral premotor cortex (reviewed by Ehrsson, 2020). The rubber hand illusion can be induced in a matter of seconds in those who are susceptible to it and recent work suggests that the effects are relatively long-lasting; in particular, the sense that the fake arm belongs to one’s body may persist for several minutes (Abdulkarim et al., 2021).

In a similar but even more dramatic visuotactile illusion, synchronous stroking of a mannequin and a participant’s own body at the same body location, along with a virtual reality display that shows the mannequin in place of the participant’s body, induces the perception that the mannequin’s body is the participant’s own (reviewed by Ehrsson, 2020). These amazing observations indicate that constructing the perception of our body and its parts depends critically on multisensory integration. There are large individual differences in susceptibility to these illusions, the reasons for which have yet to be worked out; understanding such variability is vital because of the importance of incorporating prosthetics or tele-operated devices into the body schema for their efficient use (Cutts et al., 2019).

Visuohaptic object processing over the lifespan

Infants – even neonates – are capable of visuohaptic crossmodal matching and are therefore sensitive to object properties common to vision and touch (reviewed in Lewkowicz & Bremmer, 2020). However, the statistically optimal visuohaptic integration that adults demonstrate, with flexible weighting of input modality to minimize the variance of perceptual estimates, takes some time to develop: up to about 8 years, integration is suboptimal, with haptics dominating size perception and vision dominating orientation discrimination, but by 8–10 years integration is more optimal, presumably reflecting calibration of sensory systems by crossmodal comparison during development (Gori et al., 2008). That something important happens around 8–10 years is consistent with the observation that object recognition in a crossmodal priming paradigm is view-independent (as in adults, see earlier section on shape perception) for children aged 9–10 years and older, but not for younger children (Jűttner et al., 2006). While proficiency at visuohaptic within-modal memory for objects is unaffected by age, this is not so for crossmodal memory: older adults exhibit a marked asymmetry such that performance is much worse when haptic encoding is followed by visual retrieval compared to the reverse visual-haptic condition, in contrast to the situation in childhood and early adulthood where crossmodal object recognition is unaffected by study/test modality (Lacey et al., 2007b; Norman et al., 2006).

AUDIOTACTILE INTERACTIONS

Temporal frequency information is perhaps most important to audition, for example in speech and music perception (Pérez-Bellido et al., 2018b), but the ability to perceive temporal frequency by touch also contributes to tactile texture perception, as noted above (Bensmaia, 2009). Further, manipulating the frequency of the (unattended) sounds generated during touching textured surfaces influences tactile texture judgments, suggesting that auditory frequency is perceptually integrated with tactile texture (Guest et al., 2002). Thus, a reasonable question is whether audition and touch share a common representation of temporal frequency information and/or a common neural basis.

A number of psychophysical studies point to a shared representation of temporal frequency between audition and touch. For instance, in a two-interval forced-choice task where each of two successive intervals contained a stimulus with a specified temporal frequency, participants were asked to report which of the two intervals contained the higher frequency. Participants attended selectively to either auditory or tactile input while attempting to ignore distractors in the other modality. Symmetric audiotactile influences were found: the frequency in the attended modality was consistently perceived as similar to that in the unattended modality (Convento et al., 2019; see also earlier studies in a different frequency range). In the same way that visual training transfers to haptic perception and vice versa (Lacey et al., 2009), auditory adaptation effects transfer to the tactile domain: exposure to frequency-specific auditory noise improves subsequent discrimination of tactile frequency but not intensity (Crommett et al., 2017).

Neuroimaging and neurostimulation studies bear out the notion of convergent temporal frequency representations of tactile and auditory inputs. For instance, in an fMRI study, the left auditory cortex was found to respond to vibrotactile frequencies of 20 Hz and 100 Hz, both in the audible range, but not to a 3 Hz stimulus, which is below the audible range (Nordmark et al., 2012). An analogous fMRI study revealed that multiple regions of somatosensory cortex respond to auditory inputs in a frequency-specific manner, with the similarity of multivoxel spatial patterns of activity4 in response to different frequencies correlating with the similarity of corresponding perceptual judgments, although the somatosensory cortical responses were less robust and noisier than their auditory cortical counterparts (Pérez-Bellido et al., 2018b). Moreover, TMS over primary somatosensory cortex impaired auditory frequency discrimination, but only when trials comprising unimodal auditory stimuli were interleaved with trials requiring (unimodal) tactile or (crossmodal) audiotactile frequency discrimination (Convento et al., 2018).

Conclusions

This brief review has provided some examples of multisensory interactions involving the tactile (haptic) system. Touch and vision represent object properties similarly in a variety of domains, and the neural representations of these properties converge in brain regions that should be considered as specialized for particular tasks rather than particular modalities of sensory input. Although work on audiotactile interactions is less well developed, a similar theme of perceptual and neural commonality emerges in the domain of temporal frequency. Tactile inputs can often be integrated with corresponding visual or auditory inputs, and dramatic illusions evoked by manipulating visuotactile interactions reveal that our very sense of body ownership depends critically on multisensory integration. Thus, we should no longer consider the various sensory systems as independent – rather, our goal should to be find out how they interact to produce the richness of our sensory experience.

Footnotes

1.

Address correspondence to K. Sathian, Department of Neurology, Penn State Health Milton S. Hershey Medical Center, 30 Hope Drive, Mail Code EC037, Hershey, PA 17033, USA (ksathian@pennstatehealth.psu.edu).

2.

Tactile gratings comprise surfaces with alternating ridges and grooves that are actively felt with the fingertip, or applied to the passive finger.

3.

An example of an artificial intelligence-based machine learning approach in which the fine-grained pattern of activity across multiple voxels [volumetric pixels] in a given area, as measured with fMRI, is compared between two or more experimental conditions and used to train a classifier that is then tested on data that were not used in training the classifier.

4.

An approach that has become popular is representational similarity analysis (RSA), in which (dis)similarity matrices based on the multivoxel spatial patterns of fMRI (or profiles of neurophysiologically recorded) activity can be compared to perceptual, physical or various model-derived (dis)similarity matrices to infer the relevant coding mechanisms.

REFERENCES

  1. Abdulkarim Z, Hayatou Z & Ehrsson HH (2021). Sustained rubber hand illusion after the end of visuotactile stimulation with a similar time course for the reduction of subjective ownership and proprioceptive drift. Experimental Brain Research, 239:3471–3486. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Amemiya T, Beck B, Walsh V et al. (2017). Visual area V5/hMT+ contributes to perception of tactile motion direction: a TMS study. Scientific Reports, 7:40937, doi: 10.1038/srep40937 [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Barsalou LW (2008). Grounded cognition. Annual Review of Psychology, 59:617–645. [DOI] [PubMed] [Google Scholar]
  4. Bensmaia S (2009). Texture from touch. Scholarpedia, 4(8):7956; doi: 10.4249/scholarpedia.7956. [DOI] [Google Scholar]
  5. Botvinick M & Cohen J (1998). Rubber hands ‘feel’ touch that eyes see. Nature, 391:756. [DOI] [PubMed] [Google Scholar]
  6. Convento S, Rahman MS & Yau JM (2018). Selective attention gates the interactive crossmodal coupling between perceptual systems. Current Biology, 28:746–752. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Convento S, Wegner-Clemens KA & Yau JM (2019). Reciprocal interactions between audition and touch in flutter frequency perception. Multisensory Research, 32:67–85. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Crommett LE, Pérez-Bellido A & Yau JM (2017). Auditory adaptation improves tactile frequency perception. Journal of Neurophysiology, 117:1352–1362. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Cutts SA, Fragaszy DM & Mangalam M (2019). Consistent inter-individual differences in susceptibility to bodily illusions. Consciousness & Cognition, 76:102826. [DOI] [PubMed] [Google Scholar]
  10. Ehrsson HH (2020). Multisensory processes in body ownership. In Sathian K & Ramachandran VS (Eds.) Multisensory Perception: From Laboratory to Clinic, pp179–200. Academic Press: San Diego, CA, USA. [Google Scholar]
  11. Gori M, Del Viva M, Sandini G & Burr D (2008). Young children do not integrate visual and haptic form information. Current Biology, 18:694–698. [DOI] [PubMed] [Google Scholar]
  12. Guest S, Catmur C, Lloyd D & Spence C (2002). Audiotactile interactions in roughness perception. Experimental Brain Research, 146:161–171. [DOI] [PubMed] [Google Scholar]
  13. Guest S & Spence C (2003). Tactile dominance in speeded discrimination of textures. Experimental Brain Research, 150:201–207. [DOI] [PubMed] [Google Scholar]
  14. Held R, Ostrovsky Y, de Gelder B et al. (2011). The newly sighted fail to match seen with felt. Nature Neuroscience, 14:551–553. [DOI] [PubMed] [Google Scholar]
  15. Jűttner M, Műller A & Rentschler I (2006). A developmental dissociation of view-dependent and view-invariant object recognition in adolescence. Behavioural Brain Research, 175:420–424. [DOI] [PubMed] [Google Scholar]
  16. Konkle T, Wang Q, Hayward V & Moore CI (2009). Motion aftereffects transfer between touch and vision. Current Biology, 19:745–750. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Kosslyn SM (1994). Image and Brain, Ch. 5. MIT Press, Cambridge, MA. [Google Scholar]
  18. Kozhevnikov M, Kosslyn SM & Shephard J (2005). Spatial versus object visualisers: a new characterisation of cognitive style. Memory & Cognition, 33:710–726. [DOI] [PubMed] [Google Scholar]
  19. Lacey S, Campbell C & Sathian K (2007b). Vision and touch: multiple or multisensory representations of objects? Perception, 36:1513–1521. [DOI] [PubMed] [Google Scholar]
  20. Lacey S, Lin JB & Sathian K (2011). Object and spatial imagery dimensions in visuo-haptic representations. Experimental Brain Research, 213:267–273. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Lacey S, Pappas M, Kreps A, Lee K & Sathian K (2009). Perceptual learning of view-independence in visuo-haptic object representations. Experimental Brain Research, 198:329–337. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Lacey S, Peters A & Sathian K (2007a). Cross-modal object representation is viewpoint-independent. PLoS ONE, 2:e890, doi: 10.1371/journal.pone0000890. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Lacey S, Stilla R & Sathian K (2012). Metaphorically feeling: comprehending textural metaphors activates somatosensory cortex. Brain & Language, 120:416–421. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Lacey S, Stilla R, Sreenivasan K, Deshpande G, & Sathian K (2014). Spatial imagery in haptic shape perception. Neuropsychologia, 60:144–158. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Lakoff G & Johnson M (1980). Metaphors We Live By. Chicago, IL, USA: The University of Chicago Press. [Google Scholar]
  26. Lewkowicz DJ & Bremmer AJ (2020). The development of multisensory processes for perceiving the environment and the self. In Sathian K & Ramachandran VS (Eds.) Multisensory Perception: From Laboratory to Clinic, pp89–112. Academic Press: San Diego, CA, USA. [Google Scholar]
  27. Locke J (1706/1997). An Essay Concerning Human Understanding. Penguin Classics: London, UK. [Google Scholar]
  28. Lunghi C & Alais D (2015). Congruent tactile stimulation reduces the strength of visual suppression during binocular rivalry. Scientific Reports, 5:9413, doi: 10.1038/srep09413 [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Nordmark PF, Pruszynski JA & Johansson RS (2012). BOLD responses to tactile stimuli in visual and auditory cortex depend on the frequency content of stimulation. Journal of Cognitive Neuroscience, 24:2120–2134. [DOI] [PubMed] [Google Scholar]
  30. Norman JF, Crabtree CE, Norman HF, Moncrief BK et al. (2006). Aging and the visual, haptic, and cross-modal perception of natural object shape. Perception, 35:1383–1395. [DOI] [PubMed] [Google Scholar]
  31. Peissig JJ & Tarr MJ (2007). Visual object recognition: do we know more now than we did 20 years ago? Annual Review of Psychology, 58:75–96. [DOI] [PubMed] [Google Scholar]
  32. Pérez-Bellido A, Barnes KA, Crommett L & Yau JM (2018b). Auditory frequency representations in human somatosensory cortex. Cerebral Cortex, 28:3908–3921. [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Pérez-Bellido A, Pappal RD & Yau JM (2018a). Touch engages visual spatial contextual processing. Scientific Reports, 8:16637, doi: 10.1038/s41598-018-34810-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Sathian K (2016). Analysis of haptic information in the cerebral cortex. Journal of Neurophysiology, 116:1795–1806. [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Sathian K, Zangaladze A, Hoffman JM & Grafton ST (1997). Feeling with the mind’s eye. NeuroReport, 8:3877–3881. [DOI] [PubMed] [Google Scholar]
  36. Sergent J, Ohta S & MacDonald B (1992). Functional neuroanatomy of face and object processing. A positron emission tomography study. Brain, 115:15–36. [DOI] [PubMed] [Google Scholar]
  37. Sun H-C, Welchman AE, Chang DHF & Di Luca M (2016). Look but don’t touch: Visual cues to surface structure drive somatosensory cortex. NeuroImage, 128:353–361. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Zangaladze A, Epstein CM, Grafton ST & Sathian K (1999). Involvement of visual cortex in tactile discrimination of orientation. Nature, 401:587–590. [DOI] [PubMed] [Google Scholar]

RECOMMENDED READING

  1. Ghazanfar AA & Schroeder CE (2006). Is neocortex essentially multisensory? Trends in Cognitive Sciences 10: 278–285. [DOI] [PubMed] [Google Scholar]; A provocative review challenging the traditional notion that sensory processing is segregated by system.
  2. Lacey S & Sathian K (2020). Visuo-haptic object perception. In Sathian K & Ramachandran VS (Eds.) Multisensory Perception: From Laboratory to Clinic, pp179–200. Academic Press: San Diego, CA, USA. [Google Scholar]; This recent book chapter offers an in-depth account of visuo-haptic interactions in object shape perception.
  3. Cascio CJ, Simon DM, Bryant LK, DiCarlo G & Wallace MT (2020). Neurodevelopmental and neuropsychiatric disorders affecting multisensory processes. In Sathian K & Ramachandran VS (Eds.) Multisensory Perception: From Laboratory to Clinic, pp371–399. Academic Press: San Diego, CA, USA. [Google Scholar]; This recent book chapter surveys a range of multisensory interactions in autism and schizophrenia.
  4. Sathian K, Lacey S, Stilla R, Gibson GO, Deshpande G, Hu X, LaConte S & Glielmi C (2011). Dual pathways for haptic and visual perception of spatial and texture information. NeuroImage 57: 462–475. [DOI] [PMC free article] [PubMed] [Google Scholar]; A report that the well-established dichotomy, in vision and hearing, of pathways for information about “what” an object is vs. “where” it is, is also found in touch.
  5. Yau JM, Pasupathy A, Fitzgerald PL, Hsiao SS & Connor CE (2009). Analogous intermediate shape coding in vision and touch. Proceedings of the National Academy of Sciences 106: 16457–16462. [DOI] [PMC free article] [PubMed] [Google Scholar]; A brief report demonstrating that single neurons intermediate in the sensory cortical hierarchies of both visual and tactile processing are tuned for curvature direction, suggesting a similarity of shape coding mechanisms in vision and touch.

RESOURCES