Abstract
An exciting possibility for compensating for loss of sensory function is to augment deficient senses by conveying missing information through an intact sense. Here we present an overview of techniques that have been developed for sensory substitution (SS) for the blind, through both touch and audition, with special emphasis on the importance of training for the use of such devices, while highlighting potential pitfalls in their design. One example of a pitfall is how conveying extra information about the environment risks sensory overload. Related to this, the limits of attentional capacity make it important to focus on key information and avoid redundancies. Also, differences in processing characteristics and bandwidth between sensory systems severely constrain the information that can be conveyed. Furthermore, perception is a continuous process and does not involve a snapshot of the environment. Design of sensory substitution devices therefore requires assessment of the nature of spatiotemporal continuity for the different senses. Basic psychophysical and neuroscientific research into representations of the environment and the most effective ways of conveying information should lead to better design of sensory substitution systems. Sensory substitution devices should emphasize usability, and should not interfere with other inter- or intramodal perceptual function. Devices should be task-focused since in many cases it may be impractical to convey too many aspects of the environment. Evidence for multisensory integration in the representation of the environment suggests that researchers should not limit themselves to a single modality in their design. Finally, we recommend active training on devices, especially since it allows for externalization, where proximal sensory stimulation is attributed to a distinct exterior object.
Keywords: Sensory substitution, neural plasticity, multisensory perception
1. Introduction
In his seminal epistemological work, De Anima, the Greek philosopher Aristotle made an important distinction between the individual senses and the common sense (koinē aesthesis, see Ackrill, 1987). The individual senses (seeing, hearing, touch and smell) have their own domains but many perceptual functions do not involve simple operations of individual senses, but a central holistic representation of the environment. Aristotle realized that perception of the environment is a multifaceted operation where all available senses come into play. Aristotle's insight highlights not only that our representation of the world is holistic, but also that this representation is multimodal since it involves information from all the senses. Borrowing Aristotle’s term, the common sense builds a representation of the environment that encompasses information from allsenses. There are various examples of how information from one modality influences, or even alters, our perception of stimuli in another modality (McGurk & McDonald, 1976; Shams, Kamitani, & Shimojo, 2000; Hötting & Röder, 2004), showing how important integration from different sensory modalities can be. Perhaps most importantly, this holistic representation of the environment is influenced by perceptual interpretation (Hoffmann, 1998; Palmer, 1999; Rock, 1983). The stimulation from the external world (light, air pressure, tactile stimulation) is only part of the story in perception. From the stimulation, interpretive mechanisms of the nervous system create a useful representation of the external environment.
When individuals lose the function of one sense, such as through damage to sense organs, mechanisms concerned with perceptual interpretation are not necessarily affected. It should in principle be possible to find a way of conveying the missing information, if this information can still be processed. As in the case of blindness that can be traced to damage to the retinae, there is still a lot of intact neural hardware devoted to visual function. As Bach-y-Rita and Kercel (2003) put it: “Persons who become blind do not lose the capacity to see” (p. 541). Information may be fed to these neural mechanisms via other channels, utilizing the fact that the main organ of perception is the nervous system. The brain is not just a sensory-machine that responds separately to specific stimuli or sensory modalities but rather a complex “task-machine”, that can partially restore function with input from other senses (Maidenbaum et al., 2014a; Murray et al., 2016). This is the domain of sensory substitution (SS), where touch or audition, for example, convey information that is otherwise not available, such as vision. Sensory substitution devices (SSDs) have been available for a long time. The white-cane for the blind translates environmental structure into haptic and proprioceptive feedback and sign language translates visual stimuli into language. In Braille, “verbal” information is conveyed through haptic stimulation. Such devices have become increasingly sophisticated with advances in technology.
Here we present an overview of research on ways of conveying information to the blind about the external environment through other intact senses with the aim of improving perceptual function and mobility. While we limit our focus to haptic and auditory feedback approaches for the blind, many issues mentioned here undoubtedly apply to other forms of sensory substitution. Our review provides a number of novel perspectives pertinent to sensory substitution that have not been addressed systematically in the literature. We discuss research on neural activity during sensory substitution. Additionally we highlight the importance of training on sensory substitution devices, the best ways of implementing such training and constraints stemming from inherent characteristics of sensory mechanisms. We also point out some pitfalls that any such enterprise must avoid.
2. Key general considerations for sensory substitution
The neural mechanisms of vision in humans are more complex than for the other senses. The auditory nerve has around 30,000 fibers while the optic nerve contains over 1 million fibers (Wurtz & Kandel, 2000), and psychophysical measurements show that the information capacity of vision is considerably higher than of audition (Ash, 1951). Brown, Simpson and Proulx (2014) found that there is an upper limit to representation of auditory objects due to intrinsic limits of transmission in auditory cortical areas. Investigations of the capacity of the visual system reveal that it has four orders of magnitude greater bandwidth than haptic perception (Kokjer, 1987; Schmidt, 1981) while the information capacity of the human ear falls between these two estimates (Bialek et al., 1993; Dudel, 1986; Proulx et al., 2016; Wurtz & Kandel, 2000). Loomis, Klatzky and Giudice (2012) argue that the bandwidth of touch equals blurred vision through filtering of information in the cutaneous system.
Vision loss leads to lessened mobility. Various attempts have been made at generating compensatory strategies where information about the environment is conveyed to the blind through audition and/or touch. The goal has been to implement this feedback in real time, to enable obstacle avoidance, scene analysis or object recognition through interactions with the environment. One of our central points is that any sensory substitution device (SSD) and related training must be based on a thorough understanding of the key aspects that need to be conveyed, and – no less importantly – a thorough understanding of the psychophysics of the perceptual channels involved. Notably, although the available evidence suggests that there is great potential in sensory substitution methods (see Bach-y-Rita & Kercel, 2003; Merabet & Pascual-Leone, 2010; Proulx et al., 2016; Shull & Damian, 2015; Nagel et al., 2005, for reviews), such devices are still not in widespread use (Elli, Benetti & Collignon 2014).
2.1. The need for basic research
Although many technologically sophisticated methods of sensory substitution have been developed, basic research on cognition and perception necessary for effective generation of these technologies is often neglected (see e.g. Loomis, et al., 2012). Important questions, such as how sounds and tactile stimulation are interpreted, and what sort of stimulation is most effective for conveying particular information about the environment to a particular sense, are not always considered in enough detail. We revisit this issue at various points throughout this review, as many issues are related to this general one. In the end, this boils down to the obvious point that information conveyed through sensory substitution needs to match the capabilities of the human nervous system to be useful. The most straightforward way of answering such questions is through psychophysical experimentation.
Devices have been developed that may exceed the capabilities of perceptual channels, such as haptic stimulation devices that exceed 2-point thresholds for touch of the stimulated part of the body (Weinstein, 1968; Gardner & Kandel, 2000). Note however that even if two-point thresholds are exceeded it is possible that perceptual interpretation can compensate for this (see e.g. Novich & Eagleman 2015; and discussion in Bach-Y Rita, 2003). The pattern of stimulation may be detected and perceptual interpretation can functionally result in better resolution than 2-point thresholds dictate (Bach-Y-Rita & Kercel, 2003). Findings on fundamental limits imposed by attentional capacity and potential sensory overload (section 2e) demonstrate how basic perceptual and cognitive science can provide vital information for SSD design.
Loomis et al. (2012) argue that two main steps are needed to generate effective auditory or haptic substitution devices for the blind. The first is to determine what information is most critical for enabling a particular function. During design it is important to focus on the information that is most relevant for the task at hand, and this involves determining which aspects of the visual environment are functionally most important. If the aim is mobility and obstacle avoidance, the emphasis should be on identifying obstacles and conveying their location and size, rather than their color or shape. Secondly it is necessary to determine how to effectively convey this information to the haptic and auditory sensory systems. The basic processing properties of the perceptual channels in question need to be considered and effective models of encoding the information into sound and haptics must be developed (see e.g. Brown et al., 2014), for instance, encoding obstacle direction to sound through spatial audio (Blauert, 2013; Geronazzo et al., 2016) and/or appropriate haptic rendering techniques (van Erp et al., 2005). Neglect of this important point may have prevented the usability of previous SSDs (see e.g. Elli et al., 2014).
2.2. Comfort and ease of use
Ease of use, comfort level, mobility and appearance need to be seriously considered if the blind are to be expected to use a particular device (Dakopolous & Bourbakis, 2010; Elli et al., 2014; Shull & Damian, 2015). Dakopolous and Bourbakis (2010) suggest the following principles for any sensory substitution system for vision:
-
i)
Users’ hands should be free. Interesting substitution devices have been developed that are handheld (Hugues et al., 2015), but an optimal system would probably allow the hands to be free or at least minimize their use for operation.
-
ii)
Any auditory device should not interfere with the users’ ability to sense the environment directly (i.e. hear environmental sounds). There are at least three reasons why this is important: general sensory overload, limited attentional resources (points that we discuss in more detail in in section 2e), and thirdly, the intended users are typically highly accustomed to using environmental sounds.
-
iii)
The system needs to be easily wearable. Equipment that is too bulky or restricts mobility defeats its own purpose and will probably be rejected by intended users.
-
iv)
The system should be simple and easy to use. Its operation and the user interface should not be loaded with unnecessary features but only the bare minimum required for successful operation and usability.
-
v)
The equipment should not require extensive training. This last point made by Dakopolous and Bourbakis (2010) may be contentious, though, and if equipment that works, but does require training, is developed, this should certainly not be held against it. Successful approaches that require considerable training have been developed (Kärcher et al., 2012). Notably, training may be especially important for successful externalization (see section 2.3), which is, in all likelihood, beneficial for successful sensory substitution.
2.3. Externalization
Consider the case of the widely used white cane. The cane provides auditory, tactile and proprioceptive information about the environment. While the cane, however useful, has many drawbacks, it hammers home the important point that sensory substitution devices can become externalized (also called distal attribution; Auvray et al., 2005; Hartcher-O’Brien & Auvray, 2014), so that the haptic pressure picked up by mechanical receptors on the hand is experienced as occurring at the tip of the cane, making use of the “out-of-body” illusion (Maravita et al., 2003). Maravita et al. showed that through training, tools can become externalized so that visual-tactile interaction extends into far space (Maravita et al., 2001) and representations can change following active tool-use, which can extend the participants’ representation of peripersonal space (Maravita, Spence & Driver, 2003). Indeed, the cane can be experienced as an extension to the body (Serino et al., 2007). The ultimate aim with sensory substitution systems is to allow the user to build an effective mental representation of the environment, and experience from tactile or auditory stimulation that is externalized would help with this (see e.g. Loomis et al., 2012; Väljamäe & Kleiner, 2006).
Bach-y-Rita et al. (1969; see also White et al., 1970) developed a haptic stimulator array that conveyed information about the environment to blindfolded or blind observers. Following training with vibration applied to the observers’ back, the sensation changed from the vibration itself being the perceived stimulation site, to being felt as coming from the environment. White et al. (1970) referred to this as “seeing with the skin”, an example of externalization. Note importantly, that this only occurred for observers who actively trained with the device, consistent with the findings of Maravita et al. (2001, 2003; see also discussion in Proulx et al., 2008; Ward & Meijer, 2010; Bach-Y-Rita & Kercel, 2003; Stiles, Zheng & Shimojo, 2015). Similarly, participants in Bach-Y-Rita et al. (1972) reported externalized experiences following considerable training. It is only at this externalization stage that we can literally speak of sensory substitution. Ward and Wright (2014) argue that sensory substitution with good externalization might even be considered a form of acquired synesthesia (see also Farina, 2013).
Ward and Meijer (2010) reported that a user of the vOICe SSD exclaimed “It is sight!” about their experiences with auditory feedback. But note importantly that this participant (and others reporting similar experiences) previously had intact vision for a number of years and that the recruited cortical sites were likely previously used for vision. This may therefore not necessarily be the case for the congenitally blind. Another example of externalization comes from Maidenbaum et al. (2014b), where a blind participant described their experience from the use of the EyeCane: “I could feel the world stretching out before me. It was as if my hand could reach much further. As if the silent objects on the other side of the room were suddenly there” (p. 821). Another example of externalization comes from Nagel et al. (2005) who presented orientation information from a magnetic compass through a haptic belt. Following training, participants began to experience the magnetic feedback as an extra sense. Nagel et al. called this a sixth sense, since this involved qualitative changes of sensory experience.
2.4. Relation between sensory substitution and perceptual illusions
Studies of perceptual illusions have long given insight into the nature of perceptual processing (Gregory, 1998). This is no different in the context of sensory substitution. Hötting and Röder (2004) found that when a single tactile stimulus was accompanied by more than one auditory sound, this led to the perception of more than one touch. Interestingly this was less pronounced for congenitally blind observers. This illusion is a haptic-auditory analogue of a visual-auditory illusion introduced by Shams, Kamitani and Shimojo (2000) who found that when a single visual flash was accompanied by a number of auditory beeps, observers perceived the single flash incorrectly as many flashes. This demonstrates the operation of interpretative mechanisms in perception (section 1). It is also of great interest that congenitally blind participants were not as prone to the auditory-tactile illusion as sighted participants, suggesting that multisensory integration may differ in the congenitally blind, perhaps because they have better auditory discrimination abilities than sighted people or because vision plays a role in crossmodal calibration (Gori et al., 2010; Occelli, Spence & Zampini, 2013). The McGurk effect is another example of a multimodal illusion where lip movements and heard speech interact (McGurk & MacDonald, 1976). Interestingly, for the current context, Fowler and Dekle (1991) found a tactile/auditory McGurk effect when auditory syllables were presented along with a haptic representation of the syllables.
The cutaneous rabbit illusion (Geldard & Sherrick, 1972) involves a rapid repeated stimulation sequence at several locations on the skin that causes the perception of stimulation at intervening spaces between the locations that are actually stimulated. The stimulation is felt as if a small animal (e.g. a rabbit) hopped along successive stimulation sites. Blankenburg et al. (2006), using fMRI, showed how somatotopic brain sites corresponding to the experienced illusory percept were activated although they were not directly stimulated. While such haptic perceptual interpretation raises challenges for vision-to-haptic substitution devices, this may also provide cues regarding the nature of haptic processing, which can be utilized for sensory substitution. All this highlights the need for a deeper understanding of the psychophysical properties of the perceptual channels in question, and raises questions such as whether such illusions may be affected with training.
2.5. Attentional capacity and potential sensory overload
Humans can only process a limited amount of the information hitting their senses at a given time, since attentional capacity is severely limited (Broadbent, 1958; Treisman, 1960; Neisser, 1967; Most et al., 2005; Kristjánsson, 2006; Schneider & Shiffrin, 1977). A lot of information goes unnoticed because selective attention filters it out (Broadbent, 1958; see e.g. Driver, 2001 for review). Observers making eye movements across a visual scene miss large changes in the scene during eye movements (Grimes, 1996; McConkie & Zola, 1979) and are surprisingly inept at noticing large changes to otherwise identical visual scenes if the changed scene is alternatively presented with the original scene at a rapid rate with another visual event in between (O’Regan, Rensink, & Clark, 1999; Rensink, O’Regan, & Clark, 1997; see also Mack & Rock, 1998; Simons, 1996). Such attentional limitations also occur for haptic perception (Gallace, Tan & Spence, 2006) and audition (Vitevitch, 2003), showing how this is a universal principle of attention and perception.
An informative example of how attentional focus can have profound implications for technological design comes from a study where information from flight instruments in the cockpit of a passenger-jet was projected onto the windshield of the cockpit (Haines, 1991). The intention was that pilots would be able to simultaneously keep their eyes on the instruments and the visual scene outside the cockpit. Nevertheless, during flight simulation test pilots frequently missed large obstacles as they came in for landing, such as jets right in the landing path on the runway. Their attention was so focused on the instruments that they could not pay attention to vital information right in front of their eyes. This result has clear implications for sensory substitution. Information conveyed through headphones as a substitute for vision can be prioritized over other auditory information and little attention will be paid to real-world sounds. Kärcher et al. (2012) argue that even though sensory substitution devices can be beneficial for navigation and orientation in the environment, they place a strain on attentional resources, which may detract from attention paid to other aspects of the environment. Note also that within-modality interference is probably larger than between modality interference (e.g. Liu, 2001). This must be taken into account during design of devices and training. Implementation that places minimum demands on attentional resources is essential. Processing of information may, however, become increasingly automatic so normal attentional filtering may ensue.
Sensory overload must be avoided in the design of sensory substitution devices (White et al., 1970; see discussion in Elli et al., 2014). Auditory substitution must not interfere with the natural auditory environment (Härmä et al., 2004). To avoid sensory overload only vital information should be conveyed, to guarantee that the SSD does not obstruct perception of the natural soundscape, nor contradict it (Collins, 1985). Loomis et al. (2012) go so far as saying: “The failure to recognize the processing constraints of perception and cognition has resulted in limited utility of sensory substitution devices for supporting real-world behaviors”. Again, this highlights the importance of basic research into sensory and attentional mechanisms for sensory substitution.
2.6. Issues regarding training
The literature does not provide clear and precise training guidelines regarding SSDs, although good evidence suggests that training is important for successful sensory substitution (Stronks et al., 2015). Externalization, for example, seems to be achieved with considerable training only. Sensory substitution studies typically involve training regimes and the general finding is that performance improves with training (but see Stiles & Shimojo, 2015), but no studies have yet addressed this question systematically.
Increasing the capacity of the human mind has long been a goal. An example of this is the recent trend for brain-training games (Green & Bavelier, 2008). Another example is the literature on video-game training, where it has been claimed that playing first-person action video-games can enhance attentional abilities and that this learning transfers readily to other tasks (Bavelier et al., 2012, Spence & Feng, 2010). Although promising, on closer look, the video-game literature is fraught with confounds, ambiguities and overclaims regarding causal influences of such training (Boot, Blakely, & Simons, 2011; Kristjánsson, 2013), and benefits from brain-training games appear to be task specific (Owen et al., 2010).
Solid evidence for enhanced brain capacity through training is therefore scarce. At the same time, however, the human nervous system shows impressive flexibility (Merabet & Pascual-Leone, 2010). Disabilities from brain damage can be lessened through training (Singh-Curry & Husain, 2008; Saevarsson, Halsband, & Kristjánsson, 2011). But in such cases deficits from the loss of neural function are overcome, while many brain and video-game training approaches involve attempts to increase functionality of intact neural mechanisms. (See Shull, & Damian, 2015 for discussion of SSDs for intact senses and Wright, Margolis & Ward, 2015, for discussion of SS for low-vision). So even if brain training evidence is scarce in normal populations, the implications may not be as severe for situations involving dysfunctional senses. And there is indeed promising evidence regarding the use of virtual training environments such as virtual mazes or video-games in sensory substitution (Lahav, Schloerb, & Srinivasan; 2012; Merabet et al., 2012; Levy-Tzedek et al., 2016; Maidenbaum et al., 2014c; Maidenbaum & Amedi, 2015).
The brain is capable of considerable reorganization of neural function (Held et al., 2011; Rauschecker, 1995), while younger brains are typically more flexible (Hubel & Wiesel, 1970; Merabet & Pascual-Leone, 2010). Cortical areas responsive to audition expand in visually deprived cats (Rauschecker & Korte, 1993) and their neurons become more sharply tuned to location on the azimuth (Korte & Rauschecker, 1993; see also Rauschecker & Kniepert, 1994; King & Parsons, 1999). Extensive practice on particular dimensions can lead to structural changes in neural mechanisms (Pascual-Leone et al., 2005), such as Braille (Hamilton & Pascual-Leone, 1998; Pascual-Leone & Torres, 1993) and music (Elbert et al., 1995) and cortical representations for musical tones are enlarged in experienced musicians (Pantev et al., 1998; Schlaug et al., 1997). Note that sensory substitution may not require training if crossmodal mappings are strategically utilized (Stiles & Shimojo, 2015). Intrinsic crossmodal mappings may play an equally important role in sensory substitution pattern recognition as crossmodal plasticity and training based on crossmodal mappings may improve performance and shorten trainingtimes.
3. Neural evidence regarding sensory substitution
Evidence from neuroimaging and transmagnetic stimulation (TMS) highlights the general point that the different senses do not operate in isolation, again reminding us of Aristotle’s considerations concerning multimodal representations of the environment. Furthermore, recruitment of unused visual cortex can help with compensation for the loss of vision (Pasqualotto & Proulx, 2012; Merabet & Pascual-Leone, 2010). The main point that has emerged from this research is that brain regions often considered ‘visual’, or ‘auditory’ should not necessarily always be considered as such, but that they may serve supramodal functions (Ricciardi, Handjaras, & Pietrini, 2014; Ricciardi, Bonino, Pellegrini & Pietrini, 2014; see Proulx et al., 2016 for review).
Tactile stimulation can cause activity in neural mechanisms that are devoted to other senses. Poirier et al. (2007; see also Renier et al., 2005) show how tactile stimulation can cause activity change in visual areas. Blind participants show activation of primary and secondary visual cortex from tactile stimulation during Braille reading (Sadato et al., 1996). Ortiz et al. (2011) argue that the magnitude of activity in visual cortex with tactile stimulation correlates with observers’ experiences of illusory flashing lights (phosphenes). Interestingly, Striem-Amit et al. (2015) show that even without visual input functional organization of visual cortex follows retinotopic organizational principles (see also Wang et al., 2015).
In Amedi et al. (2001, 2002) the lateral occipital complex (LOC), an area involved in visual object perception, was also activated during haptic object perception in sighted individuals This was not the case for participants who simply learned associations between objects and soundscapes. Amedi et al. (2010) found that tactile object identification activated visual object-recognition regions within LOC in blind individuals similarly as for sighted controls. Furthermore, Reich et al. (2011) found that the so-called visual word form area in the fusiform gyrus (VWFA; Cohen et al., 2000; Sigurdardottir et al., 2015) was activated during Braille reading. Activity has been found in the so-called Fusiform Face Area (FFA; Kanwisher, McDermott & Chun, 1997) for haptic face recognition in both blind (Goyal, Hansen & Blakemore, 2006) and sighted participants (Kilgour et al., 2004). The FFA has been thought to be primarily visual, but representations of the environment probably involve integration of information from other senses, and activity in ‘visual’ areas through stimulation of other modalities may therefore be expected. Occipital visual regions that are heavily involved with judging orientation are also activated by discrimination of tactile orientation applied to the finger (Sathian et al., 1997; de Volder et al., 1999; Kupers et al., 2010). Similarly, Stevens and Weaver (2009) found that congenitally blind participants showed alterations in auditory cortex consistent with more efficient processing of auditory stimuli. Also of interest is hippocampal activity during tactile maze solving in congenitally blind individuals (Gagnon et al., 2012), since such activity is typically associated with visual information processing. Furthermore, using visual-to-music sensory substitution, Abboud et al. (2015) show that when congenitally blind participants manipulate symbols as if they were numbers, there is activity in the right inferior temporal gyrus, a region involved in visual number processing. Touch can even modulate visual imagery (Lacey et al., 2010).
Sound source discrimination, auditory motion discrimination, auditory change detection and sound localization all result in occipital activations in blind subjects (Kujala et al., 2005; Poirier et al., 2006; Voss et al., 2008; Weeks et al., 2000). This has been shown to occur early in development (by 4 years; Bedny, Richardson & Saxe, 2015). An intriguing possibility is that precise localization provided by retinotopic visual cortex may allow more accurate auditory (or haptic) localization if visual areas are recruited by audition (Striem-Amit et al., 2015; Wang et al., 2015).
Kupers et al. (2010) measured neuronal activity as blind participants used an SSD involving stimulation of the tongue finding that the parahippocampus and visual cortex were highly activated. These areas play a fundamental role in visual spatial representation and topographical learning in sighted individuals performing comparable tasks. The visual and the parietal cortices are also activated during Braille reading (Büchel et al., 1998) in the congenitally blind, together with the parietal association areas, the superior visual association cortex and the cerebellum. For late-blind individuals, unlike the congenitally blind, the primary visual cortex and the right medial occipital gyrus were activated. The results for both groups show plastic crossmodal responses in extrastriate cortex. Additionally, tactile motion discrimination activates dorsal visual stream areas in the congenitally blind (Ptito et al., 2009).
Ptito et al. (2005) also studied cross-modal plasticity in congenitally blind participants receiving information about the environment through tongue stimulation. They found that cross-modal plasticity developed rapidly and that the occipital cortex was involved in tactile discrimination. De Volder et al. (1999) conveyed feedback through an ultrasonic echolocation device finding high metabolism rates in occipital visual areas. Also, echolocation has been found to lead to activity in higher order visual areas, for various tasks such as material processing (Milne et al., 2015), and motion (Thaler et al., 2014). Arno et al. (2001), found that visual to auditory encoding of camera-based images resulted in occipital activation in blind subjects. Calvert et al. (1999) also found that the auditory cortex of normal observers was activated as they attempted, in silence, to read speakers’ lips. Importantly, active interaction with auditory sensory substitution stimuli seems to cause more neural activity in the visual cortex than passive processing (Murphy et al., 2016).
Disrupting function with transmagnetic stimulation (TMS) can answer questions regarding neural mechanisms involved in sensory substitution. Zangaladze et al. (1999) found that TMS to regions typically considered primarily visual disrupted tactile discrimination. Additionally, neural disruption of visual cortex with TMS interferes with Braille reading (Cohen et al., 1997; Kupers et al., 2007). Braille reading has also been found to be impaired following lesions to occipital areas (Hamilton et al., 2000). Moreover, TMS of occipital visual areas can induce tactile sensations in Braille readers, similarly to visual phosphenes in the sighted (Cowey & Walsh, 2000; Penfield & Boldrey, 1937; Tehovnik et al.,2005).
Studies of sensory deprivation in animals are also of interest in this context. Visual deprivation in kittens from birth sharpens auditory tuning (Rauschecker, 1995). Such processes may underlie sensory substitution in humans as unused cortex may be taken over by other functions (Blakemore & Cooper, 1970; Wiesel, 1982). Such evidence suggests that the loss of one perceptual channel can be compensated for by higher resolution activity in another (Gougoux et al., 2004; Stevens & Weaver, 2009). It is controversial, however, whether blind individuals have better tactile discrimination acuity than the sighted, and this may depend on task. For example, Alary et al. (2009) found no difference in tactile grating orientation discrimination or vibration frequency judgment between blind and sighted participants, while blind participants outperformed the sighted on a texture discrimination task. In Alary et al. (2008), the blind outperformed the sighted on haptic-angle discrimination. Alary et al. (2009) speculate that this may be due to practice (e.g. with Braille reading). Wan et al. (2010) tested congenitally, early and late-blind participants finding enhanced auditory capacities only for those who became blind early. Participants who became blind at 14 years or later showed no benefits over sighted participants. Finally, we note that certain “visual” functions are heavily influenced by at what time the visual deprivation occurs and by the date of sight restoration (Collignon et al., 2015; McKyton et al., 2015). Such findings have strong implications for sensory substitution.
4. Research on similarities and differences between perceptual processing in different modalities
In an elegant experiment, Gibson (1962) demonstrated the difference between passive haptic perception and haptic perception of a moving stimulus. When a cookie-cutter was pushed onto participants’ palms but remained otherwise stationary, identification rates of its pattern were just under 50% percent, while if the cutter was pushed around in the observers’ palm, recognition became about 95% correct (see also Novich & Eagleman, 2015). This cleverly demonstrated how perception is a continuous process and does not involve a snapshot of the environment. Design of sensory substitution devices therefore requires assessment of the nature ofspatiotemporal continuity for different senses. There is no reason to think that this issue of continuity differs fundamentally between the senses (see e.g. Gallace & Spence, 2008), although particular details maydiffer. In fact, as argued above, our representation of the environment is multimodal (see e.g. Driver & Spence, 2000), raising the question of the relative roles of unimodal versus multimodal processing. The literature on multimodal interactions (see e.g. Deroy, Chen & Spence, 2014; Driver & Spence, 2000; Koelewijn, Bronkhorst & Theeuwes, 2010; Kristjánsson, Thorvaldsson, & Kristjánsson, 2014; Stein & Meredith, 1993) makes clear that a holistic picture of perceptual representation cannot be constructed without understanding multisensory interactions. Neural activity related to multisensory interactions can be seen surprisingly close to cortical areas typically considered unimodal (Driver & Noesselt, 2008). In fact, studies of multisensory interplay make the case that representations are inherently multimodal, if they involve input from more than one sense in the first place. In the current context, this raises the very interesting question of the nature of such representations for someone who lacks a sense.
Humans use various forms of depth information for navigation. The sensory channels differ both in bandwidth and in what information they most effectively transmit. The conveyed information should therefore be chosen in light of the specific aims of a device. Taking the case of sensory substitution for the blind, it is probably infeasible to convey the rich detail of visual experience.
Can the resolution of e.g. hearing as compared to vision be estimated? In a pioneering study, Kellogg (1962) found that blind particpants could use echolocation to detect size differences (see also Ammons, Worchel & Dallenbach, 1953). These abilities vary with practice, however. Rice, Feinstein and Schusterman (1965) tested the abilities of blind observers to echolocate metal disks of various sizes at varied distances. Performance varied considerably but the average detectable size of the disk was 4.6 cm at 67 cm distance. Teng and Whitney (2011) found that sighted novices quickly learned to discriminate size and position through echolocation with notable precision (see also Kolarik et al., 2014). In Lakatos (1991) participants were able to recognize letters traced out with specific auditory patterns in a speaker array, indicative of considerable localization abilities. Cronly-Dillon, Persaud and Blore (2000) trained observers to recognize sounds through tonotopic representation (Väljamäe & Kleiner, 2006) finding that observers were able to recognize basic shapes such as buildings and cars. Other psychophysical research shows that humans can recover physical properties from objects causing sound themselves, such as dimensions (Carello, Anderson, & Kunkler-Peck, 1998; Lakatos, McAdams, & Caussá, 1997; Houben, Kohlrausch, & Hermes, 2004), shape (Kunkler-Peck & Turvey, 2000; Grassi, 2005), or material (Klatzky, Pai, & Krotkov, 2000; Giordano & McAdams, 2006). These results show that, in principle, discrimination routinely performed through vision can be performed with audition up to a certain, undeniably limited, degree, particularly following training.
What does research tell us about how to convey information from vision into touch? Apkarian-Stielau and Loomis (1975) presented alphabetic letters haptically and visually, finding that recognition across modalities became similar only when the visual presentation was considerably blurred, as if a low-pass filter were applied to the tactile stimuli. In another study, Ernst and Banks (2002) observed that while both haptics and vision conveyed accurate information about the height of a bar this was much less reliable for touch than vision. Brown et al. (2014) make the case that there may be upper resolution limits for sonification of visual information. But they also emphasize that effective resolution can be increased through patterns of stimulation (Gibson, 1962; Loomis et al., 2012; Novich & Eagleman, 2015).
Spatial localization accuracy is at least an order of magnitude poorer for haptics than for vision (Apkarian-Stielau & Loomis, 1975; Loomis & Apkarian-Stielau, 1976). Two point thresholds (the ability to discriminate two taps), vary across different parts of the body from ∼3-4 mm (on the fingers) to over 40 mm on the thighs and calves (Weinstein, 1968). This may set an upper limit for SSDs that use haptics, and may furthermore only apply to low-noise implementations. Sampaio, Maris and Bach-y-Rita (2001) found that acuity, measured with a haptic rendering of a Snellen visual acuity chart with a tongue-based SSD was 20/860 on average but doubled after 9 h of training. Interestingly, their participants reported that they located the objects in space rather than at the stimulation site, an example of externalization.
In sum, what these studies highlight is that there are large differences in bandwidth between the senses. As it is only feasible to convey a limited amount of information, this needs to be taken into account during SSD design and at least during initial design the focus should be on what is needed for the intended task (see also Elli et al., 2014). And again, the findings highlight the importance of psychophysical experimentation for insights regarding sensorysubstitution.
5. Sensory substitution devices for vision through haptics
Using haptics to convey information to the blind about the external unreachable environment is strongly constrained by the intricacies of haptic processing. Important considerations follow from any attempt to convey information that the system has not evolved to process. With such devices, information is applied through stimulation that is projected to some surface of the body, often in an isomorphic way, such that a particular location on the body corresponds to a particular location in space, and the closest stimulation sites correspond to locations close to the projected one.
In a pioneering study, Bach-Y-Rita et al. (1969) developed a chair that conveyed tactile stimulation based on a 20 by 20 stimulation grid applied to participants’ backs. The conversion algorithm translated the pixel brightness into tactile stimulation. Following 5 to 15 hours of training, participants were able to recognize objects and experience them as if an image appeared in front of them (White et al., 1970). Some highly trained observers were able to report remarkable detail, such as whether people wore glasses and even peoples identity. This was an important proof of concept, although the use in practical settings may be limited, since the device was immobile.
The acuity of the sense with the lower limit places an upper limit on what can be conveyed in sensory substitution. Research on tactile sensitivity would, at first glance, seem to indicate that the hands, and especially the palms, would be ideal since spatial acuity is high. But for usability, the hands should be free for other purposes and the haptic stimulation applied to passive parts of the body. Koo et al. (2008) developed a vibration device for the fingertips (see also Shah et al., 2006) but such methods may fail the usability test (Dakopolous & Bourbakis, 2010). Handheld devices have been developed (Hugues et al., 2015), but an optimal system should free the hands for other uses, especially if increased mobility and use in daily functioning is the goal. There is a challenging tradeoff here, since it is logical to haptically stimulate the most sensitive parts of the body.
The way that information about the environment is encoded through haptics must be chosen with care, in other words, and stimulating areas that will be the least impeding should be preferred. One promising way involves conveying vibration through a belt to the participants based on information from video cameras (McDaniel et al., 2008). Van Erp et al. (2005) used a haptic belt, coding distance with vibration rhythm while direction was translated into vibration location. While cueing direction by location worked well, conveying distance in this way was not successful (for other implementations see Cosgun, Sisbot & Christensen, 2014; Johnson & Higgins, 2006; Segond et al., 2013).
Devices have been developed where haptic stimulation is applied to the tongue, because of its sensitivity. The Tongue Display Unit (TDU; Bach-Y-Rita et al., 1998; see also Chebat et al., 2011; Tang & Beebe, 2006) is a tactile-to-vision SSD that translates images into tongue stimulation. Kupers et al. (2010; see also Nau et al., 2015) trained and tested the virtual navigational skills of ten congenitally blind and ten blindfolded sighted controls in an active (route navigation) and passive (route recognition) task using the TDU. By the end of each training day, participants were asked to draw the routes in order to evaluate learning. Following training, participants showed considerable knowledge of the routes that they navigated.
6. Sensory substitution devices for vision though audition
Many attempts have been made at conveying information about the environment to the blind through auditory stimulation. While echolocation approaches are one option, yielding promising results (e.g. Ifukube, Sasaki, & Peng, 1991; Kolarik et al., 2014), the sound emission that is required makes this option in many ways impractical. Noise pollution would likely prevent many blind people from using the device. Camera-based auditory feedback devices may therefore be more practical. We note that promising GPS-based sensory substitution devices exist (Loomis et al., 1998, 2005; Marston et al., 2006, 2007; van Erp et al., 2005) but fall beyond the current scope.
One important consideration for auditory sensory substitution is whether to convey information with sounds or speech. Loomis, Golledge and Klatzky (1998) showed how virtual sounds resulted in better navigation performance than verbal commands. Väljamäe and Kleiner (2006) suggest that one practical way of conveying information may be to use a sequential way of generating sounds that correspond to another sequential dimension. Brightness could be denoted with loudness, size with pitch, and direction with timbre, and so on (a loud high pitched stimulus of a certain timbre would indicate a large bright object, to the right etc.). The vOICe sonic imaging system (Meijer, 1992; OIC: “Oh, I see”), transforms elements of a digital image of the environment into a sound pattern. The vertical position of a particular pixel is mapped to frequency and the brightness of an element in the scene to sound intensity. A recent study (Haigh et al., 2013) indicates that the accuracy of visual-to-auditory sensory substitution (with the vOICe) is promising.
Striem-Amit et al. (2012a) designed a visual-to-auditory version of the Snellen visual acuity E-chart, testing it on eight congenitally and one early-onset blind subjects who received training with the vOICe system for 73 hours, in weekly training sessions. The training program involved 2 strategies: 2-dimensional training, where participants were taught how to visualize static images through sound (geometrical shapes, digital numbers, objects, faces) and live-view training aimed at providing visual depth perception through a mobile version of the SSD. During testing, observers were presented with E’s of different sizes and orientations. The sound was played until participants indicated the direction of the sound by keypress. Participants performed better than in experiments with tactile SSDs (Chebat et al., 2007, Sampaio et al., 2001), retinal prostheses (Zrenner et al., 2010) and 55% showed acuity higher than the threshold for blindness (see also Levy-Tzedek, Riemer, & Amedi, 2014).
Ward and Meijer (2010) found that two blind users of the vOICe reported visual-like perception (externalization) acquired over several months of use. At first, both users reported that visual details provided by the device were weak, flat and monochrome. Later, following training for several months or even years, they experienced smooth movement, and depth, although the SSD did not convey this type of information. One participant even reported seeing color, but this probably reflected the participants’ memory of color (the participant became blind at 21). Ward and Meijer argued that even though the vOICe system enabled visualization through training, users’ experiences depend on previous visual experience. This may be due to cross-modal plasticity and experience-driven multi-sensory associations, reflecting an interaction between auditory mapping triggered by use of the SSD and previous sight experience.
Auvrey, Hanneton and O’Regan (2007) found that following 15 hours of training with the vOICe, participants could identify objects on a table, including new objects (see Proulx et al., 2008 for similar conclusions), while Kim and Zatorre (2008) found that their participants were able to associate objects and soundscapes without knowing the corresponding matching algorithm, suggesting that participants learned abstract matching rules.
Amedi et al. (2007) demonstrated that a congenitally blind and a late-blind subject could identify objects by recognizing their shape from visual-to-auditory encodings provided by the vOICe. Moreover, the lateral-occipital tactile visual area (LOtv) was activated only for participants who received extensive training, but not for those who simply learned associations between objects and soundscapes. Striem-Amit et al. (2012b) trained eight congenitally blind and seven sighted control observers on object-identification and classification on the vOICe. The Visual Form Word Area (VWFA; Cohen et al., 2000; Sigurdardottir et al., 2015) was activated in the blind when they processed letter soundscapes, following a 2 h training session (see also Reich et al., 2011).
The PSVA sensory substitution device (Arno et al., 2001; Capelle et al., 1998), involves images captured by a miniature head-mounted TV camera that are simplified at high-resolution centrally but lower resolution peripherally (analogous to the human retina). Each pixel is associated with a particular sound frequency, from left to right and from top to bottom. Arno et al. (2001) studied activations of neural structures in early blind and sighted controls when the PSVA was used for pattern recognition. The visual cortex of the early blind was more activated during pattern recognition both before and after training, where participants learned to move a pen on a graphic tablet to scan patterns presented on a monitor, than during two auditory processing tasks: a detection task with noise stimuli and a detection task with familiar sounds.
In the image-to-sound conversion algorithm of EyeMusic (Abboud et al., 2014) higher musical notes represent pixels located higher on the y-axis, the delay of sound presentation corresponds to the location of a pixel on the x-axis (a larger delay for the sounds from the right and a smaller one for the sounds from the left), while color is encoded by the timbre of musical instruments. In Levy-Tzedek et al. (2012), blindfolded participants were trained on EyeMusic in order to improve their ability to perform fast and accurate movements towards targets. After a 25-minute familiarization session and brief training, participants were able to reach the targets (using the SSD) with high accuracy and efficiency, similar to that achieved during a visually-assisted task (see also Levy-Tzedek, Riemer & Amedi, 2014). Finally, we note an interesting new avenue that involves “zooming-in” with the EyeMusic. Even users with no visual experience can integrate information across successive zooms (Buchs et al., 2015).
The EyeCane (Maidenbaum et al., 2014b) provides distance estimation and obstacle detection through narrow-beam infrared sensors. The user receives spatial information by sweeping the device across the visual scene. Distance is converted into pulsating sounds (delivered via headphones) and vibrations (perceived in the palm of the hand). In Chebat, Maidenbaum and Amedi (2015) fifty-six participants were divided into four groups: early blind, low vision and late blind, blindfolded and sighted. They performed navigation tasks in real and virtual environments. All participants improved their navigational and spatial perception skills, reflected in fewer errors and collisions in the maze as they learned to use EyeCane.
7. The possibility of combining haptics and audition for sensory substitution
Both haptics and audition show promise for use in sensory substitution. If the aim is to convey qualities provided by vision for sighted people, the effective resolution of both these perceptual channels is a drawback, however (see discussion above). Although this presents many challenges, a feedback system that codes the environment through haptics and audition in conjunction may compensate for the drawbacks of using only each individual sense if strengths are strategically utilized and weaknesses avoided (Shull & Damian, 2015). We may, for example, speculate that tactile information may convey direction accurately (Van Erp et al., 2005), while audition may be more conducive to conveying information about the nature of the stimuli, and their distance. A sensory device that uses both channels could strategically utilize strengths and weaknesses.
There is now good evidence that brain regions considered primarily devoted to a particular sense also respond to stimulation from other senses (Proulx et al., 2016; Ricciardi et al., 2014a; see section 3). The common sensible (using Aristotle’s terminology), involves integrated input from all modalities, and this can be utilized in SSD design. While evidence of multi-modal processing (Driver & Spence, 2000) may present challenges for sensory substitution, the converse might also be argued: this may bode well for any sensory substitution device – and may in many ways be a prerequisite for a successful device, as long as these principles are considered during design. For example, evidence of multisensory perceptual learning (Proulx et al., 2014) is encouraging for the idea of multisensory sensory substitution. But it is also important to note some caveats.
Multisensory integration can in some cases be worse for those with impaired senses. Gori et al. (2014) tested the ability of blind participants to judge the relative position of a sound source in a sequence of three spatially separated sounds. They found impaired auditory localization in congenitally blind participants (∼4 times the normal threshold). What this may mean is that visual experience is important for the development of a representation of the environment even for another sense, consistent with the idea of crossmodal calibration (Gori et al., 2010; Occelli et al., 2013). Crossmodal connections might strengthen unimodal representations. Also, some aspects of haptic perception may actually be worse in the blind, which has implications for the use of haptic feedback. Vision may be necessary for successful cross-calibration (Hötting & Röder, 2009; Occelli et al., 2013; Wallace et al., 2004). On the other hand, the early blind outperform the sighted in pitch discrimination (Gougoux et al., 2004) and also show better temporal resolution for auditory stimuli (Muchnick et al., 1991; Stevens & Weaver, 2009). Occelli et al. (2013) highlight that loss of vision may affect the establishment of peripersonal spatial coordinates, that then affects haptic and auditory processing. This raises the crucial question of how well we actually understand representation of space in the blind, reminding us of the need for basic psychophysical research with the aim of understanding perceptual processing for SSD design and that principles discovered in one scenario may not apply to others. The necessity of experience for calibration, and the evidence of considerable neural reorganization (such as following late-onset blindness), presents important challenges for sensory substitution (see discussion in Pasqualotto & Proulx,2012).
8. Conclusions
We highlight the following main points: regarding the design of sensory subtitution devices (SSDs).
-
1.
Conveying unnecessary information about the environment risks sensory overload and limited attentional capacity raises challenges regarding SSD design. In many cases only critical information should be conveyed.
-
2.
Differences in bandwidth between sensory systems severely constrain the nature and amount of information that can be conveyed. SSDs should therefore be task-focused.
-
3.
Basic psychophysical research into representations of the environment and the most effective ways of conveying information will lead to better design of SSDs.
-
4.
SSDs must not interfere with other perceptual function.
-
5.
Strong evidence for multisensory integration suggests that design should not necessarily be confined to one feedback modality.
-
6.
Active training appears to be crucial, especially since following training, proximal stimulation is often attributed to exterior objects. Such externalization may be an important test of the efficacy of SSDs.
-
7.
Perception is a continuous process and does not involve a snapshot of the environment. Design of sensory substitution devices therefore requires assessment of the nature of spatiotemporal continuity for different senses.
References
- Abboud S., Hanassy S., Levy-Tzedek S., Maidenbaum S., & Amedi A. (2014). EyeMusic: Introducing a “visual” colorful experience for the blind using auditory sensory substitution. Restorative Neurology & Neuroscience, 32(2), 247–257. [DOI] [PubMed] [Google Scholar]
- Ackrill J.L. (1987). A New Aristotle Reader Oxford: Oxford University Press. [Google Scholar]
- Alary F., Duquette M., Goldstein R., Chapman C.E., Voss P., La Buissonnière-Ariza V., & Lepore F. (2009). Tactile acuity in the blind: A closer look reveals superiority over the sighted in some but not all cutaneous tasks. Neuropsychologia, 47(10), 2037–2043. [DOI] [PubMed] [Google Scholar]
- Alary F., Goldstein R., Duquette M., Chapman C.E., Voss P., & Lepore F. (2008). Tactile acuity in the blind: A psychophysical study using a two-dimensional angle discrimination task. Experimental Brain Research, 187(4), 587–594. [DOI] [PubMed] [Google Scholar]
- Amedi A., Jacobson G., Hendler T., Malach R., & Zohary E. (2002). Convergence of visual and tactile shape processing in the human lateral occipital complex. Cerebral Cortex, 12(11), 1202–1212. [DOI] [PubMed] [Google Scholar]
- Amedi A., Malach R., Hendler T., Peled S., & Zohary E. (2001). Visuo-haptic object-related activation in the ventral visual pathway. Nature Neuroscience, 4(3), 324–330. [DOI] [PubMed] [Google Scholar]
- Amedi A., Raz N., Azulay H., Malach R., & Zohary E.. (2010). Cortical activity during tactile exploration of objects in blind and sighted humans. Restorative Neurology and Neuroscience, 28(2), 143–156. [DOI] [PubMed] [Google Scholar]
- Amedi A., Stern W.M., Camprodon J.A., Bermpohl F., Merabet L., Rotman S., Hemond C., Meijer P., & Pascual-Leone A. (2007). Shape conveyed by visual-to-auditory sensory substitution activates the lateral occipital complex. Nature Neuroscience, 10(6), 687–689. [DOI] [PubMed] [Google Scholar]
- Ammons C.H., Worchel P., & Dallenbach K.M. (1953). Facial vision”: The perception of obstacles out of doors by blindfolded and blindfolded-deafened subjects. The American Journal of Psychology, 66(4), 519–553. [PubMed] [Google Scholar]
- Apkarian-Stielau P., & Loomis J.M. (1975). A comparison of tactile and blurred visual form perception. Perception & Psychophysics, 18(5), 362–368 . [Google Scholar]
- Arno P., De Volder A.G. et al. (2001). Occipital activation by pattern recognition in the early blind using auditory substitution of vision. Neuroimage, 13(4), 632–645. [DOI] [PubMed] [Google Scholar]
- Ash P. (1951). The sensory capacities of infrahuman mammals: Vision, audition, gustation. Psychological Bulletin, 48(4), 289–325. [DOI] [PubMed] [Google Scholar]
- Auvray M., Hanneton S., & O’Regan J.K. (2007). Learning to Perceive with a Visuo-Auditory Substitution System: Localization and Object Recognition with ‘The vOICe’. Perception, 36(3), 416–430. [DOI] [PubMed] [Google Scholar]
- Auvray M., Hanneton S., Lenay C., & O’Regan K. (2005). There is something out there: Distal attribution in sensory substitution, twenty years later. Journal of Integrative Neuroscience, 4(4), 505–521. [DOI] [PubMed] [Google Scholar]
- Bach-y-Rita P. (1972), Brain Mechanisms in Sensory Substitution. New York: Academic Press. [Google Scholar]
- Bach-y-Rita P., Collins C.C., Saunders S.A., White B., & Scadden L. (1969). Vision substitution by tactile image projection. Nature, 221(5184), 963–964. [DOI] [PubMed] [Google Scholar]
- Bach-y-Rita P., Kaczmarek K.A., Tyler M.E., & Garcia-Lara J. (1998). Form perception with a 49-point electrotactile stimulus array on the tongue: A technical note. Journal of Rehabilitation Research and Development, 35(4), 427–430. [PubMed] [Google Scholar]
- Bach-y-Rita P., & Kercel S.W. (2003). Sensory substitution and the human-machine interface. Trends in Cognitive Science, 7(12), 541–546. [DOI] [PubMed] [Google Scholar]
- Bavelier D., Green C.S., Pouget A., & Schrater P. (2012). Brain plasticity through the life span: Learning to learn and action video games. Annual Review of Neuroscience, 35(1), 391–416. [DOI] [PubMed] [Google Scholar]
- Bedny M., Richardson H., & Saxe R. (2015). Visual” cortex responds to spoken language in blind children. Journal of Neuroscience, 35(33), 11674–11681. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bialek W., DeWeese M., Rieke F., & Warland D. (1993). Bits and brains: Information flow in the nervous system. Physica A: Statistical Mechanics and its Applications, 200(1-4), 581–593. [Google Scholar]
- Blakemore C., & Cooper G.F. (1970). Development of the brain depends on the visual environment. Nature, 228(5270), 477–478. [DOI] [PubMed] [Google Scholar]
- Blankenburg F., Ruff C.C., Deichmann R., Rees G., & Driver J. (2006). The cutaneous rabbit illusion affects human primary sensory cortex somatotopically. PLoS Biology, 4(3), 459–466. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Blauert J.. (Ed.) (2013), The Technology of Binaural Listening. Berlin-Heidelberg-New York: Springer. [Google Scholar]
- Boot W.R., Blakely D.P., & Simons D.J. (2011). Do action video games improve perception and cognition? Frontiers in Psychology, 2, 226. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Broadbent D.E. (1958), Perception and Communication. New York: Pergamon Press. [Google Scholar]
- Brown D.J., Simpson A.J., & Proulx M.J. (2014). Visual objects in the auditory system in sensory substitution: How much information do we need? Multisensory Research, 27(5-6), 337–357. [DOI] [PubMed] [Google Scholar]
- Büchel C., Price C., Frackowiak R.S., & Friston K. (1998). Different activation patterns in the visual cortex of late and congenitally blind subjects. Brain, 121(3), 409–419. [DOI] [PubMed] [Google Scholar]
- Buchs G., Maidenbaum S., Levy-Tzedek S., & Amedi A. (2015). Integration and binding in rehabilitative sensory substitution: Increasing resolution using a new Zooming-in approach. Restorative Neurology and Neuroscience, 34(1), 97–105. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Calvert G.A., Brammer M.J., Bullmore E.T., Campbell R., Iversen S.D., & David A.S. (1999). Response amplification in sensory-specific cortices during crossmodal binding. Neuroreport, 10(12), 2619–2623. [DOI] [PubMed] [Google Scholar]
- Capelle C., Trullemans P., Arno C., Veraart (1998). A real time experimental prototype for enhancement of vision rehabilitation using auditory substitution. IEEE Transactions on Biomedical Engineering, 45(10), 1279–1293. [DOI] [PubMed] [Google Scholar]
- Carello C., Anderson K.L., & Kunkler-Peck A.J. (1998). Perception of object length by sound. Psychological Science, 9(3), 211–214. [Google Scholar]
- Chebat D.R., Maidenbaum S., & Amedi A. (2015). Navigation using sensory substitution in real and virtual mazes. PLoS One, 10(6), 0126307. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chebat D.R., Rainville C., Kupers R., & Ptito M. (2007). Tactile-‘visual’ acuity of the tongue in early blind individuals. Neuroreport, 18(18), 1901–1904. [DOI] [PubMed] [Google Scholar]
- Chebat D.R., Schneider F.C., Kupers R., & Ptito M.. (2011). Navigation with a sensory substitution device in congenitally blind individuals. Neuroreport, 22(7), 342–347. [DOI] [PubMed] [Google Scholar]
- Cohen L.G., Celnik P., Pascual-Leone A., Corwell B., Faiz L., Dambrosia J., Honda M., Sadato N., Gerloff C., Catala M.D., & Hallett M. (1997). Functional relevance of cross-modal plasticity in blind humans. Nature, 389(6647), 180–183. [DOI] [PubMed] [Google Scholar]
- Cohen L., Dehaene S., Naccache L., Lehéricy S., Dehaene-Lambertz G., Hénaff M.A., & Michel F. (2000). The visual word form area: Spatial and temporal characterization of an initial stage of reading in normal subjects and posterior split-brain patients. Brain, 123(2), 291–307. [DOI] [PubMed] [Google Scholar]
- Collignon O., Dormal G., de Heering A., Lepore F., Lewis T.L., & Maurer D. (2015). Long-lasting crossmodal cortical reorganization triggered by brief postnatal visual deprivation. Current Biology, 25(18), 2379–2383. [DOI] [PubMed] [Google Scholar]
- Collins C.C. (1985). On mobility aids for the blind In Electronic spatial sensing for the blind (pp. 35–64). Amsterdam: Springer. [Google Scholar]
- Cosgun A., Sisbot E.A., & Christensen H.I. (2014). Evaluation of rotational and directional vibration patterns on a tactile belt for guiding visually impaired people. In Haptics Symposium (HAPTICS IEEE 2014), 367–370. [Google Scholar]
- Cowey A., & Walsh V. (2000). Magnetically induced phosphenes in sighted, blind and blindsighted observers. Neuroreport, 11(14), 3269–3273. [DOI] [PubMed] [Google Scholar]
- Cronly–Dillon J., Persaud K.C., & Blore R. (2000). Blind subjects construct conscious mental images of visual scenes encoded in musical form. Proceedings of the Royal Society of London B: Biological Sciences, 267(1458), 2231–2238. [Google Scholar]
- Dakopoulos D., & Bourbakis N.G. (2010). Wearable obstacle avoidance electronic travel aids for blind: A survey. Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE, 40(1), 25–35. [Google Scholar]
- De Volder A.G., Catalan-Ahumada M., Robert A., Bol A., Labar D., Coppens A., & Veraart C., (1999). Changes in occipital cortex activity in early blind humans using a sensory substitution device. Brain Research, 826(1), 128–134. [DOI] [PubMed] [Google Scholar]
- Deroy O., Chen Y.C., & Spence C. (2014). Multisensory constraints on awareness. Philosophical Transactions of the Royal Society of London B: Biological Sciences, 369(1641), 20130207. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Driver J. (2001). A selective review of selective attention research from the past century. British Journal of Psychology, 92(1), 53–78. [PubMed] [Google Scholar]
- Driver J., & Noesselt T. (2008). Multisensory interplay reveals crossmodal influences on ‘sensory-specific’ brain regions, neural responses, and judgments. Neuron, 57(1), 11–23. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Driver J., & Spence C. (2000). Multisensory perception: Beyond modularity and convergence, R-R. Current Biology, 10(20), 735. [DOI] [PubMed] [Google Scholar]
- Dudel J. (1986). General sensory physiology, psychophysics In Schmidt R.F. (ed.), Fundamentals of Sensory Physiology (pp. 1–29). Berlin: Springer. [Google Scholar]
- Elbert T., Pantev C., Wienbruch C., Rockstroh B., & Taub E. (1995). Increased cortical representation of the fingers of the left hand in string players. Science, 270(5234), 305–307. [DOI] [PubMed] [Google Scholar]
- Elli G.V., Benetti S., & Collignon O. (2014). Is there a future for sensory substitution outside academic laboratories? Multisensory Research, 27(5-6), 271–291. [DOI] [PubMed] [Google Scholar]
- Ernst M.O., & Banks M.S. (2002). Humans integrate visual and haptic information in a statistically optimal fashion. Nature, 415(6870), 429–433. [DOI] [PubMed] [Google Scholar]
- Farina M. (2013). Neither touch nor vision: Sensory substitution as artificial synaesthesia? Biology & Philosophy, 28(4), 639–655. [Google Scholar]
- Fowler C.A., & Dekle D.J. (1991). Listening with eye and hand: Cross-modal contributions to speech perception. Journal of Experimental Psychology: Human Perception and Performance, 17(3), 816–828. [DOI] [PubMed] [Google Scholar]
- Gagnon L., Schneider F.C., Siebner H.R., Paulson O.B., Kupers R., & Ptito M. (2012). Activation of the hippocampal complex during tactile maze solving in congenitally blind subjects. Neuropsychologia, 50(7), 1663–1671. [DOI] [PubMed] [Google Scholar]
- Gallace A., Tan H.Z., & Spence C. (2006). The failure to detect tactile change: A tactile analogue of visual change blindness. Psychonomic Bulletin & Review, 13(2), 300–303. [DOI] [PubMed] [Google Scholar]
- Gallace A., & Spence C. (2008). The cognitive and neural correlates of “tactile consciousness”: A multisensory perspective. Consciousness and Cognition, 17(1), 370–407. [DOI] [PubMed] [Google Scholar]
- Gardner E.P., & Kandel E.R. (2000). Touch In Kandel E.R., Schwartz J.H., Jessell T.M.. (Eds), Principles of Neural Science (4th ed, 451–471). New York: McGraw-Hill. [Google Scholar]
- Geldard F.A., & Sherrick C.E.. (1972). The cutaneous “rabbit”: A perceptual illusion. Science, 178(4057), 178–179. [DOI] [PubMed] [Google Scholar]
- Geronazzo M., Bedin A., Brayda L., Campus C., & Avanzini F. (2016). Interactive spatial sonification for non-visual exploration of virtual maps. International Journal of Human-Computer Studies, 85, 4–15. [Google Scholar]
- Gibson J.J. (1962). Observations on active touch. Psychological Review, 69(6), 477–491. [DOI] [PubMed] [Google Scholar]
- Giordano B.L., & McAdams S. (2006). Material identification of real impact sounds: Effects of size variation in steel, glass, wood, and plexiglass plates. Journal of the Acoustical Society of America, 119(2), 1171–1181. [DOI] [PubMed] [Google Scholar]
- Gori M., Sandini G., Martinoli C., & Burr D. (2010). Poor haptic orientation discrimination in nonsighted children may reflect disruption of cross-sensory calibration. Current Biology, 20(3), 223–225. [DOI] [PubMed] [Google Scholar]
- Gori M., Sandini G., Martinoli C., & Burr D.C. (2014). Impairment of auditory spatial localization in congenitally blind human subjects. Brain, 137(1), 288–293. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gougoux F., Lepore F., Lassonde M., Voss P., Zatorre R.J., & Belin P. (2004). Neuropsychology: Pitch discrimination in the early blind. Nature, 430(6997), 309–309. [DOI] [PubMed] [Google Scholar]
- Goyal M.S., Hansen P.J., & Blakemore C.B. (2006). Tactile perception recruits functionally related visual areas in the late-blind. Neuroreport, 17(13), 1381–1384. [DOI] [PubMed] [Google Scholar]
- Grassi M. (2005). Do we hear size or sound? Balls dropped on plates. Perception & Psychophysics, 67(2), 274–284. [DOI] [PubMed] [Google Scholar]
- Green C.S., & Bavelier D. (2008). Exercising your brain: A review of human brain plasticity and training-induced learning. Psychology and Aging, 23(4), 692. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gregory R.L. (1998), Eye and Brain: The Psychology of seeing. (5th ed). Oxford: Oxford University Press. [Google Scholar]
- Grimes J. (1996). On the failure to detect changes in scenes across saccades In Akins K. (Ed.), Vancouver studies in cognitive science: 5. Perception (pp. 89–109). New York: Oxford University Press. [Google Scholar]
- Haigh A., Brown D.J., Meijer P., & Proulx M.J. (2013). How well do you see what you hear? The acuity of visual-to-auditory sensory substitution. Frontiers in Psychology, 4, 330. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Haines R.F. (1991). A breakdown in simultaneous information processing In Obrecht G. & Stark L.W. (eds.), Presbyopia Research (pp. 171–175). New York: Springer US. [Google Scholar]
- Hamilton R., Keenan J.P., Catala M., & Pascual-Leone A. (2000). Alexia for Braille following bilateral occipital stroke in an early blind woman. Neuroreport, 11(2), 237–240. [DOI] [PubMed] [Google Scholar]
- Hamilton R., & Pascual-Leone A. (1998). Cortical plasticity associated with Braille learning. Trends in Cognitive Sciences, 2(5), 168–174. [DOI] [PubMed] [Google Scholar]
- Härmä A., Jakka J., Tikander M., Karjalainen M., Lokki T., Hiipakka J., & Lorho G. (2004). Augmented reality audio for mobile and wearable appliances. Journal of the Audio Engineering Society, 52(6), 618–639. [Google Scholar]
- Hartcher-O’Brien J., & Auvray M. (2014). The process of distal attribution illuminated through studies of sensory substitution. Multisensory Research, 27(5-6), 421–441. [DOI] [PubMed] [Google Scholar]
- Held R., Ostrovsky Y., de Gelder B., Gandhi T., Ganesh S., Mathur U., & Sinha P. (2011). The newly sighted fail to match seen with felt. Nature Neuroscience, 14(5), 551–553. [DOI] [PubMed] [Google Scholar]
- Hoffmann D.D. (1998). Visual Intelligence: How we create what we see. New York: W.W Norton. [Google Scholar]
- Hötting K., & Röder B. (2004). Hearing cheats touch, but less in congenitally blind than in sighted individuals. Psychological Science, 15(1), 60–64. [DOI] [PubMed] [Google Scholar]
- Hötting K., & Röder B. (2009). Auditory and auditory-tactile processing in congenitally blind humans. Hearing Research, 258(1), 165–174. [DOI] [PubMed] [Google Scholar]
- Houben M.M.J, Kohlrausch A., & Hermes D.J. (2004). Perception of the size and speed of rolling balls by sound. Speech Communication, 43, 331–345. [Google Scholar]
- Hubel D., & Wiesel T.N. (1970). The period of susceptibility to the physiological effects of unilateral eye closure in kittens. Journal of Physiology, 206(2), 419. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hugues O., Fuchs P., Brunet l., & Megard C., (2015). Evaluation of a vibrotactile device for outdoor and public transport pedestrian navigation using virtual reality. Eighth International Conference on Advances in Computer-Human Interactions, Lissabon, Portugal. [Google Scholar]
- Ifukube T., Sasaki T., & Peng C. (1991). A blind mobility aid modelled after echolocation of bats. IEEE Transactions of Biomedical Engineering, 38(5), 461–465. [DOI] [PubMed] [Google Scholar]
- Johnson L., & Higgins C.M. (2006). A navigation aid for the blind using tactile-visual sensory substitution. Engineering in Medicine and Biology Society, 28th Annual International Conference of the IEEE, 6289–6292. [DOI] [PubMed] [Google Scholar]
- Kanwisher N., McDermott J., & Chun M.M. (1997). The fusiform face area: A module in human extrastriate cortex specialized for face perception. Journal of Neuroscience, 17(11), 4302–4311. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kärcher S.M., Fenzlaff S., Hartmann D., Nagel S.K., & König P. (2012). Sensory augmentation for the blind. Frontiers in Human Neuroscience, 6(37), 1–15. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kellogg W.N. (1962). Sonar System of the Blind New research measures their accuracy in detecting the texture, size, and distance of objects” by ear. Science, 137(3528), 399–404. [DOI] [PubMed] [Google Scholar]
- Kilgour A.R., Servos P., James T.W., & Lederman S.J. (2004). Functional MRI of haptic face recognition. Brain and Cognition, 54(2), 159–161. [PubMed] [Google Scholar]
- Kim J.-K, & Zatorre R.J. (2008). Generalized Learning of Visual-to-Auditory Substitution in Sighted Individuals. Brain Research, 1242(2008), 263–275. [DOI] [PubMed] [Google Scholar]
- King A.J., & Parsons C.H. (1999). Improved auditory spatial acuity in visually deprived ferrets. European Journal of Neuroscience, 11(11), 3945–3956. [DOI] [PubMed] [Google Scholar]
- Klatzky R.L., Pai D.K., & Krotkov E.P. (2000). Perception of material from contact sounds. Presence: Teleoperators and Virtual Environments, 9(4), 399–410. [Google Scholar]
- Koelewijn T., Bronkhorst A., & Theeuwes J. (2010). Attention and the multiple stages of multisensory integration: A review of audiovisual studies. Acta Psychologica, 134(3), 372–384. [DOI] [PubMed] [Google Scholar]
- Kokjer K.J. (1987). The information capacity of the human fingerti. IEEE Transactions on Systems, Man, & Cybernetics, 17(1), 100–102 . [Google Scholar]
- Kolarik A.J., Timmis M.A., Cirstea S., & Pardhan S. (2014). Sensory substitution information informs locomotor adjustments when walking through apertures. Experimental Brain Research, 232(3), 975–984. [DOI] [PubMed] [Google Scholar]
- Koo I.M., Jung K., Koo J.C., Nam J., & Lee Y.K. (2008). Development of soft-actuator-based wearable tactile display. IEEE Transactions on Robotics, 24(3), 549–558. [Google Scholar]
- Korte M., & Rauschecker J.P. (1993). Auditory spatial tuning of cortical neurons is sharpened in cats with early blindness. Journal of Neurophysiology, 70(4), 1717–1721. [DOI] [PubMed] [Google Scholar]
- Kristjánsson Á. (2006). Rapid learning in attention shifts-A review. Visual Cognition, 13(3), 324–362. [Google Scholar]
- Kristjánsson Á. (2013). The case for causal influences of action videogame play upon vision and attention. Attention, Perception and Psychophysics, 75(4), 667–672. [DOI] [PubMed] [Google Scholar]
- Kristjánsson T., Thorvaldsson T.P., & Kristjánsson Á. (2014). Divided multimodal attention: sensory trace and context coding strategies in spatially congruent auditory and visual presentation. Multisensory Research, 27(2), 91–110. [DOI] [PubMed] [Google Scholar]
- Kunkler-Peck A.J., & Turvey M.T. (2000). Hearing shape. Journal of Experimental Psychology: Human Perception and Performance, 26(1), 279–294. [DOI] [PubMed] [Google Scholar]
- Kujala T., Palva M.J., Salonen O., Alku P., Huotilainen M., Järvinen A., & Näätänen R. (2005). The role of blind humans’ visual cortex in auditory change detection. Neuroscience Letters, 379(2), 127–131. [DOI] [PubMed] [Google Scholar]
- Kupers R., Chebat D.R., Madsen K.H., Paulson O.B., & Ptito M. (2010). Neural correlates of virtual route recognition in congenital blindness. Proceedings of the National Academy of Sciences, 107(28), 12716–12721. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kupers R. et al. (2007). rTMS of the occipital cortex abolishes Braille reading and repetition priming in blind subjects. Neurology, 68(9), 691–693. [DOI] [PubMed] [Google Scholar]
- Lacey S., Flueckiger P., Stilla R., Lava M., & Sathian K. (2010). Object familiarity modulates the relationship between visual object imagery and haptic shape perception. Neuroimage, 49(3), 1977–1990. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lahav O., Schloerb D.W., & Srinivasan M.A. (2012). Newly blind persons using virtual environment system in a traditional orientation and mobility rehabilitation program: A case study. Disability and Rehabilitation: Assistive Technology, 7(5), 420–435. [DOI] [PubMed] [Google Scholar]
- Lakatos S. (1991). Recognition of complex auditory-spatial patterns. CCRMA, Department of Music, Stanford University. [Google Scholar]
- Lakatos S., McAdams S., & Caussé R. (1997). The representation of auditory source characteristics: Simple geometric form. Perception & Psychophysics, 59(8), 1180–1190. [DOI] [PubMed] [Google Scholar]
- Levy-Tzedek S., Hanassy S., Abboud S., Maidenbaum S., & Amedi A. (2012). Fast, accurate reaching movements with a visual-to-auditory sensory substitution device. Restorative Neurology and Neuroscience, 30(4), 313–323. [DOI] [PubMed] [Google Scholar]
- Levy-Tzedek S., Maidenbaum S., Amedi A., & Lackner J. (2016). Aging and Sensory Substitution in a Virtual Navigation Task.. PLoS One, 11(3), e0151593. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Levy-Tzedek S., Riemer D., & Amedi A. (2014). Color improves “visual” acuity via sound. Frontiers in Neuroscience, 8, 358. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Liu Y.C. (2001). Comparative study of the effects of auditory, visual and multimodality displays on drivers’ performance in advanced traveller information systems. Ergonomics, 44(4), 425–442. [DOI] [PubMed] [Google Scholar]
- Loomis J.M., & Apkarian-Stielau P. (1976). A lateral masking effect in tactile and blurred visual letter recognition. Perception & Psychophysics, 20(4), 221–226. [Google Scholar]
- Loomis J.M., Golledge R.G., & Klatzky R.L. (1998). Navigation system for the blind: Auditory display modes and guidance. Presence: Teleoperators and Virtual Environments, 7(2), 193–203. [Google Scholar]
- Loomis J.M., Marston J.R., Golledge R.G., & Klatzky R.L. (2005). Personal guidance system for people with visual impairment: A comparison of spatial displays for route guidance. Journal of Visual Impairment & Blindness, 99(4), 219–232. [PMC free article] [PubMed] [Google Scholar]
- Loomis J.M., Klatzky R.L., & Giudice N.A. (2012). Sensory substitution of vision: Importance of perceptual and cognitive processing In Manduchi R., & Kurniawan S. (Eds.), Assistive Technology for Blindness and Low Vision, pp. 162–191. Boca Raton, FL, USA: CRC Press. [Google Scholar]
- Mack A., & Rock I. (1998), Inattentional blindness. Cambridge, MA: MIT Press. [Google Scholar]
- Maidenbaum S., & Amedi A. (2015). Blind in a virtual world: Mobility-training virtual reality games for users who are blind. In Virtual Reality (VR), 2015 IEEE, 341–342. [Google Scholar]
- Maidenbaum S., Abboud S., & Amedi A., (2014a). Sensory substitution: Closing the Gap Between Basic Research and Widespread Practical Visual Rehabilitation. Neuroscience and Biobehavioral Reviews, 41, 3–15. [DOI] [PubMed] [Google Scholar]
- Maidenbaum S., Hanassy S., Abboud S., Buchs G., Chebat, D.R. , Levy-Tzedek S., & Amedi A. (2014b). The “EyeCane”, a new electronic travel aid for the blind: Technology, beavihor and swift learning. Restorative Neurology and Neuroscience, 32(6), 813–824. [DOI] [PubMed] [Google Scholar]
- Maidenbaum S., Levy-Tzedek S., Chebat D.R., Namer-Furstenberg R., & Amedi A. (2014c). The effect of extended sensory range via the EyeCane sensory substitution device on the characteristics of visionless virtual navigation. Multisensory Research, 27(5-6), 379–397. [DOI] [PubMed] [Google Scholar]
- Maravita A., Husain M., Clarke K., & Driver J. (2001). Reaching with a tool extends visual– tactile interactions into far space: Evidence from cross-modal extinction. Neuropsychologia, 39(6), 580–585. [DOI] [PubMed] [Google Scholar]
- Maravita A., Spence C., & Driver J. (2003). Multisensory integration and the body schema: Close to hand and within reach. Current Biology, 13(13), R531–R539. [DOI] [PubMed] [Google Scholar]
- Marston J.R., Loomis J.M., Klatzky R.L., Golledge R.G., & Smith E.L. (2006). Evaluation of spatial displays for navigation without sight. ACM Transactions on Applied Perception, 3(2), 110–124 . [Google Scholar]
- Marston J.R., Loomis J.M., Klatzky R.L., & Golledge R.G. (2007). Nonvisual route following with guidance from a simple haptic or auditory display. Journal of Visual Impairment & Blindness, 101(4), 203–211. [Google Scholar]
- McConkie G.W., & Zola D. (1979). Is visual information integrated across successive fixations in reading? Perception and Psychophysics, 25(23), 221–224. [DOI] [PubMed] [Google Scholar]
- McDaniel T., Krishna S., Balasubramanian V., Colbry D., & Panchanathan S. (2008). Using a haptic belt to convey non-verbal communication cues during social interactions to individuals who are blind. IEEE International Workshop on Haptic Audio visual Environments and Games, 13–18. [Google Scholar]
- McGurk H., & MacDonald J. (1976). Hearing lips and seeing voices. Nature, 264(5588), 746–748. [DOI] [PubMed] [Google Scholar]
- McKyton A., Ben-Zion I., Doron R., & Zohary E. (2015). The limits of shape recognition following late emergence from blindness. Current Biology, 25(18), 2373–2378. [DOI] [PubMed] [Google Scholar]
- Meijer P.B.L. (1992). An experimental system for auditory image representations. IEEE Transactions on Biomedical Engineering, 39(2), 112–121. [DOI] [PubMed] [Google Scholar]
- Merabet L.B., Connors E.C., Halko M.A., & Sánchez J. (2012). Teaching the blind to find their way by playing video games. PLoS One, 7, e44958. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Merabet L.B., & Pascual-Leone A. (2010). Neural reorganization following sensory loss: The opportunity of change. Nature Reviews Neuroscience, 11(1), 44–52. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Milne J.L., Arnott S.R., Kish D., Goodale M.A., & Thaler L. (2015). Parahippocampal cortex is involved in material processing via echoes in blind echolocation experts. Vision Research, 109(B), 139–148. [DOI] [PubMed] [Google Scholar]
- Most S.B., Scholl B.J., Clifford E.R., & Simons D.J. (2005). What you see is what you set: Sustained inattentional blindness and the capture of awareness. Psychological Review, 112(1), 217–242. [DOI] [PubMed] [Google Scholar]
- Muchnik et al. (1991). Central auditory skills in blind and sighted subjects. Scandinavian Audiology, 20(1), 19–23. [DOI] [PubMed] [Google Scholar]
- Murphy M.C., Nau A.C., Fisher C., Kim S.G., Schuman J.S., & Chan K.C. (2016). Top-down influence on the visual cortex of the blind during sensory substitution. NeuroImage, 125, 932–940. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Murray M.M., Thelen A., Thut G., Romei V., Martuzzi R., & Matusz P.J. (2016). The multisensory function of the human primary visual cortex. Neuropsychologia, 83, 161–169. [DOI] [PubMed] [Google Scholar]
- Nagel S.K., Carl C., Kringe T., Märtin R., & König P. (2005). Beyond sensory substitution—learning the sixth sense. Journal of Neural Engineering, 2(4), R13–R26. [DOI] [PubMed] [Google Scholar]
- Nau A.C., Pintar C., Arnoldussen A., & Fisher C. (2015). Acquisition of visual perception in blind adults using the BrainPort artificial vision device. American Journal of Occupational Therapy, 69(1), 6901290010p1–6901290010p8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Neisser U. (1967). Cognitive Psychology. New York: Appleton Century-Crofts. [Google Scholar]
- Niechwiej-Szwedo E., Chin J., Wolfe P.J., Popovich C., & Staines W.R. (2016). Abnormal visual experience during development alters the early stages of visual-tactile integration. Behavioural Brain Research, 304, 111–119 . [DOI] [PubMed] [Google Scholar]
- Novich S.D., & Eagleman D.M. (2015). Using space and time to encode vibrotactile information: Toward an estimate of the skin’s achievable throughput. Experimental Brain Research, 233(10), 2777–2788. [DOI] [PubMed] [Google Scholar]
- Occelli V., Spence C. & Zampini M. (2013). Auditory, tactile, and audiotactile information processing following visual deprivation. Psychological Bulletin, 139(1), 189–212. [DOI] [PubMed] [Google Scholar]
- O’Regan J.K., Rensink R.A., & Clark J.J. (1999). Change-blindness as a result of ’mud- splashes. Nature, 398(6722), 34. [DOI] [PubMed] [Google Scholar]
- Ortiz T., Poch J., Santos J.M., Requena C., Martínez A.M., Ortiz-Terán L., Turrero A., Barcia J., Nogales R., Calvo A., Martínez J.M., Córdoba J.L., & Pascual-Leone A. (2011). Recruitment of occipital cortex during sensory substitution training linked to subjective experience of seeing in people with blindness. PLoS One, 6(8), e23264. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Owen A.M., Hampshire A., Grahn J.A., Stenton R., Dajani S., Burns A.S. & Ballard C.G. (2010). Putting brain training to the test. Nature, 465(7299), 775–778. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Palmer S.E. (1999). Vision Science: Photons to Phenomenology. Cambridge, MA: MIT Press. [Google Scholar]
- Pantev C., Oostenveld R., Engelien A., Ross B., Roberts L.E., & Hoke M. (1998). Increased auditory cortical representation in musicians. Nature, 392(6678), 811–814. [DOI] [PubMed] [Google Scholar]
- Pascual-Leone A., & Torres F. (1993). Plasticity of the sensorimotor cortex representation of the reading finger in Braille readers. Brain, 116(1), 39–52. [DOI] [PubMed] [Google Scholar]
- Pascual-Leone A., Amedi A., Fregni F., & Merabet L.B. (2005). The plastic human brain cortex. Annual Review of Neuroscience, 28, 377–401. [DOI] [PubMed] [Google Scholar]
- Pasqualotto A., & Proulx M.J. (2012). The role of visual experience for the neural basis of spatial cognition. Neuroscience & Biobehavioral Reviews, 36(4), 1179–1187 . [DOI] [PubMed] [Google Scholar]
- Penfield W., & Boldrey E. (1937). Somatic motor and sensory representation in the cerebral cortex of man as studied by electrical stimulation. Brain, 60, 389–443. [Google Scholar]
- Poirier C., Collignon O., Scheiber C., Renier L., Vanlierde A., Tranduy D., Veraart C., & De Volder A.G. (2006). Auditory motion perception activates visual motion areas in early blind subjects. Neuroimage, 31(1), 279–285. [DOI] [PubMed] [Google Scholar]
- Poirier C., De Volder A., Tranduy D., & Scheiber C.,. 2007. Pattern recognition using a device substituting audition for vision in blindfolded sighted subjects. Neuropsychologia, 45(5), 1108–1121. [DOI] [PubMed] [Google Scholar]
- Proulx M.J., Brown D.J., Pasqualotto A., & Meijer P. (2014). Multisensory perceptual learning and sensory substitution. Neuroscience & Biobehavioral Reviews, 41, 16–25 . [DOI] [PubMed] [Google Scholar]
- Proulx M.J., Gwinnutt J., Dell’Erba S., Levy-Tzedek S., de Sousa A.A., & Brown D.J. (2016). Other ways of seeing: From behavior to neural mechanisms in the online “visual” control of action with sensory substitution. Restorative Neurology & Neuroscience, 34, 29–44 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- Proulx M.J., Stoerig P., Ludowig E., & Knoll I. (2008). Seeing ‘where’ through the ears: Effects of learning by doing and long-term sensory deprivation on localization based on image-to-sound conversion. PLoS One, 3(3), e1840. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ptito M., Moesgaard S.M., Gjedde A., & Kupers R. (2005). Cross-modal plasticity revealed by electrotactile stimulation of the tongue in the congenitally blind. Brain, 128(3), 606–614. [DOI] [PubMed] [Google Scholar]
- Ptito M., Matteau I., Gjedde A., & Kupers R. (2009). Recruitment of the middle temporal area by tactile motion in congenital blindness. NeuroReport, 20(6), 543–547. [DOI] [PubMed] [Google Scholar]
- Rauschecker J.P. (1995). Compensatory plasticity and sensory substitution in the cerebral cortex. Trends in Neurosciences, 18(1), 36–43. [DOI] [PubMed] [Google Scholar]
- Rauschecker J.P., & Kniepert U. (1994). Auditory localization behaviour in visually deprived cats. European Journal of Neuroscience, 6(1), 149–160 . [DOI] [PubMed] [Google Scholar]
- Rauschecker J.P., & Korte M. (1993). Auditory compensation for early blindness in cat cerebral cortex. Journal of Neuroscience, 13(10), 4538–4548. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Renier L., Collignon O., Poirier C., Tranduy D., Vanlierde A., Bol A., Veraart C., & De Volder A.G. (2005). Cross-modal activation of visual cortex during depth perception using auditory substitution of vision. NeuroImage, 26(2), 573–580. [DOI] [PubMed] [Google Scholar]
- Reich L., Szwed M., Cohen L., & Amedi A. (2011). A ventral visual stream reading center independent of visual experience. Current Biology, 21(5), 363–368. [DOI] [PubMed] [Google Scholar]
- Rensink R.A., O’Regan J.K., & Clark J.J. (1997). To see or not to see: The need for attention to perceive changes in scenes. Psychological Science, 8(5), 368–373. [Google Scholar]
- Ricciardi E., Bonino D., Pellegrini S., & Pietrini P. (2014a). Mind the blind brain to understand the sighted one! Is there a supramodal cortical functional architecture? Neuroscience & Biobehavioral Reviews, 41, 64–77 . [DOI] [PubMed] [Google Scholar]
- Ricciardi E., Handjaras G., & Pietrini P. (2014b). The blind brain: How (lack of) vision shapes the morphological and functional architecture of the human brain. Experimental Biology and Medicine, 239(11), 1414–1420. [DOI] [PubMed] [Google Scholar]
- Rice C.E., Feinstein S.H., & Schusterman R.J. (1965). Echo-detection ability of the blind: Size and distance factors. Journal of Experimental Psychology, 70(3), 246–255. [DOI] [PubMed] [Google Scholar]
- Rock I. (1983). The Logic of Perception. Cambridge, MA: MIT Press. [Google Scholar]
- Sadato N., Pascual-Leone A., Grafman J., Ibañez V., Deiber M.P., Dold G., & Hallett M. (1996). Activation of the primary visual cortex by Braille reading in blind subjects. Nature, 380, 526–528. [DOI] [PubMed] [Google Scholar]
- Saevarsson S., Halsband U., & Kristjánsson Á. (2011). Designing rehabilitation programs for neglect: Could 1 + 1 be more than ? Applied Neuropsychology, 18(2), 95–106. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sampaio E., Maris S., & Bach-y-Rita P. (2001). Brain Plasticity: ‘Visual’ Acuity of Blind Persons via the Tongue. Brain Research, 908(2), 204–207. [DOI] [PubMed] [Google Scholar]
- Sathian K., & Stilla R. (2010). Cross-modal plasticity of tactile perception in blindness. Restorative Neurology and Neuroscience, 28(2), 271–281. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sathian K., Zangaladze A., Hoffman J.M., & Grafton S.T. (1997). Feeling with the mind’s eye. Neuroreport, 8(18), 3877–3881. [DOI] [PubMed] [Google Scholar]
- Schlaug G., Jancke L., Huang Y., & Steinmetz H. (1995). In vivo evidence of structural brain asymmetry in musicians. Science, 267(5198), 699–701. [DOI] [PubMed] [Google Scholar]
- Schmidt R.F. (1981). Somatovisceral sensibility In Schmidt R.F. (ed.), Fundamentals of Sensory Physiology (pp. 81–125). Berlin: Springer. [Google Scholar]
- Schneider W., & Shiffrin R.M. (1977). Controlled and automatic human information processing: I. Detection, search, and attention. Psychological Review, 84(1), 1–66. [Google Scholar]
- Segond H., Weiss D., Kawalec M., & Sampaio E. (2013). Perceiving space and optical cues via a visuo-tactile sensory substitution system: A methodological approach for training of blind subjects for navigation. Perception, 42(5), 508–528 . [DOI] [PubMed] [Google Scholar]
- Serino A., Bassolino M., Farnè A., & Làdavas E. (2007). Extended multisensory space in blind cane users. Psychological Science, 18(7), 642–648. [DOI] [PubMed] [Google Scholar]
- Shah C., Bouzit M., Youssef M., & Vasquez L. (2006). Evaluation of RU-netra-tactile feedback navigation system for the visually impaired. International Workshop on Virtual Rehabilitation. IEEE, 72–77. [Google Scholar]
- Shams L., Kamitani Y., & Shimojo S. (2000). What you see is what you hear. Nature, 408, 788. [DOI] [PubMed] [Google Scholar]
- Shull P.B., & Damian D.D., (2015). Haptic wearables as sensory replacement, sensory augmentation and trainer– a review. Journal of Neuroengineering and Rehabilitation, 12(59), 1–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sigurdardottir H.M., Ívarsson E., Kristinsdóttir K., & Kristjánsson Á. (2015). Impaired recognition of faces and objects in dyslexia: Evidence for ventral stream dysfunction? Neuropsychology, 29(5), 739–750. [DOI] [PubMed] [Google Scholar]
- Simons D.J. (1996). In sight, out of mind: When object representations fail. Psychological Science, 7(5), 301–305. [Google Scholar]
- Singh-Curry V., & Husain M. (2008). Rehabilitation of neglect In Stuss D.T., Winocur G., & Robertson I.H. (Eds.), Cognitive Neurorehabilitation: Evidence and Application (2nd ed), pp. 449–463. Cambridge, England: Cambridge University Press. [Google Scholar]
- Spence I., & Feng J. (2010). Video games and spatial cognition. Review of General Psychology, 14(2), 92–104. [Google Scholar]
- Stein B.E. and Meredith M.A., (1993). The Merging of the Senses.–Cambridge, MA, USA: MIT Press. [Google Scholar]
- Stevens A.A., & Weaver K.E. (2009). Functional characteristics of auditory cortex in the blind. Behavioural Brain Research, 196(1), 134–138. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stiles N.R., & Shimojo S. (2015). Auditory sensory substitution is intuitive and automatic with texture stimuli. Scientific Reports, 5, 15628. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stiles N.R., Zheng Y., & Shimojo S. (2015). Length and orientation constancy learning in 2-dimensions with auditory sensory substitution: The importance of self-initiated movement. Frontiers in Psychology, 6, 1–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Striem-Amit E., Guendelman M. & Amedi A. (2012a). ‘Visual’ acuity of the congenitally blind using visual-to-auditory sensory substitution. PLoS ONE, 7(3), e33136. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Striem-Amit E., Cohen L., Dehaene S., & Amedi A., (2012b). Reading with sounds: Sensory substitution selectively activates the visual word form area in the blind. Neuron, 76(3), 640–652. [DOI] [PubMed] [Google Scholar]
- Striem-Amit E., Ovadia-Caro S., Caramazza A., Margulies D.S., Villringer A., & Amedi A. (2015). Functional connectivity of visual cortex in the blind follows retinotopic organization principles. Brain, 138(6), 1679–1695. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stronks H.C., Nau A.C., Ibbotson M.R., & Barnes N. (2015). The role of visual deprivation and experience on the performance of sensory substitution devices. Brain Research, 1624, 140–152. [DOI] [PubMed] [Google Scholar]
- Tang H., & Beebe D.J. (2006). An oral tactile interface for blind navigation. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 14(1), 116–123 . [DOI] [PubMed] [Google Scholar]
- Tehovnik E.J., Slocum W.M., Carvey C.E., & Schiller P.H. (2005). Phosphene induction and the generation of saccadic eye movements by striate cortex. Journal of Neurophysiology, 93(1), 1–19. [DOI] [PubMed] [Google Scholar]
- Teng S., & Whitney D. (2011). The acuity of echolocation: Spatial resolution in sighted persons compared to the performance of an expert who is blind. Journal of Visual Impairment and Blindness, 105(1), 20–32. [PMC free article] [PubMed] [Google Scholar]
- Thaler L., Milne J.L., Arnott S.R., Kish D., & Goodale M.A. (2014). Neural correlates of motion processing through echolocation, source hearing, and vision in blind echolocation experts and sighted echolocation novices. Journal of Neurophysiology, 111(1), 112–127. [DOI] [PubMed] [Google Scholar]
- Treisman A.M. (1960). Contextual cues in selective listening. Quarterly Journal of Experimental Psychology, 12(4), 242–248. [Google Scholar]
- Väljamäe A., & Kleiner M. (2006). Spatial sound in auditory vision substitution systems. In Audio Engineering Society Convention, 120, 6795. [Google Scholar]
- Van Erp J.B., Van Veen H.A., Jansen C., & Dobbins T. (2005). Waypoint navigation with a vibrotactile waist belt. ACM Transactions on Applied Perception (TAP), 2(2), 106–117. [Google Scholar]
- Vitevitch M.S. (2003). Change deafness: The inability to detect changes between two voices. Journal of Experimental Psychology: Human Perception and Performance, 29(2), 333. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Voss P., Gougoux F., Zatorre R.J., Lassonde M., & Lepore F. (2008). Differential occipital responses in early- and late-blind individuals during a sound-source discrimination task. Neuroimage, 40(2), 746–758. [DOI] [PubMed] [Google Scholar]
- Wallace M.T., Perrault T.J., Hairston W.D., & Stein B.E. (2004). Visual experience is necessary for the development of multisensory integration. Journal of Neuroscience, 24(43), 9580–9584. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wan C.Y., Wood A.G., Reutens D.C., & Wilson S.J. (2010). Early but not late-blindness leads to enhanced auditory perception. Neuropsychologia, 48(1), 344–348. [DOI] [PubMed] [Google Scholar]
- Wang X., Peelen M.V., Han Z., He C., Caramazza A., & Bi Y. (2015). How visual is the visual cortex? Comparing connectional and functional fingerprints between congenitally blind and sighted individuals. Journal of Neuroscience, 35(36), 12545–12559. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ward J., & Meijer P. (2010). Visual experiences in the blind induced by an auditory sensory substitution device. Consciousness and Cognition, 19(1), 492–500. [DOI] [PubMed] [Google Scholar]
- Ward J., & Wright T. (2014). Sensory substitution as an artificially acquired synaesthesia. Neuroscience & Biobehavioral Reviews, 41, 26–35. [DOI] [PubMed] [Google Scholar]
- Weeks R., Horwitz B., Aziz-Sultan A., Tian B., Wessinger C.M., Cohen L.G., Hallett M., & Rauschecker J.P. (2000). A positron emission tomographic study of auditory localization in the congenitally blind. Journal of Neuroscience, 20(7), 2664–2672. [DOI] [PMC free article] [PubMed] [Google Scholar]
- White B.W., Saunders F.A., Scadden L., Bach-Y-Rita P., & Collins C.C. (1970). Seeing with the skin. Perception & Psychophysics, 7(1), 23–27. [Google Scholar]
- Weinstein S. (1968). Intensive and extensive aspects of tactile sensitivity as a function of body part, sex, and laterality In Kenshalo D.R. (Ed.), The Skin Senses (pp. 195–218). Springfield, IL: Thomas. [Google Scholar]
- Wiesel T.N. (1982). The postnatal development of the visual cortex and the influence of environment. Bioscience Reports, 2(6), 351–377. [DOI] [PubMed] [Google Scholar]
- Wright T.D., Margolis A., & Ward J. (2015). Using an auditory sensory substitution device to augment vision: Evidence from eye movements. Experimental Brain Research, 233(3), 851–860. [DOI] [PubMed] [Google Scholar]
- Wurtz R.H., & Kandel E.R., (2000). Central Visual Pathways In Kandel E.R., Schwartz J.H., Jessell T.M., (Eds.), of neural science (4th ed, pp. 523–547). New York: McGraw-Hill. [Google Scholar]
- Zangaladze A., Epstein C.M., Grafton S.T., & Sathian K. (1999). Involvement of visual cortex in tactile discrimination of orientation. Nature, 401(6753), 587–590. [DOI] [PubMed] [Google Scholar]
- Zrenner E., Bartz-Schmidt K.U., Benav H., Besch D., Bruckmann A., Gabel V, Gekeler F., Greppmaier U., Harscher A., Kibbel S., Koch J., Kusnyerik A., Peters T., Stingl K., Sachs H., Stett A., Szurman P., Wilhelm B., & Wilke R. (2010). Subretinal electronic chips allow blind patients to read letters and combine them to words. Proceedings of the Royal Society B: Biological Sciences, 278(1711), 1489–1497. [DOI] [PMC free article] [PubMed] [Google Scholar]