Abstract
Our sensory systems are remarkable in several respects. They are extremely sensitive, they each perform more than one function, and they interact in a complementary way, thereby providing a high degree of redundancy that is particularly helpful should one or more sensory systems be impaired. In this article, the problem of dual hearing and vision loss is addressed. A brief description is provided on the use of auditory cues in vision loss, the use of visual cues in hearing loss, and the additional difficulties encountered when both sensory systems are impaired. A major focus of this article is the use of sound localization by normal hearing, hearing impaired, and blind individuals and the special problem of sound localization in people with dual sensory loss.
Keywords: sound localization, low vision, interaural difference cues, bilateral hearing aids, directional hearing aids
Introduction
Our sensory systems are remarkable in several respects. They are extremely sensitive, they each perform multiple functions, and they interact in a complementary way, thereby providing a high degree of redundancy. This article focuses on age-related dual sensory loss and the problems that might exist for audiologists in the fitting of hearing aids when there are decrements in both vision and hearing.
We will also discuss the use of auditory cues in vision loss, the use of visual cues in hearing loss, and the special problem of dual hearing and vision loss. A major focus of this article is the use of sound localization by normal hearing, hearing impaired and blind individuals, and the special problem of sound localization in people with dual sensory loss.
Hearing and vision complement each other in important ways. The visual system has much greater spatial resolution than the auditory system. In the example of localization of sounds in space, the location of a visible sound source (ie, a person speaking) can be identified visually with great accuracy. If the sound source is not visible, however, the auditory system can locate it but with less precision. Auditory localization is most accurate in the horizontal plane, while the accuracy of localization in the vertical plane and auditory distance estimation are much less so.
The auditory system has much greater temporal resolution than the visual system. Good temporal resolution is important for speech recognition because the acoustic speech signal consists of complex spectrotemporal cues that vary rapidly in time. The auditory system is well suited for processing signals of this type. The visual speech signal (ie, in speechreading) consists of facial and body movements that convey speech cues. Those articulators that are visible in face-to-face communication (ie, lips, mouth, jaw) convey important speech cues, but this information is limited. Speech gestures by articulators that are hidden from view (ie, the back of the tongue) cannot be processed visually. Very rapid articulator movements are also not processed visually because of the relatively low temporal resolution of the visual system.
Although the visual speech signal conveys a limited amount of speech information, it nevertheless complements the acoustic speech signal in important ways. For example, speech sounds that are most susceptible to masking by environmental noises are also the easiest to see visually. These visual speech cues are a very effective supplement to the auditory speech signal under adverse listening conditions.
The overlapping functions and concomitant redundancies of the visual and auditory systems are of great value in the case of sensory impairments. In some cases, the redundant cues are used without the need for artificial intervention. For example, visual speech cues are used in normal speech perception when auditory cues are lost as a result of a hearing impairment. In the case of vision loss, the auditory system provides valuable cues to assist in navigation and hazard avoidance.
Auditory and visual sensitivity normally decrease with age and it is quite common for older people to have both significant hearing and visual impairments, thereby placing a greater load on cognitive processing, which also often declines with age. To develop effective methods of intervention for older people with one or more sensory deficits, it is important to know how sensory modalities interact and the extent to which deficits in one modality can be compensated for by other modalities. It is also important to understand the possible consequences of fitting certain hearing aid processing strategies for the dually impaired. Hearing aids have been developed for listeners with normal vision. Thus, they have been recommended on the premise that any limitations in sound localization ability will be negligible because the hearing aid user will have good visual abilities and will be able to compensate for the reduced or distorted localization cues. However, this is not the case for those with vision and hearing impairments. These issues will be addressed in this article.
Vision and Hearing Impairment
A visual impairment can be defined as any chronic visual deficit that impairs everyday function and is not correctable by ordinary spectacles or contact lenses. Visual impairments include blindness and low vision. Low vision implies that the person can accomplish tasks with the use of compensatory visual strategies and environmental modifications.1
Legal blindness in the United States is typically defined as visual acuity with best correction in the better eye worse than or equal to 20/200 or a visual field extent of less than 20 in diameter. However, this definition represents an artificial distinction and has little value for rehabilitation, but it is significant in that it determines eligibility for certain disability benefits from the federal government.
Perfect sensory integration becomes more difficult in people with certain eye diseases such as advanced age-related maculopathy (AMD), which often cause dense central scotomas (blind spots) in both eyes.2 Eventually, most of these patients not only lose vision for fine detail but also learn to compensate for their vision loss by an alternative gaze strategy called “eccentric viewing.”3,4 They direct their gaze away from the object of interest, so that its image falls on a still intact region of the retina, called the “preferred retinal locus.”5 Depending on the severity of the disease that can cause varying degrees of topographic damage to the retina, the enforced diversion can be as large as 20.6 Thus, the enforced deviation of the eyes from a central position leads to a divergence between the retinal visual and the head-centered auditory coordinates. Hence, we have to expect that people with advanced maculopathies who use eccentric viewing will show deficits in their localization of objects in the world surrounding them.
Interestingly, in the most common types of visual impairment (glaucoma, cataract and media opacities, and AMD), all of which are age related,7,8 the loss of ability to detect low-contrast objects is a key factor. Just as significant in the aging population in general, low-contrast perception has been shown to fall off many times faster with age than visual acuity.9,10
In contrast to AMD, which results in a loss of central vision, glaucoma, a condition resulting from an increase in fluid pressure inside the eye, leads to optic nerve damage and loss of vision in the periphery. Diabetic retinopathy, a leading cause of blindness in American adults, is caused by changes in the blood vessels of the retina and causes more widespread and varied scotomas or “blind spots.”
Disability resulting from hearing impairment is the third most prevalent chronic disability identified by people over the age of 65 years.11,12 The hearing-impaired population in the United States, not including residents of nursing homes or retirement homes, is more than 31 million13 or more than 18% of the adult population.14 More than one-third of the population over 65 years of age is hearing impaired.15,16 Furthermore, of those hearing-impaired individuals who could benefit from acoustic amplification, only 1 in 5 use hearing aids.17 More than 70% of hearing aid wearers are over 65.18
It is well established that hearing loss and other perceptual problems related to aging cause an increase in overall communication difficulties.19–22 Self-reports by the elderly identify communication disabilities as one of the greatest problems in their lives.23 Older people with even mild hearing impairments experience disproportionate difficulty in face-to-face and audiovisual media communication under adverse listening conditions, such as the presence of background noise. For this group there is greater reliance, therefore, on visual cues such as speechreading and interpreting facial expressions. However, in this same over-65 population group (the fastest growing segment of our society), the vision needed for this function is often impaired.
Of the 3 million Americans with low vision, almost 1 million are “legally blind,” and roughly 200 000 are more severely impaired.24 Because of their reliance on narrow definitions of visual impairment, these figures underestimate its prevalence. (See Brabyn et al10 in this issue for a discussion of the characteristics of blind and visually impaired populations.)
The prevalence of vision loss, as in the case of hearing loss, increases significantly with age. For example, approximately 10% of people over 75 have corrected visual acuity worse than 20/40.24 More than 1 million Americans who are 40 years and older are legally blind from eye disease and an additional 2.3 million are visually impaired.25 In addition, 17% of Americans 45 years and older report some type of visual impairment even when wearing eyeglasses or contact lenses. The percentage rises with age to 26% of people of age 75 years and older.7 It is estimated that 21% of older adults in the United States have dual sensory impairments by 70 years of age.26
Auditory and Visual Localization
An important feature of binaural hearing is that of sound localization. The ability to localize sound is important in everyday life for identifying the location of a target signal in the presence of competing signals, for allowing one to focus on a signal of interest, and to be alerted to potential hazards and their location. This ability is of value to people with normal vision and is of particular importance to people with vision loss.
Sound localization is a complex perceptual process that requires the integration of multiple acoustic cues.27 The two basic steps involved in learning to localize are the quantification of auditory cues and the association of cue values with appropriate locations in space.
The dual mechanism theory of binaural sound localization, sometimes referred to as the “duplex” theory,28 states that on the horizontal plane, low-frequency sounds are localized on the basis of interaural time differences (ITDs) and high-frequency sounds are localized on the basis of interaural intensity differences (IIDs). However, it has been shown that when wideband stimuli are produced with conflicting IID and ITD cues, listeners follow the direction of the ITD cue, as long as the stimuli include low frequencies.29 High-frequency sounds with low-frequency amplitude modulations can also be localized by means of the ITD of the modulation envelope. Localization in the vertical plane is dependent primarily on spectral shape cues.29 The reflection and diffraction of sound in and around the folds of the pinnae create perceptible high-frequency spectral differences by filtering the incoming sounds in a directionally dependent manner,30 a phenomenon that is primarily useful for localization on the vertical plane and for front–back discrimination.
In conversational speech, it is important for the individual to localize and then separate the intended signal from competing sounds. A deficit in sound localization will reduce the listener's ability to locate the speaker and use audiovisual speech cues to improve speech perception.
It is well known that normal-hearing (NH) listeners can localize well in the horizontal plane and less well in the vertical plane. Depending on the stimuli and methodology, NH listeners have been reported to localize sounds in quiet (anechoic environments) on the order of 1° to 3° in the horizontal plane (H. J. Simon et al, unpublished data, 2006).30–32 Accuracy of localization is poorer for sources at the side of the head (as opposed to the median plane), and errors can be as large as 20° for sources in the rear and elevated above the horizontal plane.33,34 Similarly, accuracy of localization in the vertical plane is poorer than for the horizontal plane.35,36
The binaural auditory system is remarkably sensitive to interaural differences. The just noticeable difference for ITD is on the order of tens of microseconds37,38 and on the order of 1 dB for IID.38–40 This remarkable sensitivity accounts for the high accuracy of localization in the horizontal plane. Interaural differences are less prominent for sounds in the vertical plane and in front–back discrimination, which helps account for the poorer localization ability under these conditions.41
Durlach et al,42 in a review of binaural studies prior to 1981, showed that general horizontal localization and lateralization performance was not easily predicted on the basis of the audiogram although it was degraded in subjects with sensorineural hearing loss, particularly those with presbyacusis, unilateral hearing loss, and bilateral asymmetry. More recently, Byrne, Noble, and colleagues36,43 and others44–46 found a moderate correlation between the severity of the hearing loss and horizontal localization difficulty and concluded that unaided localization by hearing-impaired subjects is affected by degree and type of hearing loss.36,47 Localization in the vertical plane by bilaterally impaired hearing-impaired listeners has been found to be at chance level, presumably because of the necessity to access high-frequency energy.35,36
The question of how well someone with a hearing loss can localize sound (with or without amplification) is still not fully resolved. Most studies of the localization abilities in sensorineural hearing loss listeners have been designed either for the purpose of describing binaural deficits36,38,48–51 or evaluating the effects of hearing aids and various amplification strategies,46,52–56 cochlear implants,57,58 ear-mold configurations,52–55,59 and ear protectors.60
Recent work from this laboratory61 studied unaided localization in the horizontal plane with listeners with NH and a group of hearing-impaired listeners who had symmetric bilateral hearing loss and who used bilateral amplification (BIN). There were three important findings in the study. First, accuracy of localization was small for the NH group and small but significantly worse for the BIN group of listeners. Second, precision of localization was high for both groups, but the BIN group showed a larger error in absolute judgments of direction and these errors were less symmetric than for the NH group. Third, the results for subjects with asymmetric losses or who have been aided monaurally were not as clear. Most of these subjects showed significantly poorer symmetry in their localization errors than for either the NH or BIN groups.
A recent study by Van den Bogaert et al46 also tested bilaterally aided listeners with and without their hearing aids in a localization task. They found that when tested with their hearing aids, subjects performed worse than did the NH subjects. However, in agreement with the above study, more than one-half of the subjects reached NH performance levels when tested unaided. Thus, they concluded (cf61) that independently operating hearing aids do not preserve localization cues.
An important advantage of binaural listening is that speech in a noisy environment is easier to understand. The improvement in speech intelligibility is because of two factors, head shadow or head diffraction effects and binaural auditory processing.62–64 The head shadow effect is purely acoustic and arises when the speech and interference or noise sources are in different locations. Binaural auditory processing is the ability of the binaural system to make use of the interaural difference cues (IID and ITD) in the received sounds, which enhances the separation of signal from interference (noise).
Despite the tremendous sensitivity of the binaural auditory system to acoustic localization cues, visual cues dominate auditory cues in sound localization. Although recent findings have shown that auditory information can change the percept of an unambiguous visual stimulus qualitatively (causing a strong visual illusion), it has long been considered that human beings are primarily visually oriented and that vision is the dominant modality in the multisensory perception of the world.65
The ability of a ventriloquist to make speech appear to come from a dummy is an example of visual localization cues taking precedence over competing auditory cues. A more striking example is that of the apparent source of sound in a drive-in theater. For those who can remember drive-in theaters, a loudspeaker placed on the car window delivers the soundtrack of the movie. The sound, however, appears to come from the visual image projected on the theater screen which is at some considerable distance from the car.
Conflicting visual and auditory cues can have a substantial impact on the perceived speech sound. McGurk,66 for example, showed that the perceived place of articulation of an auditory consonant (such as /ba/) can be influenced by the simultaneous presentation of a video signal of a talker saying a conflicting consonant such as /ga/. Usually, such a presentation is perceived by observers as “da” or “delta a” (known as fusion responses). The reverse pairing (auditory /ga/ paired with a visual /ba/) results in “bga.” In fact, the McGurk effect is a compelling example of the effect of visual speech cues on speech perception.67
Vision is the primary sense for determining safe paths of travel, detecting obstacles, and finding landmarks. For people who are blind or who have low vision, spatial hearing, the localization or perception of space via auditory cues, is critical. “Echolocation,” which is one component of spatial hearing, may play a role in independent travel for blind people.68 Echolocation involves a process for locating objects by means of sound waves produced by the emitter, such as clicks, chirps, cane taps, etc being reflected back to the emitter from objects in the environment.69 Detection of variations in naturally occurring sounds in the ambient sound field due to reflections by objects such as walls,68,70 locating sound sources, and locating the direction, distance and direction of moving sound sources are also very important to the visually impaired listener.71
A person with vision impairment but normal hearing is more dependent on auditory cues for navigation, locating a desired sound source and separating it from competing sounds, and for identifying alerting signals than a person with normal vision and hearing.72 For many years there has been discussion regarding auditory compensatory mechanisms in the blind predicated on the theory that the loss of the visual information channel(s) results in greater emphasis on the other sensory modalities. This implies increased requirements for auditory processing.73 Two models74,75 have been proposed for defining the role of visual experience in the development of spatial hearing in blind listeners. The deficit model74 holds that auditory space has to be calibrated by vision. This model assumes that other kinds of experience cannot be substituted for the visual experience in the development of spatial hearing.
Alternatively, the compensation model75,76 assumes that, although visual experience may normally play a role in the development of spatial hearing, other kinds of experiences are also important and that compensation occurs through multimodal use. Some proponents of this model predict that spatial hearing may actually be better in persons with visual disabilities because nonvisual areas of perception may become more highly developed than in sighted individuals.76 Both models are supported by evidence from animals (barn owl, cat, and ferret) and humans.77–79
Early studies regarding the ability of the blind to localize, a binaural task of vital significance to the blind population, reveal inconsistencies in the results (see Ashmead et al76 for a comprehensive review of spatial hearing in blind listeners). For example, Starlinger and Niemeyer73 and Muchnik et al80 found that blind subjects performed better than sighted listeners. Other studies of localization, however, found either no differences between the two groups or differences favoring the sighted groups.75,81,82 However, methodological problems and differences make comparisons between these studies difficult.75,83
A recent study from this laboratory83 suggests that blind listeners are fully able to use the cues for spatial hearing and that vision is not a mandatory prerequisite for the calibration of human spatial hearing. In this study, the effects of varying ITD and IID were measured in NH sighted and congenitally totally blind listeners. The results of this study showed that blind listeners appear at least comparable and may be more sensitive to IID and ITD than sighted subjects, especially when attending to signals in the periphery. The findings support the compensation model: blind listeners are fully able to use the cues for spatial hearing and that vision is not a mandatory prerequisite for the calibration of human spatial hearing.
However, data on sound localization in the frontal plane84 are less conclusive. In people with low vision (not blind) there may be some question about whether reduced visual acuity results in improved auditory sensitivity and vice versa. The results of Lessard et al85 and Röder et al86 suggest that when visual input is absent, early auditory spatial representations continue to develop. This is consistent with the compensation model and cross-modal plasticity found in cats by Rauschecker and colleagues.78,87 After binocular deprivation, an increase in responsiveness to auditory and somatosensory stimulus in multimodal areas was found in the superior colliculus and the anterior ectosylvian sulcus. In addition, visually responsive units did not significantly decrease, resulting in an increased proportion of multisensory neurons.78 These results are analogous to studies of the congenitally deaf by Neville and colleagues,88 where evidence of enhanced early processing of peripheral visual events was found. However, in many studies, the greater accuracy in the totally blind subjects was attributed to some form of auditory compensation not available to the groups with residual vision.
Distance Perception
Although the mechanisms that allow human listeners to determine the direction of a sound source are well known, relatively little is known about auditory distance perception. One reason for this disparity is that auditory distance cues generally rely, at least in part, on the listener's deductive knowledge about the source and/or the listening environment.89 Under real life auditory only conditions, listeners underestimate auditory source distance when the source is more than 1 meter away. This result differs from the visual only condition.90,91 However, accuracy of source distance estimation is improved significantly and judgment variability is lowered by the addition of visual distance information.92
Auditory distance perception consists fundamentally of two cues: the intensity of the sound and the difference between the arrival of the sound and any subsequent reflections from nearby objects, that is, the walls of a room.30 The first signal to arrive (the direct sound) is independent of the properties of the room. The subsequent signals (the acoustic reflections) provide complex acoustic cues on both the distance of the sound source and the acoustic characteristics of the room.
It has been found that hearing-impaired listeners show deficits in the ability to use some of the cues for distance perception.93 Using a synthetic paradigm, psychometric functions were measured for the distance discrimination of spoken sentences in a virtual room under two conditions: (a) overall level and both direct and reflected sound cues available and (b) overall level cue eliminated. The hearing-impaired subjects performed as well as the NH controls when both cues were available but less well when the overall level cue was eliminated. In addition, there were significant correlations between the self-reported distance capabilities as measured by the Speech, Spatial and Qualities of Hearing Scale.94 From this result it was suggested that hearing-impaired listeners might have a diminished capacity to discriminate distances using the relationship between the direct and reflected sounds (the reverberant sound field). In comparison, distance discrimination is much better when overall level is the primary cue.95,96
In another study of distance perception, Zahorik92 showed that the use of vision improved auditory distance judgment accuracy and lowered judgment variability compared to the condition in which only audition was used. However, the accuracy of the auditory-only condition was found to improve over the course of the experiment, which has implications for wayfinding in visually impaired listeners, and will be discussed below.
Visual Cues in Speech Perception
A sighted person with a hearing impairment is more dependent on visual cues for navigation, for locating a desired sound source in the presence of competing sounds (in the case of speech communication, facing the speaker provides valuable supplementary speechreading cues), and for identifying alerting signals.
Although speech is conveyed primarily by auditory cues, visual speech cues are particularly helpful when the auditory signal is impoverished, as in the case of background noise and/or reverberation, or as a result of a hearing loss. For example, acoustic cues conveying information on place of articulation (ie, the primary difference in articulation between /b/ and /d/) are typically not perceived in the case of a hearing loss. In contrast, visual cues signaling place of articulation are perceived with little difficulty. Acoustic cues conveying manner-of-articulation (i.e., the primary difference in articulation between /d/ and /z/) are not affected quite as much by a hearing loss but are much harder to see visually. Acoustic cues conveying voicing (ie, the difference between /s/ and /z/) are least affected by a hearing loss; voicing cues are extremely difficult to perceive in the visual speech signal.97–102
Sumby and Pollack,103 Erber,104 MacLeod and Summerfield,105 and others have found that for listeners with normal hearing listening to speech in noise, the use of both auditory and visual cues resulted in a substantial improvement in speech intelligibility, equivalent to a reduction of the speech-to-noise ratio by 7 to 15 dB, or more, depending on the speech material (words, continuous speech) and the noise level. The contribution of visual speech cues to speech intelligibility increases with increasing impoverishment of the auditory signal; that is, with increasing background noise or reverberation or other distortions of the speech signal. Berryman et al106 also found that errors of voice, place of articulation, and manner of articulation were markedly reduced by 70% or more when vision was added to hearing in a word-in-noise discrimination task.
In an analysis of the articulation index (AI) for auditory–visual consonant recognition, Grant and Walden100 found that voicing, manner, and place information were not independent and that one feature alters the expected performance of another. In addition, the recognition of voicing was solely determined by the auditory modality; there was no difference between the auditory and auditory–visual conditions.100 A more recent article by Schwartz et al,107 however, showed that visual speech information, in conjunction with acoustic information (a cross-modality timing cue), can be used to improve voicing recognition.
Significance of Dual Loss: Speech Recognition
As noted above, the importance of visual speech cues increases with increasing impoverishment of the auditory signal. Visual speech cues are thus of greater importance for more severe hearing losses. The mutual redundancy of auditory and visual speech cues is thus of great value for people with hearing loss, but it is important to note that subtle vision losses can seriously reduce speechreading ability. The human face and lips are relatively low-contrast targets, and ability to speechread is likely to be significantly affected by losses of visual contrast sensitivity even when acuity is preserved at near-normal levels. Loss of visual contrast sensitivity is common in aging. Functions such as the ability to see low-contrast objects (contrast sensitivity), and the ability to see in the presence of veiling glare decline much faster with age than does visual acuity. In the 90- to 95-year-old age group, for example, median visual acuity is only about 2 times worse than that of a young normal, although contrast sensitivity is about 6 times worse, and ability to see a low-contrast target in glare is about 18 times worse.10
Significance of Dual Loss: Localization
As discussed earlier, it is well known that NH listeners can localize well in both the horizontal and vertical planes. Localization ability by people with symmetric hearing loss is almost as good as for NH listeners in the horizontal plane but is extremely poor in the vertical plane. Recently, the basic mechanisms underlying audiovisual interactions in sound localization have been investigated with a variety of techniques, including some form of induced visual impairment.
A study by Zwiers et al108 investigated sound and visual localization with induced visual deprivation. Subjects wore binocular 0.5× lenses to modify visual space for 2 to 3 days, before and after which auditory localization was tested. Unlike prisms, which induce a bias or a homogenous lateral shift of the entire visual–spatial map, 0.5× lenses compress the spatial visual field by half. Sound localization was “consistent and precise.” However, there was a compression of auditory localization that was consistent with the compression of the visual field by the lenses that was most pronounced for the horizontal plane and minimal for the vertical plane. In addition, there was no difference in performance whether subjects used eye movements in a target fixation task or peripheral vision in a central fixation task. This suggests that the major site of adaptation resides within the central auditory system. Thus, as in barn owls,109,110 the modified visual-auditory experience induced changes in auditory localization. However, unlike studies of barn owls,111,112 the plasticity was not limited to early development.
Other studies have used prisms for a few hours for adaptation113 or used passive visual training under laboratory conditions.114,115 They showed that there are aspects of vision and audition that can be modified experimentally. For example, after prolonged exposure to an abnormal stimulus situation, localization as well as sensorimotor coordination113 showed evidence of compensation, with performance returning toward normal.
Despite the fact that the basic mechanisms underlying audiovisual interactions in sound localization have been investigated with a variety of techniques,116–119 there is no scientific report to date investigating these interactions in people for whom they should matter most, namely those with severe visual impairment or listeners with visual and hearing problems. Such studies would investigate any adaptive changes in adult human sound and visual localization that mirror the effects of the visual or auditory impairment as reported in the studies above.
Acoustic Amplification for Dual Sensory Loss
Hearing aids have been designed for sighted people with hearing loss. These hearing aids can also be of great benefit to people with both hearing and vision loss, but there are some special factors that need to be taken into account. As pointed out by Dillon,120 “The provision of bilateral hearing aids is extremely important for hearing-impaired people with severe visual problems. Even small improvements in localization are likely to be extremely important because of the increased importance to such people of auditory perception.” Simon61 has also shown, on the basis of recent research and clinical and theoretical arguments, that bilateral amplification should be the first choice for maximizing localization ability. There are, however, some special cases for which bilateral amplification is contraindicated.121
At present, a true binaural hearing aid has yet to be developed that can be programmed to maximize localization ability for a bilateral hearing loss by appropriate adjustments to the interaural amplitude and phase characteristics of the instrument. There is some movement in the industry towards the development of a true binaural hearing aid but, at present, it is common practice to use two similar hearing aids, one for each ear, for a symmetric bilateral hearing loss.
The underlying assumption with respect to sound localization using bilateral amplification is that interaural distortions or limitations introduced by bilateral amplification will be small and will have little effect on the localization of sound sources. In addition, visual localization cues will take precedence over auditory cues and the auditory system will adapt to the amplified auditory localization cues. This assumption is not unreasonable if two similar single-channel linear hearing aids are used and the amplified auditory localization cues are not very different from the true localization cues.
The above assumption, however, is questionable for more advanced nonlinear forms of signal processing. Wide dynamic range multichannel compression (WDRMCC), for example, can distort substantially the interaural cues used for sound localization. In this form of amplification, the gain at each ear is adjusted continuously depending on the signal intensity. Interaural intensity difference is an important cue for sound localization and the ongoing gain adjustments in WDRMCC will distort this cue substantially. Interaural time differences will also be distorted depending on the dynamics of the compression system at each ear. The problem is compounded by additional interaural distortions introduced by the use of multiband compression.
There is a limited body of research on the effect of interaural distortions of the type introduced by compression amplification on auditory localization. Bakke122 found that horizontal localization was not altered significantly with compression ratios up to 2:1 or with IID reductions up to 50%. Localization accuracy was degraded only when IID cues were attenuated completely in the absence of ITD cues. This result suggests that WDRMCC may not have a significant negative effect on localization provided ITD cues are not distorted. Bear in mind that this study was conducted with compression in a single band. This result may not necessarily hold for multiband compression systems.
It should be noted that WDRMCC is most useful in the high frequencies where the reduction in dynamic range of hearing is typically most severe. It should also be noted that ITD cues for sound localization are important in the low frequencies (ie, below approximately 1000 Hz). These observations suggest that a variation of WDRMCC is needed. Frequencies below 1000 Hz should be amplified linearly so that important low-frequency localization cues (ITDs in particular) are preserved while the advantages of WDRMCC are implemented in the high-frequency region where they are most needed.
Another issue is that of distance perception in listeners with dual sensory loss. As discussed earlier, distance perception is affected by hearing loss93 and that people with hearing loss rely mainly on changes in overall level as their primary auditory cue for distance perception. Signal processing strategies that affect overall level (such as compression amplification) can thus provide misleading auditory cues for distance perception. The problem is likely to be more serious for people with dual sensory loss who rely heavily on auditory cues for wayfinding.
There is currently great interest in the use of directional inputs (ie, a directional microphone or microphone array) for improving speech recognition in noise. People with a hearing loss or dual sensory loss can benefit from this form of signal processing when listening to speech in the presence of competing signals coming from different directions, as in a noisy restaurant or cocktail party. A directional input, however, will attenuate alerting signals and other important information coming from a direction outside the range of the directional input. For a person with dual sensory loss this can be a very serious limitation. Similarly, directional hearing aids can be detrimental to people with low vision or blindness who rely heavily on these cues for wayfinding, localization, and distance perception.
Ricketts123 has reported that some of the problems associated with highly directional hearing aids are “tunnel hearing” (not being aware of important sounds not directly in front of the listener) and limited low-frequency microphone sensitivity. Directional microphones typically have a 6 dB per octave reduction in low-frequency sensitivity. Although compensation for the associated low-frequency attenuation is necessary for audibility in listeners with low-frequency hearing losses greater than approximately 40 dB HL,124 gain compensation for listeners with little or no low-frequency hearing loss is not recommended because of the potential increase in audibility of the low-frequency microphone noise floor. Ricketts123 also notes that there is a directional disadvantage in some noisy environments, especially when the target speech, for example, is behind or at the side(s) of the listener.
Henry and Ricketts,125 in comparing three listening conditions including unaided and the bilaterally fitted behind-the-ear hearing aids in omnidirectional and directional microphone modes, found that changes in IIDs from those provided by the unaided ear did not consistently result in decreases in auditory localization accuracy as hypothesized. Localization accuracy was significantly poorer in the aided omnidirectional microphone mode than in the unaided condition. Localization for unaided and directional conditions, however, was not significantly different from each other.
A blind person is critically dependent on acoustic cues in wayfinding. To develop hearing aids for blind people with hearing loss it is important to know the nature of the acoustic cues used by blind people in wayfinding. Kuttruff126 investigated the ambient sound field in a room. He found that the sound field is not spatially uniform and interference patterns cause sound pressure to build up near the boundaries (walls, floors, ceilings.) This buildup extends further from the boundaries for low-frequency (less than 500 Hz) than for higher frequency sounds (3–6 feet vs 1 inch).
In a related study, Ashmead and colleagues68,76 investigated the specific acoustic information available to travelers to detect a wall on their periphery. Their results were in agreement with Kuttruff's model of the sound field in a room and the importance of low frequency signals for wayfinding. They found that listeners were able to detect a spectral shift in the ambient sound field near a wall and not necessarily an overall increase in sound level. This spectral shift was in the low frequency region at an average distance of 47 cm from a wall. This finding was consistent with another result,127 which showed that the walking paths of blind children were straighter and more parallel in hallways that were narrower than 3 meters. These results emphasize the importance of low-frequency cues in wayfinding and the need for hearing aids that will amplify low-frequency acoustic cues with minimal distortion.
Cochlear implant (CI) processing is also of interest in the population with dual sensory loss. Recently, bilateral electrical stimulation has been used with CI users. Tyler et al128 stated that precise timing of intra-aural electrical stimulation to preserve ITD cues on a pulse-by-pulse basis has not been shown with two CIs functioning independently, nor has the problem of coding level differences between the ears been solved. Without such processing interaural cues of ITD and IID would be lost. These cues, as discussed, are basic to localization and would impair listeners’ abilities.
Recent studies have nevertheless shown a benefit of bilateral CI stimulation for some bilateral implant users (children and adults). Benefits include improved localization, binaural unmasking, and good sensitivity to interaural level differences in lateralization.58,129–132 Interestingly, an attempt to preserve the fine-structure ITD cues did not show an improvement over the unprocessed cues.58
Conclusions
The auditory and visual modalities complement each other in important ways. This redundancy is of great value if one or the other modality is impaired. Visual speech cues, for example, compensate for many of the auditory speech cues that are lost as a result of a hearing loss, while auditory localization cues are of great value to people with vision loss. Much of this redundancy is lost, however, if both hearing and vision are impaired.
The focus of this article is on auditory localization and the effect of dual sensory loss on localization ability and, concomitantly, on wayfinding ability. Although there is a substantial body of literature documenting the great precision and accuracy of localization in people with normal hearing and a smaller but significant body of literature documenting the localization abilities of people with hearing loss, there is an almost complete lack of data on the localization abilities of people with both vision and hearing loss.
In summary, these investigations show that normal hearing individuals have excellent auditory localization ability (in the horizontal plane) and that the localization ability for sighted individuals with symmetric bilateral hearing impairment is almost as good for the unaided condition but not necessarily for the aided condition. These data indicate that in many, if not most, cases bilateral hearing aids are distorting the interaural cues for auditory localization. At present, there are no true binaural hearing aids that allow for interaural time and intensity cues to be adjusted for accurate sound localization. Bilateral hearing aids are nevertheless widely recommended based on the assumption that the hearing aid user has normal or near normal vision (with eyeglasses) and that visual localization cues will take precedence over any misleading auditory localization cues introduced by imperfect bilateral amplification. This assumption, however, is questionable for people with dual sensory loss.
The problem of misleading sound localization cues in bilateral amplification is potentially worse with modern wide dynamic range compression hearing aids. These hearing aids alter the gain depending on the signal level and, consequently, they can introduce intensity differences that distort the interaural cues used for sound localization. If these hearing aids, in addition, distort ITD cues, accuracy of localization is likely to be reduced substantially. Hearing aids with directional inputs can also distort the interaural cues used for sound localization. There have been few experiments investigating the effect of wide dynamic range compression and/or directional inputs on localization ability, and a more substantial experimental effort is needed to determine the effects of these and related methods of signal processing on auditory localization.
Inferences based on data obtained for sighted people with hearing loss, but with no visual cues, indicate that methods of amplification for individuals with dual sensory loss should not distort low frequency cues, particularly for wayfinding applications, and that the advantages of wide dynamic range amplitude compression should be limited to high frequencies (ie, above 1000 Hz). It is also important to bear in mind that a directional input can improve speech-to-noise ratio (an important consideration for communication in a noisy restaurant or cocktail party), but it does so at the expense of attenuating alerting signals from directions outside the range of the directional input.
In short, if visual cues are not available to compensate for the loss of or distortions to auditory localization cues, as may occur with current forms of bilateral amplification, it is important that the method of amplification should not distort these auditory cues. In particular, the amplification system should not distort either ITD or IID cues in the low frequencies where these cues are especially important.
Acknowledgments
Parts of this paper were presented at the ISAC-07, May 17, 2007, Portland, Maine. Preparation of this paper was supported by the Smith-Kettlewell Eye Research Institute and NIDRR (HJS) and by the National Center for Rehabilitative Auditory Research (HL). It was also supported in part by a program grant from the Department of Veterans Affairs, Rehabilitation Research and Development Service, National Center for Rehabilitative Auditory Research (Program #C2659C).
References
- 1.Corn AL, Koenig AJ. Perspectives on Low Vision. New York: American Foundation for the Blind; 1996. [Google Scholar]
- 2.Bressler NM, Bressler SB, Fine SL. Age-related macular degeneration. Surv Ophthalmol. 1988;32: 375–413 [DOI] [PubMed] [Google Scholar]
- 3.Von Noorden GK, Mackensen G. Phenomenology of eccentric fixation. Am J Ophthalmol. 1962;53: 642–660 [DOI] [PubMed] [Google Scholar]
- 4.Nilsson UL, Frennesson C, Nilsson SE. Location and stability of a newly established eccentric retinal locus suitable for reading, achieved through training of patients with a dense central scotoma. Optom Vis Sci. 1998;75: 873–878 [DOI] [PubMed] [Google Scholar]
- 5.Timberlake GT, Mainster MA, Peli E, Augliere RA, Essock EA, Arend LE. Reading with a macular scotoma. I. Retinal location of scotoma and fixation area. Invest Ophthalmol Vis Sci. 1986;27: 1137–1147 [PubMed] [Google Scholar]
- 6.Fletcher DC, Schuchard RA. Preferred retinal loci relationship to macular scotomas in a low-vision population. Ophthalmology. 1997;104: 632–638 [DOI] [PubMed] [Google Scholar]
- 7.Williams TF. Vision impairment in older adults. In: Silverstone B, Lang MA, Rosenthal BP, Faye E. eds. The Lighthouse Handbook on Visual Impairment and Vision Rehabilitation. New York, NY: Oxford University Press; 2000: 1279–1286 [Google Scholar]
- 8.Watson GR. Low vision in the geriatric population: rehabilitation and management. J Am Geriatr Soc. 2001;49: 317–330 [DOI] [PubMed] [Google Scholar]
- 9.Haegerstrom-Portnoy G, Schneck ME, Brabyn J. Seeing into old age: vision function beyond acuity. Optom Vis Sci. 1999;76: 141–158 [DOI] [PubMed] [Google Scholar]
- 10.Brabyn J, Schneck ME, Haegerstrom-Portnoy G, Lott L. The Smith Kettlewell Institute (SKI) longitudinal study of vision function and its impact among the elderly: an overview. Optom Vis Sci. 2001;78: 264–269 [DOI] [PubMed] [Google Scholar]
- 11.National Center for Health Statistics (NCHS) Prevalence of Selected Impairments. Washington, DC: US Government Printing Office; 1977. [Google Scholar]
- 12.National Center for Health Statistics (NCHS) Current Estimates From the National Health Interviews Survey, U.S. 1985. Washington, DC: US Government Printing Office; 1986. [Google Scholar]
- 13.Kochkin S. MarkeTrak VII. Hearing loss population tops 31 million people. Hear Rev. 2005;12: 16–29 [Google Scholar]
- 14.Blackwell DL, Coles R. Summary Health Statistics for U. S. Adults: National Health Interview Survey, 1997. Washington, DC: National Center for Health Statistics; 2002. [PubMed] [Google Scholar]
- 15.Ries PW. Prevalence and Characteristics of Persons With Hearing Trouble: United States, 1990–1991. Washington, DC: Public Health Service, US Government Printing Office; 1994 [PubMed] [Google Scholar]
- 16.Pleis JR, Coles R. Summary Health Statistics for U. S. Adults: National Health Interview Survey, 1999. Washington, DC: National Center for Health Statistics; 2003. [PubMed] [Google Scholar]
- 17.Kochkin S. MarkeTrak VII. Obstacles to adult non-user adoption of hearing aids. Hear J. 2007;60: 24–50 [Google Scholar]
- 18.LaPlante MP, Hendershot GE, Moss AJ. Assistive Technology Devices and Home Accessibility Features: Prevalence, Payment, Need and Trends, Advance Data From Vital and Health Statistics. Hyattsville, MD: National Center for Health Statistics; 1992. [PubMed] [Google Scholar]
- 19.Dubno JR, Dirks DD, Morgan DE. Effects of age and mild hearing loss on speech recognition in noise. J Acoust Soc Am. 1984;76: 87–96 [DOI] [PubMed] [Google Scholar]
- 20.Gelfand SA, Piper N, Silman S. Consonant recognition in quiet as a function of aging among normal hearing subjects. J Acoust Soc Am. 1985;78: 1198–1206 [DOI] [PubMed] [Google Scholar]
- 21.Nabelek AK. Identification of vowels in quiet, noise, and reverberation: relationships with age and hearing loss. J Acoust Soc Am. 1988;84: 476–484 [DOI] [PubMed] [Google Scholar]
- 22.Gordon-Salant S, Fitzgibbons PJ. Profile of auditory temporal processing in older listeners. J Speech Lang Hear Res. 1999;42;300–313 [DOI] [PubMed] [Google Scholar]
- 23.Jacobs-Condit L. Gerontology and Communication Disorders. Rockville, MD: American Speech-Language-Hearing Association; 1984 [Google Scholar]
- 24.Schneck ME, Haegerstrom-Portnoy G. Practical assessment of vision in the elderly. Ophthalmol Clin North Am. 2003;16: 269–287 [DOI] [PubMed] [Google Scholar]
- 25.Vision Problems in the U.S. Schaumburg, IL: Prevent Blindness America; 2002 [Google Scholar]
- 26.Berry P, Mascia J, Steinman BA. Vision and hearing loss in older adults: “double trouble.”. Care Manag J. 2004;5: 35–40 [DOI] [PubMed] [Google Scholar]
- 27.Wightman FL, Kistler DJ. Perceptual consequences of engineering compromises in synthesis of virtual auditory objects. J Acoust Soc Am. 1992;92: 2332A [Google Scholar]
- 28.Strut JW. (Lord Rayleigh). On our perception of sound direction. Phil Mag. 1907;13: 214–232 [Google Scholar]
- 29.Wightman FL, Kistler DJ. The dominant role of low-frequency interaural time differences in sound localization. J Acoust Soc Am. 1992;91: 1648–1661 [DOI] [PubMed] [Google Scholar]
- 30.Blauert J. Spatial Hearing. Revised ed. Cambridge, MA: MIT Press; 1997. [Google Scholar]
- 31.Bronkhorst AW. Localization of real and virtual sound sources. J Acoust Soc Am. 1995;98: 2542–2553 [Google Scholar]
- 32.Seeber B. A new method for localization studies. Acustica—Acta Acust. 2002;88: 446–450 [Google Scholar]
- 33.Oldfield SR, Parker SP. Acuity of sound localisation: topography of auditory space. I. Normal hearing conditions. Perception. 1984;13: 581–600 [DOI] [PubMed] [Google Scholar]
- 34.Makous JC, Middlebrooks JC. Two-dimensional sound localization by human listeners. J Acoust Soc Am. 1990;87: 2188–2200 [DOI] [PubMed] [Google Scholar]
- 35.Butler RA. The effect of hearing impairment on locating sound in the vertical plane. Int Audiol. 1970;9: 117–116 [Google Scholar]
- 36.Noble W, Byrne D, LePage B. Effects on sound localization of configuration and type of hearing impairment. J Acoust Soc Am. 1994;95: 992–1005 [DOI] [PubMed] [Google Scholar]
- 37.Yost WA. Discriminations of interaural phase differences. J Acoust Soc Am. 1974;55: 1299–1303 [DOI] [PubMed] [Google Scholar]
- 38.Gabriel KJ, Koehnke J, Colburn HS. Frequency dependence of binaural performance in listeners with impaired binaural hearing. J Acoust Soc Am. 1992;91: 336–347 [DOI] [PubMed] [Google Scholar]
- 39.Mills AW. Lateralization of high frequency tones. J. Acoust Soc Am. 1960;32: 132–134 [Google Scholar]
- 40.Yost WA, Dye RH., Jr Discrimination of interaural differences of levels as a function of frequency. J Acoust Soc Am. 1988;83: 1846–1851 [DOI] [PubMed] [Google Scholar]
- 41.Wightman FL, Kistler DJ. Sound localization. In: Yost WA, Popper AN, Fay RR. eds. Human Psychophysics. New York: Springer-Verlag; 1993: 155–192 [Google Scholar]
- 42.Durlach NI, Thompson CL, Colburn HS. Binaural interaction of impaired listeners. A review of past research. Audiology. 1981;20: 181–211 [DOI] [PubMed] [Google Scholar]
- 43.Byrne D, Dirks D. Effects of acclimatization and deprivation on non-speech auditory abilities. Ear Hear. 1996;17(Suppl.): 29S–37S [DOI] [PubMed] [Google Scholar]
- 44.Hunig G, Berg M. Richtungshoren von patienten mit seitenungleichem horvermogen. Audiologische Akustik. 1990;3: 86–97 [Google Scholar]
- 45.Proeschel ULJ, Doering WH. Richtungshoren inder horizontalebene bei storungen der auditiven selecktionsfahigkeit und bei seitengleicher innenohrschwerhorigkeit, Teil I, Teil II. Audiologische Akustik. 1990;3: 98–107, 170–177 [Google Scholar]
- 46.Van den Bogaert T, Klasen TJ, Moonen M, Van Deun L, Wouters J. Horizontal localization with bilateral hearing aids: without is better than with. J Acoust Soc Am. 2006;119: 515–526 [DOI] [PubMed] [Google Scholar]
- 47.Byrne D, Noble W, Ter-Horst K. Effects of hearing aids on localization of sounds by people with sensorineural and conductive/mixed hearing losses. Aust J Audiol. 1995;17: 79–86 [Google Scholar]
- 48.Gabriel KI, Koehnke J, Colburn HS. Frequency dependence of binaural performance in listeners with impaired binaural hearing. J Acoust Soc Am. 1992;91: 336–347 [DOI] [PubMed] [Google Scholar]
- 49.Koehnke J, Culotta CP, Hawley ML, Colburn HS. Effects of reference interaural time and intensity differences on binaural performance in listeners with normal and impaired hearing. Ear Hear. 1995;16: 331–353 [DOI] [PubMed] [Google Scholar]
- 50.Simon HJ, Aleksandrovsky I. Perceived lateral position of narrow-band noise in hearing-impaired and normal-hearing listeners under conditions of equal sensation level and sound pressure level. J Acoust Soc Am. 1997;102: 1821–1826 [DOI] [PubMed] [Google Scholar]
- 51.Smoski W, Trahiotis C. Discrimination of interaural temporal disparities by normal-hearing listeners and listeners with high-frequency sensorineural hearing loss. J Acoust Soc Am. 1986;79: 1541–1547 [DOI] [PubMed] [Google Scholar]
- 52.Byrne D, Noble W, LePage B. Effects of long-term bilateral and unilateral fitting of different hearing aid types on the ability to locate sounds. J Am Acad Audiol. 1992;4: 369–382 [PubMed] [Google Scholar]
- 53.Koehnke J, Zurek PM. Localization and binaural detection with monaural and binaural amplification. J Acoust Soc Am. 1990;88: S169 [Google Scholar]
- 54.Noble W, Byrne D. A comparison of different binaural hearing aid systems for sound localization in the horizontal and vertical planes. Br J Audiol. 1990;24: 335–346 [DOI] [PubMed] [Google Scholar]
- 55.Noble W, Sinclair S, Byrne D. Improvement in aided sound localization with open earmolds: observations in people with high-frequency hearing loss. J Am Acad Audiol. 1998;9: 25–34 [PubMed] [Google Scholar]
- 56.Drennan WR, Gatehouse S, Howell P, Van Tasell D, Lund S. Localization and speech-identification ability of hearing-impaired listeners using phase-preserving amplification. Ear Hear. 2005;26: 461–472 [DOI] [PubMed] [Google Scholar]
- 57.Seeber B, Bauman U, Fastl H. Localization ability with bimodal hearing aids and bilateral cochlear implants. J Acoust Soc Am. 2004;116: 1698–1709 [DOI] [PubMed] [Google Scholar]
- 58.van Hoesel RJ, Tyler RS. Speech perception, localization, and lateralization with bilateral cochlear implants. J Acoust Soc Am. 2003;113: 1617–1630 [DOI] [PubMed] [Google Scholar]
- 59.Nabelek AK, Letowski T, Mason D. An influence of binaural hearing aids on positioning of sound images. J Speech Hear Res. 1980;23: 670–687 [PubMed] [Google Scholar]
- 60.Abel SM, Hay VH. Sound localization: the interaction of aging, hearing loss and hearing protection. Scand Audiol. 1996;25: 3–12 [DOI] [PubMed] [Google Scholar]
- 61.Simon HJ. Bilateral amplification, sound localization: then and now. J Rehabil Res Dev. 2005;42: 117–132 [DOI] [PubMed] [Google Scholar]
- 62.Carhart R. Monaural and binaural discrimination against competing sentences. Int Audiol. 1965;1: 5–10 [Google Scholar]
- 63.Dirks DD, Wilson RH. Binaural hearing of speech for aided and unaided conditions. J Speech Hear Res. 1969;12: 650–664 [DOI] [PubMed] [Google Scholar]
- 64.Dirks DD, Wilson RH. The effect of spatially separated sound sources on speech intelligibility. J Speech Hear Res. 1969;12: 5–38 [DOI] [PubMed] [Google Scholar]
- 65.Shams L, Kamitani Y, Shimojo S. Illusions. What you see is what you hear. Nature. 2000;408: 788. [DOI] [PubMed] [Google Scholar]
- 66.McGurk E. Susceptibility to visual illusions. J Psychol. 1965;61: 127–143 [DOI] [PubMed] [Google Scholar]
- 67.Massaro DW. Speech Perception by Ear and Eye: A Paradigm for Psychological Inquiry. Hillsdale, NJ: Lawrence Erlbaum Associates; 1987. [Google Scholar]
- 68.Ashmead DH, Wall RS. Auditory perception of walls via spectral variations in the ambient sound field. J Rehabil Res Dev. 1999;36: 313–322 [PubMed] [Google Scholar]
- 69.Rice CE. Human echo perception. Science. 1967;155: 656–664 [DOI] [PubMed] [Google Scholar]
- 70.Strelow ER, Brabyn JA. Locomotion of the blind controlled by natural sound cues. Perception. 1982;11: 635–640 [DOI] [PubMed] [Google Scholar]
- 71.Blumsack JT. Audiological assessment, rehabilitation, and spatial hearing considerations associated with visual impairment in adults: an overview. Am J Audiol. 2003;12: 76–83 [DOI] [PubMed] [Google Scholar]
- 72.Wiener WR, Lawson GD. Audition for the traveler who is visually impaired. In: Blasch BB, Wiener WR, Welsh RL. eds. Foundations of Orientation and Mobility. New York, NY: AFB Press; 1999 [Google Scholar]
- 73.Starlinger I, Niemeyer W. Do the blind hear better? Investigations on auditory processing in congenital or early acquired blindness. I. Peripheral functions. Audiology. 1981;20: 503–509 [DOI] [PubMed] [Google Scholar]
- 74.Axelrod S. Effects of Early Blindness. New York, NY: American Foundation for the Blind; 1959 [Google Scholar]
- 75.Jones B. Spatial perception in the blind. Br J Psychol. 1975;66: 461–472 [DOI] [PubMed] [Google Scholar]
- 76.Ashmead D, Wall R, Eaton S, et al. Echolocation reconsidered: using spatial variations in the ambient sound field to guide locomotion. J Vis Impair Blind. 1998;9: 615–632 [Google Scholar]
- 77.Knudsen EI. Early blindness results in a degraded auditory map of space in the optic tectum of the barn owl. Proc Natl Acad Sci USA. 1988;85: 6211–6214 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 78.Rauschecker J P, Harris LR. Auditory compensation of the effect of visual deprivation in cats' superior colliculus. Exp Brain Res. 1983;50: 63–83 [DOI] [PubMed] [Google Scholar]
- 79.Röder B, Stock O, Bien S, Neville H, Rosler F. Speech processing activates visual cortex in congenitally blind humans. Eur J Neurosci. 2002;16: 930–936 [DOI] [PubMed] [Google Scholar]
- 80.Muchnik C, Efrati M, Nemeth E, Malin M, Hildesheimer M. Central auditory skills in blind and sighted subjects. Scand Audiol. 1991;20: 19–23 [DOI] [PubMed] [Google Scholar]
- 81.Tonning FM. Ability of the blind to localize noise: practical consequences. Scand Audiol. 1975;4: 183–186 [Google Scholar]
- 82.Wanet MC, Veraart C. Processing of auditory information by the blind in spatial localization tasks. Percept Psychophys. 1985;38: 91–96 [DOI] [PubMed] [Google Scholar]
- 83.Simon HJ, Divenyi PL, Lotze A. Lateralization of narrow band noise by blind and sighted listeners. Perception 2002;31: 855–873 [DOI] [PubMed] [Google Scholar]
- 84.Zwiers MP, Van Opstal AJ, Cruysberg JR. A spatial hearing deficit in early-blind humans. J Neurosci. 2001;21(RC142): 141–145 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 85.Lessard N, Pare M, Lepore F, Lassonde M. Early-blind human subjects localize sound sources better than sighted subjects. Nature. 1998;395: 278–280 [DOI] [PubMed] [Google Scholar]
- 86.Röder B, Teder-Salejarvi W, Sterr A, Rosler F, Hillyard SA, Neville HJ. Improved auditory spatial tuning in blind humans. Nature. 1999;400: 162–166 [DOI] [PubMed] [Google Scholar]
- 87.Rauschecker JP. Compensatory plasticity and sensory substitution in the cerebral cortex. Trends Neurosci. 1995;18: 36–43 [DOI] [PubMed] [Google Scholar]
- 88.Neville HJ, Lawson D. Attention to central and peripheral visual space in a movement detection study: an event-related potential and behavioral study. II. Congenitally deaf adults. Brain Res. 1987;405: 268–283 [DOI] [PubMed] [Google Scholar]
- 89.Brungart DS, Scott KR. Auditory distance perception of speech: the influence of production level. AFRL Technol Horizons. 2000;1: 27–28 [Google Scholar]
- 90.Loomis JM, Klatzky RL, Philbeck JW, Golledge RG. Assessing auditory distance perception using perceptually directed action. Percept Psychophys. 1998;60: 966–980 [DOI] [PubMed] [Google Scholar]
- 91.Bronkhorst A, Houtgast T. Auditory distance perception in rooms. Nature. 1999;397: 517–520 [DOI] [PubMed] [Google Scholar]
- 92.Zahorik P. Estimating sound source distance with and without vision. Optom Vis Sci. 2001;78: 270–275 [DOI] [PubMed] [Google Scholar]
- 93.Akeroyd MA, Gatehouse S, Blaschke J. The detection of differences in the cues to distance by elderly hearing-impaired listeners. J Acoust Soc Am. 2007;121: 1077–1089 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 94.Gatehouse S, Noble W. The speech, spatial and qualities of hearing scale (SSQ). Int J Audiol. 2004;43: 85–99 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 95.Ashmead DH, LeRoy D, Odom RD. Perception of the relative distances of nearby sound sources. Percept Psychophys. 1990;47: 326–331 [DOI] [PubMed] [Google Scholar]
- 96.Zahorik P. Direct-to-reverberant energy ratio sensitivity. J Acoust Soc Am. 2002;112: 2110–2117 [DOI] [PubMed] [Google Scholar]
- 97.Walden BE, Prosek RA, Worthington DW. Predicting audiovisual consonant recognition performance of hearing-impaired adults. J Speech Hear Res. 1974;17: 270–278 [DOI] [PubMed] [Google Scholar]
- 98.Montgomery AA, Walden BE, Schwartz DM, Prosek RA. Training auditory-visual speech reception in adults with moderate sensorineural hearing loss. Ear Hear. 1984;5: 30–36 [DOI] [PubMed] [Google Scholar]
- 99.Walden BE, Busacco DA, Montgomery AA. Benefit from visual cues in auditory-visual speech recognition by middle-aged and elderly persons. J Speech Hear Res. 1993;36: 431–436 [DOI] [PubMed] [Google Scholar]
- 100.Grant KW, Walden BE. Evaluating the articulation index for auditory-visual consonant recognition. J Acoust Soc Am. 1996;100: 2415–2424 [DOI] [PubMed] [Google Scholar]
- 101.Grant KW, Walden BE, Seitz PF. Auditory-visual speech recognition by hearing-impaired subjects: consonant recognition, sentence recognition, and auditory-visual integration. J Acoust Soc Am. 1998;103: 2677–2690 [DOI] [PubMed] [Google Scholar]
- 102.Grant KW, Seitz PF. The use of visible speech cues for improving auditory detection of spoken sentences. J Acoust Soc Am. 2000;108: 1197–1208 [DOI] [PubMed] [Google Scholar]
- 103.Sumby WH, Pollack I. Visual contributions to speech intelligibility in noise. J Acoust Soc Am. 1954;26: 212–215 [Google Scholar]
- 104.Erber NP. Interaction of audition and vision in the recognition of oral speech stimuli. J Speech Hear Res. 1969;12: 423–425 [DOI] [PubMed] [Google Scholar]
- 105.MacLeod A, Summerfield Q. A procedure for measuring auditory and audio-visual speech-reception thresholds for sentences in noise: rationale, evaluation, and recommendations for use. Br J Audiol. 1990;24: 29–43 [DOI] [PubMed] [Google Scholar]
- 106.Berryman L, Dancer J, Burl N. An examination of error reductions in consonant recognition when vision and hearing are combined. Percept Motor Skills. 1996;82: 558. [DOI] [PubMed] [Google Scholar]
- 107.Schwartz JL, Berthommier F, Savariaux C. Seeing to hear better: evidence for early audio-visual interactions in speech identification. Cognition. 2004;93: B69–B78 [DOI] [PubMed] [Google Scholar]
- 108.Zwiers MP, Van Opstal AJ, Paige GD. Plasticity in human sound localization induced by compressed spatial vision. Nat Neurosci. 2003;6: 175–181 [DOI] [PubMed] [Google Scholar]
- 109.Knudsen EI, Knudsen PF. Vision guides the adjustment of auditory localization in young barn owls. Science. 1985;230: 545–548 [DOI] [PubMed] [Google Scholar]
- 110.Brainard MS, Knudsen EI. Dynamics of visually guided auditory plasticity in the optic tectum of the barn owl. J Neurophysiol. 1995;73: 595–614 [DOI] [PubMed] [Google Scholar]
- 111.Knudsen EI, Knudsen PF. The sensitive period for auditory localization in barn owls is limited by age, not by experience. J Neurosci. 1986;6: 1918–1924 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 112.Knudsen EI. Capacity for plasticity in the adult owl auditory system expanded by juvenile experience. Science. 1998;279: 1531–1533 [DOI] [PubMed] [Google Scholar]
- 113.Lackner JR. Influence of abnormal postural and sensory conditions on human sensorimotor localization. Environ Biol Med. 1976;2: 137–177 [PubMed] [Google Scholar]
- 114.Shinn-Cunningham B, Durlach N, Held R. Adapting to supernormal auditory localization cues II. Constraints on adaptation of mean response. J Acoust Soc Am. 1998;103: 3667–3676 [DOI] [PubMed] [Google Scholar]
- 115.Shinn-Cunningham B. Adapting to remapped auditory localization cues: a decision-theory model. Percept Psychophys. 2000;62: 33–47 [DOI] [PubMed] [Google Scholar]
- 116.Lewald J, Ehrenstein WH. The effect of eye position on auditory lateralization. Exp Brain Res. 1996;108: 473–485 [DOI] [PubMed] [Google Scholar]
- 117.Lewald J. The effect of gaze eccentricity on perceived sound direction and its relation to visual localization. Hear Res. 1998;115: 206–216 [DOI] [PubMed] [Google Scholar]
- 118.Ladavas E, Pavani F. Neuropsychological evidence of the functional integration of visual, auditory and proprioceptive spatial maps. Neuroreport. 1998;9: 1195–1200 [DOI] [PubMed] [Google Scholar]
- 119.Zimmer U, Lewald J, Erb M, Grodd W, Karnath HO. Is there a role of visual cortex in spatial hearing? Eur J Neurosci. 2004;20: 3148–3156 [DOI] [PubMed] [Google Scholar]
- 120.Dillon H. Hearing Aids. Turrsmurra, Australia: Boomerang Press; 2001 [Google Scholar]
- 121.Martin JS, Jerger JF. Some effects of aging on central auditory processing. J Rehabil Res Dev. 2005;42: 25–44 [DOI] [PubMed] [Google Scholar]
- 122.Bakke MH. The Contribution of Interaural Intensity Differences to the Horizontal Auditory Localization of Narrow Bands of Noise. New York: The City University of New York Graduate Center; 1999 [Google Scholar]
- 123.Ricketts TA. Directional hearing aids: then and now. J Rehabil Res Dev. 2005;42: 133–144 [PubMed] [Google Scholar]
- 124.Ricketts T, Henry P. Low-frequency gain compensation in directional hearing aids. Am J Audiol. 2002;11: 29–41 [DOI] [PubMed] [Google Scholar]
- 125.Henry P, Ricketts T. The effects of changes in head angle on auditory and visual input for omnidirectional and directional microphone hearing aids. Am J Audiol. 2003;13: 41–51 [DOI] [PubMed] [Google Scholar]
- 126.Kuttruff H. Room Acoustics. New York: Halsted Press; 1973 [Google Scholar]
- 127.Ashmead DH, Wall RS, Ebinger KA, Eaton SB, Snook-Hill MM, Yang X. Spatial hearing in children with visual disabilities. Perception. 1998;27: 105–122 [DOI] [PubMed] [Google Scholar]
- 128.Tyler RS, Dunn CC, Witt SA, Preece JP. Update on bilateral cochlear implantation. Curr Opin Otolaryngol Head Neck Surg. 2003;11: 388–393 [DOI] [PubMed] [Google Scholar]
- 129.Long CJ, Eddington DK, Colburn HS, Rabinowitz WM. Binaural sensitivity as a function of interaural electrode position with a bilateral cochlear implant user. J Acoust Soc Am. 2003;114: 1565–1574 [DOI] [PubMed] [Google Scholar]
- 130.Litovsky RY, Parkinson A, Arcaroli J, et al. Bilateral cochlear implants in adults and children. Arch Otolaryngol Head Neck Surg. 2004;130: 648–655 [DOI] [PubMed] [Google Scholar]
- 131.Litovsky RY, Johnstone PM, Godar S, et al. Bilateral cochlear implants in children: localization acuity measured with minimum audible angle. Ear Hear. 2006;27: 43–59 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 132.van Hoesel RJ. Exploring the benefits of bilateral cochlear implants. Audiol Neurootol. 2004;9: 234–246 [DOI] [PubMed] [Google Scholar]