William A. Yost – email@example.com, paper presenter
Xuan Zhong – firstname.lastname@example.org
Speech and Hearing Science
Arizona State University
P.O. Box 870102
Tempe, AZ 87285
Popular version of paper 4aPP2, related papers 1aPPa1, 1pPP7, 1pPP17, 3aPP4,
Presented Monday morning, May 18, 2015
169th ASA Meeting, Pittsburgh
When an object (sound source) produces sound, that sound can be used to locate the spatial position of the sound source. Since sound has no physical attributes related to space and the auditory receptors do not respond according to where the sound comes from, the brain makes computations based on the sound’s interaction with the listener’s head. These computations provide information about sound source location. For instance, sound from a source opposite the right ear will reach that ear slightly before reaching the left ear since the source is closer to the right ear. This slight difference in arrival time produces an interaural (between the ears) time difference (ITD), which is computed in neural circuits in the auditory brainstem as one cue used for sound source localization (i.e., small ITDs indicate that the sound source is near the front and large ITDs that the sound source is off to one side).
We are investigating sound source localization when the listener and/or the source move. See Figure 1 for a picture of the laboratory that is an echo-reduced room with 36 loudspeakers on a 5-foot radius sphere and a computer-controlled chair for rotating listeners while they listen to sounds presented from the loudspeakers. Conditions when sounds and listeners move presents a challenge for the auditory system in processing auditory spatial cues for sound source localization. When either the listener or the source moves, the ITDs change. So when the listener moves the ITD changes, signaling that the source moved even if it didn’t. In order to prevent this type of confusion about the location of sound sources, the brain needs another piece of information. We have shown that in addition to computing auditory spatial cues like the ITD, the brain also needs information about the location of the listener. Without both types of information, our experiments indicate that major errors occur in locating sound sources. When vision is used to provide information about the location of the listener, accurate sound source localization occurs. Thus, sound source localization requires information about the auditory spatial cues such as the ITD, but also information provided by systems like vision indicating the listener’s spatial location. This has been an underappreciated aspect of sound source localization. Additional research will be needed to more fully understand how these two forms of essential information are combined and used to locate sound sources. Improving sound source localization accuracy when listeners and/or sources move has many practical applications ranging from aiding people with hearing impairment to improving robots’ abilities to use sound to locate objects (e.g., a person in a fire). [The research was supported by an Air Force Office of Scientific Research, AFOSR, grant].