Akio Honda- honda@yamanash-eiwa.ac.jp
Yamanashi-Eiwa College
888 Yokone-machi, Kofu
Yamanashi, Japan 400-8555

Popular version of paper 4aPP4, “Effects of listener’s whole-body rotation and sound duration on sound localization accuracy”
Presented Thursday morning, December 7, 2017, 9:45-10:00 AM, Studio 4
174th ASA Meeting, New Orleans

Sound localization is an important ability to make daily life safe and rich. When trying to localize a sound, our head/body movement is known to facilitate sound localization, which creates dynamic changes to the information input to each ear [1–4]. However, earlier reports have described that sound localization accuracy deteriorates during a listener’s head rotation [5–7]. Moreover, the facilitative effects of a listener’s movement differ depending on the sound features [3–4]. Therefore, the interaction between a listener’s movement and sound features remains unclear. For this study, we used a digitally controlled spinning chair to assess the effects of a listener’s whole-body rotation and sound duration on horizontal sound localization accuracy.

In this experiment, listeners were 12 adults with normal audition. Stimuli were 1/3-octave band noise bursts (center frequency = 1 kHz, SPL = 65 dB) of 50, 200, and 1000 ms duration. Each stimulus was presented from a loudspeaker in a circular array (1.2 m radius) with loudspeaker separation of 2.5 deg (total 25 loudspeakers). Listeners were unable to see the loudspeakers because an acoustically transparent curtain was placed between the listener and the circular loudspeaker array while maintaining brighter conditions inside the curtain than outside. We assigned numbers for the azimuth angle at 1.25 degree intervals: the number zero was 31.25 deg to the left; the number 25 was in front of the listener; and the number 50 was 31.25 deg to the right. These numbers were presented on the curtain to facilitate responses. Listeners sitting on the spinning chair set at the circle center were asked to report the number corresponding to the position of the presented stimulus (see Fig. 1).

In the chair-still condition, listeners faced forward with the head aligned frontward (0 deg). Then the stimulus was presented from one loudspeaker of the circular array. In the chair-rotation condition, listeners faced forward with the head 15 deg left or 15 deg right. Then, the chair rotated for 30 deg clockwise or counterclockwise respectively when the listener first faced 15 deg left or right. During the rotation, when listeners faced forward with the head front at 0 deg, the stimulus was presented from one of the loudspeakers in the circular array.

We analyzed the angular errors in the horizontal planes. The angular errors were calculated as the difference between the perceptually localized position and the physical target position. Figure 2 depicts the mean horizontal sound localization performance.

Our results demonstrated superior sound localization accuracy of the chair-rotation condition to that of a chair-still condition. Moreover, a significant effect of sound duration was observed; the accuracy for 200 ms stimuli seems worst among the durations used. However, the interaction of the test condition and the sound duration was not significant.

These findings suggest that the sound localization performance might be improved if listeners are able to obtain dynamic auditory information from their movement.  Furthermore, the duration difference of target sound was not crucially important for their sound localization accuracy. Of course, other explanations are possible. For instance, listeners might be better able to localize the sound using shorter sound (less than 50 ms), although a halfway longer duration such as 200 ms would not provide effective dynamic information to facilitate sound localization. Irrespective of the interpretation, our results provide valuable suggestions for future studies undertaken to elucidate the interaction between a listener’s movement and sound duration.

sound localization

Fig. 1 Outline of the loudspeaker array system.

Fig. 2 Results of angular error in the horizontal planes

References:

  • Wallach, “On sound localization,” J. Acoust. Soc. Am., 10, 270–274 (1939).
  • Honda, H. Shibata, S. Hidaka, J. Gyoba, Y. Iwaya, and Y. Suzuki, “Effects of Head Movement and Proprioceptive Feedback in Training of Sound Localization,” i-Perception, 4, 253–264 (2013).
  • Iwaya, Y. Suzuki and D. Kimura, “Effects of head movement on front-back error in sound localization,” Acoust. Sci. Technol., 24, 322–324 (2003).
  • Perrett and W. Noble, “The contribution of head motion cues to localization of low-pass noise,” Percept. Psychophys., 59, 1018–1026 (1997).
  • Cooper, S. Carlile and D. Alais, “Distortions of auditory space during rapid head turns,” Exp. Brain. Res., 191, 209–219 (2008).
  • Leung, D. Alais and S. Carlile, “Compression of auditory space during rapid head turns,” Proc. Natl. Acad. Sci. U.S.A., 105, 6492–6497 (2008).
  • Honda, K. Ohba, Y. Iwaya, and Y. Suzuki, “Detection of sound image movement during horizontal head rotation,” i-Perception, 7, 2041669516669614 (2016).
Share This