Leslie Kay - kaysonic@voyager.co.nz
Spatial Sensing Laboratory
Bay Advanced Technologies Ltd.
PO Box 124,
Russell, New Zealand
Popular version of paper 2pABa1
Presented Tuesday Afternoon, December 5, 2000
ASA/NOISE-CON 2000 Meeting, Newport Beach, CA
Vision substitution has been long-sought-after as a means for enabling blind persons to be more effectively rehabilitated, and for young blind children to develop more naturally like sighted children. Animals such as the bat have developed sonar methods for very effectively finding their way in the dark and catching food. The same is true for dolphins who use their sonar system underwater. These skills have often been quoted as a good reasons to think that an "air-sonar" device might greatly help the blind, but for many years no one had the thought of developing an air-sonar as a "vision substitute." But the development of sonic eyeglasses in the early 1970's changed our thinking.
For the next 20 years a sensor system was under development to improve further the ability of users to perceive their surroundings. In the meantime, the sonic eyeglasses became widely used. The stage has now been reached when blind persons can walk about like sighted persons in a busy shopping area going in and out of shops, and be able to recognize their location relative to the many landmarks on the way. We are calling the new process "sonocular perception" - or seeing with sound. Blind persons appear to look at where they are going, and they can focus their attention on specific objects like sighted persons fixate on objects so as better to recognize them.
An example of sonar eyeglasses is shown in figure 1. These were specially developed for a blind child. There are three sensing elements: one radiates ultrasonic waves, and the other two act as receivers. In figure 2 is shown the later sensor (known as KASPA) fitted in a head band that has, in addition, a central field of view modeling the function of eye's fovea, a retinal area producing focused vision and containing the photoreceptors that enable color vision. The sensor has greater spatial resolution by a factor of 6 than is available with the eyeglasses.
The vision substitution method produces ultrasonic wave transmission into a wide field of view, from which a multiplicity of echoes is produced by each object in this field. An object like a bush will produce tiny echoes from the leaves and the branches that are then received by the sonar sensing elements. These tiny echoes are each converted into tones that have a pitch (or frequency) representing the distance to each leaf.
We call these multiple tones a 'tone complex', and the sound that is heard through miniature earphones by a user is called a 'sound signature'. With the sensor mounted on the forehead and moved about in a looking action, a blind user senses in a stereophonic form, the multiple object space that makes up the environment in which movement take place.
The remarkable auditory experience one gets when first trying the vision substitute is the real-time change in sound that takes place as the head moves. This is because of the resulting change in view of the objects. This is like normal sight, but we pay no attention to the constant change. Indeed, the brain converts optical images at the eye, as they change with view, into invariant objects as seen from a different angle. An experienced blind user of sonocular perception finds that the brain does similar things to the sound signatures. Each object is located and recognized as a separate entity. But this of course is not with the clarity of optical vision.
Only those blind users who have developed the ability over time come to say that this is their experience. Sighted persons have not had a need to spend the time learning to see with sound. For those blind persons who have, some quite remarkable skills have been developed. For example, some blind persons have learned to cycle in slalom fashion between a row of poles spaced 2 meters apart just like in snow skiing. A totally blind young man using a softball bat has shown that he can hit, with a good whack, a softball that is thrown at him. These examples were thought to be "not possible." This is especially so in the case of hitting a ball because the bio-acoustic vision substitution eyeglasses must provide information of the trajectory of the ball and the brain must interpret this in real-time if contact is to be made at the appropriate moment. The bio-acoustic mechanism is not yet understood. We do know however, that the acoustic flow that is generated by the echo from the ball is rich in spatial information.