151st ASA Meeting, Providence, RI
Robots That Listen:
Mimicking Human Hearing to Help Robots Find Snipers
Socrates Deligeorges - sgd@bu.edu
Aleks Zosuls - azosuls@bu.edu
David Mountain - dcm@bu.edu
David Anderson - da@bu.edu
Allyn Hubbard - aeh@bu.edu
Department of Electrical and Computer Engineering,
Department of Biomedical Engineering,
and Boston University Hearing Research Center,
44 Cummington St.
Boston, MA 02215.
Popular version of paper 2aSP1
Presented Tuesday morning, June 6, 2006
151st ASA Meeting, Providence, RI
Robots may become a soldiers best friend when dealing with snipers
and other hazardous situations in urban environments. Small, portable
robots equipped with advanced sensors are being used for reconnaissance
and surveillance missions to detect and locate hidden gunmen before troops
expose themselves to hostile gunfire. A snipers location can be revealed
once a weapon has been fired, if the robot is equipped with an acoustic
direction finding (ADF) capability and can quickly point its optical sensors
at the sound source. Accurately localizing and identifying sound
sources in urban environments is especially difficult, however, due to
background noise and reverberation.
Many of the problems associated with working in complex acoustic environments
have now been overcome using a new acoustic signal-processing approach
based on models of human hearing. This biomimetic (copying biology)
approach replicates the signal processing that takes place in the human
inner ear and brainstem. The system uses frequency and timing cues exploited
by the auditory system. These features include time delays between microphones,
level differences between microphones, and frequency profiles.
|
The black box on top of the robot head is the acoustic direction
finding unit
|
The initial system of direction-finding algorithms was designed and
tested using the EarLab simulation software (http://earlab.bu.edu).
The ADF electronics were designed and built in Boston Universitys VLSI
and Neural Net Systems Laboratory (http://www.bu.edu/vnns)
and mounted on an iRobot Packbot . The system was programmed to
detect and localize gunfire and orient the robots camera towards the shooter.
Positioning errors in outdoor field tests with live fire were less than
2 degrees. The robot also performed well during indoor testing in
highly reverberant spaces such as the atrium of the Boston University Photonics
Center using simulated gunfire.
|
The robot patrolling an urban environment
|
The performance of the prototype platform demonstrates the potential
of the biomimetic approach and its applications to practical problems for
commercial, civilian, and military acoustic processing.
click here for a movie of the robot in action
[ Lay Language Paper Index | Press Room ]
|