2pAB4 – Towards understanding how dolphins use sound to understand their environment
YeonJoon Cheong – yjcheong@umich.edu
K. Alex Shorter – kshorter@umich.edu
Bogdan-Ioan Popa – bipopa@umich.edu
University of Michigan, Ann Arbor
2350 Hayward St
Ann Arbor, MI 48109-2125
Popular version of 2pAB4 – Acoustic scene modeling for echolocation in bottlenose dolphin
Presented Tuesday Morning, November 30, 2021
181st ASA Meeting
Click here to read the abstract
Dolphins are excellent at using ultrasound to discover their surroundings and find hidden objects. In a process called echolocation, dolphins project outgoing ultrasound pulses called clicks and receive echoes from distant objects, which are converted into a model of the surroundings. Despite significant research on echolocation, how dolphins process echoes to find objects in cluttered environments, and how they adapt their searching strategy based on the received echoes are still open questions.
Here we developed a framework that combines experimental measurements and physics-based models of the acoustic source and environments to provide new insight into echolocation. We conducted echolocation experiments at Dolphin Quest Oahu, Hawaii, which consisted of two stages. In the first stage, a dolphin was trained to search for a designated target using both vision and sound. In the second stage, the dolphin was asked to find the designated target placed randomly in the environment in the presence of distraction objects while “blind-folded” using suction cups, Fig. 1. After each trial, the dolphin was rewarded with a fish if it selected the correct target.
Target discrimination tasks have been used by many research groups to investigate echolocation. Interesting behavior has been observed during these tasks. For example, animals sometimes swim from object to object, carefully inspecting them before making a decision. Other times they swim without hesitation straight to the target. These types of behavior are often characterized using measurements of animal acoustics and movement, but how clutter in the environment changes the difficulty of the discrimination task or how much information the animals gather about the acoustic scene before target selection are not fully understood.
Our approach assumes that the dolphins memorize target echoes from different locations in the environment during training. We hypothesize that in a cluttered environment the dolphin selects the object that best matches the learned target echo signature, even if it is not an exact match. Our framework enables the calculation of a parameter that quantifies how well a received echo matches the learned echo, called the “likelihood parameter”. This parameter was used to build a map of the most likely target locations in the acoustic scene.
During the experiments, the dolphin swam to and investigated positions in the environment with high predicted target likelihood, as estimated by our approach. When the cluttered scene resulted in multiple objects with high likelihood values, the animal was observed to move towards and scan those areas to collect information before the decision. In other scenarios, the computed likelihood parameter was large at only one position, which explained why the animal swam to that position without hesitation. These results suggest that dolphins might create a similar “likelihood map” as information is gathered before target selection.
The proposed approach provides important additional insight into the acoustic scene formulated by echolocating dolphins, and how the animals use this evolving information to classify and locate targets. Our framework will lead to a more complete understanding of the complex perception procedure used by the echolocating animals.