The Evolution of Bat Robots: A Spooky Tale of Echo Location

As Halloween approaches, it’s the perfect time to dive into the mysterious world of bat robots in this Acoustics Today article, “The Evolution of Bat Robots.” The ability of bats to navigate their environment using ultrasound has fascinated scientists for decades, and the mystery of how they process this information has drawn researchers from various fields. It’s no wonder that engineers have been lured into this world, attempting to replicate the biosonar capabilities of bats through a variety of “bat robots.”

Despite decades of research, the intricacies of bat biosonar remain mostly uncharted. Continuous advancements in recording and data analytics technologies promise to unlock more insights into the world of bat robots. These insights will likely drive further evolution in the field. Researchers are on the cusp of developing more integrated systems that combine encoding and extraction of sensory information. These mechanical marvels, inspired by the eerie elegance of bats, may hold the key to autonomous drones capable of navigating the dark forests, just like their natural counterparts.

The Acoustics Today article weaves together a captivating story of technological evolution, highlighting the challenges, breakthroughs, and intriguing possibilities that lie ahead. If you’re curious about how bats’ extraordinary biosonar abilities are inspiring cutting-edge drones and robotic systems, read the full article for free at AcousticsToday.org. It’s a journey that promises to leave you in awe of both nature and human ingenuity. Happy Halloween!

AT Winter 2020 cover - bat robots

New Across Acoustics Episode: Do Shrews Echolocate?

We know that dolphins and bats echolocate, but less is known about the ultrasonic vocalizations of other animals– including the northern short-tailed shrew. Some suggest they don’t even make ultrasonic vocalizations at all, but instead produce noise when they move. In this episode, Valerie Eddington and Laura Kloepper (both currently at University of New Hampshire and previously at St. Mary’s College) discuss their research into the sound made by these creatures.

(Like the episode? Don’t miss the article in JASA!)

4aAB4 – Analysis of bats’ gaze and flight control based on the estimation of their echolocated points with time-domain acoustic simulation

Taito Banda – dmq1001@mail4.doshiha.ac.jp
Miwa Sumiya – miwa1804@gmail.com
Yuya Yamamoto – dmq1050@mail4.doshisha.ac.jp
Yasufumi Yamada – yasufumi.yamada@gmail.com
Faculty of Life and Medical Sciences, Doshisha UniversityKyotanabe, Kyoto, Japan

Yoshiki Nagatani – nagatani@ultrasonics.jp
Department of Electronics, Kobe City College of Technology, Kobe, Japan.

Hiroshi Araki – Araki.Hiroshi@ak.MitsubishiElectric.co.jp
Advanced Technology R&D Center, Mitsubishi Electric Corporation, Amagaski, Japan

Kohta I. Kobayasi – kkobayas@mail.doshisha.ac.jp
Shizuko Hiryu – shiryu@mail.doshisha.ac.jp
Faculty of Life and Medical Sciences, Doshisha University, Kyotanabe, Kyoto, Japan

Popular version of paper 4aAB4 “Analysis of bats’ gaze and flight control based on the estimation of their echolocated points with time-domain acoustic simulation.”
Presented Friday morning, December 7, 2017, 8:45-9:00 AM, Salon F/G/H
174th ASA in New Orleans

Bats broadcast ultrasound and listen to the echoes to understand surrounding information. It is called echolocation. By analyzing those echoes, i.e., arrival time of echoes, bats can detect the position of objects, shape or texture [1-3]. Contrary to the way people use visual information, bats use the sound for sensing the world. How is the world different between the two by sensing? Because both senses are completely different, we cannot imagine how bats see the world.

To address this question, we simulated the echoes arriving at the bats during obstacle-avoiding flight based on the behavioral data so that we could investigate how the surrounding objects were described acoustically.

First, we arranged microphone arrays (24 microphones) and two high-speed cameras in an experimental flight chamber (Figure 1) [4]. The timing, positions and directions of emitted ultrasound as well as the flight paths were measured. A small telemetry-microphone was attached on the back of the bat so that the intensity of emitted ultrasound could be recorded accurately [5]. The bat was forced to follow a S-shaped flight pattern to avoid the obstacle acrylic boards.

Based on those behavioral data, we simulated propagation of sounds with the measured strength and direction emitted at the position of the bat in the experiment, and we could obtain echoes reaching both left and right ears from the obstacles. By using interaural time difference of echoes, we could acoustically identify the echolocated points in the space for all emissions (square plots in Fig.2). We also investigated how the bats show spatial and temporal changes in the echolocated points in the space as they became familiar with the space (top and bottom panels).

We analyzed changes in the echolocated points by using this acoustic simulation, corresponding to which part of objects the bats intended to gaze at. In a comparison between before and after the habituation in the same obstacle layout, there are differences in the wideness of echolocated points on the objects. By flying the same layout repeatedly, false detection of objects was reduced, and their echolocating fields became narrower.

It is natural for animals to pay their attention toward objects adequately and adapt both flight and sensing controls cooperatively as they became familiar with the space. These finding suggests that our approach in this paper, i.e., acoustic simulation based on behavioral experiment is one of effective ways to visualize how the object groups are acoustically structured and represented in the space for bats by echolocation during flight. We believe that it might serve a tip to the question; “What is it like to see as a bat?”

ehcolocation
Figure 1 Diagram of bat flight experiment. Blue and red circles indicate microphones on the wall and the acrylic boards, respectively. Two high-speed video cameras are attached at the two corners of the room. Three acrylic boards are arranged to make bats follow S-shaped flight pattern to avoid the obstacles.

echolocation
Figure 2 Comparison of echolocated points between before and after space habituation. The measured positions where the bat emitted the sound are shown with circle plots meanwhile the calculated echolocated points are shown with square plots. Color variation from blue to red for both circle and square plots corresponds to temporal sequence of the flight. Sizes of circle and square plots correspond to the strength of emissions and their echoes from the obstacles at the bat, respectively.

References:
[1] Griffim D. R., Listning in the dark, Yle University, New Haven, CT, 1958

[2] Simmons J.A., Echolocation in bats: signal processing of echoes for target range, Science, vol. 171, pp.925-928., 1971

[3] Kick S. A., Target-Detection by the Echolocating Bat, Eptesicus fuscus, J Comp Physiol, A., vol. 145, pp.431-435, 1982

[4] Matsuta N, Hiryu S, Fujioka E, Yamada Y, Riquimaroux H, Watanabe Y., Adaptive beam-width control of echolocation sounds by CF-FM bats, Rhinolophus ferrumequinum nippon, during prey-capture flight, J Exp Biol., vol. 206, pp.1210-1218, 2013

[5] Hiryu S, Shiori Y, Hosokawa T, Riquimaroux H, Watanabe Y., On-board telemetry of emitted sounds from free-flying bats: compensation for velocity and distance stabilizes echo frequency and amplitude, J Comp Physiol A., vol. 194, pp.841-851, 2008

3aUW8 – A view askew: Bottlenose dolphins improve echolocation precision by aiming their sonar beam to graze the target

Laura N. Kloepper– lkloepper@saintmarys.edu
Saint Mary’s College
Notre Dame, IN 46556

Yang Liu–yang.liu@umassd.edu
John R. Buck– jbuck@umassd.edu
University of Massachusetts Dartmouth
285 Old Westport Road
Dartmouth, MA 02747

Paul E. Nachtigall–nachtiga@hawaii.edu
University of Hawaii at Manoa
PO Box 1346
Kaneohe, HI 96744

Popular version of paper 3aUW8, “Bottlenose dolphins direct sonar clicks off-axis of targets to maximize Fisher Information about target bearing”
Presented Wednesday morning, November 4, 2015, 10:25 AM in River Terrace 2
170th ASA Meeting, Jacksonville

Bottlenose dolphins are incredible echolocators. Using just sound, they can detect a ping-pong ball sized object from 100 m away, and discriminate between objects differing in thickness by less than 1 mm. Based on what we know about man-made sonar, however, the dolphins’ sonar abilities are an enigma–simply put, they shouldn’t be as good at echolocation as they actually are.

Typical manmade sonar devices achi­eve high levels of performance by using very narrow sonar beams. Creating narrow beams requires large and costly equipment. In contrast to these manmade sonars, bottlenose dolphins achieve the same levels of performance with a sonar beam that is many times wider–but how? Understanding their “sonar secret” can help lead to more sophisticated synthetic sonar devices.

Bottlenose dolphins’ echolocation signals contain a wide range of frequencies.  The higher frequencies propagate away from the dolphin in a narrower beam than the low frequencies do. This means the emitted sonar beam of the dolphin is frequency-dependent.  Objects directly in front of the animal echo back all of the frequencies.   However, as we move out of the direct line in front of the animal, there is less and less high frequency, and when the target is way off to the side, only the lower frequencies reach the target to bounce back.   As shown below in Fig. 1, an object 30 degrees off the sonar beam axis has lost most of the frequencies.

Kloepper-fig1

Figure 1. Beam pattern and normalized amplitude as a function of signal frequency and bearing angle. At 0 degrees, or on-axis, the beam contains an equal representation across all frequencies. As the bearing angle deviates from 0, however, the higher frequency components fall off rapidly.

Consider an analogy to light shining through a prism.  White light entering the prism contains every frequency, but the light leaving the prism at different angles contains different colors.  If we moved a mirror to different angles along the light beam, it would change the color reflected as it moved through different regions of the transmitted beam.  If we were very good, we could locate the mirror precisely in angle based on the color reflected.  If the color changes more rapidly with angle in one region of the beam, we would be most sensitive to small changes in position at that angle, since small changes in position would create large changes in color.  In mathematical terms, this region of maximum change would have the largest gradient of frequency content with respect to angle.  The dolphin sonar appears to be exploiting a similar principle, only the different colors are different frequencies or pitch in the sound.

Prior studies on bottlenose dolphins assumed the animal pointed its beam directly at the target, but this assumption resulted in the conclusion that the animals shouldn’t be as “good” at echolocation as they actually are. What if, instead, they use a different strategy? We hypothesized that the dolphin might be aiming their sonar so that the main axis of the beam passes next to the target, which results in the region of maximum gradient falling on the target. Our model predicts that placing the region of the beam most sensitive to change on the target will give the dolphin greatest precision in locating the object.

To test our hypothesis, we trained a bottlenose dolphin to detect the presence or absence of an aluminum cylinder while we recorded the echolocation signals with a 16-element hydrophone array (Fig.2).

Laura Dolphin Graphics

Figure 2: Experimental setup. The dolphin detected the presence or absence of cylinders at different distances while we recorded sonar beam aim with a hydrophone array.

We then measured where the dolphin directed its sonar beam in relation to the target and found the dolphin pointed its sonar beam 7.05 ± 2.88 degrees (n=1930) away from the target (Fig.3).

Kloepper-Fig_3

Figure 3: Optimality in directing beam away from axis. The numbers on the emitted beam represent the attenuation in decibels relative to the sound emitted from the dolphin. The high frequency beam (red) is narrower than the blue and attenuates at angle more rapidly. The dolphin directs its sonar beam 7 degrees away from the target.

To then determine if certain regions of the sonar beam provide more theoretical “information” to the dolphin, which would improve its echolocation, we applied information theory to the dolphin sonar beam. Using the weighted frequencies present in the signal, we calculated the Fisher Information for the emitted beam of a bottlenose dolphin. From our calculations we determined 95% of the maximum Fisher Information to be between 6.0 and 8.5 degrees off center, with a peak at 7.2 degrees (Fig. 4).

Kloepper-Fig_4

Figure 4: The calculated Fisher Information as a function of bearing angle. The peak of the information is between 6.0 and 8.5 degrees off center, with a peak at 7.2 degrees.

The result? The dolphin is using a strategy that is the mathematically optimal! By directing its sonar beam slightly askew of the target (such as a fish), the target is placed in the highest frequency gradient of the beam, allowing the dolphin to locate the target more precisely.

2pABa9 – Energetically speaking, do all sounds that a dolphin makes cost the same?

Marla M. Holt – marla.holt@noaa.gov
Dawn P. Noren – dawn.noren@noaa.gov
Conservation Biology Division
NOAA NMFS Northwest Fisheries Science Center
2725 Montlake Blvd East
Seattle WA, 98112

Robin C. Dunkin – rdunkin@ucsc.edu
Terrie M. Williams – tmwillia@ucsc.edu
Department of Ecology and Evolutionary Biology
University of California, Santa Cruz
100 Shaffer Road
Santa Cruz, CA 95060

Popular version of paper 2pABa9, “The metabolic costs of producing clicks and social sounds differ in bottlenose dolphins (Tursiops truncatus).”
Presented Tuesday afternoon, November 3, 2015, 3:15, City Terrace room
170th ASA Meeting Jacksonville

Dolphins are known to be quite vocal, producing a variety of sounds described as whistles, squawks, barks, quacks, pops, buzzes and clicks.  These sounds can be tonal (think whistle) or broadband (think buzz), short or long, or loud or not.  Some sounds, such as whistles, are used in social contexts for communication.  Other sounds, such as clicks and buzzes, are used for echolocation, a form of active biosonar that is important for hunting fish [1].   Regardless of what type of sound a dolphin makes in its diverse vocal repertoire, sounds are generated in an anatomically unique way compared to other mammals.   Most mammals, including humans, make sound in their throats or technically, in the larynx.  In contrast, dolphins make sound in their nasal cavity via two sets of structures called the “phonic lips” [2].

All sound production comes at an energetic cost to the signaler [3].  That is, when an animal produces sound, metabolic rate increases a certain amount above baseline or resting (metabolic) rate.  Additionally, many vociferous animals, including dolphins and other marine mammals, modify their acoustic signals in noise.  That is, they call louder, longer or more often in an attempt to be heard above the background din.  Ocean noise levels are rising, particularly in some areas from shipping traffic and other anthropogenic activities and this motivated a series of recent studies to understand the metabolic costs of sound production and vocal modification in dolphins.

We recently measured the energetic cost for both social sound and click production in dolphins and determined if these costs increased when the animals increased the loudness or other parameters of their sounds [4,5].  Two bottlenose dolphins were trained to rest and vocalize under a specialized dome which allowed us to measure their metabolic rates while making different kinds of sounds and while resting (Figure 1).  The dolphins also wore an underwater microphone (a hydrophone embedded in a suction cup) on their foreheads to keep track of vocal performance during trials. The amount of metabolic energy that the dolphins used increased as the total acoustic energy of the vocal bout increased regardless of the type of sound the dolphin made.  The results clearly demonstrate that higher vocal effort results in higher energetic cost to the signaler.

Holt fig 1 - dolphins

Figure 1 – A dolphin participating in a trial to measure metabolic rates during sound production.  Trials were conducted in Dr. Terrie Williams’ Mammalian Physiology lab at the University of California Santa Cruz.  All procedures were approved by the UC Santa Cruz Institutional Animal Care and Use Committee and conducted under US National Marine Fisheries Service permit No.13602.

These recent results allow us to compare metabolic costs of production of different sound types. However, the average total energy content of the sounds produced per trial was different depending on the dolphin subject and whether the dolphins were producing social sounds or clicks.  Since metabolic cost is dependent on vocal effort, metabolic cost comparisons across sound types need to be made for equal energy sound production.

The relationship between energetic cost and vocal effort for social sounds allowed us to predict metabolic costs of producing these sounds at the same sound energy as in click trials.  The results, shown in Figure 2, demonstrate that bottlenose dolphins produce clicks at a very small fraction of the metabolic cost of producing whistles of equal energy.  These findings are consistent with empirical observations demonstrating that considerably higher air pressure within the dolphin nasal passage is required to generate whistles compared to clicks [1].  This pressurized air is what powers sound production in dolphins and toothed whales [1] and mechanistically explains the observed difference in metabolic cost between the different sound types.

Holt fig 2 - dolphins

Figure 2 – Metabolic costs of producing social sounds and clicks of equal energy content within a dolphin subject.

Differences in metabolic costs of whistling versus clicking have implications for understanding the biological consequences of behavioral responses to ocean noise.  Across different sound types, metabolic costs depend on vocal effort.  Yet, overall costs of producing clicks are substantially lower than costs of producing whistles.  The results reported in this paper demonstrate that the biological consequences of vocal responses to noise can be quite different depending on the behavioral context of the animals affected, as well as the extent of the response.

 

  1. Au, W. W. L. The Sonar of Dolphins, New York: Springer-Verlag.
  2. Cranford, T. W., et al., Observation and analysis of sonar signal generation in the bottlenose dolphin (Tursiops truncatus): evidence for two sonar sources. Journal of Experimental Marine Biology and Ecology, 2011. 407: p. 81-96.
  3. Ophir, A. G., Schrader, S. B. and Gillooly, J. F., Energetic cost of calling: general constraints and species-specific differences. Journal of Evolutionary Biology, 2010. 23: p. 1564-1569.
  4. Noren, D. P., Holt, M. M., Dunkin, R. C. and Williams, T. M. The metabolic cost of communicative sound production in bottlenose dolphins (Tursiops truncatus). Journal of Experimental Biology, 2013. 216: 1624-1629.
  5. Holt, M. M., Noren, D. P., Dunkin, R. C. and Williams, T. M. Vocal performance affects metabolic rate in dolphins: implication for animals communicating in noisy environments. Journal of Experimental Biology, 2015. 218: 1647-1654.