To help small aerial robots navigate in the dark and other low-visibility environments, my colleagues and I developed an ultrasound-based perception system inspired by bat echolocation.
Current robots rely heavily on cameras or light detection and ranging, known as lidar, or both. But these sensors fail in visually challenging conditions, such as smoke, fog, dust, snow or complete darkness.
I’m a scientific engineer who develops bio-inspired microrobots. To solve this challenge, my research team looked at nature’s experts at navigating in poor visibility: bats. They thrive in dark, damp and dusty caves and can detect obstacles as thin as a human hair using echolocation while weighing as little as two paper clips. They emit sound waves and listen to weak echoes reflected from objects.
However, enabling this sensing on aerial robots is extremely challenging because propellers generate a lot of noise. It is a bit like trying to listen to your friend while a jet engine is taking off next to you.
To overcome this issue, we present two key ideas. First, a physical acoustic shield inspired by bat’s ear cartilage reduces propeller noise around the acoustic sensors, which act like the robot’s ears. Second, a neural network called Saranga recovers weak echo signals from very noisy measurements by learning patterns over time, inspired by how bats process sound.
Together, these enable the robot to estimate obstacle locations in 3D and navigate safely using milliwatt-level sensing power.
The drone navigates around an obstacle in a test with simulated snowfall.
Nitin Sanket
Why it matters
These types of drones are very useful for search and rescue, especially in confined, dynamic and dangerous environments, because they are small and inexpensive. Search-and-rescue operations often happen in environments where visibility is very poor, such as forest fires, collapsed buildings, caves or dusty outdoor conditions. In these scenarios, traditional sensors like cameras and lidar often become unreliable.
Bats do not rely only on vision and instead use echolocation to perceive the world. Ultrasound sensing doesn’t depend on lighting conditions and works in smoke, dust and darkness.
Our work shows that it is possible to bring this capability to aerial robots despite strong onboard propeller noise. Sonar boosted by noise shielding and machine learning promises to enable a new class of small, low-cost robots that can operate in environments where current systems fail.
This research can enable highly functional, autonomous, tiny aerial robots for critical humanitarian applications, such as search and rescue, combating poaching and cave exploration. AI-enabled sonar navigation could lead to safer, faster and more cost-effective robots for time-sensitive operations where human or larger helicopter access is limited. This is a step toward being able to deploy swarms of aerial robots, much like groups of…


