Learning acoustics-based localization of a blind drone

Contact: Dümbgen, Frederike

Synopsis: Study and test drone’s capability to localize itself from acoustic signals.

Level: BS, MS

Description:

Acoustic signals are inexpensive to create and record and their physics are well understood. Using only relatively cheap and light hardware, drones could therefore be equipped with audio processing capabilities. However, very few drone localization systems utilize audio signals, and characteristics of drones such as their inherent noise have not been studied in regard to localization.

The first goal of this semester project is to characterize the behavior of a drone’s sound to factors such as geometry and propeller speed in an anechoic room. In particular, we want to create a model that predicts the drone’s frequency response based on the movements it makes. Equivalently, we can use this model to infer the movements necessary to generate a desired sound. We will adapt the model’s complexity based on the severeness of the scattering effects by the drone’s body. In a second stage, we will use this knowledge to localize (and potentially control) a drone inside a room, from an external microphone.

We will use the crazyflie drone: https://www.bitcraze.io/crazyflie-2-1/ and measurements from an audio sound card.

Deliverables: A report and a working system with clear documentation.

Prerequisites: Interest in audio signal processing, preferably experience  working on drones, and solid programming skills.

Type of Work: 50% data acquisition and analysis, 50% algorithm design and programming.