Vision-Based Sense and Avoid Systems for Unmanned Aerial Vehicles
This collaborative project aims to have fully autonomous Unmanned Aerial Vehicles (UAVs) for civil applications able to safely fly and share a common airspace without human supervision. More specifically, our work is on algorithms of the Sense and Avoid (SAA) system, fusing information from the sensors to track other UAVs and use those tracks to avoid collision if necessary. In this project, we focus on having a single camera as main sensor.
Our approach is to adapt the algorithms, both tracking and avoidance, to the limited field of view (FOV) of the camera. Because it is not possible to reliably track a target that is not the sensor’s FOV, the motion is constrained to guarantee collision-free trajectories.
To validate our approach, we run experiments an indoor facility equipped with a motion capture system, providing millimetric position accuracy. Our experiments are performed using small quadrotors that embed the sensing and computation necessary to perform the SAA.