Can one find his absolute GPS cordinates and view pose only using a single RGB camera in less than a second? Can you tell exactly where and from what angle a photo or video has been taken? In this project the students will join research team, developing machine learning solutions answering those challenges.
Can we teach our drones to ‘feel’ the aerodynamics under their wings? Is it possible to navigate and find one’s position only using your sensation of aerodynamics? Can we draw inspiration how nature has solved the challenge? In this project one will join an exiting research project investigating the vechicle dynamic navigation concept. State of the art wind tunnel testing, Computational Fluid Dynamics and Physics guided machine learning.
Real-time navigation of small Uninhabited Aerial Vehicles (UAVs) is predominantly based on sensor fusion algorithms. Theses algorithms make use of an Extended Kalman Filter (EKF), wherein, state-dynamics, governed by dead-reckoning, are fused with GPS measurements to yield a navigation solution (position, velocity and attitude). However, during a GPS outage (1-2 minutes), this solution can drift around 1-2 Kilometers, adversely affecting safety and reliability of the mission. Recently, Vehicle Dynamic Model (VDM) based navigation system has shown significant improvement in positioning accuracy during a GPS outage. This is accomplished by incorporation of aerodynamics in the sensor fusion architecture. However, the modeled aerodynamics is old and there is a scope of improvement, thanks to availability of i) state-of-the-art quality flight data acquired in the vicinity of Lausanne and ii) wind tunnel experimentation at Laboratory of Intelligent System. The goal of this student project is to combine the data coming from real-flight campaign and wind tunnel experiments to come up with a new aerodynamic model of the in-house UAV and its integration in the sensor fusion framework.
Success of a drone mission is incumbent on accurate determination of its real-time position, velocity, and attitude, also known as navigation states altogether. For small Unmanned Aerial Vehicles, this is conventionally determined by a fusion of Inertial Navigation System (INS) and Global Navigation Satellite System (GNSS). Very recently, a Vehicle Dynamic Model based navigation system – VDMNav (in post-processing) has demonstrated an improvement in 1) attitude determination in normal flying condition and 2) in position in case of GNSS outage in comparison to traditional INS-based navigation. The student will work with the first real-time prototype in C++ and improve its performance and modulations via different workpackages.
The use of redundant MEMS grade IMUs (R-IMU) present an economically and ergonomically viable solution to improve navigation performance in a classical sensor-fusion. However, such a system has never been tested in an aerodynamically constrained sensor fusion framework. Thanks to recent in-house development of R-IMU board (dahu4Nav), there is a hope to come up with a first ever sensor fusion architecture that has inertial redundancy in addition to already incorporated aerodynamics. However, there are numerous challenges on hardware, firmware and software fronts that need to be resolved.
The optimal integration of simultaneously acquired datasets of the 2D domain (imagery) and the 3D domain (lidar point-cloud) is a direct task when geometric constraints are considered. However, in the absence of system calibrations, the fusion of the two optical datasets becomes uncertain and often fails to meet user expectations. Deep Neural Networks have managed to establish a spatial relationship between the 2D and the 3D domain, but current architectures have not yet been evaluated on long-range (< 200 m) aerial datasets.
VDM-based navigation is a novel approach to autonomous navigation that improves the estimation of the navigation states (position, velocity, attitude) under normal and GNSS denied flight condition without the addition of extra-sensors. A custom delta-wing UAV has been constructed at TOPO with the objective of implementing real-time VDM-based navigation on an onboard computer.
The project’s goal is the development of the C++ based code based on Kalman Filtering with the platform aerodynamic model employed as process model. This will require the transfer of information logged in the autopilot (Pixhawk running PX4 firmware) to the companion computer using a RTPS/DDS bridge (PX4-ROS2 interface). This interface allows reliable sharing of time-critical/realtime information between the flight controller and offboard components.
Vegetation distribution in the Alps is directly related to geomorphic processes, water availability, plant dispersal modes (e.g. animals, wind) and indirectly to human activity from agriculture to leisure, and tourism. Nevertheless, recent warming trends have begun to affect the limits and spatial structure of vegetation colonies in the Alps thereby threatening ecosystems and species that currently exist in these zones. While these changes can be monitored locally, region-wide characterisations are needed to accurately model and forecast them. To address this need broad scale species distributions are required, accurately linking ground based observations with Earth observation (EO) data.
The goal of this project is to develop deep learning methods that use multi-modal remote sensing technology (Images, LiDAR) to automatically detect individual tree species across broad scales. Such capacity will support future investigations that have the promise to reveal substantial insights on the evolution of tree colonisation patterns in the Alps and its relationship to the accelerated processes being observed as a result of climate change.