The website was moved to a new location. An updated list of projects can be found here.
The optimal integration of simultaneously acquired datasets of the 2D domain (imagery) and the 3D domain (lidar point-cloud) is a direct task when geometric constraints are considered. However, in the absence of system calibrations, the fusion of the two optical datasets becomes uncertain and often fails to meet user expectations. Deep Neural Networks have managed to establish a spatial relationship between the 2D and the 3D domain, but current architectures have not yet been evaluated on long-range (< 200 m) aerial datasets.
Vegetation distribution in the Alps is directly related to geomorphic processes, water availability, plant dispersal modes (e.g. animals, wind) and indirectly to human activity from agriculture to leisure, and tourism. Nevertheless, recent warming trends have begun to affect the limits and spatial structure of vegetation colonies in the Alps thereby threatening ecosystems and species that currently exist in these zones. While these changes can be monitored locally, region-wide characterisations are needed to accurately model and forecast them. To address this need broad scale species distributions are required, accurately linking ground based observations with Earth observation (EO) data.
The goal of this project is to develop deep learning methods that use multi-modal remote sensing technology (Images, LiDAR) to automatically detect individual tree species across broad scales. Such capacity will support future investigations that have the promise to reveal substantial insights on the evolution of tree colonisation patterns in the Alps and its relationship to the accelerated processes being observed as a result of climate change.
Room segmentation on point clouds is an important task in computer vision and robotics that has numerous applications in assisted odometry, which in turns is crucial in several fields such as autonomous navigation or virtual reality. The goal of this project is to develop a methodology that allows the segmentation of point clouds of buildings’ interior into their constituent rooms. Those point clouds are acquired using a portable high-end LiDAR device named BLK2Go, that is carried by a human during the scan.
Image: Ochmann S., et al., 2019. Automatic reconstruction of fully volumetric 3D building models from oriented point clouds
Real-time navigation of small Uninhabited Aerial Vehicles (UAVs) is predominantly based on sensor fusion algorithms. Theses algorithms make use of an Extended Kalman Filter (EKF), wherein, state-dynamics, governed by dead-reckoning, are fused with GPS measurements to yield a navigation solution (position, velocity and attitude). However, during a GPS outage (1-2 minutes), this solution can drift around 1-2 Kilometers, adversely affecting safety and reliability of the mission. Recently, Vehicle Dynamic Model (VDM) based navigation system has shown significant improvement in positioning accuracy during a GPS outage. This is accomplished by incorporation of aerodynamics in the sensor fusion architecture. However, the modeled aerodynamics is old and there is a scope of improvement, thanks to availability of i) state-of-the-art quality flight data acquired in the vicinity of Lausanne and ii) wind tunnel experimentation at Laboratory of Intelligent System. The goal of this student project is to combine the data coming from real-flight campaign and wind tunnel experiments to come up with a new aerodynamic model of the in-house UAV and its integration in the sensor fusion framework.
VDM-based navigation is a novel approach to autonomous navigation that improves the estimation of the navigation states (position, velocity, attitude) under normal and GNSS denied flight condition without the addition of extra-sensors. A custom delta-wing UAV has been constructed at TOPO with the objective of implementing real-time VDM-based navigation on an onboard computer.
The project’s goal is the development of the C++ based code based on Kalman Filtering with the platform aerodynamic model employed as process model. This will require the transfer of information logged in the autopilot (Pixhawk running PX4 firmware) to the companion computer using a RTPS/DDS bridge (PX4-ROS2 interface). This interface allows reliable sharing of time-critical/realtime information between the flight controller and offboard components.