Airborne LiDAR and Imagery Aided Tree Species Detection via Semi-Supervised Deep Learning

LiDAR detect tree individuals combined with imagery to be combined in an Automatic Tree Detection and Classification algorithm

Keywords

Remote sensing, Lidar, Imagery, 3D, Point-cloud, Deep Learning

Introduction

Vegetation distribution in the Alps is directly related to geomorphic processes, water availability, plant dispersal modes (e.g. animals, wind) and indirectly to human activity from agriculture to leisure, and tourism [4]. Nevertheless, recent warming trends have begun to affect the limits and spatial structure of vegetation colonies in the Alps [3,12] thereby threatening existing ecosystems in the upper altitudes. While these changes can be monitored locally, a region-wide characterisation is needed to accurately model and forecast them. To address this need broad scale species distributions are required, accurately linking in-situ observations with Earth observation (EO) data [10,14].

In particular, we aim to leverage in-situ species characteristics and environmental parameter observations in combination with visible and  structural features derived from airborne imagery and laser scanning data to automatically detect individual tree species over broad scales via semi-supervised deep learning approaches.  The products created in this project will be applied to future research efforts, which will explore the potential of a new generation of process based species evolution models which take advantage of the spatially explicit species distribution data produced in this project.

In short, this project aims to lay the foundation for the use of remote sensing technology to support future investigations that have the promise to reveal substantial insights on the evolution of tree colonisation patterns in the Alps and its relationship to the accelerated processes being observed as a result of climate change.

Task Description and Methodology

In this project, we will investigate the possibility to automatically establish species distributions through the use of airborne sensor data in combination with deep learning models. Airborne imagery and laser scanning data was collected over the Swiss alpine valley, Val d’Arpette in the fall of 2021. This data, in combination with in-situ species observations will be used to train and deploy an object detection model based on deep neural networks, currently the gold standard in remote sensing image analysis. The main tasks in this project are summarized as follows:

  • Review and identify relevant literature on forest monitoring and object detection:
  • Per [1], automatically detected tree locations from a LiDAR based tree detection algorithm will be used to create ‘fuzzy’ approximate tree bounding boxes to serve as initialization training data for identified object detection model.
  • Precise tree bounding boxes must be manually established using ground truth in-situ observations which include the individual tree species for each label.
  • The hyper-parameters of the trained model will then be tuned to enhance the performance for the given area of interest.
  • Results of predicted tree individuals will then be spatially overlapped with LiDAR based predictions to obtain the height of each classified tree individual.
  • The results should be prepared for visualization and sharing via standard GIS systems.

Inputs:

  • 10cm airborne orthophoto / LiDAR pointcloud ~ 20 pnts/m2
  • ~500 in-situ localized tree species observations
  • Pytorch lightning implementation provided by [1] or other identified state of the art methods. 

Deliverables:

  • Report summarising findings of the investigation
  • Code implementation published to TOPO lab GitLab account
  • Output predictions prepared in a format readable by a standard GIS software

Prerequisites

  • Experience with python, Deep Learning concepts and common deep learning frameworks (pytorch, Tensorflow,etc.), Computer Vision. 

Contact

Interested candidates are kindly asked to send us by email their CV/github profile and a short motivation statement.

Jesse Lahaye , Jan Skaloud