iThCoM

Non-Invasive AI-powered Thermal Comfort Monitoring

Summary

Rapid urbanization and the fact that people spend almost 90% of their time indoors makes indoor environmental quality responsible in new and existing buildings for assuring the well-being of the occupants. Therefore, a human-centric approach towards designing and operating the buildings is an important step to advance living and working conditions of individuals.

The transition towards a human-centered indoor climate control requires knowledge of individuals’ thermal state at the level of body parts. However, current state-of-the-art solutions are based on invasive wearable technologies that suppose a physical access to people and cannot scale with the number of occupants (e.g., in public spaces with continuous flow of people). This project aims to investigate non-invasive multi-modal sensing solutions such as IR and RGB cameras and to develop a Machine Learning-based human thermo-physiology model to effectively monitor occupants’ thermal states in indoor spaces. We will show that it is possible to remotely measure individuals’ thermal states given a new set of input attributes that are detectable with cameras. This could be used for personalized HVAC controls with human-in-the-loop. Advanced computer vision techniques such as human body part detection and attribute recognition will be used to extract these key attributes. The project combines ergonomics of the thermal indoor environment and advanced computing methods such as Deep Learning and Computer Vision.

 We believe that this interdisciplinary collaboration will pave the way for a more resilient human-centered built environment, and lead to new opportunities for designing and operating sustainable buildings.

General information

Output

PechaKucha: ENAC Research Day 07.09.2023