Semester Projects and Master Theses

Various human-robot interaction systems have been designed to support children’s learning of language and literacy. They commonly foster learning in a way of utilizing social robots to engage a learner in the designed educational activities, which heavily rely on the human-robot social relationship. In this project, we will work on building the QT Cowriter, an intelligent conversational handwriting companion robot powered by ChatGPT, that can interactively talk with children like a friend and hold a pen to write on tablets. Building upon it, we would like to exploit the social bond between children and robots and explore the model of Relational Norm Intervention (RNI) for the purpose of body posture regulation during handwriting.

To this end, in this research project, we will work with QTrobot and Wacom tablets. We will implement innovative human-robot interaction systems, develop state-of-the-art algorithms, and conduct small-scale user studies.  We have weekly meetings to address questions, discuss progress, and think about future ideas. 

We are looking for students with the following interests: HRI, ROS, and Large Language Models. Relevant IT skills include Python and ROS basics. If you are interested, do not hesitate to contact me.

Contact: [email protected]

VR workspaces are becoming increasingly prevalent, revolutionizing industries such as design, training, and collaboration. However, prolonged use of virtual desktops in VR can lead to discomfort and health issues due to prolonged static postures. This project seeks to address this challenge by creating an unobtrusive system that subtly guides users into healthier postures without disrupting their immersive experiences. In [1], the unobtrusive intervention is implemented by adjusting the content position at an unperceivable low speed, while the motion strategies were pre-defined regardless of the current posture of the user. Thus, a second objective of this project is to investigate a Reinforcement Learning agent to adaptively adjust the content position and learn personalized intervention strategies to help the user keep a proper posture or stimulate body movement from time to time to prevent long static postures.

To this end, in this project, we will work on a VR/XR development with Unity and develop new RL algorithms for adaptive posture intervention. We will have weekly meetings to address questions, discuss progress, and think about future ideas. We aim to summarize the results in a scientific report. 

We are looking for students with any of the following interests: VR/XR, Machine Learning, and Human-computer Interaction. Relevant IT skills include Python and knowledge of any one of the following object-oriented programming languages: C++ or C#. Experience with VR/XR development would be beneficial. If you are interested, do not hesitate to contact me.

Contact: [email protected]

Learning how to grip the pen properly is a fundamental part of handwriting training for children, which requires constant monitoring of their pen grip posture and timely intervention from teachers. Various sensing technologies have been explored to automate the pen grip posture estimation, like the camera-based system or using EMG armband. In the context of digital writing, namely, writing on tablets, these solutions with additional sensors lack portability. In this project, we aim to tackle this challenge by exploiting the integrated sensors of touch screens and digital pens.  One study identified that it is promising to reconstruct the 3D hand pose based on the capacitive images provided by the touch screen. Together with the accessible pen tip location and orientation, which are strongly coupled with the hand pose, we postulate that the pen grip posture can be inferred in situ with a single commodity tablet and pen. This is a continuing project, in which a dataset and the first version of pen gripping posture analysis have been developed in Phase 1&2. In this phase, the goal is to improve the algorithm and implement the baseline condition.  

To this end, in this research project, we will work on a Wacom Pen tablet and develop new deep-learning algorithms for pen grip posture estimation and analysis. We will have weekly meetings to address questions, discuss progress, and think about future ideas. We aim to summarize the results in a scientific report.

We are looking for students with any of the following interests: Machine Learning, Human-computer Interaction, Computer Vision, and Mobile Computing. Relevant IT skills include Python and basic knowledge of any one of the following object-oriented programming languages: C++, Java or C#. If you are interested, do not hesitate to contact me.

Contact: [email protected]

This project will explore uses of diminished reality, a type of mixed reality where real-world objects are removed or occluded using computer vision before being passed into a user’s headset. You will use a Zed Mini camera to capture and process a video stream of the world; identify objects using YOLO (or a similar real-time object-detection algorithm); attempt to occlude, blur, or remove these objects; and then finally pass this modified video stream into the user’s headset. The end goal of this project is to develop a mixed reality environment that supports users in blocking out distractions to improve focus and concentration. More information on diminished reality can be found in this paper: https://dl.acm.org/doi/fullHtml/10.1145/3491102.3517452.

Required knowledge: Unity, Python, CV

Contact: Richard Davis at [email protected]

In this project, you will analyse a dataset of people being interviewed by a human and by a robot. The goal is trying to find patterns in the extracted metrics, investigating potential correlations between the metrics and the type of interlocutor. The metrics are voice features and autonomous speech recognition outputs from the system. Your work will involve: (1) video analysis to build the ground truth of the metrics, (2) implementing and fine-tuning algorithms to find patterns, and (3) implementing a machine learning algorithm in the robot trying to identify, in running-time, if the users are behaving like they are “talking to a robot”, based on the analysed dataset.

Required knowledge: Python, Machine Learning, Basics of Statistic

Key-words: Data Analysis, Human-Robot Interaction, Conversational AI 

Contact: [email protected] with tag in the subject: [Semester Project]

Dyslexia is a specific and long-lasting learning disorder characterised by reading performances well below those expected for a certain age. Eye-tracking technology has been successfully used to study the reading behaviour of children. This semester project aims to explore its application in more ecologically valid scenarios and expand its usage. We started to develop an iOS application that utilizes the eye-tracking abilities of ARKit, Apple’s augmented reality framework, to study children’s gaze behaviour while interacting with the iPad. The goal of this semester project is: 1) to improve eye-tracking and 2) to conduct experiments comparing the results obtained from the ARKit application with eye-tracking glasses, as well as across different iPad models.


Since the application is on iPad, you will learn how to program in the Swift language. A background in Linear Algebra is useful. We seek students interested in IOS development and experiment design to join this project. This project is rather designed for a master semester project, but can be adapted for motivated bachelor students.

Contact: [email protected]

Dyslexia is a specific and long-lasting learning disorder characterised by reading performances well below those expected for a certain age. A recent promising research trend focused on abnormalities of the “internal clock” used to sample information as one of the main underlying deficits. We are developing several digital activities to explore that view.

In this project, you will develop a game that calls on the rhythmic abilities of the user.
Since the application is on iPad, you will learn how to program in the Swift language. We seek students interested in IOS development and Game Design to join this project. This project is rather designed for a master semester project, but can be adapted for motivated bachelor students. 


Contact: [email protected]

Jupyter notebooks have become an essential tool used in data science, scientific computing, and machine learning in both industry and academia. Cloud based Jupyter notebooks like Google Colab, Noto, and Jupyter Hub bring the power of Jupyter notebooks into the cloud and make it easier to share and collaborate. At EPFL and other universities, these cloud-based Jupyter notebooks are used as interactive textbooks, platforms for distributing and grading homework, and as simulation environments.

These notebooks produce rich logs of interaction data, but there is currently no easy way for teachers and students to view and make sense of this data. This data could provide a valuable source of feedback that both teachers and students could use to improve their teaching and learning. This way of using data is called learning analytics, and we have recently begun designing a software extension that will bring the power of learning analytics directly into cloud-based Jupyter notebooks.

We are looking for students to join in the development of this lerning analytics tool with any of the following interests: data visualization, full-stack web development, UX research, learning analytics, education.

Contact: [email protected]