Alumni of IIG
Former staff members
Thibault Porssut worked on “Embodiment in Virtual Reality” as a PhD student at EPFL from 2016 to 2020 (EPFL PhD thesis N°8053). He then worked on a postdoc funded by the Hasler Foundation from december 2020 to May 2021. He received a Master degree in Mechanical engineering and Industrial Engineering from l’ Ecole des Arts et Métiers and a Research Master’s degree in Digital Mock-up and Virtual Immersion from Institut Image in France. He did his Master project at EVL in Chicago in partnership with Mechdyne on collaborative task between the HTC Vive and the CAVE 2.
Phil Lopes graduated with a PhD in Artificial Intelligence in Digital Games from the Institute of Digital Games of the University of Malta. His main interest is in the development of new tools and algorithms for Procedural Content Generation (PCG) and Digital Game blending. He currently focuses at the interplay of audio, level design and visuals. His most developed system is Sonancia, a multi-faceted generator for horror, which can be downloaded here! Phil also worked as a post-doc at the University of Geneva, continuing his research in music and games, in addition to developing tools and applications in the emotion recognition filed specifically on the topic of e-sports. Phil has worked on Virtual Reality systems and its application in emotion recognition, specifically for MRI analysis, which includes games and applications capable of effectively being using in this condition. He is very passionate about games (board-games, digital and pen & paper) and music (sound design, recording and playing). He is now Assistant-Professor in Artificial Intelligence for Digital Games at Universidade Lusófona de Humanidades e Tecnologia, Lisbon, Portugal
Neal Hartman received a B.S.E. degree in Mechnical Engineering from UC Berkeley and a M.A. in Film Directing from the university of Westminster. He worked 20 years as an accomplished engineer for CERN, the particle physics laboratory in Switzerland. In IIG he worked on “Embodiment in Virtual Reality”. He is now director of the Science Gallery in Venice.
Julien received a master degree in Computer Science from EPFL. He did his Master project here, with IIG, about evaluating and reducing Motion Sickness in Virtual Reality. He worked on this topic as a scientific assistant with a short project grant from the Hasler Foundation.
Sidney has worked in EPFL-IIG from January to August 2017 on movement distortion and immersive embodied interactions funded by the Hasler Foundation. He received both his B.Sc. and M.Sc. in Computer Science from EPFL between 2011 and 2017. He now works as software Engineer in Logitech.
Dr Wang has worked as a postdoctoral research associate for EPFL-IIG from 2012 to 2015 ; he then went to EPFL-LMAM to finalize the SUVA project. Nan WANG got his University degree at the Communication University of China (Beijing) in 2006 and his master research degree in France, at Department of Informatique of INSA de Lyon in 2007. After one further year in EURECOM as a research engineer working in Computer Vision and Image Processing, he joined Centre de Robotic of Ecole des Mines de PARIS and got his Ph.D in 2012 on research with Prof. Philippe FUCHS with topic immersive 3D interaction and 3D perception. Nan WANG’s research interests are Human-Computer Interaction, particularly in 3D Visualization, immersive interaction, motion capture and analysis.
Valerie worked for the CTI project Walt-Mocap from December 2016 to May 2017. She got her Bachelor Degree in Animation and her Master Degree in Animated Film Direction at the Ceruleum Ecole d’Arts Visuels (Lausanne) in 2012. She worked as animator and lead animator in renown companies in Europe producing 2D & 3D films and advertisements.
Dr Mahmudi has worked for EPFL-IIG from May 2014 to May 2017 in the framework of two CTI projects: WALT and WALT-Mocap. His research, in association with Moka Studios, was on a sketching interface for desinging human 3D poses for graphic artists. He received his BSc degree in EECS from Jacobs University in Germany. Afterwards he joined University of California-Merced’s Graphics Lab for graduate work and received his PhD degree in 2013. His PhD dissertation was on developing data-driven human-like motion planning algorithms suitable for virtual characters and humanoid robots. His research interests are in Computer Graphics, AI and Robotics.
Dr Harish has worked for EPFL-IIG from June 2014 to May 2017 in the framework of two CTI projects: WALT and WALT-Mocap. His research, in association with Moka Studios, was on designing parallel inverse kinematics algorithm on the GPU. He got his B.Tech from UPTU, Lucknow, India and joined the PhD program at IIIT, Hyderabad, India in 2005. He focused on parallelizing graph algorithms on the GPU and in the field of Computational Displays during his PhD, advised by Prof. P. J. Narayanan. He completed his PhD in 2013 and joined University of California, Irvine as a visiting scholar. He then joined Samsung Research India as a technical lead, where he worked on parallelizing graph cut algorithms and optimizing camera and pose estimation for 2D animations. His interests include parallel algorithms, novel displays, CHI and computer graphics.
Mireille has worked for two CTI project s Walt and Walt-Mocap between october 2014 and November 2016. Mireille is graphic and computer graphics designer by training, she hase worked 15 years as a 3D designer and animator at EPFL-VRlab, on 3D virtual humans meeting the specific requirements for real-time. Tackling so many animation problems over the years has sharpened her keen eye for tiny details and nurtured her fascination for expressive motion and believability. Having worked in close collaboration with researchers from various fields (computer scientists, ophtalmologists, psychiatrists, artists, etc), she favours team work, for its ability to foster people’s creativity and to generate unexpected ideas.
Dr Galvan Debarba is now assistant-Professor at the IT-University of Copenhagen, after working for the Artanim Foundation in Geneva. He graduated on February 17th 2017 (EPFL PhD Thesis N° 7180). He obtained a Master in CS from the Federal University of Rio Grande do Sul and a Bachelor in Digital Technologies from the University of Caxias do Sul. He was a doctoral student at the IIG from Sept. 2012 to August 2016 where he conducted researches on the perceptual experience and bounds between user and avatar in immersive virtual reality, more specifically, how to take advantage of the malleability of virtual environments and address its lack of tactile feedback in embodied interaction.
Dr Molla is now working at Rockstar North in Edinburgh. He graduated on January 29th 2016 (EPFL PhD Thesis N° 6789). After obtaining B.Sc. in Computer Engineering at METU, Eray Molla joined EPFL and completed his Master studies in Computer Science there, together with a minor degree program in Management of Technology and Entrepreneurship. During his Master studies he participated in research activities of CVLAB, and, mainly worked with Dr. Vincent Lepetit on Augmented Reality. Moreover, he did a 6-month-internship at Logitech focusing on Virtual Reality and Computer Vision. He was PhD student in IIG from September 2011 and January 2016.
Dr Li, Visiting professor. Now working at Tianjin Univ. China
Dr Llobera, senior researcher. Now CEO of Timepath
Dr Ahn, senior researcher. Now working at Technicolor, New-York
Dr Gobron, senior researcher. Now associate Professor of HES.so Neuchatel
Quentin Silvestre, Engineer. Now Unity 3D C# Engineer in MindMaze
Final Master Projects
- Nicolas Vial (internship in Autonomio) : Neuro-rehabilitation game connected to an exoskeleton, September 2023
- Jen Yi-Hsin (internship in Olympe) : Olympe DRAW UI rewrite, April 2023
- Antoine Brunner (internship in Swiss motion Technology SA) : 3D to 2D textile pattern generation, August 2022
- Lucas Strauss: Internal Traits, Sense of Embodiment and Full body Follower effect, July 2022
- Valentin Jacquat (internship in Artanym Foundation): A VR experience with novel character Animation Techniques, April 2022
Peter Krcmar : TELL in motion: embodied reading-writing learning method (Collaboration with UNIL), February 2022
- Qiu Huajian (in ETHZ): Perceptual Motion Evaluation and Pose Reconstruction, January 2022
- Romain Clément (University of Malta): Assisted game-level design with genetic algorithms, September 2021
- David Pittet: Haptic Interaction with the Dexmo, July 2021
- Jean Duquenne (Logitech): Multimodal Interactions on Oculus Quest (including Hands), March 2021
- Loïc Serafin (Logitech): Asymmetric virtual reality system, March 2021
- Arthur Parmentier (eM+): sign language sound synthesis soundpainting motion tracking, July 2020
- Tim Nguyen (Raptor-lab): Effects of complex full body force feedback and haptics on proprioception, presence and immersion in a virtual environment, August 2020
- Joseph Vavalà (ELCA): Maintenance and support through AR/MR, March 2020
- Boris Goullet (ELCA): VR for banking applications, March 2020
- Karine Perrard (Logitech Europe, Cork): AR for keyboard entry, March 2020
- Florian Poma (Lambda health Systems): Serious Game design for rehabilitation, September 2019
- Fabien Zellweger (Logitech / Cork Ireland), Exploring wrist & finger wearables for interaction in AR/VR, August 2018
- Yannick Grimault (ProcSim Consulting Sàrl), Ergonomic evaluation in VR, August 2018
- Valentin de la Perraudière (Nextflow Software), Automation of build processes, March 2018
- Julien Marengo, Assessing motion sickness factors, July 2018
- Gaétan Séchaud (lambda Health System), VR for leg rehabilitation, August 2017
- Dragic Ozrenko (Logitech Europe SA), Smart tags, August 2017
- Kristof Szabo (Logitech Europe SA), Text input in VR, April 2017
- Sidney Bovet (project supervisors: H Galvan & R Boulic), Ensuring self-Haptic Consistency for Immersive Amplified Embodiment, February 2017
- Arthur Giroux and Fabien Bourban (MindMaze SA), Building a framework for AR / VR applications with BCI capabilities, March 2016
- Gaël Grosch (MindMaze SA), Real-time human body-pose estimation, March 2015
- Julien Jacquemot (Univ. Paris Descartes, Hopital Broca), NATURAL HUMAN-MACHINE INTERFACE FOR ELDERLY PEOPLE A VIRTUAL AGENT PLATFORM, August 2014
- Matthieu Fond (project supervisor: G. Sapiro, Duke University), Estimating Body Orientation from low resolution Basketball footage,August 2014
- Gruner Samuel (project supervisor: H. Galvan), Intuitive hand-free navigation interface in cluttered virtual environment, January 2014
- Pisupati Phanindra Venkata Aditya (project supervisor: E. Molla), Study of Human Stepping Pattern with Multiple Video Projectors, January 2014
- Zhou Yunpeng (project supervisors: B. Herbelin & H. Galvan), Baumgartner avatar project, January 2014
- Tychinskaya Anastasiya (Imina Technology), Automatic Control of Micromanipulators, August 2013
- Arévalo Christian (project supervisor: E. Molla), Immersive scale-one a posteriori movement analysis in the CAVE, July 2013
- Jonathan Pittet (Lemopix), Picoprojector, Fiat Lux!, June 2012
- Gaétan Cretton (Logitech), 3D gestures for slides presentation, August 2012
- Pascal Bach (project supervisor: R. Boulic), Human stepping, an experimental study on holonomic behaviors, February 2011
- Johann Conus (project supervisor: S. Gobron), Real-Time Virtual Human Speech: Sentence Lips Movement Aspects, February 2011
Semester projects and Internships
- Julien di Tria (IN): Predictive control of omnidirectional treadmill to mitigate navigation discomfort
- Elif Kurtay and Dylan Vairoli (IN) : Cybersickness assessment framework
- Mehdi Sellami and Baudouin Bosc (MT-RO): Machine learning to predict individual susceptivity to cybersickness using physiological signals (brain and gut)
- Cao Yiren and Pierre Amaury (IN): Brain-Gut interaction during cybersickness with active control
- Adriano Viegas Milani (IN): Towards a framework to detect cybersickness
- Camille Montemagni (IN) : TELL en jeu : apprentissage corporel de la lecture-écriture avec le TBI / Collaboration avec UNIL
- Khalil Achache & Ali Raed Ben Pustapha (IN) : Using individual data and machine learning to predict cybersickness under certain manipulations.
- Nolan Chappuis (IN): Game design and implementation project with Infinadeck
- Florian Marcos & Paul McIntyre (IN): Ready Player N°2
- Alexandre Riou (IN): Comparing traditional and immersive VR training methods for manipulations at human’s body scale
- Su Xingyu (IN): On potential of cybersickness reduction with 3D FOV restrictors
- Abel Vexina (IN):Mum Proof IV: Robustly interacting with fingers in a virtual environment
- Sruti Bhattacharjee (IN-Ech) : Real-time emotion recognition in VR
- Sacha Coppey (Ms sem3 IN): Assessing Enfacement in VR
- Pani Vishal (Ms sem1 IN): Mum Proof (the third)
- Paul Oliver (Ms sem1 IN): Smoothing the control of the Infinadeck
- Jeremy Di Dio (Bs sem5 SC): Real-time emotion recognition in VR
- Xavier Theimer-Lienhard (Bs sem5 SC): Smoothing the control of the Infinadeck
- Alexandre Abbey: Presence experiment in MR
- Mathieu Marchand & Robin Plumey: Human skeleton calibration in AR
- Lucas Strauss: MoonVR
- Jonatan Bonjour: VR based learning of COVID-19 transmission and enhance self-protection awareness with life simulation games.
- Leo Dupont: Exploring the Impact of Audio on Heart-Rate in Digital Games.
- Eugenie Demeure: Is Physiological Recordings a Reliable Data Source for Digital Gamesplay Analysis?
- Paul Olivier: Different Degrees of Interaction and its Impact on Emotional Intensity.
- Emilien Ordonneau: Developing and Adaptable and Dynamic Virutal Environment Testbed.
- Karen Preitner: Using a glove with haptic feedback to interact with a virtual control panel
- Hugo Decroix: The effect of the dynamic reduction of transparency in the scene on cybersickness.
- Valentin Jacquat: The effect of the dynamic reduction of the optic flow in the scene on cybersickness
- Ratislav Kovac: Mum Proof : Interacting with fingers in a virtual environment
- David Resin: Developing an Adaptable and Dynamic Virtual Environment Testbed
- Camille Montemagni: Assessment of an Augmented Reality Skeleton Calibration System
- Rodrigo Soares Granja: Placing Sound in Procedurally Dynamic Generated Video Game Levels
- Régis Croset: PackAssist – Using Computer Vision and Deep Learning to Assist a Packaging Operator
- David Pittet: Interacting with Fingers in a Virtual Environment
- Ugo Decroix: Be a Giant: Virtual Body Scaling for Long Distance Navigation in Virtual Environments with Oculus Go
- Boris Zbinden: Haptic interaction in Cardiac Ressucitation Training
- Adrian Baudat: Implementation of a Virtual Reality Application to Evaluate a Break in Presence
- Florian Poma and Arnaud Hennig: Mocap Calibration tool for Virtual Reality
- Hugo Hueber: Passive haptic interaction
- Matthieu Devaux: Exploiting the KATVR device for active walking in VE
- V. du Bois de Dunilac et D. Pittet: Combining AR and VR for skeleton calibration
- L. Pellier: Automatic grasping in VR
- M. Shulga: Game interaction involving real-world elements from a ZED camera
- Ismael Imani: Evaluating eye tracking in VR
Spring & Summer 2016-17
- Dario Pavllo: Robust Real-Time Finger Tracking with Neural Networks
- Fabien Zellweger: Joint center marking tool
- Davor Ljubenkov: scanning and rigging a 3D model of an individual for further uising it as an avatar in UNITY3D
- Elisa Rosse (Summer internship): Full-body interaction with HTC-Vive trackers
- Philippe Martinet et Benjamin Predivoli: test et evaluation du module de simulation FlexSim-VSM (Supervision GM et S. Farsah de la compagnie FlexSim)
- Simon Narduzzi: Full-body movement tracking with Kinect and Neuron Mocap
- Loïc Habegger: Wireless interactive demonstration development
- Kristof Szabo: Immersive game play for two players
- Florian Junker: Perception of self-generated movements in a redirected virtual body
- Evci Utku: stepping pattern strategies analysis and modeling
- Giroux Arthur: Assessment of hemispatial neglect through immersive virtual reality
- Séguy Louis Marie James: Bring Real Human into VR environment
- Khoury Jad-Nicolas Elie: “Was that me?” Human perception of guided interaction
- Bloch Aurélien François Gilbert: Carte blanche “Station haptique & occulus”
- Milano Hadrien: Exploring and optimizing the motion capture volume of a Phasespace system with an Occulus HMD
- Imani Ismail: Character Animation in Unity Game Engine
- Gueniat Patrice: Physiologic monitoring of immersive interaction
- Bovet Sidney: Leaving the body: transition between first and third person perspective in VR
- Junker Floria: Wireless Interactive Demonstration for IIG
- Perret Yann: inattention measuring with Eye Tracking device
- Perrin Sami: Integrating peripheral visual perception with a curved perspective in an occulus HMD
- Bourban Fabien: 3D sculpture using LeapMotion and hand gestures
- Schmitt Fabien: Conveying body pose errors and collisions to the sense of vision through the use of additional viewports in VR
- Shafique Muhammad: Calibration of human joint centers with the Phasespace system
- Jolidon fabien & Farhan Elias Aran Paul: The influence of heart-beat synchronization on gaming experience (col. with LNCO)
- Bourban Fabien: Turning in a virtual environment
- Good Xavier: Real-Time Physically-based interaction in the CAVE
- Sheng Cong: Automated Emotion Summary from Large Text Repository
Alumni of VRLAB supervised by Dr. R. Boulic before 2011
Advisor or co-advisor of EPFL PhD students
One selected publication is suggested to discover a key contribution of the PhD students. Other publications can be found in the publication page of the group.
Daniel Raunhardt, “Motion Synergies for Real-Time Postural Control in Virtual Envrionments“, FNRS grant, presented on April 2010. Daniel is now working as Portfolio and project manager, Cantonal Police of Zurich, Switzerland
- D. Raunhardt, R. Boulic,” Motion Constraint”, The Visual Computer (2009) 25:509-518
Schubert Ribeiro De Carvalho, “Data-Driven Constraint-Based Motion Editing“, presented on December 2009. Schubert is now working as researcher at Instituto Tecnologico Vale, Brazil
- S. Carvalho , R.Boulic, . Thalmann. “Interactive Low-Dimensional Human Motion Synthesis by Combining Motion Models and PIK”, Journal of Computer Animation and Virtual Worlds ,Wiley, 2007
Benoît Le Callennec, ”Interactive Techniques for Motion Deformation of Articulated Figures Using Prioritized Constraints”, FNRS grant, Presented on January 2006. Benoît is co-owner of mokastudio.tv a swiss startup specialized in postproduction, special effects and motion capture processing. Our research group has established a strong collaboration with Moka Studio for licencing our IK know-how.
- B. Le Callennec, R. Boulic “Interactive Motion Deformation with Prioritized Constraints” , Graphical Models, 68(2), 175-193, March 2006
Paolo Baerlocher “Inverse Kinematics Techniques for the Interactive Posture Control of Articulated Figures “, FNRS grant, Presented on April 2001. Paolo is Senior Software Development Engineer at Asmodee Digital, Paris, France
- P. Baerlocher, R. Boulic, « An Inverse Kinematic Architecture Enforcing an Arbitrary Number of Strict Priority Levels », The Visual Computer20(6), pp 402-417, Springer Verlag, 2004
Co-Advisor of PhD students in other universities
Manuel Peinado Gallego, research funded by Spanish grant at the University of Alcala, Spain Reachability in Complex Environments, 2004 to 2010
Inmaculada Rodriguez Santiago, research funded by Spanish grant at the University of Alcala, Spain ” Simulation of Fatigue and Comfort Optimizing Human Postures “, Presented on June 2004.
Ramon Mas Sanso, research funded by Spanish grant at the University of Balearic Islands, Palma, Spain “Balance Control of Multiple Supported Articulated Systems for Computer Animation “, Presented on September 1996
Supervisor of EPFL PhD students (completed thesis)
Damien Maupu Virtual Human Control for Reaching Tasks (scaling study section), 2010
Ehsan Arbabi Contact modeling and collision detection in human joints, 2008
Pascal Glardon On-Line Locomotion Synthesis for Virtual Humans, 2005
Anderson Maciel A biomechanically-based articulation model for medical applications, 2005
Jean-Sébastien Monzani An Architecture for the Behavioural Animation of Virtual Humans (motion deformation section), 2002
Nathalie Farenc An Informed Environment for Inhabited City Simulation, 2001
Luc Emering Human Action Modeling and Recognition for Virtual Environments, 1999
Tom Molet Real-Time Motion Capture with magnetic systems, 1998
Pascal Becheiraz Un modele comportemental et emotionel pour l’animation d’acteurs virtuels , 1998
Zhiyong Huang Motion Control for Human Animation, 1997
Diploma and final Master Projects
Antoine Schmid, Hands in the Pocket: personalizing PCA-driven walking gait with Inverse Kinematics, 2006
Damien Sonney, Integration de personages virtuels dans le quartier nord de l’EPFL ; tracking des personnes réelles, 2006
Sebastian Illan, Incrustation de personages virtuels sur la place Giacometti, 2005
Martin Herren, Constraint-based motion editing using Inverse Kinematics, 2003
Markus Burki, Ms Uni Bern, Plug-in de cinematique inverse pour 3DS Max, 1999
Jean-Sébastien Monzani Motion conversion between different skeletons, 1998
Christophe Bordeux Modélisation et Animation de Structures Articulées, 1997
Mahmoud El Husseini Evaluation de l’espace atteignable humain en équilibre, 1997
Selim Saffet Balcisoy Identification and Prediction of the Human Gestures, 1996
Paolo Baerlocher Edition de Creatures par evolution artificielle, 1995
Daniel Balmer Etude cinématique du genou lors de mouvements de marche selon des trajectoires circulaires, 1994
Daniel Margairaz Simulation de figures articulées, 1991
Laurent Bezault Animation 2D d’un modèle de marche humaine, 1990