The projects presented in this page concern past projects and an heritage of skills from the lab LSRO (Laboratory of Robotic Systems), which closed by February 2019

Associated Phd thesis: Simon Gallo, Laura Santos, Solaiman Shokur, Ali Sengul, Evren Semur


Multi-Modal Haptics

We systematically investigate human tactile perception and related illusions and propose promising technological solutions for improved multi-modal haptic displays exploiting these illusions. We will identify situations in which humans strongly rely on their haptic senses (e.g. object manipulation in activities of daily living and tactile exploration), and explore which haptic stimuli are most beneficial for this interaction with real or virtual environments, with collaborative help from psychophysics and cognitive neuroscience.

This project is a part of the research framework NCCR Robotics:

Visuo-Tactile Integration

Body representation is highly complex and involves the integration of a wide range of multisensory and motor signals. We are currently using a combined robotic and virtual reality approach to reveal the importance of multisensory mechanisms for bodily experience. We investigate interactions and conflicts between the visual and somatosensory modalities with cross-modal congruency effects.

We argue that our findings will not only shed light on the brain mechanisms of hand-object manipulations and robotically-mediated interactions, but may also guide further development of novel robotic platforms and be of relevance for the related field of motor rehabilitation.

Neuroscience & Surgical Robots

Our aim is to analyze the factors that influence surgeons’ sense of presence in a tele-robotic surgery and to improve control and usability of surgical robots. For this purpose, we use cognitive methods to study the usability of the different surgical robots. These methods will also enable us to find more objective methods to evaluate them.

Brain-Machine Interfaces

We aim to interface the central nervous system to control artificial devices as a robotic arm, or a virtual 3D body. In colloboration with Duke University Medical School (Durham, NC, USA), we have developped a virtual 3D avatar that is controlled directly by brain derived signals. We are particulary interested to:

1) Assess the ability of the subject to control a virtual 3D representation of himself.

2) Evaluate how this virtual body in integrated into the subject body schema.