Human-Robot Interaction

Second Hands

Second Hands

We focus on scenarios where robots take the role of assistants, trainers or collaborators in order to achieve physical manipulation tasks. We use machine-learning algorithms to model and observe human behaviors, develop robot implementations and replicate the task in a human-like manner or detect human intention.

We develop approaches and applications that:

  • learn a task from human demonstrations and facilitate the execution in changing contexts
  • assist in the execution of a task by learning, encoding and transferring knowledge to humans through training
  • collaborate to execute a task with a human in order to recognize human intentions during handovers
  • enable human-in-the-loop control and adaptive grasping
  • humans and robots work in a shared space by observation of functional reaching motions
  • robots communicate functional behavior when reacting to uncertainty and resolve conflict over a shared resource
  • facilitate developments in haptic interaction

Shared Control Through EMG

The range of motion that someone can do while wearing a prosthetic device is constrained due to lack of control and functionality. We are working to develop control interfaces that can decode the intention of the user by its muscular activity (EMG).

We develop approaches and applications that:

  • describe the dynamic behavior of the hand and the fingers during reaching-to-grasp motions
  • extract valuable information from the muscles to decode the intention of the user
  • combine dynamical systems with the user’s intention to provide a natural human-like behavior to the device
  • achieve results in a shared control scheme between the user and wearable robotic device
  • apply our results to prosthetic devices for the upper limbs as well as tele operated robotic systems

Video playlist