Pick up(MS or Semester project) Social media and crowdsourcing for social good
Project Summary: The student will contribute to a multidisciplinary initiative for the use of social media and mobile crowdsourcing for social good. Several projects are available. Specific topics include:
* Social media analytics
* Visualization of social and crowdsourced data
* Smartphone apps for mobile crowdsourcing
Students will be working with social computing researchers studying European and developing cities.
Contact: Prof. Daniel Gatica-Perez [email protected]
(MS or Semester project) A human-centered approach to understand local news consumption
Project Summary: The goal of the project is the design and implementation a framework to study the consumption of local news in the Swiss multicultural context. The project will include a combination of research methods for experimental design and data analysis, and will be done in the context of the AI4Media European project (A European Excellence Center for Media, Society, and Democracy). The main tasks of the project include: literature review; identification of local news sources; mixed-method experimental design; experiments and data analysis; and writing.
Contact: Prof. Daniel Gatica-Perez
(MS or Semester project) Swiss Alpine Lakes & Citizen Science
Project Summary: A new project in the context of the CLIMACT UNIL-EPFL initiative has the ambition of cataloging all Swiss Alpine Lakes (located above 2000 meters), including collection of water samples and in-depth analysis of their microbial diversity, with a citizen science approach to engage citizens in science and increase awareness of environmental conservation. The master’s project will design a framework that uses existing crowdsourced resources (e.g., Wikipedia) and computer vision, NLP, and visualization to build and validate a prototype of an interactive system that can be used as part of citizen science activities.
Contact: Prof. Daniel Gatica-Perez
(MS or Semester project) Privacy-preserving machine learning methods for diversity-aware mobile computing
Project Summary: The goal of the project is to study privacy-preserving machine methods in the context of mobile, diversity-aware computing systems that support the local needs of communities. The project will include work in machine learning, mobile data analysis, and will be done in the context of the multidisciplinary WeNet European project. The main tasks of the project include: literature review; algorithm design and implementation; experiments and data analysis; and writing.
Contact: Prof. Daniel Gatica-Perez
(MS or Semester project) A FATE framework for diversity-aware mobile computing
Project Summary: This project will study and propose a methodology to characterize and validate machine learning methods in the context of diversity-aware mobile computing from the FATE perspective (fairness, accountability, transparency, ethics.) Recent approaches include Google’s model cards for model reporting or Microsoft’s Guidelines for Human-AI interaction. The project will provide a set of best practices for this domain. The main tasks of the project include: literature review; method design and implementation; experiments and data analysis; and writing.
Contact: Prof. Daniel Gatica-Perez
(MS or Semester project) The European AI Act and its impact on European cities
Project Summary: The April 2021 proposal by the European Commission on AI regulation (The AI Act) will impact many sectors of the economy and have important societal implications. This project will study this proposal, analyze its possible effects on how European cities use AI as part of their mission, and make recommendations for the future. The project will be done in the context of the multidisciplinary ICARUS European project, which involve a number of actors in cities and non-governmental organizations. The main tasks of the project include: literature review; conceptual analysis; data collection; data analysis, and writing.
Contact: Prof. Daniel Gatica-Perez
(MS or Semester project) Robot Learning and Interaction
Project Summary: The Robot Learning and Interaction Group proposes various Master and Semester projects with topics related to robotics, machine learning, adaptive control and human-robot interaction.
The list of projects is available here.
Supervisor: Dr Sylvain Calinon ([email protected])
Keywords: robotics, machine learning, adaptive control, human-robot interaction
(MS or Semester project) Human-Robot Interfaces for Interactive Robot Programming
Project Summary: For robots to be widely adopted across industries and beyond structured manufacturing environments,it is critical for them to be programmable by a wide range of users. Growing research on End UserProgramming (EUP) for robotics aims to address this problem with novel user interfaces, programminglanguages, and techniques to aid or fully automate robot programming.
In this project, you will design a Human-Robot Interface for Robot Programming, integrating anexisting programming framework based on iterative Linear Quadratic Regulator (iLQR) [1] on a roboticmanipulator, the FRANKA EMIKA Panda. The interface will allow users to compose robot programsdefined as sequences of components like goal poses or constraints (e.g., maintaining a gripper orientation)which, in turn, inform the generation of executable robot trajectories.
For more details, see the following link: https://www.idiap.ch/en/join-us/files/incremental-lfd-semester-project.pdf
Supervisors: Dr. Jean-Marc Odobez, Dr. Sylvain Calinon
Advisors and point of contact: Dr. Mattia Racca (mattia.racca at idiap.ch)
(MS or Semester project) 6D Pose Estimation for Robotic Manipulation Tasks
Project Summary: For robots to interact with their environment in a safe and efficient manner, they need robust ways of estimating the pose (location, orientation) of the objects to be manipulated. This is usually done with RGB-D cameras as input sensors, i.e. cameras that provide not only the color of each pixel (the RGB part) but also how far it is from the camera (the D for “depth”).
Most recent methods leverage deep learning architecture to achieve such estimation, leveraging convolutional neural networks and fully-connected layers.
In this semester project, you will use referenced methods (Wang et al 2019, He et al 2020 & 2021), testing their performance and usability on the available datasets. In particular, you will take advantage of the YCB Object Dataset, a popular robotics dataset used to evaluate 6D pose estimation techniques (Calli et al, 2015). This dataset, avaialble in the lab, consists of real-life items and their corresponding 3D model (in the form of point clouds or textured meshes). You will then integrate the methods in a lab setup, where the input images come, in real time, from a RBG-D camera (Intel Realsense d415, Kinect Azure). Finally, you will showcase the robustness of the methods, by having the estimated 6D pose of YCB items used as input for a robot grasping task.For more details about the tasks and goals, see the following link:
Supervisor: Dr. Jean-Marc Odobez
Advisors and point of contact: Anshul Gupta (research assistant, [email protected]), Dr. Mattia Racca
(MS or Semester project) Interaction Manager for Human-Robot Interactions
Project Summary: This project aims to provide an easy way to create dialogues for human-robot interactions. For example, a robot introducing itself when someone looks at it, telling a joke if the person asks for one, and then using the person’s laughter to learn whether the joke was a good one. In this project (semester or master), you will use RASA, a commercial dialogue management system used in many chatbots on the web to handle turn-taking discussions. You will interface this system with a simulated robot as well as a real Pepper robot to create scenarios allowing people to interact with Pepper in different ways and ensuring that Pepper’s responses are appropriate both verbally and non-verbally. For this, you will need to use Pepper’s sensors to make sense of the world and understand people’s speech, relay this information to the dialogue manager in RASA, and treat the outputs of the dialogue manager to create real robot behaviors (speech and gestures).
For more details, see the following link: [link to pdf]
Supervisors: Dr. Jean-Marc Odobez, Dr. Emmanuel Senft
Point of contact: Dr. Emmanuel Senft ([email protected])