Visual Intelligence for Transportation VITA

Our research enables a future where self-driving vehicles, or delivery/social robots will be facts of everyday life. Beyond embodied agents, we will also enable our living spaces – our homes, terminals, and cities – become equipped with ambient intelligence which can sense and respond to human behavior.

 

Our laboratory pushes the limits of Artificial Intelligence (AI) in the context of transportation, mobility, and built environments. 

Self-driving vehicles or delivery robots need to navigate crowded social scenes in close proximity with humans. Hence, they must understand social conventions, ethics and obey unwritten common-sense rules. Humans have an innate ability to “read” the behavior of others before making decisions. A machine should have the same capability to share the space with humans in a safe, efficient, and trustworthy manner.

To address this grand challenge of co-existence in the “last mile mobility”, we propose a new type of AI we call socially-aware AI: perception and planning augmented with social intelligence.  In other words, it is the ability to effectively perceive, navigate, and negotiate complex social interactions and environments.

Technically, our research brings together Computer Vision (Real-time Perception), Machine Learning (Deep learning) and Robotics (Crowd-Robot Interaction) to understand human behavior at every scale (enabling Autonomous Moving Agents and Digital Twins). 

 

We have open positions for Ph.D. students and Postdocs (with an expected background in Deep Learning).
Feel free to send your application to [email protected]

 

Have fun with our latest real-time demo of human pose estimation!

Our research is centered around understanding and predicting human social behavior with multi-modal visual data. Our work spans multiple aspects of socially-aware systems:

  1.  From collecting multi-modal data at scale,
  2. Extracting coarse-to-fine grained behaviors in real-time,
  3. Designing deep learning methods that can learn to predict human social behavior in a fully data-driven way,
  4. To integrating the developed methods in real-world systems such as a vehicle or a socially-aware robot that navigates crowded social scenes.