Redefining functional assessment methods through the EPFL Smart Kitchen (ESK)

Performance site: Campus Biotech, Geneva

Background – Assessing the functional use of upper extremities (UEs) in real-life settings is crucial for optimizing rehabilitative interventions for individuals with UE impairments, such as transradial amputees, stroke survivors, and people with cervical spinal cord injuries. The distance between a patient’s home and clinical settings, however, can hinder accurate tracking of neurological and functional recovery. Consequently, it is vital to develop and validate outcome measures of UE function that align with the International Classification of Functioning, Disability, and Health’s performance domain, addressing how individuals with UE impairments perform activities of daily living (ADLs) and instrumental ADLs (iADLs) in their current environment.

To address this need, a “Smart Kitchen” has been developed recently at EPFL. This fully equipped, functional kitchen simulates a home environment, enabling researchers to conduct experiments with individuals with UE impairments in an ecologically valid yet controlled setting. The Smart Kitchen currently features 3D cameras, depth sensors (Microsoft Kinect Azure), sensorized objects and tools (cupboards, drawers, dishes, cutlery, etc.), and egocentric cameras (Microsoft HoloLens headsets with integrated gaze sensors). Wearable sensors (EMG arrays, IMUs, etc.) can also be incorporated during experiments.

Incorporating a multi-modal approach, the Smart Kitchen’s objective is to establish a behavioral fingerprint for each participant, derived from the digital readout of their whole-body movement. This behavioral fingerprint could serve as a potential biomarker, revolutionizing the way we conduct functional assessments of upper extremity (UE) movements.

Project description:

By using an advanced array of cameras installed in the ESK and employing cutting-edge computer vision algorithms, we can now determine the 3D joint positions of participants. This process facilitates the extraction of features associated with dexterity during task performance. These developments prove pivotal in our ability to carry out functional assessments of prosthetic limb users, contributing to a better understanding and potential improvements in prosthetic limb use.

Activities:

  • Refinement of ESK pre-processing pipeline 
  • Development of pipeline to extract relevant features associated with dexterity
  • Perform analyses to compare healthy and amputees participants 

Requirements:

Project is 80% analysis, 20% experimentation.

Proficiency in programming with Python is a MUST

Experience with data analysis as well as machine learning is highly valued

Contact: [email protected]

Project description:

Leveraging the cutting-edge technology of Microsoft’s Hololens2, we have enhanced our capability to accurately identify the participants’ visual focal points. By investigating this element of visual attention, we propose the hypothesis that amputee participants may require a heightened degree of visual focus on their prosthetic limbs during task execution. This insight could lead to significant advancements in our understanding and optimization of prosthetic limb use.

Activities:

  • Refinement of ESK pre-processing pipeline 
  • Development of pipeline to extract relevant features associated with visual attention
  • Perform analyses to compare healthy and amputees participants 

Requirements:

Project is 80% analysis, 20% experimentation.

Proficiency in programming with Python is a MUST

Experience with data analysis as well as machine learning is highly valued

Contact: [email protected]

Contact

If none of the projects suits you but you are interested to apply computer vision in prosthetic limb research in general, please feel free to contact us to discuss potential opportunities.

Project PI: Solaiman Shokur ([email protected])

Franklin Leong ([email protected])