Assessing the functional use of upper extremities (UEs) in real-life settings is crucial for optimizing rehabilitative interventions for individuals with UE impairments, such as transradial amputees, stroke survivors, and people with cervical spinal cord injuries. The distance between a patient’s home and clinical settings, however, can hinder accurate tracking of neurological and functional recovery. Consequently, it is vital to develop and validate outcome measures of UE function that align with the International Classification of Functioning, Disability, and Health’s performance domain, addressing how individuals with UE impairments perform activities of daily living (ADLs) and instrumental ADLs (iADLs) in their current environment.
To address this need, a “Smart Kitchen” has been developed recently at EPFL. This fully equipped, functional kitchen simulates a home environment, enabling researchers to conduct experiments with individuals with UE impairments in an ecologically valid yet controlled setting. The Smart Kitchen currently features 3D cameras, depth sensors (Microsoft Kinect Azure), sensorized objects and tools (cupboards, drawers, dishes, cutlery, etc.), and egocentric cameras (Microsoft HoloLens headsets with integrated gaze sensors). Wearable sensors (EMG arrays, IMUs, etc.) can also be incorporated during experiments.
Incorporating a multi-modal approach, the Smart Kitchen’s objective is to establish a behavioral fingerprint for each participant, derived from the digital readout of their whole-body movement. This behavioral fingerprint could serve as a potential biomarker, revolutionizing the way we conduct functional assessments of upper extremity (UE) movements.