Student Projects

Accurate movement trajectory decoding for high-performance brain-machine interfaces

Brain-machine interfaces (BMIs) hold the promise to enable a direct communication channel between a paralyzed patient’s brain and external environment. Implementing modern machine learning algorithms for continuous movement intention decoding in a BMI is an interesting problem. While the classification of various motor tasks, such as binary classification of hand opening/closing or detecting moving fingers has been demonstrated in several studies, accurate decoding of continuous movement trajectory (i.e., position of hands/fingers) remains a challenge. 

In this project, we aim to build a regressor to decode movement trajectory from brain recordings in patients (ECoG and/or spiking activity). Students will study informative biomarkers and implement ML models for target regression tasks to obtain a high decoding performance (both feature-engineered and end-to-end models).     

Objectives​: 1. Understanding of the basic problems and research directions in neural engineering. 2. Applying machine learning algorithms to a real world (and challenging) application. 3. Gaining experience in biomedical signal processing.

Required knowledge: ​1. Programming skills (Python or Matlab) 2. Basic knowledge of ML and signal processing. Note: Neuroscience knowledge is not required.               

Preferred knowledge: ​1. Able to implement state-of-the-art ML algorithms and neural features. 2. Some prior knowledge on brain-machine interfaces.   

Resource-efficient machine learning algorithm design for on-implant neurological symptom detection

Recently, machine learning (ML) techniques have been used in early detection of neurological symptoms. However, although modern machine learning tools generally obtain a high accuracy in neural signal classification, their deployment on neural interface implants is severely limited due to stringent power, area, and latency constraints for these tiny implants. In this project, we aim to develop an efficient ML model to process neural data in real time, with low power consumption, small on-chip area and fast inference.   

Students are expected to work with electrophysiological recordings such as scalp EEG, ECoG, and spikes (action potentials). Students will implement state-of-the-art ML models and develop ideas for their resource-efficient implementation, evaluate their performance on a given neural task, and estimate the hardware complexity. Students may focus on a particular metric (e.g., detection latency, power dissipation) or seek a reasonable trade-off between several performance metrics.

Objectives​: 1. Understanding of the basic problems and research directions in neural engineering. 2. Applying machine learning algorithms to a real world application. 3. Gaining experience in biomedical signal processing. 4. Learning to optimize the designed algorithms for efficient hardware implementation.               

Required knowledge: ​1. Programming skills (Python or Matlab) 2. Basic knowledge of ML and signal processing. Note: Neuroscience and advanced hardware knowledge is not required.   

Preferred knowledge: ​1. Able to implement state-of-the-art ML algorithms. 2. Familiar with hardware design metrics. 3. Some prior knowledge on brain-machine interfaces.   

Machine learning chip design for on-implant neural decoding

While modern machine learning shows a great potential in neural signal classification, its integration on neural interface implants is challenging due to severe constraints on power consumption and implant size. In this project, we aim to design an efficient feature extraction and/or ML processor to classify/decode neural data in real time, with low power consumption, small on-chip area and fast inference.

Students are expected to design candidate ML algorithms in Cadence. Various methods for efficient implementation will be explored (e.g., mixed-signal design, approximate computing, …) and the corresponding hardware complexity will be assessed with Cadence simulations. Students may focus on a particular metric (e.g., power dissipation, area, detection latency) or seek a reasonable trade-off between these hardware metrics. The student(s) will closely work with the advisor, PhD students, and postdocs in our group to co-design the hardware and algorithm, evaluate it on real neural datasets, design and test the prototype integrated circuits.

Objectives​: 1. Understanding of the basic problems and research directions in neural engineering. 2. Implementing  machine learning algorithms on chip. 3. Learning to optimize the hardware for low-power and compact implementation.               

Required knowledge: ​1. Experience with analog/mixed-signal or digital CMOS integrated circuit design in Cadence. Experience with machine learning algorithms and PCB design is beneficial but not necessary. Note: Neuroscience knowledge is not required.           

Preferred knowledge: ​1. Able to implement analog/digital circuits on ASIC. 2. Familiar with hardware design metrics. 3. Some prior knowledge on neural interfaces.           

Detecting mental fatigue and symptoms of psychiatric disorders from continuous neuronal recordings

Early detection of mental fatigue and changes in vigilance could be used to initiate neurostimulation to treat patients suffering from brain injury and various mental disorders. The goal of this project is to explore modern machine learning tools for detecting the onset of mental fatigue and symptoms of neuropsychiatric disorders, using multi-channel electrocorticography (ECoG) signals and local field potentials (LFP) recorded from primates (and possibly humans). The neural data is provided by our neurology/neurosurgery collaborator labs.

Students will study various correlating biomarkers (i.e., features) and implement state-of-the-art ML models to obtain a high classification performance and optimize the chosen algorithms in terms of detection accuracy and latency.       

Objectives​: 1. Understanding of the basic problems and research directions in neural engineering. 2. Applying machine learning algorithms to a real world (and challenging) application. 3. Gaining experience in biomedical signal processing.

Required knowledge: ​1. Programming skills (Python or Matlab) 2. Basic knowledge of ML and signal processing. Note: Neuroscience knowledge is not required.               

Preferred knowledge: ​1. Able to implement and improve state-of-the-art ML algorithms. 2. Some knowledge on neural interfaces.                   

Attention models for temporal decoding of neural data            

Models used for sequence transduction have recently shifted from recurrent neural networks to attention mechanism. Such attention-based models (i.e. transformers) have achieved great success in the field of natural language processing. Neurological recordings (i.e., the electrical activity of the brain recorded by invasive or noninvasive electrodes) can be considered as multi-channel time sequences. While RNNs have shown promise in many tasks such as motor decoding, the attention mechanism is still unexplored in this context.           

Students will apply attention-based models to process pre-recorded neural data. Students may need to explore the model structure to better fit into a specific neural task. Decoding performance will be compared with conventional approaches (e.g., LSTM). Limitations will be analyzed and solutions will be provided.   

Objectives​: 1. Understanding of the basic problems and research directions in neural engineering. 2. Applying machine learning algorithms to a real world (and challenging) application. 3. Gaining experience in biomedical signal processing.               

Required knowledge: ​1. Programming skills (Python or Matlab) 2. Basic knowledge of ML and signal processing. Note: Neuroscience knowledge is not required.               

Preferred knowledge: ​1. Able to implement and improve state-of-the-art ML algorithms. 2.    Some prior knowledge on brain-machine interfaces.