Assessing emotional states of users, evoked during their multimedia consumption has received a great deal of attention with recent advances in multimedia content distribution technologies and increasing interest in personalized content delivery. Physiological signals such as the electroencephalogram (EEG) and peripheral physiological signals have been less considered for emotion recognition in comparison to other modalities such as facial expression and speech, although they have a potential interest as alternative or supplementary channels. This page presents our work on  constructing a dataset containing EEG and peripheral physiological signals acquired during presentation of music video clips, which is made publicly available.

In our paper “Affect Recognition Based on Physiological Changes During the Watching of Music Video “, in ACM  Transaction on Interactive Intelligent Systems, authored by Ashkan Yazdani, Jong-Seok Lee, Jean-Marc Vesin, and Touradj Ebrahimi the procedure for the dataset acquisition, including stimuli selection, signal acquisition, self-assessment, and signal processing is described in detail.  Especially, we propose a novel asymmetry index based on relative wavelet entropy for measuring the asymmetry in the energy distribution of EEG signals, which is used for EEG feature extraction. Then, the classification systems based on EEG  and peripheral physiological signals are presented.

The Dataset produced in this study is available for free download:

  • List of music video clips.
  • EEG, peripheral physiological data and Subjects’ self assessments.
  • Description of the dataset.

The data can be downloaded from the DEAP dataset. If you download the data, it is assumed that you agree to the copyright notice reported below. If you use our data for your own publications please do not forget to reference this website and our paper.

Permission is hereby granted, without written agreement and without license or royalty fees, to use, copy, modify, and distribute the data provided and its documentation for research purpose only. The data provided may not be commercially distributed.  In no event shall the Ecole Polytechnique Federale de Lausanne be liable to any party for direct, indirect, special, incidental, or consequential damages arising out of the use of the data and its documentation. The Ecole Polytechnique Federale de Lausanne  specifically disclaims any warranties. The data provided hereunder is on an “as is” basis and the Ecole Polytechnique Federale de Lausanne has no obligation to provide maintanance, support, updates, enhancements, or modifications


The work presented here was partially supported by the European Network of Excellence PetaMedia (FP7/2007-2011), and the Swiss National Foundation for Scientic Research in the framework of NCCR Interactive Multimodal Information Management (IM2). The authors would like to thank Christian Muehl, Mohammad Soleymani, and Sander Koelstra for their help and participation in data acquisition.