
Using Deep and Convolutional Neural Networks for
Accurate Emotion Classification on DEAP Dataset
Samarth Tripathi
Columbia University, New York
samarthtripathi@gmail.com
Shrinivas Acharya
Amazon, Hyderabad
acharys@amazon.com
Ranti Dev Sharma
University of California, San Diego
ranti.iitg@gmail.com
Sudhanshi Mittal
Oracle, Hyderabad
sudhanshumittal1992@gmail.com
Samit Bhattacharya
Indian Institute of Technology, Guwahati
samit@iitg.ernet.in
Abstract
Emotion recognition is an important field of research in Brain
Computer Interactions. As technology and the understanding
of emotions are advancing, there are growing opportunities
for automatic emotion recognition systems. Neural networks
are a family of statistical learning models inspired by biolog-
ical neural networks and are used to estimate functions that
can depend on a large number of inputs that are generally
unknown. In this paper we seek to use this effectiveness of
Neural Networks to classify user emotions using EEG sig-
nals from the DEAP (Koelstra et al (2012)) dataset which
represents the benchmark for Emotion classification research.
We explore 2 different Neural Models, a simple Deep Neu-
ral Network and a Convolutional Neural Network for clas-
sification. Our model provides the state-of-the-art classifica-
tion accuracy, obtaining 4.51 and 4.96 percentage point im-
provements over (Rozgic et al (2013)) classification of Va-
lence and Arousal into 2 classes (High and Low) and 13.39
and 6.58 percentage point improvements over (Chung and
Yoon(2012)) classification of Valence and Arousal into 3
classes (High, Normal and Low). Moreover our research is a
testament that Neural Networks could be robust classifiers for
brain signals, even outperforming traditional learning tech-
niques.
Introduction
Emotions are very important in human decision handling,
interaction and cognitive process (Sreeshakthy et al (2016)).
As technology and the understanding of emotions are ad-
vancing, there are growing opportunities for automatic emo-
tion recognition systems. There have been successful re-
search breakthroughs on emotion recognition using text,
speech, facial expressions or gestures as stimuli. However
one of the new and exciting directions this research is head-
ing is EEG-based technologies for automatic emotion recog-
nition, as it becomes less intrusive and more affordable,
leading to pervasive adoption in healthcare applications. In
this paper we focus on classifying user emotions from Elec-
troencephalogram (EEG) signals, using various neural net-
work models and advanced techniques. For our research we
Copyright
c
2017, Association for the Advancement of Artificial
Intelligence (www.aaai.org). All rights reserved.
particularly explore Deep Neural Networks and Convolu-
tional Neural Networks, using advanced machine learning
techniques like Dropout, for emotion classification. Neural
network is a machine that is designed to model the way our
brain performs a particular task, where the key concepts of
brain as a complex, non-linear and parallel computer are
imitated (Haykin (2004)), and possess the ability to model
and estimate complex functions depending on multitude of
factors. Moreover recent developments in machine learning
have shown neural networks to provide prime accuracy in
various varied tasks such as Text and Sentiment Analysis
(Kim (2014)), Image recognition (Krizhevsky et al (2012)),
and Speech analysis.
Recently, the affective EEG benchmark database DEAP
(Koelstra et al (2012)) was published, which presents mul-
timodal data set for the analysis of human affective states.
The electroencephalogram (EEG) and peripheral physiolog-
ical signals of 32 participants were recorded as each watched
40 one-minute long excerpts of music videos. Participants
rated each video in terms of the levels of arousal, valence,
like/dislike, dominance, and familiarity. A 32 EEG channels
Biosemi ActiveTwo device was used to record the EEG sig-
nals when the subjects were exposed to the videos. Other
than the EEG recordings, channels also recorded some phys-
iological signals like temperatures and respiration etc. Meth-
ods and results were presented for single-trial classification
of arousal, valence, and like/dislike ratings using the modal-
ities of EEG, peripheral physiological signals, and multi-
media content analysis. Automatic classification of human
emotion using EEG signals has been researched upon in de-
tail by various scholars. However in the release of DEAP
data, research academia finds a standardized dataset to ef-
fectively measure and compare accuracies for various clas-
sification algorithms.
We use two different Neural Models for classification, the
first being a Deep Neural Network comprising of 4 Neural
layers. The model contains an initial neural layer of 5000
nodes, followed by layers of 500 and 1000 neurons respec-
tively, before the output neural layer of 2 or 3 nodes depend-
ing on the classification classes. All the layers are fully con-
nected with Softmax (Dunne and Campbell (1997)) acting
as the Activator, and use Dropout (Srivastava et al (2014))
Proceedings of the Twenty-Ninth AAAI Conference on Innovative Applications (IAAI-17))