Zhang,S.,Liu,G.,andLai,X.
Paper:
Classification of Evoked Emotions Using an Artificial Neural
Network Based on Single, Short-Term Physiological Signals
Shanbin Zhang
∗
, Guangyuan Liu
∗,†
, and Xiangwei Lai
∗∗
∗
College of Electronic and Information Engineering, Southwest University
No.2 of Tiansheng Road, BeiBei District, Chongqing 400715, China
E-mail: {zhshbin, liugy}@swu.edu.cn
∗∗
Computer and Information Science College, Southwest University
No.2 of Tiansheng Road, BeiBei District, Chongqing 400715, China
E-mail: laixw@126.com
†
Corresponding author
[Recei ved April 27, 2014; accepted October 24, 2014]
Most automated analysis methods related to biosignal-
based human Emotions collect their data using multi-
ple physiological signals, long-term physiological sig-
nals, or both. However, this restricts their ability to
identify Emotions in an efficient manner. This study
classifies evoked Emotions based on two types of sin-
gle, short-term physiological signals: electrocardio-
grams (ECGs) and galvanic skin responses (GSRs)
respectively. Estimated recognition times are also
recorded and analyzed. First, we perform experiments
using film excerpts selected to elicit target Emotions
that include anger, grief, fear, happiness, and calm-
ness; ECG and GSR signals are collected during these
experiments. Next, a wavelet transform is applied to
process the truncated ECG data, and a Butterworth
filter is applied to process the truncated GSR signals,
in order to extract the required features. Finally, the
five different Emotion types are classified by employ-
ing a n a rtificial neural network (ANN) based on the
two signals. Average classification accuracy rates of
89.14% and 82.29% were achieved in the experiments
using ECG data and GSR data, respectively. In addi-
tion, the total time required for feature extraction and
emotional classification did not exceed 0.15 s for either
ECG or GSR signals.
Keywords: emotional recognition, ANN, automatic
recognition, ECG, GSR
1. Introduction
Emotions are an important factor in human intelli-
gence, rational decision making, social interaction, per-
ception, memory, learning, creativity, and other facets
of human existence [1]. Emotions and their interaction
with cognition have attracted increasing attention in cog-
nitive science. Affective computing, one of the founda-
tions required to build harmonious human-machine envi-
ronments, is about Emotion itself, and considers the fac-
tors that generate and influence Emotions. Its purpose is
to teach computers the ability to identify, understand, and
adapt to our Emotions. Emotional recognition is a critical
step in affective computing [2].
For more than a decade, numerous researchers have
studied human Emotions using facial expressions, voice
signals, body language, and other characteristics, while
recording the expected results. Real-time automated sys-
tems for human emotional states were proposed by An-
derson et al. [3], who employed facial expressions, and
Zhang et al. [4], who employed both facial expressions
and semantics-based topic detection. It must be noted that
although high classification results were reported, such as
Lucey et al. [5] accomplishing 95% accuracy based on a
support vector machine (SVM), and Whitehill et al. [6]
accomplishing 92.35% accuracy in real-time emotional
recognition, these results do not always accurately corre-
spond to actual Emotions. Facial expressions, voice sig-
nals, and body language are influenced not only by a per-
son’s emotional states, but also by their will.
In general, physiological responses, which are induced
by the autonomic nervous system (ANS), are difficult to
control consciously. As a result, physiological responses
and their corresponding signals have attracted increasing
attention. At first, Ekman et al. [7] and Winton et al. [8]
reported that physiological signals are related to various
emotional states. Subsequently, an increasing number of
researchers have analyzed automated Emotion recogni-
tion by employing physiological signals. Four types of
physiological signals were used by Picard et al. to de-
tect eight emotional states; using their technique, they
achieved 81% accuracy [9]. Lisetti and Nasoz [10] ac-
complished 84.1% accuracy during the detection of five
emotional states based on three types of physiological
signals. Haag et al. employed six types of physiological
signals to detect both valence and arousal; they achieved
96.0% and 89.9%, respectively [11]. All of these research
efforts aimed to improve the detection of human Emotions
and made significant contributions to the domain. More-
over, their findings have been applied to various fields.
However, because many of these research efforts used a
118 Journal of Advanced Computational Intelligence Vo l.19 No.1, 2015
and Intelligent Informatics