Detecting Emotion From EEG Signals Using The Emotive Epoc Device
Detecting Emotion From EEG Signals Using The Emotive Epoc Device
Detecting Emotion From EEG Signals Using The Emotive Epoc Device
net/publication/262288467
Detecting Emotion from EEG Signals Using the Emotive Epoc Device
CITATIONS READS
60 7,289
2 authors:
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Rafael Ramirez on 18 February 2015.
1 Introduction
F.M. Zanzotto et al. (Eds.): BI 2012, LNCS 7670, pp. 175–184, 2012.
c Springer-Verlag Berlin Heidelberg 2012
176 R. Ramirez and Z. Vamvakousis
2 Background
The firing of neurons in the brain trigger voltage changes. The electrical activity
measured by the electrodes in an EEG headset corresponds to the field poten-
tials resulting from the combined activity of many individual neuronal cells in the
Detecting Emotion from EEG Signals Using the Emotive Epoc Device 177
brain cortex. However, the measured cortical activity is distorted by the tissue
and skull between the electrodes and the neurons. This introduces noise and re-
duce the intensity of the recorded signals. In despite of this, EEG measurements
still provide important insight into the electrical activity of the cortex.
The frequency of EEG measurements ranges from 1 to 80Hz, with amplitudes
of 10 to 100 microvolts [9]. Signal frequencies have been divided into different
bands, since specific frequency waves are normally more prominent in particular
states of mind. The two most important frequency waves are the alpha waves
(8-12Hz) and the beta waves (12-30Hz). Alpha waves predominantly originate
during wakeful relaxation mental states, and are most visible over the parietal
and occipital lobes. Intense alpha wave activity have also been correlated to brain
inactivation. Beta wave activity, on the other hand, is related to an active state
of mind, most prominent in the frontal cortex during intense focused mental
activity [9].
Alpha and beta wave activity may be used in different ways for detecting
emotional (arousal and valence) states of mind in humans (more details later).
Choppin [3] propose to use EEG signals for classifying six emotions using neural
networks. Choppin’s approach is based on emotional valence and arousal by
characterizing valence, arousal and dominance from EEG signals. He characterize
positive emotions by a high frontal coherence in alpha, and high right parietal
beta power. Higher arousal (excitation) is characterized by a higher beta power
and coherence in the parietal lobe, plus lower alpha activity, while dominance
(strength) of an emotion is characterized as an increase in the beta / alpha
activity ratio in the frontal lobe, plus an increase in beta activity at the parietal
lobe.
Oude [14] describes an approach to recognize emotion from EEG signals mea-
sured with the BraInquiry EEG PET device. He uses a limited number of elec-
trodes and trains a linear classifier based on Fishers discriminant analysis. He
considers audio, visual and audiovisual stimuli and trains classifies for posi-
tive/negative, aroused/calm and audio/visual/audiovisual.
Takahashi [18] use a headband of three dry electrodes to classify five emotions
(joy, anger, sadness, fear, and relaxation) based on multiple bio-potential signals
(EEG, pulse, and skin conductance). He trains classifiers using support vector
machines and reports the resulting classifying accuracy both using the whole set
of bio-potential signals, and solely based on EEG signals.
Lin at al. [10] apply machine-learning techniques to categorize EEG signals ac-
cording to subject self-reported emotional states during music listening. They pro-
pose a framework for systematically seeking emotion-specific EEG features and
exploring the accuracy of the classifiers. In particular, they apply support vector
machines to classify four emotional states: joy, anger, sadness, and pleasure.
3 Data Collection
EEG data in this study were collected from 6 healthy subjects (3 males and 3
females) with average age of 30.16 during listening to emotion-annotated sounds.
178 R. Ramirez and Z. Vamvakousis
For collecting the data we used the Emotiv EPOC headset, recently released by
the Emotiv Company [7]. This headset consists of 14 data-collecting electrodes
and 2 reference electrodes, located and labeled according to the international
10-20 system [12]. Following the international standard, the available locations
are: AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8 and AF4. Figure
2 shows the 14 Emotiv EPOC headset electrode positions. The EEG signals are
transmitted wirelessly to a laptop computer.
The purpose of the 10 second silent rests is to set a neutral emotional state of
mind in between stimuli.
4 Data Classification
4.1 Feature Extraction
As mentioned in the previous section, in EEG signals the alpha (8-12Hz) and beta
(12-30Hz) bands are particular bands of interest in emotion research for both
valence and arousal [13]. The presence of EOG artifacts (eye movement/blinking)
is most dominant below 4Hz, ECG (heart) artifacts around 1.2Hz, and EMG
(muscle) artifacts above 30Hz. Non physiological artifacts caused by power lines
are normally present above 50Hz [6,5].
Thus, fortunately a byproduct of extracting the alpha and beta frequencies is
that much of the noise present in EEG signals is considerably reduced. We apply
bandpass filtering for extracting alpha and beta frequency bands. Using Fourier
frequency analysis, the original signal is split up in frequencies in order to remove
specific frequencies, before transforming back the signal with only the frequen-
cies of interest. For this research, we apply the bandpass filter implementation
provided by the OpenVibe software [15].
From the EEG signal of a person, we determine the level of arousal, i.e. how
relaxed or excited the person is, by computing the ratio of the beta and alpha
brainwaves as recorded by the EEG. We measure the EEG signal in four locations
(i.e. electrodes) in the prefrontal cortex: AF3, AF4, F3 and F4 (see Figure 2).
As mentioned before, beta waves are associated with an alert or excited state of
mind, whereas alpha waves are more dominant in a relaxed state. Alpha activity
has also been associated to brain inactivation. Thus, the beta/alpha ratio is a
reasonable indicator of the arousal state of a person.
In order to determine the valence level, i.e. negative or positive state of mind,
we compare the activation levels of the two cortical hemispheres. This is moti-
vated by psychophysiological research which has shown the importance of the
difference in activation between the cortical hemispheres. Left frontal inactiva-
tion is an indicator of a withdrawal response, which is often linked to a negative
emotion. On the other hand, right frontal inactivation may be associated to an
approach response, or positive emotion.
As mentioned before, high alpha activity is an indication of low brain activity,
and vice versa. Thus, an increase in alpha activity together with a decrease in
beta waves may be associated with cortical inactivation [13]. F3 and F4 are the
most used positions for looking at this alpha activity, as they are located in the
prefrontal lobe which plays a crucial role in emotion regulation and conscious
experience.
Although previous research suggests that hemispherical differences are not
an indication of affective valence (feeling a positive or negative emotion), it has
been suggested that it is an indication of motivational direction (approach or
withdrawal behavior to the stimulus) [8]. In general, however, affective valence is
related to motivational direction. Therefore, comparing hemispherical activation
180 R. Ramirez and Z. Vamvakousis
4.3 Algorithms
In this paper we evaluate two classifiers, Linear Discriminant Analysis (LDA)
[11] and Support Vector Machines (SVM) [4], for classifying an emotion state for
each EEG segment. Linear discriminant analysis and the related Fisher’s linear
discriminant are methods used in statistics, pattern recognition and machine
learning to find a linear combination of features which characterizes or separates
two or more classes of objects or events. The resulting combination may be
used as a linear classifier. LDA is closely related to regression analysis, which
also attempt to express one dependent variable as a linear combination of other
features. In regression analysis however, the dependent variable is a numerical
quantity, while for LDA it is a categorical variable (i.e. the class label).
On the other hand, SVM is one of the most popular supervised learning algo-
rithms for solving classification problems. The basic idea in SVM is to project
input data onto a higher dimensional feature space via a kernel transfer func-
tion, which is easier to be separated than that in the original feature space.
Depending on input data, the iterative learning process of SVM would even-
tually converge into optimal hyperplanes with maximal margins between each
class. These hyperplanes would be the decision boundaries for distinguishing dif-
ferent data clusters. Here, we use linear and radial basis function (RBF) kernel
to map data onto a higher dimension space. The results reported are obtained
using the LDA and SVM implementations in the OpenVibe software [15].
We evaluated each induced classifier by performing the standard 10-fold cross
validation in which 10% of the training set is held out in turn as test data while
the remaining 90% is used as training data. When performing the 10-fold cross
validation, we leave out the same number of examples per class. In the data sets,
the number of examples is the same for each class considered, thus by leaving
out the same number of examples per class we maintain a balanced training set.
4.4 Results
Given that we are dealing with 2-class classification tasks and that the number
of instances in each class is the same, the expected classification accuracy of the
default classifier (one which chooses the most common class) is 50% (measured in
correctly classified instances percentage). For the high-versus-low arousal, and
the positive-versus-negative valence classifiers the average accuracies obtained
for SVM with radial basis function kernel classifier were 77.82%, and 80.11%,
respectively. For these classifiers the best subject’s accuracies were 83,35%, and
86.33%, respectively. The correctly classified instances percentage for each sub-
ject and each learning method is presented in Figures 4 and 5.
4.5 Discussion
The difference between the results obtained and the accuracy of a baseline classi-
fier, i.e. a classifier guessing at random confirms that the EEG data contains suffi-
cient information to distinguish between high/low arousal and positive/negative
182 R. Ramirez and Z. Vamvakousis
Fig. 4. Classifiers (LDA, SVM with linear kernel, and SVM with radial basis function
kernel) accuracies for high-versus-low arousal for all subjects
Fig. 5. Classifiers (LDA, SVM with linear kernel, and SVM with radial basis function
kernel) accuracies for positive-versus-negative valence for all subjects
valence states, and that machine learning methods are capable of learning the
EGG patterns that distinguish these states. It is worth noting that both learning
algorithm investigated (LDA and SVM) produced better than random classifi-
cation accuracies. This supports our statement about the feasibility of training
classifiers using the Emotiv Epoc for the tasks reported.
The accuracy of the classifiers for the same task for different subjects varies
significantly, even using the same learning method. Subjects producing high
Detecting Emotion from EEG Signals Using the Emotive Epoc Device 183
accuracies with one learning method tend to produce high accuracies with the
other learning methods. These uneven accuracies among subjects may be due to
different degrees of emotional response between different individuals, or to the
amount of noise for different subjects. In any case, it has been reported that
there exists considerable variation in EEG responses among different subjects.
It is worth mentioning that in all the experiments performed we provided
no self-assessment information about the emotional states by the subjects. This
contrasts with other approaches (e.g. [10]) where EEG data is categorized ac-
cording to subject self-reported emotional states. Incorporating self-assessment
information would very likely improve the accuracies of the classifiers.
5 Conclusion
We have explored and compared two machine learning techniques for the problem
of classifying the emotional state of a person based on EEG data using the
Emotiv Epoc headset. We considered two machine learning techniques: linear
discriminant analysis and support vector machines. We presented the results
of the induced classifiers which are able to discriminate between high-versus-
low arousal and positive-versus-negative valence. Our results indicate that EEG
data obtained with the Emotiv Epoc device contains sufficient information to
distinguish these emotional states, and that machine learning techniques are
capable of learning the patterns that distinguish these states. Furthermore, we
proved that it is possible to train successful classifiers with no to self-assessment
of information about the emotional states by the subjects. As future work, we
are particularly interested in systematically exploring different feature extraction
methods and learning methods in order to improve the accuracy of the induced
classifiers.
References
1. Bradley, M.M., Lang, P.J.: International Affective Digitized Sounds (IADS): Stim-
uli, Instruction Manual and Affective Ratings. The Center for Research in Psy-
chophysiology, University of Florida, Gainesville, FL, USA (1999)
2. Chanel, G., Kronegg, J., Grandjean, D., Pun, T.: Emotion Assessment: Arousal
Evaluation Using EEG’s and Peripheral Physiological Signals. In: Gunsel, B., Jain,
A.K., Tekalp, A.M., Sankur, B. (eds.) MRCS 2006. LNCS, vol. 4105, pp. 530–537.
Springer, Heidelberg (2006)
3. Choppin, A.: Eeg-based human interface for disabled individuals: Emotion expres-
sion with neural networks. Masters thesis, Tokyo Institute of Technology, Yoko-
hama, Japan (2000)
4. Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines.
Cambridge University Press (2000)
184 R. Ramirez and Z. Vamvakousis
5. Coburn, K., Moreno, M.: Facts and artifacts in brain electrical activity mapping.
Brain Topography 1(1), 37–45 (1988)
6. Fatourechi, M., Bashashati, A., Ward, R.K., Birch, G.E.: EMG and EOG artifacts
in brain computer interface systems: A survey. Clininical Neurophysiology (118),
480–494 (2007)
7. Emotiv Systems Inc. Researchers, http://www.emotiv.com/researchers/
8. Harmon-Jones, E.: Clarifying the emotive functions of asymmetrical frontal cortical
activity. Psychophysiology 40(6), 838–848 (2003)
9. Kandel, E.R., Schwartz, J.H., Jessell, T.M.: Principles of Neural Science. Mc Graw
Hill (2000)
10. Lin, Y.-P., Wang, C.-H., Jung, T.-P., Wu, T.-L., Jeng, S.-K., Duann, J.-R., Chen,
J.-H.: EEG-Based Emotion Recognition in Music Listening. IEEE Transactions on
Biomedical Engineering 57(7) (2010)
11. Mika, S., et al.: Fisher Discriminant Analysis with Kernels. In: IEEE Conference
on Neural Networks for Signal Processing IX, pp. 41–48 (1999)
12. Niedermeyer, E., da Silva, F.L.: Electroencephalography, Basic Principles, Clinical
Applications, and Related Fields, p. 140. Lippincott Williams & Wilkins (2004)
13. Niemic, C.P.: Studies of emotion: A theoretical and empirical review of psychophys-
iological studies of emotion. Journal of Undergraduate Research 1, 15–18 (2002)
14. Bos, D.O.: EEG-based Emotion Recognition: The Influence of Visual and Auditory
Stimuli
15. OpenViBE: An Open-Source Software Platform to Design, Test, and Use Brain-
Computer Interfaces in Real and Virtual Environments. MIT Press Journal Pres-
ence’ 19(1), 35–53 (2010)
16. Partala, T., Jokiniemi, M., Surakka, V.: Pupillary responses to emotionally
provocative stimuli. In: ETRA 2000: Proceedings of the 2000 Symposium on Eye
Tracking Research & Applications, pp. 123–129. ACM Press, New York (2000)
17. Picard, R.W., Klein, J.: Toward computers that recognize and respond to user
emotion: Theoretical and practical implications. Interacting with Computers 14(2),
141–169 (2002)
18. Takahashi, K.: Remarks on emotion recognition from bio-potential signals. In: 2nd
International Conference on Autonomous Robots and Agents, pp. 186–191 (2004)