Professional Documents
Culture Documents
Thesis F PDF
Thesis F PDF
PROJECT
ON
MUSIC EVOKED EMOTION
CLASSIFICATION OF EEG SIGNAL
USING MACHINE LEARNING
Submitted by
It is our greatest pleasure to thank everyone who was related with our thesis project. It wouldn‟t
have been possible without their effort and encouragement. We would like to thank our supervisors, Mr. Vicky
Suri for helping us and guiding us with the topic and for their constant support through any problems which we
faced while proceeding with the project. We are deeply grateful for all the encouragement and support during
We extend our heartfelt thanks to all the faculty members. We are grateful to the institute for providing
us with the environment for research and development. Finally before closing, we would like to thank our
parents who have helped us throughout the project and encouraged us to aim for the best..
This is to certify that the project entitled, “Music Evoked Emotion Classification Of EEG Signal
Using Machine Learning” by Vivek Majumdar, Kartik Nishad, Vivek Nigam, Vishal Chopra is a record
of bonafide work carried out by us, in the Division of Instrumentation and Control Engineering, Netaji Subhas
Institute of Technology, University of Delhi, New Delhi, in partial fulfillment of requirements for the award
of the degree of Bachelor of Engineering in Instrumentation and Control Engineering, University of Delhi in
The results presented in this thesis are original and have not been submitted to any other university in
This is to certify that the project entitled, “Music Evoked Emotion Classification Of EEG Signal
Using Machine Learning” by Vivek Majumdar, Kartik Nishad, Vivek Nigam, Vishal Chopra is a record
of bonafide work carried out by them, in the Division of Instrumentation and Control Engineering, Netaji
Subhas Institute of Technology, University of Delhi, New Delhi, under our supervision and guidance in partial
fulfillment of requirements for the award of the degree of Bachelor of Engineering in Instrumentation and
The results presented in this thesis are original and have not been submitted to any other university in
This is to certify that the project entitled, “Music Evoked Emotion Classification Of EEG Signal
Using Machine Learning” by Vivek Majumdar, Kartik Nishad, Vivek Nigam, Vishal Chopra is a record
of bonafide work carried out by them, in the division of Instrumentation and Control Engineering, Netaji
Subhas Institute of Technology, University of Delhi, New Delhi, in partial fulfillment of requirements for the
award of the degree of Bachelor of Technology in Instrumentation and Control Engineering, University of
% 19
SIMILARITY INDEX
% 14
INTERNET SOURCES
% 7
PUBLICATIONS
% 11
STUDENT PAPERS
PRIMARY SOURCES
1
Submitted to Vel Tech University
Student Paper
% 2
2
ctms.engin.umich.edu
Internet Source
% 1
3
es.mathworks.com
Internet Source
% 1
Submitted to Edith Cowan University % 1
4 Student Paper
www.circuitstoday.com
% 1
5 Internet Source
www.mathworks.com
% 1
6 Internet Source
en.wikibooks.org
% 1
7 Internet Source
% 1
Submitted to Clemson University
8 Student Paper
Internet Source
16
Submitted to RMIT University
Student Paper
<%1
17
fachschaft.etec.uni-karlsruhe.de
Internet Source
<%1
18
www.monografias.com
Internet Source
<%1
19
www.dtic.mil
Internet Source
<%1
20
www.coursehero.com
Internet Source
<%1
21
www.cl.cam.ac.uk
Internet Source
<%1
22
en.wikipedia.org
Internet Source
<%1
www.slideshare.net <%1
25 Internet Source
26
www.scribd.com
Internet Source <%1
ABSTRACT
Emotions play an important role in the daily life of human beings, the need and
importance of automatic emotion recognition has grown with increasing role of Brain
Computer Interface (BCI) applications. Here the emotions are generated by musical signal
Component Analysis (ICA) to remove artifacts from the signal. We use logistic regression
i
LIST OF TABLES
Table 2.1: Nomenclature of electrodes in 10 – 20 system ..........................................................................5
ii
LIST OF FIGURES
iii
LIST OF ABBREVIATIONS AND SYMBOLS
If you do not have any symbols, abbreviations, or specific nomenclature in your thesis, you
Abbreviations Definition
iv
INDEX OF EQUATIONS
v
TABLES OF CONTENT
ACKNOWLEDGEMENTS ........................................................................................................................... i
DECLARATION ............................................................................................................................................. ii
CERTIFICATE............................................................................................................................................... iii
CERTIFICATE................................................................................................................................................ iv
PLAGIRISM REPORT .................................................................................................................................. v
ABSTRACT ..................................................................................................................................................... vi
LIST OF TABLES ........................................................................................................................................vii
LIST OF FIGURES ..................................................................................................................................... viii
LIST OF ABBREVIATIONS AND SYMBOLS .................................................................................. ix
INDEX OF EQUATIONS ............................................................................................................................ x
TABLES OF CONTENTS........................................................................................................................... xi
Chapter 1
Introduction ..................................................................................................................................................... 1
1.1 motivation ..................................................................................................................................... 1
1.2 outline of the thesis .................................................................................................................... 2
Chapter 2
Introduction to EEG and emotions ........................................................................................................ 3
2.1 introduction .................................................................................................................................. 3
2.2 types of electrode placement ................................................................................................... 3
2.2.1 international 10-20 system ..................................................................................... 4
2.2.2 higher resolution system ......................................................................................... 5
2.3 types of EEG waves patterns .................................................................................................. 6
2.4 Emotions classification ...........................................................................................................10
Chapter 3
Signal processing of EEG .........................................................................................................................12
3.1 introduction ................................................................................................................................12
3.2 data acquisition..........................................................................................................................12
3.3 signal processing in MATLAB ............................................................................................14
3.4 spectral analysis of EEG using MATLAB ........................................................................18
vi
Chapter 4
Chapter 5
Introduction to Machine Learning ...................................................................................................... 34
5.1 Introduction ............................................................................................................................................. 34
5.1.1Supervised Learning…………………………………………………………….34
5.1.2Unsupervised Learning…………………………………………………………35
5.1.3Reinforced Learning……………………………………………………………35
vii
CHAPTER 1
INTRODUCTION
This chapter entails the motivation behind the study and a basic outline of the
thesis. The literature review and recent advancements in this field of study has also
been discussed.
1.1 Motivation
The stress is a serious problem in today's society because of brisk traffic
facilities and developed information society. Thus, chronic stress has a bad influence
on mind and body, and causes various illness. In order to solve such a problem,
Brain–computer interfaces (BCIs) are emerging interfaces that have a lot of potential
as an application of signal processing and machine learning techniques. BCIs are not
only used to control a computer or a device but also for music therapy and
recommendations [2].
transformed (FFT) is used to analyze EEG. Different bands of EEG have been defined
In music-related researches, the band powers are important parameters. The EEG is
the signal measured by sensors attached in the head and then amplified. It is said the
EEG is generated by brain's electric activity. The signal changes by external stimulus
1
as sound and inner stimulus as feelings, therefore an EEG pattern can change by
In this research EEG data is processed with removal of the artifacts using ICA.
Classification of the EEG signal is done using different machine learning algorithms
to correlate with manual assessment of the musical stimuli, calculating the accuracy
The rest of the thesis is organised as follows. In chapter 1, a basic introduction of the
research is shown. It tells about the motivation and purpose of the thesis outlining the
signal processing of EEG using matlab with EEGLAB toolbox is studied. In chapter
4, how the data is pre-processed that is feature extraction and selection is done using
2
CHAPTER 2
INTRODUCTION TO EEG AND EMOTIONS
2.1 Introduction
Electroencephalography is a technique that records the electrical activity of
the brain. During an EEG test, small electrodes like cup or disc type are placed on the
scalp. They pick up the brain's electrical signals and send them to a machine called
In 1875, Sir Richard Caton presented his finding about electrical phenomena
of the exposed cerebral hemispheres of rabbits and monkeys. In 1890, Sir Adolf Beck
dogs. In 1924, Sir Hans Berger recorded the first human EEG.
An EEG is mainly used when there is a need to diagnose and manage epilepsy.
It can also be used to investigate other conditions such as encephalitis, dementia, head
injuries, brain tumors, hemorrhage. An EEG can identify areas of the brain that are
not working properly. EEGs are also used to determine the level of brain function in
3
2.2.1 International 10-20 system
maintain the standardized testing insuring data subject study could be compiled,
The 10 and 20 prefers that the actual distance between adjacent electrodes are either
10% or 20% of the total front back or right left distance of the skull. Each electrode
has unique letter to locate different areas of the brain as shown in table 2.1.
4
Table 2.1 Nomenclature of Electrodes in 10 – 20 System
1 Pre – frontal Pf
2 Frontal F
3 Temporal T
4 Parietal P
5 Occipital O
6 Central C
All the right hand side electrodes of the head are referred using even numbers whereas all
the left handed electrodes are odd numbers. All the mid line electrodes are referred with
Measurement is done separately for each electrodes that are F3, F4, P3 and P4.
It measure front to back and side to side and then diagonally through C3 and C4
points.
20 system using more electrodes in it. The spacing between all electrodes is done in
5
10% division. The new electrodes naming is done according to Modified
The MCN system uses 1,3,5,7,9 for left hemisphere of skull which represents 10%-
50% inions-nation distance respectively. The new MCN code for intermediate
6
Table 2.2 New MCN codes for intermediate electrodes
BETWEEEN
ELECTRODES
1 AF Fp and F
2 FC F and C
3 FT F and T
4 CP C and P
5 TP T and P
6 PO P and O
New MCN codes after introduction of new electrodes in the former electrode system.
T3 is now T7
T4 is now T8
T5 is now P7
T6 is now P8
7
8
2.3 Type of EEG wave patterns
DELTA WAVE
Frequency range 0.2-4 Hz
Deep sleep condition of adults
THETA WAVES
Frequency range 4-8 Hz
Relaxing, day dreaming etc.
ALPHA WAVES
Frequency range from 8-16 Hz
These waves represent non-arousal emotions.
BETA WAVES
Frequency range 16– 32 Hz
These waves represent arousal emotion.
GAMMA WAVES
Frequency range approximately 32–48 Hz
Perception that combines two different senses, such as sound and sight
Shortsensations-termmemory matching of recognized objects, sounds, or tactile
9
Emotion classification is done on the basis of two factors that is (1) arousal and (2)
valence. It states that we react to a stimulus and experience the associated emotion at
the same time. We feel emotions and experience physiological reactions such as
sweating, trembling and muscle tension simultaneously.It states that emotions occur
2.4.1 Valence
negative. It tells about how much positive or negative the data is. Vlalence
emotion is done according to how pleasant and unpleasant emotion is. For
be unpleasant.
2.4.2 Arousal
the body. It include the data of EEG + heart rate + skin conductance (EMG) to
10
Fig 2.4 Emotion Classification
11
CHAPTER 3
3.1 Introduction
This chapter includes the data acquisition of EEG signal, workings of
EEGLAB toolbox using MATLAB for signal processing. Musical signal are used as
stimuli to evoked EEG potential. The data is then filtered out according to EEG
frequency range. After filtering the data is ready for removing unwanted artefacts,
online platform all EEG datasets collected in BCI competition.The data collected in a
subjects are placed in a noise free room. There were three sessions and each of them
12
Another 15 seconds impulsive EEG.
At the end of each trial, participants perform a self – evaluation of valence
and arousal.
The EEG signals were continuously recorded by 16 channel EEG system as shown
in fig 3.1, with a sampling rate of 256Hz. The electrodes followed the
13
3.3 Signal processing in MATLAB
To process the given EEG dataset, we have to first install EEGLAB toolbox in
MATLAB. After installing EEGLAB, add EEGLAB source file destination path to
14
Next step after importing EEGLAB in MATLAB is to import dataset into EEGLAB.
To import dataset, click on File→ Import data→ Using EEGLAB functions and
plugins→ From ASCII/float file or MATLAB array. After that a dialog box will open,
in which you have to select dataset file, sampling rate and channel location as shown
in Fig 3.3.
15
Fig 3.4 After importing dataset into EEGLABv14.1.2
After importing dataset into EEGLAB as shown in fig 3.4, the data is filter out using
FIR filters. The pattern of EEG signal spread over the frequency range between
gather only EEG signal we have to filter out the data using 0.2- 50Hz band pass filter
(BPF). To filter the data in EEGLAB toolbox we have to click on Tool→ Filter the
data → Basic FIR filter (new, default). After that a dialog box will appear in which,
16
Fig 3.5 FIR filter window
17
After filling all the data plot of frequency response is popped out showing filtered
Spectral analysis of EEG is done to retrieve the power and energy signal from the
EEG data. It is a kind of feature extraction in which we extract power spectral density
using FFT.
signal and then analyze it. To start with the processing, first we import the dataset
18
After inserting the data, we have to write a code for power spectral density (PSD)
applying fast Fourier transform(FFT) to get the value in frequency domain as shown
below.
%%%%%%%%%%%%%%%%
psd = abs(fft(xtest{:,1},N).^2/Ns);
xlim([0 1]);
%%%%%%%%%%%%%%%%
19
Then after writing the code we will plot the power spectral density as shown in fig
3.8.
20
CHAPTER 4
Data Pre-Processing
4.1 Introduction
In this chapter, we will get to know how and why the data is being pre-
and Feature selection. Feature extraction includes ICA, PCA, LDA, etc. whereas
For removal of artefacts from the EEG signal we have to apply ICA into
the dataset. There is a simpler way for applying ICA in MATLAB i.e using EEGLAB
toolbox.
of the data. It done by subtracting the mean value from the individual values then
21
Standardized values lie between -1 to 1.
the data. It done by subtracting the minimum value from the individual values then
= ( − min( ))
(max( ) − min( ))
In Feature selection, we find the subset of original attributes values. It has three
Filter Strategy
Wrapper Strategy
Embedded Strategy(Regularization)
In machine learning, for regression and classification of the large data, output is more
22
4.4.2 Feature extraction
three matrixes,
X= US
U=I
V=V = I.
principal component waveforms which are linearly decorrelated and can be remixed
M = US.
23
The eigenvector matrix, V is essentially a set of topographic scalp maps, similar to
Bell and Sejnowski have proposed a simple neural network algorithm that blindly
maximizing the joint entropy, H(y), of the output of a neural processor minimizes the
nonlinearity and
u = Wx,
a version of the original sources, s, identical save for scaling and permutation. This
implies that the distribution of the output approximates a uniform density. Independence
is achieved through the nonlinear squashing function which provides necessary higher-
order statistics through its Taylor series expansion. The learning rule can be derived by
W = [I + p] W
The `natural gradient' W term avoids matrix inversions and speeds convergence. The
form of the nonlinearity g(u) plays an essential role in the success of the algorithm.
The ideal form for g() is the cumulative density function (cdf ) of the distributions of
the independent sources. algorithm. The ideal form for g() is the cumulative density
24
4.5 Applying ICA to artefact correction
because: It is plausible that EEG data recorded at multiple scalp sensors are linear
volume conduction does not involve significant time delays1 . For EEG analysis,
the rows of the input matrix x are the EEG signals recorded at different electrodes,
are time courses of activation of the ICA components, and the columns of the
inverse matrix give the projection strengths of the respective components onto the
scalp sensors. The scalp topographies of the components allow us to examine their
artifactual components set to zero. The rank of corrected EEG data is less than
26
4.6 Artefacts removal using MATLAB
To remove eye blinking artefact using MATLAB we use EEGLAB toolbox
for ICA analysis. So after importing dataset and processing EEG data for further
To imply ICA we have to select Tool→ Run ICA then a dialog box will
After selecting the mode of the ICA, click OK to RUN the program. After running
ICA there will conformation box will appear as shown in fig 4.2.
27
Fig 4.2 window showing ICA weights is equal to YES
Now to visualize the EEG component map Goto the Plot→ Component Map→ In 2-D
28
Fig 4.3 Visualization of ICA component maps
Now Select the channel manually in which the intensity around eyes is maximum and
then mark the channel by Selecting the channel and mark it to Reject as shown in the
fig 4.4.
29
Fig 4.4 marking the channel component map manually
After selecting the channels to be rejected, remove the channel manually by Selecting
30
Fig 4.5 removing the channel component
Click Ok to remove channel component. To show the new EEG plot select Plot→
Fig 4.6 removed EEG plot;blue plot is original EEG; Red plot shows removed artefact EEG
plot
31
Now to visualize the EEG component map Goto the Plot→ Component Map→ In 2-D
After removing all the artefact export the data from EEGLAB, Goto File →
32
Fig 4.8 Export data dialog box
33
CHAPTER 5
5.1 Introduction
In this chapter, we will learn about machine learning algorithm and
approaches to classify the data to reduce the difference between the assessment of the
human v/s prediction of the system. There are three type of learning:
Supervised learning
Unsupervised learning
Reinforcement learning
In supervised learning, we train the dataset giving all the feature and command the
variable i.e. X and an output variable i.e. Y and made a hypothesis model i.e.
The goal is to predict the mapping func in such a way that when we input new data
variables new_X, we can predict output variable to that data. Supervised learning
includes:
Regression
Simple Linear Regression
Multiple Linear Regression
34
Support Vector Regression
Classification
LR
SVM
SVM – kernel
K - NN
In unsupervised learning, we don‟t give the output variable to classify the data.
System learning by clustering the data variable and then classify it. Unsupervised
learning is different from supervised one, as there no given correct answer for the
Classification
Association
5.3.Reinforcement Learning
In reinforcement learning, the system learns through its experience. It marks the
35
5.2 Classification of the EEG data
So after pre-processing the EEG dataset in chapter 4, we convert the
dataset into .csv file. To convert .mat file into .csv use online convertor from internet.
After converting into .csv file open Python IDE – PyCharm Community Edition
_____________________________________________________________________
#importing libraries
import numpy as np
import pandas as pd
dataset=pd.read_csv('xtest')
dataset1=pd.read_csv('ytest')
x = dataset.iloc[,].values
y = dataset1.iloc[:,1].values
#Splitting the dataset into the Training Set and Test Set
#Feature Scaling
36
from sklearn.preprocessing import
x_train = sc_x.fittransform(x_train)
xtest = sc_x.transform(x_test)
result
y_pred = classifier.predict(x_test)
cm = confusion_matrix(y_test,y_pred)
_____________________________________________________________________
#importing libraries
import numpy as np
import pandas as pd
37
#importing the dataset
dataset=pd.read_csv('xtest')
dataset1=pd.read_csv('ytest')
x = dataset.iloc[,].values
y = dataset1.iloc[:,1].values
#Splitting the dataset into the Training Set and Test Set
#Feature Scaling
x_train = sc_x.fittransform(x_train)
xtest = sc_x.transform(x_test)
classifier.fit(xtrain , ytrain)
38
#Predicting the test set result
y_pred = classifier.predict(x_test)
cm = confusion_matrix(y_test,y_pred)
_____________________________________________________________________
_____________________________________________________________________
import math
import numpy as np
import pandas as pd
min_max_scaler = preprocessing.MinMaxScaler(feature_range=(-1,1))
df = pd.read_csv("data.csv", header=0)
39
df.columns = ["grade1","grade2","label"]
x = df["label"].map(lambda x: float(x.rstrip(';')))
X = df[["grade1","grade2"]]
X = np.array(X)
X = min_max_scaler.fit_transform(X)
Y = df["label"].map(lambda x: float(x.rstrip(';')))
Y = np.array(Y)
X_train,X_test,Y_train,Y_test =
clf.fit(X_train,Y_train)
1) neg = where(Y == 0)
ylabel('Exam 2 score')
legend(['Not Admitted',
Sigmoid(z):
40
def Hypothesis(theta, x):
z=0
for i in xrange(len(theta)):
z += x[i]*theta[i]
return Sigmoid(z)
def Cost_Function(X,Y,theta,m):
sumOfErrors = 0
for i in xrange(m):
xi = X[i]
hi = Hypothesis(theta,xi)
if Y[i] == 1:
elif Y[i] == 0:
sumOfErrors += error
const = -1/m
J = const * sumOfErrors
return J
def Cost_Function_Derivative(X,Y,theta,j,m,alpha):
sumErrors = 0
for i in xrange(m):
41
xi = X[i]
xij = xi[j]
hi = Hypothesis(theta,X[i])
sumErrors += error
m = len(Y)
constant = float(alpha)/float(m)
J = constant * sumErrors
return J
def Gradient_Descent(X,Y,theta,m,alpha):
new_theta = []
constant = alpha/m
for j in xrange(len(theta)):
CFDerivative = Cost_Function_Derivative(X,Y,theta,j,m,alpha)
new_theta.append(new_theta_value)
return new_theta
def Logistic_Regression(X,Y,alpha,theta,num_iters):
m = len(Y)
for x in xrange(num_iters):
42
new_theta = Gradient_Descent(X,Y,theta,m,alpha)
theta = new_theta
if x % 100 == 0:
Cost_Function(X,Y,theta,m)
import numpy as np
import cvxopt.solvers
import logging
MIN_SUPPORT_VECTOR_MULTIPLIER = 1e-5
class SVMTrainer(object):
self._kernel = kernel
self._c = c
lagrange_multipliers = self._compute_multipliers(X, y)
43
def _gram_matrix(self, X):
K = np.zeros((n_samples, n_samples))
return K
support_vector_indices = \
support_multipliers = lagrange_multipliers[support_vector_indices]
y[support_vector_indices]
# http://www.cs.cmu.edu/~guestrin/Class/10701-S07/Slides/kernels.pdf
# compute error.
bias = np.mean(
[y_k - SVMPredictor(
44
kernel=self._kernel,
bias=0.0,
weights=support_multipliers,
support_vectors=support_vectors,
support_vector_labels=support_vector_labels).predict(x_k)
return SVMPredictor(
kernel=self._kernel,
bias=bias,
weights=support_multipliers,
support_vectors=support_vectors,
support_vector_labels=support_vector_labels)
K = self._gram_matrix(X)
P = cvxopt.matrix(np.outer(y, y) * K)
q = cvxopt.matrix(-1 * np.ones(n_samples))
45
G_std = cvxopt.matrix(np.diag(np.ones(n_samples) * -1))
h_std = cvxopt.matrix(np.zeros(n_samples))
G_slack = cvxopt.matrix(np.diag(np.ones(n_samples)))
G = cvxopt.matrix(np.vstack((G_std, G_slack)))
h = cvxopt.matrix(np.vstack((h_std, h_slack)))
b = cvxopt.matrix(0.0)
solution = cvxopt.solvers.qp(P, q, G, h, A, b)
return np.ravel(solution['x'])
class SVMPredictor(object):
def __init__(self,
kernel,
bias,
weights,
support_vectors,
support_vector_labels):
self._kernel = kernel
self._bias = bias
self._weights = weights
self._support_vectors = support_vectors
self._support_vector_labels = support_vector_labels
46
assert len(support_vectors) == len(support_vector_labels)
"""
"""
result = self._bias
self._support_vectors,
self._support_vector_labels):
return np.sign(result).item()
47
5.2.3 Visualization code to plot the decision line
1, step = 0.01))
plt.xlim(X1.min(), X1.max())
plt.ylim(X2.min(), X2.max())
for i, j in enumerate(np.unique(y_set)):
plt.xlabel('Age')
plt.ylabel('Estimated Salary')
plt.legend()
plt.show()
48
# Visualising the Test set results
step = 0.01))
plt.contourf(X1,X2,classifier.predict(np.array([X1.ravel(),X2.ravel()]).T).reshape(X1.
X1.max())
plt.ylim(X2.min(), X2.max())
for i, j in enumerate(np.unique(y_set)):
plt.xlabel('Age')
plt.ylabel('Estimated Salary')
plt.legend()
plt.show()
49
REFERENCES
1. P.Chiu, and A.Kumar, «Music Therapy: Loud Noise or Soothing Notes?,"
Nasuto, “Affective brain-computer music interfacing,” J. Neural. Eng., vol. 13, no.
4, p. 046022, 2016.
3, p. 40, 2012.
Based Support Vector Machine for Emotion Recognition Through EEG”, in Proc.
6. E.T. Berkman, D.K. Wong, M.P. Guimaraes, E.T. Uy, J.J. Gross, P.Suppes, “Brain
S71-S7
50
BIODATA
1. Vivek Majumdar
572/IC/14
vivekmajumdar03@gmail.com
2. Kartik Nishad
471/IC/14
kartiknishad204@gmail.com
3. Vishal Chopra
568/IC/14
vishalchopra96500@gmail.com
4. Vivek Nigam
573/IC/14
viveknigam420@gmail.com