Download as pdf or txt
Download as pdf or txt
You are on page 1of 35

Boston College

Computer Science Department


Non-Invasive BCI through EEG
An Exploration of the Utilization of Electroencephalography to Create Thought-Based
Brain-Computer Interfaces
Senior onors Thesis
Daniel !" Szafir# $%%&-'%
Ad(ised )y *rof" +o)ert Signorile
Contents
,ist of -igures and Ta)les
'" A)stract '
$" Introduction $
'" Electroencephalography $
$" Brain-Computer Interfaces .
/" *re(ious EE0 BCI +esearch 1
2" The Emoti(3 System &
'" Control *anel ''
$" TestBench '2
/" The Emoti( A*I '1
." The *arallax Scri))ler4 +o)ot and the I*+E -lu5e '6
1" Control Implementation '7
'" Emoti( Connection '&
$" Scri))ler Connection '&
/" Decoding and andling EmoStates $%
2" 8odifications $/
6" Blin5 Detection $.
7" Conclusions /%
&" Ac5no9ledgments /'
,ist of -igures and Ta)les
Ta)le ': EE0 Bands and -re;uencies 2
Ta)le $: Emoti( SD<s ''
Ta)le /: *redefined Cogniti( Action Enumerator =alues $$
-igure ': Electrode *lacement according to the International '%-$% System 2
-igure $: Brain-Computer Interface Design *attern 6
-igure /: Emoti( E*>C eadset '%
-igure 2: The '2 E*>C eadset '2 contacts '%
-igure .: Expressi( Suite '$
-igure 1: Affecti( Suite '/
-igure 6: Cogniti( Suite '2
-igure 7: +eal-time EE0 8easurements in TestBench '.
-igure &: --T 8easurements in TestBench '.
-igure '%: igh-le(el =ie9 of the Utilization of the Emoti( A*I '1
-igure '': Scri))ler +o)ot 9ith I*+E -lu5e add-on Board '6
-igure '$: igh-le(el =ie9 of the Control Scheme '7
-igure '/: A Blin5 E(ent ighlighted in the A-/ Channel $.
-igure '2: >ne '%-second Segment 9ith Blin5s and ?oise $1
-igure '.: Initial ?eural ?et +esults $6
-igure '1: T9enty '%-second +ecordings of A-/ Data $6
-igure '6: '"1 Second +ecording Sho9ing a Blin5 *attern $7
-igure '7: Unsuper(ised <-8eans Clustering Exposes ?aturally >ccurring Blin5 *atterns $7
-igure '&: -ully *rocessed and ?ormalized Input to the ?eural ?et9or5 $&
1. Abstract
It has long been known that as neurons fire within the brain they produce measurable electrical
activity. Electroencephalography (EEG) is the measurement and recording of these electrical signals
using sensors arrayed across the scalp. Though there is copious research in using EEG technology in
the fields of neuroscience and cognitive psychology, it is only recently that the possibility of utiliing
EEG measurements as inputs in the control of computers has emerged. The idea of !rain"#omputer
Interfaces (!#Is) which allow the control of devices using brain signals evolved from the realm of
science fiction to simple devices that currently e$ist. !#Is naturally present themselves to many
e$tremely useful applications including prosthetic devices, restoring or aiding in communication and
hearing, military applications, video gaming and virtual reality, and robotic control, and have the
possibility of significantly improving the %uality of life of many disabled individuals. &owever,
current !#Is suffer from many problems including inaccuracies, delays between thought, detection,
and action, e$orbitant costs, and invasive surgeries. The purpose of this research is to e$amine the
Emotiv E'(#) *ystem as a cost"effective gateway to non"invasive portable EEG measurements and
utilie it to build a thought"based !#I to control the 'aralla$ *cribbler+ robot. This research furthers
the analysis of the current pros and cons of EEG technology as it pertains to !#Is and offers a glimpse
of the future potential capabilities of !#I systems.
,
2. Introduction
-ho wouldn.t love to control a computer with their mind/ Interfaces between the brain and
computers have long been a staple of science fiction where they are used in an incredible variety of
applications from controlling powered e$oskeletons, robots, and artificial limbs to creating art
envisioned by the user to allowing for machine"assisted telepathy. This space"age fantasy is not %uite
real yet, however simple !#Is do currently e$ist and research and public interest in them only
continues to grow. This research e$plores the process in creating a novel !#I that utilies the Emotiv
E'(# *ystem to measure EEG waves and controls the 'aralla$ *cribbler robot.
2.1 Electroencephalography
EEG waves are created by the firing of neurons in the brain and were first measured by
0ladimir 'ravdich"1eminsky who measured the electrical activity in the brains of dogs in ,2,3,
although the term he used was 4electrocerebrogram.5
,
Ten years later &ans !erger became the first to
measure EEG waves in humans and, in addition to giving them their modern name, began what would
become intense research in utiliing these electrical measurements in the fields of neuroscience and
psychology.
3

EEG waves are measured using electrodes attached to the scalp which are sensitive to changes
in postsynaptic potentials of neurons in the cerebral corte$. 'ostsynaptic potentials are created by the
combination of inhibitory and e$citatory potentials located in the dendrites. These potentials are
created in areas of local depolariation or polariation following the change in membrane conductance
, *wart, !.E6 Goldensohn, E*. 7Timeline of the history of EEG and associated fields.7 Electroencephalography and
Clinical Neurophysiology. 0ol. ,89, pp.,:;<,:9. ,22=. >http?@@www.sciencedirect.com@science/
AobBCImgDAimagekeyB!9*EF"GH0G*9&","
,DAcdiBG=G9DAuserB,8DAorigBbrowseDAcoverIateB83J3H3=J3H,22=DAskB22=2;222:DviewBcDwchpBdGKb0
"*k-bDmdLBG:fbbe:eL,a=89::2:,9fbaG,Lb29ab:DieB@sdarticle.pdfM.
3 Cillett, Iavid. 7&ans !erger? from psychic energy to the EEG.5 Perspectives in Biology and Medicine, Nohns &opkins
Oniversity 'ress, 388,. GG.G pp. L33<LG3.
>http?@@muse.Phu.edu@Pournals@perspectivesAinAbiologyAandAmedicine@v8GG@GG.Gmillett.htmlM.
3
as neurotransmitters are released. Each electrode has a standard sensitivity of : Q0@mm and averages
the potentials measured in the area near the sensor. These averages are amplified and combined to
show rhythmic activity that is classified by fre%uency (Table ,).
;
Electrodes are usually placed along
the scalp following the 4,8"38 International *ystem of Electrode 'lacement5 developed by Ir. &erbert
Nasper in the ,2L8.s which allows for standard measurements of various parts of the brain (Higure ,).
G

The primary research that utilies EEG technology is based on the fact that this rhythmic activity is
dependent upon mental state and can be influenced by level of alertness or various mental diseases.
This research commonly involves comparing EEG waves in alert and asleep patients as well as looking
for markers in abnormal EEG waves which can evidence diseases such as epilepsy or Rlheimer.s. (ne
of the historical downsides of EEG measurement has been the corruption of EEG data by artifacts
which are electrical signals that are picked up by the sensors that do not originate from cortical
neurons. (ne of the most common cause of artifacts is eye movement and blinking, however other
causes can include the use of scalp, neck, or other muscles or even poor contact between the scalp and
the electrodes.
L
Cany EEG systems attempt to reduce artifacts and general noise by utiliing reference
electrodes placed in locations where there is little cortical activity and attempting to filter out correlated
patterns.
9

; 1une 'K, *rinivasan S. Electric Fields of the Brain: The Neurophysics of EEG. ($ford Oniversity 'ress. ,2=,.
G 1iedermeyer, Ernst and da *ilva, Hernando Kopes. Electroencephalography: Basic Principles, Clinical pplications,
and !elated Fields, Fifth Edition. Kippincott -illiams D -ilkins, 388L. pp ,G8
L Sowan, R. Names. Pri"er of EEG. Elsevier *cience, 'hiladelphia, 'R. 388;.
9 Kudwig, Tip R. et al. E"ploying a Co""on verage !eference to #"prove Cortical Neuron !ecordings fro"
Microelectrode rrays. Nournal of 1europhysiology, *eptember ;
rd
, 388=.
>http?@@Pn.physiology.org@cgi@reprint@282=2.388=v,.pdfM.
;
Band Frequency (Hz)
Ielta ,"G
Theta G":
Rlpha :",;
!eta ,;";8
Gamma ;8U
Table ,? EEG Bands and Fre$uencies

Higure ,? Electrode Place"ent according to the #nternational %&'(& )yste". *dd nu"+ers on the right,
even on the left. ,etters correspond to lo+es - F.rontal/, T.e"poral/, P.arietal/, and *.ccipital/. C
stands for Central .there is no central lo+e/.
G
2.2 Brain!o"puter Inter#aces
The term 4!rain"#omputer Interface5 first appeared in scientific literature in the ,2:8.s, though
the idea of hooking up the mind to computers was nothing new.
:
The ultimate goal of !#I research is
to create a system that not only an 4open loop5 system that responds to users thoughts but a 4closed
loop5 system that also gives feedback to the user. Sesearchers initially focused on the motor"corte$ of
the brain, the area which controls muscle movements, and testing on animals %uickly showed that the
natural learning behaviors of the brain could easily adapt to new stimuli as well as control the firing of
specific areas of the brain.
=
This research dealt primarily with invasive techni%ues but slowly
algorithms emerged which were able to decode the motor neuron responses in monkeys in real"time
and translate them into robotic activity.
2,,8
Secently, a system developed by researchers and #arnegie
Cellon Oniversity and the Oniversity of 'ittsburgh allowed a monkey to feed itself via a prosthetic arm
using only its thoughts.
,,
This research is e$tremely promising for the disabled, and indeed by 3889 a
system was developed for a tetraplegiac that enabled him to use prosthetic devices, a mouse cursor, and
a television via a 29"micro"electrode array implanted into his primary motor corte$.
,3
Iespite these
achievements, research is beginning to veer away from invasive !#Is due to the costly and dangerous
: N. 0idal, 7Toward Iirect !rain<#omputer #ommunication.7 nnual !evie0 of Biophysics and Bioengineering. 0ol. 3,
,2:;, pp. ,L:",=8. >http?@@arPournals.annualreviews.org@doi@pdf@,8.,,G9@annurev.bb.83.898,:;.88,,8LM.
= Het, E E. 4(perant #onditioning of #ortical Onit Rctivity.5 )cience. 0olume ,9;, Hebruary 3=, ,292, pp. 2LL"2L=.
>http?@@www.sciencemag.org@cgi@rapidpdf@,9;@;=:8@2LL.pdfM.
2 Tennedy, 'hilip S. et al. 4Rctivity of single action potentials in monkey motor corte$ during long"term task learning.5
!rain Sesearch, 0olume :98 Issue ,"3, Nune ,22:, pp. 3L,"3LG. >http?@@www.sciencedirect.com@science/
AobBRrticleOSKDAudiB!9*ES";'C:C&!"
,GDAuserBL3,;,2DAcoverIateB89J3H38J3H,22:DArdocB,DAfmtBhighDAorigBsearchDAsortBdDAdocanchorBDvie
wBcDAsearch*trIdB,3:LG::G2,DArerun(riginBgoogleDAacctB#8888398,=DAversionB,DAurl0ersionB8DAuseridBL3
,;,2DmdLB:928dc38GaLaG:,e3:a39a,L,b8d,L=dM.
,8 -essber, Nohan et al. 4Seal"time prediction of hand traPectory by ensembles of cortical neurons in primates.5
Nature.0ol. G8=, 1o. 9=,8, 3888. pp. ;9,";9L. >http?@@medev.kaist.ac.kr@upload@paper@iP8,.pdfM.
,, #arey, !enedict. 4Conkeys Think, Coving Rrtifiacl Rrm as (wn.5 The Ne0 1or2 Ti"es. Cay 32, 388=.
>http?@@www.nytimes.com@388=@8L@32@science@32brain.htmlM.
,3 &ochberg, Keigh S. et al. 41euronal ensemble control of prosthetic devices by a human with tetraplegia.5 Nature. 0ol.
GG3, Nuly ,;, 3889, pp. ,9G",:,. >http?@@www.nature.com@nature@Pournal@vGG3@n:822@full@nature8G2:8.htmlM.
L
nature of the surgeries re%uired for such systems. 1on"invasive alternatives for !#Is include EEG
technology, Cagnetoencephalography (CEG), and Cagnetic Sesonance Imaging (CSI), as well as the
4partially invasive5 Electrocorticography where sensors are placed within the skull but outside the gray
matter of the brain. 1on"invasive methods are limited in that they are often susceptible to noise, have
worse signal resolution due to distance from the brain, and have difficulty recording the inner workings
of the brain. &owever more sophisticated systems are constantly emerging to combat these difficulties
and non"invasive techni%ues have the advantage of lower cost, greater portability, and the fact that they
do not re%uire any special surgery.
,;
$. %re&ious EE' B!I (esearch
Though the idea of using EEG waves as input to !#Is has e$isted since the initial conception of
!#Is, actual working !#Is based on EEG input have only recently appeared.
,G
Cost EEG"!#I
systems follow a similar paradigm of reading in and analying EEG data, translating that data into
device output, and giving some sort of feedback to the user (Higure 3), however implementing this
model can be e$tremely challenging.
,L
The primary difficulty in creating an EEG"based !#I is the
feature e$traction and classification of EEG data that must be done in real"time if it is to have any use.
Heature e$traction deals with separating useful EEG data from noise and simplifying that data
so that classification, the problem of trying to decide what the e$tracted data represents, can occur.
There is no best way of e$tracting features from EEG data and modern !#Is often use several types of
feature e$traction including &Porth parameters (a way of describing the normalied slope of the data),
wavelet transforms, Hourier transforms, and various other types of filters. The maPor features that
,; Habiani, Georg E. et al. Conversion of EEG activity into cursor "ove"ent +y a +rain'co"puter interface.
>http?@@citeseer$.ist.psu.edu@viewdoc@summary/doiB/doiB,8.,.,.,3=.L2,GM. 388G.
,G N. 0idal, 7Toward Iirect !rain<#omputer #ommunication.7
,L (midvarnia, Rmir &. et al. 3al"an Filter Para"eters as a Ne0 EEG Feature 4ector for BC# pplications.
5http?@@citeseer$.ist.psu.edu@viewdoc@summary/doiB,8.,.,.,;3.,;==M.
9
EEG"!#I systems rely on are event"related potentials (ES's) and event"related changes in specific
fre%uency bands. The 4';88 wave5 is one of the most often used ES's in !#Is and is utilied because
it is easily detectable and is only created in response to specific stimuli.
,9,,:
Higure 3? Brain'Co"puter #nterface 6esign Pattern
!#I systems are further complicated by the fact that there is no standard way of classifying the
e$tracted data. 0arious means including neural networks, threshold parameters, and various other types
of pattern recogniers are employed to try to match the input data to known categories of EEG
archetypes.
,=
Hurthermore, researchers have also relied on unsupervised learning algorithms to find
natural clusters of EEG segments that are indicative of certain kinds of mental activities with varying
,9 1iedermeyer. Electroencephalography. pp. ,39L",399.
,: *ellers, Eric -. et al. 4R ';88 event"related potential brain<computer interface (!#I)? The effects of matri$ sie and
inter stimulus interval on performance.5 Biological Psychology. 0olume :;, Issue ;, (ctober 3889. pp. 3G3"3L3.
>http?@@www.sciencedirect.com@science/AobBCImgDAimagekeyB!9TGT"GTGG9G3","
1DAcdiBG2=;DAuserBL3,;,2DApiiB*8;8,8L,,8988,;29DAorigBsearchDAcoverIateB,8J3H;,J3H3889DAskB2223
92229DviewBcDwchpBdGKb0-"*k-RDmdLBGe28:9e;2dLe29=3;baGfL8e;e;=L==dDieB@sdarticle.pdfM.
,= Rdlakha, Rmit. )ingle Trial EEG Classification. *wiss Hederal Institute of Technology. Nuly ,3, 3883.
>http?@@mmspl.epfl.ch@webdav@site@mmspl@shared@!#I@publications@adlakhareport.pdfM.
:
!rain
Heature
E$traction
're"'rocessing
EEG
Ceasurement
Heedback #lassification
Ievice@Rpplication
#ontrol
degrees of success.
,2,38
Heedback is essential in !#I systems as it allows users to understand what brainwaves they Pust
produced and to learn behavior that can be effectively classified and controlled. Heedback can be in the
form of visual or auditory cues and even haptic sensations, and ongoing research is still attempting to
figure out the optimal form feedback should take.
3,

EEG"!#Is can be classified as either synchronous or asynchronous. The computer drives
synchronous systems by giving the user a cue to perform a certain mental action and then recording the
user.s EEG patterns in a fi$ed time"window. Rsynchronous systems, on the other hand, are driven by
the user and operate by passively and continuously monitoring the user.s EEG data and attempting to
classify it on the fly. *ynchronous protocols are far easier to construct and have historically been the
primary way of operating !#I systems.
33
EEG"!#I systems have made incredible progress in recent years. !y 3888, researchers had
created a thought"translation device for completely paralyed patients which allowed patients to select
characters based on their thoughts, although the character selection process was time consuming and
not perfectly accurate.
3;
!y 388=, researchers collaborating from *witerland, !elgium, and *pain
created a feasible asynchronous !#I that controlled a motoried wheelchair with a high degree of
accuracy though again the system was not perfect.
3G
Today, the 38,8 IRS'R budget allocates VG
,2 Ku, *hiPian et al. 4Onsupervised !rain #omputer Interface based on Inter"*ubPect Information.5 ;8
th
Rnnual
International IEEE EC!* #onference. 0ancouver, !ritish #olumbia, #anada. Rugust 38"3G, 388=.
>http?@@ieee$plore.ieee.org@stamp@stamp.Psp/arnumberB8G9G23;;M.
38 1iedermyer. Electroencephalography. pp. ,3G8.
3, Tauhanen K, 'alomaki T, Nylanki ', Rloise H, 1uttin C, and Cillan NS. 4&aptic feedback compared with visual feedback for !#I5.
'roceeding of ;rd International !#I -orkshop and Training #ourse 3889. Gra Rustria. 3,"3L *ept 3889, 'p. 99"9:.
>000.lce.hut.fi7research7css7 bci 73auhanen8et8al8conf8(&&9.pdf M.
33 1iedermeyer. Electroencephalography. pp. ,39L.
3; !irbaumer, 1. et al. 4The Thought Translation Ievice (TTI) for #ompletely 'aralyed 'atients.5 #EEE Transactions on
!eha+ilitation Engineering. 0olume =, 1o. 3, Nune 3888. pp. ,28",2;. >www.cs.cmu.edu@WtanPa@!#I@TTI388;.pdfM.
3G Galn, H. et al. 4R !rain"Rctuated -heelchair? Rsynchronous and 1on"invasive !rain"#omputer Interfaces for #ontinuous #ontrol
of Sobots.5 Clinical Neurophysiology. 0olume ,,2, Issue 2, *eptember, 388=. pp. 3,L2"3,92.
>http?@@www.sciencedirect.com@science/AobBRrticleOSKDAudiB!901'"GT8=8LG"
LDAuserBL3,;,2DAcoverIateB82J3H;8J3H388=DArdocB,DAfmtBhighDAorigBsearchDAsortBdDAdocanchorBDview
BcDAacctB#8888398,=DAversionB,DAurl0ersionB8DAuseridBL3,;,2DmdLBG;;c,;acfed:,:,c9;=Lecefa9fe9G;,M.
=
million to develop an EEG"based program called *ilent Talk which will 4allow user"to"user
communication on the battlefield without the use of vocalied speech through analysis of neural
signals.5
3L
*tate"of"the"art EEG"based !#Is are a cutting"edge emerging technology and researchers
are constantly developing newer and more accurate algorithms to aid making !#Is that are simpler and
more effective than ever before.
). *he E"oti&+ ,yste"
The Emotiv *ystem, whose tag"line is 4you think, therefore you can,5 bills itself as a
4revolutionary new personal interface for human computer interaction.5 It is based around the E'(#
headset for recording EEG measurements and a software suit which processes and analyes the data.
Emotiv offers both a consumer headset for V322 which only works with approved applications and a
developer headset which supports product development and includes a software bundle for VL88. !oth
headsets are wireless and utilie a proprietary O*! dongle to communicate using the 3.GG& band.
This research used the developer E'(# headset which contains a rechargeable ,3 hour lithium battery,
,G EEG sensors (plus #C*@ISK references), and a gyroscope, and has an effective bandwidth of 8.,9"
G;& (Higures ;, G). Emotiv offers 9 different software development kits which grant various control
over the E'(# headset R'I and detection libraries and come with up to G different programs (Table 3).
This research originally used the Ievelopment *IT and later upgraded to the Sesearch Edition. The
Sesearch Edition includes the Emotiv #ontrol 'anel, Emo#omposer (an emulator for simulating EEG
signals), EmoTey (a tool for mapping various events detected by the headset into keystrokes), the
Test!ench, and an upgraded R'I which enables the capture of raw EEG data from each individual
sensor.
39
3L Irummond, Tatie. 4'entagon 'reps *oldier Telepathy 'ush.5 -ired Cagaine. Cay ,G, 3882.
>http?@@www.wired.com@dangerroom@3882@8L@pentagon"preps"soldier"telepathy"pushM.
39 Emotiv -ebsite. >http?@@www.emotiv.comM.
2
Higure ;? E"otiv EP*C :eadset
Higure G? The %; EP*C :eadset %; contacts. #n addition there is a Co""on Mode )ense .CM)/
electrode in the P< location and a 6riven !ight ,eg .6!,/ electrode in the P; location 0hich for" a
feed+ac2 loop for referencing the other "easure"ents.
,8
,-. Edition !ost -e&elop"ent /icense Included ,o#t0are
Kite Hree Individuals #ontrol 'anel,
Emo#omposer,
EmoTey
Ieveloper VL88.88 (comes with
developer headset)
Individuals #ontrol 'anel,
Emo#omposer,
EmoTey,
!asic R'I
Sesearch V:L8.88 or V3L8.88 for
upgrade from Ieveloper
*IT
Individuals #ontrol 'anel,
Emo#omposer,
EmoTey,
Test!ench,
Saw EEG data R'I
Enterprise
V3,L88.88 Enterprises #ontrol 'anel,
Emo#omposer,
EmoTey,
!asic R'I
Enterprise 'lus V:,L88.88 Enterprises #ontrol 'anel,
Emo#omposer,
EmoTey,
Test!ench,
Saw EEG data R'I
Education V3,L88.88 Educational Institutes #ontrol 'anel,
Emo#omposer,
EmoTey,
Test!ench,
Saw EEG data R'I
Table 3? E"otiv )63s
).1 !ontrol %anel
The Emotiv #ontrol 'anel is a graphical"user interface which functions as a gateway to using
the E'(# headset. It oversees connecting with the headset, preprocessing and classifying the input
signals, giving feedback to the user, and allows the user to create a profile and train thoughts and
actions. The #ontrol 'anel includes the E$pressiv, Rffectiv, and #ognitiv suits as well as a Couse
Emulator which allows the user to control the mouse by moving their head and utiliing the headset
gyroscope. The E$pressiv suite is designed to measure facial e$pressions based on reading EEG@ECG
,,
and is an innovative way in utiliing artifacts that are usually simply filtered out of EEG systems. The
E$pressiv suite can recognie ,3 actions? blink, right@left wink, look right@left, raise brow, furrow brow,
smile, clench, right@left smirk, and laugh. It gives feedback by matching the incoming signals to a
simulated face avatar which mimics the user.s e$pressions (Higure L).
Higure L? E=pressiv )uite
The Rffectiv suite monitors the user.s emotional states. It can measure engagement@boredom,
frustration, meditation, instantaneous e$citement, and longterm e$citement (Higure 9). The #ognitiv
suite monitors and interprets the user.s conscious thoughts. It can measure ,; active thoughts? push,
pull, lift, drop, left, right, rotate left, rotate right, rotate clockwise, rotate counterclockwise, rotate
forward, rotate backward, and disappear, as well as the passive neutral state. The Emotiv *oftware
detects these thoughts using built"in proprietary software. This software works by running the input
,3
from the electrodes through a neural network and attempting to classify the signals as one of the ,;
built"in 4prototype5 action thoughts. The data for these 4prototype thoughts5 was gathered prior to the
release of the headset based on hundreds of test cases and serve as a base for classification. Rction
thoughts must be trained before use and the user can tie different thoughts to the built"in actions (i.e.
training the 4push5 command by thinking 4blue5), however doing this can lower the accuracy of
recogniing the thoughts and increase the training time since these thoughts will not match up to the
prototype thoughts the software initially e$pects. The software gives feedback in the form of a floating
cube that responds to the recognied thoughts (Higure :).
Higure 9? ffectiv )uite
#urrently, at any given time the #ognitiv suite can distinguish between four separate thoughts
on"the"fly, however the user can recognie additional thoughts concurrently by having multiple #ontrol
'anels open simultaneously, each looking for different thoughts. The four thought limit is currently in
,;
place due to usability concerns as adding additional thoughts can greatly increase the difficulty of using
the #ognitiv suite effectively. Iespite this, Emotiv is currently considering upping the concurrent
thought recognition limit to beyond four.
Higure :? Cognitiv )uite
).2 *estBench
The Test!ench program provides a real"time display of the Emotiv headset data stream. It
allows the user to see the EEG contact %uality and actual data coming in from each sensor, as well as
the gyroscope data, wireless packet information, and battery life (Higure =). Hurthermore, the program
can display a Hast Hourier Transform (HHT) of any incoming channel and can display the Ielta, Theta,
Rlpha, and !eta bands as well as a user"defined custom band (Higure 2). Hinally, the Test!ench can
record, save, and playback data in European Iata Hormat (.edf) and can convert saved EIH data to
#omma"separated 0alue format (.csv) and was used to record and save data for blink analysis.
,G
Higure =? !eal'ti"e EEG Measure"ents in TestBench
Higure 2? FFT Measure"ents in TestBench. TestBench also "easures 6elta, Theta, lpha, Beta, and a
user'defined custo" +and.
,L
).$ *he E"oti& A%I
The Emotiv R'I is e$posed as an R1*I # interface implemented in two -indows IKKs?
edk.dll and edkAutils.dll. The core of the Emotiv *IT is the 4EmoEngine,5 which is a logical
abstraction that 4communicates with the Emotiv headset, receives preprocessed EEG and gyroscope
data, manages user"specific or application"specific settings, performs post"processing, and translates
the Emotiv detection results into an easy"to"use structure called an Emo*tate.5 Every Emo*tate
represents the current input from the headset including 4facial, emotional, and cognitive state5 and,
with the upgrade to the research edition, contains electrode measurements for each contact. Otiliing
the Emotiv R'I consists of connecting to the EmoEngine, detecting and decoding new Emo*tates, and
calling code relevant to the new Emo*tate (Higure ,8).
3:
Xuery for new Emo*tate
Higure ,8? :igh'level 4ie0 of the >tili?ation of the E"otiv P#
3: E"otiv )oft0are 6evelop"ent 3it: >ser Manual for Beta !elease %.&.=. pp. ;9";:
,9
EmoEngine
&andle
Emo*tate &andle?
1ew Emo*tate/
1o Ees
#ode to handle
Emo*tate
1. *he %aralla2 ,cibbler3 (obot and I%(E Flu4e
The 'aralla$ *cribbler robot is a fully assembled reprogrammable robot built around the !R*I#
*tamp+ 3 microcontroller. It contains built in photovoltaic sensors, infrared sensors, line sensors, two
independent I# motors to drive the two wheels, three indicator KEI lights, a speaker, and a serial port.
It can be programmed using the *cribbler 'rogram Caker Gui or the !asic *tamp Editor.
3=

The Institute for 'ersonal Sobots in Education (I'SE) Hluke is an add"on board created by
Georgia Sobotics that plugs into the *cribbler.s serial port and adds color vision, IS range sensing,
internal voltage sensing, an e$tra KEI, and bluetooth functionality (Higure ,,). Hurthermore, I'SE has
created the Cyro (short for Cy Sobot) software package that interacts with the Hluke via bluetooth and
grants the ability to reprogram the *cribbler using 'ython.
32
Higure ,,? )cri++ler !o+ot 0ith #P!E Flu2e add'on Board
3= 4The *cribbler? R Seprogrammable Sobot.5 >http?@@www.paralla$.com@tabid@GLL@Iefault.asp$M. #opyright 38,8 by
'aralla$ Inc. Rccessed G@,,@38,8.
32 4Osage Guides.5 >http?@@www.roboteducation.org@guides.htmlM. #opyright ) 388: Institute for 'ersonal Sobots in
Education. Rccessed G@,,@38,8.
,:
5. !ontrol I"ple"entation
The control implementation for this proPect was written in Cicrosoft 0isual #UU. I decided on
#UU as it allowed me to easily call the R1*I # libraries e$posed in the Emotiv R'I as well as make
calls to the 'ython@# R'I so that I could control the *cribbler using the 'ython Cyro interface. The
code implementing this control scheme is divided into four basic parts? connecting to the Emotiv
headset via the Emotiv R'I, connecting to the *cribbler through the Cyro 'ython libraries, reading and
decoding Emotiv events and sending the corresponding commands to the *cribbler, and closing the
connections when the user is done (Higure ,3).
Higure ,3? :igh'level 4ie0 of the Control )che"e
,=
#onnect to Emotiv
#onnect to *cribbler
!link, 'ush, Turn Keft,
or Turn Sight Ietected/
Xuery Emotiv and
decode new events
1o
Ees
*end commands
to *cribbler
Oser Xuit/
1o
Ees
Iisconnect and e$it
5.1 E"oti& !onnection
#onnecting to the Emotiv allows for two basic options for controlling the robot? using thoughts
detected from the actual Emotiv E'(# headset or using the mock signals from the Emo#omposer
emulator. The Emo#omposer acts as an emulator that the #ontrol 'anel can connect to and allows the
user to send per"recorded built"in data signals that match possible inputs from the real headset. I coded
the program to allow access to the Emo#omposer as it allowed me to initially use clean signals from
the Emo#omposer to make a %uick and testable prototype scheme before I had trained the headset,
however once the headset was trained I switched to using actual headset signals as inputs. The Osers
Canual suggests using the EEAEngine#onnect call to connect directly with the headset and using the
EEAEngineSemote#onnect otherwise, however I found that the EEAEngine#onnect always returned
true, even if the headset was off.
;8
Instead, I decided to use EEAEngineSemote#onnect to connect
through the #ontrol 'anel which enables communication with the headset and %uerying its power and
connection status. Thus my connection calls for utiliing the Emo#omposer or the headset differ only
in the port and I'?
EE_EngineRemoteConnect(EmoControlPanelIP.c_str(), headsetPort)
EE_EngineRemoteConnect(EmoComposerIP.c_str(), composerPort)
5.2 ,cribbler !onnection
Initialiing the *cribbler robot consists of three steps? initialiing 'ython, loading the Cyro
libraries, and connecting to the robot using the Cyro.s initialie() command. The 'ython.h library
allows #@#UU code to embedding 'ython and make direct calls to the 'ython interpreter. Initialiing
the 'ython interpreter and loading the 'ython dictionary re%uires four lines of code?
Py_Initialize();
PySys_SetArg(argc, arg);
;8 >ser Manual. pp ;:";=
,2
main_mod!le " PyImport_Add#od!le($__main__$);
main_dict " Py#od!le_%et&ict(main_mod!le);
Hrom here, any code can be sent to the 'ython interpreter directly using the 'ySunA*imple*tring
function. Hurthermore, #UU code can construct 'y(bPect pointers to reference 'ython functions stored
in the 'ython dictionary. These functions can be called via the 'y(bPectA#allHunction which passes
back another 'y(bPect pointer which is the return value of the called function. Thus full embedding
functionality is possible complete with function calls, parameter passing, and return values when
embedding 'ython in #@#UU.
;,
Koading the Cyro libraries consists of only one line of code?
PyR!n_SimpleString($'rom myro import ($);
however it is important to remember to update the #UU reference to the now updated 'ython dictionary
so that the Cyro functions can be called. Hrom there, the user inputs the !luetooth out com port that
the *cribbler is connected to and the Cyro initialie() command is called which allows the 'ython
interpreter to send commands directly to the *cribbler?
Py)*+ect( init,!nction " Py&ict_%etItemString(main_dict, $initialize$);
Py)*+ect_Call,!nction(init,!nction, $s$, port.c_str());
5.$ -ecoding and Handling E"o,tates
There are four maPor steps in reading and decoding information from the E'(# headset?
creating the EmoEngine and Emo*tate handles, %uerying for the most recent Emo*tate, deciding if this
is a new Emo*tate, and decoding the Emo*tate. The EmoEngine handle allows for %ueries to get direct
input from the headset including contact %uality, raw electrode input, and the connection %uality. 1ew
Emo*tates are constantly created by the EmoEngine which represent recognied actions such as facial
e$pressions, changed emotional status, and detected thoughts and can be %ueried through the Emo*tate
handle. Hirst the EmoEngine handle to %uery for new Emo*tates and the Emo*tate handle used in
;, van Sossum, Guido. Python7C P# !eference Manual. 'ython *oftware Houndation 3,, Hebruary, 388=.
>http?@@docs.python.org@release@3.L.3@api@api.htmlM. Rccessed ,G, Rpril, 38,8.
38
determining what was detected are allocated using the Emotiv R'I?
EmoEngineEent-andle eEent " EE_EmoEngineEentCreate();
EmoState-andle eState " EE_EmoStateCreate();
Hrom here, the EmoEngine handle is %ueried to retrieve the most recent event using?
EE_Engine%et.e/tEent(eEent);
This should be polled ,8",L per second to ensure real"time reactions to the users thoughts@actions.
1e$t the program determines the event type returned by the EmoEngine. There are three categories of
event types? hardware"related events, new Emo*tate events, and suite"related events.
;3
If the event
represents a new Emo*tate, the code retrieves the Emo*tate, the user.s II number, and the time"stamp
of the event so that the event can be decoded?
EE_Eent_t eent0ype " EE_EmoEngineEent%et0ype(eEent);
...
i' (eent0ype "" EE_EmoState1pdated) 2
EE_EmoEngineEent%etEmoState(eEent, eState);
const 'loat timestamp " ES_%et0ime,romStart(eState);
decodeState(!serI&, eState, timestamp);
3
In decoding the Emo*tate, I look for eight possible events. Hirst, I check whether the headset
was disconnected or reconnected. If the headset was disconnected, I suspend all activity until a
reconnect Emo*tate is published. I originally had the program terminate if the headset disconnected,
however I decided to alter the scheme to allow for the user disconnecting@reconnecting without having
to restart the program each time since I often had trouble maintaining a lengthy connection with the
headset. I used the ES_%et4irelessSignalStat!s(eState) and the ES_%et-eadset)n(eState)
calls to determine whether or not the headset was still connected. If the event was not related to the
headset disconnecting@reconnecting, I check to see if the event was a blink using the
ES_E/pressiIs5lin6(eState) call. I maintain a global variable that keeps track of when the last
blink is recorded and whenever a new blink is detected I compare it with the time"stamp of the new
;3 >ser Manual. pp ;=
3,
blink to determine if the user double"blinked. If so, I have the *cribbler take a picture by calling the
Cyro 'ython commands?
PyR!n_SimpleString($sho7(ta6ePict!re())$);
PyR!n_SimpleString($stop()$);
It is necessary to call the stop() command so that the picture frame returned by the 'ython
interpreter is refreshed.
If the event was neither a blink nor a signal that the headset had disconnected or reconnected, I
look for three possible cognitive thoughts? 'ush, Turn Keft, and Turn Sight. To do this, I cast the
Emo*tate as a #ognitivRction, get the specific integer value of the action, and get the power of the
thought by utiliing the #ognitiv *uite?
EE_CognitiAction_t action0ype " ES_Cogniti%etC!rrentAction(eState);
'loat actionPo7er " ES_Cogniti%etC!rrentActionPo7er(eState);
int po7er " static_cast8int9(actionPo7er(:;;.;');
int action " static_cast8int9(action0ype);
Hrom here, the integer value of the action can be compared to the defined #ognitivRction enumerator
stored in the Emo*tateIKK.h file (Table ;). (nce I have decoded which thought sparked the
Emo*tate, I send the appropriate call to the *cribbler ('ush Y Cove Horward, Turn Keft Y Turn Keft,
Turn Sight Y Turn Sight). If the thought is not a 'ush, Turn Keft, or Turn Sight thought I ignore it. I
initially e$perimented with using the power of the thought as an input to the power of the robotic
action, however I found that this control scheme was too difficult to use and it ended up being far more
intuitive to use specific values for turning and moving forward no matter the thought"power. This
allowed the user to concentrate on the thoughts alone and not have to additionally worry about how
4hard5 to think the thoughts.
33
Rction 0alue
1eutral 8$888,
'ush 8$8883
'ull 8$888G
Kift 8$888=
Irop 8$88,8
Keft 8$8838
Sight 8$88G8
Sotate Keft 8$88=8
Sotate Sight 8$8,88
Sotate #lockwise 8$8388
Sotate #ounterclockwise 8$8G88
Sotate Horward 8$8=88
Sotate !ackward 8$,888
Iisappear 8$3888
Ta+le <: Predefined Cognitiv ction Enu"erator 4alues
5.) 6odi#ications
(ne of the first early problems I recognied was the disparity between the rate of input
Emo*tates and the time it takes the *cribbler to perform actions. The internal sampling rate in the
E'(# headset is 38G=&. This is filtered to remove artifacts and alias fre%uencies and then down"
sampled to appro$imately ,3=&.
;;
Rny given motion input to the *cribbler using !luetooth takes
appro$imately 3 seconds to e$ecute, while picture taking take slightly longer as it has to capture and
send data back. Cy initial tests of my control scheme failed to account for this time differential in
input sampling and possible output e$ecution and conse%uently failed to offer a usable interface.
Instead of the *cribbler responding to my thoughts, Emo*tates and corresponding actions for the
*cribbler almost instantly %ueued up while the *cribbler was still e$ecuting the first command, and the
;; Emotiv Sesearch 'lus *IT &eadset *pecifications. >http?@@www.emotiv.com@Sesearch'lus.htmlM.
3;
result was a robot that seemed to make random actions as it e$ecuted thoughts that had taken place
seconds and even minutes ago instead of responding to current thoughts. To solve this problem I
introduced a sampling variable to only decode one in every ten input Emo*tates. The rational behind
creating this variable is that the ,3=& input rate is so fast that inputs like a 'ush thought or even an
eye blink will create numerous Emo*tates. Osing my sampling variable I filter out those e$tra states
that really only correspond to one event by using a sample rate small enough that it will still capture
events which send more than ,8 input Emo*tates while sending only one command to the *cribbler
instead of %ueuing up ,8. This system worked much better, and even had the added bonus of filtering
out 4noise5 thoughts when the headset detected a push or turn thought for a fraction of a second.
Iespite this, adding the sampling variable had the unintended conse%uence of creating the possibility
of missing a headset disconnect@reconnect as these events create only one Emo*tate each. To solve this
problem, I moved the checking of the sample variable and settled on decoding every single Emo*tate
so that I could catch the disconnect@reconnect while filtering thought and blink Emo*tates using the
sample variable to ensure the *cribbler acts on current thoughts.
(ne other modification I made after this scheme was created was to add an additional operation
mode. I decided to add this as it would often be useful to be able to send more than three commands to
the *cribbler. &owever, if I wanted to take the same approach as before and hook up thoughts to robot
actions I had to contend with the added difficulty in recogniing more an additional thought for every
new action I wanted to add. To solve this problem, I created an additional mode which remaps the
same three input thoughts to different outputs in the robot. This is hugely beneficial as it does not
increase the difficulty in recogniing new thoughts and also does not re%uire the user to train additional
thoughts, thus giving double the usability with only one additional input. This additional input is
raising the eyebrows which toggles between the original and the new mode. I decided on utiliing the
raising of eyebrows as a toggle as it is very easily trained and accurately recognied by the headset.
3G
The new mode maps 'ush to the *cribbler moving forward until it detects a wall, Turn Keft to rotating
appro$imately 28Z to the left, and Turn Sight to rotating appro$imately 28Z to the right. &aving the
*cribbler move forward until it detects a wall is accomplished by sending a loop to the 'ython
interpreter which has the *cribbler continually use the Hluke board to check if there is an obstacle in
front of it and move forward if no obPect was detected?
string command " $7hile(get)*stacle(:)8:;<;)= 'or7ard(.<,:)$;
...
PyR!n_SimpleString(command.c_str());
Opon seeing how successful this was I also added a check to the initial move forward command so that
when the user commands the *cribbler to go forward into a wall the robot instead gives feedback to the
user e$plaining there is an obPect in the way. The addition of more modes is certainly possible and is
an e$cellent way of adding functionality without adding the cost of recogniing and learning new
thoughts. In the end, it was completely feasible to control the *cribbler robot using the E'(# headset
proving the viability of EEG based !#I technology.
7. Blin4 -etection
&aving e$plored the Ievice@Rpplication #ontrol and the Heedback portions of the !#I Iesign
'attern by creating my Emotiv"*cribbler #ontrol *cheme, I ne$t decided to e$plore the 're"'rocessing,
Heature E$traction, and #lassification of EEG data by analying eye blinks (Higure 3). The analysis of
eye blinks is useful in !#I development for two reasons? eye blinks can be used as control inputs (as in
my control scheme), and if they are not they must be filtered out lest they corrupt the useful EEG data.
I furthermore decided on eye blinks as they are immediately recogniable in the RH; channel (Higure
,;). Hocusing on only one channel allowed me to immediately reduce the amount of input data by a
factor of ,G since I could discount the other ,; input channels.
3L
Higure ,;? Blin2 Event :ighlighted in the F< Channel
Seducing the sie of the data set is the primary focus of pre"processing and feature e$traction
whose goal is to get rid of e$traneous and noisy data while preserving data that can best differentiate
between classes. &owever, before I could do this I first needed a data set and thus recorded twenty ten
second clips, ten of which I blinked during and ten of which I didn.t. I then e$ported the clips to #*0
format so that I could load them in CRTKR!. These recordings produced a lot of data because in
addition to the ,G EEG channels capturing electrode data at ,3=& the headset also records gyroscope
data, battery data, packet information, etc. and each ,8 second clip ended up had roughly ;9888 data
points and combined I recorded :38888 data points (Higure ,G).
39
Higure ,G? *ne %&'second )eg"ent 0ith Blin2s and Noise
The first step of my feature e$traction was to use Pust the RH; channel where blinks are clearly
visible. I was curious as to how simply e$tracting this channel would work for blink classification and
decided to run the 38 test clips through a neural net to see how it would fare. The classification using
CRTKR!*. nprtool to create a two"layer feedforward neural network with backpropagation obtained
only a 9LJ accuracy (Higure ,L). The reason for this became apparent when I looked at a plot of all
the RH; channel data (Higure ,9). It is clear that, though there is a certain pattern to blinks, the neural
net was thrown off because the blinks were not normalied with respect to time. The neural net treated
time as an attribute, and thus might not classify two samples that both contain blinks but where the
blinks occur at different times. Time is correlated to blinks in respect to how long the blink takes and
thus how wide the blink spike will be, however the time that the blink occurs is not a usable attribute.
3:
0 2 0 0 4 0 0 6 0 0 8 0 0 1 0 0 0 1 2 0 0 1 4 0 0 1 6 0 0 1 8 0 0 2 0 0 0
0
5 0 0
1 0 0 0
1 5 0 0
2 0 0 0
2 5 0 0
3 0 0 0
3 5 0 0
4 0 0 0
4 5 0 0
5 0 0 0
Higure ,L? #nitial Neural Net !esults sho0ing +oth a Confusion Matri= and !eceiver *perating
Characteristic .!*C/ Plot
Higure ,9? T0enty %&'second !ecordings of F< 6ata
To solve this problem, I decided to further reduce the amount of data the neural net worked with
along with normaliing any blinks found. Secogniing that blinks correlate to spikes in EEG data, I
scanned each ,8 second clip looking for the largest spikes. I found that blinks typically were
represented by a surge in the GL88 to G=88 Qvolt range over appro$imately .L2 seconds and were
followed by a characteristic dip of around L8 Qvolts over appro$imately .38 seconds (Higure ,:). This
pattern was very clear and easily distinguishable from a non"blink state6 I first noticed it when applying
unsupervised T"Ceans #lustering to detect naturally occurring patterns in the data (Higure ,=).
3=
0 1 0 0 2 0 0 3 0 0 4 0 0 5 0 0 6 0 0 7 0 0 8 0 0 9 0 0 1 0 0 0
4 1 0 0
4 2 0 0
4 3 0 0
4 4 0 0
4 5 0 0
4 6 0 0
4 7 0 0
4 8 0 0
Higure ,:? %.9 )econd !ecording )ho0ing a Blin2 Pattern .+lue/ 0ith Non'Blin2 6ata .green/ for
Co"parison
Higure ,=? >nsupervised 3'Means Clustering E=poses Naturally *ccurring Blin2 Patterns .( Clusters
,eft, < Clusters !ight/
32
0 5 0 1 0 0 1 5 0 2 0 0 2 5 0
4 3 0 0
4 4 0 0
4 5 0 0
4 6 0 0
4 7 0 0
4 8 0 0
4 9 0 0
0 5 0 1 0 0 1 5 0 2 0 0 2 5 0
4 4 0 0
4 4 5 0
4 5 0 0
4 5 5 0
4 6 0 0
4 6 5 0
4 7 0 0
4 7 5 0
4 8 0 0
4 8 5 0
0 5 0 1 0 0 1 5 0 2 0 0 2 5 0
4 4 0 0
4 4 5 0
4 5 0 0
4 5 5 0
4 6 0 0
4 6 5 0
4 7 0 0
4 7 5 0
4 8 0 0
4 8 5 0
I used this information to further filter each of the 38 inputs down to ,.9 second segments, each
of which highlighted the ma$imum spike of the original ten second segment. This normalied the
blinks by having each blink start at roughly the same time and additionally filtered out noise that was
unrelated to the blinks creating data that was much easier for a neural net to distinguish (Higure ,2).
Osing these inputs, neural net accuracy improved to ,88J, however I wanted to see if this system truly
recognied blinks or was over"trained on the input data. Therefore, I recorded five more segments, ;
with blinks and 3 without, and followed the same pre"processing@feature e$traction steps and fed the
data to the neural net. The neural net accurately predicted all of these new inputs even though it had
not been trained upon them, showcasing that it was truly was e$tendable and actually recogniing blink
patterns. These are very promising results that prove the feasibility of utiliing a neural net to classify
blinks, however it would be best to obtain a larger sample sie to accurately test the classification
performance of this scheme.
Higure ,2? Fully Processed and Nor"ali?ed #nput to the Neural Net0or2
;8
0 5 0 1 0 0 1 5 0 2 0 0 2 5 0
4 1 0 0
4 2 0 0
4 3 0 0
4 4 0 0
4 5 0 0
4 6 0 0
4 7 0 0
4 8 0 0
8. !onclusions
&aving e$plored all phases of EEG"based !#I construction and implementation, I can safely
say that it is a feasible system that will likely only improve and become more widespread in the future.
The biggest flaw I encountered in the technology was interference with the wireless headset which
often had difficulty connecting and staying connected, however Emotiv has informed me that this is
likely a hardware problem with the headset and will be sending me a new one. I did my research using
a !eta version of the E'(# headset so it is possible that the #onsumer version they will send me will
address this problem.
The system I constructed was largely a success as I was able to create a system whereby I could
control a robot with my thoughts and further created accurate blink"recogniing software.
Hurthermore, my system showcases the amaing possibilities of !#I.s in aiding the disabled. Hor
instance, a person who could only move their head could certainly use my system to control a
motoried wheelchair accurately using their thoughts. In addition, had they a computer built into the
headset, they could easily switch modes by raising their eyebrows and then use their thoughts as an
input to the computer, by using the same thoughts that had moved the wheelchair to control the mouse
and double"blinking to click. Rn alternative would be to keep the thoughts controlling the wheelchair
while utiliing the gyroscope in the headset to control the mouse, enabling the user to have
simultaneous control of the wheelchair and computer. Hurther research can certainly lead to direct
implementation of such systems and can e$plore the recognition of thoughts beyond those included in
the Emotiv R'I. In the end, the research demonstrates Pust the tip of the iceberg in what is truly the
limitless potential of EEG"!#I systems.
;,
9. Ac4no0ledg"ents
I would like to thank the #omputer *cience Iepartment at !oston #ollege for purchasing the
Emotiv &eadset and Sesearch *IT, without which my research would have been impossible.
Hurthermore, I e$tend my gratitude towards the Emotiv Team who have developed a great cost"
effective product for anyone looking to get into EEG research and who have answered many of my
%uestions on their closely monitored forums. Hinally, I am indebted to Sobert *ignorile who served as
my adviser and consistently offered his insightful input and enthusiasm for the proPect.
;3

You might also like