Professional Documents
Culture Documents
Hardware_Acceleration_of_EEG-Based_Emotion_Classification_Systems_A_Comprehensive_Survey
Hardware_Acceleration_of_EEG-Based_Emotion_Classification_Systems_A_Comprehensive_Survey
3, JUNE 2021
Abstract—Recent years have witnessed a growing interest in use it to map out several research opportunities, including multi-
EEG-based wearable classifiers of emotions, which could enable modal hardware platforms, accelerators with tightly-coupled cores
the real-time monitoring of patients suffering from neurological operating robustly in the near/supra-threshold region, and pre-
disorders such as Amyotrophic Lateral Sclerosis (ALS), Autism processing libraries for universal EEG-based datasets.
Spectrum Disorder (ASD), or Alzheimer’s. The hope is that such
wearable emotion classifiers would facilitate the patients’ social Index Terms—Emotion detection and classification, EEG,
integration and lead to improved healthcare outcomes for them hardware acceleration, machine learning, monitoring of
and their loved ones. Yet in spite of their direct relevance to neurological disorders.
neuro-medicine, the hardware platforms for emotion classification
have yet to fill up some important gaps in their various approaches I. INTRODUCTION
to emotion classification in a healthcare context. In this paper, we
MOTION classification using EEG features is a promising
present the first hardware-focused critical review of EEG-based
wearable classifiers of emotions and survey their implementation
perspectives, their algorithmic foundations, and their feature ex-
E path to achieving social integration of patients suffering
from neurological disorders that involve locked-in emotional
traction methodologies. We further provide a neuroscience-based
analysis of current hardware accelerators of emotion classifiers and states. In contrast to face-based emotion detectors whose ac-
curacy depends on external muscular responses, EEG-based
Manuscript received March 31, 2021; revised May 28, 2021; accepted June classifiers provide a non-intrusive window into the fluctuations
6, 2021. Date of publication June 14, 2021; date of current version August 17, of the internal organismic subsystems, which are measurable
2021. This work was partially supported by the Khalifa University Center for despite the presence of a paralytic condition. Recent years have
Cyber Physical Systems (C2PS) and by the German Research Foundation (DFG,
Deutsche Forschungsgemeinschaft) as part of Germany’s Excellence Strategy witnessed sustained efforts to design EEG-based software and
- EXC 2050/1 - Project ID 390696704 - Cluster of Excellence “Centre for hardware classifiers for emotions. The focus of this paper is to
Tactile Internet with Human-in-the-Loop” (CeTI) of Technische Universität critically survey the hardware classifiers as they are the linchpins
Dresden. This paper was recommended by Associate Editor Prof. Chul Kim.
(Corresponding author: Hector A. Gonzalez.) of wearable, robust, real-time medical devices with small foot-
Hector A. Gonzalez, Richard George, and Sebastian Höppner are print and long battery life that aim at improving the healthcare
with the Chair for Highly-Parallel VLSI-Systems and Neuromorphic outcomes for patients with neurological disorders. Inspired by
Circuits, Technische Universität Dresden, 01062 Dresden, Germany (e-
mail: hector.gonzalez@tu-dresden.de; richard_miru.george@tu-dresden.de; the state-of-the-art software solutions, the paper will establish
sebastian.hoeppner@tu-dresden.de). a common framework to analyze the emerging hardware ones
Javier Acevedo is with the Deutsche Telekom Chair of Communication and will use it to identify unexplored paths for novel hardware
Networks, Technische Universität Dresden, 01187 Dresden, Germany (e-mail:
javier.acevedo@tu-dresden.de). designs. The paper will further bring a Neuroscience perspective
Christian Mayr is with the Chair for Highly-Parallel VLSI-Systems and Neu- to bear on the existing and emerging engineering approaches to
romorphic Circuits, Technische Universität Dresden, 01062 Dresden, Germany, the hardware acceleration of emotion classifiers.
and also with the Centre for Tactile Internet (CeTi) with Human-in-the-Loop,
Cluster of Excellence, Dresden, Technische Universität Dresden, 01187 Dres- To the best of our knowledge, this is the first extensive
den, Germany (e-mail: christian.mayr@tu-dresden.de). hardware-focused review of EEG-based emotion classification
Jerald Yoo is with the Department of Electrical and Computer Engineer- systems. Most of the prior research related to EEG-based emo-
ing, National University of Singapore, Singapore 117583, Singapore, and
also with the N.1 Institute for Health, Singapore 117456, Singapore (e-mail: tion detection has focused on analyzing multi-modal contri-
jyoo@nus.edu.sg). butions [1], surveying generic EEG-based Deep Learning ap-
Frank H.P. Fitzek is with the Deutsche Telekom Chair of Communication proaches [2], reviewing the available transfer learning mech-
Networks, Technische Universität Dresden, 01187 Dresden, Germany, and also
with the Centre for Tactile Internet (CeTi) with Human-in-the-Loop, Cluster of anisms for general EEG classification tasks [3], and analyzing
Excellence, Dresden, Technische Universität Dresden, 01187 Dresden, Germany existing solutions [4]–[6], all from a software perspective, which
(e-mail: frank.fitzek@tu-dresden.de). unfortunately excludes the hard challenges of designing a robust,
Shahzad Muzaffar and Ibrahim M. Elfadel are with the Department of Elec-
trical Engineering and Computer Science, and the Center for Cyber Physical wearable hardware emotion classifier under severe constraints
Systems (C2PS), Khalifa University, Abu Dhabi 127788, United Arab Emirates of area, power and stringent specifications on accuracy and
(e-mail: shahzad.muzaffar@ku.ac.ae; ibrahim.elfadel@ku.ac.ae). reliability. A recent survey [7] with focus on hardware design for
Color versions of one or more figures in this article are available at https:
//doi.org/10.1109/TBCAS.2021.3089132. biomedical applications discusses the area of emotion detection
Digital Object Identifier 10.1109/TBCAS.2021.3089132 in the context of “AI-based EEG processing algorithm design”
1932-4545 © 2021 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See https://www.ieee.org/publications/rights/index.html for more information.
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
GONZALEZ et al.: HARDWARE ACCELERATION OF EEG-BASED EMOTION CLASSIFICATION SYSTEMS: A COMPREHENSIVE SURVEY 413
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
414 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, VOL. 15, NO. 3, JUNE 2021
TABLE I
FEATURE EXTRACTION METHODS IN SW AND HW
STFT: Short-time Fourier transform. HOC: Higher Order Crossings. DE: Differential Entropy. CSP: Common Spatial Pattern.
HFD: Higuchi Fractal Dimension. AR: Autoregressive model. FBCSP: Filter Bank Common Spatial Pattern. DCT: Discrete Cosine Transform.
FD: Fractal Dimension. GP: Grassberger-Procaccia algorithm. MFCC: Mel-frequency Cepstral Coefficients. PCA: Principal Component Analysis.
ARRC: Autoregression Reflection Coefficients. ASP: Asymmetric Spatial Pattern. BDGLS: Broad Dynamical Graph Learning System.
CS: Compressed Sensing. DTF: Directed Transfer Function. EMD: Empirical Mode Decomposition. GCB-Net: Graph Convolutional Broad Network.
HAF: Hybrid Adaptive filtering. HHS: Hilbert-Huang Spectrum. HOS: Higher Order Spectral analysis. LBP: Local Binary Patterns.
LFCC: Linear frequency Cepstral Coefficients. LPP: Late Positive Potential. MSCE: Magnitude Squared Coherence Estimate. D.† : Distance.
NEE: Narrow band energy event. NTSPP: Neural Time series Prediction Pre-processing. SPWV: Smoothed Pseudo-Wigner-Ville distribution.
with the standing assumptions being the availability of abundant paper to give a detailed account of the potential of the hardware
hardware resources and off-line operation. The lack of an early approaches in contrast to the software ones, while identifying
hardware perspective may be easily inferred from the high the opportunities and the unexplored paths that can guide future
computational complexity of the feature extraction methods and research and development in this important biomedical domain.
algorithms of the software approaches listed in Tables I and II.
Since 2018, a growing trend in hardware-focused approaches
III. EEG-BASED HARDWARE SOLUTIONS
has been noticed as is clear from the black curve in Fig. 1. This
trend has helped in highlighting the many challenges and the In this section, the hardware solutions to EEG-based emotion
upside potential of wearable devices for the EEG-based emotion detection are compared in terms of wearability, pre-processing
classification problem. Such devices have the potential to enable techniques, and feature extraction methods. In Section IV, they
near real-time, robust, clinically relevant emotion classification, will be compared in terms of their classification algorithms,
under very small form factors and for extended periods of oper- while in Section V, they will be compared based on hardware
ation in a medical environment. The near real-time operation implementation and verification metrics, including power, area,
could improve healthcare outcomes and facilitate the social and latency. The main criterion for inclusion in the hardware
re-integration of patients suffering from neurological diseases survey list is for the publication to report on an online inference
such as Alzheimer’s, Amyotrophic Lateral Sclerosis (ALS), implemented in either FPGA or ASIC, regardless of the classifier
and Autism Disorder (AD). It is one of the major aims of this used. Two hardware contributions [9], [10], which are part of the
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
GONZALEZ et al.: HARDWARE ACCELERATION OF EEG-BASED EMOTION CLASSIFICATION SYSTEMS: A COMPREHENSIVE SURVEY 415
TABLE II
ALGORITHMS IN SW AND HW PLATFORMS
AI-VAE: Attribute Invariance - Variational Auto-encoder. BLS: Broad Learning System. CCA: Canonical Correlation Analysis. DA: Discriminative Analysis.
CNN: Convolutional Neural Networks. ConvLSTM: Convolutional Long Short-Term Memory (LSTM). DBN: Deep Belief Network. DT: Decision Trees.
FCM: Fuzzy Cognitive Map. FCN: Fully Connected Network. GAN: Generative Adversarial Network. GCNN: Graph Convolutional Neural Network.
GLCM: Grey Level Co-occurrence Matrices. SVM: Support Vector Machines. GRU: Gated Recurrent Unit. HMM: Hidden Markov Models.
KNN: K-nearest neighbors algorithm. MLP: Multi-layer Perceptron. NB: Naive Bayes. PNN: Probabilistic Neural Network. RF: Random Forest.
RNN: Recurrent Neural Network. SAE: Sparse Auto-encoder. GELM: Graph Regularized Extreme Learning Machine.
‡
: [30], [41], [42], [50], [50], [102], [126], [127], [129], [131], [149], [187]–[191].
¶
: [44], [45], [53], [56], [58], [62], [64], [68], [70]–[72], [74], [76], [78], [79], [81], [89], [91]–[94]
¶
: [96], [99], [103], [104], [109], [110], [114], [115], [118], [120], [122], [124], [133]–[136], [138], [140], [144], [148], [192]–[195]
Ψ: [60], [65], [69], [73], [77], [88], [95], [108], [123], [132], [196]
A. Wearability Index Fig. 2. High-level block diagram of a generic EEG-based hardware classifier.
The high-level block diagram of a complete EEG-based
hardware emotion classifier is presented in Fig. 2. The violet
color shows the offline processing flow used to train the classi-
fier, whereas the purple color highlights the components most Whereas this index is based on the level of hardware integra-
amenable for a hardware implementation (FPGA or ASIC) as a tion, it is being used as a wearability metric due to hardware
standalone online inference. Depending on the number of online integration being an essential requirement for any standalone
blocks, a wearability index is assigned to each implementation. wearable System-on-Chip (SoC) solution. A maximum index of
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
416 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, VOL. 15, NO. 3, JUNE 2021
TABLE III
FIGURE OF MERIT FOR THE EEG-BASED CLASSIFIERS OF EMOTIONS IN HARDWARE
VAM: Valence/Arousal Model. : Accuracy for 3 classes. LGE: Logic Gate Equivalent. | : Full CNN system. ∦ : Normalized to 180 nm.
HOC: Higher Order Crossings. SK: Skewness. : Normalized energy per Class during inference defined in equation. Nclasses : Number of classes.
four is given in case the pre-processing, feature extraction (FE), that do not remove such artefacts are introducing biases into
training, and inference blocks are all supported in hardware. their classifications and should therefore implement methods to
A colored symbol representing each of the hardware imple- compensate for such biases.
mentations reported in the prior art is used to highlight the The Pre-processing blocks in Fig. 2, whether operating on-
supported blocks. As per the survey selection criterion, all the board or offline, represent the signal processing used in the hard-
hardware classifiers have in common the online inference block. ware approaches to condition the EEG data. Some datasets [18]
The articles of [14] (ISCAS’19), [15] (CICC’20), [16] (IS- provide EEG data where certain artefacts have been removed.
CAS_b’20) and [17] (TBioCAS’20) are the only ones where the While this may ease offline learning and verification, it is the
pre-processing and feature extraction blocks are implemented actual hardware implementation of the pre-processing block
in hardware, whereas the remaining two studies outsource these that provides the more complete and realistic assessment of the
components to the offline processing. This is clearly an ad- hardware resources required to achieve a fully wearable solution
vantage for [14]–[16], and [17], which brings them closer to for EEG-based emotion classification.
a standalone, independent solution with an index of 3 out of As indicated in Fig. 2, the HW platforms of [11] (JETCAS’19)
4. Although [11] (JETCAS’19) is missing an on-board feature and [12] (ISCAS_a’20) do not include on-board pre-processing.
extraction component, resulting in a wearability index of 2 out of Both platforms rely on off-chip components to condition the
4, it is the only implementation that provides on-board training, EEG data, which unfortunately prevents a detailed assessment of
which is essential for enabling on-the-fly tuning of the classifier. the complete HW resources required to achieve a full HW solu-
The implementation reported in [12] (ISCAS_a’20) is focused tion. Additionally, [11] (JETCAS’19) uses a rather simple band-
entirely on the proof of concept of a low-footprint classifier to pass filter to remove those artefacts whose spectra are within the
the exclusion of other features, which resulted in an index of 1 5 to 45 Hz frequency range, but such filter does not remove all
out of 4. All the surveyed hardware platforms share a common artefacts. As summarized in Fig. 3, the spectrum of the EEG
supervised training approach for the offline processing blocks. signal spans the 0 to 100 Hz range and overlaps with the spectra
of non-physiological [19] and physiological [20]–[22] artefacts
that are mingled with the EEG signal. The non-physiological
B. Pre-Processing
category of artefacts is composed by Electrode pop (Ep), ca-
The signals recorded via the EEG electrodes are susceptible ble movement (Cable), incorrect placement of the reference
to external interference, which requires the implementation of electrode, and electrical interference (AC). The physiological
filtering techniques to remove artefacts. EEG-based classifiers category of artefacts corresponds to myogenic (EMG) signals
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
GONZALEZ et al.: HARDWARE ACCELERATION OF EEG-BASED EMOTION CLASSIFICATION SYSTEMS: A COMPREHENSIVE SURVEY 417
Fig. 4. Robustness of the classifier against Eye blinking ([15], Fig. 9).
Fig. 3. EEG artefacts and their frequency bands.
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
418 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, VOL. 15, NO. 3, JUNE 2021
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
GONZALEZ et al.: HARDWARE ACCELERATION OF EEG-BASED EMOTION CLASSIFICATION SYSTEMS: A COMPREHENSIVE SURVEY 419
Fig. 6. Scalp plots with frequency activity for different emotion labels in the
DEAP dataset [18] ([11], Fig. 3). Fig. 7. (a) Accuracy plot for time-based methods. (b) Area and power of the
time-based methods. The legend in Figure (a) is for both (a) and (b). ([17], Table
I).
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
420 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, VOL. 15, NO. 3, JUNE 2021
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
GONZALEZ et al.: HARDWARE ACCELERATION OF EEG-BASED EMOTION CLASSIFICATION SYSTEMS: A COMPREHENSIVE SURVEY 421
whereas the second one is called Inter Hemispheric Power no hardware approaches using this new paradigm, as it normally
Asymmetry (IHP A), and it is given by requires complex DNN’s to compensate for the absence of
specific features [127], [131]. As will be discussed in Section III,
IHP A = |P SDLef t − P SDright | (4)
the hardware solutions to the EEG-based emotion detection
The asymmetry index approaches of [14] (ISCAS’19) problem tend to opt for shallow neural networks to achieve
and [17] (TBioCAS’20) are from the same research group and low-footprint implementations. These networks are unlikely to
use the same indices as in Eq. (3) and (4), but are implemented work directly on raw EEG data. Future research on new circuits
differently. The design in [14] (ISCAS’19) is based on a scaled paradigms, such as near/supra-threshold techniques [169] or
version of IHPR with the required hardware division imple- adaptive body bias operation [170], [171] in Fully Depleted
mented using a customized LookUp Table (LUT) approach, Silicon on Insulator (FDSOI) [172] technologies (e.g., 22FDX)
which is more advantageous than using standard IP cores. On the may potentially enable the HW implementation of raw EEG
other hand, the approach in [17] (TBioCAS’20) slightly changes signals DNN’s into wearable classifiers.
the IHPR index into the following logarithmic expression A possibly more promising approach for the use of raw data
may be based on dedicating the initial layers of a deep CNN to
LIHP R = log2 {P SDLef t } − log2 {P SDRight } (5) feature extraction from raw EEG signals. In [126], coincidence
whose hardware implementation is more convenient. Indeed, filtering along with Differential Entropy (DE) and PSD are used
The LIHP R index reduces the gate counts and power consump- to find appropriate kernels that will guide the automated feature
tion by a factor of 4.7 and 1.5, respectively, when compared with extraction from raw EEG data in the first layers of a CNN. Such
IHP R. These indices then compose the feature vector passed approach has the advantage of implicitly accounting for spectral
to the Support Vector Machine classifier (see Section III). and entropy information while processing the raw EEG data for
As already mentioned, the five hardware designs applying learning and classification.
the STFT rely extensively on asymmetry properties [9]–[11], 6) FE Score: As illustrated by Fig. 2, only four of the HW
[16], [117]. As shown by Fig. 8, their reconstruction of the publications include the EFE engine in their on-board implemen-
EEG frames is based on combining asymmetrical pairs from the tation. The remaining two use a software interface to provide the
six-electrode configuration using the Differential Asymmetry classifier with the features already extracted. Such combined
expression SW/HW platform serves well a a proof of concept but does
not allow a realistic assessment of the resources required in a
DASM = Spectrogramlef t − Spectrogramright (6) fully wearable solution. No additional score is assigned in the
EFE case, as there is no preferred feature extraction method, the
The six Spectrograms are reduced to 3 DASM frames, which
major requirement being that its combination with the classifier
are then combined using
results in acceptable classification accuracy. Such combination
Reconstruction = W1 ∗ DASMF3 −F4 + should also satisfy the HW design specifications on power, area,
(7) and latency. For these reasons, we only provide a binary EFE
W2 ∗ DASMFp1 −Fp2 + W3 ∗ DASMF7 −F8
index with a score of 1 assigned to the HW platfoms with
where W1 , W2 , and W3 are pre-defined weights. The Recon- on-board feature extraction and a score of 0 assigned to those that
struction in Eq. (7) corresponds to the EEG frame of Fig. 8 and is don’t have on-board. Other hardware-related aspects of feature
passed as input to the classifier with the asymmetry information extraction performance will be addressed in the next section
already embedded in its composition. devoted to hardware-friendly classification methods.
Besides CSP and asymmetry indices, another set of spatial-
based features that have been exclusively used by the software-
based EEG-based emotion detection systems is the Connectiv- IV. HARDWARE-FRIENDLY CLASSIFICATION
ity features, and in particular the Directed Transfer Function The classification block is the most important component
(DTF). DTF features are causality measures that are used in of an EEG-based emotion detection system. It is in charge of
the determination of brain connectivity patterns [168], which transforming the extracted features into useful emotion labels.
characterize specific brain states. The DTF is used to describe As presented in Fig. 2, the training of these inference engines
casual influence of one EEG channel over another at a particular for the emotion detection problem is typically based on super-
frequency. Although a hardware implementation of DTF is vised learning, involves large volumes of data, and is therefore
possible, it is not as simple as that of asymmetry indices, as performed offline. Supervised training require that input-output
it entails the computation of the transfer matrix of a multivariate pairs be passed to the classifier, which implies that the emotion
auto-regressive model, a square root, and a division. classes for a given EEG input need to be known ahead of
5) Raw Signals: Lastly, the Raw Signals category of EEG time, as it is the case for any other supervised learning task.
features corresponds to a new paradigm that has recently However, in the context of emotion detection, the output labels
emerged as a result of using Deep Neural Network (DNN) are obtained through psycho-physical experiments in which
classifiers. Under this paradigm, the classifier is directly applied participants are asked to categorize their evoked response in
to the raw EEG signal with the underlying assumption being a graphical representation of the Valence-Arousal scales [173].
that the classifier is strong enough to capture the variations Whereas the graphical representation is quite helpful in lowering
embedded in the different emotional classes. There are currently the language barriers, there are still subjective and cultural
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
422 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, VOL. 15, NO. 3, JUNE 2021
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
GONZALEZ et al.: HARDWARE ACCELERATION OF EEG-BASED EMOTION CLASSIFICATION SYSTEMS: A COMPREHENSIVE SURVEY 423
Fig. 11. Dense network implemented in [15] with time multiplexing ([15],
Fig. 6).
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
424 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, VOL. 15, NO. 3, JUNE 2021
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
GONZALEZ et al.: HARDWARE ACCELERATION OF EEG-BASED EMOTION CLASSIFICATION SYSTEMS: A COMPREHENSIVE SURVEY 425
distinction between the N ormalizedEnergy metric in (8) and and use it to calculate the Logic Gate Equivalent (LGE), which
the energy-per-classification metric provided by [15] is that the in turn can be used in conjunction with the FPGA utilization
former considers as Nclasses the total number of classes the reports to estimate the FPGA area in relation to that of the ASIC
classification model supports whereas the latter accounts only one. However, such estimation only leads to approximate area
for the energy spent during one feed-forward inference. The comparisons rather than exact ones, and in this particular case,
purpose of the metric in (8) is to provide a fair comparison the cell information from all contributions is unknown, which
between the HW implementations of the classification models, prevents that comparison.
as “small” classifiers supporting only two classes will naturally The area comparisons amongst ASIC designs implemented
consume less power than a “large” classifier supporting the four in different technologies have a similar caveat. However, it can
valence and arousal classes. be addressed using normalized metrics. The normalized area is
As shown in Table III, the best average power for the ASIC defined as [206]:
designs is achieved by the 2-channels architecture in [15] (168
Area
μW), followed by the SVM design in [17] (12.7 mW). The next N ormalizedarea = (9)
design is the complex CNN in [11], which consumes an average (T echnology/180 nm)2
of 29.5 mW during inference. The training power is also reported where the reference technology is 0.18 μm as it is the oldest
in [11] but it is not included in the comparison, as this is the amongst the ones used for EEG-based ASIC emotion classifica-
only design with on-board training. These power figures are not tion platforms.
normalized, yet their power ratings are indicative of the energy The smallest normalized area (5.4 mm2 ) among the ASIC
efficiency of the designs themselves. Indeed, the largest power designs is achieved by the SVM of [17]. The second best design
consumption is that of [11] which uses the smallest VDD and is the FCN in [15] with 16 mm2 , followed by the complex CNN
the smallest technology (28 nm). Its normalization would only in [11]. Regarding the FPGA designs, the only one reporting
widen the gap between the designs since the first two are already utilization is the low-resource CNN in [12], [13]. Although its
in 0.18 μm and are using the same VDD . Note that the power figure cannot be compared with those of the ASIC designs, an
metric is only an indicator and has to be augmented with the approximate comparison with the complex CNN [11] is partially
number of classes and the latency, which has a direct impact on possible, as the latter reports an LGE metric. A direct inspection
the energy consumption, and therefore battery life. of Table I shows the large area differences due to the logic cells
The differences in the normalized energy per class for the used. This is expected as the low-resource CNN is even smaller
ASIC designs are also presented in Table III. The design with than multiple state-of-the-art FPGA implementations of other
the smallest normalized energy per class is in fact the complex CNN models [12], [13]. However, it is important to note that
CNN in [11]. This is because of its extremely small latency. The the complex CNN design of [11] provides a full system whereas
following designs are the FCN of [15] and the SVM of [17]. that of [12], [13] is focused on compressing the CNN classifier.
In regard to the FPGA designs, the smallest average power Additionally, the complex CNN [11] provides a more flexible
(12.7 mW) is that of the SVM in [14], which also has the smallest and complete solution, including on-board training support.
normalized energy per class (2.54 μJ/class). The next design is An additional observation regarding the top designs is that
the low-resources CNN in [13], [12], which consumes 150 mW, all of them use time division multiplexing to reduce the area,
and achieves a normalized energy per class of 34.84 μJ/class. taking advantage of the relaxed timing constraints in EEG-based
The last design in this rating is the CNN of [16], which did not emotion detection. Another observation regarding the second
report any power figures. best normalized area of [15] is that despite using only 2 channels,
One last disclaimer regarding the power comparisons is that the use of dense 4-layer network has resulted in an area that is
not all designs in Table III have implemented all the stages of the three times larger than the SVM design, which has achieved the
EEG-based emotion classification system, which as discussed smallest area.
earlier, will impact the power figures. However, the best power
figures amongst the surveyed designs have been achieved by
those that have implemented the most complete systems [11], C. Latency
[15] [17], [14]. The latency is the time taken by the system to produce a
classification result once an EEG epoch is received. The EEG
classification latency is particularly important in the context
B. Area of a wearable EEG-based emotion classification device, where
The previous paragraphs have already qualitatively described the objective is to track, within timing constraints of human
some of the area-related optimizations performed by the hard- interaction (150 ms), the patient’s emotional state. This is in stark
ware accelerators of the EEG-based emotion classification sys- contrast with EEG-based seizure identification devices, in which
tems. In this section, more quantitative assessments and com- longer detection time windows are more effective because the
parisons are provided. The area figures have the same caveats as relevant episodes are sparsely distributed over time [207]. Emo-
the power figures. For one, it is not possible to directly compare tional experiences are short and highly dynamic. and patients
the areas of an ASIC and an FPGA design. One reasonable way under emotional stimuli are unlikely to manifest a consistent
to make a comparison is to fetch the area of a single ASIC cell response over large time intervals [61]. Tracking such dynamic
(e.g., NOR gate) from the Library Exchange Format (LEF) files, emotional states require a short classification latency.
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
426 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, VOL. 15, NO. 3, JUNE 2021
Fig. 13. Training paradigms and the number of occurrences in the surveyed
state of the art [9]–[17], [30], [31], [33], [39]–[42], [44]–[51], [53]–[58], [60]–
[84], [88]–[106], [108]–[112], [114]–[119], [121]–[149], [187].
D. Training Paradigms Fig. 16. Experimental setup for [17] (See Fig. 9 therein).
In contrast to the EEG-based seizure identification prob-
lem [207], the EEG-based emotion detection task, has the
potential of using large volumes of data to fully exploit the art machine learning algorithms such as Deep Neural Networks.
capabilities of Deep Neural networks in generic classifiers. The In this section, the hardware contributions are assessed in terms
evoked experiences in the emotion detection problem are short of the training paradigm used as well as the number of training
and can provide sufficient amount of data to train state of the paradigms supported.
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
GONZALEZ et al.: HARDWARE ACCELERATION OF EEG-BASED EMOTION CLASSIFICATION SYSTEMS: A COMPREHENSIVE SURVEY 427
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
428 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, VOL. 15, NO. 3, JUNE 2021
lobes with some minor utilization overlap in the posterior and loop is then closed by sending back the result via Bluetooth into
occipital regions. Interestingly, the highest hardware accuracies the Laptop, where a graphical user interface (GUI) displays the
for the DEAP dataset in Table III, were reported by the 2-channel result. This complete setup with such a complex communication
implementation of [15] (CICC’20), in which the two electrodes environment is currently the state of the art in EEG-based
used were not in the frontal lobe. Rather they were placed at the hardware for emotion classification, but is still few steps away
junction between the posterior and temporal regions (TP7 and from a low-footprint, wearable device that tightly couples the
TP8). pre-processing front-end to the EEG sensors on the one hand
Aside from algorithmic differences between the various hard- and to the classifier on the other. However, it is clear that this
ware designs, Table III shows that including a larger number of was not the original intent of the design. The aim was rather
electrodes does not necessarily result in an accuracy improve- to achieve maximum reconfiguration flexibility to test multiple
ment. As mentioned above, the highest accuracies were achieved algorithms and explore various design options.
by designs using 2 [15] (CICC’20) and 6 electrodes [14] (IS- The second experimental setup is from the same research
CAS’19) [17] (TBioCAS’20). This aspect of electrode count and group [16] (ISCAS_b’20), involving an FPGA platform with a
placement is well-documented in the EEG-based SW emotion less complex but more compact implementation than the ASIC
classifiers [99] and has immediate relevance to the HW designs. CNN classifier. The platform operates based on multi-modal
Such designs can reduce processing and input connectivity over- sensors such as EEG, electrocardiogram (ECG) and photo-
head without sacrificing accuracy by using a minimal number plethysmogram (PPG), with the various elements centralized
of electrodes. around two evaluation boards closely connected via a serial
The utilization of a high number of electrodes has far-reaching periphery interface (SPI). The primary board hosts a RISC-V
impact on hardware implementations that is translated into a core synthesized in a Kintex-7 FPGA, whereas the secondary
significant increase in the hardware metrics of area, power, board hosts the classifier implemented in a Spartan-6 FPGA. The
and design complexity. Since these metrics will be accounted sensory data is sent via Bluetooth from each of the three sensors,
as a matter of course when evaluating the HW designs, no and it is acquired on the primary board, where the pre-processing
particular score is assigned to the electrode engineering aspects is performed in the RISC-V core. The pre-processed signals
of the EEG-based emotion classification systems. The main are then transmitted via SPI to the secondary board, where the
recommendation is to carefully select the number and placement classification takes place. Lastly, the output is transmitted via
of EEG channels channel as they impact not only hardware itself Bluetooth to an external laptop, which has a GUI to display the
but also the accuracy of the classification system. One possible classification result. The setup provides an interesting proof of
venue of further development in EEG-based HW classifiers of concept of a flexible combination of an embedded core acting
emotions is the incorporation of newer, more wearable technolo- as feature extractor with a machine learning (ML) accelerator.
gies for EEG signal acquisition [220]. Such technologies are In contrast to the previously described hardware designs, the
particularly relevant to patients suffering from emotion lock-in design in [12] (ISCAS_a’20), which also employs an FPGA
disorders that are accompanied by difficulties in maintaining as the implementation platform, uses a simpler experimental
stable head postures. setup that is dedicated to a proof-of-concept for a low-footprint
CNN accelerator. The design of [12] does not address the issue
of bench-marking a holistic HW system to classify emotions.
F. Verification Methodology It uses a laptop for the data acquisition, pre-processing, and
In addition to the algorithmic and hardware criteria used to feature extraction. Upon the extraction of the frequency-based
assess the EEG-based hardware designs of Fig. 2, this subsection features, the EEG frequency frames are transmitted to the FPGA
presents an overview of the experimental settings that have been using a universal asynchronous receiver-transmitter (UART)
employed for testing and verification. Setting up the testing and interface. The FPGA hosting the CNN accelerator generates the
verification environment for the EEG-based hardware emotion classification labels.
classifiers is indeed a challenging task that sets the HW approach Toward achieving full wearability, the designs of [14], [17],
apart from SW approach. present a System-on-Chip (SoC) involving all the stages required
Fig. 15 shows the experimental setup used in [11], which is for the EEG-based emotion detection. Their system acquires
the one design involving significant ports and interfacing effort. the EEG data directly from the electrodes, which entails a
As shown in Fig. 15, a front-end circuit and a micro-controller larger design effort due to the inclusion of a customized Analog
are attached to the back of the 16-channel EEG cap to acquire Front-End (AFE). The designs in [14] (ISCAS’19) and [17]
and send the data via a Bluetooth module. The Bluetooth pack- (TBioCAS’20) employ one AFE block for each of the 8 EEG
ets are received by a laptop, in which the feature extraction channels. The AFE blocks in both designs include a low-noise,
engine is implemented. The laptop then forwards the data to capacitively coupled instrumentation amplifier (CCIA) and a
the Printed Circuit Board (PCB) hosting the 28-nm CNN chip. programmable gain amplifier (PGA). The fundamental AFE dif-
The PCB also includes a bluetooth module, a data conversion ference between the two designs lies in the number of Analog-to-
stage implemented in an embedded Spartan-3E FPGA, and has Digital Converters (ADC) used. The design in [14] (ISCAS’19)
the possibility to enable the classification either in an on-board uses one ADC per channel, whereas that of [17] (TBioCAS’20)
Virtex-7 FPGA or the 28-nm CNN engine. The classification shares one successive-approximation (SAR) ADC between all
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
GONZALEZ et al.: HARDWARE ACCELERATION OF EEG-BASED EMOTION CLASSIFICATION SYSTEMS: A COMPREHENSIVE SURVEY 429
TABLE IV
SUMMARY OF SCORES ACCORDING TO THE PROPOSED ASSESSMENT METRICS
channels, as shown in Fig. 16. After the digital data is ac- each and everyone of these metrics with the convention that a
quired, both approaches use time division multiplexing for pre- high score implies a successful design for that particular category
processing and feature extraction to reduce on-chip resources. the metric is meant to measure. The scores for Area, Latency
The features are then passed to the SVM classifier with both and Energy, range from 1 to 6, except when a specific design
designs using the same SVM algorithm. Lastly, the output of has failed to report on one or another of these metrics, in which
the classifier is sent to an external Bluetooth master, where the case the assigned score is zero. In order to ensure a balanced
accuracy assessments take place. weight distribution between the assessed metrics, a normalized
In addition to the previous two SoC designs, another SoC score calculated with the corresponding maximum value is also
design with an FCN classifier and including all the processing reported. The summary of all scores is presented in Table IV,
stages on-chip, is presented in [15]. It is the one with the in which the best two designs correspond to the CNN chip
smallest footprint and most efficient utilization of a compact described in [11], [117], and the SoC implementing the SVM
two-electrode sensing interface. As such, it is representative classifier in [17]. However, the normalized score assigns the
of the state-of-the-art in EEG-based HW emotion classifiers. top place to the SVM-based SoC [17] due to its consistently
Their customized AFE uses a Capacitively Coupled Low Noise good performance in all metrics. A more detailed summary
Amplifier (C2LNA) followed by the 2nd stage of a Continuous- including the full technical specifications of the various designs
time, Digitally-Assisted Capacitively Coupled Instrumentation was already shown in Table III.
Amplifier (CTDAC2IA). The output of the IA is passed to a
low-pass filter with a cut-off frequency of 2KHz, then digitized VI. CHALLENGES AND OPPORTUNITIES
using an SAR ADC with 10-bit resolution. One design concern
In this section, we highlight two main challenges facing the
of the two-electrode interface is to reduce the on-resistance of
adoption of clinically-approved EEG-based emotion monitors.
the front end. This is done through a carefully designed parallel
The first is the absence of standardized, multi-modal, datasets
configuration multiplexer that is used to share one AFE between
for training, validating and benchmarking the HW emotion clas-
the two channels, thus resulting in more area savings. The
sification engines. The second is the interlock not just with the
acquired digital data is passed to the on-chip feature extraction
physio-psychological community but also with neuro-scientific
engine for subsequent use by the fully connected network (FCN)
community where functional studies of brain activities may
classifier.
provide a more solid foundation to the EEG-base platforms
As is evident from the above review, the verification setup
for emotion classification. We also highlight a couple of re-
EEG-base HW classifiers requires a significant amount of effort,
search opportunities. The first one is short-term and is related
involving the design of reliable communication links, complex
to the leveraging of the latest ultra-low-power design tech-
mixed-signal front ends, and robust digital data paths. Given
niques for the design of medium-to-high complexity EEG-based
that a fully wearable, clinically approved, EEG-based solution
HW platforms for emotion classification. The second is to use
is still few years down the road, all the surveyed designs have
such platforms as tools in support of neuro-scientific research
received a score of 2 for their experimental setups, except for [12]
on neurological disorders, including Alzheimer’s, Autism, and
(ISCAS_a’20), which received a score of 1 due to its streamlined
Amyotrophic Lateral Sclerosis (ALS).
environment with a mandatory external host to pre-process and
extract features.
A. Challenge: EEG Datasets
Comprehensive and high-quality EEG data is a key condition
G. Scoring of Surveyed Hardware Designs for the robustness of emotional recognition systems. In light of
In our assessment of the various EEG-based hardware de- this condition, available datasets are assessed to provide guide-
signs, we have considered a number of metrics, including: lines for closing the gap between existing EEG-based emotion
Wearability, Pre-processing, Feature Extraction, Classification, classifiers and next-generation, clinically-tested, robust classi-
Online Training support, Normalized Energy, Area, Latency, and fiers. The EEG-based datasets currently available for emotion
number of training types. Each design has received a score in classification of emotions include the following:
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
430 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, VOL. 15, NO. 3, JUNE 2021
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
GONZALEZ et al.: HARDWARE ACCELERATION OF EEG-BASED EMOTION CLASSIFICATION SYSTEMS: A COMPREHENSIVE SURVEY 431
[223], [224], in which only the EEG data is collected without any with stimulus intensity rather than its valence. Emotional stimuli
reference sensor, and those including a multi-modal recording activate the reticular formation and amygldala as well [232].
testbed composed by other sensors besides EEG, including ECG, The amygdala activation can be directly correlated with skin
facial video clips, peripheral physiological signals (e.g., EOG, conductive response measurements [233].
galvanic skin response, etc), and infrared spectroscopy [18], Of course, the involvement of deeper brain areas in the
[113], [150], [222], [225]. Of course from a modeling and processing of emotions poses a stiff challenge for their direct,
machine-learnin viewpoint, the higher the number of sensor non-invasive electrophysiological observation since only their
modalities, the better, as more robust pre-processing and classi- cortical correlates can be acquired. However, areas of the limbic
fication strategies can be explored. system control autonomic nervous system responses, that can
be quantified by measuring skin impedance, heart-rate, and
breathing frequency, among other vital signs [234]. In particular,
B. Challenge: Neuroscience Interlock these secondary markers have been used to quantify valence as
The research we have surveyed so far on the EEG-based early as 1993 [235]. Whereas such quantification techniques
emotion classification problem is engineering-focused. Such form a useful approach to the non-invasive monitoring of the
research is essential if we are to achieve tangible solutions with limbic system, it has however been noted that the secondary
a potential of being deployed in a clinical context. It is however responses are significantly influenced by factors besides the
biased toward existing technologies and may not be informed subjects emotional state [236].
with the latest advances in the neurosciences. In this subsection, From an electrophysiological standpoint, it is often over-
we approach the EEG-based emotion classification problem looked that autonomic responses can in fact be acquired as part
from a Neuroscience perspective with a hope of identifying novel of the EEG artefacts and are usually rejected as “noise” in the
approaches toward its solution. early feature extraction stages of the EEG signals. The EEG
In the early days of affective neuroscience, basic emotions artefacts contain the following autonomic responses:
such as anger, fear and disgust, have been treated as basic r Heart Rate: In the EEG signals, cardiac-related artefacts
affects that arise from the limbic system, comprising structures appear in amplitude and frequency ranges that are sim-
such as the thalamic nucleii, hippocampus, hypothalamus and ilar to those due to neuronal activity. They are however
the cingulate cortex [227]. This view has been particularly highly stereotypical and periodic, which facilitates their
supported by the finding that there are shared facial expressions identification. In order to separate these signals from EEG,
for these emotions independent of cultural context as was already methods such as Independent Component Analysis can be
postulated by Charles Darwin [228]. Based on the criticism employed effectively [237].
that subjectively, emotional states seem to be at times diffuse r Breathing Rate: It is well known that the respiratory fre-
and merge one into another, contemporary dimensional models quency modulates the spectral power of the alpha EEG
break down the emotional spectrum into two, sometimes three, band [238], which may allow the inference of breathing
elemental components that correspond with well understood frequency, and from the latter, the arousal state of the
signalling pathways within the brain-stem and cortex and whose subject.
combination give rise to emotional phenomenologies [229]. r Eye Blinking Rate: Occulomotor activity can be observed
Evidence from a variety of EEG and fMRI studies [230] on the frontal electrodes as muscle artefacts. Whereas some
points out that within these pathways, cortical structures are EEG databases tend to filter out this artefact, it is interesting
crucially involved in the processing of emotions, which opens to observe that the rate of eye blinks correlates with the
the possibility of an at least partial electro-physiological investi- sympathetic arousal of a subject [239], which may lead to
gation through non-invasive methods such as EEG and MEG. In a new secondary marker of emotions in EEG recordings.
turn, the fMRI evidence provides significant justification for the r Skin Impedance: Many commercially available EEG se-
noninvasive, wearable EEG-based approaches to emotion clas- tups include an electrode-skin impedance test in order to fa-
sification. Indeed, such approaches primarily acquire cortical cilitate the correct placement of wet electrodes. Wearables
signals. for continuous skin impedance measurements have also
Different authors have used a variety of terms for these been well researched by the affective computing commu-
elemental emotion components, and in this paper, we have made nity [234], [240]. In an EEG-based, emotion classification
use of the terminology proposed by [229], according to which system, they may provide important secondary information
the term valence describes the continuum between euphoria and that will significantly improve the accuracy of the emotion
dysphoria, thought to be originating in the mesolimbic dopamine inference engine.
system, and the serotonergic projections from the dorsal Nucleus One typical psycho-physiological paradigm for the investi-
Raphe to the Ventral Striatum. gation of autonomic responses is to “Startle” the subject with
An orthogonal component is arousal (also called Ap- an unexpected stimulus such as an acoustical burst of white
proach/Withdrawl or Behavioral Activation among several other noise. The intensity of the involuntary reaction to the sudden
terminologies). Arousal plays an important role in the formation stimulus is modulated by the emotional state of the subject. Even
of declarative memory where it seems to modulate the ability to EEG responses to startle-stimuli could be considered a form of
commit a given situation to memory. As shown in [231] by the sensory evoked potential [241] and find use cases in clinical
use of gustatory stimuli, amygdala activation directly correlated diagnoses of psychological disorders such as Alexithemia and
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
432 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, VOL. 15, NO. 3, JUNE 2021
Schizophrenia [242], [243]. The EEG reponses may also be used actively control performance and leakage by using 22FDX ABB
as a tool to estimate the emotional capacities of patients with designs in reverse bias configurations (conventional well) [260].
neurodegenerative diseases such as Alzheimer’s. The latter designs result in extremely efficient implementations
Another important paradigm of the neuro-anatomical basis of with low to medium performance, as required for real-time bio-
emotion classification is to augment biomarkers in the frequency signal processing, and allows direct SRAM integration within
domain representations of EEG signals with the spatial informa- the ultra low voltage domain. It is worth noting that the winning
tion of cortical activity [244], [245]. The traditional spectral design (ZEN) of the 2021 German national competition on
decomposition into alpha, beta, and gamma bands is further “Energy-efficient AI system” in the ASIC category was such
analyzed in light of the frontal alpha-band power asymmetry a design, in which an ECG classifier based on gated-recurrent
between hemispheres, and the results are used to measure the units (GRU) has been implemented [202]. This ultra low voltage
activated approach/withdrawal responses. Following the 10/20 design paradigm is very much worth exploring for the classifica-
system, the electrode pairs F3-F4, F6-F5, F8-F7 are investigated tion of EEG signals as well. We believe that it represents a signif-
and their power spectral density in the alpha band (8–13 Hz) icant opportunity for achieving complex HW emotion classifiers
compared according to the IHPA given in Eq. (5). operating in the ultra low power domains with tightly-coupled
Whereas these approaches yield promising results [246], the cores to enable flexible pre-processing, and SRAM blocks fully
neurological basis that motivates the selection of electrode pairs integrated in support of adaptive and subject specific wearables.
in IHPA may need to be investigated further. It is clear that
the availability of a hardware emotion classifier that can collect
real-time EEG data as function of electrode placement and selec- D. Opportunity: Research Platform for EEG-Based
tion is bound to facilitate the investigation of the neuroscientific Monitoring of Neurological Disorders
research questions. The challenge already mentioned in VI-B regarding the inter-
In such context, EEG acquisition platforms should offer lock with neuro-scientific research can be turned into a signif-
the capability to log raw signals while enabling reproducible icant research opportunity as the ultra-low-power, near/supra-
electrode placement. This can be accomplished by using, for threshold EEG platforms may be used to enable the social
instance, optical markers on the electrodes to photographically integration and improve the medical treatments of those patients
document any deviation from the clinical 10–20 standard. More suffering from certain neurological disorders. Such opportu-
will be said on the use of the EEG hardware platform in neuro- nity is presented in the context of three important disorders:
science research in Subsection VI-D. Alzheimer’s, Autism, and Amyotrophic Lateral Sclerosis.
1) Alzheimer’s Disease: Despite the degenerative nature of
the disease, there is enough evidence to support that Alzheimer’s
C. Opportunity: Ultra Low-Power Hardware Design
patients are still able to identify and process emotions. Multiple
As already mentioned in Section III, robust System-on-Chip studies have been conducted to prove that the Alzheimer subjects
(SoC) solutions addressing the EEG-based emotion classifica- have a reasonable understanding of emotions [261]–[263]. As
tion problem opt for simple classifiers to meet a low-power presented in [264], Alzheimer’s patients in the moderate stage of
budget. The HW designs we have surveyed use traditional, the disease encounter difficulties only when processing second
low-power and power management circuit techniques, and the order emotion inferences as they require a cognitive process hav-
problem of trading off classifier complexity with more aggres- ing a representation in working memory of embedded clauses.
sive power reduction strategies is yet to be fully explored. Otherwise, the amygdala of the Alzheimer’s patient does not
The recent near/supra-threshold circuit design methodologies deteriorate, and as such, it remains an active agent in emotion
should enable such trade-off exploration under robust operation processing throughout the disease, even after the patient loses
conditions. The new hardware classifiers of emotions can now the language capabilities [265].
exploit near/supra-threshold designs to close the gap between a One attempt to classify emotions using SVM for patients
high-accuracy complex classifier and a stringent power envelop with Alzheimer’s is documented in [266], where functional
constraint. Although there are extensive research efforts to oper- magnetic resonance imaging (fMRI) is used as the brain com-
ate with sub-threshold voltages under process variations [247]– puter interface. The reported classification accuracy for pleasant
[259], the combination of low-voltage and robustness benefits and unpleasant emotions was reported to be 71% on average,
is achieved best in the near/supra-threshold region. reaching a maximum of 83%, which are the typical accuracies
New mechanisms, such as Adaptive Body Biasing (ABB) in displayed by healthy subjects in state-of-the-art emotion clas-
FDSOI technology [172], enable a highly controlled operation sification datasets. Despite the intrusive nature of fMRI, this
in the near/supra-threshold region, as it prevents the worst cases research confirms the presence and relevance of brain-related
from going deep into the sub-threshold regime. This ensures physiological markers in Alzheimer’s patients.
very reliable and robust designs with high yield and ultra-low 2) Amyotrophic Lateral Sclerosis (ASL): Functional mag-
voltages. These designs can still achieve reasonable performance netic resonance imaging (fMRI) has also been used to examine
while operating at ultra low voltages. This is in fact the case the brain activity of patients suffering from Amyotrophic Lateral
for forward designs (flipped wells) in the near/supra-threshold Sclerosis (ALS) under controlled emotion elicitation experi-
operation [170], [171]. Alternatively, complex hardware classi- ments. An analysis is conducted in [267] on the fMRI data of
fiers having for objective power consumption minimization, can volunteeers diagnosed with ALS while they were being exposed
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
GONZALEZ et al.: HARDWARE ACCELERATION OF EEG-BASED EMOTION CLASSIFICATION SYSTEMS: A COMPREHENSIVE SURVEY 433
to visual stimuli from the International Affective Picture System Artefact mitigation strategies sometimes trigger additional
(IAPS) dataset. The two experimental sessions were recorded undesired effects such as the multipolar excursions caused by
with a time lapse of six months. An important conclusion of the application of high-pass filters [273] and that are commonly
the analysis is that the ALS subjects displayed lower arousal mistaken for neural activity [274], [275]. To address these prac-
levels after the six-month gap. This “history effect” is a good tical issues, the EEG-based hardware design must include low-
candidate for further EEG-based research as it could impact the cost but robust pre-processing techniques for “smart” artefact
development of reference data for the emotion classification of mitigation. As shown in Fig. 3, filter-based pre-processing does
ALS patients. not remove all artefacts present in the EEG data. Furthermore,
3) Autism Spectrum Disorder: The convergence of affective ICA-based techniques such as the one proposed in [12] are not
computing and autism has been recognized and explored in [268] entirely hardware friendly. Nor are they sufficient in themselves
and [269]. Early work on autism monitoring proposed multi- to deal with all artefacts. For instance, ICA is not effective in
modal affective sensing technologies to facilitate the socio- removing artefacts due to electrode-specific or sensor-specific
emotional integration of autistic patients [269]. The early de- sources as such sources cannot be suppressed linearly as in blind
tection of Autism Spectrum Disorder (ASD) in newborns is source separation or beam-forming techniques. [276], [277].
a key factor to avoid the irreversible consequences of ASD
during adulthood. However, the lack of an approved medical B. High-Resolution Feature Extraction
procedure has made the early diagnosis of ASD very difficult as
The concept of brainwave bands has been a useful model
the signs and symptoms are very much subject-dependent, and
for extracting frequency-domain features but is not entirely sup-
their severity varies from one patient to another. It is therefore
ported by solid neurophysiological evidence. The average power
important to develop subject-specific, custom neuro-feedback
in a single frequency band is normally assumed to be a represen-
systems to track socio-emotional responses to stimuli in real-
tative measure of the underlying activity. However, such measure
time. Although conventional EEG systems have been vastly used
has proven to be quite coarse, and future frequency-domain brain
for affect recognition in adults and children under controlled en-
activity models should be based on higher frequency resolutions
vironments, they are still short of providing day-to-day monitor-
and more fundamental, non-invasive approaches for capturing
ing because of their high cost, limited wearability, and subject-
brain activity. Besides the high frequency resolution, small fre-
independent training. In [15], a patient-centric, FPGA-based
quency windows may help better capture power excursions over
sensor is proposed that could be integrated into an inexpensive
the full frequency range. Furthermore, the inclusion of Cross-
and low-power EEG headset to furnish seamless signal recording
Frequency Coupling (CFC) features in the EEG-based hardware
that should facilitate the treatment of ASD patients. In [270],
classifiers of emotions represents an important alternative path
a deep-learning approach based on multi-modal sensing has
to wearable emotion detection with the potential to improve
been proposed to address patient-specific variations in autistic
robustness [155] and reduce memory footprint by focusing on
children.
correlation graphs [157], rather than on the full raw EEG dataset.
Another improvement in the frequency-domain features is
VII. RECOMMENDATIONS FOR FUTURE DIRECTIONS the systematic inclusion of spatial asymmetry as is done in the
10/20 EEG system (see “Spatial Features” in Section IV) and
This article has provided a comprehensive, up-to-date survey
its extension to other EEG electrode patterns, especially those
of hardware-based classifiers of emotions using EEG signals.
with a minimal number of electrodes.
The more widely researched software-based classifiers have
been used to identify the challenges and opportunities facing the
emerging hardware classifiers. In this final section of our survey C. Hardware Platforms With Tightly-Coupled Cores
article, we give recommendations for promising, high-impact The hardware designs we have surveyed have all been the
research directions in the area of EEG-based, wearables for subject of several engineering trade-offs, the most important
emotion classification. They are concerned with embedded pre- of which is the one trading off inference engine accuracy and
processing, high-resolution feature extraction, HW platforms complexity for low power and small footprint. The lightweight
with tightly-coupled cores, and multi-modal, cross-cultural EEG SVM classifier of [17] is an extreme example of such trade-
datasets. off, especially that the SVM inference engine has very limited
margin to adapt to changes in the subject’s emotional signaling
over time. Such changes are common in patients suffering from
A. Embedded Pre-Processing
neurological diseases in which their emotional capabilities are
As mentioned earlier, removing artefacts is highly desired in likely to deteriorate in the late stages of the disease [267]. On
single-mode EEG-based classifiers. Artefacts commonly mask the other hand, the designs of [16] (ISCAS_b’20) and [11] (JET-
interesting activity [38]. For example, power line noise obscures CAS’19) have successfully shown that some degree of adapta-
cortical activity around 50 Hz (or 60 Hz), whereas drifting tion is achievable with complex classifiers. Despite its modest
of electrodes interfere with slow cortical potentials. Ocular or complexity, the approach in [16] (ISCAS_b’20) presents an
myogenic activity can be interpreted as cortical activity [271], interesting combination of an embedded core with an ML accel-
especially if they correlate with external stimulation [272], erator that can provide sufficient margins to implement multiple
which is quite common in emotion-based experiments. DNNs with flexible feature extraction. However, both power and
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
434 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, VOL. 15, NO. 3, JUNE 2021
footprint are increased as a result of this added flexibility. One of such data and make it more usable for training and
promising approach to improve the trade-off between adaptation validation.
and energy efficiency is to compute the DNN inference within 3) Pre-processing library: The multi-modal data which
the accelerator, while keeping a tightly-coupled core within the would include EEG, visual, aural, temperature, pulse
low-power domain to achieve flexible pre-processing and feature rate, skin impedance, and breathing rate information
extraction as was done in [278]. The programmability of the low- need to be pre-processed to ensure consistency and re-
power core should provide the flexibility needed for adapting duce undesired interferences. So far, such pre-processing
the pre-processing needs to subject and environment variations has been fragmented with each research group em-
with the configurable accelerator ensuring short inference times ploying its own algorithms to pre-process data and ex-
without compromising wearability or power. Another promising tract features from the EEG sub-band signals. Algo-
direction is to combine such core-based platforms with network rithmic and pre-processing differences result in vari-
weight binarization [279]. ations in the training data. The multi-modal, univer-
sal datasets we are advocating should come with their
own standardized libraries of pre-processing tools so
D. Comprehensive EEG Datasets and Benchmarks as to insure input uniformity across all classification
Unfortunately, the field of EEG-based emotion classification platforms.
is still lacking a comprehensive EEG dataset where all stim- 4) Standardized benchmarking: It is important that the com-
ulation modalities are included. Existing datasets use at best munity adopts a set of universal benchmarks for evaluating
a very small number of multimodal signals with the machine emotion classification engines. Such benchmarks will be
learning engines limited to training and inference with those essential for validating the clinical use of EEG-based
signals only. Such restrictions on sensing modalities lead to wearable devices for emotion classification.
sensor biases that are detrimental to network training. Future
datasets for EEG-based emotion classification should have the
VIII. CONCLUSIONS
following characteristics:
1) Multi-modality: The dataset should cover a broader range In this paper, we have presented the very first critical review
of multi-modal markers to minimize the impact of sensing of EEG-based wearable classifiers of human emotions. We have
biases. surveyed their algorithmic foundations, their feature extraction
2) Universality: The dataset should be universal in that it methodologies, and their implementation details. We have also
should include subjects from various regions, ethnic back- assessed their performance based on a variety of metrics that we
grounds, occupations, and health conditions, in order to have condensed into a single score assigned to all the HW ac-
account for inter-cultural differences and similarities in celerators included in this survey. In addition, We have provided
response to emotion stimuli. a neuroscience-based analysis of existing hardware accelerators
The dataset that comes closest to the above two requirements of emotion classifiers and used it to map out several research
is [18], which unfortunately still needs a well-established pro- opportunities, including multi-modal hardware platforms, accel-
tocol to eliminate distortions during emotion elicitation. One erators with tightly-coupled cores, and pre-processing libraries
such distortion is the impact of trail timing and duration on for universal EEG-based datasets.
collected data. Indeed, [150] has shown that during an extended
period of trial time, the emotional state of a person may vary, REFERENCES
introducing bias in the collected data. In the 60 seconds allocated
to data collection, several emotions may arise in response to the [1] Z. He et al., “Advances in multimodal emotion recognition based on
brain-computer interfaces,” Brain Sci., vol. 687, no. 10, pp. 1–29, 2020.
same stimulus, thus introducing ambiguity as to the correlation [2] A. Craik, Y. He, and J. L. Contreras-Vidal, “Deep learning for elec-
between the stimulus and the collected data. Further recommen- troencephalogram (EEG) classification tasks: A review,” J. Neural Eng.,
dations for dataset improvements include: vol. 16, no. 3, Apr. 2019, Art. no. 031001.
[3] K. Zhang et al., “Application of transfer learning in EEG decoding based
1) Use of sudden stimuli: The use of sudden stimuli such as on brain-computer interfaces: A review,” Sensors, vol. 6321, no. 21, pp.
white noise bursts, can help reduce the bias of more com- 1–25, Jul. 2020.
mon stimuli such as song snippets, film clips, and video [4] S. M. Alarcão and M. J. Fonseca, “Emotions recognition using EEG sig-
nals: A survey,” IEEE Trans. Affect. Comput., vol. 10, no. 3, pp. 374–393,
games. The latter stimuli are known to induce emotions Jul.-Sep. 2019.
with a history effect, in that they depend on the subject’s [5] Y. Zhao, W. Zhao, C. Jin, and Z. Chen, “A review on EEG based emotion
emotions prior to the initiation of the stimuli. classification,” in Proc. IEEE 4th Adv. Inf. Technol., Electron. Automat.
Control Conf., vol. 1, 2019, pp. 1959–1963.
2) Consistent labeling: Methods and protocols should be [6] F. Lotte et al., “A review of classification algorithms for EEG-based
developed and adopted so that EEG data is labeled consis- brain-computer interfaces: A 10 year update,” J. Neural Eng., vol. 15,
tently across subjects and throughout the EEG collection no. 3, Feb. 2018, Art. no. 031005.
[7] Y. Wei et al., “A review of algorithm hardware design for AI-based
time frame, not just for the collection session itself but biomedical applications,” IEEE Trans. Biomed. Circuits Syst., vol. 14,
for the entire duration of dataset construction. This con- no. 2, pp. 145–163, Apr. 2020.
sistency is especially important around the intermediate [8] A. M. Brouwer, T. O. Zander, J. B. van Erp, J. E. Korteling, and A. W.
Bronkhorst, “Using neurophysiological signals that reflect cognitive or
regions of the Valence-Arousal plane, where the value of affective state: Six recommendations to avoid common pitfalls,” Front.
the data is the least. Consistency will increase the value Neurosci., vol. 9, no. 136, pp. 1–11, 2015.
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
GONZALEZ et al.: HARDWARE ACCELERATION OF EEG-BASED EMOTION CLASSIFICATION SYSTEMS: A COMPREHENSIVE SURVEY 435
[9] Y. Huang, K. Wang, Y. Ho, C. He, and W. Fang, “An edge AI system-on- [32] T. Nguyen, T. Zhou, T. Potter, L. Zou, and Y. Zhang, “The cortical network
chip design with customized convolutional-neural-network architecture of emotion regulation: Insights from advanced EEG-fMRI integration
for real-time EEG-based affective computing system,” in Proc. IEEE analysis,” IEEE Trans. Med. Imag., vol. 38, no. 10, pp. 2423–2433,
Biomed. Circuits Syst. Conf., 2019, pp. 1–4. Oct. 2019.
[10] K.-Y. Wang, Y.-D. Huang, Y.-L. Ho, and W.-C. Fang, “A customized [33] P. Li et al., “EEG based emotion recognition by combining functional
convolutional neural network design using improved softmax layer for connectivity network and local activations,” IEEE Trans. Biomed. Eng.,
real-time human emotion recognition,” in Proc. IEEE Int. Conf. Artif. vol. 66, no. 10, pp. 2869–2881, Oct. 2019.
Intell. Circuits Syst., 2019, pp. 102–106. [34] R. M. Mehmood, H. J. Yang, and S. H. Kim, “Children emo-
[11] W. Fang, K. Wang, N. Fahier, Y. Ho, and Y. Huang, “Development and tion regulation: Development of neural marker by investigating hu-
validation of an EEG-based real-time emotion recognition system using man brain signals,” IEEE Trans. Instrum. Meas., vol. 70, 2021,
edge AI computing platform with convolutional neural network system- Art. no. 4000411.
on-chip design,” IEEE Trans. Emerg. Sel. Topics Circuits Syst., vol. 9, [35] Y. H. Chen, S. W. Chen, and M. X. Wei, “A VLSI implementation of
no. 4, pp. 645–657, Dec. 2019. independent component analysis for biomedical signal separation using
[12] H. A. Gonzalez, S. Muzaffar, J. Yoo, and I. A. M. Elfadel, “An in- CORDIC engine,” IEEE Trans. Biomed. Circuits Syst., vol. 14, no. 2,
ference hardware accelerator for EEG-based emotion detection,” in pp. 373–381, Apr. 2020.
Proc. IEEE Int. Symp. Circuits Syst. (ISCAS), 2020, pp. 1–5, doi: [36] Y. H. Chen and S. P. Wang, “Low-cost implementation of independent
10.1109/ISCAS45731.2020.9180728. component analysis for biomedical signal separation using very-large-
[13] H. A. Gonzalez, S. Muzaffar, J. Yoo, and I. M. Elfadel, “BioCNN: A hard- scale integration,” IEEE Trans. Circuits Syst. II: Exp. Briefs, vol. 67,
ware inference engine for EEG-based emotion detection,” IEEE Access, no. 12, pp. 3437–3441, Dec. 2020.
vol. 8, pp. 140896–140914, 2020, doi: 10.1109/ACCESS.2020.3012900. [37] D. Yao, “A method to standardize a reference of scalp EEG recordings
[14] A. R. Aslam and M. A. B. Altaf, “An 8 channel patient specific neuro- to a point at infinity,” Physiol. Meas., vol. 22, no. 4, pp. 693–711, 2001.
morphic processor for the early screening of autistic children through [38] A. de Cheveigné and D. Arzounian, “Robust detrending, rereferencing,
emotion detection,” in Proc. IEEE Int. Symp. Circuits Syst., 2019, outlier detection, and inpainting for multichannel data,” NeuroImage,
pp. 1–5. vol. 172, pp. 903–912, 2018.
[15] A. R. Aslam, T. Iqbal, M. Aftab, W. Saadeh, and M. A. B. Altaf, [39] T. Song, W. Zheng, P. Song, and Z. Cui, “EEG emotion recognition using
“A 10.13uJ/classification 2-channel deep neural network-based SoC for dynamical graph convolutional neural networks,” IEEE Trans. Affect.
emotion detection of autistic children,” in Proc. IEEE Custom Integr. Comput., vol. 11, no. 3, pp. 532–541, Jul.-Sep. 2020.
Circuits Conf., 2020, pp. 1–4. [40] B. H. Kim and S. Jo, “Deep physiological affect network for the recog-
[16] C. J. Yang, N. Fahier, C. Y. He, W. C. Li, and W. C. Fang, “An AI-edge nition of human emotions,” IEEE Trans. Affect. Comput., vol. 11, no. 2,
platform with multimodal wearable physiological signals monitoring pp. 230–243, Apr.-Jun. 2020.
sensors for affective computing applications,” in Proc. IEEE Int. Symp. [41] J. Chen, P. Zhang, Z. Mao, Y. Huang, D. Jiang, and Y. Zhang, “Accu-
Circuits Syst., 2020, pp. 1–5. rate EEG-based emotion recognition on combined features using deep
[17] A. R. Aslam and M. A. B. Altaf, “An on-chip processor for chronic neu- convolutional neural networks,” IEEE Access, vol. 7, pp. 44317–44328,
rological disorders assistance using negative affectivity classification,” Mar. 2019.
IEEE Trans. Biomed. Circuits Syst., vol. 14, no. 4, pp. 838–851, Aug. [42] H. A. Gonzalez, J. Yoo, and I. M. Elfadel, “EEG-based emotion
2020. detection using unsupervised transfer learning,” in Proc. 41st Annu.
[18] S. Koelstra et al., “DEAP: A database for emotion analysis using physi- Int. Conf. IEEE Eng. Med. Biol. Soc., 2019, pp. 694–697, doi:
ological signals,” IEEE Trans. Affect. Comput., vol. 3, no. 1, pp. 18–31, 10.1109/EMBC.2019.8857248.
Jan.–Mar. 2012. [43] A. H. Setianingrum and B. S. Budhi, “Model prediction of psychoanalysis
[19] M. Sazgar and M. G. Young, “EEG artifacts,” in Absolute Epilepsy EEG trend of radical emotional aggressiveness using EEG and GLCM-SVM
Rotation Review. Berlin, Germany: Springer, 2019, pp. 149–162. method,” in Proc. 6th Int. Conf. Cyber IT Serv. Manage., 2018, pp. 1–7.
[20] J. W. Clark Jr., “The origin of biopotentials,” Med. Instrum.: Appl. Des., [44] M. Alsolamy and A. Fattouh, “Emotion estimation from EEG signals
vol. 3, pp. 121–182, 1998. during listening to Quran using PSD features,” in Proc. 7th Int. Conf.
[21] P. L. Nunez et al., Electric Fields of the Brain: The Neurophysics of EEG. Comput. Sci. Inf. Technol., 2016, pp. 1–5.
London, U.K.: Oxford Univ. Press, USA, 2006. [45] S. Liu, J. Tong, M. Xu, J. Yang, H. Qi, and D. Ming, “Improve the
[22] L. Sörnmo and P. Laguna, Bioelectrical Signal Processing in Cardiac generalization of emotional classifiers across time by using training
and Neurological Applications. New York, NY, USA: Academic Press, samples from different days,” in Proc. 38th Annu. Int. Conf. IEEE Eng.
2005, vol. 8. Med. Biol. Soc., 2016, pp. 841–844.
[23] B. Grundlehner and V. Mihajlović, “Ambulatory EEG monitoring,” in [46] M. Stikic, R. R. Johnson, V. Tan, and C. Berka, “EEG-based classification
Encyclopedia Biomedical Engineering, R. Narayan, Ed., Oxford, U.K.: of positive and negative affective states,” Brain-Comput. Interfaces,
Elsevier, 2019, pp. 223–239. vol. 1, no. 2, pp. 99–112, 2014.
[24] D. B. Stone, G. Tamburro, P. Fiedler, J. Haueisen, and S. Comani, [47] H. Ullah, M. Uzair, A. Mahmood, M. Ullah, S. D. Khan, and F. A. Cheikh,
“Automatic removal of physiological artifacts in EEG: The optimized “Internal emotion classification using EEG signal with sparse discrimi-
fingerprint method for sports science applications,” Front. Hum. Neu- native ensemble,” IEEE Access, vol. 7, pp. 40144–40153, Mar. 2019.
rosci., vol. 12, no. 96, pp. 1–15, 2018. [48] Z. Wang, Y. Tong, and X. Heng, “Phase-locking value based graph
[25] M. K. Islam, A. Rastegarnia, and Z. Yang, “Methods for artifact detec- convolutional neural networks for emotion recognition,” IEEE Access,
tion and removal from scalp EEG: A review,” Neurophysiologie Clin- vol. 7, pp. 93711–93722, Jul. 2019.
ique/Clinical Neuriophysiol., vol. 46, no. 4/5, pp. 287–305, 2016. [49] W. Zhao, Z. Zhao, and C. Li, “Discriminative-CCA promoted by EEG
[26] C. Marque, C. Bisch, R. Dantas, S. Elayoubi, V. Brosse, and C. Perot, signals for physiological-based emotion recognition,” in Proc. 1st Asian
“Adaptive filtering for ECG rejection from surface EMG recordings,” J. Conf. Affect. Comput. Intell. Interaction, 2018, pp. 1–6.
Electromyogr. Kinesiol., vol. 15, no. 3, pp. 310–315, 2005. [50] S. Moon, S. Jang, and J. Lee, “Convolutional neural network approach for
[27] K. T. Sweeney, T. E. Ward, and S. F. McLoone, “Artifact removal EEG-based emotion recognition using brain connectivity and its spatial
in physiological signals-practices and possibilities,” IEEE Trans. Inf. information,” in Proc. IEEE Int. Conf. Acoust., Speech Signal Process.,
Technol. Biomed., vol. 16, no. 3, pp. 488–500, May 2012. 2018, pp. 2556–2560.
[28] J. A. Urigüen and B. Garcia-Zapirain, “EEG artifact removal-state-of-the- [51] Y. Li, J. Huang, H. Zhou, and N. Zhong, “Human emotion recognition
art and guidelines,” J. Neural Eng., vol. 12, no. 3, 2015, Art. no. 031001. with electroencephalographic multidimensional features by hybrid deep
[29] G. L. Wallstrom, R. E. Kass, A. Miller, J. F. Cohn, and N. A. Fox, neural networks,” Appl. Sci., vol. 7, no. 10: 1060, pp. 1–20, 2017.
“Automatic correction of ocular artifacts in the EEG: A comparison of [52] H. Xu and K. N. Plataniotis, “Affective states classification using EEG
regression-based and component-based methods,” Int. J. Psychophysiol., and semi-supervised deep learning approaches,” in Proc. IEEE 18th Int.
vol. 53, no. 2, pp. 105–119, 2004. Workshop Multimedia Signal Process., 2016, pp. 1–6.
[30] D. Wu, X. Han, Z. Yang, and R. Wang, “Exploiting transfer learning [53] J. Atkinson and D. Campos, “Improving BCI-based emotion recognition
for emotion recognition under cloud-edge-client collaborations,” IEEE by combining EEG feature selection and kernel classifiers,” Expert Syst.
J. Sel. Areas Commun., vol. 39, no. 2, pp. 479–490, Feb. 2021. Appl., vol. 47, pp. 35–41, 2016.
[31] X. Tao, Z. Chen, M. Xu, and J. Lu, “Rebuffering optimization for dash [54] A. M. Bhatti, M. Majid, S. M. Anwar, and B. Khan, “Human emotion
via pricing and EEG-based QoE modeling,” IEEE J. Sel. Areas Commun., recognition and analysis in response to audio music using brain signals,”
vol. 37, no. 7, pp. 1549–1565, Jul. 2019. Comput. Hum. Behav., vol. 65, pp. 267–275, 2016.
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
436 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, VOL. 15, NO. 3, JUNE 2021
[55] E. Kroupi, J.-M. Vesin, and T. Ebrahimi, “Subject-independent odor [79] S. Koelstra et al., “Single trial classification of EEG and peripheral
pleasantness classification using brain and peripheral signals,” IEEE physiological signals for recognition of emotions induced by music
Trans. Affect. Comput., vol. 7, no. 4, pp. 422–434, Oct.–Dec. 2016. videos,” in Proc. Int. Conf. Brain Inform., 2010, pp. 89–100.
[56] R. M. Mehmood and H. J. Lee, “A novel feature extraction method based [80] H. Hu, Z. Zhu, Z. Gao, and R. Zheng, “Analysis on biosignal characteris-
on late positive potential for emotion recognition in human brain signal tics to evaluate road rage of younger drivers: A driving simulator study,”
patterns,” Comput. Elect. Eng., vol. 53, pp. 444–457, 2016. in Proc. IEEE Intell. Veh. Symp., 2018, pp. 156–161.
[57] N. Thammasan, K.-i. Fukui, and M. Numao, “Application of deep belief [81] P. Arnau-González, S. Katsigiannis, M. Arevalillo-Herráez, and N.
networks in EEG-based dynamic music-emotion recognition,” in Proc. Ramzan, “Image-evoked affect and its impact on EEG-based biometrics,”
Int. Joint Conf. Neural Netw., 2016, pp. 881–888. in Proc. IEEE Int. Conf. Image Process., 2019, pp. 2591–2595.
[58] Z. Lan, O. Sourina, L. Wang, and Y. Liu, “Real-time EEG-based emo- [82] V. Govindarajan, K. Driggs-Campbell, and R. Bajcsy, “Affective driver
tion monitoring using stable features,” Vis. Comput., vol. 32, no. 3, state monitoring for personalized, adaptive ADAS,” in Proc. 21st Int.
pp. 347–358, 2016. Conf. Intell. Transp. Syst., 2018, pp. 1017–1022.
[59] S. Jirayucharoensak, S. Pan-Ngum, and P. Israsena, “EEG-based emotion [83] M. Alex, U. Tariq, F. Al-Shargie, H. S. Mir, and H. Al Nashash, “Discrim-
recognition using deep learning network with principal component based ination of genuine and acted emotional expressions using EEG signal and
covariate shift adaptation,” Sci. World J., vol. 2014, pp. 1–10, 2014. machine learning,” IEEE Access, vol. 8, pp. 191080–191089, Oct. 2020.
[60] S. Hatamikia, K. Maghooli, and A. M. Nasrabadi, “The emotion recogni- [84] V. Gupta, M. D. Chopda, and R. B. Pachori, “Cross-subject emotion
tion system based on autoregressive model and sequential forward feature recognition using flexible analytic wavelet transform from EEG signals,”
selection of electroencephalogram signals,” J. Med. Signals Sensors, IEEE Sensors J., vol. 19, no. 6, pp. 2266–2274, Mar. 2019.
vol. 4, no. 3, pp. 194–201, 2014. [85] K. Guo, H. Yu, R. Chai, H. Nguyen, and S. W. Su, “A hybrid physiological
[61] Y.-Y. Lee and S. Hsieh, “Classifying different emotional states by means approach of emotional reaction detection using combined FCM and SVM
of EEG-based functional connectivity patterns,” PLoS one, vol. 9, no. 4, classifier,” in Proc. 41st Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., 2019,
2014, Art. no. e95415. pp. 7088–7091.
[62] Y.-P. Lin, Y.-H. Yang, and T.-P. Jung, “Fusion of electroencephalographic [86] M. Mohammadpour, S. M. R. Hashemi, and N. Houshmand, “Classifica-
dynamics and musical contents for estimating emotional responses in tion of EEG-based emotion for BCI applications,” in Proc. Artif. Intell.
music listening,” Front. Neurosci., vol. 8, no. 94, pp. 1–14, 2014. Robot., 2017, pp. 127–131.
[63] X.-W. Wang, D. Nie, and B.-L. Lu, “Emotional state classification from [87] K. Guo, H. Candra, H. Yu, H. Li, H. T. Nguyen, and S. W. Su, “EEG-based
EEG data using machine learning approach,” Neurocomputing, vol. 129, emotion classification using innovative features and combined SVM and
pp. 94–106, 2014. HMM classifier,” in Proc. 39th Annu. Int. Conf. IEEE Eng. Med. Biol.
[64] Y.-H. Liu, C.-T. Wu, Y.-H. Kao, and Y.-T. Chen, “Single-trial EEG-based Soc., 2017, pp. 489–492.
emotion recognition using kernel eigen-emotion pattern and adaptive [88] Z. Mohammadi, J. Frounchi, and M. Amiri, “Wavelet-based emotion
support vector machine,” in Proc. 35th Annu. Int. Conf. IEEE Eng. Med. recognition system using EEG signal,” Neural Comput. Appl., vol. 28,
Biol. Soc., 2013, pp. 4306–4309. no. 8, pp. 1985–1990, 2017.
[65] M. Murugappan and S. Murugappan, “Human emotion recognition [89] H. Candra, M. Yuwono, R. Chai, H. T. Nguyen, and S. Su, “EEG emotion
through short time electroencephalogram (EEG) signals using fast recognition using reduced channel wavelet entropy and average wavelet
Fourier transform (fft),” in Proc. IEEE 9th Int. Colloq. Signal Process. coefficient features with normal mutual information method,” in Proc.
Appl., 2013, pp. 289–294. 39th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., 2017, pp. 463–466.
[66] H. J. Yoon and S. Y. Chung, “EEG-based emotion estimation using [90] X. Li, D. Song, P. Zhang, G. Yu, Y. Hou, and B. Hu, “Emotion recognition
bayesian weighted-log-posterior function and perceptron convergence from multi-channel EEG data through convolutional recurrent neural
algorithm,” Comput. Biol. Med., vol. 43, no. 12, pp. 2230–2237, 2013. network,” in Proc. IEEE Int. Conf. Bioinf. Biomed., 2016, pp. 352–359.
[67] T. D. Pham and D. Tran, “Emotion recognition using the emotiv EPOC [91] M. Ali, A. H. Mosa, F. Al Machot, and K. Kyamakya, “EEG-based
device,” in Proc. Int. Conf. Neural Inf. Process., 2012, pp. 394–399. emotion recognition approach for e-healthcare applications,” in Proc.
[68] D. Nie, X.-W. Wang, L.-C. Shi, and B.-L. Lu, “EEG-based emotion 8th Int. Conf. Ubiquitous Future Netw., 2016, pp. 946–950.
recognition during watching movies,” in Proc. 5th Int. IEEE/EMBS Conf. [92] N. Jatupaiboon, S. Pan-Ngum, and P. Israsena, “Subject-dependent and
Neural Eng., 2011, pp. 667–670. subject-independent emotion classification using unimodal and multi-
[69] R. Khosrowabadi, H. C. Quek, A. Wahab, and K. K. Ang, “EEG-based modal physiological signals,” J. Med. Imag. Health Inform., vol. 5, no. 5,
emotion recognition using self-organizing map for boundary detection,” pp. 1020–1027, 2015.
in Proc. 20th Int. Conf. Pattern Recognit., 2010, pp. 4242–4245. [93] N. Jatupaiboon, S. Pan-ngum, and P. Israsena, “Emotion classification
[70] Y.-P. Lin et al., “EEG-based emotion recognition in music listening,” using minimal EEG channels and frequency bands,” in Proc. 10th Int.
IEEE Trans. Biomed. Eng., vol. 57, no. 7, pp. 1798–1806, Jul. 2010. Joint Conf. Comput. Sci. Softw. Eng., 2013, pp. 21–24.
[71] M. Mikhail, K. El-Ayat, J. A. Coan, and J. J. Allen, “Using minimal [94] A. E. Vijayan, D. Sen, and A. Sudheer, “EEG-based emotion recognition
number of electrodes for emotion detection using brain signals produced using statistical measures and auto-regressive modeling,” in Proc. IEEE
from a new elicitation technique,” Int. J. Auton. Adaptive Commun. Syst., Int. Conf. Comput. Intell. Commun. Technol., 2015, pp. 587–591.
vol. 6, no. 1, pp. 80–97, 2013. [95] H. Xu and K. N. Plataniotis, “Affect recognition using EEG signal,”
[72] M. Soleymani, J. Lichtenauer, T. Pun, and M. Pantic, “A multimodal in Proc. IEEE 14th Int. Workshop Multimedia Signal Process., 2012,
database for affect recognition and implicit tagging,” IEEE Trans. Affect. pp. 299–304.
Comput., vol. 3, no. 1, pp. 42–55, Jan.-Mar. 2012. [96] S. A. Hosseini and M. B. Naghibi-Sistani, “Emotion recognition method
[73] T. F. Bastos-Filho, A. Ferreira, A. C. Atencio, S. Arjunan, and using entropy analysis of EEG signals,” Int. J. Image, Graph. Signal
D. Kumar, “Evaluation of feature extraction techniques in emotional state Process., vol. 3, no. 5, pp. 30–36, 2011.
recognition,” in Proc. 4th Int. Conf. Intell. Hum. Comput. Interaction, [97] M. Murugappan, M. R. B. M. Juhari, R. Nagarajan, and S. Yaacob, “An
2012, pp. 1–6. investigation on visual and audiovisual stimulus based emotion recogni-
[74] R.-N. Duan, X.-W. Wang, and B.-L. Lu, “EEG-based emotion recognition tion using EEG,” Int. J. Med. Eng. Inform., vol. 1, no. 3, pp. 342–356,
in listening music by using support vector machine and linear dynamic 2009.
system,” in Proc. Int. Conf. Neural Inf. Process., 2012, pp. 468–475. [98] S. Issa, Q. Peng, and X. You, “Emotion classification using EEG brain
[75] S. Nasehi, H. Pourghassem, and I. Isfahan, “An optimal EEG-based emo- signals and the broad learning system,” IEEE Trans. Syst., Man Cybern.
tion recognition algorithm using Gabor,” WSEAS Trans. Signal Process., Syst., to be published, doi: 10.1109/TSMC.2020.2969686.
vol. 3, no. 8, pp. 87–99, 2012. [99] S. K. Khare, V. Bajaj, and G. Sinha, “Adaptive tunable Q wavelet
[76] M. Soleymani, M. Pantic, and T. Pun, “Multimodal emotion recognition transform-based emotion identification,” IEEE Trans. Instrum. Meas.,
in response to videos (extended abstract),” in Proc. Int. Conf. Affect. vol. 69, no. 12, pp. 9609–9617, Dec. 2020.
Comput. Intell. Interaction, 2015, pp. 491–497. [100] H.-C. Yang and C.-C. Lee, “An attribute-invariant variational learning
[77] L. Brown, B. Grundlehner, and J. Penders, “Towards wireless emotional for emotion recognition using physiology,” in Proc. ICASSP IEEE Int.
valence detection from EEG,” in Proc. Annu. Int. Conf. IEEE Eng. Med. Conf. Acoust., Speech Signal Process., 2019, pp. 1184–1188.
Biol. Soc., 2011, pp. 2188–2191. [101] H. Chao and L. Dong, “Emotion recognition using three-dimensional
[78] X.-W. Wang, D. Nie, and B.-L. Lu, “EEG-based emotion recognition feature and convolutional neural network from multichannel
using frequency domain features and support vector machines,” in Proc. EEG signals,” IEEE Sensors J., vol. 21, no. 2, pp. 2024–2034,
Int. Conf. Neural Inf. Process., 2011, pp. 734–743. Jan. 2021.
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
GONZALEZ et al.: HARDWARE ACCELERATION OF EEG-BASED EMOTION CLASSIFICATION SYSTEMS: A COMPREHENSIVE SURVEY 437
[102] Z. Gao, X. Wang, Y. Yang, Y. Li, K. Ma, and G. Chen, “A channel- [126] Z. Gao, Y. Li, Y. Yang, N. Dong, X. Yang, and C. Grebogi, “A
fused dense convolutional network for EEG-based emotion recog- coincidence-filtering-based approach for CNNs in EEG-based recogni-
nition,” IEEE Trans. Cogn. Develop. Syst., to be published, doi: tion,” IEEE Trans. Ind. Inform., vol. 16, no. 11, pp. 7159–7167, Nov.
10.1109/TCDS.2020.2976112. 2020.
[103] T. Matlovič, “Emotion detection using EPOC EEG device,” in Proc. [127] Y. Wang, Z. Huang, B. McCane, and P. Neo, “EmotioNet: A 3-D con-
Student Res. Conf. Inform. Inf. Technol., 2016, pp. 1–6. volutional neural network for EEG-based emotion recognition,” in Proc.
[104] A. T. Sohaib, S. Qureshi, J. Hagelbäck, O. Hilborn, and P. Jerčić, “Eval- Int. Joint Conf. Neural Netw., 2018, pp. 1–7.
uating classifiers for emotion recognition using EEG,” in Proc. Int. Conf. [128] S. Alhagry, A. Aly, and R. El-Khoribi, “Emotion recognition based on
Augmented Cogn., 2013, pp. 492–501. EEG using LSTM recurrent neural network,” Int. J. Adv. Comput. Sci.
[105] G. Chanel, C. Rebetez, M. Bétrancourt, and T. Pun, “Emotion assessment Appl., vol. 8, no. 10, pp. 355–358, 2017.
from physiological signals for adaptation of game difficulty,” IEEE Trans. [129] M. Yanagimoto and C. Sugimoto, “Recognition of persisting emotional
Syst., Man Cybern.-Part A: Syst. Humans, vol. 41, no. 6, pp. 1052–1063, valence from EEG using convolutional neural networks,” in Proc. IEEE
Nov. 2011. 9th Int. Workshop Comput. Intell. Appl., 2016, pp. 27–32.
[106] A. Yazdani, J.-S. Lee, and T. Ebrahimi, “Implicit emotional tagging of [130] Y. Gao, H. J. Lee, and R. M. Mehmood, “Deep learninig of EEG signals
multimedia using EEG signals and brain computer interface,” in Proc. for emotion recognition,” in Proc. IEEE Int. Conf. Multimedia Expo
1st SIGMM Workshop Social Media, 2009, pp. 81–88. Workshops, 2015, pp. 1–5.
[107] J. Huang, X. Xu, and T. Zhang, “Emotion classification using deep [131] E. S. Salama, R. A. El-Khoribi, M. E. Shoman, and M. A. W. Shal-
neural networks and emotional patches,” in Proc. IEEE Int. Conf. Bioinf. aby, “EEG-based emotion recognition using 3D convolutional neural
Biomed., 2017, pp. 958–962. networks,” Int. J. Adv. Comput. Sci. Appl., vol. 9, no. 8, pp. 329–337,
[108] J. Chen, B. Hu, P. Moore, X. Zhang, and X. Ma, “Electroencephalogram- 2018.
based emotion assessment system using ontology and data mining tech- [132] N. Liu, Y. Fang, L. Li, L. Hou, F. Yang, and Y. Guo, “Multiple feature
niques,” Appl. Soft Comput., vol. 30, pp. 663–674, 2015. fusion for automatic emotion recognition using EEG signals,” in Proc.
[109] Y. Liu and O. Sourina, “Real-time subject-dependent EEG-based emo- IEEE Int. Conf. Acoust., Speech Signal Process., 2018, pp. 896–900.
tion recognition algorithm,” in Proc. Trans. Comput. Sci. XXIII, 2014, [133] N. Kumar, K. Khaund, and S. M. Hazarika, “Bispectral analysis of EEG
pp. 199–223. for emotion recognition,” Procedia Comput. Sci., vol. 84, pp. 31–35,
[110] Y. Liu and O. Sourina, “EEG-based valence level recognition for real-time 2016.
applications,” in Proc. Int. Conf. Cyberworlds, 2012, pp. 53–60. [134] X. Jie, R. Cao, and L. Li, “Emotion recognition based on the sample
[111] T.-Y. Chai, S. Woo, M. Rizon, and C. Tan, “Classification of human entropy of EEG,” Bio-Med. Mater. Eng., vol. 24, no. 1, pp. 1185–1192,
emotions from EEG signals using statistical features and neural network,” 2014.
Int. J. Integr. Eng., vol. 1, pp. 71–79, 2010. [135] P. C. Petrantonakis and L. J. Hadjileontiadis, “Emotion recognition from
[112] Z. Khalili and M. H. Moradi, “Emotion recognition system using brain EEG using higher order crossings,” IEEE Trans. Inf. Technol. Biomed.,
and peripheral signals: Using correlation dimension to improve the results vol. 14, no. 2, pp. 186–197, Mar. 2010.
of EEG,” in Proc. Int. Joint Conf. Neural Netw., 2009, pp. 1571–1575. [136] P. C. Petrantonakis and L. J. Hadjileontiadis, “Emotion recognition from
[113] W. Zheng and B. Lu, “Investigating critical frequency bands and chan- brain signals using hybrid adaptive filtering and higher order crossings
nels for EEG-based emotion recognition with deep neural networks,” analysis,” IEEE Trans. Affect. Comput., vol. 1, no. 2, pp. 81–97, Jul.–Dec.
IEEE Trans. Auton. Mental Develop., vol. 7, no. 3, pp. 162–175, 2010.
Sep. 2015. [137] X. Wang, T. Zhang, X. Xu, L. Chen, X. Xing, and C. L. P. Chen,
[114] R.-N. Duan, J.-Y. Zhu, and B.-L. Lu, “Differential entropy feature for “EEG emotion recognition using dynamical graph convolutional neural
EEG-based emotion classification,” in Proc. 6th Int. IEEE/EMBS Conf. networks and broad learning system,” in Proc. IEEE Int. Conf. Bioinf.
Neural Eng., 2013, pp. 81–84. Biomed., 2018, pp. 1240–1244.
[115] Y.-P. Lin, C.-H. Wang, T.-L. Wu, S.-K. Jeng, and J.-H. Chen, “EEG-based [138] J. Pan, Y. Li, and J. Wang, “An EEG-based brain-computer interface
emotion recognition in music listening: A comparison of schemes for for emotion recognition,” in Proc. Int. Joint Conf. Neural Netw., 2016,
multiclass support vector machine,” in Proc. IEEE Int. Conf. Acoust., pp. 2063–2067.
Speech Signal Process., 2009, pp. 489–492. [139] S. Makeig, G. Leslie, T. Mullen, D. Sarma, N. Bigdely-Shamlo, and C.
[116] A. D. Bigirimana, N. Siddique, and D. Coyle, “Emotion-inducing im- Kothe, “First demonstration of a musical emotion BCI,” in Proc. Int.
agery versus motor imagery for a brain-computer interface,” IEEE Trans. Conf. Affect. Comput. Intell. Interact., 2011, pp. 487–496.
Neural Syst. Rehabil. Eng., vol. 28, no. 4, pp. 850–859, Apr. 2020. [140] M. Li and B.-L. Lu, “Emotion classification based on gamma-band EEG,”
[117] K. Wang, Y. Ho, Y. Huang, and W. Fang, “Design of intelligent EEG sys- in Proc. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., 2009, pp. 1223–1226.
tem for human emotion recognition with convolutional neural network,” [141] D. Huang, C. Guan, K. K. Ang, H. Zhang, and Y. Pan, “Asymmetric
in Proc. IEEE Int. Conf. Artif. Intell. Circuits Syst., 2019, pp. 142–145. spatial pattern for EEG-based emotion detection,” in Proc. Int. Joint Conf.
[118] A. Jalilifard, E. B. Pizzolato, and M. K. Islam, “Emotion classification Neural Netw., 2012, pp. 1–7.
using single-channel scalp-EEG recording,” in Proc. 38th Annu. Int. Conf. [142] S. A. Hosseini and M. A. Khalilzadeh, “Emotional stress recognition
IEEE Eng. Med. Biol. Soc., 2016, pp. 845–849. system using EEG and psychophysiological signals: Using new labelling
[119] T. Zhang, W. Zheng, Z. Cui, Y. Zong, and Y. Li, “Spatial-temporal process of EEG signals in emotional stress state,” in Proc. Int. Conf.
recurrent neural network for emotion recognition,” IEEE Trans. Cybern., Biomed. Eng. Comput. Sci., 2010, pp. 1–6.
vol. 49, no. 3, pp. 839–847, Mar. 2019. [143] R. M. Mehmood, R. Du, and H. J. Lee, “Optimal feature selection and
[120] G. Zhao, Y. Ge, B. Shen, X. Wei, and H. Wang, “Emotion analysis for deep learning ensembles method for emotion recognition from human
personality inference from EEG signals,” IEEE Trans. Affect. Comput., brain EEG sensors,” IEEE Access, vol. 5, pp. 14797–14806, 2017.
vol. 9, no. 3, pp. 362–371, Jul.-Sep. 2018. [144] H. Shahabi and S. Moghimi, “Toward automatic detection of brain
[121] J. Teo, C. L. Hou, and J. Mountstephens, “Deep learning for EEG-based responses to emotional music through analysis of EEG effective con-
preference classification,” AIP Conf. Proc., vol. 1891, no. 1, 2017, nectivity,” Comput. Hum. Behav., vol. 58, pp. 231–239, 2016.
Art. no. 020141. [145] S. Koelstra and I. Patras, “Fusion of facial expressions and EEG for im-
[122] P. Ackermann, C. Kohlschein, J. Á. Bitsch, K. Wehrle, and S. Jeschke, plicit affective tagging,” Image Vis. Comput., vol. 31, no. 2, pp. 164–174,
“EEG-based automatic emotion recognition: Feature extraction, selection 2013.
and classification methods,” in Proc. IEEE 18th Int. Conf. Netw., Appl. [146] F. Abtahi, T. Ro, W. Li, and Z. Zhu, “Emotion analysis using audio/video,
Serv., 2016, pp. 1–6. EMG and EEG: A dataset and comparison study,” in Proc. IEEE Winter
[123] S. K. Hadjidimitriou and L. J. Hadjileontiadis, “Toward an EEG-based Conf. Appl. Comput. Vis., 2018, pp. 10–19.
recognition of music liking using time-frequency analysis,” IEEE Trans. [147] T. Zhang, X. Wang, X. Xu, and C. L. P. Chen, “GCB-Net:
Biomed. Eng., vol. 59, no. 12, pp. 3498–3510, Dec. 2012. Graph convolutional broad network and its application in emotion
[124] G. Chanel, J. J. Kierkels, M. Soleymani, and T. Pun, “Short-term emotion recognition,” IEEE Trans. Affect. Comput., to be published, doi:
assessment in a recall paradigm,” Int. J. Hum.-Comput. Stud., vol. 67, 10.1109/TAFFC.2019.2937768.
no. 8, pp. 607–627, 2009. [148] J. Jiang, Y. Zeng, L. Tong, C. Zhang, and B. Yan, “Single-trial
[125] H. Huang, Z. Hu, W. Wang, and M. Wu, “Multimodal emotion recognition ERP detecting for emotion recognition,” in Proc. 17th IEEE/ACIS Int.
based on ensemble convolutional neural network,” IEEE Access, vol. 8, Conf. Softw. Eng., Artif. Intell., Netw. Parallel/Distrib. Comput., 2016,
pp. 3265–3271, Dec. 2019. pp. 105–108.
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
438 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, VOL. 15, NO. 3, JUNE 2021
[149] S. K. Khare and V. Bajaj, “Time-frequency representation and convo- [174] X. Du et al., “An efficient LSTM network for emotion recognition from
lutional neural network-based emotion recognition,” IEEE Trans. Neu- multichannel EEG signals,” IEEE Trans. Affect. Comput., to be published,
ral Netw. Learn. Syst., vol. 32, no. 7, pp. 2901–2909, Jul. 2021, doi: doi: 10.1109/TAFFC.2020.3013711.
10.1109/TNNLS.2020.3008938. [175] L.-Y. Tao and B.-L. Lu, “Emotion recognition under sleep deprivation
[150] S. Katsigiannis and N. Ramzan, “DREAMER: A database for emo- using a multimodal residual LSTM network,” in Proc. Int. Joint Conf.
tion recognition through EEG and ECG signals from wireless low-cost Neural Netw., 2020, pp. 1–8.
off-the-shelf devices,” IEEE J. Biomed. Health Inform., vol. 22, no. 1, [176] Y. Yang, Q. J. Wu, W.-L. Zheng, and B.-L. Lu, “EEG-based emotion
pp. 98–107, Jan. 2018. recognition using hierarchical network with subnetwork nodes,” IEEE
[151] M. Soleymani, S. Asghari-Esfeden, Y. Fu, and M. Pantic, “Analysis of Trans. Cogn. Develop. Syst., vol. 10, no. 2, pp. 408–419, Jun. 2018.
EEG signals and facial expressions for continuous emotion detection,” [177] X. Li, B. Qian, J. Wei, A. Li, X. Liu, and Q. Zheng, “Classify EEG
IEEE Trans. Affect. Comput., vol. 7, no. 1, pp. 17–28, Jan.–Mar. 2016. and reveal latent graph structure with spatio-temporal graph convolu-
[152] S. Vanhatalo, J. M. Palva, M. Holmes, J. Miller, J. Voipio, and K. Kaila, tional neural network,” in Proc. IEEE Int. Conf. Data Mining, 2019,
“Infraslow oscillations modulate excitability and interictal epileptic ac- pp. 389–398.
tivity in the human cortex during sleep,” Proc. Nat. Acad. Sci., vol. 101, [178] S. Sheykhivand, Z. Mousavi, T. Y. Rezaii, and A. Farzamnia, “Recog-
no. 14, pp. 5053–5057, 2004. nizing emotions evoked by music using CNN-LSTM networks on EEG
[153] P. Fries, “A mechanism for cognitive dynamics: Neuronal communi- signals,” IEEE Access, vol. 8, pp. 139332–139345, 2020.
cation through neuronal coherence,” Trends Cogn. Sci., vol. 9, no. 10, [179] Y. Yang, Q. Wu, M. Qiu, Y. Wang, and X. Chen, “Emotion recognition
pp. 474–480, 2005. from multi-channel EEG through parallel convolutional recurrent neural
[154] G. Knyazev et al., “Cross-frequency coupling in developmental perspec- network,” in Proc. Int. Joint Conf. Neural Netw., 2018, pp. 1–7.
tive,” Front. Hum. Neurosci., vol. 13, no. 158, pp. 1–10, 2019. [180] W. Zheng, “Multichannel EEG-based emotion recognition via group
[155] A. D. Marimpis, S. I. Dimitriadis, and R. Goebel, “A multiplex connec- sparse canonical correlation analysis,” IEEE Trans. Cogn. Develop. Syst.,
tivity map of valence-arousal emotional model,” IEEE Access, vol. 8, vol. 9, no. 3, pp. 281–290, Sep. 2017.
pp. 170928–170938, 2020. [181] J. Cheng et al., “Emotion recognition from multi-channel EEG via deep
[156] H. A. Gonzalez et al., “Ultra-high compression of twiddle factor ROMs forest,” IEEE J. Biomed. Health Inform., vol. 25, no. 2, pp. 453–464, Feb.
in multi-core DSP for FMCW radars,” in Proc. IEEE Int. Symp. Circuits 2021.
Syst. (ISCAS), 2021, pp. 1–5, doi: 10.1109/ISCAS51556.2021.9401547. [182] Y. Luo and B.-L. Lu, “EEG data augmentation for emotion recognition
[157] P. Davis, C. D. Creusere, and W. Tang, “ASIC implementation of the using a conditional wasserstein GAN,” in Proc. 40th Annu. Int. Conf.
cross frequency coupling algorithm for EEG signal processing,” in Proc. IEEE Eng. Med. Biol. Soc., 2018, pp. 2535–2538.
Int. Symp. Integr. Circuits, 2014, pp. 248–251. [183] J. Chen, D. Jiang, and Y. Zhang, “A hierarchical bidirectional GRU model
[158] B. Hjorth, “EEG analysis based on time domain properties,” Electroen- with attention for EEG-based emotion classification,” IEEE Access,
cephalogr. Clin. Neuriophysiol., vol. 29, no. 3, pp. 306–310, 1970. vol. 7, pp. 118530–118540, 2019.
[159] J. S. Richman and J. R. Moorman, “Physiological time-series analysis [184] R. Al-Fahad and M. Yeasin, “Micro-states based dynamic brain con-
using approximate entropy and sample entropy,” Amer. J. Physiol.-Heart nectivity in understanding the commonality and differences in gender-
Circulatory Physiol., vol. 278, no. 6, pp. H2039–H2049, 2000. specific emotion processing,” in Proc. Int. Joint Conf. Neural Netw., 2019,
[160] Y. Hong and Y. Bao, “FPGA implementation for real-time empirical pp. 1–8.
mode decomposition,” IEEE Trans. Instrum. Meas., vol. 61, no. 12, [185] W. Huang, Y. Xue, L. Hu, and H. Liuli, “S-EEGNet: Electroen-
pp. 3175–3184, Dec. 2012. cephalogram signal classification based on a separable convolution
[161] M. K. Kıymık, İ. Güler, A. Dizibüyük, and M. Akın, “Comparison of neural network with bilinear interpolation,” IEEE Access, vol. 8,
STFT and wavelet transform methods in determining epileptic seizure pp. 131636–131646, 2020.
activity in EEG signals for real-time application,” Comput. Biol. Med., [186] Y. Luo et al., “EEG-based emotion classification using spiking neural
vol. 35, no. 7, pp. 603–616, 2005. networks,” IEEE Access, vol. 8, pp. 46007–46016, 2020.
[162] C. Yu and M. Yen, “Area-efficient 128- to 2048/1536-point pipeline FFT [187] S. Liu, X. Wang, L. Zhao, J. Zhao, Q. Xin, and S. Wang, “Subject-
processor for LTE and mobile WiMAX systems,” IEEE Trans. Very Large independent emotion recognition of EEG signals based on dynamic
Scale Integration (VLSI) Syst., vol. 23, no. 9, pp. 1793–1800, Sep. 2015. empirical convolutional neural network,” IEEE/ACM Trans. Comput.
[163] H. Kang, J. Lee, and J. Kim, “Low-complexity twiddle factor generation Biol. Bioinf., to be published, doi: 10.1109/TCBB.2020.3018137.
for FFT processor,” Electron. Lett., vol. 49, no. 23, pp. 1443–1445, 2013. [188] Y. Zhao, J. Yang, J. Lin, D. Yu, and X. Cao, “A 3D convolutional neural
[164] M. Hasan and T. Arslan, “Scheme for reducing size of coefficient memory network for emotion recognition based on EEG signals,” in Proc. Int.
in FFT processor,” Electron. Lett., vol. 38, no. 4, pp. 163–164, 2002. Joint Conf. Neural Netw., 2020, pp. 1–6.
[165] H. Lee and I. Park, “Balanced binary-tree decomposition for area-efficient [189] X. Jia et al., “Multi-channel EEG based emotion recognition using
pipelined FFT processing,” IEEE Trans. Circuits Syst. I: Regular Papers, temporal convolutional network and broad learning system,” in Proc.
vol. 54, no. 4, pp. 889–900, Apr. 2007. IEEE Int. Conf. Syst., Man, Cybern., 2020, pp. 2452–2457.
[166] H. Kang, B. Yang, and J. Lee, “Low complexity twiddle factor multipli- [190] L. Shen, W. Zhao, Y. Shi, T. Qin, and B. Liu, “Parallel sequence-channel
cation with ROM partitioning in FFT processor,” Electron. Lett., vol. 49, projection convolutional neural network for EEG-based emotion recog-
no. 9, pp. 589–591, 2013. nition,” IEEE Access, vol. 8, pp. 222966–222976, 2020.
[167] W. Tsai, S. Chen, and S. Huang, “A low-complexity mixed-radix FFT [191] K.-J. Wang and C. Y. Zheng, “Toward a wearable affective robot that de-
rotator architecture,” in Proc. IEEE Asia Pacific Conf. Circuits Syst., tects human emotions from brain signals by using deep multi-spectrogram
2018, pp. 183–186. convolutional neural networks (deep MS-CNN),” in Proc. 28th IEEE Int.
[168] M. J. Kaminski and K. J. Blinowska, “A new method of the description Conf. Robot Hum. Interactive Commun., 2019, pp. 1–8.
of the information flow in the brain structures,” Biol. Cybern., vol. 65, [192] B. Krisnandhika, A. Faqih, P. D. Pumamasari, and B. Kusumoputro,
no. 3, pp. 203–210, 1991. “Emotion recognition system based on EEG signals using relative wavelet
[169] K. Singh and J. P. de Gyvez, “Twenty years of near/sub-threshold design energy features and a modified radial basis function neural networks,” in
trends and enablement,” IEEE Trans. Circuits Syst. II: Exp. Briefs, vol. 68, Proc. Int. Conf. Consum. Electron. Devices, 2017, pp. 50–54.
no. 1, pp. 5–11, Jan. 2021. [193] G. Li, Z. Zhang, and G. Wang, “Emotion recognition based on low-
[170] S. Höppner et al., “Adaptive body bias aware implementation for ultra- cost in-ear EEG,” in Proc. IEEE Biomed. Circuits Syst. Conf., 2017,
low-voltage designs in 22FDX technology,” IEEE Trans. Circuits Syst. pp. 1–4.
II: Exp. Briefs, vol. 67, no. 10, pp. 2159–2163, Oct. 2020. [194] Y. Kumagai, M. Arvaneh, H. Okawa, T. Wada, and T. Tanaka, “Classi-
[171] S. Höppner et al., “How to achieve world-leading energy efficiency fication of familiarity based on cross-correlation features between EEG
using 22FDX with adaptive body biasing on an arm Cortex-M4 IoT and music,” in Proc. 39th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc.,
SoC,” in Proc. ESSDERC 49th Eur. Solid-State Device Res. Conf., 2019, 2017, pp. 2879–2882.
pp. 66–69. [195] S. Liu et al., “EEG-based emotion estimation using adaptive tracking
[172] R. Carter et al., “22 nm FDSOI technology for emerging mobile, Internet- of discriminative frequency components,” in Proc. 39th Annu. Int. Conf.
of-Things, and RF applications,” in Proc. IEEE Int. Electron. Devices IEEE Eng. Med. Biol. Soc., 2017, pp. 2231–2234.
Meeting, 2016, pp. 2.2.1–2.2.4. [196] B. Hu, X. Li, S. Sun, and M. Ratcliffe, “Attention recognition
[173] M. M. Bradley and P. J. Lang, “Measuring emotion: The self-assessment in EEG-based affective learning research using CFS KNN algo-
manikin and the semantic differential,” J. Behav. Ther. Exp. Psychiatry, rithm,” IEEE/ACM Trans. Comput. Biol. Bioinf., vol. 15, no. 1,
vol. 25, no. 1, pp. 49–59, 1994. pp. 38–45, Jan./Feb. 2018.
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
GONZALEZ et al.: HARDWARE ACCELERATION OF EEG-BASED EMOTION CLASSIFICATION SYSTEMS: A COMPREHENSIVE SURVEY 439
[197] Cecbur, “CC-BY-SA-4.0,” Feb. 2019. [Online]. Available: https: [222] C. Y. Park et al., “K-EmoCon, a multimodal sensor dataset for continuous
//commons.wikimedia.org/wiki/File:3_filters_in_a_Convolutional_ emotion recognition in naturalistic conversations,” May 2020. [Online].
Neural_Network.gif Available: https://doi.org/10.5281/zenodo.3814370
[198] Y. Yan et al., “Low-power low-latency keyword spotting and adaptive [223] T. B. Alakus, M. Gonen, and I. Turkoglu, “Database for an emo-
control with a spinnaker 2 prototype and comparison with loihi,” 2020, tion recognition system based on EEG signals and various com-
arXiv:2009.08921. puter games - Gameemo,” Biomed. Signal Process. Control, vol. 60,
[199] Y. Yan et al., “Efficient reward-based structural plasticity on a SpiN- 2020, Art. no. 101951.
Naker 2 prototype,” IEEE Trans. Biomed. Circuits Syst., vol. 13, no. 3, [224] J. Onton, “Imagined emotion study,” Jul. 2020. [Online]. Available: https:
pp. 579–591, Jun. 2019. //openneuro.org/datasets/ds003004/versions/1.0.0
[200] C. Liu et al., “Memory-efficient deep learning on a SpiNNaker 2 proto- [225] J. A. M. Correa, M. K. Abadi, N. Sebe, and I. Patras, “AMIGOS: A dataset
type,” Front. Neurosci., vol. 12, no. 840, pp. 1–15, 2018. for affect, personality and mood research on individuals and groups,”
[201] C. Mayr, S. Hoeppner, and S. Furber, “SpiNNaker 2: A 10 million IEEE Trans. Affect. Comput., vol. 12, no. 2, pp. 479–493, Apr.–Jun. 2021.
core processor system for brain simulation and machine learning,” 2019, [226] S. B. Eysenck, H. J. Eysenck, and P. Barrett, “A revised version of the
arXiv:1911.02385. psychoticism scale,” Pers. Individual Differences, vol. 6, no. 1, pp. 21–29,
[202] M. Jobst et al., “Event-based neural network for ECG classification with 1985.
delta encoding and early stopping,” in Proc. 6th Int. Conf. Event-Based [227] J. W. Papez, “A proposed mechanism of emotion,” Arch. Neurol. Psychi-
Control, Commun., Signal Process. (EBCCSP), 2020, pp. 1–4, doi: atry, vol. 38, no. 4, pp. 725–743, Oct. 1937.
10.1109/EBCCSP51266.2020.9291357. [228] C. Darwin, The Expression of Emotions in Man and Animals. London,
[203] Fdeloche, “CC-BY-SA-4.0,” Jun. 2017. [Online]. Available: https:// U.K.: John Murray, 1872.
commons.wikimedia.org/wiki/File:Gated_Recurrent_Unit.svg [229] J. Posner, J. A. Russel, and B. S. Peterson, “The circumplex model
[204] Long Short-Term Memory, “CC-BY-SA-4.0,” Ixnay, Jun. 2017. [On- of affect: An integrative approach to affective neuroscience, cog-
line]. Available: https://commons.wikimedia.org/wiki/File:Long_Short- nitive development, and psychopathology,” Develop. Psychopathol.,
Term_Memory.svg vol. 17, no. 3, Sep. 2005. [Online]. Available: https://doi.org/10.1017/
[205] K. Cho et al., “Learning phrase representations using RNN encoder- s0954579405050340
decoder for statistical machine translation,” 2014, arXiv:1406.1078. [230] K. Phan, T. Wager, S. F. Taylor, and I. Liberzon, “Functional neu-
[206] B. M. Baas, “A low-power, high-performance, 1024-point FFT pro- roanatomy of emotion: A meta-analysis of emotion activation studies
cessor,” IEEE J. Solid-State Circuits, vol. 34, no. 3, pp. 380–387, in pet and fMRI,” NeuroImage, vol. 16, no. 2, pp. 331–348, 2002.
Mar. 1999. [231] J. Gläscher and R. Adolphs, “Processing of the arousal of subliminal and
[207] J. Yoo et al., “An 8-channel scalable EEG acquisition SoC with supraliminal emotional stimuli by the human amygdala,” J. Neurosci.,
fully integrated patient-specific seizure classification and recording vol. 23, no. 32, pp. 10 274–10 282, 2003.
processor,” in Proc. IEEE Int. Solid-State Circuits Conf., 2012, [232] D. M. Small, M. D. Gregory, Y. Mak, D. Gitelman, M. Mesulam, and
pp. 292–294. T. Parrish, “Dissociation of neural representation of intensity and affec-
[208] Y.-X. Wang, D. Ramanan, and M. Hebert, “Growing a brain: Fine-tuning tive valuation in human gustation,” Neuron, vol. 39, no. 4, pp. 701–711,
by increasing model capacity,” in Proc. IEEE Conf. Comput. Vis. Pattern Aug. 2003.
Recognit., 2017, pp. 3029–3038. [233] K. H. Wood, L. W. Ver Hoef, and D. C. Knight, “The amygdala mediates
[209] C.-S. Wei, T. Koike-Akino, and Y. Wang, “Spatial component-wise the emotional modulation of threat-elicited skin conductance response,”
convolutional network (SCCNet) for motor-imagery EEG classification,” Emotion, vol. 14, no. 4, pp. 693–700, 2014.
in Proc. 9th Int. IEEE/EMBS Conf. Neural Eng., 2019, pp. 328–331. [234] R. R. Fletcher et al., “iCalm: Wearable sensor and network archi-
[210] H. Wu et al., “A parallel multiscale filter bank convolutional neural tecture for wirelessly communicating and logging autonomic activ-
networks for motor imagery EEG classification,” Front. Neurosci., vol. ity,” IEEE Trans. Inf. Technol. Biomed., vol. 14, no. 2, pp. 215–223,
13, no. 1275, pp. 1–9, 2019. Mar. 2010.
[211] S. J. Pan and Q. Yang, “A survey on transfer learning,” IEEE Trans. [235] P. J. Lang, M. K. Greenwald, M. M. Bradley, and A. O. Hamm, “Looking
Knowl. Data Eng., vol. 22, no. 10, pp. 1345–1359, Oct. 2010. at pictures: Affective, facial, visceral, and behavioral reactions,” Psy-
[212] F. Zenke, B. Poole, and S. Ganguli, “Continual learning through synaptic chophysiology, vol. 30, no. 3, pp. 261–273, May 1993.
intelligence,” in Proc. 34th Int. Conf. Mach. Learn., Sydney, Australia, [236] I. B. Mauss and M. D. Robinson, “Measures of emotion: A review,” Cogn.
Aug. 2017, pp. 3987–3995. Emotion, vol. 23, no. 2, pp. 209–237, Feb. 2009.
[213] G. Zeng, Y. Chen, B. Cui, and S. Yu, “Continual learning of context- [237] G. Tamburro, D. B. Stone, and S. Comani, “Automatic removal of cardiac
dependent processing in neural networks,” Nat. Mach. Intell., vol. 1, interference (ARCI): A new approach for EEG data,” Front. Neurosci.,
no. 8, pp. 364–372, 2019. vol. 13, no. 441, pp. 1–17, 2019.
[214] G. I. Parisi, R. Kemker, J. L. Part, C. Kanan, and S. Wermter, “Continual [238] P. Bušek and D. Kemlink, “The influence of the respiratory cycle on the
lifelong learning with neural networks: A review,” Neural Netw., vol. 113, EEG,” Physiol. Res., vol. 54, pp. 327–33, 2005.
pp. 54–71, 2019. [239] P. J. D. Jong and H. Merckelbach, “Eyeblink frequency, rehearsal activity,
[215] J. Gideon, S. Khorram, Z. Aldeneh, D. Dimitriadis, and E. M. Provost, and sympathetic arousal,” Int. J. Neurosci., vol. 51, no. 1/2, pp. 89–94,
“Progressive neural networks for transfer learning in emotion recogni- Jan. 1990.
tion,” in Proc. INTERSPEECH, 2017, pp. 1098–11102. [240] M. Poh, N. C. Swenson, and R. W. Picard, “A wearable sensor for un-
[216] J. Kirkpatrick et al., “Overcoming catastrophic forgetting in neural obtrusive, long-term assessment of electrodermal activity,” IEEE Trans.
networks,” Proc Nat. Acad. Sci., vol. 114, no. 13, pp. 3521–3526, Biomed. Eng., vol. 57, no. 5, pp. 1243–1252, May 2010.
2017. [241] K. Arikan et al., “EEG correlates of startle reflex with reactivity to
[217] R. Kemker, M. McClure, A. Abitino, T. L. Hayes, and C. Kanan, “Mea- eye opening in psychiatric disorders: Preliminary results,” Clin. EEG
suring catastrophic forgetting in neural networks,” in Proc. AAAI Conf. Neurosci., vol. 37, no. 3, pp. 230–234, 2006.
Artif. Intell., 2018, pp. 3390–3398. [242] G. Panayiotou and E. Constantinou, “Emotion dysregulation in alex-
[218] R. R. Karn, P. Kudva, and I. M. Elfadel, “Criteria for learning without ithymia: Startle reactivity to fearful affective imagery and its relation to
forgetting in artificial neural networks,” in Proc. IEEE Int. Conf. Cogn. heart rate variability,” Psychophysiology, vol. 54, no. 9, pp. 1323–1334,
Comput., 2019, pp. 90–97. 2017.
[219] International 10-20 system for EEG-MCN, “CC0 1.0,” B. C. Oxley, [243] D. C. Gooding, R. J. Davidson, K. M. Putnam, and K. A. Tallent,
2017. [Online]. Available: https://commons.wikimedia.org/wiki/File: “Normative emotion-modulated startle response in individuals at risk for
International_10-20_system_for_EEG-MCN.svg schizophrenia-spectrum disorders,” Schizophrenia Res., vol. 57, no. 1,
[220] T. Tang et al., “An active concentric electrode for concurrent EEG pp. 109–120, 2002.
recording and body-coupled communication (BCC) data transmission,” [244] V. De Pascalis, G. Cozzuto, G. V. Caprara, and G. Alessandri, “Relations
IEEE Trans. Biomed. Circuits Syst., vol. 14, no. 6, pp. 1253–1262, Dec. among EEG-alpha asymmetry, BIS/BAS, and dispositional optimism,”
2020. Biol. Psychol., vol. 94, no. 1, pp. 198–209, 2013.
[221] K. Zhao and D. Xu, “Food image-induced discrete emotion recognition [245] J. A. Coan and J. J. Allen, “Frontal EEG asymmetry and the behavioral
using a single-channel scalp-EEG recording,” in Proc. 12th Int. Congr. activation and inhibition systems,” Psychophysiology, vol. 40, no. 1,
Image Signal Process., Biomed. Eng. Inform., 2019, pp. 1–6. pp. 106–114, 2003.
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
440 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, VOL. 15, NO. 3, JUNE 2021
[246] R. J. Davidson, “Affective neuroscience and psychophysiology: To- [266] G. Liberati et al., “Development of a binary fMRI-BCI for Alzheimer
ward a synthesis,” Psychophysiology, vol. 40, no. 5, pp. 655–665, patients: A semantic conditioning paradigm using affective uncondi-
Sep. 2003. tioned stimuli,” in Proc. Humaine Assoc. Conf. Affect. Comput. Intell.
[247] S. Clerc et al., “8.4a0.33v/ − 40◦ c process/temperature closed-loop Interaction, 2013, pp. 838–842.
compensation SoC embedding all-digital clock multiplier and dc-dc [267] D. Lulé et al., “Brain responses to emotional stimuli in patients with
converter exploiting FDSOI 28 nm back-gate biasing,” in Proc. IEEE amyotrophic lateral sclerosis (ALS),” J. Neurol., vol. 254, pp. 519–527,
Int. Solid-State Circuits Conf. Dig. Tech. Papers, 2015, pp. 1–3. 2007.
[248] D. Bol et al., “19.6 A 40-to-80 MHz sub-4 µW/MHz ULV Cortex-M0 [268] R. El Kaliouby, R. Picard, and S. Baron-Cohen, “Affective computing
MCU SoC in 28 nm FDSOI with dual-loop adaptive back-bias generator and autism,” Ann. New York Acad. Sci., vol. 1093, no. 1, pp. 228–248,
for 20 µs wake-up from deep fully retentive sleep mode,” in Proc. IEEE 2006.
Int. Solid- State Circuits Conf., 2019, pp. 322–324. [269] R. W. Picard, “Future affective technology for autism and emotion
[249] M. Pons et al., “A 0.5 V 2.5 µW/MHz microcontroller with analog- communication,” Philos. Trans. Roy. Soc. London. Ser. B., Biol. Sci.,
assisted adaptive body bias PVT compensation with 3.13nW/kB SRAM vol. 364, no. 1535, pp. 3575–3584, 12 2009.
retention in 55 nm deeply-depleted channel CMOS,” in Proc. IEEE [270] O. Rudovic, J. Lee, M. Dai, B. Schuller, and R. W. Picard, “Personalized
Custom Integr. Circuits Conf., 2019, pp. 1–4. machine learning for robot perception of affect and engagement in autism
[250] F. u. Rahman, R. Pamula, A. Boora, X. Sun, and V. Sathe, “19.1 therapy,” Sci. Robot., vol. 3, no. 19, pp. 1–11, 2018.
computationally enabled total energy minimization under performance [271] N. Yeung, R. Bogacz, C. B. Holroyd, and J. D. Cohen, “Detection of
requirements for a voltage-regulated 0.38-to-0.58V microprocessor in synchronized oscillations in the electroencephalogram: An evaluation of
65 nm CMOS,” in Proc. IEEE Int. Solid- State Circuits Conf., 2019, methods,” Psychophysiol., vol. 41, no. 6, pp. 822–832, 2004.
pp. 312–314. [272] S. Yuval-Greenberg, O. Tomer, A. S. Keren, I. Nelken, and L. Y. Deouell,
[251] J. Lee et al., “19.2 a 6.4pJ/Cycle self-tuning Cortex-M0 IoT processor “Transient induced gamma-band response in EEG as a manifestation of
based on leakage-ratio measurement for energy-optimal operation across miniature saccades,” Neuron, vol. 58, no. 3, pp. 429–441, 2008.
wide-range PVT variation,” in Proc. IEEE Int. Solid- State Circuits Conf., [273] D. J. Acunzo, G. MacKenzie, and M. C. van Rossum, “Systematic biases
2019, pp. 314–315. in early ERP and ERF components as a result of high-pass filtering,” J.
[252] Y. Pu, J. P. de Gyvez, H. Corporaal, and Y. Ha, “An ultra-low-energy Neurosci. Methods, vol. 209, no. 1, pp. 212–218, 2012.
multi-standard JPEG co-processor in 65 nm CMOS with sub/near [274] D. Tanner, K. Morgan-Short, and S. J. Luck, “How inappropriate high-
threshold supply voltage,” IEEE J. Solid-State Circuits, vol. 45, no. 3, pass filters can produce artifactual effects and incorrect conclusions in
pp. 668–680, Mar. 2010. ERP studies of language and cognition,” Psychophysiology, vol. 52, no. 8,
[253] H. Mostafa, M. Anis, and M. Elmasry, “A novel low area overhead pp. 997–1009, 2015.
direct adaptive body bias (D-ABB) circuit for die-to-die and within-die [275] A. Widmann, E. Schröger, and B. Maess, “Digital filter design for
variations compensation,” IEEE Trans. Very Large Scale Integr. (VLSI) electrophysiological data-a practical approach,” J. Neurosci. Methods,
Syst., vol. 19, no. 10, pp. 1848–1860, Oct. 2011. vol. 250, pp. 34–46, 2015.
[254] N. Kamae, A. K. M. M. Islam, A. Tsuchiya, and H. Onodera, “A body [276] L. C. Parra, C. D. Spence, A. D. Gerson, and P. Sajda, “Recipes for the
bias generator with wide supply-range down to threshold voltage for linear analysis of EEG,” Neuroimage, vol. 28, no. 2, pp. 326–341, 2005.
within-die variability compensation,” in Proc. IEEE Asian Solid-State [277] F. C. Viola, S. Debener, J. Thorne, and T. R. Schneider, “Using ICA
Circuits Conf., 2014, pp. 53–56. for the analysis of multi-channel EEG data,” Simultaneous EEG fMRI:
[255] M. Blagojevic, M. Cochet, B. Keller, P. Flatresse, A. Vladimirescu, and Recording, Anal., Appl., vol. 1, pp. 121–133, 2010.
B. Nikolic, “A fast, flexible, positive and negative adaptive body-bias [278] F. Kelber et al., “Mapping deep neural networks on SpiNNaker2,” in
generator in 28 nm FDSOI,” in Proc. IEEE Symp. VLSI Circuits, 2016, Proc. Neuro-Inspired Comput. Elements Workshop, ser. NICE ’20. New
pp. 1–2. York, NY, USA: Association for Computing Machinery, 2020. [Online].
[256] A. Quelen, G. Pillonnet, P. Flatresse, and E. Beigne, “A 2.5 µw Available: https://doi.org/10.1145/3381755.3381778
0.0067 mm2 automatic back-biasing compensation unit achieving 50% [279] Y. Umuroglu et al., “FINN: A framework for fast, scalable binarized
leakage reduction in FDSOI 28 nm over 0.35-to-1 V VDD range,” in neural network inference,” in Proc. ACM/SIGDA Int. Symp. Field-
Proc. IEEE Int. Solid - State Circuits Conf., 2018, pp. 304–306. Programmable Gate Arrays, ser. FPGA, 2017, pp. 65–74.
[257] M. Saligane et al., “An adaptive body-BiasIna SoC using in situ slack
monitoring for runtime replica calibration,” in Proc. IEEE Symp. VLSI
Circuits, 2018, pp. 63–64.
[258] X. Wang, M. Tehranipoor, S. George, D. Tran, and L. Winemberg, “De-
sign and analysis of a delay sensor applicable to process/environmental
variations and aging measurements,” IEEE Trans. Very Large Scale
Integration (VLSI) Syst., vol. 20, no. 8, pp. 1405–1418, Oct. 2012.
[259] A. K. M. M. Islam and H. Onodera, “On-chip monitoring and compensa-
Hector A. Gonzalez (Student Member, IEEE) re-
tion scheme with fine-grain body biasing for robust and energy-efficient
ceived the B.Sc. degree in electronics engineering
operations,” in Proc. 21st Asia South Pacific Des. Automat. Conf., 2016,
from the National University of Colombia, Bogota,
pp. 403–409.
Colombia, in 2011, and the M.Sc. degree in microsys-
[260] D. Walter et al., “A 0.55v 6.3uW/MHz ARM Cortex-M4 MCU with
tems engineering from Khalifa University (Masdar
adaptive reverse body bias and single rail SRAM,” in Proc. IEEE Symp.
Institute Campus), in collaboration with MIT, Abu
Low-Power High-Speed Chips, 2020, pp. 1–3.
Dhabi, UAE, in 2017. He is currently a Scientific Staff
[261] D. Fernandez-Duque and S. Black, “Impaired recognition of negative
and working toward the Ph.D. degree with the Chair
facial emotions in patients with frontotemporal dementia,” Neuropsy-
of highly-parallel VLSI-systems and neuromorphic
chologia, vol. 43, pp. 1673–1687, 2005.
circuits with Technische Universität Dresden, Dres-
[262] R. Bucks and S. Radford, “Emotion processing in Alzheimer’s disease,”
den, Germany. From 2011 to 2015, he held multi-
Aging Ment. Health, vol. 8, pp. 222–232, 2004.
ple industrial positions as a Senior Engineer of instrumentation and control
[263] R. Hargrave, R. Maddock, and V. Stone, “Impaired recognition of facial
electronics. His research interests include neuromorphic computing, hardware
expressions of emotion in Alzheimer’s disease,” J. Neuropsychiatry Clin.
design for machine learning algorithms, biomedical applications, and digital
Neurosci., vol. 14, no. 1, pp. 64–71, 2002.
signal processing for radar systems. His honors and awards include multiple
[264] D. Zaitchik, E. Koff, H. Brownell, E. Winner, and M. Albert, “Inference
Enrolments of Honor from the National University of Colombia, a Richard
of beliefs and emotions in patients with Alzheimer’s disease,” Neuropsy-
Newton fellowship at the Design Automation Conference (2018), San Francisco,
chology, vol. 20, pp. 11–20, 2006.
United States of America, an international scholarship for an immersion training
[265] M. Downs and B. Bowers, Excellence in Dementia Care: Research
at the Institut Teknologi Petroleum Petronas, and the Petronas Leadership Centre,
Into Practice: Principles and Practice. McGraw-Hill Education, Maid-
Kuala Terengganu and Kuala Lumpur, Malaysia, and a full Graduate studies
enhead/Berkshire England, 2008. [Online]. Available: https://books.
scholarship in Abu Dhabi, United Arab Emirates.
google.de/books?id=Bgdxu4lnTCIC
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
GONZALEZ et al.: HARDWARE ACCELERATION OF EEG-BASED EMOTION CLASSIFICATION SYSTEMS: A COMPREHENSIVE SURVEY 441
Richard George (Senior Member, IEEE) received Sebastian Höppner received the Dipl.-Ing. (M.Sc.)
the M.Sc. degree in medical engineering from HTW in electrical engineering and the Ph.D. degree from
and the University of Saarland, Saarbrcken, Germany, Technische Universität Dresden, Dresden, Germany,
in 2013, with a specialization Neural Engineering, in 2008 and 2013, respectively. He is currently a
and the Ph.D. degree in computational neurosciences Research Group Leader and Lecturer with the Chair
from the Institute of Neuroinformatics of UZH and of Highly-Parallel VLSI-Systems and Neuromorphic
ETH Zürich, Zürich, Germany, in 2018, for his work Circuits. He is the author or co-author of more than
on structural plasticity in neuromorphic systems. He 56 publications, and five issued and five pending
is currently a Postdoctoral Fellow with the Chair patents in below fields. His research interests include
of highly-parallel VLSI-systems and neuromorphic circuits for low-power systems-on-chip in advanced
circuits, Technische Universität Dresden, Dresden, technology nodes, with special focus on clocking,
Germany. His aim of research is the creation of active and intelligent neuro- data transmission, and power management. He has experience in designing full-
prosthetic devices. His particular focus is in the creation of energy efficient custom circuits for multiprocessor systems-on-chip, like ADPLLs, register files,
computational architectures capable of processing electrophysiological signals and high-speed on-chip and off-chip links, in academic and industrial research
and forming electrical response stimuli within biohybrid closed-loop systems. projects. He is managing the full-custom circuit design and SoC integration for
more than 12 MPSoC chips in 65 nm, 28 nm, and 22 nm CMOS technology.
He leads the chip design of the SpiNNaker2 neuromorphic computing system
within the Human Brain Project. He was the recipient of the Barkhausen Award.
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.
442 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, VOL. 15, NO. 3, JUNE 2021
Frank H. P. Fitzek (Senior Member, IEEE) received Ibrahim (Abe) M. Elfadel (Senior Member, IEEE)
the Diploma (Dipl.-Ing.) degree in EE from RWTH received the Ph.D. degree from the Massachusetts In-
Aachen, Aachen, Germany, in 1997, and the Ph.D. stitute of Technology, Cambridge, MA, USA, in 1993.
(Dr.-Ing.) degree in EE from the Technical University He is currently a Professor of electrical engineering
Berlin, Berlin, Germany, in 2002. In 2002, he became and computer science with Khalifa University, Abu
an Adjunct Professor with the University of Ferrara, Dhabi, UAE. Prior to his current academic position,
Ferrara, Italy. He is currently a Professor and Head he was a Research Staff Member and then Senior
of the Deutsche Telekom Chair of Communication Scientist with the corporate CAD organizations at
Networks, Technische Universität Dresden, Dresden, IBM Research, and the IBM Systems and Technology
Germany, and has been coordinating the 5G Lab Ger- Group, Yorktown Heights, NY, USA, where he was
many since 2014. Since 2019, he has been a Speaker involved in the research, development, and deploy-
of the German Research Foundation (DFG, Deutsche Forschungsgemeinschaft) ment of CAD tools and methodologies for IBM’s high-end microprocessors. He
Cluster of Excellence Centre for Tactile Internet with Human-in-the-Loop. In is the inventor or co-inventor of 50 issued US patents with several more pending.
2003, he joined Aalborg University, Aalborg, Denmark, as a Professor. He His current research interests include IoT platform prototyping, energy-efficient
has visited various research institutes, including the Massachusetts Institute edge and cloud computing, secure IoT communications, embedded digital-signal
of Technology, Cambridge, MA, USA, VTT, Espoo, Finland, and Arizona processing, and computer-aided design for VLSI, MEMS, and silicon photonics.
State University, Tempe, Arizona. Since 1999, he co-founded various start-up He was the recipient of six invention achievement awards, one Outstanding
companies. He was the recipient of various awards, including the NOKIA Technical Achievement Award and one Research Division Award, all from
Champion Award and the Nokia Achievement Award. In 2011, he was also the IBM, for his contributions in the area of VLSI CAD, and in 2018, (with Prof.
recipient of the SAPERE AUDE research grant from the Danish government and Mohammed Ismail) the SRC Board of Director Special Award for pioneering
in 2012 the Vodafone Innovation Prize. In 2015, he was awarded the honorary semiconductor research in Abu Dhabi. In 2014, he was the co-recipient of
degree Doctor Honoris Causa from the Budapest University of Technology and the D. O. Pederson Best Paper Award from the IEEE TRANSACTIONS ON
Economics (BUTE). COMPUTER-AIDED DESIGN FOR INTEGRATED CIRCUITS AND SYSTEMS, in 2019,
the Best Paper Award from the IEEE Conference on Cognitive Computing,
Milan, Italy, 2019. He is an Associate Editor for the IEEE TRANSACTIONS
ON VLSI SYSTEMS and was also on the Technical Program Committees of
several leading conferences, including DAC, ICCAD, ASPDAC, DATE, ISCAS,
VLSI-SoC, ICCD, ICECS, and MWSCAS. He was the General Co-chair of
the IFIP/IEEE 25th International Conference on Very Large Scale Integration
(VLSI-SoC 2017), Abu Dhabi, UAE.
Authorized licensed use limited to: PSG COLLEGE OF TECHNOLOGY. Downloaded on July 04,2024 at 16:39:10 UTC from IEEE Xplore. Restrictions apply.