PRESENTATION SCRIPT

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 3

EMG CHARACTERIZATION THROUGH VISUAL BIOFEEDBACK

SLIDE 1
Hello and a very good morning to my fellow supervisors and classmates who are with me
here today. Today, I will be presenting my topic called “EMG Characterization Through Visual
Biofeedback”.
SLIDE 2
As a general overview of my presentation, I will briefly talk about the introduction on what
EMG really is, how it correlates to visual biofeedback and the purpose of characterizing EMG. We
then dive into the methods used in the project to obtain the results which will then be discussed. And
in my future work, what I plan to be implemented or what those who supersede me can continue from
where I left off.
SLIDE 3
To start off, electromyography or EMG for short, is the measurement of the electrical signals
produced by the skeletal muscles during contraction called myoelectric signals. EMG is a technique
that measures, detects, and records these myoelectric signals. Observations from these EMG signals
tell us the muscles’ physiological condition and properties which helps clinicians to diagnose and treat
neuromuscular diseases and issues related to it.
SLIDE 4
Visual biofeedback on the other hand, is another technique that relays information of a
patient’s physiological output visually. Both clinician and patient can then monitor and facilitate
muscle control which proves crucial in both muscle rehabilitation and prosthetic control.
SLIDE 5
Why do we need to characterize our EMG. Well, EMG characterization is important in many
medical fields from neuromuscular disorders diagnoses, biomechanics, and prosthetic control. This is
mainly because EMG characterization leads to EMG patterns associated with muscle activation
allowing classification and distinction between various gestures and movements.
SLIDE 6
Going now to my methodology, the flow of my project consists of 2 cases. One where the
algorithm for machine learning is trained using a fixed dataset. And another where the trained
machine will be implemented in real-time EMG characterization and visual biofeedback.
SLIDE 7
The project has 2 aspects. A technical aspect involving the circuit connection, electrode
placement and data recording/collection. The programming aspect involves the processing the fixed
dataset, extracting useful features from the data for further characterization.
SLIDE 8
This is the flow of my project design. First, EMG signals from my target muscles is sent to
the circuit for signal conditioning, filtering, and amplification and into the Arduino. Arduino data is
then collected and stored as text files.
SLIDE 9
This is the circuit that was used for the signal conditioning. A 1 st Order HPF with a cut off
frequency of 16Hz. A non-inverting op-amp with gain of 16. A 2 nd order LPF with cutoff frequency of
482Hz and a gain of 1. And finally, a non-inverting op-amp with gain of 11.
SLIDE 10
The biofeedback system that I planned as a base is the jumping dinosaur game with a
spacebar interfaced with the Arduino Leonardo.
SLIDE 11
The electrodes were placed on my extensor carpi radialis (the upper forearm). An app called
CoolTerm was used to interface with the Arduino to capture raw EMG data and stored as text files.
The text files were plotted in Excel to check for any abnormalities.
SLIDE 12
5 hand gestures involving the muscle flexion of the extensor carpi radialis were chosen. That
is the rock sign, a four-finger extension, a fist, a key grip and rest.
SLIDE 13
This slide show the steps taken to record EMG data. The hand was in rest for 4 seconds and
when a button is pressed, the Arduino then records data while the hand gesture was performed and
held for 1 second. Then the button was released in which data stops recording and the hand was
relaxed and in rest for 4 seconds again. The timing was monitored using the stopwatch and timer on
my phone. This process was repeated 300 times for each gesture.
SLIDE 14
Jupyter Notebook was used as the Python IDE. The raw EMG data was imported and pre-
processed with Python’s built-in libraries. The preprocessing steps consists of zero-centering,
rectification, squared and moving average(smoothing).
SLIDE 15
The EMG data was then extracted into several features for differentiation and classification.
We have the Integrated EMG, representing the area under the rectified signal. The mean absolute
value, the average mean of the signal. The simple square integral, the summation of the squared signal
values. The root mean square, the mean of the squared values. The variance, the measurement of the
data spread. The waveform length, the cumulative length of the signal and zero-crossing rate, the rate
at which the signal crosses the zero amplitude.
SLIDE 16
These are the EMG features that was planned to be extracted for each gesture.
SLIDE 17
These extracted features were stored into arrays and PCA was performed on them. PCA is a
technique that uses dimensionality reduction to reduce and simplify complex datasets via
transformation into a lower dimensional space. The PCA of the extracted features can give
visualization of the variability of the different hand gestures.
SLIDE 18
This is the example result of the raw EMG data that was plotted in Excel.
SLIDE 19
These are the results of the Python pre-processing. These 4 graphs represent the zero-centered
data, rectified data, squared data, and the moving average(smoothing) of the signal.
SLIDE 20
Before feature extraction can be done. We need to know the number of samples per batch of
the 1 second data recording. It was then determined that the 1 second’s window of muscle flexion has
440 samples.
SLIDE 21
This table is the result of all samples from the beginning of the recording to the end. There are
too many data for the batch feature extraction to be shown.
SLIDE 22
This is the PCA plot for the extracted features with one graph having all gestures while the
other excluded REST. PCA results dictate the similarity in pattern and its relationship with the data
and they will be clustered and grouped together on the plot.
SLIDE 23
The PCA variance will tell us which principal component axis managed to capture most of the
data variability and variance.
SLIDE 24
From the PCA, plot we can see that all my hand gestures have similar patterns and shapes.
SLIDE 25
There are some limitations and challenges that was faced in the project. Number one being
that the EMG data was only taken from a single channel and from a single patient not offering a
variability of data for the machine to train with. Its like giving a student the same question and same
answers every time in the exam not allowing the student to experience different kinds of questions
and answers to expand and grow. Next, the gestures taken does not offer many degrees of freedom for
the result in prosthetic control. More real and complex gestures are needed for variability.
SLIDE 26
As of where I am right now, I have only reached the end of the PCA training and has lightly
dabbled in the real-time case. So, the next step would be to test the real-time algorithm on a patient
with the trained PCA model and see if the visual biofeedback can be controlled using the pattern and
shape of the EMG rather than the amplitude. In hopes of the future, there are several key concepts and
methods that can be considered for a more efficient and proficient workflow. A multichannel EMG
can be built to allow for more data resolution, possibly on the flexor carpi radialis (lower forearm). As
explained before, for better accuracy and data precision, more complex movements imitating that of
real hand gestures can be characterized to allow for more degree of freedom. Design and build a
printed circuit board (PCB) and mass produce them for ease of use and mobility. And finally,
configure the EMG features to control the visual biofeedback instead of EMG amplitude for a more
accurate and flexible control.

You might also like