Download as pdf or txt
Download as pdf or txt
You are on page 1of 24

Wayne State University

College Of Engineering
ECE Department

ECE 7995
Medical Robotics and Imaging Surgery

EOG based Robot Controller

Dr. Abhilash Pandya

Presented by:

Hassan A.M. Muhsen


Charbel A. Habib
Harpreet Kaur Jassal

Winter2008

1
Table of Contents
1. Abstract………………………………………………………………………………………... 3
2. Background and Significance…………………………………………………………………. 4
2.1 Types of Control……………………………………………………………………... 4
2.2 EOG signal…………………………………………………………………………… 5
2.3 EOG Based Applications ……………………………………………………………. 6
2.3.1 Computer Access ………………………………………………………... 6
2.3.2 Game Control …………………………………………………………….. 9
2.2.3 Biologically-driven Musical Instrument ………………………………. 10
3. Specific Aims ………………………………………………………………………………... 12
3.1 Software Engineering Plan ……………………………………………………….... 12
3.2 System Architecture………………………………………………………………… 13
4. Objective 1: Input Unit ……………………………………………………………………… 14
4.1 Design ……………………………………………………………………………… 14
4.2 Implementation …………………………………………………………………….. 15
4.3Testing ……………………………………………………………………………… 16
5. Objective 2: Main Unit ……………………………………………………………………… 16
5.1 Process Unit ………………………………………………………………………... 17
5.1.1 Design…………………………………………………………………….. 17
5.1.2 Implementation…………………………………………………………….17
5.1.3Testing…………………………………………………………………….. 18
5.2 Control Unit ………………………………………………………………………... 18
5.2.1 Design…………………………………………………………………….. 18
5.2.2 Implementation …………………………………………………………... 18
5.2.3Testing …………………………………………………………………….. 18
6. Objective 3: Robot Unit 1: Neck Holder…………………………………………………..… 19
6.1 Design………………………………………………………………………………. 19
6.2 Implementation …………………………………………………………………….. 19
6.3Testing …………………………………………………………………………….… 19
7. Objective 4: Robot Unit 2: Aesop 3000 ………………………………………………………19
8. Timeline ……………………………………………………………………………………... 20
9. Budget ……………………………………………………………………………………….. 21
10. References ………………………………………………………………………………….. 22

2
1. Abstract

Every Surgeon or physician claim: "I need this robot or device to be controlled in an easy and
effective ways". In addition, disabled people are always waiting for new technologies that allow
them to make a step over their obstacles, and be able to live their lives like normal people. From
these two claims, this project has been launched presenting a new source of control that is strong
and effective with minimal interference with subject activities and minimal discomfort. This
project incorporates two areas of Engineering – Computer Engineering and Biomedical
Engineering, ending up with an exciting way of robotic control: "EOG based Robot Controller".
It is based on the recording of the electrical potential resulting from eye movement -
Electrooculogram (EOG), the signal that will allow the present of a new source for inputs to any
type of electronically controlled device. The project embodies the spirit of design Engineer, since
it involves research, coding and building the actual robot, and on the other hand, it will present
new aspects in the biomedical field especially to surgeons and physicians. At the end two
applications will be implemented using this controller. First application is a neck holder that
moves the neck toward the eye direction of the disabled person. Second application will be using
the signals to move a camera in 2 dimensions, as a first step, and then it will be upgraded for
zooming and other motions. This project will grab the attention of most of the surgeons who use
cameras during their surgery, since their control will not present an obstacle for their work
(laparoscopic surgery).

3
2. Background and Significance

As introduction it has been chosen to demonstrate a historical background about robotic control
systems and an existing application which use EOG as a source of Control.

2.2 Types of Control

By definition, a robot is "a machine that resembles a human and does mechanical, routine tasks
on command"[1]. Referring to this definition, the term "command" is a major factor in the
robotics field [2-4]. For that purpose, most of robotics engineers were trying to develop a way of
communication between the user and the robot; a way that has to be easy but very efficient.
From the invention of the first robot, in 1948 (Elmer), which was an autonomous robot, till our
days, 2008, a lot of command types were established. From these types, it can be distinguished
the following types of robots: autonomous robots, master-slave robots (joysticks), voice
command robots (Aesop) and bio-potential command robots (using EMG, EEG, and others...).
The figures 1, 2 and 3 show the different types of robots. An autonomous robot is a robot that
can perform a task with a previous set up, and without any continuous human guidance (figure
1). On the other hand, a master-slave robot (figure 2) is the robot that needs a continuous
guidance for the user (Di Vinci).

Figure 3 - EMG controlled Robot

Figure 2 - Master-slave Robot

Figure 1 - Autonomous Robot

4
The voice controlled robots are activated via sound commands after a process of voice
recognition training. The bio-potential guided robot is control by the electrical activity of the
human body such as electrical activity of the muscles (EMG) or electrical activity of the brain
(EEG) (figure 3).

EEG signal is too complex since it presents a reading of the electrical activity of the brain. So it
is hard to detect a specific signal of the brain while performing a specific task. On the other hand,
EMG signals, which measure the electrical activity of the motor neurons that leads to muscle
activity, give a very net signal, which can be used as a control to some type of robots. But in
some cases, those signals are not available due to specific pathology. For example, a spinal cord
injury leads to a partial paralysis and sometime to a complete paralysis depending on the site of
the injury (lower, center or upper part of the spine). Since our major topic is robotic control, we
will not go through the anatomy of the vertebrae and the spinal cord. But it is good to mention
that in case of complete paralysis, the person still be able to move his eyes (except the cases
where the person have Bells palsy: is a paralysis of the facial nerve resulting in inability to
control facial muscles in the affected side [5]). For that, electrical signals generated from the
eyes movement can be used as a control signal will be very efficient in a similar case, and can be
use in different projects to help handicap people in performing some tasks.

2.2 EOG Signal

The electrical potential of the eye is known as corneoretinal potential [6]. This field is not related
to the light simulation since it can be detected in both total darkness and in light. The presence of
this signal is recorded via electrooculogram which is used in some clinical application like the
measurement of nystagmus (small movement of the eye). The eye movements (figure 4) produce
a moving dipole source which will present the signals that will be recorded.

5
Figure 4 - The signals produced by eye movement

This source behaves as if it were a single dipole oriented from the retina to the cornea. With the
eye at rest the electrodes are effectively at the same potential and no voltage is recorded. The
rotation of the eye to the right results in a difference of potential, with the electrode in the
direction of movement becoming positive relative to the second electrode. The opposite effect
results from a rotation to the left. The disadvantage of the EOG is established in instability of
corneoretinal potential which is affected by light, fatigue, and other qualities. Consequently,
there is a need for frequent calibration and recalibration. On the other hand, the advantages of
this technique include recording with minimal interference with subject activities (due to the
position of the electrodes) and minimal discomfort.

2.3 EOG Based Applications

The basic idea behind each application is getting signals from eyes [6-12], amplifying and
filtering the signal and applying these signals to the control unit.

2.3.1Computer Access

a. Hands-Free Computer Access for the Severely Disabled [9]

It uses the Cyber link Control System (CCS) as a hands-free input device to access alternative
and augmentative communication (AAC) software. There are individuals, who because of the

6
severity of their physical limitations have been unable to access AAC technology through either
direct selection or scanning via a switch. These individuals often have disabilities related to
cerebral palsy (CP), amyotrophic lateral sclerosis (ALS), multiple sclerosis (MS), muscular
dystrophy (MD), or traumatic brain injury (TBI).
The CCS uses brain and body forehead bio-potentials in a novel way to generate multiple signals
for computer control inputs. The CCS derives control signals in three parallel channels from the
forehead potentials. The lowest frequency channel, which is defined here as the EOG (Electro-
OcularGraphic) signal is responsive to bio-potentials resulting from eye motion. It is band pass
derived within the frequency range of 0.2-3.0 Hz. The second channel is band pass derived
between one and 45 Hz, falling within the accepted EEG (ElectroEncephaloGraphic) range. The
following frequency centers are used: three low frequency bands (centered at 0.95, 2.75 and 4.40
Hz), three mid frequency bands (centered at 7.75, 9.50 and 11.45 Hz) and four high frequency
bands (centered at 13.25, 16.50, 21.20 and 25.00 Hz). The low frequency bands are sensitive to
eye movements, and the high frequency bands are sensitive to forehead muscle activity.
Envelope-detected between 70 and 3000 Hz, the third channel is defined as an EMG
(ElectroMyoGraphic) signal. It rapidly and accurately responds to subtle contractions of the
masseter and frontalis muscles and is well suited to discrete on/off switch closures, keyboard
commands, and the functions of the left and right mouse buttons for users with any significant
residual of facial muscle functionality.

A CCS Windows 95/98-mouse driver fulfills all the cursor positioning and left/right button
functions of the manual mouse and supports ASCII character and string entry. This allows
hands-free control of third party WINDOWS applications including augmentative
communication and control software such as Words Plus EZ Keys, WiViK2, Clicker 4s,
Dynavox For WINDOWS, Gus, Maxhome, and X-Ten.

b. Eagle Eye: A Computer Control System for Persons with Disabilities[10]

The electrodes are connected to two psycho physiological differential amplifiers (the Grass
amplifiers), one for the horizontal EOG and one for the vertical EOG. The EOG signals are quite
small. Typically the signal is 20 microvolts for each degree of angle of the eye. So if the eyes are
5 degrees to the right of center, the difference in the horizontal electrodes will be about 100
micro volts. This is a small but noticeable signal. The amplifiers multiply the signal typically by

7
a factor of 10,000, so the 100 microvolt signal would become 1 volt. The EagleEyes driver
software reads the horizontal and vertical values from the data acquisition board and converts
them into screen coordinates for the mouse pointer.

Figure 5 - EagleEyes Control Panel

8
2.3.2 Game Control
Patients suffering from paraplegia have limited or no functionality below their neck. They are
locked in their own bodies with almost no communication with the outside world. In order to
impart functionality and/or means of communication to such patients, attempts are being made to
optimally utilize all the available resources available with the patient such as brain activity, or
even eye movement. EOG (Electr-Oculography) signals are obtained from a subject for five
different eye movements[11] (up, down, left, right, and blink) and then designed a model to
identify these distinct movements. After this,
these signals were used for control of a computer
game (Quake II). The algorithm works in the
time domain. First, peak detection, and then
recognition of each peak pattern into actual eye
movements, and finally it is translated into game
controlling commands. After the signal was
decoded and the appropriate action determined,
the program used a TCP/IP connection to send
the command to the computer running Quake 2.
Prior to running the test, the control keys for the
game were reprogrammed to “w”, ”z”, ”a”, ”s” ,
and space which corresponded to the actions of
forward, backward, left, right and fire,
respectively. Each eye movement was decoded
for the following game actions: an up eye
movement moved the game forward, a down eye

movement moved the game backward, a left eye Figure 6 - Game Software Stack

movement moved the game left, a right eye


movement moved the game right and an eye blink fired the gun. For each decoded eye
movement, five characters were sent to produce larger game movement. Since the user needed
keep the computer screen within his/her visual field while playing the game, controlling the
game motion with eye movements posing an interesting problem. More specifically, if a left eye
movement is used to move the game left, the user’s eyes would no longer be looking at the
9
screen. Additionally when the user produced an eye movement back to the screen, a right eye
movement was detected. To overcome this problem, all decoded EOG signals were ignored for
one second after an initial command was decoded and sent to the game. This allows the user,
wanting to move left, to move his/her eyes left then back to the screen and it only be interpreted
as one forward command. One second after a command is sent, the program is ready to send
another command. This scheme worked very well, producing reliable control of left, right,
forward, backward game movements as well as a fire command.

2.3.3 Biologically-driven Musical Instrument [12]

Advances in computer science and specifically in Human-Computer Interaction (HCI) have now
enabled musicians to use sensor-based computer instruments to perform music. Musicians can
now use positional, cardiac, muscle and other sensor data to control sound. Simultaneously,
advances in Brain-Computer Interface (BCI) research have shown that cerebral patterns can be
used as a source of control. Indeed, cerebral and conventional sensors can be used together with
the object of producing a 'body-music' controlled according to the musician's imagination and
proprioception. Some research has already been done toward integrating BCI and sound
synthesis with two very different approaches. The first approach aims to signify data issued from
physiological analysis by transforming them in sound.

Figure 7 - Flow Chart of the Biologically-driven Musical Instrument

10
This process can be viewed as a translation of physiological signals into sound. The second
approach aims to build a musical instrument. In this case, the musician tries to use his
physiological signals to control intentionally the sound production. This is easy for EMG or
electro- oculogram (EOG) but difficult for heart sound or electroencephalogram (EEG).

11
3. Specific Aims

In order to get a complete project on time a specific strategy will be followed by the team
members. It was decided to divide the project into smaller units and each one works on each unit
in parallel, and then integrate overall units to accomplish the project goals.

3.1 Software Engineering Plan

For the software engineering plan of the project, the water fall model [13] is chosen to help
completing the project and accomplish its objectives. The water fall model is special because it
provides a clear, easy path that takes you all the way from the requirements to the testing phase.

Figure 8 - Water fall Model

Water fall model provides a neat way of testing the system. The testing starts with the
implementation phase. After the requirements are set and the design is completed, a unit testing
takes place in parallel with the implementation to make sure that small modules are working.
Then, the system is integrated and tested as one unit.
The advantages of the Water Fall model include:
• It organizes the work of the developers in a step by step approach. The developers must finish
one part then they can start with the next one.

12
• Good testing: At each stage of the model development we can test what has been done in that
part.
• Simple: The implementation of the product will be straight forward.
The disadvantages of the Water Fall model include:
• Team members will not see the whole picture until the final stage.
• When the team members reach further stages it costs too much time and effort to go back to
earlier stages, i.e. when trying to change something in the requirements after reaching the
implementation stage.

3.2 System Architecture


As a result of planning the main aims of the project where:
1- Input Unit
2- Main Unit
3- Output Unit
The figure 9 shows the detailed system architecture of the project and how it is divided into units
and how easily can be applied using the water fall plan.

13
Input Unit
Process Unit
EOG Sensors
Filtering (Electrodes)

Segmentation
Amplifier

Control Unit

Translation

Robot (Output Unit)


Communicator

Main Unit
(Microcontroller/PC)
System Architecture

Figure 9- System Architecture

4. Objective 1: Input Unit


This unit is responsible of collecting the EOG signals from the subject and sends them to the
main unit after some amplification to be able to process them. Electrooculogram (EOG) is the
recording of the changes in voltage that occur with the eye position. This voltage is present
because the eye is like a spherical battery with positive terminal in the front at the cornea and
negative terminal in the back at the retina. The potential between that back of the retina and the
cornea was measured to be between 0.4 and 1 mV. So, it is possible to measure the eye
movement between -70 and 70 degrees, taking the front position as the zero value. This will give
the ability to know the position of the eye, since the electrode will measure the changes in
potential as the cornea moves nearer and further form the recording electrodes. So when the
cornea if nearer to the positive electrode a positive signal will be received and when the cornea is
nearer to the negative electrode the signal is negative

4. 1 Design

14
This unit has two main parts: the sensors and the amplifier. For the sensors 6 electrodes will be
required which will be placed according to the figure 10

Figure 10- the Electrodes position for Vertical and Horizontal Eye Movement

The electrodes used are the EOG disposable electrodes, because of their easy availability. These
electrodes are the sensors that will detect the bio-signal from the eye movements. The displayed
values from the electrodes are in microvolt range. So to get a strong and usable signal, an
amplifier was added to the system with a gain of 1000 to 5000. This amplification will give a
voltage ranged between 0 and 2 V, which presented a usable signal for project purpose. The
amplifier itself is available in the BIOPAC package which was used for the primary
measurements, but the limitation was in data output from the bio-potential amplifier itself, since
it is not possible to control the output. So it was required from the team to design the amplifier.
In the design of the amplifier, a gain of 10 pre-amplifier was used with gain of 100 for the
amplifier and an overall gain of 5 for the 2nd order low pass filters, so it end up with a total gain
of 5000. A diode was used to provide the signal to the unipolar ADC at the end of the amplifier.
On the other hand, and due to the improper contact of some electrodes, a DC voltage was built
up. So, an offset compensation circuitry was uploaded to the amplifier, composed of
potentiometers to reduce this significant hardware problem. The circuit [14] was designed as
shown in the figure 11.
.

15
4.2 Implementation
The sensor part does not require any implementation since it is available in the market. But the
amplifier needs to be implemented and for that the components will be bought and implemented.

4.3 Testing

For testing this unit, the unit will be connected to an oscilloscope and the received signal will be
compared with the biopic amplifier result to check the quality of signal.

5. Objective 2: Main Unit


This unit has the main functionality of the system. A proposed flow chart that shows the activity
of this unit is shown in the figure 12.

16
Start

Running
No(Learning Mode)
Mode

Yes

Calibrate
Read Input

Process input
End

Send Output

Figure 12 - Main Unit Flow Diagram

This unit is composed of two featured units: Process and Control units.

5.1 Process Unit


An experiment was done by the team to see the shape and the behavior of the EOG signal. As
shown in figure 13 the signals had some problems which required some filtering and
segmentation in order to make sense out of these data.

17
Right side movement of
eyes Movement of eyes during reading

Left side movement of eyes

Upside movement of eyes Blinking movement of eyes

Downward movement of eyes

Figure 13 - EOG signal Results Using Biopac MP30 Amplifier

5.1.1 Design
The filter will ensure that only the signals from the electrodes are being measured. As if the
subject blink his/her eyes while recording data that effect can be removed using Median Filter.
Since the data we obtained is not free from artifacts. To remove these artifacts the data will be
passed through band pass filter or low pass filter to get the desired data. To analyze the
frequency range we’ll perform FFT.
For the segmentation part the horizontal movement will be divided into 5 regions: center, 2 left
and 2 right and for the vertical will be divided into 3 regions: center, up and down.
5.1.2 Implementation
For implementing this part the signal processing toolkit of Matlab will be used. This toolkit
provides all kind of filters and segmentation functions to help us build the unit.
5.1.3Testing
This unit will be tested by the input of biopac amplifier to check the accuracy of data then will be
integrated with the whole system.

18
5.2 Control Unit

It is one of the tow main parts of the Main Unit. It has the responsibility of translating the data
after it has been processed and decides on how the parameters should be passed to the robot by
Communicator.

5.2.1 Design

According to the tasks of this unit, it will be divided into 2 main parts as shown in the figure 14.

Control Unit

Translation

Comunicator

Figure 14 - Design of Control Unit

a) Translation: this part will receive the data and map them into real parameters of robotic
control like PWM and pass them to communicator.
b) Communicator: this part will be responsible of sending the parameters to the robot by the
designed mean of communication like socket connections or serial cables.

5.2.1 Implementation:

The control unit will be implemented on a microcontroller system for the neck application and
will be implemented on PC for the Aesop 3000 Application.

5.2.3 Testing

The system will be tested in 2 stages. First is stand alone test, in which data will be entered
manually and the result will be monitored with no robot connection. Then the unit will be
integrated with whole system and will be tested as a whole.

19
6. Objective 3: Robot Unit 1: Neck Holder

After designing the input unit and the main unit it is the time to show the capabilities of the new
control systems. One application is to help disabled people to move their neck with the position
of eyes.

6.1 Design

For this Robot all is required is a 2 DOF robot 1 for moving Up and down and 2 for moving left
and right.

6.2 Implementation

This robot will be implemented as a prototype using servo motors which are available in the
market

6.3Testing

It will be tested on a head prototype like toy head. And for the input will be one of the team
members.

7. Objective 4: Robot Unit 2: Aesop 3000

This Robot is already designed and implemented and for testing the Matlab simulator will be
used first then the real one will be tested.

20
8. Timeline

The table 1 shows the time plan for the project and the accomplished tasks until now.
Table 1 - the Project Timeline

Task Deadline Person on charge state


Choosing Project 30-Jan All Done
Research and Data Collection 7-Feb All Done
Specify Aims 10-Feb All Done
System Architecture 15-Feb Hassan Done
Proposal Document and Presentation 28-Feb All Done
Software Engineering Plan 25-Feb Hassan Done
Objective 1: Input Unit 7-Mar Charbel In progress
Objective 1: Design 26-Feb Charbel Done
Objective 1: Implementation 5-Mar Charbel Not Started
Objective 1: Testing 7-Mar All Not Started
Hassan and
Objective 2: Main Unit 21-Mar Harpreet In progress
Objective 2:Process Unit 15-Mar Harpreet In progress
Objective 2:Process Unit: Design 28-Feb Harpreet Done
Objective 2:Process Unit:
Implementation 12-Mar Harpreet Not Started
Objective 2:Process Unit: Testing 15-Mar Harpreet Not Started
Objective 2: Control Unit 15-Mar Hassan In progress
Objective 2: Control Unit: Design 28-Feb Hassan Done
Objective 2: Control Unit:
Implementation 12-Mar Hassan Not Started
Objective 2: Control Unit: Testing 15-Mar Hassan Not Started
Objective 2: Overall Testing 21-Mar All Not Started
Objective 3: Robot Unit 1: Neck Holder 1-Apr All Not Started
Objective 3: Design 15-Mar Charbel Not Started
Charbel and
Objective 3: Implementation 25-Mar Hassan Not Started
Objective 3:Testing 1-Apr All Not Started
Overall System Testing 7-Apr All Not started
Final Demo and Presentation 17-Apr All In progress
Final Report 25-Apr All Not Started
Back Up Days 14 Days
Objective 4: Robot Unit 2: Aesop 3000 Future Aim

21
22
9. Budget

The table 2 shows the Budget required for this project.


Table 2 - Project Budget

No. Item Cost Availability


1 Electrodes, Wires 80$ No
2 Amplifiers 150$ No
3 PC 1000$ Yes
3 Microcontroller 250$ No
4 Matlab 150$ Yes
4 Robot 150$ No
5 Three graduate student work 9000$ Yes
Total Cost: 630$

23
10. References

1. Craig, J.J,:Introduction to Robotics, Pearson Prentice Hall. Upper Saddle River, NJ.
(2005).
2. Cleary,K., Nguyen,C.,: Art in Surgical Robotics:Clinical Applications and Technology
Challenges, The Catholic University of America,Washington.
3. Hashizume M., Tsugawa,K., : Robotic Surgery and Cancer, Kyushu University, Fukuoka,
Japan.
4. Surg,Ann, : Robotic Surgery: A current Perspective, Medscape Today(14-21,2004)
5. Morris et al. (2002) Annualized Incidence and Spectrum of Illness from an Outbreak
Investigation of Bell"s Palsy Neuroepidemiology Issue Vol.21 Issue.5 Page no. 255-261
6. Slavicek,R., Wassef,A., : EOG for REM Sleep Detection, Senior Design Project,
University of Illinois at Urbana-Champaign,IL(2005).
7. Barea, R., Boquete, L., Mazo, M. , Lopez, E., Bergasa,L. M. ,: EOG Guidance of a
wheelchair using Eye Movements Codification, The International Journal of Robotics
Research, Vol. 22 .
8. Chang, N., Gupta, V., :PS/2 Mouse Control with EOG/EMG Signals, Senior Design
Project(2004).

9. Junker, Andrew M., Wegner, Jane R., Sudkamp, Thomas: Hands-Free Computer Access
for the Severely Disabled, Technology and Persons with Disabilities Conference,
California State University (2001).
10. Gips,James , Olivieri, Peter : EagleEyes: An Eye Control System for Persons with
Disabilities, Computer Science Department, Boston College.
11. Zaveri, Toral, Winters,Jason , Wankhede, Mamta , Park, Il : A fast and Accurate Method
for discrimination Five Choices with EOG.
12. Arslan, Burak. , Brouse, Andrew. , Castet, Julien., : Biologically-Driven Musical
Instrument, eNTERFACE Summer Workshop, Mons, Belgium(2005)
13. Royce, Winston (1970), "Managing the Development of Large Software Systems",
Proceedings of IEEE WESCON 26 (August): 1-9.
14. Vinodh k.. Naveen. R, R Sriram. L -Computer Mouse Controlled by Eye. Final Project
Report May 5,2004.

24

You might also like