Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 85

SCHOOL OF ENGINEERING SCIENCE AND TECHNOLOGY

DEPARTMENT OF MECHATRONICS ENGINEERING

FINAL YEAR RESEARCH PROJECT


ACADEMIC YEAR 2022

TITLE:

DESIGN AND IMPLEMENTATION OF A BCI CONTROLLED BIONIC ARM


BY

PRINCE TAKUDZWA MUTIKKANI

REGISTRATION NUMBER: C18133261V

SUPERVISORS: Eng. D. SIMANGO

DECEMBER 2022

1
DECLARATION

In presenting this project in partial fulfilment of the requirement for a Bachelor of Engineering Honors
Degree in Mechatronics Engineering at Chinhoyi University of Technology, I Prince Takudzwa Mutikani
C18133261V, declare that this is original work except where sources have been acknowledged. The work
has never been submitted, nor will it ever be, to another University in the awarding of a degree and has not
been copied.

I agree that permission for extensive copying of project for scholarly purposes may be granted by the head
of the department or by his or her representatives. It is underspecified that copying or publication of this
project for financial gain should not be allowed without my written permission.

Student’s Signature: …………………. Date ………………………….


0773759161/0719759161

Supervisor’s Signature: ………………. Date ………………………….

1
ACKNOWLEDGEMENTS

First and foremost, I would like to thank the Lord Almighty for the gift of life, good health and the
determination to start and finish this Research Project. I expand my profound gratitude to my project
supervisor Eng. D SIMANGO for administering assistance, approving my work and providing relentless
support, encouragement and all essential guidance needed for the completion of this work. Also, gratitude is
extended to the University for providing the resources needed to complete the research project. The whole
mechatronics department as a whole is also thanked for the support and knowledge imparted from the
beginning of the student’s journey on the degree programme to date.

I'd like to thank the AUM Robotics and Robotics Research Centre their assistance. I'd also want to thank the
reviewers at OPEN BCI for their insightful comments and suggestions.

Author Contributions:
Tatenda Bako: neural network interfacing , AUM robotics prosthetic action GUI ,OPEN BCI :datasets and
the copyrights for usage on scholar purpose

1
Statement of Student Contribution

 I independently carried out background research, design and simulate to complete a thesis for the
school of Mechatronics engineering (BEMC). My supervisor was Engineer D Simango.

 Starting from scratch I designed and modelled a low-cost myoelectric bionic arm using computer
modelling software and Proteus software.

 Using my knowledge of mechanics, electronics and programming I produced several functional


prototypes with design improvements between each iteration.

 I wrote an original Thesis

 The above represents and accurate summary of the student’s contribution

1
Contents
DECLARATION.............................................................................................................................................................................2
ACKNOWLEDGEMENTS............................................................................................................................................................3
Author Contributions:....................................................................................................................................................................3
Statement of Student Contribution...............................................................................................................................................4
ABSTRACT.....................................................................................................................................................................................6
CHAPTER ONE.........................................................................................................................................................................7
1.0 INTRODUCTION.................................................................................................................................................................7
1.2 BACKGROUND...................................................................................................................................................................7
1.3 PROBLEM STATEMENT...................................................................................................................................................8
1.4 AIM.........................................................................................................................................................................................8
1.5 OBJECTIVES.......................................................................................................................................................................8
1.6 RESEARCH QUESTIONS..................................................................................................................................................8
1.7 JUSTIFICATION..................................................................................................................................................................8
1.8 SCOPE...................................................................................................................................................................................9
1.9 DESIGN METHODOLOGY...............................................................................................................................................9
The design methodology for this project will take more of a quantitative approach and will be conducted as follows through:.......9
1.9.1 RESEARCH AND DATA COLLECTION......................................................................................................................9
1.9.2 DATA ANALYSIS..............................................................................................................................................................9
1.10 DISSERTATION STRUCTURE.......................................................................................................................................9
CONCLUSION..........................................................................................................................................................................10
CHAPTER TWO.......................................................................................................................................................................11
2.0 LITERATURE REVIEW...................................................................................................................................................11
2.1 Introduction.........................................................................................................................................................................11
2.1.2 The Human Hand.............................................................................................................................................................12
2.2 DEGREES OF FREEDOM................................................................................................................................................12
2.3Capabilities of Prosthetic hands.........................................................................................................................................13
2.3.2 The Bebionic 3..................................................................................................................................................................14
2.3.3 I Limb digits.....................................................................................................................................................................16
2.3.4 22 Degree of Freedom APL Hand...................................................................................................................................17
2.4 InMoov Robot......................................................................................................................................................................19
2.4.1 Prosthetic actuation techniques......................................................................................................................................19
2.5 Connection to the Body.......................................................................................................................................................20
2.5.1 Socket Design....................................................................................................................................................................20
2.5.2 Ossiointegration...............................................................................................................................................................20
2.6 User Control........................................................................................................................................................................21
2.6.1 Electromyography sensing..............................................................................................................................................21
2.6.2 Implanted myoelectric sensors........................................................................................................................................21
2.6.3 Brain Controlled Interface..............................................................................................................................................21
2.7 Sensory Feedback................................................................................................................................................................21
2.8 Controllers used in robotics and prostheses control........................................................................................................22
2.9 Artificial intelligence in medical field................................................................................................................................24
2.9.1Brain computer interface medicine.................................................................................................................................24
2.9.2 Brain Computer interface in prosthetics.......................................................................................................................24
2.9.3 Neural Decoding and Interfacing...................................................................................................................................25

1
2.9.4 Invasive BCIs....................................................................................................................................................................26
2.9.5 Non-invasive BCIs............................................................................................................................................................26
2.9.6 BCI control systems researches and development of products closely related to proposed system.........................27
2.9.6.1Duke University's monkey-robot arm in 2001............................................................................................................27
2.9.6.2 BCIs to move wheel chairs and mobile robots:..........................................................................................................28
2.9.7Evaluated consumer brain-computer interface/ different EEG-sensors.....................................................................28
2.9.7.1Mind Flex:.......................................................................................................................................................................28
2.9.7.2NeuroSky Mindset:........................................................................................................................................................29
2.9.8 Summary...........................................................................................................................................................................29
2.9.9 Research Gaps/How the study adds to the literature:..................................................................................................30
CHAPTER 3: METHODOLOGY...............................................................................................................................................31
3.0 INTRODUCTION...............................................................................................................................................................31
3.1 Design methodology............................................................................................................................................................32
3.1.2USER REQUREMENTS..................................................................................................................................................32
3.1.3 SYSTEM REQUIREMENTS.........................................................................................................................................32
3.1.3 CONTROL FLOW/LOGICAL SEQUENCE OF EVENTS........................................................................................34
3.2 CONCEPTUAL DESIGN...................................................................................................................................................34
3.2.1 DOMAIN SPECIFIC DESIGNS....................................................................................................................................34
3.2.3 DEVELOPMENT OF THE SYSTEM...........................................................................................................................35
3.2.4 Ergonomics.......................................................................................................................................................................35
3.3 MECHANICAL DOMAIN.................................................................................................................................................35
3.3.1Hand end effector..............................................................................................................................................................35
3.3.2FOREARM........................................................................................................................................................................35
3.3.4 Objectives of the Bionic design within the mechanical domain...................................................................................37
3.4.0 MECHANICAL DESIGN CALCULATIONS..............................................................................................................38
3.4.1Forces in fingers................................................................................................................................................................38
3.4.2Finger actuation speed......................................................................................................................................................40
3.4.3 ELECTRICAL DOMAIN...................................................................................................................................................40
3.4.3.1 ELECTRICAL DESIGN CALCULATIONS.............................................................................................................40
3.4.3.2Battery.............................................................................................................................................................................42
3.4.4 ELECTRONICS DOMAIN................................................................................................................................................42
3.4.4 .1Controller selection.......................................................................................................................................................43
3.4.4.2 Raspberry Pi..................................................................................................................................................................43
3.4.4 .2 MOTOR DRIVERS.....................................................................................................................................................44
H – Bridge (Transistors TIP 31)...............................................................................................................................................44
3.4.5 Sensor analysis..................................................................................................................................................................44
3.4.5.1 Pressure sensor..............................................................................................................................................................44
3.4.5.2 Temperature sensor.......................................................................................................................................................45
3.4.5.3CURRENT SENSOR.....................................................................................................................................................45
3.4.5.4Voltage regulators..........................................................................................................................................................46
3.5.0 Electronics and Control...................................................................................................................................................46
3.5 THE DESIGN OF THE BCI SYSTEM.............................................................................................................................47
3.5.1 SIGNAL DETECTION....................................................................................................................................................48
3.5.2 Emotiv EPOC headset.....................................................................................................................................................49
3.5.3 Signal Acquisition and feature extraction......................................................................................................................50
3.5.3 .1 TRANSFORM ANALYSIS.........................................................................................................................................52

1
3.5.3 .2 Zero-crossing Feature..................................................................................................................................................54
3.5.4 Classification.....................................................................................................................................................................55
3.5.4 .1Artificial Neural Network............................................................................................................................................55
3.5.4 .2Support Vector Machine...............................................................................................................................................55
3.5.4 .3Decision Tree..................................................................................................................................................................55
3.5.5 NUERAL NETWORK CREATION..............................................................................................................................55
3.5.6 Network model training...................................................................................................................................................56
3.5.7 ZEEGBEE MODULE INTERFACING........................................................................................................................57
3.5.7.1 Mapping Signal.............................................................................................................................................................57
3.6 INTERFACING THE NEURAL NETWORK TO THE BIONIC ARM.......................................................................57
3.7 SENSOR FEEDBACK.......................................................................................................................................................58
3.7.1 HOW IT WORKS............................................................................................................................................................58
3.7.2 TEMPERATURE AND PRESSURE SENSOR INTERFACE....................................................................................60
3.8 SYSTEM INTERGRATION DESIGN.............................................................................................................................61
3.8.1Printed Circuit Board Design..........................................................................................................................................61
3.9 Developed firmware for data acquisition and prosthesis control...................................................................................62
3.9.1Programming.....................................................................................................................................................................62
3.9.2 The functions for the motor control and data acquisition are as follows.......................................................................63
3.9.3 Servo Signals.....................................................................................................................................................................64
3.9.4 Program Flow...................................................................................................................................................................65
3.8.5 DESIGN VALIDATION..................................................................................................................................................68
3.9 CONCLUSION........................................................................................................................................................................69
CHAPTER 4: RESULTS AND DISCUSSION...........................................................................................................................70
4.0 INTRODUCTION...................................................................................................................................................................70
4.1 HOW THE SYSTEM WAS BEING TESTED.................................................................................................................70
4.2 RESULTS.............................................................................................................................................................................70
4.3NEURAL NETWORK TRAINING RESULTS................................................................................................................70
4.4 Real-Time Implementation................................................................................................................................................72
4.5 ELECTRICAL SIMULATION RESULTS.......................................................................................................................73
4.6 BILL OF QUANTITIES.....................................................................................................................................................73
4.7 System Specifications..........................................................................................................................................................74
CHAPTER 5..............................................................................................................................................................................75
5.1 CONCLUSIONS AND RECOMMENDATIONS............................................................................................................75
5.2 Overall System Performance.............................................................................................................................................75
5.3RECOMMENDATIONS.....................................................................................................................................................75

1
Figure 1 shows common bones on a human hand (Alam & He, 2014)..........................................................................15
Figure 2 Figure shows the degrees of freedom of a single point (Ahmadizadeh, Merhi, Pousett, Sangha, & Menon,
2017)............................................................................................................................................................................. 16
Figure 3 Figure shows the dip citation of degrees of freedom on a human finger.........................................................16
Figure 4 Commercial finger images (top) kinematic models of finger joint coupling (University, 2015).....................17
Figure 5 Shows various grip patterns of the Bebionic 3 arm (Bionics, 2022)................................................................18
Figure 6 Opposed and Non-opposed thumb position of the be-bionic 3 (Bionics, 2022)..............................................19
Figure 7 Shows examples of amputations for the i limb digits (Bionics, 2022)............................................................20
Figure 8 Shows High dof finger module (Rsearch, 2014).............................................................................................20
Figure 9 DEKA MIND CONTROLLED ARM (Rsearch, 2014)...................................................................................21
Figure 10 the Smart Hand developed by the ARTS Lab at Scuola Superiore Sant’Anna, Pisa, Italy (Dhillon GS, 2009)
...................................................................................................................................................................................... 22
Figure 11 Shows results of a survey on myoelectric prosthetic arm users (Christian Pylatiuk, 2007)...........................25
Figure 12 The Arduino Uno board (electronics, 2022)..................................................................................................26
Figure 13 : Raspberry PI Model 4 board (electronics, 2022).........................................................................................26
Figure 14 The architecture of general BCI....................................................................................................................27
Figure 15 Basic components of a BCI system for robotic arm control. The user’s neural activity is captured by a sensor
that can be placed in or on the brain or distributed over the scalp. Recorded signals are sent to a decoder, which
translates this information into commands that are passed to a robotic arm, which in turn, performs the intended
action. Visual feedback allows the user to continuously intervene and correct the effector motion; recently
somatosensory feedback has as been incorporated as well............................................................................................28
Figure 16Implanted sensors. Source (Donoghue, 2007)................................................................................................29
Figure 17 EEG recording cap (J.R. Wolpaw, 2002)......................................................................................................30
Figure 18 Monkey- Robot arm experiment (J. Wessberg, 2000)...................................................................................31
Figure 19 Mind Flex game console, headset and obstacle.............................................................................................32
Figure 20 V- model, VDI 2206 (Vasilije S. Vasić (2008), 2003)....................................................................................34
Figure 21 SHOWS THE CLOSED LOOP COMMUNICATION BETWEEN THE PATIENT AND THE ARM...............37
Figure 22 Shows the palm of the palm..........................................................................................................................38
Figure 23 Exploded view of the forearm.......................................................................................................................39
Figure 24 full mechanical assembly of the arm.............................................................................................................39
Figure 25 exploded view of the full arm assembly........................................................................................................40
Figure 26 shows the forces on the elbow gears.............................................................................................................41
Figure 27 DEPICITION OF FORCES ACTING ON FINGERS...................................................................................42
Figure 28 force distribution when gripping an object....................................................................................................43
Figure 29 shows the rotation speed and absorbed current as functions of exerted torque..............................................44
Figure 30 Figure shows the battery simulation run in MATLAB software....................................................................45
Figure 31 Arduino Pro mini (electronics, 2022)............................................................................................................46
Figure 32 Shows the raspberry pi zero controller (electronics, 2022)............................................................................47
Figure 33 H – Bridge (Transistors TIP 31) (electronics, 2022)......................................................................................47
Figure 34 pressure sensor (Bionics, 2022).....................................................................................................................48
Figure 35 LM 35 temperature sensor (electronics, 2022)..............................................................................................48
Figure 36 AC712 current sensor (interface, 2022)........................................................................................................49
Figure 37 Interface screen on the PC with implemented graphical user interface (GUI). courtesy of AUM ROBOTICS
(ROBOTICS, 2022)......................................................................................................................................................50
Figure 38 Block diagram of the BCI-controlled robotic arm.........................................................................................51
Figure 39 Diagram of the Emotiv Epoc Headset (Duvinage.m.Castmans, 2013)..........................................................53
Figure 40 brainwave streaming via an 8channel headset in real time ready for acquisition...........................................54
Figure 41Figure 21 figure shows the eeg raw data passing through the filter................................................................55
Figure 42 shows the code snippet fort data representation............................................................................................56
Figure 43 shows the graphical representation of the EEG data sampled using the Fast Fourier transform....................56
Figure 44 :the zero-crossing FEATURE ALGORITHM for sampling the raw eeg that will be passed to the CNN for
training..........................................................................................................................................................................57
Figure 45 shows extracted zero crossing feature from python graph representation......................................................57
Figure 46 shows the model training using different sampling rates and batches before testing.....................................59
Figure 47 shows the implementation of the neural network to the flask server.............................................................60
Figure 48 shows the process control of the arm with sensory feedback........................................................................62
Figure 49shows the temperature and force sensor integration designed in proteus.......................................................63
Figure 50 shows THE ELECTRONIC circuit of the components and their connections in proteus..............................64
Figure 51 shows the printed circuit board for the circuit designed................................................................................65

1
Figure 52 the figure shows the motor control functions used during the program creation...........................................69
Figure 53 shows the closed loop signal flow during the control and the data from sensors...........................................70
Figure 54 shows the program flow of the system..........................................................................................................71
Figure 55 Table 12: checklist table................................................................................................................................72
Figure 56 shows the confusion matrix of the neural network, validation and the accuracy test matrix tested using
MATLAB software........................................................................................................................................................74
Figure 57 shows a histogram of the human gestures expressed in % made in MATLAB graphical representation.......75
Figure 58 The battery simulations results simulated in MATLAB software..................................................................76

1
ABSTRACT

Through the use of a non-invasive BCI control method, this research project entailed the design and
construction of an operational initial prototype for a bionic human arm that is capable of autonomously
predicting the shape of an object and choosing the best method for gripping this thing.
In this research project, a working initial prototype of a bionic human hand and an anthropomorphic robotic
hand that can determine an object's shape on its own were designed and developed and choose the best
technique for collecting this object using a non-intrusive BCI control mechanism. The usage of a special
BCI control system allows for the arm to become mobile by converting the brainwaves pattern into
movements that may be used to determine an object's shape, the location of each finger, the strength of the
hold, and the quality of the grip.
Though the hand might be modified to work on various robot platforms as a versatile gripper, the
technology's intended usage is in the medical industry as a prosthetic.
As a prosthesis, this system has an advantage over comparable, commercially available products in that its
autonomous functions enable the user to quickly and easily access a variety of functionality. The hand
contains a battery and micro-controller. Multiple alternatives for signal input and control methods are
offered.
A prototype will serve as a platform for future programming efforts. The robot is designed according to the
VDI 2206 guidelines for mechatronic systems design. Modelling and simulation were used to evaluate and
validate the design of the bionic arm. Solid works software was used to help carry out a stress simulation
study, so as to determine if the mechanical frame could manage to withstand the 2kg maximum workload,
hence giving the confidence on the system’s feasibility upon implementation.

Keywords: prosthetics, mechatronics, force sensors, simulation, Biomedical.

1
Topic: Design of Brain-computer interface-controlled bionic arm

CHAPTER ONE
1.0 INTRODUCTION
Patients have difficulty being fully reintroduced to their workplaces in a post-amputation era, which has a
catastrophic influence on their health and leads to psychological distress and financial loss (Desmond &
MacLachlan, 2002) . It is unfortunate that individuals who are perceived as "other" in modern culture
frequently believe that the prosthesis is a foreign body (poor embodiment). As a result, new prosthetic
solutions must be developed that are more effective and that patients can embody.
They don't want to stand out; instead, they just want to fit in, receive normal treatment, and be able to live
normal, fulfilling lives (López Sullaez, 2009).
Hand injuries are of particular importance because they are an exceptional anatomic region with high value
because they are used in nearly all professions or occupations, and any level of amputation results in a
degree of disability that may limit the human to perform basic activities like feeding and grooming
permanently. Amputees are strong and capable people who make do with what they have and can overcome
adversity (Camacho-Conchucos, 2010).

Nowadays, exists a lot of prostheses more or less similar to the real hands and can achieve almost half of
the functions of the real ones, the newest can be moved by mental signals, or by nerve signals. Furthermore,
the price of them cannot be afforded by most people with a basic salary (Bradford, 2015) Therefore, this
research is about creating a prototype of an orthopaedic hand that can be made at a low price using the new
technology of the 3Dprinters and the movement and we will explore myoelectric prosthetic arms. It is aimed
to design a device that mimics the functions of a healthy human arm as best as possible and can be
controlled to some extent by directly recognizing brainwave patterns for a particular movement of the arm.

The design will consist of five individually actuated digits mechanism to allow the arm to operate all
actions with ease, a carbon mechanical frame with sensors that include force sensors for grasping force and
motor torque detection, and temperature sensors for measurement of the environment temperature to give
feedback to the user. The mechatronics VDI 2206 methodology will be used to come up with the final design
solutions and the control circuits will be designed using Proteus. The literature will be covered on the
prosthetics arms, the existing myoelectric and orthopaedic robot designs and patents, and their operational
principle and relative advantages and disadvantages.

1.2 BACKGROUND

Whole upper arm prosthesis was heavily considered at the outset of this project. One could contend that a
person's physique is their most priceless possession. The difficult effort of replacing a missing human limb,
particularly a hand, makes one properly understand the complexities of the human body. While many
businesses and investors have been working for ages to develop full lower arms with retired elbows, the
shoulder has long been ignored. Although the market has largely been displaying the continual
advancements in prosthetic technology, there is currently no room for a full upper arm prosthetic device
since the price would be so high that there would be no potential clients (Hand, 2014).

Until recent times the progression of prosthetic limbs has progressed relatively slowly and the early
innovations such as wooden legs or prosthetic hooks can be regarded as simple prosthetic devices. Recent
times however have given way to enormous advancements in prosthetic devices. The focus is not only on
the physical aspects of a device but also on the control and biofeedback systems. Slowly we are approaching
an advanced trans-human integration between machine and body. Perhaps sometime in the future prosthetic
devices will be faster, stronger, and maybe even healthier than our biological limbs.

The American military has recently shown interest in modernizing prosthetic technologies to provide injured
soldiers with new limbs. IEDs and landmines on the battlefield resulted in the injuries of numerous
servicemen. These soldiers frequently suffered shrapnel wounds that necessitated amputations or immediate
limb loss. The military had the resources to aid injured troops who had sacrificed limbs while defending
1
their country. It was evident that lower extremity prosthetics were useful and widely accessible, but that
since the Civil War, upper extremity prosthetics had not changed. When it comes to having an arm
amputated, the majority of people merely made due with what they had.
However, some soldiers even lost both of their arms and tragedies and it was clear that something needed to
be done for them (Adee, 2008).

The Defence Advanced Research Projects Agency, also known as DARPA, made the decision to invest
heavily in upper extremity prosthetic limbs. They decided on a two-pronged strategy and gave multiyear
contracts to Dean Kamen's DEKA research and the Johns Hopkins University Applied Physics Laboratory.
DARPA has essentially found a solution to the conundrum of creating the most sophisticated robotic
humanoid arms (Clay, 2011).

The only drawback of the arms was their price, which prevented them from performing nearly all tasks that
humans can. Although the items haven't been built, lab prototypes have been created for the $100 million
APL project and the $20 million DEKA project. These weapons weren't made with mass manufacturing and
consumer prices in mind. (Bradford, 2015).

1.3 PROBLEM STATEMENT


Every year there are over 6,000 people with upper limb amputations and neuromuscular disorders in
Zimbabwe alone, leaving thousands of individuals disabled and deformed. Aside from the discomfort caused
by curious stares from Onlookers, these injuries can greatly hinder an individual’s ability to perform even
basic daily tasks.

1.4 AIM

To develop a (BCI) controlled bionic arm that restores the natural behaviour of a healthy hand in patients
with amputations and neuromuscular disorders.

1.5 OBJECTIVES

• To develop real-time control for sensory feedback of the arm.


• To develop pattern recognition by the user for control movement and grasping conditions of the
hand.
• To develop an arm physical design that mimics the human arm in terms of dexterity.
• Integrate the Bionic arm with the BCI system

1.6 RESEARCH QUESTIONS


 How is the implementation of the arm going to affect the remaining amputee?
 What type of materials is going to be used on the arm that doesn’t affect the user's skin
or well-being?
 Will the hand be safe to use and handle during operation?
 What are the neural effects imposed by the use of the arm?

1.7 JUSTIFICATION
To produce a functional prosthetic arm there are numerous design and manufacturing challenges to
overcome. The challenge in this thesis is to create an arm of reasonable complexity and quality which can be
further used for research in the field of prosthetics. The reader needs to note that this work encompasses
several different engineering fields. The appropriate discussion will be allocated to each field and we shall
aim to tie together all areas into a single, robust, functional system. Below are the major areas that will be
addressed throughout this thesis.

 The development of a highly versatile upper body prosthetic device that addresses the issues of cost
and usability of today’s commercially available prostheses.

1
 This project provides a proof-of-concept prototype for the bionic human arm anthropomorphic
robotic hand capable of determining the suitable grip for different objects, movement same as the
real hand among other things without any human intervention but only the use of brain-computer
interface that will be controlling the arm using brain neural nerve impulses.

 Through the implementation of advanced sensing technologies into the device, this project aspires to
develop a prosthetic hand that is as natural and easy to use as Person’s organic extremity, without the
need for invasive surgical procedures.

 The issue of usability will be primarily addressed by the implementation of a digital vision sub-
system and impulse electrodes embedded in the prosthetic that will allow the arm to determine the
shape of the object the user is reaching for.

 The end goal is that the user will only need to reach for an object and tell the device when to close.
The process of selecting and executing a particular grip pattern will be taken care of automatically,
much like what is done naturally in our subconscious.

 The issue regarding cost will be addressed through the use of inexpensive materials such as plastics,
and low-cost manufacturing techniques such as laser cutting and 3D printing.

1.8 SCOPE

The scope of this project is to design an autonomous robot using the already existing engineering principles.
The:

• The hand will consist of 6 DOF, 5 individually actuated fingers, plus a thumb roll
• Attachment locations will include currently used standard universal socket methods
• To equip the hand with light material that will accommodate the user’s weight.

1.9 DESIGN METHODOLOGY

The design methodology for this project will take more of a quantitative approach and will be conducted as
follows:
1.9.1 RESEARCH AND DATA COLLECTION
This part will involve the use of research methods and tools through a review of relevant literature, various
engineering journals, and engineering design books.
1.9.2 DATA ANALYSIS
Data analysis will involve quantitative, and qualitative analysis including the comparison of the existing
designs to the proposed one, relating the set operating conditions/standards, and the data collected from
various sources mentioned above.

1.10 DISSERTATION STRUCTURE


The research is divided into five chapters as shown below:
 Chapter 1: Introduction. This chapter introduces the concept of prosthetics and BCI control in the
biomedical field outlaying the background and statement of the problem that supports the need for
the design of the brain-computer interfaced bionic arm.
 Chapter 2: Literature Review. This chapter focuses on the essential theory and critical assessment of
the related work needed for the development of the project.
 Chapter 3: Methodology. This chapter presents the materials and methods that were adopted to meet
the objectives of the study.
 Chapter 4: Results. This section presents results attained from experimental work and simulation
tests carried out in the development of the system.
 Chapter 5: Discussions. This section presents an analysis of the results, providing conclusive remarks
1
for the research and recommendations for future work.

CONCLUSION

This chapter introduces the background of the problem, which is the challenges faced by amputees and
patients with neuromuscular disorder or paralysis, which affect one’s day-to-day way of life because of a
disability. The project seeks to mitigate the high prosthetic costs resulting in the lack of restoration for
amputee patients in the biomedical industry through the design of a BCI-controlled bionic arm.

1
CHAPTER TWO
2.0 LITERATURE REVIEW
It is our responsibility to study the literature and provide summaries of significant developments in
engineering, computer, and medicine that have contributed to the creation of the bionic hand
prosthesis that is currently on the market. A survey of the literature follows an introduction to the
robotic applications in the biomedical field review of the literature is conducted.
2.1 Introduction
For the past decades, there have been rapid technological advancements in the development and use
of medical robots commonly in developed and developing countries due to an increase in the elderly
population (Masakatsu & Bo, 2020), and Bionic prosthetic hands are rapidly evolving. An in-depth
knowledge of this field of medicine is currently only required by a small number of individuals
working in highly specialist units. However, with improving technology the demand for and
application of bionic hands will likely continue to increase and a wider understanding will be
necessary (R G E Clement, 2011), prostheses have been in our lives since ancient times and have
made a significant impact in our living (Alam & He, 2014). They were primarily utilized for function
and cosmetic appearance (Thurston, 2007) .today, owing to the advancement of technology, artificial
hands can not only improve functional but also psychological well-being (Desmond & MacLachlan,
2002).

Although significant progress is been made in the field of artificial hand design, it is still an open
problem with ongoing research hence the development of new materials and new types of sensors
and actuators is also promising for the next generation of robotic hands. Some state-of-the-art
solutions for the robotic hand design, characterized by great functionality and dexterity the Robonaut
hand, designed by NASA (C. S. Lovchik, 5 May 1999).

Human hands are very important and vital for performing sophisticated movements and enabling
human beings to communicate with their surroundings and do most daily activities like feeding,
washing, and so forth. Therefore, it is possible to regain missing functions with a prosthetic limb to
enhance the quality of life of the amputee. Although prostheses can assist amputees in retrieving the
ability of hands, developing a bionic hand capable of mimicking a natural human hand remains a
challenge (Francesca Cordella, 12 MAy 2016). The need for sensory feedback in prostheses is
debated, but several recent studies have found that it is something that prosthesis is users desire in
their hand prostheses in addition to comfort, function, appearance, and durability. Even if the
performance in grasping tasks already is good, feedback could be beneficial for complex tasks and
situations when visual feedback is constrained. Regardless of the possible improvement in
performance, the subjective experience of embodiment tends to increase when feedback is added In
addition, some studies have reported reduced PLP when sensory feedback is added to a prosthetic
hand (Christian Pylatiuk, 2007)

In recent years researchers have tried to provide sensory feedback in hand prostheses in different
ways in research done by (Schofield & Katherine R Evans, 2014) : Applications of sensory feedback
in motorized upper extremity prosthesis reviews that Dexterous hand movement is possible due to
closed-loop control dependent on efferent motor output and afferent sensory feedback. This control
strategy is significantly altered in those with upper limb amputation as sensations of touch and
movement are inherently lost. For upper limb prosthetic users, the absence of sensory feedback
impedes the efficient use of the prosthesis and is highlighted as a major factor contributing to user

1
rejection of myoelectric prostheses. Therefore, to bridge the technological gap between artificial and
biological skin, several efforts have been made in the generation of electronic skin for the prosthetic
hand among other things (Jun Chang Yang, 2019) .
The main aim of this research is to design an artificial extremity that can work or perform the
sensation and execution of sophisticated motor functions but the most common limitation that the
majority of the prostheses have in common is that they are open-loop systems, lacking proprioceptive
feedback which thereby affects the full functionality of the natural limb or organ (Wendy Yen Xian
Peh, 25 September 2018).in the past studies there has been a small number of success in the attempts
to create a bionic hand that works the same way as a healthy human hand.

2.1.2 The Human Hand


The human hand has at least 27 bones, more than 30 different muscles, and more than 100
identifiable ligaments, nerves, and arteries depending on the individual (Alam & He, 2014). People
who have lost an extremity can resume performing physical functions with the aid of a prosthesis.
The human hand's flexibility, dexterity, and fluidity are unsurpassed by any prosthetic at this time.

Figure 1 shows common bones on a human hand (Alam & He, 2014)

1
2.2 DEGREES OF FREEDOM

Figure 2 Figure shows the degrees of freedom of a single point (Ahmadizadeh, Merhi, Pousett,
Sangha, & Menon, 2017)
Consider a point in space while looking at the image above. From this position, we can translate
(move) along three different axes, i.e., left and right, up and down. We can also rotate around three
separate axes at the same location. The human neck, for instance, has three degrees of freedom of
rotation, allowing us to look up, down, and sideways while also tilting our heads. Consequently, a
single point can have up to 6 degrees of freedom in total (3 translational, 3 rotational) The human
finger has four degrees of freedom in total (George ElKoura, 2003). Each joint's three rotations—
DIP, PIP, and MCP—combine to regulate the flexion and extension of the finger. Abduction and
adduction (wiggling) are also possible at the knuckle (MCP joint). Fingers and all joints in the human
body are actuated (moved) via the contraction of muscles and tendons.

st
DIP degree of freedom
1
th
MCP 4degree of
nd
freedom
– PIP 2 degree of freedom –

rd
MCP 3degree of freedom
Figure 3 Figure shows the dip citation of degrees of freedom on a human finger

2.3Capabilities of Prosthetic hands

The vast majority of commercial prosthetic fingers are actuated through a joint linkage system
powered by DC electric motors (] Belter, 2013). Kinematic models for various prosthetic fingers are
shown below.

1

Each finger incorporates its own mechanism to

mechanically couple joints together. Rotating

the metacarpal joint (knuckle) simultaneously


rotates the higher phalange joint .

Figure 4 Commercial finger images (top) kinematic models of finger joint coupling (University,
2015)
The problem with this type of design is that there is no control over individual finger joints. All joints
in the finger are controlled through a single actuator which means the entire finger has only a single
degree of freedom – these fingers can only open/close exclusively. (Childress, 1985)
Dexterity arises from the numerous degrees of freedom of the human hand. The fine motor control a
person has over their finger joints allows for a vast array of intricate tasks to be achieved. In contrast,
commercial prostheses are limited to simple tasks partially due to the lack of fine control in the
fingers. For example, trying to knit, sew or play a musical instrument like a guitar with a modern
commercial prosthetic device would be extremely difficult if not impossible. (Allin S, 2010)

2.3.2 The Bebionic 3


The Bebionic 3 is a world-leading commercial myoelectric arm. Like others of its kind, the Bebionic
3 uses a predefined grip system. A user can select from 14 different grip patterns using muscle
activity around their upper forearm (Francesca Cordella, 12 MAy 2016). The user does not essentially
have control of individual finger movements, rather they can select a grip pattern and then use muscle
activity to activate the movements of that specific grip. Four of the fourteen grips of the Bebionic 3
are shown below.

1

Figure 5 Shows various grip patterns of the Bebionic 3 arm (Bionics, 2022)
The problem with a predefined grip system is that the user cannot finely control finger positions to
grip a specific object or complete a task. Rather, a user must choose a grip pattern that best suits the
job at hand and then actuator that grip pattern.

Furthermore, the user must cycle through several grip patterns before they get to their desired choice.
For example, unzipping a bag, picking up a heavy object, placing it in that bag, and then zipping the
bag up could require several grip changes. As a result, certain simple tasks like this could take quite
some time to complete and can become tedious and frustrating.

The thumb accounts for arguably 40 percent of human hand use (Ahmadizadeh, Merhi, Pousett,
Sangha, & Menon, 2017). Thumb design is critical in all prosthetic hands and is more complex than
the other fingers.

The Bebionic 3 has an adjustable thumb that can be placed in an opposed or non-opposed position,
the difference between these positions can be seen in the images below. This prosthesis cannot
directly change the thumb's position. To switch between opposed and non-opposed positions the user
must apply an external force to “click” the thumb into position, e.g., use the other hand to change the
prosthetic thumb position

1
-

Figure 6 Opposed and Non-opposed thumb position of the be-bionic 3 (Bionics, 2022)

Suppose a user of the Bebionic 3 is using a computer mouse. If that user reaches to pick up a bottle
of water not only, would they have to change the thumb position using their other hand, but they
would also need to cycle to a new grip state. The user would be better off using their other biological
hand to fetch the water bottle in the first place. In such a situation this prosthesis provides no
practical benefit.

With all that being said, prosthetic devices are intended to provide more than functional practicality.
For many amputees, the loss of an extremity is also accompanied by a significant decrease in
confidence and self-esteem. A prosthetic arm can help alleviate these issues (Francesca Cordella, 12
MAy 2016)

Physical appearance is an important aspect of prosthetic arms. A survey of myoelectric prosthetic


hand users found that the majority of adult users were dissatisfied with their device’s cosmetic
appearance (Christian Pylatiuk, 2007).
The Bebionic 3 is controlled through electromyography (EMG) electrodes placed on the surface of
the user’s skin. The placement of these electrodes depends on the level of amputation but is usually
around the upper forearm. Rebalance computer software can be used to adjust several settings to
enhance the user’s control of the device and tune the system to the user’s myoelectric signals.

The level of EMG control is dependent on the specific amputation but is generally limited to only a
few different commands which cannot be executed simultaneously.
It is due to this limitation in control that a predefined grip system and a manually adjustable thumb
were designed. If a more complex system was designed the user would simply have no way of
controlling it.

2.3.3 I Limb digits


One major step back in developing prosthetic arms is that amputation can occur at any point along
the arm and is unique in every case. The Bebionic 3 arm previously discussed incorporates electric
motors into the palm to actuate the fingers. As a result, the Bebionic would be of no use to an

1
amputee who has lost several fingers but still has their palm intact.
The I-Limb digits developed by touch bionics incorporate electric motors directly into the prosthetic
fingers (Bionics, 2022). This allows for the palm area to fit into a socket connection attaching the
prosthetic fingers to the hand. The image below shows possible amputations which would be suitable
for use of the I Limb digits


(Limb digits )
.2 A review
Figure of present
7 Shows technology
examples and future
of amputations aspirations
for the of (Bionics,
i limb digits prosthetics
2022)
A custom socket is designed to fit around the remaining area of the user’s palm as many digits as
necessary can be added to the system the control for the I limb digits is the same as the ones for the
Bebionic 3 arm
The disadvantage of this system is that relatively small motors have to be used to be able to fit inside
fingers. this leads to slow movement of the digits and weaker than those in commercial prostheses.

2.3.4 22 Degree of Freedom APL Hand


One of the most advanced modern prosthetic arms is the 22 degrees of freedom Intrinsic Hand developed
at the John Hopkins Applied Physics Laboratory (Rsearch, 2014). This hand has been developed through
the DARPA initiative and funding and has unmatched mechanical dexterity. To achieve such fine control
designers incorporated a total of 15 miniature DC motors directly in the fingers, palm, and wrist.

Figure 8 Shows High dof finger module (Rsearch, 2014)

Furthermore, this device is designed to fit a 50th-percentile female arm making it truly exceptional in
terms of its complexity and size. The Intrinsic Hand can replicate almost every movement of the
biological human hand. Using standard EMG sensing techniques there is no way of obtaining enough

1
control for a user to practically use all the degrees of freedom of this device. However, DARPA is
further funding the development of a prosthesis/brain neural interface to connect the user’s nervous
system directly to inputs in the arm.

Figure 9 DEKA MIND CONTROLLED ARM (Rsearch, 2014)


Also, a 16 degrees of freedom (DOF) Smart Hand prosthesis was created by the ARTS Lab at Scuola
Superiore Santana, Pisa, Italy. The delivery of prosthesis actuation based on four DC motors and
tendon transmission helps the hand execute core grasping tasks useful in everyday living activities
(Figure 10). The artificial hand is fitted with a customized palm-embedded control architecture to
move its actuators in multiple control modes and share proprioceptive and exteroceptive sensory
inputs from the external world.

The hand fingers typically contain 32 locations, force, and tactile sensors, all of which are obtained
from the local controller and are capable of providing input to patients through suitable interfaces.
(Cipriani, Controzzi, & Carrozza, 2009)

1
Figure 10 The Smart Hand developed by the ARTS Lab at Scuola Superiore Sant’Anna, Pisa, Italy
(Dhillon GS, 2009)

2.4 InMoov Robot


Gael Langevin is a French model builder and artist who has no formal training in robotic engineering.
In January 2012, he began working on a personal project named InMoov to build a life-size
humanoid robot (J., 2015). A 3D printer with a build volume of 12 * 12 * 12 cm can manufacture the
InMoov robot seen in Figure 5 because it was created using free and open-source software. Those 3D
part files (.stl) are available for download from Gael's website and are open-source
(InMoov.blogspot.com). Gael had the concept to combine servos and an Arduino to create a
programmable robotic hand. The hand can be controlled by using a keyboard and can move at many
speeds. In 8 months, Gael's project went from a simple hand to a torso, arms, and head. This robot
has a moving capability just like a human being. The fingers can move, the hand can twist, and the
elbow can flex and extend (Escribà Montagut, 2013)

2.4.1 Prosthetic actuation techniques


Actuators are defined as devices that convert chemical, electrical, and thermal energy into
mechanical energy or physical motion, either linear or rotary. The main types of actuations are
pneumatic actuation, hydraulic actuation, and electric actuation.

Electric actuators convert electrical energy into linear or rotary motion with better accuracy and
repeatability as compared to other actuators and are commonly used in clean room applications
because they are well-suited for electronic control (Jenkins, 2005) In most cases, electrical actuators
have wide industrial applications due to the simplicity in controlling force, speed, and position. DC
motors are widely used in industrial robot systems due to characteristics such as constant power
output and rapid acceleration/deceleration. They spin when voltage is applied to the terminals and,
no-load speed/stall torque and voltage are directly proportional to each other. When incorporated with
a gear system, a DC gear motor is formed with reduced speed but increased output shaft torque.

Also, BLDC (Brushless DC) motors are those DC motors with no brushes but use electronic
commutation. There are a variety of speed control methods for these motors including proportional-

1
integral (PI), PWM (Pulse Width Modulation), and proportional-integral-derivative (PID) controllers.
PWM is a technique that can be used to precisely control DC motors with varying duration of pulses
from the controller board. This enables a constant torque output hence avoiding motor jerks as posed
by other control techniques like linear speed controllers. (NIKHIL & DEIVANATHAN, 2018).
Modeled and analyzed the application of a DC motor as an actuator for use in two-finger robotic
grippers, pick and place, employing the PWM technique. A Simulink model was developed to
confirm the motor model's reliability and performance. The DC motor was to convert the rotary
motion into a linear one for the gripper rod responsible for opening and closing the grippers. Results
with and without using the PWM were compared and according to the authors, better control of the
grippers was accomplished by variations in the PWM duty cycle.
Motor speed, supply voltage, output force, and torque are some of the factors considered when
selecting DC motors.

For real-time current compensation, increased control response, and increased accuracy, a study by
(Shih-Hsiang, Pei-Chong, Yuan-Chiu, & Chyi-Yeu, 2019). Was proposed for sensorless, low-gain
control of BLDC motors focused on improved safety and reducing costs. The system used Hall Effect
and current sensors for position and torque control, a dynamic compensator to avoid slow response as
well as low-pass filters to avoid noise and under-sampling. A seven-axis robot was used for
experiments and the system's results verified its applicability in service robots with required response
and accuracy.
Servo motors are DC gear motors with a servo control circuit to precisely rotate the shaft with a
specific angle using PWM signals from the MCU (Fatma, Sıtkı, & Fatihhan, 2022).

2.5 Connection to the Body


2.5.1 Socket Design
The first goal of prosthetic management is the protection of the residual limb since 90% of upper
limb amputations arise from physical trauma (Lake., 2019). Prolonged pressure exerted on damaged
soft tissue areas can lead to a significant compromise of the remaining appendage. Problems an
amputee may experience include pain, swelling, blisters, skin irritations, edema, and a restriction of
blood flow.

The most common way of fitting a prosthetic is by creating a custom socket that fits around the
amputee’s stump. This socket can either be self-suspending, suction fitted, or secured by harnesses to
the user. Comfort and load distribution can be increased by providing some form of paddings such as
a prosthetic sock, or inflatable air pockets, or by reducing the density and stiffness at a sensitive
region (Cipriani, Controzzi, & Carrozza, 2009)
Several lost-cost prosthetic arms use some form of thermo-softening plastic to create a custom
socket. A plastic sheet is heated and formed around an amputee’s stump. A prosthetic sock can be
worn to create a snug fit between the user and the device
2.5.2 Osseointegration
osseointegration is the process of permanently integrating a non-biological component with a human
bone. In prosthetic devices, a titanium stud is screwed into a long bone in the arm or leg at the
amputation site. Over time the titanium and bone fuse together to create a firm anchor point for the
prosthetic to be attached to. Osseointegration is not an overly common practice, however, it does
offer several benefits including A strong, sturdy anchor point with no need for soft tissue to bear the

1
weight of the prosthesis
• No skin/blood flow issues induced by a socket
• No fitting problems due to a gain or loss in weight

2.6 User Control


2.6.1 Electromyography sensing
Myoelectric signals are electrical pulses within the body produced by contracting muscles. Surface
electrodes on the user’s skin can detect these small signals and in the case of prosthetics be used to
control the device (Jun Chang Yang, 2019).
The problem with surface EMG techniques is that there is a lot of cross-talk between muscle signals.
Because muscle groups, especially in the arm, are physically close together, it is difficult to
distinguish exactly which muscle is generating the measured signal via the surface electrodes. One
way of alleviating this problem is through target muscle reinnervation (TMR). TMR is a surgical
procedure that takes residual nerve endings from an amputation site and spreads them across an
alternative intact muscle group (Ahmadizadeh, Merhi, Pousett, Sangha, & Menon, 2017). Because
the nerves and corresponding muscle contractions are now spread over a larger area it is easier to
decipher individual signals.
Surface EMG electrodes require a clean and secure connection to the user’s skin which makes
measurements susceptible to sweating and electrode displacement (Alam & He, 2014).

2.6.2 Implanted myoelectric sensors


Implanted myoelectric sensors (IMES) are inserted directly into targeted muscles through a surgical
procedure. Using IMES for EMG control allows for relatively cross-talk-free signals. This means
muscle signals are more distinguishable and deeper muscles can also be used for control (Birbaume,
1999). As a result, this method allows a user to have more control over their prosthetic device. Users
of IMES systems have been able to independently control their prosthesis’ thumb, fingers, and wrist
rotation simultaneously (Chen X, 2010).

2.6.3 Brain-Controlled Interface


The most cutting-edge form of control is a direct brain interface be researched and developed by
DARPA (Donoghue, 2007). Two sensor arrays placed on the surface of the brain through a surgical
procedure are wired to two pedestals embedded in the skull. A patient using this technology has been
able to control a robotic arm in 3-dimensional space as well as open and close the hand – all through
the power of her thoughts alone.
2.7 Sensory Feedback
One major problem of many prosthetic devices is that they lack feedback from the user. The sense of
touch is a natural feedback mechanism that allows a person to make physical adjustments both
consciously and subconsciously. With no form of feedback, a user must rely entirely on vision to
determine the position and force of their prostheses. Survey results show almost all users of
myoelectric prostheses want some form of modern sensory feedback. Modern prostheses can
provide feedback by stimulating senses in some areas of the body. Vibration motors and temperature
pads placed on the surface of the skin provide rudimentary sensory substitution. The majority of
myoelectric prosthesis users found these forms of feedback to be useful (George, et al., 2019).

1
Another emerging form of feedback is through electro-tactile arrays. By electrically stimulating
arrays of electrodes, a false sense of touch can be created [26]. For example, an electro-tactile array
on a smartphone screen can create artificial surface textures of wood and stone.

Figure 11 shows the results of a survey on myoelectric prosthetic arm users (Christian Pylatiuk,
2007)
Another emerging form of feedback is through electro-tactile arrays. By electrically stimulating
arrays of electrodes,, a false sense of touch can be created. For example, an electro-tactile array on a
smartphone screen can create artificial surface textures of wood and stone. (E, 2014)
Electro-tactile pads could be fitted around the stump of an amputee generating artificial senses of
touch. The stimulating signals could be controlled via pressure sensors connected to fingertips.
Essentially this would shift the sense of fingertip touches to the stump.

2.8 Controllers used in robotics and prostheses control


Generally, microcontrollers (MCUs) are used to read input data from sensors, process it and make
decisions and execute them on different actuators. Mostly in robotics, augmented MCUs are used,
and these are ones in which all peripherals and functions are placed on one chip. PIC (Peripheral
Interface Controller) 18F series controllers by Microchip technology are commonly used as they are
an upgraded version having an enhanced input/output port, more memory, a bigger stack, and a larger
instruction set. The 8051 MCU, programmed in Assembly language, has wide applications in Pick
and Place or Firefighting robots having, among others, an 8-bit data bus, 16-bit address bus, 128-byte
RAM, 16-bit program counter, and data pointer.
AVR MCUs by Atmel is mostly used in development boards e.g., ATMega328P used in Arduino Uno
boards. These controllers offer the ease of programming as well as prototyping making them well-
suited for educational purposes. An Arduino Mega 2560 was used by (C, et al., 2021) to control two

1
servo motors, a Bluetooth module, and an ultrasonic sensor of a mobile robot. The program executed
a motion sequence on the actuators according to the user's input received via Bluetooth
communication with a 0.5-second delay between steps.

Figure 12 The Arduino Uno board (electronics, 2022)

First off, although being geared toward educational uses, the Raspberry PI has applications that have
been expanded to include use in most industries. A single, compact controller featuring ARM, a GPU,
and wired and wireless communication is called the Raspberry PI. It comes in different models, but
Model 4 is best for industrial uses because it can handle complicated jobs with longer computation
times. The Compute IO industrial module, Video Core VI GPU, 1GB to 4GB RAM, and 1.5GHz speed
are a few aspects that contribute to its strength. The ability to interface various sensors, including
visual sensors that are incompatible with the robot controller, as well as the incorporation of artificial
intelligence algorithms make this controller advantageous. It is also more affordable than competing

controllers.
Figure 13: Raspberry PI Model 4 board (electronics, 2022)

1
2.9 Artificial intelligence in the medical field
2.9.1Brain computer interface medicine
Brain activity produces electrical signals that are detectable on the scalp or cortical surface or within
the brain. Brain-computer interfaces (BCIs) translate these signals from mere reflections of brain
activity into outputs that communicate the user’s intent without the participation of peripheral nerves
and muscles (Sellers, 2006).
BCIs can enable communication and control for persons with severe neuromuscular illnesses such as
amyotrophic lateral sclerosis, brainstem stroke, cerebral palsy, and spinal cord damage because they
don't rely on neuromuscular control. The goal of BCI research and development is to make it possible
for these people, who may not even be able to breathe or move their eyes, to communicate their
requests to carers, utilize word processors and other types of software, or operate a robotic arm or
neuroprosthesis (Velliste, 2008).

BCIs can be designed for communication or control applications. For control applications, the BCI
output drives a device such as a cursor, robotic arm, or wheelchair that the user can move. Invasive or
non-invasive BCIs are both possible Current non-invasive BCIs employ scalp-recorded
electroencephalographic (EEG) activity to infer the user's intention. They can give severely disabled
people basic communication and control (Birbaume, 1999) and Current invasive BCIs derive the
user’s intent from neuronal action potentials or local field potentials recorded from within the
cerebral cortex or from its surface. (E, 2014).

Figure 14 The architecture of general BCI

2.9.2 Brain-Computer interface in prosthetics


A brain-machine interface connects the mind with the outside world so that information can flow
between them and they can communicate with one another using an external device. The basic idea
underlying current prosthetic BMIs is to collect motor control signals from neural networks, translate
those signals into the motor control of a device, and then fine-tune that device's control using various

1
feedback sources and computer algorithms. (M. A. Lebedev and M. A. L. Nicolelis, , 2006.)
Information can be encoded and sent back into the brain via transcutaneous electrical nerve
techniques, and information flow is controlled by the capacity to record signals from neurons and
decode those signals so that they can be converted into device control (S. J. Bensmaia and L. E.
Miller, 2014).

2.9.3 Neural Decoding and Interfacing


Understanding brain impulses and transmitting them to the prosthetic device need a great deal of
decoding. Multiple mathematical operations are used by decoders to transform the range of brain
signals captured by electrodes into signal patterns that may be identified and understood as being
related to particular neural activity. (M. A. Lebedev and M. A. L. Nicolelis, , 2006.)

Figure 15 Basic components of a BCI system for robotic arm control. The user’s neural activity is
captured by a sensor that can be placed in or on the brain or distributed over the scalp. Recorded
signals are sent to a decoder, which translates this information into commands that are passed to a
robotic arm, which in turn, performs the intended action. Visual feedback allows the user to

1
continuously intervene and correct the effector motion; recently somatosensory feedback has as been
incorporated as well

2.9.4 Invasive BCIs


Direct brain implants of invasive BCIs are performed during neurosurgery. Scientists can read the
firing of hundreds of neurons in the brain using chips implanted in the brain that have hundreds of
electrodes, as shown in Figure 13. The BCI devices that are intrusive produce the best signals. Rats
and monkeys were used in laboratories to test this technology in the real world (interface, 2022).

Figure 16Implanted sensors. Source (Donoghue, 2007)


2.9.5 Non-invasive BCIs
The non-invasive BCIs are a different way than direct neural contact with pins to read brain activity.
The first and best-studied non-invasive interface is electroencephalography (EEG), which uses
electrodes pressed on the scalp to collect brain impulses. The EEG-based brain-machine interface
shown in Figure14 offers a precise temporal resolution. It is affordable, lightweight, and simple to
operate. The fact that this method does not directly touch each neuron to take up brain impulses
results in lower spatial resolution and noise in EEG readings. Millions of neurons located beneath the
scalp produce electrical activity that is reflected in the EEG data. (J.R. Wolpaw, 2002)

1
Figure 17 EEG recording cap (J.R. Wolpaw, 2002)

2.9.6 BCI control systems research and development of products closely related to
the proposed system
Numerous research teams have made significant investments in the fields of neurology, robotics, and
computer science to improve BMI and its applications (Pfurtscheller, 2004). A review of EEG-based
BCI signal processing techniques was published by Bashashati in 2006 (Bashashati, 2007). In those
publications, the existing situation of BCI research, open research problems, and prospective
applications are discussed.

2.9.6.1Duke University's monkey-robot arm in 2001


One of the well-known research projects in invasive brain-machine interfaces is the monkey-
robot arm experiment carried out by Nicolelis's team at Duke University in 2001 (J. K. Chapin,
1999). To obtain neural impulses to power a BCI, they implanted several electrodes dispersed
throughout a larger portion of the monkey brain. They used the gadgets to replicate monkey
motions in robotic arms by simultaneously recording the activity of vast groups of neurons,
decoding monkey brain activity. Nicolelis' research with rhesus monkeys was successful in
closing the feedback loop and reproducing the monkeys' reaching and gripping actions using a
robot arm. The monkeys were trained to reach and grasp objects on a computer screen by
manipulating a joystick while corresponding movements by a robot arm were hidden. The
monkeys were later shown the robot directly and learned to control it by viewing its movements.
The BCI used velocity predictions to control reaching movements and simultaneously predicted
hand gripping force.

1
Figure 18 Monkey-Robot arm experiment (J. Wessberg, 2000)

2.9.6.2 BCIs to move wheelchairs and mobile robots:


By asynchronously translating high-level brain intention commands into a finite state automaton,
some researchers examined how to control robots (Millán...., 2004). Using an EEG-based BMI that
could distinguish between three different mental states, a group demonstrated that two human
participants could manage a robot by merely using their minds to maneuver it between different
rooms. A BCI system built on the P300 EEG is used by another group (Pires, 2009). Before
classification, the system uses temporal feature selection and feature extraction, which increases
classification accuracy for BCI communications with very low signal-to-noise ratios and limited
spatial resolution.
2.9.7Evaluated consumer brain-computer interface/ different EEG-sensors
2.9.7.1Mind Flex:
another innovative gadget that incorporates mental focus into a game? Both a gaming console and a
headset are present. Turning the control knob on the gaming console allows you to move the fan
nozzle that lifts the ball into the air. With the help of three electrodes, the headset tracks brain waves.
Two on-ear clips are attached to each ear lobe and two are on the forehead. The game starts when you
put on the headset, enabling you to utilize your focus to guide the ball around the gaming console and
through a challenging obstacle course in Figure 19. The ball must be raised and lowered by
alternately concentrating, but a dial-in console allows you to maintain the ball aloft on a cushion of
air (Hochberg, 2006).

1
Figure 19 Mind Flex game console, headset, and obstacle

2.9.7.2NeuroSky Mindset:
The headset for NeuroSky mindset features four electrodes. It consists of the forehead-touching
sensor, the contact and reference sites on the ear pad, and the onboard chip that handles all of the data
processing. The onboard chip enhances the signal of the raw brainwaves, analyses the raw
brainwaves, and filters out background noise and muscle movement. The Mindset connects to the PC
through built-in Bluetooth and features a rechargeable lithium-ion battery. The Mind Set offers
software that categorizes mental states as Meditation or Attention (akin to concentration) (similar to
relaxation). Generally speaking, visual focus can be used to control attention. Usually, practicing
meditation can aid in relaxation. Development Tools (MDT) provide the developer with an API
interface for headset communication. In order to query a user's brainwave data, the API enables
Win32 applications to connect to and communicate with the onboard device. It enables scientists and
programmers to research the behaviours of raw brainwave (EEG) data. (Farahani, 2006).

2.9.8 Summary
As evidenced by the research cited above, robotics is developing quickly and has a wide range of
potential applications in the medical profession Prosthetic hand manufacturing has advanced
significantly There are now bionic hand prosthetics available for patients that are reliable, secure, and
convenient with enhanced capability. These prostheses must also overcome significant obstacles if
they are to emulate or even advance on the intrinsic side. Millions of amputees have seen a glimmer
of hope thanks to advancements in biomedical engineering, robotics, and artificial intelligence (AI),
and it's feasible that soon there will be commercially available artificial limbs with both AI and
robotics accurate motor coordination and sensation. AI applications in the orthosis and prosthetic
fields are still in their early stages and not frequently used. Techniques for prosthesis regulation can
be identified as the first neural pathway adaptations. In cases when a significant amount of data is
lost in the affected area, imperfect recovery and shaky cognitive knowledge should be supplemented
by other technical techniques to recover the information that was deleted in the injured
neuromuscular system Electroencephalography (EEG), electrocorticography (ECoG),
mechanomyography (MMG), electroneurography (ENG), force myography (FMG), and
electrocorticography (ECoG) are among the neural signals from the central/peripheral nervous

1
system (PNS/CNS) that can be used to recover missing data and enable hand gestures and dexterous
movement of the prosthesis. However, Out of the signals mentioned above, EMG is considered to be
the only biological signal currently accepted for medicinal sand more technological advancement
also brought to light the use of brain-computer interface which will allow for the turning of "thought
into action" may be big deal for patients with severe motor disabilities.

2.9.9 Research Gaps/How the study adds to the literature:


According to the reviewed literature, the use of BCI control in robotic systems is an area that has less
and requires more, attention. Thus, this research aims to add literature on the use of the Brain-
Computer Interface in the biomedical field. A prosthetic controlled by a machine learning algorithm
is designed and developed for use for patients with severe disabilities and amputations thus widening
the areas of applicability of biomechatronic designs in the field. The research depicts how to take
advantage of the current developments in robotics and powerful technology, specifically brain
Computer interface, and exploit these technologies in creating artificial limbs that restore the full
functionalities of a real healthy hand. Also, highlighted in the literature are descriptions of how BCIs
are integrated with nerves or the brains in the control and movement of the prosthetic arm. Finally,
the study shows how BCI can be designed and used in communicating the sensor data from the arm
to the brain.

1
CHAPTER 3: METHODOLOGY
3.0 INTRODUCTION
The design of a BCI-controlled bionic arm is expected to exhibit synergy and integration towards
constraints such as higher performance, speed, precision, efficiency, lower costs, and healthy hand
characteristics in terms of voluntary control and sensation.
The mechatronic system design process addresses these challenges following an interdisciplinary
design procedure, with evaluation, integration, and optimization of the system and all its sub-systems
and components as a whole and concurrently with all the design disciplines working in parallel and
collaboratively throughout the design and development process to produce an overall optimal design
(Mahfouz, October - 2014).
This chapter focuses on the steps followed in coming up with a working solution or design for the
desired BCI-controlled Bionic arm. To meet the desired objectives, engineering analyses of the
various solutions generated were done which ensure the most suitable, cost-effective, and fitting
solution is chosen

Figure 20 V- model, VDI 2206 (Vasilije S. Vasić (2008), 2003)


The whole design task boils down to a comprehensible and logical exercise that allows recovery from
inevitable errors. The solution formulated by this approach is said to have a higher chance of
being both economically and technically sound (Lyshevski, 2002)research paper specifically
aligns with this methodology (VDI 2206) mainly due to its aims to make it easier to find an
optimal and best in the given scenario solution by searching for it in a structured and
systematic manner very comprehensive and extensively. (Gaetani1, 2020)

1
3.1 Design methodology
The overall design for this prosthetic hand was treated as an investigation of feasibility. After
performing the necessary background research and studying the available products on the market, it
was determined that reconstructing an entire prosthetic arm was not necessary, therefore a design by
InMoov was taken into consideration.
To build upon current ideas, areas of potential innovations were reviewed, which range from
mechanical ideas which included novel actuation methods such as hydraulics and pneumatics or
linear motors, central transmissions with variable clutches, then spring storage techniques.
Mechanical linkage concepts ranged from cables and pulleys to gears and four-bar mechanisms in
terms of the controls of the system. Ultimately the BCI interface of hand will be attained from the
off-the-shelf EEG sensor headset made by EMOTIVE SYSTEMS called the EPOC headset. which is
an open source for reteaches in the biomedical field.

3.1.2USER REQUREMENTS
The target users sought were health professionals and students that are in the medical field. The
above-chosen group provided a concrete set of requirements that will need to be possessed by the
system.
These specify the common requirements outlined by the user so that the design is deemed “user-
friendly.” The user should be able to:
1. Change the batteries from the prosthetic arm without additional tools
2. Perform simple maintenance on the arm
3. Should be able to put on the arm on the stump with little to help
4. Should be able to connect the headset easily

 The arm should be robust


 The arm should be cheap
 The arm should be reliable in terms of motor movements and control
 The hand will not be able to destroy itself, all actuators will have software and
mechanical limits to prevent unwanted motion at joint limits.

Signal quality was another key metric for evaluating the effectiveness of an EEG headset.
EEG signals are noise-prone due to physiological and non-physiological artifacts. A good
 headset design should minimize these artifacts. Metrics for signal quality assessment
evaluated.

3.1.3 SYSTEM REQUIREMENTS


The system requirements are noted and analyzed in terms of the description of the requirement. The
bionic arm should operate like a healthy human hand that is in line with the scope of the project
whereby it will have movements that mimic the operations of the unaffected hand with involuntary
control from the BCI. The bionic arm will be operating automatically thus having a power supply
itself during operation.
Hence for the arm to satisfy the above-mentioned objectives, it had to incorporate the below-tabled
system requirements.

1
Table 1 SYSTEM REQUIREMENTS OF THE BIONIC ARM

Requirements Description

Mechanical ˗ the arm needs a mechanical base frame to support the analog
Frame
sensors and all electronics

Sensory ˗ The arm is required to send back the sensor data to the user through
feedback
LEDs and vibrating motors

Accuracy and ˗ The arm system is required to have high accuracy but moderate
Repeatability
repeatability

Geared Servo ˗ To provide the mechanical rotary motion to the fingers and the
Motors
elbow

Power supply ˗ To provide power to the controller, sensor, switching devices, and
geared motor units

Controller ˗ To regulate the movement of of the fingers and sensors as per the
speed of the hand.
˗ Process the EEG data and transform it to microprocessor
understandable language.
˗ To monitor and control the speed of the arm
˗ To store the operating code.

Sensors Each finger will be equipped with a a Force sensor to measure the
applied force on the grasped object
˗ Each finger will be equipped with a tactile sensor to measure the
feels of the object being grasped
˗ Each finger will be equipped with a temperature sensor to measure

1
the temperature of the object

Carbon ˗ protect the prosthetic as well as embed the sensors


freemasonry
skin

LCD Screen ˗ To give the arm user feedback on battery level and the object they
are holding

EEG ˗ The headsetwill contain multiple electrodes, to produce a large


HEADSET
number of EEG headsets while staying under budget, low costs and
easy manufacturability were required. Additionally, the design had
to be comfortable so that participants were not disturbed by the
headset during data collection

3.1.3 CONTROL FLOW/LOGICAL SEQUENCE OF EVENTS

Brainwaves are recorded by the headset through the EEG electrode sensors and the recorded raw
waves pass through a bandstop filter to remove the noises from the data the data is passed to a neural
network by making use of a Zigbee transvers to transmit the commands to the Arduino then the
Arduino signals are mapped to the bionic arm for the predefined movements. The feedback from the
prosthetic arm is sent back to the user by the sensors that are embedded in the fingers. To get a clear
idea of the control, a logical sequence of events has to be described using a flow chart in figure 2 to
fully describe the functionality.

BRAIN LAPTOP/
EEG
WAVES BANDSTOP
SENSORS
FILTER

ZIGBEE
ARDUINO NANO TRANSIVER/H
BIONIC ARM
M11 BLE
MODULE

1
Figure 21 SHOWS THE CLOSED LOOP COMMUNICATION BETWEEN THE PATIENT AND THE
ARM
3.2 CONCEPTUAL DESIGN
The conceptual design process took the cross-method approach which comprises divergent and
convergent steps. The divergent step involves the evaluation and screening of multiple solutions to
generate the final design concept.

3.2.1 DOMAIN-SPECIFIC DESIGNS


The design process was subdivided into 3 subdomains in this case which are
 Electrical engineering domain
 Electronic engineering domain
 Mechanical engineering domain

3.2.3 DEVELOPMENT OF THE SYSTEM


VDI 2206 methodology is now used to allow for simultaneous development of the system as a whole
encompassing its mechanical domain, the electrical and electronics domain as well as the information
technology domain. Information from the previous chapters with the original objectives was also
used to come up with a principal solution.

3.2.4 Ergonomics
Ergonomics is the interaction between humans and machines. The field of prosthetics is interesting
as it deals with ergonomics between prosthetics and amputees such – as a physical attachment to the
body and sensory feedback. Ergonomics must also be considered for the interaction between a
person’s prosthesis and other people. An ideal prosthesis is physically comfortable for the amputee to
wear, easy and natural to control, provides useful sensory feedback, and interacts well with its
environment. The dimensions of a large male hand have been used for design proportions. A
universal goal in prosthetic design is to achieve shapes and sizes that match an average female
physique. It is much easier to scale a design up in size rather than shrink it down to fit a smaller
person. Scalability has been kept in mind throughout the design process. Components can be easily
rescaled in computer modeling software and printed relatively fast. This allows for various-size
prototypes to be developed with ease.

3.3 MECHANICAL DOMAIN


In the mechanical domain, the author the hand part of the robotic arm is printed by a 3D printer based
on the design of the InMoov robot (Joseph T. Belter, Segil, 2 Aaron M. Dollar, & 1 Richard F. Weir,
2013).

3.3.1Hand end effector


By making use of Solidworks the author remodeled the arm end effector to clearly show how the
servos and drive system of the arm would be integrated into it and how to best it can fit with
maximum modularity

1
Figure 22 shows the palm of the palm
3.3.2FOREARM
Although the forearm section contains no moving components its design is still somewhat
challenging as this section needs to house five servo motors, and a lithium polymer (LiPo) battery
and allows for assembly therefore After the complete forearm section was designed in SolidWorks
the design had to be split into separate components which could then be assembled with screws and
easy to assemble the components inside.

Figure 23 Exploded view of the forearm


Figure22&23 below show the assembly of the full arm The final design consists of 35 individual 3D
components. in trying to check the modularity and compatibility of the frame and components.

1
Figure 24 full mechanical assembly of the arm
The figure below shows the exploded view of the full assembly done in SolidWorks for the arm to
clearly show the integration of the drive motors and gears into the forearm casing and to check the
compatibility of the parts.

Figure 25 exploded view of the full arm assembly


3.3.4 Objectives of the Bionic design within the mechanical domain
 Material Selection
 Fingers and wrist kinematics
 Design force calculations.
 Finite Element Analysis.

1
BIONIC ARM

3.4.0 MECHANICAL DESIGN CALCULATIONS

Early measurements indicate the final weight of the arm will be around one kilogram. To simplify
calculations let us assume that a 1kg point load acts on the arm 13.5cm from the elbow pivot, which
is the average arm length of a Zimbabwean male hand.

So, it can be concluded that this is the torque required at the elbow to lift the arm, but rather the
Tower pro servo motors have a threshold of only 10kg-cm, hence an increase in torque 0.35% should
theoretically be enough to lift the arm and also, it’s not ideal to run the servo at their maximum
torque rating.
Therefore, the way to increase the torque is to have a manipulation in the gear ratios where the elbow
gear has to be as large as possible but still fits in the mechanical elbow frame, while making the small
gear as small as possible whilst providing the teeth that are strong enough for the torques.
The force on the elbow will be twice the force provided by the servo motor to make it
F 2=2 F 1
Where f1 is a force due to the servo motor and f2 is a force due to the elbow joint

1
3.4.1Forces in fingers

To calculate the theoretical finger strength let us first look at the situation in which the index finger is
fully extended and apply a force near the tip of the finger.

Figure 27 DEPICTION OF FORCES ACTING ON FINGERS


In this case, the tendon creates a moment about each joint in the finger. The moment about the
knuckle joint will be the greatest since it is the furthest away from the applied force. Therefore, it is
the turning force at the knuckle that limits the load we can lift at the tip of the finger.
At the point where the maximum liftable load is applied the moments, M1 and M2 will balance out.
To begin our calculations, we must determine the tensile force in the tendon.
The stall torque (maximum turning force) of the MG996R Servos is 10kg-cm (1 N/m).

1
So a force of 0.7N can be applied at each fingertip when fully extended or a 70g mass can be lifted.
This may seem quite low but it is important to note that this is not necessarily the maximum force
the finger can apply

As the finger curls the perpendicular distance between the knuckle joint and the applied load
decreases – which results in a lesser moment about the knuckle joint. This means the fingertips
apply more force as they close further. Suppose the hand is curled around an object, then the applied
force to the index finger would be acting in an orientation similar to the depiction below

Figure 28 force distribution when gripping an object


Since the perpendicular distance from the applied force to the knuckle bone is now small then
the fingertip can now exert a greater force

In this case, the maximum force in each finger rises to about 150g which would give the entire hand a
lifting or holding capacity of 150 g∗4=600 g
3.4.2Finger actuation speed
the servo motors used by the author have an operating speed of 0,15 sec/ 60 degrees.

A full wrist rotation from palm to palm take

And from the mechanical design the tendon moves a distance of 2 cm from fully extended to full flex
therefore using formula where r 7mm is the radius of servo motor horns and length is 2cm then the
maximum rotation for the servo motors should be 160O

1
3.4.3 ELECTRICAL DOMAIN
To achieve the objective for control and movement of the arm in chapter 1, the author has focused on
the electrical domain of the arm and the following aspects were considered
Battery calculation
power calculations
Circuit design

3.4.3.1 ELECTRICAL DESIGN CALCULATIONS


The power of each motor is calculated from the required torque and the known value of the speed of
the motor.
P=τ .ω motor
The servo motors actuate hand prono-supination and perform wrist flexion/extension; they are
powered and driven from the control board by 6V DC supply voltage and two 50Hz PWM signals. A
PID regulator, utilizing the angular position detected through a rotary encoder, holds the angular
position of the motor shaft constant, thus leading to zero position error even in presence of applied
forces (within motors’ power and torque limits). The graphs reported in have complied in MATLAB
show the relationship between torque and speed and current absorbed.

Figure 29 displays the relationship between the applied torque and the rotation speed.
The servo motor, used for finger movement, requires a supply voltage of 6V; the LT1086 linear
voltage regulator receives 7.2 V from a lithium-ion battery and provides the required voltage. To

1
obtain a stabilized voltage and reduced noise, high-value electrolytic capacitors were added able to
avoid voltage drops during the maximum current absorption. Considering the worst case (peak-torque
condition), the current absorbed by each servomotor is 1.07A; since the control logic allows actuating
motors one at a time, the maximum current to be guaranteed is 1.2A and LT1086 satisfies this
requirement.
Since input and output current are equal¿), the LT1086 efficiency is estimated
Pload / Ptot=Iout∗VoutIin∗Vin=VoutVin =6 /7.2 ≅ 0.833=83.3 % .
Since the difference between input and output voltages is low then the power dissipation is
Pdiss=Iout∗(Vin−Vout )=1.2 A∗1.2 V =1.44 W

3.4.3.2Battery
For wearable devices like realized prostheses, where battery autonomy is crucial for patient comfort
and quality of life, batteries are a crucial component. The battery used was an Ottobock lithium-ion
type (Cutti, 2005) 757B20 with specifications of 900mAh capacity (or 6.5Wh), 3.5h recharge time,
7.2V supplied voltage, 65g weight, and operational temperature range of 0 to 50°C.
The power consumption of each powered device is reported following: 17.2W for DC motors,
180mW for Arduino board, 33mW for HM-11 BLE module, and 1.5mW and 12.5mW respectively
for all temperature and pressure sensors. Raspberry board consumption is 4W, the touch-screen
display 330mW, 1.44W is related to a voltage regulator for motor power supply, and 0.87W for the
switching regulator. It is possible to calculate a weighted average of power consumption (Paving)
making the sum of all consumptions multiplied by usage-time percentages obtaining about 1.4W;
since battery energy is 6.5Wh, its autonomy results in 4.6 hours.
The battery circuitry was simulated using Simulink in MATLAB software to determine the battery
behavior when inserted in the bionic arm.

1
Figure 30 Figure shows the battery simulation run in MATLAB software
3.4.4 ELECTRONICS DOMAIN
The electronic domain consists of
1. Controller Selection
2. Servo motor movement
3. Sensor selection for pressure sensing and temperature sensing
4. Electronic communication between the user and the robot
A decision matrix approach was used to assess the practicality, feasibility, and utility of each of the
three possible solutions. A scale of 0 – 4 as per the VDI Guideline 2225 was adopted.

3.4.4 .1Controller selection


The onboard processor is capable of handling all movements of the hand in addition to all sensor
inputs and user feedback. Arduino Pro mini was chosen by the author. Considering the severe space
constraints, nature does not demand a lot of processing power, but there are several analog to digital
inputs and general digital I/Os required. One digital channel is needed for each degree of freedom as
an output to provide the PWM signal for the speed controllers and one ADC analog input is needed to
read the potentiometer that measures the degree of freedom's present position. For the sole finger that
requires true force measurement, the index finger, it is also important to have a second analog input
used to measure a current or force sensor. It would be beneficial to have a second analog input when
utilizing the hand as a prototype rather than a final item.

1
Figure 31 Arduino Pro mini (electronics, 2022)
3.4.4.2 Raspberry Pi
With its built-in wireless connectivity, the new Raspberry Pi is positioned as a low-cost hub for
Internet of Things devices, or as a flexible, low-cost basis for new types of connected gadgets
(Rundle, 2016). The new bump to a 2.5 amps power source means it will be able to power more
complex USB devices without the need for a second power cable. Hence the author chose the Pi for
its Capability of multitasking and High availability.

Figure 32 Shows the raspberry pi zero controllers (electronics, 2022)

3.4.4 .2 MOTOR DRIVERS

H – Bridge (Transistors TIP 31)


Mainly used in robotics for forward and reverse bidirectional control of motors. The TIP31 is a
standard NPN bipolar junction transistor that is often used for medium power applications. A bipolar
junction transistor (BJT) is a three-terminal device that allows for amplification or switching
applications. Some transistors can amplify a small current which will then be powerful enough to
operate a lamp or other high-current devices. These devices are also able to detect a change in voltage
and act as a switch

1
.
Figure 33 H – Bridge (Transistors TIP 31) (electronics, 2022)

To select the motor driver electrical device for the system the author considered the Ease of speed
control, Directional control capabilities. Current rating. Ease of replacement.

3.4.5 Sensor analysis


3.4.5.1 Pressure sensor
FSR402 is a thin, ultra-lightweight, resistive pressure sensor. For pressure information, this type of
pressure sensor converts pressure applied to the FSR film sensor area into a change in resistance
value. The resistance increases with decreasing pressure. Allow its replacement with pressure
between 100g and 10kg. The terminal holder sensor can be utilized in manipulators with or without
holding objects, robots taking their first steps on the ground, biological research using mammals as
test subjects, and a wide range of other applications.

Figure 34 pressure sensor (Bionics, 2022)

3.4.5.2 Temperature sensor


LM35 is a temperature sensor that outputs an analog signal which is proportional to the instantaneous
temperature. The output voltage can easily be interpreted to obtain a temperature reading in Celsius.

1
The advantage of lm35 over thermistor is it does not require
any external calibration

.
Figure 35 LM 35 temperature sensor (electronics, 2022)

3.4.5.3CURRENT SENSOR
ACS712 is a current sensor that can operate on both AC and DC. This sensor operates at 5V and
produces an analog voltage output proportional to the measured current. This tool consists of a series
of precision Hall sensors with copper. This type of sensor was chosen for its efficient use and size

Figure 36 AC712 current sensor (interface, 2022)

3.4.5.4Voltage regulators

AP1117 and MC5205 -XBM5 Power regulators have been used to control the voltage and power
supplied to the servos and the microcontroller respectively. These regulators prevent a situation in
which a servo could be stalling and drawing a large amount of current from the battery. If such a case
happened to all six servos simultaneously it could easily damage the electronics or even cause a fire
in a worst-case scenario. Out of availability 5V surface, mount regulators have been used to supply
the servos and a 3.3V regulator is used to power the microcontroller. The specific regulators are used

1
to output a maximum current of 1A. A fair assumption is that each servo will need access to at least
500mA of current to operate. Therefore, each 1A power regulator supplies two servos. These specific
regulators hold the output stable at 5V. This results in slightly slower and weaker servo performance
as the ideal servo power is about 6.5V.

3.5.0 Electronics and Control

An ARM Cortex M0+ processor-equipped raspberry pi board that is housed inside the bionic hand
controls the actuators on the device. Robotic hands are intended to house the PI board. It is capable of
managing six motors at once. The board weighs 15 g and has compact (57 * 45 * 9) dimensions that
enable it to fit within the bionic hand.

A ZigBee transceiver was used to wirelessly collect every bit of EEG headset data, which was then
sent serially to a PC at a predetermined sampling rate of 125 Hz. An action is matched to each sent
serial datum. With the trained gesture model, these signals are compared. The gesture that was
identified is indicated on a graphical user interface (GUI) panel that the user can interact with, as
seen in Figure 36. The hand movements, such as closing one or more fingers on a closed hand,
opening one or more fingers on an open hand, were mapped to the gestures that were detected by the
headset sensors. Controlling the movement of the actuators enables these actions. To move the
actuators in the hand, the control signals are transmitted via the PI and Arduino board.

Figure 37 Interface screen on the PC with implemented graphical user interface (GUI). courtesy of
AUM ROBOTICS (ROBOTICS, 2022)

1
3.5 THE DESIGN OF THE BCI SYSTEM

The major goal of this project is to create a basic BCI-controlled robotic arm that can make seven
degrees of freedom using EEG signals, such as extending the elbow and making and releasing a fist
as well as performing and moving with the mimic of a healthy human hand. A device that gathers
EEG data as the input and transforms the signals into mechanical output is required to complete this
project. The designed system consists of five steps. The first phase involves using the (EPOC) EEG
headset and opening BCI software to collect the user's brain signal for a particular thinking or
activity. The arm was designed to have seven degrees of freedom and so four different EEG signals
generated using seven different brain function activities are used in this work. Since the EEG signals
will come from a different patient the system has to go under model training using EEGLAB in
MATLAB and deep machine learning using python to successfully executes the operations easily for
any patient. the figure18 below shows the various stages that are undertaken to control the bionic arm
using BCI control.

EEG Signals collection

Signal filtering
&feature
identification

Interactive program

Arduino
microcontroller

Bionic arm movement

Figure 38 Block diagram of the BCI-controlled robotic arm.

1
3.5.1 SIGNAL DETECTION

The detection of the EEG signal from the user's scalp constitutes the first stage of arms control.
Millions of neurons, each joined by dendrites and axons, make up the human brain. Our neurons are
active whenever we think, feel, sense, move, or recall something. Small electric signals travel from
neuron to neuron as quickly as 250 mph to complete that task. Variations in electric potential
conveyed by ions on the membrane of each neuron cause the messages to be sent. It can help to use
these signals to control various gadgets if you can recognize them and comprehend what they signify.

EEG stands for electroencephalography. A method that entails securing electrodes or sensors to the
scalp in order to read and record the complete electrical activity of the cerebral cortex. The ionic
current in the brain's neurons causes voltage changes, which the EEG detects. The EEG is a
superposition of numerous elementary signals since the human brain has millions of neurons, each of
which generates a small electric voltage field. In a healthy adult, the amplitude of an EEG signal
normally ranges from roughly 1 V to 100 V (Dany Bright, 2016).

Table 2 shows Frequencies Generated by Different activities in the brain (J.R. Wolpaw, 2002)

Brainwave type Frequency range Mental states and conditions

0.1Hz to 3Hz Deep


Delta , dreamless
, unconscious

4Hz to 7Hz Intuitive, recall, imaginary,


Theta dream

13Hz to 15Hz Formerly SMR, relaxed yet


Low Beta focused, integrated

16Hz to 20Hz Thinking, aware of self & the


Midrange Beta surroundings

21Hz to 30Hz Alertness, agitation


High Beta

1
31Hz & above Cognition, information
Gamma processing

3.5.2 Emotiv EPOC headset

Figure 39 Diagram of the Emotiv Epoc Headset (Duvinage.m.Castmans, 2013)


The ARM Cortex M4 processor controls the operation of the headset, and the BLE NRF51822 chip
transmits data to the HM-11 BLE module, which is mounted on the bottom side of the prosthesis
control unit. The headset's two lithium batteries are divided into two elements, and they are recharged
with 5V voltage using a micro-USB connector. It contains a Think Gear chip, a piece of equipment
that converts brainwaves, an analog electrical signal, into a digital signal by quantifying the signal.
The band stops filters in the think gear chip (Duvinage.m.Castmans, 2013).

3.5.3 Signal Acquisition and feature extraction


In this research, the Epoc Headset was used to collect the EEG data of the selected four gestures from
twenty-three participants (twelve males and eleven females with ages ranging from 18 to 45 years).
First, the headset was connected wirelessly to the computer, and several numerical algorithms were
used to transform the collected data from the official open BCI software to a matrix data format. This
procedure simplified the data collection process and allowed visualization of data while recording.

1
Only data used to train and test the offline classifiers were collected using numerical tools, while the
online implementation of this project was performed using Python code

Figure 40 brainwave streaming via an 8channel headset in real time ready for acquisition
Three main stages involve Data collection, data processing, and rectification, and feature
extraction .as part of the data collection procedure, participants were instructed to keep a certain
thought of hand movement during data collection. The dataset was collected in several sessions, and
every time the headset will be worn by all participants the same way. In the first phase, data were
collected from participants in several sessions, where data associated with four hand gestures were
recorded: Spread fingers, closed hand, wave-in, and wave-out. The participants were instructed to
move their hand from the resting position to perform
one of the proposed gestures and then move back to the resting position for around four seconds. The
participants repeated this procedure more than 10 times for every single gesture. The same procedure
was applied for all four gestures. As a result, a dataset of 2000 files was collected, where each file
contains the signals of several gestures

The raw data was sampled by making use of the fast Fourier transform to sample the data and remove
the unwanted noises from the impulse the raw data is sampled at the rate of 25frames/sec and the
Nyquist theorem (Zero crossing feature) is applied to remove all the data that is below 60Hz.

1
Figure 41Figure 21 figure shows the EEG raw data passing through the filter

3.5.3 .1 TRANSFORM ANALYSIS

In this paper, A Fast Fourier Transform (FFT) is used as an efficient algorithm to compute the DFT
and its inverse (IFFT to compute IDFT). Computational complexity is
(N 2) for standard DFT and N (log N) for FFT procedure. The FFT algorithm is based on the Divide
and Conquer algorithm. It divides the transformation of size N to transform the sizes N1 and N2. In
this paper, the FFT algorithm is used for the size of the sample, according to formula (Rejer I., 2012).

Each file from the dataset of the participants was targeted to be 10 seconds long, which, at 25 iter/sec
gives us, 250 (though you should not depend on/assume all files will be exactly 250 long). Then you
have the number of channels (16), and then 60 values, for up to 60Hz, and the graphical
representation of the data was done in python shown below

1
Figure 42 shows the code snippet for data representation

Figure 43 shows the graphical representation of the EEG data sampled using the Fast Fourier
transform

1
3.5.3 .2 Zero-crossing Feature
the second phase was concluded by capturing the envelope of the filtered and rectified EEG signal, as
the obtained shape gives a better reflection of the recorded data. Hence the
Zero crossing counts the number of time the signals crosses pre-defined threshold amplitude. The
zero-crossing is formulated as shown
𝑍𝐶=Σ [sgn (𝑥𝑖. 𝑥𝑖+1) ∩|𝑥𝑖. 𝑥𝑖+1|≥𝑡ℎ𝑟𝑒𝑠ℎ𝑜𝑙𝑑] 𝑁−1𝑖=1
sgn(𝑥)= {1, 𝑖𝑓→𝑥≥𝑡ℎ𝑟𝑒𝑠ℎ𝑜𝑙𝑑0, 𝑜𝑡ℎ𝑒𝑤𝑖𝑠𝑒
where xi indicates the i-th EEG signals, N indicates the number of samples in each segment, and the
threshold indicates the level of amplitude
.

Figure 44: the zero-crossing FEATURE ALGORITHM for sampling the raw EEG that will be passed to
the CNN for training

1
Figure 45 shows extracted zero-crossing feature from the python graph representation

3.5.4 Classification
In this part, a classifier or recognition algorithm was trained using the retrieved features and the
matching known outputs as input data. The classifier is trained to learn and recognize patterns in the
data and to react to inputs using a pre-selected optimization technique. After successful training, the
reliability of the classifier is tested with a different dataset.

Classifiers can be trained and tested to provide accurate classification models, which aid in
confirming the results. To determine which classifier is most appropriate for creating the bionic hand,
three classifiers were examined these were: the artificial neural network (ANN), support vector
machine (SVM), and decision trees (TD) methods.

3.5.4 .1Artificial Neural Network


Artificial neural networks (ANN), also known as multi-layer perceptrons (MLP), are one of the main
pattern recognition techniques; they comprise a large number of neurons, and these neurons are
connected in a layered manner. The training procedure of a neural network can be easily achieved by
optimizing the unknown weights to minimize a pre-selected fitness function. In this work, one of the
most recognized ANN algorithms, the convolution neural network, was used as a supervised
classifier for gesture recognition. The neural network classifier is trained with a set of data (called
training data); the trained classifier is then tested with a different dataset. Finally, the resulting ANN
classifier is used to recognize online input data (Zhen Zhang, 2019).

3.5.4 .2Support Vector Machine


A support vector machine (SVM) is a multi-class classifier that has been successfully applied in
many disciplines. The SVM algorithm gained its success from its excellent empirical performance in
applications with relatively large numbers of features. In this algorithm, the learning task involves
selecting the weights and bias values based on given labelled training data. This can be achieved by
finding the weights and biases that maximize a quantity known as the margin. Generally, the SVM
algorithm was first designed for two-class classification. However, in this research, a multi-class
SVM classifier is trained, tested, and used to classify gestures based on online data (Dai, Zhou, Chen,
& Yang, 6–9 November 2017)

3.5.4 .3Decision Tree


Recently, decision tree (DT) algorithms have become very attractive in machine learning
applications due to their low computational cost. Furthermore, DT approaches are transparent and
easy to understand, since the classification process could be visualized as following a tree-like path
until a classification answer is obtained. The decision tree algorithm can be summarized as follows:
The classification is broken down into a set of choices, where each choice is about a specific feature.
In this project, a decision tree algorithm is used to train and test a gesture dataset, and the results are
compared with the SVM and ANN to select the best model for creating the bionic hand. (Zhang, et
al., November 2011)

1
3.5.5 NEURAL NETWORK CREATION
After selecting three different types of classifiers, the offline procedure was used to train and test
these classifiers to select the model that will be used for the online recognition procedure. The
artificial neural network was chosen in this research for its fast computation in sequential data and
real-time implementation that helps in achieving the goal of performing hand movement as close to a
healthy hand as possible.

3.5.6 Network model training

After the data is sampled and all the noises are removed the data is then first arranged into NumPy
arrays. To start, the main objective is to train a neural network to detect thoughts of hand movements
thus the finger flexion, extension, supination, forearm extension, and flexion. From here the network
will be used to control the arm. as a feature prediction algorithm. the data used is 16-channel FFT 0-
60Hz, sampled at a rate of about 25frames/second a series of tests are contacted to increase the
accuracy of the model the figure below shows the python program used for training the model for
various movements

The model has 5 hidden layers, with the number of neurons used in each layer set to 64,128,256,512
and 3, respectively. The tank (SoftMax), which is the hyperbolic tangent function, is considered the
activation function of the network. The training procedure is achieved using an optimizer called the
Adam optimizer

Figure 46 shows the model training using different sampling rates and batches before testing
The network was used to train and test the same dataset for a different set of parameters. The best
model for each was selected based on its performance. Next, a statistical study was used to compare

1
the testing results to select the best model, the tests were run in four trials, and the testing accuracy
for each model was stored in a table.

3.5.7 ZEEGBEE MODULE INTERFACING


To simulate the Raspberry Pi program and map the movements to the prosthetic arm, the Python IDE
and headset linked to the Proteus simulation are used. These are incorporated using the Py-serial
module and virtual serial ports created using the Virtual Serial Port Emulator. ZigBee is simple to use
and compatible with everything microcontroller type

3.5.7.1 Mapping Signal


The signal received from the ZigBee transceiver has to be mapped to the prosthetic arm in the
microcontroller (i.e, Arduino Uno). The received signal will act as a command signal to control the
prosthetic arm. The signal from the headset is converted into a binary hex file for the language that is
understood by the Arduino then it performs the specified motion. ZigBee is a high-level wireless
communication protocol specification that is based on IEEE 802.15.4. Small, portable devices are
used to create personal area networks (PANs). compact digital radios Because of their low power
usage Line-of-sight transmission distances are limited to 10 to 100 meters. Susceptible to climatic
conditions and electricity output. ZigBee has a 250 kbit/s maximum transmission speed. The
best ideal for erratic data input from a sensor or input instrument

3.6 INTERFACING THE NEURAL NETWORK TO THE BIONIC ARM

To map out the movements that are predefined by the neural network a server was created to create
an operating system for the arms brains and the sever was created using a flask server where the
model of the network will be saved and called out by the microcontroller during work

1
Figure 47 shows the implementation of the neural network to the flask server

3.7 SENSOR FEEDBACK

To amplify the sense of ownership of the arm. and control possibility to the bionic arm the arm is
embedded with sensors to communicate back to the user of the prosthetic arm The LM35 temperature
sensor is installed in each fingertip and the measured values are reported on the display which
represents feedback on grasped objects’ temperature besides over limit temperature warning given by
vibration that would not be otherwise detectable by the user. In addition, a pressure resistive sensor
equips each fingertip (FSR 40 from Interlink Electronics) to acquire pressure that the prosthetics’
fingers exert on the grasped object; adding these data to the current absorbed from servos, it is
possible to grip any object typology without damaging it.

3.7.1 HOW IT WORKS


Sensors (temperature and pressure sensors) in the prosthetic hand measure the pressure applied to
various objects as the hand closes around them and records their temperature too. The measurements
are then recorded, converted into an understandable language of the controller, and sent through the
ZigBee transvers to electrodes that are on the headset. When the neural code reaches the electrodes,
the signal is transmitted at a predefined frequency and the brain interprets the signals as feeling, as if
from a normal hand.

1
Figure 48 shows the process control of the arm with sensory feedback (Zhen Zhang, 2019)

1
3.7.2 TEMPERATURE AND PRESSURE SENSOR INTERFACE
Sensors are connected to the Arduino controller whereby the controller gathers the data from them as
inputs then data is converted into a neural code by a predefined calculation in the trained neural
network and it’s then sent wirelessly back to the headset and the brain interprets it with a
corresponding response movement

Figure 49shows the temperature and force sensor integration designed in proteus

1
3.8 SYSTEM INTERGRATION DESIGN
The following schematic outlines the circuitry of the system integration designed using Proteus
software highlighting how the system components are connected to satisfy the required movements in
the arm

Figure 50 shows THE ELECTRONIC circuit of the components and their connections in proteus

3.8.1Printed Circuit Board Design


An illustration of the printed circuit board created for this artificial arm may be found below.   The circuit
traces displayed below could be printed using the PCB-type printer onto a range of surfaces. However, a
PCB schematic is invaluable for creating prototype circuits on Veroboard with holes. The servo output
pins and the power supply are shown in the diagram below. the voltage regulators, microcontroller, and
supply input are ordered from the left to Debugging pin port for programming. It would be simple to send
this schematic file to the board's manufacturer to have it professionally built.

1
Figure 51 shows the printed circuit board for the circuit design
Note: The lower right section of the PCB design (Vcc, GND, etc.) caters to a wireless radio chip that
could be used to wirelessly send and receive data (ZigBee transvers). This was going to be used to
receive information from a motion-sensing glove a user could wear to control the arm. This feature
has been excluded from the thesis as it does not directly relate to the field of prosthetics.

3.9 Developed firmware for data acquisition and prosthesis control

3.9.1Programming
An Arduino microcontroller understands only its specific assembly language. To program the
microcontroller software code must be written in assembly language. Fortunately, compiler software can
convert standard C code into assembly code automatically. Coding in the standard C language is generally
much easier and faster. Arduino idle is the development environment provided by smart projects.
Microprocessors are programmed via a USB connection. The microprocessor monitors input signals from
EEG sensors through the serial link makes calculations to determine the required action associated with
that input and generates corresponding signals for the motors.

1
3.9.2 The functions for the motor control and data acquisition are as follows
− handlePoseData () The headset use data from electrodes to identify five predefined poses (spread
fingers, create a fist, wrist flexion, wrist extension, and fingers touch) that control how the
function prosthesis moves.  The setPoseEventCall Back () function on the Arduino microcontroller
allows it to identify the completed pose when the armband detects a predefined pose. A second
structure connected to the fingers-touching pose resets the servomotor's positions to get the hand
rest position.

− handleEEGData () function: it is recalled when data related to EEG electrodes are received from
the Arduino board and then sent to the Raspberry board for further processing.
The functions to process analog signals from LM35 temperature and pressure sensors are described
following. Since Arduino ADC has a 10bit resolution, values returned by the analogRead () function
range from 0 to 1023, to map these values to the desired interval (e.g., 0-500 °C for temperature),
the Arduino library’s map () function was used; since temperature varies slowly, a 1 Hz sampling
rate was adopted ensuring both low processing load and power consumption.

The update Temperature () function processes LM35 sensors data to warn the user in case of high
temperatures utilizing vibration of the vibrotactile.

− update Pressure () function: it processes data of pressure FSR400 resistive sensors with a 50Hz
sampling rate of the analog signals. The voltage values obtained from the map () function are
converted into resistance by the voltage divider formula. Then, by using the pressure/resistance
FSR400 characteristic, the force or pressure exerted by the bionic hand is obtained; Thus, the
pressure values are used to control the gripping force and to detect any slipping of the grasped
object.

− update Servo () function: it is used to control wrist flexion/extension and is recalled if a


contraction of the wrist flexors/extensors muscles, corresponding to Wave left or Wave Right
poses respectively, is recognized by the handlePoseData () function. The function receives as
input a determined angular value, depending on the received pose, and recalls a Servo library
method providing the PWM signal for driving the servomotor. it is also used to drive the second
servomotor needed to actuate the wrist rotation controlled by the yaw angle got by the
handleIMUData () function. To avoid motor vibrations due to high magnetometer sensitivity, it is
executed after a set time interval providing an average value of the yaw angle.

− Reset Motors () function: it is called whenever the Fingers Touch pose is detected by the
handlePoseData () function; in this case, the three motors return to their default position bringing
the hand to the rest position. Numerous objects with different forms, dimensions, and stiffness
were correctly gripped and data related to temperature and pressure were acquired by sensors
integrated into fingertips, shown on the PC terminal.

1
Note: the codes of the above-mentioned functioned are attached in the appendix with full clarity

3.9.3 Servo Signals

A pulse width modulated signal is used to control the servo motors. Every 20ms a pulse between 1ms and
2ms long is sent from the PIC microcontroller to the internal control circuitry of the servo.

20ms Period

1 – 2ms Pulse
High (5V)

Ground(0V)

A 1.5ms pulse rotates the servo shaft to its central position. Different pulse widths correspond to
different motor shaft positions.
To create a PWM we could use a software timer to accurately control the timing and duration of a pulse.
Another option would be to use the inbuilt PWM generator feature of the microcontroller. The problem
with both these options is that there are not enough software timers to control each servo. Six servos
need to be controlled and only four timers are available.
For this bionic arm, two 16-bit software timers have been used to accurately control six servos. Six
PWM signals need to be generated on individual output pins.

1
Every 3.3ms the beginning of a new pulse is started on a new output line. After six cycles 20ms have
passed a new pulse begins on the first signal line and the cycle repeats. A software timer is used to
control the 3.3ms period time. If for example only five servos were used then we would start a new
signal every 4ms.
The image below outlines how two timers have been used to generate all six of the PWM signals
developed in MATLAB software.

nd
2 timer controls the
width of each pulse

st
3.3ms 1 software timer starts a
new signal pulse very 3.3ms
Signal 1

Signal 2

Signal 3

………….. Six signals

A maximum of eight individual servos can be controlled using this method. Any more and the time
between the start of each pulse becomes less than the pulse width of each signal.
Meaning another timer or method would have to be used.

3.9.4 Program Flow


The control program flow was created highlighting how the different functions of the prosthetic arm were
interfaced that’s from the sensor data, motor movements, and poses

The following block diagram shows the full program flow from the headset to the prosthetic arm and its
way back in form of feedback. The custom firmware was developed for each of the functions and they are
grouped into groups of functions each one with a specific function

1
EPOC
Headset

Figure 52 the figure shows the motor control functions used during the program creation

After the EEG signals are received there is the conversion of those signals from analog to digital by
making use of a digital control system in real-time. The data from the sensors is gathered and the
feedback is used to control the servo feedback and the finger actuation this signal is sent back to the
user in form of vibrotactile sensory feedback for the user adjustment. The control flow of the data is
shown on the flow chart below to highlight the integration of the system

1
Fi
gu
re
53

shows the closed loop signal flow during the control and the data from sensors.

1
Figure 54 shows the program flow of the system
Figure 54 highlights the program flow in the pose function as the current sensor reads the absorbed
current from the motors the read current is compared to the previous mapped current to check the
conditions if its either greater than or equal to the previous one and if the force is greater hence the
system breaks but if its equal or less then the system then maps the current to the value to voltage (as
shown from the electrical calculations earlier)
When the voltage value is mapped then the force that is being carried can now be calculated in real-
time to send back the feedback to the user immediately before damaging the arm.

3.8.5 DESIGN VALIDATION


In this section, a checklist is made to verify if the system design satisfies the various requirements
made in the requirements section

1
Requirement Description Check

Minimum cost The cost is the essential factor in


projects and it determines the failure
or the success of the project. The
design of the system should
maintain the minimum cost for the
implementation accompanied by the 
high quality of the product.

Simplicity of use Many users tend to dislike several


steps-based interfaces.
Consequently, the design should
provide a friendly interface that can
be easily accessed with the least
complexity for both users and

administrators.

Accessibility and reliability The design should ensure


perfectness in use, and the user
should control the device with ease.

Weight The arm should be light in weight
with a weight <1kg 

Material The material to be used should be


waterproof, impact-resistant free, 
and lightweight

Safety Operator Safety 

Environmental Safety

Ergonomics Visual Displays 

Input Commands Provision

Maintenance Meantime failure of at least a month 

Figure 55 Table 12: checklist table


3.9 CONCLUSION
The chapter gave an outline of the various techniques the researcher used to come up with the best
solution to solve the problem at hand. Data collection methods were outlined, such as system
requirements, flowchart, design architecture, and relevant software used. Validation and testing are
also one of the procedures which were undertaken to come up with the design and these will be
outlined in the next chapter

1
CHAPTER 4: RESULTS AND DISCUSSION
4.0 INTRODUCTION
Simulations, experiments, and mathematical calculations are the three engineering techniques used in
acquiring the results of a design. Results are important in assessing whether the design meets its
objectives or system requirements to measure how effective the design is in solving the existing problem.
This chapter presents the results in assessing the fulfillment of the system requirements.
4.1 HOW THE SYSTEM WAS BEING TESTED
To test the system a combination of Arduino and raspberry pi 3 microcontrollers respectively were used.
These controllers were interfaced by use of the Proteus software and the serial interface was made
possible by (Null modem emulator) COM-to-COM virtual port software. Also, several subsystems were
used for the simulation tests. The BCI system was interfaced by integrating the bionic prosthetic arm with
the EEG headset whereby the Egg headset would acquire the brainwaves patterns inform of impulse by
making use of electrodes that will be attached to the scalp of the patient non-invasively and the recorded
impulses are decoded in the raspberry pi controller in the headset then the feature extraction follows
which was made possible by the creation of a neural network for the intended movements.
4.2 RESULTS
The RAW data from the EEG headset was evaluated and elaborated upon in the studies and simulations
that were conducted. The above chapter 3 mentioned electronic board for gathering the EEG signals from
the Headset and GUI software integrating a machine learning algorithm that enables the detection of
flexion/extension/rest motions of the user's fingers were used to evaluate the brain-computer interface
control of the prosthetic limb

4.3NEURAL NETWORK TRAINING RESULTS


Model had5 hidden layers, with the number of neurons used in each layer set to 64,128,256,512 and 3,
respectively. The tanh (SoftMax), which is the hyperbolic tangent function, was considered the activation
function of the network. The training procedure is achieved using an optimizer called the Adam optimizer
The parameter values for the models were selected after performing a cross-validation
process for each training. Each model was used to train and test the same dataset for a different
set of parameters. The best model was selected based on its performance. Next, a statistical study was
used to compare the testing results to select the best. For each model, four trials were conducted and the
best result was selected.

The first trial provided the highest classification result with a mean value of the training data equal to
51.21% and a standard deviation of 1.92%. Furthermore, the second run provided a testing accuracy equal
to 73.93%, and a standard deviation of 1.75%. The third run produced a training accuracy
of 89.46% with a standard deviation of 4.87%,
Finally, the fourth run provided a training accuracy of 94.78% with a standard deviation of
1.11%. The ANN accuracy of the testing procedure was equal to 83.91% with a standard deviation of
2.3%. The results are presented in the Table below

1
TRIAL TRAINING TESTING

1 73.46% _ 4.87% 70.51% _ 2.51%

2 84.78% _ 4.11% 83.91% _ 2.30%

3 53.3%_ 1.16% 54.87%_1.5%

4 94.21% _ 1.92% 89.93% _ 1.75%

Table 3 Training and testing results of the model


The four gestures were performed close, open, wave-in, and wave-out, and the reported results were
represented by the confusion matrix below. As observed, the accuracy for both training and testing
procedures was higher than 82%. Furthermore, the results indicate that the misclassification between
gestures is relatively low and mostly happens between the open and close gestures.

Figure 56 shows the confusion matrix of the neural network, validation, and the accuracy test matrix
tested using MATLAB software

1
4.4 Real-Time Implementation
Different testing protocols were proposed to the user for testing the arm design and the EEG signal
control with the neural network enabled. The testing scenarios showed the ability of the user to
control the bionic hand accurately will be seen after the training phase. The bionic hand movements
were optimized to allow the user to perform different activities simulations for example turning the
motors to a position of holding objects, grasping, drinking, and writing. In single-action testing, the
simulations were run to perform one action at a time and note down the motor movements in the
proteus software. The single actions include making a fist, spreading the fingers, closing one finger,
and closing two fingers.

The simulation was performed for each action repetitively for consecutive times. The results of
testing every single action show a minimum detection rate of 85%. In the combination of two
actions, the performed opening and closing had a success rate of 95%, the opening, and closing of
one finger 90%, and the opening and closing of two fingers with 85%. The final results of the test
were compiled into a graphical representation

Figure 57 shows a histogram of the human gestures expressed in % made in MATLAB graphical
representation

1
4.5 ELECTRICAL SIMULATION RESULTS

The main objective that the electric domain is supposed to meet is that it has to ensure that the bionic
arm remains powered for four hours before another recharge. therefore, a Simulink model was used
to experiment if the designed power supply was able to do as described. Since the designed system of
the model consists of a 7.2 V, 5 Ah lithium-ion battery. In this experiment, the charging and discharge
results will then be recorded.

Below shows the results from the MATLAB battery simulation from the state of charge graph it can
be seen that it takes an hour and a half for the battery to be fully charged and the discharging from
full capacity took 4 hours to the lowest voltage that can power the system in this case 40% of 7.2
volts which is 3 volts. This shows that the designed power supply was able to meet its objectives.

Figure 58 The battery simulations results simulated in MATLAB software

4.6 BILL OF QUANTITIES


Due to their restrictions, the current limbs on the market may disappoint some amputees who have
had limbs amputated. Here, an individual design method has been used to create a tailored design for
the user that may be targeted to meet the needs of a specific amputee scenario, especially in terms of
its low cost and lightweight. The current price range of gadgets on the market is between 4,000 and
20,000 USD (Jorge Zuniga, 2015).

(McGimpsy & T., 2018) completed thorough market research on the price of prosthetic limbs. Costs for
modest cosmetic arms and hands might range from $3,000 to USD 5,000. On the other hand, the
price of a useful prosthetic arm may range from USD 20,000 to 30,000. hence the main target was to
manufacture a cheap and affordable bionic arm for amputees costing around 300 USD

1
To reduce the cost of coming up with the design, the components would have to be procured through
Netro sales, which is a Zimbabwean growing electronics shop. Delivery would be free if brought in
bulk. The summation of all costs for each component that develop the system is shown
in the table below.

Table 4: the Bill of materials for the bionic arm

Description Cost

700gram Spool of ABS $40

6 x Standard Servo motors $60

Cell LiPo Battery 2 x $20

Epic Headset kit $100

Electronics $20
(Micro, regulators, Veroboard,
solder, wires, etc.)
Miscellaneous (Braided $10
fishing line, screws, Acetone,
etc.)

4.7 System Specifications


The thumb, index, middle, and small fingers all move separately, but the ring and little fingers move
together. 180 degrees of rotation are possible at the wrist, while 110 degrees of bending are possible
at the elbow. Through a series of EEG electrodes on the headset to the scalp, a user can regulate the
control by the action of their thoughts. Electrodes sets at this time enable basic control of the arm's
range of motion. The gadget has a battery life of more than 4.6 hours and is entirely portable.

1
CHAPTER 5
5.1 CONCLUSIONS AND RECOMMENDATIONS
In this work, a brain-computer interface-controlled bionic prosthetic limb's electronic system was
realized. The arm gives the user feedback from the environment, such as the temperature of items and
the force that fingers apply to objects they have grasped. It was created in a modular approach to be
adaptable to various amputee kinds, thereby satisfying a wider user group. The control of the
prosthesis was made simpler and lighter by using individual servo motors to move the digits
individually on the entire hand. The orthopaedic specialist who oversees user therapy remotely
monitors EEG and IMU data using the Wi-Fi connectivity offered by Raspberry. Real-world studies
were conducted using a machine-learning algorithm to identify hand motions. Nine people were
examined in real tests using an algorithm based on machine learning to identify hand motions (one
affected by congenital upper-limb Amelia) courtesy of OPEN BCI COMPANY. Results from the
classification process provide excellent accuracy and quick answers for amputee subjects as well,
demonstrating that the algorithm is effective even for subjects with weak, untrained muscles

5.2 Overall System Performance


The final system provided relatively good performance and characteristics for a prototype model.
The device was fast and responsive to EEG signal user input but offers minimum latency due to the
server that was used for the neural network operations. Throughout testing the system has proven to
be reliable and has required minimal maintenance when being assembled

5.3RECOMMENDATIONS
Continuous improvement (Kaizen Philosophy) in engineering is a very helpful technique in the
pursuit of excellence. What could be improved or added to this project to make it more beneficial to
the user or profitable for the client is a constant concern for the engineer. As a result, a design in
engineering never comes to an end but rather evolves from one era to the next. In light of this, the
author also asked the manufactured project these same questions and came up with the following
suggestions.

 The use of intraneural electrodes is perhaps the most promising technology that may hold the
key to the successful integration of bionic limbs.

 Improving the bidirectional flow of information between the bionic limb and the patient. It is
a daunting engineering challenge, but when succeeded, this haptic technology could benefit
more than just prosthetic users.

 Research on cognitive neuroscience must be done in more depth if this system is to be


improved to its ideal state.

 To further enhance the bionic arm, additional features are required, such as a multi-degree-of-
freedom wrist joint connector. A spherical manipulator or two servo motors with brackets can
be used to accomplish this. Additionally, air-ducted adjustable sockets may make it simple for
the user to mount and detach the bionic arm.

1
References
] Belter, J. T. (2013). mechanical design and performance specifications of anttromophobic prosthetic hands.
journal of RRehabilitation resesarch and Development 50(5), 599-618.

Adee, S. (2008, February 01). IEEE Spectrum. Retrieved from Dean Kamen's "Luke Arm" Prosthesis Readies for
Clinical Trials: http://spectrum.ieee.org/biomedical/bionics/dean-kamens-luke-arm-prosthesis-readies-for-
clinical-trials

Ahmadizadeh, C., Merhi, L.-K., Pousett, B., Sangha, S., & Menon, C. (2017). . Toward Intuitive Prosthetic Control:
Solving Common IssuesUsing Force Myography, Surface Electromyography, and Pattern Recognition in a
Pilot Case Study. IEEE Robotics .and Automation. Mag (pp. 102–111). USA: IEEE.

Alam, M., & He, J. (2014). . Lower-Limb Neuroprostheses: Restoring Walking after Spinal Cord Injury. . In
Emerging Theory and Practice in Neuroprosthetics (pp. 153–180). PA, USA, Ganesh, R.N., Yina, G., Eds.;
IGI Global: Hershey.

Allin S, E. E. (2010). Recent Trends in the Development and Evaluation of Assistive Robotic Manipulation
Devices. Physical Medicine and Rehabilitation Clinics, 59-77.

Bashashati, ]. A. (2007). A survey of signal processing algorithms in brain-computer interfaces based on electrical
brain signals. Neural Engineering., pp. R32-R57.

Bionics, T. (2022, September 3). Impacted Prosthetics Product. Retrieved from i-Limb® Quantum:
https://www.ossur.com/en-us/prosthetics/arms/i-limb-quantum

Birbaume, .. N. (1999). “A Spelling Device for the Paralyzed,”. Nature, vol. 398, no. 6725, 1999, pp. 297-298.

Bradford, G. M. (2015). Limb Prosthetics Services and Devices Critical Unmet Need: Market Analysis. NewYork:
Bioengineering Institute Center for Neuroprosthetics.

C. S. Lovchik, M. A. (5 May 1999). The Robonaut Hand: A Dexterous Robot Hand For Space. ” Proc.
International Conference on Robotics and Automation (pp. p. 907-912). Detroit, Michigan: Defiler.

Camacho-Conchucos, H. (2010, may 6). Patients amputated by work accidents: characteristics and years
accumulated of potential productive life lost. Retrieved from scielo.org.pe:
http://www.scielo.org.pe/pdf/afm/v71n4/a11v71n4.pdf

Chen X, Z. Y. (2010). Sonomyography (SMG)control for powered prosthetic hand a study with normal
SUBJECTS. Ultrasound Med Bio, 293=5.

Childress, D. (1985). Historical aspects of powered limb prostheses. Clin. Prosthet. Author, 9, 2–13.

Christian Pylatiuk, S. S. (2007). Results of an Internet survey of myoelectric prosthetic hand users. Prosthet Orthot.
Int, 362.

Cipriani, C., Controzzi, M., & Carrozza, M. (2009). Objectives, criteria, and methods for the design of the
SmartHand transradial prosthetics. Robotica ,, 28, 919–927.

1
Clay, d. (2011, FEBRUARY 2). Retrieved from Darpas Brain Controlled Robotic Arm Could be available just four
years: http://www.popsci.com/science/article/2011-02/darpas-brain-controlled-robotic-arm-could-be-
available-just-four-years

Cutti, A. G. (2005). Performance evaluation of the New Otto Bock “DynamicArm” utilizing biomechanical
modeling. USA: Myoelectric Symposium.

Dai, Y., Zhou, Z., Chen, X., & Yang, Y. A. (6–9 November 2017). a novel method for simultaneous gesture
segmentation and recognition. International Symposium on Intelligent Signal Processing and
Communication Systems (PACS), (pp. pp. 684–688.). Xiamen, China.

Dany Bright, a. N. (2016). EEG-based brain-controlled prosthetic arm. Conference on Advances in Signal
Processing (CASP) (p. 1.11). Pune, India: Cummins College of Engineering for Women.

Desmond, D., & MacLachlan, M. (2002). Psychological issues in prosthetic and orthotic practice: A 25-year review
of psychology in. Prosthet. Orthot. Int., 182–188.

Dhillon GS, L. S. (2009). Residual function in peripheral nerve stumps of amputeesandimplications for neural
control of artificial limbs. J Hand Surg JM, 616-618.

Donoghue, J. P. (2007). Assistive technology and robotic control using motor cortex ensemble-based neural
interface systems in humans withtetraplegia. J. Physiol 579.3 , , pp 603–611.

Duvinage.m.Castmans, T. (2013). Performance of the Emotiv Epoc headset for P300-based applications.
Biomedical engineering online, 1-15.

E, B. S. (2014). Restoring sensorimotor function through intracortical interfaces . progress and looming challenges,
volume 15 313.

electronics, S. (2022, september 25). SparkFun Originals. Retrieved from SparkFun Originals.com:
www.google.com/search?q=https%3A%2F%2Fwww.sparkfun.com
%2F&rlz=1C1SQJL_enZW1020ZW1020&oq=https%3A%2F%2Fwww.sparkfun.com
%2F&aqs=chrome..69i58j69i57.5633j0j4&sourceid=chrome&ie=UTF-8

Escribà Montagut, G. R. (2013). Mechanical design and performance specifications of anthropomorphic. JRRD,
50(2)pp 22.

Farahani, S. (2006). zigbee wireless networks and transivers . usa: elsevier.

Fatma, K., Sıtkı, Ö., & Fatihhan, K. (2022). Image processing-based realization of servo motor control on a
Cartesian Robot with Rexroth PLC. Turkish Journal of Engineering, 320-326.

Francesca Cordella, L. Z. (12 MAy 2016). Literature Review on Needs of Upper Limb Prosthesis Users. Frontiers
in Neuroscience, 5.

Gaetani1, R. d. (2020). A prosthetic limb managed by sensors-based electronic system: Experimental results on
amputees. Bulletin of Electrical Engineering and Informatics Vol. 9, No. 2,, 514~524.

1
George ElKoura, K. S. (2003). Handrix: Animating the Human Hand. Department of Computer Science, University
of toronto ,canada Side Effects software ,inc, 12.

George, J., Kluger, D., Davis, T., Wendelken, S., Okorokova, E., He, Q., . . . Thumser, Z. (2019). Biomimetic
sensory feedback through peripheral nerve stimulation improves dexterous use of a bionic hand. Sci.
Robot., 34.

Hand, T. (2014, may 5). Life changing myoelectric hand packed with the latest technology. Retrieved from the hand
N.p., n.d. Web: <http://bebionic.com/the_hand>.

Hochberg, .. L. (2006). Neuronal Ensemble Control of Prosthetic Devices by a Human with Tetraplegia. Nature, .
164-171.

interface, “.-c. (2022, september 05). wikipedia. Retrieved from encyclopedia: http://en.wikipedia.org/wiki/Brain
%E2%80%93c

J. K. Chapin, K. A. (1999). “Real-time control of a robot arm using simultaneously recorded neurons in the motor
cortex. ” Nature Neuroscience, vol. 2, pp. 664–670, .

J. Wessberg, C. R. (2000). Real-time prediction of hand trajectory by ensembles ofcortical neurons in primates. ”
Nature, vol. 408, , pp. 361–365.

J., M. (2015, june 25). Timeline of 3D printing design milestones. Beginning Design for 3D Printing, pp. 397-401.

J.R. Wolpaw, N. B. (2002). “Brain-Computer Interfaces for Communication and . Clinical Neurophysiology,, :767-
791.

Jenkins, H. (2005). Design of Robotic End Effectors. CRC Press LLC.

Jorge Zuniga, D. K. (2015). Cyborg beast: a low-cost 3d-printed prosthetic hand for children with upper-limb
differences. BMC RESEARCH NOTES Article number: 10, 8.

Joseph T. Belter, M. B., Segil, 1. J., 2 Aaron M. Dollar, P. S., & 1 Richard F. Weir, P. (2013). Mechanical design
and performance specifications of anthropomorphic. JRRD Volume 50, Number 5, 599-618.

Jun Chang Yang, J. M. (2019). Electronic Skin: Recent Progress and Future Prospects or Skin-Attachable Devices
for Health Monitoring. South Korea: Zhenan Bao.

Lake., C. (2019). Chapter 14: Partial Hand Amputation: Prosthetic Management. American Academy of
Orthopaedic. Chapter 14: Partial Hand Amputation: Prosthetic Management. American Academy of
Orthopaedic surgeons.

López Sullaez, L. E. (2009, march 6). Impact Occupational of Traumatic Amputation inFingers of the Hand by
Work Accident. Retrieved from scielo.isciii.es/pdf/mesetra:
http://scielo.isciii.es/pdf/mesetra/v55n217/original4.pdf

Lyshevski, S. (2002). Mechatronic curriculum – retrospect and prospect. Mechatronics, Vol. 12, No. 2, 195-205.

1
M. A. Lebedev and M. A. L. Nicolelis. (, 2006.). Brain-machine interfaces: past, present, and future. Trends in
Neurosciences, vol. 29, pp. 536–546.

Mahfouz, F. A. (October - 2014). Mechatronics Design And Implementation Education-Oriented Methodology; A


Proposed Approach. Journal of Multidisciplinary Engineering Science and Technology (JMEST), 1-12.

Masakatsu, G. F., & Bo, Z. (2020). State-of-the-art of intelligent minimally invasive surgical robots. Frontiers of
Medicine, 404-416.

McGimpsy, G., & T., B. (2018). Limb Prosthetics Services and Devices Critical Unmet Need: Market Analysis (p.
12). MA, USA: Bioengineering Institute Center for Neuroprosthetics, Worcester Polytechnic Institution.

Millán...., J. d. (2004). "Noninvasive brain-actuated control of a mobile robot by human EEG. Trans Biomedical
Engineering ; (p. 51(6); 1026). sta: IEEE.

NIKHIL, S. S., & DEIVANATHAN, R. (2018). MODELLING AND ANALYSIS OF DC MOTOR ACTUATOR
FOR AN ELECTRIC GRIPPER. Journal of Engineering Science and Technology, 862-874.

Pfurtscheller, G. (2004). Brain-Computer Interfaces: State of the Art and Future Prospects”. Proceedings of the 12th
European Signal Processing Conference, ’04, (pp. pp. 509-510). EUROSIPCO.

Pires, G. (2009). Brain-Computer Interface Methodology Based on a Visual P300 Paradigm. n IROS, pp. 4199-4205
.

R G E Clement, K. E. (2011, July 13). national library of medicine. Retrieved from Bionic prosthetic hands: A
review of present technology and future aspirations:
https://pubmed.ncbi.nlm.nih.gov/22041647/#affiliation-1

Rejer I., P. M. (2012). Analiza statystyczna zbioru sygnałów EEG na potrzeby budowy interfejsu mózg-komputer.
zagadnienia informatyki medycznej, Stowarzyszenie Przyjaciół, 44-51.

ROBOTICS, A. (2022, OCTOBER 24). GUI INTERFACE. Retrieved from AUM ROBOTICS SOFTWARE.

Research, D. (2014, May 14). the Deka arm reaching out to the future. Retrieved from Davison Inventions:
davison.com/blog/the Deka-arm-reaching-t-the-future

Rundle. (2016, Feb 29). wired. Retrieved from wired UK: http://www.wired.co.uk

S. J. Bensmaia and L. E. Miller. (2014). Restoring sensorimotor function through intracortical interfaces: progress
and looming challenges. Nature Reviews Neuroscience, vol. 15, pp. 313–325.

Schofield, J. S., & Katherine R Evans, J. P. (2014). Applications of sensory feedback in motorized upper extremity
prosthesis. Expert Reviews. Med. Devices, 1--13.

Sellers, .. E. (2006). “A P300 Event-Related Potential BrainComputer Interface (BCI The Effects of Matrix Size
and enter Stimulus Interval on Performance. Biological Psychology, vol. 73, 242-252.

Shih-Hsiang, Y., Pei-Chong, T., Yuan-Chiu, L., & Chyi-Yeu, L. (2019). A Sensorless and Low-Gain Brushless DC
Motor Controller Using a Simplified Dynamic Force Compensator for Robot Arm Application. Sensors, 1-

1
20.

Thurston, A. (2007). Paré and prosthetics: The early history of artificial limbs. USA: ANZ J. Surg.

University, Y. (2015).

Vasilije S. Vasić (2008), M. P. (2003). Standard Industrial Guideline for Mechatronic Product Design. FME
Transactions, 104, vol. 36, No 3.

Velliste, .. M. (2008). “Cortical Control of a Prosthetic Arm for Self-feeding. ” Nature, 1098-1101.

Wendy Yen Xian Peh, M. N.-C. ( 25 September 2018). Closed-loop stimulation of the pelvic nerve for optimal
micturition. Journal of Neural Engineering, 15.

Zhang, X., Chen, X., Li, Y., Lantz, V., Wang, K., & Yan, J. ( November 2011). A Framework for Hand Gesture
Recognition Based on Accelerometer and EMG Sensors. IEEE Transactions on Systems, Man, and
Cybernetics - Part A: Systems and Humans ( Volume: 41, Issue: 6, 1064 - 1076.

Zhen Zhang, O. Y. (2019). Real-Time Surface EMG Pattern Recognition for Hand Gestures Based on an Artificial
Neural Network. Sensors 2019, 14, 3170, 14, 3170.

1
APPENDIX

A.2 Datasheets
 Voltage Regulators – AP1117 http://diodes.com/datasheets/AP1117.pdf
 Microcontroller – PIC18525k22 See Microchip Website (Arduino)
 ServosTowerProMG996r
https://www.hobbyking.com/hobbyking/store/__6221__Towerpro_MG996R_1
0kg_Servo_55g_10kg_ 20sec.html

A.3 Digital Appendix


The digital appendix can be found on the link provided.
 Solidworks Files
 Printable STL Files
 Software CODE
 Proteus design

https://drive.google.com/drive/folders/
1PwbEPRX6h9bASqaTFZ7iM_xuojM-ErNP?usp=sharing

You might also like