Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 38

CHAPTER I

Introduction

Can Sign language cause communication gaps between deaf people with the
general public or ordinary persons? As per year 2021 statistics posted by the World
Health Organization, 466 million people across the world have disabled hearing loss
(over 5% of the world’s population), of whom 34 million are children. India (South
East Asia) has been an oral country. Only recently got some leverage for Indian Sign
Language (ISL). The majority of deaf schools use or at least claim to use an oral
approach (almost banning sign language). Many deaf people also suspect a key
factor working against them is the intense distaste for disability. Their demand and
rights are neglected. Deaf and mute students of a school staged a protest and
accused the Rajasthan government of ignoring their needs. All these cases reveal
that they are oppressed as they communicate effectively. The device which can
convert sign language into speech or text can make their life easier. Some
prototypes are built for this purpose but they are not available in India and also, they
are very expensive. So, a device that is available for everyone can decrease this
communication gap. According to Amol Portmanteau and Pragati Bachchhav (2019)
in their study, they concluded that there are many myths regarding sign language
which are not true and have facts that reveal the truth about sign language. It is
commonly seen that the impaired are treated differently in our society. And people
think that if they are disabled, they cannot live like a normal human being. But the
world is fair to all and many ways and technologies are coming up to help these
people so that they can lead a normal life. According to statistics of deaf and mute in
the philippines (2018), Legarda also mentioned the study of Dr. Charlotte Chiong,
Director of the Philippine National Ear Institute, which reported that, "at least eight
profoundly hearing deaf babies are born every day in the Philippines or one deaf
baby born every three hours." Deaf Education I.D.E.A. Sara Novic is a Deaf writer
and assistant professor of creative writing at Stockton University. MANILA,
Philippines—Thanks to the testimony of a 14-year-old deaf-mute boy, the Supreme
Court has upheld the 40-year prison term meted out on a … Population and Housing
of the National Statistics Office of the Philippines shows a total of 942,098 PWD, of
whom 13.91% are hard of hearing, partially deaf, or totally 42. Of this number,
241,624 are deaf, 275,912 are partially deaf, while more than half a million have
limited access to information as they are hard of hearing. Deaf in the Philippines are
probably found in Manila.

The law known as the Republic Act No. 11106, An act declaring the Filipino
sign language as the national sign language of the Filipino deaf and the official sign
language of government in all transactions involving the deaf, and mandating its use
in schools, broadcast media, and workplaces.

“The State shall, in compliance with the United Nations


Convention on the Rights of Persons with Disabilities, promote,
protect, and ensure the full and equal enjoyment of all human
rights and fundamental freedoms of persons with disabilities.
Thus, national and local State agencies shall uphold respect for
their inherent dignity, individual autonomy, and independence by
guaranteeing accessibility and eliminating all forms of

1
discrimination in all public interactions and transactions, thereby
ensuring their full and effective participation and inclusion in
society. The State shall also take all appropriate measures to
ensure that the Filipino deaf can exercise the right to expression
and opinion. Accordingly, the State recognizes and promotes the
use of sign languages embodying the specific cultural and
linguistic identity of the Filipino deaf”.

This Article stated that people should eliminate all forms of discrimination in all
public interactions with Persons with Disabilities (PWD) such as deaf people, and
that people should recognize the use of their linguistic identity which is the Sign
language.

Project Context

Filipinos are always active in socializing. Most of the countrymen love having
a good time by sharing a story and spending time chatting with family and friends.
The experience of being deaf in the Philippines is certainly a unique one. The
question is, how will they express themselves if non-deaf people will not understand
them? Moreover, deaf students across the country are constantly being
underestimated; therefore, they are working against stereotypes that cause them to
lose their confidence. Lack of confidence can be the root cause for escalating the
communication gap between the deaf and hearing people. In light of the said
challenge, the developers created a project that will help a deaf person to lessen the
communication gap and help them be more confident when talking to other people,
in that way, the hearing people would understand them and they won’t think that they
are being neglected or disregarded.

Purpose of the project

The main purpose of this project is to Interpret Hand Sign into text.
Interpreting Hand Sign into Text will use the following: a camera that will detect the
hand gestures of a student - this can be used by the students who knows how to use
a sign language, monitor and speakers for the Interpreted sign language - this will
help the normal students understand the hand gestures - which can be transcribed
into text or voice output. Additionally, the project was built together with an android
application - which can be used by the teacher – it features speech recognition which
can be transcribed in both the application and the monitor. Also, the teacher can also
view the screen of the monitor through the android application. Overall, the project is
connected and controlled by Raspberry pi 3 - an integrated circuit device used for
controlling other portions of an electronic system - which accepts the programming
language that the developers used for the project, Python. In addition, the project
developers follow the American Sign Language (ASL), as the guide for Interpreting
the hand gestures.

2
Objectives

In general, the project developers aim to develop Interpreting Hand Sign into
Text.

Specifically, the project is designed to perform the following:


1. Provide a system that accepts two-way communication - for deaf
and normal people.
2. Read and identify the hand gestures of the user using the camera;
3. Display the recognized and Interpreted hand gestures to the
monitor and android application;
4. Transcribe the hand gesture into audio using the speaker.

Scope and limitation


Scope

The study is conducted primarily for the purpose of lightening up the selected
deaf and mute community. The focus of the project is to lessen the communication
gap between a deaf – to become more confident and not feel being neglected – and
a normal person to understand their thoughts. The project can interpret and accept
alphabets and common hand gestures of sign language. The android application
shows the converted hand gestures into text, accepts voice input then transcribes it
into text that will be displayed in the application and the monitor, and allows the user
see what’s on the screen of the monitor. Since the project reads and says a single
letter, the common hand gestures or if the students command the project to say a
word, the student should gesture a finger heart sign (oppa heart sign).

Limitations

However, the project has its limitations, which is as follows: it cannot read
both hands at the same time, thus the project recognizes one hand per gesture only.
It does not allow any user to train or add another hand gestures, only the
programmer is allowed to do such activity. The android application will not work
without the internet connection. It cannot read the hand gesture if it is done in the
dark area. It cannot translate a full sentence structure. Not all hand gestures can be
read and translated.

Importance of the Study

This study will enlighten up the selected deaf students and benefit the following;

For Information Technology Students, they may look for an opportunity to


make this as their research study as they have similar purposes as it could provide
ideas that may help the students to create a more schematic system with the use of

3
existing devices, languages and sensors; and with the hope of upgrading the
developed current system.
For Information Technology Instructors, they make a room for developing
a study ground for future developers with the use of android application from which it
would help them to increase their ideas and skill set as they prepare themselves to
their own careers;
For the Institute of Computer Studies, they may create an environment from
which both students and instructors can explore a more suitable training ground for
using their talents and skills. The importance of a more suited community will very
affect the students learning outcomes;
For the SPED student at San Jose Elementary School, they may adapt the
system into their daily lives and studies from which it lessens the communication gap
between them and the hearing people; and
For the Future Project Developers; they may look at the opportunity of
providing accessibility to their clients and ability to integrate the system into a
website, providing the customers to acknowledge the inventory status if the products
that they want to buy are available and also allowing its clients to make a reservation
purchase if possible. It is highly suggested for upgrading the system to use devices
that detect more distinguished human interference.

Technical background

This chapter discusses the technology that the developers used and the explanation
for all those technical developments for the project. The Interpreting Hand Sign into
Text used a microcontroller - Raspberry pi 3b+, an integrated circuit device used for
controlling other portions of an electronic system. The project is built using Python
programming language to support the data inputting, building, control, testing, and
management of the project. Moreover, the project is equipped with the following: a
camera, for capturing hand gestures and movements, speakers for voice output of
converted text. In addition, the project is developed with an android-based
application, which will accept voice input and translate it into text that can be
displayed on monitor and android application.
The project will be placed in a classroom, once it is running, it will start
detecting hand gestures using the camera and each gesture will be translated into
text, then, the translated text will be seen in the monitor. Message is included in the
screen, this feature is connected to the android application. All the messages that
are displayed on the screen will be coming from the application, which the teacher
can access. Another feature of this project is the voice output, meaning, the project
is not just capable of translating the hand gestures to text, but also, students can
hear what the hand gesture is supposed to say.
Since the project is developed together with an android application, here are
the features and how it works: first, the application should be connected in the
internet, then, once it is connected, the teacher can now access the application; it
can accept voice input that will be transcribed to text and can be seen in the monitor.
Apart from that, with the application the teacher will not look at the monitor
repeatedly just to check what the student wants to say, because the Interpreted sign
language can also be seen in the application.

4
Chapter ll

REVIEW OF RELATED LITERATURE, STUDIES, AND SYSTEMS

In this chapter, various local and foreign literature, studies, and systems were
reviewed by the researcher to gain understanding of existing researches and other
academic works relevant to the area of study, and thus, present a broader
knowledge to the academic community and at the same time, help the researcher
intensify his knowledge in the field.

The results of their study show that the SVM and the Decision Tree models do
recognize Filipino Sign Language movements but its performance is greatly affected
by an imbalanced data set. (Sia et al, 2019). The relevance, connection, and
relationship of this citation to the project is that developers considered the objective
of SVM to recognize the movement in Filipino Sign Language (FSL) using the Kinect
V2 and machine learning classification models in RapidMiner. 

The study finds out that Filipino Sign Language is the language of most deaf
Filipino, yet they are still using the ASL and Signing Exact English (SEE), (Destreza,
Froilan 2012). The relevance, connection, and relationship of this citation to the
objective where it says minimizing the gap of sign language and the normal
language users. The conceptualization of sign language to voice translator has
emerged.

Their study covers Filipino sign language and uses Kinect V2. (Oliva et al,
2018) The relevance, connection, and relationship of this citation to the project is the
objective to solve the communication gap between deaf people and people who can
hear. Their study covers Filipino sign language and uses Kinect V2.

The study is about Chinese sign language recognition based on shs


descriptor and encoder-decoder lstm model (Xiao Xu Li et al, 2017). The relevance,
connection, and relationship of this citation to the project that the project developers
developed is their objective to recognize isolated Chinese sign language. In order to
better distinguish different hand shapes, a new Specific Hand Shape (SHS)
descriptor is proposed. Based on the SHS descriptor, an encoder-decoder LSTM
model is applied to achieve better sign recognition results. 

The study is about Sign language recognition with multi-modal features (Junfu
Pu, 2016). The relevance, connection, and relationship of this citation is the aim to
recognize sign language automatically using the RGB videos and skeleton
coordinates captured by Kinect. Which is of great significance in communication
between the deaf and the hearing societies.

The study is about the nearest neighbor classification of Indian sign language
gestures using Kinect camera (Zafar et al, 2016). The relevance, connection, and
relationship of this citation to the project is the goal to help people with speech
disabilities. Communicate in sign language and therefore have trouble in mingling

5
with the able-bodied. There is a need for an interpretation system which could act as
a bridge between them and those who do not know their sign language. 

The study concluded that a dataset to train and evaluate the system must
have sufficient gesture variations to generalize each symbol. (Vazquez,
Lopez,2017). The relevance, connection, and relationship of this citation to the
project is the intention to introduce a system that takes advantage of Convolutional
Neural Networks. It is used to recognize hand letter and number gestures from
American Sign Language based on depth images captured by the Kinect camera.  

The study is about Sign-language Recognition through Gesture & Movement


Analysis (SIGMA) (Ian Lim et al, 2015). The relevance, connection, and relationship
of this citation to the system is the idea of introducing a system that combines a
prototype data glove with computer vision. In order to translate Filipino Sign
Language for medical purposes. 

The study uses Hand talk: an interactive web-based sign language (Mejia,
2002). The relevance, connection, and relationship of this citation to the project is
their objective to introduce the system will not only serve as a tool for communication
among the deaf and mute. Also, as a means of instruction for users with little or no
knowledge of sign language.

The study is about text converters using leap motion to sign language (Khan
et al, 2016). The relevance, connection, and relationship of this citation to the
developed system is the objective to introduce a prototype that can convert sign
language into text. A Leap Motion controller was utilized as an interface for hand
motion tracking without the need of wearing any external instruments. 

The system is developed for dynamic and uninterpreted communication


between the normal user and impaired one. (Amol Potgantwar and Pragati
Bachchhav, 2019). The relevance, connection, and relationship of this citation to
developed systems is the objective to bridge the gap between the normal user and
impaired one. The proposed system acts as the mediator between impaired and
normal people. This system uses a Kinect motion sensor to capture the signs. 
 
The study is about Sign language translator and gesture recognition
(Elmahgiubi et al, 2015). The relevance, connection, and relationship of this citation
to the project is the aim to develop a Data Acquisition and Control (DAC) system that
translates the sign language into text that can be read by anyone. This system is
called Sign Language Translator and Gesture Recognition. 
 
The system use F-Xinulator: Filipino Sign Language Translator (Arayata et al,
2014). The relevance, connection, and relationship of this citation to the system that
the project developers developed is their objective to teach and show the basic sign
language to make it possible for users (deaf and normal people) to understand
different languages. For them to understand one another far more quickly than users
of unrelated languages can. The idea was to make a mobile app that will help people
to learn the basic languages for them to communicate well with deaf people.
 

6
The study is about Sign language number recognition (Sandjaja, 2008). The
relevance, connection, and relationship of this citation to the system is the objective
to learn the Filipino Sign Language number in the training phase and recognize the
Filipino Sign Language number in the testing phase by transcribing Filipino Sign
Language number into text. 

The study is about the stages of faith development of the deaf students at the
De La Salle University - College of St. Benilde (Manalapan, 2004). The relevance,
connection, and relationship of this citation to the project is the aim to be transcribed
into written English by a sign language teacher and checked and verified by two
other sign language interpreters.

The system converts the hand gestures to the text and further to speech.
(Jhunjhunwala et al, 2017). The relevance, connection, and relationship of this
citation to the system is the objective to recognize and convert sign language into
text and further to speech. The sign language glove consists of simple hand gloves
fitted with flex sensors which are being used for monitoring the amount of bend on
the fingers. 
 
Their paper aimed to minimize the major complexions in the system for further
extensions, the sensor comes with the feature of face recognition and voice
recognition (Aravind and Sivagami, 2015). The relevance, connection, and
relationship of this citation to the system is the motive to convert the human sign
language to Voice with human gesture understanding and motion capture. This is
achieved with the help of Microsoft Kinect, a motion capture device from Microsoft.

The study is about Real time conversion of sign language to speech and
prediction of gestures using artificial neural networks (Abraham et al, 2018). The
relevance, connection, and relationship of this citation to the system aims to help the
people who are unable to speak, for communication. Most people will not be able to
understand the Universal Sign Language (unless they have learnt it) and due to this
lack of knowledge about the language, it is very difficult for them to communicate
with mute people. A device that helps to bridge a gap between mute persons and
other people forms the crux of this paper. 

Synthesis of the reviewer

The related literature, studies, and systems have the same objective to help
the deaf community to lessen the communication gap between normal people. There
are devices and sensors they used that will be a guide for developing the project.
Most of their systems used Kinect sensors (Kinect v2, Kinect camera, Microsoft
Kinect), Leap motion sensor, F-Xinulator, Artificial neural network, and Arduino as a
device they used to convert sign languages to text. The microcontroller is intended to
use the PIR (Passive Infrared) Motion Sensor, the purpose of this microcontroller is
to detect the motion of a moving human body or objects. Whenever someone comes
in the range of PIR sensor, it gives High at its output pin. The project is about motion
sensors that can convert sign language into text. Conversion of sign language into
text with the use of PIR motion sensor will help an ordinary person to understand the
deaf people. With the help of the cited reviews the project proposal can be
implemented well. It will help the deaf and mute student to communicate with normal

7
people. Hence, the project must consider all the possible things that can happen
when implementing the said project.

CHAPTER lll

EVALUATION, DESIGN, AND FRAMEWORK

This chapter represents the technicality of the system that involves the
discussion of expected output and justification, operational framework, requirements
specifications, system analysis, system design, system development and testing, the
conceptual framework and the definition of terms.

Discussion of Expected Output and Justification

The expected output of the project is to interpret Hand Gestures into text. The
project is translation of sign language into text with the use of a camera and
microcontroller that will help an ordinary person to understand the deaf people, to
lessen and lower the communication gap between deaf people with the selected
deaf students. The project will benefit the deaf people for communicating well in daily
life.

The project consists of microcontroller Raspberry pi, an integrated circuit


device used for controlling other portions of an electronic system. The system used
Python programming language to support the data inputting, building, control,
testing, and management of our project. Camera is used for capturing hand gestures
and movements to convert it to text. And monitor where the converted hand gesture
displays. And lastly speakers for voice output of converted text. The developers used
an android phone for displaying the output text and audio. The developer used
Basic4Android in developing an android application. Basic4android is a rapid native
application development tool.

Operational Framework

Identification
The first step in developing the Interpreting Hand Sign into Text is
Identification. The developers gathered and collected the project requirements. In
this phase the devices were identified and prepared according to its use, like
Raspberry pi, Camera, Speaker, monitor, Bluetooth module and android phone and
other materials needed. The developers also collected ideas in this phase that may
help in developing the Sign language into text. This phase also includes analyzing
the problem and identifying the initial design of the project. In this phase all the
possible problems were addressed and all the solutions were discussed by the
developers.

Design
The second phase is the designing. In this phase the initial design of the
Interpreting Hand Sign into Text was produced. The designing phase is the part of
the development where the functions of Sign language into text were made in a hand

8
gesture process that translates into text and sounds. The information gathered from
planning is now put into a design in this phase. All of the capabilities of the project
were done. This is also an important phase of the development because this is the
time where the developers put all the ideas into a real-world project. In developing
the Interpreting Hand Sign into Text, the developers designed the project according
to the data and information gathered from the planning phase.

Construct or Build
Building and Prototyping, the developers start building the project according to
the design created from the previous phase. Coding and Building parts of the
Interpreting Hand Sign into Text, begin in this phase all the hardware components
were combined in this phase of the development. All the initial functionalities of the
project were also done. In this phase of the development, the developers improved
and changed some functions of the Interpreting Hand Sign into Text according to the
demand of the client. Some editing of codes and change in hardware is included.
And also, includes the maintenance where the developers observe the project for
possible problems that arise while the project is doing all of its function. The project
developers still support and observe the project even if it is already implemented for
the possible changes that the project may need.

Evaluation and Risk Analysis


Evaluation by the community partner is the next phase, in this phase of the
development the testing occurs when the initial design of the Interpreting Hand Sign
into Text is done, it will be tested by the community partner for possible errors or
malfunction. The community partner will be asked for their evaluation of the
prototype. If the community partner is not satisfied, the project developers will ask for
their suggestions and insights to improve the project. If the community partner is
already satisfied that’s the only time where the project developers start developing
the final designs of the Interpreting Hand Sign into Text. And also, the project
developers came up with a solution to make the project a two way communication.
That allows the deaf and normal person to make a simple conversation.

1. Identification 2. Design

4. Evaluation and Risk Analysis 3. Construct or Build

9
Figure 1
An Operational Framework showing the Development of the
Interpreting Hand Sign into Text
Requirements Specifications

The hardware requirements in developing the Interpreting Hand Sign into Text
are the following; The Raspberry pi 3b+ microcontroller which used to control the
camera and the speaker also it is a computer unit for coding and testing the other
device, Camera for capturing the hand gesture, Speaker used to output the
translated hand gesture into audio, Monitor used to see if the hand gesture is
correct, Power bank that gives power to the raspberry pi, Android phone used to see
the translated hand gesture into text. Bluetooth module used to connect the
Raspberry pi to the android phone. All of those devices are important and necessary
for the project if one of them fails the prototype will not function well.

The software requirements in developing the Interpreting Hand Sign into Text
are the following; The Raspbian operating system (OS) which used to run the
Raspberry pi, Basic4android which is used in developing the android application
also, Python IDE which is used in coding the function of the Sign language into text,
and Android Operating system which is used as a platform of the android application
preferably Android Lollipop 5.0 or higher version for smooth flow of the application.
All of that software plays a vital role in developing the project when one of that
software is missing the project is impossible to build.

The community partner that will benefit from this prototype is San Jose
Elementary School. The community partner’s desired output is to make a friendly
design of the project. And also, the project is easy to understand and navigate for the
users. The developers designed the prototype wherein it can easily connect with
people since its main functionality is to communicate with the desired users.

System Analysis
The Project Developer's developed this project to help the deaf people to
lessen the communication gap between normal people to them. Also, to build their
confidence that being deaf is not a barrier to achieve their goal in life. In this phase
the developer’s gathered all the possible risks and needs identified by the project
developer. This is the phase where the devices that will be used were identified and
prepared like Raspberry pi, Camera, Speaker, Android phone and other materials
needed. The developers collect ideas in this phase that may help in developing the
Interpreting Hand Sign into Text. This phase also includes analyzing the problem
and identifying the initial design of the project. In this phase all the possible problems
were addressed and all the solutions were discussed by the developers. Project
developers came up with this project for the deaf/mute to help them make their
communication better in their studies. The developers reviewed and studied all the
needed equipment and the requirements to meet the ideas for the model of the
project. A model that will cost less and affordable for the selected deaf/mute
community.

System Design

10
Flowchart, is the diagram where the project is being based on. The Flowchart
is where the process or the flow of the project. The developers created a 3D design
for the box in which all of the hardware components are placed within the box for
safety and to prevent it from being easily destroyed.

System Development and Testing

The Developers start building the project according to the design created from
the previous phase. Coding and building parts of the Interpreting Hand Sign into Text
begin in this phase all the hardware components were combined. All the initial
functionalities of the project were also done. In coding, the project Developers made
a set of codes that needed to make the project function. The project also requires
Python IDE for coding and creating functions of prototype. Started to gather hand
gestures to save and analyzed all the gathered hand gestures if it is trained well.
Testing the project allows the project's developers to locate an error or a limitation of
the prototype. The project will be tested directly with our partnership and the client
would have made an analysis or an evaluation of the acceptability of the project.

Conceptual Framework

This conceptual framework explained the input, process, and output of the
system. In developing the project and building the project the process requires
software and hardware components. These are categorized as first; Input the
developers select a computer operating system that is compatible with the software
needed, especially the Basic4Android. This Basic4Android is the platform used to
design the android application. After building the application it must be installed in a
mobile phone with an operating system version 9.0 “Pie” or above. The project also
requires Python IDE; it is used for coding and creating functions of the project.

The processes were the procedures of development of the project are the
Identification, Designing, Construct and build and the evaluation and risk analysis. It
includes determining the objectives, the design, and the progress of the project. The
project developers tested the project functionality. An android application will be
used for accepting voice input that will be transcribed to text and can be seen in the
monitor.

The finished project or the output should be able to show correct recognition
of the gesture using the camera by the help of raspberry pi and transform the
gesture into audio using the speaker. Capture the hand gesture and transform it into
text and sound by the help of an android phone that is connected to the Bluetooth of
the device. The project should help the deaf community to lessen and lower
communication gaps.

11
INPUT PROCESS OUTPUT OUTCOMES

Identification.

Designing. Helps the deaf


1.
Interpreting student of San
Acceptability
hand sign into Jose Elementary
norm Construct or
Text. School to lessen
Build. communication
2. Weighted
mean score gap.
Evaluation and
Risk Analysis

FEEDBACK

Figure 2
A Conceptual Framework showing the developed
Interpreting Hand Sign into Text

12
Definition of Terms

APK – an Android Application file format.


Basic 4 Android – Is a tool used for developing native Android applications.
Bluetooth module – It is a device used to connect the prototype and the Android
Application
Deaf – lacking the power of hearing or having impaired hearing.
Machine learning – It is a technology that uses a dataset to train the image.
Mediapipe – It may be used to create advanced Machine Learning models such as
face detection, multi-hand tracking, object detection, and tracking.
Monitor – Used for displaying a divided screen into two parts. One is for showing the
acted sign language and the other half is displaying the text that comes from the
android application.
Python – Is a programming language with objects, modules, threads, exceptions,
and automatic memory management. 
Raspberry pi – a microcontroller that will serve as the brain of the project a small
sized computer that plug into a computer monitor
Raspbian operating system – The official operating system for the Raspberry pi3.
Speech Recognition – It is used to transcribe the voice into text and transfer the
text into the monitor.
Thonny IDE – It is used for coding of Python.
Transcribe – put (thoughts, speech, or data) into written or printed form.
Webcam – This is an input device used for capturing images and also, detecting
hand gestures.

13
Chapter lV

METHODOLOGY

Research Design

The project developers utilized the descriptive research design through


systematic study of the project's design, development and evaluation of all the
instructions and must meet all the needs of the community partner. The developers
studied the project based on both physical and logical design, methods in
development and assessment. The descriptive research pertains to research survey
questionnaires, study design and also machine learning that are the center point of
the project. It helps the project developers in learning more about the study.

Setting of the Study

The subject of the developer’s project is the Sped student at San Jose
Elementary School. Located at Hernandez Building, 40C Rodriguez Hwy, Rodriguez,
1860 Rizal. In 1947, San Jose Elementary School started as an annex of La Loma
Elementary School with Prudencio Quilinta as principal. The school was open for
Kinder, Grade 1 to Grade 6, Special Education and Alternative Learning System.
The San Jose Elementary school project, the developer found a suitable community
partner to test our project. The school has a special program for those students with
special attention like deaf this are the students who will benefit from this project.
From the developer’s house located at Mediterranean Heights Blk 7 lot 3, take
a ride with a tricycle or jeepney going to San Jose Elementary School, it takes 1.38
kilometers and the trip takes less than 10 minutes. From the developer’s house to
the Colegio de Montalban takes 3.52 kilometers. The whole trip by riding takes less
than 15 minutes. The vicinity map where the researchers conducted their study is
shown in the next page.

14
Figure 3
Vicinity Map showing the location of San Jose Elementary School

15
Subject of the Study

This study requires validation of the developed project. The technology experts
provided specific information and recommendations that will help the project
developers to continuously refine and advance their device and programs, thus,
improving the overall output of the project. The documented validation from the
technology experts would include the project’s purpose and function, use, and even
durability and design was all tested.

Respondents of the Study


The study was about Interpreting hand signs into text. The project developers
conducted a survey in San Jose Elementary School. Before conducting the survey to
the students, the teacher of the SPED first evaluated the project and fortunately the
requirements were met. The project subjects assess the project by filling out
questionnaires through google forms to determine whether or not the project is
acceptable. The project developers used availability sampling, a sample of 28 deaf
students due to the fact that they are those who could manage the project. The
project developers talked with the classroom adviser to gather some insights and
suggestions.

Sources of Data

The following data guides the project developers in the developmental phase
of the system;

Primary

Interview. The project developers conducted an interview in San Jose


Elementary School, specifically, the SPED teacher. The Interview is also the source
of data of this study. The developer makes face to face conversation with the
teacher. And gather information related to deaf students about the desired design of
the project. And also, discuss how the prototype works.

The Internet is the primary source of information. All essential information


connected to the project gathered on World Wide Web are evaluated and
responsively used in strengthening the project.

Survey Questionnaire The developers used surveys for collecting data and
information to the intended user. The primary objective of the survey is to give a
solution to a specific issue of the project. And also allows the developers to better
understand the demands of the respondents.

Secondary

Related Literature The project developers cited local and foreign literatures
which are related to the project. The developer gathered ideas from different

16
libraries through reading that comes in many forms like textbook, magazines,
newspaper and other published thesis as reference that is related to the developer’s
project. The related literature helps the developers to know what the devices needed
in developing the project and also, it widens the idea of the developers on how to
improve the prototype.

Procedure of the Study

The procedure of the study is divided into different series of step-by-step


procedures in the development of the project planning, designing, building and
prototyping, evaluation and risk analysis.

First is planning the Project Developers gather information to know the


requirements of the intended user. This will be the guide for the Project Developers
on what output should be done since the project is intended for the user. The
developer gathered all the materials and started to assemble all the devices needed.
It also needs to check the functionality to be able to analyze all details and also build
a logical flow for the accurate solution. It’s also necessary to verify the proper
software to know if that software is reliable for the development of the system.

The second procedure is designing the project developers' design was


created according to what has been planned in the 3D model. The developers
created a box in which all of the hardware components are placed within the box for
safety and to prevent it from being easily destroyed. This procedure includes the
costing of what devices are needed, as well as the materials for creating the box.

The third is building and prototyping, the developers began to create a


program to develop the Interpreting Hand Sign into text. On the camera the
developers installed software requirements such as Opencv and Mediapipe. It is
used to detect hand gestures. These requirements are really important for
developing this project because it is a library computer vision that widely uses image
processing and detection recognition. After the requirements were installed, the
developers started to enable the voice output using speakers. Additionally, the
project was built together with an android application - which can be used by the
teacher – it features speech recognition which can be transcribed in both the
application and the monitor.

The fourth procedure is evaluation and risk analysis. After building the
project developers start evaluating and communicating with the community partner to
ask for their suggestions and insights to improve the project. Analyzing the possible
risk and problems will lessen the project's errors and problems.

Statistical Treatment
The statistical tool that is use by the project developers is:
Ranking: Used to arrange the answer of the respondents to provide
the specific function.
Weighted mean: Used for calculating the average of each category
and also, to measure the acceptability of the project.

17
Chapter V

RESULTS AND DISCUSSION

This chapter presents the results of validation as well as its corresponding


verbal interpretations with respect to each of the five point scales of validation.

The project Developers, developed a project that can interpret hand signs into
text and audio output. Specifically, the project purpose is to make two way
communication.

Table 1
Computed Weighted Mean on the Level of Acceptability of the Developed
Interpreting Hand sign into Text with the respect to purpose

PURPOSE AND FUNCTION 𝐱̅ VERBAL


RANK
INTERPRETATION
1. The project’s purpose is met. 3.61 Highly Acceptable 2

2. The project is built according


3.46 Highly Acceptable 3
to its intended design.

3. The project would help in


improving the difficulty of 3.75 Highly Acceptable 1
everyday living.

4. The project can be marketed


2.96 Acceptable 5
for local use.

5. The project can be


introduced in the country for its 2.85 Acceptable 6
practical use.
6. The project’s intended use is
highly evident and shows a 3.28 Acceptable 4
higher level of integrity.

WEIGHTED MEAN 3.32 Acceptable

Table 1 presents the computed weighted mean of the level of validation of


the developed Interpreting Hand Sign into text with respect to purpose and function.
You will notice that among the five (5) factors of purpose and function, the factors
saying that the project would help in improving the difficulty of everyday living ranked
first with a mean of 3.75 with verbally interpreted by the respondents by having
highly acceptable results. The project’s purpose is met next with having a mean of
3.61 which is verbally interpreted having a high level of acceptability. Factors such

18
as the project being built according to its intended design ranked third with the mean
of 3.46, which is verbally interpreted as having an acceptable. Stated that the
project’s intended use is highly evident and shows a higher level of integrity with a
mean of 3.28 which is verbally interpreted as having an acceptable result. Next,
saying that the project can be marketed for local use with mean 2.96 which is
verbally interpreted as having acceptable. Lastly, the project can be introduced in
the country for its practical use with a mean of 2.85 with verbally interpreted as
acceptable. Overall, the project’s purpose and function is interpreted as having
acceptable with a mean of 3.32.

The study concluded that a dataset to train and evaluate the system
must have sufficient gesture variations to generalize each symbol. (Lopez, 2017).
“NUI is the use of body and hand gestures therefore Hand Gesture Recognition
(HGR) is an essential component of this kind of system.” The developers used a
dataset to train the image properly to recognize hand letter and words gestures from
American Sign Language (ASL) based on depth images captured by the camera.
The developed project can improve the difficulty of everyday living.

Based on the result with respect to purpose and function, machine


learning gives a good impression to the respondents. Machine learning is highly
needed today because today’s leading companies such as Facebook and Google
make machine learning a central part of their operation. It focuses on development
of computer programs that can access data and use it to learn for themselves.
That’s why this project is conducted to help improve this field of computer
technology.

19
The project Interpreting Hand Sign into text will enlighten up the selected deaf
students. The android based application can accept voice input that will be
transcribed to text and can be seen in the monitor.

Table 2
Computed Weighted Mean on the Level of Acceptability of the Developed
Interpreting Hand Sign into text with the respect to use

USE 𝐱̅ VERBAL
RANK
INTERPRETATION
1. The project is easy for the Highly Acceptable
user to use and become 3.85 2
familiar with.
2. The project guarantees ease
for users to achieve its 3.75 Highly Acceptable 3
demonstrable operation.
3. The project’s overall
usability impression makes it 3.89 Highly Acceptable 1
standout.

WEIGHTED MEAN 3.83 Highly Acceptable

Table 2 presents the computed weighted mean of the level of validation of the
developed Interpreting Hand Sign into text with respect to usefulness. You will notice
that among the three (3) factors of usefulness, the factors saying that the project’s
overall usability impression makes it standout ranked first with a mean of 3.89 and
verbally interpreted as having a highly acceptable. The factor such as the project is
easy for the user to use and become familiar with, ranking second with a mean of
3.85 with verbal interpretation of having a highly acceptable score. Lastly, factor
saying the project guarantees ease for users to achieve its demonstrable operation
with a mean of 3.75 with verbally interpreted of having a highly acceptable. Overall,
the project’s use is interpreted as having highly acceptable results with a mean of
3.83 average.

Digital systems can record every individual performance and can provide an
accurately customized report of their specific need (Singh, 2020). With classroom
strength increasing day by day this kind of technology help will be a breakthrough in
education. This will ease the burden on both teachers and students. Since this
project was made to help the students, the developers made it user friendly for the
ease of the user, and become familiar with.

One of the most difficult components in terms of usability is the designing and
building the project. Because aside from meeting the user's needs and demands, the
developers additionally make adjustments to the user's knowledge criteria in terms of

20
computer technology. Developing a user-friendly program is the most difficult part in
any software development. But the project developers met it with a high level of
usability.

The project is designed according to the intended plan of the 3D model. The
project developers make some adjustments based on the client feedback and
suggestions.

Table 3
Computed Weighted Mean on the Level of Acceptability of the Developed
Interpreting Hand Sign into text with the respect to durability

VERBAL
DURABILITY AND DESIGN x RANK
INTERPRETATION
1. The project gives an
impression of steadiness and 3.17 Acceptable 6
firmness.
2. The project assures
probability that it will have a
relatively long continuous
3.25 Acceptable 5
useful life, without requiring an
excessive degree of
maintenance. 
3. The project does not show
3.28 Acceptable 4
any cracks or fractures.
4. The project looks well-
3.82 Highly Acceptable 1
designed.
5. The project is neat and free
3.46 Highly Acceptable 3
from disorder. 
6. The project is worthy to be
considered as a college output 3.57 Highly Acceptable 2
– one that is well-thought of. 

WEIGHTED MEAN 3.43 Acceptable

Table 3 presents the computed weighted mean of the level of validation of the
developed Interpreting Hand Sign into text with respect to durability and design. You
will notice that the highest rank with mean of 3.82 saying that the project looks well-
designed is verbally interpreted as having a highly acceptable score. The factor such
as the project is worthy to be considered as a college output – one that is well-
thought of is ranked second with a mean of 3.57 which is verbally interpreted as
having a highly acceptable result. The project is neat and free from disorder with a

21
mean of 3.46 ranked as third with verbally interpreted having as highly acceptable. It
stated that the project does not show any cracks or fractures having a mean of 3.28
which can be verbally interpreted as having acceptable. The next factor such as the
project assures probability that it will have a relatively long continuous useful life,
without requiring an excessive degree of maintenance, follows next with each having
mean of 3.25 which is verbally interpreted as having an acceptable result. And lastly,
the project gives an impression of steadiness and firmness. With a mean of 3.17 with
verbally interpreted or acceptable. Overall, the durability and design computed a total
average weighted mean of 3.32 with acceptable result.

Sign Language Converter (Arsan , Ulgen 2015). Their aim on their paper is to
design a convenient system that is helpful for the people who have hearing
difficulties and in general who use a very simple and effective method; sign
language. This system can be used for converting sign language to voice and also
voice to sign language. A motion capture system is used for sign language
conversion and a voice recognition system for voice conversion. It captures the signs
and dictates on the screen as writing. It also captures the voice and displays the sign
language meaning on the screen as a motioned image or video.

The durability and design is the ability to express both strengths of the project
physically. The design is such an important element of a project that without it there
is no structure for the project development process to stand on and make the project
a success in the future. The durability of the project discusses the materials that they
used in developing the project. The developers consider that in order to realize the
project's maximum capability, it must be durable in terms of the user's criteria and
the environment in which the project will be located.

Project Design is such an important element of a project that without it there is


no structure for the project development process to stand on and make the project a
success in the future.

22
The project Developers developed a project that can Interpret Hand Sign into
text. Specifically, the system is designed to make two way communication:
read the hand gesture of the user using the camera; show a correct
recognition of the hand gesture and displayed on the monitor; and transform
the hand gesture into audio using the speaker.

Table 4
Computed General Weighted Mean on the level of Acceptability of the
Developed Interpreting Hand Sign into text

FACTORS OF MEAN VERBAL


RANK
ACCEPTABILITY INTERPRETATION
1. Purpose and Function 3.32 Acceptable 3

2. Use
3.83 Highly Acceptable 1

3. Durability and Design


3.43 Highly Acceptable 2

GENERAL WEIGHTED MEAN 3.53 Highly Acceptable

Table 4 presents the general computed weighted mean of the level of


validation of the developed Interpreting Hand Sign into text with respect to the factor
of acceptability. You will notice that among the three (3) factors of acceptability, the
factor use ranked first with the mean average of 3.83 and verbally interpreted as
having a highly acceptable. The next factor durability and design ranked second with
having a weighted mean of 3.43 which is verbally interpreted as highly acceptable.
Lastly, the purpose and function with mean average of 3.32 and verbally interpreted
as acceptable. Overall, the project is interpreted as having highly acceptable with a
general weighted mean of 3.53.

The research on signing avatars by (Ka- corri et al., 2017), acceptance within
the Deaf community is vital for the adoption of sign language generation
technologies. (Bragg et al., 2019) believe that the Deaf user perspective has to be
properly analyzed and that enforcing technology on the Deaf community will not
work. However, avatar generation “faces a number of technical challenges in
creating avatars that are acceptable to deaf users (i.e., pleasing to view, easy to
understand, representative of the Deaf community, etc.)” (Bragg et al., 2019, 7).

After considering all of the validation considerations, it appears that the


project isn't too awful at all, for achieving a high degree of validation. However, from
the viewpoint of the project developer, something is missing, particularly in terms of
usefulness and purposefulness. These would be the two most significant project

23
criteria development. Despite the fact that the project has previously been built, the
project developers continue to enhance the project in order to get better outcomes.
CHAPTER Vl

CONCLUSION AND RECOMMENDATION

This chapter presents the Conclusion, Recommendation and Implementation


Plan of the developed Sign Language into text.

Conclusion

The project developers therefore, concluded that the general objectives have
been successfully met. Which will lessen the communication gap between deaf and
normal people. Interpreting Hand Sign into text was developed for the selected deaf
students in San Jose Elementary School, and was designed to enlighten the
selected students that they were not being forgotten and disregarded. It is intended
to transcribe hand gestures in real time to visual and audio output. The project uses
an android application and accepts voice input then transforms it into text, where it is
connected to the internet and provides a project that accepts two-way
communication for deaf and normal people. The implementation of this project will
considerably lessen the communication gap.

Recommendations

Based on the findings and conclusion formulated in this study, there are a lot
of changes and improvements in the system from which it may possibly encounter as
technology advances as it will make more ways to meet the needs of the company or
the community partner and to improve the system.

For Information Technology Students, may look for an opportunity to make


this as their research study as they have similar purposes as it could provide ideas
that may help the students to create a more schematic system with the use of
existing devices, languages and sensors; and with the hope of upgrading the
developed current system.

For Information Technology Instructors, they make a room for developing


a study ground for future developers with the use of android application from which it
would help them to increase their ideas and skill set as they prepare themselves to
their own careers;

For the Institute of Computer Studies, may create an environment from


which both students and instructors can explore a more suitable training ground for
using their talents and skills. The importance of a more suited community will very
affect the students learning outcomes;

For the SPED student at San Jose Elementary School, they could adapt
the system into their daily lives and studies from which it lessen the communication
gap between them and the hearing people; and

24
For the Future Project Developers may look at the opportunity of providing
accessibility to their clients and ability to integrate the system into a website,
providing the customers to acknowledge the inventory status if the products that they
want to buy are available and also allowing its clients to make a reservation
purchase if possible. They may be validation experts. It is highly suggested for
upgrading the system to use devices that detect more distinguished human
interference.

Implementation Plan

The implementation plan will determine how long the project will be applied on
the respective selected deaf students. The project will be implemented to the San
Jose Elementary School within a week; this project will benefit the deaf students. On
the first day of the Implementation all the needed software and hardware
requirements are gathered. On the second day all the possible requirements and the
most needed components are being searched. On the third and fourth day the
developers made a 3D model of the project. On the fifth and sixth day, the
developers analyze and study the possible problems that they will encounter, and
make a back-up plan and solutions to solve some problems. On the seventh day, the
project developers consider the clients feedback and suggestions in constructing and
building the project. After building the project the developers tested it first before they
demonstrated it to the client, checking and troubleshooting occurred. After testing
and checking the project, it was presented to the client, the developers will let the
user control the project to familiarize them on how to navigate the project.

25
BIBLIOGRAPHY
Deafness and Hearing Loss. Posted April 1, 2021. Retrieved from
https://www.who.int/news-room/fact-sheets/detail/deafness-and-hearing-loss

Potgantwar, A. and Bachchhav, P. Sign Language Interpreter Using Kinect Motion.


Posted October 2019. Retrieved From
https://www.ijitee.org/wp-content/uploads/papers/v8i12/L26451081219.pdf

Advani, N. et. al. A Survey on Communication Gap A Survey on Communication


Gap Communication Gap between Hearing and Speech Impaired persons
and Normal Persons. Posted December 2013. Retrieved From
https://core.ac.uk/download/pdf/25725245.pdf

Statistics of deaf and mute in the Philippines 2018. Posted January 2021 Retrieved
From https://www.gamecritics.com/wp-content/uploads/daiem4k/258bfa-
statistics-of-deaf-and-mute-in-the-philippines-2018

Rashmi, et. al. Lack of proper communication system between Deaf/Mute people
and normal people.  Posted  October 2019. Retrieved From
https://tqb.li2.in/lack-of-proper-communication-system-between-deaf-mute-
people-and-normal-people/?
fbclid=IwAR0MrH7BMhk8GamJKfycLRHqFFTubMto8Cg_XHg5EluCpvJT0Nb
K1l0R0FQ

Junfu Pu,et. al. SIGN LANGUAGE RECOGNITION WITH MULTI-MODAL


FEATURES. Posted September 2016. Retrieved from
https://www.researchgate.net/publication/310901309_Sign_Language_Recog
nition_with_Multi-modal_Features

Zafar, Ahmed & HARIT, GAURAV. (2016). Nearest neighbour classification of


Indian sign language gestures using kinect camera. Sadhana. 41.
10.1007/s12046-015-0405-3.

Vazquez Lopez. HAND GESTURE RECOGNITION FOR SIGN LANGUAGE


TRANSCRIPTION. Posted May 2017. Retrieved from
https://scholarworks.boisestate.edu/cgi/viewcontent.cgi?
article=2322&context=td

Lim, I. et. al. Sign-language Recognition through Gesture & Movement Analysis
(SIGMA). Posted March 2015. Retrieved from
https://www.dlsu.edu.ph/wpcontent/uploads/pdf/conferences/research-
congress-proceedings/2015/HCT/011-HCT_Ong_C.Y.pdf

Mejia, M. HandTalk : an interactive web-based sign language. Posted 2002.


Retrieved from https://libguides.dlsu.edu.ph/c.php?g=807200&p=5761312

26
Khan, Fazlur & Ong, Huey & Bahar, Nurhidayah. (2016). A Sign Language to Text
Converter Using Leap Motion. International Journal on Advanced Science,
Engineering and Information Technology. 6. 1089. 10.18517/ijaseit.6.6.1252.

Potgantwar and Bachchhav. Sign Language Interpreter Using Kinect Motion.


Posted October 2019. Retrieved from
https://www.ijitee.org/wp-content/uploads/papers/v8i12/L26451081219.pdf
 
Elmahgiubi, Mohammed & Ennajar, Mohamed & Drawil, Nabil & Elbuni, Mohamed.
(2015). Sign Language Translator and Gesture Recognition.
10.1109/GSCIT.2015.7353332.

Alibusa, Arayata, Estrella, Matias, and Ramirez. F-Xinulator: Filipino Sign Language
Translator. Posted November 2014. Retrieved from
https://www.scribd.com/document/254920293/Filipino-Sign-Language

Sandjaja, Iwan Njoto. Sign language number recognition. Posted June 2008.
Retrieved from https://libguides.dlsu.edu.ph/c.php?g=807200&p=5761312

Newall, Maria Cristina L. Manlapig. The stages of faith development of the deaf
students at the De La Salle University - College of St. Benilde. Posted 2004
retrieve from https://libguides.dlsu.edu.ph/c.php?g=807200&p=5761312

Jhunjhunwala, Yash & Shah, Pooja & Patil, Pradnya & Waykule, Jyoti. (2017). SIGN
LANGUAGE TO SPEECH CONVERSION USING ARDUINO.

Rajaganapathy. S, et. al. Conversation of Sign Language to Speech with Human


Gestures. (2015). Doi: 10.1016/j.procs.2015.04.004

Abey Abraham, Rohini V. Real time conversion of sign language to speech and
prediction of gestures using Artificial Neural Network (2018). Doi:
10.1016/j.procs.2018.10.435

Vijay Singh, How Machine Learning Is Changing the World,Posted Feb 18 2020
https://www.datasciencecentral.com/how-machine-learning-is-changing-the-
world/ (2020)

27
APPENDIX A
THE FIVE POINTS SCALE

The five points scale use in the capability


of the developed Interpreting
Hand Sign into text

28
APPENDIX B
PROJECT FLOWCHART

Prepared by: Checked by:

ALBERT KYLE V. BUENAOBRA Ms. Marjorie Mae C. Bascones


Leader, Project Development Group Technical Adviser

29
APPENDIX C
ANDROID FLOWCHART

START

BLUETOOTH

yes
INPUT TRANSLATE
VOICE

no

DISPLAY

END

Prepared by: Checked by:

ALBERT KYLE V. BUENAOBRA Ms. Marjorie Mae C. Bascones


Leader, Project Development Group Technical Adviser

30
APPENDIX D
SURVEY QUESTIONNAIRE

31
32
33
34
APPENDIX E
SOURCE CODE

# TechVidvan hand Gesture Recognizer


# -*- coding: utf-8 -*-
# import necessary packages

import cv2
import numpy as np
import mediapipe as mp

from Preprocessing_LandMark import *


from pyttsx3_module import *
import time
from threading import Thread
from HGR_RFC_Module import *
import frmMain
from serialport import HardwareSerial
from Parser import Parser
from aatk_module import *
import textwrap

Serial1=HardwareSerial("/dev/ttyUSB0") #"/dev/ttyUSB0" for Raspi USB to TTL,


"COM4" for windows, "/dev/ttyS0" for UART Rpi
Serial1.begin(9600)
MSGParser=Parser("MSG,", "?", 1, _sizeofdata=140)
wrapper = textwrap.TextWrapper(width=15)
class Model:
def __init__(self):
self.mpHands = mp.solutions.hands
self.hands = self.mpHands.Hands(max_num_hands=1,
min_detection_confidence=0.7)
self.mpDraw = mp.solutions.drawing_utils
#self.Train()

self.unique={26:"I love you",28:"Ok!",29:"Hello! We are BSIT 3G!",27:"Thank


You!"}
self.text=""
self.frmMain=frmMain.Handler(self)
self.FromAndroidData=[]

'''################# DATA EXTRACTION ####################'''


def ImageProcessingHand(self,frame):

x, y, c = frame.shape

# Flip the frame vertically

35
frame = cv2.flip(frame, 1)
framergb = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)

# Get hand landmark prediction


result = self.hands.process(framergb)

# print(result)

className = ''

# post process the result


landmarks = []
if result.multi_hand_landmarks:

for handslms in result.multi_hand_landmarks:


for lm in handslms.landmark:
# print(id, lm)
lmx = int(lm.x * x)
lmy = int(lm.y * y)

landmarks.append([lm.x,lm.y])
#landmarks.append(lmy)

# Drawing landmarks on frames

self.mpDraw.draw_landmarks(frame, handslms,
self.mpHands.HAND_CONNECTIONS)
if len(landmarks):
landmarks=pre_process_landmark(landmarks)
return frame,landmarks

def Train(self):
x=np.load("data.npy")
y=np.load("label.npy")

'''################# Train-Test-Split ####################'''


X_train,X_test,Y_train,Y_test =
train_test_split(x,y,test_size=0.40,random_state=0)

#87.8% accuracy score at 1090 iteration August 31, 2021


#96.67% accuracy_score at 148 iteration September 1, 2021

self.rf = RandomForestClassifier(random_state=148)

self.rf.fit(X_train,Y_train)
def Predict(self,landmark_list):
Y_pred_rf = rf.predict_proba([landmark_list,])
if max(Y_pred_rf[0]) > 0.2:
Y_pred_rf=np.argmax(Y_pred_rf)
return labelconv[str(Y_pred_rf+1)]

36
#if Y_pred_rf<=25:
#return chr(Y_pred_rf+65)
#else:
#return self.unique[Y_pred_rf]
else:
return ""
def CamThread(self):
#mode 0: Recognition (Initial Mode)
#mode 1: Writing

"""Recognition Proper"""
cap=cv2.VideoCapture(0)
#cap.set(3,1280)
#cap.set(4,720)
prevText=""
mode=0
cnt=0
text=""
txt=""
SCH1=Scheduler(500)
while True:
ret,frame=cap.read()

frm=self.ImageProcessingHand(frame)

if len(frm[1]):

txt=self.Predict(frm[1])
txt=txt.replace("-"," ")
if txt=="ok":
txt="I'm ok"

if len(txt)==1:
cnt+=1
#print(cnt)
if cnt>=20:
text+=txt
self.text=txt
cnt=0
prevText=txt
elif txt == 'koreanheart':#self.unique[27]:
self.text=text
mode=0
text=""
txt=""
elif prevText!=text and len(txt)==1:
cnt=0
txt=""
else:
self.text=txt

37
cnt=0
#txt=""

else:
txt=""
if SCH1.Event():
mtxt=text+txt
if len(mtxt):
Serial1.println(mtxt)
cv2.putText(frm[0], text+txt, (10, 70), cv2.FONT_HERSHEY_SIMPLEX, 2,
(0,0,255), 3, cv2.LINE_AA)
#cv2.imshow("Output", frm[0])
#key=cv2.waitKey(100)
self.frmMain.Image2.LoadPictureOCV=cv2.resize(frm[0],
(int(self.frmMain.Image2.Width),int(self.frmMain.Image2.Height)))
time.sleep(0.1)

38

You might also like