Hand Gesture To Speech Translation

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 37

HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

Chapter 1

Introduction

Human beings interact with each other to convey their ideas, thoughts and experiences to
the people around them, but this is not the case for deaf and mute people, sign language
helps them to communicate. Through sign language, communication is possible for them
without the means of acoustic sounds. The objective behind this work is to develop a
system for recognizing the sign language, which provides communication between
people with speech impairment and normal people thereby reducing the communication
gap between them.

1.1 Motivation

Communication involves the exchange of information and this can only occur effectively
if all participants use a common language. Sign language is the language used by deaf
and mute people that uses gestures instead of sound to convey the speaker’s thoughts. A
gesture in a sign language is a particular movement of the hands with a specific shape
made out of them. The main objective of this project is to present a system that can
efficiently translate sign language gestures to both auditory voice and text.

1.2 Objective of the project

The main objective of the project is to facilitate dumb and deaf person’s lifestyle.
Throughout the world, dumb people use sign language to communicate with others, this
is possible who has undergone special trainings. This project presents a system that can
effectively translate the hand gestures to speech and text. This projects also aims at
controlling the devices using hand gestures.

1.3 Literature review

Deaf people used interpreters to translate the sign language into speech signals. However,
interpreters were very costly and are difficult to acquire on a moment’s notice. Note
writing was used by the non-vocal individuals to communicate with someone who is
Dept. of E&C Engg. , SIT,Tumakuru Page 1
HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

seated nearby, but it is awkward while standing at a distance, walking and when more
than two persons are involved in a conversation. In [1], sign language recognition system
mainly have two well-known approaches, one is image processing technique and another
one is microcontroller and sensor based data glove. These approaches are also known as
vision based and sensor based techniques. In the image processing technique, camera is
used to capture the image/video. In this technique, static images are analysed and
recognition of the images are carried out using algorithms that produce sentences in the
display. In [2], the algorithms used in vision based sign language recognition system are
Hidden Markov Mode (HMM), Artificial Neural Networks (ANN) and Sum of Absolute
Difference (SAD). The disadvantage of vision based technique includes complex
algorithms for data processing. Visual based techniques use camera chase technologies,
where the user wears a glove with specific colours or markers indicating individual parts
of the hands, especially the fingers. The cameras record the ever-changing image and
position of the hand because the user signs and the pictures are then processed to retrieve
the hand form, position and orientation. Another challenge in image and video processing
includes variant lighting conditions, backgrounds and field of view constraints and
occlusion.

Another approach is using a tactile flex sensors used to measure the hand gesture in [3].
Sensor output signals are fed to the microcontroller which process to recognize the hand
gesture and produce speech/text. The existent systems have used a text to-speech
conversion for voice output and display as well as to control devices in [4].

1.4 Organization of the report


This mini project report is divided into 6 chapters. Report begins with introduction and
literature review in Chapter 1. Chapter 2 describes the block diagram of the conversion
of hand gestures to speech. The hardware description is presented in Chapter 3. In
Chapter 4, software description and flowchart is given. Results are discussed with
snapshots in Chapter 5. Chapter 6 presents conclusion and future work.

Dept. of E&C Engg. , SIT,Tumakuru Page 2


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

Chapter 2
Block Diagram Description
This chapter discusses about the block diagram and the different components used in the
process of conversion from hand gestures to speech and device control.

2.1 Block diagram

The block diagram of the conversion of hand gestures to speech and device control is
shown in Figure 2.1. The major components used in the conversion process are the
PIC18 microcontroller, LCD display, Flex Sensors and Bluetooth module.

FLEX ADC LCD DISPLAY


SENSOR (16x2)
PIC
MICROCONTROLLER
SPEECH
BLUETOOTH
MODULE
UART MODULE
(PAIRED
DEVICE)
MOBILE
BLUETOOTH
RECEIVER

BULB1 RELAY1 PIC


MICROCONTROLLER

RELAY2 (DEVICE CONTROL


BULB2
MODE)

Figure 2.1: Block diagram of speech translation and device control system

Dept. of E&C Engg. , SIT,Tumakuru Page 3


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

2.2 Description of the block diagram

• PIC 18: It is a microcontroller which controls all the operations and is carried out
during the conversion of hand gestures into speech and also during the device
control process.
• Flex Sensors: It is a variable resistor. The resistance of the flex sensor increases
as the body of the component bends.
• LCD Display: It is an electronic display module which uses liquid crystal to
produce a visible image. The 16×2 LCD display is a very basic module
commonly used in circuits. The 16×2 translates a display 16 characters per line in
2 such lines.
• Bluetooth Module: HC-05 module uses Serial Port Protocol (SPP) module. It is
designed for transparent wireless serial connection setup.

Dept. of E&C Engg. , SIT,Tumakuru Page 4


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

Chapter 3

Hardware Description

This chapter describes the main hardware components used in the process of
conversion of hand gestures. The components are Flex Sensors, PIC18
microcontroller, Bluetooth module HC-05 and LCD display.

3.1 Flex Sensor

The flex sensor shown in Figure 3.1 is based on resistive carbon elements. As a variable
printed resistor, the flex sensor achieves great form-factor on a thin flexible substrate.
When the substrate is bent, the sensor produces a resistance output correlated to the bend
radius. The smaller the radius, higher will be the resistance value. The flex sensor in this
system mainly acts as a voltage divider, which varies the voltage according to the bent
resistance as shown in Table 3.1. The variation in resistance is just about 10 kΩ to 30 kΩ.
A global organization flexed device has 10 kΩ resistance and once bent the resistance
will increase to 30 kΩ at 90 degrees. The flex sensors placed on a glove is shown in
Figure 3.2. The graph shown in Figure 3.3 depicts the output of the flex sensors with
respect to deflection.

Figure 3.1: Flex Sensor

Dept. of E&C Engg. , SIT,Tumakuru Page 5


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

Figure 3.2: Flex Sensor placed on the glove

Table 3.1: Variation of voltage & resistance with respect to bend angle

Bend Angles Resistance Voltage


(in degrees) (in kΩ) (in Volts)
0 9.77 1.93
10 10.85 2.06
20 11 2.11
30 12 2.15
50 13.1 2.3
60 13.7 2.4
90 16 2.5
120 18 2.9
150 19.5 3
180 20 3.1

Dept. of E&C Engg. , SIT,Tumakuru Page 6


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

Figure 3.3: Voltage versus deflection graph

3.2 PIC Microcontroller

The inputs from the flex sensor is connected to analog pins of the PIC 18 microcontroller.
I/O pins are: RA1, RA2, RA3, RA5 and RB0.

The Figure 3.4 shows the PIC18(L)F2X/4XK22 microcontroller. Analog to digital


converter (ADC) is required to digitalize all analog signals from the sensors.
PIC18(L)F2X/4XK22 has inbuilt ADC and multiplexer. The computation taking place
inside the microcontroller is based on the ADC values of each flex sensor and the
respective output is seen either through speech or by controlling the devices. It supports
both serial and parallel communication facilities. Some of the key features of this
microcontroller are:

• Analog-to-Digital Converter (ADC) module:


• 10-bit resolution up to 30 external channels
• Auto-acquisition capability
• Conversion available during sleep

Dept. of E&C Engg. , SIT,Tumakuru Page 7


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

• Fixed Voltage Reference (FVR) channel


• Independent input multiplexing
• Flash Program Memory: 32 Kbytes
• EEPROM Data Memory: 256 bytes
• SRAM Data Memory: 1536 bytes A/D Converter: 10-bit Thirteen Channels
• Enhanced USART: Addressable with RS-485, RS-232 and LIN Support
• MSSP: SPI and IC Master and Slave Support
• External Oscillator: up to 40MHz

Figure 3.4: PIC microcontroller

3.3 LCD Display

A 16x2 LCD means it can display 16 characters per line and there are 2 such lines. In
this LCD, each character is displayed in 5x7 pixel matrix. The LCD which is shown in
Figure 3.5 has two registers Command and Data. The command register stores the
command instructions given to the LCD. A command is an instruction given to LCD to
do a predefined task like initializing it, clearing its screen, setting the cursor position,
controlling display etc. The data register stores the data to be displayed on the LCD. The
data is the ASCII value of the character to be displayed on the LCD.

Dept. of E&C Engg. , SIT,Tumakuru Page 8


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

Figure 3.5: LCD Display

3.4 Bluetooth Module

The Bluetooth module 1 and 2 are setup as the master and the slave respectively. The
Figure 3.6 depicts the HC-05 Module which is a Bluetooth SPP module used for
transparent wireless serial connection setup. The HC-05 Bluetooth module is used in
slave configuration. The serial port Bluetooth module is fully qualified Bluetooth
V2.0+Enhanced Data Rate (EDR) 3 Mbps Modulation with complete 2.4 GHz radio
transceiver and baseband. It uses CSR Bluecore04 external single chip Bluetooth system
with CMOS technology and with Adaptive Frequency Hopping feature (AFH).
Interfacing Bluetooth module with PIC microcontroller is shown in Figure 3.7.

Figure 3.6: Bluetooth Module HC-05

Dept. of E&C Engg. , SIT,Tumakuru Page 9


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

Figure 3.7: Interfacing Bluetooth module with PIC microcontroller

3.5 Speech Module

Once the gesture made by a person is matched with the database, its output should be in
the form of voice audio. To bring this voice output, an android mobile is used which is
paired with the Bluetooth module, in which an application is developed through
Massachusetts Institute of Technology (MIT) App Inventor which converts the
information sent through the Bluetooth module (master) into speech. Since
microcontroller is used, the serial mode of operation is performed. Due to this, only 2
ports of serial communication (Tx and Rx) are used to transmit and receive the
information through the Bluetooth module. The outlook of MIT App Inventor is shown in
Figure 3.8.

Figure 3.8: MIT App Inventor to develop the code

Dept. of E&C Engg. , SIT,Tumakuru Page 10


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

3.6 Device Control Mode

When the system is used in device control mode certain gestures are specified to control
the devices respectively. The Bluetooth module 2 receives the information from the PIC
microcontroller which in turn controls the devices such as light and fans. The transmitted
data is received by a Bluetooth receiver operating at the same frequency as that of the
transmitter. The received signals are given to the PIC microcontroller that compares the
received and stored hand gestures, and then the signals are given to the home appliances
to control them. The controlled devices are fans and lights for compactness of the device
design.

3.7 Relay

In order to control the switching on and off action, relay is used. A relay is an
electromechanical device that is triggered by an electrical current shown in Figure 3.9.
The current flowing in one circuit can switch ON and OFF a current in another circuit.
Relays are employed in this device to switch ON and OFF the home appliances.

Figure 3.9: Relay circuit

Dept. of E&C Engg. , SIT,Tumakuru Page 11


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

Chapter 4

Software Description

This chapter discusses the software involved in the project which is needed to convert
hand gesture to speech and control the devices as shown in Figure 4.1.

Start

Inputs are given by


making hand
gestures wearing the
gloves

Input values from


the sensors and
accelerometer are
fed to ADC channels
of PIC
microcontroller

Convert the input


analog signals to
hexadecimal value
using inbuilt ADC’s

Compare the
hexadecimal
values from each
ADC channel with
predefined values

X Where X is a connector

Dept. of E&C Engg. , SIT,Tumakuru Page 12


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

NO
Does the
value match Display
with the “Normal”
threshold
ones

YES

NO Send corresponding
Does the
value match signals to LCD Display
to device and speech module
control mode simultaneously

YES

Convert the signal into


Control the respective speech corresponding to
device the gesture

Another Another
Start
gesture gesture

Stop Stop

Figure 4.1: Flow chart for hand gesture to speech and device control

Dept. of E&C Engg. , SIT,Tumakuru Page 13


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

4.1 MPLAB IDE

MPLAB X IDE is a software program that runs on a PC to develop applications for


microchips, microcontrollers and digital signal controllers. The code for this project is
written in C language. The software is fed with build in header files and source files
which provide necessary instructions to PIC18F44K22 microcontroller to work. Two
programs have been implemented, one for the hand gesture to speech translation and
other for the device control.

Syntax of Certain Keywords

• Function

type function name(parameters)

statements;

• If

if (CONDITION 1) // if true enter the loop and execute the body

do something;

• If…. else

if (CONDITION 1) //if true enter the body otherwise go to else part

Dept. of E&C Engg. , SIT,Tumakuru Page 14


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

do thing A;

else

do thing B;

• While

while(CONDITION) // if true enter the loop

do something;

Dept. of E&C Engg. , SIT,Tumakuru Page 15


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

Chapter 5

Results

The hardware circuit used for the conversion of the hand gestures to speech and device
control is shown in Figure 5.1. The different signs or gestures indicate different
combinations of outputs which are in the form of LCD display and speech. The device
control mode is enabled by assigning a special gesture. Through this special gesture the
devices such as lights are controlled as shown in Figure 5.3.

The results of the projects are explained in the following sections

Figure 5.1: Circuit of Hand gesture to speech translation and device control system

Dept. of E&C Engg. , SIT,Tumakuru Page 16


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

1. NORMAL 2. LUNCH 3. WATER 4. BREAKFAST

5. WASHROOM 6. GOING TO BED 7. BATH 8. BOOKS

9. MEDICINE 10. DINNER 12. CLOTHES


11. COFFEE

Figure 5.2: Hand gestures used in the system

Dept. of E&C Engg. , SIT,Tumakuru Page 17


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

15. CHANGE MODE


13. SAD 14. HAPPY
TO DEVICE
CONTROL

16. DEVICE 1 17. DEVICE 2

Figure 5.2: Hand gestures used in the system

Figure 5.2 shows the hand gestures and the corresponding outputs for some of the
gestures are shown in Figure 5.3.

Dept. of E&C Engg. , SIT,Tumakuru Page 18


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

Figure 5.3: Device control demonstration

Dept. of E&C Engg. , SIT,Tumakuru Page 19


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

Chapter 6
Conclusion and Future Work

6.1 Conclusion
The project on hand gesture to speech and device control using PIC microcontroller has
been implemented to lower the communication gap between the dumb-mute community
and the normal world. This project also controls the devices using hand gestures. Thus,
the project helps in achieving the basic needs of dumb-mute and bedridden community.

6.2 Future Work


The project can further be extended by implementing more combinations of gestures and
thus monitoring the hand gesture operations. The device can be effectively used in
Artificial Intelligence to study the movement of humans and then present it in augmented
reality to improve gaming experience and it can also be used for medical purposes.

Dept. of E&C Engg. , SIT,Tumakuru Page 20


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

References
1. Available: http://www.worldwidescience.org .
2. Soumya Dutta and Bidyut. B. Chaudhuri, “A Colour Edge Detection Algorithm in
RGB Colour Space”, IEEE Transactions on Image Processing, Vol. 8, No. 5, pp.
337-340, May 2013.
3. K.C. Shriharipriya and K. Arthy, “Flex sensor based hand gesture recognition
system”, Proceedings of International Journal of an Innovative Research and
Studies (IJIRS), Vol. 2, Issue 3, pp. 12-14, May 2013.
4. Ramya V and Palaniappan B, “Embedded Home Automation for Visually
Impaired”, International Journal of Computer Applications, Vol. 41, No. 18, pp.
32-38, March 2012.

Dept. of E&C Engg. , SIT,Tumakuru Page 21


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

Appendix

Code used for hand gesture to speech translation

#include <p18f44k22.h>

#include <stdio.h>

#include <string.h>

#include <stdlib.h>

#include "ProjectMain.h"

#include "serial_codes.h"

#include "ADC.H"

#include "PinAllocation.h"

#include "LCD.h"

#include "delay.h"

#include "timers.h"

#include "clock_setting.h"

unsigned char guchLCDLine1String[17],guchLCDLine2String[17];

unsigned char uchBuffDisplay[] = "SIGN TO SPEECH\n\r";

#pragma udata udata3

unsigned char uchRecieve150Ch[150],i = 0;

unsigned int ADC_Data_0=0;

unsigned int ADC_Data_1=0;


Dept. of E&C Engg. , SIT,Tumakuru Page 22
HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

unsigned int ADC_Data_2=0;

unsigned int ADC_Data_3=0;

unsigned int ADC_Data_4=0;

void fnMain_Project()

TRISA = SET_OUTPUT;

TRISB = SET_OUTPUT;

TRISC = SET_OUTPUT;

TRISD = SET_OUTPUT;

Dir_LEDSW = SET_OUTPUT;

Dir_SW = SET_OUTPUT;

while(1)

ADC_Data_0 = unfnReadADCChannel(1,ADC_10BIT_MODE);

//printf("\r\n%d\t",ADC_Data_0);

ADC_Data_1 = unfnReadADCChannel(0,ADC_10BIT_MODE);

Delay_in_ms(300);

Dept. of E&C Engg. , SIT,Tumakuru Page 23


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

//printf("%d\t",ADC_Data_1);

ADC_Data_2 = unfnReadADCChannel(2,ADC_10BIT_MODE);

Delay_in_ms(300);

//printf("%d\t",ADC_Data_2);

ADC_Data_3 = unfnReadADCChannel(3,ADC_10BIT_MODE);

Delay_in_ms(300);

//printf("%d\t",ADC_Data_3);

ADC_Data_4 = unfnReadADCChannel(4,ADC_10BIT_MODE);

Delay_in_ms(300);

//printf("%d\r\n",ADC_Data_4);

if(ADC_Data_1 > 875 && ADC_Data_2 < 890&& ADC_Data_3 <


860&& ADC_Data_4 < 850)

printf(" Water");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," WATER ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

Dept. of E&C Engg. , SIT,Tumakuru Page 24


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

else if(ADC_Data_1 < 875 && ADC_Data_2 > 890&& ADC_Data_3 <
860&& ADC_Data_4 < 850)

printf(" breakfast");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," BREAKFAST ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

else if(ADC_Data_1 < 875 && ADC_Data_2 < 890&& ADC_Data_3 >
860&& ADC_Data_4 < 850)

printf(" lunch\r\n");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," LUNCH ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

}
Dept. of E&C Engg. , SIT,Tumakuru Page 25
HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

else if(ADC_Data_1 < 875 && ADC_Data_2 < 890&& ADC_Data_3 <
860&& ADC_Data_4 > 850)

printf(" dinner\r\n");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," DINNER ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

else if(ADC_Data_1 > 875 && ADC_Data_2 > 890&& ADC_Data_3 <
860&& ADC_Data_4 < 850)

printf("D");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," DEVICE2 OFF


");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

Dept. of E&C Engg. , SIT,Tumakuru Page 26


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

else if(ADC_Data_1 < 875 && ADC_Data_2 < 890&& ADC_Data_3 >
860&& ADC_Data_4 > 850)

printf("C");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," DEVICE2 ON ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

else if(ADC_Data_1 > 875 && ADC_Data_2 < 890&& ADC_Data_3 <
860&& ADC_Data_4 > 850)

printf(" clothes");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," CLOTHES ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

Dept. of E&C Engg. , SIT,Tumakuru Page 27


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

else if(ADC_Data_1 < 875 && ADC_Data_2 > 890&& ADC_Data_3 >
860&& ADC_Data_4 < 850)

printf(" happy");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," HAPPY ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

else if(ADC_Data_1 < 875 && ADC_Data_2 > 890&& ADC_Data_3 <
860&& ADC_Data_4 > 850)

printf(" books");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," BOOKS ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

Dept. of E&C Engg. , SIT,Tumakuru Page 28


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

else if(ADC_Data_1 < 875 && ADC_Data_2 > 890&& ADC_Data_3 >
860&& ADC_Data_4 > 850)

printf("A");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," DEVICE1 ON
");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

else if(ADC_Data_1 > 875 && ADC_Data_2 > 890&& ADC_Data_3 >
860&& ADC_Data_4 > 850)

printf("B");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," DEVICE1 OFF ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

Dept. of E&C Engg. , SIT,Tumakuru Page 29


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

else if(ADC_Data_1 > 875 && ADC_Data_2 < 890&& ADC_Data_3 >
860&& ADC_Data_4 > 850)

printf(" sad\r\n");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," SAD ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

else if(ADC_Data_1 > 875 && ADC_Data_2 < 890&& ADC_Data_3 >
860&& ADC_Data_4 < 850)

printf(" going to bed\r\n");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," GNG TO BED ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

Dept. of E&C Engg. , SIT,Tumakuru Page 30


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

else if(ADC_Data_1 > 875 && ADC_Data_2 < 890&& ADC_Data_3 <
860&& ADC_Data_4 > 850)

printf(" bath\r\n");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," BATH ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

else if(ADC_Data_1 > 875 && ADC_Data_2 > 890&& ADC_Data_3 <
860&& ADC_Data_4 > 850)

printf(" sorry");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," SORRY ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

else
Dept. of E&C Engg. , SIT,Tumakuru Page 31
HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

printf(" ");

fn_clear_display();

strcpypgm2ram(guchLCDLine1String," NORMAL ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

void fn_clear_display(void)

uch_Lcd_Data = 0x01; ; //clear display

lcd_write_command();

Delay_in_ms(10);

void fn_lcd_select_line_and_location(char ch_line_no, char ch_location_no)

if (ch_line_no== LCD_LINE_2)

uch_Lcd_Data = 0xC0 + ch_location_no;

else

Dept. of E&C Engg. , SIT,Tumakuru Page 32


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

uch_Lcd_Data = 0x80 + ch_location_no;

lcd_write_command();

Delay_in_ms(10);

void fn_Display_String_LCD(unsigned char *generic_ptr)

while (*generic_ptr)

uch_Lcd_Data =*generic_ptr;

lcd_write_data();

generic_ptr++;

void delay_in_seconds(unsigned char uch_Count)

unsigned int i;

for(i=0;i<uch_Count;i++)

Dept. of E&C Engg. , SIT,Tumakuru Page 33


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

delay_in_half_seconds(2);;

Code used for device control

#include <p18f44k22.h>

#include <stdio.h>

#include <string.h>

#include <stdlib.h>

#include "ProjectMain.h"

#include "serial_codes.h"

#include "ADC.H"

#include "PinAllocation.h"

#include "LCD.h"

#include "delay.h"

#include "timers.h"

unsigned char guchLCDLine1String[17],guchLCDLine2String[17];

unsigned char uchBuffDisplay[] = "DEVICE CONTROL\n\r";

#pragma udata udata3

Dept. of E&C Engg. , SIT,Tumakuru Page 34


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

unsigned char uchRecieve150Ch[150],i = 0,j=0;

unsigned char dollar = 0;

unsigned char uchNumber1;

void fnMain_Project()

ANSELAbits.ANSA0 = 0;

ANSELAbits.ANSA1 = 0;

ANSELAbits.ANSA5 = 0;

ANSELAbits.ANSA2 = 0;

ANSELAbits.ANSA3 = 0;

TRISAbits.RA0=0;

TRISAbits.RA1=0;

TRISAbits.TRISA3 = 1;

TRISAbits.TRISA4 = 1;

TRISAbits.TRISA5 = 1;

TRISAbits.TRISA6 = 1;

TRISAbits.TRISA7 = 1;

/* PORTAbits.RA0=1;

PORTAbits.RA1=1;*/

PORTAbits.RA2=1;

PORTAbits.RA3=1;

Dept. of E&C Engg. , SIT,Tumakuru Page 35


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

while(1)

dollar=uchfnReceive_Serial(UART1);

//printf("%c",dollar);

LATA = 0X00;

if(dollar == 'A')

fn_clear_display();

strcpypgm2ram(guchLCDLine1String,"DEVICE1 ON ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

LATAbits.LATA0 = 1;

if(dollar == 'B')

fn_clear_display();

strcpypgm2ram(guchLCDLine2String,"DEVICE1 OFF");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

Dept. of E&C Engg. , SIT,Tumakuru Page 36


HAND GESTURE TO SPEECH TRANSLATION AND DEVICE CONTROL 2017-2018

LATAbits.LATA0 = 0;

if(dollar == 'C')

fn_clear_display();

strcpypgm2ram(guchLCDLine2String,"DEVICE2 ON ");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

LATAbits.LATA1 = 1;

if(dollar == 'D')

fn_clear_display();

strcpypgm2ram(guchLCDLine2String,"DEVICE2 OFF");

fn_lcd_select_line_and_location(LCD_LINE_1,LOCATION_0);

fn_Display_String_LCD(guchLCDLine1String);

delay_in_seconds(1);

LATAbits.LATA1 = 0;

}}

Dept. of E&C Engg. , SIT,Tumakuru Page 37

You might also like