Virtual Mouse Control Using Hand Class Gesture: Bachelor of Engineering Electronics and Telecommunication

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 34

A PROJECT STAGE-II REPORT

ON

“Virtual Mouse Control Using Hand Class Gesture”


SUBMITTED TOWARDS THE PARTIAL FULFILLMENT
OF THE REQUIREMENTS OF DEGREE OF

BACHELOR OF ENGINEERING
IN
ELECTRONICS AND TELECOMMUNICATION
BY
Student Name Exam Seat No
Gaurav Munjewar 71804123C
Geetesh G Kongre 71726431K

UNDER THE GUIDANCE OF


PROF. S. P .DESHMUKH

DEPARTMENT OF ELECTRONICS AND TELECOMMUNICATION


STES’s NBN SINHGAD TECHNICAL INSTITUTES CAMPUS
NBN SINHGAD SCHOOL OF ENGINEERING
AMBEGAON (BK),
PUNE – 411041
2021-22
NBN SINHGAD TECHNICAL INSTITUTES CAMPUS NBN SINHGAD
SCHOOL OF ENGINEERING, PUNE DEPARTMENT OF
ELECTRONICS AND TELECOMMUNICATION ENGINEERING

CERTIFICATE
This is to certify that the Project Stage-I entitled

“Virtual Mouse Control Using Hand Class Gesture”

Submitted by,

Student Name Exam Seat No


Gaurav Munjewar 71804123C
Geetesh G Kongre 71726431K

towards the partial fulfilment of the degree of Bachelor of Engineering in Electronics and
Telecommunication as awarded by the Savitribai Phule Pune University, at NBN Sinhgad
School of Engineering.

Prof. S. P. Deshmukh Dr. Makarand. M. Jadhav Dr. Shivprasad P. Patil


Guide Head of Department Principal
“Virtual Mouse Control Using Hand Class Gesture”

ACKNOWLEDGEMENT
I am thankful to Prof. Sunita P. Deshmukh Project Guide, for her co-
operation and suggestions. She has made valuable suggestions and gave
guidance throughout for completing my project work. I am also thankful to
Prof. S.Y. Tamboli, project co-ordinator for his help and support.
I wish to express deep heartfelt thanks to Head of Department, Dr.
Makarand M.Jadhav, for giving valuable guidance and direction to my
project work. He patiently provided the encouragement and motivation. His
comments and precious suggestions made the entire journey to the final
accomplishment of this work really pleasant.
I am also thankful to Dr. Shivprasad P. Patil, Principal, NBN Sinhgad
School of Engineering, Pune for providing excellent facilities in the college
to carry out this project work.
I wish to thank all the faculty and staff members of department for their
sincere co-operation, kind blessings and warm well-wishes.

Gaurav Munjewar
Geetesh Kongre

3
“Virtual Mouse Control Using Hand Class Gesture”

ABSTRACT
Natural user interfaces have become increasingly important in today's world, because to
advances in ubiquitous computing. The presence of computers and the use of human-
computer interaction tools in our society will undoubtedly have a positive impact on our
civilizations. Whether it was back in the day when technology was less advanced or
today when technology has advanced so much that we spend the majority of our time
communicating, playing, doing our jobs with machines, and so on, human beings have
used and continue to use a wide range of gestures to communicate and interact with one
another. Human gestures are a type of nonverbal communication that can be used to
convey information. We can use this vision based recognized gestures to control
multimedia applications (like Windows Media Player, Windows Picture Manager, VLC
Player etc.) running on computer using different gestural commands.

4
“Virtual Mouse Control Using Hand Class Gesture”

CONTENTS

CH. NO TITLE PAGE NO.

ACKNOWLEDGEMENT i
ABSTRACT ii
Contents iii

1 INTRODUCTION
1.1 Background
1
2
1.2 Objectives 2
1.3 Problem Statement 4
1.4 Organization of Report 6
2 LITERATURE SURVEY
2.1 Review
7
8
2.2 Summary of Review 9

3 PROPOSED METHODOLOGY
3.1 Introduction
18
19
3.2 Block Diagram 21
3.3 Hardware Specifications 23
3.4 Software Specifications 23
4 HARDWARE IMPLEMENTATION
4.1 Introduction
24
25
4.2 Working Blocks

5 SOFTWARE IMPLEMENTATION
5.1 Algorithms
27
30
5.2 Flowchart (Working of Modules)

6 RESULTS & DISCUSSION 32

7 CONCLUSION & FUTURE SCOPE 33

8 REFERENCES 34

5
“Virtual Mouse Control Using Hand Class Gesture”

33

CHAPTER 1

INTRODUCTION

6
“Virtual Mouse Control Using Hand Class Gesture”

1.1Background

Hand gestures are the most effective and expressive form of human communication, and
they are a widely understood language. It is sufficiently expressive to allow the deaf and
dumb to comprehend it. A real-time hand gesture system is proposed in this paper. The
system's experimental setup includes a fixed-position low-cost web camera with high-
definition recording capability mounted on the top of a computer monitor or a fixed
camera on a laptop, which collects snapshots in the Red Green Blue [RGB] colour space
from a fixed distance. Image pre-processing, region extraction, feature extraction, and
feature matching are the four stages of this project. Recognition and the interpretation of
sign language is one of the major issues for the communication with dump and deaf
people. Based on pre-processing, background subtraction, and edge detection approaches,
an effective hand gesture segmentation technique has been proposed in this study. Pre-
processing is the process of preparing data for a subsequent process. The major goal of
the pre-processing step is to convert the data into a format that can be processed more
quickly and easily. Pre-processing strategies are developed in the proposed work based
on various sorts of combinations from following hand motion image processing
operations such as image capture, noise removal, background subtraction, and edge
recognition, and these image processing methods are addressed.

Within the past few years, as computer technology continues to develop, people want
more compact electronic devices. Human Computing Interaction (HCI), particularly
gesture recognition and object recognition, is becoming increasingly important. In our
project, we introduce a method for controlling the mouse system using a video device
(Mouse tasks). In today's world, most cell phones communicate with the user via touch
screen technology. However, this technology is still prohibitively inexpensive for use on
desktops and laptop computers. Generally, a gesture is a symbol for physical or

7
“Virtual Mouse Control Using Hand Class Gesture”

emotional behaviour. It consists of both body and hand gestures. Gestures can be used to
communicate between humans and computers. Human-computer interaction (HCI) began
in the early 1980s as a field of study and practice. The name "virtual mouse" conveys a
clear idea about our project. The virtual mouse establishes a virtual connection between
the user and the machine without the use of any hardware. This gesture recognition
system can capture and track the fingertips with a webcam, and the system detects the
hand's colour and movements and moves the cursor along with it.

1.2 Abstract

Natural user interfaces have become increasingly important in today's world, because to
advances in ubiquitous computing. The presence of computers and the use of human-
computer interaction tools in our society will undoubtedly have a positive impact on our
civilizations. Whether it was back in the day when technology was less advanced or
today when technology has advanced so much that we spend the majority of our time
communicating, playing, doing our jobs with machines, and so on, human beings have
used and continue to use a wide range of gestures to communicate and interact with one
another. Human gestures are a type of nonverbal communication that can be used to
convey information.We can use this vision based recognized gestures to control
multimedia applications (like Windows Media Player, Windows Picture Manager, VLC
Player etc.) running on computer using different gestural commands.

8
“Virtual Mouse Control Using Hand Class Gesture”

1.3 Objectives
This Virtual Mouse Hand Recognition application puts a simple colour cap on the finger
to control the cursor using basic motions and hand control without the need for extra
hardware. This is done using vision based hand gesture recognition with inputs from a
webcam.
The purpose of this project is to develop a Virtual Mouse application that targets a few
aspects of significant development. For starters, this project aims to eliminate the needs
of having a physical mouse while able to interact with the computer system through
webcam by using various image processing techniques. Other than that, this project aims
to develop a Virtual Mouse application that can be operational on all kind of surfaces and
environment.
The following describes the overall objectives of this project:
 To design to operate with the help of a webcam. The Virtual Mouse application will be
operational with the help of a webcam, as the webcam are responsible to capture the
images in real time. The application would not work if there are no webcam detected.
 To design a virtual input that can operate on all surfaces. The Virtual Mouse application
will be operational on all surface and indoor environment, as long the users are facing the
webcam while doing the motion gesture.
 To program the camera to continuously capturing the images, which the images will be
analysed, by using various image processing techniques. As stated above, the Virtual
Mouse application will be continuously capturing the images in real time, where the
images will be undergo a series of process, this includes HSV conversion, Binary Image
conversion, salt and pepper noise filtering, and more.

9
“Virtual Mouse Control Using Hand Class Gesture”

 To transform a hand gesture or motion into mouse input for a certain screen position.
The Virtual Mouse application will be programmed to detect the position of the defined
colours where it will be set as the position of the mouse pointers. Furthermore, a
combination of different colours may result in triggering different types of mouse events,
such as the right/left clicks, scroll up/down, and more.

1.4 Problem Statement

The proposed AI virtual mouse system can be used to solve problems in the real world,
such as instances where there isn't enough space to use a physical mouse or for people
with hand problems who can't use a physical mouse. Also, amidst of the COVID-19
situation, it is not safe to use the devices by touching them because it may result in a
possible situation of spread of the virus by touching the devices, so the proposed AI
virtual mouse can be used to overcome these problems since hand gesture and hand Tip
detection is used to control the PC mouse functions by using a webcam or a built-in
camera.

10
“Virtual Mouse Control Using Hand Class Gesture”

CHAPTER 2

LITERATURE SURVEY

11
“Virtual Mouse Control Using Hand Class Gesture”

2.1 Literature Review

The current system is comprised of a generic mouse and trackpad monitor control system,
as well as the absence of a hand gesture control system. The use of a hand gesture to
access the monitor screen from a distance is not possible. Even though it is primarily
attempting to implement, the scope is simply limited in the virtual mouse field.
The existing virtual mouse control system consists of simple mouse operations using a
hand recognition system, in which we can control the mouse pointer and left click and so
on. The use of hand recognition in the future will not be used. Even though there are a
variety of systems for hand recognition, the system they used is static hand recognition,
which is simply recognition of the shape made by the hand and the definition of action
for each shape made, which is limited to a few defined actions and causes a lot of
confusion. As technology advances, there are more and more alternatives to using a
mouse.
The following are some of the techniques that were employed.
A) Head Control: A special sensor (or built-in webcam) can track head movement to
move the mouse pointer around on the screen. The dwell delay function of the
software is frequently employed in the absence of a mouse button. Clicking can
also be accomplished with a well-placed switch.
B) Eye Control: The cost of modern eye gaze systems is decreasing. These enable
users to move the pointer on the screen solely by moving their eyes. Instead of
mouse buttons, a dwell delay feature, blinks, or a switch are used. The Tobii

12
“Virtual Mouse Control Using Hand Class Gesture”

PCEye Go is a peripheral eye tracker that lets you use your eyes to control your
computer as if you were using a mouse.
C) Touch Screens: Touch screens, which were once seen as a niche technology used
primarily in special education schools, have now become mainstream. Following
the success of smartphones and tablets, touch-enabled Windows laptops and all-
in-one desktops are becoming more common. Although this is a welcome new
technology, the widespread use of touch screens has resulted in a new set of touch
accessibility issues. However, each of the methods below has its own set of
disadvantages. The use of the head or eyes to control the cursor regularly can be
hazardous to one's health. This can lead to a number of problems with health.
When using a touch screen, the user must maintain their focus on the screen at all
times, which can cause drowsiness. By
D) comparing the following techniques, we hope to create a new project that will not
harm the user's health.

2.2 Summary of Review

We created our own dataset to implement a hand gesture recognition (HGR) system for
multimedia interaction. We have recorded each video stream of duration time
approximately 10 seconds at the rate of 30 frames per seconds and at the resolution of
1280×720 using digital camera of 8 megapixels. We have performed the experiments in
three different sessions. The images (i.e. frames) taken from the recorded video streams
at various distances and places are used to classify these sessions .Each session consists
of 300 images of 6 different classes where each class having 50 images. We'll utilise
some samples of photos from different sessions to calculate hand gesture recognition
accuracy. The whole system is designed using image processing and computer vision
techniques implemented in Matlab-2013under Windows 8 operating system. The
hardware required for processing the hand motion recognition system was a 64-bit PC
with a 2.40GHz processor and 2 GB RAM.

13
“Virtual Mouse Control Using Hand Class Gesture”

CHAPTER 3

PROPOSED
METHODOLOGY
14
“Virtual Mouse Control Using Hand Class Gesture”

3.1 Introduction

Hand gestures are the most effective and expressive form of human communication, and
they are a widely understood language. It is sufficiently expressive to allow the deaf and
dumb to comprehend it. A real-time hand gesture system is proposed in this paper. The
system's experimental setup includes a fixed-position low-cost web camera with high-
definition recording capability mounted on the top of a computer monitor or a fixed
camera on a laptop, which collects snapshots in the Red Green Blue [RGB] colour space
from a fixed distance. Image pre-processing, region extraction, feature extraction, and
feature matching are the four stages of this project. One of the primary challenges in
communicating with blind and deaf people is the recognition and interpretation of sign
language. In Within the past few years, as computer technology continues to develop,
people want more compact electronic devices. Human Computing Interaction (HCI),
particularly gesture recognition and object recognition, is becoming increasingly
important. In our project, we introduce a method for controlling the mouse system using a
video device (Mouse tasks). In today's world, most cell phones communicate with the
user via touch screen technology. However, this technology is still prohibitively
expensive for use on desktops and laptop computers. Generally, a gesture is a symbol for
physical or emotional behaviour. It consists of both body and hand gestures. Gestures can

15
“Virtual Mouse Control Using Hand Class Gesture”

be used to communicate between humans and computers. Human-computer interaction


(HCI) began in the early 1980s as a field of study and practice. The name "virtual mouse"
conveys a clear idea about our project. The virtual mouse establishes a virtual connection
between the user and the machine without the use of any hardware. This gesture
recognition system can capture and track the fingertips of a person wearing a colour cap
with a webcam, and the system detects the hand's colour and movements and moves the
cursor along with it.

A mouse is a pointing device that recognises two-dimensional movements relative


to a surface in computing terms. This movement is translated into the movement of a
pointer on a screen, allowing the user to manage the Graphical User Interface (GUI) on a
computer platform. There are several other types of mice that have already existed in
modern technology, such as the mechanical mouse, which uses a hard rubber ball that
rolls around when the mouse is moved to identify movement. Years later, the optical
mouse was produced, which replaced the hard rubber ball with an LED sensor that senses
tabletop movement and delivers the data to the computer for processing. The laser mouse
was first introduced in the year 2004.

3.2 Block Diagram.

16
“Virtual Mouse Control Using Hand Class Gesture”

Fig 3.2 (a)

3.3 Hardware Specifications

The following describes the hardware needed in order to execute and develop the Virtual
Mouse application:

 Computer Desktop or Laptop

17
“Virtual Mouse Control Using Hand Class Gesture”

The computer desktop or a laptop will be utilized to run the visual software in order to
display what webcam had captured. A notebook which is a small, lightweight and
inexpensive laptop computer is proposed to increase mobility.

System will be using

Processor: Core2Duo

Main Memory: 4GB

RAM Hard Disk: 320GB

Display: 14" Monitor

 Webcam is utilized for image processing, the webcam will continuously taking image
in order for the program to process the image and find pixel position

3.4 Software Specifications

The following describes the software needed in-order to develop the Virtual Mouse
application:

 PyCharm

Modules used in PyCharm: -

 Open CV Library: -

OpenCV was also used in the development of this software. OpenCV


(Open Source Computer Vision) is a library of real-time computer vision
programming tools. OpenCV has an image pixel value reading utility, as
well as the ability to generate real-time eye tracking and blink detection.
Software will be using: OS : Window 7 Ultimate 64-bit Language python.

 Numpy Library:-

NumPy, or Numerical Python, is a library that contains multidimensional


array objects and a collection of functions for manipulating them. NumPy
allows you to perform logical and mathematical operations on arrays.
Python's NumPy package. 'Numerical Python' is what it stands for.

18
“Virtual Mouse Control Using Hand Class Gesture”

 Mediapipe library:-

As a prebuilt Python package, MediaPipe provides ready-to-use but


configurable Python solutions. MediaPipe Python package is available on
PyPI for Linux, macOS and Windows.

 Autopy library:-

AutoPy is a Python GUI automation library that is simple and cross-


platform. It has controls for the keyboard and mouse, as well as functions
for finding colours and bitmaps on the screen and showing alarms. Currently
supported on macOS, Windows. 
 
 

 Time library: -

The Python time module offers a variety of ways to express time in code,
including objects, integers, and texts. It also provides functionality other
than representing time, like waiting during code execution and measuring
the efficiency of your code.

19
“Virtual Mouse Control Using Hand Class Gesture”

CHAPTER 4

HARDWARE
IMPLEMENTATION

4.1 Introduction

In our project, we introduce a method for controlling the mouse system using a video
device (Mouse tasks). In today's world, most cell phones communicate with the user via
touch screen technology. However, this technology is still prohibitively expensive for use

20
“Virtual Mouse Control Using Hand Class Gesture”

on desktops and laptop computers. Generally, a gesture is a symbol for physical or


emotional behaviour. It consists of both body and hand gestures. Gestures can be used to
communicate between humans and computers. Human-computer interaction (HCI) began
in the early 1980s as a field of study and practice. The name "virtual mouse" conveys a
clear idea about our project. The virtual mouse establishes a virtual connection between
the user and the machine without the use of any hardware. This gesture recognition
system can capture and track the fingertips of a person wearing a colour cap with a
webcam, and the system detects the hand's colour and movements and moves the cursor
along with it.

4.2 Working of Blocks

1. Capturing real time video using Web-Camera: We will need a sensor to detect the
user's hand movements in order for the system to work. As a sensor, the computer's
webcam is used. The webcam records real-time video at a fixed frame rate and resolution
determined by the camera's hardware. If necessary, the system's frame rate and resolution
can be changed.

2.Converting the video captured into HSV format: The video has also been converted into
HSV (hue, saturation, meaning, also called HSB), an alternative representation of the
RGB colour model created by computer graphics researchers to better reflect the
perception of coloured characteristics by human vision.

3) Each image frame is processed separately: Following the capture of the video, it goes
through a brief pre-processing stage before being processed one frame at a time.

4) Conversion of each frame to a greyscale image: A gray scale image has a lower
computational complexity than a coloured image. It also aids in faster colour calibration
without the use of external noise. All the necessary operations were carried out after the
image was converted to gray scale.

21
“Virtual Mouse Control Using Hand Class Gesture”

5) Calibrate the colour ranges: The device enters the calibration mode after the above
steps, which assigns each colour according to the HSV rule to its colour hue, saturation or
value values. Every colour already has its predetermined values. For accurate colour
detection, the user can adjust the ranges. In the diagram below you can clearly see the
variety of values used to detect every colour cap.

6) Calculate the image's centroid by locating the image's region. To guide the mouse
pointer, the user must first choose a point whose coordinates can be sent to the cursor.
The device can monitor cursor movement using these coordinates. As the object travels
around the frame, these coordinates change over time.
7) Tracking the mouse pointer. After determining the coordinates, the mouse driver is
accessed and the coordinates are sent to the cursor. The cursor positions itself in the
required position using these coordinates. As a result, the mouse moves proportionally
across the screen as the user moves his hands across the camera's field of view.

8) Simulating the mouse actions. To create the control actions in simulation mode, the
user must make hand gestures. Because colour pointers are used, the computation time is
lowered.

4.3 Experimental setup

For the implementation of hand gesture recognition (HGR) system or multimedia


interaction we have built our own data set. We have recorded each video stream of

22
“Virtual Mouse Control Using Hand Class Gesture”

duration time approximately 10 seconds at the rate of 30 frames per seconds and at the
resolution of 1280×720 using digital camera of 8 megapixels. The experiments were
carried out in three different sessions. The images (i.e. frames) taken from the recorded
video streams at various distances and places are used to classify these sessions. Each
session consists of 300 images of 6 different classes where each class having 50 images.
Some samples of images used in different sessions are shown in the and 4 based on which
we will calculate the hand gesture recognition accuracies. The whole system is designed
using image processing and computer vision techniques implemented in Matlab-
2013under Windows 8 operating system. The hardware required for processing the hand
motion recognition system was a 64-bit PC with a 2.40GHz processor and 2 GB RAM.

23
“Virtual Mouse Control Using Hand Class Gesture”

CHAPTER 5

SOFTWARE
IMPLEMENTATION

5.1 Introduction

24
“Virtual Mouse Control Using Hand Class Gesture”

We conceived and designed a system for vision-based hand gesture


identification that included several stages that we explained using an
algorithm. Flowchart of hand gesture recognition is the functioning
flowchart of the gesture recognition system.

5.2 Algorithm
1.Extract a frame (i.e. hand image) from recorded video stream.
2. Extracted frame is transformed from RGB colour space to YCbCr15colour space
model and then hand is detectedin the image using skin colour based detection
techniques15.
3. After detection of hand we have converted the image into black & white i.e.
marked the skin pixels as white andnon-skin pixels (i.e. background) as black and
then we have applied some preprocessing techniques like imagefilling, morphological
erosion using 15×15 structuring elements etc. to increase the quality of image and
toremove some noise.
4. For the feature extraction centroid, equivalent diameter, area, perimeter and
orientation of detected objects isfound out in the frame. With the help of centroid and
diameter a circle is drawn same as background colour pixels.

5.3 Flowchart (of Working Modul


Implementation of hand gesture recognition (HGR) system or multimedia
interaction we have built our own data set. We have recorded each video stream of
duration time approximately 10 seconds at the rate of 30 frames per seconds and at
the resolution of 1280×720 using digital camera of 8 megapixels. We have
performed the experiments in three different sessions. These sessions are classified
based on the images (i.e. frames) extracted from the recorded video streams at
different distance and positions.

25
“Virtual Mouse Control Using Hand Class Gesture”

Figure 5.3 (a): Flowchart of hand gesture recognition.

Each session consists of 300 images of 6 different classes where each class having 50
images. Some samples of images used in different sessions based on which we will
calculate the hand gesture recognition accuracies. The whole system is designed using
image processing and computer vision techniques implemented in Matlab-2013under
Windows 8 operating system. We have used a 64-bit computer with 2.40GHz processor
and 2 GB RAM as the hardware requirement for processing the hand gesture recognition
system

26
“Virtual Mouse Control Using Hand Class Gesture”

CHAPTER 6

RESULTS AND
DISCUSSION

27
“Virtual Mouse Control Using Hand Class Gesture”

6.1 Results and Discussion

After repeating the practical testing, it was discovered that the system performs
exceptionally well under adequate illumination and in a plain background environment
devoid of any skin-like items. The method is not particularly robust because it finds it
difficult to distinguish the hand against a complicated background. The system's
performance is also influenced by the threshold value used to determine the radius using
various feasible approximations. Nonetheless, the system is slightly more responsive than
other systems that have been built previously because it does not require a training period
for gesture recognition. We have executed the HGR system for each session and recorded
the accuracy of the. After performing the experiments we can see that the overall
accuracy of the system is 95.44% which is a very good result. The minimum accuracy we
have achieved by class 3 gesture in session 3 due to the gesture shapes and positions. In
session 1 the results were also not satisfactory for some of the classes due to the same
reason. The threshold value (i.e.) that we chose is critical for recognising motions. After
working with the system on a larger number of images, we can choose a reasonable
threshold value to improve the system's performance.

Fig 6.2{1}: - Static hand gesture numbered in session 1 (recorded at the distance of 16
cm approx.)

Fig 6.2 (a) Fig 6.2(b) Fig 6.2(c) Fig 6.2 (d)

28
“Virtual Mouse Control Using Hand Class Gesture”

Fig 6.2{2}: - Static hand gesture numbered in session 2 (recorded at the distance of 50
cm approx.)

Fig 6.2(a) Fig 6.2(b) Fig 6.2(c) Fig 6.2(d)

We compared our system to others and discovered that ours had a little higher recognition
rate than the others. The good thing about our system is that it does not require any
training or classifications which are a very much time consuming tasks. There were a few
instances where our system struggled to recognise particular movements.

We can see from the diagram above how difficult it was for our algorithm to
recognise these gestures. The system correctly identified the first three movements,
but did not acknowledge them. The number 4 gesture is not detected by the system
due to the bad lighting condition

29
“Virtual Mouse Control Using Hand Class Gesture”

CHAPTER 7

CONCLUSION &
FUTURE SCOPE

30
“Virtual Mouse Control Using Hand Class Gesture”

7.1 Conclusion and Future Scope

The AI virtual mouse system's main goal is to control mouse cursor functionalities with
hand gestures rather than utilising a hardware mouse. The proposed system can be
achieved by using a webcam or a built-in camera which detects the hand gestures and
hand tip and processes these frames to perform the particular mouse functions.

We can conclude from the model's results that the suggested AI virtual mouse system
worked extremely well and has a higher accuracy than existing models, and that the
model also solves most of the constraints of existing systems. The AI virtual mouse can
be utilised for real-world applications since the suggested model is more accurate, and it
can also be used to minimise the spread of COVID-19 because the proposed mouse
system can be operated virtually using hand movements rather than the standard physical
mouse. The model has certain flaws, such as a slight loss of accuracy when using the
right-click mouse function and some difficulty selecting text by clicking and dragging.
As a result, we'll go to work.

The proposed AI virtual mouse has various flaws, such as a slight loss of accuracy
when using the right-click mouse function, and the model has some difficulty selecting
text by clicking and dragging. These are some of the suggested AI virtual mouse system's
shortcomings, which will be addressed in future research.

In addition, the proposed system can be improved to handle virtual keyboard and
mouse functionality, which is another future use of Human-Computer Interaction (HCI).

31
“Virtual Mouse Control Using Hand Class Gesture”

REFERENCES

32
“Virtual Mouse Control Using Hand Class Gesture”

References

[1] S. S. Rautaray and A. Agrawal, Vision Based Hand Gesture Recognition for Human
Computer Interaction: A survey,Springer Transactional Artificial Intelligence Review,
pp. 1–54, (2012).
[2] P. Payeur, C. Pasca, A. Cretu and E. M. Petriu, Intelligent Haptic Sensor System for
Robotic Manipulation,IEEE Transaction on Instrumentation and Measurement, vol.
54(4), pp. 1583–1592, (2005).
[3] S. Meena, A Study on Hand Gesture Recognition Technique, Master Thesis,
Department of Electronics and Communication Engineering,National Institute of
Technology, India, (2011).
[4] M. M. Hasan and P. K. Mishra, Hand Gesture Modelling and Recognition using
Geometric Features: A Review, Canadian Journal on Image Processing and Computer
Vision, vol. 3(1), pp. 12–26, (2012).
5] B. A. Myers, A Brief History of Human Computer Interaction Technology, ACM
Interactions, vol. 5(2), pp. 44–54, (1998).
[6] A. Malima, E. ̈Ozg ̈ urand M. C ̧ etin, A Fast Algorithm for Vision-Based Hand
Gesture Recognition For Robot Control,IEEE Signal Processingand Communications
Applications, pp. 1–4, (2006).
[7] Z. Xu,et al., Hand Gesture Recognition and Virtual Game Control Based on 3D
Accelerometer and EMG Sensors, InProceedings of IUI’09,pp. 401–406, (2009).
[8] S. Mitra and T. Acharya, Gesture Recognition: A Survey, IEEE Transactions on
Systems, Man and Cybernetics, Part C: Applications and Reviews, vol. 37(3), pp. 311–
324, (2007).

33
“Virtual Mouse Control Using Hand Class Gesture”

[9] N. X. Tran, Wireless Data Glove for Gesture-Based Robotic Control,


13thInternational Conference on HCI, vol. 5611, pp. 271–280, (2009).
[10] J. M. Rehg and T. Kanade, Visual Tracking of High DOF Articulated Structures: An
Application to Human Hand Tracking, 3rdEuropeanConference on Computer Vision, pp.
35–46, (1994).
[11] Murthy and Jadon, Hand Gesture Recognition using Neural Networks, In2nd IEEE
International Advance Computing Conference (IACC),pp. 134–138, (2010).[12]S. K.
Kang, M. Y. Nam and P. K. Rhee, Colour Based Hand and Finger Detection Technology
for user Interaction,IEEE International Conference on Convergence and Hybrid
Information Technology, pp. 229–236, (2008).
[13] N. H. Dardas and N. D. Georganas, Real-Time Hand Gesture Detection and
Recognition using Bag-of-Features and Support Vector MachineTechniques,IEEE
Transaction on Instrumentations and Measurement, vol. 60(11), pp. 3592–3607,
November (2011).
[14] S. S. Rautaray and A. Agrawal, A Novel Human Computer Interface Based on Hand
Gesture Recognition using Computer Vision Techniques,InProceedings of ACM
IITM’10, pp. 292–296, (2010).
[15] N. A. Ibraheem, R. Z. Khan and M. M. Hasan, Comparative Study of Skin Colour
Based Segmentation Techniques, International Journal of Applied Information Systems
(IJAIS), vol. 5(10), pp. 24–38, (2013).

34

You might also like