Virtual Whiteboard Using Hand Gestures

You might also like

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 42

VIRTUAL WHITEBOARD

USING HAND GESTURES

GUIDED BY: GROUP MEMBERS:

MS SHINI K
AKSHAY K P (MES20IT002)
ASSISTANT PROFESSOR
ANJANA K S (MES20IT004)
DEPARTMENT OF IT
ASNA BINZY (MES20IT006)
MOHAMMED AFEEF (MES20IT015)
ZAYED MUHAMMED (MES20IT029)
2
01 OBJECTIVES
02 INTRODUCTION
03 LITERATURE SURVEY
04 RESOURCES
05 PROJECT SCHEDULE
06 TASK ALLOCATION
07 PROJECT PLANNING
08 RESOURCES
09 SYSTEM ARCHITECTURE CONTENTS
10 ACTIVITY DIAGRAM
11 PROJECT SCHEDULE
12 TASK ALLOCATION
13 PROJECT PROGRESS
14 IMPLEMENTATION AND RESULTS
15 CONCLUSION
16 REFERENCE
3

OBJECTIVES

▪ Implement a gesture recognition system

▪ Seamless integration of hand gestures and mouse control.

▪ Provide a diverse set of annotation tools and functionalities to accommodate various features.

▪ Enable mouse operations for selecting and manipulating objects


4

INTRODUCTION

• In the fast emerging era of technologies, integrating new and advanced systems
into the existing systems will limit the drawbacks of the conventional models.

Existing Methods
• The current method captures the finger movements.
• Works as a virtual whiteboard that can be controlled using hand gestures.
• A black screen, created as a ’Canvas’ where the results are displayed.
5

Disadvantages
• Real-time interaction is difficult.
• Have to depend on a different canvas for output

Proposed Method:
• Employs the concept of hand gesture recognition into the existing
system of conventional whiteboard model.
• Proposed model will recognize hand gestures within live images and
identify them using image processing techniques
• Collaboration of mouse pointer with the whiteboard
Presentation title 6

LITERATURE SURVEY
1. Hand Gesture Recognition For Human Computer
Interaction Through KNN Algorithm And Media Pipe[2]
AUTHORS: Shaik Sai Rohit, Raunak Kandoi, Sandeep Kumar (April 2023)

• Employ a new model to improve the human computer interaction with help webcam.
• A menu will be displayed to select the desired action.
• Sign language is used to select the menu action
• Used googles Media pipe frame work to detect the hand.
• Used KNN algorithm
8

ADVANTAGES

• Accuracy of 95.7% on palm detection


• It will recognize the gestures without any unnecessary equipment.
• And any sort of background.

DISADVANTAGES

• Quality of tone applied to the fingers.


• Distance between camera and hand always should be at same range.
• Very time consuming.
9

2. AUGMENTED REALITY BASED GESTURE CONTROLLED


VIRTUAL BOARD [3]
AUTHORS : N. SANTHIYAKUMARI , JEEVAA D, GOWRI SHANKAR S, MOHIT RAKESH TAPARIA, NEAVIL
PORUS (2022)

• Developed in order to provide a blended Virtual board for dynamic workspace and improvised interaction among the
users.

• Uses the concept of Gesture recognition, Augmented Reality, Computer Vision.

• Image analysis techniques like Real Time Pre-Processing, Finger-Tip Detection, Skin segmentation.
10

ADVANTAGES
• Works with 98% accuracy.
• Recognizes multiple sign gestures simultaneously without any delay.

DISADVANTAGES
• Distance greater than 20m leads to inaccuracy.
• Quick movement cannot be captured by the webcam.
11

3. HAND GESTURE CONTROLLER (VIRTUAL MOUSE) AND


VOICE ASSISTANT USING OPENCV, ML, PYTHON[4]
AUTHORS : DR. PRATIBHA V. WAJE1, MS. SHIPRANJALI K. GANGURDE, MS. SNEHAL S. SONAWANE, MS.
PALLAVI S. AVHAD, MR.SHUBHAM S. RAUT ( MAY 2023)

• Proposes hand gesture controller (virtual mouse) and voice assistant that utilizes OpenCV

• ML and Computer Vision algorithms used to recognize hand gestures and voice commands

• Model is implemented using CNN and mediapipe framework.


12

ADVANTAGES
• System is scalable and adaptable to different environments and
devices.
• Alternative interface to hardware mouse.
• Applications in hazardous environments.
• Enhance user experience and improve accessibility through HCI.

DISADVANTAGES
• Inaccurate in noisy environments.
• Quick movements can lead to inaccurate results.
13

4. WHITEBOARD[5]
AUTHORS: MONA S. WANVE1 , KARUNA S. PALASKAR , PRATIKSHA D. NAIK , ARPITA S.
WANKHADE4 , VAISHNAVI V, BOHARA , DR. A. D. RAUT (MAY 2022)

• A web application to increase collaboration


• Real-time collaborative drawing whiteboard
• Implemented using HTML5, DOM, JavaScript, CSS, canvas
14

ADVANTAGES
• Improves teacher presentations and demonstrations.
• Provides the easiest ways for teachers to teach a class from a single computer
• They provide a visually enhancing and stimulating type of learning

DISADVANTAGES
• Interactive Whiteboards tend to be PC friendly not MAC friendly
• Problems with too many wires in the classroom
• Students interfering with and being attracted to playing with visible wires
• Takes a lot of time to pack away when needed
15

RESOURCES

▪ Python

▪ OpenCV

▪ Media Pipe

▪ PyAutoGUI

▪ PC With Webcam

▪ Tkinter
16
17
18
19
20

SYSTEM ARCHITECTURE
Camera

Position

Whiteboard Prints
Gesture
Gesture Calculation
Frame
Capture Prediction
Features Select Operation
G. Model
Execution
Pre- G. Dataset
processing
G.Dataset Execution

Mouse Operation
Features
G.Model Gesture Mouse Pointer
Hand Detection
Recognition Prediction
Select Operation
Gesture
22

ACTIVITY DIAGRAM
LEVEL 0
Start

Camera

Frame Capture

White Board Mouse

Stop
LEVEL 1 Whiteboard

Hand Gesture Place menu


using MediaPipe

Finger tip recognition Compare

Position detection Select operation

Execution

no yes
If
Finish
stop
LEVEL 2 Mouse

Hand recognition Initiate & assign


gestures

Gesture recognition Finish

Compare yes
no If
stop
Select operation

Identify mouse position Execution


26

IMPLEMENTATION
AND RESULTS
27

1. Interface

• Developed using "Tkinter” Python library used for creating graphical user interfaces (GUIs)

• Camera Access Button, a pivotal component allows users to seamlessly initiate the camera

• Transparent Window Button, a versatile tool empowers users with the flexibility to adjust the visibility

of the application window

• Provides a user-friendly experience by presenting features such as different colour options, clear, and

eraser functionalities.
28

Fig a. Interface
29

Fig b. Transparent window


30

2. Hand Recognition and Drawing

2.1 Loading the Hand Tracking Model

• Import the mediapipe module, which provides access to the hand

tracking model and other functionalities

• Use the constructor to initialize the Hand Tracking model

• This constructor loads the pre-trained model and prepares it for

inference
Fig c. Palm Detection
• Hands class representing the loaded Hand Tracking model

• Ready for Inference


31

2.2 Processing Input Data

• Can process webcam streams

• Use the process() method of the initialized hand tracking model to feed the input data for inference

• This method accepts the input data and returns the results of hand detection and tracking

• When processing a video frame, it returns the results for that specific frame

• After processing the input data, the model generates results containing information about the detected

hands

• These results typically include the coordinates of hand landmarks, palm bounding boxes, and confidence

scores
32

2.3 Fingertip Detection

● The hand landmarks are detected & extracted from the results

provided by the hand tracking model

● The landmarks consist of several points representing different parts of

the hand, including the fingertips

● Once the finger tips are identified, we can visualize them by drawing

Fig d. Fingertip Detection points or circles on the input data at their respective coordinates
33

2.4 Drawing

• Interprets the coordinates of the user's fingertips obtained from the

fingertip detection stage

• Strokes representing the drawing actions are generated as the user

moves their fingertips

• Ensures the smoothness and continuity of the drawn strokes, enhancing

Fig e. Drawing visual appeal and user experience.

• User selection of different colours and line thicknesses


34

2.5 Selecting Colours

● Fosters a more engaging and personalized experience on the virtual whiteboard

● Allowing users to effortlessly switch between different colours while drawing or interacting with

objects on the virtual whiteboard

● Users can select from a predefined set of colours, including blue, green, red, and yellow

● Chosen colours are accurately represented using the RGB (Red, Green, Blue) colour space
35

2.6 Clear

● Facilitates a dynamic workspace environment on the virtual whiteboard

● Enables users to swiftly reset the canvas, removing all drawings and annotations with a single action.

● Underscores commitment to user-centric design and usability

2.7 Erase

● Valuable tool for users to refine their drawings and annotations with precision by undoing the last drawn line

● Provides users with a seamless way to backtrack and refine their work

● Works in a way similar to undo action


36

3. Mouse Pointer Controller

● Allows users to control a virtual mouse pointer on the whiteboard interface using hand gestures

● Enables actions such as dragging objects by holding a specific gesture

● Left-clicking with the middle finger gesture

● Right-clicking with the forefinger gesture

● Enhances usability and productivity in navigating and interacting with the virtual whiteboard interface
37

3.1 Dragging

● Allows users to move objects on the virtual whiteboard by holding

down a specific gesture

● Enhances user control and flexibility in arranging and organizing files

● Works by continuously tracking the user's hand movements

● Collision detection algorithms is employed to prevent objects from

overlapping or intersecting during dragging


Fig f. Dragging
● Involves accurate gesture recognition, real-time position tracking, and

effective object manipulation


38

3.2 Right & Left Click

● Works by detecting specific hand gestures associated with each action

● For left-clicking, a gesture such as extending the middle finger is


Fig g. Left Click
recognized

● For right-clicking, a gesture associated with the forefinger is detected

● These gestures are mapped to corresponding mouse actions

Fig h. Right Click


39

CONCLUSION

• Successfully developed a virtual whiteboard system with hand tracking and gesture recognition capabilities

• Prepared and configured the Hand Tracking model for inference on input data

• Improved user experience by enabling intuitive interaction with digital content using hand gestures

• Potential applications in education, design, presentations, and remote collaboration


40

REFERENCES
• [1] FARIA SORONI, SAKIK AL SAJID, MD. NUR HOSSAIN BHUIYAN, JUNAID IQBAL, MOHAMMAD
MONIRUJJAMAN KHAN “Hand Gesture Based Virtual Blackboard Using Webcam”
• [2] SHAIK SAI ROHIT, RAUNAK KANDOI, SANDEEP KUMAR “Hand Gesture Recognition for Human
Computer Interaction through KNN Algorithm and Media pipe”
• [3] N. SANTHIYAKUMARI ,JEEVAA D, GOWRI SHANKAR S,MOHIT RAKESH TAPARIA, NEAVIL PORUS
“Augmented Reality based Gesture Controlled Virtual Board”
Presentation title 41

[4] DR. PRATIBHA V. WAJE1,MS.SHIPRANJALI K. GANGURDE, MS. SNEHAL S. SONAWANE, MS. PALLAVI S.
AVHAD, MR.SHUBHAM S. RAUT “HAND GESTURE CONTROLLER (VIRTUAL MOUSE) AND VOICE ASSISTANT
USING OPENCV, ML, PYTHON”
[5] MONA S. WANVE, KARUNA S. PALASKAR , PRATIKSHA D. NAIK , ARPITA S. WANKHADE, VAISHNAVI V.
BOHARA , DR. A. D. RAUT “WHITEBOARD”
THANK YOU

You might also like