Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 29

Faculty of Technology,

The M. S .University,
Vaddodara
PC Control
using
Object Extraction, Object Tracking
&
Gesture Imitation
Electrical Engineering Deparment
Prepared by:
Fedrik Macwan (ME-96)
ME III IV Sem
Problem Statement
Develop an HCI for PC
control using camera
image as input
Topics Covered in Seminar
Topics up to this seminar
Object Extraction and
Object tracking
Hardware / Software used
Hardware / Software Specifications
Processor Pentium P6200 @ 2.13 Ghz
RAM 2 GB
Operating System
Windows 7 Home Premium
32bit
Testing Platform
Matlab Matlab 2013a 8.0.1.604
Visual C Visual Studio Ultimate 2012
Objectives
Applications
Cam Canvas,
Minesweeper, Night
Testing, Counter-strike
Work Remaining
Reducing time of current
algorithm and Thesis
Writing.
Object
Extraction
Objective 1
Object
Tracking
Objective 2
Gesture
Imitation
Objective 3
Problem Statement
Develop an HCI (Human Computer Interface) for the PC
without use of an additional hardware.
Goal :
The goal of this project is to take an advantage of the recent
hardware development and development of the image
processing algorithms to build an HCI using computer vision
to replace traditional input hardware.
Objectives
Object
Extraction
Objective 1
Object
Tracking
Objective 2
Gesture
Imitation
Objective 3
Objective : 1 Object Extraction
O
b
j
e
c
t

E
x
t
r
a
c
t
i
o
n

G
U
I

A
l
g
o
r
i
t
h
m


Blanked
Background
Extracted
Object
Approximated
Background
O
b
j
e
c
t

E
x
t
r
a
c
t
i
o
n

G
U
I

A
l
g
o
r
i
t
h
m


Select Area of
Interest
Step 1 : Cropping Operation of the Selected
Region of image
Object
Processing
Step 2 : Edge Detection and Filtering
Operation is done on the cropped
image
Object
Extraction
Step 3 : Object is subtracted for the image
Auto Fill
Background
Step 4 : Background is filled with the
duplication of the corner pixel
Objective : 1 Object Extraction
A
l
g
o
r
i
t
h
m

O
b
j
e
c
t

E
x
t
r
a
c
t
i
o
n

G
U
I

Objective : 1 Object Extraction
Objective : 2 Object Tracking
R
e
s
u
l
t

o
f

t
h
e

O
b
j
e
c
t

T
r
a
c
k
i
n
g

A
l
g
o
r
i
t
h
m

o
f

O
b
j
e
c
t

t
r
a
c
k
i
n
g


Object on
starting Position
Object on
Ending Position
Object Path
I
m
p
l
e
m
e
n
t
a
t
i
o
n

T
y
p
e
s

o
f

M
o
u
s
e

O
p
e
r
a
t
i
o
n
s

Objective : 2 Object Tracking
R
e
s
u
l
t

o
f

t
h
e

O
b
j
e
c
t

T
r
a
c
k
i
n
g

I
m
p
l
e
m
e
n
t
a
t
i
o
n

T
y
p
e
s

o
f

M
o
u
s
e

O
p
e
r
a
t
i
o
n
s

A
l
g
o
r
i
t
h
m

o
f

O
b
j
e
c
t

t
r
a
c
k
i
n
g

S
l
i
d
e

8

Step 1
Start the video acquisition
Get the snapshot after some interval
Step 2
Subtract the red component from the
grayscale image to extract the red
components in the image
Step 3
Use a median filter to filter out noise
Remove all those pixels less than
300px
Step 4
Label the image and find the centroid of
the detected red object
Step 5
Set the mouse curser position to the
found Red object centroid.
A
l
g
o
r
i
t
h
m

R
e
s
u
l
t

o
f

t
h
e

O
b
j
e
c
t

T
r
a
c
k
i
n
g

Objective : 2 Object Tracking
I
m
p
l
e
m
e
n
t
a
t
i
o
n

T
y
p
e
s

o
f

M
o
u
s
e

O
p
e
r
a
t
i
o
n
s

A
l
g
o
r
i
t
h
m

T
y
p
e
s

o
f

M
o
u
s
e

O
p
e
r
a
t
i
o
n
s

Objective : 2 Object Tracking
I
m
p
l
e
m
e
n
t
a
t
i
o
n

R
e
s
u
l
t

o
f

t
h
e

O
b
j
e
c
t

T
r
a
c
k
i
n
g

There are basically four mouse operations needed
to be simulated in order to effectively handle the
PC.
1. Curser Movement
2. Left Click event
3. Right Click Event
4. Double Click Event
5. Scroll Mouse Event
A
l
g
o
r
i
t
h
m

I
m
p
l
e
m
e
n
t
a
t
i
o
n

Objective : 2 Object Tracking
T
y
p
e
s

o
f

M
o
u
s
e

O
p
e
r
a
t
i
o
n
s

R
e
s
u
l
t

o
f

t
h
e

O
b
j
e
c
t

T
r
a
c
k
i
n
g

Step 1
Acquire the image from camera after
every 0.1 sec . Set red , blue and green
threshold for the color detection.
Step 2
Subtract red components, green
components and blue components of
the image from the original image
Step 3
Obtain centroids for the red
components, green components and
blue components
Step 4
Movement of the red colored object
centroid is reflected on the mouse
curser movement
Step 5
Movement of the green colored object
centroid is reflected on the mouse scroll
movement
Step 6
Number of blue colored object is
counted & one for left click, two for right
click and three for double click
A
l
g
o
r
i
t
h
m

I
m
p
l
e
m
e
n
t
a
t
i
o
n

Objective : 2 Object Tracking
T
y
p
e
s

o
f

M
o
u
s
e

O
p
e
r
a
t
i
o
n
s

R
e
s
u
l
t

o
f

t
h
e

O
b
j
e
c
t

T
r
a
c
k
i
n
g

Step 1
Acquire the image from camera after
every 0.1 sec . Set red , blue and green
threshold for the color detection.
Step 2
Subtract red components, green
components and blue components of
the image from the original image
Step 3
Obtain centroids for the red
components, green components and
blue components
Step 4
Movement of the red colored object
centroid is reflected on the mouse
curser movement
Step 5
Movement of the green colored object
centroid is reflected on the mouse scroll
movement
Step 6
Number of blue colored object is
counted & one for left click, two for right
click and three for double click
Objective : 3 Gesture Imitation
G
e
n
e
r
a
l
i
z
e
d

H
C
I

S
y
s
t
e
m


T
y
p
e
s

o
f

t
h
e

G
e
s
t
u
r
e

Control Points
S
I
F
T

A
l
g
o
r
i
t
h
m

Objective : 3 Gesture Imitation
S
I
F
T

A
l
g
o
r
i
t
h
m

T
y
p
e
s

o
f

t
h
e

G
e
s
t
u
r
e

1. Static Gesture Identification (without
movement):
2. Dynamic Gesture Identification
(with movement) :

G
e
n
e
r
a
l
i
z
e
d

H
C
I

S
y
s
t
e
m


Objective : 3 Gesture Imitation
T
y
p
e
s

o
f

t
h
e

G
e
s
t
u
r
e

G
e
n
e
r
a
l
i
z
e
d

H
C
I

S
y
s
t
e
m


S
I
F
T

A
l
g
o
r
i
t
h
m

Objective : 3 Gesture Imitation
T
y
p
e
s

o
f

t
h
e

G
e
s
t
u
r
e

S
I
F
T

A
l
g
o
r
i
t
h
m

G
e
n
e
r
a
l
i
z
e
d

H
C
I

S
y
s
t
e
m


SCALE INVARIANT FEATURE TRANSFORM :
To aid the extraction of features the SIFT
algorithm applies a 4 stage filtering
approach
1. Scale-Space Extrema Detection
2. Keypoint Localization
3. Orientation Assignment
4. Keypoint Descriptor
Objective : 3 SGI System
S
c
a
l
e

I
n
v
a
r
i
a
n
t

F
e
a
t
u
r
e

T
r
a
n
s
f
o
r
m


S
c
a
l
e

I
n
v
a
r
i
a
n
t

F
e
a
t
u
r
e

T
r
a
n
s
f
o
r
m


1
3
2
d
11

d
21

d
31

1
2
3
d
12

d
22

d
32

M denotes the # of Matched Points
Objective : 3 Gesture Imitation
S
c
a
l
e

I
n
v
a
r
i
a
n
t

F
e
a
t
u
r
e

T
r
a
n
s
f
o
r
m


S
c
a
l
e

I
n
v
a
r
i
a
n
t

F
e
a
t
u
r
e

T
r
a
n
s
f
o
r
m


Construct
the
Database
Init
rSIFT ROD = 0.65
MK RoD Thresold =
0.035
Print out
the result
RUN rSIFT
Match Algorithm
Run the MK ROD
and get Validity
Ratio
More
than
one
Result ?
Increment
rSIFT RoD by 0.05 and
decrement MK RoD by
0.005
YES
NO
Objective : 3 Result : SGI System
Input Type Accuracy
Hand gesture images from
training set
98 %
Hand gesture images
outside the training set
89%
Objective : 3 DGI System
R
e
s
u
l
t

o
f

D
G
I

S
y
s
t
e
m

I
m
p
l
e
m
e
n
t
e
d

B
l
o
c
k

D
i
a
g
r
a
m

Preliminary Processing
Computation Geometry
Calculation
Fingertips
Detection
Palm Position
Detection
Modification
Image
Data
Estimated
Fingertip and
Palm Location
Binary Data
Contour Convex Hull
Objective : 3 DGI System
R
e
s
u
l
t

o
f

D
G
I

S
y
s
t
e
m

I
m
p
l
e
m
e
n
t
e
d

B
l
o
c
k

D
i
a
g
r
a
m

Hardware / Software used
Hardware / Software Specifications
Processor Pentium P6200 @ 2.13 Ghz
RAM 2 GB
Operating System Windows 7 Home Premium 32bit
Testing Platform
Matlab Matlab 2013a 8.0.1.604
Visual C Visual Studio Ultimate 2012
Application
Cam Canvas :
Application
Night Testing:
Application
Counter-strike:
Application
Minesweeper:
References

1. Itauma Isong Itauma, Hasan Kivrak, and Hatice Kose; Gesture
imitation using machine learning techniques[2012]
2. Ashis Pradhan,M.K.Ghose,Mohan Pradhan; A Hand Gesture
Recognition using Feature Extraction[2011]
3. Ho-Sub Yoon, Jung Soh, Younglae J. Bae, Hyun Seung Yang;
Hand gesture recognition using combined features of location,
angle and velocity[2009]
4. Sebastien Marcel, Olivier Bernier, JeanEmmanuel Viallet and Daniel
Collobert; Hand Gesture Recognition using InputOutput Hidden Markov
Models[2012]
5. Rafiqul Zaman Khan and Noor Adnan Ibraheem; Comparative study of
hand gesture recognition system[2007]
6. Oleg Rumyantsev, Matt Merati, Vasant Ramachandran; Hand Sign
Recognition through Palm Gesture and Movement[2008]



Convex Hull
A set of points is defined to be convex if it contains the
line segments connecting each pair of its points.

The convex hull of a given set X may be defined as

1. The (unique) minimal convex set containing X
2. The intersection of all convex sets containing X
3. The set of all convex combinations of points in X.
4. The union of all simplices with vertices in X.

You might also like