Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

PATTERN RECOGNITION

Dr. Dibakar Saha


Assistant Professor Lecture-1
Department of Computer Applications
National Institute of Technology Raipur
PATTERN RECOGNITION

❑Pattern recognition is the automated recognition of


patterns and regularities in data.

Signal Structure
APPLICATIONS

❑ statistical data analysis,


❑ signal processing,
❑ image analysis,
❑ information retrieval,
❑ bioinformatics,
❑ data compression,
❑ computer graphics and
❑ machine learning.

❑ Pattern recognition has its origins in statistics and engineering; some modern approaches
to pattern recognition include the use of machine learning, due to the increased
availability of big data and a new abundance of processing power.
PATTERN RECOGNITION

❑ In machine learning, pattern recognition is the assignment of a label to a given input value.

❑ In statistics, discriminant analysis was introduced for this same purpose in 1936.
❑ An example of pattern recognition is classification, which attempts to assign each input value to
one of a given set of classes (for example, determine whether a given email is "spam" or "non-
spam").

❑ However, pattern recognition is a more general problem that encompasses other types of output as
well.
WHAT IS OUR TARGET

Our target is to train a machine so that it can identify or


recognize a certain pattern or signal or object.

Signal Structure
TAJ MAHAL BIBI KA MAQBARA, AURANGABAD
HOW TO TRAIN MYSELF

Profit of experience

Acquire Knowledge and apply the knowledge


HOW TO TRAIN MACHINE
❑ The concept of abstract ideas are known to us a priori.

❑ We should adapt the learning process

❑ It might be continuous adaptive learning process

❑ To identify or recognize the underlying pattern or structure within the given data, we have to find what
are the information known as a priori.

❑ We want such Pattern Recognition technique so that we can impart certain power to the machine that
can recognize the pattern from the given input efficiently.
EXAMPLE

❑ Medical Signal Analysis → Functioning heart → ECG

❑ Speech to word tool → Captures voice → Convert it to electronic signal


FEATURE EXTRACTION

Given object
FEATURE EXTRACTION

Boundary of the object


FEATURE EXTRACTION

Region of the object


FEATURE EXTRACTION

❑Boundary features → Extract from the boundary

❑Region features → may be Color, texture


FEATURE EXTRACTION
Boundary features
jv
Imaginary S(k)= u(k) + jv(k), where k= 0,1,…,N-1
axis
N=total boundary points

Example:

Boundary features can be acquired from


• Discrete furrier transformation co-efficient technique
• Magnitudes of N furrier transformation co-efficient
U technique etc.
Real axis
FEATURE EXTRACTION
Shape features

jv
Imaginary S(k)= u(k) + jv(k), where k= 0,1,…,N-1
axis
N=total boundary points

Example:

Shape features can be acquired from


• Moment around the principle of x-axis
• Moment around an axis which is orthogonal to the principle
of axis passing through center the gravity of the object .
U
Real axis
FEATURE EXTRACTION
❑ We need to find a set of properties
from boundary/color or texture

❑ Those extracted properties can be


treated as numerical values
FEATURE VECTOR

❑ If we keep those numerical values in a certain


order which is called Feature Vector

❑ Every Feature Vector has M-dimension


FEATURE SPACE
Lets we have a feature vector with 3-dimesion, i.e., M=3 z

P1
❑ We have a pattern 1 → P1 and another pattern P2
y
❑ P1 has feature vector: P1≈< 3, 5, 1 >
P2

❑ P2 has feature vector: P1≈< 4, 3, 0 >

x
FEATURE SPACE
Lets we have a feature vector with 3-dimesion, i.e., M=3

P1
❑ We have a pattern 1 → P1 and another pattern P2

❑ P1 has feature vector: P1 ≈ < 3, 5, 1 >


P2

❑ P2 has feature vector: P2 ≈ < 4, 3, 0 >

Say, P1 and P2 belong to class1 and class2 feature space

Ball Book
FEATURE SPACE
class1 ❑ Now, for unknown pattern P3 we need to recognize the pattern
from which class it belongs

❑ If we have feature vector of P3 ≈ < x’, y’, z’ >, we simply calculate


P1 the distance d(P1, P3) and d(P2, P3) between P1 and P3 and
also between P2,P3, respectively.
P3
❑ If d(P1, P3) > d(P2, P3), then we can say P3 belongs to the class
P2 where P2 belongs to.
class2
❑ That means P3 may be a book.
MODEL
Class 1

Feature Class 2
Vectors Pattern Recognizer (PR)

Class 3

Categorization of PR:
1. Supervised Learning
2. Unsupervised Learning
SUPERVISED LEARNING
❑ Patterns are known as a priori

❑ Knowledge of the similar pattern is available

❑ We have the feature vectors of the patterns

Class 1
Class 2
UNSUPERVISED LEARNING
❑ Patterns are not known as a priori

❑ Knowledge of the similar pattern is not available

❑ We have fixed of pattern

Class 1 Class 2
THAT’S ALL FOR TODAY..
IN THE NEXT CLASS WE WILL SEE SOME
FEATURE EXTRACTION TECHNIQUES

You might also like