Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 31

Machine Learning

L- 7
Pattern Recognition

• Patterns are recognized by the help of algorithms used in


Machine Learning. Recognizing patterns is the process of
classifying the data based on the model that is created by
training data, which then detects patterns and characteristics
from the patterns.

2
Applications of Pattern Recognition

• Computer vision:[(From the perspective of engineering, it seeks to


understand and automate tasks that the human visual system can
do.] : Objects in images can be recognized with the help of pattern
recognition which can extract certain patterns from image or video
which can be used in face recognition, farming tech, etc.
• Civil administration: surveillance and traffic analysis systems to
identify objects such as a car.
• Engineering: Speech recognition is widely used in systems such as
Alexa, Siri, and Google Now.
• Geology: Rocks recognition, it helps geologist to detect rocks.
• Speech Recognition: In speech recognition, words are treated as a
pattern and is widely used in the speech recognition algorithm.
• Fingerprint Scanning: In fingerprint recognition, pattern recognition
is widely used to identify a person one of the application to track
attendance in organizations.
3
https://www.edureka.co/blog/pattern-recognition /
Probability of Multiple Random Variables

• In machine learning, we are likely to work with many random


variables.
• For example, given a table of data, such as in excel, each row
represents a separate observation or event, and each column
represents a separate random variable.
• Variables may be either discrete, meaning that they take on a
finite set of values, or continuous, meaning they take on a real
or numerical value.
4
Red Wine Data Set

5
Contd..

• Probability across two or more random variables is


complicated as there are many ways that random variables can
interact, which, in turn, impacts their probabilities.

6
Contd..
We assume that the two variables are related or dependent in
some way.
As such, there are three main types of probability we might want
to consider; they are:
• Joint Probability:
• Marginal Probability:
• Conditional Probability:
Joint probability is the probability of two events occurring simultaneously.
Marginal probability is the probability of an event irrespective of the
outcome of another variable. Conditional probability is the probability of

one event occurring in the presence of a second event.


7
Naïve Bayes Classifier
• A Naive Bayes Classifier is a supervised machine-learning
algorithm that uses the Bayes’ Theorem, which assumes that
features are statistically independent. The theorem relies on
the naive assumption that input variables are independent of
each other
• What Is the Bayes’ Theorem? Naive Bayes Classifiers rely
on the Bayes’ Theorem, which is based on conditional
probability or in simple terms, the likelihood that an event (A)
will happen given that another event (B) has already
happened.

https://blog.easysol.net/machine-learning-algorithms-4 / 8
9
Naïve Bayes Classifier

10
Naïve Bayes Classifier

11

https://en.wikipedia.org/wiki/Bayes'_theorem
Bayes Classification

• Problem statement:
– Given features X1,X2,…,Xn
– Predict a label Y

Source: Material borrowed from Jonathan Huang and I. H. Witten’s and E. Frank’s “Data Mining” and
Jeremy Wyatt and others
Bayes Classification

•• For
  a new data point X and a class Y

Finding the probability of an event Y occurring. Given


that another event X has already occurred.

13
Bayesian decision theory
Introduction
• Bayesian Decision Theory is a fundamental statistical
approach to the problem of pattern classification.
• Quantifies the tradeoffs between various classifications
using probability and the costs that accompany such
classifications.
• Assumptions: Decision problem is posed in probabilistic
terms. All relevant probability values are known.

Pp 34: Pattern Classification by Richard O. Duda, David G. Stork, Peter E.Hart 14


Bayes’ Theorem

There are four parts to Bayes’ Theorem:


• Prior,
• Evidence,
• Likelihood,
• and Posterior. 

15
https://towardsdatascience.com/introduction-to-bayesian-decision-theory-
16
Bayes Decision Theory: https://www.youtube.com/watch?v=y5KiOC85Huc
17
Bayes Decision Theory: https://www.youtube.com/watch?v=y5KiOC85Huc
18
Bayes Decision Theory: https://www.youtube.com/watch?v=y5KiOC85Huc
Bayes Classification

• Problem statement:
– Given features X1,X2,…,Xn
– Predict a label Y

Source: Material borrowed from Jonathan Huang and I. H. Witten’s and E. Frank’s “Data Mining” and
Jeremy Wyatt and others
Bayes Classification

•• For
  a new data point X and a class Y

Finding the probability of an event Y occurring. Given


that another event X has already occurred.

20
An Application

• Digit Recognition

Classifier 5
• X1,…,Xn  {0,1} (Black vs. White pixels)
• Y  {5,6} (predict whether a digit is a 5 or a 6)
Bayes Classifier
Bayes Classifier

• To classify, we’ll simply compute these two probabilities and


predict based on which one is greater
Bayes Classifier: Example

24
Bayes Classifier: Example

25
Bayes Classifier

26
Bayes Classifier

27
28
Bayes Classifier

29
30
31

You might also like