Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 9

COLLEGE OF ENGINEERING & MANAGEMENT,KOLAGHAT

TOPIC NAME – MINIMUM DISTANCE CLASSIFIER


NAME - SUBHAM PATTANAYAK
COLLEGE ROLL NO - CSE/21/057
UNIVERSITY ROLL NO - 10700121025
DEPARTMENT- COMPUTER SCIENCE & ENGINEERING
SEMESTER- 6TH
SUBJECT – PATTERN RECOGNITION
PAPER CODE – PEC-IT602D
 INTRODUCTIO
N :-

• A minimum distance classifier is a fundamental concept in pattern recognition, a field of study


that involves the identification and classification of patterns within data.
• This classifier is commonly used to assign a given input pattern to one of several predefined
classes based on the minimum distance criterion.
• The underlying idea is to measure the similarity or dissimilarity between the input pattern
and each of the predefined classes, and assign the pattern to the class with the minimum
distance.
 KEY CONCEPTS :-
• Feature Space :-
Patterns are often represented in a feature space, where each pattern is described by a set of features or
attributes. These features could be measurements, characteristics, or any relevant information
associated with the pattern.
• Classes :-
The patterns are grouped into different classes, and the goal is to classify a given pattern into one of
these predefined classes.
• Distance Metric :-
The choice of a distance metric is crucial in the minimum distance classifier. Common distance metrics
include euclidean distance, manhattan distance, and mahalanobis distance. The distance metric
determines how the similarity or dissimilarity between patterns is measured.
 WORKING OF MINIMUM DISTANCE
CLASSIFIER :-
• Training Phase :-
During the training phase, the minimum distance classifier learns the characteristics of each class
in the feature space. This involves computing the representative values (such as mean or
centroid) for each class based on the training data.
• Classification Phase :-
When a new pattern is presented for classification, the classifier calculates its distance from the
representative values of each class using the chosen distance metric.
• Decision Rule :-
The classifier assigns the input pattern to the class with the minimum distance. The decision rule
can be formulated as follows:

Class = argmini​Distance( Pattern, Classi​)


 EXAMPLE
DATASET :-
• Suppose we have a dataset with two classes, "A" and "B", and each class has two
features, x1 and x2. The dataset looks like this:

Class A Class B
(1, 2) (5, 6)
(2, 3) (6, 7)
(3, 4) (7, 8)

• Now, let's say we have a new data point (4, 5) and we want to classify it using the
Minimum Distance Classifier.
 CALCULATIONS :-
First, we calculate the mean of each class:

Mean of Class A: (2, 3)

Mean of Class B: (6, 7)

Next, we calculate the distance between the new data point and the mean of each class using the Euclidean
distance formula:

Distance to Class A = sqrt((4-2)^2 + (5-3)^2) = sqrt(4 + 4) = sqrt(8) ≈ 2.83

Distance to Class B = sqrt((4-6)^2 + (5-7)^2) = sqrt(4 + 4) = sqrt(8) ≈ 2.83

Since the distance to Class A is smaller, we classify the new data point as belonging to Class A.

 So, in this example, the Minimum Distance Classifier assigns the new data point (4, 5) to Class A based on the
nearest mean.
 ADVANTAGES :-

 The minimum distance classifier is a simple and intuitive approach in pattern recognition that assigns a
given input pattern to the class with the nearest prototype or centroid in the feature space. Here are
some advantages of using the minimum distance classifier in pattern recognition:

• Simplicity: One of the main advantages of the minimum distance classifier is its simplicity. The method is
easy to understand and implement, making it accessible for beginners in pattern recognition.
• Computational Efficiency: Minimum distance classification involves calculating distances between the
input pattern and prototype vectors, which is computationally efficient. This makes it suitable for real-
time applications and scenarios where computational resources are limited.
• Low Training Complexity: Unlike more complex classifiers that require extensive training processes, the
minimum distance classifier often requires minimal or no training. This can be advantageous when
dealing with datasets that are not large or when training time is a critical factor.
• Robustness to Noise: The minimum distance classifier can be robust to small amounts of noise in the
data, especially when the noise does not significantly affect the relative distances between prototypes.
 DISADVANTAGES :-
 The minimum distance classifier, also known as the Nearest Neighbor classifier, is a simple and intuitive
method used in pattern recognition. However, it has several disadvantages that may limit its effectiveness in
certain situations:

• Memory Requirements: Storing the entire dataset for fast retrieval of nearest neighbors can be memory-
intensive, especially for large datasets. This can be a limitation in real-world applications with limited memory
resources.
• Boundary Ambiguity: The decision boundaries of the minimum distance classifier can be complex and may
not align well with the underlying structure of the data. This can lead to ambiguous regions where the
classifier may struggle to make accurate predictions.
• Unequal Importance of Features: The classifier assumes that all features contribute equally to the similarity
measure. In some cases, certain features may be more important than others, and the classifier may not
effectively capture these distinctions.
• Choice of Distance Metric: The classifier's performance is highly dependent on the choice of distance metric.
Different metrics may lead to different results, and selecting an appropriate metric can be challenging,
especially when the characteristics of the data are not well understood.
 CONCLUSION :-

• The minimum distance classifier serves as a straightforward and intuitive method for
pattern recognition, relying on the concept of proximity to determine class assignments.
While it may lack the complexity of more advanced algorithms, its simplicity makes it
suitable for certain applications, especially when dealing with well-separated classes in
feature space. However, its performance can be limited in scenarios with overlapping class
distributions or non-linear decision boundaries.

 REFERENCES :-

• https://www.slideshare.net/nilnayem/pattern-recognition-designing-a-minimum-distance-class-me
an-classifier
• www.google.co.in
• ChatGPT

You might also like