Ai 8

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

Lab 8

Fisher Linear Discriminant Projection

Name: Aleena Sehar, Roll No: BSCS-2021-36


Name: Meesam Imran, Roll No: BSCS-2021-01
Name: Munim Naeem, Roll No: BSCS-2021-36-04

In the previous lab, we performed the dimensionality reduction using PCA also known as
Principal Component Analysis. In this lab we will Calculate Linear discriminant projection
for 2 classes.
Fisher's Linear Discriminant projec on is actually a technique used for dimensionality
reduc on) and feature extrac on in the context of pa ern classifica on. The Fisher Linear
Discriminant Projec on involves finding a linear combina on of features that maximizes the
separa on between different classes in a dataset.
The main goal is to find a projec on direc on (a linear combina on of features) which
maximizes the ra o of the between-class variance to the within-class variance.

Code:
The steps of the algorithm are also given in the code comments 😊 :
import numpy as np
import matplotlib.pyplot as plt

# Sample data set as in the Slides


class1_data = np.array([
[4,2],
[2,4],
[2,3],
[3,6],
[4,4],
])
class2_data = np.array([
[9,10],
[6,8],
[9,5],
[8,7],
[10,8]
])
# Combine data for both classe
data=np.vstack([class1_data,class2_data])
# We calculated the means for both classes
class1_means=np.mean(class1_data,axis=0)
class2_means=np.mean(class2_data,axis=0)
# Next ,We calculated the variances for both classes
class1_variances=np.var(class1_data,axis=0)
class2_variances=np.var(class2_data,axis=0)
# In this step, we calculated the Between-Class Scatter Matrix by following
steps:
# 1 : Mean Difference
# 2 : between class scaler matrix
mean_difference=class1_means-class2_means
between_class_scatter_matrix=np.outer(mean_difference,mean_difference)
# Then, we calculated the Within-Class Scatter Matrix for both classes
within_class_scatter_matrix_class1=np.cov(class1_data,rowvar=False)
within_class_scatter_matrix_class2=np.cov(class2_data,rowvar=False)
within_class_scatter_matrix=within_class_scatter_matrix_class1+within_class_sc
atter_matrix_class2
# Calculate the Fisher Linear Discriminant
fisher_linear_discriminant=np.linalg.inv(within_class_scatter_matrix).dot(betw
een_class_scatter_matrix)
# Calculate Eigenvalues and Eigenvectors
_,eigenvectors=np.linalg.eig(fisher_linear_discriminant)
# Project all the original points onto the Fisher Linear Discriminant
num_components=1
projected_data=np.dot(data, eigenvectors[:,:num_components])
# Visualize the results for both classes
plt.figure(figsize=(12,6))
# Original scatter plot for Class 1 and Class 2
plt.subplot(1,2,1)
plt.scatter(class1_data[:,0], class1_data[:,1], c='blue',label='Class 1 Data')
plt.scatter(class2_data[:,0], class2_data[:,1], c='orange',label='Class 2
Data')
plt.title('Original Data for Both Classes')
plt.xlabel('X')
plt.ylabel('Y')
plt.legend()
plt.grid(True)
# Transformed scatter plot using Fisher Linear Discriminant
plt.subplot(1,2,2)
plt.scatter(projected_data, np.zeros_like(projected_data), c='green',
label='Projected Data')
plt.title('Fisher Linear Discriminant Projection')
plt.xlabel('Fisher Linear Discriminant')
plt.legend()
plt.grid(True)
plt.tight_layout()
plt.show()
Output:

Original Data for Both Classes:

 The first subplot on the top le shows the sca er plot of the original data points for
both Classes
 Each point represents a data sample with its coordinates on the X and Y axes.
Fisher Linear Discriminant Projec on:

 The second subplot on the top gives us the sca er plot of the data points a er being
projected onto the Fisher Linear Discriminant.
 The projec on is along a one-dimensional axis (Principal Component 1), which is the
direc on that maximizes the separa on between the two classes.

Contribution:
• Munim Naeem performed Data Preparation and Calculation of Scatter Matrices.
• Aleena Sehar performed Fisher Linear Discriminant Calculation and
Eigenvectors/Eigenvalues.

• Meesam Imran took care of Visualiza on and Interpreta on of Results.

You might also like