Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

EXP- 6

AIM:
To study and implement classification using Decision Tree algorithm in MATLAB software

Description:

Decision tree theory revolves around creating a tree-like model for making classifications or
predictions based on a set of features and outcomes. It's a fundamental concept in machine
learning and data analysis.

Here's a breakdown of the key aspects of decision tree theory:

Components:

Features (Attributes): These are the independent variables that describe the data points
(e.g., Sepal Length, Sepal Width in the Iris example).

Target Variable (Outcome): This is the dependent variable you want to predict or classify
(e.g., Iris species).

Nodes: These are the decision points in the tree structure. They represent a question asked
about a specific feature.

Internal Nodes (Decision Nodes): These nodes contain questions or tests based on a feature
value (e.g., "Is Sepal Length greater than 5 cm?").

Leaf Nodes (Terminal Nodes): These nodes represent the final predictions or classifications
(e.g., "Species: Iris Setosa").

Branches: These represent the possible answers or outcomes of the questions at each node.
They connect nodes and guide the flow from the root to a leaf node.
Procedure:
1. Open MATLAB
2. Load fisheriris dataset
3. Write the following code

% Load the Fisher Iris dataset


load fisheriris.mat

% Create a scatter plot to visualize the data


figure;
gscatter(meas(:, 1), meas(:, 2), species, 'rgb', 'osd');

% Label the axes


xlabel('Sepal Length (cm)');
ylabel('Sepal Width (cm)');

% Create a decision tree model using the first two features (Sepal Length and
Sepal Width)
f = fitctree(meas(:, 1:2), species, 'PredictorNames', {'sepal_length',
'sepal_width'});

% Visualize the decision tree


view(f, 'mode', 'graph');

4. Go to workspace click on meas to visualise the dataset

Results:

Decision tree:
Classification:

Cluster:-

You might also like