Depth First Search

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 4

Heaven’s Light is Our Guide

Rajshahi University of Engineering & Technology

Department of Mechatronics Engineering (MTE)

Course No: CSE 4288


Course Name: Artificial Intelligence sessional

LAB REPORT

Submitted By: Submitted To:


Dibya Joy Paul Md. Mehedi Hasan
Roll:1708057 Lecturer
Date of Exp. :15/03/23 Dept. of Mechatronics Engineering,
Date of Sub. :10/05/23 RUET
Md. Hafiz Ahamed
Lecturer
Dept. of Mechatronics Engineering,
RUET
Experiment No: 3
Experiment Name: Study of Backpropagation Algorithm.
Objective:
1) To know about Backpropagation Algorithm .
2) To Know how to build a Backpropagation Algorithm in python.
3) To know about the importance of Backpropagation Algorithm in Deep
Learning .
Theory : For artificial neural networks, the backpropagation algorithm is a form
of supervised learning process where we can adjust the weight functions and
improve the model's accuracy. The cost function is minimized using the gradient
descent method. The mean-squared difference between the expected and actual
data is decreased. When training feed-forward neural networks with a set of data
whose classifications are known to us, this kind of procedure is typically used.
Backward propagation is also known as the spread of errors backward to increase
accuracy. The backpropagation procedure must be used whenever the forecast we
received from a neural network model differs significantly from the actual output
in order for us to gain improved accuracy.
There are mainly three layers in a backpropagation model i.e input layer, hidden
layer, and output layer. Following are the main steps of the algorithm:
 Step 1:The input layer receives the input.
 Step 2:The input is then averaged over weights.
 Step 3:Each hidden layer processes the output. Each output is referred to as
“Error” here which is actually the difference between the actual output and the
desired output.
 Step 4:In this step, the algorithm moves back to the hidden layers again to
optimize the weights and reduce the error.

You might also like