Basics of Neural Network and Deep Learning: Presented By: - Subhodeep Seal - 162116559 - CSE 'B' - 3 - 6 - 4/5/2019

You might also like

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 12

BASICS OF

NEURAL
NETWORK AND
DEEP LEARNING

PRESENTED BY :-

NAME - SUBHODEEP SEAL


COLLEGE ROLL NO.- 162116559
BRANCH - CSE 'B'
YEAR - 3rd SEM - 6th
DATE - 4/5/2019
CONTENTS
• What is a (Neural Network) NN?
• Logistic regression cost function
• Logistic regression derivatives
• Vectorization
• Activation functions
• Forward and Backward propagation
• References
What is a (Neural
Network) NN?

Each Input will be


connected to the hidden
layer and the NN will
decide the connections.
Logistic regression
Shallow NN is a NN with one or two Simple equation: y = wx + b
layers. If x is a vector: y = w(transpose)x + b
Deep NN is a NN with three or more
If we need y to be in between 0 and 1
layers.
(probability): y =
GRADIENT DESCENT sigmoid(w(transpose)x + b)

Logistic regression cost function

BEST RESULT
(BEST VALUE OF
COST FUNCTION)
• \
Deep learning shines when
VECTORIZATION the dataset are big. However
for loops will make you wait
a lot for a result. Thats why
we need vectorization to get
rid of some of our for loops.
NumPy library (dot)
function is using
vectorization by default.
Whenever possible avoid
for loops.
Most of the NumPy library
methods are vectorized
version
Activation
functions
Forward propagation

the "cache" records values from the


forward propagation units and sends it to
the backward propagation units because it
is needed to compute the chain rule
derivatives.
Backward propagation
Applications

Comparison Between Artificial Neural Network


(ANN) With Biological Neural Network (BNN)
References

• www.google.com
• www.coursera.org
• www.wikipedia.com
Thank You !

You might also like