Professional Documents
Culture Documents
Lecture 3-Features, Neural Networks
Lecture 3-Features, Neural Networks
Lecture 3-Features, Neural Networks
NEURAL NETWORKS:
When we face with complex situations we shift to neural networks from linear
classifiers. In a simple way, Neural networks are a bunch of linear classifiers which
are stitched together with some non-linearity.
Above picture shows that neural networks tries to break down the problem into a set
of subproblems (each one is an output of a linear classifier) and again applies a linear
classifier and finally gets the score. The sigmoid function is our choice but it should
be a non-linear function(ex: logistic function/RLU).
Now training loss depends on both V and w. We know that
TrainLoss(V,w)=1/|Dtrain| ∑ Loss(x,y,w,V).We need to minimise the gradient of
TrainLoss as we did in Linear Classifiers. In neural networks we have 5 basic building
blocks - +,-,*,max and σ. With these 5 blocks we can have a number of layers and can
design any function.
Backpropagation is an algorithm that allows us to compute gradients for any
computation graph.It computes 2 values forward and backward. Nowadays pytorch
and tensorflow automatically calculate gradient ,which often becomes a tedious task
when we have many layers.