Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

What is Recurrent Neural Network?

➢ A Recurrent Neural Network(RNN) is a class of Artificial Neural Network in which,the


neurons are able to send feedback signal to each other.
➢ RNN has a "Memory", which remember all the information about what has been
calculate.
➢ ANN/CNN accepts a fixed-sized vectors as an input and produce a fixed-sized vectors
as an output.
➢ RNN allows us to operate over vector sequence in input,output or in both.
What is Vanishing Gradient Problem?
➢ When we do back-propagation moving backward in network and calculating gradient
of loss(Errors) with respect to weight,the gradient tends to get smaller and smaller
as we keep on moving backward in the network.
➢ That means the neurons in earlier layers learn very slowly as compared to the
neurons that are in later layers,the earlier layers in network are slowest to train.
➢ So, the process of reducing gradient as we go backward in network is called
Vanishing Gradient.
Solution of Vanishing Gradient:
➢ We do not use sigmoid and tanh as an activation function which causes vanishing
gradient probelms.
➢ Instead of use this we use ReLu as an activation function to overcome this problems.
What is Long Short-Term Memory (LSTM)?
➢ Long short-term memory (LSTM) is a recurrent neural network (RNN) that selectively
remembers patterns for long durations of time.
➢ A LSTM unit is composed of a cell and three gates which control the flow of
information.
1.Forget gate
2.Input gate
3.Output gate
Forgot gate:
➢ Forget gate will decide what to forget from the previous memory you need.
Input gate:
➢ The input gate will decide what to accept inside the neuron,update gate will update
the memory.
Output gate:
➢ The output gate will give the output as long term memory.
what is Gated Recurrent unit(GRU)?
➢ They are two types of gates in GRU
1.Update Gate
2.Reset Gate
Update Gate:
➢ The update gate access forgot gate an input get inside the LSD update get besides
what information to throw away and what informtion to add in neuron.
Reset Gate:
➢ Reset gate is used to decide How much pass information to forgot.
Why we need RNN?
➢ Because of their internal memory, RNN’s can remember important things about the
input they received, which allows them to be very precise in predicting what’s
coming next.
➢ This is why they're the preferred algorithm for sequential data like time series,
speech, text, financial data,audio, video, weather and much more.
➢ Recurrent neural networks can form a much deeper understanding of a sequence
and its context compared to other algorithms.
when do you need to use a RNN?
➢ Whenever there is a sequence of data and that temporal dynamics that connects the
data is more important than the spatial content of each individual frame.
➢ Since RNNs are being used in the software behind Siri and Google Translate,
recurrent neural networks show up a lot in everyday life.

You might also like