Professional Documents
Culture Documents
Backpropagation Through Time: Recurrent Neural Networks
Backpropagation Through Time: Recurrent Neural Networks
The right side of the equation shows the effect of unrolling the recurrent relationship,
highlighting the repeated application of the same transformations and the resulting state
that combines information from past sequence elements with the current input, or context.
An alternative formulation connects the context vector to the first hidden state only; we will
outline additional options to modify the previously shown baseline architecture in the
following section.
The recurrent connections between subsequent hidden states are critical in rendering the
RNN model as universal because it can compute any (discrete) functions that a Turing
machine can represent.
[4]