Professional Documents
Culture Documents
Multi Layer Perceptron
Multi Layer Perceptron
Multilayer perceptron with an input layer, three hidden layers, and an output layer
Backpropagation learning algorithm
The algorithm is based on the gradient descent technique for solving an optimization problem, which involves the
minimization of the network cumulative error Ec
being the square of the Euclidian norm of the vectorial difference between the k-th target output vector t(k) and the
k-th actual output vector o(k) of the network
Consider n is the number of training patterns presented to the network for learning purposes
The algorithm is designed in such a way as to update the weights in the direction of the gradient
descent of the cumulative error (with respect to the weight vector).
Off line (all the training patterns are
presented to the system at once) or on line
(training is made pattern by pattern)
with respect to the vector w(l ) corresponding to all
interconnection weights between layer (l ) and the preceding
layer (l − 1)
• For the case where layer (l) is the output layer (L), above equation
can be expressed as:
• Considering the case where f is the sigmoid function
• The error signal becomes expressed as:
• Propagating the error backward now, and for the case where (l)
represents a hidden layer (l < L), the expression of Δwij(l) is given as follows