1
Contents:
Multilayer Perceptron (MLP)
Introduction to Back Propagation Algorithm
Derivation of Back Propagation of Error
Advantages of Back Propagation Algorithm
Disadvantages of Back Propagation Algorithm
Application of Back Propagation Algorithm
2
BACK PROPAGATION ALGORITHM IN MULTILAYER PERCEPTRON
Multilayer Perceptron (MLP):
A multilayer perceptron is a
neural network connecting
multiple layers in a directed
graph, which means that the
signal path through the nodes
only goes one way. Each
node, apart from the input
nodes, has a nonlinear
activation function. An MLP
uses back propagation as a
supervised learning
technique.
3
BACK PROPAGATION ALGORITHM IN MULTILAYER PERCEPTRON
Introduction to Back Propagation Algorithm:
Back-propagation is the essence of neural
network training. It is the method of fine-
tuning the weights of a neural net based on
the error rate obtained in the previous epoch
(i.e., iteration). Proper tuning of the weights
allows you to reduce error rates and to make
the model reliable by increasing its
generalization.
Back propagation is a short form for
"backward propagation of errors." It is a
standard method of training artificial neural
networks. This method helps to calculate the
gradient of a loss function with respects to all
the weights in the network.
4
BACK PROPAGATION ALGORITHM IN MULTILAYER PERCEPTRON
Derivation of Back Propagation of Error:
5
BACK PROPAGATION ALGORITHM IN MULTILAYER PERCEPTRON
The popularity of on-line learning for the supervised training of multilayer perceptron has
been further enhanced by the development of the back-propagation algorithm. To describe
this algorithm, consider the given figure on the previous page (fig 4.3), which depicts neuron j
being fed by a set of function signals produced by a layer of neurons to its left. The induced
local field vj(n) produced at the input of the activation function associated with neuron j is
therefore
6
BACK PROPAGATION ALGORITHM IN MULTILAYER PERCEPTRON
In a manner similar to the Least Mean Square (LMS), the back
7
BACK PROPAGATION ALGORITHM IN MULTILAYER PERCEPTRON
8
BACK PROPAGATION ALGORITHM IN MULTILAYER PERCEPTRON
9
BACK PROPAGATION ALGORITHM IN MULTILAYER PERCEPTRON
From Eqs. (4.15) and (4.16), we note that a key factor involved in the calculation of the weight
adjustment ∆wji(n) is the error signal ej(n) at the output of neuron j. In this context, we may
identify two distinct cases, depending on where in the network neuron j is located.
10
BACK PROPAGATION ALGORITHM IN MULTILAYER PERCEPTRON
11
BACK PROPAGATION ALGORITHM IN MULTILAYER PERCEPTRON
Fig 4.4: Signal-flow graph highlighting the details of output neuron k connected to hidden neuron j.
12
BACK PROPAGATION ALGORITHM IN MULTILAYER PERCEPTRON
13
BACK PROPAGATION ALGORITHM IN MULTILAYER PERCEPTRON
14
BACK PROPAGATION ALGORITHM IN MULTILAYER PERCEPTRON
15
BACK PROPAGATION ALGORITHM IN MULTILAYER PERCEPTRON
Advantages of Back Propagation Algorithm:
It is fast, simple and easy to program.
It has no parameters to tune (except for the number of input) .
This is a shift in mind set for the learning-system designer instead of trying to design a
learning algorithm that is accurate over the entire space
It requires no prior knowledge about the weak learner and so can be flexible.
Disadvantages of Back Propagation Algorithm:
The actual performance of Back propagation on a particular problem is clearly dependent on
the input data.
Back propagation can be sensitive to noisy data and outliers.
Fully matrix-based approach to back propagation over a mini-batch .
16
BACK PROPAGATION ALGORITHM IN MULTILAYER PERCEPTRON
Applications of Back Propagation Algorithm:
Mapping character strings into phonemes so they can be pronounced by a computer.
Neural network trained how to pronounce each letter in a word in a sentence, given the three
letters before and three letters after it in a window
In the field of Speech Recognition.
In the field of Character Recognition.
In the field of Face Recognition.
17
Thank You!