Professional Documents
Culture Documents
Homework 4 Neural Networks: Submitted To
Homework 4 Neural Networks: Submitted To
NEURAL NETWORKS
SUBMITTED TO
By
Aasim Kamal
Raghul Vishnu
The homework explains the various conditions of training a neural network. There are 4
conditions that can occur in any neural network. The conditions are namely:
Ideal training
Under training
Over training
For all the cases given above the initial weight parameters and alpha are:
U0j=0.5
Uij=-1^(i+j)
V0k=-0.5
Vjk=-1^(j+k)
Alpha = 0.1
Case 1:
Ideal Training: This is the case in which the error curves are flat and the error curves are
below the threshold. In this case we need not take any action as the desired result is acquired.
The output for this case is as follows:
Here the error average of both the validation and the training errors are less than the threshold
and the error curves are flat. Hence this condition is considered as the ideal condition.
Case 2:
Under Training: This is the case in which the validation error is greater than the threshold
error and the error curves become flat. When this happens we can rectify the mistake by
increasing the number of hidden neurons in a neural network.
The output of such a case is as follows:
For the above example the number of hidden neurons is 5. From the output it can inferred
that the values of the validation and training error is higher than the threshold even though the
value of the error curves have become flat.
Now this error can be rectified by increasing the number of hidden neurons, in this case
the number of hidden neurons has been increased from 5 neurons to 10 neurons there by making
the errors converge below the threshold error limit.
In this above example the validation and the training errors are less than the threshold
error which were decreased by the increase in the number of hidden neurons from 5 to 10 hidden
neurons. Thus in this way the undertraining of a neural network can be identified and rectified.
Case 3:
Over Training: This type of phenomenon occurs when the training error is less than the
threshold error but the validation error is greater than the threshold. When this phenomenon
occurs it can be corrected either by increasing the number of data used for traing the network or
by decreasing the number of hidden neurons.
The output of an overtraining case is as follows:
For the above example the number of hidden neurons is 20. But it can be seen that the value of
the validation error is greater than threshold but the training error is smaller than the threshold.
This phenomenon can either be corrected by increasing the number of data used for training the
system or by decreasing the number of hidden neurons. In this case the number of hidden
neurons has been reduced from 20 hidden neurons to 10 hidden neurons which would yield the
output as follows:
In the above example the error curves are flat and are both below the threshold. This was
possible by the decrease of the number of neurond from 20 to 10 neurons. Thus by using this
method over training can be prevented from happening in a neural network.
MATLAB PROGRAM
% % % % % Program for 10 neurons % % % % %
clc;
clear all;
filename='miteredbendtrain.xlsx';
sheet = 1;
xlRange1 = 'A2:A101';
xlRange3 = 'B2:B101';
xlRange4 = 'C2:C101';
xlRange5 = 'D2:D101';
xlRange6 = 'E2:E101';
xlRange7 = 'f2:f101';
x=zeros(100,2);
d=zeros(100,4);
u=sym('u',[3,10]);
u=0;
% % % % % Declaration of the u and v weights
for i=1
for j=1:10
u(i,j)=0.5;
end
end
for i=2:3
for j=1:10
u(i,j)=(-1)^((i-1)+j);
end
end
v=sym('v',[11,4]);
v=0;
% v=zeros(6,2);
for i=1
for j=1:4
v(i,j)=-0.5;
end
end
for i=2:11
for j=1:4
v(i,j)=(-1)^((i-1)+j);
end
end
x1 = xlsread(filename,sheet,xlRange1);
x2 = xlsread(filename,sheet,xlRange3);
d1 = xlsread(filename,sheet,xlRange4);
d2 = xlsread(filename,sheet,xlRange5);
d3 = xlsread(filename,sheet,xlRange6);
d4 = xlsread(filename,sheet,xlRange7);
sheet = 1;
xlRange1 = 'A2:b101';
x = xlsread(filename,sheet,xlRange1);
% % % % % Variable declarations
gamma=sym('gamma',[100,10]);
gamma=0;
z=sym('z',[100,10]);
z=0;
y=sym('y',[100,4]);
y=0;
gammaut=sym('gammaut',[100,10]);
gammaut=0;
zut=sym('zut',[100,10]);
zut=0;
yut=sym('yut',[100,4]);
yut=0;
error=sym('error',[100,1]);
error=0;
errorval=sym('error',[200,1]);
errorval=0;
xlRange2 = 'c2:f101';
d = xlsread(filename,sheet,xlRange2);
aer=sym('a',[100,1]);
aer=0;
ber=sym('b',[100,1]);
ber=0;
cer=sym('a',[100,1]);
cer=0;
der=sym('b',[100,1]);
der=0;
aval=sym('aval',[200,1]);
aval=0;
bval=sym('bval',[200,1]);
bval=0;
delu=sym('delu',[2,10]);
delu=0;
delv=sym('delv',[11,4]);
delv=0;
delui=sym('delui',[2,10]);
delui=0;
delvi=sym('delvi',[11,4]);
delvi=0;
errorval(k,1)=0.5*((aval(k,1)^2)+(bval(k,1)^2)+(cval(k,1)^2)+(dval(k,1)^2));
eavgut=eavgut+errorval(k);
end
eavgut=eavgut/200;
errut(epoch)=eavgut;
end
hold on;
title('Error Average vs Epochs plots of Training & Testing Data');
xlabel('Epochs');
ylabel('Average Error');
p1=plot(errut,'-b');
p2=plot(err,'-r');
legend('validation','Training');
hold off;