Professional Documents
Culture Documents
Final PPT ANN
Final PPT ANN
Final PPT ANN
DEEP LEARNING
• Neural networks, also known as artificial neural networks
(ANNs) or simulated neural networks (SNNs), are a subset
of machine learning and are at the heart of deep
learning algorithms. Their name and structure are inspired by
the human brain, mimicking the way that biological neurons
signal to one another.
In this network, the information moves from the input nodes, through the
hidden nodes (if any) and to the output nodes.
The predicted value of the network is compared to the expected output, and an
error is calculated using a function. This error is then propagated back within
the whole network, one layer at a time, and the weights are updated according
to the value that they contributed to the error.
REPRESENTATION OF A FFBP NETWORK
CONSISTING OF 2 HIDDEN LAYERS WITH 15
AND 20 NEURONS EACH
What is an epoch?
THE IDEAL WAY OF SPLITTING THE DATA WOULD BE IN A 70/30 FASHION WITH 70% OF THE
DATA USED FOR TRAINING AND THE OTHER FOR TESTING, CHOSEN RANDOMLY FOR BETTER
TRAINING
TRAINING AND TESTING DATA VARIABLES ARE CREATED AND
IMPORTED INTO THE NNtool WINDOW, TEST OUTPUTS ARE
STORED SEPARATELY IN AN EXCEL SHEET TO COMPARE IT WITH
THE SIMULATION DATA AND FINDING THE CORRELATION
COEFFICIENT (R)
CORRELATION
COEFFICIENT (AVERAGE
R VALUE) IS THEN FOUND
FOR THE NETWORK
• Backpropagation (backward propagation) is an important mathematical tool for improving
the accuracy of predictions in data mining and machine learning. Essentially,
backpropagation is an algorithm used to calculate derivatives quickly.
• The physical layout of the Elman neural network is divided broadly into 4 layers: the
input layer, the hidden layer, the Undertake layer(aka context layer) , and the output
layer.
• Connections among the input layer, the hidden layer and the output layer can be considered as a feed-
forward network, this part is similar to the traditional multi-layer neural network.
• Besides above three layers, there exists another layer named the context layer, the inputs of this layer
come from outputs of the hidden layer, the context layer is used to store the hidden layer’s output
values of the previous time, so it is called the context layer. The purpose of context layer is to memorize
the hidden layer output. As it is based on a backpropagation neural network, the output of the hidden
layer connects with its input via the delay and memory of context layer.
• nntool opens the Network/Data Manager window, which
allows you to import, create, use, and export neural
networks and data.
• Matlab for Cascade Forward Back-propagation
LINEAR NORMALIZED DATA
S.No I/P O/P
Blend Load BTE CO2 CO SMOKE HC Pressure
1 0.060302 0 0 0.181671 0.05914 0.063239 0.049219 0.178847
2 0.060302 0.08165 0.128755 0.215314 0.11828 0.112425 0.1125 0.194089
3 0.060302 0.163299 0.193133 0.215314 0.177421 0.140532 0.165234 0.207223
4 0.060302 0.244949 0.25751 0.248957 0.215055 0.196745 0.229687 0.215914
5 0.060302 0.326599 0.261534 0.255686 0.279572 0.274037 0.301172 0.222854
6 0.120605 0 0 0.174943 0.11828 0.105399 0.052734 0.17515
7 0.120605 0.08165 0.14485 0.195129 0.172044 0.126479 0.116016 0.191624
8 0.120605 0.163299 0.225322 0.201857 0.211471 0.161612 0.16875 0.205504
9 0.120605 0.244949 0.249463 0.222043 0.263443 0.196745 0.235547 0.215038
10 0.120605 0.326599 0.281652 0.228772 0.057348 0.295117 0.310547 0.216984
11 0.180907 0 0 0.168214 0.121865 0.098372 0.05625 0.174275
12 0.180907 0.08165 0.136802 0.174943 0.173836 0.147558 0.119531 0.189906
13 0.180907 0.163299 0.209227 0.201857 0.224016 0.175665 0.172265 0.204629
14 0.180907 0.244949 0.253084 0.215314 0.28674 0.203771 0.242578 0.212898
15 0.180907 0.326599 0.265558 0.222043 0.134409 0.302143 0.317578 0.215752
16 0.241209 0 0 0.161486 0.172044 0.119452 0.058594 0.173432
17 0.241209 0.08165 0.120708 0.154757 0.211471 0.133505 0.123047 0.18903
18 0.241209 0.163299 0.193133 0.1884 0.263443 0.168638 0.174609 0.203753
19 0.241209 0.244949 0.233369 0.208586 0.206095 0.238904 0.247265 0.209979
20 0.241209 0.326599 0.247532 0.222043 0.225808 0.316197 0.307031 0.213093
21 0.301511 0 0.112661 0.148029 0.147922 0.147558 0.062109 0.172556
22 0.301511 0.08165 0.177038 0.1413 0.245521 0.112425 0.127734 0.187182
23 0.301511 0.163299 0.249463 0.181671 0.188173 0.161612 0.180469 0.201613
24 0.301511 0.244949 0.227736 0.198762 0.116488 0.281064 0.251953 0.205958
25 0.301511 0.326599 0.27441 0.218679 0.318999 0.344303 0.3 0.208034