3rd Ass

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

DECENTRALIZED LEARNING

(ASSIGNMENT # 03 .... SEMESTER FALL- 2023)


Submission Date (23 December, 2023)
Submitted By:
Mazhar Abbas
20021519-054
Submitted To:
Dr. Zahid Iqbal
Course Code:
CS-411
Degree Program Title and Section:
BS-VII Computer Science (A)

Department of Computer Science

1|Page
Question # 01: Perform Forward and Backward propogation….
Forward Propagation:
Forward propagation is the process in a neural network where input data is passed through the network
to obtain predictions. Each layer in the network performs a weighted sum of its inputs, followed by an
activation function that introduces non-linearity. This process continues until the final output is obtained.
In simple terms, forward propagation is like the "thinking" phase of the neural network, where it makes
predictions based on the given inputs.

Backward Propagation:
Backward propagation is the learning phase in a neural network. It involves calculating the error between
the predicted output and the actual target, then propagating this error backward through the network.
The goal is to adjust the weights and biases of the network to minimize this error. Backward propagation
utilizes the chain rule from calculus to calculate the gradients of the error with respect to the weights and
biases. In essence, backward propagation is the network's way of "learning from its mistakes" to improve
future predictions.

CODE

import numpy as np
from keras.models import Sequential
from keras.layers import Dense
from keras import backend as K

# Set the learning rate


learning_rate = 0.01

# Input values
input_data = np.array([[5, 3]])

# Target value
target = np.array([[0]])

# Define the model


model = Sequential()

# Add the hidden layer with 4 neurons


model.add(Dense(4, input_dim=2, activation='sigmoid', use_bias=True,
bias_initializer='ones'))

2|Page
# Add the output layer with 1 neuron
model.add(Dense(1, activation='sigmoid', use_bias=True,
bias_initializer='ones'))

# Set the initial weights


hidden_layer_weights = np.array([[0.4, 0.1], [0.2, 0.6], [0.3, 0.5], [0.7,
0.8]])
hidden_layer_biases = np.array([1, 1, 1, 1])

output_layer_weights = np.array([[0.9], [0.5], [0.3], [0.4]])


output_layer_biases = np.array([1])

model.layers[0].set_weights([hidden_layer_weights.T, hidden_layer_biases])
# Transpose the output layer weights to match the correct shape (4, 1)
output_layer_weights_corrected = output_layer_weights.reshape((4, 1))
model.layers[1].set_weights([output_layer_weights_corrected,
output_layer_biases])

# Forward propagation
output = model.predict(input_data)
print(output)

# Calculate loss using mean square error


loss = 0.5 * np.mean((output - target) ** 2)

# Define a Keras function to get the output of the hidden layer


get_hidden_layer_output = K.function([model.input],
[model.layers[0].output])
hidden_layer_output = get_hidden_layer_output([input_data])[0]

delta_output = (output - target) * output * (1 - output)


# Transpose the hidden layer output to match the correct shape
transpose_hidden_output = K.transpose(hidden_layer_output)
model.layers[1].set_weights([output_layer_weights_corrected -
learning_rate * np.dot(transpose_hidden_output, delta_output),
output_layer_biases - learning_rate *
np.sum(delta_output, axis=0)])

3|Page
delta_hidden = np.dot(delta_output, model.layers[1].get_weights()[0].T) *
hidden_layer_output * (1 - hidden_layer_output)
model.layers[0].set_weights([hidden_layer_weights.T - learning_rate *
np.dot(input_data.T, delta_hidden),
hidden_layer_biases - learning_rate *
np.sum(delta_hidden, axis=0)])

# Display updated weights and biases


for layer in model.layers:
print(layer.get_weights())

# Display final output and loss


print("Final Output:", output)
print("Final Loss:", loss)

# Forward propagation after backpropagation


updated_output = model.predict(input_data)

# Display updated output


print("Updated Output:", updated_output)

4|Page
Code

5|Page
OUTPUT

1/1 [==============================] - 0s 55ms/step


[[0.9548363]]
[array([[0.39993647, 0.19997798, 0.2999891 , 0.69999915],
[0.09996188, 0.5999868 , 0.49999347, 0.7999995 ]], dtype=float32),
array([0.9999873, 0.9999956, 0.9999978, 0.9999998], dtype=float32)]
[array([[0.8996029 ],
[0.49959725],
[0.29959565],
[0.39958864]], dtype=float32), array([0.99958825], dtype=float32)]
Final Output: [[0.9548363]]
Final Loss: 0.4558561884504986
1/1 [==============================] - 0s 20ms/step
Updated Output: [[0.95474946]]

6|Page

You might also like