Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 9

ARTIFICIAL NEURAL NETWORK

LAB MID REPORT

Submitted to:
DR. TEHSEEN ZIA
Submitted by:
MOHSIN IDREES
SP21-BAI-014

BACHELOR OF ARTIFICIAL INTELLIGENCE

COMSATS UNIVERSITY ISLAMABAD


Neural Network
Initializing the Network:

- We start by defining a class called `NeuralNetwork` that represents our neural


network.
- In this class, we set up the basic structure of our network, including the number of
neurons in each layer and random initial weights and biases.

Activation Functions:

- We define two functions: `sigmoid` and `sigmoid_derivative`.


- These functions help introduce non-linearity to our network, enabling it to learn
complex patterns in the data.

Forward Propagation:

- The `forward` method calculates the output of the neural network given an input.
- It does this by multiplying the input data with the weights, adding biases, and passing
the result through the sigmoid activation function.

Backpropagation:

- The `backward` method updates the weights and biases of the network based on the
difference between the predicted output and the actual target.
- This process is called backpropagation and is crucial for the network to learn from its
mistakes and improve its predictions.

Training the Network:

- The `train` method iterates over the training data for a certain number of epochs
(iterations).
- During each epoch, it feeds the input data forward through the network, calculates the
loss (difference between predicted and actual output), and adjusts the weights and
biases accordingly to minimize this loss.

Visualizing Training Progress:

- We keep track of the training loss over epochs and plot it using Matplotlib.
- This visualization helps us understand how well our network is learning and whether
it's improving over time.
Testing the Trained Model:

- After training, we test our trained model by feeding it with new input data.
- We observe the predictions made by the network and compare them with the actual
target values to evaluate its performance.

CODE SNIPPETS:
OUTPUTS:
Epoch 100/1000, Loss: 0.24443114942704236
Epoch 200/1000, Loss: 0.23692622556266935

Epoch 300/1000, Loss: 0.22916462724112063

Epoch 400/1000, Loss: 0.22059898348784246


Epoch 500/1000, Loss: 0.21089334083613986

Epoch 600/1000, Loss: 0.1998681501311817


Epoch 700/1000, Loss: 0.1875072606150682

Epoch 800/1000, Loss: 0.17394554503806584


Epoch 900/1000, Loss: 0.15941322474404007

Epoch 1000/1000, Loss: 0.14419773597660632


Predictions after training:
Input: [0 0], Target: [0], Predicted: [[0.29555451]]
Input: [0 1], Target: [1], Predicted: [[0.5686631]]
Input: [1 0], Target: [1], Predicted: [[0.6739878]] what?
Input: [1 1], Target: [0], Predicted: [[0.44396284]]

You might also like