Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

Neural NetworkAssignment

Made by:-Aditya jain


B.Tech(C.S.E.) 7th Sem

Q.1 Mention the linear and nonlinear activation functions used in ArtificialNeural
Networks.
Ans:-Linear Function
Equation: Linear function has the equation similar to as of a straight line i.e. y = x.
No matter how many layers we have, if all are linear in nature, the final activation function of
last layer is nothing but just a linear function of the input of first layer.
Range: -inf to +inf
Uses: Linear activation function is used at just one place i.e. output layer.
Issues: If we will differentiate linear function to bring non-linearity, result will nomore
depend on input “x” and function will become constant, it won’t introduce any ground-
breaking behaviour to our algorithm.
Graph: -

Non-Linear Function
Non-linear activation functions are essential in artificial neural networksto
introduce non-linearity into the model, allowing them to capture
complex patterns and relationships in data. Here are some common non-linear
activation functions used in artificial neural networks:
Sigmoid Activation Function (Logistic Function):
Output Range: (0, 1)
Characteristics: S-shaped curve, squashes input to the range (0,1).
Historically used but less common in deep networks due to vanishing
gradient problems.
Hyperbolic Tangent (Tanh) Activation Function:
Characteristics: Similar to the sigmoid but centred at 0. It squashes
input to the range (-1, 1). Also suffers from vanishinggradient
problems.
Output Range: (-1, 1)
Rectified Linear Unit (ReLU):
Output Range: [0, ∞)
Characteristics: Outputs zero for negative inputs and passespositive
inputs directly. Widely used due to its simplicity and effectiveness in
training deep networks.
Leaky Rectified Linear Unit (Leaky ReLU):
Output Range: (-∞, ∞)
Characteristics: Similar to ReLU but allows a small, non-zero
gradient for negative inputs, preventing some neurons from
becoming "dead" during training.
Parametric Rectified Linear Unit (PReLU):
Output Range: (-∞, ∞)
Characteristics: Similar to Leaky ReLU, but the slope parameter α
is learned during training.
Exponential Linear Unit (ELU):
Output Range: (-∞, ∞)
Characteristics: Smoothly approaches zero for negative inputs, helping
with vanishing gradient issues and sometimes improvingtraining.
Swish Activation Function:
Output Range: (-∞, ∞)
Characteristics: Combines elements of sigmoid and ReLU, showing
promising results in some applications.
Q.2 What is the role of activation functions in Neural Networks.
Ans:-Activation functions play a crucial role in neural networks by introducing non-
linearity into the model. They are applied to the output of each neuron (or node) in a neural
network, and their primary purpose is to determine whether a neuron should be activated
(i.e., "fire") or not based on the weighted sum of its inputs. Here's a more detailed
explanation of their role:
1. Introducing Non-Linearity: Without activation functions, neural networkswould
behave as linear models, no matter how many layers they have. Linear models can
only learn linear relationships between input and output. By applying non-linear
activation functions, neural networks canmodel and learn complex, non-linear patterns
in data, making them powerful tools for a wide range of tasks, including image
recognition, natural language processing, and more.
2. Learning Complex Features: Activation functions allow neurons to learn complex
features and relationships in the data. They enable the networkto capture and
represent hierarchical features by stacking multiple layers of neurons with non-
linear activations. Each layer learns increasingly abstract and complex features,
which contributes to the network's ability to understand and generalize from the
data.
3. Decision-Making: Activation functions determine the output of a neuron based on its
input. Depending on the specific activation function used, a neuron can be more or
less likely to activate (produce an output signal)
in response to its inputs. This activation decision allows the network tomake
decisions and predictions based on the learned patterns in the data.
4. Gradient Computation: Activation functions also play a crucial role in
backpropagation, which is the process by which neural networks learnfrom
data. During training, gradients are computed with respect to thenetwork's
weights and biases to update them and improve the model's performance.
Activation functions must be differentiable, as their derivatives are used to
calculate these gradients efficiently.
Q.3 Discuss the activation function and give its brief description.
Ans: -Activation function is an internal state of neuron, which is used to connect the input
signal on node of ANN to an output signal.Basically, weightedsum of I/P becomes I/P
signal to A/F(activation function) to give one O/P signal.
Threshold Function: Binary Step Function:-
It is used for single layer network to convert net i/p to o/p.
Range {0 - 1}
Example:-f(x) = 1, if x >= 0 , f(x) = 0, if x < 0 {Threshold value}Note:
If input is greater than 0 output will be 1 or else 0.
Signum Function (Sign Function):
Range {-1, 1}
Example: If the input is 3, the output would be 1. If the input is -2, theoutput
would be -1.
Activation
f(x) = -1, if x < 0
f(x) = 0, if x = 0
f(x) = 1, if x > 0

Sigmoid Function:-
This is used in backpropagation networks range is 0 - 1.
Example: If the input is 0.5, the output would be approximately 0.622.
Activation:
f(x) = 1 / (1 + e^(-x))
Formula:-
Hyperbolic Tangent (Tanh) Function:-
Here optimization is easy. Range is -1 to 1.
Example: If the input is 0.3, the output would be approximately 0.291.
Formula:-
Rectified Linear Unit (ReLU) Function:-
Range {0 - infinity} ; “only positive result”
Example: If the input is 2.5, the output would be 2.5. If the input is -1.8,the output
would be 0.
Activation:
f(x) = x, if x >= 0
f(x) = 0, if x < 0
Formula:-
R(z)=max(0,z)

Step or Heaviside Function:


Range {0 - 1}
Example: If the input is 1.2, the output would be 1. If the input is -0.5, theoutput
would be 0.
Activation:
f(x) = 1, if x > 0 f(x)
= 0, if x <= 0
Identity Activation Function:
Example: Example: If the input is 3.2 , the output would be 3.2.Activation:
f(x) = x ; for all x,
Q4. Explain ANN architecture with its different types. With their rangediagram.
Ans. ANN Architecture
Single Layer Feed Forward Network:
In this type of network, we have only two layers input layer and the
output layer but the input layer does not count because no
computation is performed in this layer. The output layer is formed
when different weights are applied to input nodes and the cumulative
effect per node is taken. After this, the neurons collectively give the
output layer to compute the output signals.

Multilayer feed forward Network:


It consists of Hidden layer, for more strong computation

Multilayer perceptron:
It is fully connected network and uses no-linear activation function.Consists of 3 or
more layers which are non linearly separable.
Feedback NN:
Here a feedback is provided based on that it adjust the parameters.
Returns some i/p towards the 1st layer to update the parameters to update theerrors.“to
minimize the error”

Fully Recurrent NN:


Here both hidden and output layers are in recurrent in nature. It is time taking
computation, which leads to get a more accurate outcome using optimal weights
Q5. Explain what is ANN. Advantages and disadvantages of ANN.
Ans. The term "Artificial Neural Network" is derived from Biological neural
networks that develop the structure of a human brain. Similar to the humanbrain
that has neurons interconnected to one another, artificial neural networks also have
neurons that are interconnected to one another in various layers ofthe networks.

These neurons are known as nodes.

Artificial Neural Network primarily consists of three layers:

Input Layer:

As the name suggests, it accepts inputs in several different formats provided by the
programmer.
Hidden Layer:

The hidden layer presents in-between input and output layers. It performs all the
calculations to find hidden features and patterns.

Output Layer:

The input goes through a series of transformations using the hidden layer, which
finally results in output that is conveyed using this layer.

Advantages:

Storing data on the entire network:

Data that is used in traditional programming is stored on the whole network, not on
a database. The disappearance of a couple of pieces of data in one place doesn't
prevent the network from working.

Capability to work with incomplete knowledge:

After ANN training, the information may produce output even with inadequate
data. The loss of performance here relies upon the significance of missing data.

Having a memory distribution:

For ANN is to be able to adapt, it is important to determine the examples and to


encourage the network according to the desired output by demonstrating these
examples to the network. The succession of the network is directly proportional to
the chosen instances, and if the event can't appear to the network in all its aspects,
it can produce false output.

Having fault tolerance:

Extortion of one or more cells of ANN does not prohibit it from generating output,
and this feature makes the network fault-tolerance.

Disadvantages

Assurance of proper network structure:


There is no particular guideline for determining the structure of artificial neural
networks. The appropriate network structure is accomplished through experience,
trial, and error.

Unrecognized behavior of the network:

It is the most significant issue of ANN. When ANN produces a testing solution, it
does not provide insight concerning why and how. It decreases trust in the
network.

Hardware dependence:

Artificial neural networks need processors with parallel processing power, as per
their structure. Therefore, the realization of the equipment is dependent.

Difficulty of showing the issue to the network:

ANNs can work with numerical data. Problems must be converted into numerical
values before being introduced to ANN. The presentation mechanism to be
resolved here will directly impact the performance of the network. It relies on the
user's abilities.

You might also like