Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

3/23/22, 9:14 AM All types of Numericals

(/)

 (/local/search/page/) Ask (/p/new/post/)

Login (/site/login/?next=/p/39252/all-types-of-numericals-1/)

All types of Numericals


written 3.1 years ago by


teamques10 (/u/1/teamques 
♣ 16k

 machine learning (/t/machine learning/)

2 ADD COMMENT
FOLLOW
SHARE
EDIT (/p/edit/39252/)
15k
views
1 Answer

written 3.1 years ago by


teamques10 (/u/1/teamques 
♣ 16k

Q1. For the network shown in figure, calculate the net input to the neuron?

Solution: [x 1, x2 , x3 ] = [0.3, 0.5, 0.6]

[w1 , w2 , w3 ] = [0.2, 0.1, −0.3]

The net input can be calculated as,

https://www.ques10.com/p/39252/all-types-of-numericals-1/ 1/11
3/23/22, 9:14 AM All types of Numericals

yin = x1 w1 + x2 w2 + x3 w3

yin = 0.3 × 0.2 + 0.5 × 0.1 + 0.6 × (−0.3)

yin = 0.06 + 0.05 − 0.18 = −0.07

Q2. Calculate the net input for the network shown in figure with bias included in the network?

Solution: [x 1, x2 , b] = [0.2, 0.6, 0.45]

[w1 , w2 ] = [0.3, 0.7]

The net input can be calculated as,

yin = b + x1 w1 + x2 w2

yin = 0.45 + 0.2 × 0.3 + 0.6 × 0.7

yin = 0.45 + 0.06 + 0.42 = 0.93

Q3. Obtain the output of the neuron Y for the network shown in the figure using activation functions as

(i) Binary Sigmoidal and

(ii) Bipolar Sigmoidal.

https://www.ques10.com/p/39252/all-types-of-numericals-1/ 2/11
3/23/22, 9:14 AM All types of Numericals

Solution: The given network has three input neurons with bias and one output neuron. These form a
single layer network,

The inputs are given as,

[x1 , x2 , x3 ] = [0.8, 0.6, 0.4]

The weights are,

[w1 , w2 , w3 ] = [0.1, 0.3, −0.2]

The net input can be calculated as,


n
yin = b + ∑ (xi wi )
i=1

yin = 0.35 + 0.8 × 0.1 + 0.6 × 0.3 + 0.4 × (−0.2)

yin = 0.35 + 0.08 + 0.18 − 0.08 = 0.53

(i) For Binary Sigmoidal Function,


1 1
y = f (yin ) = = = 0.62
−y −0.53
1 + e in 1 + e

(ii) For Bipolar Sigmoidal activation function,


−y
2 1 − e in

y = f (yin ) = − 1 =
−y −y
1 + e in 1 + e in

−0.53
1 − e
y = = 0.259
1 + e−0.53

Q4. Implement AND function using McCulloch-Pitts Neuron (take binary data).

Solution: Truth table for AND is

In McCulloch-Pitts Neuron only analysis is performed. hence, assume weights be w 1 = w2 = 1 .

The network architecture is

https://www.ques10.com/p/39252/all-types-of-numericals-1/ 3/11
3/23/22, 9:14 AM All types of Numericals

With these assumed weights the net input is calculated for four inputs,

(i) (1, 1) − y in = x1 w1 + x2 w2 = 1 × 1 + 1 × 1 = 2

(ii) (1, 0) − y in = x1 w1 + x2 w2 = 1 × 1 + 0 × 1 = 1

(iii) (0, 1) − y in = x1 w1 + x2 w2 = 1 × 0 + 1 × 1 = 1

(iv) (0, 0) − y in = x1 w1 + x2 w2 = 0 × 1 + 0 × 1 = 0

For, AND function the output is high if both the inputs are high. For this function, the net input is
calculated as 2.

Hence, based on this input the threshold value is set, i.e., if the output value is greater than or equal to
2 then the neuron fires, else it does not fire.

So, the threshold value is set to 2 (θ = 2). This can be obtained by,

θ ≥ nw − p

Here, n = 2, w = 1 (excitory) and p = 0 (inhibitory)

∴ θ ≥ 2 × 1 − 0

⇒ θ ≥ 2

The output of neuron Y can be written as,

1, if yin ≥ 2
y = f (yin ) = {
0, if yin < 2

Q5. Use McCulloch-Pitts Neuron to implement AND NOT function (take binary data representation).

Solution: Truth table for AND NOT is

https://www.ques10.com/p/39252/all-types-of-numericals-1/ 4/11
3/23/22, 9:14 AM All types of Numericals

The given function gives an output only when x 1 = 1 and x 2 = 0 . The weights have to be decided
only after analysis.

The network architecture is

Case 1: Assume both the weights as excitatory, i.e., w 1 = w2 = 1 ,

θ ≥ nw − p

θ ≥ 2 × 1 − 0 ≥ 2

The net input,

(i) (1, 1) − y in = x1 w1 + x2 w2 = 1 × 1 + 1 × 1 = 2

(ii) (1, 0) − y in = x1 w1 + x2 w2 = 1 × 1 + 0 × 1 = 1

(iii) (0, 1) − y in = x1 w1 + x2 w2 = 1 × 0 + 1 × 1 = 1

(iv) (0, 0) − y in = x1 w1 + x2 w2 = 0 × 1 + 0 × 1 = 0

From the calculated net input, it is possible to fire the neuron with input (1, 0) only.

Case 2:

Assume one weight as excitatory and another one as inhibitory,

i.e., w 1 = 1, w2 = 1

The net input,

(i) (1, 1) − y in = x1 w1 + x2 w2 = 1 × 1 + 1 × (−1) = 0

https://www.ques10.com/p/39252/all-types-of-numericals-1/ 5/11
3/23/22, 9:14 AM All types of Numericals

(ii) (1, 0) − y in = x1 w1 + x2 w2 = 1 × 1 + 0 × (−1) = 1

(iii) (0, 1) − y in = x1 w1 + x2 w2 = 1 × 0 + 1 × (−1) = −1

(iv) (0, 0) − y in = x1 w1 + x2 w2 = 0 × 1 + 0 × (−1) = 0

From the net inputs now it is possible to conclude that the neuron will only fire with input (1, 0) by
fixing the threshold θ ≥ 1.

Thus, w 1 = 1, w2 = −1; θ ≥ 1

The value of θ is calculated as,

θ ≥ nw − p

θ ≥ 2 × 1 − 1

θ ≥ 1

The output of the neuron Y can be written as,

1, if yin ≥ 1
y = f (in ) = {
0, if yin < 1

Q6. Implement XOR function using M-P neuron. (consider Binary Data)

Solution: The truth table for XOR function is computed as,

In this case, the output is "ON" only for odd number of 1's. For the rest it is "OFF". XOR function cannot
be represented by simple and single logic function, it is represented as
¯
¯¯¯¯ ¯
¯¯¯¯
y = x1 x2 + x1 x2

y = z1 + z2

where z 1 = x1 . x2
¯
¯¯¯¯
is the first functio,

and z 2
¯
¯¯¯¯
= x1 . x2 is the second function.
⇒ y = z1 + z2 is the third function

A single layer net is not sufficient to represent it, we require an intermediate layer. An intermediate
layer is necessary,

https://www.ques10.com/p/39252/all-types-of-numericals-1/ 6/11
3/23/22, 9:14 AM All types of Numericals

Neural Network for XOR function

First Function: z 1
¯
¯¯
= x1 . x2
¯¯

Case 1: Assume both the weights are excitatory,

i.e., w 11 = 1, w21 = 1

yin = x1 w11 + x2 w21

(1, 1) − yin = 1 × 1 + 1 × 1 = 2

(1, 0) − yin = 1 × 1 + 0 × 1 = 1

(0, 1) − yin = 0 × 1 + 1 × 1 = 1

(0, 0) − yin = 0 × 1 + 0 × 1 = 0

Truth table for z = x1 . x2


¯
¯¯¯¯

Hence, it is not possible to obtain activation function using this weights.

Case 2: Consider one weight excitatory and another weight inhibitory

i.e., w 11 = 1, w21 = −1

yin = x1 w11 + x2 w21

(1, 1) − yin = 1 × 1 + 1 × (−1) = 0

(1, 0) − yin = 1 × 1 + 0 × (−1) = 1

https://www.ques10.com/p/39252/all-types-of-numericals-1/ 7/11
3/23/22, 9:14 AM All types of Numericals

(0, 1) − yin = 0 × 1 + 1 × (−1) = −1

(0, 0) − yin = 0 × 1 + 0 × (−1) = 0

For this weight, it is possible to get the desired output. Hence,

w11 = 1, w21 = −1

∴ θ ≥ nw − p

⇒ θ ≥ 2 × 1 − 1

θ ≥ 1

This is for z neuron.


1

Second Function: z 2
¯
¯¯¯¯
= x1 . x2

The truth table is as shown below,

Case 1: Assume both the weights are excitatory,

i.e., w 12 = 1, w22 = 1

yin = x1 w11 + x2 w21

(1, 1) − yin = 1 × 1 + 1 × 1 = 2

(1, 0) − yin = 1 × 1 + 0 × 1 = 1

(0, 1) − yin = 0 × 1 + 1 × 1 = 1

(0, 0) − yin = 0 × 1 + 0 × 1 = 0

Hence, through these weights it is not possible to obtain activation function, z .


2

Case 2: Consider one weight excitatory and another weight inhibitory

i.e., w 12 = −1, w22 = 1

yin = x1 w11 + x2 w21

(1, 1) − yin = 1 × (−1) + 1 × 1 = 0

(1, 0) − yin = 1 × (−1) + 0 × 1 = −1

https://www.ques10.com/p/39252/all-types-of-numericals-1/ 8/11
3/23/22, 9:14 AM All types of Numericals

(0, 1) − yin = 0 × (−1) + 1 × 1 = 1

(0, 0) − yin = 0 × (−1) + 0 × 1 = 0

For this weight, it is possible to get the desired output. Hence,

w21 = −1, w22 = 1

∴ θ ≥ nw − p

⇒ θ ≥ 2 × 1 − 1

θ ≥ 1

This is for z neuron.


2

Third Function: y = z1 or z 2

The truth table is

Here the net input is caluculated as,

 yin = z1 v 1 + z2 v 2

Case 1: Assume both the weights excitatory


6 i.e., v 1 = v2 = 1

756
views The net input,

(0, 0) − yin = 0 × 1 + 0 × 1 = 0

(0, 1) − yin = 0 × 1 + 1 × 1 = 1

(1, 0) − yin = 1 × 1 + 0 × 1 = 1

(1, 1) − yin = 0 × 1 + 0 × 1 = 0

Therefore,

https://www.ques10.com/p/39252/all-types-of-numericals-1/ 9/11
3/23/22, 9:14 AM All types of Numericals

θ ≥ nw − p

θ ≥ 4 × 1 − 0

θ ≥ 4

Hence, we are getting the desired output.

Thus, the weights are obtained as following for the XOR function,

w11 = w22 = 1 (excitatory)

w12 = w21 = −1 (inhibitory)

v1 = v2 = 1

ADD COMMENT
SHARE
EDIT (/p/edit/60173/)

Please log in (/site/login/?next=/p/39252/all-types-of-numericals-1/) to add an answer.

COMMUNITY CONTENT COMPANY


Users (/user/list/) All posts (/t/latest) About (/info/about/)
Levels (/info/levels/) Tags (/t/) Team (/info/team/)
Badges (/b/list/) Dashboard (/dashboard/) Privacy (/info/privacy/)

https://www.ques10.com/p/39252/all-types-of-numericals-1/ 10/11
3/23/22, 9:14 AM All types of Numericals

Join our team   (/info/join/)

https://www.ques10.com/p/39252/all-types-of-numericals-1/ 11/11

You might also like