NFT PPT1 w21

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 352

Neur0 Fuzzy Techniques

Dr. Poonam Sharma


Syllabus
Neural Networks: History, overview of biological neuro-system, mathematical
models of neurons, ANN architecture, Learning rules, Learning Paradigms-
Supervised, Unsupervised and reinforcement Learning, Learning Tasks, ANN
training Algorithms-Single layer perceptron, multi-layer perceptron, Self-
organizing Map, Applications of Artificial Neural Networks.

Introduction to fuzzy set, Operations on fuzzy sets, Fuzzy relation, Fuzzy


implication, approximate reasoning, Fuzzy rule-based systems, Fuzzy reasoning
schemes, Fuzzy logic controller.
Implementing fuzzy IF-THEN rules by trainable neural nets. Fuzzy neurons,
Hybrid neural networks, Neuro-fuzzy classifiers.

9/14/2021 NFT 2
List of Books
Neuro-Fuzzy and Soft Computing: A computational Approach to Learning
& Machine Intelligence; Roger Jang, Tsai Sun, Eiji Mizutani, PHI

Soft Computing and Its Applications : R.A. Aliev, R.R. Aliev

Neural Network: A Comprehensive Foundation; Simon Haykin, PHI.

Elements of artificial Neural Networks; Kishan Mehtrotra, S. Ranka,


Penram International Publishing (India).

Fuzzy Logic with Engineering Applications; Timothy Ross, McGraw-Hill.


Neural Networks and Fuzzy Systems: Bar Kosko , PHI.

9/14/2021 NFT 3
9/14/2021 NFT 4
9/14/2021 NFT 5
9/14/2021 NFT 6
0 T 1

9/14/2021 NFT 7
9/14/2021 NFT 8
9/14/2021 NFT 9
9/14/2021 NFT 10
Some real ANN usages
● Character recognition
● Image Compression
● Classification of Neurodegenerative Diseases
● sentiment analysis
● Forecasting

9/14/2021 NFT 11
Artificial neural network

● Biologically inspired
● A network of simple
processing elements

9/14/2021 NFT 12
A simple neuron

Takes the Inputs .


Calculate the summation
of the Inputs .
Compare it with the
threshold being set
during the learning stage.

9/14/2021 NFT 13
The Neuron Diagram
Bia
s
x w b
1 1 Activati
Induc on
ed function Outp
Field
x w ut
v
Inpu
2 2
y
t
valu Summi
es ng
functio
x w n
m mweigh
ts
9/14/2021 NFT 14
Neuron
● The neuron is the basic information processing unit of
a NN. It consists of:
1 A set of links, describing the neuron inputs, with weights W 1,
W 2, …, Wm
2 An adder function (linear combiner) for computing the
weighted sum of the inputs:
(real numbers)

3 Activation function for limiting the amplitude of the


neuron output. Here ‘b’ denotes bias.

9/14/2021 NFT 15
Bias of a Neuron

● The bias b has the effect of applying a transformation


to the weighted sum u
v=u+b
● The bias is an external parameter of the neuron. It can
be modeled by adding an extra input.
● v is called induced field of the neuron

9/14/2021 NFT 16
Activation Functions
• Controls when unit is
“active” or “inactive”
• Threshold function
outputs 1 when input
is positive and 0
otherwise
• Sigmoid function
= 1 / (1 + e-x)

9/14/2021 NFT 17
Neuron Models
● The choice of activation function determines the
neuron model.
Examples:
● step function:

● ramp function:

● sigmoid function with z,x,y parameters

9/14/2021 NFT 18
Step Function

9/14/2021 NFT 19
Ramp Function

c d

9/14/2021 NFT 20
Sigmoid function

9/14/2021 NFT 21
Mc-Culloch Pitts Model

AND Model

x1 1

Y=F(yin)

1
x2
OR Model

x1 2

2
x2

9/14/2021 NFT 23
NOT Model

1
Xx1

9/14/2021 NFT 24
9/14/2021 NFT 25
9/14/2021 NFT 26
9/14/2021 NFT 27
9/14/2021 NFT 28
b c
a

AND Model

9/14/2021 NFT 29
OR Model

9/14/2021 NFT 30
1 1 0

NOT Model

9/14/2021 NFT 31
XOR Model

9/14/2021 NFT 32
Y
z
a
x a
i x
s i
s

X axis X axis

9/14/2021 NFT 33
Y
z
a
x a
i x
s i
s

X axis X axis

9/14/2021 NFT 34
Y

a
x
i
s
z

a
x
i
s

X axis

9/14/2021 NFT 35
XOR Model
X1 Z1

X2 Z2

9/14/2021 NFT 36
1
X1 Z1 1
-1 N
-1 1
X2 1 Z2

9/14/2021 NFT 37
Hebb Algorithm
Step 1: Initialize all weights and bias are set to zero
Step 2: For each training vector and target output pair (s, t) perform
steps 3-6
Step 3: Set activations for input units with input vector
Step 4: Set activation for output unit with the output neuron.
Step 5: Adjust the weights by applying Hebb rule

Step 6: Adjust the bias

If net input is +ve the output is +ve


If the input is –ve the output is –ve

9/14/2021 NFT 38
9/14/2021 NFT 39
9/14/2021 NFT 40
X1 X2 B y
1 1 1 1
1 -1 1 -1
-1 1 1 -1
-1 -1 1 -1

W1=0, W2 =0 and b=0

AND Model
9/14/2021 NFT 41
9/14/2021 NFT 42
9/14/2021 NFT 43
X1 X2 b y ΔW ΔW 2 Δb W1 W2 B
1

0 0 0
1 1 1 1 1 1 1 1 1 1
1 -1 1 -1 -1 1 -1 0 2 0
-1 1 1 -1 1 -1 -1 1 1 -1
-1 -1 1 -1 1 1 -1 2 2 -2

x1 2 -2

2
x2
9/14/2021 NFT 44
Example 2:

X1 X2 B y
1 1 1 -1
1 -1 1 1
-1 1 1 1
-1 -1 1 -1

X1 X2 B y ΔW ΔW 2 Δb W1 W2 b
1

0 0 0
1 1 1 -1 -1 -1 -1 -1 -1 -1
1 -1 1 1 1 -1 1 0 -2 0
-1 1 1 1 -1 1 1 -1 -1 1
-1 -1 1 -1 1 1 -1 0 0 0

9/14/2021 NFT 45
Example 3:

X1 X2 X3 X4 B y
1 1 -1 1 1 1
-1 1 -1 1 1 -1
1 -1 1 -1 1 1
-1 -1 1 1 1 -1

X1 X2 X3 X4 B y ΔW 1 ΔW 2 ΔW 3 ΔW 4 Δb W1 W2 W3 W4 b

0 0 0 0 0
1 1 -1 1 1 1 1 1 -1 1 1 1 1 -1 1 1
-1 1 -1 1 1 -1 1 -1 1 -1 -1 2 0 0 0 0
1 -1 1 -1 1 1 1 -1 1 -1 1 3 -1 1 -1 1
-1 -1 1 1 1 -1 1 1 -1 -1 -1 4 0 0 -2 0

9/14/2021 NFT 46
Example 4:

* * * *
* *
* * * * * *

X1 X2 X3 X4 X5 X6 X7 X8 X9 Y

1 1 1 -1 1 -1 1 1 1 1
1 -1 -1 1 -1 -1 1 1 1 -1

9/14/2021 NFT 47
X1 X2 X3 X4 X5 X6 X7 X8 X9 Y

1 1 1 -1 1 -1 1 1 1 1
1 -1 -1 1 -1 -1 1 1 1 -1

ΔW 1 ΔW 2 ΔW 3 ΔW 4 ΔW 5 ΔW 6 ΔW 7 ΔW 8 ΔW 9 Δb

1 1 1 -1 1 -1 1 1 1 1
-1 1 1 -1 1 1 -1 -1 -1 -1

W1 W2 W3 W4 W5 W6 W7 W8 W9 Δb

0 0 0 0 0 0 0 0 0 0
1 1 1 -1 1 -1 1 1 1 1
0 2 2 -2 2 0 0 0 0 0
9/14/2021 NFT 48
x1

0
x2
2

x3 2

-2
x4
2

x5 0

0
x6
0
x7

0
x8

x9
9/14/2021 NFT 49
AND gate
W1 W2 B

2 2 -2

9/14/2021 NFT 50
9/14/2021 NFT 51
Perceptron Rule
Step 1: Initialize all weights and bias are set to zero. Set learning rate
=0to 1.
Step 2: while stopping condition is false do step 3-7
Step 3: For each training vector and target output pair (s, t) perform
steps 4-6
Step 4: Set activations for input units with input vector
Step 5: Compute the output unit response

9/14/2021 NFT 52
Step 6: The weights and bias are updated if the target is not equal to the output
response. If t≠y and value of is not zero

Adjust the bias

Step 7: Test for stopping condition.

9/14/2021 NFT 53
AND MODEL

X1 X2 b Yin y t ΔW ΔW 2 Δb W1 W2 B
1

0 0 0
1 1 1 0 0 1 1 1 1 1 1 1
1 -1 1 1 1 -1 1 -1 -1 2 0 0
-1 1 1 2 1 -1 -1 1 -1 1 1 -1
-1 -1 1 -3 -1 -1 0 0 0 1 1 -1

x1 1 -1

1
x2
9/14/2021 NFT 54
AND MODEL

X1 X2 b Yin y t ΔW ΔW 2 Δb W1 W2 B
1

1 1 -1
1 1 1 0 0 1 1 1 1 1 1 -1
1 -1 1 1 -1 -1 1 -1 -1 1 1 -1
-1 1 1 2 -1 -1 -1 1 -1 1 1 -1
-1 -1 1 -3 -1 -1 0 0 0 1 1 -1

x1 1 -1

1
x2
9/14/2021 NFT 55
OR MODEL

X1 X2 b Yin y t ΔW ΔW 2 Δb W1 W2 B
1

0 0 0
1 1 1 0 0 1 1 1 1 1 1 1
1 -1 1 1 1 1 0 0 0 1 1 1
-1 1 1 1 1 1 0 0 0 1 1 1
-1 -1 1 -1 -1 -1 0 0 0 1 1 1

x1 1 1

1
x2
9/14/2021 NFT 56
AND MODEL

X1 X2 b Yin y t ΔW ΔW 2 Δb W1 W2 B
1

0 0 0
1 1 1 0 0 1 1 1 1 1 1 1
1 0 1 2 1 -1 1 0 -1 2 1 0
0 1 1 1 1 -1 0 -1 -1 2 0 -1
0 0 1 -1 -1 -1 0 0 0 2 0 -1

x1 2 -1

0
x2
9/14/2021 NFT 57
Example 3:

X1 X2 X3 X4 B t
1 1 1 1 1 1
1 1 1 -1 1 -1
-1 1 -1 -1 1 1
1 -1 -1 1 1 -1

X1 X2 X3 X4 B Yin y t ΔW ΔW 2 ΔW 3 ΔW 4 Δb W1 W2 W3 W4 b
1

0 0 0 0 0
1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1
1 1 1 -1 1 3 1 -1 -1 -1 -1 1 -1 0 0 0 2 0
-1 1 -1 -1 1 -2 -1 1 -1 1 -1 -1 1 -1 1 -1 1 1
1 -1 -1 1 1 1 1 -1 -1 1 1 -1 -1 -2 2 0 0 0

9/14/2021 NFT 58
Adaline

Step 1: Initialize all weights and bias any small value other than zero.
Set learning rate =0to 1.
Step 2: while stopping condition is false do step 3-7
Step 3: For each bipolar training vector and target output pair (s, t)
perform steps 4-6
Step 4: Set activations for input units with input vector
Step 5: Compute the output unit response

9/14/2021 NFT 59
Step 6: The weights and bias are updated

Adjust the bias

Step 7: Test for stopping condition.

9/14/2021 NFT 60
X1 X2 b Yin t t-Yin ΔW 1 ΔW 2 Δb W1 W2 B (t-
Yin) 2
0.2 0.2 0.2
1 1 1 0.6 1 0.4 0.08 0.08 0.08 0.28 0.28 0.28 0.16
1 -1 1 0.28 1 0.72 0.14 -0.14 0.14 0.42 0.13 0.424 0.51
4 4 4 4 6
-1 1 1 0.136 1 0.864 -0.1 0.173 0.173 0.251 0.30 0.597 0.74
73 9
-1 -1 1 0.037 1 0.963 -0.1 -0.19 0.96 0.05 0.11 1.56 0.92
93 3 3 84 6

9/14/2021 NFT 61
Multiclass Discrimination
⚫ Often, our classification problems involve more than

⚫ For
two classes.
example, character recognition requires at least 26

⚫ We
different classes.
can perform such tasks using layers of perceptrons
or Adalines.

9/14/2021 NFT 62
Multiclass Discrimination
w11
o1
w12
i1

o2
i2

.
. o3
.

in

w4n o4

⚫ Adimensional
four-node perceptron for a four-class problem in n-
input space
9/14/2021 NFT 63
Multiclass Discrimination
⚫ Each perceptron learns to recognize one particular
class, i.e., output 1 if the input is in that class, and 0

⚫ The units can be trained separately and in parallel.


otherwise.

⚫ Incurrent
production mode, the network decides that its
input is in the k-th class if and only if o = 1,
k

⚫ For
and for all j ≠ k, o = 0, otherwise it is misclassified.
j

units with real-valued output, the neuron with


maximal output can be picked to indicate the class of

⚫ This
the input.
maximum should be significantly greater than all
other outputs, otherwise the input is misclassified.

9/14/2021 NFT 64
Multilayer Networks
⚫ Although single-layer perceptron networks can
distinguish between any number of classes, they still

⚫ Tomultiple
require linear separability of inputs.
overcome this serious limitation, we can use

⚫ Rosenblatt
layers of neurons.
first suggested this idea in 1961, but he

⚫ However,
used perceptrons.
their non-differentiable output function led

⚫ The
to an inefficient and weak learning algorithm.
idea that eventually led to a breakthrough was the
use of continuous output functions and gradient
descent.

9/14/2021 NFT 65
Terminology
⚫ Example: Network function f: R → R 3 2

output vector
o1 o2
output layer

hidden layer

input layer
x1 x2 x3
input vector
9/14/2021 NFT 66
9/14/2021 NFT 67
9/14/2021 NFT 68
9/14/2021 NFT 69
9/14/2021 NFT 70
9/14/2021 NFT 71
9/14/2021 NFT 72
9/14/2021 NFT 73
9/14/2021 NFT 74
9/14/2021 NFT 75
Madaline

Step 1: Initialize all weights and bias any small value other than zero.
Set learning rate =0to 1.
Step 2: while stopping condition is false do step 3-7
Step 3: For each bipolar training vector and target output pair (s, t)
perform steps 4-7
Step 4: Set activations for input units with input vector
Step 5: Compute the output unit response

Step 6: Find the output of the net:

9/14/2021 NFT 76
Step 7: Calculate the error and update the weights.
1. If t=y, no weight updation is required.
2. If t≠y and t=+1, update the weights on Zj, where
net input is closest to 0(zero)

3. If t≠y and t=-1, update the weights on Zj, where


net input is closest to 0(zero)

Step 8: Test for the stopping condition.

9/14/2021 NFT 77
1
b1 1
W11
X1 Z1
W11 =0.05 b3
V1
W12 =0.1
W21 =0.2 W12
W22 =0.2
V 1 =0. 5 Y
V 2 =0. 5
b1 =0.3 W21
b2 =0.15
b3 =0. 5 V2

X2 Z2
W22
b2

9/14/2021 NFT 78
X1 X2 B y
1 1 1 -1
1 -1 1 1
-1 1 1 1
-1 -1 1 -1

9/14/2021 NFT 79
Hetero Associative Memory Neural Networks

Step 1: Initialize all weights using Hebb or Delta rule.


Step 2: whiFor each input vector do step 3-5
Step 3: For each bipolar training vector and target output pair (s, t)
perform steps 4-7
Step 4: Set activations for input units with input vector
Step 5: Compute net input to the output units

9/14/2021 NFT 80
Hetero Associative Memory Neural Networks

9/14/2021 NFT 81
Example 1: A hetero associative neural network is trained by Hebb outer product rule
for input row vector S=(x1,x2,x3,x4) to the output row vectors t=(t1,t2). Find the
weight matrix.
S1=(1 1 0 0) t1=(1 0)
S2=(1 1 1 0) t2=(0 1)
S3=(0 0 1 1) t3=(1 0)
S4=(0 1 0 0) t4=( 1 0)

Step 1: Initialize all weights to zero


Step 2: Find the output for each input
S1=(1 1 0 0) t1=(1 0)

9/14/2021 NFT 82
Step 2: Find the output for each input
S2=(1 1 1 0) t2=(0 1)

Step 2: Find the output for each input


S3=(0 0 1 1) t3=(1 0)

9/14/2021 NFT 83
Step 2: Find the output for each input
S4=(0 1 0 0) t4=( 1 0)

Step 3Weight matrix of all the four patterns is the sum of the weight matrix
for each stored pattern

9/14/2021 NFT 84
Example 2: A hetero associative neural network is trained by Hebb outer product rule
for input row vector S=(x1,x2,x3,x4) to the output row vectors t=(t1,t2). Find the
weight matrix.
S1=(1 1 0 0) t1=(1 0)
S2=(0 1 0 0) t2=(1 0)
S3=(0 0 1 1) t3=(0 1)
S4=(0 0 1 0) t4=( 0 1)

Step 1: Initialize all weights to zero


Step 2: Find the output for each input
S1=(1 1 0 0) t1=(1 0)

9/14/2021 NFT 85
Step 2: Find the output for each input
S2=(0 1 0 0) t2=(1 0)

Step 2: Find the output for each input


S3=(0 0 1 1) t3=(0 1)

9/14/2021 NFT 86
Step 2: Find the output for each input
S4=(0 0 1 0) t4=( 0 1)

Step 3Weight matrix of all the four patterns is the sum of the weight matrix
for each stored pattern

9/14/2021 NFT 87
Test the weight using different input set:

S1=(1 1 0 0) t1=(1 0)

S2=(0 1 0 0) t2=(1 0)

9/14/2021 NFT 88
S3=(0 0 1 1) t3=(0 1)

S4=(0 0 1 0) t4=( 0 1)

9/14/2021 NFT 89
Test the weight using different input set:

S3=(1 1 1 0) t3=?

9/14/2021 NFT 90
Example 3: A hetero associative neural network is trained by Hebb outer product rule
for input row vector S=(x1,x2,x3,x4) to the output row vectors t=(t1,t2). Find the
weight matrix.
S1=(1 1 -1 -1) t1=(1 -1)
S2=(-1 1 -1 -1) t2=(1 -1)
S3=(-1 -1 1 1) t3=(-1 1)
S4=(-1 -1 1 -1) t4=( -1 1)

Step 1: Initialize all weights to zero


Step 2: Find the output for each input
S1=(1 1 -1 -1) t1=(1 -1)

9/14/2021 NFT 91
Step 2: Find the output for each input
S2=(-1 1 -1 -1) t2=(1 -1)

Step 2: Find the output for each input


S3=(-1 -1 1 1) t3=(-1 1)

9/14/2021 NFT 92
Step 2: Find the output for each input
S4=(-1 -1 1 -1) t4=( -1 1)

Step 3Weight matrix of all the four patterns is the sum of the weight matrix
for each stored pattern

9/14/2021 NFT 93
Test the weight using different input set:

S1=(1 1 -1 -1) t1=(1 -1)

S2=(-1 1 -1 -1) t2=(1 -1)

9/14/2021 NFT 94
S3=(-1 -1 1 1) t3=(-1 1)

S4=(-1 -1 1 -1) t4=( -1 1)

9/14/2021 NFT 95
Test the weight using different input set:

S1=(1 1 0 0) t1=(1 0)

S2=(0 1 0 0) t2=(1 0)

9/14/2021 NFT 96
S3=(0 0 1 1) t3=(0 1)

S4=(0 0 1 0) t4=( 0 1)

9/14/2021 NFT 97
S3=(1 0 0 0 ) t3=?

S4=(1 1 1 0) t4=?

9/14/2021 NFT 98
S5=(1 1 1 1) t5=?

S6=(0 1 -1 0) t6=?

9/14/2021 NFT 99
Auto Associative Memory Neural Networks

9/14/2021 NFT 100


Mutually Orthogonal Pairs

Two vectors x and y are orthogonal if

9/14/2021 NFT 101


Example 3: A auto associative neural network is trained by Hebb outer product rule
for input row vector S=(1 1 -1 -1). Find the weight matrix.
S1=(1 1 -1 -1) t1=(1 1 -1 -1)

Step 1: Initialize all weights to zero


Step 2: Find the output for each input
S1=(1 1 -1 -1) t1=(1 1 -1 -1)

9/14/2021 NFT 102


Test the weight using different input set:

S1=(1 1 0 0) t1=(1 0)

S2=(0 1 0 0) t2=(1 0)

9/14/2021 NFT 103


Different types of neural networks
❖ Variations on the classic neural network design
allow various forms of forward and backward
propagation of information among tiers.
❖ 1. feed-forward

❖ 2.backpropagation

9/14/2021 NFT 104


Feed-forward neural network
This type of artificial neural network algorithm
passes information straight through from input to
processing nodes to outputs. It may or may not
have hidden node layers, making their functioning
more interpretable.

9/14/2021 NFT 105


Back propagation neural network
⚫ Recurrent Neural Networks (Back-Propagating)
⚫ Information passes from input layer to output layer
to produce result. Error in result is then
communicated back to previous layers now. Nodes
get to know how much they contributed in the
answer being wrong. Weights are re-adjusted. Neural
network is improved. It learns. There is bi-directional
flow of information.
9/14/2021 NFT 106
Basic Neuron Model In A
Feedforward Network
⚫ Inputs xi arrive through


pre-synaptic connections
Synaptic efficacy is
modeled using real


weights wi
The response of the
neuron is a nonlinear
function f of its
weighted inputs

9/14/2021 NFT 107


Inputs To Neurons


Arise from other neurons or from outside the network
Nodes whose inputs arise outside the network are


called input nodes and simply copy values
An input may excite or inhibit the response of the
neuron to which it is applied, depending upon the
weight of the connection

9/14/2021 NFT 108


Weights
⚫ Represent synaptic efficacy and may be excitatory or


inhibitory
Normally, positive weights are considered as
excitatory while negative weights are thought of as


inhibitory
Learning is the process of modifying the weights in
order to produce a network that performs some
function

9/14/2021 NFT 109


Output


The response function is normally nonlinear
Samples include
⚫ Sigmoid

⚫ Piecewise linear

9/14/2021 NFT 110


Backpropagation Preparation
⚫ Training Set
A collection of input-output patterns that are used to


train the network
Testing Set
A collection of input-output patterns that are used to


assess network performance
Learning Rate-α
A scalar parameter, analogous to step size in numerical
integration, used to set the rate of adjustments

9/14/2021 NFT 111


Network Error
⚫ Total-Sum-Squared-Error (TSSE)

⚫ Root-Mean-Squared-Error (RMSE)

9/14/2021 NFT 112


9/14/2021 NFT 113
Feedforward

Outputs
Inputs

9/14/2021 NFT 114


By use of chain rule we have

9/14/2021 NFT 115


9/14/2021 NFT 116
9/14/2021 NFT 117
9/14/2021 NFT 118
9/14/2021 NFT 119
9/14/2021 NFT 120
9/14/2021 NFT 121
1

x1 2
Z1
1 -1
0 -1
1

x2 2 1
Z2
V= [ -1 1 2] 2 Y

0 3 2
W=[ 2 1 0
1 2 2 x3 1 Z3
0 3 1]
x=[ 0.6 0.8 0]
Vo= [-1] -1

T=[ 0.9]
1
Wo=[ 0 0 -1]

9/14/2021 NFT 122


9/14/2021 NFT 123
9/14/2021 NFT 124
9/14/2021 NFT 125
9/14/2021 NFT 126
9/14/2021 NFT 127
9/14/2021 NFT 128
9/14/2021 NFT 129
9/14/2021 NFT 130
Unsupervised Learning
⚫ We can include additional structure in the network so
that the net is forced to make a decision as to which one
unit will respond.

⚫ The mechanism by which it is achieved is called


competition.

⚫ It can be used in unsupervised learning.

⚫ A common use for unsupervised learning is clustering


based neural networks.

9/14/2021 NFT 131


Unsupervised Learning
⚫ In a clustering net, there are as many units as the input
vector has components.

⚫ Every output unit represents a cluster and the number of


output units limit the number of clusters.

⚫ During the training, the network finds the best matching


output unit to the input vector.

⚫ The weight vector of the winner is then updated


according to learning algorithm.

9/14/2021 NFT 132


Kohonen Learning
⚫ A variety of nets use Kohonen Learning
⚫ New weight vector is the linear combination of old
weight vector and the current input vector.
⚫ The weight update for cluster unit (output unit) j can
be calculated as:

⚫ the learning rate alpha decreases as the learning


process proceeds.

9/14/2021 NFT 133


Kohonen SOM (Self Organizing Maps)
⚫ Since it is unsupervised environment, so the name is Self
Organizing Maps.

⚫ Self Organizing NNs are also called Topology Preserving


Maps which leads to the idea of neighborhood of the
clustering unit.

⚫ During the self-organizing process, the weight vectors of


winning unit and its neighbors are updated.

9/14/2021 NFT 134


Kohonen SOM (Self Organizing Maps)
⚫ Normally, Euclidean distance measure is used to find the
cluster unit whose weight vector matches most closely to
the input vector.

⚫ For a linear array of cluster units, the neighborhood of


radius R around cluster unit J consists of all units j such
that:

9/14/2021 NFT 135


Kohonen SOM (Self Organizing Maps)
⚫ Architecture of SOM

9/14/2021 NFT 136


Kohonen SOM (Self Organizing Maps)
⚫ Structure of Neighborhoods

9/14/2021 NFT 137


Kohonen SOM (Self Organizing Maps)
⚫ Structure of Neighborhoods

9/14/2021 NFT 138


Kohonen SOM (Self Organizing Maps)
⚫ Structure of Neighborhoods

9/14/2021 NFT 139


Kohonen SOM (Self Organizing Maps)
⚫ Neighborhoods do not wrap around from one side
of the grid to other side which means missing units

⚫ Algorithm:
are simply ignored.

9/14/2021 NFT 140


Kohonen SOM (Self Organizing Maps)
⚫ Algorithm:

⚫ Radius and learning rates may be decreased after each


epoch.
⚫ Learning rate decrease may be either linear or
geometric.

9/14/2021 NFT 141


KOHONEN SELF ORGANIZING MAPS

Architecture

neuron i
Kohonen layer
wi

Winning neuron

Input vector X X=[x1,x2,…xn] ∈ R n


wi =[wi 1,wi2,…,win] ∈ R n

9/14/2021 NFT 142


Kohonen SOM (Self Organizing Maps)
⚫ Example

9/14/2021 NFT 143


Kohonen SOM (Self Organizing Maps)

x1 0.2
Z1
0.8

0.6
x2
0.4

0.5
Z2
0.7
x3
0.3
0.9

x4

9/14/2021 NFT 144


Kohonen SOM (Self Organizing Maps)

9/14/2021 NFT 145


Kohonen SOM (Self Organizing Maps)

9/14/2021 NFT 146


Kohonen SOM (Self Organizing Maps)
The weight vector for the cluster unit are (0.9,0.7,0.6) and (0.4,0.2,0.1). Find the
winning cluster for the input vector (0.4,0.2,0.1). Use learning rate of 0.2. Find the
new weight for the winning unit.

x1 0.9
C1
0.7
0.6

x2
0.4
0.3
C2
x3 0.5

9/14/2021 NFT 147


9/14/2021 NFT 148
9/14/2021 NFT 149
Kohonen SOM (Self Organizing Maps)
0.2 C1

0.3

x1
0.6
X=[0.3,0.4] C2
α=0.3 0.5

0.4
0.7 C3

x2
0.9
0.6
C4

0.2

0.8
C1

9/14/2021 NFT 150


9/14/2021 NFT 151
If R=1,
Weight updation=

9/14/2021 NFT 152


Hop field Network Algorithm

9/14/2021 NFT 153


Step 1: Initialize all weights to store pattern.
Step 2: For each input vector do step 3-7
Step 3: For each bipolar training vector and target output pair (s, t)
perform steps 4-7
Step 4: Set activations for input units with input vector
Step 5: Compute net input to the output units

Step 6: Broadcat the value to all other units


Step 7: Test for convergence

9/14/2021 NFT 154


Energy Function:

Hallucinations is one of the main problems in the Discrete Hopfield Network. Sometimes


network output can be something that we hasn’t taught it.

9/14/2021 NFT 155


Example 3: Hopfield neural network is trained by Hebb outer product rule for input
row vector S=(1 1 -1 -1). Find the weight matrix.
S1=(1 1 -1 -1) t1=(1 1 -1 -1)

Step 1: Initialize all weights to zero


Step 2: Find the output for each input
S1=(1 1 -1 -1) t1=(1 1 -1 -1)

9/14/2021 NFT 156


Test the weight using different input set:

S1=(1 1 0 0) t1=(1 0)

9/14/2021 NFT 157


Design a Hopfield network for 4 bit bipolar patterns. The training patterns
are:
S1=[1 1 -1 -1]
S2=[-1 1 -1 1]
S3=[-1 -1 -1 1]
Find the weight matrix and the energy for the inputs. Determine the
pattern to which the sample T=[-1 1 -1 -1] associates.

9/14/2021 NFT 158


9/14/2021 NFT 159
Design a Hopfield network for 4 bit bipolar patterns. The training patterns
are:
S1=[1 1 1 1 1]
S2=[1 -1 -1 1 -1]
S3=[-1 1 -1 -1 -1]
Find the weight matrix and the energy for the inputs. Determine the
pattern to which the sample T1=[1 1 1 -1 -1] , T2=[1 -1 -1 -1 -1] associates.

9/14/2021 NFT 160


LVQ

• Vector Quantisation is a technique that exploits the


underlying structure of input vectors for the purpose
of data compression.
• An input space is divided in a number of distinct
regions and for each region a reconstruction
(representative) is defined.
• When the quantizer is presented with a new input
vector, the region in which the vector lies is first
determined, and is then represented by the
reproduction vector for this region.
• The collection of all possible reproduction vectors is
called the code book of the quantizer and its members
are called code words.

9/14/2021 NFT 161


LVQ-1

• A vector quantizer with minimum encoding distortion


is called Voronoi or nearest-neighbour quantizer, since
the Voronoi cells about a set of points in an input
space correspond to a partition of that space
according to the nearest-neighbour rule based on the
Euclidean metric.
• An example with an input space divided to four cells
and their associated Voronoi vectors is shown below:

9/14/2021 NFT 162


LVQ-2

• The SOM algorithm provides an approximate method


for computing the Voronoi vectors in an unsupervised
manner, with the approximation being specified by
the

9/14/2021 NFT 163


LVQ-3

weight vectors of the neurons in the feature map.


• Computation of the feature map can be viewed as the
first of two stages for adaptively solving a pattern
classification problem as shown below. The second
stage is provided by the learning vector quantization,
which provides a method for fine tuning of a feature
map.

9/14/2021 NFT 164


LVQ-4

• Learning vector quantization (LVQ) is a supervised


learning technique that uses class information to
move the Voronoi vectors slightly, so as to improve
the quality of the classifier decision regions.
• An input vector x is picked at random from the input
space. If the class labels of the input vector and a
Voronoi vector w agree, the Voronoi vector is moved
in the direction of the input vector x. If, on the other
hand, the class labels of the input vector and the
Voronoi vector disagree, the Voronoi vector w is
moved away from the input vector x.
• Let us denote {wj }j=1l the set of Voronoi vectors, and
let {xi}i=1 N be the set of input vectors. We assume
that

9/14/2021 NFT 165


LVQ-5

N >> l.
• The LVQ algorithm proceeds as follows:
i. Suppose that the Voronoi vector wc is the
closest to the input vector xi. Let Cwc and Cxi
denote the class labels associated with wc and
xi respectively. Then the Voronoi vector wc is
adjusted as follows:
• If Cwc = Cxi then
Wc(n+1)= wc(n)+an [xi- wc(n)]
Where 0< an <1

9/14/2021 NFT 166


LVQ-6

• If Cwc ≠ Cxi then


Wc(n+1)= wc(n)-an [xi- wc(n)]
iii. The other Voronoi vectors are not modified.
• It is desirable for the learning constant an to
decrease monotonically with time n. For example
an could be initially 0.1 and decrease linearly with
n.
• After several passes through the input data the
Voronoi vectors typically converge at which point
the training is complete.

9/14/2021 NFT 167


9/14/2021 NFT 168
Construct and test LVQ with four vectors assigned to two classes.
Assume alpha =0.1, Perform interaction upto alpha=0.05
Vector Class
1010 1
0011 2
1100 1
1001 2

1) Initialize weight =1010 and w2=0011


2) Begin training
3) X=1100 with T=1
4) Calculate J

9/14/2021 NFT 169


9/14/2021 NFT 170
9/14/2021 NFT 171
A Pseudo-Code Algorithm
⚫ Randomly choose the initial weights
⚫ ⚫While error is too large
For each training pattern (presented in random order)
⚫ Apply the inputs to the network
⚫ Calculate the output for every neuron from the input layer, through the
hidden layer(s), to the output layer
⚫ Calculate the error at the outputs
⚫ Use the output error to compute error signals for pre-output layers
⚫ Use the error signals to compute weight adjustments


Apply the weight adjustments
Periodically evaluate the network performance

9/14/2021 NFT 172


Possible Data Structures
⚫ Two-dimensional arrays
⚫ Weights (at least for input-to-hidden layer and hidden-to-output
layer connections)
⚫ Weight changes (Δ )
⚫ One-dimensional arrays
ij

⚫ Neuron layers
⚫ Cumulative current input
⚫ Current output
⚫ Error signal for each neuron
⚫ Bias weights

9/14/2021 NFT 173


Apply Inputs From A Pattern
⚫ Apply the value of each Feedforward
input parameter to each


input node
Input nodes compute
only the identity function

Outputs
Inputs

9/14/2021 NFT 174


Calculate Outputs For Each
Neuron Based On The Pattern
⚫ The output from neuron j for
pattern p is O pj where
Feedforward

and

Outputs
Inputs
k ranges over the input indices and Wjk
is the weight on the connection
from input k to neuron j

9/14/2021 NFT 175


Calculate The Error Signal For Each
Output Neuron
⚫ The output neuron error signal δpj is given by δpj=(Tpj-


Opj) Opj (1-Opj)


Tpj is the target value of output neuron j for pattern p
Opj is the actual output value of output neuron j for
pattern p

9/14/2021 NFT 176


Calculate The Error Signal For Each
Hidden Neuron
⚫ The hidden neuron error signal δpj is given by

where δpk is the error signal of a post-synaptic


neuron k and Wkj is the weight of the connection
from hidden neuron j to the post-synaptic neuron k

9/14/2021 NFT 177


Calculate And Apply Weight
Adjustments
⚫ Compute weight adjustments ΔWji at time t by

ΔWji(t)= η δpj Opi

⚫ Apply weight adjustments according to

Wji(t+1) = Wji(t) + ΔWji(t)

⚫ Some add a momentum term α*ΔWji(t-1)

9/14/2021 NFT 178


9/14/2021 NFT 179
Fuzzy Logic

9/14/2021 NFT 180


WHAT IS FUZZY LOGIC?
Definition of fuzzy
⚫ Fuzzy – “not clear, distinct, or precise; blurred”

Definition of fuzzy logic


⚫ A form of knowledge representation suitable for notions
that cannot be defined precisely, but which depend upon
their contexts.

9/14/2021 NFT 181


TRADITIONAL REPRESENTATION OF
LOGIC

Slow Fast
Speed = 0 Speed = 1

9/14/2021 NFT 182


FUZZY LOGIC REPRESENTATION
Slowest
For every problem
[ 0.0 – 0.25 ]
must represent in
terms of fuzzy sets.
Slow
What are fuzzy sets? [ 0.25 – 0.50 ]

Fast
[ 0.50 – 0.75 ]

Fastest
[ 0.75 – 1.00 ]

9/14/2021 NFT 183


FUZZY LOGIC REPRESENTATION CONT.

Slowest Slow Fast Fastest

9/14/2021 NFT 184


ORIGINS OF FUZZY LOGIC
Traces back to Ancient Greece

Lotfi Asker Zadeh ( 1965 )


⚫ First to publish ideas of fuzzy logic.

Professor Toshire Terano ( 1972 )


⚫ Organized the world's first working group on fuzzy
systems.

F.L. Smidth & Co. ( 1980 )


⚫ First to market fuzzy expert systems.

9/14/2021 NFT 185


FUZZY LOGIC VS. NEURAL NETWORKS
How does a Neural Network work?

Both model the human brain.


⚫ Fuzzy Logic

⚫ Neural Networks

Both used to create behavioral

systems.

9/14/2021 NFT 186


FUZZY LOGIC IN CONTROL SYSTEMS
Fuzzy Logic provides a more efficient and
resourceful way to solve Control Systems.

Some Examples
⚫ Temperature Controller

⚫ Anti – Lock Break System ( ABS )

9/14/2021 NFT 187


How the models
Crisp data work
Fuzzifier Inputs converted to
Member 90% hot degrees of
10% cold membership of fuzzy
sets.
Fuzzy rules
IF 90% hot THEN 80% open
IF 10% cold THEN 20% closed Fuzzy rules applied to
get new sets of
Fuzzy output set members.
80% open, 20% closed

Defuzzifier These sets are then


converted back to real
Crisp data numbers.
9/14/2021 NFT 188
TEMPERATURE CONTROLLER
The problem
⚫ Change the speed of a heater fan, based off the room
temperature and humidity.
A temperature control system has four settings
⚫ Cold, Cool, Warm, and Hot
Humidity can be defined by:
⚫ Low, Medium, and High
Using this we can define
the fuzzy set.

9/14/2021 NFT 189


Steps
Fuzzification: determines an input's % membership in overlapping sets.

Rules: determine outputs based on inputs and rules.

Combination/Defuzzification: combine all fuzzy actions into a single fuzzy


action and transform the single fuzzy action into a crisp, executable system
output. May use centroid of weighted sets.

9/14/2021 NFT 190


Fuzzy Logic Example

9/14/2021 NFT 191


Example Rules
IF speed is TOO SLOW and acceleration is DECELERATING,
THEN INCREASE POWER GREATLY

IF speed is SLOW and acceleration is DECREASING,


THEN INCREASE POWER SLIGHTLY

IF distance is CLOSE,
THEN DECREASE POWER SLIGHTLY

...

Output Determination
Degree of membership in an output fuzzy set now represents each fuzzy
action.

Fuzzy actions are combined to form a system output.

Note there would be a total of 95 different rules for all combinations of


inputs of 1, 2, or 3 at aNFT
9/14/2021
time. 192
( 5x3x3 + 5x3 + 5x3 + 3x3 + 5 + 3 + 3 )
9/14/2021 NFT 193
Crisp Set and Fuzzy Set

9/14/2021 NFT 194


Information World

Crisp set has a unique membership function


μA(x) = 1x ∈ A
0x ∉ A
μA(x) ∈ {0, 1}

Fuzzy Set can have an infinite number of membership


functions
μA ∈ [0,1]

9/14/2021 NFT 195


Fuzziness

Examples:
A number is close to 5

9/14/2021 NFT 196


Fuzziness

Examples:
He/she is tall

9/14/2021 NFT 197


Classical Sets

9/14/2021 NFT 198


CLASSICAL SETS
Define a universe of discourse, X, as a collection of objects all
having the same characteristics. The individual elements in the
universe X will be denoted as x. The features of the elements in
X can be discrete, or continuous valued quantities on the real
line. Examples of elements of various universes might be as


follows:


the clock speeds of computer CPUs;


the operating currents of an electronic motor;


the operating temperature of a heat pump;
the integers 1 to 10.

9/14/2021 NFT 199


Operations on Classical Sets

Union:
A ∪ B = {x | x ∈ A or x ∈ B}
Intersection:
A ∩ B = {x | x ∈ A and x ∈ B}
Complement:
A’ = {x | x ∉ A, x ∈ X}
X – Universal Set
Set Difference:
A | B = {x | x ∈ A and x ∉ B}
Set difference is also denoted by A - B

9/14/2021 NFT 200


Operations on Classical Sets

Union of sets A and B (logical or).

Intersection of sets A and B.

9/14/2021 NFT 201


Operations on Classical Sets

Complement of set A.

Difference operation A|B.

9/14/2021 NFT 202


Properties of Classical Sets

A∪B=B∪A
A∩B=B∩A
A ∪ (B ∪ C) = (A ∪ B) ∪ C
A ∩ (B ∩ C) = (A ∩ B) ∩ C
A ∪ (B ∩ C) = (A ∪ B) ∩ (A ∪ C)
A ∩ (B ∪ C) = (A ∩ B) ∪ (A ∩ C)
A∪A=A
A∩A=A
A∪X=X
A∩X=A
A∪∅=A
A∩∅=∅
9/14/2021 NFT 203
Mapping of Classical Sets to Functions

Mapping is an important concept in relating set-theoretic forms to function-


theoretic representations of information. In its most general form it can be
used to map elements or subsets in one universe of discourse to elements or
sets in another universe.

9/14/2021 NFT 204


Example:
X={1,2,3,4}
Find Cardinal number, power set and cardinality of the power set.
η(X)=4
Power set P(X)= {φ,{1},{2},{3},{4}, {1,2},{1,3},{1,4},{2,3},{2,4},{3,4},{1,2,3},
{2,3,4}{1,3,4},{1,2,4},{1,2,3,4}}
Cardinality of Power set=24

9/14/2021 NFT 205


Tall and Short

Membership
function

Height

9/14/2021 NFT 206


Operations on Classical Sets and Fuzzy Set
Union of sets A and B (logical or).

Intersection of sets A and B.

9/14/2021 NFT 207


Operations on Classical Sets
Complement of set A.

Difference operation A|B.

9/14/2021 NFT 208


Example:

Calculate A ∪B, A ∩B, Ã

9/14/2021 NFT 209


We want to compare two sensors based upon their detection levels and gain
settings
Gain settings Sensor 1 Sensor 2
0 0 0
20 0.5 0.35
40 0.65 0.5
60 0.85 0.75
80 1 0.9
100 1 1

Calculate union, intersection and complement of both sensors

9/14/2021 NFT 210


9/14/2021 NFT 211
Features of membership function

Core

Support

Boundary Boundary

9/14/2021 NFT 212


Normal and Sub Normal Fuzzy Set

1 1

9/14/2021 NFT 213


Convex and non Convex Fuzzy Set

1 1

9/14/2021 NFT 214


Law of Excluded middle

A ∪Ã=X

Law of contradiction

A ∩ Ã=ø

Demorgan’s Law

9/14/2021 NFT 215


Consider two fuzzy set A and B. Find complement, union, Intersection, Difference
and De Morgan’s Law

9/14/2021 NFT 216


Fuzzy Relations:
AXB
A={0,1}
B={e,f,g}
AXB={(0,e),(0,f),(0,g),(1,e),(1,f),(1,g)}
BXA={(e,0),(e,1),(f,0),(f,1),(g,0),(g,1)}

9/14/2021 NFT 217


Crisp Relation
⚫ Definition (Product set):
Let A and B be two nonempty sets, the product
set or Cartesian product A × B is defined as
follows,
A × B = {(a, b) | a ∈ A, b ∈ B }

⚫ Extension to n sets
A1×A2×...×An =
{(a1, ... , an) | a1 ∈ A1, a2 ∈ A2, ... , an ∈ An }

9/14/2021 NFT 218


Crisp Relation
Example: A = {a1, a2, a3}, B = {b1, b2}
A × B = {(a1, b1), (a1, b2), (a2, b1), (a2, b2), (a3, b1),
(a3, b2)}

Product set A × B
9/14/2021 NFT 219
Crisp Relation
A × A = {(a1, a1), (a1, a2), (a1, a3), (a2, a1), (a2, a2), (a2,
a3), (a3, a1), (a3, a2), (a3, a3)}

Cartesian product A × A
9/14/2021 NFT 220
Crisp Relation
⚫ ⚫Definition Binary Relation

R = { (x,y) | x ∈ A, y ∈ B } ⊆ A x B

⚫ n-ary Relation
(x1, x2, x3, … , xn) ∈ R ,

R ⊆ A1 × A2 × A3 × … × An

9/14/2021 NFT 221


Cartesian Product
⚫ Example 3.1
⚫ Set A = { 0,1 }
⚫ Set B = { a, b, c }

A × B = {(0,a),(0,b),(0,c),(1,a),(1,b),(1,c)}
B × A = {(a,0),(a,1),(b,0),(b,1),(c,0),(c,1)}
A × A = A2 = {(0,0),(0,1),(1,0),(1,1)}
B × B = B2 ={(a,a),(a,b),(a,c),(b,a),(b,b),(b,c),(c,a),(c,b),(c,c)}

9/14/2021 NFT 222


Crisp Relations
⚫ :

Measure by characteristic function χ
X × Y = {(x,y)│x∈X, y∈Y}
⚫ Binary relation
⚫ χX×Y(x,y)= 1, (x,y) ∈ X × Y
0, (x,y) X × Y

⚫ χR(x,y)= 1, (x,y) ∈ X × Y
0, (x,y) X×Y

9/14/2021 NFT 223


9/14/2021 NFT 224
Crisp Relations
■ EX :
X={1,2,3} Y={a,b,c}

⚪ Sagittal diagram ⚪ Relation Matrix


abc
1
R= 2
3

9/14/2021 NFT 225


Crisp Relation
⚫ Representation of Relations
(1)Bipartigraph
representing the relation by drawing arcs or edges

Binary relation from A to


9/14/2021 B NFT 226
Crisp Relation MR = (mij)
(2) Matrix i = 1, 2, 3, …,
m
j = 1, 2, 3, …, n
(4) Digraph

R b1 b2 b3
a1 1 0 0
a2 0 1 0
a3 0 1 0
a4 0 0 1
Matri Directed
9/14/2021 x NFT graph 227
Crisp Relation
⚫ Operations on relations R, S ⊆ A × B
(1) Union T = R ∪ S
If (x, y) ∈ R or (x, y) ∈ S, then (x, y) ∈ T
(2) Intersection T = R ∩ S
If (x, y) ∈ R and (x, y) ∈ S, then (x, y) ∈ T.
(3) Complement
If (x, y) ∉ R, then (x, y) ∈ RC
(4) Inverse
R-1 = {(y, x) ∈ B × A | (x, y) ∈ R, x ∈ A, y ∈ B}
(5) Composition T
R ⊆ A × B, S ⊆ B × C , T = S ∙ R ⊆ A × C
T = {(x, z) | x ∈ A, y ∈ B, z ∈ C, (x, y) ∈ R, (y, z) ∈ S}

9/14/2021 NFT 228


Cardinality of Crisp Relations
⚫ X:n elements Y:m elements
n :the cardinality of X
n :the cardinality of Y
X

⚫ Cardinality of the relation


Y

⚫ n =n *n
⚫ power set
X×Y X Y

⚫ :
The cardinality P(X × Y)
⚫ n P(X × Y) = 2 (nXnY)

9/14/2021 NFT 229


Composition
⚫ R={(X ,Y ),(X ,Y ),(X ,Y )}
1 1 1 3 2 4

S={(Y1,Z2),(Y3,Z2)}

⚫ Composition oeration
⚫ Max-min composition
⚫ T=R S。
⚫ Max-product comositon
⚫ T=R S。
9/14/2021 NFT 230
9/14/2021 NFT 231
Composition
⚫ Max-min composition
y1 y2 y3 y4 z1 z2
R= x1 S= y1
x2 y2
x3 y3
y4

z1 z2
T= x1
x2
x3

9/14/2021 NFT 232


Fuzzy Relations
⚫ Membership function
⚫ Interval [0,1]
⚫ Cartesian space X × Y =>

⚫ Cardinality of Fuzzy Relations


⚫ Universe is infinity

9/14/2021 NFT 233


Operations on Fuzzy Relations
⚫ Union

⚫ Intersection
⚫ Complement
⚫ Containment

9/14/2021 NFT 234


Fuzzy Cartesian Product
⚫ Cartesian product space
⚫ Fuzzy relation has membership function

⚫ Example 3.5

9/14/2021 NFT 235


Suppose we have two fuzzy sets, A defined on a universe of three discrete
temperatures, X = {x1,x2,x3}, and B defined on a universe of two discrete pressures,
Y = {y1,y2}, and we want to find the fuzzy Cartesian product between them. Fuzzy
set A could represent the “ambient” temperature and fuzzy set B the “near-
optimum” pressure for a certain heat exchanger, and the Cartesian product might
represent the conditions (temperature–pressure pairs) of the exchanger that are
associated with “efficient” operations.

9/14/2021 NFT 236


9/14/2021 NFT 237
9/14/2021 NFT 238
A certain type of virus attacks cells of the human body. The infected cells can be
visualized using a special microscope. The microscope generates digital images that
medical doctors can analyze and identify the infected cells. The virus causes the
infected cells to have a black spot, within a darker gray region. A digital image process
can be applied to the image. This processing generates two variables: the first variable,
P, is related to black spot quantity (black pixels) and the second variable, S, is related
to the shape of the black spot, that is, if they are circular or elliptic. In these images, it
is often difficult to actually count the number of black pixels, or to identify a perfect
circular cluster of pixels; hence, both these variables must be estimated in a linguistic
way. Suppose that we have two fuzzy sets: P that represents the number of black pixels
(e.g., none with black pixels, C1, a few with black pixels, C2, and a lot of black pixels,
C3) and S that represents the shape of the black pixel clusters (e.g., S1 is an ellipse and
S2 is a circle). So, we have

and we want to find the relationship between quantity of black pixels in the virus
and the shape of the black pixel clusters. U

9/14/2021 NFT 239


9/14/2021 NFT 240
9/14/2021 NFT 241
9/14/2021 NFT 242
Fuzzy Composition
⚫ Fuzzy max-min composition
⚫ Fuzzy max-product composition
⚫ Crisp fuzzy composition
9/14/2021 NFT 243
Fuzzy Composition
⚫ X={x 1 ,x2} Y={y1 ,y2} Z={z1 ,z2,z3}

⚫ Max-min composition

⚫ Max-product compositon

9/14/2021 NFT 244


Now, suppose another microscope image is taken and the number of black
pixels is slightly different; let the new black pixel quantity be represented by a
fuzzy set, P’

Using max–min composition with the relation R∼ will yield a new value for the
fuzzy set of pixel cluster shapes that are associated with the new black pixel
quantity:

9/14/2021 NFT 245


9/14/2021 NFT 246
9/14/2021 NFT 247
9/14/2021 NFT 248
9/14/2021 NFT 249
Equivalence and Tolerance Relation

9/14/2021 NFT 250


Reflexivity:
When a relation is reflexive every vertex in the graph originates a single loop.

Symmetric:
If a relation is symmetric, then in the graph for every edge pointing from vertex i to
vertex j (i,j = 1, 2, 3), there is an edge pointing in the opposite direction, that is,
from vertex j to vertex i.

Transitive:
When a relation is transitive, then for every pair of edges in the graph, one
pointing from vertex i to vertex j and the other from vertex j to vertex k (i,j,k = 1, 2,
3), there is an edge pointing from vertex i directly to vertex k.

9/14/2021 NFT 251


9/14/2021 NFT 252
Crisp Tolerance Relation:
The crisp tolerance relation exhibits only properties of reflexivity and symmetry.

9/14/2021 NFT 253


Suppose in an airline transportation system we have a universe composed of five
elements: the cities Omaha, Chicago, Rome, London, and Detroit. The airline is
studying locations of potential hubs in various countries and must consider air
mileage between cities and takeoff and landing policies in the various countries.
These cities can be enumerated as the elements of a set, that is, X = {x1,x2, x3,
x4,x5}={Omaha, Chicago, Rome, London, Detroit}.
Further, suppose we have a tolerance relation, R1, that expresses relationships among
these cities

9/14/2021 NFT 254


9/14/2021 NFT 255
Fuzzy Equivalence Relation

9/14/2021 NFT 256


Suppose, in a biotechnology experiment, five potentially new strains of bacteria
have been detected in the area around an anaerobic corrosion pit on a new
aluminum–lithium alloy used in the fuel tanks of a new experimental aircraft. In
order to propose methods to eliminate the biocorrosion caused by these bacteria,
the five strains must first be categorized. One way to categorize them is to
compare them to one another

9/14/2021 NFT 257


9/14/2021 NFT 258
9/14/2021 NFT 259
Fuzzy Propositions
Two-valued logic vs. Multi-valued logic:

The basic assumption upon which crisp logic is based - that every
proposition is either TRUE or FALSE.
The classical two-valued logic can be extended to multi-valued logic.
As an example, three valued logic to denote true(1), false(0) and
indeterminacy (1 /2)

9/14/2021 NFT 261


Three-valued logic
Fuzzy connectives defined for such a three-valued logic better can be stated
as follows:

Symbol Connective Usage Definition

¬ NOT ¬P 1−T(P)
∨ OR P∨Q max{T(P), T(Q)}
∧ AND P∧Q min{T(P),T(Q)}
=⇒ IMPLICATIO (P =⇒Q) or (¬P∨Q) max{(1 - T(P)), T(Q)}
N
= EQUALITY (P = Q) or 1−|T(P)−T(Q)|
[(P =⇒Q)∧ (Q =⇒P)]

9/14/2021 NFT 262


Fuzzy proposition
Example 1: P : Ram is honest

1 T(P) = 0.0 : Absolutely false


2 T(P) = 0.2 : Partially false
3 T(P) = 0.4 : May be false or not false
4 T(P) = 0.6 : May be true or not true
5 T(P) = 0.8 : Partially true
6 T(P) = 1.0 : Absolutely true.

Example 2 :
P : Mary is efficient ; T(P) = 0.8;
Q : Ram is efficient ; T(Q) = 0.6

1 Mary is not efficient. T(¬P) = 1−T(P) = 0.2


2 Mary is efficient and so is Ram. T(P∧Q) = min{T(P),T(Q)} = 0.6
3 Either Mary or Ram is efficient T(P∨Q) = maxT(P),T(Q) = 0.8
4 If Mary is efficient then so is Ram T(P =⇒Q) = max{1−T(P),T(Q)} = 0.6

9/14/2021 NFT 263


Fuzzy
Implications
Fuzzy rule
A fuzzy implication (also known as fuzzy If-Then rule, fuzzy rule, or fuzzy
conditional statement) assumes the form :

If x is A then y is B

where, A and B are two linguistic variables defined by fuzzy sets A and B on the
universe of discourses X and Y, respectively.

Often, x is A is called the antecedent or premise, while y is B is called the


consequence or conclusion.

9/14/2021 NFT 264


Fuzzy implication :
Example 1
If pressure is High then temperature is Low

If mango is Yellow then mango is Sweet else mango is Sour

If road is Good then driving is Smooth else traffic is High


The fuzzy implication is denoted as R : A→B In essence, it represents a binary fuzzy
relation R on the (Cartesian) product of A×B

Example 2
Suppose, P and T are two universes of discourses representing pressure and
temperature, respectively as follows.
P ={1,2,3,4}and
T ={10, 15, 20, 25, 30, 35, 40, 45, 50}
Let the linguistic variable High temperature and Low pressure are given as
THIGH = {(20,0.2),(25,0.4),(30,0.6),(35,0.6),(40,0.7),(45,0.8),(50,0.8)}
PLOW = (1,0.8),(2,0.8),(3,0.6),(4,0.4)

9/14/2021 NFT 265


Example 2
Then the fuzzy implication
If temperature is High then pressure is Low can be defined as R : THIGH
→PLOW
where,
R=

Note : If temperature is 40 then what about low pressure?

9/14/2021 NFT 266


Example 3:
Zadeh’s Max-Min rule

If x is A then y is B with the implication of Zadeh’s max-min rule can be


written equivalently as :

Here, Y is the universe of discourse with membership values for all y ∈Y is 1,


that is , µY(y) = 1∀y ∈Y.
Suppose
X = {a,b,c,d}and Y = {1,2,3,4} and
A = {(a,0.0),(b,0.8),(c,0.6),(d,1.0)}
B = {(1,0.2),(2,1.0),(3,0.8),(4,0.0)}are two fuzzy sets.
We are to determine

9/14/2021 NFT 267


Example 3:
Zadeh’s min-max rule:
The computation of is as follows:

And

9/14/2021 NFT 268


IF x is A THEN y is B ELSE y is C.
The relation R is equivalent to

The membership function of R is given by

9/14/2021 NFT 269


Example 4:
X = {a,b,c,d}
Y = {1,2,3,4}
A = {(a,0.0),(b,0.8),(c,0.6),(d,1.0)}
B = {(1,0.2),(2,1.0),(3,0.8),(4,0.0)}
C = {(1,0),(2,0.4),(3,1.0),(4,0.8)}
Determine the implication relation : If x is A then y is B else y is C

9/14/2021 NFT 270


9/14/2021 NFT 271
Washing Machine – 2 rules
Rule 1: If Laundry quantity is LARGE and Laundry softness is HARD then wash cycle
is strong.

Rule 2: If Laundry quantity is MEDIUM and Laundry softness is NOT SO HARD


then wash cycle is normal.

9/14/2021 NFT 272


• Using If-Then type fuzzy rules converts the fuzzy input to the fuzzy output.

9/14/2021 NFT 273


Fuzzy Logic Example
Automotive Speed Controller
3 inputs:
speed (5 levels)
acceleration (3 levels)
distance to destination (3 levels)

1 output:
power (fuel flow to engine)

Set of rules to determine output based on input


values
9/14/2021 NFT 274
Fuzzy Logic Example

9/14/2021 NFT 275


Fuzzy Logic Example
Example Rules
IF speed is TOO SLOW and acceleration is DECELERATING,
THEN INCREASE POWER GREATLY

IF speed is SLOW and acceleration is DECREASING,


THEN INCREASE POWER SLIGHTLY

IF distance is CLOSE,
THEN DECREASE POWER SLIGHTLY

...

9/14/2021 NFT 276


Fuzzy Logic Example

Output Determination
Degree of membership in an output fuzzy set now represents each fuzzy
action.

Fuzzy actions are combined to form a system output.

9/14/2021 NFT 277


■ The classical example in fuzzy sets is tall
men. The elements of the fuzzy set “tall
men” are all men, but their degrees of
membership depend on their height.

9/14/2021 NFT 278


Crisp and fuzzy sets of “tall men”

9/14/2021 NFT 279


■ The x-axis represents the universe of
discourse – the range of all possible values
applicable to a chosen variable. In our case,
the variable is the man height. According to
this representation, the universe of men’s
heights
■ The consists
y-axis of all the
represents tall membership
men.
value of the fuzzy set. In our case, the
fuzzy set of “tall men” maps height values
into corresponding membership values.

9/14/2021 NFT 280


How to represent a fuzzy set in a
computer?
■ First, we determine the membership
functions. In our “tall men” example, we
can obtain fuzzy sets of tall, short and
■ average men. of discourse – the men’s
The universe
heights – consists of three sets: short,
average and tall men. As you will see, a
man who is 184 cm tall is a member of the
average men set with a degree of
membership of 0.1, and at the same time,
he is also a member of the tall men set
with a degree of 0.4.
9/14/2021 NFT 281
Crisp and fuzzy sets of short, average and
tall men

9/14/2021 NFT 282


Typical Membership Functions

9/14/2021 NFT 283


Linguistic variables and hedges
■ At the root of fuzzy set theory lies the idea
of linguistic variables.
■ A linguistic variable is a fuzzy variable.
For example, the statement “John is tall”
implies that the linguistic variable John
takes the linguistic value tall.

9/14/2021 NFT 284


In fuzzy expert systems, linguistic variables
are used
in fuzzy rules.isFor
IF wind example:
strong
THEN sailing is good

IF project_duration is long
THEN completion_risk is high

IF speed is slow
THEN stopping_distance is short

9/14/2021 NFT 285


■ The range of possible values of a linguistic
variable represents the universe of
discourse of that variable. For example, the
universe of discourse of the linguistic
variable speed might have the range
between 0 and 220 km/h and may include
such fuzzy subsets as very slow, slow,
■ medium, fast,
A linguistic and very
variable fast. with it the
carries
concept of fuzzy set qualifiers, called
hedges. are terms that modify the shape
■ Hedges
of fuzzy sets. They include adverbs such
as very, somewhat, quite, more or
less and slightly.
9/14/2021 NFT 286
Fuzzy sets with the hedge very

9/14/2021 NFT 287


Representation of hedges in fuzzy
Hedge
logicGraphical Representation
Mathematical
Expression

A Little [μA( x)]1.3


more

Slightly [μA(x)]1.7
more

Very [μA (x)]2

Extremely [μA(x)]3

9/14/2021 NFT 288


Representation of hedges in fuzzy logic
(continued)

9/14/2021 NFT 289


If Distance is NEAR and Angle is SMALL Then Turn Sharp Left

9/14/2021 NFT 290


9/14/2021 NFT 291
9/14/2021 NFT 292
9/14/2021 NFT 293
9/14/2021 NFT 294
9/14/2021 NFT 295
9/14/2021 NFT 296
9/14/2021 NFT 297
9/14/2021 NFT 298
9/14/2021 NFT 299
9/14/2021 NFT 300
9/14/2021 NFT 301
9/14/2021 NFT 302
9/14/2021 NFT 303
9/14/2021 NFT 304
9/14/2021 NFT 305
9/14/2021 NFT 306
9/14/2021 NFT 307
9/14/2021 NFT 308
9/14/2021 NFT 309
9/14/2021 NFT 310
9/14/2021 NFT 311
9/14/2021 NFT 312
9/14/2021 NFT 313
9/14/2021 NFT 314
9/14/2021 NFT 315
9/14/2021 NFT 316
9/14/2021 NFT 317
9/14/2021 NFT 318
9/14/2021 NFT 319
9/14/2021 NFT 320
9/14/2021 NFT 321
Fuzzy Logic Example
Automotive Speed Controller
3 inputs:
speed (5 levels)
acceleration (3 levels)
distance to destination (3 levels)

1 output:
power (fuel flow to engine)

Set of rules to determine output based on input


values
Fuzzy Logic Example
Fuzzy Logic Example
Example Rules
IF speed is TOO SLOW and acceleration is DECELERATING,
THEN INCREASE POWER GREATLY

IF speed is SLOW and acceleration is DECREASING,


THEN INCREASE POWER SLIGHTLY

IF distance is CLOSE,
THEN DECREASE POWER SLIGHTLY

...
Fuzzy Logic Example

Output Determination
Degree of membership in an output fuzzy set now represents each fuzzy
action.

Fuzzy actions are combined to form a system output.


Fuzzy Logic Example
Steps
Fuzzification: determines an input's % membership in overlapping sets.

Rules: determine outputs based on inputs and rules.

Combination/Defuzzification: combine all fuzzy actions into a single fuzzy


action and transform the single fuzzy action into a crisp, executable system
output. May use centroid of weighted sets.
Fuzzy Logic Example
Note there would be a total of 95 different rules for
all combinations of inputs of 1, 2, or 3 at a time.
( 5x3x3 + 5x3 + 5x3 + 3x3 + 5 + 3 + 3 )

In practice, a system won't require all the rules.

System tweaked by adding or changing rules and


by adjusting set boundaries.

System performance can be very good but not


usually optimized by traditional metrics (minimize
RMS error).
9/14/2021 NFT 328
9/14/2021 NFT 329
9/14/2021 NFT 330
9/14/2021 NFT 331
9/14/2021 NFT 332
9/14/2021 NFT 333
Fuzzy Sets

9/14/2021 NFT 334


Fuzzy Sets

❑ A fuzzy set, is a set containing elements that have varying


degrees of membership in the set.

❑ Elements in a fuzzy set, because their membership need not


be complete, can also be members of other fuzzy sets on the
same universe.

❑ Elements of a fuzzy set are mapped to a universe of


membership values using a function-theoretic form.

9/14/2021 NFT 335


Fuzzy Set Theory

● An object has a numeric “degree of membership”


● Normally, between 0 and 1 (inclusive)
● 0 membership means the object is not in the set

● 1 membership means the object is fully inside the set

● In between means the object is partially in the set

9/14/2021 NFT 336


If U is a collection of objects denoted generically by x,
then a fuzzy set A in U is defined as a set of ordered pairs:

membersh
ip
function U : universe of

discourse.

9/14/2021 NFT 337


Fuzzy Sets

Characteristic function X, indicating the


belongingness of x to the set A
X(x) = 1 x ∈ A
0 x∉A
or called membership
Hence,
A ∪ B → XA ∪ B(x)
= XA(x) ∪ XB(x)
= max(XA(x),XB(x))
Note: Some books use + for ∪, but still it is not ordinary
addition!
9/14/2021 NFT 338
Fuzzy Sets

A ∩ B → XA ∩ B(x)
= XA(x) ∩ XB(x)
= min(XA(x),XB(x))
A’ → XA’(x)
= 1 – XA(x)
A’’ = A

9/14/2021 NFT 339


Fuzzy Set Operations

μA ∪ B(x) = μA(x) ∪ μB(x)


= max(μA(x), μB(x))
μA ∩ B(x) = μA(x) ∩ μB(x)
= min(μA(x), μB(x))
μA’(x) = 1 - μA(x)

De Morgan’s Law also holds:


(A ∩ B)’ = A’ ∪ B’
(A ∪ B)’ = A’ ∩ B’
But, in general
A ∪ A’
A ∩ A’
9/14/2021 NFT 340
Fuzzy Set Operations

Union of fuzzy sets A and B ∼


.

Intersection of fuzzy sets A and B∼


.

9/14/2021 NFT 341


Fuzzy Set Operations

Complement of fuzzy set A ∼

9/14/2021 NFT 342


Examples of Fuzzy Set Operations

⚫ Fuzzy union (∪): the union of two fuzzy sets is


the maximum (MAX) of each element from two
sets.
⚫ E.g.
⚫ A = {1.0, 0.20, 0.75}
⚫ B = {0.2, 0.45, 0.50}
⚫ A ∪ B = {MAX(1.0, 0.2), MAX(0.20, 0.45), MAX(0.75,
0.50)} = {1.0, 0.45, 0.75}

9/14/2021 NFT 343


Examples of Fuzzy Set Operations

⚫ Fuzzy intersection (∩): the intersection of two


fuzzy sets is just the MIN of each element from


the two sets.
E.g.
⚫ A ∩ B = {MIN(1.0, 0.2), MIN(0.20, 0.45), MIN(0.75,
0.50)} = {0.2, 0.20, 0.50}

9/14/2021 NFT 344


Examples of Fuzzy Set Operations

9/14/2021 NFT 345


Properties of Fuzzy Sets

A∪B=B∪A
A∩B=B∩A
A ∪ (B ∪ C) = (A ∪ B) ∪ C
A ∩ (B ∩ C) = (A ∩ B) ∩ C
A ∪ (B ∩ C) = (A ∪ B) ∩ (A ∪ C)
A ∩ (B ∪ C) = (A ∩ B) ∪ (A ∩ C)
A∪A=A A∩A=A
A∪X=X A∩X=A
A∪∅=A A∩∅=∅
If A ⊆ B ⊆ C, then A ⊆ C

A’’ = A
9/14/2021 NFT 346
Fuzzy Sets

Note μ(x) ∈ [0,1]


not {0,1} like Crisp set
A = {μA(x1) / x1 + μA(x2) / x2 + …}
= {∑ μA(xi) / xi}
Note: ‘+’ ≠ add
‘/ ’ ≠ divide
Only for representing element and its
membership.
Also some books use μ(x) for Crisp Sets too.

9/14/2021 NFT 347


Example (Discrete Universe)
# courses a student
may take in a semester.

appropriate
# courses taken

0.
5

0
2 4 6 8
9/14/2021 NFT x:# 348
Example (Discrete Universe)
# courses a student
may take in a semester.

appropriate
# courses taken

Alternative Representation:

9/14/2021 NFT 349


a
b
Example (Continuous Universe) o
u
t
U : the set of positive real numbers possible ages
5
0

y
e
a
Alternative r
Representation: s

o
9/14/2021 NFT
l x: 350
9/14/2021 NFT 351
9/14/2021 NFT 352

You might also like