Professional Documents
Culture Documents
NFT PPT1 w21
NFT PPT1 w21
NFT PPT1 w21
9/14/2021 NFT 2
List of Books
Neuro-Fuzzy and Soft Computing: A computational Approach to Learning
& Machine Intelligence; Roger Jang, Tsai Sun, Eiji Mizutani, PHI
9/14/2021 NFT 3
9/14/2021 NFT 4
9/14/2021 NFT 5
9/14/2021 NFT 6
0 T 1
9/14/2021 NFT 7
9/14/2021 NFT 8
9/14/2021 NFT 9
9/14/2021 NFT 10
Some real ANN usages
● Character recognition
● Image Compression
● Classification of Neurodegenerative Diseases
● sentiment analysis
● Forecasting
9/14/2021 NFT 11
Artificial neural network
● Biologically inspired
● A network of simple
processing elements
9/14/2021 NFT 12
A simple neuron
9/14/2021 NFT 13
The Neuron Diagram
Bia
s
x w b
1 1 Activati
Induc on
ed function Outp
Field
x w ut
v
Inpu
2 2
y
t
valu Summi
es ng
functio
x w n
m mweigh
ts
9/14/2021 NFT 14
Neuron
● The neuron is the basic information processing unit of
a NN. It consists of:
1 A set of links, describing the neuron inputs, with weights W 1,
W 2, …, Wm
2 An adder function (linear combiner) for computing the
weighted sum of the inputs:
(real numbers)
9/14/2021 NFT 15
Bias of a Neuron
9/14/2021 NFT 16
Activation Functions
• Controls when unit is
“active” or “inactive”
• Threshold function
outputs 1 when input
is positive and 0
otherwise
• Sigmoid function
= 1 / (1 + e-x)
9/14/2021 NFT 17
Neuron Models
● The choice of activation function determines the
neuron model.
Examples:
● step function:
● ramp function:
9/14/2021 NFT 18
Step Function
9/14/2021 NFT 19
Ramp Function
c d
9/14/2021 NFT 20
Sigmoid function
9/14/2021 NFT 21
Mc-Culloch Pitts Model
AND Model
x1 1
Y=F(yin)
1
x2
OR Model
x1 2
2
x2
9/14/2021 NFT 23
NOT Model
1
Xx1
9/14/2021 NFT 24
9/14/2021 NFT 25
9/14/2021 NFT 26
9/14/2021 NFT 27
9/14/2021 NFT 28
b c
a
AND Model
9/14/2021 NFT 29
OR Model
9/14/2021 NFT 30
1 1 0
NOT Model
9/14/2021 NFT 31
XOR Model
9/14/2021 NFT 32
Y
z
a
x a
i x
s i
s
X axis X axis
9/14/2021 NFT 33
Y
z
a
x a
i x
s i
s
X axis X axis
9/14/2021 NFT 34
Y
a
x
i
s
z
a
x
i
s
X axis
9/14/2021 NFT 35
XOR Model
X1 Z1
X2 Z2
9/14/2021 NFT 36
1
X1 Z1 1
-1 N
-1 1
X2 1 Z2
9/14/2021 NFT 37
Hebb Algorithm
Step 1: Initialize all weights and bias are set to zero
Step 2: For each training vector and target output pair (s, t) perform
steps 3-6
Step 3: Set activations for input units with input vector
Step 4: Set activation for output unit with the output neuron.
Step 5: Adjust the weights by applying Hebb rule
9/14/2021 NFT 38
9/14/2021 NFT 39
9/14/2021 NFT 40
X1 X2 B y
1 1 1 1
1 -1 1 -1
-1 1 1 -1
-1 -1 1 -1
AND Model
9/14/2021 NFT 41
9/14/2021 NFT 42
9/14/2021 NFT 43
X1 X2 b y ΔW ΔW 2 Δb W1 W2 B
1
0 0 0
1 1 1 1 1 1 1 1 1 1
1 -1 1 -1 -1 1 -1 0 2 0
-1 1 1 -1 1 -1 -1 1 1 -1
-1 -1 1 -1 1 1 -1 2 2 -2
x1 2 -2
2
x2
9/14/2021 NFT 44
Example 2:
X1 X2 B y
1 1 1 -1
1 -1 1 1
-1 1 1 1
-1 -1 1 -1
X1 X2 B y ΔW ΔW 2 Δb W1 W2 b
1
0 0 0
1 1 1 -1 -1 -1 -1 -1 -1 -1
1 -1 1 1 1 -1 1 0 -2 0
-1 1 1 1 -1 1 1 -1 -1 1
-1 -1 1 -1 1 1 -1 0 0 0
9/14/2021 NFT 45
Example 3:
X1 X2 X3 X4 B y
1 1 -1 1 1 1
-1 1 -1 1 1 -1
1 -1 1 -1 1 1
-1 -1 1 1 1 -1
X1 X2 X3 X4 B y ΔW 1 ΔW 2 ΔW 3 ΔW 4 Δb W1 W2 W3 W4 b
0 0 0 0 0
1 1 -1 1 1 1 1 1 -1 1 1 1 1 -1 1 1
-1 1 -1 1 1 -1 1 -1 1 -1 -1 2 0 0 0 0
1 -1 1 -1 1 1 1 -1 1 -1 1 3 -1 1 -1 1
-1 -1 1 1 1 -1 1 1 -1 -1 -1 4 0 0 -2 0
9/14/2021 NFT 46
Example 4:
* * * *
* *
* * * * * *
X1 X2 X3 X4 X5 X6 X7 X8 X9 Y
1 1 1 -1 1 -1 1 1 1 1
1 -1 -1 1 -1 -1 1 1 1 -1
9/14/2021 NFT 47
X1 X2 X3 X4 X5 X6 X7 X8 X9 Y
1 1 1 -1 1 -1 1 1 1 1
1 -1 -1 1 -1 -1 1 1 1 -1
ΔW 1 ΔW 2 ΔW 3 ΔW 4 ΔW 5 ΔW 6 ΔW 7 ΔW 8 ΔW 9 Δb
1 1 1 -1 1 -1 1 1 1 1
-1 1 1 -1 1 1 -1 -1 -1 -1
W1 W2 W3 W4 W5 W6 W7 W8 W9 Δb
0 0 0 0 0 0 0 0 0 0
1 1 1 -1 1 -1 1 1 1 1
0 2 2 -2 2 0 0 0 0 0
9/14/2021 NFT 48
x1
0
x2
2
x3 2
-2
x4
2
x5 0
0
x6
0
x7
0
x8
x9
9/14/2021 NFT 49
AND gate
W1 W2 B
2 2 -2
9/14/2021 NFT 50
9/14/2021 NFT 51
Perceptron Rule
Step 1: Initialize all weights and bias are set to zero. Set learning rate
=0to 1.
Step 2: while stopping condition is false do step 3-7
Step 3: For each training vector and target output pair (s, t) perform
steps 4-6
Step 4: Set activations for input units with input vector
Step 5: Compute the output unit response
9/14/2021 NFT 52
Step 6: The weights and bias are updated if the target is not equal to the output
response. If t≠y and value of is not zero
9/14/2021 NFT 53
AND MODEL
X1 X2 b Yin y t ΔW ΔW 2 Δb W1 W2 B
1
0 0 0
1 1 1 0 0 1 1 1 1 1 1 1
1 -1 1 1 1 -1 1 -1 -1 2 0 0
-1 1 1 2 1 -1 -1 1 -1 1 1 -1
-1 -1 1 -3 -1 -1 0 0 0 1 1 -1
x1 1 -1
1
x2
9/14/2021 NFT 54
AND MODEL
X1 X2 b Yin y t ΔW ΔW 2 Δb W1 W2 B
1
1 1 -1
1 1 1 0 0 1 1 1 1 1 1 -1
1 -1 1 1 -1 -1 1 -1 -1 1 1 -1
-1 1 1 2 -1 -1 -1 1 -1 1 1 -1
-1 -1 1 -3 -1 -1 0 0 0 1 1 -1
x1 1 -1
1
x2
9/14/2021 NFT 55
OR MODEL
X1 X2 b Yin y t ΔW ΔW 2 Δb W1 W2 B
1
0 0 0
1 1 1 0 0 1 1 1 1 1 1 1
1 -1 1 1 1 1 0 0 0 1 1 1
-1 1 1 1 1 1 0 0 0 1 1 1
-1 -1 1 -1 -1 -1 0 0 0 1 1 1
x1 1 1
1
x2
9/14/2021 NFT 56
AND MODEL
X1 X2 b Yin y t ΔW ΔW 2 Δb W1 W2 B
1
0 0 0
1 1 1 0 0 1 1 1 1 1 1 1
1 0 1 2 1 -1 1 0 -1 2 1 0
0 1 1 1 1 -1 0 -1 -1 2 0 -1
0 0 1 -1 -1 -1 0 0 0 2 0 -1
x1 2 -1
0
x2
9/14/2021 NFT 57
Example 3:
X1 X2 X3 X4 B t
1 1 1 1 1 1
1 1 1 -1 1 -1
-1 1 -1 -1 1 1
1 -1 -1 1 1 -1
X1 X2 X3 X4 B Yin y t ΔW ΔW 2 ΔW 3 ΔW 4 Δb W1 W2 W3 W4 b
1
0 0 0 0 0
1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1
1 1 1 -1 1 3 1 -1 -1 -1 -1 1 -1 0 0 0 2 0
-1 1 -1 -1 1 -2 -1 1 -1 1 -1 -1 1 -1 1 -1 1 1
1 -1 -1 1 1 1 1 -1 -1 1 1 -1 -1 -2 2 0 0 0
9/14/2021 NFT 58
Adaline
Step 1: Initialize all weights and bias any small value other than zero.
Set learning rate =0to 1.
Step 2: while stopping condition is false do step 3-7
Step 3: For each bipolar training vector and target output pair (s, t)
perform steps 4-6
Step 4: Set activations for input units with input vector
Step 5: Compute the output unit response
9/14/2021 NFT 59
Step 6: The weights and bias are updated
9/14/2021 NFT 60
X1 X2 b Yin t t-Yin ΔW 1 ΔW 2 Δb W1 W2 B (t-
Yin) 2
0.2 0.2 0.2
1 1 1 0.6 1 0.4 0.08 0.08 0.08 0.28 0.28 0.28 0.16
1 -1 1 0.28 1 0.72 0.14 -0.14 0.14 0.42 0.13 0.424 0.51
4 4 4 4 6
-1 1 1 0.136 1 0.864 -0.1 0.173 0.173 0.251 0.30 0.597 0.74
73 9
-1 -1 1 0.037 1 0.963 -0.1 -0.19 0.96 0.05 0.11 1.56 0.92
93 3 3 84 6
9/14/2021 NFT 61
Multiclass Discrimination
⚫ Often, our classification problems involve more than
⚫ For
two classes.
example, character recognition requires at least 26
⚫ We
different classes.
can perform such tasks using layers of perceptrons
or Adalines.
9/14/2021 NFT 62
Multiclass Discrimination
w11
o1
w12
i1
o2
i2
.
. o3
.
in
w4n o4
⚫ Adimensional
four-node perceptron for a four-class problem in n-
input space
9/14/2021 NFT 63
Multiclass Discrimination
⚫ Each perceptron learns to recognize one particular
class, i.e., output 1 if the input is in that class, and 0
⚫ Incurrent
production mode, the network decides that its
input is in the k-th class if and only if o = 1,
k
⚫ For
and for all j ≠ k, o = 0, otherwise it is misclassified.
j
⚫ This
the input.
maximum should be significantly greater than all
other outputs, otherwise the input is misclassified.
9/14/2021 NFT 64
Multilayer Networks
⚫ Although single-layer perceptron networks can
distinguish between any number of classes, they still
⚫ Tomultiple
require linear separability of inputs.
overcome this serious limitation, we can use
⚫ Rosenblatt
layers of neurons.
first suggested this idea in 1961, but he
⚫ However,
used perceptrons.
their non-differentiable output function led
⚫ The
to an inefficient and weak learning algorithm.
idea that eventually led to a breakthrough was the
use of continuous output functions and gradient
descent.
9/14/2021 NFT 65
Terminology
⚫ Example: Network function f: R → R 3 2
output vector
o1 o2
output layer
hidden layer
input layer
x1 x2 x3
input vector
9/14/2021 NFT 66
9/14/2021 NFT 67
9/14/2021 NFT 68
9/14/2021 NFT 69
9/14/2021 NFT 70
9/14/2021 NFT 71
9/14/2021 NFT 72
9/14/2021 NFT 73
9/14/2021 NFT 74
9/14/2021 NFT 75
Madaline
Step 1: Initialize all weights and bias any small value other than zero.
Set learning rate =0to 1.
Step 2: while stopping condition is false do step 3-7
Step 3: For each bipolar training vector and target output pair (s, t)
perform steps 4-7
Step 4: Set activations for input units with input vector
Step 5: Compute the output unit response
9/14/2021 NFT 76
Step 7: Calculate the error and update the weights.
1. If t=y, no weight updation is required.
2. If t≠y and t=+1, update the weights on Zj, where
net input is closest to 0(zero)
9/14/2021 NFT 77
1
b1 1
W11
X1 Z1
W11 =0.05 b3
V1
W12 =0.1
W21 =0.2 W12
W22 =0.2
V 1 =0. 5 Y
V 2 =0. 5
b1 =0.3 W21
b2 =0.15
b3 =0. 5 V2
X2 Z2
W22
b2
9/14/2021 NFT 78
X1 X2 B y
1 1 1 -1
1 -1 1 1
-1 1 1 1
-1 -1 1 -1
9/14/2021 NFT 79
Hetero Associative Memory Neural Networks
9/14/2021 NFT 80
Hetero Associative Memory Neural Networks
9/14/2021 NFT 81
Example 1: A hetero associative neural network is trained by Hebb outer product rule
for input row vector S=(x1,x2,x3,x4) to the output row vectors t=(t1,t2). Find the
weight matrix.
S1=(1 1 0 0) t1=(1 0)
S2=(1 1 1 0) t2=(0 1)
S3=(0 0 1 1) t3=(1 0)
S4=(0 1 0 0) t4=( 1 0)
9/14/2021 NFT 82
Step 2: Find the output for each input
S2=(1 1 1 0) t2=(0 1)
9/14/2021 NFT 83
Step 2: Find the output for each input
S4=(0 1 0 0) t4=( 1 0)
Step 3Weight matrix of all the four patterns is the sum of the weight matrix
for each stored pattern
9/14/2021 NFT 84
Example 2: A hetero associative neural network is trained by Hebb outer product rule
for input row vector S=(x1,x2,x3,x4) to the output row vectors t=(t1,t2). Find the
weight matrix.
S1=(1 1 0 0) t1=(1 0)
S2=(0 1 0 0) t2=(1 0)
S3=(0 0 1 1) t3=(0 1)
S4=(0 0 1 0) t4=( 0 1)
9/14/2021 NFT 85
Step 2: Find the output for each input
S2=(0 1 0 0) t2=(1 0)
9/14/2021 NFT 86
Step 2: Find the output for each input
S4=(0 0 1 0) t4=( 0 1)
Step 3Weight matrix of all the four patterns is the sum of the weight matrix
for each stored pattern
9/14/2021 NFT 87
Test the weight using different input set:
S1=(1 1 0 0) t1=(1 0)
S2=(0 1 0 0) t2=(1 0)
9/14/2021 NFT 88
S3=(0 0 1 1) t3=(0 1)
S4=(0 0 1 0) t4=( 0 1)
9/14/2021 NFT 89
Test the weight using different input set:
S3=(1 1 1 0) t3=?
9/14/2021 NFT 90
Example 3: A hetero associative neural network is trained by Hebb outer product rule
for input row vector S=(x1,x2,x3,x4) to the output row vectors t=(t1,t2). Find the
weight matrix.
S1=(1 1 -1 -1) t1=(1 -1)
S2=(-1 1 -1 -1) t2=(1 -1)
S3=(-1 -1 1 1) t3=(-1 1)
S4=(-1 -1 1 -1) t4=( -1 1)
9/14/2021 NFT 91
Step 2: Find the output for each input
S2=(-1 1 -1 -1) t2=(1 -1)
9/14/2021 NFT 92
Step 2: Find the output for each input
S4=(-1 -1 1 -1) t4=( -1 1)
Step 3Weight matrix of all the four patterns is the sum of the weight matrix
for each stored pattern
9/14/2021 NFT 93
Test the weight using different input set:
9/14/2021 NFT 94
S3=(-1 -1 1 1) t3=(-1 1)
9/14/2021 NFT 95
Test the weight using different input set:
S1=(1 1 0 0) t1=(1 0)
S2=(0 1 0 0) t2=(1 0)
9/14/2021 NFT 96
S3=(0 0 1 1) t3=(0 1)
S4=(0 0 1 0) t4=( 0 1)
9/14/2021 NFT 97
S3=(1 0 0 0 ) t3=?
S4=(1 1 1 0) t4=?
9/14/2021 NFT 98
S5=(1 1 1 1) t5=?
S6=(0 1 -1 0) t6=?
9/14/2021 NFT 99
Auto Associative Memory Neural Networks
S1=(1 1 0 0) t1=(1 0)
S2=(0 1 0 0) t2=(1 0)
❖ 2.backpropagation
⚫
pre-synaptic connections
Synaptic efficacy is
modeled using real
⚫
weights wi
The response of the
neuron is a nonlinear
function f of its
weighted inputs
⚫
called input nodes and simply copy values
An input may excite or inhibit the response of the
neuron to which it is applied, depending upon the
weight of the connection
⚫
inhibitory
Normally, positive weights are considered as
excitatory while negative weights are thought of as
⚫
inhibitory
Learning is the process of modifying the weights in
order to produce a network that performs some
function
⚫ Piecewise linear
⚫
train the network
Testing Set
A collection of input-output patterns that are used to
⚫
assess network performance
Learning Rate-α
A scalar parameter, analogous to step size in numerical
integration, used to set the rate of adjustments
⚫ Root-Mean-Squared-Error (RMSE)
Outputs
Inputs
x1 2
Z1
1 -1
0 -1
1
x2 2 1
Z2
V= [ -1 1 2] 2 Y
0 3 2
W=[ 2 1 0
1 2 2 x3 1 Z3
0 3 1]
x=[ 0.6 0.8 0]
Vo= [-1] -1
T=[ 0.9]
1
Wo=[ 0 0 -1]
⚫ Algorithm:
are simply ignored.
Architecture
neuron i
Kohonen layer
wi
Winning neuron
x1 0.2
Z1
0.8
0.6
x2
0.4
0.5
Z2
0.7
x3
0.3
0.9
x4
x1 0.9
C1
0.7
0.6
x2
0.4
0.3
C2
x3 0.5
0.3
x1
0.6
X=[0.3,0.4] C2
α=0.3 0.5
0.4
0.7 C3
x2
0.9
0.6
C4
0.2
0.8
C1
S1=(1 1 0 0) t1=(1 0)
N >> l.
• The LVQ algorithm proceeds as follows:
i. Suppose that the Voronoi vector wc is the
closest to the input vector xi. Let Cwc and Cxi
denote the class labels associated with wc and
xi respectively. Then the Voronoi vector wc is
adjusted as follows:
• If Cwc = Cxi then
Wc(n+1)= wc(n)+an [xi- wc(n)]
Where 0< an <1
⚫ Neuron layers
⚫ Cumulative current input
⚫ Current output
⚫ Error signal for each neuron
⚫ Bias weights
⚫
input node
Input nodes compute
only the identity function
Outputs
Inputs
and
Outputs
Inputs
k ranges over the input indices and Wjk
is the weight on the connection
from input k to neuron j
⚫
Opj) Opj (1-Opj)
⚫
Tpj is the target value of output neuron j for pattern p
Opj is the actual output value of output neuron j for
pattern p
Slow Fast
Speed = 0 Speed = 1
Fast
[ 0.50 – 0.75 ]
Fastest
[ 0.75 – 1.00 ]
⚫ Neural Networks
systems.
Some Examples
⚫ Temperature Controller
IF distance is CLOSE,
THEN DECREASE POWER SLIGHTLY
...
Output Determination
Degree of membership in an output fuzzy set now represents each fuzzy
action.
Examples:
A number is close to 5
Examples:
He/she is tall
⚫
follows:
⚫
the clock speeds of computer CPUs;
⚫
the operating currents of an electronic motor;
⚫
the operating temperature of a heat pump;
the integers 1 to 10.
Union:
A ∪ B = {x | x ∈ A or x ∈ B}
Intersection:
A ∩ B = {x | x ∈ A and x ∈ B}
Complement:
A’ = {x | x ∉ A, x ∈ X}
X – Universal Set
Set Difference:
A | B = {x | x ∈ A and x ∉ B}
Set difference is also denoted by A - B
Complement of set A.
A∪B=B∪A
A∩B=B∩A
A ∪ (B ∪ C) = (A ∪ B) ∪ C
A ∩ (B ∩ C) = (A ∩ B) ∩ C
A ∪ (B ∩ C) = (A ∪ B) ∩ (A ∪ C)
A ∩ (B ∪ C) = (A ∩ B) ∪ (A ∩ C)
A∪A=A
A∩A=A
A∪X=X
A∩X=A
A∪∅=A
A∩∅=∅
9/14/2021 NFT 203
Mapping of Classical Sets to Functions
Membership
function
Height
Core
Support
Boundary Boundary
1 1
1 1
A ∪Ã=X
Law of contradiction
A ∩ Ã=ø
Demorgan’s Law
⚫ Extension to n sets
A1×A2×...×An =
{(a1, ... , an) | a1 ∈ A1, a2 ∈ A2, ... , an ∈ An }
Product set A × B
9/14/2021 NFT 219
Crisp Relation
A × A = {(a1, a1), (a1, a2), (a1, a3), (a2, a1), (a2, a2), (a2,
a3), (a3, a1), (a3, a2), (a3, a3)}
Cartesian product A × A
9/14/2021 NFT 220
Crisp Relation
⚫ ⚫Definition Binary Relation
R = { (x,y) | x ∈ A, y ∈ B } ⊆ A x B
⚫ n-ary Relation
(x1, x2, x3, … , xn) ∈ R ,
R ⊆ A1 × A2 × A3 × … × An
A × B = {(0,a),(0,b),(0,c),(1,a),(1,b),(1,c)}
B × A = {(a,0),(a,1),(b,0),(b,1),(c,0),(c,1)}
A × A = A2 = {(0,0),(0,1),(1,0),(1,1)}
B × B = B2 ={(a,a),(a,b),(a,c),(b,a),(b,b),(b,c),(c,a),(c,b),(c,c)}
⚫ χR(x,y)= 1, (x,y) ∈ X × Y
0, (x,y) X×Y
R b1 b2 b3
a1 1 0 0
a2 0 1 0
a3 0 1 0
a4 0 0 1
Matri Directed
9/14/2021 x NFT graph 227
Crisp Relation
⚫ Operations on relations R, S ⊆ A × B
(1) Union T = R ∪ S
If (x, y) ∈ R or (x, y) ∈ S, then (x, y) ∈ T
(2) Intersection T = R ∩ S
If (x, y) ∈ R and (x, y) ∈ S, then (x, y) ∈ T.
(3) Complement
If (x, y) ∉ R, then (x, y) ∈ RC
(4) Inverse
R-1 = {(y, x) ∈ B × A | (x, y) ∈ R, x ∈ A, y ∈ B}
(5) Composition T
R ⊆ A × B, S ⊆ B × C , T = S ∙ R ⊆ A × C
T = {(x, z) | x ∈ A, y ∈ B, z ∈ C, (x, y) ∈ R, (y, z) ∈ S}
⚫ n =n *n
⚫ power set
X×Y X Y
⚫ :
The cardinality P(X × Y)
⚫ n P(X × Y) = 2 (nXnY)
S={(Y1,Z2),(Y3,Z2)}
⚫ Composition oeration
⚫ Max-min composition
⚫ T=R S。
⚫ Max-product comositon
⚫ T=R S。
9/14/2021 NFT 230
9/14/2021 NFT 231
Composition
⚫ Max-min composition
y1 y2 y3 y4 z1 z2
R= x1 S= y1
x2 y2
x3 y3
y4
z1 z2
T= x1
x2
x3
⚫ Example 3.5
and we want to find the relationship between quantity of black pixels in the virus
and the shape of the black pixel clusters. U
⚫ Max-min composition
⚫ Max-product compositon
Using max–min composition with the relation R∼ will yield a new value for the
fuzzy set of pixel cluster shapes that are associated with the new black pixel
quantity:
Symmetric:
If a relation is symmetric, then in the graph for every edge pointing from vertex i to
vertex j (i,j = 1, 2, 3), there is an edge pointing in the opposite direction, that is,
from vertex j to vertex i.
Transitive:
When a relation is transitive, then for every pair of edges in the graph, one
pointing from vertex i to vertex j and the other from vertex j to vertex k (i,j,k = 1, 2,
3), there is an edge pointing from vertex i directly to vertex k.
The basic assumption upon which crisp logic is based - that every
proposition is either TRUE or FALSE.
The classical two-valued logic can be extended to multi-valued logic.
As an example, three valued logic to denote true(1), false(0) and
indeterminacy (1 /2)
¬ NOT ¬P 1−T(P)
∨ OR P∨Q max{T(P), T(Q)}
∧ AND P∧Q min{T(P),T(Q)}
=⇒ IMPLICATIO (P =⇒Q) or (¬P∨Q) max{(1 - T(P)), T(Q)}
N
= EQUALITY (P = Q) or 1−|T(P)−T(Q)|
[(P =⇒Q)∧ (Q =⇒P)]
Example 2 :
P : Mary is efficient ; T(P) = 0.8;
Q : Ram is efficient ; T(Q) = 0.6
If x is A then y is B
where, A and B are two linguistic variables defined by fuzzy sets A and B on the
universe of discourses X and Y, respectively.
Example 2
Suppose, P and T are two universes of discourses representing pressure and
temperature, respectively as follows.
P ={1,2,3,4}and
T ={10, 15, 20, 25, 30, 35, 40, 45, 50}
Let the linguistic variable High temperature and Low pressure are given as
THIGH = {(20,0.2),(25,0.4),(30,0.6),(35,0.6),(40,0.7),(45,0.8),(50,0.8)}
PLOW = (1,0.8),(2,0.8),(3,0.6),(4,0.4)
And
1 output:
power (fuel flow to engine)
IF distance is CLOSE,
THEN DECREASE POWER SLIGHTLY
...
Output Determination
Degree of membership in an output fuzzy set now represents each fuzzy
action.
IF project_duration is long
THEN completion_risk is high
IF speed is slow
THEN stopping_distance is short
Slightly [μA(x)]1.7
more
Extremely [μA(x)]3
1 output:
power (fuel flow to engine)
IF distance is CLOSE,
THEN DECREASE POWER SLIGHTLY
...
Fuzzy Logic Example
Output Determination
Degree of membership in an output fuzzy set now represents each fuzzy
action.
membersh
ip
function U : universe of
discourse.
A ∩ B → XA ∩ B(x)
= XA(x) ∩ XB(x)
= min(XA(x),XB(x))
A’ → XA’(x)
= 1 – XA(x)
A’’ = A
⚫
the two sets.
E.g.
⚫ A ∩ B = {MIN(1.0, 0.2), MIN(0.20, 0.45), MIN(0.75,
0.50)} = {0.2, 0.20, 0.50}
A∪B=B∪A
A∩B=B∩A
A ∪ (B ∪ C) = (A ∪ B) ∪ C
A ∩ (B ∩ C) = (A ∩ B) ∩ C
A ∪ (B ∩ C) = (A ∪ B) ∩ (A ∪ C)
A ∩ (B ∪ C) = (A ∩ B) ∪ (A ∩ C)
A∪A=A A∩A=A
A∪X=X A∩X=A
A∪∅=A A∩∅=∅
If A ⊆ B ⊆ C, then A ⊆ C
A’’ = A
9/14/2021 NFT 346
Fuzzy Sets
appropriate
# courses taken
0.
5
0
2 4 6 8
9/14/2021 NFT x:# 348
Example (Discrete Universe)
# courses a student
may take in a semester.
appropriate
# courses taken
Alternative Representation:
y
e
a
Alternative r
Representation: s
o
9/14/2021 NFT
l x: 350
9/14/2021 NFT 351
9/14/2021 NFT 352