Professional Documents
Culture Documents
Modul2ver2
Modul2ver2
[CPEN6144]
Modul #2
Introduction to Artificial Neural
Network
x2 wi2 yi
f(.) a(.)
xm wim
net input function
General model of processing element (PE)
Fundamental Concepts
A linear threshold unit (LTU) is a PE with linear
integration function and a hard limiter activation
function.
M-P Neuron
x1 wi1
x2 wi2 f(.) y
a(.) i
xm wim
b
m 1, if f 0
yi (t 1) a wii x j (t ) b a( f )
j 1 1, if f 0
b : Bias, z=Threshold
b=1;
w1=[-1/3;-1/2];
W=[w1'];
Single Node Neural NetworkPV=[dx;dy];
dx=[1 1 1.5]; %koordinat data 1 x1 PA=[d1x;d1y];
dy=[1 1.2 0.8];%koordinat data 1 x2 n=W*PV+b;
d1x=[1 1 2 2.2 2.6 3 3.1];%koordinat data figure(2)
2 x1 plot([0 3],[2 0]);
d1y=[2.1 3 2.8 2.1 1.5 3 1.8];%koordinat xlabel('P1');
data 2 x1 ylabel('P2');
axis([0 3.5 0 3.5]);
figure (1) while (b==1)
hold on p1=input('dx=');
plot(dx,dy,'og','MarkerFaceColor','g'); p2=input('dy=');
%plot data 1 P=[p1;p2];
plot(d1x,d1y,'ob','MarkerFaceColor','b'); a=hardlim(W*P+b)
%plot data 1 if (a==1)
xlabel('P1'); %label P1-data1 disp ('class 1')
ylabel('P2'); %label P2-data2 plot(p1,p2,'og','MarkerFaceColor','g')
axis([0 3.5 0 3.5]); else disp ('class2')
plot([0 3],[2 0]); plot (p1,p2,'ob','MarkerFaceColor','b')
end
end
pertanyaan
1. Buat test untuk 5 data secara acak, berapa
akurasinya,
2. Jika weight diubah, apa yang terjadi, apa
pengaruhnya terhadap akurasi
3. Buatlah program untuk data lain,
menggunakan function dan activation LGU-
bipolar sigmoid tansig
Basic Models of ANNS
3. Learning/Training Rule
-Supervised Learning-
Correct inputs and desired outputs is provided,
the weight adjustment performed based on the
error of the computed output.
ANN
x W
y
(Input) (Actual output)
Error signal
d
generator
(Desired output)
Perceptron
m
Target:
(k )
y
i
aw xT
i
k
j a w ijx j di
k
(k )
j 1
Perceptron
A simple perceptron
w x 0
T
i
k
j
Perceptron
Linear Separable Problem
A single layer perceptron has only capability to
classify data into two classes for only a linear
separable problems.
A linear separable problem is one in which a decision
plane (line) can be found in the input space
separating the input pattern with desired output =
+1 from those with desired output = -1.
Xm=-1