Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 15

AI for mechanical Engineers

Dr. Arsalan Arif

Artificial Intelligence A Modern Approach


Stuart J. Russell and Peter Norvig

Spring 2023
Gradient and Non Gradient Optimization

𝑓 ( 𝑥)

𝑀 𝑖𝑛𝑖𝑚𝑢𝑚
𝑑𝑓 ( 𝑥)
=0
𝑥 𝑑𝑥
Example a:2

clear
clc
x=-20:0.1:20;
y=2.*x.^2+20.*x-22;
plot(x,y)
grid
i=1;
while(y(i)>y(i+1))
i=i+1
end
x(i)
y(i)
Matlab
Example a: 2

clear
clc

x=-20:0.1:20;
y=2.*x.^2+20.*x-22;

dy=4.*x+20;
plot(x,y)
hold on
plot(x,dy,'r')
grid
point=min(y)
Matlab
Example b:

x= [-3:0.1:3];
y= x.^3-4.*x;
plot(x,y)
grid on
i=1;
while (y(i)>y(i+1));
i=i+1;
end
x(i)
y(i)
Example c: - Matlab
Example d:
Gradient Based Optimization
Gradient Based solver used derivative to get optimal value of a function.
We can use map to reach there or the camp can be seen directly from the mountain

3 Steps
Search Direction Conversion Check Step Size

• Slope (Slope)
• Derivative in one dimensions
• Gradient in 2 or more
• In this step algorithm will choose that which direction to go

Conversion Check and Step Size


• Tells how far to go in that direction
• In absence of step size we may end up on another mountain
• Step Size is decided and the solver moves in the chosen direction.
• It keeps on checking that whether the bottom has reached or not.
• If not, the solver uses the slope and step size again, until goal is achieved (try to
reach the bottom).
• These iterations continues till the it reaches the min (bottom)

Convergence
Gradient Based Optimization Finds the min of cost function

Cost function tells us how far our predicted values are from the actual values So we want to minimize the cost function

House Example
Location, Size and its cost
As we go down we reduce the
cost function
Size is also called learning rate
𝑓 (𝑥)

𝑥
Determine min of f(x,y):(a) using (2,1) (Theta 0 and theta 1) as the initial estimate

𝜕𝑓 𝜕𝑓 ¿ 6
=2 𝑥 +2 =2(2)+2
𝜕𝑥 𝜕𝑥
𝛻 𝑓 =6 𝑖+2 𝑗
𝜕𝑓 𝜕𝑓
=2 𝑦
𝜕𝑦
=2 (1) ¿ 2
𝜕𝑦

𝑝 (𝑖 +1 )=𝑝 ( 𝑖 ) + 𝑆 𝛻 𝑓 𝑥 ( 𝑖+1 ) =𝑥 ( 𝑖 ) + 𝑆 𝛻 𝑓 𝑦 ( 𝑖+1 )=𝑦 ( 𝑖 ) + 𝑆 𝛻 𝑓

𝑝 (𝑖 +1 )=𝑝 ( 𝑖 ) + 𝑆 𝛻 𝑓 𝑝 (𝑖 +1 ) =
[ ] [ ]
2
1
+𝑆
6
2
¿
[ 2 +6 𝑆
1 +2 𝑆 ]
Put x= 2+6S and y = 1+S in Equation (a)

f(i+1): G(S)=

𝐺 (𝑆) =80 𝑆+ 40=0 𝑆=− 0.5 𝑆𝑡𝑒𝑝 𝑠𝑖𝑧𝑒



Derivative of G(S) should be equal to zero to get minimum

𝑝 (𝑖 +1 ) =
[ 2+6 (− 0.5)
1+2 (− 0.5) ] 𝑝 ( 𝑖 +1 ) =
[ 2 −3
1 −1 ] 𝑝 (𝑖 +1 ) =
[ ]
−1
0
Determine min of f(x,y):(a) using (2,1) as the initial estimate

For Initial values (2,1) ; f(x,y)= 13

After 1st iteration (-1,0) ; f(x,y)= 03


Second Iteration

We will calculate the gradient at new initial points (-1,0)


𝜕𝑓 𝜕𝑓
𝜕𝑥
=2 𝑥 +2
𝜕𝑥
=2(−1)+2 ¿ 0
𝛻 𝑓 =0 𝑖+0 𝑗
𝜕𝑓 𝜕𝑓
=2 𝑦
𝜕𝑦
=2 (0) ¿ 0
𝜕𝑦

At points (-1,0) gradient = 0 so we got the optimal solution, f min=3 at (-1, 0)


Matlab
Example GA: 𝐼 𝑛𝑡𝑖𝑎𝑙 𝑔𝑢𝑒𝑠𝑠(−10 , −10)
𝜕𝑓 𝜕𝑓
=10 𝑥1 + 4 𝑥 2 − 14=2 𝑥2 + 4 𝑥 1 − 6
𝜕 𝑥1 𝜕 𝑥2

At (-10, 10), f(, )=1202


Matlab
Example GA:
𝜕𝑓 𝜕𝑓
=10 𝑥1 + 4 𝑥 2 − 14=2 𝑥2 + 4 𝑥 1 − 6
𝜕 𝑥1 𝜕 𝑥2

𝐼 𝑛𝑡𝑖𝑎𝑙 𝑔𝑢𝑒𝑠𝑠(−10 , −10) At (-10, 10), f(, )=1202

clear
clc
x1=-10
x2=-10
fg=5*x1^2+x2^2+4*x1*x2-14*x1-6*x2+2;
disp(fg)
hold on
for h=1:100000
x1=x1-0.05*(10*x1+4*x2-14);
x2=x2-0.05*(2*x2+4*x1-6);
end
disp(x1);
disp(x2);
disp(fg)
Matlab
Example GA: 𝐼 𝑛𝑡𝑖𝑎𝑙 𝑔𝑢𝑒𝑠𝑠(−10 , −10)
𝜕𝑓 𝜕𝑓 clear
=10 𝑥1 + 4 𝑥 2 − 14=2 𝑥2 + 4 𝑥 1 − 6
𝜕 𝑥1 𝜕 𝑥2 clc
syms x1 x2;
fg=5*x1^2+x2^2+4*x1*x2-14*x1-6*x2+20;
fsurf(fg,[-10 10 -10 10]);
pause(8);
hold on;
x1=-10
x2=-10

for h=1:50000
x1=x1-0.001*(10*x1+4*x2-14);
x2=x2-0.001*(2*x2+4*x1-6);
fg=5*x1^2+x2^2+4*x1*x2-14*x1-
6*x2+20;

plot3(x1,x2,fg,‘mx','linewidth',3)
pause(0.02)
end
disply
Practice problem
Determine min of f(x,y):

You might also like