Professional Documents
Culture Documents
Nonlinear Curve Fitting: "Why Fit in When You Were Born To Stand Out?" - Dr. Seuss
Nonlinear Curve Fitting: "Why Fit in When You Were Born To Stand Out?" - Dr. Seuss
1
Objectives
• Master polynomial regression.
• Become competent in implementing multiple linear regression.
• Comprehend the formulation of the nonlinear least-squares model.
• Learn to solve for different cure fittings using MATLAB.
• Learn to apply nonlinear regression with optimization techniques.
2
Polynomial Regression
Beyond 1st order
3
• Nonlinear fits are often required.
• Fig.
a 1st-order polynomial (linear fit) does not fit the set of data well
we need a 2nd order polynomial
(quadratic) fit.
→ polynomial regression
4
• The least-squares procedure can be extended to higher order polynomials.
• The 2nd order or quadratic
y = a 0 + a 1 x + a 2 x2 + e where e = error
• The sum of the squares of the residuals,
σ𝑛𝑖=1 2 2
𝑆𝑟 = 𝑦𝑖 − 𝑎0 − 𝑎1 𝑥𝑖 − 𝑎2 𝑥𝑖
• The derivatives,
𝜕𝑆𝑟
= −2 σ 𝑦𝑖 − 𝑎0 − 𝑎1 𝑥𝑖 − 𝑎2 𝑥𝑖2
𝜕𝑎0
𝜕𝑆𝑟
= −2 σ 𝑥𝑖 𝑦𝑖 − 𝑎0 − 𝑎1 𝑥𝑖 − 𝑎2 𝑥𝑖2
𝜕𝑎1
𝜕𝑆𝑟
= −2 σ 𝑥𝑖2 𝑦𝑖 − 𝑎0 − 𝑎1 𝑥𝑖 − 𝑎2 𝑥𝑖2
𝜕𝑎2
6
• Setting the derivatives to zero gives,
𝑛 𝑎0 + σ 𝑥𝑖 𝑎1 + σ 𝑥𝑖2 𝑎2 = σ 𝑦𝑖
σ 𝑥𝑖 𝑎0 + σ 𝑥𝑖2 𝑎1 + σ 𝑥𝑖3 𝑎2 = σ 𝑥𝑖 𝑦𝑖
σ 𝑥𝑖2 𝑎0 + σ 𝑥𝑖3 𝑎1 + σ 𝑥𝑖4 𝑎2 = σ 𝑥𝑖2 𝑦𝑖
• 2nd–order polynomial
→ 3 linear equations with 3 unknowns, a0, a1, a2.
7
• General case of mth–order polynomial,
y = a 0 + a 1 x + a 2 x 2 + … + a m xm + e
→ solving m+1 linear equations, i.e., for a0, a1, a2, … am, am+1.
𝑆𝑟
• The standard error, 𝑆𝑦/𝑥 =
𝑛− 𝑚+1
It is divided by n–(m+1) because (m+1) data-derived coefficients,
a0, a1, … am+1, were used to compute Sr.
8
Goodness of Fit
• For a 2nd order polynomial, the best fit corresponds to
n n
Sr = e = (yi − a0 − a1 xi − a x
2
i 2 i)
2 2
i=1 i=1
• In general, for an mth order polynomial, the sum of the squares of the residuals,
n n
Sr = e = (yi − a0 − a1 xi − a x −
2
i
2
2 i −a xm 2
m i )
i=1 i=1
• The standard error for fitting an mth order polynomial to n data points is
Sr
s y/ x =
n − (m +1)
because the mth order polynomial has (m+1) coefficients.
St
where
(
2
S =
t y −y
i ) 9
Example 7.1 2nd–order Polynomial Fit
• Given: data of (xi, yi) for n = 6 Change values to avoid copyright
xi, 0 1 2 3 4 5
yi, 2.1 7.7 13.6 27.2 40.9 61.1
• Find: 2nd-order polynomial fit
• Solution:
• We need to tally the following for a 2nd order polynomial
n
σ 𝑥𝑖 , σ 𝑥𝑖2 , σ 𝑥𝑖3 , σ 𝑥𝑖4
σ 𝑦𝑖 , σ 𝑥𝑖 𝑦𝑖 , σ 𝑥𝑖2 𝑦𝑖
10
• For a 2nd order polynomial, the sum of error minimization gives
𝑛 𝑎0 + σ 𝑥𝑖 𝑎1 + σ 𝑥𝑖2 𝑎2 = σ 𝑦𝑖
σ 𝑥𝑖 𝑎0 + σ 𝑥𝑖2 𝑎1 + σ 𝑥𝑖3 𝑎2 = σ 𝑥𝑖 𝑦𝑖
σ 𝑥𝑖2 𝑎0 + σ 𝑥𝑖3 𝑎1 + σ 𝑥𝑖4 𝑎2 = σ 𝑥𝑖2 𝑦𝑖
where
n = 6 xi = 15 xi2 = 55 a0 yi = 152.6
15 55 xi3 = 225 a1 xiyi = 585.6
55 225 xi4 = 979 a2 xi2yi = 2488.8
𝑥ҧ = 2.5 𝑦ത = 25.433
11
• The simultaneous linear equations are
6 15 55 a0 152.6
15 55 225 a1 = 585.6
55 225 979 a2 2488.8
• Use MATLAB to solve for the coefficients
>> N = [6 15 55; 15 55 225; 55 225 979];
>> r = [152.6 585.6 2488.8];
>> a = N\r
a=
2.4786
2.3593
1.8607 12
• The least-squares quadratic equation,
y = 2.4786 + 2.3593x + 1.8607x2
𝑆𝑟 3.74657
• The standard error, 𝑆𝑦/𝑥 = = =1.1175
𝑛− 𝑚+1 6− 2+1
• The coefficient of determination,
r2 = (2513.39 – 3.74657)/2513.39 = 0.99851
• The correlation coefficient, r = 0.99925
14
Quadratic Curve-Fitting [Ansari & Dichone, 2018]
• Consider a quadratic function,
f(x) = C0 + C1 x + C2 x2
where C0, C1 & C2 are coefficients to be determined.
• Note that Ansari & Dichone [2018] used a special case where C1 is zero.
This is not a good practice because it is not the most general form.
15
• For C1 = 0, the sum of the squares of deviations,
S = [y1 – (C0 + C2 x12)]2 + [y2 – (C0 + C2 x22)]2 + … + [yn – (C0 + C2 xn2)]2.
• For minimum S, S/C0 = S/C2 = 0, i.e.,
S/C0 = σ𝑛𝑖=1 2 𝑦𝑖 − 𝐶0 − 𝐶2 𝑥𝑖2 = 0 → nC0 + (xi2 C2) = yi
S/C2 = σ𝑛𝑖=1 2 𝑦𝑖 − 𝐶0 − 𝐶2 𝑥𝑖2 𝑥𝑖2 → (xi2 C0) + (xi4 C2) = (xi2 yi)
• In matrix form,
n xi 2 C0 (yi)
xi2 xi4 C2 (xi2 yi)
16
Example 7.2 Another 2nd–order Polynomial Fit
• Given:
x 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
y 0.51 2.35 7.54 13.23 17.65 24.21 28.94 37.63 58.32 63.21
• Find: f(x) = C0 + C2 x2
17
n xi2 C0 (yi)
• Solution: xi2 xi4 C2 (xi2 yi)
x = 0.5:0.5:5.0;
y = [0.51 2.35 7.54 … 63.21];
n = size (x,2);
C = M\A
f = C(1) + C(2)*x.^2;
18
p = plot (x, y, ‘redo’, x, f, ‘blue’)
title (‘Comparison: Fitted Function with Data’)
xlabel (‘x’)
ylabel (‘y’)
legend (‘Given data’, ‘Fitted function’)
p(1) .LineWidth = 2;
p(2) .LineWidth = 2;
19
M=
1.0e+03 *
0.0100 0.0963
0.0963 1.5833
A=
1.0e+03 *
n xi2 C0 (yi)
0.2536 xi2 xi4 C2 (xi2 yi)
4.1184
C=
0.7790
2.5538
20
• The quadratic function,
f(x) = 0.7790 + 2.5538 x2
21
Linear Combination of
Functions
Combining Functions
22
• For a given data set, (x1, y1), (x2, y2), …, (xn, yn), we can find a linear
combination of functions,
f(x) = C1 f1(x) + C2 f2(x) + … + Cm fm(x).
• Note that n = total number of data & m = total number of functions.
• The function value at xi is
f(xi) = σ𝑚
𝑗=1 [𝐶𝑗 𝑓𝑗 (𝑥𝑖 )]
• The sum of the squares of the deviations from the true (measured)
values yi is
s = [y1 – f(x1)]2 + [y2 – f(x2)]2 + … [yn – f(xn)]2.
• For minimum S,
s/C1 = s/C2 = … = s/Cn = 0
s2/C12 > 0, s2/C22 > 0, …, s2/Cn2 > 0.
23
• In matrix form,
MC = L
where: C1 σ𝑛𝑖=1 [𝑓1 (𝑥𝑖 )𝑦𝑖 ]
C2 σ𝑛𝑖=1 [𝑓2 (𝑥𝑖 )𝑦𝑖 ]
C = : , L = :
: :
Cm σ𝑛𝑖=1 [𝑓𝑚 (𝑥𝑖 )𝑦𝑖 ]
and
σ𝑛𝑖=1 [𝑓1 (𝑥𝑖 )𝑓1 (𝑥𝑖 )] σ𝑛𝑖=1 [𝑓1 (𝑥𝑖 )𝑓2 (𝑥𝑖 )] … σ𝑛𝑖=1 [𝑓1 (𝑥𝑖 )𝑓𝑚 (𝑥𝑖 )]
σ𝑛𝑖=1 [𝑓2 (𝑥𝑖 )𝑓1 (𝑥𝑖 )] σ𝑛𝑖=1 [𝑓2 (𝑥𝑖 )𝑓2 (𝑥𝑖 )] … σ𝑛𝑖=1 [𝑓2 (𝑥𝑖 )𝑓𝑚 (𝑥𝑖 )]
M= : : … :
: : … :
σ𝑛𝑖=1 [𝑓𝑚 (𝑥𝑖 )𝑓1 (𝑥𝑖 )] σ𝑛𝑖=1 [𝑓𝑚 (𝑥𝑖 )𝑓2 (𝑥𝑖 )] … σ𝑛𝑖=1 [𝑓𝑚 (𝑥𝑖 )𝑓𝑚 (𝑥𝑖 )]
24
Example 7.3 Linear Combination of C1 + C2 x3
• Given: x 1 2 3 4 5
y 5.75 10.75 12.65 29.95 49.35
• Find: y(x) = C1 f1(x) + C2 f2 (x) = C1 + C2 x3
• Solution:
• We see that n = 5 and m = 2 and hence, MC = L can be written as
σ𝑛𝑖=1 [𝑓1 (𝑥𝑖 )𝑓1 (𝑥𝑖 )] σ𝑛𝑖=1 [𝑓1 (𝑥𝑖 )𝑓2 (𝑥𝑖 )] C1 σ𝑛𝑖=1 [𝑓1 (𝑥𝑖 )𝑦𝑖 ]
σ𝑛𝑖=1 [𝑓2 (𝑥𝑖 )𝑓1 (𝑥𝑖 )] σ𝑛𝑖=1 [𝑓2 (𝑥𝑖 )𝑓2 (𝑥𝑖 )] C2 σ𝑛𝑖=1 [𝑓2 (𝑥𝑖 )𝑦𝑖 ]
25
x = 1:1:5;
y = [5.75 10.75 12.65 29.95 49.35];
n = size(x,2);
f1 = ones(1,n);
f2 = x.^3;
C1 = C(1)
C2 = C(2) 26
x1 = 1: 0.1: 5;
f = C1 + C2*x1.^3;
27
M=
2 225
225 20515
L=
1.0e+03 *
0.1084
8.5189
28
C=
5.9309
0.3502
C1 =
5.9309
C2 =
0.3502
30
Multiple Linear Regression
• Another useful extension of linear regression is the case where y is a
linear function of 2 or more independent variables:
y = a0 + a1x1 + a2 x2 + am xm
• The best fit is obtained by minimizing the sum of the squares of the
estimate residuals: n n
Sr = ei = (yi − a0 − a1 x1,i − a2 x2,i − am xm,i )
2 2
i=1 i=1
32
• Setting the partial derivatives to zero, we have
n x1,i x2,i a0 yi
x1,i x1,i2 x1,i x2,i a1 = x1,i yi
x2,i x1,i x2,i x2,i2 a2 x2,i yi
33
Example 7.4 Multiple Linear Regression
• Given: n = 6 data created from y = 5 + 4 x1 – 3 x2
x1 0 2 2.5 1 4 7
x2 0 1 2 3 6 2
y 5 10 9 0 3 27
• Find: linear regression
• Solution:
𝜕𝑆𝑟
= −2 σ 𝑦𝑖 − 𝑎0 − 𝑎1 𝑥1,𝑖 − 𝑎2 𝑥2,𝑖
𝜕𝑎0
𝜕𝑆𝑟
= −2 σ 𝑥1,𝑖 𝑦𝑖 − 𝑎0 − 𝑎1 𝑥1,𝑖 − 𝑎2 𝑥2,𝑖
𝜕𝑎1
𝜕𝑆𝑟
= −2 σ 𝑥2,𝑖 𝑦𝑖 − 𝑎0 − 𝑎1 𝑥1,𝑖 − 𝑎2 𝑥2,𝑖
𝜕𝑎2
34
• Minimizing the sum of the error by setting the partial derivatives to zero,
n x1,i x2,i a0 yi
x1,i x1,i2 (x1,ix2,i) a1 = (x1,iyi)
x2,i (x1,ix2,i) x2,i2 a2 (x2,iyi)
• The simultaneous linear equations are
6 16.5 14 a0 54
16.5 76.25 48 a1 = 243.5
14 48 54 a2 100
• a0 = 5, a1 = 4, a2 = -3
35
Polynomial Curve Fitting
Curve-fitting with polynomials
37
• If an mth order polynomial looks like a good fit, we can still use a
linear combination of known functions,
f1(x) = 1, f2(x), f3(x) = x2, …, fm+1(x) = xm
with
f(x) = C1 f1(x) + C2 f2(x) + … + Cm+1 fm+1(x).
38
• In matrix form,
MC = L
where: C1 σ𝑛𝑖=1 𝑦𝑖
C2 σ𝑛𝑖=1 𝑥𝑖 𝑦𝑖
C = : , L = :
: :
Cm+1 σ𝑛𝑖=1 𝑥𝑖𝑚 𝑦𝑖
and
𝑛 σ𝑛𝑖=1 𝑥𝑖 … σ𝑛𝑖=1 𝑥𝑖𝑚
σ𝑛𝑖=1 𝑥𝑖 σ𝑛𝑖=1 𝑥𝑖2 … σ𝑛𝑖=1 𝑥𝑖𝑚+1
M= : : :
: : :
σ𝑛𝑖=1 𝑥𝑖𝑚 … … σ𝑛𝑖=1 𝑥𝑖2𝑚
39
Example 7.5 2nd–order Polynomial
• Given: x 0 1 2 3 4 5
y 0 8.47 17.48 19.57 14.69 11.23
• Find: y(x) = C1 + C2 x + C3 x2
• Solution:
• We see that n = 6 and m = 2 and hence, MC = L can be written as
40
x = 0:1:5;
y = [0 8.47 17.48 19.57 14.69 11.23];
n = size(x,2);
41
x1 = 0: 0.1: 5;
f = C1 + C2*x1 + C3*x1.^2;
42
M=
6 15 55
15 55 225
55 225 979
L=
71.4400
217.0500
770.3100
C=
-0.4439
12.4838
-2.0573 43
C1 =
-0.4439
C2 =
12.4838
C3 =
-2.0573
44
MATLAB
Non-Linear Fit Functions
MATLAB Fit Functions
45
MATLAB polyfit
46
Example 7.6 MATLAB 2nd–order Polynomial
• Given: Example 7.2
• Find: polynomial curve-fit using MATLAB function
• Solution:
x = 0.5: 0.5: 5.0;
y = [0.51 2.35 7.54 13.23 17.65 24.21 28.94 37.63 58.32 63.21];
C = polyfit (x,y,2)
47
p = plot (x, y, ‘redo’, x, yfit, ‘blue’, x, f, ‘green –’)
title (‘Comparing Second Order Polynomial Curve Fit’)
xlabel (‘x’)
ylabel (‘y’)
legend (‘Given data’, ‘Fit from E.g. 7.2’, ‘MATLAB Fit’)
p(1) .LineWidth = 2;
p(2) .LineWidth = 2;
p(3) .LineWidth = 2;
C=
2.4930 0.3517 0.3963
• That is,
y(x) = 2.4930x2 + 0.3517x + 0.3963
48
MATLAB’s lsqcurvefit Function
• MATLAB has a built-in function for deducing the coefficients, C1, C2, … in
a linear combination of known functions, f1(x), f2(x), …, in the form
C1 f1(x) + C2 f2(x) + ….
• The function is
lsqcurvefit(F,x0,x,y)
where:
F is the anonymous function
x is the vector containing the x values
y is the vector containing the y values
x0 is the vector containing initial points (guesses) of C1, C2, … (0’s)
• The output is the vector contains C1, C2, … 49
Example 7.7 MATLAB lsqcurvefit
• Given: x 1 2 3 4 5
y 5.75 10.75 12.65 29.95 49.35
y(x) = C1 f1(x) + C2 f2(x), where f1(x) = 1 and f2(x) = x3
• Find: coefficients using MATLAB function lsqcurvefit
• Solution:
x = 1: 1: 5;
y = [5.75 10.75 12.65 29.95 49.35];
n = size (x,2);
% The curve fitting functions E.g. 7.3
f1 = ones(1, n);
f2 = x.^3;
50
M = [sum(f1.^2), sum(f1.*f2); sum(f2.*f1), sum(f2.^2)];
L = [sum(f1.*y); sum(f2.*y)];
C = M\L;
C1 = C(1)
C2 = C(2)
x1 = 1:0.1:5;
f = C1 + C2*x1.^3;
51
% Using MATLAB’s nonlinear regression
F = @(c,x) c(1) + c(2)*x.^3;
x0 = [0, 0];
c = lsqcurvefit(F, x0, x, y);
f1 = F(c, x1);
52
• Plot the results
p = plot (x, y, ‘redo’, x1, f, ‘blue’, x1, f1, ‘green –’)
title (‘Comparing Results’)
xlabel (‘x’)
ylabel (‘y’)
legend (‘Given data’, ‘Fit from E.g. 7.3’, ‘MATLAB Fit’)
p(1) .LineWidth = 2;
p(2) .LineWidth = 2;
p(3) .LineWidth = 2;
c=
5.9309 0.3502
54
u = lsqcurvefit (F, x0, x, y)
x1 = 0.3:0.1:4;
f = F(u, x1);
• In short,
f(x) = exp(2.5654 – 0.7882 x + 0.0364 x2)
56
Example 7.9 MATLAB Least Squares ln Fit
• Given: (0, 4), (1, 5), (2, 5.5), (3, 5.8), (4, 6.2), (5, 6.5)
• Find: best fit of the form a ln(x+b) + c with initial guess [2,1,5]
• Solution:
x = 0:1:5;
y = [4 5 5.5 5.8 6.2 6.5];
58
c=
1.3879 1.0435 3.9538
• The function is
f(x) = 1.3879 ln(x+1.0435) + 3.9538
59
Example 7.10 Least Squares ln Fit via MATLAB
• Given: x 1 2 3 4 5 6
y 5 6.5 7.25 7.80 8.25 8.63
• Find: fit of the form a ln(x) + b
• Solution:
x = 1:1:6;
y = [5 6.5 7.25 7.80 8.25 8.63];
c=
2.0048 5.0400
• The function is
f(x) = 2.0048 ln(x) + 5.0400
61
Example 7.11 MATLAB Least Squares axb+c Fit
• Given:
x 0 1 2 3 4 5 6 7
y 5 9 50 192 517 1123 2121 3635
• Find: fit of the form axb + c using [3, 4, 10] as the initial guess
• Solution:
x = 0:1:7;
y = [5 9 50 192 517 1123 2121 3635];
c = lsqcurefit(F,x0,x,y)
62
x1 = 0:.1:7;
f = F(c,x1);
63
c=
3.9977 3.5003 4.9612
• The function is
f(x) = 3.9977 x3.5003 + 4.9612
64
Example 7.12 MATLAB Least Squares aebx+c Fit
• Given:
x 0 1 2 3 4 5 6 7 8
Y 21 24.892 31.31 41.89 59.334 88.095 135.513 213.693 342.5889
• Find: fit of the form a exp(bx) + c using [3, 2, 10] as the initial guess
• Solution:
x = 0:1:8;
y = [21 24.892 31.31 41.89 59.334 88.095 135.513 213.693 342.5889];
c = lsqcurvefit(F,x0,x,y)
65
x1 = 0:.1:8;
f = F(c,x1);
c=
6.0000 5.000 14.999
• The function is
f(x) = 6.0000 exp(0.5000x) + 14.9999
66
Tutorial Problems
Students are encouraged to complete or redo all the examples. The following problems/examples are
of particular interest and importance.
• Problem 7.1. Apply 1st–order and 2nd–order polynomial fits to the digits of your student ID, where 1
… 9 are the digits in increasing order.
x 1 2 3 4 5 6 7 8 9
y 1 2 3 4 5 6 7 8 9
• Problem 7.2. Wind load on wind turbine fatigue. The number (millions) of cycles that a wind turbine
can withstand, y, is a function of the wind speed, x [m/s].
x 13.28 13.30 13.33 13.37 13.50 14.00 15.00 16.67 18.33
y 13.0 12.2 11.3 10.1 8.1 6.2 4.2 2.3 1.1
x 19.00 20.33 21.67 23.33 26.67
y 0.8 0.5 0.4 0.3 0.2
Find the appropriate curve fit using MATLAB.
• Problem 7.3. Brinell hardness number, BHN, for steel as a function of temperature is:
T [K] 480 590 700 810 920
BHN 541 503 447 376 309
Use a 2nd-order polynomial to fit the data and estimate BHN at 650 K.
Do it by hand and by MATLAB. 67
• “When you rise in the morning, give thanks for the light, for your life,
for your strength. Give thanks for your food and for the joy of living. If
you see no reason to give thanks, the fault lies in yourself.”
– Chief Tecumseh.
68