Professional Documents
Culture Documents
Physics 114: Lecture 17 Least Squares Fit To Polynomial: Dale E. Gary
Physics 114: Lecture 17 Least Squares Fit To Polynomial: Dale E. Gary
y
(
x
)
1
2 i
yi a bx
i
i
To minimize any function, you know that you should take the
derivative and set it to zero. But take the derivative with respect
2
to what? Obviously, we want to find constants a andb that
minimize , so we will form two equations:
2
1
2
1
y
bx
yi a bxi 2
i 0,
i
2
a
a i
2
yi a bxi
b
b i
xi
yi a bx i 0.
i2
bx
cx
i
i
i
2
yi a bxi cxi2
a
a i
2
1
2
y
bx
cx
i
i
i 0,
2
2
yi a bxi cxi2
b
b i
2
yi a bxi cxi2
c
c i
2
xi
2
y
bx
cx
i 0,
i
i
2
2
xi2
y
bx
cx
i
i
i 0.
i2
Second-Degree Polynomial
1
a
1
c
yi
i2
xi yi
i2
xi2 yi
i2
1
i2
xi
i2
xi2
i2
xi
i2
xi2
i2
xi3
i2
xi
i2
xi2
i2
xi3
i2
xi2
i2
xi3
i2
xi4
i2
yi
i2
xi yi
i2
xi2 yi
i2
1
b
1
i2
xi
i2
xi2
i2
where
1
i2
xi
i2
xi2
i2
yi
i2
xi yi
i2
xi2 yi
i2
xi
i2
xi2
i2
xi3
i2
xi2
i2
xi3
i2
xi4
i2
xi2
i2
xi3
i2
xi4
i2
MatLAB Example:
2nd-Degree Polynomial Fit
First, create a set of points that follow a second degree polynomial,
with some random errors, and plot them:
25
3.0145 -2.5130
hold on
plot(x,polyval(p,x),'r')
p = polyfit(x,y,2)
prints p = 1.5174
x = -3:0.1:3;
y = randn(1,61)*2 - 2 + 3*x + 1.5*x.^2;
plot(x,y,'.')
20
data1
Polyfit
y(x)
15
y = 1.5x 2 + 3x - 2
10
5
0
-2
-1
0
x
The residuals appear flat and random, which is good. Check the
standard deviation of the residuals:10
resid = y polyval(p,x)
figure
plot(x,resid,'.')
std(resid)
prints ans = 1.9475
Residuals
-5
-10
-3
-2
-1
0
x
The fit looks the same, but there is a subtle difference due to the
use of an additional parameter. Lets look at the standard deviation
of the new
p2 = polyfit(x,y,3)
hold off
plot(x,polyval(x,p2),'.')
resid2 = y polyval(x,p2)
std(resid2)
prints ans = 1.9312
Is this a better fit? The residuals are slightly smaller BUT check chisquare.
chisq1 = sum((resid/std(resid)).^2)
% prints 60.00
chisq2 = sum((resid2/std(resid2)).^2)
%1.0345
prints 60.00
sum((resid/std(resid)).^2)/58.
% prints
sum((resid2/std(resid2)).^2)/57.
% prints 1.0526
=> 2 nd-order
fit is preferred
They
look identical, but now consider
the reduced
chi-square.
y ( x) ae bx ,
where a and b are the unknown parameters. Rather than consider
a and b, we can take the natural logarithm of both sides and
consider instead the function
ln y ln a bx.
1
2
ln yi ln a bx .
i
(ln
y
)
1 2
1
2
i
i2
i.
i
i
i
2
y
yi
yi
MatLAB Example:
Linearizing An Exponential
First, create a set of points that follow the exponential, with some
0.25
random errors, and plot them:
0.2
0.15
0.1
0.05
logy = log(y+dev);
plot(x,logy,.)
-1
2
-2
0
10
6 6
88
10
10
-3
-2
x = 1:10;
y = 0.5*exp(-0.75*x);
sig = 0.03*sqrt(y); % errors proportional to sqrt(y)
dev = sig.*randn(1,10);
errorbar(x,y+dev,sig)
-12
-8
-14
-9
4 4
xx
0.2
0.25
logsig = sig./y;
errorbar(x, logy, logsig)
p = glmfit(x,logy,normal,weights,logsig);
p = circshift(p,1);
% swap order of parameters
hold on
plot(x,polyval(p,x),r)
hold off
errorbar(x,y+dev,sig)
hold on
plot(x,exp(polyval(p,x)),r)
66
88
10
10
88
10
10
-2
-3
-4
ln(y)
ln(y)
-5
-6
-7
-8
-9
Summary
Use polyfit() for polynomial fitting, with third parameter giving the
degree of the polynomial. Remember that higher-degree
polynomials use up more degrees of freedom (an nth degree
polynomial takes away n + 1 DOF).
A polynomial fit is still considered linear least-squares fitting, despite
its dependence on powers of the independent variable, because it is
linear in the coefficients (parameters).
y ( x) ae bx ,
For some problems, such as exponentials,
, one can
b
linearize the problem. Another
y ( type
x) axthat
, can be linearized is a
power-law expression,
i.
usual error propagation
equation,
e.g.
i
i
i
i
2
y
y
y
i
i
Apr 12, 2010