Professional Documents
Culture Documents
Final Versiongauss and The Method of Least Squares
Final Versiongauss and The Method of Least Squares
1
Outline
2
Johann Carl Friedrich Gauss
3
Facts about Gauss
Attended Brunswick College in 1792, where he
discovered many important theorems before even
reaching them in his studies
Found a square root in two different ways to fifty
decimal places by ingenious expansions and
interpolations
Constructed a regular 17 sided polygon, the first
advance in this matter in two millennia. He was only
18 when he made the discovery
4
Ideas of Gauss
5
Intellectual Personality and Controversy
Those who knew Gauss best found him to be cold and
uncommunicative.
He only published half of his ideas and found no one to share his
most valued thoughts.
6
Formal Arrival of Least Squares
• Gauss
• Published ‘The theory of the Motion of Heavenly Bodies’ in 1809.
He gave a probabilistic justification of the method,which was
based on the assumption of a normal distribution of errors.
Gauss himself later abandoned the use of normal error function.
• Published ‘Theory of the Combination of Observations Least
Subject to Errors’ in 1820s. He substituted the root mean square
error for Laplace’s mean absolute error.
7
Treatment of Errors
Random error
8
Error Assumptions
All errors within these limits are possible, but not necessarily
with equal likelihood
9
Density Function
We define the function ( x) with the same meaning as a density function w ith
the following properties .
– Positive and negative errors of the same maginitude are equally likely, ( x) ( x)
10
Mean and Variance
Define k x ( x)dx . In many cases
assume k=0
Define mean square error as
m
2 2
x ( x ) dx
2
If k=0 then the variance will equal m
11
2
Reasons for m
2
m is always positive and is simple.
12
More on Variance
If k 0 then variance equals m k .
2 2
Suppose we have independent random variables {e, e' , e' ' ,...}
with standard deviation 1 and expected value 0.
The linear function of total errors is given by
E e ' e'...
k k
M e i2
2 2 2
i i
Now the variance of E is given as i 1 i 1
This is assuming every error falls within standard
deviations from the mean
13
Gauss’ Derivation of the Method of Least Squares
14
Gauss’ Derivation of the Method of Least Squares
Problem :
We want to estimate V ,V ' , V ' ' , by taking independen t observatio ns : L, L' , L' ' , .
where V , V ' , V ' ' , . are functions of unkowns x, y, z , .
V f1 ( x, y, z , )
V ' f 2 ( x, y, z ,)
V " f 3 ( x, y, z , )
Let the errors in the observatio ns be :
(V L) (V ' L' )
v : , v' ,
p p'
where the p ' s are the weights of the ' mean errors of the observatio ns'.
( Note : We scaled the errors so they have the same variance ) 15
Gauss’ Derivation of the Method of Least Squares
16
Solve an optimization problem :
min 2 '2 ' '2
where , ' , " , are coefficients of v, v ' , v" ,
s.t : v ' v ' ' ' v' ' etc. x k
for some constant k independent of x, y, z, .
We can state the problem as :
We are looking for a linear mapping G(v, v' , v" , ) from R to R such that :
1. G F is the identiy on R
2. G statisfies an optimality condition, described as below :
Suppose x g (v, v ' , v"...) is the first component of G. Then
x g (v, v ' , v"...) v ' v ' " v"... k .
18
Gauss’ Derivation of the Method of Least Squares
It can be proved that
19
Gauss’ derivation by modern matrix notation:
Assume that observable quantities V1 ,V2 , , V are linear
functions of parameters x1 , x2 , , x such that
Vi bi1 x1 ... bi x ci , bij , ci R
we know the values of all the bij and ci .
We measure the Vi in an attempt to infer the values of the xi .
Assume Li is an observation of Vi
Switch to a new coordinate system by setting :
vi (Vi Li ) / pi
The system becomes :
v Ax l
20
Gauss’ derivation by modern matrix notation:
In a linear model
x A
where A is an n p matrix with rank p, is an unknown vector, and is the
error vector. If E( ) 0 and Var( ) 2 I , then for any unbiased estimator
~ ~
of C T , we have E(ˆLS ) and Var(C TˆLS ) Var ( )
In other words, when ' s have the same variance and are uncorrelated, the least -
squares estimator is the best unbiased estimator with the smallest variance.
23
Limitation of the Method of Least Squares
Nothing is perfect:
24
References
Gauss, Carl Friedrich, Translated by G. W. Stewart. 1995. Theory of the
Combination of Observations Least Subject to Errors: Part One, Part
Two, Supplement. Philadelphia: Society for Industrial and Applied
Mathematics.
Plackett, R. L. 1949. A Historical Note on the Method of Least Squares.
Biometrika. 36:458–460.
Stephen M. Stiger, Gauss and the Invention of Least Squares. The
Annals of Statistics, Vol.9, No.3(May,1981),465-474.
Plackett, Robin L. 1972. The Discovery of the Method of Least Squares.
Plackett, Robin L. 1972. The Discovery of the Method of Least Squares.
Belinda B.Brand, Guass’ Method of Least Squares: A historically-based
introduction. August 2003
http://www.infoplease.com/ce6/people/A0820346.html
http://www.stetson.edu/~efriedma/periodictable/html/Ga.html
25