Professional Documents
Culture Documents
Gausmkov
Gausmkov
D(β ∗ ) = BD(y)B 0
© ª© ª
= σ 2 (X 0 X)−1 X 0 + G X(X 0 X)−1 + G0
(1)
= σ 2 (X 0 X)−1 + σ 2 GG0
= D(β̂) + σ 2 GG0 .
The tactic of taking arbitrary linear combinations of the elements of β̂ is to avoid the
difficulty inherent in the fact that β̂ is a vector quantity for which there is no uniquely
defined measure of dispersion. An alternative approach, which is not much favoured, is to
use the generalised variance that is provided by the determinant of the D(β). A version
of the Gauss–Markov Theorem that uses this measure can also be proved.
It is worthwhile to consider an alternative statement and proof of the theorem, which
also considers the variance of an arbitrary linear combination of the elements of β̂.
The Gauss–Markov Theorem. (An Alternative Statement). Let q 0 y be a linear
estimator of the scalar function p0 β of the regression parameters in the model (y; Xβ, σ 2 I).
Then q 0 y is an unbiased estimator, such that E(q 0 y) = q 0 E(y) = q 0 Xβ = p0 β for all β, if
and only if q 0 X = p0 . Moreover, q 0 y has the minimum variance in the class of all unbiased
linear estimators if and only if
1
THE GAUSS–MARKOV THEOREM
Minimise V (q 0 y) = q 0 D(y)q = σ 2 q 0 q
(4)
Subject to E(q 0 y) = p0 β or, equivalently X 0 q = p.
Differentiating with respect to q and setting the results to zero gives, after some rearrange-
ment, the condition
(6) q 0 = λ0 X 0 .
q 0 y = q 0 X(X 0 X)−1 X 0 y
(8)
= p0 (X 0 X)−1 X 0 y = p0 β̂,