Professional Documents
Culture Documents
Cramer-Rao Lower Bound: 4.1 Estimator Accuracy
Cramer-Rao Lower Bound: 4.1 Estimator Accuracy
Cramer-Rao Lower Bound: 4.1 Estimator Accuracy
• As the curvature increases, 𝜎 2 decreases. Since we already know that the estimator
𝐴̂ has variance 𝜎 2 , then, for this example,
1
var(𝐴̂) = − 2
𝜕 ln 𝑝(𝑥[0], 𝐴)
𝜕𝐴2
• In general the second derivative depends on the 𝑥[𝑛] itself. Thus, a more appropriate
way of measuring the curvature, which measures the average curvature of log-
likelihood is
𝜕 2 ln 𝑝(𝑥[𝑛]; 𝜃)
Measure of Curvature ~ − 𝐸 [ ]
𝜕𝜃 2
Assumption:
1
var(𝜃̂) ≥
𝜕 2 ln 𝑝(𝐱; 𝜃)
−𝐸 [ ]
⏟ 𝜕𝜃 2
𝐼(𝜃)
where the derivative is evaluated at the true value of 𝜃 and the expectation is taken with
respect to 𝑝(𝐱; 𝜃). Furthermore, an unbiased estimator may be found that attains the bound
for all 𝜃 if and only if
𝜕 ln 𝑝(𝐱; 𝜃)
= 𝐼(𝜃)(𝑔(𝐱) − 𝜃)
𝜕𝜃
For some functions 𝐼 and 𝑔. That estimator, which is the MVU estimator, is 𝜃̂ = 𝑔(𝐱), and the
minimum variance is 1/𝐼(𝜃), where 𝐼(𝜃) is the Fisher information,
𝜕 2 ln 𝑝(𝐱; 𝜃)
𝐼(𝜃) = −𝐸 [ ].
𝜕𝜃 2
Example: Consider the previous example, where 𝑥[0] = 𝐴 + 𝑤[0]
var(𝐴̂) ≥ 𝜎 2 .
We can easily see that there cannot exists any unbiased estimator whose variance is lower
than 𝜎 2 .
𝜃=𝐴
𝜕 ln 𝑝(𝑥[0], 𝐴) 1
= 2 (𝑥[0] − 𝐴)
𝜕𝐴 𝜎
1
𝐼(𝜃) =
𝜎2
𝑔(𝑥[0]) = 𝑥[0]
𝑥[𝑛] = 𝐴 + 𝑤[𝑛], 𝑁 = 0, ⋯ 𝑁 − 1
𝑁−1
1 1 2
= 𝑁 exp [− 2𝜎 2 ∑(𝑥[𝑛] − 𝐴) ]
(2𝜋𝜎 2 ) 2 𝑛=0
Therefore,
𝑁−1
𝑁 1
ln 𝑝(𝐱; 𝐴) = − ln(2𝜋𝜎 2 ) − 2 ∑(𝑥[𝑛] − 𝐴)2
2 2𝜎
𝑛=0
𝜕 ln 𝑝(𝐱; 𝜃)
= 𝐼(𝜃)(𝑔(𝐱) − 𝜃)
𝜕𝜃
𝑁−1 𝑁−1
𝜕 ln 𝑝(𝐱; 𝐴) 1 𝑁 1 𝑁
= 2 ∑(𝑥[𝑛] − 𝐴) = 2 ( ∑ 𝑥[𝑛]) − 𝐴 = 2 (𝑥̅ − 𝐴)
𝜕𝐴 𝜎 𝜎 𝑁 𝜎
𝑛=0 ⏟ 𝑛=0
( 𝑥̅ )
where 𝑥̅ is the sample mean. Differentiating again yields,
𝜕 2 ln 𝑝(𝐱; 𝐴) 𝑁
2
=− 2
𝜕𝐴 𝜎
the derivative is a constant. Therefore the expected value gives itself and
1 𝜎2
var(𝐴̂) ≥ =
𝜕 2 ln 𝑝(𝐱; 𝐴) 𝑁
−𝐸 [ ]
𝜕𝐴2
The amplitude and the frequency 𝑓0 are assumed to known. The pdf is
𝑁−1
1 1
𝑝(𝐱; 𝜙) = 2 𝑁/2
exp {− 2 ∑[𝑥[𝑛] − 𝐴 cos(2𝜋𝑓0 𝑛 + 𝜙)]2 }
(2𝜋𝜎 ) 2𝜎
𝑛=0
Log-likelihood function is
𝑁−1
𝑁 1
log 𝑝(𝐱; 𝜙) = − log(2𝜋𝜎 2 ) − 2 ∑[𝑥[𝑛] − 𝐴 cos(2𝜋𝑓0 𝑛 + 𝜙)]2
2 2𝜎
𝑛=0
𝑁−1
𝐴 𝐴
= − 2 ∑ [𝑥[𝑛] sin(2𝜋𝑓0 𝑛 + 𝜙) − sin(4𝜋𝑓0 𝑛 + 2𝜙)]
𝜎 2
𝑛=0
Therefore,
2 𝜎2 2
var(𝜙̂) ≥ 2
=
𝑁𝐴 𝑁SNR
and CRLB
CRLB
so that
1 1
var(𝜃̂) ≥ 2 =
𝜕 ln 𝑝(𝐱; 𝜙) 𝜕 2 ln 𝑝(𝐱; 𝜃)
𝐸 [( ) ] −𝐸 [ ]
𝜕𝜃 𝜕𝜃 2
The CRLB is
1. Nonnegative
2. Additive for independent observations → CRLB for 𝑁 iid observations is 1/𝑁 𝛼 times
that for one observation
therefore
𝜎2
var(𝜃̂) ≥ 2 = CRB
𝜕𝑠[𝑛; 𝜃]
∑𝑁−1
𝑛=0 ( )
𝜕𝜃
• Signals that change rapidly as the unknown parameter changes result in more
accurate estimators.
𝜎2
var(𝑓̂0 ) ≥
𝐴2 ∑𝑁−1
𝑛=0 (2𝜋𝑛 sin(2𝜋𝑓0 𝑛 + 𝜙))
2