Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

ECE 531 - Detection and Estimation Theory

Homework 3

4.6 (Correction Shu Wang) In this problem, we only have a single component. So =
[
ak , bk ]T .
According to Example 4.2, we have
!
2 2
0
N
C=
2 2
0
N
2

So a
k N (ak , 2
k and bk are independent.
N ) and bk N (bk , N ). Also a

a
2k + b2k
]
2
b2
a
2
= E[ k ] + E[ k ]
2
2
1
2
= (E[
ak ] + E[b2k ])
2
1
= [V ar(ak ) + E 2 [ak ] + V ar(Bk ) + E 2 [Bk ]]
2
1 2 2
2 2
= [
+ a2k +
+ b2k ]
2 N
N
2 2 a2k + b2k
+
=
N
2

E[P ] = E[

Suppose P =

a2k +b2k
2 .

So E[P ] =

2 2
N

2
2
+ P . Then E 2 [P ] = ( 2
N + P) .

V ar(P ) = V ar(

a
2k + b2k
)
2

1
= [V ar(
a2k ) + V ar(b2k )]
4
According to textbook page38. Eq 3.19:
If N (, 2 ), then
E[ 2 ] = 2 + 4
E[ 4 ] = 4 + 62 2 + 3 4
V ar( 2 ) = 42 2 + 2 4
2
2 2 2
2 2 2
2 2 2
2
So V ar(
a2k ) = 4a2k 2
N + 2( N ) and V ar(bk ) = 4bk N + 2( N ) . Then we can have:
2
2 2 2
2 2
2 2
V ar(P ) = (a2k + b2k )( 2
N ) + ( N ) = ( N )[2P + N ]

So
2

( 2 + P )2
E 2 [P ]
= 22 N
2
V ar(P )
( N )[2P + 2
N ]
=1+
E 2 [P ]
= 1.
V ar(P )
2
E 2 [P ]
= P42 =
V ar(P )
P

(2P )2 N 2
4[2P N 2 2 + 4 4 ]

If ak = bk = 0 P = 0
But if P >>

2 2
N ,

then

P
4 2
N

>> 1. Then signal will be easily detected.

4.13 (Shu Wang) In practice we sometimes encounter the linear model x = H + w but H
composed of random variables. Suppose we ignore this difference and use our usual estimator
= (HT H)1 HT x
where we assume that the particular realization of H is known to us. Show that if H and w
are independent, the mean and covariance of are
=
E()


C = 2 EH (HT H)1
where EH denotes the expectation with respect to the PDF of H. What happens if the
independence assumption is not made?

= E[(HT H)1 HT x]
E[]
= E[(HT H)1 HT (H + w)]
= E[(HT H)1 HT H] + E[(HT H)1 HT w]
According to H and w are independent. Also w has zero mean. Then we can have:
= E[] + E[(HT H)1 HT ]E[w]
E[]
= E[]
=

C = E[( )( )T ]
= E[((HT H)1 HT x )((HT H)1 HT x )T ]
= E[((HT H)1 HT x (HT H)1 HT H)((HT H)1 HT x (HT H)1 HT H)T ]
= E[((HT H)1 HT (x H))((HT H)1 HT (x H))T ]
= E[((HT H)1 HT w)((HT H)1 HT w)T ]
= EHw [(HT H)1 HT wwT H(HT H)1 ]
= EH|w Ew [(HT H)1 HT wwT H(HT H)1 ]
= EH|w [(HT H)1 HT 2 IH(HT H)1 ]
= EH|w [ 2 (HT H)1 ]
= 2 EH [(HT H)1 ]
According to H and w are independent.
may not equal to , so may be biased.
If H and w are not independent. Then E[]
5.3 (Luke Vercimak) The IID observations x[n] for n = 0, 1, . . . , N 1 have the exponential
PDF

exp (x[n]) x[n] > 0
p(x[n]; ) =
0
x[n] < 0
Find a sufficient statistic for
Since the observations are IID, the joint distribution is
"N 1
#
X
n
p(x; ) = exp
x[n]
n=0

"
=

n exp

N
1
X

#!
x[n]

(1)

n=0

= (n exp [T (x)]) (1)


= g(T (x), )h(x)
By the Neyman-Fisher Factorization theorem,
T (x) =

N
1
X

x[n]

n=0

is a sufficient statistic for


5.9 (Luke Vercimak) Assume that x[n] is the result of a Bernoulli trial (a coin toss) with
Pr{x[n] = 1} =
Pr{x[n] = 0} = 1
and that N IID observations have been made. Assuming the Neyman-Fisher factorization
theorem holds for discrete random variables, find a sufficient statistic for . Then, assuming
3

completeness, find the MVU estimator of

PN 1

Let p = number of times x = 1 or

n=0

Pr [x] =

N
1
Y

x[n]. Since each observation is IID,

Pr [x[n]]

n=0
p

= (1 )N p
p (1 )N
(1 )p

p

=
(1 )N
1
"
#
T (x)

=
(1 )N [1]
1
=

= g(T (x), )h(x)


By the Neyman-Fisher Factorization theorem,
T (x) = p =

N
1
X

x[n]

n=0

is a sufficient statistic for .


To get a MVUE statistic, the RBLS theorem says that we need to prove:
1. T (x) is complete. This is given in the problem statement.
2. T (x) is unbiased:

E[T (x)] = E

"N 1
X

#
x[n]

n=0
N
1
X

E [x[n]]

n=0
N
1
X
n=0
N
1
X

[Pr(x[n] = 1)x[n] + Pr(x[n] = 0)x[n]]

[(1) + (1 )(0)]

n=0

= N
Therefore an unbiased estimator of is
N 1
1 X
x[n]
=
N
n=0

By the RBLS theorem, this is also the MVUE.

You might also like