Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Statistical Inference

Test Set 1

1. Let X ~ Ρ(λ ). Find unbiased estimators of (i ) λ 3 , (ii ) e − λ cos λ , (iii) sin λ . (iv) Show that
there does not exist unbiased estimators of 1/ λ , and exp{−1/ λ}.
2. Let X 1 , X 2 ,..., X n be a random sample from a N ( µ , σ 2 ) population. Find unbiased and
µ
consistent estimators of the signal to noise ration and quantile µ + bσ , where b is
σ
any given real.
3. Let X 1 , X 2 ,..., X n be a random sample from a U (−θ , 2θ ) population. Find an unbiased and
consistent estimator of θ .
4. Let X 1 , X 2 be a random sample from an exponential population with mean 1/ λ . Let
X1 + X 2
= T1 = , T2 X 1 X 2 . Show that T1 is unbiased and T2 is biased. Further, prove that
2
MSE (T2 ) ≤ Var (T1 ) .
5. Let T1 and T2 be unbiased estimators of θ with respective variances σ 12 and σ 22 and
cov(T1 , T2 ) = σ 12 (assumed to be known). Consider T = α T1 + (1 − α )T2 , 0 ≤ α ≤ 1. Show
that T is unbiased and find value of α for which Var (T ) is minimized.
6. Let X 1 , X 2 ,..., X n be a random sample from an Exp ( µ , σ ) population. Find the method of
moment estimators (MMEs) of µ and σ .
7. Let X 1 , X 2 ,..., X n be a random sample from a Pareto population with density
βα β
f X ( x=
) , x > α , α > 0, β > 2. Find the method of moments estimators of α , β .
x β +1
8. Let X 1 , X 2 ,..., X n be a random sample from a U (−θ , θ ) population. Find the MME of θ .
9. Let X 1 , X 2 ,..., X n be a random sample from a lognormal population with density
1  1 
f X=
( x) exp − 2 (log e x − µ ) 2  , x > 0. Find the MMEs of µ and σ 2 .
σ x 2π  2σ 
10. Let X 1 , X 2 ,..., X n be a random sample from a double exponential ( µ , σ ) population. Find
the MMEs of µ and σ .
Hints and Solutions

1. (i) E { X ( X − 1)( X − 2)} = λ3


(ii) For this we solve estimating equation. Let T ( X ) be unbiased for e − λ cos λ .
Then
ET ( X ) = e − λ cos λ for all λ >0 .

e−λ λ x
⇒ ∑ T ( x) e − λ cos λ for all λ >0
=
x =0 x!

λx λ2 λ4
⇒ ∑ T ( x) =−
1 −  for all λ >0
+
x =0 x! 2! 4!
As the two power series are identical on an open interval, equating coefficients of
powers of λ on both sides gives

T ( x=) 0, if= x 2m + 1,
= 1,= if x 4m,
= −1, if x =4m + 2, m =0,1, 2,
(iii) For this we have to solve estimating equation. However, we use Euler’s identity
to solve it.

Let U ( X ) be unbiased for sin λ . Then


∞ λ x 1 λ iλ −iλ
⇒ ∑ U ( x) =e (e − e ) for all λ > 0
x=0 x ! 2i
1 (1 + i )λ (1 − i )λ
= (e −e ) for all λ > 0
2i
1  ∞ λ k (1 + i )k ∞ λ k (1 − i )k 
=  ∑ − ∑  for all λ > 0.
= 2i  k 0= k! k! 
 k 0 
Applying De-Moivre’s Theorem on the two terms inside the parentheses, we get

∞ λ x 1 ∞ λk  k  π π
k
k   π  π  
k

=∑ U ( x) ∑ ( 2)  cos 4 + i sin 4  − ( 2)  cos  − 4  + i sin  − 4   


x ! 2i k =0 k ! 
x=0 
1 ∞ λk  k  kπ kπ  k   kπ   kπ   
= ∑ ( 2)  cos
2i k =0 k !   4
+ i sin
4
 − ( 2)  cos  −
   4
 + i sin  −

 
 4  

( 2) k λ k  kπ 
=∑ sin   for all λ > 0
k =0 k!  4 
Equating the coefficients of powers of λ on both sides gives

πx 
=U ( x) (= 2) x sin   , x 0,1, 2,
 4 
In Parts (iv) and (v) , we can show in a similar way that estimating equations do not
have any solutions.
1 n 1 n
=

2. = =
Let X
X
n i 1=
i , and S 2

n −1 i 1
( X i − X )2 .

 σ2  (n − 1) S 2
Then X ~ N  µ ,  , and W= ~ χ n2−1 . It can be seen that
 n  σ 2

n n−2
2
E (W 1/2 ) = 2 and E (W −1/2 ) = 2 . Using these, we get unbiased estimators
n −1 n −1
2
2 2
n −1 n −1
1 n −1 2 2 2 1 respectively. As
of = σ and as T1 = S and T2
σ 2 n n −1 n − 2 S
2 2
µ
X and S 2 are independently distributed, U1 = X T2 is unbiased for . Further,
σ
U= 2 X + bT1 is unbiased for µ + bσ . As X and S 2 are consistent for µ and σ 2
µ
respectively, U1 and U 2 are also consistent for and µ + bσ respectively.
σ

3θ 2X
3. =
As µ1′ = ,T is unbiased for θ . T is also consistent for θ .
2 3

1
4. As E ( X i ) = , T1 is unbiased. Also X 1 and X 2 are independent. So
λ
2
1 π  π
( ) ( ) 1
2
=
E (T2 ) E =
X1 X 2 E=
( X1 ) =  . Var (T1 ) = 2 .
2 λ  4λ 2λ
2

(=
MS ET2)


1
E  X1 X 2 − =
2
 E ( X1 X 2 ) − E
λ λ
( )
X1 X 2 +
1
λ2
2  π
= 2 1 − 
λ  4

σ 22 − σ 12
5. The minimizing choice of α is obtained as .
σ 12 + σ 22 − 2σ 12

1  x−µ 
f ( x)= exp  −  , x > µ , σ > 0. µ1′ =µ + σ , µ2′ =(µ +σ ) +σ 2 .
2
6.
σ  σ 
So µ =µ1′ − µ2′ − µ ′2 , σ = µ2′ − µ ′2 . The method of moments estimators for
µ and σ are therefore given by
1 n 1 n
=n i 1=
µˆ MM =
ni1
X− ∑ ( X i − X )2 , an σˆ MM =
d ∑ ( X i − X )2 .
βα βα 2 µ1′ µ2′ µ2′
7.=µ1′ = , µ′ . So α = , β = 1+
β −1 2 β − 2 µ2′ − µ 1′2 µ2′ − µ1′2
The method of moments estimators for α and β are therefore given by
n n
X ∑X i
2
∑X i
2

αˆ MM = i =1
, βˆMM = 1 + n
i =1
.
∑(X
n n

∑(X ∑X − X) 2
i − X) + 2
i
2
i
=i 1 =i 1 i =1

θ2 3 n 2
8. Since µ1′ = 0 , we consider µ2′ =
3
. So θˆMM = ∑ Xi
n i =1
 µ ′2   µ′ 
9.=µ1′ e=
µ +σ 2 /2
, µ2′ e=
2 µ + 2σ 2
. So µ log
=  1 , σ
2
log  22  and the method of
 µ′   µ1′ 
 2 

moments estimators for µ and σ 2 are therefore given by

  1 n 2

X2
  n ∑ Xi 
=µˆ MM log
=   , σˆ 2
log  i =1 2  .
 1 n  MM  X 
 ∑ X i2  



 n i =1 
1  x−µ 
10. f (=
x)

exp  −
σ  , x ∈ , µ ∈ , σ > 0. µ=
′ µ , µ=
1
′ µ 2 + 2σ 2 .
2
 
1
=
So µ µ= ′
1, σ ( µ2′ − µ ′2 ) . The method of moments estimators for µ and σ are
2
therefore given by
1 n
=µˆ MM X=
, and σˆ MM ∑ ( X i − X )2 .
2n i =1

You might also like