Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

20137 Advanced Statistics for Economic and Social Sciences (ESS-MS)

First partial exam


October 28th, 2010
(Time: 2 hours)

Problem 1 (13 points). Let (X, Y ) be a random vector having the joint density function
(X,Y )
f (x, y) = K()I[ , 3 ] (x)I[ 1 , 3 ] (y), > 1.
2 2 2 2

(1.1) Find the value of K().


RR (X,Y )
Solution: Setting f (x, y) dx dy = 1, one has
Z 3/2 Z 3/2
1= dy K() dx = K() ;
1/2 /2

therefore, K() = 1 .
(1.2) Obtain the marginal density function of Z = X Y (use as auxiliary variable V = X).
Solution: Let us consider the transformation
( (
Z = XY X=V

V =X Y = VZ

one can easily check that | det J| = v1 , then, the joint distribution of Z, V is
 z 1 1 z 
(Z,V ) (X,Y )
f (z, v) = f v, = I[ , 3 ] (v)I[ 1 , 3 ] ,
v v v 2 2 2 2 v
that is,
(Z,V ) 1
f (z, v) = I 3 (v)I[ v , 3v ] (z).
v [ 2 , 2 ] 2 2

The set  
3 v 3v
T = (z, v) : v , z
2 2 2 2
can also be rewritten in the form
   
3 3 9 2z 3
T = (z, v) : z , v 2z (z, v) : <z , v ,
4 4 2 4 4 3 2
we can now find the marginal distribution of Z: if 4 z 3
4 we have
Z 2z
1 1 2z 1 4z
fZ (z) = dv = ln v /2 = ln ,
/2 v
3 9
otherwise, if 4 <z 4 we have
Z 3/2
1 1 3/2 1 9
fZ (z) = dv = ln v 2z/3 = ln ;
2z/3 v 4z
thus,
1 4z 3
ln
4 z 4
fZ (z) = 1 ln 4z
9 3
4 <z 4
9 .

0 z < 4 or z > 9

4
(1.3) Assume that (X1 , . . . , Xn ) is a simple random sample of i.i.d. random variables each with density function
equal to that of X, and (Y1 , . . . , Yn ) a simple random sample of i.i.d. random variables each with density
function equal to that of Y . Obtain the joint distribution of (X(n) , Y(1) ).
Solution: First of all, let us observe that
(X,Y ) 1
f (x, y) = I[ , 3 ] (x) I[ 1 , 3 ] (y),
| 2 {z2 } | 2 {z 2
}
fX (x) fY (y)

then X and Y are independent. Therefore, X(n) and Y(1) will also be independent. Using the
usual formulas one can obtain that
n1
   n1
n 3
fX(n) (x) = n x I[ , 3 ] (x), fY(1) (y) = n y I[ 1 , 3 ] (y),
2 2 2 2 2 2

finally,
n1
n2
 
3
fX(n) ,Y(1) (x, y) = n x y I[ , 3 ] (x)I[ 1 , 3 ] (y).
2 2 2 2 2 2

(1.4) Given the sample (X1 , . . . , Xn ):


(a) Find the sufficient statistic for ;
Solution: Let us recall that
1
fX (x) = I[ , 3 ] (x),
2 2
we can write the joint function of the sample as
1
fX1 ,...,Xn (x1 , . . . , xn ) = I 3 (x )I 3 (x ),
n [ 2 , 2 ] (1) [x(1) , 2 ] (n)
that is, T = (X(1) , X(n) ) is sufficient.
(b) Suggest an ancillary statistic for ;
Solution: Observe that
3 1 x 3
x ,
2 2 2 2
and    
3 1 1 3
X U nif , X = Y U nif , ,
2 2 2 2
therefore is a scale parameter. An example of ancillary statistics is, for instance
X1 Y1 Y1
U= = = .
X2 Y2 Y2
(c) Show that the sufficient statistic found in a) is not complete (note that no calculations are necessary);
X(1)
Solution: Setting V = X(n) , V is function of the sufficient statistics T , moreover

X(1) Y(1)
  
E [V ] = E =E = ,
X(n) Y(n)

we choose the function g(T ) as follows:


X(1)
g(T ) =
X(n)

clearly E [g(T )] = 0, but this does not imply that g(t) = 0 for a.e. t.
(d) Find the method of moments estimator and the maximum likelihood estimator for .
Solution: As E [X] = , then M = X. For the maximum likelihood estimator, let us
observe that
1 1
L(|x) = n
I[ , 3 ] (x(1) )I[x(1) , 3 ] (x(n) ) = n I[ 2 x(n) ,2x(1) ] (),
2 2 2 3

as L is decreasing, its maximum is attained for L = 23 x(n) .

Problem 2 (13 points). Consider the simple random sample X1 , . . . , Xn extracted from the population
described by the following density function:
1
x
f (x; , ) = [() ]1 x(+1) e x > 0, > 1, > 0.

(2.1) Find the jointly minimal sufficient statistic T for = (, ).

Solution: Writing the density as


h i
1 1
1 [(+1) ln x]
f (x; , ) = [() ] e x
,

it can be seen that f belongs to the exponential family, then T = ( ln Xi , X1i ) is the minimal
P P
sufficient statistic.

(2.2) Is the statistic T also complete?

Solution: As the parameters and belong to a rectangle { > 1, > 0} R2 , regularity


conditions hold, and the statistic T is also complete.

(2.3) Verify that the expected value of the population is E [X] = [( 1)]1 . (Hint: Consider the transfor-
mation Y = 1/X).

Solution: We have
Z + 1

E [X] = x [() ]1 x(+1) e x dx
0
Z +
1 1
=
x e x dx
() 0
 
1 1 1
letting = y x = dx = 2 dy
x y y
Z +
1 1y
=
y 2 e dy
() 0
  
1
2 y 1
y e is the kernel of a 1,

1 1
=
( 1) 1 = .
() ( 1)

On the other hand, one could also see that, letting Y = X1 , the density of Y is (y > 0)
 
1 1 1 1y 1 1 1y
fY (y) = f 2
=
y +1 e 2 =
y 1 e ;
y y () y ()

that is, Y (, 1 ); then a straigthforward calculation leads to E[X] = E[ Y1 ] = [( 1)]1 .


(2.4) Assuming that is known, obtain the method of moments estimator of .
1 1
Solution: As E [X] = (1) , that is: = 1 + E[X] , the method of moments estimator of is
1
M = 1 + .
X
(2.5) Assuming that is known, obtain the maximum likelihood estimator of .
Solution: The likelihood function is
1 1
P
L(; x) = K(; x) n e xi
,
the derivative w.r. to of ln L is
   X 1
d  d 1X 1 1
ln L(; x) = ln K(; x) n ln = 2 n ,
d d xi xi
1 P 1 d

and is equal to 0 iff = = n xi , which is clearly a maximum, since d ln L(; x) R 0 if
1 P 1
Q . Hence, L = n Xi .

(2.6) Again assuming that is known, compute Fishers Information of the sufficient statistic.
P 1
Solution: Since is known, the sufficient statistic is S = Xi , we have already noticed that
1 1
1
X (, ), so that E X = , and
 2 

IS () = nIX1 () = nE ln f (X; )
2
 2  
1 1
= nE ln(()) ln ( + 1) ln X
2 X
    
2 1 2 1 n
= nE 2
3 = n 2
3E = 2.
X X

Problem 3 (6 points). Let X1 , . . . , Xn be an i.i.d. sample extracted from the population described by the
random variable
X fX (x; ) = exp{x}, x 0, > 0.
(3.1) Find the complete sufficient statistic T for .
P
Solution: As f belongs to the exponential family, T = Xi is the complete sufficient statistic
(observe that regularity conditions hold, as R+ ).
(3.2) Show that the statistic
min{X1 , . . . , Xn }
U= ,
X
Pn
with X = (1/n) i=1 Xi , is ancillary.
Solution: Observe that Y = X Exp(1), indeed x = 1 y, and
 
1 1
fY (y) = fX y = ey I(y > 0).

Thus is a scale parameter, and
min{X1 , . . . , Xn } min{Y1 , . . . , Yn } min{Y1 , . . . , Yn }
U= = = ,
X Y Y
that is: U is ancillary.
(3.3) Compute the expected value of the random variable U .

Solution: First of all, let us note that min{X1 , . . . , Xn } Exp(n); furthermore, as X is


the complete sufficient statistic and U is ancillary, we know that U and X are independent by
Basus Theorem. As U X = min{X1 , . . . , Xn }, we have
 
E U X = E [min{X1 , . . . , Xn }] ,

using the independence,  


E [U ] E X = E [min{X1 , . . . , Xn }] ,
that is,
1
E [min{X1 , . . . , Xn }] n 1
E [U ] =   = 1 = .
E X
n

You might also like