Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

MATH 330  Prob and Stat

James Helmreich

Problem Set 5
1

Chapter 6 Problems

2.
> pt(3,5)
[1] 0.9849504
> pt(3,5)-pt(2,5)
[1] 0.03592012
> qt(.05,5)
[1] -2.015048
3.
-5 = -n/2 ==> n=10
> pchisq(15.99,10)
[1] 0.900081

Clearly Chi-squared from mgf form

6.
> pf(1,2,5)
[1] 0.5687988
> qf(.5,2,5)
[1] 0.7987698
9.
a, b, and d are statistics.

c involves a parameter and thus is not a statistic.

25.
> set.seed(36)
> a<-NULL
> b<-NULL
> set.seed(36)
> a<-rnorm(1000)
> b<-rchisq(1000,5)
> ab<-a/sqrt(.2*b)
> par(mfrow=c(1,3))
> hist(ab,freq=F)
> lines(x,dt(x,5),lwd=2,col=2)
> hist(ab,breaks=20,freq=F)
> lines(x,dt(x,5),lwd=2,col=2)
> hist(ab,breaks=40,freq=F)
> lines(x,dt(x,5),lwd=2,col=2)

Fall 2014

Page 1 of 3

MATH 330

PS 5

Helmreich

Figure 1: Randomly generated histograms (number of breaks is only dierence) of Z divided


1
by ( 51 25 ) 2 with a t5 density superimposed.
2

Chapter 7 Problems

(4) V (T1 ) = V (X)/n2 = npq/n2 = pq/n

V (T2 ) = V (X + 1)/(n + 2)2 = V (X)/(n + 2)2 = npq/(n + 2)2


E(T1 ) = E(X)/n = np/n = p (E(T1 ) p)2 = 0
E(T2 ) = E(X + 1)/(n + 2) = (E(x) + 1)/(n + 2) = (np + 1)/(n + 2)
(E(T2 ) p)2 = ((1 2p)/(n + 2))2

So...

M SE(T1 ) = pq/n
M SE(T2 ) = npq/(n + 2)2 + ((1 2p)/(n + 2))2 = =

1 + (n p)pq
(n + 2)2

(b) n=100, p=.4, M SE(T1 ) = .0024 and M SE(T2 ) = .0023936947 so M SE(T2 ) is smaller.
(7) (a) E(T1 ) = (1/6)16 E(X) = (1/6)(6) = .
E(T2 ) = (1/5)26 E(X) = (1/5)(5) = .
(b) V ar(T1 ) = 2 /6; V ar(T2 ) = 2 /5.
So:
M SE(T3 ) = 2 V ar(T1 ) + (1 )2 V ar(T2 ) = 2 (
2

M SE 0 (T3 ) = 2( 6 ) 2(1 ) 5

2
2
) + (1 )2
6
5

Setting equal to zero, we get


Fall 2014

Page 2 of 3

MATH 330

PS 5

Helmreich

1
6
=
so = . You should check that this minimizes M SE(T3 ) (the second derivative
6
5
11

is a positive constant).

(10) L( = 1/2 | HT T ) = (1/2)(1/2)(1/2) = 1/8


L( = 1/3 | HT T ) = (2/3)(1/3)(1/3) = 2/27
L( = 2/3 | HT T ) = (1/3)(2/3)(2/3) = 4/27
So the MLE of is 2/3.

(13) (a) We discussed the likelihood in class. Taking the log-Likelihood, you should get
something like
ln(L) = 7 + (xi ) ln() 20 + (yi ) ln() + (constants involving xi ! and yi !)
Taking the derivative and setting equal to zero we get that the MLE estimator of is
= xi + yi

27
=
(b) This makes the MLE estimate of for the given data 80/27. Then we have V ar()
xi + yi
) = (V ar(xi ) + V ar(yi ))/272 = (7 + 10 2)/272 = /27, so in this
V ar(
27
instance the variance of the estimator is 80/272 .

Fall 2014

Page 3 of 3

You might also like