Professional Documents
Culture Documents
Bonushw 2 Sol
Bonushw 2 Sol
Homework 2
Solution
Saad Mneimneh
Visiting Professor
Hunter College of CUNY
Problem 0: Readings
Read pages 12 to 22 and pages 34 to 42 in the Bayesian statistics book. Also
read the notes posted on the course website for lectures 4, 5 and 6.
Problem 1: Polyas urn
Polyas urn represents to some extent a generalization of a Binomial random
variable. Consider the following scheme: An urn contains b black and r red
balls. The ball drawn is always replaced, and, in addition, c balls of the color
drawn are added to the urn.
Note that if c = 0, drawings ar equivalent to independent Bernoulli processes
b
with p = b+r
. However, with c 6= 0, the Bernoulli processes are dependent, each
with a parameter that depends on the sequence of previous drawings.
For instance, if the first ball is black, the (conditional) probability of a black
ball at the second drawing is (b + c)/(b + c + r). The probability of the sequence
black, black is, therefore,
b
b+c
b+rb+c+r
Let Bi be the event that a black ball is drawn at the ith drawing. Define similarly Ri . For instance, B1 R2 B3 is the event that a black ball is drawn first,
then a red one, then a black one.
(a) Show that the probability of a sequence of Bs and Rs remains the same
if we permute the Bs and the Rs in any way. Derive an expression for the
probability of k black balls in n k drawings.
Solution: The probability that the nth ball is black, conditioned on k 1 black
balls in the first n 1 rounds (k n) is
b + (k 1)c
b + r + (n 1)c
Therefore, if the ith B appears in the j th position in a sequence, it contributes
a probability:
b + (i 1)c
b + r + (j 1)c
The same is true for red balls with b and r switched. Therefore, a sequence with
k black balls will contribute the terms b, b+c, . . . , b+(k1)c; and r, r+c, . . . , r+
(n k 1)c, in the numerator; and the terms b + r, b + r + c, . . . , b + r + (n 1)c
in the denominator.
This makes the probability of k black balls in n rounds equal to the following
expression (regardless of the order in which they were drawn):
n
k
Qk
i=1 [b
Qnk
+ (i 1)c] i=1 [r + (i 1)c]
Qn
i=1 [b + r + (i 1)c]
n
k
Qk
b
i=1 [ c
n
k
n
k
Qnk r
+ i 1]
[ + i 1]
Qn b+r i=1 c
[
+
i
1]
i=1 c
( cb +k)( rc +nk)
( cb )( rc )
( b+r
c +n)
( b+r
c )
( cb + k)( rc + n k) ( cb + rc )
( cb +
r
c
+ n)
( cb )( rc )
P (Ri |Rj ) =
Since P (Bi ) = P (Bj ) = b/(b + r) and similarly P (Ri ) = P (Rj ) = r/(b + r), the
first two equalities follow immediately.
P (Bi |Rj ) =
P (Bj , Ri )
P (Bi , Rj )
=
= P (Bj |Ri )
Rj
Ri
The first equality follows from the definition of conditioning, the second equality follows from the fact that we can permute the sequence (part (a)) and that
Ri = Rj (part (b)), and the last equality follows again from the definition of
conditioning.
The last case is symmetric. The proof of the first two is actually identical to
this one since P (X|Y )P (Y ) in the Bayes rule is nothing but P (X, Y ).
Problem 2: Splitting a Poisson
Consider a Poisson random variable Z with
k e
k!
Consider also the following two random variables X and Y = Z X
n
P (X = k|Z = n) =
pk (1 p)nk
k
n
P (Y = k|Z = n) =
(1 p)k pnk
k
P (Z = k) =
(a) Show that X and Y are both Poisson random variables with parameters p
and (1 p) respectively.
Solution: We need to find P (X = k).
P (X = k) =
P (X = k|Z = n)P (Z = n)
n=0
X
n e
n
P (X = k) =
pk (1 p)nk
k
n!
n=k
X
n=k
n!
n e
pk (1 p)nk
k!(n k)!
n!
X
pk (1 p)nk n
e
k!(n k)!
n=k
X
n=k
k p h
i
[(1 p)] e
[(1 p)]1 e(1p)
(p) e
+
+ ...
k!
0!
1!
This sum represents the sum of all Poisson probabilities with parameter p(1).
Therefore, the sum should evaluate to 1. We end up with:
0 (1p)
P (X = k) =
(p)k ep
k!
Solution:
P (X = k1 , Y = k2 |Z = k1 + k2 ) = P (X = k1 , Z X = k2 |Z = k1 + k2)
k1 + k2
= P (X = k1|Z = k1 + k2 ) =
pk1 (1 p)k2
k1
Now by Bayes rule:
P (X = k1 , Y = k2 |Z = k1 +k2 ) =
=
P (Z = k1 + k2 |X = k1 , Y = k2 )P (X = k1 , Y = k2 )
P (Z = k1 + k2 )
1 P (X = k1 , Y = k2 )
P (Z = k1 + k2 )
Therefore,
P (X = k1 , Y = k2 ) =
=
k1 + k2
k1
pk1 (1 p)k2
k1 +k2 e
(k1 + k2 )!
n
X
n
ak bnk
k
k=0
Solution: P (Z = n) = P (X = 0, Y = n) + P (X = 1, Y = n 1) + . . . + P (X =
n, Y = 0) and since X and Y are independent:
P (Z = n) =
n
X
P (X = k)P (Y = n k)
k=0
n
X
k e nk e
k=0
k!
(n k)!
X
1 (+) X
n!
1
e
k nk = e(+)
n!
k!(n k)!
n!
k=0
k=0
n
k
k nk
1 (+)
e
( + )n
n!
This shows that Z is Poisson with parameter + .
P (Z = n) =
P (X = k|Z = n) =
P (Z = n|X = k)P (X = k)
P (X + Y = n|X = k)P (X = k)
=
P (Z = n)
P (Z = n)
P (Y = n k)P (X = k)
nk e k e
n!
=
P (Z = n)
(n k)! k! ( + )n e(+)
k nk
n
=
k
+
+
E[Sn2 ]
1 X
=
E[(xi x
)2 ]
n 1 i=1
n
E[(x x
)2 ]
n1
1
E[(x1 + . . . + xn )(x1 + . . . + xn )]
n2
If we expand, we have n terms of the form xi xi and n(n 1) terms of the form
xi xj (i 6= j). Since xi and xj are independent, E[xi xj ] = E[xi ]E[xj ]. And since
all xi are identically distributed, we end up with:
E[
x2 ] =
1
n1
E[x2 ] +
E[x]2
n
n
E[(x x
)2 ] = E[x2 ] +
=
n1
2
n1
1
E[x2 ] +
E[x]2 E[x2 ] 2
E[x]2
n
n
n
n
n1
n1
n1
n1 2
E[x2 ]
E[x]2 =
(E[x2 ] E[x]2 ) =
x
n
n
n
n
Therefore,
E[Sn2 ] = x2
Problem 5: Law of large numbers
Consider flipping a fair coin 2n times and let S2n be the number of heads. We
can prove the following result (0 t n):
P (n t S2n n + t) 1 et
/(n+t)
(a) How large must n be to guarantee that the fraction of heads is within 0.01
of 1/2 with probability at least 0.99?
Solution: Given the above inequality, we can rewrite it as follows:
P (|
2
S2n
1
t
|
) 1 et /(n+t)
2n
2
2n
2
2n
We want = 0.01 and, therefore,
n 1/(8 0.013 ) = 125000.
P(
1
2
| )
2
2n2
2 /(2n0.012 ) 0.01. Since 2 = 1/4,
(c) Answer the same question using the Central Limit Theorem.
Solution:
S2n
1
S2n n
S2n n
P (|
| 0.01) = P (|
| 0.01) = P (|
|
2n
2
2n
2n
2n0.01
)
n
By the central limit theorem, S2n2n
N (0, 1) for large n. Therefore,
S2n n
2n0.01
2n0.01
P (|
|
) 2(1 (
)) 0.01
2n
1
2
0 x 2
(b) f (y|x)
Solution:
f (y|x) =
1
2 x
x y 2
(c) f (x, y)
Solution:
f (x, y) = f (y|x)f (x) =
1
2(2 x)
0 x y 2
(d) f (y)
Solution:
Z
f (y) =
f (y|x)f (x)dx =
0
=
0
f (x, y)dx
0
1
1
dx =
log(2 x)|y0
2(2 x)
2
1
1
log(2)
log(2 y)
2
2
1
2
=
log
0 y 2
2
2 y
(e) f (x|y)
Solution:
f (x|y) =
(f) the expected cut
f (y|x)f (x)
f (x, y)
1
=
=
f (y)
f (y)
(2 x) log
2
2y
Solution:
E[X] =
(uniform)
2 + x
]
(uniform)
2
Therefore, E[Y ] = 3/2. The expected cut is given by the quadrant [, 3/2].
E[Y ] = EX E[Y |X = x] = EX [
1
1
= min s
= min ps (1 p)f
i pi (1 pi )f
i
maxi psi (1 pi )f