Hw5 Solutions

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

INTERMEDIATE MATHEMATICAL STATISTICS I

HOMEWORK 5

Assigned September 9, 2022, due October 6, 2022


This homework pertains to materials covered in Lecture 8, 10, 11 and 12.
The assignment can be typed or handwritten, with your name on the doc-
ument, and with properly labeled computer output for those problems that
require computer simulations. To obtain full credit, please write clearly and
show your reasoning. If you choose to collaborate, the write-up should be
your own. Please show your work! Upload the file to the Week 6 Assignment
on Canvas or hand it in before class.
Problem 1 (10 points). Problem 4.5 of Casella & Berger.
Solution.
(a) It proceeds as follows:
!
Z Z Z 1 Z 1


x + y dxdy = √
x + y dx dy
{(x,y)∈[0,1]2 :x> y} 0 y
1
1
1−y √ y y 2 2 5/2
Z
7
= + y(1 − y)dy = + − y = .
0 2 2 4 5 20
0

(b) Similarly, we calculate


Z Z Z 1 Z x 
2x dxdy = 2x dy dx
{(x,y)∈[0,1]2 :x2 <y<x} 0 x2
1
1
2x3 2x4
Z
2 1
= 2x(x − x )dx = − = .
0 3 4 6
0

Problem 2 (16 points). The random variables X and Y contain a joint


PDF given by
(
ce−2(x+y) , if x ≥ 0 and y ≥ 0,
fXY (x, y) =
0, otherwise.
R∞ R∞
(1) Determine c with the use of property −∞ −∞ fXY (x, y)dxdy = 1.
(2) Determine the joint CDF of X and Y .
(3) Determine the marginal PDFs fX and fY .
(4) Calculate P (X > 1, Y > 1) and P (X < Y ).
Solution.
1
INTERMEDIATE MATHEMATICAL STATISTICS I HOMEWORK 5 2

(1) Note
Z ∞Z ∞ Z ∞Z ∞
1= fXY (x, y)dxdy = c e−2(x+y) dxdy
−∞ −∞ 0 0
Z ∞  Z ∞ 
−2x −2y
=c e dx e dy = c/4.
0 0
Thus c = 4.
(2) By definition of CDF, if x ≥ 0 and y ≥ 0
Z xZ y
FXY (x, y) = = 4e−2(x+y) dxdy
0Z x 0  Z y 
−2x
= 2e dx 2e dy = (1 − e−2x )(1 − e−2y ).
−2y
0 0
If one of x and y is negative, FXY (x, y) = 0.
(3) Note that fXY can be factorized into a function of x and a function of
y. Thus, X and Y are independent and
fX (x) = c1 e−2x 1(0, ∞), fY (x) = c1 e−2y 1(0, ∞),
R∞
in which c1 is a constant such that 0 fX (x)dx = 1, i.e., c1 = 2.
(4) Since X and Y are independent,
 ∞   ∞ 
−2x −2y
= e−4 .

P (X > 1, Y > 1) = P (X > 1)P (Y > 1) = −e −e
1 1
On the other hand,
Z ∞Z ∞ Z ∞
−2(x+y) 1
P (X < Y ) = 4e dydx = 2e−2x e−2x dx = .
0 x 0 2
Problem 3 (8 points). The random variables X and Y contain a joint PDF
given by (
8xy, if 0 < x < y < 1,
fXY (x, y) =
0, otherwise.
(1) Sketch the support of X and Y , i.e., the set {(x, y) ∈ R2 : fXY (x, y) >
0}.
(2) Determine the marginal PDFs fX and fY .
(Refer to the example from Lecture 8 for ideas.)
Solution.
(1) See Figure 1 for an illustration of the support.
(2) The marginal PDF of X is
Z 1 1
8xydy = 4xy = 4x(1 − x2 ), 0 < x < 1.
2

fX (x) =
x x
The marginal PDF of Y is
Z y y
8xydx = 4yx = 4y 3 , 0 < y < 1.
2

fY (y) =
0 0
INTERMEDIATE MATHEMATICAL STATISTICS I HOMEWORK 5 3

Figure 1. The support of X and Y .

Problem 4 (8 points). We shoot at a practice target with a center 0 and


a radius of 1. Every shot is a hit and the point where the target is hit is a
pair of random variables (X, Y ) which are uniformly distributed on the disc
{(x, y) ∈ R2 : x2 + y 2 ≤ 1}. Equivalently, we have a joint PDF
(
c, if x2 + y 2 ≤ 1,
fXY (x, y) =
0, otherwise.
(1) Determine c.
(2) Determine the marginal PDFs fX and fY .
Solution.
(1) By the property of density functions,
Z Z
cdxdy = cπ12 = 1,
{x2 +y 2 ≤1}

which means c = 1/π.


(2) Since X and Y are exchangeable w.r.t. fXY , the marginal PDFs of
X and Y should be identical. Also,
Z √1−x2
1 2p
fX (x) = √ dy = 1 − x2 , −1 < x < 1.
− 1−x2 π π
Problem 5 (5 points). Problem 4.22 of Casella & Berger.
Proof. The inverse function of the transformation is
(
x = (u − b)/a,
y = (v − d)/c,
INTERMEDIATE MATHEMATICAL STATISTICS I HOMEWORK 5 4

and the Jacobian determinant of the transformation is



1/a 0
J = = 1/(ac).
0 1/c
Note that a > 0 and c > 0. Thus the joint PDF of (U, V ) is
 
1 u−b v−d
fU V (u, v) = f , .
ac a c

Problem 6 (10 points). The random variables X and Y has an absolutely
continuous distribution with density function
fXY (x, y) = exp{−(x + y)}1{x > 0, y > 0}
√ p
Let U = XY and V = X/Y . Find the joint PDF of (U, V ).
Solution. The inverse function is
(
x = uv,
y = u/v,
and the Jacobian determinant is

v u
J = 2
= −2u/v.
1/v −u/v
Thus the joint PDF of (U, V ) is
2u −(uv+ u )
fU V (u, v) = fXY (uv, u/v)|J|1(u > 0, v > 0) =
e v 1(u > 0, v > 0)
v
Problem 7 (10 points). Problem 4.11 of Casella & Berger.
Solution. Let’s examine the joint PMF of (U, V ): If u and v are positive
integers such that u ≥ v, P (U = u, V = v) = 0 because it is impossible to
take less trials to get two heads than to get the first head.
If u < v, then
P (U = u, V = v) = P (First u − 1 trials get tails, uth trial gets head,
u + 1th to v − 1th trial get tails and vth trial gets head)
1
= v.
2
In summary, the joint PMF of (U, V ) is fU V (u, v) = 21v 1(u < v, u ≥ 1, v ≥
2), which obviously cannot be factorized. Thus, U and V are not indepen-
dent.
Problem 8 (10 points). Let X and Y denote independent random variables,
each with a PDF
1
f (x) = exp{−|x|}1{x ∈ R}.
2
Find the PDF of U = X + Y .
INTERMEDIATE MATHEMATICAL STATISTICS I HOMEWORK 5 5

Solution. Define the transformation U = X +Y and V = Y . The inverse


transformation is (
x = u − v,
y = v,
and the Jacobian determinant is

1 −1
J = = 1.
0 1
Thus the joint PDF of (U, V ) is
1
fU V (u, v) = exp{−|u − v| − |v|}, x ∈ R, y ∈ R.
4
If u < 0, the marginal PDF of U can be derived as follows
Z ∞
fU (u) = fU V (u, v)dv
−∞
Z u Z 0 Z ∞
1 1 1
= exp{−(u − v) + v}dv + exp{(u − v) + v}dv + exp{(u − v) − v}dv
−∞ 4 u 4 0 4
Z u Z 0 Z ∞
1 1 1
= exp{−u + 2v}dv + exp{u}dv + exp{u − 2v}dv
−∞ 4 u 4 0 4
1 u 1 exp(u) − u exp(u)
= exp(u) − exp(u) + exp(u) =
8 4 8 4
If u ≥ 0, the marginal PDF of U is
Z ∞
fU (u) = fU V (u, v)dv
−∞
Z 0 Z u Z ∞
1 1 1
= exp{−(u − v) + v}dv + exp{−(u − v) − v}dv + exp{(u − v) − v}dv
−∞ 4 0 4 u 4
Z 0 Z u Z ∞
1 1 1
= exp{−u + 2v}dv + exp{−u}dv + exp{u − 2v}dv
−∞ 4 0 4 u 4
1 u 1 exp(−u) + u exp(−u)
= exp(−u) + exp(−u) + exp(−u) = .
8 4 8 4
In summary, the PDF of U = X + Y is
exp(−|u|) + |u| exp(−|u|)
fU (u) = , u ∈ R.
4
Problem 9 (15 points). Problem 4.20 of Casella & Berger.
Solution.
(a) We first focus on the subregion where x2 ≥ 0 so that the mapping from
(x1 , x2 ) to (y1 , y2 ) is one-to-one. The inverse transformation over this
subregion is ( √
x1 = y1 y2 ,
p
x2 = y1 − y1 y22 ,
INTERMEDIATE MATHEMATICAL STATISTICS I HOMEWORK 5 6

and the Jacobian determinant is


y2 √
√ y 1
2 1 − y22
= p−y2 1
2 y1
J = √1−y2 2 1 − y 2 2 1 − y 2 = − 2p1 − y 2 .

2
−y y

√ 1 2
p
2 y1 −y1 y22 y1 −y1 y22 2 2 2

Thus the joint PDF of (Y1 , Y2 ) is


y1 y22 + y1 − y1 y22
 
1
f1 (y1 , y2 ) = exp − |J|
2πσ 2 2σ 2
1 n y o
1 1
= 2
exp − 2 p , y1 ≥ 0, −1 ≤ y2 ≤ 1.
2πσ 2σ 2 1 − y22
We can follow the same procedure and show that the joint PDF of
(Y1 , Y2 ) mapped from the subregion where x2 < 0 is identical to the
above form. Thus, the joint PDF of (Y1 , Y2 ) is
1 n y o 1
f (y1 , y2 ) = 2f1 (y1 , y2 ) = exp
1
− 2 p 1(y1 ≥ 0, −1 ≤ y2 ≤ 1).
2πσ 2 2σ 1 − y22
Since f (y1 , y2 ) can be factorized, we know Y1 and Y2 are independent.
(b) We see that Y1 is the square of the distance from (X1 , X2 ) to the origin,
whilst Y2 is the cosine of the angle between the positive x1 -axis and the
line from (X1 , X2 ) to the origin. Because X1 and X2 are independent
themselves, it is clear that the distance and the angle can also change
independently of each other.
Problem 10 (8 points). Problem 4.24 of Casella & Berger.
Solution. The inverse transformation is
(
x = z1 z2 ,
y = z1 − z1 z2 ,
and the Jacobian determinant is

z2 z1
J = = −z1 .
1 − z2 −z1
Thus the joint PDF of (Z1 , Z2 ) is
(z1 z2 )r−1 exp(−z1 z2 ) (z1 − z1 z2 )s−1 exp(−z1 + z1 z2 )
f (z1 , z2 ) = · |J|
Γ(r) Γ(s)
z1r+s−2 exp(−z1 )z2r−1 (1 − z2 )s−1
= z1
Γ(r)Γ(s)
z1r+s−1 exp(−z1 ) Γ(r)Γ(s) r−1
= z (1 − z2 )s−1 .
Γ(r + s) Γ(r + s) 2
z1r+s−1 exp(−z1 )
Denote g(z1 ) = Γ(r+s) (which is the PDF of a Gamma(r + s, 1) dis-
tribution), h(z2 ) = Γ(r)Γ(s) r−1
Γ(r+s) z2 (1 − z2 )
s−1 (which
is the PDF of a Beta(r, s)
distribution). Since f (z1 , z2 ) = g(z1 )h(z2 ), we know that Z1 and Z2 are
independent.
INTERMEDIATE MATHEMATICAL STATISTICS I HOMEWORK 5 7

Problem 11 (Bonus 10 points). Let X and Y denote independent random


variables each with a uniform distribution on the interval (0, 1). Let
p
U = −2 log X cos(2πY )
and p
V = −2 log X sin(2πY ).
Find the joint PDF of (U, V ).
Solution. The inverse function is
(
x = exp{−(u2 + v 2 )/2},
y = arctan(v/u)/(2π),
and the Jacobian determinant is
−u exp{−(u2 + v 2 )/2} −v exp{−(u2 + v 2 )/2}

J = v 1 1 1 = − 1 exp{−(u2 +v 2 )/2}.
− 2πu 2 1+(v/u)2 2πu 1+(v/u)2

Thus the joint density of (U, V ) is
1
exp{−(u2 +v 2 )/2}, u ∈ R, v ∈ R.
fU V (u, v) = fXY (x(u, v), y(u, v))|J| =

That is, U and V are independent standard Normal variables.

You might also like