Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

University of California, Los Angeles

Department of Statistics

Statistics 100B Instructor: Nicolas Christou

Exam 1
27 January 2021
This is a 2-hour exam: Due by 8:15 pm on Thursday, 28 January

Name:

Problem 1 (25 points)


Answer the following questions:
a. Consider the following two independent sets of random variables: Let X1 , . . . , Xn i.i.d. random variables with
Xi ∼ N (µ1 , σ) and let Let Y1 , . . . , Ym i.i.d. random variables with Yi ∼ N (µ2 , σ). Express the vector
 

 Ȳ 
X1 − X̄
 
 
 ..      
. X X
 
 in the form A Y and show some typical elements of var A .
 
Xn − X̄ Y

 

 Y1 − Ȳ 

 .. 
 . 
Yn − Ȳ
n
b. Let Y ∼ Nn (µ, Σ). Show that the moment generating function of Q = (Y − µ)0 Σ−1 (Y − µ) is (1 − 2t)− 2 and
therefore Q ∼ Γ( n2 , 2). Note: Integrate using the definition of moment generating functions.
c. Suppose Z1 , . . . , Zn are i.i.d. Bernoulli random variables with probability of success p. Let W be a symmetric
n × n matrix of constants such that wii = 0, i = 1, . . . n. For example, the entries of W can be 0 or 1, where 1
denotes connectivity between neighboring counties and 0 no connectivity. Find E[Z0 WZ].
d. Let X ∼ Poisson(λ1 ) and Y ∼ Poisson(λ2 ). We showed in class using moment generating functions that if X
and Y are independent then X + Y ∼ Poisson(λ1 + λ2 ). Show the same result by expanding P (X + Y = m).
Here you will need to show that this probability is computed using the Poisson probability mass function with
parameter λ1 + λ2 and therefore X + Y ∼ Poisson(λ1 + λ2 ).
e. Consider the multinomial distribution X ∼ Mr (n, p). Find the correlation between X1 and X2 .

1
Problem 2 (25 points)
Answer the following questions:
a. Suppose ln[Yi ] = µ + i , Suppose i ∼ N (0, σ). Find the mean and variance of Yi .
b. Suppose X and Y are independent exponential random variables with parameters λ1 and λ2 respectively. Find
the mean and variance of X
Y
.
c. Refer to homework 3, exercise 4. Find the distance such that the probability of exceeding it is e−1 . This is the
same as finding the (1 − e−1 )th percentile of R.
Pm
d. Refer to homework 3, exercise 4. Suppose m points are randomly selected. Find the pdf of i=1
Ri3 , where Ri
P
m
1 
3 3
is the distance from point i to the nearest particle. Find E i=1
Ri .
√ √
e. We showed that if Y ∼ Nn (µ, Σ) then a0 Y ∼ N (a0 µ, a0 Σa). Show that if a0 Y ∼ N (a0 µ, a0 Σa) for every
vector a then Y ∼ Nn (µ, Σ)

2
Problem 3 (25 points)
Answer the following questions:
a. Suppose Y ∼ Nn (µ, Σ). Consider the n × 1 vectors of constants a and b. Show that cov(a0 Y, b0 Y) = a0 Σb.
b. Without integration determine the value of c that makes f (x) a pdf.
1. f (x) = cx3 (1 − x)2 , with 0 ≤ x ≤ 1.
x
2. f (x) = cx2 e− 3 , with x > 0.
1 2
3. f (x) = ce− 2 x , with −∞ < x < ∞.
c. Suppose X is a random variable and the expectation of the mth moment is given by E[X m ] = (m + 1)!2m with
m = 1, 2, 3, . . .. What is the moment generating function and distribution of X?
d. Suppose Y ∼ Nn (µ, Σ). Consider partitioning the vector Y as given in handout #12 on page 5. Use the joint
moment generating function method to show that Q1 ∼ Np (µ1 , Σ11 ).
e. Suppose Y ∼ Nn (µ, Σ). Consider a linear combination of Y1 , Y2 , . . . , Yn given by c0 Y, where c is a vector of
constants. Use the method of moment generating functions to show that c0 Y follows a normal distribution
with mean c0 µ and variance c0 Σc.

3
Problem 4 (25 points)
Answer the following questions:
a. Suppose X ∼ Γ(6, 4). Find P (X > 12). Please use the table below and provide all the steps used to obtain
the solution.
Note: The table gives P (Y ≤ y) for Y ∼ Γ(α, 1). The value of α is given in the first row of the table.
1 2 3 4 5 6 7 8 9 10
y
1 0.632 0.264 0.080 0.019 0.004 0.001 0.000 0.000 0.000 0.000
2 0.865 0.594 0.323 0.143 0.053 0.017 0.005 0.001 0.000 0.000
3 0.950 0.801 0.577 0.353 0.185 0.084 0.034 0.012 0.004 0.001
4 0.982 0.908 0.762 0.567 0.371 0.215 0.111 0.051 0.021 0.008
5 0.993 0.960 0.875 0.735 0.560 0.384 0.238 0.133 0.068 0.032
6 0.998 0.983 0.938 0.849 0.715 0.554 0.394 0.256 0.153 0.084
7 0.999 0.993 0.970 0.918 0.827 0.699 0.550 0.401 0.271 0.170
8 1.000 0.997 0.986 0.958 0.900 0.809 0.687 0.547 0.407 0.283
9 1.000 0.999 0.994 0.979 0.945 0.884 0.793 0.676 0.544 0.413
10 1.000 1.000 0.997 0.990 0.971 0.933 0.870 0.780 0.667 0.542
11 1.000 1.000 0.999 0.995 0.985 0.962 0.921 0.857 0.768 0.659
12 1.000 1.000 0.999 0.998 0.992 0.980 0.954 0.910 0.845 0.758
13 1.000 1.000 1.000 0.999 0.996 0.989 0.974 0.946 0.900 0.834
14 1.000 1.000 1.000 1.000 0.998 0.994 0.986 0.968 0.938 0.891
15 1.000 1.000 1.000 1.000 0.999 0.997 0.992 0.982 0.963 0.930
. . . . . . . . . . .
. . . . . . . . . . .
. . . . . . . . . . .

b. Suppose 3 random variables follow jointly a multivariate normal distribution as follows:


Y1 170 400 64 128
" # " # " #!
Y2 ∼ N3 68 , 64 16 0
Y3 40 128 0 256
Find P (150 < Y1 + Y2 − Y3 < 215).
c. Let X = (X1 , X2 , X3 )0 have multivariate normal distribution with µ1 = 6, µ2 = 4, µ3 = 2 and σ12 = 16, σ22 =
25, σ32 = 64, σ12 = 6, σ13 = σ23 = 0. Let Y1 = X1 − 2X2 + 3X3 − 3 and Y2 = 3X1 − 2X3 + 3. Find the following:
1. Distribution of Y = (Y1 , Y2 )0 .
2. Correlation between Y1 and Y2 .
3. Joint moment generating function of Y.
d. Suppose X and Y are independent with X ∼ N (0, 1) and Y ∼ N (0, 1). Let U = 2X + 3Y and V = X − Y .
Find the joint pdf of U and V .
e. Is it always true that the sum of normal random variables follow a normal distribution? Please provide as
many details as you can.

You might also like