Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

QF314800: Mathematical Statistics I

Assignment 13: Limiting Behaviors


Instructor: Chung-Han Hsieh (ch.hsieh@mx.nthu.edu.tw)
Teaching Assistant: Yi-Shan Wong (pudding70207@gmail.com)

Problem 13.1. Let {Xn : n ≥ 1} be a random sequence with mean µ ∈ R and variance σ 2 /nk
P
where k > 0 and σ 2 > 0 are fixed constants. Show that Xn → µ as n → ∞.
Hint: Apply Markov inequality.
Problem 13.2 (Convergence of Binomial Random Sequence). Let random variables Xn with
Xn ∼ binominal(n, p) for n ≥ 1 and p ∈ [0, 1].
P
(i) Show that Xnn → p as n → ∞.
Xn P
(ii) Show that 1 − n → 1 − p as n → ∞.
Xn P
(iii) Show that ( Xnn )(1 − n )→ p(1 − p) as n → ∞.
Problem 13.3 (Generalized Continuous Mapping Theorem). Suppose that X1 , X2 , . . . be a
P
random sequence such that Xn → X as n → ∞ where X is some random variable. If g be a
P
continuous function. Show that g(Xn ) → g(X) as n → ∞.
Problem 13.4 (Almost Sure Convergence). Let the sample space S := [0, 1] with the uniform
probability distribution. For each s ∈ S, define random variables Xn (s) := sn and X(s) = 0.
a.s.
Show that Xn → X.
Hint: Check P ({s ∈ S : limn→∞ Xn (s) ̸= X(s)}) = 0 first. Then use this to conclude that
P ({s ∈ S : limn→∞ Xn (s) = X(s)}) = 1.
Preliminaries for Problem 13.5. Let {Xn : n ≥ 1} be a random sequence with cdf FXn (x). If Xn
has the mgfs that corresponds to the cdf FXn (x), then one can use these mgf to determine the
limiting cdf. The following Theorem 13.1 bridges this gap.
Theorem 13.1 (MGF Technique for Determining the Limiting Distribution). Let {Xn : n ≥
1} be a sequence of random variables with mgf MXn (t) that exists for t ∈ (−h, h) for all n.
Let X be a random variable with mgf MX (t), which exists for |t| ≤ h1 ≤ h for some h1 . If
D
limn→∞ MXn (t) = MX (t) for |t| ≤ h1 , then Xn → X.

Pn Let {Xn : n ≥ 1} be a random sequence


Problem 13.5 (Convergence in Distribution and Mgf).
with iid component Xn ∼ N (µ, σ 2 ). Let X n := n1 i=1 Xi be the sample mean of Xn .
(i) Find the mgf of MX n .
D
(ii) Show that X n → µ as n → ∞. That is, X n converges in distribution to a degenerate random
variable µ.
P
(iii) Show that X n → µ as n → ∞.
Problem 13.6 (Sensing the Law of Large Numbers). Generate a sequence of iid random variables
X1 , X2 , . . . , Xn from a lognormal distribution with mean 0 and variance 1/2 for n = 100.
(i) Plot the random samples as n dots on a figure with x-axis being n and y-axis being the values
that taken as Xi = xi for i = 1, 2, . . . , n.
(ii) Indicates the population mean on the same figure in part (i).
(iii) With the sampled
Pn values x1 , x2 , . . . , xn obtained in part (i), consider a mapping f : N → R
with f (n) = n1 i=1 xi . Plot f (n) for n = 1, 2, . . . , 100 on the same figure you generated in
part (i).
(iv) Comments on your finding. Does your sample mean converge to population mean (or not)?

1
Problem 13.7 (Preview of the CLT). This problem serves as a preview of the Central Limit
Theorem (CLT). The striking feature of the CLT is that for any distribution with finite variance,
the sum of iid samples leads to the normality. In this problem, we will experiment on this.
(i) Choose an arbitrary distribution F that you like for the underlying observations Xi . Report
the distribution F you use and the associated mean µ, variance σ 2 , and standard deviation σ.

(ii) Generate independent draws of Zn := n X nσ−µ for n = 100. Repeat this procedure with
k = 1000 times. Then use these draws to plot the empirical distribution via histogram.
(iii) Compare your histogram with the standard normal curve.
(v) Attach your code.
Hint for Python user: you may find “scipy.stats” package useful; e.g., to import uniform
distribution, try from scipy.stats import uniform.

You might also like