Professional Documents
Culture Documents
Probability 2.2 EdX
Probability 2.2 EdX
Dave Goldsman
H. Milton Stewart School of Industrial and Systems Engineering
Georgia Institute of Technology
3/2/20
ISYE 6739
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Introduction
In this module, we’ll look at what happens when you consider two random
variables simultaneously.
ISYE 6739
Introduction
Discrete Case
Properties:
0 ≤ f (x, y) ≤ 1.
P P
x y f (x, y) = 1.
A ⊆ R2 ⇒ P ((X, Y ) ∈ A) =
PP
(x,y)∈A f (x, y).
ISYE 6739
Introduction
ISYE 6739
Introduction
In addition,
P (X ≥ 2, Y ≥ 2)
XX
= f (x, y)
x≥2 y≥2
= f (2, 2) + f (2, 3) + f (3, 2) + f (3, 3)
= 0 + 1/6 + 1/6 + 0 = 1/3. 2
ISYE 6739
Introduction
Continuous Case
Think of
It’s easy to see how this generalizes the 1-dimensional pdf, f (x).
ISYE 6739
Introduction
Application: Toss n darts randomly into the unit square. The probability
that any individual dart will land in the circle is π/4. It stands to reason that
the proportion of darts, p̂n , that land in the circle will be approximately π/4.
So you can use 4p̂n to estimate π!
ISYE 6739
Introduction
ISYE 6739
Introduction
Bivariate cdf’s
∂2 ∂2
Rx Ry
2-dimensions: f (x, y) = ∂x∂y F (x, y) = ∂x∂y −∞ −∞ f (s, t) dt ds.
ISYE 6739
Introduction
Properties:
ISYE 6739
Introduction
Example: Suppose
(
1 − e−x − e−y + e−(x+y) if x ≥ 0, y ≥ 0
F (x, y) =
0 if x < 0 or y < 0.
ISYE 6739
Marginal Distributions
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Marginal Distributions
and
X
fY (y) = P (Y = y) = f (x, y).
x
ISYE 6739
Marginal Distributions
By total probability,
P (X = 1) = P (X = 1, Y = any #) = 0.3. 2
ISYE 6739
Marginal Distributions
Remark: Hmmm. . . . Compared to the last example, this has the same
marginals but different joint distribution! That’s because the joint distribution
contains much more information than just the marginals.
ISYE 6739
Marginal Distributions
Example: (
e−(x+y) if x ≥ 0, y ≥ 0
f (x, y) =
0 otherwise.
Then the marginal pdf of X is
Z Z ∞
fX (x) = f (x, y) dy = e−(x+y) dy = e−x , if x ≥ 0. 2
R 0
ISYE 6739
Marginal Distributions
Example: (
21 2
4 x y if x2 ≤ y ≤ 1
f (x, y) =
0 otherwise.
Note funny limits where the pdf is positive, i.e., x2 ≤ y ≤ 1.
Z Z 1
21 2 21 2
fX (x) = f (x, y) dy = x y dy = x (1 − x4 ), −1 ≤ x ≤ 1.
R x2 4 8
√
Z Z y
21 2 7
fY (y) = f (x, y) dx = √
x y dx = y 5/2 , 0 ≤ y ≤ 1. 2
R − y 4 2
ISYE 6739
Conditional Distributions
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Conditional Distributions
P (X = x ∩ Y = y) f (x, y)
P (Y = y|X = x) = = .
P (X = x) fX (x)
P (Y = y|X = 2) defines the probabilities on Y given that X = 2.
f (x,y)
Remark: Of course, fX|Y (x|y) = f (x|y) = fY (y) .
ISYE 6739
Conditional Distributions
ISYE 6739
Conditional Distributions
ISYE 6739
Conditional Distributions
f ( 21 , y) 21
· 14 y 32
4 1
≤ y ≤ 1. 2
f y|1/2 = = = y, if 4
fX ( 21 ) 21
8 · 1
4 · (1 − 1
16 )
15
Note that 2/(1 − x4 ) is a constant with respect to y, and we can check to see
that f (y|x) is a legit conditional pdf:
Z Z 1
2y
f (y|x) dy = dy = 1. 2
R x2 1 − x4
ISYE 6739
Conditional Distributions
Example: Suppose fX (x) = 2x, for 0 < x < 1. Given X = x, suppose that
Y |x ∼ Unif(0, x). Now find fY (y).
Solution: Y |x ∼ Unif(0, x) implies that f (y|x) = 1/x, for 0 < y < x. So,
Thus,
Z Z 1
fY (y) = f (x, y) dx = 2 dx = 2(1 − y), 0 < y < 1. 2
R y
ISYE 6739
Independent Random Variables
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Independent Random Variables
Then
P (A ∩ B) P (A)P (B)
P (A|B) = = = P (A).
P (B) P (B)
And similarly, P (B|A) = P (B).
Now we want to define independence for random variables, i.e., the outcome
of X doesn’t influence the outcome of Y (and vice versa).
ISYE 6739
Independent Random Variables
Equivalent definitions:
Proof:
f (x, y) fX (x)fY (y)
f (y|x) = = = fY (y). 2
fX (x) fX (x)
ISYE 6739
Independent Random Variables
ISYE 6739
Independent Random Variables
ISYE 6739
Independent Random Variables
Theorem: X and Y are independent iff f (x, y) = a(x)b(y), ∀x, y, for some
functions a(x) and b(y) (not necessarily pdf’s).
But if there are funny limits, this messes up the factorization, so in that case,
X and Y will be dependent — watch out!
ISYE 6739
Independent Random Variables
21 2
Example: f (x, y) = 4 x y, x2 ≤ y ≤ 1.
Can’t factor f (x, y) into functions of x and y separately. Thus, X and Y are
not independent. 2
Now that we can figure out if X and Y are independent, what can we do with
that knowledge?
ISYE 6739
Consequences of Independence
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Consequences of Independence
ISYE 6739
Consequences of Independence
= E[X] + E[Y ]. 2
ISYE 6739
Consequences of Independence
One can generalize this result to more than two random variables.
Proof: Induction. 2
ISYE 6739
Consequences of Independence
= E[X]E[Y ]. 2
ISYE 6739
Consequences of Independence
Proof:
ISYE 6739
Consequences of Independence
Can generalize. . .
Proof: Induction. 2
ISYE 6739
Random Samples
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Random Samples
ISYE 6739
Random Samples
So the mean of X̄ is the same as the mean of Xi , but the variance decreases!
This makes X̄ a great estimator for µ (which is usually unknown in practice);
the result is referred to as the Law of Large Numbers. Stay tuned.
ISYE 6739
Conditional Expectation
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Conditional Expectation
ISYE 6739
Conditional Expectation
Consider the usual definition of expectation. (E.g., what’s the average weight
of a male?) ( P
yf (y) discrete
E[Y ] = R y
R yf (y) dy continuous.
Now suppose we’re interested in the average weight of a 6' tall male.
ISYE 6739
Conditional Expectation
Discrete Example:
ISYE 6739
Conditional Expectation
ISYE 6739
Conditional Expectation
ISYE 6739
Double Expectation
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Double Expectation
The expected value (averaged over all X’s) of the conditional expected value
(of Y |X) is the plain old expected value (of Y ).
ISYE 6739
Double Expectation
ISYE 6739
Double Expectation
21 2
Old Example: Suppose f (x, y) = 4 x y, if x2 ≤ y ≤ 1.
2 1 − x6
E[Y |x] = · .
3 1 − x4
ISYE 6739
Double Expectation
ISYE 6739
Honors Class: First-Step Analysis
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Honors Class: First-Step Analysis
Furthermore, consider the first step of the coin flip process, and let X = H or
T denote the outcome of the first toss. Based on the result X of this first step,
we have
ISYE 6739
Honors Class: First-Step Analysis
In any case, it’s obvious that A and B are iid Geom(p = 1/2), so by the
previous example, E[Y ] = E[A] + E[B] = (1/p) + (1/p) = 4. 2
This example didn’t involve first-step analysis (besides using the expected
value of a geometric RV). But the next related example will. . . .
ISYE 6739
Honors Class: First-Step Analysis
ISYE 6739
Honors Class: Random Sums of Random Variables
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Honors Class: Random Sums of Random Variables
Suppose that X1 , X2 , . . . are independent RVs, all with the same mean.
Remark:
PN You have to be very careful here. In particular, note that
E i=1 Xi 6= N E[X1 ], since the LHS is a number and the RHS is random.
ISYE 6739
Honors Class: Random Sums of Random Variables
ISYE 6739
Honors Class: Random Sums of Random Variables
ISYE 6739
Honors Class: Standard Conditioning Argument
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Honors Class: Standard Conditioning Argument
Let A be some event, and define the RV Y as the following indicator function:
(
1 if A occurs
Y = 1A ≡
0 otherwise.
Then X
E[Y ] = yfY (y) = P (Y = 1) = P (A).
y
ISYE 6739
Honors Class: Standard Conditioning Argument
Proof:
P (A) = E[Y ] (where we take Y = 1A )
= E[E(Y |X)] (double expectation)
Z
= E[Y |x]fX (x) dx (LOTUS)
ZR
ISYE 6739
Honors Class: Standard Conditioning Argument
Proof: (Actually, there are many proofs.) Let the event A = {Y ≤ X}. Then
Z
P (Y ≤ X) = P (Y ≤ X|X = x)fX (x) dx
ZR
= P (Y ≤ x|X = x)fX (x) dx
ZR
= P (Y ≤ x)fX (x) dx (X, Y are independent). 2
R
ISYE 6739
Honors Class: Standard Conditioning Argument
Remark: Think of X as the time until the next male driver shows up at a
parking lot (at rate α / hour) and Y as the time for the next female driver (at
rate β / hour). Then P (Y ≤ X) = β/(α + β) is the intuitively reasonable
probability that the next driver to arrive will be female. 2
ISYE 6739
Honors Class: Standard Conditioning Argument
Proof: Z
P (Z ≤ z) = P (X + Y ≤ z|X = x)fX (x) dx
ZR
= P (Y ≤ z − x|X = x)fX (x) dx
ZR
= P (Y ≤ z − x)fX (x) dx (X, Y are indep). 2
R
ISYE 6739
Honors Class: Standard Conditioning Argument
iid
Example: Suppose X, Y ∼ Exp(λ), and let Z = X + Y . Then
Z
P (Z ≤ z) = FY (z − x)fX (x) dx
R
Z z
= (1 − e−λ(z−x) )λe−λx dx
0
(must have x ≥ 0 and z − x ≥ 0)
= 1 − e−λz − λze−λz , if z ≥ 0.
d
P (Z ≤ z) = λ2 ze−λz , z ≥ 0.
dz
This turns out to mean that Z ∼ Gamma(2, λ), aka Erlang2 (λ). 2
ISYE 6739
Honors Class: Standard Conditioning Argument
You can do the similar kinds of convolutions with discrete RVs. We state the
following result without proof (which is straightforward).
∞
X
fZ (z) = P (Z = z) = fX (x)fY (z − x).
x=−∞
ISYE 6739
Honors Class: Standard Conditioning Argument
ISYE 6739
Covariance and Correlation
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Covariance and Correlation
ISYE 6739
Covariance and Correlation
Remark: If X and Y have positive covariance, then X and Y move “in the
same direction.” Think height and weight.
ISYE 6739
Covariance and Correlation
ISYE 6739
Covariance and Correlation
ISYE 6739
Covariance and Correlation
Proof:
ISYE 6739
Covariance and Correlation
But Z 1
1
E[X] = x·
dx = 0 and
−1 2
Z 1
3 1
E[XY ] = E[X ] = x3 · dx = 0,
−1 2
so
Cov(X, Y ) = E[XY ] − E[X]E[Y ] = 0.
ISYE 6739
Covariance and Correlation
ISYE 6739
Covariance and Correlation
ρ ≈ 1 is “high” correlation.
ρ ≈ 0 is “low” correlation.
ρ ≈ −1 is “high” negative correlation.
ISYE 6739
Correlation and Causation
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Correlation and Causation
ISYE 6739
Correlation and Causation
ISYE 6739
Correlation and Causation
The three examples above seem to give conflicting guidance with respect to
the relationship between correlation and causality. How can we interpret these
findings in a meaningful way? Here are the takeaways:
If the correlation between X and Y is (significantly) nonzero, there is
some type of relationship between the two items, which may or may not
be causal; but this should raise our curiosity.
If the correlation between X and Y is 0, we are not quite out of the
woods with respect to dependence and causality. In order to definitively
rule out a relationship between X and Y , it is always highly
recommended protocol to, at the very least,
Plot data from X and Y against each other to see if there is a nonlinear
relationship, as in the uncorrelated-yet-dependent example.
Consult with appropriate experts.
ISYE 6739
A Couple of Worked Correlation Examples
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
A Couple of Worked Correlation Examples
We’ll spare the details, but here are the relevant calculations. . .
ISYE 6739
A Couple of Worked Correlation Examples
X
E[X] = xfX (x) = 2.7,
x
X
E[X 2 ] = x2 fX (x) = 7.9, and
x
Var(X) = E[X 2 ] − (E[X])2 = 0.61.
ISYE 6739
A Couple of Worked Correlation Examples
ISYE 6739
A Couple of Worked Correlation Examples
Similarly,
Z 1
10
fY (y) = 10x2 y dx = y(1 − y 3 ), 0 ≤ y ≤ 1,
y 3
Z 1Z x
E[XY ] = 10x3 y 2 dy dx = 10/21,
0 0
Cov(X, Y )
ρ = p = 0.4265. 2
Var(X)Var(Y )
ISYE 6739
Some Useful Covariance / Correlation Theorems
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Some Useful Covariance / Correlation Theorems
ISYE 6739
Some Useful Covariance / Correlation Theorems
Theorem:
n
X n
X XX
Var Xi = Var(Xi ) + 2 Cov(Xi , Xj ).
i<j
i=1 i=1
Proof: Induction.
ISYE 6739
Some Useful Covariance / Correlation Theorems
Proof:
Theorem:
Xn n
X XX
Var ai Xi + c = a2i Var(Xi ) + 2 ai aj Cov(Xi , Xj ).
i<j
i=1 i=1
ISYE 6739
Some Useful Covariance / Correlation Theorems
Var(X − 2Y + 3Z)
= Var(X) + 4Var(Y ) + 9Var(Z)
−4Cov(X, Y ) + 6Cov(X, Z) − 12Cov(Y, Z)
= 14(10) − 4(3) + 6(−2) − 12(0) = 116. 2
ISYE 6739
Moment Generating Functions, Revisited
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Moment Generating Functions, Revisited
Old Theorem (why it’s called the mgf): Under certain technical conditions,
dk
E[X k ] = MX (t) , k = 1, 2, . . . .
dtk t=0
ISYE 6739
Moment Generating Functions, Revisited
Proof:
MY (t) = E[etY ]
P
= E[et Xi ]
Yn
tXi
= E e
i=1
n
Y
= E[etXi ] (Xi ’s independent)
i=1
n
Y
= MXi (t). 2
i=1
ISYE 6739
Moment Generating Functions, Revisited
Pn
Corollary: If X1 , . . . , Xn are iid and Y = i=1 Xi , then
So what use is a result like this? We can use results such as this with our old
friend. . . .
ISYE 6739
Moment Generating Functions, Revisited
By the previous example and uniqueness, all we need to show is that the mgf
of Z ∼ Bin(n, p) matches MY (t) = (pet + q)n . To this end, we have
MZ (t) = E[etZ ]
X
= etz P (Z = z)
z
n
X n z n−z
tz
= e p q
z
z=0
n
X n
= (pet )z q n−z
z
z=0
= (pe + q)n
t
(by the Binomial Theorem). 2
ISYE 6739
Moment Generating Functions, Revisited
Old Theorem (mgf of a linear function of X): Suppose X has mgf MX (t)
and let Y = aX + b. Then MY (t) = etb MX (at).
Example:
15
−2t 3 3t 1
MY (t) = e e + = ebt (peat + q)n = ebt MX (at),
4 4
ISYE 6739
Moment Generating Functions, Revisited
Proof:
k
Y
MY (t) = MXi (t) (mgf of independent sum)
i=1
k
Y
= (pet + q)ni (Binomial(ni , p) mgf)
i=1
Pk
= (pet + q) i=1 ni .
ISYE 6739
Honors Bivariate Functions of Random Variables
1 Introduction
2 Marginal Distributions
3 Conditional Distributions
4 Independent Random Variables
5 Consequences of Independence
6 Random Samples
7 Conditional Expectation
8 Double Expectation
9 Honors Class: First-Step Analysis
10 Honors Class: Random Sums of Random Variables
11 Honors Class: Standard Conditioning Argument
12 Covariance and Correlation
13 Correlation and Causation
14 A Couple of Worked Correlation Examples
15 Some Useful Covariance / Correlation Theorems
16 Moment Generating Functions, Revisited
17 Honors Bivariate Functions of Random Variables
ISYE 6739
Honors Bivariate Functions of Random Variables
Goal: Now let’s give a general result on the distribution of functions of two
random variables, the proof of which is beyond the scope of our class.
ISYE 6739
Honors Bivariate Functions of Random Variables
Honors Theorem: Suppose X and Y are continuous RVs with joint pdf
f (x, y), and V = h1 (X, Y ) and W = h2 (X, Y ) are functions of X and Y ,
and
X = k1 (V, W ) and Y = k2 (V, W ),
for suitably chosen inverse functions k1 and k2 .
ISYE 6739
Honors Bivariate Functions of Random Variables
You can use this method to find all sorts of cool stuff, e.g., the distribution of
X + Y , X/Y , etc., as well as the joint pdf of any functions of X and Y .
Remark: Although the notation is nasty, the application isn’t really so bad.
ISYE 6739
Honors Bivariate Functions of Random Variables
This yields
∂x ∂x ∂y ∂y
= 0, = 1, = 1, and = −1,
∂v ∂w ∂v ∂w
so that
∂x ∂y ∂y ∂x
|J| = − = |0(−1) − 1(1)| = 1.
∂v ∂w ∂v ∂w
ISYE 6739
Honors Bivariate Functions of Random Variables
And, finally, we obtain the desired pdf of the sum V (after carefully noting the
region of integration),
Z Z v
gV (v) = g(v, w) dw = λ2 e−λv dw = λ2 ve−λv , for v > 0.
R 0
This is the Gamma(2, λ) pdf, which matches our answer from earlier in the
current module. 2
ISYE 6739
Honors Bivariate Functions of Random Variables
Honors Example: Suppose X and Y are iid Unif(0,1). Find the joint pdf of
V = X + Y and W = X/Y .
∂x w ∂x v ∂y 1 ∂y −v
= , = , = , = ,
∂v w+1 ∂w (w + 1)2 ∂v w+1 ∂w (w + 1)2
∂x ∂y ∂y ∂x v
|J| = − = .
∂v ∂w ∂v ∂w (w + 1)2
ISYE 6739
Honors Bivariate Functions of Random Variables
Note that you have to be careful about the limits of v and w, but this thing
really does double integrate to 1! 2
ISYE 6739
Honors Bivariate Functions of Random Variables
We can also get the marginal pdf’s. First of all, for the ratio of the uniforms,
we get
Z
gW (w) = g(v, w) dv
R
Z1+min{1/w,w}
v
= dv
0 (w + 1)2
2
1 + min{1/w, w}
=
2(w + 1)2
(
1
2, if w ≤ 1
= 1
2w2
, if w > 1,
ISYE 6739
Honors Bivariate Functions of Random Variables
Things will get easier from now on! Happy days are here again!
ISYE 6739