Professional Documents
Culture Documents
Conditional Expectation: Scott Sheffield
Conditional Expectation: Scott Sheffield
440: Lecture 26
Conditional expectation
Scott Sheffield
MIT
1
18.440 Lecture 26
Outline
Conditional expectation
2
18.440 Lecture 26
Outline
Conditional expectation
3
18.440 Lecture 26
Recall: conditional probability distributions
I It all starts with the definition of conditional probability:
P(A|B) = P(AB)/P(B).
I If X and Y are jointly discrete random variables, we can use
this to define a probability mass function for X given Y = y .
I That is, we write pX |Y (x|y ) = P{X = x|Y = y } = p(x,y )
pY (y ) .
I In words: first restrict sample space to pairs (x, y ) with given
y value. Then divide the original mass function by pY (y ) to
obtain a probability mass function on the restricted space.
I We do something similar when X and Y are continuous
random variables. In that case we write fX |Y (x|y ) = ffY(x,y )
(y ) .
I Often useful to think of sampling (X , Y ) as a two-stage
process. First sample Y from its marginal distribution, obtain
Y = y for some particular y . Then sample X from its
probability distribution given Y = y .
I Marginal law of X is weighted average of conditional laws.
4
18.440 Lecture 26
Example
5
18.440 Lecture 26
Outline
Conditional expectation
6
18.440 Lecture 26
Outline
Conditional expectation
7
18.440 Lecture 26
Conditional expectation
8
18.440 Lecture 26
Example
9
18.440 Lecture 26
Conditional expectation as a random variable
12
18.440 Lecture 26
Outline
Conditional expectation
13
18.440 Lecture 26
Outline
Conditional expectation
14
18.440 Lecture 26
Interpretation
15
18.440 Lecture 26
Examples
16
18.440 Lecture 26
MIT OpenCourseWare
http://ocw.mit.edu
For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.