Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

Université Paris 13, Institut Galilée MACS 2 – Probabilités

Année universitaire 2022–2023 M1 Math – Probabilités

Fiche 2 – Conditional expectation and distribution

Early properties
Exercice 1. Let X, Y be nonnegative (resp : integrable) random variables defined on a probability space
(Ω,F,P). Let G ⊂ F be a σ-field.
1. Recall the definition of the conditional expectation of X given G.
2. Prove the following properties :
a) E[E[X | G]] = E[X] ;
b) If X is indépendant of G, then E[X | G] = E[X]. Deduce that E[c | G] où c ∈ R is a constant ;
c) If Y is G-measurable and nonnegative (resp. bounded), then E[XY | G] = Y E[X | G] ;
d) If Y ≥ X a.s., then E[Y | G] ≥ E[X | G] a.s.
e) E[X + Y | G] = E[X | G] + E[Y | G]
3. Suppose that X and Y are two real-valued independent random variables, and f : R×R → R+ is a measurable
function, prove that
E[f (X,Y ) | X] = g(X)
with g : x 7→ E[f (x,Y )].
In other words, this shows that if X and Y are independent then we calculate E[f (X,Y ) | X] by doing as if X
were a constant and taking the expectation with respect to Y only. This is similar to Fubini’s formula.
Exercice 2. Let X,Y be two independent random variables variables aléatoires, distributed as N (0,1).
Compute E[XY 2 | X], E[(X + Y )2 | X], E[e−XY | X], E[e−XY | X,Y ].
2
Recall that E[eλY ] = eλ /2 (how do you get this formula ?)
Exercice 3. Let X1 ,X2 , . . . be i.i.d. integrable random variables. Let m = E[Xi ] and define for n ≥ 0,

Sn = X1 + · · · + Xn .

Let n ≥ 1. What are the values of E[Sn | X1 ] and E[Sn+1 | Sn ] ? Justify that E[X1 | Sn ] = E[X2 | Sn ] = · · · =
E[Xn | Sn ] and deduce E[X1 | Sn ].

Discrete distributions
Exercice 4. Let X,Y be independent random variables having a Poisson distribution of parameter λ and µ
respectively.
1. Compute the distribution of X + Y . What is the distribution of (X,X + Y ) ?
2. Compute the conditional law of X given Z = X + Y . What is its name ?
3. Compute E[X | X + Y ]
Exercice 5. Let X1 , . . . ,Xn be independent random variables having a Bernoulli distribution of parameter
p ∈]0,1[. Define Sn = X1 + · · · + Xn .
1. Compute E[Sn | X1 ].
2. Compute the conditional law of (X1 , . . . ,Xn ) given Sn .
3. Compute the conditional law of X1 given Sn .
4. Compute E[X1 | Sn ].

Continuous distributions
Exercice 6. Let (X, Y ) be a random vector having density (with respect to the Lebesgue measure on R2 )

f (x,y) = λx−1 e−λx 1{0<y<x} .

1. Compte the conditional law of Y given X.


2. Compute E[Y 2 | X].
Exercice 7. Let X, Y be two independent random variables, distributed as N (0,1).
1. Check that E[Xϕ(X 2 )] = 0 for all measurable bounded function ϕ (using the symmetry x 7→ −x). Deduce
the value of E[X | X 2 ]. What is value of E[X 3
| X ] ?
1
 if x > 0,
2. Compute E[X | sgn(X)], where sgn(x) = 0 if x = 0

−1 if x < 0.

3. Define Z = X + Y .
3.a) Compute the density function of the random vector (X,Z).
3.b) Compute the conditional distribution of X given Z = z for all z ∈ R.
3.c) Deduce E[X | Z].
Exercice 8. Let X be a random variable having the exponential distribution of parameter 1. Compute
1. E(X | X > 1) ;
2. E(X | 1{X>1} ) ;
3. E(X | min(X,1)).
Exercice 9. Let Y be a real-valued random variable having density
1
f (y) = √ e−y 1{y>0} .
πy

Suppose that the conditional distribution of X given Y is a Gaussian distribution N (0,1/(2Y )).
1. Compute the distribution of the vector (X,Y ).
2. Compute the conditional law of Y given X.
3. Compute E[X 2 Y ].

You might also like