1-3 Independence PDF

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

CHAPTER 2

Independent random variables


2.1. Product measures
Denition 2.1. Let
i
be measures on (
i
, F
i
), 1 i n. Let F =F
1
. . .F
n
be the
sigma algebra of subsets of :=
1
. . .
n
generated by all rectangles A
1
. . .A
n
with A
i
F
i
. Then, the measure on (, F) such that (A
1
. . . A
n
) =

n
i=1

i
(A
i
)
whenever A
i
F
i
is called a product measure and denoted =
1
. . .
n
.
The existence of product measures follows along the lines of the Caratheodary
construction starting with the -system of rectangles. We skip details, but in the
cases that we ever use, we shall show existence by a much neater method in Propo-
sition 2.8. Uniqueness of product measure follows from the theorem because
rectangles form a -system that generate the -algebra F
1
. . . F
n
.
Example 2.2. Let B
d
, m
d
denote the Borel sigma algebra and Lebesgue measure
on R
d
. Then, B
d
=B
1
. . . B
1
and m
d
=m
1
. . . m
1
. The rst statement is clear
(in fact B
d+d
=B
d
B
d
). Regarding m
d
, by denition, it is the unique measure for
which m
d
(A
1
. . . A
n
) equals

n
i=1
m
1
(A
i
) for all intervals A
i
. To show that it is
the d-fold product of m
1
, we must show that the same holds for any Borel sets A
i
.
Fix intervals A
2
, . . . , A
n
and let S :={A
1
B
1
: m
d
(A
1
. . . A
n
) =

n
i=1
m
1
(A
i
)}.
Then, S contains all intervals (in particular the -system of semi-closed intervals)
and by properties of measures, it is easy to check that S is a -system. By the
theorem, we get S =B
1
and thus, m
d
(A
1
. . .A
n
) =

n
i=1
m
1
(A
i
) for all A
1
B
1
and
any intervals A
2
, . . . , A
n
. Continuing the same argument, we get that m
d
(A
1
. . .
A
n
) =

n
i=1
m
1
(A
i
) for all A
i
B
1
.
The product measure property is dened in terms of sets. As always, it may be
written for measurable functions and we then get the following theorem.
Theorem 2.3 (Fubinis theorem). Let =
1

2
be a product measure on
1

2
with the product -algebra. If f : R
+
is either a non-negative r.v. or integrable
w.r.t , then,
(1) For every x
1
, the function y f (x, y) is F
2
-measurable, and the func-
tion x

f (x, y)d
2
(y) is F
1
-measurable. The same holds with x and y
interchanged.
(2)

f (z)d(z) =

2
f (x, y)d
2
(y)

d
1
(x) =

1
f (x, y)d
1
(x)

d
2
(y).
PROOF. Skipped. Attend measure theory class.
Needless to day (self: then why am I saying this?) all this goes through for nite
products of -nite measures.
25
26 2. INDEPENDENT RANDOM VARIABLES
Innite product measures: Given (
i
, F
i
,
i
), i =1, 2, . . ., let :=
1

2
. . . and
let F be the sigma algebra generated by all nite dimensional cylinders A
1
. . .
A
n

n+1

n+2
. . . with A
i
F
i
. Does there exist a product measure on F?
For concreteness take all (
i
, F
i
,
i
) =(R, B, ). What measure should the prod-
uct measure give to the set ARR. . .? If (R) > 1, it is only reasonable to set
(ARR. . .) to innity, and if (R) < 1, it is reasonable to set it to 0. But then
all cylinders will have zero measure or innite measure!! If (R) = 1, at least this
problem does not arise. We shall show that it is indeed possible to make sense of
innite products of Thus, the only case when we can talk reasonably about innite
products of measures is for probability measures.
2.2. Independence
Denition 2.4. Let (, F, P) be a probability space. Let G
1
, . . . , G
k
be sub-sigma
algebras of F. We say that G
i
are independent if for every A
1
G
1
, . . . , A
k
G
k
, we
have P(A
1
A
2
. . . A
k
) =P(A
1
) . . . P(A
k
).
Random variables X
1
, . . . , X
n
on F are said to be independent if (X
1
), . . . , (X
n
)
are independent. This is equivalent to saying that P(X
i
A
i
i k) =

k
i=1
P(X
i
A
i
)
for any A
i
B(R).
Events A
1
, . . . , A
k
are said to be independent if 1
A
1
, . . . , 1
A
k
are independent.
This is equivalent to saying that P(A
j
1
. . . A
j

) = P(A
j
1
) . . . P(A
j

) for any 1
j
1
< j
2
<. . . < j

k.
In all these cases, an innite number of objects (sigma algebras or random vari-
ables or events) are said to be independent if every nite number of them are inde-
pendent.
Some remarks are in order.
(1) As usual, to check independence, it would be convenient if we need check
the condition in the denition only for a sufciently large class of sets. How-
ever, if G
i
=(S
i
), and for every A
1
S
1
, . . . , A
k
S
k
if we have P(A
1
A
2

. . . A
k
) = P(A
1
) . . . P(A
k
), we cannot conclude that G
i
are independent! If
S
i
are -systems, this is indeed true (see below).
(2) Checking pairwise independence is insufcient to guarantee independence.
For example, suppose X
1
, X
2
, X
3
are independent and P(X
i
=+1) =P(X
i
=
1) =1/2. Let Y
1
= X
2
X
3
, Y
2
= X
1
X
3
and Y
3
= X
1
X
2
. Then, Y
i
are pairwise
independent but not independent.
Lemma 2.5. If S
i
are -systems and G
i
=(S
i
) and for every A
1
S
1
, . . . , A
k
S
k
if
we have P(A
1
A
2
. . . A
k
) =P(A
1
) . . . P(A
k
), then G
i
are independent.
PROOF. Fix A
2
S
2
, . . . , A
k
S
k
and set F
1
:= {B G
1
: P(B A
2
. . . A
k
) =
P(B)P(A
2
) . . . P(A
k
)}. Then F
1
S
1
by assumption and it is easy to check that F
1
is a -system. By the - theorem, it follows that F
1
= G
1
and we get the assump-
tions of the lemma for G
1
, S
2
, . . . , S
k
. Repeating the argument for S
2
, S
3
etc., we get
independence of G
1
, . . . , G
k
.
Corollary 2.6. (1) Random variables X
1
, . . . , X
k
are independent if and only if
P(X
1
t
1
, . . . , X
k
t
k
) =

k
j=1
P(X
j
t
j
).
(2) Suppose G

, I are independent. Let I


1
, . . . , I
k
be pairwise disjoint subsets
of I. Then, the -algebras F
j
=

I
j
G

are independent.
2.3. INDEPENDENT SEQUENCES OF RANDOM VARIABLES 27
(3) If X
i, j
, i n, j n
i
, are independent, then for any Borel measurable f
i
:
R
n
i
R, the r.v.s f
i
(X
i,1
, . . . , X
i,n
i
) are also independent.
PROOF. (1) The sets (, t] form a -system that generates B(R). (2) For j k,
let S
j
be the collection of nite intersections of sets A
i
, i I
j
. Then S
j
are -systems
and (S
j
) =F
j
. (3) Follows from (2) by considering G
i, j
:=(X
i, j
) and observing that
f
i
(X
i,1
, . . . , X
i,k
) (G
i,1
. . . G
i,n
i
).
So far, we stated conditions for independence in terms of probabilities if events. As
usual, they generalize to conditions in terms of expectations of random variables.
Lemma 2.7. (1) Sigma algebras G
1
, . . . , G
k
are independent if and only if for
every bounded G
i
-measurable functions X
i
, 1 i k, we have, E[X
1
. . . X
k
] =

k
i=1
E[X
i
].
(2) In particular, random variables Z
1
, . . . , Z
k
(Z
i
is an n
i
dimensional random
vector) are independent if and only if E[

k
i=1
f
i
(Z
i
)] =

k
i=1
E[ f
i
(Z
i
)] for any
bounded Borel measurable functions f
i
: R
n
i
R.
We say bounded measurable just to ensure that expectations exist. The proof
goes inductively by xing X
2
, . . . , X
k
and then letting X
1
be a simple r.v., a non-
negative r.v. and a general bounded measurable r.v.
PROOF. (1) Suppose G
i
are independent. If X
i
are G
i
measurable then
it is clear that X
i
are independent and hence P(X
1
, . . . , X
k
)
1
= PX
1
1

. . . PX
1
k
. Denote
i
:= PX
1
i
and apply Fubinis theorem (and change of
variables) to get
E[X
1
. . . X
k
]
c.o.v
=

R
k
k

i=1
x
i
d(
1
. . .
k
)(x
1
, . . . , x
k
)
Fub
=

R
. . .

R
k

i=1
x
i
d
1
(x
1
) . . . d
k
(x
k
)
=
k

i=1

R
ud
i
(u)
c.o.v
=
k

i=1
E[X
i
].
Conversely, if E[X
1
. . . X
k
] =

k
i=1
E[X
i
] for all G
i
-measurable functions X
i
s,
then applying to indicators of events A
i
G
i
we see the independence of the
-algebras G
i
.
(2) The second claim follows from the rst by setting G
i
:=(X
i
) and observing
that a random variable X
i
is (Z
i
)-measurable if and only if X = f Z
i
for
some Borel measurable f : R
n
i
R.

2.3. Independent sequences of random variables


First we make the observation that product measures and independence are
closely related concepts. For example,
An observation: The independence of random variables X
1
, . . . , X
k
is precisely the
same as saying that P X
1
is the product measure PX
1
1
. . . PX
1
k
, where X =
(X
1
, . . . , X
k
).
28 2. INDEPENDENT RANDOM VARIABLES
Consider the following questions. Henceforth, we write R

for the countable


product space RR. . . and B(R

) for the cylinder -algebra generated by all -


nite dimensional cylinders A
1
. . . A
n
RR. . . with A
i
B(R). This notation is
justied, becaue the cylinder -algebra is also the Borel -algebra on R

with the
product topology.
Question 1: Given
i
P(R), i 1, does there exist a probability space with inde-
pendent random variables X
i
having distributions
i
?
Question 2: Given
i
P(R), i 1, does there exist a p.m on (R

, B(R

)) such
that (A
1
. . . A
n
RR. . .) =

n
i=1

i
(A
i
)?
Observation: The above two questions are equivalent. For, suppose we answer the
rst question by nding an (, F, P) with independent random variables X
i
: R
such that X
i

i
for all i. Then, X : R

dened by X() = (X
1
(), X
2
(), . . .) is
measurable w.r.t the relevant -algebras (why?). Then, let :=PX
1
be the pushfor-
ward p.m on R

. Clearly
(A
1
. . . A
n
RR. . .) = P(X
1
A
1
, . . . , X
n
A
n
)
=
n

i=1
P(X
i
A
i
) =
n

i=1

i
(A
i
).
Thus is the product measure required by the second question.
Conversely, if we could construct the product measure on (R

, B(R

)), then we
could take = R

, F = B(R

) and X
i
to be the i
th
co-ordinate random variable.
Then you may check that they satisfy the requirements of the rst question.
The two questions are thus equivalent, but what is the answer?! It is yes, of
course or we would not make heavy weather about it.
Proposition 2.8 (Daniell). Let
i
P(R), i 1, be Borel p.m on R. Then, there exist
a probability space with independent random variables X
1
, X
2
, . . . such that X
i

i
.
PROOF. We arrive at the construction in three stages.
(1) Independent Bernoullis: Consider ([0, 1], B, m) and the random vari-
ables X
k
: [0, 1] R, where X
k
() is dened to be the k
th
digit in the binary
expansion of . For deniteness, we may always take the innite binary ex-
pansion. Then by an earlier homework exercise, X
1
, X
2
, . . . are independent
Bernoulli(1/2) random variables.
(2) Independent uniforms: Note that as a consequence, on any probability
space, if Y
i
are i.i.d. Ber(1/2) variables, then U :=

n=1
2
n
Y
n
has uniform
distribution on [0, 1]. Consider again the canonical probability space and
the r.v. X
i
, and set U
1
:= X
1
/2+X
3
/2
3
+X
5
/2
5
+. . ., U
2
:= X
2
/2+X
6
/2
2
+. . .,
etc. Clearly, U
i
are i.i.d. U[0,1].
(3) Arbitrary distributions: For a p.m. , recall the left-continuous inverse
G

that had the property that G

(U) if U U[0, 1]. Suppose we are


given p.m.s
1
,
2
, . . .. On the canonical probability space, let U
i
be i.i.d
uniforms constructed as before. Dene X
i
:= G

i
(U
i
). Then, X
i
are inde-
pendent and X
i

i
. Thus we have constructed an independent sequence
of random variables having the specied distributions.
Sometimes in books one nds construction of uncountable product measures too.
It has no use. But a very natural question at this point is to go beyond independence.
We just state the following theorem which generalizes the previous proposition.
2.3. INDEPENDENT SEQUENCES OF RANDOM VARIABLES 29
Theorem 2.9 (Kolmogorovs existence theorem). For each n 1 and each 1
i
1
< i
2
< . . . < i
n
, let
i
1
,...,i
n
be a Borel p.m on R
n
. Then there exists a unique proba-
bility measure on (R

, B(R

)) such that
(A
1
. . . A
n
RR. . .) =
i
1
,...,i
n
(A
1
. . . A
n
) for all n 1 and all A
i
B(R),
if and only if the given family of probability measures satisfy the consistency condition

i
1
,...,i
n
(A
1
. . . A
n1
R) =
i
1
,...,i
n1
(A
1
. . . A
n1
)
for any A
k
B(R) and for any 1 i
1
< i
2
<. . . < i
n
and any n 1.

You might also like