Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

Chernoff Bound

Kirill Levchenko
We will prove a fairly general form of the Chernoff bound. This proof was given by Van Vu at the
University of California, San Diego.
Theorem 1. Let X
1
, . . . , X
n
be discrete, independent random variables such that E[X
i
] = 0 and |X
i
| 1
for all i. Let X =
n
i=1
X
i
and
2
be the variance of X. Then
Pr[|X| ] 2e

2
/4
for any 0 2.
Proof. We will prove
Pr[X ] e

2
/4
.
The argument is symmetric for Pr[X ]. Let t be a real number between 0 and 1, to be
determined later. Note that
Pr[X ] = Pr[tX t]
= Pr[e
tX
e
t
]

E[e
tX
]
e
t
,
the last step by the Markov inequality.
Before going any further, we establish a bound on E[e
tZ
], where 1 Z 1 and E[Z] = 0.
Additionally, let t 1. By the denition of expectation,
E[e
tZ
] =
m

j=1
p
j
e
tz
j
=
m

j=1
p
j
_
1 + tz
j
+
1
2!
(tz
j
)
2
+
1
3!
(tz
j
)
3
+
_
=
m

j=1
p
j
. .
A
+t
m

j=1
p
j
z
j
. .
B
+
m

j=1
p
j
_
1
2!
(tz
j
)
2
+
1
3!
(tz
j
)
3
+
_
. .
C
.
Summation A is the sum of all probabilities, so A = 1. Summation B is the expectation of Z, so
B = 0. Since |tz
j
| 1, we can upper-bound C by
m

j=1
p
j
(tz
j
)
2
_
1
2!
+
1
3!
+
_
t
2
m

j=1
p
j
z
2
j
.
But the above summation is just the variance of Z, giving
E[e
tZ
] 1 + t
2
Var[Z].
1
Returning to our claim,
E[e
tX
] = E[e
t(X
1
+X
2
++X
n
)
]
= E
_

n
i=1
e
tX
i

=
n
i=1
E[e
tX
i
] by the independence of X
i

n
i=1
_
1 + t
2
Var[X
i
]
_

n
i=1
e
t
2
Var[X
i
]
since 1 + e

for 0
= e
t
2

2
. by the independence of X
i
Thus,
Pr[X ]
e
t
2

2
e
t
= e
t(t)
.
Optimizing t we get t = /2, which gives
Pr[X ] e

2
/4
.
2

You might also like