HW 2

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

EE376A: Homework #2

Due on Thursday, January 28, 2016


You can hand in the homework either after class or deposit it, before 5 PM, in the EE376A
drawer of the class file cabinet on the second floor of the Packard Building.
Some definitions that may be useful:
Definition 1: A sequence of random variables X1 , X2 , . . . , Xn , . . . converges to a random
variable X in probability if for any  > 0,
lim P {|Xn X| < } = 1

(1)

1. Entropy and pairwise independence.


Let X, Y, Z be three binary Bernoulli( 12 ) random variables that are pairwise independent; that is, I(X; Y ) = I(X; Z) = I(Y ; Z) = 0.
(a) Under this constraint, what is the minimum value for H(X, Y, Z) (under all possible joint distributions of (X, Y, Z) that are consistent with the above)?
(b) Give an example achieving this minimum.
(c) Now suppose that X, Y, Z are three random variables each uniformly distributed
over the alphabet {1, 2, . . . , m}. Again, they are pairwise independent. What is
the minimum value for H(X, Y, Z)?
2. Conditional entropy. Let X1 , X2 , X3 be i.i.d. discrete random variables.
Which is larger: H(X1 |X1 + X2 + X3 ) or H(X1 + X2 |X1 + X2 + X3 )?

3. Inequalities.
Let X, Y and Z be jointly distributed random variables. Prove the following inequalities and find conditions for equality.
(a) H(X, Y | Z) H(X | Z).
(b) I(X, Y ; Z) I(X; Z).
(c) H(X, Y, Z) H(X, Y ) H(X, Z) H(X).
(d) I(X; Z | Y ) I(Z; Y | X) I(Z; Y ) + I(X; Z).

4. Conditional mutual information.


Consider a sequence of n binary random variables X1 , X2 , . . . , Xn . Each n-sequence

Homework 2

Page 1 of 3

with an even number of 1s has probability 2(n1) and each n-sequence with an odd
number of 1s has probability 0. Find the mutual informations
I(X1 ; X2 ), I(X2 ; X3 |X1 ), . . . , I(Xn1 ; Xn |X1 , . . . , Xn2 ).

5. The value of a question.


Let X p(x), x = 1, 2, . . . , m. We are given a set S {1, 2, . . . , m}. We ask whether
X S and receive the answer

1, if X S
Y =
0, if X 6 S.
Suppose P {X S} = . Find the decrease in uncertainty H(X) H(X|Y ).

6. AEP
LetPXi be iid p(x), x {1, 2, . . . , m}. For  > 0, let = EX, and H =
P p(x) log p(x). Let An = {xn X n : | n1 log p(xn ) H| } and Bn = {xn X n :
| n1 ni=1 Xi | }.
(a) Does P {X n An } 1?
(b) Does P {X n An Bn } 1?
(c) Show |An Bn | 2n(H+) , for all n.
(d) Show |An Bn | ( 21 )2n(H) , for n sufficiently large.

Homework 2

Page 2 of 3

7. Let X1 , X2 , . . . . be i.i.d. X where X X = {1, 2, , K} and p(x) > 0 for all


x X.
(a) Find the limit (in probability) of
r

(p(X1 , X2 , . . . , Xn )) n
as n . (r is fixed.)
(b) Find
E

1
.
p(X1 , X2 , . . . , Xn )

8. An AEP-like limit and the AEP.


(a) Let X1 , X2 , . . . be i.i.d. drawn according to probability mass function p(x). Find
the limit in probability as n of
1

p(X1 , X2 , . . . , Xn ) n .
(b) Let X1 , X2 , . . . be drawn i.i.d. according to the following distribution:

2, with probability 52

3, with probability 25
Xi =

4, with probability 15
Find the limit (in probability) of the product
(X1 X2 Xn )1/n .
1

(c) Evaluate the limit of p(X1 , X2 , . . . , Xn ) n assuming the distribution in part (b).

9. Prefix and Uniquely Decodable codes


Consider the following code:
u Codeword
a
10
b
00
c
11
d
110
(a) Is this a Prefix code?
(b) Prove that this code is uniquely decodable.

Homework 2

Page 3 of 3

You might also like