HW6

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

EE5581 - Information Theory and Coding UMN, Fall 2018, Dec.

Homework 6
Instructor: Soheil Mohajer Due on: Dec. 10, 2:30 PM

Problem 1
Let X ∼ N (0, σ 2 ) be a zero-mean Gaussian random variable.

(a) Find the criteria for a sequence to be typical.


(n)
(b) Describe the shape of the typical set A in the n-dimensional Euclidean space.

(c) Approximate the volume of the shape you found in (b).

(d) Compare the result of (c) to the result of AEP theorem. √ z z



Hint: You may use the fact that Γ(z + 1) = zΓ(z) and the Stirling approximation Γ(z + 1) ∼ 2πz e .

Problem 2

Consider the channel Yi = Xi + Zi , where Xi is the transmitted signal with average power constraint P , Zi is
independent additive noise, and Yi is the received signal. Let
(
1
0 with probability 10
Zi =
Z ∗ with probability 109
,

where Z ∗ ∼ N (0, N ). Thus, Z has a mixture of a Gaussian distribution and a degenerate distribution with center
mass at 0.

(a) What is the capacity of this channel?

(b) How would you encode and send over the channel to achieve capacity?

Problem 3
Consider the zero-mean jointly Gaussian random variables X and Y with covariance matrix
    2 
X  σx ρσx σy
E X Y = ,
Y ρσx σy σy2

i.e.,
 2
y2
 
1 1 x 2ρxy
(X, Y ) ∼ p(x, y) = exp − + − .
2(1 − ρ2 ) σx2 σy2
p
2πσx σy 1 − ρ2 σx σy

(a) Find h(X), h(Y ), and h(X, Y ).

1
(b) Find f (x|y).
(c) Using (b), find h(X|Y ), and compare it with h(X, Y ) − h(Y ) from (a).
(d) Interpret h(X|Y ) for ρ = 0 and ρ = 1.

Problem 4
Consider a continuous channel with additive noise
Y = X + Z,
where Z is independent from X, and noise variance is E[Z 2 ] = σ 2 (note that noise is not necessarily Gaussian).
Assume we use a Gaussian input distribution X ∼ N (0, P ) as input to communicate over this channel.

(a) Assume α is an arbitrary constant. Prove each of the marked (in)equalities below
I(X; Y ) = h(X) − h(X|Y )
(i)
= h(X) − h(X − αY |Y )
(ii)
≥ h(X) − h(X − αY )
(iii) 1 P
≥ log (1)
2 E[(X − αY )2 ]

(b) Write E[(X − αY )2 ] in terms of α, σ 2 and P , and evaluate the lower bound in (1).
(c) Find the best α to obtain the strongest lower bound on I(X; Y ). Compute the corresponding bound on the
mutual information.
(d) Compare the result of part (c) and the capacity of a similar channel with Gaussian noise. What is your
conclusion?

Problem 5
Consider a Gaussian channel with two parallel links,
Y1 = X1 + Z1
Y2 = X2 + Z2 ,
with a total power constraint P , and noise covariance matrix
    
Z1   7 2
E Z1 Z2 = .
Z2 2 4

(a) Let P = 4. Find the capacity of this channel, and determine the optimum power allocation.
(b) Now assume P = 10. Find the channel capacity.
(c) What is the best input for this channel for P = 10? How X1 and X2 should be related?
(d) Let P = 10, and assume Alice chooses X1 and Bob chooses X2 where he has no idea about Alice’s choice.
In this case, how do you distribute the power between Alice and Bob?
(e) How much do we loose in terms of capacity compared to part (b)?

You might also like