Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

UNIVERSITY OF OULU

Dept. of Electrical and Information / Telecommunication Laboratory

Basics of Information Theory


Exam 18 Nov 2008

521315A Basics of Information Theory


Exam 18 November 2008
Leave a margin of two columns at the right-hand side of each page. Mark clearly where a
solution to a problem ends and if it continues on a following page or paper.
1. Joint probability mass function p ( x, y ) is given in Table 1 below. Calculate
a) the entropy of Y
b) the joint entropy of X and Y
c) the conditional entropy of X, when the realisation of Y is known
d) the mutual information between X and Y.

Table 1.

Joint distribution.

X
1

1/8

1/16

1/32

1/32

Y 2

1/16

1/8

1/32

1/32

1/16

1/16

1/16

1/16

1/4

2. Consider a Bernoulli source X~ Bernoulli(1/2). Define a distortion measure as


0, jos x = x

d ( x, x ) = 2, jos x = 1, x = 0
, jos x = 0, x = 1.

Derive the rate distortion function R( D) for this source and this distortion function.

1(2)

UNIVERSITY OF OULU
Dept. of Electrical and Information / Telecommunication Laboratory

Basics of Information Theory


Exam 18 Nov 2008

2(2)

3. Consider the discrete memoryless channel shown in Figure 1. The random variable Z has the
distribution P( Z = 0) = , P( Z = a ) = 1 , where 0 < < 1 . The parameter a is a real
number. The random variable X {0,1}. The random variables X and Z are independent.
Derive the information capacity for the channel by considering the equation for mutual
information and by maximizing it with respect to the distribution of X

Y
Figure 1. A discrete memoryless channel.

4. Let us consider lossless source coding.


a) What are non-singular, uniquely decodable and prefix-free source codes? What is the
relationship between the mentioned code classes? In other words, do some codes belong
to multiple code classes?
b) What is the Krafts inequality and what is its importance in analysing source coding?
5. Consider a discrete source producing independent identically distributed samples A,B,C,D
with a time interval of 1 ms each with equal probability . The source is transmitted through
a radio channel, whose one-sided bandwidth is W = 1 kHz. The channel attenuates the signal
by 10 dB. Additive white Gaussian noise with one-sided power spectral density of
N0 = 1 W/Hz is summed to the signal at the receiver.
a) Explain under what conditions the source can be reliably transmitted through the
channel.
b) What is the minimum transmit power required for reliable communication?

You might also like