Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

TIT3441 Information Theory Tutorial 1

Part A

1. A source emits one of four possible symbols during each signalling interval. The
symbols occur with the probabilities:

p0 = 0.4
p1 = 0.3
p2 = 0.2
p3 = 0.1

Find the amount of information gained by observing the source emitting each of these
symbols.

2. Assume an equal number of each of the grades A, B, C, D and F are given in a certain
course. How much information in bits have you received when the instructor tells you
that your grade is not F? How much information do you need to determine your grade?

3. Given an DMS, S = {s0, s1, s2, s3, s4} and the probability of occurrence for the first 4
symbols are p0 = 1/3, p1 = 1/4, p2 = 1/5, and p3 = 1/6. The source emits one of five
symbols, once every millisecond. Find the information rate of this source.

4. A source emits one of four symbols s0, s1, s2, and s3 with probabilities 1/3, 1/6, 1/4, and
1/4, respectively. The successive symbols emitted by the source are statistically
independent. Calculate the entropy of the source.

5. Let X represents the outcome of a single roll of a fair die. What is the entropy of X?

6. Consider a discrete memoryless source (DMS) with source alphabet ς = {s0, s1, s2} and
source statistics {0.7, 0.15.0.15}.
(a) Calculate the entropy of the source.
(b) Calculate the entropy of the second-order extension of the source.

Part B

7. Consider the four codes listed below:

Symbol Code I Code II Code III Code IV


s0 0 0 0 00
s1 10 01 01 01
s2 110 001 011 10
s3 1110 0010 110 110
s4 1111 0011 111 111

(a) Two of these four codes are prefix codes. Identify them, and construct their
individual decision trees.

Trimester 1 Session 2010/11 NMA


TIT3441 Information Theory Tutorial 1

(b) Apply the Kraft-McMillan inequality to codes I, II, III, and IV. Discuss your results
in light of those obtained in part (a).

8. A DMS has an alphabet of seven symbols whose probabilities of occurrence are as


described here:

Symbol s0 s1 s2 s3 s4 s5 s6
Probability 0.25 0.25 0.125 0.125 0.125 0.0625 0.0625

Compute the Huffman code of this source, moving a “combined” symbol as high as
possible. Explain why the computed source code has an efficiency of 100 percent.

9. Figure 1 shows a Huffman tree. What is the code word for each of the symbols A, B,
C, D, E, F, and G represented by this Huffman tree? What are their individual code
word lengths?

Figure 1

10. Consider the following binary sequence


11101001100010110100…
Use the Lempel-Ziv algorithm to encode this sequence. Assume that the binary symbols
0 and 1 are already in the codebook.

Trimester 1 Session 2010/11 NMA

You might also like