Professional Documents
Culture Documents
Tutorial1
Tutorial1
Part A
1. A source emits one of four possible symbols during each signalling interval. The
symbols occur with the probabilities:
p0 = 0.4
p1 = 0.3
p2 = 0.2
p3 = 0.1
Find the amount of information gained by observing the source emitting each of these
symbols.
2. Assume an equal number of each of the grades A, B, C, D and F are given in a certain
course. How much information in bits have you received when the instructor tells you
that your grade is not F? How much information do you need to determine your grade?
3. Given an DMS, S = {s0, s1, s2, s3, s4} and the probability of occurrence for the first 4
symbols are p0 = 1/3, p1 = 1/4, p2 = 1/5, and p3 = 1/6. The source emits one of five
symbols, once every millisecond. Find the information rate of this source.
4. A source emits one of four symbols s0, s1, s2, and s3 with probabilities 1/3, 1/6, 1/4, and
1/4, respectively. The successive symbols emitted by the source are statistically
independent. Calculate the entropy of the source.
5. Let X represents the outcome of a single roll of a fair die. What is the entropy of X?
6. Consider a discrete memoryless source (DMS) with source alphabet ς = {s0, s1, s2} and
source statistics {0.7, 0.15.0.15}.
(a) Calculate the entropy of the source.
(b) Calculate the entropy of the second-order extension of the source.
Part B
(a) Two of these four codes are prefix codes. Identify them, and construct their
individual decision trees.
(b) Apply the Kraft-McMillan inequality to codes I, II, III, and IV. Discuss your results
in light of those obtained in part (a).
Symbol s0 s1 s2 s3 s4 s5 s6
Probability 0.25 0.25 0.125 0.125 0.125 0.0625 0.0625
Compute the Huffman code of this source, moving a “combined” symbol as high as
possible. Explain why the computed source code has an efficiency of 100 percent.
9. Figure 1 shows a Huffman tree. What is the code word for each of the symbols A, B,
C, D, E, F, and G represented by this Huffman tree? What are their individual code
word lengths?
Figure 1