Professional Documents
Culture Documents
Answer Key Ese - Apr - 2018
Answer Key Ese - Apr - 2018
Answer Key Ese - Apr - 2018
1. a)
i. Identify the equations of source efficiency and redundancy
1M
1M
ii. Consider a source emitting one of three symbols A, B and C with respective
probabilities 0.7, 0.15, and 0.15. Find the self-information conveyed by each symbol
and compare. Also calculate its efficiency and redundancy.
Given:
Self-information bits
= =0.5145 bits
= =2.73696 bits
= =2.73696 bits
8M
b)
i. A discrete memory less source emits five symbols in every 2ms.The symbol probabilities
are {0.5, 0.25, 0.125, 0.0625, 0.0625}. Find the average information rate of the source.
Solution:
∑
10M
ii. A discrete source emits one of the five symbols in every 1s.The symbol probabilities are
{ } Find the average information rate of the source. Find the average
information content of the source in nats/Sym and Hartley/Sym.
M1 M2 M3 M4 M5
1/8 1/16 3/16 1/4 3/8
l=(2,2, 3,3,4)
Source Symbol Code
M5 3/8 00 2
M4 1/4 01 2
M3 3/16 101 3
M1 1/8 110 3
M2 1/16 1111 4
∑
Code efficiency
Code redundancy 10M
b)
i. Construct a Shannon-Fano ternary code for the following ensemble and find the code
efficiency and redundancy.
S1 S2 S3 S4 S5 S6 S7
0.3 0.3 0.12 0.12 0.06 0.06 0.04
S1 0.3 2 Code length
S2 0.3 1 2 1
S3 0.12 0 0.12 2 1 1
S4 0.12 0 0.12 1 02 2
S5 0.06 0 0.06 0 0.06 2 01 2
S6 0.06 0 0.06 0 0.06 1 002 3
S7 0.04 0 0.04 0 0.04 0 001 3
000 3
Code efficiency
Code redundancy 10M
ii. Given the messages s1, s2, s3 and s4 with respective probabilities of 0.4, 0.3, 0.2 and0.1,
construct a binary code by applying Huffman encoding procedure. Determine the
efficiency and redundancy of the code so formed.
10M
3. a) A transmitter has an alphabet consisting of five letters {a1, a2, a3, a4, a5} and the receiver
has an alphabet of four letters {b1, b2, b3, b4. The JPM of the system is given below.
[ ]
[ ]
10M
b) Consider a discrete memory less source with S={C, M, O, E} with respective probabilities
P={0.4, 0.1, 0.2, 0.3}. Develop the code word for the message “COME” using arithmetic
coding.
4. a) List the detailed steps involved in the development of adaptive Huffman code tree and
discuss sibling property.
b) Discuss
i. Perceptual coding 3M
ii. MEG audio Layers I, II, III 4M
iii. Psychoaucostic 3M
5.
c) Compare the following lossless compression methods 10M
i. RLE
ii. Huffman Coding
iii. LZW Coding
[ ] [ ] Draw the encoder circuit and obtain all possible code vectors.
b) i. Explain the types of errors seen in a digital communication system with examples
ii. List out the differences between block codes and convolutional codes
8. a) the parity check bits of a (7,4) block code are generated by
c5=d1+d3+d4
c6=d1+d2+d3
c7=d2+d3+d4
i. Obtain the generator matrix and parity check matrix for this code
ii. Show that GHT=0
b) Construct the standard array to perform syndrome decoding for (6,3) code having a
parity matrix
[ ] [ ]
9. a) Consider the (3,1,2) convolutional code with g(1)=(110), g(2)=(101), g(3)=(111).
i. Find the constraint length
ii. Find the rate
iii. Draw the encoder block diagram
iii. Find the codeword for the message sequence (11101) using time domain approach
b) i. write the convolutional encoder circuit for K=3, r=1/3.
Ii. Draw the state table for the encoder circuit obtained in (i)
Iii. Draw the trellis diagram and encode the message “110101”.
Assumptions: 2M
Drawing State machine: 3M
Decoding and final output: 5M