Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 8

UNIT- I

INFORMATION THEORY
This unit deals with various aspects of digital communication by considering
discrete memoryless source and channels. The Information theory is being
understood clearly. Based on that the various coding techniques and its
application can be understood.

Introduction
Describe about digital communication. List the elements of basic U
1.
communication systems.
What are the major advantages and disadvantages of a digital U
2.
communication system?
Show how is the performance of a digital communication system A
3.
measured?
Discrete Memoryless Source- Information, Entropy, Mutual
Information
4. Write about Discrete memoryless Source. R
5. Outline Information. Summarize the Properties of Information. R
6. Define Information Theory and mention the basic setups in R
Information Theory.
7. Comment on the Scope of Information Theory. U
8. Deliberate the primary assumptions that are made about the R
source.
9. Let P denote the probability of some event. Plot the amount of A
information gained by the occurrence of this event for 0≤p≤1.
10. A source emits one of four symbols during each signaling A
interval. The symbols occur with the probabilities:
P0=0.4,P1=0.3,P2=0.2,P3=0.1.Find the amount of information
gained by observing the source emitting each of these symbols.
11. Define Entropy of a Discrete Memory less source. Give its R
Properties.
12. Let X represents the outcome of a single roll of a fair die. What is A
the entropy of X?
13. Consider a discrete memoryless source with source alphabet A
S={s0,s1,s2} with probabilities p(s0)=p0=1/4,p(s1)=p1=1/4, and
p(s2)=p2=1/2.Hence the use of suitable equation find the entropy
of the source.
14. A source emits one of four symbols s 0, s1, s2 and s3 with A
probabilities 1/3, 1/6, 1/4, and 1/4 respectively. The successive
symbols emitted by the source are statistically independent.
Calculate the entropy of the source.
15. Define Extension Coding and Outline conditional Entropy. R
16. Define Mutual Information. Mention the properties of Mutual R
Information.
17. Show that the mutual information of a channel is Symmetric. A
18. Express Information Capacity. Mention the stages considered for R
the evaluation of information capacity.
Discrete Memoryless Channels
19. Write about discrete memory less channels? R
20. Write about Joint probability distribution and marginal probability R
distribution.
21. Define average probability of symbol error. R
Binary Symmetric Channel
22. Write about Binary Symmetric Channel. R
23. Consider the transition probability diagram of a binary A
symmetric channel shown in figure below. The input binary
symbols 0 and 1 occur with equal probability. Find the
probabilities of the binary symbols 0 and 1 appearing at the
channel output.

Channel Capacity
24. Define channel capacity. Write about the channel capacity of an U
AWGN Channel.
25. Describe Additive White Gaussian Noise (AWGN).Mention its R
Properties.
26. Show that how many decoding spheres can be packed inside the A
larger sphere of received vectors?
27. A voice grade channel of the telephone network has a bandwidth A
of 3.4kHz.
a) Calculate the channel capacity of the telephone channel for a
signal-to-noise ratio of 30 dB.
b) Calculate the minimum signal-to-noise ratio required to
support information transmission through the telephone channel
at the rate of 4800 bits per second.
28. Alphanumeric data are entered into a computer from a remote A
terminal through a voice grade telephone channel. The channel
has a bandwidth of 3.4kHz, and output signal-to-noise ratio of 20
db. The terminal has a total of 128 symbols. Assume that the
symbols are equiprobable, and the successive transmissions are
statistically independent.
a) Calculate the channel capacity
b) Calculate the maximum symbol rate for which error-free
transmission over the channel is possible.
29. A black and white television picture may be viewed as consisting A
of approximately 3x105 elements, each one of which may occupy
one of 10 distinct brightness levels with equal probability. Assume
(a)The rate of transmission is 30 picture frames per second, and
(b)The signal-to-noise ratio is 30dB.Using Channel Capacity
Theorem; calculate the minimum bandwidth required to support
the transmission of the resultant video signal
Hartley – Shannon Law
30. State Hartley-Shannon Law. R
31. State Shannon’s information capacity Theorem. R
32. Inscribe about Shannon limit for AWGN channel. U
33. A binary symbol occurs with a probability of 0.75.Determine the A
information associated with the symbol in bits, nats and Hartleys.
34. Mention about Shannon’s fundamental bound for perfect security. R
35. Write about Confusion and diffusion in Shannon model of R
cryptography.
36. Give the Applications of the Shannon-Hartley theorem to data A
streams and sparse recovery.
Source Coding Theorem
37. Comment on source encoder. List the functional requirements of R
an efficient source encoder.
38. Write about variable length code. R
39. Define Source Coding Theorem. R
40. A
Symbols Code I Code II Code III Code IV
S0 0 0 0 00
S1 10 01 01 01
S2 110 001 011 10
S3 1110 0010 110 110
S4 1111 0011 111 111
Consider the four codes listed below.
Two of these four codes are prefix codes. Identify them, and
construct their individual decision trees.
41. Inscribe about Rate Distortion Theory. R
42. Define Prefix Code and Prefix condition. R
43. Describe Kraft-McMillan inequality. U
44. Describe uniquely decipherable codes. U
45. A source with probability distribution is shown in table below. A
Four candidate code for encoding this source is also given. Find
the efficiency of each code. Also determine whether all these
codes are decipherable or not.
Symbols Prob. Code I Code II Code Code IV
III
A 1/2 00 0 0 0
B 1/4 01 01 10 1
C 1/8 10 011 110 10
D 1/8 11 0111 111 11
L 2.0 1.875 1.75 1.25
K 1.0 0.9375 1.0 1.5
46. Write about Compact Codes.. R
47. Describe about source information rate. U
48. An inventor claims to have developed a device for storing A
arbitrary English text in computer memory; the device requires 1
bit of memory for every 10 Latin letters. The entropy of English
has been estimated by researchers to be in the range 1-2 bits per
letter. The estimate takes account of the fact that some English
letters are more likely than others and that successive letters in
English text are highly dependent. Given this estimate of the
entropy of English, is the claim made by the inventor correct?
Justify your answer.
Shannon - Fano & Huffman Codes
49. Outline Huffman Code. U
50. Write about Huffman encoding algorithm. U
51. Define Efficiency of the code and Code word Length. R
52. Define Lempel-Ziv Algorithm. Mention the various steps in R
Lembel-Ziv Algorithm.
53. Define Universal source coding algorithm. R
54. Write the algorithm of Shannon-Fano code. R
55. Mention the initial conditions of Fano algorithm. U
56. The source distribution is shown in Table below. Design a A
Shannon-Fano code for this source.
Prob Coding Steps Code
Symbols
. 1 2 3 4 5 6 Word
m1 0.50 0 0
m2 0.15 1 0 0 100
m3 0.15 1 0 1 101
m4 0.08 1 1 0 110
m5 0.08 1 1 1 0 1110
m6 0.02 1 1 1 1 0 11110
m7 0.01 1 1 1 1 1 0 111110
m8 0.01 1 1 1 1 1 1 111111
57. A memory less source emits symbols m1 and m2 with probability A
0.8 and 0.2 respectively. Find the binary Huffman code for this
source. Compare the performance of this code with its second
binary extension.
58. A memory less source emits six symbols with probabilities shown A
in table below. Find the 4-Ary Huffman code and determine its
average word length, efficiency and redundancy.
Original Source Reduced Source
Symbols Code Probability S1
m1 0 0.3 0.3
m2 2 0.25 0.3
m3 3 0.15 0.25
m4 10 0.12 0.15
m5 11 0.10
m6 12 0.08
m7 13 0.00
59. Summarize the various applications and advantages of Shannon- U
Fano Coding and Huff man coding.
60. Design a binary Huffman code for a discrete source of three A
independent symbols A, B and C with probabilities 0.9, 0.08 and
0.02, respectively. Determine the efficiency of the code. Now to
improve the efficiency of the code; a binary second order
extension code is used for these symbols. Determine the efficiency
of this new code.
PART B
1. a) Discuss in detail about the relationship among uncertainty, 8 R
information and Entropy and explain how it befits to the
information theory.
b) Discuss about the entropy of a discrete memory less source 8 U
and Prove the properties of Entropy.
2. (a) Describe in detail about the discrete memoryless channels 8 U
b) Explain Binary Symmetric Channels in detail. 8 U
3. a) A Signal has a Gaussian amplitude pdf with zero mean and 8 A
variance σ2.The signal is sampled at 500 samples per second.
The samples are quantized to the following table:
Quantizer input Quantizer Output
-∞ <xi < -2σ m0
-2σ < xi < -σ m1
-σ < xi < 0 m2
0< xi < σ m3
σ < xi < 2σ m4
2σ < xi < ∞ m5
Determine the entropy at the quantizer output and the
information rate in samples per second.
b) Let X1,X2,….Xn denote the elements of a Gaussian vector X. 8 A
The Xi are independent with means µi and variances σi2, i=1,2,
…n. Show that the differential entropy of the vector X equals
h(X)=n/2log2[2πe(σ12,σ22,…σn2)1/n]. What does h(X) reduce to if
the variances are equal?
4. Consider the binary symmetric channel shown in figure. Let p0 16 A
denote the probability of choosing binary symbol x0=0, and let
p1=1-p0 denotes the probability of choosing binary symbol
x1=1.Let p denotes the transition probability of the channel.

a)Show that the average mutual information between the


channel input and channel output is given by I(X:Y)=H(z)-H(p)
where H(z)=zlog2(1/z)+(1-z)log2(1/(1-z)) , z=p0p+(1-p0)(1-p)
And H(p)=plog2(1/p)+(1-p)log2(1/(1-p0))
(b) Show that the value of p0 that maximizes I(X: Y) is equal to
½.
(c) Show that the channel capacity equals C=1-H(p)
5. a)Consider a discrete memory less source with source alphabet 8 A
S={s0,s1,…sk-1} with source statistics {p0,p1,……pk-1}.the nth
extension of this source is another discrete memoryless source
with source alphabet Sn={σ0,σ1,…..σM-1},where M=Kn. Let σi
correspond to the sequence {si1,si2,…sin}. LetP(σi) denote the
probability of σi.
M 1

 P( )  1
i
a) Show that i 0 which is to be expected.
M 1
 1 
 P( ) log
i 2    H ( S ), k  1, 2,...n,
b) Show that i 0  pik  where pik is
the probability of symbol sik, and entropy of the original
source.
M 1
1
H ( S )   P( i ) log 2  nH ( S )
c) Hence show that i 0 P ( i )
b)A discrete memory less source has an alphabet of five 8 A
symbols with their probabilities for its output, as given here:
Symbol S0 S1 S2 S3
Probabilit
0.55 0.15 0.15 0.10
y
Compute two different Huffman codes for this source. Hence,
for each of the two codes, find (a)The average code-word
length
(b)The Variance of the average code-word length over the
ensemble of source symbols.
(c ) Efficiency
6. a) Discuss in detail about the Mutual information and explain 9 U
the properties of mutual information in detail.
b) Consider a binary symmetric channel characterized by the 7 S
transition probability p. Plot the mutual information of the
channel as a function of p1,the priori probability of symbol 1 at
the channel input, for the transition probability
p=0,0.1,0.2,0.3,0.5……
7 a) Explain differential entropy and mutual information for 8 U
continuous Ensembles.
b) The sample function of a Gaussian process of zero mean and 8 S
unit variance is uniformly sampled and then applied to a
uniform quantizer having the input-output amplitude
characteristic shown in figure. Calculate the entropy of the
quantizer output.

8. (i)Describe Channel Capacity Theorem in detail. 8 R


(ii) Explain Source coding Theorem in detail. 8 U
9. a) Describe in detail about Hartley - Shannon law. 8 R
b) Explain Shannon fano algorithm and Huffman Algorithm in 8 U
detail.
10. i) A discrete memoryless source has an alphabet of seven 8 A
symbols with probabilities for its output, as described here:
Symbol S0 S1 S2 S3 S4 S5 S6
Probabil 0.2 0.2 0.1 0.1 0.06 0.062
0.125
ity 5 5 25 25 25 5
Compute the Huffman code for this source, moving a
‘combined’ symbol as high as possible. Explain why the
computed source code has an efficiency of 100 percent.
ii) Consider a discrete memoryless source whose alphabet of k 8 A
equi-probable symbols.
a) Explain why the use of a fixed –length code for the
representation of such a source is about as efficient as any code
can be.
b) What conditions have to be satisfied by K and the code
length for the coding efficiency to be 100 percent?

You might also like