Professional Documents
Culture Documents
Itc
Itc
Itc
com
~~
C</,/.'
'--
Information Theory and Coding /;'\,':"::;~)
\) 0") \
Time: 3 hrs. /;«~'> ~a,x'? .r¥~rks: 8Q
m
Note: ~"~~~/iF1VEfUII questions, choosing any onefull qUestionfrom<e~~6.jJ~dule.
1/ ~,~
M dule-l '
.~- v / .'
Y. --,)
0 . l\
co
1 a. Derive an exp'.r~on for average information content of long independent sym~Q.I~~
sequence. ...(~).' (03 Marks) t>,
b. For the Markov<''-S,~ti~c$shown below, find i) The station~~~6ution ii) State
entropies iii) S~uide entropy iv) G1 G2 and show that G(~~2--~ H(s). (10 Marks)
/~ 8 , .' I?
g.
-"A /',\)'i;'
//'1 .0:4 It- '\, '-:;
....
- "'./'.
,/ l; «-,
;~
/'.'~~
~4 A ,\)/. V iV
\J
c. Define Self Information, B'nt ·'.-~nd Information rate: ~,::::;V (03 Marks)
lo
u -I) --..
\~ \ OR (c) -~~,>
2 a. Mention different properties of e~r9PY and prove property. ,*~Wa1 (07 Marks)
b. A source emits one of the four sym''b:OIs-$1S2 S3 arid $(tZvith probabilities of
c.
7
16
5 I
16
I h th
8
~·::.7..1
- , _, _ & _. S ow at H(S ) z=AH(S). ~~'
8
/;::\-
S b ~ / I", So S1 S2 \::~ S4
Prop 'if it." 0.55 0.15 0.15 O. .05
Compute.' Huffinan code b,~ W~g composite ~ymbol a~)1.,:...~as possible and by placing
ed
*.....'/ M . -'....
/ ,...
~,\
.. '<." . 1 M2 M3 M4 Ms /;-';iJ
I)
s
z
/';<;:.: •.......))
/ ( <,•.../
1/8 1/16 3/16 1/4 3/8 0 . \0
o ) ..
1
w
o 0,2 0.8
10f2
15~~54
~ '
~~fine Mutual Information, List the properties of Mutual information and prove that,."
---,!!-' . y) = H(x) + HCy) - H(xy) bits/system. (Oli Marks)
c. <A ~nel
tr;~~,=
has the following characteristics:
m
I I I I 2
__
/) -
.J 6
-
6
-
3
-
3 '
capacity 'if! ::.1000 symbols/sec. (06 Marks)
~~ F
co
6 a.
<:~;i)
A bimary symm{~;C;..~h anne I h as th e J::
10
IIO~ . .. h c/', b bili .
owmg norse matrix WIt s9~~e pro a I rties 0
f
g.
3 /3>\\ 1 3 '.'
<..(,,//./ 4" "4 '
i) Determine H(x) , H~~(X, y) , H(y/x) , Htx/y) aQd i~,y).
lo
ii) Find channel capacity·(>.;:; ::;\ iii) Find channel e~~ien<;:y and redundancy.
b. Derive an expression for cha'unefefficiency for a Binary E\,asure channel. (05 Marks)
c. Write a note on Differential Entto"py. ',1 . (03 Marks)
I,(/./ 'f
\...,\ .I"
7 a.
C6 = d, EDd2•
'-\. "\mdule-4
I
ib C4 = d, ED d3, C, = d2 Ef) d, ,
ik
r-
'"
i) Find all possible code vectors ii) Dr~:w encoder circuit and syndrome circuit
iii) Detect and correct the code word if'the resiived code word is 110010.
iv) Hamming weight for all code vec..tor, <ffiin hamming distance. Error detecting and
w
correcting capability. r~ >':: , .: (14 Marks)
b. Define the following: i) Block ,,-odev' Convolutional code and
ii) Systematic and non-
systematic code. ,,)/"I '> " J. (02 Marks)
ia
(03 Marks)
b. A generator polynomi~t1.~"a'(15, 7) cyclic code is g(xd = I -i(x~+_x.6 + x7 + x8.
i) Find the code vecto~lbJthe message D(x) = X2+ X + X4.U~Jn:g~yclic encoder circuit.
ii) Draw syndrorYf(e,a!~ulation circuit and find the syndrome of'the received polynomial
Z(x) = 1 + ft':+'x6 + x8 + x9 + XII + X14. .
.p
(13 Marks)
~ '\ J '
<. "~I' Module-5
9 a. Consider the (.t-)!"., ) convolutional code with gl = 110 , g2 = 101 , g3 = 11 L (12 Marks)
w
i) Draw tt\e ~~oder block diagram ii) Find the generator matrix ,.. ,
iii) Find ~C:Ode word corresponding to the information sequence l1lU I using time
domairy@r~ftMsform Domain approach.
w
»1) OR "<
10 Forf.~' t;--.Y) convolutional encoder with g, = 1011 , g2 = 1101. (16 Ma'rl\s)
w
*****
20f2