Itc

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

For More Question Papers Visit - www.pediawikiblog.

com

~~

USN '!_~, ~~4


, ;ttJI Semester D.E. Degree Examination, Dec.2017/Jan.20)~:) oJ

C</,/.'
'--
Information Theory and Coding /;'\,':"::;~)
\) 0") \
Time: 3 hrs. /;«~'> ~a,x'? .r¥~rks: 8Q

m
Note: ~"~~~/iF1VEfUII questions, choosing any onefull qUestionfrom<e~~6.jJ~dule.
1/ ~,~
M dule-l '
.~- v / .'
Y. --,)
0 . l\

co
1 a. Derive an exp'.r~on for average information content of long independent sym~Q.I~~
sequence. ...(~).' (03 Marks) t>,
b. For the Markov<''-S,~ti~c$shown below, find i) The station~~~6ution ii) State
entropies iii) S~uide entropy iv) G1 G2 and show that G(~~2--~ H(s). (10 Marks)
/~ 8 , .' I?

g.
-"A /',\)'i;'
//'1 .0:4 It- '\, '-:;
....
- "'./'.
,/ l; «-,
;~
/'.'~~

~4 A ,\)/. V iV
\J

c. Define Self Information, B'nt ·'.-~nd Information rate: ~,::::;V (03 Marks)

lo
u -I) --..
\~ \ OR (c) -~~,>
2 a. Mention different properties of e~r9PY and prove property. ,*~Wa1 (07 Marks)
b. A source emits one of the four sym''b:OIs-$1S2 S3 arid $(tZvith probabilities of

c.
7
16
5 I
16
I h th
8
~·::.7..1
- , _, _ & _. S ow at H(S ) z=AH(S). ~~'
8
/;::\-

In a facsimile transmission ofa picture{pre~about


01~/' .~' ib
2.25 x 106 pixels/frame. For a good
(04 Marl<s)
.
ik
reproduction at the receiver 12 brightness 1~ls are necessary. Assume all these levels are
equally likely to occur. Find the rate o~U:!.fohn~ti9n if one picture is to be transmitted every
3 min. Also compute the source effici~~~') ~-;.:;, (05 Marks)
- ~) (C)Sl
w
\M'6'dule-2 .:»:
3 a. A discrete memory less source has- ~ alphabet of:§~ symbols with their probabilities as
given below: r--- '<~)(-t0i (10 Marks)
ia

S b ~ / I", So S1 S2 \::~ S4
Prop 'if it." 0.55 0.15 0.15 O. .05
Compute.' Huffinan code b,~ W~g composite ~ymbol a~)1.,:...~as possible and by placing
ed

.~omposlte s~mbol as I~,~,~sslble. Also find 1) The avetE g~~'deword length


11) The vanance ofthe..,--ayerage code word for both the cases. ~-~)../ .
b. Using Shannon Fano T coClIng, find code words for the probabil'i~~/d1S ibution
nYr /'
P = {.!. .!. .!.
f '-F',
~Ulf-rn,ti
average code word length and efficiencj--:> (06 Marks) »>
.p

2 ' 4 ' 8 ' _1f~,:!7 (,/ ';..


!.
<'\, <"'/
.:::~~~
.•.) (:~--',\---
( .::» OR C'

4 a. Wri~e a short ~~n Lempel Ziv algorithm. ~::_ (05 Marks)


w

b. Denve S9!:!~...:9:idmg theorem, . - .: . (05 Marks)


c. Apply s~~riQli's encoding algorithm and generate binary codes for the set ~tW~.ss~es
given biWW,'/Also find variance, code efficiency and redundancy. ( /1 <(9§ Marks)
w

*.....'/ M . -'....
/ ,...
~,\
.. '<." . 1 M2 M3 M4 Ms /;-';iJ
I)
s
z
/';<;:.: •.......))
/ ( <,•.../
1/8 1/16 3/16 1/4 3/8 0 . \0
o ) ..
1
w

,- I'-~.., Module-3 <~<\


a. 5 a,r_~d the capacity of the discrete channel whose noise matrix is (04 M~

---(Y
P ~x
I) [0,80.1 0,2 °1
0.8 0.1
=

o 0,2 0.8
10f2

For More Question Papers Visit - www.pediawikiblog.com


For More Question Papers Visit - www.pediawikiblog.com

15~~54
~ '
~~fine Mutual Information, List the properties of Mutual information and prove that,."
---,!!-' . y) = H(x) + HCy) - H(xy) bits/system. (Oli Marks)
c. <A ~nel

tr;~~,=
has the following characteristics:

l·%,x: [± ~ ~ ~l & P(XI) ~ p(x,) ~


' "

i. Find H(x) , H(y) , H(x, y) and ~annel

m
I I I I 2
__
/) -
.J 6
-
6
-
3
-
3 '
capacity 'if! ::.1000 symbols/sec. (06 Marks)
~~ F

co
6 a.
<:~;i)
A bimary symm{~;C;..~h anne I h as th e J::
10
IIO~ . .. h c/', b bili .
owmg norse matrix WIt s9~~e pro a I rties 0
f

P(XI) = 3. and p~(~~(L~nd p(Yx) = [~ ±]. IS C> . (0' M"k'J

g.
3 /3>\\ 1 3 '.'
<..(,,//./ 4" "4 '
i) Determine H(x) , H~~(X, y) , H(y/x) , Htx/y) aQd i~,y).

lo
ii) Find channel capacity·(>.;:; ::;\ iii) Find channel e~~ien<;:y and redundancy.
b. Derive an expression for cha'unefefficiency for a Binary E\,asure channel. (05 Marks)
c. Write a note on Differential Entto"py. ',1 . (03 Marks)
I,(/./ 'f
\...,\ .I"

7 a.
C6 = d, EDd2•
'-\. "\mdule-4
I

For a systematic (6.3) linear block ~9"ge;~enefat~d by


-.,.'ly';'"'
/ /
" •..
-,.

ib C4 = d, ED d3, C, = d2 Ef) d, ,
ik
r-

'"
i) Find all possible code vectors ii) Dr~:w encoder circuit and syndrome circuit
iii) Detect and correct the code word if'the resiived code word is 110010.
iv) Hamming weight for all code vec..tor, <ffiin hamming distance. Error detecting and
w
correcting capability. r~ >':: , .: (14 Marks)
b. Define the following: i) Block ,,-odev' Convolutional code and
ii) Systematic and non-
systematic code. ,,)/"I '> " J. (02 Marks)
ia

/~:;() \ "/ .-:.


f (1\ OR .;' <....,
8 a. A linear Hamming code fo/;"fJ-,:4:J
is described by a gene(.afO~'p.~lynOmial g(x) = I + X + x3•
Determine Generator Ma~a Parity check matrix. \., _-j
ed

(03 Marks)
b. A generator polynomi~t1.~"a'(15, 7) cyclic code is g(xd = I -i(x~+_x.6 + x7 + x8.
i) Find the code vecto~lbJthe message D(x) = X2+ X + X4.U~Jn:g~yclic encoder circuit.
ii) Draw syndrorYf(e,a!~ulation circuit and find the syndrome of'the received polynomial
Z(x) = 1 + ft':+'x6 + x8 + x9 + XII + X14. .
.p

(13 Marks)
~ '\ J '
<. "~I' Module-5
9 a. Consider the (.t-)!"., ) convolutional code with gl = 110 , g2 = 101 , g3 = 11 L (12 Marks)
w

i) Draw tt\e ~~oder block diagram ii) Find the generator matrix ,.. ,
iii) Find ~C:Ode word corresponding to the information sequence l1lU I using time
domairy@r~ftMsform Domain approach.
w

b. Write sfl.et!:,note on BCH code. ':(04 ~arks)


........
<, '\ 'v

»1) OR "<
10 Forf.~' t;--.Y) convolutional encoder with g, = 1011 , g2 = 1101. (16 Ma'rl\s)
w

a. ra he state diagram b. Draw the code tree.


c~ w trellis diagram and code word for the message 1 I I 0 I. .
d. Usmg Viterbi decoding algorithm decode the obtained code word if first bit is erroneous.

*****
20f2

For More Question Papers Visit - www.pediawikiblog.com

You might also like