TEC 320, Assignment, Submission Date 22 May

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

ASSIGNMENT

TEC-320, Information theory & Coding

Submission Last Date: 22.05.23(till evening 5 PM)

Maximum Marks: 10

Note: Do all the questions. Write every steps if needed; according to questions.
Submit the assignment in hard copy till 22 may .

Q1. Let X and Y represent random variables with associated probability distributions
p(x) and p(y), respectively. They are not independent. Their conditional probability
distributions are p(x|y) and p(y|x), and their joint probability distribution is p(x, y).

(a). What do you understand by information and entropy. What is the marginal
entropy H(X) and Information of variable X.
(b) In terms of the probability distributions, what are the conditional entropies
H(X|Y) and H(Y|X)?
(c) What is the joint entropy H(X, Y ), and what would it be if the random
variables X and Y were independent?
(d) Give an alternative expression for H(Y ) − H(Y |X) in terms of the joint entropy
and both marginal entropies.
(e) What is the condition to maximize entropy?
Q2. Calculate the probability that if somebody is “tall” (meaning taller than 6 ft or
whatever), that person must be male. Assume that the probability of being male is
p(M) = 0.5 and so likewise for being female
p(F ) = 0.5. Suppose that 20% of males are T (i.e. tall): p(T|M) = 0.2; and that 6% of
females are tall: p(T|F ) = 0.06.

(a) Calculate p(M|T )


(b) If you know that somebody is male, how much information do you gain
(in bits) by learning that he is also tall?
(c) How much do you gain by learning that a female is tall?
(d) How much information do you gain from learning that a tall person is
female?

Q3. Consider a noiseless analog communication channel whose bandwidth is 10,000


Hertz. A signal of duration 1 second is received over such a channel. We wish to
represent this continuous signal exactly, at all points in its one-second duration,
using just a finite list of real numbers obtained by sampling the values of the signal
at discrete, periodic points in time. What is the length of the shortest list of such
discrete samples required in order to guarantee that we capture all of the information
in the signal and can recover it exactly from this list of samples?

Q4.(i) What is markov source. (ii) Derive the formula to get entropy of markov
source (iii) Draw the state diagram (iv) Find the source entropy.
The markov source is described by the following conditional probability
P(0/00) =P(1/11) = 0.8
P(1/00) =P(0/11) =0.2
P(0/01) = P(1/01) =P( 0/10) =P(1/10) =0.5
Q5. For the following channel, derive the channel capacity for each channel
(i )Binary Symmetric channel) (ii) Binary Erasure channel

Q6. Assume a BSC with p=1/4 and q=3/4 and p(0)=p(1)=0.5.Calculate the
improvements in rate of transmission by 2 and 3 repetitions of the input
Q7.How will you determine the size of block code. Decode the data 1011001 by
using hamming code.
Q8.MCQ
(i) A discrete memory less source emits a symbol U that takes 6 different
values {u1,u2,u3,u4,u5,u6}with
probabilities {1/4,1/4,1/8,1/8,1/8,1/8} respectively. A binary Shannon-Fano
code consists of code words of following lengths?
(a) {3, 3, 3, 3, 3, 3)
(b){2, 2, 2, 2, 2, 2}
(c){2, 2, 3, 3, 3, 3}
(d){3, 3, 4, 4, 4, 4}
(ii) Let X is a discrete random variable with entropy H(X). Which of the
following statement is true?
(a) H (5 X) = H(X)
(b) H (5 X) = 5H(X)
(c) H (5 X) = H(X)/5
(d) H (5 X) = H(X) +log5
(iii) For two random variables X and Y, H2(X) =5, He(Y) =4 and H10 (XY) =3.
I2(X; Y) is equal to (Answer rounded to first decimal place)?
(a) 0.2
(b) 0.8
(c) 6
(d) 12
(iv) Consider following codes:
which of the following statement is true?
(a) Only (III) is uniquely decodable.
(b) Only (I) and (II) are uniquely decodable.
(c) Only (III) and (IV) are uniquely decodable
(d) All the above codes are uniquely decodable.

Q9. Apply Huffman coding for the following table:

S S1 S2 S3 S4 S5 S6 S7 S8 S9 S10

P(S) 0.20 0.18 0.12 0.10 0.10 0.08 0.06 0.06 0.06 0.04

Q10. Prove the following

(i) H( X,Y) = H(X/Y) + H(Y)


(ii) I(X,Y) = H(Y) - H(Y/X)

Q11 (a) What is linear block code. What is diffrence between block code and cyclic
code. Define G and H matrix and show that G . HT = 0

(b). Consider a (7,4) linear block code , Make appropriate generator matrix

(i) Find the error correcting capability of designed matrix.


(ii) Make code dictionary .
Q 12. A (7, 4) cyclic code has a generator polynomial: g(X) = X3 + X + 1.

(i) Draw the block diagram of encoder for nonsystematic encoder.

(ii) Find encoded data for transmitted data 1111 for nonsystematic form.

Q13.Write an explanatory note on convolution code. Make a comparison between


block code and convolution code. Draw the general block diagram of convolution
encoder and explain its working procedure.
Q14. For a code rate 1/2 convolution code given by:
g1=(1 1 1), g2=(1 0 1)
(a) Design the encoder
(b) Make state transition table
(c) Draw code tree graph
(d) Draw Trellis diagram
(e) Encode the data 11010 using Trellis diagram( Draw the path neat and
clean)
(f) Decode the codeword 1101010010
Q15. Design encoder for the (7,4) binary systematic cyclic code generated by the
generator polynomial g(x)= 1+x2+x3 and get output for message bit 0101. Show all
the timing sequence in tabular form.

Q16. Calculate the capacity of the following channels and show how C(capacity)
varies with p.
1 0 0

0 p (1-p)

0 (1-p) p

Q17. What is significance of negative information in case of continuous signal


probability density function? Prove that mutual information I(X, Y) is always
positive.

Q18. Explain how data compression occurs in LZ codes.

Q19. In the game of mastermind, player A chooses an ordered sequence of four


pieces which is concealed from player B. The pieces are of the same shape and may
be of different colors. Six colors are available, so that the chosen sequence may
consist of one, two, three or four colors. Player B has to guess the sequence by
submitting ordered sequences of four pieces. After considering the combination put
forth by B, player A tells player B the number of pieces in the correct position and
the number of pieces in the wrong position, but without indicating which pieces or
positions are correct.

(i)What is the average amount of uncertainty in the sequence chosen by player


A?

(ii)The first sequence submitted by player B consists of four pieces of the


same color. What is the average amount of uncertainty in the unknown
sequence (the one chosen by player A) resolved by the answer given by player
A to the first submitted sequence

Q20. Let S be a DMS taking on 8 equally likely values. Its symbol rate is 1,000
symbols per second. The outcomes of S are to be transmitted over a binary
symmetric channel of crossover probability equal to 0.001. The maximum channel
symbol rate is 3,000 bits per second. Is it possible to transmit the outcomes of S with
an arbitrarily low probability of error?

******************************************************

You might also like