Professional Documents
Culture Documents
TEC 320, Assignment, Submission Date 22 May
TEC 320, Assignment, Submission Date 22 May
TEC 320, Assignment, Submission Date 22 May
Maximum Marks: 10
Note: Do all the questions. Write every steps if needed; according to questions.
Submit the assignment in hard copy till 22 may .
Q1. Let X and Y represent random variables with associated probability distributions
p(x) and p(y), respectively. They are not independent. Their conditional probability
distributions are p(x|y) and p(y|x), and their joint probability distribution is p(x, y).
(a). What do you understand by information and entropy. What is the marginal
entropy H(X) and Information of variable X.
(b) In terms of the probability distributions, what are the conditional entropies
H(X|Y) and H(Y|X)?
(c) What is the joint entropy H(X, Y ), and what would it be if the random
variables X and Y were independent?
(d) Give an alternative expression for H(Y ) − H(Y |X) in terms of the joint entropy
and both marginal entropies.
(e) What is the condition to maximize entropy?
Q2. Calculate the probability that if somebody is “tall” (meaning taller than 6 ft or
whatever), that person must be male. Assume that the probability of being male is
p(M) = 0.5 and so likewise for being female
p(F ) = 0.5. Suppose that 20% of males are T (i.e. tall): p(T|M) = 0.2; and that 6% of
females are tall: p(T|F ) = 0.06.
Q4.(i) What is markov source. (ii) Derive the formula to get entropy of markov
source (iii) Draw the state diagram (iv) Find the source entropy.
The markov source is described by the following conditional probability
P(0/00) =P(1/11) = 0.8
P(1/00) =P(0/11) =0.2
P(0/01) = P(1/01) =P( 0/10) =P(1/10) =0.5
Q5. For the following channel, derive the channel capacity for each channel
(i )Binary Symmetric channel) (ii) Binary Erasure channel
Q6. Assume a BSC with p=1/4 and q=3/4 and p(0)=p(1)=0.5.Calculate the
improvements in rate of transmission by 2 and 3 repetitions of the input
Q7.How will you determine the size of block code. Decode the data 1011001 by
using hamming code.
Q8.MCQ
(i) A discrete memory less source emits a symbol U that takes 6 different
values {u1,u2,u3,u4,u5,u6}with
probabilities {1/4,1/4,1/8,1/8,1/8,1/8} respectively. A binary Shannon-Fano
code consists of code words of following lengths?
(a) {3, 3, 3, 3, 3, 3)
(b){2, 2, 2, 2, 2, 2}
(c){2, 2, 3, 3, 3, 3}
(d){3, 3, 4, 4, 4, 4}
(ii) Let X is a discrete random variable with entropy H(X). Which of the
following statement is true?
(a) H (5 X) = H(X)
(b) H (5 X) = 5H(X)
(c) H (5 X) = H(X)/5
(d) H (5 X) = H(X) +log5
(iii) For two random variables X and Y, H2(X) =5, He(Y) =4 and H10 (XY) =3.
I2(X; Y) is equal to (Answer rounded to first decimal place)?
(a) 0.2
(b) 0.8
(c) 6
(d) 12
(iv) Consider following codes:
which of the following statement is true?
(a) Only (III) is uniquely decodable.
(b) Only (I) and (II) are uniquely decodable.
(c) Only (III) and (IV) are uniquely decodable
(d) All the above codes are uniquely decodable.
S S1 S2 S3 S4 S5 S6 S7 S8 S9 S10
P(S) 0.20 0.18 0.12 0.10 0.10 0.08 0.06 0.06 0.06 0.04
Q11 (a) What is linear block code. What is diffrence between block code and cyclic
code. Define G and H matrix and show that G . HT = 0
(b). Consider a (7,4) linear block code , Make appropriate generator matrix
(ii) Find encoded data for transmitted data 1111 for nonsystematic form.
Q16. Calculate the capacity of the following channels and show how C(capacity)
varies with p.
1 0 0
0 p (1-p)
0 (1-p) p
Q20. Let S be a DMS taking on 8 equally likely values. Its symbol rate is 1,000
symbols per second. The outcomes of S are to be transmitted over a binary
symmetric channel of crossover probability equal to 0.001. The maximum channel
symbol rate is 3,000 bits per second. Is it possible to transmit the outcomes of S with
an arbitrarily low probability of error?
******************************************************