Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

An Introduction to Information Theory

Assignment-1

Q 1: Which of the following statement is correct?


(a) H(Y, Z/X) ≤ H(Y /X) + H(Z/X)
(b) H(Y, Z/X) ≤ H(Z/X) + H(Y /X)
(c) H(Y, Z/X) = H(Y /X) + H(Z/X, Y )

(d) All of the above


Q 2: Given a ternary source X = {x1 , x2 , x3 } with probability of occurrence of x1 , x2 , x3 given by 0.25, 0.25 and 0.5
respectively. What is the information content in X (in bits)?

(a) 1.5
(b) 0.45
(c) 1.04
(d) 0.9
Q 3: Given H(Y /X) = 0, which of the following statement is correct?
(a) X = g(y), where g is an arbitrary function

(b) Y = g(X), where g is an arbitrary function

(c) I(X; Y ) = H(X)


(d) None of the above
Q 4: Given a discrete random variable X that takes K different values x1 , x2 , . . . , xK with probabilities p1 , p2 , . . . , pK
respectively. Define a new random variable Y = g(X) where g is an arbitrary function. Which of the statement is
incorrect?
P
(a) H(X) = − i pi log pi

(b) H(X) ≤ H(g(X))

(c) H(g(X)) ≤ H(X)


(d) None of the above
Q 5: Two coins are given. One of them is an unbiased coin with equal probability of occurrence of heads and tails
and another is a two headed coin. A coin is selected at random and tossed two times, and number of tails is recorded.
Let X denote the random variable that has value 0 or 1 depending upon whether unbiased or two headed coin is
chosen. Let Y denotes the number of tails obtained. (Use the same statement for Q-05 to Q-07)
What is H(X)?
(a) 0 bit
(b) 0.5 bits

(c) 1 bit

1
(d) 2 bits
Q 6: What is H(Y )?
(a) 0.4 bits

(b) 1.3 bits


(c) 0.9 bits

(d) 1 bit
Q 7: What is I(X; Y )?
(a) 0.38 bits

(b) 0.55 bits


(c) 0.17 bits

(d) None of the above


Q 8: Information theory does not answer following questions.
(a) How to quantify information?
(b) What is the maximum data compression possible for a source?

(c) What is the best coding scheme for error-free transmission over a communication channel?

(d) What is the maximum transmission rate possible over a channel for arbitrarily small error probability?
Q 9: For any two L-ary random variables, X and Y , let p = Pr(X 6= Y ). Which of the following statement is
incorrect?

(a) H(p) + p log L ≤ H(X/Y )

(b) H(p) + p log(L − 1) ≥ H(X/Y )


(c) 1 + p log L ≥ H(X/Y )

(d) None of the above


Q 10: Which of the following is a concave function?
(a) 1 − log X, X ∈ [0, ∞)

(b) I(X; Y ) as a function of p(X) for a fixed p(Y /X)

(c) I(X; Y ) as a function of p(Y /X) for a fixed p(X)


(d) D(p||q) as a function of (p, q)

You might also like