Professional Documents
Culture Documents
Unit 2 - Week 1: Assignment 1
Unit 2 - Week 1: Assignment 1
reviewer3@nptel.iitm.ac.in ▼
Unit 2 - Week
1
Course
outline
Assignment 1
The due date for submitting this assignment has passed.
How to access As per our records you have not submitted this Due on 2018-08-15, 23:59 IST.
the portal assignment.
Quiz :
H(X, Y) = H(X) + H(Y|X)
Assignment 1
H(X, Y) = H(Y) - H(X|Y)
New Lesson
H(X, Y) = H(X) + H(X|Y)
Week 2
None
Week 3
No, the answer is incorrect.
Score: 0
Week 4
Accepted Answers:
Week 5 H(X, Y) = H(X) + H(Y|X)
No, never
Week 7
Yes, depends on the choice of the pdf
Week 8
Funded by
When X? Y
5) A source, X, has an infinitely large set of outputs with probability of occurrence given by 0 points
P(xi) = 2-i, i = 1, 2, 3, ..... What is the average self information, H(X), of this source?
H(p2) bits
pH(p) bits
H(p) bits
6) The Jensen’s inequality given by __________________ where E[.] is the expectation 1 point
operator.
E[f(X)] ≥ f[E(X)]
E[f(X)] ≤ f[E(X)]
None
7) Let X denote the number of tosses required for a coin until the first tail appears. The 1 point
entropy H(X) if the probability of getting a tail is p is given by
H(p)
p H(p)
1/p H(p)
p 2 H(p)
8) Suppose a source produces independent symbols from the alphabet {a1, a2, a3}, with 1 point
probabilities p1 = 0.4999999, p2 = 0.4999999, and p3 = 0.0000002. The entropy, H(X), of this source is
Less than 1
More than 1
Equal to 1
None
10)Which of the following gives the differential entropy, h(X), for the uniformly distributed 1 point
random variable X with the pdf,
log2(2a)
log2(a2)
log2(a)
log2(a–1)