Professional Documents
Culture Documents
Lect3 - 2021 IT
Lect3 - 2021 IT
Lect3 - 2021 IT
INFORMATION THEORY
Recap of MoI
• When the base of the logarithm is 2 the units of I(x) are in bits
• When the base is e, the units of I(x) are in nats (natural units)
Example 1
• Consider a binary source A which tosses a fair coin
• It produces an output equal to 1 if a head appears and a 0 if a
tail appears
• What is the information content of each output?
Solution
• For the source, P(1) = P(0) = 0.5
• The information content of each output from the source is
Example 1 contd
• Suppose the successive outputs from this binary source are
statistically independent, i.e. the source is memory less
• Consider a block of m binary digits
• There are 2m possible m-bit blocks, each of which is equally
probable with probability of 2−m
• The self information of an m-bit block is
Example 1 contd
• Suppose the successive outputs from this binary source B are
statistically independent, i.e. the source is memory less
• Consider a block of m binary digits
• There are 2m possible m-bit blocks, each of which is equally
probable with probability of 2−m
• The self information of an m-bit block is
Example 2
• Consider a discrete memoryless source C that generates two
bits at a time
• This source comprises of two binary sources (A and B), each
contributing one bit
• The two binary sources within the source C are independent
• What is the information content of the aggregate source C
Solution
• Intuitively, the information content of the aggregate source C
should be the sum of the information contained in the outputs
of the two independent sources that constitute this source C
• Since A and B are independent
P(C) = P(A)P(B) = 0.5 × 0.5 = 0.25
I(C) = −log2 P(xi ) = −log2 (0.25) = 2 bits
• The answer is again consistent with intuition
Example
• Given a bag containing 3 green, 4 red and 2 yellow balls, what is
the average surprise associated with choosing a ball at random
from the bag.
⊣ What is the information gained by choosing a green ball?
⊣ Differentiate between the two types of information
obtained.
Example
• Calculate the entropy of fair coin.
⊣ Entropy of a biased coin that comes up 75% heads
⊣ What is the entropy of the coin if somehow it is incapable
of landing tails
Mutual Information
• Consider two discrete random variables X andY with possible
outcomes xi where i = 1, 2, ..., n and yj where j = 1, 2, ..., m
respectively
• Suppose we observe some outcome Y = yj and we want to
determine the amount of information this event provides about
the event X = xi ∀i = 1, 2, ..., n
• That is, we want to mathematically represent the mutual
information
Mutual Information
Mutual Information
Note the following
• If X and Y are independent events, the occurrence of Y = yj
provides no information about X = xi
• Observe that
P(xi |yi ) P(xi |yi )P(yi ) P(xi , yi ) P(yi |xi )
= = =
P(xi ) P(xi )P(yi ) P(xi )P(yi ) P(yi )
• It is symmetric
Prepared by Dr. Abdel-Fatao Information Theory 19/29
Measure of Information
( ) ( 1−p )
P(Y=0|X=0)
I(x0 ; y0 ) = I(0; 0) = log2 P(Y=0)
= log2 0.5
= log2 2(1 − p)
( ) ( )
P(Y=0|X=1) p
I(x1 ; y0 ) = I(1; 0) = log2 P(Y=0)
= log2 0.5
= log2 2p
• In that case
I(x0 ; y0 ) = I(0; 0) = log2 2(1 − p) = 1 bit
• Hence having observed with certainty the output, we can
determine what was transmitted
• Recall that the self information about event X = x0 was 1 bit
Prepared by Dr. Abdel-Fatao Information Theory 23/29
Measure of Information
( ) ( ) ( )
P(Y=0|X=1) p 2p1
I(x1 ; y0 ) = I(1; 0) = log2 P(Y=0)
= log2 0.5
= log2 1−p0 +p1
ADIOS