Professional Documents
Culture Documents
I Unit
I Unit
UNIT-1
1. Define entropy?
Entropy of a source is the measure of information. Basically source codes try to
reduce the redundancy present in the source, and represent the source with fewer bits that
carry more information.
M
E = pk log21 / pk
i=0
2. Give two properties of information.
Information must be Non-negative (i.e.) I (sk)0
If probability is less then information is more and if probability is more ,then information is less
If ( I (sk ) > I (si)) then p(sk ) < p(si)
3. Give two properties of entropy?
Entropy is zero if the event is sure or it is impossible .
H=0 if pk =0 or 1
When pk=1/M for all the M symbols , then the symbols are equally likely. For