Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 3

Information Theory and Coding

UNIT-1
1. Define entropy?
Entropy of a source is the measure of information. Basically source codes try to
reduce the redundancy present in the source, and represent the source with fewer bits that
carry more information.
M
E = pk log21 / pk

i=0
2. Give two properties of information.
Information must be Non-negative (i.e.) I (sk)0
If probability is less then information is more and if probability is more ,then information is less
If ( I (sk ) > I (si)) then p(sk ) < p(si)
3. Give two properties of entropy?
Entropy is zero if the event is sure or it is impossible .
H=0 if pk =0 or 1
When pk=1/M for all the M symbols , then the symbols are equally likely. For

such source entropy is given as H=log2M.


4. What is memory less channel?
The channel is said to be memory less when the current output symbol depends only on the
current input symbol and not any of the previous choices
5. Define mutual information?
Mutual information of the channel is the average amount of information gained by the
transmitter when the state of the receiver is known
6. Give two properties of mutual information?
M u t u a l i n f o r m a t i o n i s a l w a ys n o n - n e g a t i v e
The mutual information of the channel is symmetric
7. What are the two important points while considering a code word?
The code words produced by the source encoder are in binary form
The source code is uniquely decodable

Information Theory and Coding

8. What is the important property while using the joint probability?


The sum of all the elements in a matrix is equal to 1.
9. Define mutual information?
Mutual information of the channel is the average amount of information gained by the
transmitter when the state of the receiver is known.
10. What is meant by Source Coding?
Source Coding refers to the conversion of symbols for a source into a
binary data suitable for transmission. The objective of source coding is to minimize the
average bit rate required for representation of source. Code length and efficiency are the
terms related to source coding.
11. Define Entropy Coding.
The design of a variable length code such that its average code word
length approaches the entropy DMS is often referred to as entropy coding. There are 2
types
1. Shannon-Fano Coding
2. Huffman Coding
12. Define Code efficiency .
The Code efficiency is denotes and is defined as = L min /L where L min
Minimum Possible Value of L, when approaches unity, the code is said to be efficient
[ (i.e) = 1].
13. Define Fixed Length Codes and Variable Length Codes
A Fixed Length Code is one whose code word length is fixed. Code 1 and code 2 of the
given binary table are fixed length codes with length 2
A Variable Length Code is one whose code word length is not fixed. All codes of the
given binary table except code 1 and code 2 are variable length codes.
14. Define Source Coding Theorem
Source Coding Theorem states that For a DMS X with entropy H(X), the average code
word length L, per symbol is bounded as, L >= H(X) , and further, L can be made as close to
H(X) as desired for some suitably chosen code. Thus with L min = H(X), the code efficiency can
be rewritten as = H(X) / L .
15. Define the average length of the code word

Information Theory and Coding


The average length of the code word is L (bar) = PiLi
Li Length of ith code word, should be minimum
L (bar) should approach the value [ H(X)/log D] with condition L(bar)>=H(X)/logD. Since
PiLi is to be minimum, the code words with larger Li should have smaller Pi , (i.e)
Li
log (1/Pi).
16. Definition of information?
Let us consider the communication system which transmits messages m1,m2,m3 with
probabilities of occurrence p1,p2,p3,..............The amount of information transmitted through the
message mk with probability pk is given as,
Amount of information : lk = log2(1/pk)
17. Code Redundancy?
It is the measure of redundancy of bits in the encoded message sequence. It is given as,
Redundancy () = 1-code efficiency
=1
Redundancy should be as low as possible.
18. What is information rate?
The information rate is represented by R and it is given as, Information Rate: R=rH
19.

You might also like