Professional Documents
Culture Documents
Cha 02
Cha 02
Cha 02
abaabaababbbaabbabab… ….
0.5 a
0.5 b
The average # of bits to represent the symbol is therefore
1
− pi log pi
source
i =0
Self Information
So, let’s look at it the way Shannon did.
Assume a memoryless source with
◼ alphabet A = (a1, …, an)
◼ symbol probabilities (p1, …, pn).
How much information do we get when finding out that the
next symbol is ai?
According to Shannon the self information of ai is
Why?
Assume two independent events A and B, with
probabilities P(A) = pA and P(B) = pB.
Example 2:
Which logarithm? Pick the one you like! If you pick the natural log,
you’ll measure in nats, if you pick the 10-log, you’ll get Hartleys,
if you pick the 2-log (like everyone else), you’ll get bits.
Entropy
Example: Binary Memoryless Source
BMS 01101000…
Let
Then
1
The uncertainty (information) is greatest when
0 0.5 1
Example
Three symbols a, b, c with corresponding probabilities:
What is H(P)?
What is H(Q)?
Important parameters:
Average Length L = pi li ,
H
Efficiency of coding E =
L
where is length (in binary digits)
li
Source Coding Method
▪ Fano-Shannon method
▪ Huffman’s Method
Fano-Shannon Coding
Writing the symbol in a table in the order of
descending order of probabilities ;
Dividing lines are inserted to successively divide
the probabilities into halves, quarters, etc (or as
near as possible);
A ‘0’ and ‘1’ are added to the code at each
division.
Final code for each symbol is obtained by reading
from towards each symbol.
Example
Example…
s1 0.5 0 0
s2 0.2 0 100
0.1 1 0 101
s3
1
s4 0.1 0 110
0.1 1 1 111
s5
L=0.5×1+0.2 ×3+3 × 0.1 ×3=2.0
H=1.96
E=0.98
Huffman’s Method
1. Writing the symbol in a table in the order of
descending order of probabilities ;
2. The probabilities are added in pairs from bottom
and reordered.
100 symbol/s 0, 1
channel
source coder decoder
How?
By mapping the incoming data sequence into a channel input
sequence and inverse mapping the channel output sequence
into an output data sequence in such a way that the overall
effect of channel noise on the system is minimized
Modulo-2 operations
The encoding and decoding functions involve the binary
arithmetic operation of modulo-2
Rules for modulo-2 operations are…..
Linear Block Codes…
Modulo-2 operations
The encoding and decoding functions involve the binary
arithmetic operation of modulo-2
Rules for modulo-2 operations are:
Modulo-2 addition
0+0=0
1+0=1
0+1=1
1+1=0
Modulo-2 multiplication
0 x0=0
1 x0=0
0 x1=0
1 x1=1