Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

Digital Image Processing

Huffman Coding Example


Huffman Coding Example
 Suppose X is a source producing symbols; the symbols comes from the alphabet A={a1, a2, a3,
a4, a5}.
 Suppose that the probability of each symbol is as follows: {0.4, 0.2, 0.2, 0.15, 0.05}.
 Form the Huffman tree:
0
a1 Symbol | Probability | Codeword
0.4
0 a1 0.4 0
a2 1.0 a2 0.2 10
0.2
1 a3 0.2 110
0
a3 0.6 a4 0.15 1110
0.2 1
0 a5 0.05 1111
a4 0.4
0.15 1 Average
1 0.2 codeword
a5 length = 0.4*1 + 0.2*2 + 0.2*3 + 0.15*4 + 0.05*4 = 2.2
0.05
per
symbol

1
Entropy = H ( x)   Pi ( x) log 2 = 2.08
Bahadir K. Gunturk i Pi ( x) 2
Huffman Coding Example
 Another possible tree with the same source is:

0
a1 Symbol | Probability | Codeword
0.4
0 a1 0.4 0
a2 0 a2 0.2 100
0.2
0.4 1.0 a3 0.2 101
1
a3 1 a4 0.15 110
0.2
0 0.6 a5 0.05 111
a4
0.15 1 Average
1 0.2 codeword
a5 length = 0.4*1 + 0.2*3 + 0.2*3 + 0.15*3 + 0.05*3 = 2.2
0.05
per
symbol

Bahadir K. Gunturk 3

You might also like