Professional Documents
Culture Documents
Individual Assignment
Individual Assignment
Individual Assignment
b) Calculate the mutual information 𝐼(𝑋; 𝑌) when the channel input probability𝑝(𝑥1 ) = 𝑝(𝑥2 ) = 0.5 and
𝑃 = 0.5, comment on the result
c) Find the channel capacity of BSC
4. A source has an alphabet {𝑎1 , 𝑎2 , 𝑎3, 𝑎4 } with corresponding probabilities {0.1, 0.2, 0.3, 0.4}.
b) What is the minimum required average code word length to represent this source for error-free
reconstruction?
c) Design a Huffman code for the source and comment on the average length of the Huffman code
d) Design a Huffman code for the second order extension of the source. What is the average code word
length?
e) Which one is a more efficient coding Scheme, Huffman coding of the original source or Huffman coding
of the second extension of the source?