Professional Documents
Culture Documents
Conditions of Occurrence of Events: Entropy
Conditions of Occurrence of Events: Entropy
Conditions of Occurrence of Events: Entropy
Entropy
When we observe the possibilities of the
occurrence of an event, how surprising or uncertain
it would be, it means that we are trying to have an
idea on the average content of the information from
the source of the event.
Entropy can be defined as a measure of the average
information content per source symbol. Claude
Shannon, the "father of the Information
Theory", provided a formula for it as -
Mutual Information
Let us consider a channel whose output is Y and
input is X
Thisisassumedbeforetheinputisapplied
To know about the uncertainty of the output, after
the input is applied, let us consider Conditional
Entropy, given that Y = Yk
5-1
p(cj I
respectively.
k-1
(Tj I '1k) p (YD log2
adpushup
( Tj I Yk) p (YD log2
Ep 31k) log2
If
minimum possible value of L
Then coding efficiency can be defined as
For this, the value Lmtn has to be determined.
example
MorsecodeforthewordQUEUEis
capacity.
capacity.
BCH Codes
BCH codes are named after the inventors Bose,
Chaudari and Hocquenghem. During the BCH
code design, there is control on the number of
symbols to be corrected and hence multiple bit
correction is possible. BCH codes is a powerful
technique in error correcting codes.