Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

Information

Prof. Budhaditya Bhattacharyya, SENSE, VIT, University

A source is supposed to produce symbols.


The symbols form the message
The message contain information.(May be relative to the
user)
The Information is categorised only on the basis of its
probability of occurrence.
Information in general will have Four Distinct properties:
Property 1 : Information (I) should always be positive that is I => 0

The property is obvious, otherwise the source


producing the symbol will not be called a source. Hence
a source should be such that there is no loss of
Information

Information
Prof. Budhaditya Bhattacharyya, SENSE, VIT, University

Property 2 : For a symbol with probability approaching its highest

value 1; the amount of information in it should approach its lowest


value.

If we are absolutely certain about the outcome, even


before the event occurs, no Information
Property 3 : For two different symbols xi and xj with respective

probabilities Pi and Pj, the one with lower probability should


contain more information i.e. for Pi < Pj , we must have Ii > Ij

Property 4 : The total information conveyed by two independent

symbols should be the sum of their respective information


content.
Iij = Ii + Ij

Information
Prof. Budhaditya Bhattacharyya, SENSE, VIT, University

From the above properties it is obvious that

I is a function of P

The function should be an inverse relation


between I and P.
The only possible way to mathematically represent this
relationship is self information : I = log (1/P)

base 2 (Number of alphabet in a source is Two) : Bits or binit


base e (Number of alphabet in a source is n) : nats
base 10 (Number of alphabet in a source is 10) : decit or hurtley

Information
Prof. Budhaditya Bhattacharyya, SENSE, VIT, University

Let there is one event E having two sub event which are
statistically independent given as e1 and e2.
Each having probability of occurrence as P(e1) and P(e2)

I (E)

=
log (1/P (E))
=
log (1/ P (e1,e2))
[considering P (e1) and P (e2) are two independent event]
=
=

( log (1/P (e1)) + log (1/P (e2))


I (e1) + I (e2)

Prof. Budhaditya Bhattacharyya, SENSE, VIT, University

Information cont.

All discrete sources emit outputs which are sequences of a


finite number of symbols called alphabets. Like English
language has 26 alphabets, similarly a binary source will
have alphabets as 0 and 1.

Discrete Memoryless Source (DMS) : When a source is


statistically independent. The output letter is statistically
independent from all past and future outputs.

The physical quantity that expresses the information content


of a DMS along with giving probabilistic behaviour of this
source is termed as ENTROPY

You might also like