Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

Communication Systems

Dr. Fahim Aziz Umrani


Department of Telecommunication, Room # 215
Institute of Information & Communication Technologies (IICT),
Mehran UET, Jamshoro
https://sites.google.com/a/faculty.muet.edu.pk/fau/cs
Communication Systems By Dr. Fahim Aziz Umrani

Information theory and entropy


 Information theory tries to
solve the problem of
communicating as much
data as possible over a noisy
channel
 Measure of data is entropy
 Claude Shannon first
demonstrated that reliable
communication over a noisy
channel is possible (jump-
started digital age)

Department of Telecommunication, Mehran UET 2


Communication Systems By Dr. Fahim Aziz Umrani

A brief introduction to Information Theory


 Information theory is a branch of science that deals with the
analysis of a communications system.

NOISE

Source of Destination
Encoder Channel Decoder
Message of Message

 Claude Shannon Published a landmark paper in 1948 that was the


beginning of the branch of information theory.
 Information theory answers two fundamental questions in
communication theory:
 What is the ultimate data compression (entropy), and
 What is the ultimate transmission rate of communication system
(Channel capacity)
Department of Telecommunication, Mehran UET 3
Communication Systems By Dr. Fahim Aziz Umrani

Horse race example


 Consider eight horses in a race
 Every day there is a race between them
 The probability of winning 1st horse is: 1/2 000
 The probability of winning 2nd horse s: 1/4 001
 The probability of winning 3rd horse is: 1/8 010
 The probability of winning 4th horse is: 1/16 011
 The probability of winning 5th horse is: 1/64 100
 The probability of winning 6th horse is: 1/64 101
 The probability of winning 7th horse is: 1/64 110
 The probability of winning 8th horse is: 1/64 111
 To convey the result of one race each day we need
three bits since there are eight horses involved.
Department of Telecommunication, Mehran UET 4
Communication Systems By Dr. Fahim Aziz Umrani

Horse race example


 Lets say you want to transmit 365 number of results, so how
many numbers of bits on average you will need?
Horses #1 #2 #3 #4 #5 #6 #7 #8

Prob. of
1/2 1/4 1/8 1/16 1/64 1/64 1/64 1/64
winning

Bits
000 001 010 011 100 101 110 111
assigned
 So the average number of bits required are:
1/2 x 3+ 1/4 x 3 + 1/8 x 3 + 1/16 x 3 + 1/64 x 3 + 1/64 x 3 + 1/64 x 3 + 1/64 x 3 = 3 bits
 But can we transmit lower than three bits to convey the same
amount of information?? The answer is yes!

Department of Telecommunication, Mehran UET 5


Communication Systems By Dr. Fahim Aziz Umrani

Horse race example


 Let us say we assign following
The probability of winning 1st horse is: 1/2 0
The probability of winning 2nd horse s: 1/4 10
The probability of winning 3rd horse is: 1/8 110
The probability of winning 4th horse is: 1/16 1110
The probability of winning 5th horse is: 1/64 111100
The probability of winning 6th horse is: 1/64 111101
The probability of winning 7th horse is: 1/64 111110
The probability of winning 8th horse is: 1/64 111111
 How we are deciding how many numbers should be
assigned per result? is based on log2(1/Prob. of winning)

Department of Telecommunication, Mehran UET 6


Communication Systems By Dr. Fahim Aziz Umrani

Horse race example


 Lets say you want to transmit 365 number of results, so how
many numbers of bits on average you will need?

Horses #1 #2 #3 #4 #5 #6 #7 #8

Prob. of
1/2 1/4 1/8 1/16 1/64 1/64 1/64 1/64
winning

Bits
0 10 110 1110 111100 111101 111110 111111
assigned

 So the average number of bits required are:


1/2 x 1+ 1/4 x 2 + 1/8 x 3 + 1/16 x 4 + 1/64 x 6 + 1/64 x 6 + 1/64 x 6 + 1/64 x 6 = 2 bits

Department of Telecommunication, Mehran UET 7


Communication Systems By Dr. Fahim Aziz Umrani

Conclusions
 Should use fewer bits for frequent events!
 Frequent events have lower information.
 Rare events have more information.

Department of Telecommunication, Mehran UET 8


Communication Systems By Dr. Fahim Aziz Umrani

Information theory
 In our case, the messages will be a sequence of binary
digits
 One detail that makes communicating difficult is noise
 noise introduces uncertainty
 Suppose I wish to transmit one bit of information what
are all of the possibilities?
 tx 0, rx 0 - good 0 0
 tx 0, rx 1 - error
 tx 1, rx 0 - error
 tx 1, rx 1 - good 1 1

 Two of the cases above have errors – this is where


probability fits into the picture.
Department of Telecommunication, Mehran UET 9
Communication Systems By Dr. Fahim Aziz Umrani

Measure of “Information”
 Suppose we have an event X, where xi represents a particular
outcome of the event
 Consider flipping a fair coin, there are two equi-probable outcomes,
say:
 X0 = heads, P0 = 1/2,
 X1 = tails, P1 = 1/2
 The amount of information for any single result is:

I(X=xi) = I(xi) = log2(1/P(xi)) = - log2(P(xi)) bits

 In the example of tossing coin information is 1 bit.


 In other words, the number of bits required to communicate the
result of the event is 1 bit

Department of Telecommunication, Mehran UET 10


Communication Systems By Dr. Fahim Aziz Umrani

Definition on “Information”
 When outcomes are equally likely, there is a
lot of information in the result
 The higher the likelihood of a particular
outcome, the less information that outcome
conveys
 However, if the coin is biased such that it
lands with heads up 99% of the time, there is
not much information conveyed when we flip
the coin and it lands on heads

Department of Telecommunication, Mehran UET 11


Communication Systems By Dr. Fahim Aziz Umrani

Example
 An event X generates randomly 1 or 0 with
equal probability P(X=0) = P(X=1) = 0.5
then I(X) = -log2(0.5) = 1
or 1 bit of info each time X occurs
 if X is always 1 then P(X=0) = 0, P(X=1) = 1
then I(X=0) = -log2(0) = 
and I(X=1) = -log2(1) = 0

Department of Telecommunication, Mehran UET 12


Communication Systems By Dr. Fahim Aziz Umrani

Discussion
 I(X=1) = -log2(1) = 0
Means no information is delivered by X, which is
consistent with X = 1 all the time.
 I(X=0) = -log2(0) = 
Means if X=0 then a huge amount of information
arrives, however since P(X=0) = 0, this never
happens.

Department of Telecommunication, Mehran UET 13


Communication Systems By Dr. Fahim Aziz Umrani

Entropy
 The sum of average information arriving with
each bit is Entropy
L L
H ( X )   P ( xi ) I ( xi )   P ( xi ) log 2 ( P ( xi ))
i 1 i 1

Department of Telecommunication, Mehran UET 14


Communication Systems By Dr. Fahim Aziz Umrani

Example
 Consider an event with two outcomes 1 and 0,
occurring with probability of p and 1 – p
respectively .
 Then H(X) = p*log2(1/p) + (1 – p)*log2(1/(1 – p))
For p = 0 and 1
H(X) = 0
For p = 0.5
H(X) = 1

Department of Telecommunication, Mehran UET 15


Communication Systems By Dr. Fahim Aziz Umrani

Channel capacity
Definition:
 channel capacity is the rate of information that can be
reliably transmitted over a communications channel.

Shannon-channel capacity
 In the early 1940s, it was thought that increasing the
transmission rate of information over a communication
channel increased the probability of error.
 Shannon surprised the communication theory community
by proving that this was not true as long as the
communication rate was below channel capacity.
C = BW.log2(1 + SNR) bits per second

Department of Telecommunication, Mehran UET 16


Communication Systems By Dr. Fahim Aziz Umrani

Fundamental Constraint
 Shannon’s capacity upper bound
Achievable data rate is fundamentally limited by
bandwidth and signal-to-noise ratio

Department of Telecommunication, Mehran UET 17


Communication Systems By Dr. Fahim Aziz Umrani

Fundamental Constraints
 Fundamental constraints for high data rate communications

Department of Telecommunication, Mehran UET 18

You might also like