Tc-502 Information Theory: Information Channels-Channel Capacity

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

TC-502 INFORMATION THEORY

INFORMATION CHANNELS-
Channel Capacity
PROBLEM
Mutual Information of BCS
PROBLEM
Calculate the mutual information of BEC shown below
Channel Capacity

A discrete memory Less channels having m-inputs and n-outputs.


Maximum Entropy of a random variable occurs
when all the possible symbols are equiprobable.
Maximum Entropy of the input random variable X
max = log 2
Maximum Entropy of the output random variable Y
max = log 2
Channel Capacity
Capacity of Special Channels
Capacity of Special Channels
PROBLEM
Find the capacity of BSC and BEC shown belwo
Continuous Channels and Gaussian Channels
We extend our analysis of information channels to the case of
continuous valued input and output alphabets and to the
most important class of continuous channel, the Gaussian
channel.
In digital communication systems noise analysis at the most
basic level requires consideration of continuous valued
random variables rather than discrete quantities. Thus the
Gaussian channel represents the most fundamental form of
all types of communication channel systems and is used to
provide meaningful insights and theoretical results on the
information carrying capacity of channels.
The BSC and BEC models, on the other hand, can be
considered as high-level descriptions of the practical
implementations and operations observed in most digital
communication systems.
Continuous Channels and Gaussian Channels
When considering the continuous case our discrete-valued
symbols and discrete probability assignments are replaced by
continuous-valued random variables, X, with associated
probability density functions, ().
Additive White Gaussian Noise
(AWGN) Channels

You might also like