Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 19

UNIT-2

Channel capacity
Channel capacity in AWGN channels
Channel Coding Theorem

 Source alphabet X , Source Entropy H(X ) bits / source symbol

 Source emits a symbol every Ts seconds 

 Average Information Rate = H(X ) / Ts bits/sec.

 Channel capacity  C bits per channel use

 Channel is capable of being used every Tc seconds

 Channel capacity per unit time  Maximum rate of


information transfer through the channel = C / Tc
bits/sec.
Channel Coding Theorem

 Let a discrete memoryless source with an alphabet X , have


entropy H(X ) bits / source symbol and produce symbols once
every Ts seconds. Let a discrete memoryless channel have a
capacity C bits per channel use and be used once every Tc
seconds. Then if,

H(X ) / Ts  C / Tc

there exists a coding scheme for which the source output can be
transmitted over the channel and be reconstructed with an
arbitarily small probability of error. The parameter C / Tc is called
the Critical rate. When the equality sign is satisfied, the system is
said to be signaling at the critical rate.
Channel Coding Theorem

 Conversely, if

H(X ) / Ts > C / Tc

it is not possible to transmit information over the channel and


reconstruct it with an arbitrarily small probability of error.
Information Capacity Theorem

Consider band-limited, powerlimited Gaussian channels,


Consider a zero-mean stationary process X(t) that is band-limited to
B hertz, Sampled at Nyquist rate of 2B samples per second,
transmitted in T seconds over a noisy channel, also band-limited to B
hertz.
Number of samples, K = 2BT
Let Xk  a sample of the transmitted signal,
Let the channel output be perturbed by Additive White
Gaussian Noise (AWGN) of zero mean and power spectral
density No/2. The noise is band-limited to B hertz.
Let the continuous random variables Yk, k = 1, 2,…..K denote
samples of the received signal,
Yk = Xk + Nk
Information Capacity Theorem
The noise sample Nk is Gaussian with zero mean and variance 2
= N oB
Let the transmitter power be limited; E[Xk2] = P  average Tx.
power
Let I(Xk; Yk) denote the mutual information between Xk and Yk.
Information capacity of the channel is then defined as
C = max { I(Xk; Yk) : E[Xk2] = P }
fX (x)

I(Xk; Yk) = h(Yk) - h(Yk/ Xk)


Maximizing h(Yk)
= h(Yk) - h(Nk)
Independent of Xk
Information Capacity Theorem

C = max { I(Xk; Yk) : E[Xk2] = P }


fX (x)

C = I(Xk; Yk) : Xk Gaussian , E[Xk2] = P

Maximum for Xk Gaussian


Information Capacity Theorem

1. The variance of received signal sample Yk  P + 2.


Hence, the differential entropy of Yk is,
h(Yk) = ½ log2[2e(P + 2)]

2. The variance of the noise sample Nk equals 2.


Hence, the differential entropy of Nk is,
h(Nk) = ½ log2[2e2]
Information Capacity Theorem

C = I(Xk; Yk) : Xk Gaussian , E[Xk2] = P

I(Xk; Yk) = h(Yk) - h(Nk)

C = ½ log2( 1 + P/2 ) bits per


transmission
Capacity per unit time -> K/ T bits/sec.

C = B log2( 1 + P/ NoB) bits per


second
Shannon’s
Information Capacity Theorem
The information capacity of a continuous channel of
bandwidth B Hertz, perturbed by additive white Gaussian
noise of power spectral density No/2 and limited in
bandwidth to B, is given by,

C = B log2( 1 + P/ NoB) bits per


second
where P is the average transmitted power.
Defines the fundamental limit on the rate of error-free transmission
for a power-limited, band-limited Gaussian channel.
Shannon limit …

• Shannon theorem puts a limit on transmission


data rate, not on error probability:

– Theoretically possible to transmit information at any


rate Rb , where Rb  C with an arbitrary small error
probability by using a sufficiently complicated coding
scheme.

– For an information rate Rb > C , it is not possible to find


a code that can achieve an arbitrary small error
probability.
Shannon limit …

C = B log2( 1 + P/ NoB)

Rb = C = B log2( 1 + Eb C / NoB)
C/B = log2 [ 1 + (C/B) (Eb/ No )]
Shannon limit

Eb / No = [ 2C/B – 1 ] / (C/B)

As B  , Eb / No  ln 2 = 0.693  -1.6 dB

– There exists a limiting value ofEb / N 0 below which there can be no error-
free communication at any information rate.

– By increasing the bandwidth alone, the capacity cannot be increased to


any desired value.
Shannon limit …

Unattainable
region
Rb /B [bits/s/Hz]

Practical region

Eb /No [dB]
Shannon limit …

B/Rb [Hz/bits/s] Practical region

Unattainable
region

-1.6 [dB] Eb / N 0 [dB]


Bandwidth efficiency plane
R>C
Unattainable region M=256
M=64
R=C
M=16
M=8
Rb/ B [bits/s/Hz]

M=4
Bandwidth limited
M=2

M=4 M=2 R<C


M=8 Practical region
M=16

Shannon limit MPSK


Power limited
MQAM PB  105
MFSK

Eb / N 0 [dB]
Why use error correction coding?

– Error performance vs. bandwidth


– Power vs. bandwidth PB

– Data rate vs. bandwidth Coded

– Capacity vs. bandwidth


A
F
Coding gain:
C B
For a given bit-error probability,
the reduction in the Eb/N0 that can be
D
realized through the use of code: E
Uncoded
 Eb   Eb 
G [dB]    [dB]    [dB]
 N0 u  N 0 c Eb / N 0 (dB)
• Example 4.1: Consider a wireless channel
where power falloff with distance follows the
formula Pr(d) =Pt(d0/d)3 for d0 = 10m. Assume
the channel has bandwidth B = 30 KHz and
AWGN with noise power spectral density of
N0 = 10−9 W/Hz. For a transmit power of 1 W,
find the capacity of this channel for a
transmit-receive distance of 100 m and 1 Km.
• The received SNR is = Pr(d)/(N0B) = .1 * .13/(10−6 *30*
103) = 33 = 15 dB for d = 100 m and = .1*.013/(10−6 *
30* 103) = .033 = −15 dB for d = 1000 m.
The corresponding
• capacities are C = B log2(1 + SNR ) = 30000 log2(1 + 33)
= 152.6 Kbps for d = 100m and
• C = 30000 log2(1+.033) = 1.4 Kbps for d = 1000 m. Note
the significant decrease in capacity at farther distances,
due to the path loss exponent of 3, which greatly
reduces received power as distance increases.

You might also like