Download as pdf or txt
Download as pdf or txt
You are on page 1of 20

Summary

Channel Capacity in Wireless


Communications
Lie-Liang Yang

Gaussian channels with power-constraint;


Band-limited Gaussian channels;
Parallel Gaussian channels with average power-constraint (water-filling);

Communications Research Group


School of Electronics and Computer Science,
University of Southampton, SO17 1BJ, UK.
Tel: +44 23 8059 3364, Fax: +44 23 8059 4508
Email: lly@ecs.soton.ac.uk
http://www-mobile.ecs.soton.ac.uk/lly

Parallel coloured Gaussian channels with average power-constraint;


Capacity of time-varying fading channels with channel state side-information;
Capacity of Rayleigh fading channels with diversity and channel state
side-information;
Capacity of multiple-input multiple-output channels.

UNIVERSITY OF

Southampton

c
Yang 1/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

UNIVERSITY OF

Southampton

c
Yang 2/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Gaussian Channels: Channel Model


Gaussian Channels: Channel Model

Gaussian channel is one of the most important channels;


It is a time-discrete state-continuous channel;

Ni

Let Xi , i = 1, . . . , N be the inputs and Yi , i = 1, . . . , N the corresponding


outputs. Then, Yi can be expressed as
Yi = Xi + Ni , i = 1, 2, . . . , N

Xi

Yi

Figure 1: The Gaussian channel model.

where Ni is the Gaussian noise that is independently distributed with mean


zero and a variance of 2 ;
The PDF of Ni is given by


1
y2
fNi (y) =
exp 2 , i = 1, 2, . . . , N
2
2

UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 3/ 80 |

(1)

UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 4/ 80 |

(2)

Gaussian Channels with Power Constraint


Gaussian Channels with Power Constraint
In wireless communications the most common constraint on the
input signals is the energy or power constraint;
For the Gaussian channels with average power constraint, let
x1 , x2 , . . . , xN be the input signal sequence, the average power
constraint can be expressed as
N
1 X 2
xn P or E[X 2 ] P
N n=1

The capacity of the channel is defined as the maximum of the mutual


information between the input and the output over all distributions on the
input that satisfy the power constraint;
Hence, the capacity of the Gaussian channel as shown in (1) with the
average power contrain of (3) can be described by the optimization problem:
C=

max

p(x):E[X 2 ]P

I(X; Y )

(4)

It can be shown that [1]

(3)

I(X; Y ) = h(X) h(X|Y ) = h(Y ) h(Y |X) = h(Y ) h(X + N |X)


= h(Y ) h(N |X) = h(Y ) h(N )

where P represents the average transmission power.

(5)

where h(X) represents the differential entropy of a continuous random


variable X.
UNIVERSITY OF

Southampton

UNIVERSITY OF

c
Yang 5/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Southampton

c
Yang 6/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Gaussian Channels with Power Constraint


In (5), since N is Gaussian distributed, its differential entropy is given by [1]
h(N ) =
where log log2 is assumed.


1
log 2e 2
2

(6)

Since X and N are independent, we have E[Y 2 ] = E[X 2 ] + E[N 2 ] = P + 2 ,


hence, the differential entropy of Y is bounded by
h(Y )


1
log 2e[P + 2 ]
2

(7)

Therefore, the mutual information I(X; Y ) satisfies

 1

1
log 2e[P + 2 ] log 2e 2
2
2


P
1
= log 1 + 2
2

I(X; Y ) = h(Y ) h(N )

UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 7/ 80 |

(8)

Gaussian Channels with Power Constraint


Hence, the capacity of the Gaussian channel with average
power constraint is given by


1
P
1
C=
max2
I(X; Y ) = log 1 + 2 = log (1 + )
p(x):E[X ]P
2

(9)

where = P/ 2 represents the signal-to-noise ratio (SNR);


Observations: The capacity of the Gaussian channel with
average power constraint is achieved, when the input signal X is
a Gaussian signal distributed with mean zero and a variance of
P.

UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 8/ 80 |

Band-Limited Gaussian Channels

Band-Limited Gaussian Channels

N (t)

Band-limited Gaussian channels (also power-limited) can be


represented as
Y (t) = [X(t) + N (t)] h(t)

N1

+
........

X1

X(t) = {x(t)}: input random process;

N (t) = {n(t)}: white Gaussian noise process with a PSD of


N0 /2;

XM

X(t) and N (t) are independent.


UNIVERSITY OF

Southampton

Band-Limited Gaussian Channels


Since the Gaussian channel is band-limited within [B, B], according to the
Nyquist theorem, the band-limited Gaussian channel of (10) can be
equivalently represented by M = 2BTs , where Ts is the observation time,
independent subchannels:
(11)

c
Yang 10/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

When Xm for m = 1, 2, . . . , M are independent random


variables, the capacity of the band-limited Gaussian channel
can be expressed as
X , Y ) = M C0 = 2BTs C0
C = I(X

T
T
Y = [Y1 , Y2 , , YM ] , X = [X1 , X2 , , XM ]

T
N = [N1 , N2 , , NM ]

(12)

Hence, we have
X,Y )
I(X

M
X

I(Xm , Ym )

(13)

m=1

with the equality achieved, when Xm for m = 1, 2, . . . , M are independent


random variables.
UNIVERSITY OF

YM

Band-Limited Gaussian Channels

Let

Southampton

Figure 2: The band-limited Gaussian channel model.

c
Yang 9/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Ym = Xm + Nm , m = 1, 2, . . . , M

Y1

NM

Y (t) = {y(t)}: output random process;

UNIVERSITY OF

Y (t)

(10)

where h(t) denotes the impulse response of an ideal bandpass


filter, which cuts off all the frequencies higher than B;

Southampton

Band-Pass
Filter
(h(t))

X(t)

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 11/ 80 |

(14)

where C0 denotes the capacity of a component subchannel,


which can be expressed as


PXm
1
bits/transmission
(15)
C0 = I(Xm , Ym ) = log 1 + 2
2

which is achieved when Xm s are independent Gaussian


random variables with mean zero and a common variance PXm .
UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 12/ 80 |

Band-Limited Gaussian Channels

Band-Limited Gaussian Channels

In (15) PXm represents the power per sample, which is given by


PXm = P Ts /2BTs = P/2B; 2 denotes the noise variance per
sample, which is given by 2 = N20 2BTs /2BTs = N0 /2.

Alternatively, the capacity of the band-limited Gaussian channel


can be expressed as


P
C = M C0 = B log 1 +
bits/second
(18)
N0 B

Hence, C0 in (15) is given by






1
P/(2B)
P
1
C0 = log 1 +
= log 1 +
bits/transmission
2
N0 /2
2
N0 B
(16)
Upon substituting (16) into (14), we obtain the capacity of the
band-limited Gaussian channel, which is given by


P
bits/transmission
(17)
C = M C0 = BTs log 1 +
N0 B
UNIVERSITY OF

Southampton

c
Yang 13/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Observations:
the Gaussian channel limited within the band [B, B] is
equivalent to M = 2BTs independent additive component
Gaussian channels;
the capacity of the band-limited Gaussian channel is
achieved, when each of the component channels achieves its
capacity.
UNIVERSITY OF

Southampton

c
Yang 14/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Band-Limited Gaussian Channels Properties

Capacity of Band-Limited Gaussian Channel


1.6
1.4

When B , we have
lim C = lim B log 1 +

1.2

P
N0 B

= lim log
B

"

1+

P
N0 B

Capacity/(P/N0)

 NP0 B # NP0

P
P
= log e
= 1.4427
(bits/second)
N0
N0

(19)

UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 15/ 80 |

0.8
0.6
0.4
0.2

When the bandwidth is very high, the channel capacity


increases linearly with the SNR;
1 bits/second is achieved, when P/N0 = 1/1.4427 = 0.9631,
which is about 1.6dB.

1.0

0.0

10

Bandwidth (B)

Figure 3: Capacity of band-limited Gaussian channel computed from (18).

UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 16/ 80 |

Band-Limited Gaussian Channels


Let us assume the data rate is Rb = 1/Tb bits/s. Then (18) can be modified as


P T b Rb
C
= log 1 +
B
N0 B


Rb
bits/s/Hz
(20)
= log 1 + b
B

Band-Limited Gaussian Channels


Note that, when the bandwidth of a system is given, (21) can be
simply expressed as
C=

where b = P Tb /N0 = Eb /N0 denotes the SNR per bit.


Since when the channel capacity is achieved, we have Rb = C, hence


C
C
= log 1 + b
bits/s/Hz
B
B

C
= log (1 + ) bits/s/Hz
B

(22)

where represents the equivalent SNR.


(21)

In this case, C can be referred to as the number of bits per


symbol.

which shows that the ratio of the channel capacity C to the available
bandwidth B is fully determined by the SNR per bit.

UNIVERSITY OF

Southampton

c
Yang 17/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

UNIVERSITY OF

Southampton

c
Yang 18/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Parallel Gaussian Channels with Average


Power-Constraint - Channel Model[1, 2]
Capacity of Band-Limited Gaussian Channel
2

M number of independent Gaussian channels in parallel. The


input-output relation of the mth channel is denoted as

10

C/B (bits/s/Hz)

Ym = Xm + Nm , m = 1, 2, . . . , M

where the noise Nm for m = 1, 2, . . . , M is an independent


2
Gaussian random variable distributed following N (0, m
);

-1.6dB

-1

10

2
-2

10

-2

10

12

14

16

18

20

SNR per bit (dB)

It is assumed that the total average input power is constrained


by
M
X

Figure 4: Capacity of band-limited Gaussian channel computed from (21).

m=1
UNIVERSITY OF

Southampton

(23)

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 19/ 80 |

UNIVERSITY OF

Southampton

 2
=P
E Xm

(24)

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 20/ 80 |

Parallel Gaussian Channels with Average


Power-Constraint

Parallel Gaussian Channels with Average


Power-Constraint
 2
Let us for the moment assume that E Xm
= PXm , m = 1, . . . , M .

Then we have

The capacity of the concerned channel can be obtained by


solving the optimization problem
C=
f (x1 ,x2 ,...,xM ):

I (X1 , X2 , . . . , XM ; Y1 , Y2 , . . . , YM )
= h (Y1 , Y2 , . . . , YM ) h (Y1 , Y2 , . . . , YM | X1 , X2 , . . . , XM )

I (X1 , X2 , . . . , XM ; Y1 , Y2 , . . . , YM )
max
M
X
 2
P
E Xm

= h (Y1 , Y2 , . . . , YM ) h (N1 , N2 , . . . , NM )
= h (Y1 , Y2 , . . . , YM )

m=1

(25)

where f (x1 , x2 , . . . , xM ) represents the joint PDF of X1 , X2 , . . .,


XM ;

UNIVERSITY OF

Southampton

c
Yang 21/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Parallel Gaussian Channels with Average


Power-Constraint
Therefore, the capacity of the parallel Gaussian channels with a
given set of transmission power arrangements {PXm } is given by


M
X
PXm
1
log 1 + 2
C=
2
m
m=1

(27)

The above capacity is achieved, when Xm is an independent


Gaussian random variable distributed according to N (0, PXm );
Upto this stage, the optimization problem seen in (25) is
reduced to a power-allocation problem, which distributes the
transmission power to X1 , X2 , . . . , XM so that the mutual
information subject to the power constraint of (24) is maximized.
UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 23/ 80 |

UNIVERSITY OF

Southampton

M
X

m=1

M
X

h (Nm )

m=1

[h (Ym ) h (Nm )]



M
X
1
PX
log 1 + 2m
2
m
m=1

(26)

c
Yang 22/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Parallel Gaussian Channels with Average


Power-Constraint
The optimization problem can be formed as
(M
)

X1
PXm
log 1 + 2
C=
max
P
2
m
PX1 ,PX2 ,...,PXM : M
m=1 PXm P
m=1

(28)

This optimization problem can be solved with the aid of the


Lagrange multiplier, which can be expressed as


M
X
1
PX
J (PX1 , PX2 , . . . , PXM ) =
log 1 + 2m +
2
m
m=1

UNIVERSITY OF

Southampton

M
X

m=1

PXm P

(29)

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 24/ 80 |

Parallel Gaussian Channels with Average


Power-Constraint
Differentiating (29) with respect to PXm and setting the result to equate zero
gives
1
1
+ = 0, m = 1, 2, . . . , M
2
2 ln 2 PXm + m

(30)

which can be equivalently expressed as


PX m = v

2
m
,

Parallel Gaussian Channels with Average


Power-Constraint
Therefore, the optimum power-allocation scheme is
P
2
P+ M
2
i=1 i
PXm =
m
, m = 1, 2, . . . , M
M
and the channel capacity is given by

m = 1, 2, . . . , M

(31)

M
X
1
log 1 +
C=
2
m=1

With the aid of (24), we can obtain

v=

UNIVERSITY OF

Southampton

M
X

PX m +

M
X

m=1

m=1

2
m

P+
=

M
X

m=1

2
m

=
, m = 1, 2, . . . , M

(32)

c
Yang 25/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Parallel Gaussian Channels with Average


Power-Constraint
From (34), we know that, when the noise samples are iid
Gaussian random variables with a common variance 2 , the
channel capacity is given by


P/M
M
log 1 +
(bits/transmission)
(35)
C=
2
2
which shows that the capacity is achieved when {Xm } are iid
Gaussian random variables with a common variance P/M .

UNIVERSITY OF

Southampton

M
1X
log
2 m=1

UNIVERSITY OF

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 27/ 80 |

P+

PM

i=1

i2

2
m

2
m
!
P
2
P+ M

i=1 i
2
M m

!
(34)

c
Yang 26/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Parallel Gaussian Channels with Average


Power-Constraint
Notice that, when a component channel has a high noise power, the
corresponding transmission power allocated according to (33) may be
negative, which is impractical;
Instead, the water-filling principles can be used, which allocate the
transmission power according to
2
PX m = v m

+

, m = 1, 2, . . . , M

(36)

where x+ = x is x 0 and x+ = 0 is x < 0.


In (36) v is chosen to satisfy
M
X

m=1

Southampton

(33)

UNIVERSITY OF

Southampton

2
v m

+

=P

(37)

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 28/ 80 |

Parallel Gaussian Channels with Average


Power-Constraint: Summary
The capacity of the parallel Gaussian channels with average
power-constraint is achieved, when the input signals are
independent Gaussian signals distributed with mean zero and
variances determined by the optimum power-allocation;
The optimum power-allocation can be achieved through the
water-filling principles;

Parallel Coloured Gaussian Channels with


Average Power-Constraint - Channel
Model[1]
M number of dependent Gaussian channels in parallel, or in serial with M
uses. The input-output relation of the mth channel is denoted as
Ym = Xm + Nm , m = 1, 2, . . . , M
T
It is assumed that the noise vector N = [N1 , , NM ] is Gaussian
distributed with mean zero and a covariance matrix K N ;
T
K X is the covariance matrix of the input signals X = [X1 , , XM ] ;

The power constraint on the input signals can be expressed as

The transmitter/receiver require the knowledge of the noise


variance in order to achieve the channel capacity.

M
X

m=1

UNIVERSITY OF

Southampton

c
Yang 29/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Parallel Coloured Gaussian Channels with


Average Power-Constraint
The channel capacity can be obtained by solving the
optimization problem:
X;Y ) =
I(X
=

max

Y ) h(Y
Y | X )}
{h(Y

max

Y )} h(N
N)
{h(Y

X ):Trace(K
K X )M P
p(X
X ):Trace(K
K X )M P
p(X

(40)

(41)

A) represents the determinant of the square matrix


where det (A
A.
UNIVERSITY OF

Southampton

(39)

c
Yang 30/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Parallel Coloured Gaussian Channels with


Average Power-Constraint

Y ) is maximized, provided that Y is normal


The entropy h(Y
distributed, which can be achieved when the input X is normal
distributed;
When X is normal distributed and remembering that X and N
are independent, the covariance matrix of Y is hence given by
KY = KX + KN;

N ) is given by [1]
In (40), h(N
h
i
1
KN)
log (2e)M det (K
2

UNIVERSITY OF

Southampton

 2
KX) MP
E Xm
M P or Trace (K

As shown in (40), in order to maximize the mutual information


X ; Y ), we only need to maximize h(Y
Y );
I(X

where Y = [Y1 , , YM ]T ;

N) =
h(N

(38)

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 31/ 80 |

Y ) is given by
Therefore, the entropy h(Y
h
i
1
KX + KN)
Y ) = log (2e)M det (K
h(Y
2
UNIVERSITY OF

Southampton

(42)

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 32/ 80 |

Parallel Coloured Gaussian Channels with


Average Power-Constraint

Then, upon applying (44), we have

Now the optimization problem of (40) is equivalent to


X;Y ) ,
I(X
,

max

Y )}
{h(Y

max

K X + K N )}
{det (K

X ):Trace(K
K X )M P
p(X
X ):Trace(K
K X )M P
p(X

Parallel Coloured Gaussian Channels with


Average Power-Constraint


K X + K N ) = det K X + U U T
det (K

 
= det U U tK X U + U T

= det U tK X U +

(43)

A + )
= det (A

Let the noise covariance matrix K N be decomposed as


K N = U U T

(44)

UNIVERSITY OF

where A = U tK X U ;
It can be shown that

where = {1 , , M } is a diagonal matrix, while U is an


orthonormal matrix satisfying U U t = I ;

Southampton

(45)

A) = Trace (K
KX) MP
Trace (A

c
Yang 33/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Parallel Coloured Gaussian Channels with


Average Power-Constraint

UNIVERSITY OF

Southampton

(46)

c
Yang 34/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Parallel Coloured Gaussian Channels with


Average Power-Constraint

Upon applying the Hadamards inequality, we obtain


K X + K N ) = det (A
A + )
det (K

M
Y

(Amm + m )

(47)

m=1

with equality iff A is diagonal;


Since A is subject to the constraint as shown in (46), the
maximum of (47) is achieved when Amm , m = 1, . . . , M is
chosen according to the water-filling principles as
Amm = (v m )+ , m = 1, 2, . . . , M

(48)

where the constant v is chosen to satisfy the constraint of (46).


UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 35/ 80 |

Finally, when substituting (41), (47) and (48) into (40), we can
obtain the channel capacity, which can be expressed as
#
"
M
Y

1
M
+
C = log (2e)
(v m ) + m
2
m=1
h
i
1
KN)
log (2e)M det (K
2
#
"M
"M
#
Y
Y

1
1
= log
m
(v m )+ + m log
2
2
m=1
m=1


M
1X
(v m )+
=
+1
(49)
log
2 m=1
m
UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 36/ 80 |

Parallel Coloured Gaussian Channels with


Average Power-Constraint: Summary

Capacity of Fading Channels with Channel


Side Information

The capacity of the parallel coloured Gaussian channels with


average power-constraint is achieved, when
the input signals are normal distributed signals associated
with optimum power-allocations,

Transmitter
Data
In

Encoder
(Rate adaption)

Power
adaption

c
Yang 37/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Capacity of Fading Channels with Channel


Side Information [3]

(50)

The channel impulse response (CIR) of the fading channel can


be expressed as

Decoder

Data
Out

Channel
Estimation

Figure 5: Gaussian fading channel with side-information.

UNIVERSITY OF

Southampton

c
Yang 38/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Capacity of Fading Channels with Channel


Side Information - Assumptions

hi , Xi and Ni are independent of each other;






E |hi |2 = 1 and E |Xi |2 = S;

The instantaneous received signal-to-noise ratio (SNR) is given by


=

hi = i exp(ji )

Yi

hi , i = 0, 1, 2, . . . is time-varying and is known to the receiver or to both the


transmitter and receiver;

The input-output relation is given by


Yi = hi Xi + Ni , i = 0, 1, . . .

h(t)

Feedback channel

and the optimum power-allocations can be achieved with the


aid of the water-filling principles.

UNIVERSITY OF

Xi

Channel

A = U tK X U is a diagonal matrix, where U is an orthonormal


matrix satisfying K N = U U T ,

Southampton

Receiver

Ni

(51)

Si2
N0 B

(52)

where B denotes the bandwidth of the received signals.

where the index i is associated with the transmitted symbols;

The average SNR is given by

Ni is the Gaussian noise with a density of N0 .


UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 39/ 80 |

c =
UNIVERSITY OF

Southampton

S
N0 B

(53)

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 40/ 80 |

Capacity of Fading Channels with Channel


Side Information
From our previous slides (see Eq.(22)), it can be shown that, for a given SNR
value , the capacity of the band-limited Gaussian channel can be expressed
as
C() = log(1 + ) bits/s/Hz

(54)

Let f () be the probability density function (PDF) of the instantaneous SNR,


. Then, the capacity of the fading channel with transmitter and receiver side
information can be evaluated as
Z
C=
C()f ()d
0
Z
=
log(1 + )f ()d
(55)

Capacity of Fading Channels with Channel


Side Information
Since the transmitter employs the knowledge about the CIR, or the
knowledge about the instantaneous SNR, , the transmission power can
hence be adjusted according to the time-varying channels;
Let S() denote the resultant transmission power in correspondence with a
SNR value . Then, as shown in [3], the channel capacity can be obtained by
solving the optimization problem:


Z
S()
log 1 +
C = max
f ()d
(56)
S
S() 0
Z
subject to
S()f ()d S
(57)
0

UNIVERSITY OF

Southampton

c
Yang 41/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Capacity of Fading Channels with Channel


Side Information
With the aid of the Lagrange multiplier, it can be readily shown that the
optimum transmission power adaption scheme satisfies

S() 10 1 , if 0
=
0,
S
if < 0

UNIVERSITY OF

Southampton

c
Yang 42/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Capacity of Fading Channels with Channel


Side Information
Conclusions:

(58)

where 0 is a cutoff value, which can be obtained by substituting (58) into


(57), yields

Z 
1
1

f ()d 1
(59)
0

0
Correspondingly, the capacity is given by
 
Z

f ()d (bits/s/Hz)
C=
log

0
0

UNIVERSITY OF

Southampton

When the channel is time-varying and both the transmitter


and receiver have the channel knowledge, the channel
capacity is achieved, when both the transmission power,
transmission data rate and coding scheme are adapted
correspondingly according to the time-varying channels;
If the channel instantaneous SNR is below the cutoff value,
0 , then no data is transmitted over the time-varying channel,
in order to save the transmission power.

(60)

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 43/ 80 |

UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 44/ 80 |

Capacity of Fading Channels with


Power-Control

Capacity of Fading Channels with Channel


Side Information

Power-control can be implemented in two ways:


When the channel experiences iid fading and when the CIR is only known to
the receiver, the channel capacity is given by
Z
C=
log(1 + )f ()d (bits/s/Hz)
(61)

Channel inverse, which adjusts the transmission power


according to
c
S()
=
S

In this case,

(62)

where c denotes the constant received SNR;

the average capacity is achieved by making use of the time-diversity;


side information at the transmitter side has no capacity benefit.
When the channel experiences correlated fading and the transmitter uses no
side information, the achievable channel capacity is then upper-bounded by
(61).

Truncated channel inverse, which adjusts the transmission


power according to

c
S() , 0
(63)
=
0, <
S
0

UNIVERSITY OF

Southampton

c
Yang 45/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

UNIVERSITY OF

Southampton

Capacity of Fading Channels with


Power-Control

Capacity of Fading Channels with


Power-Control

Observations:

With power-control, the received SNR is constant c . Hence, the channel


capacity is
C = log(1 + c ) (bits/s/Hz)

When the channel inverse assisted power-control is employed, the


transmitter encoder and receiver decoder are very simple to implement;
(64)

where c is given by:


For using channel inverse, c is given by
c =

Z

f ()
d

1

1
E [1/]

(65)

For using truncated channel inverse, c is given by


c =

Z

UNIVERSITY OF

Southampton

f ()
d

1

c
Yang 46/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

However, there may be a large capacity penalty when the fading is severe.
For example, in Rayleigh fading channels the capacity is zero, since in this
case the deep fading may demand an extremely high transmission power
for keeping the received power constant;
When the power-control is based on truncated channel inverse, the
receiver must know when < 0 . In this case, 0 in (66) can be chosen to
maximize the capacity, ie., 0 is chosen according to
C = max {log(1 + c )P ( > 0 )}
0

(66)

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 47/ 80 |

(67)

where P ( > 0 ) is the probability of the event that > 0 .


UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 48/ 80 |

Capacity of Rayleigh Fading Channels with


Diversity [4] - Summary

Capacity of Rayleigh Fading Channels with


Diversity - Assumptions
Correlated Rayleigh fading channels or slow Rayleigh fading
channels;

Three types of adaptive transmission schemes:


Optimum power and rate adaption;

Both the transmitter and receiver employ the ideal channel


knowledge;

Optimum rate adaption only;


Channel inverse only.

For each component Rayleigh fading channel, the instantaneous


SNR l obeys the exponential distribution with the PDF given by


l
1
, 0, l = 1, 2, . . . , L
(68)
f (l ) = exp
c
c

Two types of diversity combining schemes:


Maximal ratio combining (MRC);
Selection combining (SC).

where c = E[l ], which is assumed to be the same for any l.


UNIVERSITY OF

Southampton

c
Yang 49/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Capacity of Rayleigh Fading Channels with


Diversity
It can be shown that:
when the MRC-assisted diversity combining scheme is employed, the
received SNR obey the 2 distribution with 2L degrees of freedom. The PDF
is given by


1

L1
fM () =
, 0,
(69)

exp
(L 1)!cL
c

UNIVERSITY OF

Southampton

Capacity of Rayleigh Fading Channels with


Diversity - Optimum Power/Rate Adaption
When optimum power and rate adaption is implemented, the capacity is
given by (60) associated with the cutoff satisfying (59).
When applying the corresponding PDFs in (68) - (71) into (59) and (60), we
can obtain the following results;
No diversity [4]:



0
c
exp
c (bits/s/Hz),
0
c
 


c
0
0
with
exp
E1
= c
0
c
c

C = log e

By contrast, when the SC-assisted diversity combining scheme is employed,


the PDF of the received SNR can be expressed as



L1


L
exp
1 exp
fS () =
c
c
c




L1
X
L
(k + 1)
k L1
(1)
=
, 0
exp
c
c
k

(70)
(71)

k=0

UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 51/ 80 |

c
Yang 50/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

where En (x) is the exponential integral of order n defined as


Z
En (x) =
tn ext dt, x 0

(72)

(73)

UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 52/ 80 |

Capacity of Rayleigh Fading Channels with


Diversity - Optimum Power/Rate Adaption

Capacity of Rayleigh Fading Channels with


Diversity - Optimum Power/Rate Adaption

MRC [4]:
"

C = log e E1

0
c

L1
X
k=1

1
Pk
k

0
c

#

SC [4]:

(bits/s/Hz),





0
0
c
L 1,
= (L 1)!c
L,
with
0
c
c

(74)

C =L log e
with

In (74) (, x) is the complementary incomplete gamma function, and Pk (x)


is the Poisson distribution, which are defined respectively as
Z
t1 et dt
(75)
(, x) =

L1
X

UNIVERSITY OF

Southampton

k1
X
j=0

xj
j!

k=0

(1)

k=0

Pk (x) = ex



L1
X
(1)k L 1
k+1

E1

(k + 1)0
c

(bits/s/Hz)

  (k+1)0 /c


L1
e
c
(k + 1)0
=
E1
k
(k + 1)0 /c
c
L
(77)

(76)

c
Yang 53/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

UNIVERSITY OF

Southampton

c
Yang 54/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Capacity of Rayleigh Fading Channels with


Diversity - Optimum Rate Adaption Only
Capacity of Rayleigh Fading Channels with
Diversity - Optimum Rate Adaption Only
When the transmission power is constant and the optimum rate
adaption is employed by the transmitter, the channel capacity
can be evaluated by [3]
Z
log(1 + )f ()d (bits/s/Hz)
(78)
C=

No diversity:
C = log e exp

UNIVERSITY OF

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 55/ 80 |

E1

1
c

X
L1



L1
X
(1)k L 1

C = log e exp

1
c

(79)

k=0

(k, 1/c )
ck

(80)

SC:
C = L log e

Southampton

1
c

MRC:

When applying the corresponding PDFs in (68) - (71) into (78),


we can obtain the following results.

UNIVERSITY OF

Southampton

k=0

k+1

exp

k+1
c

E1

k+1
c

(81)

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 56/ 80 |

Capacity of Rayleigh Fading Channels with


Diversity - Truncated Channel Inverse Only
Capacity of Rayleigh Fading Channels with
Diversity - Truncated Channel Inverse Only

No diversity:
 
C = max log 1 +
0

When the power control is based on the truncated channel


inverse, the channel capacity has been given by (67) associated
with c given by (66). When substituting (66) into (67), we obtain
( "
)
Z
1 #
f ()
C = max log 1 +
d
P ( > 0 )
(bits/s/Hz)
0

0
(82)

c
E1 (0 /c )



0
exp
c

(83)

MRC:
 
C = max log 1 +
0

(L 1)!c
(L 1, 0 /c )

(L, 0 )/c
(L 1)!

(84)

MRC:

c
P
(
>

)
(85)
C = max log 1 + L1

0


 
0

L1
(k + 1)0

(1)
L
E1

c
k
k=0

UNIVERSITY OF

Southampton

c
Yang 57/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Capacity of Rayleigh Fading Channels with


Diversity - Main Conclusions
In order to achieve the highest channel capacity as shown in (72), (74) and
(77) when communicating over time-varying Rayleigh fading channels,
the channel fading information must be known to both the transmitter and
the receiver;
the transmitter adapts its transmission power and rate according to the
channel state: allocates higher transmission power and rate when the
channel condition is good, while reduces the power level and rate when
the channel condition becomes worse, and no data rate when the channel
SNR reduces to a certain value;
There exists an optimum cutoff value of 0 , which results in the highest
channel capacity. The optimum cutoff value is depended on the average
SNR c .
UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 59/ 80 |

UNIVERSITY OF

Southampton

c
Yang 58/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Capacity of Rayleigh Fading Channels with


Diversity- Main Conclusions
The capacity of a Rayleigh fading channel with diversity is always lower than
the capacity of a single Gaussian channel. However, as the diversity order
increases, the capacity of the Rayleigh fading channel becomes closer to the
capacity of a single Gaussian channel;
Optimum power/rate adaption yields a small increase in capacity over the
rate adaption only. This small increase in capacity diminishes as the diversity
order increases;
The optimum power/rate adaption, rate adaption only and the channel
inverse only policies tend to reach a common channel capacity, when the
diversity order increases;
When the diversity order is sufficiently high, adaptive transmission power/rate
becomes less important. Constant transmission power and constant
transmission rate may be sufficient for reaching the channel capacity.
UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 60/ 80 |

MIMO Communications Systems


MIMO Channel: H

Capacity of Multiple-Input Multiple-Output


Channels - Summary
Data
Input

Capacity expressions;

TX
Processor

{h11 , h21 , . . . , hM 1 }

Data
Output
RX
Processor

Typical characteristics;
Conclusions.

{h1N , h2N , . . . , hM N }
Figure 6: Schematic diagram of a MIMO wireless system.

UNIVERSITY OF

Southampton

c
Yang 61/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

UNIVERSITY OF

Southampton

c
Yang 62/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

MIMO: Received Signal Representation


Let the MIMO channel be represented by a N M matrix H .
The N 1 received signal y can be expressed as
y = H x + n,

(86)

Capacity - Single-Input-Single-Output
(SISO) System
For a memoryless SISO system the capacity is given by

where
x : the M 1 transmitted vector distributed associated with
mean zero and a covariance matrix Q .
n : the N 1 additive white complex Gaussian noise vector,
which is circular symmetrically distributed with zero-mean
and covariance matrix of 2I N , where I N is a N N identity matrix;
UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 63/ 80 |

C = log(1 + c |h|2 ) bits/s/Hz,

(87)

where
c denotes the SNR at the receive antenna;
h is the normalized complex gain of the wireless channel.

UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 64/ 80 |

Capacity of MIMO Channels

Capacity of MIMO Channels


The capacity of a MIMO channel with transmission power constraint can be
evaluated as


x; y )
C = EH
max I (x
(88)
Q)P
Trace(Q

where E H denotes the expectation with respect to the random channel H .


In (88) the mutual information is given by

H x + n | x)
= h(yy ) h(H
n | x)
= h(yy ) h(n

(89)

x; y ) in (88) is equivalent to maximizing h(yy ), since


Therefore, maximizing I (x
n is Gaussian.
UNIVERSITY OF

Southampton

c
Yang 65/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Capacity of MIMO Channels


Upon substituting (91) and (92) into (89), we obtain




x; y ) = log (e)N det H QH H + 2I N log (e 2 )N
I (x



1
H
H
Q
H
(bits/s/Hz)
(93)
= log det I N + 2




1
= log det I M + 2 H H H Q
(bits/s/Hz)
(94)




1
H
= log det I M + 2 QH H
(bits/s/Hz)
(95)

UNIVERSITY OF

Southampton

Let R y be the covariance matrix of y . R y is then expressed as




R y = E yy H


= H E xx H H H + 2I N
= H QH H + 2I N

x; y ) = h(yy ) h(yy | x )
I (x

n)
= h(yy ) h(n

As shown in [1], the differential entropy of h(yy ) is maximized, iff


y is a circularly symmetric complex Gaussian vector;

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 67/ 80 |

(90)

Therefore, we have




R y ) = log (e)N det H QH H + 2I N
h(yy ) = log (e)N det (R
(91)




n) = log (e)N det 2I N = log (e 2 )N
(92)
h(n
UNIVERSITY OF

Southampton

c
Yang 66/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Capacity - Single-Input-Multiple-Output
(SIMO) System
For a memoryless SIMO system, we can set M = 1 in either (94) or (95),
yielding the capacity of the SIMO channel, which is given by
!
N
X
2
|hn |
bits/s/Hz,
(96)
C = log 1 + c
n=1

where
c denotes the average SNR at any receive antenna;
hn is the normalized complex gain of the nth receive antenna;
N is the number of receive antennas.
The Capacity of (96) is achieved when the channel knowledge is known to
the receiver.

UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 68/ 80 |

Capacity - Multiple-Input-Single-Output
(MISO) System
For a MISO system, when the transmitter does not have the
channel knowledge, the capacity is achieved by equally dividing
the transmitted power among all the transmit antennas.

H H H = U U H
bits/s/Hz,

(97)

where

Let the rank of H H H be . Explicitly, min(M, N ). Then, in


(98)
= diag {1 , 2 , , } is a ( ) diagonal matrix
containing the non-zero eigenvalues of H H H .

M is the number of transmit antennas.


UNIVERSITY OF

c
Yang 69/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

UNIVERSITY OF

Southampton

c
Yang 70/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Capacity of MIMO Channels

Capacity of MIMO Channels


When substituting (98) into (95), the mutual information can be denoted as



1
H
x; y ) = log det I M + 2 QU U
I (x




1
= log det I M + 2 U H QU




1

(bits/s/Hz)
(99)
= log det I M + 2 Q

When invoking the Hadamards inequality, the mutual information of (99)


satisfies
#
"
Y

2
x; y ) log
1 + i qii /
I (x

= U H QU = [
where by definition Q
qij ];
 
= Trace (Q
Q);
It can be shown that Trace Q

is a diagonal matrix;
The equality in (101) is achieved, when Q

The optimization problem in (88) can now be described as


(
)


1
max log det I M + 2 Q
C = EH
)P

Trace(Q
UNIVERSITY OF

Southampton

(98)

U is a (M ) unitary matrix;

hm is the normalized complex gain of the mth transmit


antenna;

Southampton

Based on the mutual information expressions of (93) - (95), the


capacity (88) of the MIMO channel with power constraint can be
achieved by the water-filling;
Specifically, let the eigen-decomposition on H H H be

Thus from (93) the capacity is given by


M
c X
C = log 1 +
|hm |2
M m=1

Capacity of MIMO Channels

i=1

X
i=1



log 1 + i qii / 2 (bits/s/Hz)

(101)

Therefore, the capacity of the MIMO channel is given by


C=
(100)

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 71/ 80 |

X
i=1



log 1 + i qii / 2 (bits/s/Hz)

(102)

which is achieved, when the channel knowledge is known to both the transmitter
and receiver.
UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 72/ 80 |

Capacity of MIMO Channels


In (102) qii for i = 1, 2, . . . , can be obtained via the
water-filling principles, ie.,
+
qii = v 2 /i , i = 1, 2, . . . ,

Capacity of MIMO Channels - Receiver


Channel Knowledge Only
(103)

where v is chosen to satisfy the power constraint, so that


+


X
X
2
P
qii =
v
2
i

i=1
i=1

(104)

Finally, when substituting (103) into (102), the capacity of the


MIMO channel can be expressed as



X
(vi )+
C=
log
(bits/s/Hz)
(105)
2

i=1
UNIVERSITY OF

Southampton

c
Yang 73/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Capacity of MIMO Channels - Properties


Let M = N and H H
simplified to

= I N in (106). Then, (106) can be


C = M log 1 +
bits/s/Hz,
M


and



lim M log 1 +
bits/s/Hz
M
M
= / ln(2),

When the channel knowledge is only known to the receiver, the capacity is achieved by equally dividing the
transmitted power among all the transmit antennas;
The capacity of a deterministic MIMO channel H is given
by
i
h


H
H
H
I
bits/s/Hz,
(106)
C = log det N +
M

By contrast, the capacity of a random MIMO channel H is


given by
h

i

C = E H log det I N + H H H
bits/s/Hz,
(107)
M
UNIVERSITY OF

Southampton

Capacity of MIMO Channels - Properties


In (107), for a fixed value of N , when increasing the value of
M , we have
lim

(108)

UNIVERSITY OF

(110)

i
h


lim E H log det I N + H H H


M
M
= log [det (II N + II N )]

C =
(109)

which shows that the capacity increases linearly with increasing the SNR of .

Southampton

1
HH H = I N .
M

Hence, we have

lim C =

c
Yang 74/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 75/ 80 |

= N log (1 + ) bits/s/Hz,

(111)

which shows that the capacity increases linearly with increasing the number of receive antennas.
UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 76/ 80 |

Capacity of MIMO Channels - Conclusions


If M and N simultaneously become large, capacity of
the MIMO system is seen to grow linearly with
min(M, N );
The linearly growing capacity is achieved, when communicating over a rich scattering environment providing independent transmission paths from each transmit antenna to
each receive antenna;
This characteristics of linearly growing capacity is retained, provided that the receiver employs the channel
state information, while the transmitter employs either the
channel state information or its distribution information;
UNIVERSITY OF

Southampton

c
Yang 77/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

Capacity of MIMO Channels - Conclusions


If only the channel distribution information is available at
both the transmitter and the receiver, and if we assume
that the channel matrix components remain constant for
a coherence interval of T symbol periods (block fading
model), then
for a fixed number of antennas, as the length of the
coherence interval increases, the capacity approaches
the capacity obtained as if the receiver employs the
channel state information;
However, the capacity does not increase at all, when
the number of transmit antennas is increased beyond
the length of the coherence interval T .
UNIVERSITY OF

Southampton

c
Yang 78/ 80 |
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.

[1] T. M. Cover and J. A. Thomas, Elements of Information Theory, John Wiley & Sons,
Inc, 1991.
[2] D. Jiang and Y. Qian, Information Theory and Coding, University of Science and
Technology of China, 1992.
[3] A. J. Goldsmith and P. P. Varaiya, Capacity of fading channels with channel side
information, IEEE Trans. on Inform. Theory, Vol. 43, No. 6, pp.1986-1992, Nov. 1997.

Thank you!

[4] M.-S. Alouini and A. J. Goldsmith, Capacity of Rayleigh fading channels under
different adaptive transmission and diversity-combining techniques, IEEE Trans. on
Veh. Tech., Vol. 48, No. 4, pp.1165-1992, July 1999.
[5] M. H. M Costa, Writing on dirty paper, IEEE Trans. on Information Theory,
Vol. IT-29, No. 3, pp. 439-441, May 1983.
[6] I. Telatar, Capacity of multi-antenna Gaussian Channels, European Transactions on
Telecommunications, Vol. 10, No. 6, pp. 585-595, Nov/Dec 1999.

UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 79/ 80 |

UNIVERSITY OF

Southampton

c
School of ECS, Univ. of Southampton, UK. www-mobile.ecs.soton.ac.uk L.-L.
Yang 80/ 80 |

You might also like