Maximum Likelihood Decoding

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 5

MAXIMUM LIKELIHOOD DECODING

BASICS

THE EXAMPLE OF AN AWGN RECEIVER

Any set of M real energy signals S = defined on [0,T) can be


represented as a linear combination of N <= M real orthonormal basis

functions . We say that these basis functions span the set

S. Thus, we can write each in terms of its basis function


representation as

We denote the coefficients as a vector which is


called the signal constellation point corresponding to the signal The signal
constellation consists of all constellation points

Given the channel output , we now investigate the


receiver structure to determine

which constellation point si or, equivalently, which message mi, was sent over
the time interval [0,T].

We convert the received signal r(t) over each

time interval into a vector, as it allows us to work in finite-dimensional vector


space to estimate the transmitted signal.
Since n(t) is a Gaussian random process, if we condition on

the transmitted signal then the channel output )

is also a Gaussian random process and is a random vector.

E( ) = E(Sij) + E(Nj)

E( )=0.....AWGN

E(constant)= Constant itself. Therefore E( )= .

This implies E( )=

Therefore .

Also,

Moreover E(Nj)= .

Therefore the joint probability distribution function is given by

For each value of i=1....M this function is computed.

Actually this function gives the probability of occourence of r when is sent


Therfore the maximum value gives the value of index i for which this
probability(i.e r is received ) is maximum .

Therefore if a transmitter sends one of M signals si(t), for i=1,2,…,M

The M signals forms a constellation in the signaling space

s7

An ML receiver selects sj that maximize fX|sj.


2
( xi −s ji )
[ ]
N
N /2
( πN 0 ) exp −∑ N0
i=1

Maximizing the likelihood function is equivalent to minimizing the quantity


N
2
⃗^s j =min d ( ⃗x , ⃗s j ) =∑ ( x i −s ji )
⃗s j i=1

In other word , The maximum Likelihood receiver picks the signal that is closed
to the received signal in the signal space. This distance is called Euclidean
distance.

dmin

EXTENDING THIS PROCEDURE TO MIDDELTON BIVARIATE CLASS A NOISE


AND BIVARIATE GAUSSIAN STATISTICS.
Here we are considering the case of a 2X2 MIMO therefore there will be

two transmission matrices X1= .

X2 =

DERIVATION FOR BIVARIATE MIDDELTON CLASS A::

Y=HX + N ... MIMO System model.

E(Y)=E(HX)+E(N).

As E(HX)=HX and E(N) =0 .This implies E(Y)=HX.

= HX.

This implies Y- = Y-HX.

If we replace X with S where S is the codebook i.e all the possible symbols that
may have been transmitted

Then keeping n= Y- HS.

The Maximum likelihood decision rule is given by the Joint spatial distribution

Where S=[x(1....N),x(1,....N)].

We have used 4 QAM therefore codebook is of size 16 X 2.


The value of index at which this function is maximized gives the decoded
symbol.

Why not Distance Computation ?

The computational complexity of the ML receiver for Mid-

eton Class A noise is higher than its Gaussian counterpart

nce the likelihood function given by this equation can no longer

expressed as just a fraction of the minimum distance.

While for Multivariate Gaussian noise we can make use of Mahalanobis


distance minimization.

D=min((N’inv(K)N)^(1/2)).

Where N = Y-HS. S = Codebook.

You might also like