Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 43

ADVANCED SIGNAL

PROCESSING TECHNIQUES
FOR WIRELESS
COMMUNICATIONS
Erdal Panayrc
Electronics Engineering Department
IIK University
OUTLINE
Introduction
Knowledge Gaps in General
The essential of EM algorithm
The Sage algorithm
Some Application Areas
Sequential Monte Carlo Method
(SMC)
Knowledge Gaps in SMC

INTRODUCTION
Future generation wireless commun.
systems are confronted with new
challenges mainly due to
Hostile channel characteristics
Limited bandwidth
Very high data rates
Advanced Signal Proc. techniques
such as
The Expectation-Maximization
algorithm
The SAGE algorithm
The Baum-Welch algorithm
Sequential Monte Carlo Techniques
Kalman filters and their extensions
Hidden Markov modeling
Stochastic approximation algorithms

In collaboration with
Inexpensive and
Rapid
computational power provide
powerful tools to overcome the
limitations of current technologies.

Applications of advanced signal
processing algorithms, include, but
are not limited to
Joint/Blind/Adaptive
Sequence (data) detection
Frequency, Phase ,timing
synchronization
Equalization
Channel Estimation techniques.

These techniques are employed in
advanced wireless communication
systems such as
OFDM/OFDMA
CDMA
MIMO, Space-time-frequency Coding
Multi-User detection

Especially, development of the
suitable algorithms for wireless
multiple access systems in
Non-stationary
Interference-rich
environments presents major
challenges to us.
Optimal solutions to these
problems mostly can not be
implemented in practice mainly due
to
high computational complexity
Advanced signal processing tools, I
mentioned before, have provided a
promising route for the design of low
complexity algorithms with
performances approaching the
theoretical optimum for
Fast, and
Reliable
communication in highly severe and
dynamic wireless environment
Over the past decade, such methods
have been successfully applied in
several communication problems.
But many technical challenges
remain in emerging applications
whose solutions will provide the
bridge between the theoretical
potential of such techniques and
their practical utility.
The Key Knowledge Gaps
Theoretical performance and
convergence analysis of these
Algorithms
Some new efficient algorithms need to
be worked out and developed for some
of the problems mentioned above
Computational complexity problems of
these algorithms when applied to on-line
implementations of some algorithms
running in the digital receivers must be
handled.
Implementation of these algorithms
based on batch processing and
sequential (adaptive) processing
depending on how the data are
processed and the inference is made
has not been completely solved for
some of the techniques mentioned
above.

Some class of algorithms requires efficient
generation of random samples from an
arbitrary target probability distribution, known
up to a normalizing constant. So far two basic
types of algorithms,
Metropolis algorithm and Gibbs sampler have
been widely used in diverse fields. But it is
known that they are substantially complex and
difficult to apply for on-line applications like
wireless communications.
There are gaps for devising new types of more
efficient algorithms that can be effectively
employed in wireless applications.

THE EM ALGORITHM
The EM algorithm was popularized in 1977
An iterative algorithm for obtaining ML
parameter estimates
Not really an algorithm, but a procedure
Same problem has different EM
formulations
Based on definition of complete and
incomplete data

L. E. Baum, T. Petrie, G. Soules and N. Weiss, A
Maximization Technique in Statistical Estimation
for Probabilistic Functions of Markov Chains,
Annals of Mathematical Statistics, pp. 164-171, 970.
A. P. Dempster, N. M. Laird, and D. B. Rubin,
Maximum-Likelihood from Incomplete Data Via the
EM Algorithm, Journal, Royal Statistical Society, Vol.
39, pp. 1-17, 1977.
C. F. Wu, On the Convergence Properties of the EM
Algorithm, Annals of Statistics, Vol. 11, pp. 95-103,
1983.
Main References
The Essential EM
Algorithm
Consider estimating parameter vector s from data
y (incomplete data):
( ) n z s F y + = ,
Parameters to be estimated Random parameters
Then, the ML estimate of s is:
( ) ( ) | | z s y p
z
E s y p
C s
ml
s , max arg =
e
=
Thus, obtaining ML estimates
may require:
An Expectation
Often analytically
intractable
A Maximization
Computationally
intensive
The EM Iteration
Define the complete data x
( ) x y x Many-to-one mapping
having conditional density ( ) s x f
The EM iteration at the i-th step:
E-step:
M-step:

Q s

s
i
|
\

|
.
| E log f x s
( )
y,

s
i



(

(


s
i+1
=arg max
seC
Q s

s
i
|
\

|
.
|
Convergence Properties
At each iteration the likelihood-function
is monotonically non-decreasing
If the likelihood-function is bounded,
then the algorithm converges
Under some conditions, the limit point
coincides with the ML estimate

EM Algorithm Extensions
J. A. Fessler and A. O. Hero, Complete-data spaces
and generalized EM algorithms, 1993 IEEE
International Conference on Acoustics, Speech, and
Signal Processing (ICASSP-93), Vol. 4, pp. 1-4, 1993.
J. A. Fessler and A. O. Hero, Space-alternating
generalized EM algorithm, IEEE Transactions and
Signal Processing, October 1994.
The SAGE Algorithm
The SAGE algorithm is an extension of
EM algorithm
It provides much faster convergence
than EM
Algorithm alternates several hidden data
spaces rather than just using one
complete data space, and
Updates only a subset of elements of the
parameters in each itteration
Some Application Areas
Positron-Emission-Tomography (PET)
Genetics
Neural Networks
Radar Imaging
Image / Speech processing
Communications
Channel Estimation / Equalization
Multiuser detection
Squence estimation
Interference rejection
SEQUENTIAL MONTE
CARLO TECHNIQUE (SMC)

Emerged in the field of statistics,
J. S. Liu and R. Chen, Sequential Monte
Carlo Methods for Dynamics Systems, J.
American Stat. Assoc., Vol. 93, pp. 1032-
1044, 1998.

Recently, SMC has been successfully
applied to several problems in
wireless communications, such as,
Blind equalization
Detection/decoding in fading channels
It is basically based on approximating
the expectation operation by means of
sequentially generated Monte Carlo
samples from either unknow state
variables or system parameters.
Main Advantages
SMC is self adaptive and no
training/pilot symbols or decision
feedback are needed
Tracking of fading channels and the
estimation of the data sequence are
naturally in integrated
Channel noise can be either Gaussian
on Non-Gaussian
It is suitable for MAP receiver design
If the system employs channel coding,
the coded signal structure can be easily
exploited to improve the accuracy of
both channel and data estimation
SMC is suitable for high-speed parallel
implementation using VLSI
Does not require iterations like in EM
algorithm
Updating with new data can be done
more efficiently
SMC Method
Let denote the parameter vector of interest
Let denote the complete data
so that is assumed to be simple
is partially observed
It can be partitioned as where
denotes the observed part
( )
T
t 2 1 t
x x x X ,..., , =
( )
t
X p u
t
X
( )
t t t
S Y X , =
( )
t t
y y y Y ,..., ,
2 1
=
( )
t t
s s s S ,..., ,
2 1
=
denotes in the incomplete data or
unobservable or missing data.
u
Example
1. Fading channel
,... 2 , 1 ,
1
= + =

=

t n s h y
L
i
t i t i t

Problem: Joinly estimate the data signal
and the unknown channel
parameters
{ } L i h
i
,..., 2 , 1 , =
,... 2 , 1 , = t s
t
2. Joint Phase Offset and SNR Estimation
,... 2 , 1 , = + = t n e s y
t
j
t t
u
u
is unknown phase offset
2
o
is unknown noise variance
( )
t 2 1 t
s s s S ,..., , =
is the data to be transmitted
Problem: Estimate ( )
2
1 p o u = , based on
complete data ( )
t t t
S Y X , = where
( )
t t
y y y Y ,..., ,
2 1
= observed part
( )
t t
s s s S ,..., ,
2 1
=
incomplete data
MAP SOLUTION USING
SMC METHOD
MAP solution of the unknown parameter
vector u is
( ) ( ) u u u u E u
u
d Y p Y
t t MAP }
= =

( ) ( ) ( )
t t t t t
S
t
dS Y S p S Y p Y p
t
, u u
}
=
Where p(u |Y
t
) can be computed by
means of incomplete data sequence as
Substituting this in the above, we have
( ) ( )
( ) | |
t t
t t t
S
t MAP
Y S
dS Y S p S
t
E
u

=
}

To implement SMS, we need to draw m


independent samples (Monte Carlo samples)
( )
{ }
m
j
j
t
s
1 =
from the conditional distribution of
( ) ( ) y y y s s s p Y S p
t 2 1 t 2 1 t t
,..., , ,..., , =
Usually, directly drawing samples from this
distribution is difficult.
But, drawing samples from some trial-
distribution ( )
t t
Y S q
is easy.
In this case, we can use the idea of
importance sampling as follows:
Suppose a set of samples
( )
m j s
j
t
,..., 2 , 1 , = is drawn
from the trial distribution ( ). Y S q
t t
By associating
weight
( )
( )
( )
( )
( )
t
j
t
t
j
t j
t
Y s q
Y s p
w =
to the sample
( )
,
j
t
s
( ) | |
( )
( )
( ) j
t
m
1 j
j
t
t
t t MAP
w S
W
1
Y S

=
= = E u

where,
( )

=
=
m
1 j
j
j t
w W
The pair
( ) ( )
( ) m j w S
j
t
j
t
,..., 2 , 1 , , = is called
a properly weighted sample. w. r. t.
distribution
( ) . Y S p
t t
We can now estimate
MAP
u

as follows;
By properly choosing the trial distribution q(.), the
weighted samples
( ) ( )
( ) w S
j
t
j
t
,
can be generated sequentially. That is, suppose
a set of properly weighted samples
( ) ( )
( ) { }
m
1 j
j
1 t
j
1 t
w S
=
,
Then SMC algorithm generates from this set, a
new one
( ) ( )
( ) { }
m
1 j
j
t
j
t
w S
=
,
at time t.
at time t-1 is given.
1. Draw samples
( ) ( ) ( )
( )
j
t
j
1 t
j
t
s S S ,

( ) j
t
s from the trial distribution
q(.) and let
2. Compute the important weight
( ) j
t
w from
( ) j
1 t
w

sequentially.
3. Compute the MAP estimate
( )
( )
( )
( ) j
t
m
1 j
j
t MAP
w S m 1

=
= u

As a summary SMC algorithm is given as


follows for j = 1, 2,..., m
KNOWLEDGE GAPS IN
SMC
Coosing the effective sample size m
(empirically usually ,20 < m < 100).
The sampling weights measures the
quality of the corresponding drawn data
sequence
Small weights implies that these samples do
not really represent the distribution from
which they are drawn and have small
contribution in the final estimation
Resampling procedure was developed for it. It
needs to be improved for differential
applications
50 ~ m
( )
{ }
j
t
w
( ) j
t
S
Delay Estimation Problem:
Since the fading process is highly
correlated, the future received signals
contain information about current
data and channel state.
A delay estimate seems to be more
efficient and promising than the
present estimate summarized above.
In delay estimation:
Instead of making inference on (S
t
, u)
with posterior density p(u, S
t
|Y
t
), we
delay this inference to a later time (t+A)
with the distribution p(u, S
t
|Y
t+A
)
Note: Such a delay estimation method
does not increase computational cost but it
requires some extra memory.
Knowledge Gap: Develop computationally
efficient delayed-sample estimation
techniques which will find applications in
channel with strong memory (ISI channel).

Turbo Coding Applications
Because, SMC is soft-input and soft-output
in nature, the resulting algorithms is capable
of exchanging extrinsic information with the
MAP outher channel decoder and
sucessively improving the overall receiver
performance. Therefore blind MAP decoder
in turbo receivers can be worked out.

You might also like