Professional Documents
Culture Documents
Advanced Digital Signal Processing: Linear Prediction and Optimum Linear Filters
Advanced Digital Signal Processing: Linear Prediction and Optimum Linear Filters
Advanced Digital Signal Processing: Linear Prediction and Optimum Linear Filters
Lecture 6
Linear Prediction and Optimum Linear Filters
• E[x(m)] = μx
• E[x(m)x(m + k)]= rxx (k)
Non-Stationary Processes
• A random process is non-stationary if its distributions or statistics
vary with time.
• Most stochastic processes such as video signals, audio signals,
financial data, meteorological data, biomedical signals, etc., are
nonstationary, because they are generated by systems whose
environments and parameters vary over time.
Adaptive Filters
Adaptive filters work on the principle of minimizing the mean squared
difference (or error) between the filter output and a target (or desired)
signal.
Adaptive filters are used in applications that involve a combination of
three broad signal processing problems:
1. De-noising and channel equalization – filtering a time-varying noisy
signal to remove the effect of noise and channel distortions;
2. Trajectory estimation – tracking and prediction of the trajectory of a
nonstationary signal or parameter observed in noise;
3. System identification – adaptive estimation of the parameters of a
time-varying system from a related observation.
Adaptive linear filters work on the principle that the desired signal or
parameters can be extracted from the input through a filtering or
estimation operation. The adaptation of the filter parameters is based on
minimizing the mean squared error between the filter output and a target
(or desired) signal. The use of the LSE criterion is equivalent to the
principal of orthogonality in which at any discrete time m the estimator is
expected to use all the available information such that any estimation error
at time m is orthogonal to all the information available up to time m.
An adaptive filter can be a combination of the following types of filters:
single-input or multi-input filters;
linear or nonlinear filters;
finite impulse response FIR or infinite impulse response IIR filters.
Applications of adaptive filters include
Multichannel noise reduction,
Radar/sonar signal processing,
Channel equalization for cellular mobile phones,
Echo cancellation and low-delay speech coding.
Adaptive Filters
• For stationary inputs, the resulting solution is commonly known as the Wiener
filter, which is said to be optimum in the mean-square sense.
• A plot of the mean-square value of the error signal vs. the adjustable parameters
of a linear filter is referred to as the error-performance surface.
• The minimum point of this surface represents the Wiener solution.
• The Wiener filter is inadequate for dealing with situations in which non-
stationarity of the signal and/or noise is intrinsic to the problem.
• A highly successful solution to this more difficult problem is found in the
Kalman filter, a powerful device with a wide variety of engineering
applications.
Transversal Filter
Lattice Predictor
• Least-Squares Estimation
• Recursive least-squares (RLS) estimation
• Standard RLS algorithm
• Square-root RLS algorithms
• Fast RLS algorithms
Wiener Filters
• error signal e(m) for N samples of the signals x(m) and y(m):
• The Wiener filter coefficients are obtained by minimising an average
squared error function E[e2(m)] with respect to the filter coefficient
vector w
• Where the gradient of the mean square error function with respect to the filter
coefficient vector is given by
• The minimum mean square error Wiener filter is obtained by setting equation
to zero
• The calculation of the Wiener filter coefficients
requires the autocorrelation matrix of the input signal
and the crosscorrelation vector of the input and the
desired signals.
• Note that the feedback equation for the time update of the filter
coefficients is essentially a recursive (infinite impulse response)
system with input μ[y(m)e(m)] and its poles at α.
The LMS Filter
• Substituting this equation into the recursion update equation of the filter
parameters, yields the LMS adaptation equation
The LMS Filter
• When the parameter α<1, the effect is to introduce more stability and
accelerate the filter adaptation to the changes in input signal
characteristics.
Stability of LMS
• The LMS algorithm is convergent in the mean square if and
only if the step-size parameter satisfy
2
0
max
• Here max is the largest eigenvalue of the correlation matrix of
the input data
2
• More practical test for stability is 0
input signal power
• Applications:
• System identification
• Layered earth modeling
28
Applications of Adaptive Filters: Inverse Modeling
• Used to provide an inverse model of an unknown plant
• Applications:
Predictive deconvolution
Adaptive equalization
Blind equalization 29
Applications of Adaptive Filters: Prediction
• Used to provide a prediction of the present value of a random signal
• Applications:
Linear predictive coding
Adaptive differential pulse-code modulation
Autoregressive spectrum analysis
Signal detection
30
Applications of Adaptive Filters: Interference Cancellation
• Applications:
hands-free carphone, aircraft headphones etc
Adaptive noise canceling (Echo / Noise cancellation: hands-free carphone,
aircraft headphones etc
Echo cancelation
Adaptive beamforming 31
Example:
Acoustic Echo Cancellation
32
Linear prediction
Linear prediction (LP) model predicts the future value of a signal
from a linear combination of its past values.
LP models are used in a diverse applications, such as:
Data forecasting,
Speech coding,
Video coding,
Speech recognition,
Model-based spectral analysis,
Model-based signal interpolation,
Signal restoration,
Noise reduction,
Impulse detection and change detection.
In terms of usefulness and application as a signal processing tool, linear
prediction models rank almost alongside the Fourier transform. Indeed,
modern digital mobile phones employ voice coders based on linear
prediction modelling of speech for efficient coding.
There are two main motivations for the use of predictors in signal
processing applications:
1. To predict the trajectory of a signal;
2. to remove the predictable part of a signal in order to avoid
retransmitting parts of a signal that can be predicted at the receiver
and thereby save storage, bandwidth, time and power.
The accuracy with which a signal can be predicted from its past samples
depends on:
1. The autocorrelation function, or
2. Equivalently the bandwidth and the power spectrum, of the signal.
.
Linear prediction models are extensively used in speech processing applications such as
in low bit-rate speech coders, speech enhancement and speech recognition. Speech is
generated by inhaling air and then exhaling it through the glottis and the vocal tract. The
noise-like air, from the lung, is modulated and shaped by vibrations of the glottal cords
and the resonance of the vocal tract. Figure 2 illustrates a source-filter model of speech.
The source models the lung, and emits a random input excitation signal that is filtered by
a pitch filter. The pitch filter models the vibrations of the glottal cords, and generates a
sequence of quasiperiodic excitation pulses for voiced sounds
A Prediction Problem
38
A Prediction Problem
39
Forward & Backward Prediction
J M
2E{(x[n]x[n i]) 2E{w[k]x[n k]x[n i]}
wi k1
• On substituting with the correspending correlation
sequences we have
J M
2r[i] 2w[k]rxx[i k]
wi k 1
M
w[k]rxx[i k] rxx[i] i 1,2,3,...,M
k1
• These are the Normal Equations, or Wiener-Hopf ,
or Yule-Walker equations structured for the one-
step forward predictor
M rxx[0] k 0
aM [m]rxx[m k]
m0 0 k 1,2,...,M
1 2 M
A f ( z ) 1 a1[1]z aM [2]z ... aM [ M ]z
Professor A G Constantinides© 47
Backward Prediction Problem
• In a similar manner for the backward prediction case we
write
eb [n] x[n M ] xˆ[n M ]
• And
M
~[k ]x[n k 1]
xˆ[n M ] w
k 1
x[n M ]
• Hence the normal equations for the backward case can be
written as
M
~[m]r [m k] r [M k 1]
w k 1,2,3,...,M
xx xx
m1
• This can be slightly adjusted as
M
~[M m1]r [k m] r [k]
w k 1,2,3,...,M
xx xx
m1