Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

Linear Systems and Stochasic Processes 75

Autocorrelation of complex {x[k ]}, rx (m) = E{x[k ]( x[k + m])*}


= [rx r (m) + rx i (m)] + j[ −rx r x i (m) + rxi x r ( m)]

Autocovariance of complex {x[ k ]}, cx (m) = E{( x[k + m] − x )( x[k ] − x )*}


cx (m) = rx (m) − x[k ]x *[m]

Cross - correlatio n of complex {x[k ]} & { y[ k ]}, rxy (m) = E{x[k ]( y[k + m])*}

Cross - covariance of complex {x[k ]} & { y [k ]}, c xy (m) = E{( x[k ] − x )( y[k + m] − y )*}
c xy (m) = rxy (m) − x[k ] x *[m]
where : m is the lag factor

The autocorrelation function of complex wide-sense stationary processes is a


very important function which is used extensively. Consequently, it deserves closer
attention. The autocorrelation between the random variables x[k] and x[i] depends
only on the difference, k - i, separating the two random variables in time, i.e.,

rx ( k , i ) = rx ( k − i ,0) ≡ rx ( k − i )

The difference, m = k - i, is called the lag. The autocorrelation sequence of a wide-


sense stationary process is a conjugate symmetric function of k, rx (m) = rx* ( −m) .
This property is evident from the definition Equation 2.89,

rx (m) = E{x(k + m) x * (k )} = E{x(k ) x * (k + m)} = rx (−m) (2.89)

where:
m is the lag.

2.11.9 Autocovariance and Autocorrelation Matrices


The autocovariance and autocorrelation sequences are important second-order
moments of discrete-time random processes that are often represented in a matrix
form. For a (p+1)-dimensional vector x = [x[0], x[1], x[2],...., x[p]]T of a wide-sense
stationary process {x[n]} its outer product is a (p+1) x (p+1) matrix defined by
Equation 2.90.

ª x[0]x * [0] x[0]x *[1] .. x[0]x * [ p] º


« »
x[1]x * [0] x[1]x * [1] .. x[1]x * [ p ] »
xx H =« (2.90)
« : : .. : »
« * * *
»
¬« x[ p]x [0] x[ p ]x [1] .. x[ p]x [ p]¼»
76 Principles of Adaptive Filters and Self-learning Systems

The expectation of this matrix is the (p+1) x (p+1) autocorrelation matrix, Rx, as
defined by Equation 2.91.

ª rx (0) rx* (1) .. rx* ( p) º


« »
r (1) rx (0) .. rx* ( p − 1)»
R x = E{xx H } = « x (2.91)
« : : : : »
« »
«¬rx ( p) rx ( p − 1) : rx (0) »¼

where:
rx (m) = rx* (−m), according to Hermitian symmetry.

By the same process the expectation of the outer product of the vector x minus
the mean vector of the process, i.e., (x − x ) , produces the autocovariance matrix,
Cx, which for a zero mean process is equal to the autocorrelation matrix. The
autocovariance matrix is defined by Equation 2.92.

C x = E{( x − x )(x − x) H } = R x − xx H (2.92)

The autocorrelation matrix of a wide-sense stationary process is a Hermitian matrix


with all the diagonal values real and equal. For a real valued random process it is a
symmetric Toeplitz matrix.
A wide-sense stationary Gaussian process with covariance cx(m) is referred to
as autocorrelation ergodic if,

1 N −1
lim
N →∞
¦ c x2 (m) = 0
N m =0

In most applications it is not practical to determine whether a given process is


ergodic. Therefore, often, time averages are simply used to estimate the ensemble
averages and the validity of the assumption is tested by the performance of the
algorithm requiring the information.

2.11.10 Spectrum of a Random Process


The power Spectrum of a discrete-time wide-sense stationary random process,
{x[n]}, is the Fourier transform of its autocorrelation sequence rx (k ) as defined by
Equation 2.93.
∞ ∞
Px (e jθ ) = ¦ rx (k )e − jkθ Px ( z ) = ¦ rx (k ) z −k (2.93)
k = −∞ k = −∞

where:
2π f
θ = ωT =
Fs
Linear Systems and Stochasic Processes 77

The autocorrelation sequence may be computed by taking the inverse Fourier


transform of Px (e jθ ) as defined by Equation 2.94.

1 +π
rx (k ) = ³ Px (e jθ )e jkθ dθ (2.94)
2π −π

The power spectrum of a wide-sense random process x[n] is nonnegative and real-
valued, i.e., Px (e jθ ) = Px* (e jθ ) , and Px (z ) satisfies the symmetry condition,
Px ( z ) = Px* (1 z ) . If x[n] is real then the power spectrum is even, i.e.,
Px (e jθ ) = Px (e − jθ ) , which implies that Px ( z ) = Px* ( z ) . The total power in a zero
mean wide-sense stationary process is proportional to the area under the power
spectral curve as defined by Equation 2.95.

2 1 +π jθ
E{ x[n] } = ³ Px (e )dθ
2π −π
(2.95)

The eigenvalues of the n x n autocorrelation matrix of a zero mean wide-sense


stationary random process are upper and lower bounded by the maximum and
minimum values of the power spectrum as defined by Equation 2.96.

min Px (e jθ ) ≤ λi ≤ max Px (e jθ ) (2.96)


θ θ

The power spectrum can also be seen as the expected value of the squared
Fourier magnitude, PN(ejθ), in the limit as N → ∞ for 2N + 1 samples of a given
realisation of the random process, i.e., refer to Equation 2.97.

Px (e jθ ) = E{PN (e jθ )} = ¦ rx (k )e − jkθ
k = −∞

­° N
2½ (2.97)
1 − jnθ °
= lim E® ¦ x ( n ) e ¾
N →∞ 2 N + 1 °̄ n=− N °¿

where:
2
1 N
PN (e jθ ) = ¦ x(n)e − jnθ
2 N + 1 n=− N

If the power spectrum Px (e jθ ) of a wide-sense stationary process is a


continuous function of θ, then Px ( z) may be factored into a product of a form
known as the “spectral factorisation” of Px ( z) as defined by Equation 2.98.

§ 1 ·
Px ( z ) = σ 02Q (z )Q * ¨ * ¸ (2.98)
©z ¹
78 Principles of Adaptive Filters and Self-learning Systems

where:
­ 1 π ½
σ 02 = exp ® ³−π
ln Px (e jθ )dθ ¾
¯ 2π ¿

For a real-valued process the spectral factorisation is defined by Equation 2.99.

Px (e z ) = σ 02 Q ( z )Q ( z −1 ) (2.99)

Any process that can be factored in this way is called a regular process and has the
following properties,

1. The process x[n] can be realised as the output of a causal and stable filter
H (z ) that is driven by white noise having a variance of σ 02 .

1
2. If the process x[n] is filtered with the inverse filter the output v[n]
H ( z)
1
is white noise having a variance of σ 02 . In this case is known as a
H ( z)
whitening filter.

3. Since v[n] and x[n] are related by an invertible transformation then they
both contain the same information and may be derived from each other.

N ( z)
For the special case when Px ( z) = , a rational function, then according to the
D( z)
spectral factorisation Px ( z) may be factored in the following form,
ª * 1 º
§1 · 2 ª B( z ) º
« B ( z* ) »
Px ( z ) = σ 02 Q ( z )Q * ¨ ¸ = σ 0« »« »
© z* ¹ ¬ A( z ) ¼ « A* ( 1 ) »
«¬ z * »¼
where:
σ 2o is a constant.
B( z ) = 1 + b[1]z −1 + .. + b[q ]z − q , is a monic polynomial having all its roots inside
the unit circle.
A( z ) = 1 + a[1]z −1 + .. + a[ p ]z − p , is monic polynomial having all its roots inside
the unit circle.

2.11.11 Filtering of Random Processes


The output y[n] of a stable LSI filter, h[n], driven by x[n], a wide-sense stationary
process, is defined by Equation 2.100.
Linear Systems and Stochasic Processes 79


y[ n ] = x [ n ] ∗ h [ n ] = ¦ h [k ]x [n − k ] (2.100)
k = −∞

The mean of y[n] is defined by Equation 2.101.

­ ∞ ½ ∞
E{y[n]} = E ® ¦ h [k ] x [n − k ]¾ = ¦ h [k ] E{x [n − k ]}
¯k = −∞ ¿ k = −∞
(2.101)

= x [ n ] ¦ h [ k ] = x [ n ] H (e ) j0
k = −∞

The crosscorrelation between y[n] and x[n], ryx (n + k , n) , depends only on the
difference between n + k and n and is defined by Equation 2.102.

ryx (k ) = E{ y [n + k ] x *[n]} = rx (k ) ∗ h [k ] (2.102)

where:
k is the difference between n + k and n .

The autocorrelation of y[n] is defined by Equation 2.103.

ry (k ) = ryx (k ) ∗ h* [−k ]
∞ ∞
= ¦ ¦ h[l ]rx (m − l + k ) h*[m] (2.103)
l = −∞ , m = −∞

= rx (k ) ∗ h[k ] ∗ h*[− k ]

It can also be said that,


rh (k ) = h[k ] ∗ h* [−k ] = ¦ h[n]h*[n + k ]
n = −∞
therefore,

ry (k ) = rx (k ) ∗ rh (k )

The variance of the output process is defined by Equation 2.104.


2 ∞ ∞
E{ y[n] } = σ y2 = ry (0) = ¦ ¦ h[l ]rx (m − l )h*[m] (2.104)
l = −∞ m = −∞

In the special case when h[n] is finite with a length of N then the variance, or
power, of y[n] may be expressed in terms of the autocorrelation matrix Rx of x[n]
and the vector filter coefficients h as defined by Equation 2.105.
2
σ y2 = E{ y[n] } = h H R x h (2.105)

The power spectrum of x[n] and y[n] are related as follows,

You might also like