Mean Square Caluclus

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

CONTINUITY AND DIFFERENTIATION OF RANDOM PROCESSES

 We know that the dynamic behavior of a system is described by differential


equations involving input to and output of the system. For example, the behavior
of an RC network is described by a first order linear differential equation with the
source voltage as the input and the capacitor voltage as the output. What happens
if the input voltage to the network is a random process?

 Each realization of the random process is a deterministic function and the


concepts of differentiation and integration can be applied to it. Our aim is to
extend these concepts to the ensemble of realizations and apply calculus to
random process.

We discussed the convergence and the limit of a sequence of random variables. The
continuity of the random process can be defined with the help of the convergence and
limits of a sequence of random variables arising from a random process. We can define
almost-sure continuity, mean-square continuity, and continuity in probability etc.
according to the mode of convergence considered. Here, we shall discuss the concept of
mean-square continuity and introduce elementary concepts of corresponding mean-
square calculus.

Mean-square continuity of a random process

Recall that a sequence of random variables  X n  converges to a random variable X if


lim E  X n  X   0
2

n 
and we write
l.i.m. X n  X
n 
We also note the following properties:

(a) l.i.m.c  c where c is a constant.


n

(b) l.i.m.(c1 X n  c2Yn )  c1 l.i.m. X n  c2 l.i.m.Yn


n  n  n 

(c) lim EX n  E l.i.m. X n


n  n 
(d) The following Cauchy criterion gives the condition for m.s. convergence of a
random sequence without actually finding the limit. The sequence
 X n  converges in m.s. if and only if , for every   0 there exists a positive
integer N such that

E xm  xn
2
0 as m,n   0

Proof (b)
Suppose l . i . m X n  X and l . i . m .Yn  Y . Then,
n  n
E (c1 X n  c2Yn  c1 X  c2Y ) 2  E (c1 ( X n  X )  c2 (Y  n Y )) 2
 c1 E ( X n  X ) 2  c2 E (Y  n Y ) 2  2c1c2 E ( X n  X )(Y  n Y )
lim E (c1 X n  c2Yn  c1 X  c2Y ) 2  lim c12 E ( X n  X ) 2  lim c2 2 E (Y  n Y ) 2
n  n  n 

 2c1c2 lim E ( X n  X )(Y  n Y )


n 

  2 c1c2 lim E ( X n  X )(Y  n Y )


n 

  2 c1c2 lim E ( X n  X )2 E (Y  n Y ) 2
n 

0

The following Cauchy criterion gives the condition for m.s. convergence of a random
sequence without actually finding the limit. The sequence  X n  converges in m.s. if and
only if E ( xn  xn ) 2  0 as m,n  

The concept of the limit of a deterministic function can be extended to a random process
to define limit in the sense of mean-square, probability etc. The limit of a continuous-
time random process  X  t  at point t  t0 can be defined using a sequence of random
variables at the neighbourhood of t0 . The limit in the m.s. sense is defined as follows:
Definition: Suppose  X  t  , t   is a random process and Y be a random variable
defined on the same probability space ( S , , P ) . Y is called the m.s. limit of  X  t  at
point t  t0 if for any   0 , there exists   0 such that E  X  t   Y    whenever
2

t  t0   . We write l.i.m. X (t )  Y
t  t0

We also note that

lim EX (t )  E l.i.m. X (t )
t t0 t  t0

With the above definition for the m.s. limit, we can define the mean-square continuity
of a random process.

Definition: A random process  X  t  , t   defined on a probability space ( S , , P ) is


said to be continuous at a point t  t0 in the mean-square sense if l.i.m. X  t   X  t0 
t t0

or equivalently
lim E  X  t   X  t0    0
2

t  t0

The random process  X  t  , t   is called m.s. continuous if it is m.s. continuous at each


t  .
Mean-square continuity and autocorrelation function

The m.s. continuity of a random process  X  t  , t   at point t  t0 depends on the


two-dimensional continuity of the autocorrelation function RX  t1 , t2  as stated in the
following theorem
Theorem 1: A random process  X  t  , t   is MS continuous at t0 if and only if its
autocorrelation function RX  t1 , t2  is continuous at (t0 , t0 ).

Proof: We shall first prove the necessary part first. We have,

RX  t1 , t2   RX  t0 , t0   EX (t1 ) X (t2 )  EX (t0 ) X (t0 )


 EX (t1 )( X (t2 )  X (t0 ))  EX (t0 )( X (t1 )  X (t0 ))
 EX (t1 )( X (t2 )  X (t0 ))  EX (t0 )( X (t1 )  X (t0 ))
 EX 2 (t1 ) E ( X (t2 )  X (t0 )) 2  EX 2 (t0 ) E ( X (t1 )  X (t0 )) 2
( Using the Cauchy Schwartz inequality)
Now lim E ( X (t2 )  X (t0 )) 2  0 and lim E ( X (t1 )  X (t0 ))2  0.
t 2  t0 t1 t0

 lim RX  t1 , t2   RX  t0 , t0   0
t1 t0 ,t2 t0

Thus RX  t1 , t2  is continuous at (t0 , t0 ).


To prove the sufficient part, suppose RX  t1 , t2  is continuous at (t0 , t0 ). Then,
E  X  t   X  t0    E  X 2  t   2 X  t  X  t0   X 2  t0  
2

 RX  t , t   2 R X  t , t 0   R X  t0 , t 0 

 lim E  X  t   X  t0    lim RX  t , t   2 RX  t , t0   RX  t0 , t0 
2

t  t0 t  t0

 RX  t0 , t0   2 RX  t0 , t0   RX  t0 , t0 
0
Thus  X  t  , t   is MS continuous at t0
Theorem 2: If  X  t  is MS continuous at t0 its mean is continuous at t0 .

This follows from the fact that

 EX  t   EX  t     E ( X  t   X  t  
2 2
0 0

 E  X t   X t 
2
0
2
 lim  E  X  t   X  t0    lim E  X  t   X  t0    0
2

t  t0 t  t0

 EX  t  is continuous at t0 .
The converse of this theorem is not true. For example, a white noise process  X  t  has
the mean EX  t   0 which is continuous at every t. But its autocorrelation function is
discontinuous at point (t , t ) .
Remark
(1) If the autocorrelation function RX  t1 , t2  is continuous at all points of the form
(t , t )     , then  X  t  , t   m.s. continuous over all points of .
(2) If  X  t  is a WSS random process, then EX  t1 ) X (t2   RX   where   t1  t2 .
Then Theorem 1 is simplified as follows:
A WSS random process  X  t  , t   is MS continuous over all points of  if and only
if its autocorrelation function RX   is continuous at   0.
This is because,
E  X  t0     X  t0    RX  0   2 RX    RX  0 
2

If lim RX    RX  0  , then from above lim E  X  t0     X  t0    0 . This proves the


2

 0  0
sufficient condition. The necessary condition can similarly be proved.

Example 1

Consider the random binary wave { X (t )} . A typical realization of the process is shown
in Figure 1. The realization is a discontinuous function.

X (t )

1
 Tp

0 Tp t

1 
Figure 1

The process has the autocorrelation function given by


 
1    Tp
RX ( )   Tp
0 otherwise

We observe that RX ( ) is continuous at   0. Therefore, RX ( ) is continuous at all  .

Example 2

For a Wiener process  X  t  ,


RX  t1 , t2    min  t1 , t2 
where  is a constant.
 RX  t , t    min  t , t    t

Thus the autocorrelation function of a Wiener process is continuous everywhere implying


that a Wiener process is m.s. continuous everywhere. We can similarly show that the
Poisson process is m.s. continuous everywhere.

Mean-square differentiability

Definition: The random process  X  t  on a probability space ( S , , P ) is said to have


X  t  t   X  t 
the mean-square derivative X '  t  at a point t  , provided
t
approaches X '  t  in the mean square sense as t  0 . In other words, the random
 X  t  t   X  t 
2

process  X  t  has a m-s derivative X '  t  if lim E   X ' t   0
t  0
 t 

Remark

 If all the sample functions of a random process X  t  are differentiable, then the
above condition is satisfied and the m-s derivative exists.

Example 3

Consider the random-phase sinusoid  X (t ) given by

X (t )  A cos(0t   ) where A and w0 are constants and  ~ U [0, 2 ]. Then for each
 , X (t ) is differentiable.

Therefore, the M.S. derivative is X (t )   A0 sin(0t   )


M.S. Derivative and Autocorrelation functions

Theorem: The random process  X  t  is m.s. differentiable at t   if and only if


 RX  t1 , t2 
2
exists at (t , t )    .
t1t2
To prove the theorem, we shall take the help of the Cauchy criterion for m.s.
convergence:
The sequence { X n } converges in m.s. if and only if
2
lim E xn  xm  0.
n  ,m 

Applying the Cauchy criterion, the condition for existence of m.s. derivative is

 X  t  t1   X  t  X  t  t2   X  t  
2

lim E    0
t1 , t2  0
 t1 t2 

Expanding the square and taking expectation results,

 X  t t1   X  t  X  t t2   X  t  
2

lim E   
t1,t2 0
 t1 t2 
 R  t t1, t t1   RX  t, t   2RX  t t1, t    RX  t t2 , t t2   RX  t, t   2RX  t t2, t  
 lim  X  
t1,t2 0
 t12   t22 
 R  t t1, t t2   RX  t t1, t   RX  t, t t2   RX  t, t  
2 X 
 t1t2 

 2 RX  t1 , t2  
Each of the above terms within square bracket converges to  if this
t1t2  t t ,t
1 2 t

second partial derivative exists.

 X  t t1   X  t  X  t t2   X  t   2 RX  t1, t2  2RX  t1, t2  2RX  t1, t2  


2

 lim E      2 
t1 ,t2 0
 t1 t2  t1t2 t1t2 t1t2 t t ,t t
1 2

0
 2 RX  t1 , t2 
Thus,  X  t  is m-s differentiable at t   if exists at (t , t )    .
t1t2

Particularly, if X  t  is WSS,

RX  t1 , t2   RX  t1  t2 

Substituting t1  t2   , we get

 2 RX  t1 , t2   2 RX  t1  t2 

t1t2 t1t2
  dRX     t1  t2  
  . 
t1  d t2 
d 2 RX   

d 2 t1
d RX  
2

d 2

Therefore, a WSS process X  t  is m-s differentiable if RX   has second derivative at


  0.

Example 4

Consider a WSS process  X  t  with autocorrelation function


RX    exp   a  
RX   does not have the first and second derivative at   0.  X  t  is not
mean-square differentiable.

Example 5

The random binary wave  X  t  has the autocorrelation function


 
1    Tp
RX ( )   Tp
0 otherwise

RX   does not have the first and second derivative at   0. Therefore,  X  t  is not
mean-square differentiable.

Example 6

For a Wiener process  X  t  ,


RX  t1 , t2    min  t1 , t2 
where  is a constant.
 t if t2  0
 RX  0, t2    2
0 other wise
 if t2  0
RX  0, t2  
  0 if t2  0
t1 does not exist if if t  0
 2

 RX  t1 , t2 
2
 does not exist at (t1  0, t2  0)
t1t2

Thus a Wiener process is m.s. differentiable nowhere.

Mean and Autocorrelation of the Derivative process

We have,
X  t  t   X  t 
EX '  t   E l.i.m.
t  0 t
EX  t  t   EX  t 
 lim
t  0 t
  t  t    X  t 
 lim X
t  0 t
  X ' t 

For a WSS process EX '  t    X '  t   0 as  X  t  = constant.


RXX '  t1 , t2   EX  t1  X '  t2 
X  t2  t2   X  t2 
 EX  t1  l.i.m.
t2  0 t 2
E  X  t1  X  t2  t2   X  t1  X  t2  
 lim
t2  0 t2
RX  t1 , t2  t2   RX  t1 , t2 
 lim
t2  0 t 2
RX  t1 , t2 

t2
RX  t1 , t2 
 RXX '  t1 , t2  
t2

Similarly we can show that


Using the above result,
RX '  t1 , t2   EX '  t1  X '  t2 
RX , X '  t1 , t2 

t1
 2 RX , X  t1 , t2 

t1t2

For a WSS process

 X' (t )  E  X  0 ,
 dR  
EX  t1  X '  t2   RX  t1  t2    X
t2 d
and
 2 RX  t1  t2 
EX   t1  X   t2  
t1t2
d 2 RX  

d 2
d 2 RX  
 var  X   t    
d 2  0

Mean Square Integral


Recall that the Riemannian integral of a function x  t  over the interval t0 , t  is defined
as the limiting sum given by

t n 1

 x  d  lim
n  k  0
 x   
k 0
k k
t0

Figure 2

where t0  t1  ...  tn 1  tn  t is a partition on the interval t0 , t  and  k  tk 1  tk and


 k  tk 1 , tk  . For a random process { X  t } , the m-s integral can be similarly defined.

Definition: Suppose t0  t1  ................  tn 1  tn  t is a partition on the interval t0 , t 


and  k  tk 1  tk and  k  tk 1 , tk  . The random process  X  t  on a probability space
n 1
( S , , P ) is said to be m.s. integrable over t0 , t  if l.i.m.  X   k k exists. This
n  k 0
k 0

limit is called the mean-square integral of  X  t  and denoted by


t
Y (t )   X  d .
t0

Thus
t n 1
Y  t    X  d  l.i.m.
n  k  0
 X  
k 0
k k
t0

Note that the mean-square integral is a random process as the integral is over time.

Existence of M.S. Integral


t
 It can be shown that the sufficient condition for the m.s. integral  X  d
t0
to

t t
exist is that the double integral   R  ,  d d
t0 t0
X 1 2 1 2 exists. In other words, the

random process  X  t  is m.s. integrable over  t0 , t  if and only if


RX (t1 , t2 ) is Riemann integrable over t0 , t   t0 , t  .
 If { X  t } is m.s. continuous, then the above condition is satisfied and the
process is m.s. integrable.
Mean and Autocorrelation of the Integral of a WSS process

We have
t
EY  t   E  X  d
t0
t
  EX  d
t0
t
   X d
t0

  X (t  t0 )

Therefore, if  X  0, Y  t  is necessarily non-stationary.


RY  t1 , t2   EY (t1 )Y (t2 )
t1 t2

 E   X  1  X  2  d 1d 2
t0 t 0
t1 t2

   EX  1  X  2  d 1d 2
t0 t 0
t1 t2

   RX  1   2  d 1d 2
t0 t 0

which is a function of t1 and t2 .

Thus the integral of a WSS process is always non-stationary.

Figure 2. (a) Realization of a WSS process X (t )


(b) corresponding integral Y (t )

Remark
 The nonstationary of the M.S. integral of a random process has physical
importance – the output of an integrator due to stationary noise rises
unboundedly.

Example 7

The random binary wave  X  t  has the autocorrelation function


 
1    Tp
RX ( )   Tp
0 otherwise

RX   is continuous at   0 implying that  X  t  is M.S. continuous. Therefore,


 X  t  is mean-square integrable.

You might also like