Download as pdf or txt
Download as pdf or txt
You are on page 1of 41

Random/Stochastic Process - Introduction

• Stochastic processes are processes that proceed randomly in time.

• Rather than consider fixed random variables X, Y, etc. or even


sequences of i.i.d random variables, we consider sequences X0, X1,
X2, …. Where Xt represent some random quantity at time t.

• In general, the value Xt might depend on the quantity Xt-1 at time t-1,
or even the value Xs for other times s < t.

• Example: simple random walk .

week 2 1
Stochastic Process

• Innocent Ndoh Mbue, PhD


(Professor in Ecoinformatics)

• E-mail: dndoh2009@gmail.com
• Tel: 653754070/677540384

week 2 2
Stochastic Process - Definition
• A stochastic process is a family of time indexed random variables Xt
where t belongs to an index set. Formal notation, X t : t  I where I is
an index set that is a subset of R.
• Examples of index sets:
1) I = (-∞, ∞) or I = [0, ∞]. In this case Xt is a continuous time
stochastic process.
2) I = {0, ±1, ±2, ….} or I = {0, 1, 2, …}. In this case Xt is a discrete
time stochastic process.
• We use uppercase letter {Xt } to describe the process. A time series,
{xt } is a realization or sample function from a certain process.
• We use information from a time series to estimate parameters and
properties of process {Xt }.
week 2 3
week 2 4
Solution

5
Example 2

week 2 6
Solution

7
week 2 8
PDFs and CDFs

Example
Consider the random process {Xn,n = 0,1,2,⋯}, in which Xi's are i.i.d. standard normal random
variables.

1. Write down fXn(x) for n = 0,1,2,⋯


2. Write down fXmXn(x1,x2) for m≠n.

9
Solution

10
Probability Distribution of a Process
• For any stochastic process with index set I, its probability
distribution function is uniquely determined by its finite dimensional
distributions.

• The k dimensional distribution function of a process is defined by


FX t
1
,..., X t k x1 ,..., xk   PX t
1
 x1 ,..., X tk  x k 
for any t1 ,..., t k  I and any real numbers x1, …, xk .

• The distribution function tells us everything we need to know about


the process {Xt }.

11
Mean Function of a Random Process:

Example

12
Moments of Stochastic Process
• We can describe a stochastic process via its moments, i.e.,
E  X t , E X t2 , E  X t  X s  etc. We often use the first two moments.

• The mean function of the process is E  X    .


t t

• The variance function of the process is Var  X    2 .


t t

• The covariance function between Xt , Xs is


Cov X t , X s   E  X t   t  X s   s 
• The correlation function between Xt , Xs is
Cov X t , X s 
 X t , X s  
 t2  s2

• These moments are often function of time.


13
Sample Autocovariance Function
• Given a single realization {xt} of a stationary process {Xt}, the
sample autocovariance function given by
1 nk
ˆ k    xt  x xt  k  x 
n t 1

is an estimate of the autocivariance function.

week 2 14
Sample Autocorrelation Function
• For a given time series {xt}, the sample autocorrelation
function is given by
nk

 x t  x xt  k  x 
ˆ k 
ˆ k   t 1
 .
n
ˆ 0

 tx  x  2

t 1

• The sample autocorrelation function is non-negative definite.

• The sample autocovariance and autocorrelation functions have


the same properties as the autocovariance and autocorrelation
function of the entire process.

15
The Autocovariance and Autocorrelation Functions

• For a stationary process {Xt }, with constant mean μ and constant


variance σ2. The covariance between Xt and Xt+s is
 s   cov X t , X t  s   E  X t    X s   
• The correlation between Xt and Xt+s is
cov X t , X s   s 
 s   
var X t  var X s   0

Where var  X t   var  X t s    0 .


• As functions of s, γ(s) is called the autocovariance function and ρ(s)
is called the autocorrelation function (ATF). They represent the
covariance and correlation between Xt and Xt+s from the same process,
separated only by s time lags.

week 2 16
Properties of γ(s) and ρ(s)
• For a stationary process, the autocovariance function γ(s) and the
autocorrelation function ρ(s) have the following properties:
  0   var  X t ;  0   1.
  1   s   1 .
  s     s  and  s    s  .
 The autocovariance function γ(s) and the autocorrelation
function ρ(s) are positive semidefinite in the sense that
n n n n

    i  j   0
i 1 j 1
i j and     i  j   0
i j
i 1 j 1

for any real numbers  1 ,  2 ,...,  n .

week 2 17
Example

week 2 18
Partial Autocorrelation Function

• Often we want to investigate the dependency / association between


Xt and Xt+k adjusting for their dependency on Xt+1, Xt+2,…, Xt+k-1.

• The conditional correlation Corr(Xt , Xt+k | Xt+1, Xt+2,…, Xt+k-1) is


usually referred to as the partial correlation in time series analysis.

• Partial autocorrelation is usually useful for identifying


autoregressive models.

week 2 19
Multiple Random Processes

To get an idea about these concepts suppose that X(t) is the price of oil (per gallon)
and Y(t) is the price of gasoline (per gallon) at time t.

Since gasoline is produced from oil, as oil prices increase, the gasoline prices tend to
increase, too. Thus, we conclude that X(t) and Y(t) should be positively correlated (at
least for the same t, i.e., CXY(t,t)>0).
20
Example

Solution

21
… Solution

22
Independent Random Processes

23
Stationary Processes

We can provide similar definition for discrete-time processes.

24
Example

Consider the discrete-time random process {X(n),n∈Z⋯}, in which the X(n)'s are
i.i.d. with CDF FX(n)(x)=F(x). Show that this is a (strict-sense) stationary process.

Solution
Intuitively, since X(n)'s are i.i.d., we expect that as time evolves the probabilistic
behavior of the process does not change. Therefore, this must be a stationary process.
To show this rigorously, we can argue as follows. For all real numbers x1,x2,⋯,xr and
all distinct integers n1, n2,⋯, nr, we have,

25
Weak-Sense Stationary Processes

In the case of a discrete-time WSS processes.

week 2 26
Example
Consider the random process {X(t),t∈R} defined as X(t)=cos(t+U), where
U∼Uniform(0,2π). Show that X(t) is a WSS process.

Solution

27
… Solution

Since for WSS random processes, RX(t1,t2)=RX(t1−t2), we usually denote the


correlation function by RX(τ), where τ = t1−t2. Thus, for a WSS process, we can write

RX(τ)=E[X(t)X(t−τ)] = E[X(t+τ)X(t)]
28
Let {X(t),t∈R} be a WSS process with correlation function RX(τ). Then, we can write

RX(0)=E[X(t)2].

The quantity E[X(t)2] is called the expected (average) power in X(t) at time t.
For a WSS process, the expected power is not a function of time.

Since X(t)2 ≥ 0, we conclude that RX(0) ≥ 0.

Next, let's consider RX(−τ).


We have
RX(−τ) = E[X(t)X(t+τ)] (by definition) = E[X(t+τ)X(t)] = RX(τ), even function.

Finally, we would like to show that RX(τ) takes its maximum value at τ = 0. That is,
X(t) and X(t+τ) have the highest correlation when τ=0.

|RX(τ)| ≤ RX(0),for all τ∈R.

29
Jointly Wide-Sense Stationary Processes
We often work with multiple random processes, so we extend the concept of wide-
sense stationarity to more than one process. More specifically, we can talk about
jointly wide-sense stationary processes.

Example

Let X(t) and Y(t) be two jointly WSS random processes. Consider the random
process Z(t) defined as:
Z(t)=X(t)+Y(t).
Show that Z(t) is WSS.
week 2 30
Solution

week 2 31
Cyclostationary Processes

For example, consider the random process {X(t),t∈R} defined as

32
Derivatives and Integrals of Random Processes

week 2 33
Mean-square continuous
Let X(t) be a continuous-time random process. We say that X(t) is mean-square
continuous at time t if

Note that mean-square continuity does not mean that every possible realization of
X(t) is a continuous function. It roughly means that the difference X(t+δ)−X(t) is
small on average.
Example
If X(t) is a Poisson process with intensity λ, then for all t > s ≥ 0, we have
X(t)−X(s)∼Poisson(λ(t−s)).
Show that X(t) is mean-square continuous at any time t≥0.

34
Gaussian process

35
36
Similarly, we can define jointly Gaussian random processes.

Note that from the properties of jointly normal random variables, we can conclude
that if two jointly Gaussian random processes X(t) and Y(t) are uncorrelated, i.e.,

CXY(t1,t2) = 0,for all t1,t2,

then X(t) and Y(t) are two independent random processes.


37
White Noise Processes
• A process {Xt} is called white noise process if it is a sequence of
uncorrelated random variables from a fixed distribution with
constant mean μ (usually assume to be 0) and constant variance σ2.

• A white noise process is stationary with autocovariance and


autocorrelation functions given by ….

• A white noise process is Gaussian if its joint distribution is normal.

week 2 38
Tutorials exercises

39
40
Problem 6
(Weakly stationary process) Consider real valued random variable ,

Problem 7
Show that any stochastic process with independent increments
is a Markov process.

41

You might also like