Professional Documents
Culture Documents
stochastic processes_Lecture_2024
stochastic processes_Lecture_2024
• In general, the value Xt might depend on the quantity Xt-1 at time t-1,
or even the value Xs for other times s < t.
week 2 1
Stochastic Process
• E-mail: dndoh2009@gmail.com
• Tel: 653754070/677540384
week 2 2
Stochastic Process - Definition
• A stochastic process is a family of time indexed random variables Xt
where t belongs to an index set. Formal notation, X t : t I where I is
an index set that is a subset of R.
• Examples of index sets:
1) I = (-∞, ∞) or I = [0, ∞]. In this case Xt is a continuous time
stochastic process.
2) I = {0, ±1, ±2, ….} or I = {0, 1, 2, …}. In this case Xt is a discrete
time stochastic process.
• We use uppercase letter {Xt } to describe the process. A time series,
{xt } is a realization or sample function from a certain process.
• We use information from a time series to estimate parameters and
properties of process {Xt }.
week 2 3
week 2 4
Solution
5
Example 2
week 2 6
Solution
7
week 2 8
PDFs and CDFs
Example
Consider the random process {Xn,n = 0,1,2,⋯}, in which Xi's are i.i.d. standard normal random
variables.
9
Solution
10
Probability Distribution of a Process
• For any stochastic process with index set I, its probability
distribution function is uniquely determined by its finite dimensional
distributions.
11
Mean Function of a Random Process:
Example
12
Moments of Stochastic Process
• We can describe a stochastic process via its moments, i.e.,
E X t , E X t2 , E X t X s etc. We often use the first two moments.
week 2 14
Sample Autocorrelation Function
• For a given time series {xt}, the sample autocorrelation
function is given by
nk
x t x xt k x
ˆ k
ˆ k t 1
.
n
ˆ 0
tx x 2
t 1
15
The Autocovariance and Autocorrelation Functions
week 2 16
Properties of γ(s) and ρ(s)
• For a stationary process, the autocovariance function γ(s) and the
autocorrelation function ρ(s) have the following properties:
0 var X t ; 0 1.
1 s 1 .
s s and s s .
The autocovariance function γ(s) and the autocorrelation
function ρ(s) are positive semidefinite in the sense that
n n n n
i j 0
i 1 j 1
i j and i j 0
i j
i 1 j 1
week 2 17
Example
week 2 18
Partial Autocorrelation Function
week 2 19
Multiple Random Processes
To get an idea about these concepts suppose that X(t) is the price of oil (per gallon)
and Y(t) is the price of gasoline (per gallon) at time t.
Since gasoline is produced from oil, as oil prices increase, the gasoline prices tend to
increase, too. Thus, we conclude that X(t) and Y(t) should be positively correlated (at
least for the same t, i.e., CXY(t,t)>0).
20
Example
Solution
21
… Solution
22
Independent Random Processes
23
Stationary Processes
24
Example
Consider the discrete-time random process {X(n),n∈Z⋯}, in which the X(n)'s are
i.i.d. with CDF FX(n)(x)=F(x). Show that this is a (strict-sense) stationary process.
Solution
Intuitively, since X(n)'s are i.i.d., we expect that as time evolves the probabilistic
behavior of the process does not change. Therefore, this must be a stationary process.
To show this rigorously, we can argue as follows. For all real numbers x1,x2,⋯,xr and
all distinct integers n1, n2,⋯, nr, we have,
25
Weak-Sense Stationary Processes
week 2 26
Example
Consider the random process {X(t),t∈R} defined as X(t)=cos(t+U), where
U∼Uniform(0,2π). Show that X(t) is a WSS process.
Solution
27
… Solution
RX(τ)=E[X(t)X(t−τ)] = E[X(t+τ)X(t)]
28
Let {X(t),t∈R} be a WSS process with correlation function RX(τ). Then, we can write
RX(0)=E[X(t)2].
The quantity E[X(t)2] is called the expected (average) power in X(t) at time t.
For a WSS process, the expected power is not a function of time.
Finally, we would like to show that RX(τ) takes its maximum value at τ = 0. That is,
X(t) and X(t+τ) have the highest correlation when τ=0.
29
Jointly Wide-Sense Stationary Processes
We often work with multiple random processes, so we extend the concept of wide-
sense stationarity to more than one process. More specifically, we can talk about
jointly wide-sense stationary processes.
Example
Let X(t) and Y(t) be two jointly WSS random processes. Consider the random
process Z(t) defined as:
Z(t)=X(t)+Y(t).
Show that Z(t) is WSS.
week 2 30
Solution
week 2 31
Cyclostationary Processes
32
Derivatives and Integrals of Random Processes
week 2 33
Mean-square continuous
Let X(t) be a continuous-time random process. We say that X(t) is mean-square
continuous at time t if
Note that mean-square continuity does not mean that every possible realization of
X(t) is a continuous function. It roughly means that the difference X(t+δ)−X(t) is
small on average.
Example
If X(t) is a Poisson process with intensity λ, then for all t > s ≥ 0, we have
X(t)−X(s)∼Poisson(λ(t−s)).
Show that X(t) is mean-square continuous at any time t≥0.
34
Gaussian process
35
36
Similarly, we can define jointly Gaussian random processes.
Note that from the properties of jointly normal random variables, we can conclude
that if two jointly Gaussian random processes X(t) and Y(t) are uncorrelated, i.e.,
week 2 38
Tutorials exercises
39
40
Problem 6
(Weakly stationary process) Consider real valued random variable ,
Problem 7
Show that any stochastic process with independent increments
is a Markov process.
41