Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

PROBABILITY AND STOCHASTIC PROCESS 15

2. Stochastic processes and its classes


The theory of stochastic processes turns to be a useful tool in solving problems in
various fields such as engineering, genetics, statistics, economics, finance, etc. The word
stochastic means random or chance. Stochastic processes can be thought of collection of
random variables indexes by some parameter.
Definition 2.1 (Stochastic process). A real stochastic process is a collection of random
variables {Xt : t ∈ T} defined on a common probability space (Ω, P, F) with values in R.
• T is called the index set of the process which is usually a subset of R.
• The set of values that the random variable Xt can take is called state space,
denoted by S.
Example 2.1. Consider a random event occurring in time such as number of telephone
calls received at a switchboard. Suppose that Xt is the random variable which represents
the number of incoming calls in an interval (0, t) of duration t units. The number of calls
within a fixed interval of specified duration, say one unit of time, is a random variable X1
and the family {Xt ; t ∈ T} constitutes a stochastic process.
Stochastic processes can be classified, in general, into the following four types of pro-
cesses:
a) Discrete time, discrete state space: the number of individuals in a population
at the end of year t can be modeled as a stochastic process {Xt ; t ∈ T}, where T =
{0, 1, 2, . . .} and the state space S = {0, 1, 2, . . .}.

b) Discrete time, continuous state space: the share price for an asset at the close of
trading on day t with T = {0, 1, 2, . . .} and S = {x : 0 ≤ x < ∞}.

c) Continuous time, discrete state space: the number of incoming calls Xt in an


interval [0, t]. The stochastic process {Xt ; t ∈ T} has T = {t : 0 ≤ t < ∞} and the state
space S = {0, 1, 2, . . .}.

d) Continuous time, continuous state space: the value of the Dow-Jones index (stock
market index) at time t. Then The stochastic process {Xt ; t ∈ T} has T = {t : 0 ≤ t <
∞} and the state space S = {x : 0 ≤ x < ∞}.
Definition 2.2 (Independent Increments). Let {Xt : t ∈ T} be a stochastic process.
We say that it has independent increments, if for all t0 < t1 < t2 < . . . < tn , the random
variables Xt0 , Xt1 − Xt0 , . . . , Xtn − Xtn−1 are independent.
Definition 2.3 (Stationary Increments). A stochastic process {Xt : t ∈ T} is said to
have stationary increments if Xt2 +τ − Xt1 +τ has the same distribution as Xt2 − Xt1 for all
choices of t1 , t2 and τ > 0.
Definition 2.4 (Second order process). A stochastic process {Xt : t ∈ T} is called a
second order process if E[Xt2 ] < ∞ for all t ∈ T.
Example 2.2. Let Z1 , Z2 be independent normally distributed random variables with mean
0 and variance σ 2 . For λ ∈ R, define
Xt = Z1 cos(λt) + Z2 sin(λt) , t ∈ R+ .
16 A. K. MAJEE

Then Xt is a second-order stochastic process. Indeed, by using independent property,


E[|Xt |2 ] = E Z12 cos2 (λt) + Z22 sin2 (λt) + 2Z1 Z2 cos(λt) sin(λt)
 

= E[Z12 ] cos2 (λt) + E[Z22 ] sin2 (λt) + sin(2λt)E[Z1 Z2 ]


= σ 2 (cos2 (λt) + sin2 (λt)) = σ 2 ∀t ∈ R+ .
Definition 2.5 (Covariance Stationary Process). A second order stochastic process
{Xt : t ∈ T} is called covariance stationary process if
i) m(t) = E[Xt ] is independent of t.
ii) Cov(Xt , Xs ) depends only on the difference |t − s| for all s, t ∈ T.
Example 2.3. Let {Xn : n ≥ 1} be uncorrelated random variables with mean 0 and
variance 1. Then
(
0, m 6= n
Cov(Xm , Xn ) = E[Xm Xn ] =
1, m = n.
Thus, {Xn : n ≥ 1} is a covariance stationary process.
Example 2.4. Let A be a random variable with mean m and variance σ 2 . For fixed
η ∈ R, consider the stochastic process
Xt := A sin(ηt + Φ), t≥0
where Φ is uniformly distributed over (0, 2π) and independent of A. Then {Xt : t ≥ 0} is
a covariance stationary process. To check this, we first compute the mean.
E[Xt ] = E[A sin(ηt + Φ)] = E[A]E[sin(ηt + Φ)] = mE[sin(ηt + Φ)]
m 2π
Z
= sin(ηt + θ) dθ = 0.
2π 0
Next we calculate the covariance of Xt and Xs .
1
Cov(Xt , Xs ) = E[Xt Xs ] = E[A2 2 sin(ηt + Φ) sin(ηs + Φ)]
2
1 2
= E[A ]E[cos(η(t − s)) − cos(η(t + s) + 2Φ)] (∴ 2 sin(a) sin(b) = cos(a − b) − cos(a + b))
2
Z 2π
1 2
 1 
= E[A ] cos(η(t − s)) − cos(η(t + s) + 2θ) dθ
2 2π 0
1 1
= E[A2 ] cos(η(t − s)) = (σ 2 + m2 ) cos(η(t − s)).
2 2
Thus, Cov(Xt , Xs ) depends only on the difference |t − s|. Next we show that it is a second
ordet process. Indeed,
Z 2π
2 2 2 2 2 1
E[Xt ] = E[A ]E[sin (ηt + Φ)] = (σ + m ) sin2 (ηt + θ) dθ
2π 0
Z 2π
2 2 1
1 − cos(2ηt + 2θ) dθ (∴ 2 sin2 (a) = 1 − cos(2a))

= (σ + m )
4π 0
σ 2 + m2
= < +∞ ∀ t.
2
Thus, {Xt : t ≥ 0} is a covariance stationary process.
PROBABILITY AND STOCHASTIC PROCESS 17

Definition 2.6 (Markov Process). Let {Xt : t ≥ 0} be a stochastic process defined


over a probability space (Ω, F, P) with state space (R, B). We say that {Xt : t ≥ 0} is a
Markov process if for any 0 ≤ t1 < t2 < . . . < tn and for any B ∈ B
   
P Xtn ∈ B|Xt1 , . . . , Xtn−1 = P Xtn ∈ B|Xtn−1 .

Roughly, a Markov process is a process such that given the value Xs , the distribution
of Xt for t > s, does not depend on the values of Xu , u < s.
Remark 2.1. Any stochastic process which has independent increments is a Markov
process. To show this we proceed as follows: for 0 ≤ t0 < t1 < . . . < tn < t < t + s , n ≥ 1,
it suffices to show that
   
P Xt+s ≤ y|Xt = x, Xtk = xk , k = 1, 2, . . . , n = P Xt+s ≤ y|Xt = x .
Now, by independence property, we have

LHS =P Xt+s − Xt ≤ y − x|Xt − Xtn = x − xn , Xtn − Xtn−1 = xn − xn−1 , . . . ,

Xt1 − Xt0 = x1 − x0 , Xt0 = x0
   
= P Xt+s − Xt ≤ y − x = P Xt+s ≤ y|Xt = x .
Hence it is a Markov process.
3. Markov Chain
Definition 3.1 (Markov Chain). A sequence of random variables (Xn )n∈N with discrete
state space (countable or finite) is called a discrete-time Markov chain (DTMC) if it
satisfies the following condition
  
P Xn+1 = j Xn = i, Xn−1 = in−1 , . . . , X0 = i0 = P Xn+1 = j Xn = i (3.1)

for all n ∈ N and for all i0 , i1 , . . . , in−1 , i, j ∈ S (state space) with P X0 = i0 , . . . , Xn =


in > 0. In other words, the condition (3.1) implies the following: if we know the present
state “Xn = i”, the knowledge of the past history “Xn−1 , . . . , X0 ” has no influence on the
probability structure of the future state Xn+1 .
Example 3.1. Suppose that a coin is tossed repeatedly and let Xn denotes the number of
heads in the first n-tossed. Then the number of heads obtained in the first (n + 1) tossed
only depends on the knowledge of the number of heads obtained in the first n tosses. Thus,
(3.1) is satisfied. Hence it is a Markov chain.

You might also like