Professional Documents
Culture Documents
Markov Processes and Birth-Death Processes: J. M. Akinpelu
Markov Processes and Birth-Death Processes: J. M. Akinpelu
and
Birth-Death Processes
J. M. Akinpelu
Exponential Distribution
Definition. A continuous random variable X has an
exponential distribution with parameter > 0 if its
probability density function is given by
e x , x0
f ( x)
0, x 0.
P{ X s t | X t} P{ X s} (1)
or equivalently,
P{ X s t} P{ X s}P{ X t}.
f (0)dx
dG ( h )
G (h)
0 0
t t
log(G (h)) 0
f ( 0) x 0
log(G (t )) f (0)t
G (t ) e f ( 0 ) t .
Exponential Distribution
Theorem 2. A R.V. X is exponentially distributed
if and only if for h 0,
P{ X h} h o(h)
P{ X h} 1 h o(h) .
Exponential Distribution
Proof: Let X be exponentially distributed, then for
h 0, h
P{ X h} 1 e
( h) n
1 1
n 1 n!
( h) n
h
n2 n!
h o(h).
The converse is left as an exercise.
Exponential Distribution
1.2
0.8
F (x )
0.6
0.4
0.2
slope (rate) ≈
0
0 1 2 3 4 5 6
x
Markov Process
A continuous time stochastic process {Xt, t 0}
with state space E is called a Markov process provided
that
P{ X s t j | X s i, X u xu , 0 u s}
P( X s t j | X s i}
known
0 s s+t
Markov Process
We restrict ourselves to Markov processes for which the
state space E = {0, 1, 2, …}, and such that the
conditional probabilities
Pij (t ) P{ X s t j | X s i}
r0 (t ) r1 (t ) r2 (t )
r0 (t ) r1 (t )
P(t )
r0 (t )
0
where
e t ( t ) j i
Pij (t ) r j i (t ) , 0i j
( j i )!
for some > 0. X is a Poisson process.
Chapman-Kolmogorov Equations
Theorem 3. For i, j E, t, s 0,
P ij ( s t ) Pik ( s ) Pkj (t ).
kE
Realization of a Markov Process
Xt()
7
S4
6
S2
5
4
S3
3
S1 S5
2
S0
1
0
t
T0 T1 T2 T3 T4 T5
Time Spent in a State
Theorem 4. Let t 0, and n satisfy Tn ≤ t < Tn+1, and let Wt =
Tn+1 – t. Let i E, u 0, and define
G (u ) P{Wt u | X t i} .
Then
G (u v) G (u )G (v).
Note: This implies that the distribution of time remaining in a
state is exponentially distributed, regardless of the time
already spent in that state.
Wt
Tn t t+u Tn+1
Time Spent in a State
Proof: We first note that due to the time homogeneity of X, G(u)
is independent of t. If we fix i, then we have
G (u v) P{Wt u v | X t i}
P{Wt u, W t u v | X t i}
P{Wt u | X t i}P{W t u v | Wt u , X t i}
P{Wt u | X t i}P{Wt u v | X t u i}
G (u ) G (v).
An Alternative Characterization of a
Markov Process
Theorem 5. Let X ={Xt, t 0} be a Markov process. Let T0, T1,
…, be the successive state transition times and let S0, S1, …, be
the successive states visited by X. There exists some number i
such that for any non-negative integer n, for any j E, and t > 0,
P{S n 1 j , Tn 1 Tn t | S 0 , , S n 1 , S n i ; T0 , , Tn }
Q(i, j ) e i t
where
Q ij 0, Qii 0, Q
jE
ij 1.
An Alternative Characterization of a
Markov Process
This implies that the successive states visited by
a Markov process form a Markov chain with
transition matrix Q.
Hence
Pij (t h) Pij (t ) o( h)
k Qkj Pik (t ) jP ij (t ) .
h k j h
j P j k Qkj P k
k j
for all j. These are referred to as “balance equations”. Together with
the condition
P
j
j 1,
they uniquely determine the limiting distribution.
Birth-Death Processes
Definition. A birth-death process {X(t), t 0} is a Markov
process such that, if the process is in state j, then the only
transitions allowed are to state j + 1 or to state j – 1 (if j >
0).
P{ X t h j | X t j 1} j 1h o(h)
P{ X t h j | X t j} 1 j h j h o(h)
P{ X t h j | X t i} o(h) if | j i | 1.
Birth and Death Rates
j-1 j
j-1 j j+1
j j+1
Note:
1. The expected time in state j before entering state j+1 is 1/j;
the expected time in state j before entering state j‒1 is 1/j.
2. The rate corresponding to state j is vj = j + j.
Differential-Difference Equations for
a Birth-Death Process
It follows that, if Pj (t ) P{ X (t ) j} , then
d
Pj (t ) j 1P j 1 (t ) j 1 Pj 1 (t ) ( j j ) Pj (t ), j0
dt
d
P0 (t ) 1 P1 (t ) 0 P0 (t ).
dt
1 if j 0
Pj (0)
0 if j 0.
1 if j N
Pj (0)
0 otherwise.
Then solving the difference-differential equations for this
process gives
N t j
Pj (t ) (e ) (1 e t ) N j 0 j N.
j
Limiting Probabilities
Now assume that limiting probabilities Pj exist.
They must satisfy:
0 j 1 P j 1 j 1 Pj 1 ( j j ) Pj , j0
0 1 P1 0 P0
or
( j j ) Pj j 1P j 1 j 1 Pj 1 , j0
(*)
0 P0 1 P1.
Limiting Probabilities
These are the balance equations for a birth-death
process. Together with the condition
P
j 0
j 1,
j 1
i
Pj P0 j 0, 1, 2, .
i 0 i 1
When Do Limiting Probabilities
Exist?
Define j 1
i
S 1 .
j 1 i 0 i 1
It is easy to show that
Po S 1
if S < . (This is equivalent to the condition P0 > 0.)
Furthermore, all of the states are recurrent positive, i.e.,
ergodic. If S = , then either all of the states are recurrent
null or all of the states are transient, and limiting
probabilities do not exist.
Flow Balance Method
Draw a closed boundary around state j:
j-1 j
j-1 j j+1
j j+1
j-1 j
j-1 j j+1
j j+1
j-1=(m‒j+1) j=(m‒j)
j‒1 j j+1
j= j+1=
(m j 1) Pj 1 Pj 1 [ (m j ) ]Pj
mP0 P1
Pm1 Pm
Example
j-1=(m‒j+1) j=(m‒j)
j‒1 j j+1
j= j+1=
j 1
i 1
Pj P0 P0
i 0 i 1
j
m
j 1
(m i ) 1 m(m 1) (m j 1)
P0 j 1
i 0
P0 m(m 1) (m j 1)( / ) j
Example
How would this example change if there were m
(or more) repairmen?
Homework
No homework this week due to test next week.
References
1. Erhan Cinlar, Introduction to Stochastic Processes,
Prentice-Hall, Inc., 1975.