Professional Documents
Culture Documents
Extinction Probability For Queues and Martingales
Extinction Probability For Queues and Martingales
•Slow Traffic ( ρ ≤ 1 )
Steady state solutions exist and the probability of extinction
equals 1. (Busy periods are bound to terminate with
probability 1. Follows from sec 15.6, theorem 15-9.)
X0 = 1 X0 = 1
X1 = 3 (2)
Y1 Y2( 2 ) Y3( 2 )
X2 = 9
X 3 = 26
Y1( 3) Y2( 3) Y3( 3) Y4( 3) Y5( 3) Y6( 3) Y7( 3) Y8( 3) Y9( 3)
Fig 20.1 3
PILLAI
Queues and Population Models
• Population models
X n : Size of the nth generation
Yi ( n ) : Number of offspring for the ith member of
the nth generation. From Eq.(15-287), Text
Xn
X n +1 = ∑ Yk( n )
k =1
Let
∆
ak = P{Yi ( n ) = k }
∑
{ X n = j }
j
Y
= E ( E{ z X n +1
X n = j}) = E E z i =1 i
= E{[ P( z )] } = ∑ {P ( z )} P{ X n = j} = Pn ( P( z ))
j j
(20-2)
j
Pn +1 ( z ) = Pn ( P ( z )) = P ( Pn ( z )) (20-3)
P{ X n = k } = ?
Extinction probability = nlim
→∞
P{ X n = 0} = π o = ?
P( z ) P( z )
1
a0
a0
Fig 20.3 π0 1
1
(a)ρ ≤1 (b) ρ >1
ρ ≤ 1 ⇒ π0 = 1
ρ > 1 ⇒ π 0 is the unique solution of P( z ) = z , (20-7)
a0 < π 0 < 1
• Left to themselves, in the long run, populations either
die out completely with probability π 0 , or explode with
probability 1-π 0 . (Both unpleasant conclusions). 6
PILLAI
Queues : Customers that arrive
during the first service time
X2
First
customer X1 X3
X0 Y1 Y2 Y3
s1 s2 s3 s4 sk
Service time of Service time of
first customer second customer
Fig 20.4 Xn
X n +1 = ∑ Yi
i =1
Busy period
t
π 0
k =0
∞
• ρ = P′(1) = ∑ kak : Traffic Intensity ≤ 1
k =1 } Steady state
> 1 ⇒ Heavy traffic
• Termination of busy periods corresponds to extinction of
queues. From the analogy with population models the
extinction probability π 0 is the unique root of the equation
P( z ) = z
• Slow Traffic : ρ ≤ 1 ⇒ π 0 = 1
Heavy Traffic : ρ > 1 ⇒ 0 < π 0 < 1
i.e., unstable queues ( ρ > 1) either terminate their busy periods
with probability π 0 < 1 , or they will continue to be busy with
probability 1- π 0 . Interestingly, there is a finite probability of
busy period termination even for unstable queues.
π 0 : Measure of stability for unstable queues. 8
PILLAI
Example 20.1 : M/M/ 1 queue
From (15-221), text,we have
k k
µ λ 1 ρ
ak = = , k = 0, 1, 2, (20-8)
λ + µ λ + µ 1 + ρ 1 + ρ
P (λ )
t
P (µ )
Number of arrivals between any two departures follows
a geometric random variable.
1 λ
P( z ) = , ρ= (20-9)
[1 + ρ (1 − z )] µ
P( z ) = z ⇒ ρ z 2 − ( ρ + 1) z + 1 = ( z − 1) ( ρ z - 1) = 0
π0 1
1, ρ ≤1 ρ
Fig 20.5
π0 = 1 (20-10)
ρ , ρ >1 9
ρ
1 PILLAI
Example 20.2 : Bulk Arrivals M[x]/M/ 1 queue
Compound Poisson Arrivals : Departures are exponential
random variables as in a Poisson process with parameter µ .
Similarly arrivals are Poisson with parameter λ. However each
arrival can contain multiple jobs.
A1 P(λ ) A2
t Fig 20.6
t1 t2
P(µ )
k =0 10
PILLAI
Inter-departure Statistics of Arrivals
∞
P( z ) = ∑ P{Y = k} z k = E{zY }
k =0
∞ ∞
= ∑ ∫ 0 E{z ( A1 + A2 + + An )
n arrivals in (0,t )} P{n arrivals in (0,t )} f s (t) dt
n =0
∞ ∞ (λ t )n λ
= ∑ ∫ 0 [ E{z }] e Ai n − λt
µ e − λ t dt , ρ=
n =0 n! µ
∞ ∞ [λ t C ( z )]n 1
= ∑ µ∫ 0 e − ( λ + µ )t
= (20-11)
n =0 n! 1 + ρ{1 − C ( z )}
P( z ) = [1 + ρ {1 − C ( z )}]−1 , Let ck = (1 − α ) α k , k = 0, 1, 2,
1−α 1−α z
C( z) = , P( z ) = (20-12)
1−α z 1 + αρ − α (1 + ρ ) z
αρ
Traffic Rate = P′(1) = >1
1−α
P( z ) = z ⇒ (z - 1) [α (1 + ρ )z - 1] = 0 π 0 = 1 /(α (1 + ρ ))
11
PILLAI
Bulk Arrivals (contd)
• Compound Poisson arrivals with geometric rate
M/M/1
π0
1 1−α
π0 = , ρ> . (20-13)
α (1 + ρ ) α
ρ
2
For α = , we obatain 1−α
α
1 (a)
3 π0
1
3 1
π0 = , ρ> (20-14)
2 (1 + ρ ) 2
ρ
1
2 (b)
Fig 20.7
• Doubly Poisson arrivals ⇒ C ( z ) = e − µ (1− z ) gives
1
P( z ) = =z (20-15)
1 + ρ [1 − C ( z )] 12
PILLAI
Example 20.3 : M/En / 1 queue (n-phase exponential service)
From (16-213)
−n
ρ
P( z ) = 1 + (1 − z ) (20-16)
n
1
n
P( z ) = z ⇒ x n + x n − 1 + +x− = 0, x=z n
ρ
2
−1 + 1 + 8 ρ ρ =2
n = 2⇒π = , ρ >1
0 2 M / M /1→ n =1 π 0 = 0.5
π 0 ≈ (1 + ρ / n ) − n , n >> 1 (20-17) M / E2 / 1 → n = 2 π 0 = 0.38
= E{( p )
q Sn + Z n +1
| Sn }, 18
PILLAI
since {Sn} generates a Markov chain.
Thus
E{Yn +1 | Yn , Yn −1 , , Y0 } = ( )
q
p
Sn
( q
p⋅ p+ ( )
q
p
−1
)
⋅q = ( )
q
p
Sn
= Yn (20-27)
≤ E{e θ ( X n −1 − X 0 ) θ 2σ n2 / 2
}e ≤e θ 2 ∑ in=1σ i2 / 2
. (20-38) 22
PILLAI
Substituting (20-38) into (20-37) we get
− (θ x −θ 2 ∑ in=1σ i2 / 2)
P{ X n − X 0 ≥ x} ≤ e (20-39)
(iii) Observe that the exponent on the right side of (20-39) is
minimized for θ = x / ∑ in=1σ i2 and hence it reduces to
− x 2 / 2 ∑ in=1σ i2
P{ X n − X 0 ≥ x} ≤ e , x > 0. (20-40)
The same result holds when Xn – X0 is replaced by X0 – Xn,
and adding the two bounds we get (20-30), the Hoeffding’s
inequality.
( )
q a +b
= Pa + p (1 − Pa ). (20-44)
Equating (20-43)-(20-44) and simplifying we get
( ) p b
1− q
Pa = (20-45)
1− ( q )
p a +b
that agrees with (3-47), Text. Eq. (20-45) can be used to derive
other useful probabilities and advantageous plays as well. [see
Examples 3-16 and 3-17, Text].
Whatever the advantage, it is worth quoting the master
Gerolamo Cardano (1501-1576) on this: “The greatest
advantage in gambling comes from not playing at all.” 27
PILLAI