Solution CMC1

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

Problem Sheet 5 Solutions: Poisson and Birth Processes

Question 1
(a) Let Wi be the time taken for atom i to decay. Then X1 = min{W1 , W2 , . . . , W1026 }. Since
Wi ∼ Exp(10−26 ) and the Wi ’s are independent, X1 ∼ Exp(1). To see that note that

P(X1 > x) = P(min{W1 , W2 , . . . , W1026 } > x)


26
10
Y
= P(W1 > x, W2 > x . . . , W1026 > x) = P(Wi > x)
i=1
1026
1026
 −26
= (P(W1 > x)) = e−10 x = e−x .

After the first atom decays there are 1026 − 1 left. By the lack of memory property each decays
after a further time which is Exp(10−26 ). Thus

X2 ∼ Exp((1026 − 1)10−26 ) = Exp(1 − 10−26 ).

More formally, compute


 −26
1026 −1
P(X2 > x) = P(min{W1 , W2 , . . . , W1026 −1 } > x) = e−10 x ,

hence X2 ∼ Exp((1026 − 1)10−26 ) = Exp(1 − 10−26 ).

(b) Let p = P(Wi < 60) = 1 − exp{−60 × 10−26 }, so that N ∼ Bin(1026 , p). Recall that
the Poisson distribution with parameter λ = np approximates the binomial distribution B(n, p)
if n is sufficiently large and p is sufficiently small. In our cases, since 1026 is enormous and
p very small, the distribution of N is approximately Poisson with mean λ = 1026 p. But p =
1 − (1 − 60 × 10−26 + 12 (60 × 10−26 )2 + · · · ) ≈ 60 × 10−26 . Thus λ ≈ 60.

Question 2
(a) Suppose that
ith

1 if the decay is recorded
Wi =
0 if the ith decay isn’t recorded
PDt
Then Nt = i=1 Wi . Since the Wi ’s are independent and Dt is independent of the Wi ’s we have
 PDt    PDt 
GNt (s) = E(sNt ) = E s i=1 Wi = E E s i=1 Wi Dt
Dt
!! Dt
!
Y Y
Wi Wi

= E E s Dt =E E s
i=1 i=1
Dt
!
Y
= E (GW (s))Dt = GDt (GW (s)).

= E GWi (s)
i=1

1
Since Dt ∼ Pois(µt), GDt (s) = eµt(s−1) and Gw (s) = (1 − p) + ps it follows

GNt (s) = eµpt(s−1)

i.e. Nt ∼ Pois(µpt).
Alternatively, you could argue that this is nothing else than the thinning of a Poisson process!
(b) P(N1 = 1) = 3e−3 . Now

P(T1 > 0.75|N1 = 1)


P(T1 > 0.75, N1 = 1)
=
P(N1 = 1)
P(no event in[0, 0.75], one event in[0.75, 1])
=
P(N1 = 1)
P(no event in[0, 0.75])P(one event in[0.75, 1])
=
P(N1 = 1)
P(N 43 = 0)P(N1 − N 43 = 1)
=
P(N1 = 1)

where we have used independent increments. Substituting the relevant probability mass functions
(using Nt ∼ P oi(3t)) completes the exercise.

Question 3
Applying the forward equations, it is clear that, for n ≥ 1

pn (t + δ) = (1 − λn δ)pn (t) + λn−1 δpn−1 (t) + o(δ)

using the usual arguments it follows

p′n (t) = −λn pn (t) + λn−1 pn−1 (t).

For n = 0 it is clear that


p′0 (t) = −λ0 p0 (t)
i.e. p0 (t) = exp{−λ0 t}, where we have used the boundary condition p0 (0) = 1.
To complete the exercise, we need to verify that pn (t) as given, is a solution of the forward
equations. n = 0 is clear, so consider n ≥ 1. Now
n n
 Y 
1 X 2 −λi t λj
p′n (t) = − λi e . (1)
λn i=0 λj − λi
j=0,j̸=i

In addition
n n
 Y 
X λj
−λn pn (t) = − λi e−λi t (2)
i=0
λj − λi
j=0,j̸=i
n−1  n−1 
X Y λj
λn−1 pn−1 (t) = λi e−λi t . (3)
i=0
λj − λi
j=0,j̸=i

Adding together (2) and (3) we have


 n  n−1  n−1  
Y λj X Y λj λn
−λn e−λn t + λi e−λi t 1− .
λj − λn i=0
λj − λi λn − λi
j=0,j̸=n j=0,j̸=i

2
The summation is equal to
n−1  n−1 n−1 n
λ2i
  Y 
X Y λj 1 X 2 −λi t λj
− e−λi t =− λi e
i=0
λn − λi λj − λi λn i=0 λj − λi
j=0,j̸=i j=0,j̸=i

hence (2) + (3) is


 n  n−1 n
 Y 
−λn t
Y λj 1 X 2 −λi t λj
−λn e − λi e
λj − λn λn i=0 λj − λi
j=0,j̸=n j=0,j̸=i

n n
 Y 
1 X 2 −λi t λj
=− λi e
λn i=0 λj − λi
j=0,j̸=i

which is exactly (1); this completes the exercise.

Question 4
The forward equations can be employed, as in lectures to obtain:

p′1 (t) = −λp1 (t)


p′n (t) = −nλpn (t) + (n − 1)λpn−1 (t) n ≥ 2.

The boundary conditions are p1 (0) = 1 and pn (0) = 0 for n ≥ 2.


This problem can be solved in various ways.Here we carry out a proof by induction. We want
to prove that

pn (t) = (1 − e−λt )n−1 e−λt ,

holds for all n ≥ 1.


Clearly, by solving the forward equations we get

p1 (t) = e−λt = (1 − e−λt )1−1 e−λt .

Now we write down the forward equations for n + 1.

p′n+1 (t) = −(n + 1)λpn+1 (t) + nλpn (t).

Using the induction hypothesis, we get

p′n+1 (t) + (n + 1)λpn+1 (t) = nλpn (t) = nλ(1 − e−λt )n−1 e−λt .

Using the integrating factor approach, we get


Z x 
M (x) = exp λ(n + 1)ds = exp(λ(n + 1)x),
0

where we used the boundary condition. Further,


Z t
pn+1 (t) = nλpn (u)M (u)du(M (t))−1
0
Z t
= λn(1 − e−λu )n−1 e−λu eλ(n+1)u due−λ(n+1)t .
0

3
Note that
Z t
λn(1 − e−λu )n−1 e−λu eλ(n+1)u du
0
Z t
= λn(1 − e−λu )n−1 eλnu du
0
t n−1
X 
n−1
Z
= λn (−1)k (e−λu )k eλnu du
0 k
k=0
Z t n−1
X n − 1
= λn (−1)k eλ(n−k)u du
0 k
k=0
n−1
X n − 1  Z t
= (−1)k n λeλ(n−k)u du
k 0
k=0
n−1 
X n−1 
n
(−1)k eλnt e−λkt − 1 .

=
k n−k
k=0

By rearranging terms we find that


Z t
λn(1 − e−λu )n−1 e−λu eλ(n+1)u du = eλnt (1 − e−λt )n .
0

Thus
pn+1 (t) = eλnt (1 − e−λt )n e−λ(n+1)t = (1 − e−λt )n e−λt ,
which concludes the proof.
Alternative proof:
Alternatively, you could show that pn satisfies the forward equations. For n = 1 we have
p1 (t) = (1 − e−λt )1−1 e−λt = e−λt .
Then
p′1 (t) = −λe−λt = −λp1 (t),
hence p1 satisfies the forward equation. For n ≥ 2 we need to check whether
p′n (t) = −nλpn (t) + (n − 1)λpn−1 (t)
is satisfied.
Note that for
pn (t) = (1 − e−λt )n−1 e−λt ,
we have
p′n (t) = (n − 1)(1 − e−λt )n−2 λe−λt e−λt + (1 − e−λt )n−1 (−λ)e−λt


= (n − 1)λpn−1 (t)e−λt − λpn (t).


Let us check whether this is equal to the expression coming from the forward equations, i.e.
p′n (t) = −nλpn (t) + (n − 1)λpn−1 (t).
Hence we check whether
(n − 1)λpn−1 (t)e−λt − λpn (t) = −nλpn (t) + (n − 1)λpn−1 (t)
(n − 1)λpn−1 (t) e−λt − 1 = −(n − 1)λpn (t)


pn (t) = pn−1 (t) 1 − e−λt ,


which is true. Compare this expression with the definition of pn !

4
Question 5
(a) If there are i infectives there are a − i susceptibles and hence the birth rate is given by
λi = (a − i)iη if i = 1, . . . , a and 0 otherwise. (b)Let Xi be the time spent in state i then we have
that the time to complete epidemic is

T = X1 + · · · + Xa−1

with Xi ∼ Exp(λi ), and mutually independent. The Laplace transform of an exponential is


λi /(λi + s) thus

Pa−1 a−1
Y
E[e−sT ] = E[e−s i=1 Xi
]= E[e−sXi ]
i=1
a−1
Y  a−1
λi Y (a − i)iη
= = .
i=1
λi + s i=1
s + (a − i)iη

To compute the expectation, we calculate the log of the expectation and use the fact that
  
∂ −sT
E[T ] = − log E[e ] .
∂s s=0

The result then follows:


a−1
X 1
E[T ] = .
i=1
(a − i)iη

You might also like