Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

MATH3029/MATH6109 LECTURE 14

So far we have seen a lot of measure theory. On the one hand that’s good,
because measure theory lies at the foundation of probability theory, and it is very
useful for some things, like deriving highly nontrivial statements about tail events
such as Kolmogorov’s zero-one law. But it is far from the only aspect of probability
theory. And now that you have some knowledge of these foundations, we can use
these tools to study other parts of probability theory.
The plan for today is as follows:
• stochastic processes;
• Poisson process;
• the jump process associated with a Poisson process.

1. Stochastic processes
Today we will discuss a specific stochastic process, and it’s useful to briefly
consider general stochastic processes. Let (Ω, F) be an event space. A continuous-
time stochastic process on Ω is a collection (Nt )t≥0 of random variables Nt : Ω → Ω′
for t ≥ 0, where (Ω′ , F ′ ) is an event space. A discrete-time stochastic process on Ω
is a sequence (Tk )∞ ′
k=0 of random variables Tk : Ω → Ω , k ∈ Z+ .
Throughout today’s lecture we only consider (Ω , F ′ ) = (R, B) or (Ω′ , F ′ ) =

(Z, P(Z)), as these are the most common choices. Note that continuous time and
discrete time do not mean that the random variables which are involved are con-
tinuous or discrete. In fact, today the continuous-time stochastic process consists
of discrete random variables, and the discrete-time stochastic process consists of
continuous random variables.
A random variables is often used to model a measurement of something random.
A stochastic process can be used to model measurements of something random
which also changes over time.
Let (N (t))t≥0 be a continuous-time stochastic process with N (0) = 0 (i.e. N (0)
is the “random” variable that is constantly equal to 0). Then, for each 0 ≤ s < t,
we call the random variable Nt − Ns an increment of the process (Nt )t≥0 . We say
that (Nt )t≥0 has independent increments if, for all n ∈ N and all choices of points
0 = t0 < t1 < . . . < tn , the sequence (Nti − Nti−1 )ni=1 of increments is independent.
Note that it is more reasonable to expect the increments to be independent than
the random variables Nt and Ns themselves, since in most applications the future
will depend in some nontrivial way on the past.
Similarly, consider a discrete-time process (Tk )∞k=0 with T0 = 0. Then the random
variable Tl − Tk , for k < l, is an increment of the process. One says that the process
has independent increments if for all n ∈ N and all 0 = k0 < k1 < . . . < kn , the
sequence (Tki − Tki−1 )ni=1 is independent.
Usually one considers stochastic processes with more interesting properties, like
the Poisson process which we will discuss now.
1
2 LECTURE 14

2. Poisson process
The Poisson process arises as a natural model for situations where a whole bunch
of events happen independently at random times, and you are interested in the
number of events in a given time interval. We already studied such a setting a few
weeks ago, but back then we fixed a time interval (0, t] and studied the distribution
of the random number of claims in that interval. Here we let the time interval vary,
and study the properties of the resulting stochastic process.
We derived the gamma and exponential distributions as the waiting time dis-
tributions associated with a Poisson process. In this lecture we go the other way
and derive the Poisson process starting from the exponential distribution. It is not
that important to know in what ways one can derive the Poisson process (we will
be more interested in its properties), but it’s nonetheless good to see that it arises
naturally in various different ways.
Let (Li )∞
i=1 be a sequence of i.i.d. random variables that are exponentially dis-
tributed with parameter α > 0. In other words, each Li : Ω → (0, ∞) is a random
variable on a probability space (Ω, F, P) with distribution PLi = Eα on (0, ∞). We
have already seen that a probability space (Ω, F, P) and an independent sequence
(Li )i∈N like this exist (by Theorem 3.26 in the book, one can for example take
Ω = (0, ∞)N ). Each Li is exponentially distributed, and when we derived the ex-
ponential distribution we viewed such a random variable as modeling the random
time you have to wait until the first event. Since the Li are independent of each
other, it also makes sense to view Li as the random time between the (i − 1)-th
event and the i-th ∑event.
k
Now set Tk := i=1 Li , and T0 = 0. Each Tk is a random variable, as a sum
of random variables. By the interpretation that we have just given the Li , Tk is
simply the random time that you have to wait for the k-th event to occur. We will
have more to say about these random variables later.
Next, for t > 0 we let Nt : Ω → Z+ be given by


(2.1) Nt (ω) := max{k ∈ Z+ | Tk (ω) ≤ t} = 1[0,t] (Tk (ω)) (ω ∈ Ω).
k=1

Also, set N0 := 0. Each 1[0,t] (Tk ) is a random variable, as the composition of the
random variables Tk : Ω → (0, ∞) and 1[0,t] : (0, ∞) → [0, 1]. Sums of random
variables are random variables, and so is the supremum of countably many random
variables, so we see that Nt itself is a random variable. Given our interpretation
of Tk as the time until the k-th event, Nt is simply the random number of events
in (0, t]. Note that technically one should replace the max in (2.1) by a sup, since
one could have Nt (ω) = ∞, but the probability P({Nt = ∞}) of this happening is
zero. So when you’re only interested in the distribution of Nt then you can cheat a
bit and ignore the point at infinity, which we will do from now on. Now (Nt )t≥0 is
a stochastic process.
Now that we have introduced our stochastic process (Nt )t≥0 , we are ready to
describe its main properties.
Theorem 2.1. (Nt )t≥0 is a continuous-time stochastic process of random variables
Nt : Ω → Z+ , for t ≥ 0, with N0 = 0. The process has independent increments,
and each increment Nt − Ns has the Poisson distribution with parameter α(t − s),
for 0 ≤ s < t.
LECTURE 14 3

I won’t discuss the proof in the lecture, since we don’t have that much time, and
the proof is somewhat technical and not particularly insightful, but you can have a
look at it in the book if you want.
We call any process (Nt )t≥0 as in Theorem 2.1 a Poisson process with intensity
α > 0.
There are a few things you should note:
• Technically the Poisson process depends on the underlying space (Ω, F, P),
but this is irrelevant and we only care about the distributions of the random
variables.
• If you let s = 0, then it follows that Nt − N0 = Nt is Poisson distributed
with parameter αt. So each of the random variables Nt , t > 0, is Pois-
son distributed with parameter αt. However, a Poisson process has more
structure than this. If you give the increments the interpretation that it
corresponds to counting the number of events in disjoint time intervals, then
the random number of points in disjoint time intervals are independent and
Poisson distributed as well.
• The distribution of Nt − Ns depends only on t − s.

3. The jump process


There is one more way to look at the Poisson process. Since Tk (ω) ≤ t if and
only if Nt (ω) ≥ k, you can also write (2.1) as
Tk (ω) = inf{t > 0 | Nt (w) ≥ k} (ω ∈ Ω)
for k ∈ N. Also set T0 := 0. Then Tk is the random time at which the map t 7→
Nt (ω) makes its k-th jump of size 1. This is why (Tk )∞k=0 is called the jump process
for the Poisson process with intensity α. It is discrete-time stochastic process, and
it has several nice properties.
Theorem 3.1. (Tk )∞
k=0 is a discrete-time stochastic process with independent incre-
ments, such that each increment Tl − Tk has the gamma distribution Γα,l−k for
0 ≤ k < l.
Proof. We have already seen that each Tk : Ω → (0, ∞) is a random variable.
Moreover, for k < l,

l ∑
k ∑
l
Tl − Tk = Li − Li = Li ,
i=1 i=1 i=k+1

and we saw last week that a sum of l − k independent random variables with
distribution Eα has distribution Γα,l−k . Also, for n ∈ N and 1 < k1 < . . . < kn , one
has
( ∑ ki )n
(Tki − Tki−1 )ni=1 = Li ,
i=1
i=ki−1 +1

and the latter sequence is independent because (Li )∞i=1 is independent and the sums
involve disjoint collections of the Li (recall the theorem about combining disjoint
collections of independent random variables). □
We now briefly discuss a concrete example to give you an idea of when you can
expect to encounter Poisson processes.
4 LECTURE 14

Example 3.2. Suppose that people arrive for lunch at a restaurant in a university
food precinct according to a Poisson process with intensity 1/2 (per minute). As we
have seen today, this assumption is reasonable if the arrival times are independent,
and if the times between arrivals have the distribution E1/2 . Suppose you arrive ten
minutes after the shop opens. What is the probability that there are five people
waiting when you arrive?
Let N (t) be the number of people waiting at time t ≥ 0. Then we’re interested
in P(N10 = 5). By Theorem 2.1, N10 has distribution P5 , so
55 −5
P(N10 = 5) = P5 ({5}) = e ≈ 17.547%.
5!
Now, suppose that you arrive in the first ten minutes, and that there are indeed
five people waiting in line when you arrive. What is the probability that for two
minutes no one arrives after you? Let (Tk )∞
=1 be the arrival times of the customers.
The question is to determine P(T7 − T6 ≥ 2 | T6 ≤ 10). Theorem 3.1 then yields
P(T7 − T6 ≥ 2 | T6 ≤ 10) = P(T7 − T6 ≥ 2) = E1/2 ([2, ∞))
∫ ∞
1 −x/2
= e dx = e−1 ≈ 36.788%.
2 2
You can read more about the Poisson process in the book. In particular, the
book discusses more complicated Poisson processes in the examples.

You might also like