Download as pdf or txt
Download as pdf or txt
You are on page 1of 21

Lévy processes

Abhinav Pradeep

March 19, 2024


Table of Contents

Lévy process

Algorithm to simulate a Lévy process

Recap: Brownian motion

Recap: Poisson random measure

Recap: Poisson process and compound Poisson process

Lévy–Itô decomposition

Characterising a Lévy process

Gamma process

Simulation
Lévy process

A Lévy process is a stochastic process {Xt }R , Xt ∼ Dist(t)


satisfying

1. Continuity conditions: Right continuous and left limited.

2. Independent increments: ∀ t1 < t2 < t3 < t4 ∈ R,

Xt4 − Xt3 independent of Xt2 − Xt1


and stationary increments:

∀t, n ∈ R (Xt+n − Xt ) ∼ Dist(n)


The Lévy process hence generalises of properties of the Wiener and
compound Poisson processes. An interesting property is that it can
be fully characterised by a PRM and Brownian motion.
Algorithm to simulate a Lévy process

This algorithm exploits the independent and stationary nature of


increments.

Given that the marginal distributions Xt ∼ Dist(t) is known ∀t.


Generating a over interval [0, tn ] can be done by:

1. Set X0 := 0. Take k = 1

2. Draw A ∼ Dist(tk − tk−1 )

3. Set Xk = Xk−1 + A

4. While k ̸= n, set k := k + 1 and repeat steps 2 and 3.


Recap: Brownian motion

A Wiener process Wt is a Gaussian process with continuous sample


paths on R+ s.t. µ({Wt }R+ ) = 0 and Cov(Ws , Wt ) = s ∧ t.

A Brownian motion is obtained by applying an affine


transformation to the Wiener process:

Bt>0 := B0 + at + cWt
Where a ∈ R is called the drift coefficient and c ∈ R is called the
diffusion coefficient. Moreover, B0 is Gaussian and independent of
{Wt }R+ .

Effectively it is some intensity of a Wiener process that moves in


some direction starting at some random point such that the start
point says noting about future movement.
Recap: Poisson random measure

Consider measurable spaces (Ω, H) and (E , E). A random measure


is a mapping N : Ω × E → R+ such that NA (ω) : ω 7→ N(ω, A) is
a random variable and ∀A ∈ E, Nω (A) : A 7→ N(ω, A) is a measure
on (E , E).

Let E ⊆ Rn and E := B(E ). A Poisson random measure is a


random counting measure on (E , E) such that:

N(ω, A) ∼ Poi(µ(A))
Where µ is called the mean measure and for any disjoint sequence
of sets An the random variables NAn (ω) are independent.
Recap: Poisson process and compound Poisson process

Setting E := R+ and E correspondingly the Poisson process Nt is


defined as:
Nt := N[0, t]
Now set E := R+ × Rn . Suppose a product measure of Leb ⊗ ν.
Suppose ν(Rn ) = c < ∞. That is the number of points scattered
along Rn is finite and on average equal to c. The process

Kt := N([0, t] × Rn )
With mean measure Leb ⊗ ν is a Poisson process. A compound
Poisson process is the below stochastic integral taken w.r.t. the
measure defined by N:
Z
Xt≥0 := xN(ds, dx)
[0,t]×Rn
Recap: Compound Poisson process

Z
Xt≥0 := xN(ds, dx)
[0,t]×Rn

Where the above integral is effectively adding up the values of the


scattered points across Rn for each arrival ∈ [0, t]. That is, the
above process can be thought of as: for any number of arrivals
Kt ∼ Poi(ct) on [0, t] we add the batch of summed up points
Y ∼ νc . That is:
Kt
X
Xt≥0 = Yi
i=1
i.i.d ν
∼ c
Where Y1 , Y2 , Y3 . . . independently of Poisson counting
process Kt with rate c.
Lévy–Itô decomposition

Recall the condition (call it 1.0) for compound Poisson process:

ν(Rn ) := c < ∞

This can be relaxed to (call it 2.0):


Z
min (∥x∥, 1)ν(dx) < ∞
Rn

and further relaxed to (call it 3.0):


Z
min (∥x∥2 , 1)ν(dx) < ∞
Rn
Lévy–Itô decomposition
Theorem: A process is Lévy taking values in Rn ⇐⇒ it can be
written as a sum:

Xt = bt + cWt
Z
+ xN(ds, dx)
[0,t]×{∥x∥>1}
Z
+ lim x[N(ds, dx) − dsν(dx)]
δ↓0 [0,t]×{δ≤∥x∥≤1}

The first part is just a 0 mean Brownian motion with drift


coefficient b ∈ Rn and diffusion coefficient c ∈ Mn×m (R) with Wt
having state space Rm . The second component is a compound
Poisson process over R+ × Rn with Rn endowed with measure ν
satisfying the Condition 3.0. The final term is a compensated
Poisson processes, designed to handle the singularities encountered
when extending previous term to include the unit ball.
Lévy–Itô decomposition

Consider specifically the compensated Poisson process. Let:


Z
δ
lim Xt := lim x[N(ds, dx) − dsν(dx)]
δ↓0 δ↓0 [0,t]×{δ≤∥x∥≤1}

The claim is that

lim Xtδ := Xt
δ↓0

Almost surely where Xt is some Lévy process. Convergence almost


surely is analogous to µ-almost convergence for µ-measurable
functions over some (E , E, µ). Specifically, the condition can be
stated as:

P({ω | lim Xtδ := Xt }) = 1


δ↓0
Lévy–Itô decomposition

To get an intuition of this consider that:


Z Z
Xtδ := xN(ds, dx) − xdsν(dx)
[0,t]×{δ≤∥x∥≤1} [0,t]×{δ≤∥x∥≤1}

And by Fubini’s
Z Z !
Xtδ := xN(ds, dx) − xν(dx) t
[0,t]×{δ≤∥x∥≤1} {δ≤∥x∥≤1}

Denote Z
Ytδ := xN(ds, dx)
[0,t]×{δ≤∥x∥≤1}
Z
δ
a := xν(dx)
{δ≤∥x∥≤1}
Lévy–Itô decomposition

Hence we write:
Xtδ := Ytδ − aδ t
Now consider two weakened cases:

Condition 2.0 holds without 1.0: The clearly limδ↓0 Ytδ = Yt is


going to exist and by linearity can be added back to the prior term.
This just leaves the drift term which limδ↓0 aδ := a < ∞ by 2.0
and hence adds −at to the process. Hence Xt is just a Poisson
process with drift given by a ∈ Rn .

Only Condition 3.0 holds: limδ↓0 Ytδ → ∞ and limδ↓0 aδ → ∞.


However, the singularities cancel out to some Xt such that it has
infinite variation over every interval (s, t) (Cinlar, 2011).
Characterising a Lévy process

A corollary of Lévy–Itô decomposition is that the characteristic


function of a Lévy process Xt :

ψ(r) = Ee i⟨r,Xt ⟩
Has exponent
1
= ⟨r, Σr⟩ + i⟨r, b⟩
2 Z
e i<r,x> − 1 ν(dx)

+
∥x∥>1
Z
e i<r,x> − 1 − i⟨r, x⟩ ν(dx)

+
∥x∥≤1

Hence the triplet (b, Σ, ν) completely characterizes Xt with


Σ = c T c.
Gamma process
A Gamma process with state-space R+ is a Lévy process
R1 such that
it has no Wiener process component, has drift a = 0 xν(dx) and
its Poisson random measure N(ds, dx) has mean measure

ds ν(dx) := ds g (x)dx
With (for some α, λ ∈ R+ ):

αe −λx
g (x) :=
x
Hence, it has characteristic triplet (a, 0, ν). Clearly ν does not
satisfy Condition 1. That is:
Z
ν(R+ ) = 1ν(dx)
R+
Z
= g (x)dx
R+
Gamma process

αe −λx
Z
= dx → ∞
R+ x
However, it does satisfy Condition 2:

αe −λx 1
1 − e −λ
Z Z  
−λx
x· ⇒ αe := α <∞
{x≤1} x 0 λ
and

αe −λx αe −λx
Z Z
⇒ dx < ∞
{x>1} x 1 x
Hence the process can be written as:

Z 1  Z Z 1 
Xt := xν(dx) t + xN(ds, dx) − xν(dx) t
0 [0,t]×R+ 0
Gamma process
Clearly, the drift terms cancel out leaving the pure jump Levy
process:
Z
Xt := xN(ds, dx)
[0,t]×R+

To determine the marginal distributions, consider the exponent of


the characteristic function:
Z ∞ 
=t e i(rx) − 1 ν(dx)
0
Z ∞  αe −λx
=t e i(rx) − 1 dx
0 x
 α  αt
λ λ
= t ln = ln
λ − ir λ − ir
Hence, the marginal distributions are Xt ∼ Gamma(αt, λ)
Gamma process
Moreover, the distribution of the increments can be determined by
observing the characteristic function of the difference Xt+δ − Xt :

Z ∞  Z ∞ 
i(rx)
= (t + δ) e − 1 ν(dx) − t e i(rx) − 1 ν(dx)
0 0
Z ∞ 
=δ e i(rx) − 1 ν(dx)
0
And by prior derivations,
 αδ
λ
= ln
λ − ir

Hence, Xt+δ − Xt ∼ Gamma(αδ, λ) and the marginal distributions


of the gamma process are fully characterized.
MATLAB code for the simulation
n = 100; % Number of steps
t = linspace(0, 1, n+1); % Time vector from 0 to 1, divided into n intervals
% Define parameters of gamma distribution
alpha = 2;
lambda = 1;

% Initialize array tracking the realization


X = zeros(1, n+1);
% Set X_0 to 0. Matlab arrays not zero indexed.
X(1) = 0;

% Loop as was previously described


for k = 2:n+1
delta_t = t(k) - t(k-1);
% Generate increment
A = gamrnd(alpha*delta_t, lambda);
% Add increment
X(k) = X(k-1) + A;
end

% Plot
plot(t, X);
xlabel(’Time’);
ylabel(’X(t)’);
title(’Realization of a Gamma Process’);
Example realisation
Citations

Course textbook

Cinlar, E. (2011) Probability and stochastics. New York: Springer.

You might also like