Professional Documents
Culture Documents
Monte Carlo Sampling For Random Differential Equations: Master INVESTMAT 2017-2018 Unit 4
Monte Carlo Sampling For Random Differential Equations: Master INVESTMAT 2017-2018 Unit 4
Monte Carlo Sampling For Random Differential Equations: Master INVESTMAT 2017-2018 Unit 4
Equations
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 1
Aims of Monte Carlo method
Monte Carlo sampling (MCS) is a statistical sampling method that was popularized by
physicists from Los Alamos National Laboratory in the USA in the 1940s.
In the following we will explain how this method works in dealing with random
differential equations (r.d.e.’s) although it can be applied in many different contexts
such as:
Approximate definite integrals.
Numerical optimization.
MCS relies on simulation of both random variables (r.v.’s) and stochastic processes
(s.p.’s). Hence, to understand better MCS we first need to introduce some
preliminaries about random number generation since basically what needs when using
MCS is generating randomly values of r.v.’s following specific probability distributions.
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 2
Part I
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 3
Pseudorandom number generation: sampling r.v.’s
The easiest method to generate random numbers with a fixed probability distribution
is based on the following idea:
Assume that X is a continuous r.v. with d.f. FX (x) which is continuous and strictly
increasing. Then, for x : −∞ ≤ x ≤ +∞ fixed one gets:
1 log(U)
X =− log(1 − U) ≡ X = − , U ∼ Un([0, 1]).
λ λ
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 5
Example 2: Sampling a standard Gaussian r.v. X ∼ N(0; 1)
In this case:
y2
Z x
1
FX (x) = √ exp − dy ⇒ FX−1 (u) is very difficult to compute!
2π −∞ 2
p0 + p1 y + p2 y 2 + p3 y 3 + p4 y 4
FX−1 (u) ≈ y +
p
, y= −2 log(1 − u), u ∼ U([0, 0.5]).
q0 + q1 y + q2 y 2 + q3 y 3 + q4 y 4
The case 0.5 < u < 1 is handled by symmetry. The coefficients pi and qi , 0 ≤ i ≤ 4 are
given by
p0 = −0.322232431088, q0 = 0.099348462606,
p1 = −1, q1 = 0.588581570495,
p2 = −0.342242088547, q2 = 0.531103462366,
p3 = −0.0204231210245, q3 = 0.10353775285,
p4 = −0.0000453642210148, q4 = 0.0038560700634.
Exercise 2: Testing the quality of inverse distribution method for sampling a standard
Gaussian r.v.
Test with Mathematica the method described above.
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 6
Exercise 3: Generating a Poisson r.v.
The following procedure describes the generation of values, {Yn }, sampled from a
Poisson distribution of parameter γ > 0, Yn ∼ Po(γ) (hence E[Yn ] = γ). For that goal,
first remember that
m
γk
y > 0, FY (y ) = P[Y ≤ y ] = e −γ ∑ , m ≤ y < m + 1.
k=0 k!
Box-Muller
p
X1 = −2 ln(U1 ) cos(2πU2 ),
U1 , U2 ∼ U(0, 1) independent ⇒ p ⇒
X2 = −2 ln(U1 ) sin(2πU2 ),
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 8
Exercise 4 (continuation): Two additional methods for generating standard Gaussian
r.v.’s: Box-Muller and Polar-Marsaglia methods
Polar-Marsaglia
It avoids trigonometric evaluations
V1 = 2U1 − 1,
U1 , U2 ∼ U(0, 1) independent ⇒ V1 , V2 ∼ U(−1, 1) independent ⇒
V2 = 2U2 − 1,
q
⇒Z = (V1 )2 + (V2 )2 ⇒
if Z > 1 ⇒ U1 , U2 , V1 , V2 are recomputed,
V1
p
⇒ X1 = Z −4 ln(Z ),
if 0 < Z < 1 ⇒ ⇒ X1 , X2 ∼ N(0; 1) independent
V2
p
−4 ln(Z ),
X2 = Z
⇒ µ + σ Xi ∼ N(µ; σ 2 ), i = 1, 2, independent.
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 9
Exercise 5: The linear congruential generator method
All the above generator methods (Inverse Transformation; Poisson; Box-Muller;
Polar-Marsaglia, etc.) are based on generating randomly distributed uniformly on
[0, 1]. In this exercise we propose to study a popular method to generate numerical
values of a r.v. on [0, 1]. The method is based on the following congruence
0 ≤ d mod (m) ≤ m − 1,
Xn
Un = para n = 0, 1, 2, . . . , 0 ≤ Un ≤ 1, ∀n ≥ 0.
m
For certain values of the parameters a, c and m the sequence Un may possess
statistical properties for numbers that are randomly distributed uniformly on [0, 1].
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 10
Exercise 5 (continuation): The linear congruential generator method
Linear congruential generators eventually repeat. If Xi+p = Xi , then the smallest value
of p is called the cycle length or period of the generator. For linear congruential
generator p ≤ m.
The next result is a useful criteria to determine the cycle length of certain linear
congruential generators when c 6= 0.
The period of a linear congruential is m if and only if the following three conditions
hold
g.c.d.(c, m) = 1.
Every prime factor of m divides a − 1.
If 4 divides m, then 4 divides a − 1.
If c = 0 and m is a prime number, the longest possible period is m − 1. This is
achieved when
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 11
Exercise 5 (continuation): The linear congruential generator method
Using Mathematica construct the following linear congruential generators for a
uniform r.v. on [0, 1] and compare the corresponding histograms with the ones
provided by a direct command of Mathematica:
1 For the case c = 0, take a = 75 , m = 231 − 1 = 2 147 483 647 (this is a Mersenne
prime), then one gets the following congruential generator
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 12
Part II
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 13
Our initial pattern example
In our exposition we will consider the following initial value problem (i.v.p.):
Ẋ (t) = −αX (t), Ẋ (t, ω) = −α(ω)X (t, ω),
⇔ , ω ∈ Ω,
X (0) = β , X (0, ω) = β (ω)
where α = α(ω) and β = β (ω) are r.v.’s defined on a common probability space
(Ω, F , P).
Notice that α and β can be independent or dependent r.v.’s. Their statistical
dependence structure is given by their joint d.f.: Fα,β (a, b). In case of both are
statistically independent, this d.f. factorizes as Fα,β (a, b) = Fα (a)Fβ (b) being Fα (a)
and Fβ (b) their respective individual (marginal) d.f.’s.
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 14
Goals
Solving a differential equation in the random context has a wider meaning than in the
deterministic scenario:
Deterministic Random
x(t) X (t, ω) = X (t)
expectation function: E[X (t)]
variance (or standard deviation) function: V[X (t)]
higher moments: E[(X (t))n ], n ≥ 3
covariance: C[X (t), X (s)]
1-p.d.f. and 1-d.f.
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 15
MCS procedure
1 Generate identically and independently distributed random numbers
according to the joint distribution of α and β : Fα,β (a, b). Notice that the
dependence structure of α and β is required to be known.
2 For each i = 1, 2, . . . , M, solve the governing deterministic equation
Ẋ (t) = −αX (t),
X (0) = β ,
and obtain:
4
X (i) (t) = X (t, Z (i) ).
3 Estimate the required solution statistics. For example, the expectation and
variance of the solution s.p. can be estimated, respectively, by:
M
1
X M (t) =
M ∑ X (t, Z (i) ) ≈ E[X (t)],
i=1
M 2
1
σ 2 [XM (t)] = ∑ X (t, Z (i) ) − X M (t) ≈ V[X (t)].
M i=1
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 16
Remarks
Other solution
n statistics
o can be estimated via proper schemes from the solution
ensemble X (i) . It is obvious that Steps 1 and 3 are preprocessing and
postprocessing steps, respectively. Only Step 2 requires solution of the original
problem, and it involves repetitive simulations of the deterministic counterpart of the
problem.
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 17
MCS in practice: assuming that only the i.c. is random
Let us illustrate the solution of our particular i.v.p. via MCS with:
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 18
Expectation of the solution stochastic process (s.p.)
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 19
MCS expectation approximations against the exact expectation
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 20
Standard deviation of the solution s.p.
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 21
MCS standard deviation approximations against the exact one
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 22
MCS in practice: assuming that only the coefficient is random
α ∼ Exp(2), β = 1, M = 20, 0 ≤ t ≤ 3
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 23
About the quality of the approximations
M √
1 XM −µ
XM = ∑ Xi , UM = M .
M i=1 σ
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 24
About the quality of the approximations
n o
Since X (t, Z (i) ) are i.i.d. distributed r.v.’s for each t, one gets:
M
1 σ d V[X (t)]
X M (t) =
M ∑ X (t, Z (i) ) = µ + √M UM −M→∞
−−→ M(t) ∼ N E[X (t)];
M
i=1
This leads to the widely adopted concept that error convergence rate of MCS is
inversely proportional to the square root of the number of simulations
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 25
About the quality of the approximations
Drawbacks of MCS
√
The convergence rate of MCS is O(1/ M) is relatively slow. Roughly speaking:
If a one-digit increase in solution accuracy of the statistics is required, one needs to run
roughly 100 times more simulations and thus the computational burden by 100 times.
For large and complex systems where the solution of a single deterministic realization
is time-consuming, this entails a tremendous numerical challenge.
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 26
About the quality of the approximations
Advantages of MCS
√
The O(1/ M) convergence rate is independent of the total number of input
r.v.’s. This turns out to be an extremly useful property that virtually no other
methods possess.
Implementation of MCS is simple.
MCS relies on deterministic methods to compute the (approximate) solutions of
every trajectory. This is a good new since a large number of powerful
deterministic methods to deal with differential equations are available.
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 27
Part III
Applications
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 28
Now we will present several illustrative examples where randomness enters into the
r.d.e. through different terms (i.c.’s, coefficients and/or source term) and considering
different probability distributions. This includes independent and dependent structure
among the involved random inputs. In addition, we will also consider different ways to
introduce statistical dependence including copulas method.
Computations will be carried out by Mathematica.
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 29
We will consider the following initial value problem (i.v.p.):
Ẋ (t) = αX (t) + β ,
X (0) = γ,
The examples are:
1 γ is a r.v.: γ ∼ Un([0, 1]).
2 α is a r.v.: α ∼ Exp(λ = 2).
3 γ and α are independent r.v.’s: γ ∼ Un([0, 1]) and α ∼ N(−1; 1).
4 α and β are dependent r.v.’s: θ = (α, β ) ∼ N2 (µ; Σ) where
−1 1 1
µθ = , Σθ = .
1 1 4
5 α and β are dependent r.v.’s generated by a kernel distribution from a sample.
6 α and β are dependent r.v.’s whose dependence structure is generated by a
copula (by Farlie–Gumbel–Morgenstern).
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 30
References
1 D.P. Kroese, T. Taimre and Z.I. Botev (2011): Handbook of Monte Carlo
Methods, Wiley Series in Probability and Statistics, John Wiley and Sons, New
York.
2 S.M. Ross (1990): A Course in Simulation, Macmillan, New York.
Random Differential Equations and Applications Monte Carlo sampling and R.D.E.’s 31