Week3 Handout

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 34

Cov&Corr Derivatives PoA MC Spotting errors and bias

wi3425TU—Monte Carlo methods

L.E. Meester

Week 3

1/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias

Week 3—Program for this week

1 Covariance and correlation in MC


2 Simulating derivatives: a good and a bad method
Finite differences
Method 1
Method 2 (Common random numbers)
Comparison
3 Points of attention for MC simulation (part)
4 Spotting errors and bias
Spotting errors
Bias

Don’t forget: Do the weekly practice material and upload your


solutions and we’ll provide you with feedback!
2/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias

Covariance and correlation

Covariances and correlations play an important role in:


Estimating derivatives (§15.4).
Antithetic variates (Ch 21).
Control variates (Ch 22).

Key rule:

Var(X + Y ) = Var(X ) + Var(Y ) + 2Cov(X , Y ) .

The properties we use (details in MIPS Chapter 10):

3/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias

Rules on variance and covariance

Covariance
Cov(X , Y ) = E [(X − E [X ])(Y − E [Y ])] = E [XY ] − E [X ] E [Y ].

Correlation
ρ (X , Y ) = √ Cov(X ,Y ) .
Var(X )Var(Y )

Var(X + Y ) = Var(X ) + Var(Y ) + 2Cov(X , Y ).


Cov(rX + s, tY + u) = r t Cov(X , Y ).
Cov(X , Y ) = ρ (X , Y ) σX σY .
−1 ≤ ρ (X , Y ) ≤ 1,
ρ (rX + s, tY + u) = sign(rt) ρ (X , Y ).

4/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

1 Covariance and correlation in MC

2 Simulating derivatives: a good and a bad method


Finite differences
Method 1
Method 2 (Common random numbers)
Comparison

3 Points of attention for MC simulation (part)

4 Spotting errors and bias


Spotting errors
Bias

5/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

MC for “Greeks” or: simulating derivatives

European call option: given expiry T , strike E and


S(0) = S, the price is
h  1 2
√ i
V (S) = E e−rT Λ S · e(r − 2 σ ) T +σ T ·Z ,

where Λ(s) = max(s − E , 0) and Z ∼ N (0, 1).


For the execution of hedging strategies we need the delta:
∂V
∆= .
∂S
(Background in Higham Chapter 8.)
Here we can just differentiate the Black-Scholes formula, but:
How can we determine ∆ if there is no pricing formula?

6/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

Finite difference approximation

We know
∂V V (S + h) − V (S)
= lim .
∂S h→0 h
Second order Taylor expansion for V (S + h):

∂V ∂2V
V (S + h) = V (S) + h + 12 h2 (S∗ ),
∂S ∂S 2
where S∗ is between S and S + h. Requirement: V has continuous
second derivative.
So
V (S + h) − V (S) ∂V ∂2V
= + 12 h (S∗ ).
h ∂S ∂S 2

7/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

So:
∂V V (S + h) − V (S)
= + R(h),
∂S h
with
R(h) = 12 h V 00 (S) + 12 h V 00 (S∗ ) − V 00 (S) .
 

Note here:
the first term is linear in h;
the second term goes to zero faster than h,
because S∗ → S as h → 0 and so
limh→0 V 00 (S∗ ) = V 00 (S) (as V 00 is continuous).

This may be summarized as:


“R(h) = O(h)” or “the error R(h) is big O of h.”

8/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

The two order symbols:


O(h): big O of h
A function f is big O of h at zero, denoted f (h) = O(h), if

f (h)
lim exists and is finite.
h→0 h
This means a constant c exists, so that for small h:

f (h) = c · h + r (h),

where the error term r (h) goes to zero faster than h, i.e., is o(h).

o(h): small o of h
A function r is small o of h at zero, denoted r (h) = o(h), if

r (h)
lim = 0.
h→0 h
9/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

Summary: we have obtained

∂V V (S + h) − V (S)
= + O(h), for h → 0,
∂S h
a Taylor expansion for ∂V /∂S. This expansion is the basis for the
finite difference approximation

∂V ∆V V (S + h) − V (S)
≈ =
∂S ∆S h
where the error is of “order h,” roughly proportional to h.

10/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

1 Covariance and correlation in MC

2 Simulating derivatives: a good and a bad method


Finite differences
Method 1
Method 2 (Common random numbers)
Comparison

3 Points of attention for MC simulation (part)

4 Spotting errors and bias


Spotting errors
Bias

11/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

Delta by simulation (Method 1)

The finite difference approximation ∆ ≈ V (S+h)−V


h
(S)
also
works if V (S) and V (S + h) are estimated by simulation.
In addition to M we then also have to choose h.
Example: we want to estimate ∂V /∂S for

V (S) = E [W (S, Z )] ,

where
 1 2
√ 
W (S, Z ) = e−rT Λ S · e(r − 2 σ ) T +σ T ·Z .

Focus on the structure: W (S, Z ) is a function of


a parameter S (= S0 ) and a standard normal rv Z .

12/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

The simulation plan:


Write

V0 = E [W (S, Z )] and
Vh = E [W (S + h, Z )] .

The plan:
1 Choose h and M.
2 Estimate V0 from M replications of W (S, Z ).
3 Estimate Vh from M replications of W (S + h, Z ).
4 Calculate the estimate

ˆ = V̂h − V̂0 .

h

13/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

Delta by simulation (1)—variance analysis

Since V̂0 and V̂h are independently simulated:


!
 
ˆ V̂h − V̂0 1 h    i
Var ∆ = Var = 2 Var V̂0 + Var V̂h .
h h

Because h is small, W = W (S, Z ) and W (S + h, Z ) will not


differ much in distribution, so
    Var(W )
Var V̂h ≈ Var V̂0 = ,
M
whence
ˆ ≈ 2 Var(W ) .
 
Var ∆
h2 M

14/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

ˆ ≈ 2 Var(W ) = 2 Var V̂0 .


   
Var ∆
h2 M h2
Conclusion:
1.4 ˆ ≈
· s.e.(Vˆ0 ).
s.e.(∆)
h
Implementation: DeltaCall1.m,
where, as usual, Var(W ) is estimated from the simulation.
We are free to choose h and M:
systematic error (bias) is smaller for small h;
random error is smaller for larger M.

Question: Suppose, we reduce h by a factor 10 in order to reduce


the systematic error; how should we adjust M in order to maintain
the random error at the same level?
We cannot make both errors small at the same time.
15/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

1 Covariance and correlation in MC

2 Simulating derivatives: a good and a bad method


Finite differences
Method 1
Method 2 (Common random numbers)
Comparison

3 Points of attention for MC simulation (part)

4 Spotting errors and bias


Spotting errors
Bias

16/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

Delta by simulation (Method 2)

The approximation ∆ ≈ Vh −V
h
0
can also be written as
 
Wh − W
∆≈E with Wh = W (S+h, Z ), W = W (S, Z ),
h

where W and Wh are determined by the same Z .


Note: W and Wh were simulated independently in Method 1.
New plan:
1 Choose h and M.
2 Estimate ∆ from M replications of (Wh − W )/h.
Name of this technique: common random numbers.

17/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

Delta by simulation (2)—variance analysis

 
ˆ 2 = 1 Var Wh − W = 1 Var(Wh − W ) .
 
Var ∆
M h M h2
Now,

Var(Wh − W ) = Var(Wh ) + Var(W ) − 2 Cov(Wh , W )


≈ 2 [Var(W ) − Cov(Wh , W )] .

Using

Cov(Wh , W ) = ρ (Wh , W ) · σWh · σW ≈ ρ (Wh , W ) · Var(W )

we obtain

Var(Wh − W ) ≈ 2 Var(W ) [1 − ρ (Wh , W )] .

18/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

Summarizing:

ˆ 2 ≈ 2 Var(W ) [1 − ρ (Wh , W )] .
 
Var ∆
h2 M

For method 1 we found:

ˆ 1 ≈ 2 Var(W ) .
 
Var ∆
h2 M
Reduction: by a factor 1 − ρ (Wh , W ).
For small h the correlation ρ (Wh , W ) is close to 1, so
method 2 must be (a lot) better. DeltaCall2.m
Question: How many fewer replications needed if ρ = 0.95?
Answer: around 1/(1 − ρ) = 20 times fewer.

19/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

1 Covariance and correlation in MC

2 Simulating derivatives: a good and a bad method


Finite differences
Method 1
Method 2 (Common random numbers)
Comparison

3 Points of attention for MC simulation (part)

4 Spotting errors and bias


Spotting errors
Bias

20/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

Some simulation results I

Output DeltaCall2.m for various h; at confint the confidence


intervals, first for method 1, then for 2:

M = 1000000
h = 0.1000
confint =
0.9312 0.9853
0.9600 0.9609
speedgain = 3.4954e+03

h = 0.0100
confint =
0.6668 1.2034
0.9560 0.9570
speedgain = 3.1376e+05

21/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

Some simulation results II

h = 1.0000e-03
confint =
-1.9381 3.4240
0.9556 0.9566
speedgain = 3.1038e+07

h = 1.0000e-04
confint =
-27.9828 25.6337
0.9556 0.9565
speedgain = 3.1002e+09

22/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Finite differences Method 1 Method 2 Comparison

Delta—comparing method 1 and 2

Clear: method 1 is the wrong way, method 2 the right way.


“Wrong way” (method 1) and “right way” (method 2) columns
show estimate and 95% confidence interval. Results for M = 106 .
h wrong way (1) right way (2)
0.1 0.95823± 0.02704 0.96044±0.00046
0.01 0.93514± 0.26831 0.95649±0.00048
0.001 0.74295± 2.68104 0.95608±0.00048
0.0001 -1.17454±26.80824 0.95604±0.00048
Presentation of results: This table also is a partial illustration of
PoAMC 2 and 3: number 2 and 3 of the list Points of attention for
Monte Carlo simulations (see Brightspace).

23/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias

1 Covariance and correlation in MC

2 Simulating derivatives: a good and a bad method


Finite differences
Method 1
Method 2 (Common random numbers)
Comparison

3 Points of attention for MC simulation (part)

4 Spotting errors and bias


Spotting errors
Bias

24/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias

Points of attention for Monte Carlo simulations

The first 4 from the list. The others will come later.
1 Analyse your problem. Do this, before you start
programming. Are there alternative ways to formulate it?
This may lead to alternative/better solutions and/or
simulation possibilities. You may not have to simulate at all.
2 Accuracy. Estimates should (where possible) always be
accompanied by an indication of their accuracy: standard
errors or confidence intervals (make sure the confidence level
is clear). The notation 3.12 ± 0.14 (s.e.) indicates an estimate
of 3.12 with a standard error of 0.14. If it is understood which
it is, 3.12 ± 0.14 or 3.12 (0.14) would suffice.

25/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias

3 Significant digits. How many of the digits are significant


depends on the accuracy. In case of an estimate of
3.1237920388 with a standard error of 0.121298375 the 95%
confidence interval is about 3.1237920388 ± 0.2377448150.
This says that the estimate is accurate to about 0.24.
Everything from the third digit after the decimal point carries
no additional information and is therefore better omitted: the
answer 3.12 ± 0.24 (95%CI) contains all the useful
information.
A sensible rule of thumb: round the standard error (or
the ± part of the confidence interval) to two significant digits;
then state your estimate with the same number of digits as
the standard error. So if your estimate is 9823.34 and your
standard error 327.89 you would write: 9820 ± 330 (s.e.).

26/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias

4 Report relevant parameters with your results. Always do


this, if there are parameters whose values can be set at your
discretion, like the number of replications, the step size, the
control variate parameter θ, one often experiments with them
while simulating.
Even if the first line of code is M=1e3, this does not mean that
this was the case for the reported results. The same holds for
the other parameters.

27/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Spotting errors Bias

1 Covariance and correlation in MC

2 Simulating derivatives: a good and a bad method


Finite differences
Method 1
Method 2 (Common random numbers)
Comparison

3 Points of attention for MC simulation (part)

4 Spotting errors and bias


Spotting errors
Bias

28/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Spotting errors Bias

Hypothesis tests to uncover errors. Bias.

We want check whether our delta simulation method produces


unbiased estimates. What prompted this worry?
For the case “wrong way” and h = 0.1 we find:
ˆ = 0.95823 with (estimated) s.e. 0.01380.

For the case “right way” and h = 0.1 we find:
ˆ = 0.96044 with (estimated) s.e. 0.00023.

In this case, we know what the answer should be: ∆ = 0.95577.

The t-test allows us to check whether it is plausible that the


simulated data have the claimed/desired expectation.
(We return to above in a few slides)

29/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Spotting errors Bias

Spotting simulation errors (with help of the CLT)

Suppose: I simulate 1000 realizations from a distribution and


I know it has expectation 2.
Assuming the CLT holds, the sample average X̄n
2
approximately has N (2, σn ) distribution, σ 2 = Var(X ).
Suppose: average = 1.930,
(estimated) standard error = 0.013.
This means my produced outcome is
1.930 − 2 −0.070
= = −5.4,
0.013 0.013
so, 5.4 s.e.’s below its expected value.
Analysis/Conclusions?!

30/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Spotting errors Bias

Simulated average is 5.4 s.e.’s below its expected value.


It may have happened by chance, but this seems unlikely; the
p-value P(Z ≤ −5.4) measures how unlikely (Z ∼ N (0, 1)).
The alternative explanation: there is an error somewhere.
This can also be used to compare p independent simulation
results, using s.e.(difference) = [1st s.e.]2 + [2nd s.e.]2 .
Rule of thumb:
deviations of 1 or 2 s.e.’s are “normal;”
3 is a bit suspect;
4 highly,
5 or more: a definite sign something is wrong.
Before you go digging for errors:
run again with a different seed, perhaps with bigger M.

31/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Spotting errors Bias

1 Covariance and correlation in MC

2 Simulating derivatives: a good and a bad method


Finite differences
Method 1
Method 2 (Common random numbers)
Comparison

3 Points of attention for MC simulation (part)

4 Spotting errors and bias


Spotting errors
Bias

32/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Spotting errors Bias

Bias: a difficult probleem

Results for M = 106 , exact value: ∆ = 0.95577.


Estimates and 95% confidence intervals:
h wrong way (1) right way (2) “right”−exact
0.1 0.95823± 0.02704 0.96044±0.00046 0.00466
0.01 0.93514± 0.26831 0.95649±0.00048 0.00072
0.001 0.74295± 2.68104 0.95608±0.00048 0.00031
0.0001 -1.17454±26.80824 0.95604±0.00048 0.00026
Bias: at h = 0.1 it is clearly visible for “right way” because the
difference between the estimate and the exact value is about 20
standard errors. At smaller h this is not so visible.
Exact value known?: bias can be “proved” using a statistical test.

33/ 34
Cov&Corr Derivatives PoA MC Spotting errors and bias Spotting errors Bias

What to do if there is bias?

The analysis on the previous slide is only possible when we


know the exact answer. In most situations you will not know
more than “there is bias” (plus sometimes the sign).
However, we can obtain a bias “estimate”:
the change in the estimate, for example going from h = 0.1 to
h = 0.01, corresponds in the previous example roughly with
the bias for the bigger h-value,
at least when this difference is a fair amount bigger than the
standard error.
Bias is very difficult to determine; we will return later to this
issue; see also PoAMC 9 and 10.

34/ 34

You might also like