Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 3

QUANTITATIVE RISK MANAGEMENT 

INDEX 
1. Returns and log-returns  
2. Definition of losses 
3. Definition of Value-at-Risk (VaR) 
4. VaR formula for normality distributed losses 
5. VaR formula for normality distributed profits 
6. Definition of Expected Shortfall (ES or CVaR) 
7. Expected shortfall formula for normally distributed losses 
8. Simple linear regression formula 
9. Multiple linear regression formula 
10. Formula for OLS estimator of β 
11. Partial t-test and overall F-test testing problems (hypothesis) 
12. Interpretation of R2 and adjusted R2 
13. Interpretation of the information criteria in linear regression
estimation 
14. Definition of a stationary process, white noise process 
15. Autocovariance and autocorrelation function, properties 
16. AR(1) formula, stationarity of the model 
17. Random walk model 
18. AR(p), MA(1), MA(q), ARMA(1,1), ARMA(p,q) 
19. Stationarity of ARMA(p,q) 
20. Yule-Walker equation for AR(p) model (matrix form) 
21. Yule-Walker equation of σ2 
22. Interpretation of the information criteria in ARMA model
estimation 
23. Minimum squared error predictor of stationary process 
24. Formula for ARCH(1), ARCH(q), GARCH(p,q) and stationarity
condition 
The sequence of random variables {Xt , t belonging to the set of integer numbers Z} is
called random process while its respective realizations {xt , t belonging to the set of
integer numbers Z} are called time series.
When we are talking about random processes there are two topics that acquire a specific
importance:
- The issue of stationary process. A random process is considered stationary if the
following rules are respected:
1. The expected value of Xt is equal to m. In other words, it means that it does not
depend on the variable t. This rule should be respected for all the ts that belong
to the set of integer numbers Z.
2. The expected value of the squared absolute value of Xt needs to be lower than
plus infinity. This rule should be respected for all the ts that belong to the set of
integer numbers Z.
3. The variance of Xt is equal to sigma squared.
4. GammaX of r and s is equal to GammaX of r+t and s+t for all the rs, ss and ts
that belong to the set of integer numbers Z.
γX(r, s) = γX(r + t, s + t) for all r, s, t ∈ Z
5. The COVARIANCE of Xt+h and Xt is equal to Gamma h
Cov(Xt+h, Xt) = γh
6. Rules regarding the autocovariance function and the autocorrelation function
7.
o Gamma 0 needs to be higher (or at least equal) to 0
o Gamma h needs to be equal to gamma minus h => this function is
SIMMETRICAL
o The absolute value of gamma h needs to be lower (or at least equal) to
gamma
o The absolute value of rho h (our ACF) is lower or equal than 1. Why?
Because in the formula of our AUTOCORRELATION FUNCTION we have at
numerator one quantity that is lower (or at max equal) to the denominator)

γ0 ≥ 0 γh = γ−h |γh| ≤ γ0 |ρh| ≤ 1

- The issue of white noise process. First of all, we need to say that white noise is
reported as epsilon t because it is another name to express errors/residuals.
The process {εt , t ∈ Z} is said to be white noise with mean 0 and variance σ2 ,
written {εt} ∼ WN(0, σ2 ), if the following properties are respected:
o The expected value of epsilon t need to be equal to 0 and the variance
equal to sigma squared
o The covariance of any 2 elements s and t is equal to 0. This rule is valid
whatever s and t we take.
E(εt) = 0 Var(εt) = σ 2 Cov(εs, εt) = 0 for any s and t (not equal)

You might also like