Download as odt, pdf, or txt
Download as odt, pdf, or txt
You are on page 1of 7

LS 01: Introduction to time series analysis 1) Theoretical basics (considerations) Eviews: Method Least Squares - Linear Regression

2) Statistical analysis (time series plot, correlogram,


Use of time series analysis in economics: tests)
Empirical basis for economic theories / economic 3) Modeling: choosing a model and reviewing it
studies (GDP forecast, seasonally adjusted AL ratio, 4) Forecast: quality of the forecast (confidence
business cycle analysis / estimation of parameters for interval)
models of financial math (returns, correlations) /
forecast for prices or returns (stocks, foreign LS 02: Descriptive modeling of time series
exchange, returns) / application in risk Mgt. /
Estimation of interest structures Components of economic time series (trend,
What is a time series analysis? business cycle and season) Business cycle & trend
Ad hoc definition of a time series: Time series are ("smooth component"). Trend can also be exponential,
ordered sequences of observations (values of logarithmic, negative.
economic time series are very likely to be
interdependent) / Descriptive modeling: Modeling of Additive model: Yt = Ft + St + Et (all additive)
time series with components (trend, cycle, season) / Trend with seasonal fluctuations and constant
Stochastic modeling: Modeling the dependency amplitude. Periodic deviations are independent of the
structure of time series by stoch , Processes / forecast trend value. Ft = 207'143.30 + 1133.867 * t
of future values from the observations of a time series
3 components: Multiplicative model: Yt = Ft * St * Et (all Eview's solution with logarithmization of the linear
Trend: Long-term development / business cycle: multiplicative) trend. Then reverse solution again with exponential
Cyclical variation / season: Seasonal fluctuations + Trend with seasonal fluctuations with proportionally function (see before).
residual component (random fluctuation): One-off and increasing amplitude. Periodic deviations from the
random influences, the timing and strength of which trend are proportional to the size of the trend. Rashes Example of polynomial growth
are difficult or impossible to estimate (weather, strikes, grow with trend. A multiplicative model can be Fitting a polynomial regression for the smooth
FX, machine failures etc.) converted into an additive model by logarithmization component. Additional modeling of seasonal
(logYt = logFt + logSt + logEt). fluctuations with dummy variables for each month:
Simple stochastic model - random walk with drift
Yt = δ + Yt-1 + ut / ∆Y = Yt - Yt-1 = δ + ut Yt = β0 + β1 * t + β2 * t2 + β3 * t3 + Σαi + Etαi =
Values of Y at time t depend on the values at time t-1. dummy variables for the months / i (1, ..., 11)
There is also a constant δ.
There is also a random fluctuation ut.
The differences ∆Y fluctuate randomly around the
value δ (expected value of Yt).
Repeated insertion into the model equation delivers

The ut are independent and identically distributed


random variables with expected value 0 and variance
σ2.
A variable with these properties is called white noise.

The illustration shows why the process is called a Methods for estimating a trend
random walk with drift. The process consists of two Regression model or filtering (method of moving
components: average)
Time series plot provides an indication of a possible
◦ Deterministic growth path (drift): Y0 + δ⋅t regression model
◦ Accumulated disturbance variables (random walk): Seasonality Estimation Methods (In Course)
Σui Modeling with dummy variables
Seasonality mostly known from problem / data (e.g. Q
Goals of the time series analysis: Description of data)
time series Determination of linear trend and exponent. Ft = 317.9641 + 0.081953 * t + 0.000106 * t2 - 5.77E-
Goal: To make internal relationships visible Growth 08 * t3 + Σαit
Descriptive methods: Graphics of the time series Linear growth (approximately constant absolute
("time series plot") / decomposition of the time series Warning: auto-correlated residuals (Durbin-
increases)
into components / lagged scatter plot (LS03) / Ft = β0 + β1 * t + Et Watson)
correlogram (LS03) Durbin-Watson stat. = 0.231636 (residuals highly
Exponential growth (roughly constant relative growth)
Time series plot: Provides information about trends, Ft = β0 * β1t + Et correlated)
seasonal effects, variability and "outliers". Usual -> As close as possible to 2
Linearization of exponential growth -> linear trend
representation: X: time and Y: measured variable log Ft = log β0 + t * log β1 + log Et
Trend determination through filtering / smoothing
Properties of time series plot Filtering a time series (moving average)
Example: Ln (4.6) = 1.526 => e1.526 = 4.6

Misinterpretations depending on: choice of scales Regression models: A trend can be estimated by filtering the time series.
(transformed? Units?) / Choice of section (zero point? Prerequisite: Seasonal fluctuation has been adjusted
Start? End?) The regressor variable is the variable t of the observed or does not exist. For filtering, a new time series is
points in time. generated with an additive, linear filter:
Time series plot is a unique realization of a random Parameters can be determined using conventional
process: The next realization can reject hypotheses / OLS estimators.
Only under certain conditions (LS03) can one-off
realizations be considered representative. Conclusion: Determination of linear growth with regression
Time series plots are important analytical tools, but model in EViews:
should be used with caution.
The "ai" are called weights. The smoothed series is
identified as a trend. The deviations from the
Modeling of time series - goal: understanding &
stochastic residual term. Disadvantage: Trend is not
defining model relationships
described by the functional equation.
(Given a time series y1, y2, y3, ...)
Linear filtering: simple moving average
Methods: Descriptive modeling (classic, "older" time
series analysis) / Stochastic modeling: specification of
If the additive, linear filter applies ...
a stochastic process (Yt) t = 1,2,3, ...
p = q (symmetry) and sum of the "ai" = 1
Process: Theoretical considerations, analysis of
(standardization)
empirical data to uncover the laws, definition of the
Sum of "ai" = sum of all weights
model type, estimation of the model parameters,
Same weights: ai = 1 / (2p + 1) = 1 / (2q + 1)
testing of the model quality
... it is simply the same
average
Forecast with time series - goal: forecast of
unknown values with associated forecast intervals
The basis is an estimated and validated model
Procedure of an analysis The remaining time series "Residual" only contains
seasonality and remaining term. The number of weights is odd n = 2p + 1 = 2q + 1
The number of weights "n" means length or Order of
the filter is lower.
At the edges p = q moving average values cannot be
calculated

Example: Real rate return with moving average: If δ ≠ 0 -> increases expected value> 0: E (Yt) = δ * t
(= drift)
3 different filters: p = 1 => n = 3 / p = 5 => n = 11 / p =
11 => n = 23 Drift -> mean increases with time
n = "length", depending on p / the larger, the smoother
Simulation example in Excel:

LS 03 Introduction to stochastic modeling


Static vs. Dynamic models - example demand
radio.
Requested quantity Q of a product depending on price
P and income Y.
Static model: Current price and current income
determine demand:
Qt = β1 + β2Pt + β3Yt + ut
Dynamic model: Current price and income from the
previous period determine demand:
Qt = β1 + β2Pt + β3Yt-1 + ut
Dynamic, autoregressive model: Current price and Time series as a stochastic model - definition
demand from the previous period determine demand:
Qt = β1 + β2Pt + β3Qt-1 + ut Time series = sequence of n observations yt.
Observations are the realization of the random
Feature, Transparent. static - arguments for variables Y1, Y2, ... Yn. The sequence as a section of
General moving average dynamic models the underlying stochastic process (infinite). Measuring
The position of the weights can be asymmetrical. In times can be discrete or continuous (with us: discrete).
Independent variables in static have an immediate The measured values can be discrete or continuous
addition, the weights can be different from one effect. Adjustment of the dependent variable to the
another, with the standardization remaining. (p not (with us: continuous). Example of continuous time
independent one is completed within the current period series in discrete time (daily maximum of share
equal to q = asymmetry and "sum ai" = 1) (process always seems to be in balance). Static prices).
models are often unsuitable because: 1) Activities are
Warning: simple moving averages are distorted by determined by the past 2) Actors in economic
seasonal fluctuations. In order to eliminate seasonal Comparison of random samples
processes react with a delay.
fluctuations, the moving averages must match the
cycle length. E.g. quarters = 4-element sliding. Cross-sectional data: Independence of the
Important properties of dynamic models observations is required. Then there are parameter
average
estimates according to BLUE.
Lag structure: Independents are delayed ("lagged").
Moving average for quarterly data is even order -> Shows delayed effect of the regressors on the
average cannot be assigned to a point in time. Time series: random variables Yi are dependent.
dependent variable. Usually stochastic models Parameter estimation can be distorted (only one
dynamic models. observation at a time). So that the time series values
Sliding. Avg. with straight order / centered filtering
can be used for parameter estimation, the time series
Stochastic processes - definition must be stationary.
Use an odd number of time series values for the
calculation. The two outer time series values are Stochastic process = infinite, temporally arranged
weighted with the factor ½. The weights at the edge Description of time series with moments (3), if
sequence of random variables: (Yt, t = -∞, ...., + ∞). stationary:
center the window at the observed point in time t. (Simplified mostly started at time t = 0). The process
Example with quarterly data: (Yt, t = 0,1,2,3 ...) is also called the data-generating 1) Expected value = mean of the time series: μt = E
process of the time series y0, y1, y2, ... (Yt)
2) Spread of the time series: σt2 = Var (Yt) = Cov (Yt,
Process in discrete time: data is collected at equally Yt)
spaced times / process in continuous time: continuous 3) Correlation of delayed values:
Error term as a stochastic process

Simple exponential filtering Error term = Description of uncertainty in time series.

Reasons: 1) modeling of effects of unobserved /


unobservable variables, 2) measurement errors, 3)
random process
-> Often used as white noise

White noise - the simplest stochastic process


-> Casual for stationarity: mean values and variances
Strong white noise: Error term independently and constant over time
identically distributed (expected value = 0 "mean value
horizontal" and variance = σ2ε "constant") / Warning: OLS regression with EViews:
There is also "normally distributed" white noise,
α means smoothing parameter
special case! Distribution not identical (Gaussian curve
Rule of thumb: α often between 0.1 and 0.3 instead of horizontal).

Depending on the choice of α, past observations are Weak, white noise (relevant for us)
weighted differently. The smaller α, the less strongly
Available if εt = serial uncorrelated. May have
"more recent" points in time are weighted (but still the
dependencies in third or higher moments and may not
strongest). The larger α, the less strongly "less
be completely independent.
current" points in time are weighted (the most obvious
at all times).
Special stochastic process - random walk with
drift
Example: Exponential filtering of a real rate of
return with eviews and α = 0.2
Yt = δ + Yt-1 + ut (δ as constant and ut white noise)
Eviews: "Smoothing method": Trend and seasonality Problem: Everything OK except Durbin Watson (0.56)
Determination of expected value and variance (with indicates a problem. Reason: trend. If time series are
can be included in exponential smoothing.
constant = 0 and Y0 = 0); Substitution by repeated not stationary -> usual test procedures provide
Holt-Winters method. At the beginning, you don't insertion; Yt = ut + ut-1 + ... + u1 (Yt is omitted), acc. distorted estimators; leads to false regression.
know whether it's additive or multiplicative. Test both Calculation rule for E and Var.
and select where “Root Mean Squared Error (RMSE)” Stationarity important prerequisite for analysis; if not
stationary -> make stationary! (Remove trend).
Weak and steady process Although all theoretical auto-correlations are positive,
some estimates will be negative. Be careful when
Weakly stationary (also covariance-stationary) if: interpreting correlograms.

- μt = E (Yt) (strict, i.e. for all t, independent of time) Lag operator and characteristic polynomial AR (p)
The expected value is constant
- Autocovariance depends only on k (the distance). A lag operator is often used to simplify the notation of
- It follows: variance is constant an AR (p) process. The lag operator L shifts the index
- Weakly stationary if your first and second moments of a variable by 1 in the past. LYt = Yt-1
do not depend on time. (E.g. black stat. = Black white
noise) Multiple use of L corresponds to multiple shift: LsYt =
Yt-s (specific case identity I: L0 = I -> L0Yt = Yt)
Types of instationarities:
AR (p) model in the notation with lag operator:
Trends (not mean stat.), Heteroscedasticity (not Yt = α + φ1Yt-1 + φ2Yt-2 + ... + φpYt-p + ut
variance-stationary), or both, long periodicities Yt = α + φ1LYt + φ2L2Yt + ... + φpLpYt + ut
(business cycles) Yt = α + (φ1L + φ2L2 + ... + φpLp) Yt + ut
(1- φ1L - φ2L2 - ... - φpLp) Yt = α + ut
How can stationary time series be recognized? ---> Prob everywhere 0.000 (no white noise)
Attention: The same correlograms can differ from one
Consideration of the time series plots (danger of another. Time series come!
deception) / autocorrelations and partial Characteristic polynomial:
autocorrelation function of the time series / tests for AR (p) process – definition Φ (L) = (1 - φ1L - φ2L2 - ... - φpLp)
stationarity (LS05) / window technology (different "cut Autoregressive process -> variable is regressed on Φ (L) Yt = α + ut
out windows" variabilities significant?) itself. Model for the dependence of Yt on the past.
MA (q) process - definition (moving average)
Lagged scatter plot (e.g. logarithmic lynx data) AR (1) process: Yt = α + φYt-1 + ut (φ = Beta at
regression number) Yt is the weighted sum of the error terms
Is linear dependence on Yt and Yt-1 recognizable? AR (p) process: Yt = α + φ1Yt-1 + φ2Yt-2 + ... + φpYt- MA (1) process: Yt = α + ut + θut-1
Y axis: normal data / shift X axis by time unit p + ut MA (q) process:
In a year with few lynxes, it has little in the following Yt = α + ut + θ1ut-1 + θ2ut-2 + ... + θqut-q
year Error term is white noise The error term is white noise
A negative sign at φ leads to stronger swings.
Autocorrelation function (AC function) φ = 1 -> no longer stationary (up to 1 limit, stationary Example MA (1) process / properties
<1) Time series are shifted upwards on average.
Measurement of dependence on Yt and Yt-k (Eviews Negative sign at θ -> Stronger debauchery
provides value). MA (∞) representation and moments AR (1) Time series are also stationary for θ = +/- 1.
Thumb rules for the AC function of a stationary processes
process: By recursive insertion, the AR (1) representation AR (∞) representation (inversion) and moments of
Mind. N = 30 data points / only lags with k <n / 4 results in the MA (∞) representation: Yt = α + φYt-1 + an MA (1) process By recursive insertion, the MA (1)
(purists) or Trust k <n / 2 (pragmatist) / Only lags k <10 ut ..... (algebraic conversion) representation results in the AR (∞) representation:
* log10 (n). Equal distances and no gaps! (Holidays Yt = α + ut + θt-1 .... --->
are a problem).

Partial autocorrelation function (PAC)

Measures linear relationship between Yt and Yt-k after


eliminating the effects of the intermediate variables Yt- (Sum of error terms)
1 to Yt-k + 1.
-> decaying influence of past disturbance terms ut-i The AR (∞) representation shows the decaying
Coefficient okk measures partial effect after elimination
of the information contained in the other delayed influence of the past Yt .:
variables.

Correlogram pattern for white noise:

Moments: E (Yt) = α and Var (Yt) = σ2 (1 + θ2)  Var.


finally stay

Autocorrelation of an MA (1) process:

Autocovariances and correlations can be expressed


through the model parameters:
γ0 = Var (Yt) = σ2 (1 + θ2)
γ1 = Cov (Yt - Yt-1) = σ2θ
γk = Cov (Yt - Yt-k) = 0
(Prerequisite for stationarity)
Moments (both constant) (for k> 1 all theoretical autocovariances are 0.)
From this follows for the autocorrelations:
Autocorrelations of an AR (1) process
Can be expressed by model parameters:

Goal: Within band -> randomly different from zero


(95%).
Q-Stat & Prob (H0 = p1 =… = pk = 0 white noise)

Prob> 0.05 -> assume H0 (example: lag 8 to lag 12; For k> 1, all autocorrelations are 0.
white noise (not significantly different)).
Lag operator and characteristic polynomial MA (q)
Patterns for periodicity and non-stationary time
series: A lag operator is often used to simplify the notation of
an MA (q) process.
(for positive φ all theoretical autocorrelations are> 0)
For the n-1 possible empirical autocorrelations (finally) MA (q) model in the notation with lag operator:
the following sum applies (without proof):
Yt = α + θ1ut-1 + θ 2ut-2 + ... + θqut-q
Yt = α + ut + θ 1Lut + θ 2L2ut + ... + θqLqut
Yt = α + (1 + θ 1L + θ2L2 + ... + θqLq) ut

Characteristic polynomial:
Θ (L) = (1 - θ1L - θ2L2 - ... - θqLq)
Yt = α + Θ (L) ut

ARMA (p, q) process

Union of AR (p) and MA (q)


MA (q) process for the error term of AR (p)
ARMA (p, q) process: Coefficient = comparable, estimators are true to
Yt = α + φ1Yt-1 + φ2Yt-2 + ... + φpYt-p + θ1ut-1 + θ expectations
2ut-2 + ... + θqut-q
Standard errors = clearly different, taking into account
-> Error term (ut) is white noise. the correlated errors (Cochrane-Orcutt) provides
Notation with characteristic polynomial: smaller "s.e." (approx. 1/2)
Φ (L) Yt = α + Θ (L) ut
Tests for auto-correlation of the error terms
Practice LS03
Durbin-Watson test: values of 2 indicate that the error
Calculation rules: expected value: terms are uncorrelated. Problem: Durbin-Watson only
tests the first autocorrelation. If error terms follow AR
E (a + b * X + c * Y) = a + b * E (X) + c * E (Y)
(p) process with p> 1, the test fails.
With E (X) = μx and E (Y) = μy
Between times t and t + s, LOGLYNX and
Application of Durbin-Watson test using an
Calculation rules: variance LOGLYNX_1 move in the same direction, they fall.
example:
Therefore there is a positive correlation between the
Var (X) = (E ((X - E (X)) 2) two variables. LOGLYNX_5 is increasing between the
Durbin Watson without correction: 3.3
Var (X) = E (X2) - (E (X)) 2 (shift set for variances) same times. Therefore there is a negative correlation
Durbin Watson with correction (Cochrane Orcutt):
Var (X) = E (X2) if E (X) = 0 between LOGLYNX and LOGLYNX_5.
approx. 2
Var (X) greater than or equal to 0
Var (X +/- Y) = Var (X) + Var (Y) (X and Y are LS04: Regression between time series / ARMA
Bruche-Godfrey test
uncorrelated) models
Var (X +/- Y) = Var (X) + Var (Y) +/- 2 * Cov (X, Y) (X & More general than the Durbin-Watson test. The value
Y correlated) Regression model for time series
of "p" must be determined before the test
Var (a * X +/- b * Y) = a2 * Var (X) + b2 * Var (Y) +/- 2 * (correlogram, you have to know beforehand -> play
a * b * Cov (X, Y) The time series Yt is influenced by q further time
around a bit).
series Xt (1), Xt (2), ... Xt (q). Regression model:
Calculation rules: covariance Starting point for example: Et = φ1 * Et-1 + ... + φp *
Yt = β0 + β1Xt (1) + β2Xt (2) + ... + βqXt (q) + Et
Et-p + ut
Cov (X, Y) = E ((X - E (X)) * (Y - E (Y))
(AR (p) process of error terms <-> autocorrelation)
Cov (X, Y) = 0 (X & Y uncorrelated) Null hypothesis: H0: φi = 0, i = 1, ... p
Cov (X, X) = Var (X) Autocorrelated residuals
Alternative hypothesis HA: φi ≠ 0 for at least one i
Cov (a * b * X, c + d * Y) = b * d * Cov (X, Y)
Cov (X + Y, Z) = Cov (X, Z) + Cov (Y, Z) In contrast to models without a time series character,
time series models often (almost always) have
Calculation of expected value, variance and autocorrelated residuals. Reasons: positional structure
Here: reject P = 2 ... H0 because test significant. 1st
autocorrelation of the variables not correctly specified / variable not
order AC.
taken into account / successive values of the missing
Given: Random Walk with Drift variables correlated? Then the residuals in the model
Cochrane-Orcutt method for auto-correlated error
that does not take this variable into account are
terms (special case AR (1) model for the error)
Calculation of expected value Yt = δ + Yt-1 + ut (with correlated.
Y0 = 0) Model for the errors: Et = φ1 * Et-1 + ut (autocorre 1st
Impact of autocorrelated residuals: Ordinary OLS
order)
Y1 = δ + Y0 + u1 -> Y2 = δ + Y1 + u2 estimators remain true to expectations, but they do not
Y2 = δ + (δ + Y0 + u1) + u2 ---> .... ---> Yt = t * δ + ut + have minimal variance, there are more precise Regression model of a time series, for example: Yt =
ut-1 ... + u1 estimators / The standard error of the coefficients βi β0 + β1Xt (1) + β2Xt (2) + Et / forming the difference:
E (Yt) = E (t * δ + ut + ut-1 ... + u1) are estimated distorted / This means that t-tests and
E (Y t) = E (t * δ) + E (ut) + E (ut-1) + ... + E (u1) = t * δ confidence intervals are inaccurate (t dependent on Use Y * t = Yt - φYt-1 / model for Yt and Yt-1
+ 0 + ... + 0 = t * δ standard deviation, and this is distorted ).
= β0 + β1Xt (1) + β2Xt (2) + Et - φ (β0 + β1Xt (1) +
Calculating variance Detection of auto correlations β2Xt (2) + Et)
Var (Yt) = Var (t * δ + ut + ut-1 ... + u1) = Var (t * δ) + t Graphical residual analysis (scatter plot) / analysis of = β0 (1- φ) + β1 (Xt (1) - φXt-1 (1)) + β2 (Xt (2) -
* Var (u ...) AC or PAC / Durbin Watson (Lag 1) / Breusch-Godfrey φXt-1 (2)) + Et - φEt-1
= 0 + t * 2u = Var (Yt) = t * 2u (Lag p) / Box-Pierce Ljung Box (Lag p) - also known as
Portmanteau tests Where: Et - φEt-1 = ut
Calc. Autok .: Cov (Yt, Yt-k) = E ((Yt - E (Yt)) (Yt-k - E
(Yt-k)) ---> = Activities: = β * 0 + β1Xt * (1) + β2Xt * (2) + ut (new model;
E ((t * δ + ut + ut-1 ... + u1 - t * δ) ((tk) * δ + ut-k + ut-k- auto correction gone, only white noise)
1 ... + u1 - (tk) * δ) ) 1) Adding additional explanatory variables (content)
Cov (Yt, Yt-k) = E ((ut + ut-1 ... + u1) (ut-k + ut-k-1 ... + 2) Consideration of the situation structure of the The following applies: β * 0 = β0 (1- φ) & X t * (1) = Xt
u1)) explanatory V. (content) (1) - φXt-1 (1) & X t * (2) = Xt (2) - φXt-1 (2 )
Because of Cov (ut, ut-k) = 2u for k = 0 and 0 for k  3) Generalized Least Square Estimator (GLS) ->
0 Correction of standard errors (flu agent) e.g. Prais- The transformed model is free from autocorrelation
Winsten estimator and meets the requirements of the general regression
Cov (Yt, Yt-k) = (t-k) * 2u 4) Modeling and incorporating the correlated errors as model. Disadvantage: In order to calculate it, φ must
an AR (p) model (flu agent) e.g. Cochrane-Orcutt be known (estimated in EViews).
Interpretation of Lagged Scatterplot method
Example: Breusch-Godfrey test with p = 4
Only use GLS if 1) & 2) is fulfilled

Example with forced autocorrelations:

Coefficient β is well estimated. Stay true to ---> Residuals are autocorrelated (2nd order) with lag
expectations. 2

Standard error is distorted. Correction with flu: Cochrane-Orcutt model with lag 2

Estimation without / taking into account the Properties of ARMA processes


correlated errors (Cochrane-Orcutt)
Every stationary AR (p) process can be done with MA
(∞) -P. write

Any invertible MA (q) -P. can be called AR (∞) -P. write


Every stationary process can be approximated with an
ARMA process (!)

ARMA (p, q) processes are usually not clear. The


same process can be represented with different
combinations of p and q. Basically:

- Modeling with small p & q (0 - 4) simplify analysis

- AR representations are more suitable for the


estimation of parameters, since the OLS assumptions
are fulfilled
AR; φ1 = 0.3 und MA; θ1 = 0.8 (1. Bild
- MA representation is more suitable for the calculation ACF, 2. Bild PACF)
of variances and covariances.

Examples of Eview's estimates:

Time series is not stationary due to trend. Durbin


Watson too high. Estimation of the parameters with
errors. Model is not OK. (Trend)

Time series is stationary. OK model. Estimate


parameter OK. (without trend)

Properties of the ACF function of AR (p)

AR; φ1 = -0.8 und MA; θ1 = 0.3 (1. Bild LS05 - Trends and Unit Root Test / ARIMA Models
ACF, 2. Bild PACF)
Deterministic vs. Stochastic trend
Properties of the ACF function of AR (p)
Fundamental differences:

Deterministic trend: Regressor "time" determines the


Properties of the PACF function of AR (p) time increase. Uncertainty in the system is described
by stationary interference. (Yt = α + β * t + εt)

Stochastic trend:

Shocks εt have persistent (persistent) effects. The


accumulated shocks (sum of εt) lead to the trend.
(Ascent with drift).

Deterministic trend (D):

Yt = f (t) + ut ut ̴ i.i.d.
f (t): function of time t
Frequently linear trend:

Yt = β0 + β1 * t + ut β0 and β1 determine growth path


Properties of the AC function of MA (q)
Moments of a linear trend:
E (Yt) = E (β0 + β1 * t + ut) = β0 + β1 * t
Var (Yt) = Var (β0 + β1 * t + ut) = σu2

Time series with a linear trend is non-stationary (non-


constant mean). It is variance-stationary, but not trend-
stationary.

Stochastic trend random walk with drift (C) AR (1)


Yt = δ + Yt-1 + ut ut ̴ i.i.d.

Representation as an MA (∞) process

Yt = δ + Yt-1 + ∑ui
Y0, δ determine growth path (drift)

moments:
Properties of the PACF function of MA (q)
E (Yt) = Y0 + δ * t
Properties of the PACF function of MA (q)
Var (Yt) = t * σu2
Non-canceling, damped, exponential or sine function.
Random walk with drift is not stationary (not constant
Root of the characteristic polynomial real =
mean, not constant variance)) not variance / trend
exponential function. Root of the characteristic
stationary
polynomial complex = sine function.
Stochastic trend Random Walk (B)
Theoretical ACF and PACF different ARMA (1,1)
processes
Yt = Yt-1 + ut
AR; φ1 = 0.8 and MA; θ1 = 0.3 (1st picture ACF, 2nd
Representation as an MA (∞) process
picture PACF)
Yt = Yt-1 + ∑ui

moments:

E (Yt) = Y0 (trend-stationary, i.e. constant on average)


Var (Yt) = t * σu2
Random Walk is not stationary (variance is not Linear trend: Ft = β0 + β1 * t + ut -> Ft = 207'143.3 +
constant) 1133.867 * t
LS06: Time series with stochastic volatility
Borderline case AR (1) process (A) The remaining time series (“residual”) only contains
seasonality and residual term. The time series Introduction - Simple model for equity returns
Yt = α + φYt-1 + ut "Residual" is still used. For example, to model
seasonality. Return on a share - Discrete return on a share
Representation as an MA (∞) process (“return”)
Example differential time series: Random Walk
moments: with Drift

The new time series is white noise. Original time St = Current share price
series is stationary. (white noise is stationary)
<- if φ goes to 1 it becomes unstable For statistical analysis it is assumed that rt is normally
Example differential time series: GDP growth distributed. This leads to two problems:

Original time series = difference stationary 1) A normally distributed random variable takes values
between minus infinity and plus infinity, but rt is
New time series = growth rate (log return) compared to capped with -1 to plus infinity (-1 if St = 0 and St-1 ≠ 0)
previous quarter's Eviews: gdp_log = log (gdp) - log
(gdp (-4)) // Yt = (1-L4) log (BIPt) = log (BIPt) -log 2) Multi-period returns are not normally distributed,
AR (1) process is stationary if φ applies (BIPt- 4) New time series is stationary. even if single-period returns are normally distributed
(follows from the probability calculation).
Comparison: A: AR (1), B: Random Walk, C: Integrated time series
Random Walk with Drift, D: Linear trend -> Problems can be almost solved with steady returns,
A stochastic process Yt means integrated from the since these are viewed as more normally distributed.
order d if it has to be differentiated d times so that it
becomes stationary. Continuous (“log return”) versus discrete returns

Name: Yt ~ I (d) ("Process is integrated with order" d ")

Yt ~ I (1): Unit root process (integrated 1st order),


Other features of a random walk (RW) stationary after a single difference formation: Example:
RW with drift: Yt = δ + Yt-1 + ut
Autocorrelation: For fixed k, Yt and Yt-k are correlated
the more, the larger t is. For increasing k, ρk ∆ Yt = Yt- Yt-1 = δ + ut  stationary
converges to zero.
Yt ~ I (0): process is already stationary.
As t increases, convergence slows down.
Example AR (1): Yt = α + φYt-1 + ut with φ
A random walk has a long memory. it is already stationary

RW drift long in one direction without returning to the The formation of the difference is described for
mean of the time series -> trend simplification with the difference operator ∆: The
difference operator ∆ forms the difference: ∆1 = ∆ Yt =
Yt- Yt-1 -> Multiple use of ∆ corresponds to multiple
difference formation. Example of second difference:

∆2 Yt = ∆ (∆Yt) = ∆ (Yt - Yt-1) = ∆Yt - ∆Yt-1 = ... = Yt


-2Yt-1 + Yt-2
Continuous return on a share (“log return”) normal is
Relationship with lag operator L: ∆3Yt = (1-L) dYt normally distributed:
Y axis: autocorrelation / X axis: t = time
* HAC: Heteroscedasticity and Autocorrelation
K: Distance between two time series elements Consistent
For small price changes, the differences between
The bigger "k" the "worse" they know each other Unit root test helps in deciding which model choice is discrete and steady returns are small. If the discrete
correct. return lies in the interval (-0.10; +0.10), this leads to a
Autocorrelation increases with increasing time -> relative deviation between continuous and discrete
converges to 1 for all distances The consequences of Spurious Regression are more return of a maximum of approx. +/- 5%.
dramatic than modeling differences in a trend-based
Comparison of AR (1) process - MA (∞) process. Note: The analysis of differences describes Stochastic model for steady returns
representation: (short-term) changes. The information about the
behavior of the process in equilibrium is not taken into Prerequisite: share price follows random walk or
RW: constant weight: account: Outlook: cointegration makes it possible to random walk with drift. Consequences for the
avoid this disadvantage modeling of steady returns:
Yt = δ + Yt-1 + ∑ui
ARIMA model -> Constant return is a constant μ with an error term.
AR (1): Decreasing weight:
Extension of the ARMA (p, q) model: ARIMA (p, d, q) -> Error term is white noise.
Elimination of a trend - type of stationarity
process is a process whose dth difference follows an
Model for constant return on a share:
ARMA (p, q).
Goal: A non-stationary time series with trend can be
transformed into a stationary time series by eliminating ln (St / St-1) = rt = μ + εt
Implementation in eviews in two ways (e.g. ARIMA
the trend.
(2,1,1)
μ = 0 -> random wak
Eviews: Form the first difference of the original time
2 options: subtraction of a deterministic trend or
series.
differentiation: formation of differences. Ex. 1. Diff: Yt - μ = const. -> Random walk with drift
Or Eviews: Introduce dependent variables directly into
Yt-1
the model as a difference. (D (y, 1))
εt white noise with E (εt) = 0 and Var (εt) = 2
A non-stationary time series means:
Test for stationarity: unit root test
3 Stylized Facts of Financial Time Series - 1. Vola
Trend stationary, if you remove the deterministic Cluster
Is a time series stationary? If the zero of the
trend ... (e.g. linear trend Yt = β0 + β1 * t + ut)
characteristic polynomial "Φ (z) = 1 - φz = 0 <-> φ = 1 /
Vola is subject to fluctuations. Time accumulations of
z" has the value 1, the process is not stationary. The
Differential stationary, if by means of differentiation ... course swings. Example: UBS share (2003 - 2010)
solution to the zero of a polynomial is commonly
referred to as "root". If a zero has the value 1, its name 3 Stylized Facts of Financial Time Series - 2.
(Example: Random Walk Yt = δ + Yt-1 + ut)
is "Unit Root". Leverage Effect
... can be transformed into a stationary model.

Example of a trend-based time series: traffic


counting Negative relationship between yield and change in
volatility. Reason: Falling share prices lead to a higher
EViews: Method Least Squares
level of corporate leverage, which leads to increased Simulated AR (1) process Yt = 0.3 + 0.6 Yt-1 + εt
uncertainty and thus to higher volatility.
Error term white noise:
Real time series show asymmetrical reactions;
Negative messages influence vola more than positive εt i.i.d. with E (εt) = 0 and Var (εt) = 2 = 1/3 = 0.33
ones:
Absolutely. Expected value: Eu (Yt) = α / (1- φ) = 0.3 /
(1-0.6) = 0.75

Distance from 1-year GD in DJ since 1900 in percent Unconditional variance: Varu (Yt) = 2 / (1- φ2) =
0.33 / (1-0.62) = 0.52
3 Stylized Facts from Financial Time Series - 3.
Leptokurtosis

Leptokurtosis - narrow curvature and broad flanks ("fat


tails")

Small and large returns are overrepresented.

Example of UBS share (2003 to 2010):

Kurtosis 23.7, min -10.7, max 10.1

Comparison with white noise

Kurtosis -0.033, Min -3.3, Max 3.4

Conclusion for UBS example:

Accumulation of small returns around zero (Kurtosis


23.7 vs. -0.033 with white noise).

More extreme swings (min -10.7, max. 10.1 vs. -3.3,


3.4 with white noise

Conclusion stylized facts:

1. & 2. Can be solved. 3. Problem was not solved (just


determine)

Consequences on the choice of model

The model must be able to represent stochastic vola


(= variance)

-> Model with separate stochastic process for variance

ARCH model for time-discrete time series (ARCH =


autoregressive conditional heteroscedasticity)

Further developments:

GARCH: Generalized ARCH: extension to ARCH (∞)


process

EGARCH: Exponential GARCH for leverage effect

TGARCH: Thereshold GARCH: asymmetrical effect


(alphabet soup)

Conditional and unconditional variance - ARCH model

ARCH models distinguish explicitly conditioned &


unconditional var .:

Example AR (1) process:

Yt = α + φYt-1 + εt (εt i.i.d. with E (εt) = 0 and Var (εt) =


2)

Unconditional variance: Variance of the process “in the


long run” receives all the information: Var (Yt) = 2 /
1-φ2

Conditional variance: Variance in the information


available at time t, if Yt, Yt-1, ... Y1 realized .:

Var (Yt pt) = Var (εt) = 2

pt = common distribution of Yt, Yt-1, .... Y1

In the AR (1) process, the conditional variance


depends on the disturbance εt at time t (not constant,
coincidence).

In general, the conditional variance is a random


variable. With ARCH models, the conditional variance
is modeled as a function of time.

Example of conditional and unconditional variance:

You might also like