Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 34

An Introduction to Time Series

Ginger Davis

VIGRE Computational Finance Seminar
Rice University

November 26, 2003
What is a Time Series?
Time Series
Collection of observations
indexed by the date of each
observation

Lag Operator
Represented by the symbol L

Mean of Y
t
=
t


( )
T
y y y , , ,
2 1

1

t t
x Lx
White Noise Process
Basic building block for time series processes

{ }
( )
( )
( ) 0
0
2 2
=
=
=

=
t
c c
o c
c
c
t
t
t
t
t
E
E
E
White Noise Processes, cont.
Independent White Noise Process
Slightly stronger condition that and are
independent
Gaussian White Noise Process

( )
2
, 0 ~ o c N
t
t
c
t
c
Autocovariance
Covariance of Y
t
with its own lagged value


Example: Calculate autocovariances for:

( )( )
j t j t t t jt
Y Y E

=
( )( ) ( )
j t t j t t jt
t t
E Y Y E
Y

= =
+ =
c c
c
Stationarity
Covariance-stationary or weakly stationary
process
Neither the mean nor the autocovariances depend on
the date t




( )
( )( )
j j t t
t
Y Y E
Y E

=
=

Stationarity, cont.
2 processes
1 covariance stationary, 1 not covariance
stationary

t t
t t
t Y
Y
c |
c
+ =
+ =
Stationarity, cont.
Covariance stationary processes
Covariance between Y
t
and Y
t-j
depends only on
j (length of time separating the observations) and
not on t (date of the observation)


j j
=
Stationarity, cont.
Strict stationarity
For any values of j
1
, j
2
, , j
n
, the joint
distribution of (Y
t
, Y
t+j
1
, Y
t+j
2
, ..., Y
t+j
n
) depends
only on the intervals separating the dates and
not on the date itself
Gaussian Processes
Gaussian process {Y
t
}
Joint density



is Gaussian for any
What can be said about a covariance stationary
Gaussian process?
( )
n
n
j t j t
j t j t t Y Y Y
y y y f
+ +
+ +
, , ,
1
1
1
, , ,

n
j j j , , ,
2 1

Ergodicity
A covariance-stationary process is said to be
ergodic for the mean if



converges in probability to E(Y
t
) as

=
=
T
t
t
y
T
y
1
1
T
Describing the dynamics
of a Time Series
Moving Average (MA) processes
Autoregressive (AR) processes
Autoregressive / Moving Average (ARMA)
processes
Autoregressive conditional heteroscedastic
(ARCH) processes

Moving Average Processes
MA(1): First Order MA process


moving average
Y
t
is constructed from a weighted sum of the two
most recent values of .

1
+ + =
t t t
Y uc c
c
Properties of MA(1)
( )
( ) ( )
( )
( )
( )( ) ( )( )
( )
( )( ) 0
1
2
2
2 1
2
2
2
1 1
2 1 1 1
2 2
2
1
2
1
2
2
1
2
=
=
+ + + =
+ + =
+ =
+ + =
+ =
=


uo
c c u c uc uc c c
uc c uc c
o u
c u c uc c
uc c

j t t
t t t t t t t
t t t t t t
t t t t
t t t
t
Y Y E
E
E Y Y E
E
E Y E
Y E
for j>1
MA(1)
Covariance stationary
Mean and autocovariances are not functions of time
Autocorrelation of a covariance-stationary
process

MA(1)

0

j
j

( ) ( )
2 2 2
2
1
1 1 u
u
o u
uo

+
=
+
=
Autocorrelation Function for White Noise:
0.0
0.2
0.4
0.6
0.8
1.0
0 5 10 15 20
Lag
A
u
t
o
c
o
r
r
e
l
a
t
i
o
n
t t
Y c =
Autocorrelation Function for MA(1):
1
8 . 0

+ =
t t t
Y c c
0.0
0.2
0.4
0.6
0.8
1.0
0 5 10 15 20
Lag
A
u
t
o
c
o
r
r
e
l
a
t
i
o
n



Moving Average Processes
of higher order
MA(q): q
th
order moving average process


Properties of MA(q)





q t q t t t t
Y

+ + + + + = c u c u c u c
2 2 1 1
( )
| |
q j
q j
j
j q q j j j j
q
> =
= + + + + =
+ + + + =
+ +
, 0
, , 2 , 1 ,
1
2
2 2 1 1
2 2 2
2
2
1 0

o u u u u u u u
o u u u

Autoregressive Processes
AR(1): First order autoregression


Stationarity: We will assume
Can represent as an MA

t t t
Y c Y c | + + =
1
1 < |
( ) ( ) ( )
( )

+ + + +
(

=
+ + + + + + =


2
2
1
2
2
1
1
t t t
t t t t
c
c c c Y
c | |c c
|
c | c | c
: ) (
Properties of AR(1)
( )
( )
( )
( )
( )
2
2
2 4 2
2
2
2
1
2
0
1
1
1
|
o
o | |
c | |c c

|

+
=
+ + + =
+ + + =
=

=

t t t
t
E
Y E
c
Properties of AR(1), cont.
( )( )
| | | |
| |
| |
( )
j
j
j
j
j
j j j
j t j t j t j t
j
t t t
j t t j
E
Y Y E
|

o
|
|
o | | |
o | | |
c | |c c c | c | |c c

= =
(

=
+ + + =
+ + + =
+ + + + + + + + =
=
+ +

0
2
2
2 4 2
2 4 2
2
2
1 2
2
1
1
1


Autocorrelation Function for AR(1):
t t t
Y Y c + =
1
8 . 0
0.0
0.2
0.4
0.6
0.8
1.0
0 5 10 15 20
Lag
A
u
t
o
c
o
r
r
e
l
a
t
i
o
n
Autocorrelation Function for AR(1):
t t t
Y Y c + =
1
8 . 0
-0.5
0.0
0.5
1.0
0 5 10 15 20
Lag
A
u
t
o
c
o
r
r
e
l
a
t
i
o
n
Gaussian White Noise
0 20 40 60 80 100
-
2
-
1
0
1
2
AR(1),
0 20 40 60 80 100
-
3
-
2
-
1
0
1
2
5 . 0 = |
AR(1),
0 20 40 60 80 100
-
2
0
2
4
9 . 0 = |
AR(1),
0 20 40 60 80 100
-
4
-
2
0
2
4
9 . 0 = |
Autoregressive Processes
of higher order
p
th
order autoregression: AR(p)


Stationarity: We will assume that the roots of
the following all lie outside the unit circle.




t p t p t t t
Y Y Y c Y c | | | + + + + + =


2 2 1 1
0 1
2
2 1
=
p
p
z z z | | |
Properties of AR(p)


Can solve for autocovariances /
autocorrelations using Yule-Walker
equations
( )
p
c
| | |

2 1
1
Mixed Autoregressive Moving
Average Processes
ARMA(p,q) includes both autoregressive and
moving average terms

q t q t t
t p t p t t t
Y Y Y c Y


+ + + +
+ + + + + =
c u c u c u
c | | |

2 2 1 1
2 2 1 1
Time Series Models
for Financial Data
A Motivating Example
Federal Funds rate
We are interested in forecasting not only the
level of the series, but also its variance.
Variance is not constant over time

U. S. Federal Funds Rate
Time
1955 1960 1965 1970 1975
2
4
6
8
1
0
1
2
Modeling the Variance
AR(p):
ARCH(m)
Autoregressive conditional heteroscedastic process
of order m
Square of u
t
follows an AR(m) process


w
t
is a new white noise process


t p t p t t t
u y y y c y + + + + + =

| | |
2 2 1 1
t m t m t t t
w u u u u + + + + + =

2 2
2 2
2
1 1
2
o o o ,
References
Investopia.com
Economagic.com
Hamilton, J. D. (1994), Time Series
Analysis, Princeton, New Jersey: Princeton
University Press.

You might also like