Professional Documents
Culture Documents
Introduction To Time Series Analysis. Lecture 7.: Peter Bartlett
Introduction To Time Series Analysis. Lecture 7.: Peter Bartlett
Peter Bartlett
Last lecture:
1. ARMA(p,q) models: stationarity, causality, invertibility
2. The linear process representation of ARMA processes: .
3. Autocovariance of an ARMA process.
4. Homogeneous linear difference equations.
(B)Xt = (B)Wt ,
Xt = (B)Wt
(B) = (B)(B)
so
..
.
1 = 0 ,
1 = 1 1 0 ,
2 = 2 1 1 2 0 ,
..
.
(B)Xt = (B)Wt ,
Xt = (B)Wt ,
2
(h) 1 (h 1) (2)(h 2) = w
q
X
j jh .
j=h
q
X
j=h
j jh ,
h = 0, . . . , q.
a0 xt + a1 xt1 + + ak xtk = 0
k
a0 + a1 B + + ak B xt = 0
auxiliary equation:
a(B)xt = 0
a0 + a1 z + + ak z k = 0
(z z1 )(z z2 ) (z zk ) = 0
(B z1 )(B z2 ) (B zk )xt = 0.
a(B)xt = 0
(B z1 )(B z2 ) (B zk )xt = 0.
(B z1 )(B z2 ) (B zk )xt = 0
xt is a linear combination of solutions to
(B z1 )xt = 0, (B z2 )xt = 0, . . . , (B zk )xt = 0
10
2 2
1
c =1, c =0
1
2
c =0, c =1
1
2
c =0.8, c =0.2
0.8
0.6
0.4
0.2
0.2
0.4
0.6
0.8
10
11
12
14
16
18
20
12
As before,
2 2
1
c=1.0+0.0i
c=0.0+1.0i
c=0.80.2i
0.8
0.6
0.4
0.2
0.2
0.4
0.6
0.8
10
14
12
14
16
18
20
2 2
2
c=1.0+0.0i
c=0.0+1.0i
c=0.80.2i
1.5
0.5
0.5
1.5
10
20
30
40
15
50
60
70
(B z1 )(B z2 ) (B zk )xt = 0.
If z1 = z2 ,
(B z1 )(B z2 )xt = 0
(B z1 )2 xt = 0.
2
c =1, c =0
1
2
c =0, c =2
1
2
c =0.2, c =0.8
1
1.5
0.5
0.5
10
17
12
14
16
18
20
a0 xt + a1 xt1 + + ak xtk = 0,
with initial conditions x1 , . . . , xk .
a0 + a1 z + + ak z k = 0
Auxiliary equation in z C:
Xt = (B)Wt ,
1 1 1
1
1
1 1
,... .
j = 1, , , , , , ,
5 4 20 16 80 64 320
2
(h) 1 (h 1) 2 (h 2) = w
qh
X
h+j j
j=0
(h) + 0.25(h 2) =
(0 + 0.21 )
19
if h = 0,
2
0.2w
0
if h = 1,
otherwise.
20
21
22
We plug
into
2
1.25(1) = w
/5.
23
0.8
0.6
0.4
0.2
0.2
0.4
10
24
10
25
26
Linear prediction
Given X1 , X2 , . . . , Xn , the best linear predictor
n
Xn+m
= 0 +
n
X
i Xi
i=1
27
Projection Theorem
If H is a Hilbert space,
M is a closed linear subspace of H,
and y H,
then there is a point P y M
(the projection of y on M)
satisfying
1. kP y yk kw yk for w M,
2. kP y yk < kw yk for w M, w 6= y
3. hy P y, wi = 0 for w M.
28
yPy
y
Py
Hilbert spaces
Hilbert space = complete inner product space:
Inner product space: vector space, with inner product ha, bi:
ha, bi = hb, ai,
h1 a1 + 2 a2 , bi = 1 ha1 , bi + 2 ha2 , bi,
ha, ai = 0 a = 0.
Norm: kak2 = ha, ai.
complete = limits of Cauchy sequences are in the space
Examples:
P
n
1. R , with Euclidean inner product, hx, yi = i xi yi .
2. {random variables X: EX 2 < },
with inner product hX, Y i = E(XY ).
(Strictly, equivalence classes of a.s. equal r.v.s)
29
Projection theorem
Example: Linear regression
Given y = (y1 , y2 , . . . , yn ) Rn , and Z = (z1 , . . . , zq ) Rnq ,
choose = (1 , . . . , q ) Rq to minimize ky Zk2 .
30
Projection theorem
If H is a Hilbert space,
M is a closed subspace of H,
and y H,
then there is a point P y M
(the projection of y on M)
satisfying
1. kP y yk kw yk
2. hy P y, wi = 0
for w M.
31
yPy
y
Py
Projection theorem
yPy
y
hy P y, wi = 0
Py
zi i = 0,
hy Z ,
Z Z = Z y
= (Z Z)1 Z y
normal equations.
32
Projection theorem
Example: Linear prediction
2
Given 1, X1 , X2 , . . . , Xn r.v.s X : EX < ,
choose 0 , 1 , . . . , n R
Pn
so that Z = 0 + i=1 i Xi minimizes E(Xn+m Z)2 .
Here, hX, Y i = E(XY ),
Pn
{1, X1 , . . . , Xn }, and
M = {Z = 0 + i=1 i Xi : i R} = sp
y = Xn+m .
33
Projection theorem
If H is a Hilbert space,
M is a closed subspace of H,
and y H,
then there is a point P y M
(the projection of y on M)
satisfying
1. kP y yk kw yk
2. hy P y, wi = 0
for w M.
34
yPy
y
Py
for all Z M.
n
hXn+m
Xn+m , Zi = 0
n
E Xn+m Xn+m = 0
n
E Xn+m Xn+m Xi = 0
for all Z M
for all Z {1, X1 , . . . , Xn }
n
That is, the prediction errors (Xn+m
Xn+m ) are uncorrelated with the
prediction variables (1, X1 , . . . , Xn ).
35
Linear prediction
n
Error (Xn+m
Xn+m ) is uncorrelated with the prediction variable 1:
n
E Xn+m Xn+m = 0
!
X
i Xi Xn+m = 0
E 0 +
i
X
i
36
= 0 .
Linear prediction
...
Substituting for 0 in
n
Xn+m
= 0 +
= 0 .
i Xi ,
we get
n
Xn+m
=+
i (Xi ) .
Write
Prediction equations:
n
Xn+1
= n1 Xn + n2 Xn1 + + nn X1
n
E (Xn+1 Xn+1 )Xi = 0, for i = 1, . . . , n
n
X
nj E (Xn+1j Xi ) = E(Xn+1 Xi )
j=1
n
X
nj (i j) = (i)
j=1
38
n n = n ,
(0)
(1)
(n 1)
(0)
(n 2)
(1)
n =
..
..
..
.
.
.
(n 1)
(n 2)
n = (n1 , n2 , . . . , nn ) ,
(0)
39
= E Xn+1
=E
2
n
Xn+1
Xn+1
n
Xn+1
Xn+1
n
40
n
Xn+1
= E Xn+1
2
n
Xn+1
= (0) n 1
n n
41
42