Queen Mary, University London
Liudas Giraitis
Time Series Analysis
Solutions: Problem Set 4
Problem 1. Consider AR(2) model
Yi=01+0.2¥i1 -0.1%2 +e
where ¢ is a white noise sequence with zero mean and variance 1
(1) Find the 1-step ahead forecast of Yin
(2) Find the 2-step ahead forecast of Yis2
(3) Find the variance of the 1-step ahead forecast and the variance of
2step ahead forecast. Compare these variances.
(4) Find the 3-step ahead forescast.
Solution 1. If we have AR(2) model.
Ye = bo + b1Yi-n + b2Yia + Et
The k-step ahead forecast is defined by the formula
Yi(h) = ElYesel Fl)
(1) To compute Y;(1) first we write
Yinr = bo + OY, + PaYia + etn
Then
Yl) = BlYesalFi] = Eldo + 1%. + b2Yea + &41|F]
got hEMF] + 2b MalF] + Blew lA}
= dot hYitdo¥in.
Here we used the facts
EMIAI=%, EMAalh] =Yia
1which hold because ¥; and Y;-; are known when we know F,, and
Bleual) =
which is valid because ¢,,4 is independent of the ‘history F..
‘Therefore, since by = 0.1, 41 = 0.2, 43 = -0.1, the 1-step ahead forecast
is
¥,(1) = 0.1 + 0.2%, - 0.1%.
(2) To compute 2-step ahead forecast, write
Visa = bo + drYisa + b2¥%e + Eta.
‘Then
Yi(2) = ElYisalFi] = Bldo + bier + bY + ees2l Fi]
0+ bE MualF) + EMF
e+ d:¥i(1) + daY¥..
We found that .
(1) = 0.1 40.2% - 0.1%.
So, keeping in mind that ¢) = 0.1 and $1 = 0.2,
¥{(2) = 0.1+0.2(0.1+0.2¥ — 0.1%1) + (-0.1)%i1
= 0.12 + 0.04Y; - 0.12¥,
(3) The error of 1-step ahead forecast is
ae(l) = Yan - Yi)
Go + bY: + bain + er41 — (bo + 1% + bo¥i-1)
= eu
‘The variance of the error is
Var(ex(1)) = Var(ees1) = 2‘The error of 2-step ahead forecast is
er(2) = Yera - Y4(2) ;
= bot dsYins + ba¥i + e102 ~ (0 + d1%i(L) + 0%)
= gi(Yen — Yi(1)) + etna =
The variance of the error is
Var(ex(2)) = Var(drerss + tsa) = Var(drerss) + Var(er42)
= dj? +0? = (0.2)? +1= 1.04.
neta + E42:
We conclude that
Var(ex(2)) > Var(ex(1))
(4) To compute 3-step ahead forecast, write
Yrs = bo + brYiee + b2¥isi + £143
‘Then
El¥i4s|Fi] = Elbo + d1Yine + b2¥igs + €e|F]
bo + GE Mi2\Fi] + b22[Yir1\F]
= dot dr¥i(2) + bYi(l).
We found that
¥.(1) =0.1+0.2¥; 0.1%, ¥%4(2) = 0.12 — 0.06Y; — 0.02%,
So, keeping in mind that @ = 0.1 and ¢1 = 0.2,
¥,(3) 0.1 + 0.2(0.12 + 0.04Y;, — 0.12%.) + (—0.1)(0.1 + 0.2%, — 0.1%)
= 0.1 + 0.024 + 0.008Y; — 0.024Y;_1 + [-0.01 — 0.02¥, + 0.01%,-1]
= 0.124 — 0.012¥, — 0.14%. 1.
Problem 2, Obtain the mean, variance and lag-1 autocorrelation of the
MA(1) series
Y= eet Oe
where e is an ii.d. (0, 02) sequence.
Show that the higher order autocorrelations p,, k > 2 are equal to zero.Solution 2
(a) First we compute the mean
E[Y) = Elec + 0ee-1] = Flee] + 9Fle,-1] = 0+ 0(0) =0.
Var(¥%) = Bl(% — B(Y4))"] = Eller + Gee1)]
= Ble} + 2ere4-1 + Pe? 2]
= Ele] +26B|evera) + 6*B leo]
a2 + 20(0) + #02
o2(1 +6"),
(b) To find the autocovariance at lag-1 note, that by definition, for k > 1,
Coulis Yi-e) = El(¥i - B[Y))(%-+ - BI%4))] = BMY a]
= Bl(ee+ Oe-a)(E-n + 6ee-r-1)]
Elevee-n + O€tr€1-& + eee ea + OPE 16e-ea]
Blevee—n] + OB erre1—4] + OB ecer-n—1] + O Blerrer-n1].
‘Therefore the lag-1 auto-covariance is
oh = Blever1] + OE[e,-161-1] + 0B lece-o) + 6°Bler161-2]
= 04002 +0+0= 602,
‘The autocorrelation at lag 1 is
n a)
Pe Cole) = Farle = BLP TA)
(c) If k > 2, then
te = Berea] + OF er-161-4) + OF (ec+-na] + P Ble aerna] = 0
because € is a white noise, and therefore Elee,] = 0 if t # s. Then the
autocorrelation
=%_2
==
0%Problem 3. Consider the MA(1) time series
Yi = 6 — 0.81,
where ¢ is a white noise sequence with zero mean and variance 1.
(a) Find B[%], [Ys], Var(%4) and Var(¥o)
(b) Find the covariance of ¥; and Yo, and the covariance of ¥2 and Ys.
(c) What is the mean and the variance of Yiqo?
Solution 3. Write
Ye et Oe,
where 0 = -0.8, 02 = 1.
(a) ¥; is a stationary MA(1) n time series. We have
E(Y] = Ele, — 0.81] = Bled] — 0.8E le
‘Therefore
EM] = B[%s] = 0.
In 2a) we showed that
Var(¥,) = o3(1 + 6°) = 1(1 + (0, 8)?) =1+0.64 = 1.64.
‘Therefore
Var(¥%) = Var(¥2) = 1.64.
(b) In problem 2b) we showed that
n= Coul, ¥i-1) = 802 = (-0.8)1
0.8,
and
= Cov(¥i, Ye)
Because of symmetry property,
for k=2,3,...
Cov(Ya, Ys) = Cov(Ya, Ya) = 71 = -0.8.(c) Since ¥; is a stationary time series, then the mean £[Y;] and the variance
Var(Y,) are constant. Therefore
E[¥i00] =
Var(Y;o0) = 1.64
Problem 4, Work out the autocorrelation function p; for each of the fol-
lowing models:
(a) Ye = & ~ 0.5e15
(b) ¥ - 0.8% =e
where ¢, is a white noise sequence with zero mean and variance 2.
Solution 4, (a) We have to find
7 = Cou(%, Yi-s).
By definition
4 = BUY ~ EY))(Yi-s — EMM-s)))-
Since
Y= 6 - 0.561
then
BY) = Ele) — 0.5Ele1] =0
because Ele;] = Ele.1] = 0. Therefore
ay = ELV -3) = Eller — 0.5¢r-1)(€r- — 0.5e1-4)]
= Elevers — 0.5e-164-3 — €10.5e¢-4 + 0.5e;-10.5e.-4]
= Elecee—s] — 0.5E[e+—1€+-3] — 0.5E [ever] + (0.5)°Bleraer
since ¢; is a white noise and therefore
Elee]=0 if t#s.
‘Thus the autocorrelation
2 _ 0
=Bal =o
On
(b) We have that
Ye - 08% =e
6First note that this model can be written as AR(1) model
Y= ate
with p = 0.8. Since |p| < 1, the model is stationary, so the mean E[Y;) =
is constant. We show that
EY] = 0,
That follows taking the expectation of ¥; — 0.8Y;-1 = ¢: which implies that
E [| - 0.881%] = Bled.
Since Ble,] = 0 and E[Y,] = E[Y-1
1, we see that
1-08 =0
and therefore 0.24 = 0, so that j= 0.
‘Therefore autocovariance
Ye = Cov(¥i, ¥en) = E[(Yi - ELV) (Vi-e — B[M-a))] = BI Y.-4]-
To find 1, we multiply both sides of the equation Y; — 0.8Y;-1 = €: by Yi-«:
YY — 0.8% aN = Yen
‘Then
YYi-k = O.8Yi Yin + Vik
‘Taking expectation of both sides we obtain
EYY,-4] = 08E(% AY +] + BleY.-n]-
Since for k > 1, we have Elec¥:_x] = 0, and y = E[¥:¥;_»] we see that
Ne = 0.8 %-1,
So,
‘7 = 0.8472 = (0.8)’n = (0.8)"10
By definition,
ps = Corr(Yi, ¥i-a)Problem 5. Show that the two MA(1) processes
X= e+ Oe
Y= net m1
have the same autocorrelation function. Here (
WN(0, 02) are white noise sequences.
) ~ WN(0,02) and (nm) ~
Solution 5. (a) Consider MA(1) time series Xp =e: + 021-1
Solving Problem 2a) we showed that
0 = Var(X;) = 02(1 +82).
In Problem 2(b) we showed that
71 = Cov(Xe, Xi-2) = 028,
and therefore
(y-B- 2? 8
Px) = = LO) T+
In Problem 2c) we showed that
=Cov(Xi,Xi4)=0, for k
and therefore
px(k)=0, for k=2,3,..
(b) Consider now MA(1) time series ¥; = me + bear
Note that (¥;) can be written in the same form as X,:
Y= m+ Om1
with parameter 6” = }.
‘Therefore for ¥; we can apply the results of part (a)
py (1)
Section a) implies that
pr(k) =0=px(k), for k=2,3,..
Therefore
py(k) = px(k), for k=0,1,2,3,...,aly
=
mw
6. Using the following EVIEWS output, determine the order q of an MA(q)
model you would fit to the data.
AC PAC Q-Stat Prob
—
3.0385 0.057
10.303 0.008
0.081 0070 15296 0.002
070 0.063 19.000 0.001
0.080 0.031 20.934 0.001
0.088 0.038 23.488 0.001
0.059 v.39 26.123 0.000
a.016 -0.008 26.303 oot
0,086 0.048 2.843 0.001
0,043 0.024 31.077 0.001
O77 0.001
0.002
Solution 6: Selecting the order of MA(q) model, we test the hypothesis
Ao: pe = 0 against alternative Hi: py #0
at lags k = 1,2,... at significance level 5%, where py is ACF function.
ACF px at lag k is significant, if |f,| > 2/VN, where N is the number of
observations.
Tf |x| < 2/VN, then ACF at lag k is not significantly different from 0.
We select for q the largest lag at which the ACF is significant. This rule
can be used becauise the ACF of the MA(q) model becomes 0 for lags k > g.
We have 2//N = 2 = 2/755 = 0.0727. The ACF is significant only at
the lag 2 and 3. Hence we would fit MA(3) model.
AACE
0 0e2h
——— _
— + 2 4 is
A