This Content Downloaded From 157.40.102.185 On Sat, 02 Oct 2021 19:55:43 UTC

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Regression Models with Time Series Errors

Author(s): Ruey S. Tsay


Source: Journal of the American Statistical Association , Mar., 1984, Vol. 79, No. 385
(Mar., 1984), pp. 118-124
Published by: Taylor & Francis, Ltd. on behalf of the American Statistical Association

Stable URL: https://www.jstor.org/stable/2288345

REFERENCES
Linked references are available on JSTOR for this article:
https://www.jstor.org/stable/2288345?seq=1&cid=pdf-
reference#references_tab_contents
You may need to log in to JSTOR to access the linked references.

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide
range of content in a trusted digital archive. We use information technology and tools to increase productivity and
facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
https://about.jstor.org/terms

Taylor & Francis, Ltd. and American Statistical Association are collaborating with JSTOR to
digitize, preserve and extend access to Journal of the American Statistical Association

This content downloaded from


157.40.102.185 on Sat, 02 Oct 2021 19:55:43 UTC
All use subject to https://about.jstor.org/terms
Regression Models With Time Series Errors
RUEY S. TSAY*

The time series regression models in which the errors of regressive moving average (ARMA) model. Assuming
regression equations follow stationary or nonstationary that Zt is an autoregressive process with finite order, Dur-
autoregressive moving average models are considered. bin (1960) proposed a two-stage procedure that yields
Convergence properties of the sample autocorrelation asymptotically efficient estimates in linear models, and
function of observed series and the least squares esti- Gallant and Goebel (1976) provided a procedure for es-
mates of the linear regression parameters are shown. timating the unknown parameter 3 in nonlinear regres-
Based upon these results, a procedure for specifying the sion settings. Fuller (1976) discussed some properties of
tentative order of the mixed ARMA errors is proposed. model (1.1) in Chapter 9. More recently, using the Kal-
Two examples are given. man filter techniques, Harvey and Phillips (1979) consid-
ered the maximum likelihood estimation under the same
KEY WORDS: Autoregressive moving average model;
conditions as those of Pierce (1971a).
Extended sample autocorrelation function; Intervention;
In this article we also consider the model (l.1) but allow
Regression; Time series.
the time process Zt to be nonstationary. Such nonsta-
1. INTRODUCTION tionary regression models are of interest because, first of
all, they provide a simple method for removing the effects
A useful and extensive class of models for describing of certain outlying observations in modeling nonstation-
economic, business, and environmental data is the col- ary time series data; for example, see Chang (1982) and
lection of time series regression models. It consists of Hillmer, Bell, and Tiao (1982). Second, these models are
models in the form useful in studying the so-called "trading day" effects in
analyzing monthly economic data. For instance, the
Yt = flX,, ) + Z. (1.1)
model
where Y, is the series of interest, Zt is a time series
7
proc-
ess that is unobservable, X, denotes a set of input (or
Yt= iXit + Zt
exogenous) variables, , is a vector of parameters, and i = I

f(X,, 0) represents the effect of input X, on the output


with Xit's denoting, respectively, the number of Mon-
Yt. Obviously, the classical regression (linear or nonlin-
ear) and time series models are special cases of (1.1). The days, Tuesdays, and so on, in month t, has been used in
well-known intervention models, proposed by Box and the literature to study the effect of the number of times
Tiao (1975), and time series models with calendar effects, of each day of the week on economic series; for example,
considered by Liu (1980) and Cleveland and Devlin see Pfefferman and Fisher (1980) and Bell and Hillmer
(1983).
(1982), can also be imbedded in model (1.1). When Xt
contains the time variable t, (1.1) is used by economists This article proposes a procedure to specify the order
to study the trend of Y,. Furthermore, when some com- of an ARMA model for the unobservable process Z,. This
ponents of X, are indicator variables, (1.1) can be em- order specification problem is of practical importance be-
ployed to handle certain outlier problems in time series cause a tentative model of (1.1) for Y, can be built once
analysis. the model of Zt is specified. Yet, to the best of the au-
Considerable attention has been devoted in the litera-thor's knowledge, this problem has not been rigorously
investigated
ture to model (1.1), especially to the parameter estimation in the literature. In what follows, we prove

under the assumption that the time series component Zt some asymptotic properties of the observed series Yt
is stationary. Anderson (1954) gave a review of earlier when Zt is nonstationary. Section 2 states the model used,
work on the regression analysis when autocorrelation ex-the assumptions imposed, and the main results. The cor-
ists. Hannan (1971) proved consistency properties of theresponding proofs are given in the Appendix. Based upon
weighted least squares (LS) estimates of the parameter these results, Section 3 describes the proposed order se-
,1 when fiX,, 1) is nonlinear and Z, has a continuous lection procedure for the linear regression time series
spectrum function. Pierce (1971a) considered the LS es- models, and Section 4 applies the procedure to two ex-
amples. Finally, Section 5 addresses some nonlinear sit-
timation when the model is linear and Zt follows an auto-
uations, and Section 6 gives a brief discussion.

* Ruey S. Tsay is Assistant Professor, Department of Statistics, Car-


negie-Mellon University, Pittsburgh, PA 15213. The author wishes to ? Journal of the American Statistical Association
thank the Associate Editor and a referee for useful comments and sug- March 1984, Volume 79, Number 385
gestions that led to an improved presentation. Theory and Methods Section

118

This content downloaded from


157.40.102.185 on Sat, 02 Oct 2021 19:55:43 UTC
All use subject to https://about.jstor.org/terms
Tsay: Time Series Regression Model 119

2. THE MODEL, ASSUMPTIONS, AND SOME RESULTS or


f(X,, I) = I1 exp(t32), (2.5)
We assume that the unobservable time series process
Z, of (1.1) follows a nonstationary or stationary ARMA
model, that is, then nonlinear regression techniques are needed and
Assumption 5 must be modified. The condition then de-
D(B)Z, = 0(B)a, (2.1)
pends not only on the orders of Xj,'s but also on the true
with 4?(B) = U(B)4(B), where 4?(B) = 1 - PDB - structure off(X,, ,). The modeling procedures thus differ
from model to model. For instance, different methods are
-p4)BP, U(B) = I1- UjB - * - UdB', +~(B) = 1-
needed in order to specify any nonstationary factors of
4IB- *- p-dBP , and 0(B) = 1 - I 1B - *
the time series components associated with models (2.4)
OqBq are polynomials in B, B is the backshift operator
such that BZ, = Zt- 1, and {a,} is a sequence of white and (2.5). More detailed discussions about the nonlinear
noise disturbances. We require that the model (2.1) sat- models are given later.
isfy the following conditions: Now consider the nonstationary linear models; that is,
Y, follows model (2.3) with ?(B) # +(B). We state here
1. All the zeros of U(B) are on, those of +(B) are out- some convergence properties of the sample autocorre-
side, and those of 0(B) are on or outside the unit circle, lation function (SACF) of Y, and the LS estimates of the
with no single root common to the polynomials ?D(B) and
unknown parameter ,B in a linear regression fitting to Yt.
0(B); The results depend on the multiplicities of the nonsta-
2. The a,'s are independent and identically distributed tionary roots of ID(B), that is, zeros of the polynomial
continuous random variables with mean 0, variance u2 U(B) in (2.1) and on the orders bi6 of the input variables
and finite fourth moment E(a,4) = K4 + 3U4; and Xi, in (2.2). When the time series component Zt dominates
3. The process Z, (hence, Y,) starts at a finite time the input variables, the SACF of Y,, as expected, can be
point to if it is nonstationary, that is, if U(B) # 1. employed to obtain factors of the nonstationary AR part
Note that the well-known autoregressive integrated of Zt. On the other hand, linear regressions must be fitted
moving average (ARIMA) models of Box and Jenkins to remove the effects of the dominating input variables
(1976) for describing nonstationary time series are special whenever they exist.
cases of model (2.1), that is, when U(B) = (1 - B)d. Given n observations Y1, Y2, . . ., Yn of a process Y,,
we define as usual the SACF at lag I as
The functional form of the regression structure f(Xt,
) and the input variables X, = (XI,, .. . , Xk,)' of model
(1.1) are assumed to be known in this article. Further- ry(l) = Cy(l)/Cy(O) (2.6)
more, we assume that the Xi,'s satisfy the following con-
dition: where Cy (j) = n 1N+ 1 ( Yt- Y _j - Y) withY =
n -I Y, being the sample mean of Yt. We then adopt
4. For any fixed i (1 < i c k), there exists a nonnegative
a similar approach as that of Tiao and Tsay (1983) to
real number 6ii such that
establish some results for model (2.3). For a nonstation-
f ~ ~ ~~nA ary ARMA process, that is, d > 0 in (2.1), we may factor
bi= min 8 I min n-8 Xi,2 is finite} . (2.2) the model into
0-8<0 0 n--->)

For convenience, we refer bij to the


Lu1 Ui(B)] 4(B)Zt = O(B)at, (2.7)
xit.
IfJfX,, tB) = k iXit, the model (1.1) is linear and
the full model becomes
where Ui(B) = 1 - Ul(i)B - - Udi(i)Bdi are poly-
k nomials in B of degree ds such that (a) 7L1 di = d; (b)
Yt = E i Xit + [?((B)V `0B)a,. (2.3) 1 I-l Ui(B) = U(B); (c) Ui(B) is a factor of Ui I (B);
and (d) the multiplicity of any root of Ui(B) is 1. Clearly,
m is the highest multiplicity of the roots of U(B) in (2.7).
In this case, the LS techniques are useful and we further
Under the model Assumptions 1-3 above, Tiao and Tsay
assume the following:
(1983) have established the following results for the
5. The matrix lim,. {n - a,at'} is positive definite ARMA models: (a) , Z? zt Q=O(n2m); (b) [~ lz,2] -1
where 8 is the order of some Xi, in X, and a, = (ao,, *.. =* Op(n -2m); (C) I ZtW,+h = Op(nm+m'), where h is
a fixed integer, W, = U*(B)Zt with U*(B) being a factor
a1,)' is a subvector of Xt, each component of which shares
the same order B. of U(B), and m' is the highest multiplicity of the roots of
On the other hand, if f(X, ,B) is nonlinear, such as [U* (B)]1 U(B); and (d) the SACF of Zt satisfies asymp-
totically the homogeneous difference equation U1 (B)rz(l)
f(Xt,I") = 1 + ~2cos(tf34)+ ~3sin(th4), I r4 { = 0 for any finite integer 1. The same techniques em-
(2.4) ployed there can in fact be used to prove that for the

This content downloaded from


157.40.102.185 on Sat, 02 Oct 2021 19:55:43 UTC
All use subject to https://about.jstor.org/terms
120 Journal of the American Statistical Association, March 1984

model (2.7) with m 1, The above corollary can be generalized to obtain con-
n sistent LS estimates for the nonstationary coefficients
(a) E (Z, - ) = 2 0(n2m); (2.8) Ui's in U(B) if 81 c 1. This can also be proved by the
same methods as those in Tiao and Tsay (1983) and can
n
be used to spot the nonstationary factor U(B) of Zt in
(b) (Z t Z)2J = O (n2m); (2.9) practical modeling.
Next consider the situation that some 8j in (2.11) is
n greater than the leading order 2m of Z,; that is, consider
(C) (Zt - Z)(Wt+h - W) = Op(n m +) (2.10) the case where some Xi, is dominating. In this case, the
linear regression

where Z and W are the sample means of Zt and Wt, re- k

spectively. Yt= X3iX + e1 (2.14)


i = 1
Next let
would play an important role in the process of modeling.
81 > 82 > ... > bv (2.11)
In fact, some LS estimates ,31 of (2.14) are consistent for
be the sequence of v distinct valuestheof
truebi&
Pi but(ithe
=others
1, . are not, depending on the order
k) defined in (2.2). From (2.8), (2.9), and (2.10) and the
of Xit. The consistent estimates can then be employed to
Cauchy-Schwarz inequality, remove the corresponding input effects.
n n ] 5[ n -.5
Theorem 2.2. Suppose that the linear regression time
ctdt L Ct2J[ d dt2l (2.12) series process Y, of (2.3) satisfies the Assumptions 1-5.
Let m and 5j be defined in (2.7) and (2.11), respectively.
one can prove the following properties of Yt.
Furthermore, suppose that 2m < Sj for some j (I < ij
Theorem 2.1. Suppose that Y, follows a linear regres- k). Then (a) the LS estimate Pi of (2.14) is consistent for
sion time series model of (2.3) and satisfies the Assump-the true coefficient Pi of Xit if 8ii > 2m; (b) r is incon-
tions 1-5 above. Let m be the highest multiplicity of the
sistent if bi5 < 2m; and (c) the consistency of i is un-
determined
roots of U(B) and 5 , defined in (2.11), the highest order if bi5 = 2m, where 8ii is the order of Xit.
of the input variables Xit's. If 2m > 5 then the SACF of 3. A MODELING PROCEDURE FOR LINEAR
Yt satisfies asymptotically the homogeneous difference
REGRESSION TIME SERIES MODELS
equation U1(B)r,(l) = 0 for any fixed finite integer 1,
where U1(B) is defined in (2.7). Once a tentative model is specified for the time series
component Zt, the unknown parameters ,, cDi's, and Oj's
Corollary 2.1. If Yt satisfies all the conditions of Theo-
of model (2.3) can be estimated by those methods men-
rem 2.1, then the least squares estimates of the AR(d1)
tioned in Section 1. The adequacy of the fitted model can
regression,
also be examined by some model-checking procedures
Y= alYt_ + + Odi Y,td, + e,, (2.13) such as plotting the residuals to spot any potential out-
lying observations and computing the SACF of the re-
are consistent for the coefficients of U1(B). More spe-
siduals to test the independence assumption. The latter
cifically,
approach was discussed by Pierce (1971b). Therefore, we
a = Uj(1) + Op(n-m+ 82), j = 1, 2, . . .d, shall concentrate here on the model specification. For
general nonstationary or stationary ARMA time series,
where 8 = max{8l, 2(m - 1), 1}.
a useful model identification method, called the extended
Proof. Since the normal equations of (2.13) can be re-
sample autocorrelation function (ESACF) approach, was
written as
proposed by Tsay and Tiao (1984). This method is based
r(1) on the consideration that if consistent estimate (D ,
r(2) (DP of the AR coefficients ID, . . .(, P in ?(B) of (2.1)
can be found, then the transformed series Wt = (1 -

Lr(dl)j 401B -" DP-4 BP)Zt will approximately follow the


MA(q) model
r(0) r(1) ... r(d I- 1) 1t
- r(1) r(0) ... r(dI - 2) &2 Wt = (1 - 01B - **- OqBq)at.
The SACF of Wt can therefore be used to specify the
Lr(di - 1) r(d1 -2) ... r(0) J[Ltdij order of the moving average (MA) polynomial 0(B). To
this end, Tsay and Tiao (1984) have proposed an iterative
+ o0(n-m+8/2),
regression procedure that provides consistent estimates
with r(j) denoting the SACF ry(j), the corollary follows for the AR parameters in an ARMA model. Roughly
immediately. speaking, the iterated regressions employ certain lagged

This content downloaded from


157.40.102.185 on Sat, 02 Oct 2021 19:55:43 UTC
All use subject to https://about.jstor.org/terms
Tsay: Time Series Regression Model 121

residuals of previous regressions as newly added regres- The motivation behind this procedure is as follows: (i)
sors to accommodate the effects of the MA part on the under the condition l(b), the true time series Z, is non-
AR estimates. As the iteration advances, the MA effects stationary with the highest multiplicity of nonstationary
will reduce and eventually vanish when the number of roots being greater than the orders of Xi, and by Theorem
iterations exceeds the true MA order q. A recursive al- 2.1, the SACF would provide consistent estimates for
gorithm for computing the iterated estimates of AR pa- some nonstationary factors of Z,; and (ii) under the sit-
rameters has also been developed. uation 1(c), the time series component is no longer dom-
A useful property of the ESACF for model identifi- inating and removing the effects of X, becomes necessary.
cation is that the pth ESACF of an ARMA(p, q) model,
stationary or not, has exactly the same asymptotic "cut- 4. EXAMPLES
ting-off' property as that of the ordinary SACF of an
MA(q) model. In applications, one can simply search Two examples are presented here for the proposed
from the ESACF table for this cutting-off feature to spec- modeling procedure and for those properties shown in
ify the order (p, q) of a time series model. The readers Section 2.
are referred to Tsay and Tiao (1984) for details and ex-
Example 1. Consider a linear regression model with
amples. Here we employ this ESACF approach to iden-
nonstationary time series errors. Two hundred observa-
tify the time series model after the effects of input vari-
tions were generated from the model
ables (i.e. f(X,, 0)) have been approximately removed.
The asymptotic properties stated in Section 2 provide Yt= 5.0 + 2.OX, + Zt, (4.1a)
some clues for accessing the nonstationary factors of Zt
(1 - B)(1 - .5B)Zt = (1 + .5B)a,, (4.lb)
and the effects of X,. For example, one can employ the
where Xt and a, are iid N(5, 16) and N(O, 1), resp
SACF of Y, or the LS estimates of stepwise AR fittings
to specify the nonstationary factors of Z, if E = The data are plotted in Figure 1 and the fitted linear
0(n) for all i. On the other hand, one would fit a linear regression equation, assuming that Zt's of (4.1a) were
regression to remove the input effects when information independent, is
indicates that Z, is stationary. In general, if the station-
arity of Zt is uncertain, we recommend the following pro-
Yt= -.104 + 2.43Xt + e,, (4.2)
cedure for the model specification.
(1.54) (.24)
Step 1. Compute the SACF of Y, and that of the esti-
where the values in parentheses are the corresponding
mated residuals et of the regression (2.14). (a) If the SACF
standard deviations. The biases of the estimates are
of et decays rapidly, go to Step 2 and treat et as Zt, an
clearly shown in (4.2).
estimate for the time series component Zt. (b) If both
The SACF's of the original Yt and the residuals e, of
SACF's fail to die out quickly but satisfy the same dif-
(4.2) are given, respectively, in (a) and (b) of Table 1,
ference equation, transform Y, and Xi,'s according to the
which clearly indicates that a nonstationary factor (1 -
common difference equation and return to Step 1 with
B) exists in the time series process. Following the pro-
these transformed series as observed values. (c) If both
posed procedure, we take first difference on both Y, and
SACF's fail to die out quickly and satisfy two difference
X, and iterate the above step. Table 1(c) gives the SACF
equations, remove the effects of those input variables Xit
of the residuals E, obtained from the regression,
that possess the highest order, and go to Step 1 with the
adjusted Yt and the remaining input variables.
Step 2. Compute the ESACF table of Z, to identify the VY, = 1.99 VXt + A,, V = (1 -B), (4.3)
order (p, q) of a time series model. (.02)

Table 1. The SACF's of Yt, 6t, and Et of Example 1

Lag SACF

a. The SACF of Yt

1-12 .74 .74 .77 .72 .66 .66 .63 .59 .61 .54 .51 .49
13-24 .47 .44 .42 .39 .37 .35 .32 .29 .26 .23 .18 .18

b. The SACF of et

1-12 .97 .95 .92 .89 .86 .83 .79 .76 .73 .68 .66 .62
13-24 .59 .56 .53 .50 .46 .43 .40 .37 .35 .32 .29 .26

c. The SACF of i

1-12 .73 .40 .22 .14 .09 .06 .04 .03 .02 - .04 - .08 -.05
13-24 -.07 -.05 -.00 .02 .01 -.03 -.05 -.05 -.05 -.07 -.08 -.08

This content downloaded from


157.40.102.185 on Sat, 02 Oct 2021 19:55:43 UTC
All use subject to https://about.jstor.org/terms
122 Journal of the American Statistical Association, March 1984

Table 2. The ESACF Table of it of Example 1

MA

AR 0 1 2 3 4 5 6

a. The ESACF Table

0 .73 .40 .22 .14 .09 .06 .04


1 .36 -.02 -.06 -.01 .00 .00 -.02
2 .39 -.15 -.07 .03 .01 -.01 -.01
3 .16 .01 -.09 -.03 .01 -.01 -.01
4 .10 .02 -.11 -.01 -.01 -.01 -.02
5 -.23 .43 -.08 -.02 .00 -.01 -.03
6 .46 -.37 .07 .02 .01 .00 -.03

b. The Indicator Symbols*


0 100 200
0 X X X 0 0 0 0
1 X 0 0 0 0 0 0 Figure 1. Data Plot of Example 1.
2 X X 0 0 0 0 0
3 X 0 0 0 0 0 0
4 0 0 0 0 0 0 0
5 X X 0 0 0 0 0 Year on time series modeling. The model proposed by
6 X X 0 0 0 0 0 Liu is

X denotes the absglute value of the corresponding entry as greater than or equal to Yt= rXtl + 02x2t + Zt,
2n- 5, and 0 otherwise.

(1 - B)(1 - B'2)Zt = (1 - OIB12)at, (4.4)

and this time the SACF fails to support any nonstation- where Yt is the traffic volume, Xl, = E,, and X21 = Et
arity behavior of the differenced series. The ESACF table (T - 1962) with T representing the year and et the pro
and its simplified version for E, in (4.3) are then computed
portion of the new year period in the tth month. In th
process of arriving at this model, Liu noticed that the first
and presented in Table 2. An ARMA(1, 1) model is clearly
suggested for the time series component from this table. difference (1 - B) of Z, is clearly suggested by the SAC
Thus, as expected, the generating model (4.1) is correctly of Y,, but the seasonal difference (1 - B'2) is not by th
specified by the proposed procedure. SACF of (1 - B) Yt. Moreover, the need of a seasonal
For this example, a brief discussion is in order. In Tabledifference is indicated by the SACF of (1 - B)V, where
Vt is an adjusted series of Yt by removing the calendar
1, the SACF of e, seems to outperform that of Y, in pin-
pointing the nonstationary factor (1 - B). However, it is
effects, that is, the Chinese New Year effects.
incorrect in this situation to identify a model for the time Assuming that model (4.4) is adequate in describing the
series component basing upon the differenced process (1 data, we now give some theoretical explanations for these
- B)e,, a procedure likely to be employed in practice. phenomena observed by Liu. Since 1 is a double root of
The reason is as follows. The input variable X, in this the nonstationary AR part of Zt, by a result in Tiao and

case is of order Op(n), that is, En X.2 = Op(n). OnTsay


the (1983), E Z =2 = Op(n4). It is also clear from the
definitions and (2.12) that z X1jI = 0(n) and I X2
other hand, the nonstationarity of the original process
indicates that Et Zt2 = Op(n2 +) with 8 2 0. Therefore,= 0(n3). By Theorem 2.1, the SACF of Y, satisfies
the time series component Z, is dominating, and the non- asymptotically (1 - B)ry(l) = 0, because 1 is the only
stationary factor of it must be removed before accessing root with multiplicity 2. Next, taking the difference (1 -
the effect of input variable X,. In fact, the SACF of (1 -B) reduces the multiplicity of the nonstationary roots to
B)et given in Table 3 would misidentify an MA(1) model 1 which in turn implies that E7 W'2 = Op(n2) where Wt
for the differenced time series. = (1 - B)Zt. On the other hand, VX2, = X2, - X2,t-I
= (et - E, 1) * (T - 1962) + et-l that gives I (VX2t)2
Example 2. The Taiwan Highway Bureau data, con- = 0(n3). Consequently, the input variable VX2, becomes
sisting of 168 observations of the monthly highway traffic
the dominating factor in the differenced series. The puz-
volume in Taiwan from 1963 to 1976, were employed by zling behavior of the SACF of (1 - B) Yt observed by Liu
Liu (1980) to illustrate the effects of the Chinese New is then understood.

Table 3. The SACF of (1 - B)6t of Example 1

Lag SACF

1-12 -.15 .07 .10 .06 -.04 .08 .02 -.09 .13 -.10 -.02 -.01
13-24 -.00 -.00 -.01 .02 -.02 -.01 -.07 -.01 .02 .02 -.06 .04

This content downloaded from


157.40.102.185 on Sat, 02 Oct 2021 19:55:43 UTC
All use subject to https://about.jstor.org/terms
Tsay: Time Series Regression Model 123

5. NONLINEAR REGRESSION MODELS model is different from that in a stationary AR(1) model;
see, for example, Dickey and Fuller (1979) and Ahtola
We consider some nonlinear situations; that is, f(X,,
(1983).
,) is nonlinear. In these cases, Assumption 5 of Section
2. Regression Specification. In this article, we assume
2 is no longer valid, and must be replaced by one that
that the functional form of the regression equation is
assures the existence of a unique solution for the problem
known. The question of selecting the best regression
encountered. Moreover, unlike that of linear cases, the
model, however, is likely to arise in applications. One
impacts of exogenous variables now also depend on the
possible solution is imbedded in the diagnostic checkings.
true structure of flX,, ,). For example, the order of thespecifically, one incorporates at the beginning stage
More
input variable t itself in (2.4) is 0(n3) but that of cos(tP14)
of a modeling process all the available input variables that
and sin(t134) is only 0(n). The SACF of Y, would then
are known or suspected to have effects on the response
satisfy a homogeneous difference equation if the time se-
series Yt and then one deletes those unimportant varia-
ries component Zt is nonstationary and the number of
bles, judging in terms of the corresponding standard de-
observations is large, because any nonstationary root
viations, towards the end. Of course, some approaches
would give Z, with order 0(n2) that suffices to override
that can identify models for both the regression and the
the effects of exogenous variables. On the other hand,
time series components at the model specification stage
the effect of the input variable t in (2.5) always upsets
of an analysis need to be developed.
that of any nonstationary factors in Z, if 12 > 0. Mainly
3. Differencing. Finally, it may be worthwhile men-
because nvI(lE exp(2Q2t)) O' 0 as n -*oo for any finite vtioning that differencing the oberved nonstationary series
if 12 > 0. Thus for (2.5), nonlinear regression must be
Y, to achieve stationarity in the time series component
fitted to remove the input effects before attempting to
Z, may also reduce the effects of exogenous variables.
specify the order of Z,. In general, similar results such For instance, the first difference (1 - B) removes the
as Theorems 2.1 and 2.2 of linear cases can be established
constant term from flX,, ,) and transforms the linear
whenever the orders of the effects of input variables are
trend, if any, into a constant.
accessible. The modeling procedure of Section 3 can then
be extended to the nonlinear situation. In practice, if the APPENDIX
orders of input effects are unavailable, such as the sign
of 12 in (2.5) is unknown a priori, we recommend that Proof of Theorem 2.1
one fit the nonlinear regression first and then choose a Since Y = 'I ,Xi + Z, Yt - Y = = j3B(Xi-
proper procedure based upon the fitted results. Xi) + (Zt - Z). By this relationship, the assump
> bi, Cauchy-Schwarz inequality (2.12), and properties
6. CONCLUDING REMARKS (2.8) and (2.9), it is readily shown that
n
In this article, we have proved some properties for the
regression models with nonstationary time series errors. E (Yt - 1)2 = Op(n2m) (A.1)
These results provide information on when and how to
remove the effects of exogenous variables in the process and

of model building. An order selection procedure was then


proposed. We conclude by discussing some related and (Yt - y)2] Op(n2-). (A.2)
important issues.

1. Simultaneous Estimation. After a model is specified Letting Vt = UI(B)Y, and W, = UI(B)Zt, we have that
for the time series component Z,, all the parameters in k

model (2.3) should be estimated simultaneously. This can Vt= i[U,(B)Xit] + Wt (A.3)
be done by using either the nonlinear least squares or the and i=
maximum likelihood methods. In case of stationary
ARMA errors, a useful state space procedure for eval- U1 (B)ry(l)
uating the likelihood function was proposed by Harvey n ~ I~ n

and Phillips (1979). Two problems, however, remain open _ t(, )a E (yt
1 I~~~~+d + 1
when Z, is nonstationary. First, the initial value K used
by Harvey and Phillips might become critical because
- Y)(V - V)] (A.4)
nonstationary series tend to have long memory. It is then
necessary to have large sample sizes to reduce the effect where denotes asymptotic equivalence. Since the high-
of initial values. Second, the distribution properties such est multiplicity of the nonstationary AR roots of W, is m
as the asymptotic normality of the estimates become un- - 1 and 2m > 51, it is clear from (2.8) and (A.3) that
known. For instance, it has been shown that the distri-
bution of the LS estimate (asymptotically equivalent to
the MLE) of the AR parameter in a nonstationary AR(l)
E ( V, - V)2 = Op(n8) (A.5)
I+ di + 1

This content downloaded from


157.40.102.185 on Sat, 02 Oct 2021 19:55:43 UTC
All use subject to https://about.jstor.org/terms
124 Journal of the American Statistical Association, March 1984

Models," unpublished Ph.D. thesis, University of Wisconsin at Mad-


where 8 = max{ N, 2(m - 1), 1} < 2m. Finally, by (2.12),
ison.
(A.1), (A.2), (A.4), and (A.5),
ANDERSON, R.L. (1954), "The Problem of Autocorrelation in Regres-
sion Analysis," Journal of the American Statistical Association, 49,
Ui(B)ry(l) = Op(n- +8/2 113-129.
BELL, W.R., and HILLMER, S.C. (1983), "Modeling Time Series
and the result follows.
With Calendar Variation," Technical Report, Bureau of the Census.
BOX, G.E.P., and JENKINS, G.M. (1976), "Time Series Analysis:
Proof of Theorem 2.2 Forecasting and Control," San Francisco: Holden-Day.
BOX, G.E.P., and TIAO, G.C. (1975), "Intervention Analysis With
For simplicity in presentation, we assume that there Applications to Environmental and Economic Problems," Journal of
the American Statistical Association, 70, 70-79.
are only two distinct values in bj but it will be clear that
CHANG, I. (1982), "Outliers in Time Series," unpublished Ph.D. the-
the techniques employed can be readily extended to the sis, University of Wisconsin at Madison.
general situation. Denote these values by 81 > 2m ' 82. CLEVELAND, W.S., and DEVLIN, S.J. (1982), "Calendar Effects in
Monthly Time Series: Modeling and Adjustment," Journal of the
Rearranging the input variables Xi, if necessary, we may
American Statistical Association, 77, 520-528.
assume that the first b components share the same order DICKEY, D.A., and FULLER, W.A. (1979), "Distribution of the Es-
51 and the remaining c = k - b variables are of order timators for Autoregressive Time Series With a Unit Root," Journal
of the American Statistical Association, 74, 427-431.
82. The estimation errors of LS estimates in (2.14) are
DURBIN, J. (1960), "Estimation of Parameters in Time-Series Regres-
n )- (n ) sion Models," Journal of the Royal Statistical Society, Ser. B, 22,
- = (~xtxt'{'( x1zt) (A.6) 139-153.
FULLER, W.A. (1976), Introduction to Statistical Time Series, New
York: John Wiley.
Let G = diag{GI, G2}with G = n bIb andG2A.R.,
GALLANT, = nf82Ic,
and GOEBEL, J.J. (1976), "Nonliner Regression
With Autocorrelated Errors," Journal of the American Statistical
where Ij denotes thej x j identity matrix. Then (A.6) can
Association, 71, 961-967.
be rewritten as HANNAN, E.J. (1971), "Nonlinear Time Series Regression," Journal
of Applied Probability, 8, 767-780.
HARVEY, A.C., and PHILLIPS, G.D.A. (1979), "Maximum Likeli-
- p= (G- xtx,'> (G-' XtZ,). (A.7) hood Estimation of Regression Models With Autoregressive-Moving
Average Disturbances," Biometrika, 66, 49-58.
HILLMER, S.C., BELL, W.R., and TIAO, G.C. (1982), "Modeling
By assumptions and the Cauchy-Schwarz inequality Considerations in Seasonal Adjustment of Economic Time Series,"
(2.12), the orders of (A.7) can be written as in Proceedings of the Conference on Applied Time Series Analysis,
ed. A. Zellner, U.S. Department of Commerce, Bureau of the Census.
0_ 0(1) 0(nw)1--FOP(nA) 1 LIU, L.M. (1980), "Analysis of Time Series With Calendar Effects,"
Management Science, 26, 106-112.
L O(nw) 0(1) J LOp(n -)J PFEFFERMAN, D., and FISHER, J. (1980), "Festival and Working
Days Prior Adjustments in Economic Time Series," in Time Series,
where w = (8, - 82)/2, K = (85 - 2m)/2 and y = (82 ed. O.D. Anderson, New York: North-Holland.
- 2m)/2, and the partition is according to the dimensions PIERCE, D.A. (1971a), "Least Squares Estimation in the Regression
b and c. Since - w - y = - (8 I - 2m)/2 < 0 and - X < Model With Autoregressive-Moving Average Errors," Biometrika,
58, 299-312.
0, the result (a) holds. On the other hand, because w -
(1971b), "Distribution of Residual Autocorrelations in the
A = - y = (2m - 82) > 0 the results (b) and (c) follow Regression Model With Autoregressive-Moving Average Errors,"
immediately. Journal of the Royal Statistical Society, Ser. B, 33, 140-146.
TIAO, G.C., and TSAY, R.S. (1983), "Consistency Properties of Least
[Received December 1982. Revised July 1983.] Squares Estimates of Autoregressive Parameters in ARMA Models,"
Annals of Statistics, 11, 856-871.
REFERENCES TSAY, R.S., and TIAO, G.C. (1984), "Consistent Estimates of Auto-
regressive Parameters and Extended Sample Autocorrelation Func-
AHTOLA, J.A. (1983), "Studies in Asymptotic and Finite Sample In- tion for Stationary and Nonstationary ARMA Models," Journal of
ference of Nonstationary and Nearly Nonstationary Autoregressive the American Statistical Association, 79, 84-96.

This content downloaded from


157.40.102.185 on Sat, 02 Oct 2021 19:55:43 UTC
All use subject to https://about.jstor.org/terms

You might also like