การประยุกต์ใช้ ARIMA Model เพื่อการวิจัย

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 20

S U D D H I P A R I T A D

ARIMA Model
Application of ARIMA Model for Research

*
Jindamas Sutthichaimethee

*
*Lecturer Plan and Policy Analyst, Ministry of Science and Technology

101

102

S U D D H I P A R I T A D

The Best Model ARIMA


Model (Structure Variable) ARIMAX Model
Statistics Model


Non Stationary
Stationary Cointegration Error Correction Mechanism (ECM)

(The Best Model)

: //

S U D D H I P A R I T A D

Abstract
This article is intended to create The Best Model ARIMA Model, which
applies to the variable structure called ARIMAX Model. Statistics Model of
steps and can take the model used for forecasting the maximum efficiency.
For information on the system economy, most will look a Non
Stationary, so researchers need to be updated to look as Stationary and if
the data is Co-integration parties is essential to introduce the Error Correction
Mechanism assembly in that model and The Best Model to create a model
estimating the correct and appropriate for that type of information will
result in the forecast errors are low and can be used to accurately follow.
Keyword : Structure Variable / Time Series Data / The Best Model

103

104

S U D D H I P A R I T A D

1.
(Model)




Box-Jenkins
(Model)
BoxJenkins George E.P. Box
Gwilym M.Jenkins .. 1970
ARIMA Model
.. 1994

BoxJenkins
(Time Series Data)
Stochastic Process
Stationary Time Series
Nonstationary Time Series

(Stationary)

Stationary


(Actual Value)

Model

2. ARIMA


BoxJenkins
ARIMA Model

(1) Stationary (2) Cointegration (3)


Error Correction Mechanism (ECM)

1.Stationary Y
Stochastic Variable Time Series
Stationary
3
Mean :
t

E(Yt) = E(Yt-k) =

Variance :
Var(Y ) = E(Y - ) = E(Y - ) =
Covariance :
E [(Y - ) (Y - ) ] =

Covariance
Y (Time)
3
Stationary
Stationary Stochastic Process
Stationary
(Mean or Expected
Value) (Variance)
(Covariance)
(Constant Over Time)

(Distance or Lag)
t

t -k

t -k

S U D D H I P A R I T A D

105

1 Stationary

Nonstationary
Nonstationary

Mean :
E(Y ) = E(Y ) = t
Variance :
Var(Y ) = E(Y - ) = E(Y - ) = t
Covariance :
E [(Y - ) (Y - ) ] = t
Nonstationary Stochastic Process
t

Nonstationary


Random Walk

t-k

t -k

t -k

2 Nonstationary

106

S U D D H I P A R I T A D

Stationary

Dickey Fuller (DF), Augmented Dickey
and Fuller (ADF)
Nonstationary

Unit Root
Nonstationary
Unit Root Regression
Model Ordinary Least Square (OLS)

(Significance)
Spurious
Regression ( , 2549)
Non
stationary Stationary Weak
Stationary First
Moment Second Moment
Strictly Stationary
Moment
Moment
Moment
2
Stationary

Nonstationary
Observation

(Shock)
Stationary


Nonstationary
Model

Nonstationary
(Long Run Mean Level) (
, 2544)
Stationary

Dickey Fuller (DF) Augmented
Dickey Fuller (ADF)

Dickey Fuller Test
Augmented Dickey Fuller Test Unit
Root Test DickeyFuller
Nonstationary

(Difference Regression)
First Order Autoregressive
Process 3

. Y = Y + (Random Walk
Process Pure Random Walk)
. Y = +Y + (Random Walk
with Drift Intercept)
. Y = + T + Y + (Random
Walk with Drift Linear Time Trend
Drift Term T T )

Y =
=
(Coefficient of Lagged)

= Error Term
, Mean = 0,
Variance =
(Hypothesis) Unit Root
Test
H0 : = 0, Nonstationary
H : < 1, Stationary
t

t-1

t-1

t-1

S U D D H I P A R I T A D

Yt

Nonstationary
Accept H0 = 0
Exponential
Explosive

. Yt = Yt-1 + t

(1)

(1)
(Mean)


(Drift Term)
. Yt =1+Yt-1 + t (2)

1 = (Drift Term)
(2)
Unit Root Test
Trend Stationary (TS)
Difference Stationary (DS)

. Yt =1+ 2T + Yt-1 + t

T =
(Time Trend)
2 =
t
Stationary
0 2 t ~ IID,
(0, 2)


(Time Series) First Difference


Stationary
Difference Stationary

Yt = Yt-1

107

Yt = 1+ 2T + Yt - 1 + t (4)
(4)
Level H Accept
H0 Nonstationary = 0
Tau Statistics
Absolute Term
DF Critical Absolute Term
t
White Noise

Autocorrelation
Augmented Dickey Fuller (ADF)
Goodness of Fit
Dickey Fuller
(DF)
(Lagged)
(Dependent Variable)
Autocorrelation
(Hypothesis) Unit Root
Test
H0 : = 0, Nonstationary
H : < 1, Stationary
Level Reject H0
Accept H
Stationary 0
Tau Statistics
Absolute Term
ADF Critical Absolute Term
t

White Noise
Stationary Yt
Integrated d

108

S U D D H I P A R I T A D

/=pi2

Yt ~ I (d)
Yt =

Yt =

(6)

Yt =

(7)

(5)

p =
(Lagged Values of First Difference of the
Variable)
(5), (6)
(7)
Augmented Dickey Fuller (6)
Yt =


DF ADF
ADF Error Term
White Noise
Error Term Mean

2. Cointegration
Eagle and Granger
Cointegration
(Time Series) 2

(Steady State)

Stationary
Engle Granger
Cointegration

(Error)
Cointegrating
Regression
(Hypothesis) Reject H0

Stationary
(Linear Combination)

Cointegration
DickeyFuller (DF)
Augmented DickeyFuller (ADF)
Cointegration
1
Integrated
(Dependent Variable : Yt )
(Independent Variable : Xt )
Unit Root Test Integrated

Cointegration


Integrated

2 Cointegrating
Parameter
(Error Term) Ordinary
Least Squares (OLS)
ut = Yt- - Xt
(8)
3 ut
Stationary ut
(Linear Combination) White Noise
Augment Dickey

109

S U D D H I P A R I T A D

Fuller (ADF)
Autocorrelation
Reject H0
Accept H Tau Test
(Absolute)
Tau Critical MacKinnon
ut
Stationary Unit Root Yt
Xt
(Cointegration) Reject
H Accept H0 ut
Nonstationary Unit Root
Yt Xt
(NonCointegration)
3. Error Correction Mechanisms (ECM)


Cointegration




(Short Run
Dynamic Adjustment)
(Model)
(Macro Model)

ECM ECM


ECM Model
Co integration
Stationary
Cointegration
ECM



Yt =
(9)
(9)
(ECM Model)

(Error Team : ut - i )
Model

Yt Xt
ECM Model
()
Y t

() Yt

3. ARIMA Model
ARIMA Model
BoxJenkins
4 (1) (Identification) (2) (Parameter Estimator) (3)
(Diagnostic Checking) (4)
(Forecast)
1. (Identification)

Box Jenkins
Stationary invertible
. Autoregressive Model of
Order p AR(p)
Yt = + Y + Y + ... + Y + (10)
(10)

1 t-1

1 t-2

p t-p

110

S U D D H I P A R I T A D

AR (1)
Yt = + Y +
(11)


| 1 | < 1
Stationary
l AR (2)
Yt = + Y + Y +
(12)


- < 1

Stationary
q
(Moving Average Model of Order q)
MA(q)
Yt = + t - 1 t - 1- 2 t - 2- ... - 2 t - 1




(13)
(13)

l MA(1)
Yt = + t - 1 t - 1
(14)

| 1 | < 1
Invertible or Stationary
l

1 t-1

1 t-1

2 t-2

MA(2)

Yt = + t - 1 t - 1 - 2 t - 2

(15)

1 + 2 < 1, 2 + 1< 1
| 1 | < 1
Invertible or Stationary

.
Autoregressive p q (Mixed
Autoregressive and Moving Average
Model of Order p and q) ARMA
(p, q)
Yt = + 1Yt - 1 + 2Yt - 2 + ... + pYt - p + t
- 1 t - 1 -

2 t - 2 - ... - q t - q

ARMA(1, 1)
Yt = + Y +
(16)

| 1 | < 1
Stationary
| 1 | < 1
Invertible or Stationary
. Integrated
Autoregressive (Autoregressive Integrated
Moving Average) ARIMA(p, d, q)
d (Different Term)

l ARIMA(0,1,1) IMA(1,1)

Yt - Yt - 1 = + t - 1 t - 1 (17)

| 1 | < 1
Invertible or Stationary
l ARIMA(1,1,0) ARI(1,1)

Yt - Yt - 1 - (Y + Y ) = + (18)

| 1 | < 1
Stationary
l

1 t-1

t-1

t-1

S U D D H I P A R I T A D

ARIMA(1,1,1)
Yt - Yt - 1 - (Y + Y ) = + - 1 t - 1




(19)

| 1 | < 1, | 1 | < 1
Invertible or
Stationary
l ARIMA(0,1,0)
Yt - Yt - 1 =
(20)
. Integrated
Autoregressive (Seasonal
Autoregressive Integrated Moving Average)
SARIMA(p, d, q)L d
L

Yt - Yt - 12 = - * t - 12

| 1 | < 1
Yt - Yt - 12 =
12
* = (Parameter)
(Seasonal Moving Average Model)
2. (Parameter Estimation)
(Parameter Estimation)
1



(Ordinary Least Square : OLS)

l

t-1

t-1

111

3. (Diagnostic
Checking)


2

.
0
tstatistic
H0 : = 0 H : 0

t = / S
(21)

=
S =

. Box Pierce Chi
Square Test ( Q ) Box Pierce

H0 : 1 (et) = ... = k (et) = 0


Box Pierce Chi Square
(Q)
t et, t = 1, 2,, n

et
Q = ( n - d ) rj2 (et)
n =
d =

Stationary
rj2 (et) =
j

112


/ 2,(k - np)
2

S U D D H I P A R I T A D

(22) Q
ChiSquare
(Degree of
Freedom) k - n p
Q
Q

4. (Forecast)

(Point Forecast) (Interval Forecast)






4. ARIMA Model
ARIMA Model Statistics
Model






Structure Variables
ARIMAX Model
ARIMAX Model

ARIMA Model

ARIMAX Model
1
1



2538-2547 ARIMA
Model 1-4
2548


-
Autoregressive Integrated Moving
Average X (ARIMAX)
ARIMA

ARIMA
3
Stationary

(Determine Order of Integration)

Cointegration
ARIMAX

()

S U D D H I P A R I T A D

=
t
=
t - i
=
t-i
=
t - i
=
t - i
ECM = Error Correction Mechanism
MA(i) = Moving Average
i
= (GDP)
t - i
= t
=
(First Difference)

=



t

()

=
t
=
t - i
= t - i
=
t - i
=
t
ECM = Error Correction Mechanism
MA(i) = Moving Average
i

113

= (GDP)
t - i
= t
=
(First Difference)
t

()

=
t
=
t - i
= t - i
=
t - i
ECM = Error Correction Mechanism
MA(i) = Moving Average
i
= (GDP)
t - i
= t
= (First Difference)
t

Stationary

114

S U D D H I P A R I T A D

9
( ),
( ), ( ),

( ),
( ), ( ),
( ),
(Et ) (GDP)
(It ) ARIMAX
Sta-

Lag

ADF Test

tionary Unit Root Test


Augmented Dickey Fuller Test (ADF)
Non stationary
Unit Root
Difference Stationary

1 Unit
Root (At Level)

MacKinnon Critical Value

Status

1%

5%

10%

-3.2138

-4.2191

-3.5331

-3.1983

I(0)

-2.4634

-4.2191

-3.5331

-3.1983

I(0)

-1.3101

-4.2191

-3.5331

-3.1983

I(0)

-3.2676

-4.2191

-3.5331

-3.1983

I(0)

-2.6385

-4.2191

-3.5331

-3.1983

I(0)

-1.4694

-4.2191

-3.5331

-3.1983

I(0)

-1.7578

-4.2191

-3.5331

-3.1983

I(0)

-1.9339

-4.2191

-3.5331

-3.1983

I(0)

-8.9689

4.2191

-3.5331

-3.1983

I(0)

: Logarithm

115

S U D D H I P A R I T A D

It Trend Stationary
It )

Nonstationary

Stationary (Differencing)
First
Differencing Stationary
Two
Unit Root
Unit Root First Differencing

1 ADF Test
Statistic (Level)
Nonstationary
ADF (Critical)
1%
5% Box Jenkins

2 Unit Root (At First Difference)

Lag

ADF Test

MacKinnon Critical Value

Status

1%

5%

10%

- 6.2169

- 4.2268

- 3.5366

- 3.2003

I(1)

- 6.0058

- 4.2268

- 3.5366

- 3.2003

I(1)

- 4.3705

- 4.2268

- 3.5366

- 3.2003

I(1)

- 5.2999

- 4.2268

- 3.5366

- 3.2003

I(1)

- 6.6846

- 4.2268

- 3.5366

- 3.2003

I(1)

- 4.8247

- 4.2268

- 3.5366

- 3.2003

I(1)

- 4.6358

- 4.2268

- 3.5366

- 3.2003

I(1)

- 3.4325

- 4.2268

- 3.5366

- 3.2003

I(1)

: Logarithm

116

S U D D H I P A R I T A D

2
Stationary (Unit Root Test)
(At First Difference)
ADF TStatistic

(MacKinnon Critical Value)
Stationary
1%, 5%
10% Differencing
ARIMAX
Model

(Cointegration Test)
Stationary
Cointegration

Co
integration



Cointegration
Stationary
Integrated (I(d))
Cointegration ADF (
Statistic) (MacKinnon
Critical Value) 3

1%, 5% 10%
Residual Stationary


Error Correction Mechanism
Co
integration

3 Cointegration Engle Granger

ADF Test Statistic

Residual x

MacKinnon Critical Value

Status

1%

5%

10%

-3.3441

- 2.6272

-1.9499

-1.6115

I(0)

Residual y

-3.5094

- 2.6272

-1.9499

-1.6115

I(0)

Residual z

-8.2431

- 2.6272

-1.9499

-1.6115

I(0)

: Residual x = Residual

Residual y = Residual

Residual z = Residual

S U D D H I P A R I T A D

3. ARIMAX
(), ()
()


= 0.86181
+ 0.32296

0.95722
+

(23.12164)***(5.08672) ***

(42.50558) ***1.52574 +

0.60955
0.12175

+ (23.76910)***(8.60323) ***

(2.57628)** 0.93264
+

0.78845
0.68147

+ (3.88292)***(2.73838)***

(0.68148)***0.000134

(9.10267)*** ()
2
R
= 0.811650
2
Adjust R
= 0.741019
LM Statistic = 7.53944
ARCH Test
= 0.123382
Ramsey RESET Test = 0.001763
Jarque Bera = 0.341645
: tstatistic
*** 1%
** 5%
* 10%



= 0.34076
0.99002
1.28113
+(1.47138)
(595634.7) ***(3.65073) ***
1.04543
+ 1.30037
0.86857
+
(3.57268)*** (1.79248)*

(3.48704) ***0.00003
(2.91406)***
()

117


= 0.342569
2
Adjust R
= 0.236531
LM Statistic
= 5.346580
ARCH Test
= 0.551461
Ramsey RESET Test = 0.444123
Jarque Bera = 0.379525

R2

= 0.78347
0.88538

+1.03606
+(9.25121)***

(20.2410) ***(2.10172)**

1.21828
0.30323

+ 0.00009
(2.12692)**

(3.33231)*** (4.75533)*** ()
2
R
= 0.810356
2
Adjust R
= 0.776491
LM Statistic
= 2.363755
ARCH Test
= 0.709357
Ramsey RESET Test = 0.171932
Jarque Bera = 0.747478

The Best Model

1 - 4 2548

Root Mean Square Forecast Error


4

118

S U D D H I P A R I T A D

4 1 - 4 2548 The Best Model


..

2548

52,011

55,413

48,040

52,000

54,955

49,000

58,324

53,981

39,281

59,945

54,080

40,141

53,215

45,008

25,423

55,084

44,978

23,504

50,453

47,121

35,441

49,897

46,015

32,421

0.05

0.02

0.01

Root Mean Square Forecast Error

4
The Best Model

Root Mean Square Forecast Error


1


ARIMA Model
Correlogram
5.

The Best
Model





Box-Jinkins
(Actual)


ARIMA Model ARIMAX
Model
The Best
Model

S U D D H I P A R I T A D

119

.

. , 2549 .

. 1, : , 2553.
. . 1,

: , 2553.
. . 1,

: , 2553.
. (2553)

. .

Anderson, O.D., Time Series Analysis and Forecasting The Box Jenkins

Approach, Butterworths, London, 1975.
Box, George and D. Piece. Distribution of Autocorrelations in Autoregressive

Moving Average Time Series Models. Journal of the American

Statistical Association 65 (1970), 1509-26
Dickey, D. A., Likelihood Ratio Statistics for Autoregressive Time Series with

a Unit Root, Econometric (March 1987), 1981, 251-76
Dickey, and W.A. Fuller (1979),Distribution of the Estimators for Auto

Regressive Time Series with a Unit Root, journal of American Statistical

Association, 74 , pp.427-431.
Drapper, N.R, and Smith, H., Applied Regression Analysis, 2 Edition,

John Wiley & Sons, New York,1981.
Granger, Clive and P. Newbold. Spurious Regressions in Econometrics.

Journal of Econometrics 2 1974, 111-20.
Granger, E.S., JR. and Mckenzie ED. Forecasting Trends in Time Series.

Management Science Vol 31 . 10 (October 1985) : 1237-46
Johansen, S. and K. Juselius , Maximum Likelihood Estimation and Inference

on Co-integration: With Applications to the Demand for Money.

Oxford Bulletin of Economics and Statistics 52 (February 1990), 1990.
Kolb, R.A. and Stekler, H.O., Are Economic Forecasts Significantly Better

Than Nave Predictions ? An Appropriate Test, International Journal

of Forecasting, Vol.9, 1993, pp. 117 120.
Makridakis, S., The accuracy of major extrapolation (time series) method.

J. of Forecasting., 1: 1982, 111 153.
nd

120

S U D D H I P A R I T A D

Montgomery, D.C., Johnson, L.A. and Gardiner, J.S., Forecasting and Time

Series Analysis, 2 Edition, McGraw Hill Inc., New York, 1990.
Nelson, C.R., Applied Time Series Analysis for Managerial Forecasting

Holden Day, San Francisco, 1973
Newbold, P. and Granger, C.W.J., Experience with Forecasting Univariate

Time Series and the Combination of Forecast, Journal of Royal

Statistical Society A,Vol,137, 1974, pp.131 146
Willeam W.S. Wei. Time Series Analysis : Univariate and Multivariate method.

New York USA: Addison Wesley Publishing company, 1990.
nd

You might also like