Professional Documents
Culture Documents
Short-Run Electricity Demand Forecast in Maharashtra
Short-Run Electricity Demand Forecast in Maharashtra
To cite this article: Sajal Ghosh & Anjana Das (2002) Short–run electricity demand forecasts in
Maharashtra, Applied Economics, 34:8, 1055-1059, DOI: 10.1080/00036840110064656
Article views: 36
Download by: [Orta Dogu Teknik Universitesi] Date: 26 February 2016, At: 12:16
Applied Economics, 2002, 34, 1055 ±1059
This paper, has tried to forecast monthly maximum electricity demand for the state
Downloaded by [Orta Dogu Teknik Universitesi] at 12:16 26 February 2016
* Corresponding author: Energy Division, CII, Gate No. 31, North Block, J. N. Stadium, New Delhi 110 003, India. E-mail: sajal.ghosh@
ciionline.org
1
In this case it is assumed that the trend includes the cyclical component. So, trend and seasonal components are the permanent
components whereas random component captures all idiosyncratic nature of the series.
Applied Economics ISSN 0003±6846 print/ISSN 1466±4283 online # 2002 Taylor & Francis Ltd 1055
http://www.tandf.co.uk/journals
DOI: 10.1080 /0003684011006465 6
1056 S. Ghosh and A. Das
cess has become covariance-stationary . If the original series ance of forecast accuracy using the estimated model.
Xt is homogeneous of degree d, then Assuming the estimated model is representative during
the forecast period, the post-sample RMSE is a guide to
¢d Xt ˆ …1 ¡ L†d Xt ˆ Zt ; t ˆ 1; 2; 3; . . . ; T …1† assess which model better explains the forecasted time
is covariance-stationary . Here, L is the backward shift series.
operator. An integrated process Xt is designed as an
ARIMA (p; d; q), if taking di erences of order d, a station- ARIMA model building
ary process Zt of the type ARMA (p; q) is obtained.
The ARIMA (p, d, q) model is expressed by the function For a given time series, it is important to know which
ARIMA model is capable of generating the underlying
Zt ˆ ¿1 Zt¡1 ‡ ¿2 Zt¡2 ‡ . . . . . . ‡ ¿p Zt¡p
series. In other words, which model adequately represents
‡ut ¡ ³1 ut¡1 ¡ ³2 ut¡2 ¡ . . . . . . ¡ ³q ut¡q the behaviour of the concerned Time Series so that the
forecasts of the series under study can be done precisely.
Or ¿…L†…1 ¡ L†d Xt ˆ ³…L†ut …2† Box±Jenkins consider model building as an iterative pro-
cess which can be divided into four stages: identi®cation,
Downloaded by [Orta Dogu Teknik Universitesi] at 12:16 26 February 2016
Non-stationar y homogeneou s models with seasonal vari- estimation, diagnosti c checking and forecasting.
ations, ARIMA (P,D,Q)s : In most of the monthly electri-
city time series data, seasonal variation is one of the Identi®cation: This stage basically tries to identify an
main sources of non-stationarity . To remove seasonal appropriate ARIMA model for the underlying stationary
non-stationarit y of such series where seasonality is yearly, time series on the basis of Sample Autocorrelation Func-
one can proceed with seasonal di erencing by s ˆ 12 tion (ACF) and Partial Autocorrelation Function
times. The seasonal models ARIMA (P; D; Q), which are (PACF). If the series is nonstationar y it is ®rst trans-
not stationary but homogeneous of degree D can be formed to covariance-stationar y and then one can easily
expressed as identify the possible values of the regular part of the
model i.e., autoregressive order p and moving average
Zt ˆ ©1 Zt¡s ‡ ©2 Zt¡2s ‡ . . . . . . ‡ ©p Zt¡ps
order q in a univariate ARMA model along with the
‡d ‡ ut ¡ £1 ut¡s ¡ £2 ut¡2s ¡ . . . seasonal part.
Or ©p …Ls †…1 ¡ Ls †D Xt ˆ d ‡ £Q …Ls †ut …3† Estimation: In the estimation stage, point estimates of the
coe cients can be obtained by the method of maximum
where © and £ are ®xed seasonal autoregressive (AR) and likelihood. Associated standard errors are also provided,
moving average (MA) parameters. suggesting which coe cients could be dropped.
General multiplicative seasonal models, ARIMA (p, d, q) Diagnosti c checking: In this stage, additional autoregres-
(P, D, Q)s 2 : These models take into account the e ect sive and moving average variables can be added and their
of trend and seasonal ¯uctuations of a time series and statistical signi®cance can be examined. One should also
are expressed as: examine whether the residuals of the model appear to be
©p …Ls †¿p …L†…1 ¡ Ls †D …1 ¡ L†d Xt ˆ £Q …Ls †³q …L†ut …4† white noise process. After the model has been respeci®ed,
it will be reestimated and diagnostic checks will be
applied again until the coe cients are reasonably statisti-
Root mean square error (RMSE) criterion: To evaluate cally signi®cant and the residuals are random.
the performance of the model one can consider RMSE
criterion, which is de®ned as: Forecasting: After the diagnostic checking comes the fun-
h X i1=2 damental aim of the methodology, i.e., the forecasts of
RMSE ˆ …1=T † …X^t ¡ Xt †2 …5† the future values of the time series.
Probability 10000
[Table value>
2
Lag Chi-sq(À ) dof Observed (À2 )] 8000
Original
6 3.60 12 0.16
Mw
6000 Forecast
12 6.18 8 0.62
18 9.12 14 0.82 4000
24 10.51 20 0.95
30 15.33 26 0.95
2000
36 21.37 32 0.92
42 27.88 38 0.88
0
Jan-98 Jul-98 Feb-99 Aug-99 Mar-00 Oct-00 Apr-01
12
…1 ¡ L†…1 ¡ L †…Xt ¡ 1:86† Months (6/98 to 12/2000)