Skill

You might also like

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 6

Skill Matrix

Model performance for seasonal prediction is verified by three skill scores; mean
squared skill score (MSSS), anomaly correlation coefficient (ACC), and root mean
square error (RMSE). Brief descriptions of the scores are presented in Appendix.
MSSS, ACC, and RMSE in each model and each multi-model analysis method are
listed in Table 1, Table 2, and Table 3.

Table 1: Mean Square Skill Score (MSSS) of each model and each multi-model
analysis method (global mean, June-July-August average of precipitation on surface and
temperature on 850hPa). Bold indicates the best performance method.
 
JJA Hindcast Global mean of MSSS
MODEL Precipitation T850
CWB -0.477 -0.002
GCPS -0.429 -0.011
GDAPS_F -0.355 0.008
GDAPS_O -0.057 -0.013
HMC -1.558 0.115
IRI -0.399 0.077
IRIF -0.399 0.077
JMA -0.693 -0.067
METRI -0.158 -0.167
MGO -0.406 0.116
NCC -0.318 -0.036
NCEP -0.332 -0.011
CPPM 0.209 0.116
MME 0.180 0.148
MR 0.199 0.113
SE 0.062 -0.013

Table 2: Anomaly Correlation Coefficient (ACC) of each model and each multi-model
analysis method (global mean, June-July-August average of precipitation on surface and
temperature on 850hPa). Bold indicates the best performance method.

JJA Hindcast (1983-2003) Global mean of


ACC
MODEL Precipitation T850
CWB 0.231 0.301
GCPS 0.255 0.216
GDAPS_F 0.131 0.143
GDAPS_O -0.007 0.041
HMC 0.119 0.357
IRI 0.374 0.313
IRIF 0.374 0.313
JMA 0.151 0.123
METRI 0.210 0.153
MGO 0.267 0.352
NCC -0.029 0.068
NCEP 0.327 0.216
CPPM 0.448 0.357
MME 0.404 0.388
MR 0.408 0.324
SE 0.349 0.325

Table 3: Root Mean Square Error (RMSE) of each model and each multi-model
analysis method (global mean, June-July-August average of precipitation on surface and
temperature on 850hPa). Bold indicates the best performance method.

JJA Hindcast (1983-2003) Global mean of


RMSE
MODEL Precipitation T850
CWB 0.973 0.848
GCPS 0.956 0.850
GDAPS_F 0.930 0.844
GDAPS_O 0.821 0.850
HMC 1.280 0.797
IRI 0.945 0.814
IRIF 0.945 0.814
JMA 1.222 1.042
METRI 0.863 0.913
MGO 0.942 0.796
NCC 0.918 0.862
NCEP 0.923 0.848
CPPM 0.710 0.792
MME 0.726 0.782
MR 0.632 0.999
SE 0.650 0.980

Appendix
A-1. Root Mean Square Error (RMSE)
RMSE indicates measure of accuracy of the forecast (f) compared with observation
(o). Then RMSE is defined as,

where w is latitude weight, W is summation of w, and subscript i is grid point.


RMSE indicates total amount of difference between forecast and observation map.
The score is always greater than or equal to 0.0. If the forecast is perfect, the score of
RMSE equals to 0.0. Sample results of RMSE are shown in Fig. 1 and Fig. 2 in
Appendix.

A-2. Anomaly Correlation Coefficient (ACC)


ACC is pattern correlation between predicted and analyzed anomalies defined as,

where over bar is time average.


ACC indicates spatial similarity between forecast and observation map. The score
always ranges from -1.0 to 1.0. If the forecast is perfect, the score of ACC equals to 1.0.
Sample results of ACC are shown in Fig. 3 and Fig. 4 in Appendix.

A-3. Mean Squared Skill Score (MSSS)


A detailed description of mean squared skill score (MSSS) is provided by WMO
(2002), so only a brief description is presented here. Let oij and fij (i=1,…,n) denote time
series of observations and continuous deterministic forecasts respectively for a grid
point or station j over the period of verification (POV). Then, their averages for the
POV, o j and fj and their sample variances soj2 and sfj2 are given by

The mean squared error of the forecasts is

For the case of cross-validated POV climatology forecasts where forecast/observation


pairs are reasonably temporally independent of each other (so that only one year at a
time is withheld), the mean squared error of ‘climatology’ forecasts (Murphy 1988) is

The Mean Squared Skill Score (MSSS) for j is defined as one minus the ratio of the
squared error of the forecasts to the squared error for forecasts of ‘climatology’

An overall MSSS is computed as,


,

where wj is unity for verifications at stations and is equal to cos(θj), where θj is the
latitude at grid point j on latitude-longitude grids. Sample results of MSSS for JJA mean
precipitation hindcast are shown in Fig. 5 in Appendix.
For either MSSSj or MSSS a corresponding Root Mean Squared Skill Score
(RMSSS) can be obtained easily from
.

MSSSj for forecasts fully cross-validated (with one year at a time withheld) can be
expanded (Murphy 1988) as

where rfxj is the product moment correlation of the forecasts and observations at point or
station j.

The first three terms of the decomposition of MSSSj are related to phase errors
(through the correlation), amplitude errors (through the ratio of the forecast to observed
variances) and overall bias error, respectively, of the forecasts. These terms provide the
opportunity for those wishing to use the forecasts for input into regional and local
forecasts to adjust or weight the forecasts as they deem appropriate. The last term takes
into account the fact that the ‘climatology’ forecasts are cross-validated as well. Spatial
distribution of MSSS, phase errors, amplitude errors, and overall bias error are shown in
Fig. 6 in Appendix.

Reference
Murphy, A.H., 1988: Skill scores based on the mean square error and their relationships
to the correlation coefficient. Mon. Wea. Rev. 116. 2417-2424.
WMO, 2002: Standardised verification system (SVS) for long-range forecasts (LRF).
Manual on the GDPS (WMO-No. 485), volume 1.

You might also like