This book is in Open Review. I want your feedback to make the book better for you and other readers. To add your annotation, select some text and then click the on the pop-up menu. To see the annotations of others, click the button in the upper right hand corner of the page

## 12.5 Examples of application

In order to see how ADAM can be applied to high frequency data, we will use taylor series from forecast package. This is half-hourly electricity demand in England and Wales from Monday 5 June 2000 to Sunday 27 August 2000, used in .

library(zoo)
y <- zoo(forecast::taylor,
order.by=as.POSIXct("2000/06/05")+
(c(1:length(forecast::taylor))-1)*60*30)
## Registered S3 method overwritten by 'quantmod':
##   method            from
##   as.zoo.data.frame zoo
plot(y)

Note that when you have data with DST or Leap years, adam() will automatically correct the seasonal lags if your data contains specific dates (as zoo objects have, for example). The series above does not exhibit an obvious trend, but has two seasonal cycles: half-hour of day and day of week. Seasonality seems to be multiplicative. We will try several different models and see how they compare. In all the cases below, we will use backcasting as initialisation of the model. We will use the last 336 observations ($$48 \times 7$$) as the holdout, just to see whether models perform adequately or not.

First, it is ADAM ETS(M,N,M) with lags=c(48,7*48):

adamModelETSMNM <- adam(y, "MNM", lags=c(1,48,336),
h=336, holdout=TRUE,
initial="back")
adamModelETSMNM
## Time elapsed: 0.47 seconds
## Model estimated using adam() function: ETS(MNM)[48, 336]
## Distribution assumed in the model: Gamma
## Loss function type: likelihood; Loss function value: 25682.88
## Persistence vector g:
##  alpha gamma1 gamma2
## 0.1357 0.2813 0.2335
##
## Sample size: 3696
## Number of estimated parameters: 4
## Number of degrees of freedom: 3692
## Information criteria:
##      AIC     AICc      BIC     BICc
## 51373.76 51373.77 51398.62 51398.66
##
## Forecast errors:
## ME: 625.221; MAE: 716.941; RMSE: 817.796
## sCE: 709.966%; Asymmetry: 90.4%; sMAE: 2.423%; sMSE: 0.076%
## MASE: 1.103; RMSSE: 0.867; rMAE: 0.107; rRMSE: 0.1
plot(adamModelETSMNM,7)

As you might notice the model was constructed in 0.47 seconds, and while it might not be the most accurate model for the data, it fits the data well and produces reasonable forecasts. So, it is a good starting point. If we want to improve upon it, we can try one of multistep estimators, for example GTMSE:

adamModelETSMNMGTMSE <- adam(y, "MNM", lags=c(1,48,336),
h=336, holdout=TRUE,
initial="back", loss="GTMSE")

This time the function will take much more time (on my computer it takes around 1.5 minutes), but hopefully will produce more accurate forecasts due to shrinkage of smoothing parameters:

adamModelETSMNMGTMSE
## Time elapsed: 23.06 seconds
## Model estimated using adam() function: ETS(MNM)[48, 336]
## Distribution assumed in the model: Normal
## Loss function type: GTMSE; Loss function value: -2648.698
## Persistence vector g:
##  alpha gamma1 gamma2
## 0.0314 0.2604 0.1414
##
## Sample size: 3696
## Number of estimated parameters: 3
## Number of degrees of freedom: 3693
## Information criteria are unavailable for the chosen loss & distribution.
##
## Forecast errors:
## ME: 216.71; MAE: 376.291; RMSE: 505.375
## sCE: 246.084%; Asymmetry: 63%; sMAE: 1.272%; sMSE: 0.029%
## MASE: 0.579; RMSSE: 0.535; rMAE: 0.056; rRMSE: 0.062

Comparing, for example, RMSSE of the two models, we can conclude that the one with TMSE was more accurate than the one estimated using the conventional likelihood.

Another potential way of improvement for the model is the inclusion of AR(1) term, as for example done by . This will take more time, but might lead to some improvements in the accuracy:

adamModelETSMNMAR <- adam(y, "MNM", lags=c(1,48,336),
initial="back", orders=c(1,0,0),
h=336, holdout=TRUE, maxeval=1000)

Note that estimating ETS+ARIMA models is a complicated task, but by default the number of iterations would be restricted by 160, which might not be enough to get to the minimum of the loss. This is why I increased the number of iterations in the code above to 1000. If you want to get more feedback on how the optimisation has been carried out, you can ask function to print details via print_level=41.

adamModelETSMNMAR
## Time elapsed: 2.07 seconds
## Model estimated using adam() function: ETS(MNM)[48, 336]+ARIMA(1,0,0)
## Distribution assumed in the model: Gamma
## Loss function type: likelihood; Loss function value: 24108.2
## Persistence vector g:
##  alpha gamma1 gamma2
## 0.1129 0.2342 0.3180
##
## ARMA parameters of the model:
## AR:
## phi1[1]
##  0.6923
##
## Sample size: 3696
## Number of estimated parameters: 5
## Number of degrees of freedom: 3691
## Information criteria:
##      AIC     AICc      BIC     BICc
## 48226.39 48226.41 48257.47 48257.54
##
## Forecast errors:
## ME: 257.38; MAE: 435.476; RMSE: 561.237
## sCE: 292.266%; Asymmetry: 67.2%; sMAE: 1.472%; sMSE: 0.036%
## MASE: 0.67; RMSSE: 0.595; rMAE: 0.065; rRMSE: 0.069

In this specific example, we see that the ADAM ETS(M,N,M)+AR(1) leads to a small improvement in accuracy.

Another option of dealing with multiple seasonalities, as discussed above, is ETSX model. We start with a static model, which captures half-hours of day via its seasonal component and days of week frequency via explanatory variable. We will use temporaldummy() function from greybox package for this. This function works much better, when the data contains proper time stamps and, for example, is of class zoo or xts:

x1 <- temporaldummy(y,type="day",of="week",factors=TRUE)
x2 <- temporaldummy(y,type="hour",of="day",factors=TRUE)
taylorData <- data.frame(y=y,x1=x1,x2=x2)

This function is especially useful when dealing with DST and Leap years (see Section 12.4), because it will encode the dummy variables based on dates, allowing to sidestep the issue with changing frequency in the data. We can now fit the ADAM ETSX model with dummy variables for days of week:

adamModelETSXMNN <- adam(taylorData, "MNN", h=336, holdout=TRUE,
initial="back")

In the code above we use the initialisation via backacasting, because otherwise the calculation will take much more time. Here is what we get as a result:

adamModelETSXMNN
## Time elapsed: 0.61 seconds
## Model estimated using adam() function: ETSX(MNN)
## Distribution assumed in the model: Gamma
## Loss function type: likelihood; Loss function value: 30155.54
## Persistence vector g (excluding xreg):
##  alpha
## 0.6182
##
## Sample size: 3696
## Number of estimated parameters: 2
## Number of degrees of freedom: 3694
## Information criteria:
##      AIC     AICc      BIC     BICc
## 60315.08 60315.09 60327.51 60327.53
##
## Forecast errors:
## ME: -1664.294; MAE: 1781.472; RMSE: 2070.321
## sCE: -1889.878%; Asymmetry: -92.3%; sMAE: 6.021%; sMSE: 0.49%
## MASE: 2.74; RMSSE: 2.194; rMAE: 0.266; rRMSE: 0.253

The resulting model produces the biased forecasts (they are consistently higher than needed). This is mainly because the smoothing parameter $$\alpha$$ is too high and the model changes the level to frequently. We can see that in the plot of the state:

plot(adamModelETSXMNN\$states[,1], ylab="Level")

As we see, the level component not only contains the level, but also absorbed seasonality, which causes the issue with forecasting accuracy. However, the obtained value did not happen due to randomness - this is what the model does, when seasonality is fixed and is not allowed to evolve over time. In order to reduce the sensitivity of the model, we can shrink the smoothing parameter using a multistep estimator (discussed in Section 11.3). Note however that these estimators are typically slower than the conventional ones, so they might take more computational time:

adamModelETSXMNNGTMSE <- adam(taylorData, "MNN",
h=336, holdout=TRUE,
initial="back", loss="GTMSE")
adamModelETSXMNNGTMSE
## Time elapsed: 39.26 seconds
## Model estimated using adam() function: ETSX(MNN)
## Distribution assumed in the model: Normal
## Loss function type: GTMSE; Loss function value: -2044.705
## Persistence vector g (excluding xreg):
##  alpha
## 0.0153
##
## Sample size: 3696
## Number of estimated parameters: 1
## Number of degrees of freedom: 3695
## Information criteria are unavailable for the chosen loss & distribution.
##
## Forecast errors:
## ME: 105.462; MAE: 921.897; RMSE: 1204.967
## sCE: 119.757%; Asymmetry: 18%; sMAE: 3.116%; sMSE: 0.166%
## MASE: 1.418; RMSSE: 1.277; rMAE: 0.138; rRMSE: 0.147

While the performance of model with GTMSE has improved due to the shrinkage of $$\alpha$$ to zero, the seasonal states are still deterministic and do not adapt to the changes in data. We could adapt them via regressors="adapt", but then we would be constructing the ETS(M,N,M)[48,336] model but in a less efficient way. Alternatively, we could assume that one of the seasonal states is deterministic and, for example, construct the ETSX(M,N,M) model:

adamModelETSXMNMGTMSE <- adam(taylorData, "MNM", lags=48,
h=336, holdout=TRUE,
initial="back", loss="GTMSE",
formula=y~x1)
plot(adamModelETSXMNMGTMSE,7)
## Time elapsed: 33.56 seconds
## Model estimated using adam() function: ETSX(MNM)
## Distribution assumed in the model: Normal
## Loss function type: GTMSE; Loss function value: -2082.17
## Persistence vector g (excluding xreg):
##  alpha  gamma
## 0.0135 0.0769
##
## Sample size: 3696
## Number of estimated parameters: 2
## Number of degrees of freedom: 3694
## Information criteria are unavailable for the chosen loss & distribution.
##
## Forecast errors:
## ME: 146.436; MAE: 830.332; RMSE: 1055.372
## sCE: 166.284%; Asymmetry: 27.1%; sMAE: 2.806%; sMSE: 0.127%
## MASE: 1.277; RMSSE: 1.118; rMAE: 0.124; rRMSE: 0.129

We can see an improvement in comparison with the previous model, so the seasonal states do change over time, which means that the deterministic seasonality is not appropriate in our example. However, in some other cases it might be more suitable, producing more accurate forecasts than the models assuming stochastic seasonality (i.e. via multiple seasonal ETS / ARIMA models).

Another model we can try on this data is ARIMA. We have not yet discussed the order selection mechanism for ARIMA, so I will construct a model based on my judgment. Keeping in mind that ETS(A,N,N) is equivalent to ARIMA(0,1,1), and that the changing seasonality in ARIMA context can be modelled with seasonal differences, I will construct SARIMA(0,1,1)(0,1,1)$$_{336}$$, skipping the frequencies for half-hour of day. Hopefully, this will be enough to model: (a) changing level of data; (b) changing seasonal amplitude. Here how we can construct this model using adam():

adamModelARIMA <- adam(y, "NNN", lags=c(1,336), initial="back",
orders=list(i=c(1,1),ma=c(1,1)),
h=336, holdout=TRUE)
adamModelARIMA
## Time elapsed: 0.31 seconds
## Model estimated using adam() function: SARIMA(0,1,1)[1](0,1,1)[336]
## Distribution assumed in the model: Normal
## Loss function type: likelihood; Loss function value: 26098.76
## ARMA parameters of the model:
## MA:
##   theta1[1] theta1[336]
##      0.5086     -0.1977
##
## Sample size: 3696
## Number of estimated parameters: 3
## Number of degrees of freedom: 3693
## Information criteria:
##      AIC     AICc      BIC     BICc
## 52203.53 52203.53 52222.17 52222.20
##
## Forecast errors:
## ME: 49.339; MAE: 373.387; RMSE: 499.661
## sCE: 56.027%; Asymmetry: 18.6%; sMAE: 1.262%; sMSE: 0.029%
## MASE: 0.574; RMSSE: 0.529; rMAE: 0.056; rRMSE: 0.061
plot(adamModelARIMA,7)

This model is directly comparable with ADAM ETS via information criteria, and as we can see is worse than ADAM ETS(M,N,M)+AR(1) and multiple seasonal ETS(M,N,M) in terms of AICc. But it is better in terms of RMSSE, producing more accurate forecasts. We could analyse the residuals of this model and iteratively test, whether the addition of AR terms and halfhour of day seasonality improves the accuracy of the model. We could also try ARIMA models with different distributions, compare them and select the most appropriate one. The reader is encouraged to do this task on their own.

### References

• Taylor, J.W., 2010. Triple seasonal methods for short-term electricity demand forecasting. European Journal of Operational Research. 204, 139–152. https://doi.org/10.1016/j.ejor.2009.10.003
• Taylor, James W., 2003b. Short-term electricity demand forecasting using double seasonal exponential smoothing. Journal of the Operational Research Society. 54, 799–805. https://doi.org/10.1057/palgrave.jors.2601589