**Open Review**. I want your feedback to make the book better for you and other readers. To add your annotation, select some text and then click the on the pop-up menu. To see the annotations of others, click the button in the upper right hand corner of the page

## 13.4 Examples of application

### 13.4.1 ADAM ETS

In order to see how ADAM can be applied to high frequency data, we will use `taylor`

series from `forecast`

package. This is half-hourly electricity demand in England and Wales from Monday 5 June 2000 to Sunday 27 August 2000, used in James W. Taylor (2003).

The series does not exhibit an obvious trend, but has two seasonal cycles: half-hour of day and day of week. Seasonality seems to be multiplicative. We will try several different models and see how they compare. In all the cases below, we will use backcasting as initialisation of the model. We will use the last 336 observations (\(48 \times 7\)) as the holdout, just to see whether models perform adequately or not.

First, it is ADAM ETS(M,N,M) with `lags=c(48,7*48)`

:

```
adamModelETSMNM <- adam(y, "MNM", lags=c(1,48,336), initial="back",
h=336, holdout=TRUE)
adamModelETSMNM
```

```
## Time elapsed: 0.28 seconds
## Model estimated using adam() function: ETS(MNM)[48, 336]
## Distribution assumed in the model: Inverse Gaussian
## Loss function type: likelihood; Loss function value: 25801.04
## Persistence vector g:
## alpha gamma1 gamma2
## 0.8874 0.1125 0.1125
##
## Sample size: 3696
## Number of estimated parameters: 4
## Number of degrees of freedom: 3692
## Information criteria:
## AIC AICc BIC BICc
## 51610.08 51610.09 51634.94 51634.98
##
## Forecast errors:
## ME: -424.363; MAE: 758.61; RMSE: 1048.348
## sCE: -481.882%; sMAE: 2.564%; sMSE: 0.126%
## MASE: 1.167; RMSSE: 1.111; rMAE: 0.113; rRMSE: 0.128
```

As you might notice the model was constructed in 0.28 seconds, and while it might not be the most accurate model for the data, it fits the data well and produces reasonable forecasts. So, it is a good starting point. If we want to improve upon it, we can try one of multistep estimators, for example GTMSE:

```
adamModelETSMNMGTMSE <- adam(y, "MNM", lags=c(1,48,336), initial="back",
h=336, holdout=TRUE, loss="GTMSE")
```

This time the function will take much more time (on my computer it takes around 1.5 minutes), but hopefully will produce more accurate forecasts due to shrinkage of smoothing parameters:

```
## Time elapsed: 27.25 seconds
## Model estimated using adam() function: ETS(MNM)[48, 336]
## Distribution assumed in the model: Normal
## Loss function type: GTMSE; Loss function value: -2306.988
## Persistence vector g:
## alpha gamma1 gamma2
## 0.0411 0.1394 0.1570
##
## Sample size: 3696
## Number of estimated parameters: 3
## Number of degrees of freedom: 3693
## Information criteria are unavailable for the chosen loss & distribution.
##
## Forecast errors:
## ME: 246.191; MAE: 401.624; RMSE: 534.885
## sCE: 279.56%; sMAE: 1.357%; sMSE: 0.033%
## MASE: 0.618; RMSSE: 0.567; rMAE: 0.06; rRMSE: 0.065
```

Comparing, for example, RMSSE of the two models, we can conclude that the one with TMSE was more accurate than the one estimated using the conventional likelihood.

Another potential way of improvement for the model is the inclusion of AR(1) term, as for example done by Taylor (2010). This will take more time, but might lead to some improvements in the accuracy:

```
adamModelETSMNMAR <- adam(y, "MNM", lags=c(1,48,336), initial="back", orders=c(1,0,0),
h=336, holdout=TRUE, maxeval=1000)
```

Note that estimating ETS+ARIMA models is a complicated task, but by default the number of iterations would be restricted by 160, which might not be enough to get to the minimum of the loss. This is why I increased the number of iterations in the code above to 1000. If you want to get more feedback on how the optimisation has been carried out, you can ask function to print details via `print_level=41`

.

```
## Time elapsed: 8.67 seconds
## Model estimated using adam() function: ETS(MNM)[48, 336]+ARIMA(1,0,0)
## Distribution assumed in the model: Inverse Gaussian
## Loss function type: likelihood; Loss function value: 24107.8
## Persistence vector g:
## alpha gamma1 gamma2
## 0.1153 0.2334 0.3179
##
## ARMA parameters of the model:
## AR:
## phi1[1]
## 0.6909
##
## Sample size: 3696
## Number of estimated parameters: 5
## Number of degrees of freedom: 3691
## Information criteria:
## AIC AICc BIC BICc
## 48225.61 48225.62 48256.68 48256.75
##
## Forecast errors:
## ME: 262.434; MAE: 438.687; RMSE: 564.35
## sCE: 298.005%; sMAE: 1.483%; sMSE: 0.036%
## MASE: 0.675; RMSSE: 0.598; rMAE: 0.066; rRMSE: 0.069
```

In this specific example, we see that the ADAM ETS(M,N,M)+AR(1) leads to a small improvement in accuracy.

### 13.4.2 ADAM ARIMA

Another model we can try on this data is ARIMA. We have not yet discussed the order selection mechanism for ARIMA, so I will construct a model based on my judgment. Keeping in mind that ETS(A,N,N) is equivalent to ARIMA(0,1,1), and that the changing seasonality in ARIMA context can be modelled with seasonal differences, I will construct SARIMA(0,1,1)(0,1,1)\(_{336}\), skipping the frequencies for half-hour of day. Hopefully, this will be enough to model: (a) changing level of data; (b) changing seasonal amplitude. Here how we can construct this model using `adam()`

:

```
adamModelARIMA <- adam(y, "NNN", lags=c(1,336), initial="back",
orders=list(i=c(1,1),ma=c(1,1)),
h=336, holdout=TRUE)
adamModelARIMA
```

```
## Time elapsed: 0.53 seconds
## Model estimated using adam() function: SARIMA(0,1,1)[1](0,1,1)[336]
## Distribution assumed in the model: Normal
## Loss function type: likelihood; Loss function value: 24209.97
## ARMA parameters of the model:
## MA:
## theta1[1] theta1[336]
## 0.186 -0.395
##
## Sample size: 3696
## Number of estimated parameters: 3
## Number of degrees of freedom: 3693
## Information criteria:
## AIC AICc BIC BICc
## 48425.94 48425.94 48444.58 48444.61
##
## Forecast errors:
## ME: 191.662; MAE: 429.832; RMSE: 560.955
## sCE: 217.641%; sMAE: 1.453%; sMSE: 0.036%
## MASE: 0.661; RMSSE: 0.594; rMAE: 0.064; rRMSE: 0.068
```

This model is directly comparable with ADAM ETS via information criteria, and as we can see is slightly worse than ADAM ETS(M,N,M)+AR(1), but is better than multiple seasonal ETS(M,N,M) in terms of AICc. In fact, it is better even in terms of RMSSE, producing more accurate forecasts. We could analyse the residuals of this model and iteratively test, whether the addition of AR terms and halfhour of day seasonality improves the accuracy of the model. We could also try ARIMA models with different distributions, compare them and select the most appropriate one. The reader is encouraged to do this task on their own.

### References

Taylor, James W. 2003. “Short-term electricity demand forecasting using double seasonal exponential smoothing.” *Journal of the Operational Research Society* 54 (8): 799–805. https://doi.org/10.1057/palgrave.jors.2601589.

Taylor, James W. 2010. “Triple seasonal methods for short-term electricity demand forecasting.” *European Journal of Operational Research* 204 (1): 139–52. https://doi.org/10.1016/j.ejor.2009.10.003.