Chapter 15 Model selection and combinations in ADAM
When it comes to time series analysis and to forecasting a specific time series, there are several ways to decide, which model to use, and there are several dimensions, in which a decision needs to be made:
- Which of the models to use: ETS / ARIMA / ETS+ARIMA / Regression / ETSX / ARIMAX / ETSX+ARIMA?
- What components of the ETS model to select?
- What order of ARIMA model to select?
- Which of the explanatory variables to use?
- What distribution to use?
- Should we select model or combine forecasts from different ones?
- Do we need all models in the pool?
- How should we do all the above?
In this chapter, we discuss all aspects, related to model selection and combinations in ADAM. We will start the discussion with principles based on information criteria, we will then move to more complicated topics, related to pooling and then we will finish with selection and combinations based on rolling origin.
Before we do that, we need to recall the distributional assumptions in ADAM, which play an important role if the model is estimated via the maximisation of likelihood function. In this case an information criterion (IC) can be calculated and used for the selection of the most appropriate model. Based on this, we can fit several ADAM models with different distributions and then select the one that leads to the lowest IC. Here is the list of the supported distributions in ADAM:
- Generalised Normal;
- Log Normal;
- Inverse Gaussian;
auto.adam() implements this automatic selection of distribution based on IC for the provided vector of
distribution by user. This selection procedure can be combined together with other selection techniques for different elements of ADAM model discussed in the following sections of the textbook.
Here is an example of selection of distribution for a specific model, ETS(M,M,N) on Box-Jenkins data using
## Evaluating models with different distributions... dnorm , dlaplace , ds , dgnorm , dlnorm , dinvgauss , dgamma , Done!
## Time elapsed: 0.28 seconds ## Model estimated using auto.adam() function: ETS(MMN) ## Distribution assumed in the model: Log Normal ## Loss function type: likelihood; Loss function value: 245.3736 ## Persistence vector g: ## alpha beta ## 0.9994 0.2429 ## ## Sample size: 140 ## Number of estimated parameters: 5 ## Number of degrees of freedom: 135 ## Information criteria: ## AIC AICc BIC BICc ## 500.7471 501.1949 515.4553 516.5617 ## ## Forecast errors: ## ME: 3.211; MAE: 3.325; RMSE: 3.778 ## sCE: 14.1%; sMAE: 1.46%; sMSE: 0.028% ## MASE: 2.813; RMSSE: 2.478; rMAE: 0.924; rRMSE: 0.92
In this case the function has applied one and the same model but with different distributions, estimated it using likelihood and selected the one that has the lowest AICc value.