This book is in Open Review. I want your feedback to make the book better for you and other readers. To add your annotation, select some text and then click the on the pop-up menu. To see the annotations of others, click the button in the upper right hand corner of the page

What is ADAM?

ADAM, as stated earlier, stands for “Augmented Dynamic Adaptive Model.” It is “Augmented,” because it is not just ETS or ARIMA, it is a combination of the two with additional features. It is “Dynamic,” because it includes ETS and ARIMA components (dynamic models). It is “Adaptive,” because it has mechanism of update of parameters of explanatory variables. It is one framework for constructing ETS / ARIMA / Regression, based on more advanced statistical instruments. For example, classical ARIMA is built on the assumption of normality of the error term, but ADAM lifts this assumption and allows using other distributions as well (e.g. Generalised Normal, Inverse Gaussian etc). Another example, the conventional models are estimated either via the maximisation of the likelihood function or using basic losses like MSE or MAE, but ADAM has a wider spectrum of losses and allows using custom ones. There is much more, and different aspects of ADAM will be discussed in detail later in this textbook. For now, here is a brief list of features available in ADAM:

  1. ETS;
  2. ARIMA;
  3. Regression;
  4. TVP regression;
  5. Combination of (1), (2) and either (3), or (4);
  6. Automatic selection / combination of states for ETS;
  7. Automatic orders selection for ARIMA;
  8. Variables selection for regression part;
  9. Normal and non-normal distributions;
  10. Automatic selection of most suitable distributions;
  11. Advanced and custom loss functions;
  12. Multiple seasonality;
  13. Occurrence part of the model to handle zeroes in data (intermittent demand);
  14. Model diagnostics using plot() and other methods;
  15. Confidence intervals for parameters of models;
  16. Automatic outliers detection;
  17. Handling missing data;
  18. Fine tuning of persistence vector (smoothing parameters);
  19. Fine tuning of initial values of the state vector (e.g. level / trend / seasonality);
  20. Two initialisation options (optimal / backcasting);
  21. Provided ARMA parameters;
  22. Fine tuning of optimiser (select algorithm and convergence criteria);

In this textbook we will discuss the model underlying ADAM and how different features are built in the function.