This book is in Open Review. I want your feedback to make the book better for you and other readers. To add your annotation, select some text and then click the on the pop-up menu. To see the annotations of others, click the button in the upper right hand corner of the page

12.3 Using explanatory variables for multiple seasonalities

The conventional way of introducing several seasonal components has several issues:

  1. It only works with the data with fixed periodicity (the problem sometimes referred to as “fractional frequency”): if \(m_i\) is not fixed and changes from period to period, the model becomes disaligned. An example of such problem is fitting ETS on daily data with \(m=365\), while there are leap years that contain 366 days;
  2. If the model is fit on high frequency data, the problem of parameters estimation becomes non-trivial. Indeed, on daily data with \(m=365\), we need to estimate 364 initial seasonal indices together with the other parameters;
  3. Different seasonal indices would “compete” with each other for each observation, thus making the model overfit the data. An example is the daily data with \(m_1=7\) and \(m_2=365\), where both seasonal components are updated on each observation based on the same error, but with different smoothing parameters.

The situation becomes even more complicated, when the model has more than two seasonal components. But there are at least two ways of resolving these issues in ADAM framework.

The first is based on the idea of De Livera (2010) and the dynamic ETSX. In this case we need to generate fourier series and use them as explanatory variables in the model, switching on the mechanism of adaptation. For example, for the pure additive model, in this case, we will have: \[\begin{equation} \begin{aligned} & {y}_{t} = \check{y}_t + \sum_{i=1}^p a_{i,t-1} x_{i,t} + \epsilon_t \\ & \vdots \\ & a_{i,t} = a_{i,t-1} + \delta_i \frac{\epsilon_t}{x_{i,t}} \text{ for each } i \in \{1, \dots, p\} \end{aligned}, \tag{12.5} \end{equation}\] where \(p\) is the number of fourier harmonics. In this case, we can introduce the conventional seasonal part of the model for the fixed periodicity (e.g. days of week) in \(\check{y}_t\) and use the updated harmonics for the non-fixed one. This approach is not the same as the one in De Livera (2010), but might lead to similar results. The only issue here is in the selection of the number of harmonics, which can be done via the variables selection mechanism, but would inevitably increase computational time.

The second option is based on the idea of dynamic model with categorical variables. In this case, instead of trying to fix the problem with days of year, we first introduce the categorical variables for days of week and then for the weeks of year (or months of year if we can assume that the effects of months are more appropriate). After that we can introduce both categorical variables in the model, using the similar adaptation mechanism to (12.5). In fact, if some of variables have fixed periodicity, we can substitute them with the conventional seasonal components. So, for example, in this case, ETSX(M,N,M)[7]{D} could be written as: \[\begin{equation} \begin{aligned} & {y}_{t} = l_{t-1} s_{t-7} \times \prod_{i=1}^q \exp(a_{i,t-1} x_{i,t}) (1 + \epsilon_t) \\ & l_t = l_{t-1} (1 + \alpha\epsilon_t) \\ & s_t = s_{t-7} (1 + \gamma\epsilon_t) \\ & a_{i,t} = a_{i,t-1} + \left \lbrace \begin{aligned} &\delta \log(1+\epsilon_t) \text{ for each } i \in \{1, \dots, q\}, \text{ if } x_{i,t} = 1 \\ &0 \text{ otherwise } \end{aligned} \right. \end{aligned}, \tag{12.6} \end{equation}\] where \(q\) is the number of levels in the categorical variable (for weeks of year, this should be 53). The number of parameters to estimate in this case might be greater than the number of harmonics in the first case, but this type of model resolves all three issues as well and does not have the dilema about selecting the number of harmonics.

References

• De Livera, A.M., 2010. Exponentially weighted methods for multiple seasonal time series. International Journal of Forecasting. 26, 655–657. https://doi.org/10.1016/j.ijforecast.2010.05.010