This book is in Open Review. I want your feedback to make the book better for you and other readers. To add your annotation, select some text and then click the on the pop-up menu. To see the annotations of others, click the button in the upper right hand corner of the page

7.5 Normalisation of seasonal indices in ETS models

One of the ideas arising from time series decomposition (Section 3.2), inherited by the conventional ETS, is the idea of renormalisation of seasonal indices. It comes to one of the two principles, depending on the type of seasonality:

  1. If the model has additive seasonality, then the seasonal indices should add up to zero in a specific period of time, e.g. monthly indices should add up to 0 over the yearly period;
  2. If the model has multiplicative seasonality, then the geometric mean of seasonal indices over a specific period should be equal to one.

Condition (2) in the conventional ETS is substituted by “the arithmetic mean of multiplicative indices should be equal to one,” which does not have good grounds behind it - if we deal with multiplicative effect, then the geometric mean should be used, not the arithmetic one, otherwise by multiplying components by indices we introduce bias in the model. While the normalisation is a natural element of the time series decomposition and works fine for the initial seasonal indices, renormalising the seasonal indices over time might not be natural for the ETS.

Hyndman et al. (2008) discuss different mechanisms for the renormalisation of seasonal indices, which as the authors claim are needed in order to make the principles (1) and (2) hold from period to period in the data. However, I argue that this is an unnatural idea for the ETS models, that the indices should only be normalised during the initialisation of the model (at the moment \(t=0\)) and that they should vary independently for the rest of the sample. The rationale for this comes from the model itself. To illustrate it, I will use ETS(A,N,A), but the idea can be easily applied to any other ETS model with any types of components and any number of seasonal frequencies. Just a reminder, this model is formulated as: \[\begin{equation} \begin{aligned} y_t = &l_{t-1} + s_{t-m} + \epsilon_t \\ {l}_{t} = &l_{t-1} + \alpha\epsilon_t \\ s_t = &s_{t-m} + \gamma\epsilon_t \end{aligned}. \tag{7.4} \end{equation}\] Let’s assume that this is the true model (as discussed in Section 1.4) for whatever data we have for whatever reason. In this case the set of equations (7.4) tells us that the seasonal indices change over time, depending on the values of the smoothing parameter \(\gamma\) and each specific values of \(\epsilon_t\), which is assumed to be i.i.d. All seasonal indeces \(s_{t+i}\) in a specific period (e.g. monthly indices in a year) can be written down explicitly based on (7.4): \[\begin{equation} \begin{aligned} s_{t+1} = &s_{t+1-m} + \gamma\epsilon_{t+1} \\ s_{t+2} = &s_{t+2-m} + \gamma\epsilon_{t+2} \\ \vdots \\ s_{t+m} = &s_{t} + \gamma\epsilon_{t+m} \end{aligned}. \tag{7.5} \end{equation}\] If this is how the data is “generated” and the seasonality evolves over time, then there is only one possibility, for the indices \(s_{t+1}, s_{t+2}, \dots, s_{t+m}\) to add up to zero: \[\begin{equation} s_{t+1}+ s_{t+2}+ \dots+ s_{t+m} = 0 \tag{7.6} \end{equation}\] or after inserting in (7.5): \[\begin{equation} s_{t+1-m}+ s_{t+2-m}+ \dots+ s_{t} + \gamma \left(\epsilon_{t+1}+ \epsilon_{t+2}+ \dots+ \epsilon_{t+m}\right) = 0 \tag{7.7} \end{equation}\] meaning that:

  1. the previous indices \(s_{t+1-m}, s_{t+2-m}, \dots, s_{t}\) add up to zero and
  2. either \(\gamma=0\),
  3. or the sum of error terms \(\epsilon_{t+1}, \epsilon_{t+2}, \dots, \epsilon_{t+m}\) is zero.

Note that we do not consider the situation \(s_{t+1-m}+ \dots+ s_{t} = - \gamma \left(\epsilon_{t+1}+ \dots+ \epsilon_{t+m}\right)\) as it does not make sense. The condition (a) can be considered reasonable if the previous indices are normalised. (b) means that the seasonal indices do not evolve over time. However, (c) implies that the error term is not independent, because \(\epsilon_{t+m} = -\epsilon_{t+1}- \epsilon_{t+2}- \dots- \epsilon_{t+m-1}\), which violates one of the basic assumptions of the model from Section 1.4.1, meaning that (??) cannot be considered as the “true” model any more, as it omits some important elements. Thus renormalisation is unnatural for ETS from the “true” model point of view.

Alternatively each seasonal index could be updated on each observation \(t\) (to make sure that the indices are renormalised). In this situation we have: \[\begin{equation*} \begin{aligned} &s_t = s_{t-m} + \gamma\epsilon_t \\ &s_{t-m+1}+ s_{t-m+2}+ \dots+ s_{t-1} + s_{t} = 0 \end{aligned}, \end{equation*}\] which can be rewritten as \(s_{t-m} + \gamma\epsilon_t = -s_{t+1-m}- s_{t+2-m}- \dots- s_{t-1}\), meaning that: \[\begin{equation*} \begin{aligned} s_{t-m}+ s_{t+1-m}+ s_{t+2-m}+ \dots+ s_{t-1} = -\gamma\epsilon_t \end{aligned}, \end{equation*}\] but due to the renormalisation, the sum on the left hand side should be equal to zero, implying that either \(\gamma=0\) or \(\epsilon_t=0\). While former might hold in some cases (deterministic seasonality), the latter cannot hold for all \(t=1,\dots,T\) and violates the assumptions of the model. The renormalisation is thus impossible without changing the structure of the model. Hyndman et al. (2008) acknowledge that and propose in Chapter 8 some modifications for seasonal ETS model (i.e. introducing new models), which we do not aim to discuss in this Chapter.

The discussion in this section demonstrates that the renormalisation of seasonal indices is unnatural for the ETS model and should not be used. Having said that, this does not imply that the initial seasonal indices (those that correspond to the observation \(t=0\)) cannot be normalised. On the contrary, this is a desired approach as it allows reducing the number of estimated parameters in the model.

References

• Hyndman, R.J., Koehler, A.B., Ord, J.K., Snyder, R.D., 2008. Forecasting with Exponential Smoothing. Springer Berlin Heidelberg.