This book is in Open Review. I want your feedback to make the book better for you and other readers. To add your annotation, select some text and then click the on the pop-up menu. To see the annotations of others, click the button in the upper right hand corner of the page

## 4.6 Parameters bounds

While, it is accepted by many practitioners and academics that the smoothing parameters of ETS models should lie between zero and one, this is not entirely true for the models. There are, in fact, several possible restrictions on smoothing parameters, and it is worth discussing them separately:

1. Classical or conventional bounds are $$\alpha, \beta, \gamma \in (0,1)$$. The idea behind them originates from the simple exponential smoothing method (Section 4.1), where it is logical to restrict the bounds with this region, because then the smoothing parameters regulate, what weight the actual value $$y_t$$ will have and what weight will be asigned to the predicted one $$\hat{y}_t$$. showed that this condition is sometimes too loose and in other cases is too restrictive to some ETS models. was one of the first to show that the bounds are wider than this region for many exponential smoothing methods. Still, the conventional restriction is the most often used in practice, just because it is nice to work with.

2. Usual or traditional bounds are those that satisfy the set of the following equations: \begin{equation} \begin{aligned} &\alpha \in [0, 1)\\ &\beta \in [0, \alpha) \\ &\gamma \in [0, 1-\alpha) \end{aligned}, \tag{4.30} \end{equation} This set of restrictions guarantees that the weights decline over time exponentially (see Section 4.1.2) and the ETS models have the property of “averaging” the values over time. In the lower boundary condition, the components of the model become deterministic and we can say that they are calculated as the global averages of the values over time.

3. Admissible bounds, satisfying stability condition. The idea here is that the most recent observation should have higher weight than the older ones, which is regulated via the smoothing parameters. However, in this case we do not impose the restriction of exponential decay of weights on the models, so they can oscillate or decay harmonially, as long as their absolute values decrease over time. The condition is more complicated mathematically than the previous two and will be discussed later in the textbook for the pure additive models (see Section 5.1), but here are several examples for bounds, satisfying this condition (from Chapter 10 of Hyndman et al., 2008):

• ETS(A,N,N): $$\alpha \in (0, 2)$$;
• ETS(A,A,N): $$\alpha \in (0, 2); \beta \in (0, 4-2\alpha)$$;
• ETS(A,N,A): $$\alpha \in \left(\frac{-2}{m-1}, 2-\gamma\right); \gamma \in (\max(-m\alpha, 0), 2-\alpha)$$;

As you see, the admissible bounds are much wider than the conventional and usual ones. In fact, smoothing parameters can become either negative or greater than one in some cases for some models, which is hard to interpret, but might inicate that the data is difficult to predict. Furthermore, the admissible bounds correspond to the parameters restrictions for ARIMA models, underlying some of pure additive ETS models. In a way, they are more natural for the ETS models than the other two, because they follow from the formulation and arise naturally. However, their usage in practice has been met with mixed success, with only handful of papers using them instead of (1) or (2) (e.g. Gardner and Diaz-Saiz, 2008; mention that they appear in some cases and Snyder et al., 2017 use them in their model).

In the R code, the admissible bounds are calculated based on the discount matrix, which will be discussed in the context of pure additive ADAM ETS models in the chapter 5.1.

### References

• Brenner, J.L., D’Esopo, D.a., Fowler, a.G., 1968. Difference Equations in Forecasting Formulas. Management Science. 15, 141–159. https://doi.org/10.1287/mnsc.15.3.141
• Gardner, E.S., Diaz-Saiz, J., 2008. Exponential smoothing in the telecommunications data. International Journal of Forecasting. 24, 170–174. https://doi.org/10.1016/j.ijforecast.2007.05.002
• Hyndman, R.J., Koehler, A.B., Ord, J.K., Snyder, R.D., 2008. Forecasting with Exponential Smoothing. Springer Berlin Heidelberg.
• Snyder, R.D., Ord, J.K., Koehler, A.B., McLaren, K.R., Beaumont, A.N., 2017. Forecasting compositional time series: A state space approach. International Journal of Forecasting. 33, 502–512. https://doi.org/10.1016/j.ijforecast.2016.11.008