## 6.3 Moments and quantiles of pure multiplicative ETS

The recursion (6.7) obtained in the previous section shows how the information on observation \(t\) influences the logarithms of states. While it is possible to calculate the expectation of the logarithm of the variable \(y_{t+h}\) based on that information under some conditions, in general, this does not allow deriving the expectation of the variable in the original scale. This is because the expectations of terms \(\log(\mathbf{1}_k + \mathbf{g}_{m_i} \epsilon_{t+j})\) for different \(j\) and \(i\) are not known and are difficult to derive analytically (if possible at all). The situation does not become simpler for the conditional variance.

The only way to derive the conditional expectation and variance for the pure multiplicative models is to use the formulae from Tables 4.1 and **??** in Section 4.2 and manually derive the values in the original scale. This works well only for the ETS(M,N,N) model, for which it is possible to take conditional expectation and variance of the recursion (6.9) to obtain:
\[\begin{equation}
\begin{aligned}
\mu_{y,t+h} = \mathrm{E}(y_{t+h}|t) = & l_{t} \\
\mathrm{V}(y_{t+h}|t) = & l_{t}^2 \left( \left(1+ \alpha^2 \sigma^2 \right)^{h-1} (1 + \sigma^2) -1 \right),
\end{aligned}
\tag{6.10}
\end{equation}\]
where \(\sigma^2\) is the variance of the error term. For the other models, the conditional moments do not have general closed forms because of the product of \(\log(1+\alpha\epsilon_t)\), \(\log(1+\beta\epsilon_t)\), and \(\log(1+\gamma\epsilon_t)\). It is still possible to derive the moments for special cases of \(h\), but this is a tedious process. In order to see that, we demonstrate here how the recursion looks for the ETS(M,Md,M) model:
\[\begin{equation}
\begin{aligned}
& y_{t+h} = l_{t+h-1} b_{t+h-1}^\phi s_{t+h-m} \left(1 + \epsilon_{t+h} \right) = \\
& l_{t} b_{t}^{\sum_{j=1}^h{\phi^j}} s_{t+h-m\lceil\frac{h}{m}\rceil} \prod_{j=1}^{h-1} \left( (1 + \alpha \epsilon_{t+j}) \prod_{i=1}^{j} (1 + \beta \epsilon_{t+i})^{\phi^{j-i}} \right) \prod_{j=1}^{\lceil\frac{h}{m}\rceil} \left(1 + \gamma \epsilon_{t+j}\right) \left(1 + \epsilon_{t+h} \right) .
\end{aligned}
\tag{6.11}
\end{equation}\]
The conditional expectation of the recursion (6.11) does not have a simple form, because of the difficulties in calculating the expectation of \((1 + \alpha \epsilon_{t+j})(1 + \beta \epsilon_{t+i})^{\phi^{j-i}}(1 + \gamma \epsilon_{t+j})\). In a simple example of \(h=2\) and \(m>h\) the conditional expectation based on (6.11) can be simplified to:
\[\begin{equation}
\mu_{y,t+2} = l_{t} b_{t}^{\phi+\phi^2} \left(1 + \alpha \beta \sigma^2 \right),
\tag{6.12}
\end{equation}\]
introducing the second moment, the variance of the error term \(\sigma^2\). The case of \(h=3\) implies the appearance of the third moment, the \(h=4\) – the fourth, etc. This is why there are no closed forms for the conditional moments for the pure multiplicative ETS models with trend and/or seasonality. In some special cases, when smoothing parameters and the variance of the error term are all low, it is possible to use approximate formulae for some of the multiplicative models. These are discussed in Chapter 6 of Hyndman et al. (2008). In a special case when all smoothing parameters are equal to zero or when \(h=1\), the conditional expectation will coincide with the point forecast from Tables 4.1 and **??** in Section 4.2. But in general, the best thing that can be done in this case is the simulation of possible paths (using the formulae from the tables mentioned above) and then the calculation of mean and variance based on them.

Furthermore, it can be shown for pure multiplicative models that: \[\begin{equation} \hat{y}_{t+h} \leq \check{\mu}_{t+h} \leq \mu_{y,t+h} , \tag{6.13} \end{equation}\] where \(\mu_{y,t+h}\) is the conditional \(h\) steps ahead expectation, \(\check{\mu}_{t+h}\) is the conditional \(h\) steps ahead geometric expectation (expectation in logarithms), and \(\hat{y}_{t+h}\) is the point forecast (Svetunkov and Boylan, 2023b). This gives an understanding that the point forecasts from pure multiplicative ETS models are always lower than geometric and arithmetic moments. If the variance of the error term is close to zero, the three elements in (6.13) will be close to each other. A similar effect will be achieved when all smoothing parameters are close to zero. Moreover, the three elements will coincide for \(h=1\) (Svetunkov and Boylan, 2023b).

Finally, when it comes to conditional quantiles, the same term \(\log(\mathbf{1}_k + \mathbf{g}_{m_i} \epsilon_{t+j})\) causes a different set of problems, introducing convolutions of products of random variables. To better understand this issue, we consider the persistence part of the equation for the ETS(M,N,N) model, which is: \[\begin{equation} \log(1+\alpha\epsilon_t) = \log(1-\alpha + \alpha(1+\epsilon_t)). \tag{6.14} \end{equation}\] Whatever we assume about the distribution of the variable \((1+\epsilon_t)\), the distribution of (6.14) will be more complicated. For example, if we assume that \((1+\epsilon_t)\sim\mathrm{log}\mathcal{N}(0,\sigma^2)\), then the distribution of (6.14) is something like exp-three-parameter Log-Normal distribution (Sangal and Biswas, 1970). The convolution of (6.14) for different \(t\) does not follow a known distribution, so it is not possible to calculate the conditional quantiles based on (6.7). Similar issues arise if we assume any other distribution. The problem is worsened in the case of multiplicative trend and/or multiplicative seasonality models, because then the recursion (6.7) contains several errors on the same observation (e.g. \(\log(1+\alpha\epsilon_t)\) and \(\log(1+\beta\epsilon_t)\)), introducing products of random variables.

All of this means that in general in order to get adequate estimates of moments or quantiles for a pure multiplicative ETS model, we need to revert to simulations (discussed in Section 18.1).