This book is in Open Review. I want your feedback to make the book better for you and other readers. To add your annotation, select some text and then click the on the pop-up menu. To see the annotations of others, click the button in the upper right hand corner of the page

The ETS model implemented in ADAM framework is built upon the conventional one but has several important differences. First it is formulated using lags of components rather than the transition of them over time, so the original model (5.21) is written in the following way: \begin{aligned} {y}_{t} = &w(\mathbf{v}_{t-\mathbf{l}}) + r(\mathbf{v}_{t-\mathbf{l}}) \epsilon_t \\ \mathbf{v}_{t} = &f(\mathbf{v}_{t-\mathbf{l}}) + g(\mathbf{v}_{t-\mathbf{l}}) \epsilon_t \end{aligned}, \tag{6.1}
where $$\mathbf{v}_{t-\mathbf{l}}$$ is the vector of lagged components and $$\mathbf{l}$$ is the vector of lags, and all the other functions corresponds to the ones used in (5.21). So, for example, for the ETS(A,A,A) model the lags will be $$\mathbf{l}'=\begin{pmatrix}1 & 1 & m\end{pmatrix}$$, where $$m$$ is the seasonal periodicity of the data, leading to $$\mathbf{v}_{t-\mathbf{l}}'=\begin{pmatrix} l_{t-1} & b_{t-1} & s_{t-m}\end{pmatrix}$$. The model (6.1) updates the states exactly in the same way as (5.21) and produces exactly the same values. The main benefit of doing that is that the transition matrix becomes smaller, containing $$3\times 3$$ elements in case of ETS(A,A,A) instead of $$(2+m)\times (2+m)$$ as for the conventional model. The main disadvantage of this approach is in the complications arrising in the derivation of conditional expectation and mean, which still have closed forms, but are more cumbersome. They are discussed later in this chapter for the example of pure additive ETS.
Furthermore, ADAM ETS introduces more flexibility, allowing the error term $$\epsilon_t$$ to follow non-normal distributions. This impacts the likelihood function and prediction interval, but does not change the mechanism of the update of states.