4.1 Time series components
The main idea behind many forecasting techniques is that any time series can contain several unobservable components, such as:
- Level of the series - the average value for specific period of time,
- Growth of the series - the average increase or decrease of the value over a period of time,
- Seasonality - a pattern, which is observed from year to year (e.g. growth in sales of lager beer in Summer),
- Error - an unexplainable white noise.
where \(\varepsilon_t\) is the error term that has mean of one. The interpretation of the model (4.1) is that the different components add up to each other, so, for example, the sales in January typically increase by the amount \(s_{t-m}\), and that there is still some randomness that is not taken into account in the model. The pure additive models can be applied for the data that can have positive, negative and zero values. In case of the model (4.2), the interpretation is similar, but the sales change by \((s_{t-m}-1) \text{%}\) from the baseline. These models only work with the data with positive values. Although they should also work on data with purely negative values as well, this is less often met in practice.
It is also possible to define mixed models, for example, when trend is additive, but the other components are multiplicative: \[\begin{equation} y_t = (l_{t-1} + b_{t-1}) s_{t-m} \varepsilon_t , \tag{4.3} \end{equation}\]these models work well in practice, when the data has high values, far from zero, but in the other cases they might produce contradicting results: e.g., generate negative values on positive data. So, the conventional decomposition techniques only consider the pure models.
References
Warren M. Persons. 1919. “General Considerations and Assumptions.” The Review of Economics and Statistics 1 (1): 5–107. doi:10.2307/1928754.