Christmas and the New Year are upon us, and I wanted to publish a celebratory post before taking a break. Instead of writing something educational, I decided to simply recommend a paper for you to read over the holidays – something you might have overlooked in the past couple of years.
Here it is or here.
This paper, written by Xiaoqian Wang, Rob Hyndman, Feng Li, and Yanfei Kang, is a 50-year literature review on the topic of forecast combinations. The authors conduct a thorough review of the literature in this area. They begin by discussing implications for point forecasts, covering different combination methods such as linear and non-linear approaches, learning-based combinations, pooling, and more. Then, they shift their focus to probabilistic forecast combinations, exploring what it means to combine quantiles and how to make them better calibrated. As expected, the paper ends with conclusions, but the authors go further, summarising some of the gaps in the literature – a helpful starting point for those interested in forecasting research.
I admit that this paper has nothing to do with Christmas, but I feel it’s a fitting way to say “Good bye” to the year 2024. While we’ve seen remarkable developments in machine learning over the past year, I feel that some people are starting to lose sight of basic forecasting principles. This paper discusses one of the important ones: combinations often produce more robust forecasts than individual models, explained here in great detail.
Merry Christmas, Happy New Year, and see you in 2025!