N-BEATS-MOE: N-BEATS with a Mixture-of-Experts Layer for Heterogeneous Time Series Forecasting
By: Ricardo Matos, Luis Roque, Vitor Cerqueira
Potential Business Impact:
Predicts future events more accurately by learning from past patterns.
Deep learning approaches are increasingly relevant for time series forecasting tasks. Methods such as N-BEATS, which is built on stacks of multilayer perceptrons (MLPs) blocks, have achieved state-of-the-art results on benchmark datasets and competitions. N-BEATS is also more interpretable relative to other deep learning approaches, as it decomposes forecasts into different time series components, such as trend and seasonality. In this work, we present N-BEATS-MOE, an extension of N-BEATS based on a Mixture-of-Experts (MoE) layer. N-BEATS-MOE employs a dynamic block weighting strategy based on a gating network which allows the model to better adapt to the characteristics of each time series. We also hypothesize that the gating mechanism provides additional interpretability by identifying which expert is most relevant for each series. We evaluate our method across 12 benchmark datasets against several approaches, achieving consistent improvements on several datasets, especially those composed of heterogeneous time series.
Similar Papers
Unsupervised Anomaly Prediction with N-BEATS and Graph Neural Network in Multi-variate Semiconductor Process Time Series
Machine Learning (CS)
Finds factory problems before they happen.
Wavelet Mixture of Experts for Time Series Forecasting
Machine Learning (CS)
Predicts future events more accurately with less data.
GateTS: Versatile and Efficient Forecasting via Attention-Inspired routed Mixture-of-Experts
Machine Learning (CS)
Makes predictions better and faster.