Wavelet Mixture of Experts for Time Series Forecasting
By: Zheng Zhou , Yu-Jie Xiong , Jia-Chen Zhang and more
Potential Business Impact:
Predicts future events more accurately with less data.
The field of time series forecasting is rapidly advancing, with recent large-scale Transformers and lightweight Multilayer Perceptron (MLP) models showing strong predictive performance. However, conventional Transformer models are often hindered by their large number of parameters and their limited ability to capture non-stationary features in data through smoothing. Similarly, MLP models struggle to manage multi-channel dependencies effectively. To address these limitations, we propose a novel, lightweight time series prediction model, WaveTS-B. This model combines wavelet transforms with MLP to capture both periodic and non-stationary characteristics of data in the wavelet domain. Building on this foundation, we propose a channel clustering strategy that incorporates a Mixture of Experts (MoE) framework, utilizing a gating mechanism and expert network to handle multi-channel dependencies efficiently. We propose WaveTS-M, an advanced model tailored for multi-channel time series prediction. Empirical evaluation across eight real-world time series datasets demonstrates that our WaveTS series models achieve state-of-the-art (SOTA) performance with significantly fewer parameters. Notably, WaveTS-M shows substantial improvements on multi-channel datasets, highlighting its effectiveness.
Similar Papers
EMTSF:Extraordinary Mixture of SOTA Models for Time Series Forecasting
Computation and Language
Predicts future events more accurately than before.
GateTS: Versatile and Efficient Forecasting via Attention-Inspired routed Mixture-of-Experts
Machine Learning (CS)
Makes predictions better and faster.
MoWE : A Mixture of Weather Experts
Machine Learning (CS)
Combines weather forecasts for better predictions.