Ada-MoGE: Adaptive Mixture of Gaussian Expert Model for Time Series Forecasting
By: Zhenliang Ni , Xiaowen Ma , Zhenkai Wu and more
Potential Business Impact:
Lets computers predict changes in data better.
Multivariate time series forecasts are widely used, such as industrial, transportation and financial forecasts. However, the dominant frequencies in time series may shift with the evolving spectral distribution of the data. Traditional Mixture of Experts (MoE) models, which employ a fixed number of experts, struggle to adapt to these changes, resulting in frequency coverage imbalance issue. Specifically, too few experts can lead to the overlooking of critical information, while too many can introduce noise. To this end, we propose Ada-MoGE, an adaptive Gaussian Mixture of Experts model. Ada-MoGE integrates spectral intensity and frequency response to adaptively determine the number of experts, ensuring alignment with the input data's frequency distribution. This approach prevents both information loss due to an insufficient number of experts and noise contamination from an excess of experts. Additionally, to prevent noise introduction from direct band truncation, we employ Gaussian band-pass filtering to smoothly decompose the frequency domain features, further optimizing the feature representation. The experimental results show that our model achieves state-of-the-art performance on six public benchmarks with only 0.2 million parameters.
Similar Papers
M$^2$FMoE: Multi-Resolution Multi-View Frequency Mixture-of-Experts for Extreme-Adaptive Time Series Forecasting
Machine Learning (CS)
Predicts floods and droughts better, even when rare.
FreqMoE: Enhancing Time Series Forecasting through Frequency Decomposition Mixture of Experts
Machine Learning (CS)
Improves weather and money predictions by using all sound waves.
MoFE-Time: Mixture of Frequency Domain Experts for Time-Series Forecasting Models
Machine Learning (CS)
Predicts future events better by learning patterns.