Modèles de Fondation et Ajustement : Vers une Nouvelle Génération de Modèles pour la Prévision des Séries Temporelles
By: Morad Laglil , Emilie Devijver , Eric Gaussier and more
Potential Business Impact:
Predicts future events with new data.
Inspired by recent advances in large language models, foundation models have been developed for zero-shot time series forecasting, enabling prediction on datasets unseen during pretraining. These large-scale models, trained on vast collections of time series, learn generalizable representations for both point and probabilistic forecasting, reducing the need for task-specific architectures and manual tuning. In this work, we review the main architectures, pretraining strategies, and optimization methods used in such models, and study the effect of fine-tuning after pretraining to enhance their performance on specific datasets. Our empirical results show that fine-tuning generally improves zero-shot forecasting capabilities, especially for long-term horizons.
Similar Papers
Re(Visiting) Time Series Foundation Models in Finance
Computational Finance
Teaches computers to predict stock prices better.
Pre-trained Forecasting Models: Strong Zero-Shot Feature Extractors for Time Series Classification
Machine Learning (CS)
Teaches computers to understand patterns in data.
Foundation Models for Time Series: A Survey
Machine Learning (CS)
Helps computers understand patterns in data over time.