Score: 0

Modèles de Fondation et Ajustement : Vers une Nouvelle Génération de Modèles pour la Prévision des Séries Temporelles

Published: November 27, 2025 | arXiv ID: 2511.22674v1

By: Morad Laglil , Emilie Devijver , Eric Gaussier and more

Potential Business Impact:

Predicts future events with new data.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Inspired by recent advances in large language models, foundation models have been developed for zero-shot time series forecasting, enabling prediction on datasets unseen during pretraining. These large-scale models, trained on vast collections of time series, learn generalizable representations for both point and probabilistic forecasting, reducing the need for task-specific architectures and manual tuning. In this work, we review the main architectures, pretraining strategies, and optimization methods used in such models, and study the effect of fine-tuning after pretraining to enhance their performance on specific datasets. Our empirical results show that fine-tuning generally improves zero-shot forecasting capabilities, especially for long-term horizons.

Country of Origin
🇫🇷 France

Page Count
15 pages

Category
Computer Science:
Machine Learning (CS)