A Comparative Study on How Data Normalization Affects Zero-Shot Generalization in Time Series Foundation Models
By: Ihab Ahmed , Denis Krompaß , Cheng Feng and more
Potential Business Impact:
Makes AI better at understanding changing data.
We investigate input normalization methods for Time-Series Foundation Models (TSFMs). While normalization is well-studied in dataset-specific time-series models, it remains overlooked in TSFMs where generalization is critical. Time-series data, unlike text or images, exhibits significant scale variation across domains and channels, coupled with non-stationarity, can undermine TSFM performance regardless of architectural complexity. Through systematic evaluation across four architecturally diverse TSFMs, we empirically establish REVIN as the most efficient approach, reducing zero-shot MASE by 89\% relative to an un-normalized baseline and by 44\% versus other normalization methods, while matching the best in-domain accuracy (0.84 MASE) without any dataset-level preprocessing -- yielding the highest accuracy-efficiency trade-off. Yet its effect utilization depends on architectural design choices and optimization objective, particularly with respect to training loss scale sensitivity and model type (probabilistic, point-forecast, or LLM-based models).
Similar Papers
Re(Visiting) Time Series Foundation Models in Finance
Computational Finance
Teaches computers to predict stock prices better.
Generalisation Bounds of Zero-Shot Economic Forecasting using Time Series Foundation Models
Machine Learning (CS)
Predicts economy without needing old data.
Time Series Foundation Models: Benchmarking Challenges and Requirements
Machine Learning (CS)
Tests if future predictions are truly new.