Score: 1

A Comparative Study on How Data Normalization Affects Zero-Shot Generalization in Time Series Foundation Models

Published: December 2, 2025 | arXiv ID: 2512.02833v1

By: Ihab Ahmed , Denis Krompaß , Cheng Feng and more

BigTech Affiliations: Siemens

Potential Business Impact:

Makes AI better at understanding changing data.

Business Areas:
A/B Testing Data and Analytics

We investigate input normalization methods for Time-Series Foundation Models (TSFMs). While normalization is well-studied in dataset-specific time-series models, it remains overlooked in TSFMs where generalization is critical. Time-series data, unlike text or images, exhibits significant scale variation across domains and channels, coupled with non-stationarity, can undermine TSFM performance regardless of architectural complexity. Through systematic evaluation across four architecturally diverse TSFMs, we empirically establish REVIN as the most efficient approach, reducing zero-shot MASE by 89\% relative to an un-normalized baseline and by 44\% versus other normalization methods, while matching the best in-domain accuracy (0.84 MASE) without any dataset-level preprocessing -- yielding the highest accuracy-efficiency trade-off. Yet its effect utilization depends on architectural design choices and optimization objective, particularly with respect to training loss scale sensitivity and model type (probabilistic, point-forecast, or LLM-based models).

Country of Origin
🇩🇪 Germany

Page Count
5 pages

Category
Computer Science:
Machine Learning (CS)