Score: 0

There is No "apple" in Timeseries: Rethinking TSFM through the Lens of Invariance

Published: October 23, 2025 | arXiv ID: 2510.20119v1

By: Arian Prabowo, Flora D. Salim

Potential Business Impact:

Makes computer predictions from time data better.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Timeseries foundation models (TSFMs) have multiplied, yet lightweight supervised baselines and even classical models often match them. We argue this gap stems from the naive importation of NLP or CV pipelines. In language and vision, large web-scale corpora densely capture human concepts i.e. there are countless images and text of apples. In contrast, timeseries data is built to complement the image and text modalities. There are no timeseries dataset that contains the concept apple. As a result, the scrape-everything-online paradigm fails for TS. We posit that progress demands a shift from opportunistic aggregation to principled design: constructing datasets that systematically span the space of invariance that preserve temporal semantics. To this end, we suggest that the ontology of timeseries invariances should be built based on first principles. Only by ensuring representational completeness through invariance coverage can TSFMs achieve the aligned structure necessary for generalisation, reasoning, and truly emergent behaviour.

Country of Origin
🇦🇺 Australia

Page Count
7 pages

Category
Computer Science:
Machine Learning (CS)