Score: 0

AI Foundation Model for Time Series with Innovations Representation

Published: October 2, 2025 | arXiv ID: 2510.01560v1

By: Lang Tong, Xinyi Wang

Potential Business Impact:

Helps machines predict future events accurately.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

This paper introduces an Artificial Intelligence (AI) foundation model for time series in engineering applications, where causal operations are required for real-time monitoring and control. Since engineering time series are governed by physical, rather than linguistic, laws, large-language-model-based AI foundation models may be ineffective or inefficient. Building on the classical innovations representation theory of Wiener, Kallianpur, and Rosenblatt, we propose Time Series GPT (TS-GPT) -- an innovations-representation-based Generative Pre-trained Transformer for engineering monitoring and control. As an example of foundation model adaptation, we consider Probabilistic Generative Forecasting, which produces future time series samples from conditional probability distributions given past realizations. We demonstrate the effectiveness of TS-GPT in forecasting real-time locational marginal prices using historical data from U.S. independent system operators.

Page Count
14 pages

Category
Statistics:
Machine Learning (Stat)