AI Foundation Model for Time Series with Innovations Representation
By: Lang Tong, Xinyi Wang
Potential Business Impact:
Helps machines predict future events accurately.
This paper introduces an Artificial Intelligence (AI) foundation model for time series in engineering applications, where causal operations are required for real-time monitoring and control. Since engineering time series are governed by physical, rather than linguistic, laws, large-language-model-based AI foundation models may be ineffective or inefficient. Building on the classical innovations representation theory of Wiener, Kallianpur, and Rosenblatt, we propose Time Series GPT (TS-GPT) -- an innovations-representation-based Generative Pre-trained Transformer for engineering monitoring and control. As an example of foundation model adaptation, we consider Probabilistic Generative Forecasting, which produces future time series samples from conditional probability distributions given past realizations. We demonstrate the effectiveness of TS-GPT in forecasting real-time locational marginal prices using historical data from U.S. independent system operators.
Similar Papers
MarketGPT: Developing a Pre-trained transformer (GPT) for Modeling Financial Time Series
Trading & Market Microstructure
Makes computer markets act like real ones.
From Prediction to Understanding: Will AI Foundation Models Transform Brain Science?
Neurons and Cognition
Helps AI understand how brains work.
Building a Foundation Model for Trajectory from Scratch
Artificial Intelligence
Teaches computers to predict where things will go.