Less Is More: Generating Time Series with LLaMA-Style Autoregression in Simple Factorized Latent Spaces
By: Siyuan Li , Yifan Sun , Lei Cheng and more
Potential Business Impact:
Makes fake data that looks real, fast.
Generative models for multivariate time series are essential for data augmentation, simulation, and privacy preservation, yet current state-of-the-art diffusion-based approaches are slow and limited to fixed-length windows. We propose FAR-TS, a simple yet effective framework that combines disentangled factorization with an autoregressive Transformer over a discrete, quantized latent space to generate time series. Each time series is decomposed into a data-adaptive basis that captures static cross-channel correlations and temporal coefficients that are vector-quantized into discrete tokens. A LLaMA-style autoregressive Transformer then models these token sequences, enabling fast and controllable generation of sequences with arbitrary length. Owing to its streamlined design, FAR-TS achieves orders-of-magnitude faster generation than Diffusion-TS while preserving cross-channel correlations and an interpretable latent space, enabling high-quality and flexible time series synthesis.
Similar Papers
Fast Autoregressive Models for Continuous Latent Generation
CV and Pattern Recognition
Makes computers draw realistic pictures much faster.
L-GTA: Latent Generative Modeling for Time Series Augmentation
Machine Learning (CS)
Makes computer predictions better by creating fake data.
TARFVAE: Efficient One-Step Generative Time Series Forecasting via TARFLOW based VAE
Machine Learning (CS)
Predicts future events faster than other methods.