Score: 1

Less Is More: Generating Time Series with LLaMA-Style Autoregression in Simple Factorized Latent Spaces

Published: November 7, 2025 | arXiv ID: 2511.04973v1

By: Siyuan Li , Yifan Sun , Lei Cheng and more

Potential Business Impact:

Makes fake data that looks real, fast.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Generative models for multivariate time series are essential for data augmentation, simulation, and privacy preservation, yet current state-of-the-art diffusion-based approaches are slow and limited to fixed-length windows. We propose FAR-TS, a simple yet effective framework that combines disentangled factorization with an autoregressive Transformer over a discrete, quantized latent space to generate time series. Each time series is decomposed into a data-adaptive basis that captures static cross-channel correlations and temporal coefficients that are vector-quantized into discrete tokens. A LLaMA-style autoregressive Transformer then models these token sequences, enabling fast and controllable generation of sequences with arbitrary length. Owing to its streamlined design, FAR-TS achieves orders-of-magnitude faster generation than Diffusion-TS while preserving cross-channel correlations and an interpretable latent space, enabling high-quality and flexible time series synthesis.

Country of Origin
🇨🇳 China

Page Count
19 pages

Category
Computer Science:
Machine Learning (CS)