Score: 1

L-GTA: Latent Generative Modeling for Time Series Augmentation

Published: July 31, 2025 | arXiv ID: 2507.23615v1

By: Luis Roque , Carlos Soares , Vitor Cerqueira and more

Potential Business Impact:

Makes computer predictions better by creating fake data.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Data augmentation is gaining importance across various aspects of time series analysis, from forecasting to classification and anomaly detection tasks. We introduce the Latent Generative Transformer Augmentation (L-GTA) model, a generative approach using a transformer-based variational recurrent autoencoder. This model uses controlled transformations within the latent space of the model to generate new time series that preserve the intrinsic properties of the original dataset. L-GTA enables the application of diverse transformations, ranging from simple jittering to magnitude warping, and combining these basic transformations to generate more complex synthetic time series datasets. Our evaluation of several real-world datasets demonstrates the ability of L-GTA to produce more reliable, consistent, and controllable augmented data. This translates into significant improvements in predictive accuracy and similarity measures compared to direct transformation methods.

Repos / Data Links

Page Count
9 pages

Category
Computer Science:
Machine Learning (CS)