Score: 1

Generative Modeling of Networked Time-Series via Transformer Architectures

Published: June 8, 2025 | arXiv ID: 2506.07312v1

By: Yusuf Elnady

Potential Business Impact:

Creates more data to make computer programs smarter.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Many security and network applications require having large datasets to train the machine learning models. Limited data access is a well-known problem in the security domain. Recent studies have shown the potential of Transformer models to enlarge the size of data by synthesizing new samples, but the synthesized samples don't improve the models over the real data. To address this issue, we design an efficient transformer-based model as a generative framework to generate time-series data, that can be used to boost the performance of existing and new ML workflows. Our new transformer model achieves the SOTA results. We style our model to be generalizable and work across different datasets, and produce high-quality samples.

Page Count
10 pages

Category
Computer Science:
Machine Learning (CS)