Generative Modeling of Networked Time-Series via Transformer Architectures
By: Yusuf Elnady
Potential Business Impact:
Creates more data to make computer programs smarter.
Many security and network applications require having large datasets to train the machine learning models. Limited data access is a well-known problem in the security domain. Recent studies have shown the potential of Transformer models to enlarge the size of data by synthesizing new samples, but the synthesized samples don't improve the models over the real data. To address this issue, we design an efficient transformer-based model as a generative framework to generate time-series data, that can be used to boost the performance of existing and new ML workflows. Our new transformer model achieves the SOTA results. We style our model to be generalizable and work across different datasets, and produce high-quality samples.
Similar Papers
A Time-Series Data Augmentation Model through Diffusion and Transformer Integration
Machine Learning (CS)
Makes AI learn better from less data.
Minimal Time Series Transformer
Machine Learning (CS)
Predicts future numbers using past patterns.
Time Series Based Network Intrusion Detection using MTF-Aided Transformer
Networking and Internet Architecture
Helps computers understand network problems faster.