A Time-Series Data Augmentation Model through Diffusion and Transformer Integration
By: Yuren Zhang, Zhongnan Pu, Lei Jing
Potential Business Impact:
Makes AI learn better from less data.
With the development of Artificial Intelligence, numerous real-world tasks have been accomplished using technology integrated with deep learning. To achieve optimal performance, deep neural networks typically require large volumes of data for training. Although advances in data augmentation have facilitated the acquisition of vast datasets, most of this data is concentrated in domains like images and speech. However, there has been relatively less focus on augmenting time-series data. To address this gap and generate a substantial amount of time-series data, we propose a simple and effective method that combines the Diffusion and Transformer models. By utilizing an adjusted diffusion denoising model to generate a large volume of initial time-step action data, followed by employing a Transformer model to predict subsequent actions, and incorporating a weighted loss function to achieve convergence, the method demonstrates its effectiveness. Using the performance improvement of the model after applying augmented data as a benchmark, and comparing the results with those obtained without data augmentation or using traditional data augmentation methods, this approach shows its capability to produce high-quality augmented data.
Similar Papers
Generative Modeling of Networked Time-Series via Transformer Architectures
Machine Learning (CS)
Creates more data to make computer programs smarter.
Diffusion Transformers for Tabular Data Time Series Generation
Machine Learning (CS)
Creates realistic fake data for time-based charts.
Hyperspectral data augmentation with transformer-based diffusion models
CV and Pattern Recognition
Makes satellite pictures identify more forest types.