Rapid Augmentations for Time Series (RATS): A High-Performance Library for Time Series Augmentation
By: Wadie Skaf , Felix Kern , Aryamaan Basu Roy and more
Potential Business Impact:
Makes computer learning faster and use less memory.
Time series augmentation is critical for training robust deep learning models, particularly in domains where labelled data is scarce and expensive to obtain. However, existing augmentation libraries for time series, mainly written in Python, suffer from performance bottlenecks, where running time grows exponentially as dataset sizes increase -- an aspect limiting their applicability in large-scale, production-grade systems. We introduce RATS (Rapid Augmentations for Time Series), a high-performance library for time series augmentation written in Rust with Python bindings (RATSpy). RATS implements multiple augmentation methods spanning basic transformations, frequency-domain operations and time warping techniques, all accessible through a unified pipeline interface with built-in parallelisation. Comprehensive benchmarking of RATSpy versus a commonly used library (tasug) on 143 datasets demonstrates that RATSpy achieves an average speedup of 74.5\% over tsaug (up to 94.8\% on large datasets), with up to 47.9\% less peak memory usage.
Similar Papers
Time-RA: Towards Time Series Reasoning for Anomaly with LLM Feedback
Machine Learning (CS)
Helps computers explain why data is strange.
Time-RA: Towards Time Series Reasoning for Anomaly with LLM Feedback
Machine Learning (CS)
Helps computers explain why data is weird.
Abex-rat: Synergizing Abstractive Augmentation and Adversarial Training for Classification of Occupational Accident Reports
Machine Learning (CS)
Helps find dangerous job accidents faster.