Time Series Based Network Intrusion Detection using MTF-Aided Transformer
By: Poorvi Joshi, Mohan Gurusamy
Potential Business Impact:
Helps computers understand network problems faster.
This paper introduces a novel approach to time series classification using a Markov Transition Field (MTF)-aided Transformer model, specifically designed for Software-Defined Networks (SDNs). The proposed model integrates the temporal dependency modeling strengths of MTFs with the sophisticated pattern recognition capabilities of Transformer architectures. We evaluate the model's performance using the InSDN dataset, demonstrating that our model outperforms baseline classification models, particularly in data-constrained environments commonly encountered in SDN applications. We also highlight the relationship between the MTF and Transformer components, which leads to better performance, even with limited data. Furthermore, our approach achieves competitive training and inference times, making it an efficient solution for real-world SDN applications. These findings establish the potential of MTF-aided Transformers to address the challenges of time series classification in SDNs, offering a promising path for reliable and scalable analysis in scenarios with sparse data.
Similar Papers
Generative Modeling of Networked Time-Series via Transformer Architectures
Machine Learning (CS)
Creates more data to make computer programs smarter.
MSTN: Fast and Efficient Multivariate Time Series Model
Machine Learning (CS)
Learns patterns in fast and slow changing data.
Self-Supervised Transformer-based Contrastive Learning for Intrusion Detection Systems
Cryptography and Security
Finds computer attacks that are new and hidden.