KD-GAT: Combining Knowledge Distillation and Graph Attention Transformer for a Controller Area Network Intrusion Detection System
By: Robert Frenken , Sidra Ghayour Bhatti , Hanqin Zhang and more
Potential Business Impact:
Protects cars from hackers by spotting bad messages.
The Controller Area Network (CAN) protocol is widely adopted for in-vehicle communication but lacks inherent security mechanisms, making it vulnerable to cyberattacks. This paper introduces KD-GAT, an intrusion detection framework that combines Graph Attention Networks (GATs) with knowledge distillation (KD) to enhance detection accuracy while reducing computational complexity. In our approach, CAN traffic is represented as graphs using a sliding window to capture temporal and relational patterns. A multi-layer GAT with jumping knowledge aggregation acting as the teacher model, while a compact student GAT--only 6.32% the size of the teacher--is trained via a two-phase process involving supervised pretraining and knowledge distillation with both soft and hard label supervision. Experiments on three benchmark datasets--Car-Hacking, Car-Survival, and can-train-and-test demonstrate that both teacher and student models achieve strong results, with the student model attaining 99.97% and 99.31% accuracy on Car-Hacking and Car-Survival, respectively. However, significant class imbalance in can-train-and-test has led to reduced performance for both models on this dataset. Addressing this imbalance remains an important direction for future work.
Similar Papers
Multi-Stage Knowledge-Distilled VGAE and GAT for Robust Controller-Area-Network Intrusion Detection
Machine Learning (CS)
Stops car hackers from taking control.
Graph Collaborative Attention Network for Link Prediction in Knowledge Graphs
Machine Learning (CS)
Helps computers understand connections between facts better.
Efficient Knowledge Tracing Leveraging Higher-Order Information in Integrated Graphs
Machine Learning (CS)
Makes online learning faster and cheaper.