Advancing Vulnerability Classification with BERT: A Multi-Objective Learning Model
By: Himanshu Tiwari
Potential Business Impact:
Sorts computer security problems by danger.
The rapid increase in cybersecurity vulnerabilities necessitates automated tools for analyzing and classifying vulnerability reports. This paper presents a novel Vulnerability Report Classifier that leverages the BERT (Bidirectional Encoder Representations from Transformers) model to perform multi-label classification of Common Vulnerabilities and Exposures (CVE) reports from the National Vulnerability Database (NVD). The classifier predicts both the severity (Low, Medium, High, Critical) and vulnerability types (e.g., Buffer Overflow, XSS) from textual descriptions. We introduce a custom training pipeline using a combined loss function-Cross-Entropy for severity and Binary Cross-Entropy with Logits for types-integrated into a Hugging Face Trainer subclass. Experiments on recent NVD data demonstrate promising results, with decreasing evaluation loss across epochs. The system is deployed via a REST API and a Streamlit UI, enabling real-time vulnerability analysis. This work contributes a scalable, open-source solution for cybersecurity practitioners to automate vulnerability triage.
Similar Papers
The Application of Transformer-Based Models for Predicting Consequences of Cyber Attacks
Machine Learning (CS)
Predicts cyberattack damage to stop them.
A Multi-Dataset Evaluation of Models for Automated Vulnerability Repair
Software Engineering
Fixes computer security holes automatically.
Cross-Domain Evaluation of Transformer-Based Vulnerability Detection on Open & Industry Data
Software Engineering
Finds computer bugs automatically before they cause problems.