Score: 0

Mitigating Long-Tailed Anomaly Score Distributions with Importance-Weighted Loss

Published: January 5, 2026 | arXiv ID: 2601.02440v1

By: Jungi Lee , Jungkwon Kim , Chi Zhang and more

Potential Business Impact:

Finds hidden problems in machines better.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Anomaly detection is crucial in industrial applications for identifying rare and unseen patterns to ensure system reliability. Traditional models, trained on a single class of normal data, struggle with real-world distributions where normal data exhibit diverse patterns, leading to class imbalance and long-tailed anomaly score distributions (LTD). This imbalance skews model training and degrades detection performance, especially for minority instances. To address this issue, we propose a novel importance-weighted loss designed specifically for anomaly detection. Compared to the previous method for LTD in classification, our method does not require prior knowledge of normal data classes. Instead, we introduce a weighted loss function that incorporates importance sampling to align the distribution of anomaly scores with a target Gaussian, ensuring a balanced representation of normal data. Extensive experiments on three benchmark image datasets and three real-world hyperspectral imaging datasets demonstrate the robustness of our approach in mitigating LTD-induced bias. Our method improves anomaly detection performance by 0.043, highlighting its effectiveness in real-world applications.

Page Count
8 pages

Category
Statistics:
Machine Learning (Stat)