Redundancy-Aware Test-Time Graph Out-of-Distribution Detection
By: Yue Hou , He Zhu , Ruomei Liu and more
Potential Business Impact:
Helps computers spot fake data in graphs.
Distributional discrepancy between training and test data can lead models to make inaccurate predictions when encountering out-of-distribution (OOD) samples in real-world applications. Although existing graph OOD detection methods leverage data-centric techniques to extract effective representations, their performance remains compromised by structural redundancy that induces semantic shifts. To address this dilemma, we propose RedOUT, an unsupervised framework that integrates structural entropy into test-time OOD detection for graph classification. Concretely, we introduce the Redundancy-aware Graph Information Bottleneck (ReGIB) and decompose the objective into essential information and irrelevant redundancy. By minimizing structural entropy, the decoupled redundancy is reduced, and theoretically grounded upper and lower bounds are proposed for optimization. Extensive experiments on real-world datasets demonstrate the superior performance of RedOUT on OOD detection. Specifically, our method achieves an average improvement of 6.7%, significantly surpassing the best competitor by 17.3% on the ClinTox/LIPO dataset pair.
Similar Papers
Learning Invariant Graph Representations Through Redundant Information
Machine Learning (CS)
Helps computers learn from new data without mistakes.
Structural Entropy Guided Unsupervised Graph Out-Of-Distribution Detection
Machine Learning (CS)
Helps computers spot weird data in networks.
Graph Out-of-Distribution Detection via Test-Time Calibration with Dual Dynamic Dictionaries
Machine Learning (CS)
Finds weird computer data that doesn't fit.