Uncertainty Quantification for Incomplete Multi-View Data Using Divergence Measures
By: Zhipeng Xue , Yan Zhang , Ming Li and more
Potential Business Impact:
Makes computer learning more accurate with messy data.
Existing multi-view classification and clustering methods typically improve task accuracy by leveraging and fusing information from different views. However, ensuring the reliability of multi-view integration and final decisions is crucial, particularly when dealing with noisy or corrupted data. Current methods often rely on Kullback-Leibler (KL) divergence to estimate uncertainty of network predictions, ignoring domain gaps between different modalities. To address this issue, KPHD-Net, based on H\"older divergence, is proposed for multi-view classification and clustering tasks. Generally, our KPHD-Net employs a variational Dirichlet distribution to represent class probability distributions, models evidences from different views, and then integrates it with Dempster-Shafer evidence theory (DST) to improve uncertainty estimation effects. Our theoretical analysis demonstrates that Proper H\"older divergence offers a more effective measure of distribution discrepancies, ensuring enhanced performance in multi-view learning. Moreover, Dempster-Shafer evidence theory, recognized for its superior performance in multi-view fusion tasks, is introduced and combined with the Kalman filter to provide future state estimations. This integration further enhances the reliability of the final fusion results. Extensive experiments show that the proposed KPHD-Net outperforms the current state-of-the-art methods in both classification and clustering tasks regarding accuracy, robustness, and reliability, with theoretical guarantees.
Similar Papers
A Neural Network Algorithm for KL Divergence Estimation with Quantitative Error Bounds
Machine Learning (CS)
Helps computers measure how different data is.
Head-Tail-Aware KL Divergence in Knowledge Distillation for Spiking Neural Networks
Artificial Intelligence
Teaches computer brains to learn better, faster.
Quantum-Inspired Fidelity-based Divergence
Information Theory
Makes computer learning better by fixing math problems.