Normalized Conditional Mutual Information Surrogate Loss for Deep Neural Classifiers
By: Linfeng Ye , Zhixiang Chi , Konstantinos N. Plataniotis and more
Potential Business Impact:
Makes computer pictures smarter and more accurate.
In this paper, we propose a novel information theoretic surrogate loss; normalized conditional mutual information (NCMI); as a drop in alternative to the de facto cross-entropy (CE) for training deep neural network (DNN) based classifiers. We first observe that the model's NCMI is inversely proportional to its accuracy. Building on this insight, we introduce an alternating algorithm to efficiently minimize the NCMI. Across image recognition and whole-slide imaging (WSI) subtyping benchmarks, NCMI-trained models surpass state of the art losses by substantial margins at a computational cost comparable to that of CE. Notably, on ImageNet, NCMI yields a 2.77% top-1 accuracy improvement with ResNet-50 comparing to the CE; on CAMELYON-17, replacing CE with NCMI improves the macro-F1 by 8.6% over the strongest baseline. Gains are consistent across various architectures and batch sizes, suggesting that NCMI is a practical and competitive alternative to CE.
Similar Papers
A Neural Difference-of-Entropies Estimator for Mutual Information
Machine Learning (Stat)
Helps computers understand how things are connected.
Contrastive Predictive Coding Done Right for Mutual Information Estimation
Machine Learning (CS)
Fixes computer learning to better guess information.
Leveraging Conditional Mutual Information to Improve Large Language Model Fine-Tuning For Classification
Computation and Language
Makes AI smarter by teaching it to focus better.