Mixture of Balanced Information Bottlenecks for Long-Tailed Visual Recognition
By: Yifan Lan , Xin Cai , Jun Cheng and more
Potential Business Impact:
Helps computers recognize many things, even rare ones.
Deep neural networks (DNNs) have achieved significant success in various applications with large-scale and balanced data. However, data in real-world visual recognition are usually long-tailed, bringing challenges to efficient training and deployment of DNNs. Information bottleneck (IB) is an elegant approach for representation learning. In this paper, we propose a balanced information bottleneck (BIB) approach, in which loss function re-balancing and self-distillation techniques are integrated into the original IB network. BIB is thus capable of learning a sufficient representation with essential label-related information fully preserved for long-tailed visual recognition. To further enhance the representation learning capability, we also propose a novel structure of mixture of multiple balanced information bottlenecks (MBIB), where different BIBs are responsible for combining knowledge from different network layers. MBIB facilitates an end-to-end learning strategy that trains representation and classification simultaneously from an information theory perspective. We conduct experiments on commonly used long-tailed datasets, including CIFAR100-LT, ImageNet-LT, and iNaturalist 2018. Both BIB and MBIB reach state-of-the-art performance for long-tailed visual recognition.
Similar Papers
A Generalized Information Bottleneck Theory of Deep Learning
Machine Learning (CS)
Helps computers learn better by understanding feature connections.
Learning Fair Graph Representations with Multi-view Information Bottleneck
Machine Learning (CS)
Makes AI fairer by fixing biased data.
A Generalized Information Bottleneck Theory of Deep Learning
Machine Learning (CS)
Helps computers learn better by understanding feature connections.