Generalizing Supervised Contrastive learning: A Projection Perspective
By: Minoh Jeong, Alfred Hero
Potential Business Impact:
Teaches computers to learn from examples better.
Self-supervised contrastive learning (SSCL) has emerged as a powerful paradigm for representation learning and has been studied from multiple perspectives, including mutual information and geometric viewpoints. However, supervised contrastive (SupCon) approaches have received comparatively little attention in this context: for instance, while InfoNCE used in SSCL is known to form a lower bound on mutual information (MI), the relationship between SupCon and MI remains unexplored. To address this gap, we introduce ProjNCE, a generalization of the InfoNCE loss that unifies supervised and self-supervised contrastive objectives by incorporating projection functions and an adjustment term for negative pairs. We prove that ProjNCE constitutes a valid MI bound and affords greater flexibility in selecting projection strategies for class embeddings. Building on this flexibility, we further explore the centroid-based class embeddings in SupCon by exploring a variety of projection methods. Extensive experiments on multiple datasets and settings demonstrate that ProjNCE consistently outperforms both SupCon and standard cross-entropy training. Our work thus refines SupCon along two complementary perspective--mutual information interpretation and projection design--and offers broadly applicable improvements whenever SupCon serves as the foundational contrastive objective.
Similar Papers
Self-Supervised Contrastive Learning is Approximately Supervised Contrastive Learning
Machine Learning (CS)
Teaches computers to learn from unlabeled pictures.
On the Alignment Between Supervised and Self-Supervised Contrastive Learning
Machine Learning (CS)
Makes computer learning more like human learning.
A Theoretical Framework for Preventing Class Collapse in Supervised Contrastive Learning
Machine Learning (CS)
Teaches computers to tell similar things apart better.