Score: 0

Improving Semi-Supervised Contrastive Learning via Entropy-Weighted Confidence Integration of Anchor-Positive Pairs

Published: January 8, 2026 | arXiv ID: 2601.04555v1

By: Shogo Nakayama, Masahiro Okuda

Potential Business Impact:

Teaches computers to learn better with less information.

Business Areas:
Semantic Search Internet Services

Conventional semi-supervised contrastive learning methods assign pseudo-labels only to samples whose highest predicted class probability exceeds a predefined threshold, and then perform supervised contrastive learning using those selected samples. In this study, we propose a novel loss function that estimates the confidence of each sample based on the entropy of its predicted probability distribution and applies confidence-based adaptive weighting. This approach enables pseudo-label assignment even to samples that were previously excluded from training and facilitates contrastive learning that accounts for the confidence of both anchor and positive samples in a more principled manner. Experimental results demonstrate that the proposed method improves classification accuracy and achieves more stable learning performance even under low-label conditions.

Page Count
4 pages

Category
Computer Science:
Machine Learning (CS)