Score: 1

Uncertainty-driven Embedding Convolution

Published: July 28, 2025 | arXiv ID: 2507.20718v1

By: Sungjun Lim , Kangjun Noh , Youngjun Choi and more

Potential Business Impact:

Makes computer language understanding more reliable.

Business Areas:
Semantic Search Internet Services

Text embeddings are essential components in modern NLP pipelines. While numerous embedding models have been proposed, their performance varies across domains, and no single model consistently excels across all tasks. This variability motivates the use of ensemble techniques to combine complementary strengths. However, most existing ensemble methods operate on deterministic embeddings and fail to account for model-specific uncertainty, limiting their robustness and reliability in downstream applications. To address these limitations, we propose Uncertainty-driven Embedding Convolution (UEC). UEC first transforms deterministic embeddings into probabilistic ones in a post-hoc manner. It then computes adaptive ensemble weights based on embedding uncertainty, grounded in a Bayes-optimal solution under a surrogate loss. Additionally, UEC introduces an uncertainty-aware similarity function that directly incorporates uncertainty into similarity scoring. Extensive experiments on retrieval, classification, and semantic similarity benchmarks demonstrate that UEC consistently improves both performance and robustness by leveraging principled uncertainty modeling.

Country of Origin
🇰🇷 Korea, Republic of

Repos / Data Links

Page Count
23 pages

Category
Computer Science:
Machine Learning (CS)