Score: 0

Semantics at an Angle: When Cosine Similarity Works Until It Doesn't

Published: April 22, 2025 | arXiv ID: 2504.16318v2

By: Kisung You

Potential Business Impact:

Makes computer learning understand words better.

Business Areas:
Semantic Search Internet Services

Cosine similarity has become a standard metric for comparing embeddings in modern machine learning. Its scale-invariance and alignment with model training objectives have contributed to its widespread adoption. However, recent studies have revealed important limitations, particularly when embedding norms carry meaningful semantic information. This informal article offers a reflective and selective examination of the evolution, strengths, and limitations of cosine similarity. We highlight why it performs well in many settings, where it tends to break down, and how emerging alternatives are beginning to address its blind spots. We hope to offer a mix of conceptual clarity and practical perspective, especially for quantitative scientists who think about embeddings not just as vectors, but as geometric and philosophical objects.

Country of Origin
🇺🇸 United States

Page Count
9 pages

Category
Computer Science:
Machine Learning (CS)