One Sentence, Two Embeddings: Contrastive Learning of Explicit and Implicit Semantic Representations
By: Kohei Oda , Po-Min Chuang , Kiyoaki Shirai and more
Potential Business Impact:
Helps computers understand hidden meanings in sentences.
Sentence embedding methods have made remarkable progress, yet they still struggle to capture the implicit semantics within sentences. This can be attributed to the inherent limitations of conventional sentence embedding methods that assign only a single vector per sentence. To overcome this limitation, we propose DualCSE, a sentence embedding method that assigns two embeddings to each sentence: one representing the explicit semantics and the other representing the implicit semantics. These embeddings coexist in the shared space, enabling the selection of the desired semantics for specific purposes such as information retrieval and text classification. Experimental results demonstrate that DualCSE can effectively encode both explicit and implicit meanings and improve the performance of the downstream task.
Similar Papers
SemCSE: Semantic Contrastive Sentence Embeddings Using LLM-Generated Summaries For Scientific Abstracts
Computation and Language
Helps computers understand science papers better.
2-Tier SimCSE: Elevating BERT for Robust Sentence Embeddings
Computation and Language
Helps computers understand sentences better.
Differential syntactic and semantic encoding in LLMs
Computation and Language
Teaches computers how words fit together.