Gemini Embedding: Generalizable Embeddings from Gemini
By: Jinhyuk Lee , Feiyang Chen , Sahil Dua and more
Potential Business Impact:
Helps computers understand many languages and code.
In this report, we introduce Gemini Embedding, a state-of-the-art embedding model leveraging the power of Gemini, Google's most capable large language model. Capitalizing on Gemini's inherent multilingual and code understanding capabilities, Gemini Embedding produces highly generalizable embeddings for text spanning numerous languages and textual modalities. The representations generated by Gemini Embedding can be precomputed and applied to a variety of downstream tasks including classification, similarity, clustering, ranking, and retrieval. Evaluated on the Massive Multilingual Text Embedding Benchmark (MMTEB), which includes over one hundred tasks across 250+ languages, Gemini Embedding substantially outperforms prior state-of-the-art models, demonstrating considerable improvements in embedding quality. Achieving state-of-the-art performance across MMTEB's multilingual, English, and code benchmarks, our unified model demonstrates strong capabilities across a broad selection of tasks and surpasses specialized domain-specific models.
Similar Papers
EmbeddingGemma: Powerful and Lightweight Text Representations
Computation and Language
Helps computers understand words better, faster, and cheaper.
GEM: Empowering LLM for both Embedding Generation and Language Understanding
Computation and Language
Lets computers understand text better for answers.
GigaEmbeddings: Efficient Russian Language Embedding Model
Computation and Language
Helps computers understand Russian text better.