LELA: an LLM-based Entity Linking Approach with Zero-Shot Domain Adaptation
By: Samy Haffoudhi, Fabian M. Suchanek, Nils Holzenberger
Potential Business Impact:
Connects words in stories to real things.
Entity linking (mapping ambiguous mentions in text to entities in a knowledge base) is a foundational step in tasks such as knowledge graph construction, question-answering, and information extraction. Our method, LELA, is a modular coarse-to-fine approach that leverages the capabilities of large language models (LLMs), and works with different target domains, knowledge bases and LLMs, without any fine-tuning phase. Our experiments across various entity linking settings show that LELA is highly competitive with fine-tuned approaches, and substantially outperforms the non-fine-tuned ones.
Similar Papers
Harnessing Deep LLM Participation for Robust Entity Linking
Computation and Language
Helps computers understand names in text better.
An Entity Linking Agent for Question Answering
Computation and Language
Helps computers find answers in short questions.
An Entity Linking Agent for Question Answering
Computation and Language
Helps computers find answers in short questions.