Label-Guided In-Context Learning for Named Entity Recognition
By: Fan Bai , Hamid Hassanzadeh , Ardavan Saeedi and more
Potential Business Impact:
Helps computers find words better by learning from examples.
In-context learning (ICL) enables large language models (LLMs) to perform new tasks using only a few demonstrations. In Named Entity Recognition (NER), demonstrations are typically selected based on semantic similarity to the test instance, ignoring training labels and resulting in suboptimal performance. We introduce DEER, a new method that leverages training labels through token-level statistics to improve ICL performance. DEER first enhances example selection with a label-guided, token-based retriever that prioritizes tokens most informative for entity recognition. It then prompts the LLM to revisit error-prone tokens, which are also identified using label statistics, and make targeted corrections. Evaluated on five NER datasets using four different LLMs, DEER consistently outperforms existing ICL methods and approaches the performance of supervised fine-tuning. Further analysis shows its effectiveness on both seen and unseen entities and its robustness in low-resource settings.
Similar Papers
EL4NER: Ensemble Learning for Named Entity Recognition via Multiple Small-Parameter Large Language Models
Computation and Language
Lets small computers find important words in text.
Learning to Select In-Context Demonstration Preferred by Large Language Model
Machine Learning (CS)
Helps AI learn better by picking good examples.
Leveraging In-Context Learning for Language Model Agents
Computation and Language
Helps AI agents learn by watching examples.