Score: 3

Enhancing Coreference Resolution with Pretrained Language Models: Bridging the Gap Between Syntax and Semantics

Published: April 8, 2025 | arXiv ID: 2504.05855v1

By: Xingzu Liu , Songhang deng , Mingbang Wang and more

BigTech Affiliations: Amazon

Potential Business Impact:

Helps computers understand who or what "they" refers to.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Large language models have made significant advancements in various natural language processing tasks, including coreference resolution. However, traditional methods often fall short in effectively distinguishing referential relationships due to a lack of integration between syntactic and semantic information. This study introduces an innovative framework aimed at enhancing coreference resolution by utilizing pretrained language models. Our approach combines syntax parsing with semantic role labeling to accurately capture finer distinctions in referential relationships. By employing state-of-the-art pretrained models to gather contextual embeddings and applying an attention mechanism for fine-tuning, we improve the performance of coreference tasks. Experimental results across diverse datasets show that our method surpasses conventional coreference resolution systems, achieving notable accuracy in disambiguating references. This development not only improves coreference resolution outcomes but also positively impacts other natural language processing tasks that depend on precise referential understanding.

Country of Origin
πŸ‡¨πŸ‡³ πŸ‡ΊπŸ‡Έ China, United States

Page Count
11 pages

Category
Computer Science:
Computation and Language