QDER: Query-Specific Document and Entity Representations for Multi-Vector Document Re-Ranking
By: Shubham Chatterjee, Jeff Dalton
Potential Business Impact:
Finds better answers by understanding words and ideas.
Neural IR has advanced through two distinct paths: entity-oriented approaches leveraging knowledge graphs and multi-vector models capturing fine-grained semantics. We introduce QDER, a neural re-ranking model that unifies these approaches by integrating knowledge graph semantics into a multi-vector model. QDER's key innovation lies in its modeling of query-document relationships: rather than computing similarity scores on aggregated embeddings, we maintain individual token and entity representations throughout the ranking process, performing aggregation only at the final scoring stage - an approach we call "late aggregation." We first transform these fine-grained representations through learned attention patterns, then apply carefully chosen mathematical operations for precise matches. Experiments across five standard benchmarks show that QDER achieves significant performance gains, with improvements of 36% in nDCG@20 over the strongest baseline on TREC Robust 2004 and similar improvements on other datasets. QDER particularly excels on difficult queries, achieving an nDCG@20 of 0.70 where traditional approaches fail completely (nDCG@20 = 0.0), setting a foundation for future work in entity-aware retrieval.
Similar Papers
CDER: Collaborative Evidence Retrieval for Document-level Relation Extraction
Computation and Language
Helps computers find clues for understanding stories.
QUESTER: Query Specification for Generative Retrieval
Information Retrieval
Finds information faster using smart AI.
QuatE-D: A Distance-Based Quaternion Model for Knowledge Graph Embedding
Machine Learning (CS)
Makes computers understand relationships better.