Counting Still Counts: Understanding Neural Complex Query Answering Through Query Relaxation
By: Yannick Brunink , Daniel Daza , Yunjie He and more
Potential Business Impact:
Finds answers by trying many paths.
Neural methods for Complex Query Answering (CQA) over knowledge graphs (KGs) are widely believed to learn patterns that generalize beyond explicit graph structure, allowing them to infer answers that are unreachable through symbolic query processing. In this work, we critically examine this assumption through a systematic analysis comparing neural CQA models with an alternative, training-free query relaxation strategy that retrieves possible answers by relaxing query constraints and counting resulting paths. Across multiple datasets and query structures, we find several cases where neural and relaxation-based approaches perform similarly, with no neural model consistently outperforming the latter. Moreover, a similarity analysis reveals that their retrieved answers exhibit little overlap, and that combining their outputs consistently improves performance. These results call for a re-evaluation of progress in neural query answering: despite their complexity, current models fail to subsume the reasoning patterns captured by query relaxation. Our findings highlight the importance of stronger non-neural baselines and suggest that future neural approaches could benefit from incorporating principles of query relaxation.
Similar Papers
Efficient and Scalable Neural Symbolic Search for Knowledge Graph Complex Query Answering
Artificial Intelligence
Answers tough questions from smart computer brains faster.
Plan Then Retrieve: Reinforcement Learning-Guided Complex Reasoning over Knowledge Graphs
Artificial Intelligence
Helps computers answer questions using more information.
Interactive Query Answering on Knowledge Graphs with Soft Entity Constraints
Artificial Intelligence
Helps computers find better answers with vague rules.