Transformers for Complex Query Answering over Knowledge Hypergraphs
By: Hong Ting Tsang, Zihao Wang, Yangqiu Song
Potential Business Impact:
Answers tricky questions using smart computer knowledge.
Complex Query Answering (CQA) has been extensively studied in recent years. In order to model data that is closer to real-world distribution, knowledge graphs with different modalities have been introduced. Triple KGs, as the classic KGs composed of entities and relations of arity 2, have limited representation of real-world facts. Real-world data is more sophisticated. While hyper-relational graphs have been introduced, there are limitations in representing relationships of varying arity that contain entities with equal contributions. To address this gap, we sampled new CQA datasets: JF17k-HCQA and M-FB15k-HCQA. Each dataset contains various query types that include logical operations such as projection, negation, conjunction, and disjunction. In order to answer knowledge hypergraph (KHG) existential first-order queries, we propose a two-stage transformer model, the Logical Knowledge Hypergraph Transformer (LKHGT), which consists of a Projection Encoder for atomic projection and a Logical Encoder for complex logical operations. Both encoders are equipped with Type Aware Bias (TAB) for capturing token interactions. Experimental results on CQA datasets show that LKHGT is a state-of-the-art CQA method over KHG and is able to generalize to out-of-distribution query types.
Similar Papers
Efficient and Scalable Neural Symbolic Search for Knowledge Graph Complex Query Answering
Artificial Intelligence
Answers tough questions from smart computer brains faster.
A Method for Multi-Hop Question Answering on Persian Knowledge Graph
Information Retrieval
Helps computers answer hard questions in Persian.
The benefits of query-based KGQA systems for complex and temporal questions in LLM era
Computation and Language
Helps computers answer tricky questions by finding facts.