Meta-Semantics Augmented Few-Shot Relational Learning
By: Han Wu, Jie Yin
Potential Business Impact:
Teaches computers to learn new facts with few examples.
Few-shot relational learning on knowledge graph (KGs) aims to perform reasoning over relations with only a few training examples. While current methods have focused primarily on leveraging specific relational information, rich semantics inherent in KGs have been largely overlooked. To bridge this gap, we propose PromptMeta, a novel prompted meta-learning framework that seamlessly integrates meta-semantics with relational information for few-shot relational learning. PromptMeta introduces two core innovations: (1) a Meta-Semantic Prompt (MSP) pool that learns and consolidates high-level meta-semantics shared across tasks, enabling effective knowledge transfer and adaptation to newly emerging relations; and (2) a learnable fusion mechanism that dynamically combines meta-semantics with task-specific relational information tailored to different few-shot tasks. Both components are optimized jointly with model parameters within a meta-learning framework. Extensive experiments and analyses on two real-world KG benchmarks validate the effectiveness of PromptMeta in adapting to new relations with limited supervision.
Similar Papers
TransNet: Transfer Knowledge for Few-shot Knowledge Graph Completion
Artificial Intelligence
Helps computers learn new facts with few examples.
Knowledge-Decoupled Synergetic Learning: An MLLM based Collaborative Approach to Few-shot Multimodal Dialogue Intention Recognition
Computation and Language
Helps online shoppers get better help from chatbots.
Multi-modal Knowledge Graph Generation with Semantics-enriched Prompts
Artificial Intelligence
Makes smart pictures for computer knowledge.