Score: 1

Causal-Symbolic Meta-Learning (CSML): Inducing Causal World Models for Few-Shot Generalization

Published: September 15, 2025 | arXiv ID: 2509.12387v1

By: Mohamed Zayaan S

Potential Business Impact:

Teaches computers to learn like humans from few examples.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Modern deep learning models excel at pattern recognition but remain fundamentally limited by their reliance on spurious correlations, leading to poor generalization and a demand for massive datasets. We argue that a key ingredient for human-like intelligence-robust, sample-efficient learning-stems from an understanding of causal mechanisms. In this work, we introduce Causal-Symbolic Meta-Learning (CSML), a novel framework that learns to infer the latent causal structure of a task distribution. CSML comprises three key modules: a perception module that maps raw inputs to disentangled symbolic representations; a differentiable causal induction module that discovers the underlying causal graph governing these symbols and a graph-based reasoning module that leverages this graph to make predictions. By meta-learning a shared causal world model across a distribution of tasks, CSML can rapidly adapt to novel tasks, including those requiring reasoning about interventions and counterfactuals, from only a handful of examples. We introduce CausalWorld, a new physics-based benchmark designed to test these capabilities. Our experiments show that CSML dramatically outperforms state-of-the-art meta-learning and neuro-symbolic baselines, particularly on tasks demanding true causal inference.

Country of Origin
🇮🇳 India

Page Count
10 pages

Category
Computer Science:
Machine Learning (CS)