Robust Causal Discovery under Imperfect Structural Constraints
By: Zidong Wang , Xi Lin , Chuchao He and more
Potential Business Impact:
Finds true causes even with bad clues.
Robust causal discovery from observational data under imperfect prior knowledge remains a significant and largely unresolved challenge. Existing methods typically presuppose perfect priors or can only handle specific, pre-identified error types. And their performance degrades substantially when confronted with flawed constraints of unknown location and type. This decline arises because most of them rely on inflexible and biased thresholding strategies that may conflict with the data distribution. To overcome these limitations, we propose to harmonizes knowledge and data through prior alignment and conflict resolution. First, we assess the credibility of imperfect structural constraints through a surrogate model, which then guides a sparse penalization term measuring the loss between the learned and constrained adjacency matrices. We theoretically prove that, under ideal assumption, the knowledge-driven objective aligns with the data-driven objective. Furthermore, to resolve conflicts when this assumption is violated, we introduce a multi-task learning framework optimized via multi-gradient descent, jointly minimizing both objectives. Our proposed method is robust to both linear and nonlinear settings. Extensive experiments, conducted under diverse noise conditions and structural equation model types, demonstrate the effectiveness and efficiency of our method under imperfect structural constraints.
Similar Papers
The Robustness of Differentiable Causal Discovery in Misspecified Scenarios
Machine Learning (CS)
Makes computers understand cause and effect better.
Linear Causal Discovery with Interventional Constraints
Machine Learning (CS)
Teaches computers to understand cause and effect.
Differentiable Constraint-Based Causal Discovery
Machine Learning (CS)
Finds causes even with little information.