Meta-Dependence in Conditional Independence Testing
By: Bijan Mazaheri, Jiaqi Zhang, Caroline Uhler
Potential Business Impact:
Finds hidden causes by checking how things relate.
Constraint-based causal discovery algorithms utilize many statistical tests for conditional independence to uncover networks of causal dependencies. These approaches to causal discovery rely on an assumed correspondence between the graphical properties of a causal structure and the conditional independence properties of observed variables, known as the causal Markov condition and faithfulness. Finite data yields an empirical distribution that is "close" to the actual distribution. Across these many possible empirical distributions, the correspondence to the graphical properties can break down for different conditional independencies, and multiple violations can occur at the same time. We study this "meta-dependence" between conditional independence properties using the following geometric intuition: each conditional independence property constrains the space of possible joint distributions to a manifold. The "meta-dependence" between conditional independences is informed by the position of these manifolds relative to the true probability distribution. We provide a simple-to-compute measure of this meta-dependence using information projections and consolidate our findings empirically using both synthetic and real-world data.
Similar Papers
Conditional independence testing with a single realization of a multivariate nonstationary nonlinear time series
Methodology
Finds hidden patterns in changing data.
On the Hardness of Conditional Independence Testing In Practice
Machine Learning (Stat)
Finds why computer tests for fairness sometimes fail.
Association and Independence Test for Random Objects
Methodology
Finds hidden connections in complex data.