Entropic Causal Inference: Graph Identifiability
By: Spencer Compton , Kristjan Greenewald , Dmitriy Katz and more
Potential Business Impact:
Finds cause and effect from data.
Entropic causal inference is a recent framework for learning the causal graph between two variables from observational data by finding the information-theoretically simplest structural explanation of the data, i.e., the model with smallest entropy. In our work, we first extend the causal graph identifiability result in the two-variable setting under relaxed assumptions. We then show the first identifiability result using the entropic approach for learning causal graphs with more than two nodes. Our approach utilizes the property that ancestrality between a source node and its descendants can be determined using the bivariate entropic tests. We provide a sound sequential peeling algorithm for general graphs that relies on this property. We also propose a heuristic algorithm for small graphs that shows strong empirical performance. We rigorously evaluate the performance of our algorithms on synthetic data generated from a variety of models, observing improvement over prior work. Finally we test our algorithms on real-world datasets.
Similar Papers
Causal Identification in Time Series Models
Machine Learning (CS)
Finds hidden causes in changing data.
Causal representation learning from network data
Machine Learning (CS)
Finds hidden causes in complex systems.
Meta-Dependence in Conditional Independence Testing
Machine Learning (CS)
Finds hidden causes by checking how things relate.