Sample Complexity of Nonparametric Closeness Testing for Continuous Distributions and Its Application to Causal Discovery with Hidden Confounding
By: Fateme Jamshidi, Sina Akbari, Negar Kiyavash
Potential Business Impact:
Finds cause and effect in complex data.
We study the problem of closeness testing for continuous distributions and its implications for causal discovery. Specifically, we analyze the sample complexity of distinguishing whether two multidimensional continuous distributions are identical or differ by at least $\epsilon$ in terms of Kullback-Leibler (KL) divergence under non-parametric assumptions. To this end, we propose an estimator of KL divergence which is based on the von Mises expansion. Our closeness test attains optimal parametric rates under smoothness assumptions. Equipped with this test, which serves as a building block of our causal discovery algorithm to identify the causal structure between two multidimensional random variables, we establish sample complexity guarantees for our causal discovery method. To the best of our knowledge, this work is the first work that provides sample complexity guarantees for distinguishing cause and effect in multidimensional non-linear models with non-Gaussian continuous variables in the presence of unobserved confounding.
Similar Papers
Distribution Testing in the Presence of Arbitrarily Dominant Noise with Verification Queries
Data Structures and Algorithms
Find hidden patterns in messy data faster.
Efficient and Stable Multi-Dimensional Kolmogorov-Smirnov Distance
Computation
Finds if two sets of data are different.
Replicable Distribution Testing
Machine Learning (CS)
Tests if data groups are truly different.