Score: 0

Sample Complexity of Nonparametric Closeness Testing for Continuous Distributions and Its Application to Causal Discovery with Hidden Confounding

Published: March 10, 2025 | arXiv ID: 2503.07475v1

By: Fateme Jamshidi, Sina Akbari, Negar Kiyavash

Potential Business Impact:

Finds cause and effect in complex data.

Business Areas:
A/B Testing Data and Analytics

We study the problem of closeness testing for continuous distributions and its implications for causal discovery. Specifically, we analyze the sample complexity of distinguishing whether two multidimensional continuous distributions are identical or differ by at least $\epsilon$ in terms of Kullback-Leibler (KL) divergence under non-parametric assumptions. To this end, we propose an estimator of KL divergence which is based on the von Mises expansion. Our closeness test attains optimal parametric rates under smoothness assumptions. Equipped with this test, which serves as a building block of our causal discovery algorithm to identify the causal structure between two multidimensional random variables, we establish sample complexity guarantees for our causal discovery method. To the best of our knowledge, this work is the first work that provides sample complexity guarantees for distinguishing cause and effect in multidimensional non-linear models with non-Gaussian continuous variables in the presence of unobserved confounding.

Country of Origin
🇨🇭 Switzerland

Page Count
22 pages

Category
Computer Science:
Machine Learning (CS)