Score: 0

Non-Asymptotic Error Bounds for Causally Conditioned Directed Information Rates of Gaussian Sequences

Published: December 6, 2025 | arXiv ID: 2512.06238v1

By: Yuping Zheng, Andrew Lamperski

Potential Business Impact:

Measures how one thing causes another, even with messy data.

Business Areas:
A/B Testing Data and Analytics

Directed information and its causally conditioned variations are often used to measure causal influences between random processes. In practice, these quantities must be measured from data. Non-asymptotic error bounds for these estimates are known for sequences over finite alphabets, but less is known for real-valued data. This paper examines the case in which the data are sequences of Gaussian vectors. We provide an explicit formula for causally conditioned directed information rate based on optimal prediction and define an estimator based on this formula. We show that our estimator gives an error of order $O\left(N^{-1/2}\log(N)\right)$ with high probability, where $N$ is the total sample size.

Country of Origin
🇺🇸 United States

Page Count
8 pages

Category
Computer Science:
Information Theory