Non-Asymptotic Error Bounds for Causally Conditioned Directed Information Rates of Gaussian Sequences
By: Yuping Zheng, Andrew Lamperski
Potential Business Impact:
Measures how one thing causes another, even with messy data.
Directed information and its causally conditioned variations are often used to measure causal influences between random processes. In practice, these quantities must be measured from data. Non-asymptotic error bounds for these estimates are known for sequences over finite alphabets, but less is known for real-valued data. This paper examines the case in which the data are sequences of Gaussian vectors. We provide an explicit formula for causally conditioned directed information rate based on optimal prediction and define an estimator based on this formula. We show that our estimator gives an error of order $O\left(N^{-1/2}\log(N)\right)$ with high probability, where $N$ is the total sample size.
Similar Papers
Convergence Rates for Realizations of Gaussian Random Variables
Statistics Theory
Helps computers learn from less data.
Convergence Rates for Realizations of Gaussian Random Variables
Statistics Theory
Improves computer predictions using limited data.
Time-series Random Process Complexity Ranking Using a Bound on Conditional Differential Entropy
Signal Processing
Ranks how complex data changes over time.