Transfer entropy for finite data
By: Alec Kirkley
Potential Business Impact:
Finds hidden connections in data streams.
Transfer entropy is a widely used measure for quantifying directed information flows in complex systems. While the challenges of estimating transfer entropy for continuous data are well known, it has two major shortcomings that persist even for data of finite cardinality: it exhibits a substantial positive bias for sparse bin counts, and it has no clear means to assess statistical significance. By more precisely accounting for information content in finite data streams, we derive a transfer entropy measure which is asymptotically equivalent to the standard plug-in estimator but remedies these issues for time series of small size and/or high cardinality, permitting a fully nonparametric assessment of statistical significance without simulation. We show that this correction for finite data has a substantial impact on results in both real and synthetic time series datasets.
Similar Papers
TENDE: Transfer Entropy Neural Diffusion Estimation
Machine Learning (CS)
Finds how information moves between things.
Entropic transfer operators for stochastic systems
Dynamical Systems
Helps understand complex moving things from data.
Information entropy of complex probability
Information Theory
Makes math better for tricky, uncertain problems.