Structural Dimension Reduction in Bayesian Networks
By: Pei Heng, Yi Sun, Jianhua Guo
Potential Business Impact:
Makes smart computer guesses faster and smaller.
This work introduces a novel technique, named structural dimension reduction, to collapse a Bayesian network onto a minimum and localized one while ensuring that probabilistic inferences between the original and reduced networks remain consistent. To this end, we propose a new combinatorial structure in directed acyclic graphs called the directed convex hull, which has turned out to be equivalent to their minimum localized Bayesian networks. An efficient polynomial-time algorithm is devised to identify them by determining the unique directed convex hulls containing the variables of interest from the original networks. Experiments demonstrate that the proposed technique has high dimension reduction capability in real networks, and the efficiency of probabilistic inference based on directed convex hulls can be significantly improved compared with traditional methods such as variable elimination and belief propagation algorithms. The code of this study is open at \href{https://github.com/Balance-H/Algorithms}{https://github.com/Balance-H/Algorithms} and the proofs of the results in the main body are postponed to the appendix.
Similar Papers
How to Marginalize in Causal Structure Learning?
Machine Learning (CS)
Finds hidden patterns in data faster.
A general framework for adaptive nonparametric dimensionality reduction
Machine Learning (Stat)
Finds best way to show complex data simply.
A Convex-Inspired Neural Construction for Structured and Generalizable Nonlinear Model Reduction
Graphics
Makes computer simulations of wobbly things stable.