Graphon-Level Bayesian Predictive Synthesis for Random Network
By: Marios Papamichalis, Regina Ruane
Bayesian predictive synthesis provides a coherent Bayesian framework for combining multiple predictive distributions, or agents, into a single updated prediction, extending Bayesian model averaging to allow general pooling of full predictive densities. This paper develops a static, graphon level version of Bayesian predictive synthesis for random networks. At the graphon level we show that Bayesian predictive synthesis corresponds to the integrated squared error projection of the true graphon onto the linear span of the agent graphons. We derive nonasymptotic oracle inequalities and prove that least-squares graphon-BPS, based on a finite number of edge observations, achieves the minimax L^2 rate over this agent span. Moreover, we show that any estimator that selects a single agent graphon is uniformly inconsistent on a nontrivial subset of the convex hull of the agents, whereas graphon-level Bayesian predictive synthesis remains minimax-rate optimal-formalizing a combination beats components phenomenon. Structural properties of the underlying random graphs are controlled through explicit Lipschitz bounds that transfer graphon error into error for edge density, degree distributions, subgraph densities, clustering coefficients, and giant component phase transitions. Finally, we develop a heavy tail theory for Bayesian predictive synthesis, showing how mixtures and entropic tilts preserve regularly varying degree distributions and how exponential random graph model agents remain within their family under log linear tilting with Kullback-Leibler optimal moment calibration.
Similar Papers
Generalized Bayesian Inference for Dynamic Random Dot Product Graphs
Methodology
Predicts future connections in changing groups.
A Neuro-Symbolic Approach for Probabilistic Reasoning on Graph Data
Artificial Intelligence
Lets computers learn and reason about connected data.
PAC-Bayesian Generalization Bounds for Graph Convolutional Networks on Inductive Node Classification
Machine Learning (CS)
Helps computers learn from changing online connections.