PAC-Bayesian risk bounds for fully connected deep neural network with Gaussian priors
By: The Tien Mai
Potential Business Impact:
Makes smart computer programs learn faster and better.
Deep neural networks (DNNs) have emerged as a powerful methodology with significant practical successes in fields such as computer vision and natural language processing. Recent works have demonstrated that sparsely connected DNNs with carefully designed architectures can achieve minimax estimation rates under classical smoothness assumptions. However, subsequent studies revealed that simple fully connected DNNs can achieve comparable convergence rates, challenging the necessity of sparsity. Theoretical advances in Bayesian neural networks (BNNs) have been more fragmented. Much of those work has concentrated on sparse networks, leaving the theoretical properties of fully connected BNNs underexplored. In this paper, we address this gap by investigating fully connected Bayesian DNNs with Gaussian prior using PAC-Bayes bounds. We establish upper bounds on the prediction risk for a probabilistic deep neural network method, showing that these bounds match (up to logarithmic factors) the minimax-optimal rates in Besov space, for both nonparametric regression and binary classification with logistic loss. Importantly, our results hold for a broad class of practical activation functions that are Lipschitz continuous.
Similar Papers
Some theoretical improvements on the tightness of PAC-Bayes risk certificates for neural networks
Machine Learning (CS)
Makes AI more trustworthy and reliable.
PAC-Bayesian Generalization Bounds for Graph Convolutional Networks on Inductive Node Classification
Machine Learning (CS)
Helps computers learn from changing online connections.
Non-vacuous Generalization Bounds for Deep Neural Networks without any modification to the trained models
Machine Learning (CS)
Predicts how well computer brains learn new things.