Empirical Bayes for Data Integration
By: Paul Rognon-Vael, David Rossell
Potential Business Impact:
Improves learning from old, incomplete study data.
We discuss the use of empirical Bayes for data integration, in the sense of transfer learning. Our main interest is in settings where one wishes to learn structure (e.g. feature selection) and one only has access to incomplete data from previous studies, such as summaries, estimates or lists of relevant features. We discuss differences between full Bayes and empirical Bayes, and develop a computational framework for the latter. We discuss how empirical Bayes attains consistent variable selection under weaker conditions (sparsity and betamin assumptions) than full Bayes and other standard criteria do, and how it attains faster convergence rates. Our high-dimensional regression examples show that fully Bayesian inference enjoys excellent properties, and that data integration with empirical Bayes can offer moderate yet meaningful improvements in practice.
Similar Papers
The Bayesian Way: Uncertainty, Learning, and Statistical Reasoning
Methodology
Teaches computers to learn from past information.
On the Hierarchical Bayes justification of Empirical Bayes Confidence Intervals
Statistics Theory
Improves how computers guess numbers from data.
Expectation-propagation for Bayesian empirical likelihood inference
Methodology
Makes computer guesses more accurate without needing exact rules.