Bayesian Elastic Net Regression with Structured Prior Dependence
By: Christopher M. Hans, Ningyi Liu
Many regularization priors for Bayesian regression assume the regression coefficients are a priori independent. In particular this is the case for standard Bayesian treatments of the lasso and the elastic net. While independence may be reasonable in some data-analytic settings, incorporating dependence in these prior distributions provides greater modeling flexibility. This paper introduces the orthant normal distribution in its general form and shows how it can be used to structure prior dependence in the Bayesian elastic net regression model. An L1-regularized version of Zellner's g prior is introduced as a special case, creating a new link between the literature on penalized optimization and an important class of regression priors. Computation is challenging due to an intractable normalizing constant in the prior. We avoid this issue by modifying slightly a standard prior of convenience for the hyperparameters in such a way to enable simple and fast Gibbs sampling of the posterior distribution. The benefit of including structured prior dependence in the Bayesian elastic net regression model is demonstrated through simulation and a near-infrared spectroscopy data example.
Similar Papers
Heteroscedastic Double Bayesian Elastic Net
Methodology
Finds hidden patterns when data errors change.
Dependency-Aware Shrinkage Priors for High Dimensional Regression
Methodology
Helps computers pick better answers when data is related.
Bayesian Global-Local Regularization
Methodology
Helps computers find patterns in messy data.