Bayesian Model Averaging for Linear Regression Models With Heavy-Tailed Errors

Joyee Ghosh Co-Author
University of Iowa
 
Shamriddha De First Author
The University of Iowa
 
Shamriddha De Presenting Author
The University of Iowa
 
Wednesday, Aug 6: 10:05 AM - 10:20 AM
0886 
Contributed Papers 
Music City Center 
We aim to develop a Bayesian model averaging technique in linear regression models to accommodate heavier tailed error densities than the normal distribution. Motivated by the use of the Huber loss function in presence of outliers, the Bayesian Huberized lasso with hyperbolic errors has been proposed and recently implemented in the literature. Since the Huberized lasso cannot enforce regression coefficients to be exactly zero, we propose a fully Bayesian variable selection approach with spike and slab priors to address sparsity more effectively. Furthermore, the hyperbolic distribution has heavier tails than a normal distribution but thinner tails than a Cauchy distribution. Thus, we propose a novel regression model with an error distribution encompassing both hyperbolic and Student-t distributions. Our model aims to capture the benefit of using Huber loss, while adapting to heavier tails and unknown levels of sparsity, as entailed by the data. We develop an efficient Gibbs sampler with Metropolis Hastings steps for posterior computation. Through simulation studies and analyses of real datasets, we observe a superior performance of our method over various state-of-the-art methods.

Keywords

Bayesian Huberized lasso

Gibbs sampler

Hyperbolic distribution

Spike and slab priors

Student-t distribution 

Main Sponsor

Section on Bayesian Statistical Science