Bayesian Compressed Mixed-Effects Models
Sunday, Aug 3: 2:35 PM - 2:50 PM
2197
Contributed Papers
Music City Center
Linear mixed-effects models are fundamental in statistical methodology for analyzing repeated-measures data and longitudinal data. While regularized maximum likelihood or maximum a posteriori estimation methods are commonly employed, the literature on sampling-based Bayesian inference remains relatively unexplored. The main reason is the computational bottleneck pertaining to the covariance matrix of the random effects in high-dimensional settings. We propose compressed mixed-effects (CME) models for efficient prediction and fixed effects selection in high-dimensions. These models project a subset of the parameters into a low-dimensional space using random projection matrices yielding a quasi-likelihood. This allows us to bypass the prior specification on the high-dimensional covariance matrix by compressing its Cholesky factor using random projections, and devise a computationally efficient collapsed Gibbs sampler using shrinkage priors, enabling posterior uncertainty quantification. The CME models showcase better predictive accuracy, coverage, and selection guarantees than its competitors in diverse simulation settings and repeated measures data analysis.
Gibbs sampling
Parameter explansion
Quasi-likelihood
Random projection
Uncertainty quantification
Main Sponsor
Section on Bayesian Statistical Science
You have unsaved changes.