Advances in Bayesian Computation for Modern Data Science

Quan Zhou Chair
Texas A&M University
 
Quan Zhou Organizer
Texas A&M University
 
Monday, Aug 4: 10:30 AM - 12:20 PM
0290 
Invited Paper Session 
Music City Center 
Room: CC-101A 

Applied

Yes

Main Sponsor

Section on Statistical Computing

Co Sponsors

International Society for Bayesian Analysis (ISBA)
Section on Bayesian Statistical Science

Presentations

MCMC for Directed Acyclic Graphs via Birth-and-Death Processes

Inferring a directed acyclic graph (DAG) given data is computationally challenging. Current state-of-the-art MCMC methods for graph inference efficiently scan the space by first considering a restricted search space and iteratively expanding it until a stopping criterion is met. We estimate the error introduced from current methods that use restricted spaces compared to the full space, and develop a novel MCMC method that reduces this error. Our method is an adaptive algorithm which allows for either expansion or contraction of the search space at any iteration. Both expansion and contraction are determined by a birth-and-death process. Extensive simulations demonstrate the efficiency of the new algorithm, compare its performance with existing methods, and consider applications in the field of imaging proteomics. 

Keywords

directed acyclic graph

Markov chain Monte Carlo

birth and death process

skeleton graph 

Co-Author(s)

Morris Greenberg
Kieran Campbell
Radu Craiu, University of Toronto

Speaker

Radu Craiu, University of Toronto

Bayesian Computation via Auxiliary-Try Metropolis and Parallel Annealed Chains

The Multiple-Try Metropolis (MTM) algorithm extends the traditional Metropolis-Hastings method by generating multiple candidate proposals per iteration, improving the exploration of the state space. However, MTM can struggle with navigating complex topographies, such as plateaus and local maxima, when relying solely on local proposals. In this work, we introduce the Auxiliary-Try Metropolis (ATM) algorithm, an extension of MTM that utilizes an auxiliary variable to guide the generation of candidate proposals. This auxiliary mechanism enables more effective traversal of challenging regions in the state space, enhancing the chain's ability to escape local optima and explore more broadly. We rigorously establish the validity of ATM as a Markov chain Monte Carlo method and demonstrate its superior performance compared to existing MTM approaches. Furthermore, we propose a novel Monte Carlo method that leverages the ATM algorithm as the foundation for constructing parallel annealed chains, offering a powerful tool for Bayesian computation in high-dimensional and complex models. 

Keywords

Markov chain Monte Carlo

Multiple-Try Metropolis

guided proposals

parallel Markov chains 

Speaker

Liangliang Wang, Simon Fraser University

A geometric approach to informed MCMC sampling

A Riemannian geometric framework for Markov chain Monte
Carlo (MCMC) is developed where using the Fisher-Rao metric on the
manifold of probability density functions (pdfs) informed proposal
densities for Metropolis-Hastings (MH) algorithms are constructed. We
exploit the square-root representation of pdfs under which the
Fisher-Rao metric boils down to the standard L2 metric, resulting in a
straightforward implementation of the proposed geometric MCMC
methodology. Unlike the random walk MH that blindly proposes a
candidate state using no information about the target, the geometric
MH algorithms effectively move an uninformed base density (e.g., a
random walk proposal density) towards different global/local
approximations of the target density. The superiority of the geometric
MH algorithm over other MCMC schemes is demonstrated using various
multimodal, nonlinear, and high-dimensional examples. A publicly
available R package geommc implements the proposed MCMC algorithms. 

Keywords

Markov chain Monte Carlo, Bayesian models 

Speaker

Vivekananda Roy, Iowa State University

A generalized-likelihood based approach for joint regression/covariance selection

In this talk, we address joint sparsity selection in the regression coefficient matrix and the error precision (inverse covariance) matrix for high-dimensional multivariate regression models in the Bayesian paradigm. The selected sparsity patterns are crucial to help understand the network of relationships between the predictor and response variables, as well as the conditional relationships among the latter. While Bayesian methods have the advantage of providing natural uncertainty quantification through posterior inclusion probabilities and credible intervals, current Bayesian approaches either restrict to specific sub-classes of sparsity patterns and/or are not scalable to settings with hundreds of responses and predictors. Bayesian approaches which only focus on estimating the posterior mode are scalable, but do not generate samples from the posterior distribution for uncertainty quantification. Using a bi-convex regression based generalized likelihood and spike-and-slab priors, we develop an algorithm called Joint Regression Network Selector (JRNS) for joint regression and covariance selection which (a) can accommodate general sparsity patterns, (b) provides posterior samples for uncertainty quantification, and (c) is scalable and orders of magnitude faster than the state-of-the-art Bayesian approaches providing uncertainty quantification. We demonstrate the statistical and computational efficacy of the proposed approach on synthetic data and through the analysis of selected cancer data sets. We also provide high-dimensional posterior consistency results for one of the developed algorithms. 

Co-Author(s)

Srijata Samanta
George Michailidis, University of Florida

Speaker

Kshitij Khare, University of Florida