MCMC-CE: Efficient Bayes Factor Estimation for Bayesian Hypothesis Testing with Non-conjugate Priors via the Cross-Entropy Method

Devin Lundy Co-Author
Augusta Univeristy
 
Vy Ong Co-Author
Wayne State University
 
Yin Wan Co-Author
Wanye State University
 
Yang Shi First Author
Wayne State University
 
Yang Shi Presenting Author
Wayne State University
 
Tuesday, Aug 5: 10:05 AM - 10:20 AM
2548 
Contributed Papers 
Music City Center 
The accurate and efficient estimation of Bayes factors is critical for Bayesian model comparison, particularly when evaluating competing hypotheses in complex statistical models. Traditional computational approaches often suffer from inefficiency, instability, and poor scalability, especially when dealing with non-conjugate priors. In this work, we propose MCMC-CE, an advanced method that extends the cross-entropy (CE) technique—originally developed for rare-event probability estimation—to improve the computation of marginal likelihoods in Bayesian hypothesis testing and linear regression models. Our approach integrates the CE method within a Markov chain Monte Carlo (MCMC) framework to optimize proposal distributions and efficiently approximate the marginal likelihood. We apply MCMC-CE to both hypothesis testing via Bayes factors and Bayesian model averaging. Extensive simulation studies and real-world data applications demonstrate that MCMC-CE significantly outperforms existing methods in terms of computational speed, numerical stability, and estimation accuracy. These results suggest that MCMC-CE provides a powerful and scalable solution for Bayesian inference in challenging modeling scenarios.

Keywords

Marginal likelihood

Cross-entropy method

Markov chain Monte Carlo

Bayes factor

Bayesian model averaging

Bayesian linear regression 

Main Sponsor

Section on Statistical Computing