A variational neural Bayes framework for inference on intractable posterior distributions

Emily Hector Co-Author
North Carolina State University
 
Amanda Lenzi Co-Author
 
Brian Reich Co-Author
North Carolina State University
 
Elliot Maceda Speaker
 
Sunday, Aug 3: 4:05 PM - 4:25 PM
Topic-Contributed Paper Session 
Music City Center 
Classic Bayesian methods with complex models are frequently infeasible due to an intractable likelihood. Simulation-based inference methods, such as Approximate Bayesian Computing (ABC), calculate posteriors without accessing a likelihood function by leveraging the fact that data can be quickly simulated from the model, but converge slowly and/or poorly in high-dimensional settings. In this paper, we propose a framework for Bayesian posterior estimation by mapping data to posteriors of parameters using a machine learning model trained on data simulated from the complex model. Posterior distributions of model parameters are efficiently obtained by assuming a parametric form for the posterior, parametrized by the machine learning model, which is trained with the simulated observed data as inputs and the associated parameters as outputs. We show theoretically that our posteriors converge to the true posteriors in Kullback-Leibler divergence if the correct parametric family of the posterior is identified. We also provide tools to help us identify if our parametric assumption is close to the true posterior, and modeling options if that is not the case. Comprehensive simulation studies highlight our method's robustness and accuracy.

Keywords

Simulation-based Inference

Emulator

Spatial Epidemiology

Spatial Extreme Models

Variational Inference

Approximate Bayesian Computing