The approximation accuracy of Gaussian variational inference and related posterior approximations in high-dimensional Bayesian inference

Philippe Rigollet Co-Author
MIT
 
Anya Katsevich Co-Author
Massachusetts Institute of Technology
 
Anya Katsevich Speaker
Massachusetts Institute of Technology
 
Monday, Aug 4: 3:20 PM - 3:45 PM
Invited Paper Session 
Music City Center 
The main computational challenge in Bayesian inference is to compute integrals against a high-dimensional posterior distribution. In the past decades, variational inference (VI) has emerged as a tractable approximation to these integrals, and a viable alternative to the more established paradigm of Markov Chain Monte Carlo. However, little is known about the approximation accuracy of VI. We present new bounds on the TV error and the mean and covariance approximation error of Gaussian VI in terms of dimension and sample size. Our proof technique is part of a general framework that allows to precisely analyze the accuracy of asymptotic approximations to integrals against high-dimensional posteriors, in the regime of posterior concentration. We also present new sharp bounds on the accuracy of the Laplace approximation (a related Gaussian posterior approximation method), based on this same general framework. Finally, we compare and contrast these two Gaussian approximations, VI and Laplace.