ESPs: cost-efficient sampling of computationally expensive posterior distributions
Thursday, Aug 7: 9:25 AM - 9:50 AM
Invited Paper Session
Music City Center
Computationally expensive posterior distributions arise in a myriad of modern scientific settings. One such setting is Bayesian inverse problems with computer simulators, where each evaluation of the (unnormalized) posterior distribution $p$ requires a forward run of the expensive simulator that can take hundreds of CPU hours. This evaluation cost poses a challenge for many existing posterior samplers, where each sample taken requires at least one evaluation of the posterior $p$. Given a computational budget, a "cost-efficient" sampler is desired for effective posterior exploration with limited posterior evaluations. We thus propose a new sampling method called cost-Efficient Stein Points (ESPs). ESPs extend the Stein points in Chen et al. (2018; ICML), which employs the sequential minimization of the kernel Stein discrepancy to generate a sequence of posterior samples $x_1, x_2, \cdots$. To reduce posterior evaluations for optimizing a sample $x_i$, we leverage a Gaussian process surrogate on the kernel discrepancy that guides the selection of evaluation points. This "Bayesian optimization" strategy is then used sequentially to optimize $x_1, x_2, \cdots$ in a cost-efficient manner, where all previous posterior evaluations can be re-used for optimizing the current sample. We prove that, under mild conditions, ESPs converge in distribution to the desired posterior distribution. Finally, we demonstrate the cost-efficiency of ESPs over the state-of-the-art in a suite of numerical experiments and an application for Bayesian calibration of a biological oscillator process.
Bayesian methods, scientific computing, uncertainty quantification
You have unsaved changes.