Non-Euclidean Bayesian Constraint Relaxation via Divergence-to-Set Priors
Monday, Aug 5: 9:45 AM - 9:50 AM
3277
Contributed Speed
Oregon Convention Center
Constraints on parameter spaces promote various structures in Bayesian inference. Simultaneously, they present methodological challenges, such as efficiently sampling from the posterior. While recent work has tackled this important problem through various approaches of constraint relaxation, much of the underlying machinery assumes the parameter space is Euclidean-an assumption that doesn't hold in many settings. Building on the recently proposed class of distance-to-set priors (Presman and Xu, 2023), this talk explores extensions of constraint relaxation in non-Euclidean spaces. We propose a natural extension of these priors, which we call (Bregman) divergence-to-set priors, exemplify many settings where they can be leveraged, and demonstrate how techniques originally from an optimization algorithm known as mirror descent can utilized for non-Euclidean Bayesian constraint relaxation.
Constraint relaxation
Hamiltonian Monte Carlo
Bregman divergence
MCMC Sampler
Main Sponsor
Section on Bayesian Statistical Science
You have unsaved changes.