Non-Euclidean Bayesian Constraint Relaxation via Divergence-to-Set Priors

Abstract Number:

3277 

Submission Type:

Contributed Abstract 

Contributed Abstract Type:

Speed 

Participants:

Rick Presman (1), Jason Xu (2)

Institutions:

(1) Duke University, N/A, (2) N/A, N/A

Co-Author:

Jason Xu  
N/A

First Author:

Rick Presman  
Duke University

Presenting Author:

Rick Presman  
Duke University

Abstract Text:

Constraints on parameter spaces promote various structures in Bayesian inference. Simultaneously, they present methodological challenges, such as efficiently sampling from the posterior. While recent work has tackled this important problem through various approaches of constraint relaxation, much of the underlying machinery assumes the parameter space is Euclidean-an assumption that doesn't hold in many settings. Building on the recently proposed class of distance-to-set priors (Presman and Xu, 2023), this talk explores extensions of constraint relaxation in non-Euclidean spaces. We propose a natural extension of these priors, which we call (Bregman) divergence-to-set priors, exemplify many settings where they can be leveraged, and demonstrate how techniques originally from an optimization algorithm known as mirror descent can utilized for non-Euclidean Bayesian constraint relaxation.

Keywords:

Constraint relaxation|Hamiltonian Monte Carlo|Bregman divergence|MCMC Sampler| |

Sponsors:

Section on Bayesian Statistical Science

Tracks:

Bayesian Computation

Can this be considered for alternate subtype?

Yes

Are you interested in volunteering to serve as a session chair?

No

I have read and understand that JSM participants must abide by the Participant Guidelines.

Yes

I understand that JSM participants must register and pay the appropriate registration fee by June 1, 2024. The registration fee is non-refundable.

I understand