A Stein Gradient Descent Approach for Doubly Intractable Distributions

HEESANG LEE Co-Author
Yonsei University
 
Songhee Kim Co-Author
Yonsei University
 
Bokgyeong Kang Co-Author
Duke University
 
Jaewoo Park Co-Author
Yonsei University, Department of Applied Statistics
 
Songhee Kim Speaker
Yonsei University
 
Tuesday, Aug 5: 2:05 PM - 2:25 PM
Topic-Contributed Paper Session 
Music City Center 
Bayesian inference for doubly intractable distributions is challenging because they include intractable terms, which are functions of parameters of interest. Although several alternatives have been developed for such models, they are computationally intensive due to repeated auxiliary variable simulations. We propose a novel Monte Carlo Stein variational gradient descent (MC-SVGD) approach for inference for doubly intractable distributions. Through an efficient gradient approximation, our MC-SVGD approach rapidly transforms an arbitrary reference distribution to approximate the posterior distribution of interest, without necessitating any predefined variational distribution class for the posterior. Such a transport map is obtained by minimizing Kullback-Leibler divergence between the transformed and posterior distributions in a reproducing kernel Hilbert space (RKHS). We also investigate the convergence rate of the proposed method. We illustrate the application of the method to challenging examples, including a Potts model, an exponential random graph model, and a Conway--Maxwell--Poisson regression model. The proposed method achieves substantial computational gains over existing algorithms, while providing comparable inferential performance for the posterior distributions.

Keywords

doubly-intractable distributions

variational inference

Markov chain Monte Carlo

kernel Stein discrepancy

importance sampling