Fisher-Rao Gradient Flow: Geodesic Convexity and Functional Inequality

Jiaoyang Huang Speaker
University of Pennsylvania
 
Thursday, Aug 8: 11:15 AM - 11:35 AM
Topic-Contributed Paper Session 
Oregon Convention Center 
Sampling a probability distribution with an unknown normalization constant is a fundamental problem in computational science and engineering. This task may be cast as an optimization problem over all probability measures, where an initial distribution evolves dynamically towards the desired minimizer (the target distribution) via gradient flows. By choosing different metrics for the gradient flow, different algorithms with different convergence properties arise.

In this talk, I will focus on the Fisher-Rao metric, which is known to be the unique metric (up to scaling) that is diffeomorphism invariant.
Unlike the Wasserstein metric, a significant challenge arises from the absence of geodesic convexity under the Fisher-Rao metric for common energy functionals such as the Kullback-Leibler (KL) divergence. I will present a novel functional inequality for Fisher-Rao gradient flow. This leads to a uniform exponential rate of convergence for the gradient flow associated with KL-divergence, as well as for large families of f-divergences.