A Fully Bayesian Framework for Built-in Input Dimension Reduction and Gaussian Process Modeling

Emily Kang Co-Author
University of Cincinnati
 
Bledar Konomi Co-Author
University of Cincinnati
 
Eric Herrison Gyamfi First Author
University of Cincinnati
 
Eric Herrison Gyamfi Presenting Author
University of Cincinnati
 
Sunday, Aug 3: 2:05 PM - 2:20 PM
2447 
Contributed Papers 
Music City Center 
Gaussian process (GP) modeling is widely used in computational science and engineering. However, fitting a GP to high-dimensional inputs remains challenging due to the curse of dimensionality. While various methods have been proposed to reduce input dimensionality, they typically follow a two-stage approach, performing dimension reduction and GP fitting separately. We introduce a fully Bayesian framework that seamlessly integrates dimensionality reduction with GP modeling and inference. Our approach, built on a hierarchical Bayesian model with priors on the Stiefel manifold, enforces orthonormality on the projection matrix and enables posterior inference via Hamiltonian Monte Carlo with geodesic flow. Additionally, we extend this framework by incorporating Deep Gaussian Processes (DGP) with built-in dimension reduction, providing a more flexible and powerful tool for complex datasets. Through extensive numerical studies, we demonstrate that while the proposed Bayesian method incurs higher computational costs, it improves predictive performance and uncertainty quantification, providing a principled and robust alternative to existing methods.

Keywords

Bayesian Inference

Dimension Reduction

Gaussian Processes

Hamiltonian Monte Carlo

Stiefel Manifold

Uncertainty Quantification 

Main Sponsor

Section on Bayesian Statistical Science