Generative Calibration for Valid Inference: Bridging Inferential Models and Simulation

Hyeong Jin Hyun Co-Author
 
Halin Shin Co-Author
 
Xiao Wang Co-Author
Purdue University
 
Haoyun Yin First Author
 
Haoyun Yin Presenting Author
 
Monday, Aug 4: 2:50 PM - 3:05 PM
2674 
Contributed Papers 
Music City Center 
Modern simulation-based inference methods face challenges in achieving finite-sample validity, particularly in high-dimensional settings. Inferential models (IMs) offer a prior-free framework for statistically reliable inference, merging Bayesian-like reasoning with frequentist calibration guarantees. However, practical deployment of IMs is hindered by their possibilistic uncertainty quantification, which resists approximation by conventional Monte Carlo tools. We introduce a generative calibration framework that trains a generative model to sample parameters, which is used to generate synthetic datasets, and assess a discrepancy function over observed and simulated data. By ensuring the discrepancy follows a uniform distribution, we achieve exact frequentist confidence regions. Minimizing deviations from uniformity via a loss function iteratively refines the generative model without requiring priors or asymptotic assumptions. Experiments confirm its effectiveness in high-dimensional regression and real-world applications, delivering nominal coverage and outperforming calibrated bootstrap and Bayesian methods in finite samples.

Keywords

Inferential models

simulation-based inference

uncertainty quantification

generative modeling

frequentist calibration 

Main Sponsor

IMS