Post-selection Inference for Conformal Prediction: Trading off Coverage for Precision

Arun Kumar Kuchibhotla Co-Author
Carnegie Mellon University
 
Siddhaarth Sarkar First Author
Carnegie Mellon University
 
Siddhaarth Sarkar Presenting Author
Carnegie Mellon University
 
Thursday, Aug 7: 9:50 AM - 10:05 AM
2616 
Contributed Papers 
Music City Center 
Conformal inference has played a pivotal role in providing uncertainty quantification for black-box ML prediction algorithms with finite sample guarantees. Traditionally, conformal prediction inference requires a data-independent specification of miscoverage level. In practical applications, one might want to update the miscoverage level after computing the prediction set. The construction of prediction sets that guarantee coverage with data-dependent miscoverage level can be considered as a post-selection inference problem. In this work, we develop simultaneous conformal inference to account for data-dependent miscoverage levels. Under the assumption of independent and identically distributed observations, our proposed methods have a finite sample simultaneous guarantee over all miscoverage levels. Furthermore, we also propose methods that have the same guarantees for a user-specified choice of miscoverage levels. This allows practitioners to trade freely coverage probability for the quality of the prediction set by any criterion of their choice (say size of prediction set) while maintaining the finite sample guarantees similar to traditional conformal inference.

Keywords

post-selection inference

conformal prediction

distribution-free

CDF confidence bands

black-box methods 

Main Sponsor

Section on Statistical Learning and Data Science