Conditional Independence Testing and Conformal Inference with Adaptively Collected Data

Yash Nair Co-Author
Stanford University
 
Lucas Janson Speaker
 
Wednesday, Aug 7: 11:45 AM - 12:15 PM
Invited Paper Session 
Oregon Convention Center 
Randomization testing is a fundamental method in statistics, enabling inferential tasks such as testing for (conditional) independence of random variables, constructing confidence intervals in semiparametric location models, and constructing (by inverting a permutation test) model-free prediction intervals via conformal inference. Randomization tests are exactly valid for any sample size, but their use is generally confined to exchangeable data. Yet in many applications, data is routinely collected adaptively via, e.g., (contextual) bandit and reinforcement learning algorithms or adaptive experimental designs. In this paper we present a general framework for randomization testing on adaptively collected data (despite its non-exchangeability) that uses a weighted randomization test, for which we also present computationally tractable resampling algorithms for various popular adaptive assignment algorithms, data-generating environments, and types of inferential tasks. Finally, we demonstrate via a range of simulations the efficacy of our framework for both testing and confidence/prediction interval construction. This is joint work with Yash Nair, and the relevant paper is https://arxiv.org/abs/2301.05365.