38: A Variational Non-Asymptotic Testing Procedure for Causal Discovery

Yixin Wang Co-Author
 
Yian Ma Co-Author
University of California San Diego
 
Paolo Borello First Author
 
Paolo Borello Presenting Author
 
Tuesday, Aug 5: 10:30 AM - 12:20 PM
1567 
Contributed Posters 
Music City Center 
Causal discovery in finite sample settings presents significant challenges in feasibility and precision, yet most of the existing work often assumes asymptotic conditions or discrete support of causal variables. This paper examines the fundamental limits of causal discovery under finite data constraints, focusing on function complexity and statistical guarantees in continuous settings. We introduce a novel framework for identifying approximate causal relationships, utilizing KL-divergence minimization to estimate causal effects. Our approach adapts inherently to the finite-sample regime, offering robustness guarantees and capturing non-linear dependencies through extensions into reproducing kernel Hilbert spaces. Additionally, we develop a testing procedure to discern the direction of causality, enhancing the practical applicability of our framework in data-limited contexts. These contributions clarify the feasibility of causal inference when data is scarce and establish theoretical bounds for estimation of complex functional relationships.

Keywords

Causal DIscovery

KL Divergence

Finite samples 

Main Sponsor

Section on Statistical Learning and Data Science