Optimal Ridge Regularization for Out-of-Distribution Prediction
Abstract Number:
3340
Submission Type:
Contributed Abstract
Contributed Abstract Type:
Paper
Participants:
Pratik Patil (1), Jin-Hong Du (2), Ryan Tibshirani (3)
Institutions:
(1) University of California, Berkeley, N/A, (2) Carnegie Mellon University, United States, (3) Carnegie Mellon University, N/A
Co-Author(s):
First Author:
Presenting Author:
Abstract Text:
We study the behavior of optimal ridge regularization and optimal ridge risk for out-of-distribution prediction, where the test distribution deviates arbitrarily from the train distribution. We establish general conditions that determine the sign of the optimal regularization level under covariate and regression shifts. These conditions capture alignment between the covariance and signal structures in the train and test data and reveal stark differences compared to the in-distribution setting (where the test and train distributions agree); for example, a negative regularization level can be optimal under covariate shift, even when the training features are isotropic. Furthermore, we prove that the optimally-tuned risk is monotonic in the data aspect ratio, even in the out-of-distribution setting. In general, our results do not make any modeling assumptions for the train or the test distributions, except for moment bounds, and allow for arbitrary shifts and the widest possible range of (negative) regularization level.
Keywords:
Ridge regression|Optimal regularization|Distribution shift|Covariate shift|Regression shift|Risk monotonicity
Sponsors:
IMS
Tracks:
Statistical Theory
Can this be considered for alternate subtype?
Yes
Are you interested in volunteering to serve as a session chair?
Yes
I have read and understand that JSM participants must abide by the Participant Guidelines.
Yes
I understand that JSM participants must register and pay the appropriate registration fee by June 1, 2024. The registration fee is non-refundable.
I understand
You have unsaved changes.