Deterministic Equivalents and Scaling Laws for Random Feature Regression
Monday, Aug 4: 11:05 AM - 11:35 AM
Invited Paper Session
Music City Center
In this talk, we revisit random feature ridge regression (RFRR), a model that has recently gained renewed interest for investigating puzzling phenomena in deep learning—such as double descent, benign overfitting, and scaling laws. Our main contribution is a general deterministic equivalent for the test error of RFRR. Specifically, under a certain concentration property, we show that the test error is well approximated by a closed-form expression that only depends on the feature map eigenvalues. Notably, our approximation guarantee is non-asymptotic, multiplicative, and independent of the feature map dimension—allowing for infinite-dimensional features.
This deterministic equivalent can be used to precisely capture the above phenomenology in RFRR. As an example, we derive sharp excess error rates under standard power-law assumptions of the spectrum and target decay. In particular, we tightly characterize the optimal parametrization achieving minimax rate.
This is based on joint work with Basil Saeed (Stanford), Leonardo Defilippis (ENS), and Bruno Loureiro (ENS).
Random Feature Regression
Random Matrix Theory
Scaling laws
Benign overfitting
You have unsaved changes.