Online Model Selection by Weighted Rolling Validation
Jing Lei
Speaker
Carnegie Mellon University
Tuesday, Aug 5: 9:50 AM - 10:15 AM
Invited Paper Session
Music City Center
Online nonparametric estimators are gaining popularity due to their efficient computation and competitive generalization abilities. An important example is variants of stochastic gradient descent. These algorithms often take one sample point at a time and incrementally update the parameter estimate of interest. In this work, we consider model selection/hyperparameter tuning for such online algorithms. We propose a weighted rolling validation (wRV) procedure, an online variant of leave-one-out cross-validation, that costs minimal extra computation for many typical stochastic gradient descent estimators and maintains their online nature. We study the model selection behavior of wRV under a general stability framework and also reveal some unexpected advantage of wRV over its batch counterpart.
model selection
cross-validation
online learning
nonparametric regression
You have unsaved changes.