14: Ranked Sparsity in Penalized Regression: Fun with Penalty Factors
Ryan Peterson
Speaker
University of Colorado - Anschutz Medical Campus
Sunday, Aug 3: 8:30 PM - 9:25 PM
Invited Posters
Music City Center
The ranked sparsity framework is critical in widespread settings where predictor variables can be expected to contribute differing levels of information to describe or characterize the outcome (i.e., "mixed signals"). We motivate ranked sparsity via the Bayesian interpretation of the lasso, challenging the presumption that all covariates are equally worthy of entering into a model. Specifically, we illustrate the utility of ranked sparsity in the following settings: 1) for evaluating covariates belonging to groups of varying sizes or qualities, 2) for evaluating covariates representing derived variables (e.g. interactions), 3) for fitting time series models with complex seasonality and/or exogenous features, 4) for facilitating hypothesis testing for time-based interventions on complex time series data, and 5) for performing incomplete principal components regression. We highlight specific examples of each application, and also present a large scale predictive-model bake-off, showing how sparsity-ranked penalized regression can produce highly interpretable, transparent models with competitive prediction accuracy.
You have unsaved changes.