Sparse-Input Neural Network using Group Concave Regularization

Susan Halabi Co-Author
Duke University
 
Bin Luo First Author
Kennesaw State University
 
Bin Luo Presenting Author
Kennesaw State University
 
Monday, Aug 4: 12:10 PM - 12:15 PM
2652 
Contributed Speed 
Music City Center 
Simultaneous feature selection and non-linear function estimation are challenging, especially in high-dimensional settings where the number of variables exceeds the available sample size. We investigate feature selection in neural networks and address the limitations of group LASSO, which tends to select unimportant variables due to over-shrinkage. To overcome this, we propose a sparse-input neural network framework using group concave regularization for feature selection in both low- and high-dimensional settings. The key idea is to apply a concave penalty to the $l_2$ norm of weights from all outgoing connections of each input node, yielding a neural net that uses only a small subset of variables. We also develop an efficient algorithm based on backward path-wise optimization to produce stable solution paths and tackle complex optimization landscapes. Extensive simulations and real data examples demonstrate the proposed estimator's strong performance in feature selection and prediction for continuous, binary, and time-to-event outcomes.

Keywords

Neural networks

Feature selection

High dimensionality

LASSO

nonconvex penalty 

Main Sponsor

Section on Statistical Learning and Data Science