Sparse Deep P-Spline Regression with Applications

Noah Hung Co-Author
Georgia State University
 
Li-Hsiang Lin First Author
Georgia State University
 
Li-Hsiang Lin Presenting Author
Georgia State University
 
Tuesday, Aug 5: 9:50 AM - 10:05 AM
0894 
Contributed Papers 
Music City Center 
Deep neural networks (DNNs) have been widely used for real-world regression tasks, but applying them to high-dimensional, low-sample-size data presents unique challenges. Existing approaches often prioritize sparse linear relationships before extending to the full DNN structure, which can overlook important nonlinear associations. The problem becomes even more complex when selecting network architecture, such as determining the optimal number of layers and neurons. This study addresses these challenges by linking neuron selection in DNNs to knot placement in basis expansion techniques and additive modeling with introducing a sparsity-inducing difference penalty. This penalty automates knot selection and promotes parsimony in neuron activations, resulting in an efficient and scalable fitting method with optimizing architecture selection. The proposed method, named by Sparse Deep P-Spline, is validated through numerical studies, demonstrating its ability to efficiently detect sparse nonlinear relationships. Applications to the analysis of computer experiments are also presented.

Keywords

Deep Smoothing Regression

Additive Models

Feature Selection

Fast Tuning Algorithm 

Main Sponsor

Section on Statistical Computing