Deep P−Spline: Fast Tuning, Theory, and Application
Monday, Aug 4: 2:25 PM - 2:45 PM
Topic-Contributed Paper Session
Music City Center
Surrogate modeling is essential for analyzing expensive computer experiments but often suffers from the curse of dimensionality, limiting its scalability and interpretability. To address these challenges, we propose a novel composite framework that combines the uncertainty quantification of Gaussian process models with the scalability of deep neural networks (DNNs). A key innovation of our approach is the integration of basis expansion with a difference penalty, providing an efficient and interpretable solution for network structure selection. By drawing an analogy between neuron selection in DNNs and knot selection in splines, we extend statistical frameworks to deep learning, enabling automated and computationally efficient structure tuning through an Expectation Conditional Maximization (ECM)-based algorithm. Our penalized framework offers unique flexibility by decoupling penalty order from spline basis degree, enhancing model design. Theoretical analyses reveal that the proposed method achieves superior convergence rates and effectively circumvents the curse of dimensionality, making it suitable for nonlinear regression with high-dimensional inputs. Applications to surrogate modeling demonstrate the framework's versatility and robustness. This work establishes a principled foundation for scalable, interpretable, and theoretically grounded DNN-based surrogate modeling techniques.
You have unsaved changes.