Structure-Preserving Nonlinear Sufficient Dimension Reduction for Tensor Regression
Tuesday, Aug 5: 11:50 AM - 12:05 PM
Topic-Contributed Paper Session
Music City Center
We present a novel approach to nonlinear sufficient dimension reduction for scalar-on-tensor regression and classification problems. Our method defines a Tensor Product Space within several RKHS and introduces two kinds of the dimension-folding subspace alongside the conventional SDR subspace within the Tensor Product Space. We demonstrate that, under mild conditions, the range of the regression operator in the Tensor Product Space resides within the conventional SDR subspace. Furthermore, we propose the Tucker and CP Tensor Envelope frameworks, designed to preserve the intrinsic multidimensional structure of tensor-valued predictors while achieving effective dimension reduction. This framework bridges the subspaces, enabling us to establish that the tensor envelope of regression operator is also contained within the Dimension Folding Subspace. By leveraging Population-level and Sample-level Estimation, and drawing inspiration from this two common tensor decomposition methods, we develop two optimization algorithms to enhance the operator's objective function. We evaluate the performance of our proposed estimators through comprehensive simulations and real-world applications.
Nonlinear Sufficient Dimension Reduction
Dimension Folding
Tensor Decomposition
Reproducing Kernel Hilbert Space
Tensor Envelope and Tensor Product Space
Coordinate Mapping
You have unsaved changes.