Revisit CP Tensor Decomposition: Statistical Optimality and Fast Convergence

Julien Chhor Co-Author
 
Olga Klopp Co-Author
 
Anru Zhang Co-Author
Duke University
 
Runshi Tang First Author
 
Runshi Tang Presenting Author
 
Sunday, Aug 3: 5:35 PM - 5:50 PM
1902 
Contributed Papers 
Music City Center 
We introduce a statistical and computational framework for tensor Canonical Polyadic (CP) decomposition, with a focus on statistical theory, convergence, and algorithmic improvements. First, we show that the Alternating Least Squares (ALS) algorithm achieves the desired error rate within three iterations when $R = 1$. Second, for the more general case where $R > 1$, we derive statistical bounds for ALS, showing that the estimation error exhibits an initial phase of quadratic convergence followed by linear convergence until reaching the desired accuracy. Third, we propose a novel warm-start procedure for ALS in the $R > 1$ setting, which integrates tensor Tucker decomposition with simultaneous diagonalization (Jennrich's algorithm) to significantly enhance performance over existing benchmark methods. Numerical experiments support our theoretical findings, demonstrating the practical advantages of our approach.

Keywords

tensor

CP decomposition

alternative least square

statistical bound 

Main Sponsor

Section on Statistical Learning and Data Science