Generalized Tensor Completion for Noisy Data with Non-Random Missingness
Sunday, Aug 3: 3:05 PM - 3:25 PM
Invited Paper Session
Music City Center
Tensor completion plays a crucial role in a wide range of applications, including recommender systems and medical imaging, where observed data are often highly incomplete. While extensive prior work has addressed tensor completion with data missingness, most assume that missing entries occur randomly. However, real-world data often exhibit missing-not-at-random patterns, where missingness depends on the underlying tensor values. This paper introduces a generalized tensor completion framework for noisy data with non-random missingness, where the missing probability is modeled as a function of underlying tensor values. Our formulation is flexible and accommodates various tensor data types, including continuous, binary, and count data. For model estimation, we develop a computationally efficient alternating gradient descent algorithm and derive non-asymptotic error bounds for the estimator at each iteration. Additionally, we propose a statistical inferential procedure to test whether missing probabilities depend on tensor values, offering a formal assessment of the missing-at-random assumption within our modeling framework. The utility and efficacy of our approach are demonstrated through comparative simulation studies and analyses of two real-world datasets.
graphical model with covariates
multi-task learning
debiased inference
You have unsaved changes.