Low-rank tensor completion: a Riemannian manifold preconditioning approach
H. Kasai and B. Mishra
We propose a novel Riemannian manifold pre- conditioning approach for the tensor completion problem with rank constraint. A novel Riemannian metric or inner product is proposed that exploits the least-squares structure of the cost function and takes into account the structured symmetry that exists in Tucker decomposition. The specific metric allows to use the versatile framework of Riemannian optimization on quotient manifolds to develop preconditioned nonlinear conjugate gradient and stochastic gradient descent algorithms for batch and online setups, respectively. Concrete matrix representations of various optimization-related ingredients are listed. Numerical comparisons suggest that our pro- posed algorithms robustly outperform state-of-the-art algorithms across different synthetic and real-world datasets.
- Status: ICML 2016. A shorter version was earlier accepted to 8th NIPS workshop on optimization for machine learning (OPT2015) held at Montreal, Canada on December 11, 2015.
- Paper: [Publisher’s copy] [Supplementary material] [arXiv:1605.08257].
- Old version: [arXiv:1506.02159]. Since then, we have discussed an online algorithm. The title has been changed subsequently.
- Matlab code: Rprecon_for_tensor_completion_1July2016.zip
- The file low_rank_tensor_completion_KM.m shows both first and second order implementations with Manopt. It shows the Euclidean and Riemannian Hessian computations, which could be involved tasks for many.
- Poster: Riemannian preconditioning for tensor completion in Low-rank Optimization and Applications workshop, Hausdorff Center for Mathematics, Bonn, 2015.