next up previous
Next: About this document ...

Alexander, J.M. Howse
Nonlinearly Preconditioned Optimization on Grassmann Manifolds for Tucker Tensor Approximations

University of Waterloo
Department of Applied Mathematics
200 University Ave W
Waterloo
ON N2L 3G1
Canada
ajmhowse@gmail.com
Hans De Sterck

Two accelerated optimization algorithms are presented for computing approximate Tucker tensor decompositions (ATDs). The first is a nonlinearly preconditioned conjugate gradient (NPCG) algorithm, wherein a nonlinear preconditioner generates a direction replacing the gradient in the nonlinear conjugate gradient iteration. The second is a nonlinear GMRES (N-GMRES) algorithm, in which a linear combination of iterates generated by a nonlinear preconditioner is minimized to produce an improved search direction. The Euclidean versions of these methods are extended to the manifold setting, where optimization on Grassmann manifolds is used to handle orthonormality constraints and to allow isolated minimizers. The higher order orthogonal iteration (HOOI), the workhorse algorithm for computing ATDs, is used as the nonlinear preconditioner in NPCG and N-GMRES. Four options are provided for the update parameter $ \beta$ in NPCG. Two strategies for approximating the Hessian operator applied to a vector are provided for N-GMRES. NPCG and N-GMRES are compared to HOOI, NCG, limited memory BFGS, and a manifold trust region algorithm using synthetic data and real life tensor data arising from handwritten digit recognition. Numerical results show that all four NPCG variants and N-GMRES using a difference of gradients Hessian approximation accelerate HOOI significantly for large tensors, noisy data, and when high accuracy results are required. For these problems, the proposed methods converge faster and more robustly than HOOI and the state-of-the-art methods considered.




next up previous
Next: About this document ...
root 2016-02-22