A Riemannian BFGS Method without Differentiated Retraction for Nonconvex Optimization Problems


Wen Huang, P.-A. Absil, K. A. Gallivan


In this paper, a Riemannian BFGS method for minimizing a smooth function on a Riemannian manifold is defined, based on a Riemannian generalization of a cautious update and a weak line search condition. It is proven that the Riemannian BFGS method converges (i) globally to stationary points without assuming the objective function to be convex and (ii) superlinearly to a nondegenerate minimizer. Using the weak line search condition allows to completely avoid the information of differentiated retraction. The joint matrix diagonalization problem is chosen to demonstrate the performance of the algorithms with various parameters, line search conditions and pairs of retraction and vector transport. A preliminary version can be found in [HAG16].

Key words

Riemannian optimization; manifold optimization; Quasi-Newton methods; BFGS method; Differentiated retraction




BibTex entry