A Riemannian BFGS Method without Differentiated Retraction for Nonconvex Optimization Problems

Authors

Wen Huang, P.-A. Absil, K. A. Gallivan

Abstract

In this paper, a Riemannian BFGS method for minimizing a smooth function on a Riemannian manifold is defined, based on a Riemannian generalization of a cautious update and a weak line search condition. It is proven that the Riemannian BFGS method converges (i) globally to stationary points without assuming the objective function to be convex and (ii) superlinearly to a nondegenerate minimizer. Using the weak line search condition allows to completely avoid the information of differentiated retraction. The joint matrix diagonalization problem is chosen to demonstrate the performance of the algorithms with various parameters, line search conditions and pairs of retraction and vector transport. A preliminary version can be found in [HAG16].

Key words

Riemannian optimization; manifold optimization; Quasi-Newton methods; BFGS method; Differentiated retraction

Status

SIAM Journal on Optimization, 28:1, pp. 470-495, 2018

Download

Errata

This errata concerns the publisher's version. The typos are corrected in the technical report available above.
Page 489, equation (50): replace $$\mathcal{H}_{k + 1} = \tilde{\mathcal{H}_k} - \frac{\tilde{\mathcal{H}}_k y_k (\tilde{\mathcal{H}}_k^* y_k)^{\flat}}{(\tilde{\mathcal{H}}_k^* y_k)^{\flat} y_k} + \frac{s_k s_k^{\flat}}{s_k^{\flat} y_k}$$ by $$\mathcal{H}_{k + 1} = \left(\mathrm{id} - \frac{s_k y_k^{\flat}}{g(y_k, s_k)}\right) \tilde{\mathcal{H}}_k \left(\mathrm{id} - \frac{y_k s_k^{\flat}}{g(y_k, s_k)}\right) + \frac{s_k s_k^{\flat}}{g(y_k, s_k)}$$ (replace the Riemannian DFP update by the Riemannian BFGS update formula)

BibTex entry