Riemannian Proximal Gradient Methods

Authors

Wen Huang, Ke Wei

Abstract

In the Euclidean setting the proximal gradient method and its accelerated variants are a class of efficient algorithms for optimization problems with decomposable objective. However, due to the lack of linearity on a generic manifold, studies on such methods for similar problems but constrained on a manifold are still limited. In this paper we develop and analyze a generalization of the proximal gradient methods with and without acceleration for problems on Riemannian manifolds. Global convergence of the Riemannian proximal gradient method has been established under mild assumptions. The $O(1/k)$ and $O(1/k^2)$ convergence rate analyses are also derived for the method and its accelerated variant provided more assumptions hold. To the best of our knowledge, this is the first attempt to establish the convergence rate of the Riemannian proximal gradient methods for the nonsmooth Riemannian optimization problem. Empirical performance comparisons show that the proposed Riemannian proximal gradient methods are competitive with existing ones.

Key words

Riemannian proximal gradient; Riemannian optimization; Stiefel manifold; Oblique manifold; Sparse PCA;

Status

Published in Mathematical Programming.

Download

BibTex entry