Proximal gradient method for nonconvex and nonsmooth optimization on Hadamard manifolds

Authors

Shuailing Feng, Wen Huang, Lele Song, Shihui Ying, Tieyong Zeng

Abstract

In this paper, we address the minimizing problem of the nonconvex and nonsmooth functions on Hadamard manifolds, and develop an improved proximal gradient method. First, by utilizing the geometric structure of non-positive curvature manifolds, we propose a monotone proximal gradient algorithm with fixed step size on Hadamard manifolds. Then, a convergence theorem of the proposed method has been established under the reasonable definition of proximal gradient mapping on manifolds. If the function further satisfies the Riemannian Kurdyka-ojasiewicz (KL) property with an exponent, the local convergence rate is given. Finally, numerical experiments on a special Hadamard manifold, named symmetric positive definite matrix manifold, show the advantages of the proposed method.

Key words

Proximal gradient method; Hadamard manifolds; Manifold optimization; Convergence analysis;

Status

Optimization Letters, 16,2277-2297, 2022

Download

BibTex entry