A Riemannian Accelerated Proximal Gradient Method

Authors

Shuailing Feng, Yuhang Jiang, Wen Huang, Shihui Ying

Abstract

Riemannian accelerated gradient methods have been well studied for smooth optimization, typically treating geodesically convex and geodesically strongly convex cases separately. However, their extension to nonsmooth composite problems on manifolds with theoretical acceleration remains unclear. To address this, we propose a unified Riemannian accelerated proximal gradient method for problems of the form $F(x) = f(x) + h(x)$ on manifolds, where $f$ is geodesically convex or geodesically strongly convex and $h$ is $\rho$-retraction-convex. We rigorously establish accelerated convergence rate under appropriate conditions. Additionally, we introduce a safeguard mechanism to guarantee global convergence in non-convex settings. Numerical experiments demonstrate the theoretical acceleration of the proposed method.

Key words

Riemannian optimization; Riemannian proximal gradient; Riemannian acceleration

Download