FSUMATH
Florida State University Seal

Mathematics Colloquium


Mert Gurbuzbalaban
Rutgers University

Title: Momentum Acceleration Under Random Gradient Noise
Date: Friday, September 27, 2019
Place and Time: Room 102, Love Building, 1:25-2:15 pm
Refreshments: Room 204, Love Building, 3:00 pm

Abstract. For many large-scale optimization and machine learning problems, first-order methods and their accelerated variants based on momentum have been a leading approach for computing low-to-medium accuracy solutions because of their cheap iterations and mild dependence on the problem dimension and data size. Even though momentum-based accelerated gradient (AG) methods proposed by Nesterov for convex optimization converges provably faster than gradient descent (GD) in the absence of noise, the comparison is no longer clear in the presence of gradient noise. In this talk, we focus on GD and AG methods for minimizing strongly convex functions when the gradient has random errors in the form of additive i.i.d. noise. We study the trade-offs between convergence rate and robustness to gradient errors in designing a first-order algorithm. Our results show that AG can achieve acceleration while being more robust to random gradient errors. Our framework also leads to practical optimal algorithms that can perform better than other state-of-the-art methods in the presence of random gradient noise. This is joint work with Serhat Aybat, Alireza Fallah and Asu Ozdaglar.