Nonlinear Acceleration of Stochastic Algorithms.
TITLE: Nonlinear Acceleration of Stochastic Algorithms.
AUTHORS: Damien Scieur, Alexandre d'Aspremont, Francis Bach
ABSTRACT: Extrapolation methods use the last few iterates of an optimization algorithm to produce a better estimate of the optimum. They were shown to achieve optimal convergence rates in a deterministic setting using simple gradient iterates. Here, we study extrapolation methods in a stochastic setting, where the iterates are produced by either a simple or an accelerated stochastic gradient algorithm. We first derive convergence bounds for arbitrary, potentially biased perturbations, then produce asymptotic bounds using the ratio between the variance of the noise and the accuracy of the current point. Finally, we apply this acceleration technique to stochastic algorithms such as SGD, SAGA, SVRG and Katyusha in different settings, and show significant performance gains.
STATUS: Preprint.
ArXiv PREPRINT: 1706.07270
PAPER: Nonlinear Acceleration of Stochastic Algorithms in pdf.
