Nonlinear Acceleration of Momentum and PrimalDual Algorithms.
TITLE: Nonlinear Acceleration of Momentum and PrimalDual Algorithms.
AUTHORS: Raghu Bollapragada, Damien Scieur, Alexandre d'Aspremont
ABSTRACT: We describe a convergence acceleration scheme for multistep optimization algorithms. The extrapolated solution is written as a nonlinear average of the iterates produced by the original optimization algorithm. Our scheme does not need the underlying fixedpoint operator to be symmetric, hence handles e.g. algorithms with momentum terms such as Nesterov's accelerated method, or primaldual methods. The weights are computed via a simple linear system and we analyze performance in both online and offline modes. We use Crouzeix's conjecture to show that acceleration performance is controlled by the solution of a Chebyshev problem on the numerical range of a nonsymmetric operator modelling the behavior of iterates near the optimum. Numerical experiments are detailed on image processing problems, logistic regression and neural network training for CIFAR10 and ImageNet.
STATUS: Preprint.
ArXiv PREPRINT: 1810.04539
PAPER: Nonlinear Acceleration of Momentum and PrimalDual Algorithms in pdf
