Nonlinear Acceleration of Stochastic Algorithms - CNRS - Centre national de la recherche scientifique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2017

Nonlinear Acceleration of Stochastic Algorithms

Acceleration non-linéaire d'algorithmes stochastiques

Résumé

Extrapolation methods use the last few iterates of an optimization algorithm to produce a better estimate of the optimum. They were shown to achieve optimal convergence rates in a deterministic setting using simple gradient iterates. Here, we study extrapolation methods in a stochastic setting, where the iterates are produced by either a simple or an accelerated stochastic gradient algorithm. We first derive convergence bounds for arbitrary, potentially biased perturbations, then produce asymptotic bounds using the ratio between the variance of the noise and the accuracy of the current point. Finally, we apply this acceleration technique to stochastic algorithms such as SGD, SAGA, SVRG and Katyusha in different settings, and show significant performance gains.
Fichier principal
Vignette du fichier
1706.07270.pdf (1.98 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01618379 , version 1 (17-10-2017)

Identifiants

  • HAL Id : hal-01618379 , version 1

Citer

Damien Scieur, Alexandre d'Aspremont, Francis Bach. Nonlinear Acceleration of Stochastic Algorithms. 2017. ⟨hal-01618379⟩
261 Consultations
82 Téléchargements

Partager

Gmail Facebook X LinkedIn More