Bridging the Gap between Constant Step Size Stochastic Gradient Descent and Markov Chains - CNRS - Centre national de la recherche scientifique Accéder directement au contenu
Article Dans Une Revue The Annals of Statistics Année : 2020

Bridging the Gap between Constant Step Size Stochastic Gradient Descent and Markov Chains

Résumé

We consider the minimization of an objective function given access to unbiased estimates of its gradient through stochastic gradient descent (SGD) with constant step-size. While the detailed analysis was only performed for quadratic functions, we provide an explicit asymptotic expansion of the moments of the averaged SGD iterates that outlines the dependence on initial conditions, the effect of noise and the step-size, as well as the lack of convergence in the general (non-quadratic) case. For this analysis, we bring tools from Markov chain theory into the analysis of stochastic gradient. We then show that Richardson-Romberg extrapolation may be used to get closer to the global optimum and we show empirical improvements of the new extrapolation scheme.
Fichier principal
Vignette du fichier
main_arxiv.pdf (557.74 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01565514 , version 1 (19-07-2017)
hal-01565514 , version 2 (10-04-2018)

Identifiants

Citer

Aymeric Dieuleveut, Alain Durmus, Francis Bach. Bridging the Gap between Constant Step Size Stochastic Gradient Descent and Markov Chains. The Annals of Statistics, 2020, 48 (3), ⟨10.1214/19-AOS1850⟩. ⟨hal-01565514v2⟩
684 Consultations
2283 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More