Comparative Analysis of Accelerated Gradient Algorithms for Convex Optimization: High and Super Resolution ODE Approach - CNRS - Centre national de la recherche scientifique
Pré-Publication, Document De Travail Année : 2023

Comparative Analysis of Accelerated Gradient Algorithms for Convex Optimization: High and Super Resolution ODE Approach

Résumé

We investigate convex differentiable optimization and explore the temporal discretization of damped inertial dynamics driven by the gradient of the objective function. This leads to three accelerated gradient algorithms: Nesterov Accelerated Gradient (NAG), Ravine Accelerated Gradient (RAG), and (IGAHD). Attouch, Chbani, Fadili, and Riahi introduced (IGAHD) by discretizing inertial dynamics with Hessian-driven damping to attenuate inherent oscillations in inertial methods. By analyzing the high-resolution ODEs of order p = 0, 1, 2 for these algorithms, we gain insights into their similarities and differences. All three algorithms share the same low-resolution ODE of order 0, which is the dynamic proposed by Su, Boyd, and Candès as a continuous surrogate for (NAG). To differentiate Nesterov from Ravine, we refine the comparison and demonstrate distinct high-resolution ODEs of order 2 in h (termed super-resolution). The corresponding Taylor expansions in h reveal matching terms of order 1 but differing terms of order 2. To the best of our knowledge, this result is completely new and emphasizes the need to avoid confusion between the Ravine and Nesterov methods in the literature. We present numerical experiments to illustrate our theoretical results. Performance profiles, measuring the number of iterations, indicate that (IGAHD) outperforms both (NAG) and (RAG) methods. (RAG) exhibits a slight advantage over (NAG) in terms of the average number of iterations. When considering CPU-time, both (RAG) and (NAG) outperform (IGAHD). All three algorithms exhibit similar behavior when evaluating based on gradient norms.
Fichier principal
Vignette du fichier
Adly_Attouch_Fadili_Submitted_Version_HAL_CNRS_May-2023.pdf (7.33 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04101919 , version 1 (21-05-2023)

Identifiants

  • HAL Id : hal-04101919 , version 1

Citer

Samir Adly, Hedy Attouch, Jalal M. Fadili. Comparative Analysis of Accelerated Gradient Algorithms for Convex Optimization: High and Super Resolution ODE Approach. 2023. ⟨hal-04101919v1⟩
148 Consultations
212 Téléchargements

Partager

More