Article Dans Une Revue AIMS Mathematics Année : 2023

Sequential stochastic blackbox optimization with zeroth-order gradient estimators

Résumé

This work considers stochastic optimization problems in which the objective function values can only be computed by a blackbox corrupted by some random noise following an unknown distribution. The proposed method is based on sequential stochastic optimization (SSO), i.e., the original problem is decomposed into a sequence of subproblems. Each subproblem is solved by using a zeroth-order version of a sign stochastic gradient descent with momentum algorithm (i.e., ZO-signum) and with increasingly fine precision. This decomposition allows a good exploration of the space while maintaining the efficiency of the algorithm once it gets close to the solution. Under the Lipschitz continuity assumption on the blackbox, a convergence rate in mean is derived for the ZO-signum algorithm. Moreover, if the blackbox is smooth and convex or locally convex around its minima, the rate of convergence to an $ \epsilon $-optimal point of the problem may be obtained for the SSO algorithm. Numerical experiments are conducted to compare the SSO algorithm with other state-of-the-art algorithms and to demonstrate its competitiveness.

Fichier principal
Vignette du fichier
2305.19450v2.pdf (693.62 Ko) Télécharger le fichier
Origine Publication financée par une institution

Dates et versions

hal-04292899 , version 1 (09-01-2025)

Identifiants

Citer

Charles Audet, Jean Bigeon, Romain Couderc, Michael Kokkolaras. Sequential stochastic blackbox optimization with zeroth-order gradient estimators. AIMS Mathematics, 2023, 8 (11), pp.25922-25956. ⟨10.3934/math.20231321⟩. ⟨hal-04292899⟩
44 Consultations
0 Téléchargements

Altmetric

Partager

More