Complexity of Adagrad and other first-order methods for nonconvex optimization problems with bounds constraints

Résultats de recherche: Papier de travail

5 Téléchargements (Pure)

Résumé

A parametric class of trust-region algorithms for constrained nonconvex optimization is analyzed, where the objective function is never computed. By defining appropriate first-order stationarity criteria, we are able to extend the Adagrad method to the newly considered problem and retrieve the standard complexity rate of the projected gradient method that uses both the gradient and objective function values. Furthermore, we propose an additional iteration-dependent scaling with slightly inferior theoretical guarantees. In both cases, the bounds are essentially sharp, and curvature information can be used to compute the stepsize. Initial experimental results for noisy bound-constrained instances illustrate the benefits of the objective-free approach.
langue originaleAnglais
ÉditeurArxiv
Volume2406.15793
Etat de la publicationPublié - 10 juil. 2024

Empreinte digitale

Examiner les sujets de recherche de « Complexity of Adagrad and other first-order methods for nonconvex optimization problems with bounds constraints ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation