Complexity of a Class of First-Order Objective-Function-Free Optimization Algorithms

Serge Gratton, Sadok Jerad, Philippe TOINT

Résultats de recherche: Papier de travailArticle de travail

93 Téléchargements (Pure)


A parametric class of trust-region algorithms for unconstrained nonconvex
optimization is considered where the value of the objective function is never
computed. The class contains a deterministic version of the first-order
Adagrad method typically used for minimization of noisy function, but also
allows the use of (possibly approximate) second-order information when available.
The rate of convergence of methods in the class is analyzed and is shown to be
identical to that known for first-order optimization methods using both
function and gradients values, recovering existing results for
purely-first order variants and improving the explicit dependence on problem
dimension. This rate is shown to be essentially sharp.
A new class of methods is also presented, for which a slightly worse and
essentially sharp complexity result holds. Limited numerical experiments
show that the new methods' performance may be comparable to that of standard
steepest descent, despite using significantly less information, and that this
performance is relatively insensitive to noise.
langue originaleAnglais
Nombre de pages30
Etat de la publicationPublié - 6 juin 2023

Empreinte digitale

Examiner les sujets de recherche de « Complexity of a Class of First-Order Objective-Function-Free Optimization Algorithms ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation