Résumé
A class of algorithms for unconstrained nonconvex optimization is considered
where the value of the objective function is never computed. The class
contains a deterministic version of the first-order Adagrad method typically
used for minimization of noisy function, but also allows the use of
second-order information when available.
The rate of convergence of methods in the class is analyzed and is shown to be
identical to that known for first-order optimization methods using both
function and gradients values. The result is essentially sharp and improves on
previously known complexity bounds (in the stochastic context) by Défossez
et al. (2020) and Gratton et al. (2022). A new class of
methods is designed, for which a slightly worse and essentially sharp
complexity result holds.
Limited numerical experiments show that the new methods' performance may be
comparable to that of standard steepest descent, despite using significantly less
information, and that this performance is relatively insensitive to noise.
where the value of the objective function is never computed. The class
contains a deterministic version of the first-order Adagrad method typically
used for minimization of noisy function, but also allows the use of
second-order information when available.
The rate of convergence of methods in the class is analyzed and is shown to be
identical to that known for first-order optimization methods using both
function and gradients values. The result is essentially sharp and improves on
previously known complexity bounds (in the stochastic context) by Défossez
et al. (2020) and Gratton et al. (2022). A new class of
methods is designed, for which a slightly worse and essentially sharp
complexity result holds.
Limited numerical experiments show that the new methods' performance may be
comparable to that of standard steepest descent, despite using significantly less
information, and that this performance is relatively insensitive to noise.
| langue originale | Anglais |
|---|---|
| Éditeur | Arxiv |
| Volume | 2203.01757 |
| Etat de la publication | Publié - 7 mars 2022 |
Empreinte digitale
Examiner les sujets de recherche de « First-Order Objective-Function-Free Optimization Algorithms and Their Complexity ». Ensemble, ils forment une empreinte digitale unique.-
Complexity of Adagrad and other first-order methods for nonconvex optimization problems with bounds constraints
Toint, P., 10 juil. 2024, Arxiv.Résultats de recherche: Papier de travail
Accès ouvertFile27 Téléchargements (Pure) -
Convergence properties of an Objective-Function-Free Optimization regularization algorithm, including an 0(epsilon^{-3/2}) complexity bound
Gratton, S., Jerad, S. & TOINT, P., févr. 2023, Dans: SIAM Journal on Optimization. 33, 3, p. 1621-1646Résultats de recherche: Contribution à un journal/une revue › Article › Revue par des pairs
File -
Multilevel Objective-Function-Free Optimization with an Application to Neural Networks Training
Gratton, S., Kopanicakova, A. & TOINT, P., 15 févr. 2023, Dans: SIAM Journal on Optimization. 33, 4, p. 2772-2800 29 p.Résultats de recherche: Contribution à un journal/une revue › Article › Revue par des pairs
File43 Téléchargements (Pure)
Projets
- 2 Actif
-
Complexity in nonlinear optimization
Toint, P. (Co-investigateur), Gould, N. I. M. (Co-investigateur) & Cartis, C. (Co-investigateur)
1/11/08 → …
Projet: Recherche
-
ADALGOPT: ADALGOPT - Algorithmes avancés en optimisation non-linéaire
Sartenaer, A. (Co-investigateur) & Toint, P. (Co-investigateur)
1/01/87 → …
Projet: Axe de recherche
Activités
- 1 Recherche/Enseignement dans une institution externe
-
Institut National Polytechnique de Toulouse
Toint, P. (Chercheur visiteur)
6 mars 2022 → 12 mars 2022Activité: Visite d'une organisation externe › Recherche/Enseignement dans une institution externe
Contient cette citation
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver