Multilevel Objective-Function-Free Optimization with an Application to Neural Networks Training

Serge Gratton, Alena Kopanicakova, Philippe TOINT

Résultats de recherche: Contribution à un journal/une revueArticleRevue par des pairs

20 Téléchargements (Pure)

Résumé

A class of multilevel algorithms for unconstrained nonlinear optimization is presented which does not require the evaluation of the objective function. The class contains the momentum-less AdaGrad method as a particular (single-level) instance. The choice of avoiding the evaluation of the objective function is intended to make the algorithms of the class less sensitive to noise, while the multilevel feature aims at reducing their computational cost. The evaluation complexity of these algorithms is analyzed and their behavior in the presence of noise is then illustrated in the context of training deep neural networks for supervised learning applications.

langue originaleAnglais
Pages (de - à)2772-2800
Nombre de pages29
journalSIAM Journal on Optimization
Volume33
Numéro de publication4
Les DOIs
Etat de la publicationPublié - 15 févr. 2023

Empreinte digitale

Examiner les sujets de recherche de « Multilevel Objective-Function-Free Optimization with an Application to Neural Networks Training ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation