Multilevel Objective-Function-Free Optimization with an Application to Neural Networks Training

Serge Gratton, Alena Kopanicakova, Philippe TOINT

Research output: Contribution to journalArticlepeer-review

27 Downloads (Pure)

Abstract

A class of multilevel algorithms for unconstrained nonlinear optimization is presented which does not require the evaluation of the objective function. The class contains the momentum-less AdaGrad method as a particular (single-level) instance. The choice of avoiding the evaluation of the objective function is intended to make the algorithms of the class less sensitive to noise, while the multilevel feature aims at reducing their computational cost. The evaluation complexity of these algorithms is analyzed and their behavior in the presence of noise is then illustrated in the context of training deep neural networks for supervised learning applications.

Original languageEnglish
Pages (from-to)2772-2800
Number of pages29
JournalSIAM Journal on Optimization
Volume33
Issue number4
DOIs
Publication statusPublished - 15 Feb 2023

Keywords

  • complexity
  • deep learning
  • multilevel methods
  • neural networks
  • nonlinear optimization
  • objective-function-free optimization (OFFO)

Fingerprint

Dive into the research topics of 'Multilevel Objective-Function-Free Optimization with an Application to Neural Networks Training'. Together they form a unique fingerprint.

Cite this