Parametric complexity analysis for a class of first-order Adagrad-like algorithms

Serge Gratton, Sadok Jerad, Philippe TOINT

Research output: Working paper

2 Downloads (Pure)

Abstract

A class of algorithms for optimization in the presence of noise is presented,
that does not require the evaluation of the objective function. This class
generalizes the well-known Adagrad method. The complexity of this class is
then analyzed as a function of its parameters, and it is shown that
some methods of the class enjoy a better asymptotic convergence rate than
previously known. A new class of algorithms is then derived with similar
characteristics. Initial numerical experiments suggest that it may have some
merits in practice.
Original languageEnglish
PublisherArxiv
Volume2203.01647
Publication statusPublished - Jan 2022

Fingerprint

Dive into the research topics of 'Parametric complexity analysis for a class of first-order Adagrad-like algorithms'. Together they form a unique fingerprint.

Cite this