An adaptive regularization method in Banach spaces

Research output: Contribution to journalArticlepeer-review

38 Downloads (Pure)

Abstract

This paper considers optimization of nonconvex functionals in smooth infinite dimensional spaces. It is first proved that functionals in a class containing multivariate polynomials augmented with a sufficiently smooth regularization can be minimized by a simple linesearch-based algorithm. Sufficient smoothness depends on gradients satisfying a novel two-terms generalized Lipschitz condition. A first-order adaptive regularization method applicable to functionals with β-Hölder continuous derivatives is then proposed, that uses the linesearch approach to compute a suitable trial step. It is shown to find an ϵ-approximate first-order point in at most (Formula presented.) evaluations of the functional and its first p derivatives.

Original languageEnglish
Pages (from-to)1163-1179
Number of pages17
JournalOptimization Methods and Software
Volume38
Issue number6
DOIs
Publication statusPublished - 24 Nov 2023

Funding

Work partially supported by 3IA Artificial and Natural Intelligence Toulouse Institute, French ‘Investing for the Future - PIA3” program under the grant agreement ANR-19-PI3A-0004’.

FundersFunder number
3IA Artificial and Natural Intelligence Toulouse Institute, French ‘Investing for the FutureANR-19-PI3A-0004

    Keywords

    • adaptive regularization
    • evaluation complexity
    • Hölder gradients
    • infinite-dimensional problems
    • Nonlinear optimization

    Fingerprint

    Dive into the research topics of 'An adaptive regularization method in Banach spaces'. Together they form a unique fingerprint.

    Cite this