Abstract
A regularization algorithm using inexact function values and inexact derivatives is proposed and its evaluation complexity analyzed. This algorithm is applicable to unconstrained problems and to problems with inexpensive constraints (that is, constraints whose evaluation and enforcement has negligible cost) under the assumption that the derivative of highest degree is β-Hölder continuous. It features a very flexible adaptive mechanism for determining the inexactness which is allowed, at each iteration, when computing objective function values and derivatives. The complexity analysis covers arbitrary optimality order and arbitrary degree of available approximate derivatives. It extends results of Cartis, Gould, and Toint [SIAM J. Optim., to appear] on the evaluation complexity to the inexact case: if a qth-order minimizer is sought using approximations to the first p derivatives, it is proved that a suitable approximate minimizer within ε is computed by the propposed algorithm in at most O[Formula presented] iterations and at most O[Formula presented] approximate evaluations. An algorithmic variant, although more rigid in practice, can be proved to find such an approximate minimizer in O[Formula presented] evaluations. While the proposed framework remains so far conceptual for high degrees and orders, it is shown to yield simple and computationally realistic inexact methods when specialized to the unconstrained and bound-constrained first- and second-order cases. The deterministic complexity results are finally extended to the stochastic context, yielding adaptive sample-size rules for subsampling methods typical of machine learning.
| Original language | English |
|---|---|
| Pages (from-to) | 2881-2915 |
| Number of pages | 35 |
| Journal | SIAM Journal on Optimization |
| Volume | 29 |
| Issue number | 4 |
| DOIs | |
| Publication status | Published - 2 Jan 2020 |
Keywords
- evaluation complexity
- regularization methods
- inexact functions and derivatives
- subsampling methods
- machine learning
- Evaluation complexity
- Subsampling methods
- Inexact functions and derivatives
- Regularization methods
Fingerprint
Dive into the research topics of 'Adaptive regularization algorithms with inexact evaluations for nonconvex optimization'. Together they form a unique fingerprint.-
The Impact of Noise on Evaluation Complexity: The Deterministic Trust-Region Case
Bellavia, S., Gurioli, G., Morini, B. & TOINT, P., Feb 2023, In: Journal of Optimization Theory and Applications. 196, 2, p. 700-729 30 p.Research output: Contribution to journal › Article › peer-review
Open Access -
Evaluation complexity of algorithms for nonconvex optimization
Cartis, C., Gould, N. I. M. & TOINT, P., Jul 2022, SIAM. 600 p. (SIAM-MOS Series on Optimization)Research output: Book/Report/Journal › Book
-
An algorithm for the minimization of nonsmooth nonconvex functions using inexact evaluations and its worst-case complexity
Gratton, S., Simon, E. & Toint, P., 21 Jan 2020, In: Mathematical Programming. 187, 1-2, p. 1-24 19 p.Research output: Contribution to journal › Article › peer-review
Open AccessFile84 Downloads (Pure)
Projects
- 2 Active
-
Complexity in nonlinear optimization
Toint, P. (CoI), Gould, N. I. M. (CoI) & Cartis, C. (CoI)
1/11/08 → …
Project: Research
-
ADALGOPT: ADALGOPT - Advanced algorithms in nonlinear optimization
Sartenaer, A. (CoI) & Toint, P. (CoI)
1/01/87 → …
Project: Research Axis
Activities
-
Recent results in worst-case evaluation complexity for smooth and non-smooth, exact and inexact, nonconvex optimization
TOINT, P. (Speaker)
3 Jun 2021Activity: Talk or presentation types › Invited talk
-
Recent results in worst-case evaluation complexity for smooth and non-smooth, exact and inexact, nonconvex optimization
TOINT, P. (Speaker)
8 May 2020Activity: Talk or presentation types › Invited talk
-
5th Conference on Numerical Analysis and Optimization
Toint, P. (Contributor)
6 Jan 2020 → 9 Jan 2020Activity: Participating in or organising an event types › Participation in workshop, seminar, course
Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver