Yet another fast variant of Newton's method for nonconvex optimization

Serge Gratton, Sadok Jerad, Philippe TOINT

Résultats de recherche: Papier de travailArticle de travail

27 Téléchargements (Pure)

Résumé

A second-order algorithm is proposed for minimizing smooth nonconvex
functions that alternates between regularized Newton and negative
curvature steps. In most cases, the Hessian matrix is regularized with
the square root of the current gradient and an additional term taking
moderate negative curvature into account, a negative curvature step
being taken only exceptionnally. As a consequence, the proposed method
only requires the solution of a single linear system at nearly all
iterations. We establish that at most O(\log(epsilon)|epsilon^{-3/2})
evaluations of the problem's objective function and derivatives are
needed for this algorithm to obtain an epsilon-approximate first-order
minimizer, and at most O(|\log(epsilon)| epsilon^{-3}) to obtain a
second-order one. Initial numerical experiments with two variants of
the new method are finally presented.
langue originaleAnglais
ÉditeurArxiv
Nombre de pages26
Volume2203.10065
Etat de la publicationPublié - 21 févr. 2023

Empreinte digitale

Examiner les sujets de recherche de « Yet another fast variant of Newton's method for nonconvex optimization ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation