Yet another fast variant of Newton's method for nonconvex optimization

Serge Gratton, Sadok Jerad, Philippe TOINT

Research output: Working paper

41 Downloads (Pure)

Abstract

A second-order algorithm is proposed for minimizing smooth nonconvex
functions that alternates between regularized Newton and negative
curvature steps. In most cases, the Hessian matrix is regularized with
the square root of the current gradient and an additional term taking
moderate negative curvature into account, a negative curvature step
being taken only exceptionnally. As a consequence, the proposed method
only requires the solution of a single linear system at nearly all
iterations. We establish that at most O(\log(epsilon)|epsilon^{-3/2})
evaluations of the problem's objective function and derivatives are
needed for this algorithm to obtain an epsilon-approximate first-order
minimizer, and at most O(|\log(epsilon)| epsilon^{-3}) to obtain a
second-order one. Initial numerical experiments with two variants of
the new method are finally presented.
Original languageEnglish
PublisherArxiv
Number of pages26
Volume2203.10065
Publication statusPublished - 21 Feb 2023

Keywords

  • Newton's method, nonconvex optimization, negative curvature, adaptive regularization methods, evaluation complexity

Fingerprint

Dive into the research topics of 'Yet another fast variant of Newton's method for nonconvex optimization'. Together they form a unique fingerprint.

Cite this