An example of slow convergence for Newton's method on a function with globally Lipschitz continuous Hessian

Coralia Cartis, N. I. M. Gould, Ph Toint

Research output: Working paper

63 Downloads (Pure)


An example is presented where Newton's method for unconstrained minimization is applied to find an $\epsilon$-approximate first-order critical point of a smooth function and takes a multiple of $\epsilon^{-2}$ iterations and function evaluations to terminate, which is as many as the steepest-descent method in its worst-case. The novel feature of the proposed example is that the objective function has a globally Lipschitz-continuous Hessian, while a previous example published by the same authors only ensured this critical property along the path of iterates, which is impossible to verify \emph{a priori}.
Original languageEnglish
PublisherNamur center for complex systems
Number of pages9
Publication statusPublished - 5 May 2013



  • Complexity theory, nonlinear optimization, Newton method

Cite this