On the complexity of steepest descent, Newton's and regularized Newton's methods for nonconvex unconstrained optimization problems

C. Cartis, Nick Gould, Philippe Toint

Résultats de recherche: Contribution à un journal/une revueArticle

8 Téléchargements (Pure)

Résumé

It is shown that the steepest-descent and Newton's methods for unconstrained nonconvex optimization under standard assumptions may both require a number of iterations and function evaluations arbitrarily close to O(ε^{-2}) to drive the norm of the gradient below ε. This shows that the upper bound of O(ε^{-2}) evaluations known for the steepest descent is tight and that Newton's method may be as slow as the steepest-descent method in the worst case. The improved evaluation complexity bound of O(ε^{-3/2} ) evaluations known for cubically regularized Newton's methods is also shown to be tight. © 2010 Society for Industrial and Applied Mathematics.
langue originaleAnglais
Pages (de - à)2833-2852
Nombre de pages20
journalSIAM Journal on Optimization
Volume20
Numéro de publication6
Les DOIs
Etat de la publicationPublié - 1 janv. 2010

    Empreinte digitale

Contient cette citation