TY - JOUR
T1 - Adaptive cubic regularisation methods for unconstrained optimization. Part I
T2 - Motivation, convergence and numerical results
AU - Cartis, C.
AU - Gould, N.I.M.
AU - Toint, P.L.
N1 - Copyright 2011 Elsevier B.V., All rights reserved.
PY - 2011/4/1
Y1 - 2011/4/1
N2 - An Adaptive Regularisation algorithm using Cubics (ARC) is proposed for unconstrained optimization, generalizing at the same time an unpublished method due to Griewank (Technical Report NA/12, 1981, DAMTP, University of Cambridge), an algorithm by Nesterov and Polyak (Math Program 108(1):177-205, 2006) and a proposal by Weiser et al. (Optim Methods Softw 22(3):413-431, 2007). At each iteration of our approach, an approximate global minimizer of a local cubic regularisation of the objective function is determined, and this ensures a significant improvement in the objective so long as the Hessian of the objective is locally Lipschitz continuous. The new method uses an adaptive estimation of the local Lipschitz constant and approximations to the global model-minimizer which remain computationally-viable even for large-scale problems. We show that the excellent global and local convergence properties obtained by Nesterov and Polyak are retained, and sometimes extended to a wider class of problems, by our ARC approach. Preliminary numerical experiments with small-scale test problems from the CUTEr set show encouraging performance of the ARC algorithm when compared to a basic trust-region implementation.
AB - An Adaptive Regularisation algorithm using Cubics (ARC) is proposed for unconstrained optimization, generalizing at the same time an unpublished method due to Griewank (Technical Report NA/12, 1981, DAMTP, University of Cambridge), an algorithm by Nesterov and Polyak (Math Program 108(1):177-205, 2006) and a proposal by Weiser et al. (Optim Methods Softw 22(3):413-431, 2007). At each iteration of our approach, an approximate global minimizer of a local cubic regularisation of the objective function is determined, and this ensures a significant improvement in the objective so long as the Hessian of the objective is locally Lipschitz continuous. The new method uses an adaptive estimation of the local Lipschitz constant and approximations to the global model-minimizer which remain computationally-viable even for large-scale problems. We show that the excellent global and local convergence properties obtained by Nesterov and Polyak are retained, and sometimes extended to a wider class of problems, by our ARC approach. Preliminary numerical experiments with small-scale test problems from the CUTEr set show encouraging performance of the ARC algorithm when compared to a basic trust-region implementation.
UR - http://www.scopus.com/inward/record.url?scp=79952763936&partnerID=8YFLogxK
U2 - 10.1007/s10107-009-0286-5
DO - 10.1007/s10107-009-0286-5
M3 - Article
AN - SCOPUS:79952763936
SN - 0025-5610
VL - 127
SP - 245
EP - 295
JO - Mathematical Programming
JF - Mathematical Programming
IS - 2
ER -