Using approximate secant equations in limited memory methods for multilevel unconstrained optimization

Résultats de recherche: Contribution à un journal/une revueArticle

30 Downloads (Pure)

Résumé

The properties of multilevel optimization problems defined on a hierarchy of discretization grids can be used to define approximate secant equations, which describe the second-order behavior of the objective function. Following earlier work by Gratton and Toint (2009) we introduce a quasi-Newton method (with a linesearch) and a nonlinear conjugate gradient method that both take advantage of this new second-order information. We then present numerical experiments with these methods and formulate recommendations for their practical use. © Springer Science+Business Media, LLC 2011.
langue originaleAnglais
Pages (de - à)967-979
Nombre de pages13
journalComputational Optimization and Applications
Volume51
Numéro de publication3
Les DOIs
étatPublié - 1 avr. 2012

Empreinte digitale

Limited Memory Method
Conjugate gradient method
Unconstrained Optimization
Newton-Raphson method
Chord or secant line
Data storage equipment
Quasi-Newton Method
Line Search
Conjugate Gradient Method
Recommendations
Industry
Objective function
Discretization
Experiments
Numerical Experiment
Optimization Problem
Grid

Citer ceci

@article{9b6209cc259248218f4210fef2b6a227,
title = "Using approximate secant equations in limited memory methods for multilevel unconstrained optimization",
abstract = "The properties of multilevel optimization problems defined on a hierarchy of discretization grids can be used to define approximate secant equations, which describe the second-order behavior of the objective function. Following earlier work by Gratton and Toint (2009) we introduce a quasi-Newton method (with a linesearch) and a nonlinear conjugate gradient method that both take advantage of this new second-order information. We then present numerical experiments with these methods and formulate recommendations for their practical use. {\circledC} Springer Science+Business Media, LLC 2011.",
keywords = "nonlinear conjugate gradient methods, nonlinear optimization, quasi-Newton methods, multilevel problems, limited-memory algorithms",
author = "Serge Gratton and Vincent Malmedy and Philippe Toint",
note = "Copyright 2012 Elsevier B.V., All rights reserved.",
year = "2012",
month = "4",
day = "1",
doi = "10.1007/s10589-011-9393-3",
language = "English",
volume = "51",
pages = "967--979",
journal = "Computational Optimization and Applications",
issn = "0926-6003",
publisher = "Springer Netherlands",
number = "3",

}

Using approximate secant equations in limited memory methods for multilevel unconstrained optimization. / Gratton, Serge; Malmedy, Vincent; Toint, Philippe.

Dans: Computational Optimization and Applications, Vol 51, Numéro 3, 01.04.2012, p. 967-979.

Résultats de recherche: Contribution à un journal/une revueArticle

TY - JOUR

T1 - Using approximate secant equations in limited memory methods for multilevel unconstrained optimization

AU - Gratton, Serge

AU - Malmedy, Vincent

AU - Toint, Philippe

N1 - Copyright 2012 Elsevier B.V., All rights reserved.

PY - 2012/4/1

Y1 - 2012/4/1

N2 - The properties of multilevel optimization problems defined on a hierarchy of discretization grids can be used to define approximate secant equations, which describe the second-order behavior of the objective function. Following earlier work by Gratton and Toint (2009) we introduce a quasi-Newton method (with a linesearch) and a nonlinear conjugate gradient method that both take advantage of this new second-order information. We then present numerical experiments with these methods and formulate recommendations for their practical use. © Springer Science+Business Media, LLC 2011.

AB - The properties of multilevel optimization problems defined on a hierarchy of discretization grids can be used to define approximate secant equations, which describe the second-order behavior of the objective function. Following earlier work by Gratton and Toint (2009) we introduce a quasi-Newton method (with a linesearch) and a nonlinear conjugate gradient method that both take advantage of this new second-order information. We then present numerical experiments with these methods and formulate recommendations for their practical use. © Springer Science+Business Media, LLC 2011.

KW - nonlinear conjugate gradient methods

KW - nonlinear optimization

KW - quasi-Newton methods

KW - multilevel problems

KW - limited-memory algorithms

UR - http://www.scopus.com/inward/record.url?scp=84861999218&partnerID=8YFLogxK

U2 - 10.1007/s10589-011-9393-3

DO - 10.1007/s10589-011-9393-3

M3 - Article

VL - 51

SP - 967

EP - 979

JO - Computational Optimization and Applications

JF - Computational Optimization and Applications

SN - 0926-6003

IS - 3

ER -