On iterated-subspace minimization methods for nonlinear optimization: Proceedings on Linear and Nonlinear Conjugate Gradient-Related Methods

Andy Conn, Nick Gould, Annick Sartenaer, Philippe Toint

    Research output: Other contribution

    105 Downloads (Pure)

    Abstract

    We consider a class of Iterated-Subspace Minimization (ISM) methods for solving large-scale unconstrained minimization problems. At each major iteration of such a method, a low-dimensional manifold, the iterated subspace, is constructed and an approximate minimizer of the objective function in this manifold then determined. The iterated subspace is chosen to contain vectors which ensure global convergence of the overall scheme and may also contain vectors which encourage fast asymptotic convergence. We demonstrate that this approach can sometimes be very advantageous and indicate the general performance on a collection of large problems. Moreover, comparisons with a limited memory approach and LANCELOT are made.
    Original languageEnglish
    Publication statusPublished - 1996

    Fingerprint Dive into the research topics of 'On iterated-subspace minimization methods for nonlinear optimization: Proceedings on Linear and Nonlinear Conjugate Gradient-Related Methods'. Together they form a unique fingerprint.

    Cite this