We consider a class of Iterated-Subspace Minimization (ISM) methods for solving large-scale unconstrained minimization problems. At each major iteration of such a method, a low-dimensional manifold, the iterated subspace, is constructed and an approximate minimizer of the objective function in this manifold then determined. The iterated subspace is chosen to contain vectors which ensure global convergence of the overall scheme and may also contain vectors which encourage fast asymptotic convergence. We demonstrate that this approach can sometimes be very advantageous and indicate the general performance on a collection of large problems. Moreover, comparisons with a limited memory approach and LANCELOT are made.
|Publication status||Published - 1996|