Project Details
Description
It is well known that the norm of the gradient may be unreliable
as a stopping test in unconstrained optimization, and that it often
exhibits oscillations in the course of the optimization. In this project
we have studied the properties of the gradient norm for the
steepest descent method applied to quadratic objective functions.
We have also made some general observations that apply to nonlinear
problems, relating the gradient norm, the objective function value,
and the path generated by the iterates.
as a stopping test in unconstrained optimization, and that it often
exhibits oscillations in the course of the optimization. In this project
we have studied the properties of the gradient norm for the
steepest descent method applied to quadratic objective functions.
We have also made some general observations that apply to nonlinear
problems, relating the gradient norm, the objective function value,
and the path generated by the iterates.
Status | Finished |
---|---|
Effective start/end date | 1/05/95 → 30/04/02 |
Keywords
- nonlinear optimization
- steepest descent method
- unconstrained optimization
- gradient norm