## Abstract

In this paper two simple examples of a twice continuously differentiable strictly convex function (Formula presented.) are presented for which Newton’s method with line search converges to a point where the gradient of (Formula presented.) is not zero. The first example uses a line search based on the Wolfe conditions. For the second example, some strictly convex function (Formula presented.) is defined as well as a sequence of descent directions for which exact line searches do not converge to the minimizer of (Formula presented.). Then (Formula presented.) is perturbed such that these search directions coincide with the Newton directions for the perturbed function while leaving the exact line search invariant.

Original language | English |
---|---|

Pages (from-to) | 23-34 |

Journal | Mathematical Programming |

Volume | 158 |

Issue number | 1 |

DOIs | |

Publication status | Published - Jun 2016 |

## Keywords

- Convex minimization
- Line search
- Newton’s method
- Wolfe conditions