### Abstract

In this paper two simple examples of a twice continuously differentiable strictly convex function (Formula presented.) are presented for which Newton’s method with line search converges to a point where the gradient of (Formula presented.) is not zero. The first example uses a line search based on the Wolfe conditions. For the second example, some strictly convex function (Formula presented.) is defined as well as a sequence of descent directions for which exact line searches do not converge to the minimizer of (Formula presented.). Then (Formula presented.) is perturbed such that these search directions coincide with the Newton directions for the perturbed function while leaving the exact line search invariant.

Original language | English |
---|---|

Pages (from-to) | 23-34 |

Journal | Mathematical Programming |

Volume | 158 |

Issue number | 1 |

DOIs | |

Publication status | Published - Jun 2016 |

### Fingerprint

### Keywords

- Convex minimization
- Line search
- Newton’s method
- Wolfe conditions

### Cite this

}

*Mathematical Programming*, vol. 158, no. 1, pp. 23-34. https://doi.org/10.1007/s10107-015-0913-2

**Simple examples for the failure of Newton’s method with line search for strictly convex minimization.** / Jarre, Florian; Toint, Philippe L.

Research output: Contribution to journal › Article

TY - JOUR

T1 - Simple examples for the failure of Newton’s method with line search for strictly convex minimization

AU - Jarre, Florian

AU - Toint, Philippe L.

PY - 2016/6

Y1 - 2016/6

N2 - In this paper two simple examples of a twice continuously differentiable strictly convex function (Formula presented.) are presented for which Newton’s method with line search converges to a point where the gradient of (Formula presented.) is not zero. The first example uses a line search based on the Wolfe conditions. For the second example, some strictly convex function (Formula presented.) is defined as well as a sequence of descent directions for which exact line searches do not converge to the minimizer of (Formula presented.). Then (Formula presented.) is perturbed such that these search directions coincide with the Newton directions for the perturbed function while leaving the exact line search invariant.

AB - In this paper two simple examples of a twice continuously differentiable strictly convex function (Formula presented.) are presented for which Newton’s method with line search converges to a point where the gradient of (Formula presented.) is not zero. The first example uses a line search based on the Wolfe conditions. For the second example, some strictly convex function (Formula presented.) is defined as well as a sequence of descent directions for which exact line searches do not converge to the minimizer of (Formula presented.). Then (Formula presented.) is perturbed such that these search directions coincide with the Newton directions for the perturbed function while leaving the exact line search invariant.

KW - Convex minimization

KW - Line search

KW - Newton’s method

KW - Wolfe conditions

UR - http://www.scopus.com/inward/record.url?scp=84929629741&partnerID=8YFLogxK

U2 - 10.1007/s10107-015-0913-2

DO - 10.1007/s10107-015-0913-2

M3 - Article

VL - 158

SP - 23

EP - 34

JO - Mathematical Programming

JF - Mathematical Programming

SN - 0025-5610

IS - 1

ER -