Minimizing convex quadratics with variable precision Krylov methods

Activité: Discours ou présentation Discours invité


Iterative algorithms for the solution of convex quadratic optimization problems are investigated, which exploit inaccurate matrix-vector products. Theoretical bounds on the performance of a Conjugate Gradients method are derived, the necessary quantities occurring in the theoretical bounds estimated and a new practical algorithm derived. Numerical experiments suggest that the new method has significant potential, including in the steadily more important context of multi-precision computations.
Période28 nov. 2019
Conservé àUniversity of Oxford, Royaume-Uni
Niveau de reconnaissanceRégional