Preconditioning and globalizing conjugate gradients in dual space for quadratically penalized nonlinear-least squares problems

Research output: Contribution to journalArticle

9 Downloads (Pure)

Abstract

When solving nonlinear least-squares problems, it is often useful to regularize the problem using a quadratic term, a practice which is especially common in applications arising in inverse calculations. A solution method derived from a trust-region Gauss-Newton algorithm is analyzed for such applications, where, contrary to the standard algorithm, the least-squares subproblem solved at each iteration of the method is rewritten as a quadratic minimization subject to linear equality constraints. This allows the exploitation of duality properties of the associated linearized problems. This paper considers a recent conjugate-gradient-like method which performs the quadratic minimization in the dual space and produces, in exact arithmetic, the same iterates as those produced by a standard conjugate-gradients method in the primal space. This dual algorithm is computationally interesting whenever the dimension of the dual space is significantly smaller than that of the primal space, yielding gains in terms of both memory usage and computational cost. The relation between this dual space solver and PSAS (Physical-space Statistical Analysis System), another well-known dual space technique used in data assimilation problems, is explained. The use of an effective preconditioning technique is proposed and refined convergence bounds derived, which results in a practical solution method. Finally, stopping rules adequate for a trust-region solver are proposed in the dual space, providing iterates that are equivalent to those obtained with a Steihaug-Toint truncated conjugate-gradient method in the primal space. © 2012 Springer Science+Business Media, LLC.

Original languageEnglish
Pages (from-to)1-25
Number of pages25
JournalComputational Optimization and Applications
Volume54
Issue number1
DOIs
Publication statusPublished - 2013

Fingerprint

Penalized Least Squares
Nonlinear Least Squares Problem
Conjugate Gradient
Dual space
Preconditioning
Conjugate gradient method
Trust Region
Conjugate Gradient Method
Iterate
Statistical methods
Dual Algorithm
Gauss-Newton
Preconditioning Techniques
Data Assimilation
Stopping Rule
Data storage equipment
Equality Constraints
Linear Constraints
Exploitation
Statistical Analysis

Keywords

  • Conjugate-gradient methods
  • Data assimilation
  • Dual-space minimization
  • Globalization
  • Preconditioning
  • Trust-region methods

Cite this

@article{eaa16b18d0424df99ec7c8e128de6bf6,
title = "Preconditioning and globalizing conjugate gradients in dual space for quadratically penalized nonlinear-least squares problems",
abstract = "When solving nonlinear least-squares problems, it is often useful to regularize the problem using a quadratic term, a practice which is especially common in applications arising in inverse calculations. A solution method derived from a trust-region Gauss-Newton algorithm is analyzed for such applications, where, contrary to the standard algorithm, the least-squares subproblem solved at each iteration of the method is rewritten as a quadratic minimization subject to linear equality constraints. This allows the exploitation of duality properties of the associated linearized problems. This paper considers a recent conjugate-gradient-like method which performs the quadratic minimization in the dual space and produces, in exact arithmetic, the same iterates as those produced by a standard conjugate-gradients method in the primal space. This dual algorithm is computationally interesting whenever the dimension of the dual space is significantly smaller than that of the primal space, yielding gains in terms of both memory usage and computational cost. The relation between this dual space solver and PSAS (Physical-space Statistical Analysis System), another well-known dual space technique used in data assimilation problems, is explained. The use of an effective preconditioning technique is proposed and refined convergence bounds derived, which results in a practical solution method. Finally, stopping rules adequate for a trust-region solver are proposed in the dual space, providing iterates that are equivalent to those obtained with a Steihaug-Toint truncated conjugate-gradient method in the primal space. {\circledC} 2012 Springer Science+Business Media, LLC.",
keywords = "Conjugate-gradient methods, Data assimilation, Dual-space minimization, Globalization, Preconditioning, Trust-region methods",
author = "Serge Gratton and Selime G{\"u}rol and Philippe Toint",
note = "Publication code : FP SB092/2010/10 ; SB04977/2010/10",
year = "2013",
doi = "10.1007/s10589-012-9478-7",
language = "English",
volume = "54",
pages = "1--25",
journal = "Computational Optimization and Applications",
issn = "0926-6003",
publisher = "Springer Netherlands",
number = "1",

}

TY - JOUR

T1 - Preconditioning and globalizing conjugate gradients in dual space for quadratically penalized nonlinear-least squares problems

AU - Gratton, Serge

AU - Gürol, Selime

AU - Toint, Philippe

N1 - Publication code : FP SB092/2010/10 ; SB04977/2010/10

PY - 2013

Y1 - 2013

N2 - When solving nonlinear least-squares problems, it is often useful to regularize the problem using a quadratic term, a practice which is especially common in applications arising in inverse calculations. A solution method derived from a trust-region Gauss-Newton algorithm is analyzed for such applications, where, contrary to the standard algorithm, the least-squares subproblem solved at each iteration of the method is rewritten as a quadratic minimization subject to linear equality constraints. This allows the exploitation of duality properties of the associated linearized problems. This paper considers a recent conjugate-gradient-like method which performs the quadratic minimization in the dual space and produces, in exact arithmetic, the same iterates as those produced by a standard conjugate-gradients method in the primal space. This dual algorithm is computationally interesting whenever the dimension of the dual space is significantly smaller than that of the primal space, yielding gains in terms of both memory usage and computational cost. The relation between this dual space solver and PSAS (Physical-space Statistical Analysis System), another well-known dual space technique used in data assimilation problems, is explained. The use of an effective preconditioning technique is proposed and refined convergence bounds derived, which results in a practical solution method. Finally, stopping rules adequate for a trust-region solver are proposed in the dual space, providing iterates that are equivalent to those obtained with a Steihaug-Toint truncated conjugate-gradient method in the primal space. © 2012 Springer Science+Business Media, LLC.

AB - When solving nonlinear least-squares problems, it is often useful to regularize the problem using a quadratic term, a practice which is especially common in applications arising in inverse calculations. A solution method derived from a trust-region Gauss-Newton algorithm is analyzed for such applications, where, contrary to the standard algorithm, the least-squares subproblem solved at each iteration of the method is rewritten as a quadratic minimization subject to linear equality constraints. This allows the exploitation of duality properties of the associated linearized problems. This paper considers a recent conjugate-gradient-like method which performs the quadratic minimization in the dual space and produces, in exact arithmetic, the same iterates as those produced by a standard conjugate-gradients method in the primal space. This dual algorithm is computationally interesting whenever the dimension of the dual space is significantly smaller than that of the primal space, yielding gains in terms of both memory usage and computational cost. The relation between this dual space solver and PSAS (Physical-space Statistical Analysis System), another well-known dual space technique used in data assimilation problems, is explained. The use of an effective preconditioning technique is proposed and refined convergence bounds derived, which results in a practical solution method. Finally, stopping rules adequate for a trust-region solver are proposed in the dual space, providing iterates that are equivalent to those obtained with a Steihaug-Toint truncated conjugate-gradient method in the primal space. © 2012 Springer Science+Business Media, LLC.

KW - Conjugate-gradient methods

KW - Data assimilation

KW - Dual-space minimization

KW - Globalization

KW - Preconditioning

KW - Trust-region methods

UR - http://www.scopus.com/inward/record.url?scp=84871634545&partnerID=8YFLogxK

U2 - 10.1007/s10589-012-9478-7

DO - 10.1007/s10589-012-9478-7

M3 - Article

VL - 54

SP - 1

EP - 25

JO - Computational Optimization and Applications

JF - Computational Optimization and Applications

SN - 0926-6003

IS - 1

ER -