Nonlinear Least Squares for Inverse Problems

by
Format: Hardcover
Pub. Date: 2009-10-01
Publisher(s): Springer Verlag
List Price: $149.78

Rent Textbook

Select for Price
There was a problem. Please try again later.

Digital

Rent Digital Options
Online:30 Days access
Downloadable:30 Days
$39.24
Online:60 Days access
Downloadable:60 Days
$52.32
Online:90 Days access
Downloadable:90 Days
$65.40
Online:120 Days access
Downloadable:120 Days
$78.48
Online:180 Days access
Downloadable:180 Days
$85.02
Online:1825 Days access
Downloadable:Lifetime Access
$130.80
*To support the delivery of the digital material to you, a non-refundable digital delivery fee of $3.99 will be charged on each digital item.
$85.02*

New Textbook

We're Sorry
Sold Out

Used Textbook

We're Sorry
Sold Out

How Marketplace Works:

  • This item is offered by an independent seller and not shipped from our warehouse
  • Item details like edition and cover design may differ from our description; see seller's comments before ordering.
  • Sellers much confirm and ship within two business days; otherwise, the order will be cancelled and refunded.
  • Marketplace purchases cannot be returned to eCampus.com. Contact the seller directly for inquiries; if no response within two days, contact customer service.
  • Additional shipping costs apply to Marketplace purchases. Review shipping costs at checkout.

Summary

This book provides an introduction into the least squares resolution of nonlinear inverse problems. The first goal is to develop a geometrical theory to analyze nonlinear least square (NLS) problems with respect to their quadratic wellposedness, i.e. both wellposedness and optimizability. Using the results, the applicability of various regularization techniques can be checked. The second objective of the book is to present frequent practical issues when solving NLS problems. Application oriented readers will find a detailed analysis of problems on the reduction to finite dimensions, the algebraic determination of derivatives (sensitivity functions versus adjoint method), the determination of the number of retrievable parameters, the choice of parametrization (multiscale, adaptive) and the optimization step, and the general organization of the inversion code. Special attention is paid to parasitic local minima, which can stop the optimizer far from the global minimum: multiscale parametrization is shown to be an efficient remedy in many cases, and a new condition is given to check both wellposedness and the absence of parasitic local minima. For readers that are interested in projection of non-convex sets, Part II of this book presents the geometric theory of quasi-convex and strictly quasi-convex (s.q.c.) sets. S.q.c. sets can be recognized by their finite curvature and limited deflection and possess a neighborhood where the projection is well-behaved. Throughout the book, each chapter starts with an overview of the presented concepts and results.

Table of Contents

Prefacep. vii
Nonlinear Least Squaresp. 1
Nonlinear Inverse Problems: Examples and Difficultiesp. 5
Example 1: Inversion of Knott-Zoeppritz Equationsp. 6
An Abstract NLS Inverse Problemp. 9
Analysis of NLS Problemsp. 10
Wellposednessp. 10
Optimizabilityp. 12
Output Least Squares Identifiability and Quadratically Wellposed Problemsp. 12
Regularizationp. 14
Derivationp. 20
Example 2: 1D Elliptic Parameter Estimation Problemp. 21
Example 3: 2D Elliptic Nonlinear Source Estimation Problemp. 24
Example 4: 2D Elliptic Parameter Estimation Problemp. 26
Computing Derivativesp. 29
Setting the Scenep. 30
The Sensitivity Functions Approachp. 33
The Adjoint Approachp. 33
Implementation of the Adjoint Approachp. 38
Example 1: The Adjoint Knott-Zoeppritz Equationsp. 41
Examples 3 and 4: Discrete Adjoint Equationsp. 46
Discretization Step 1: Choice of a Discretized Forward Mapp. 47
Discretization Step 2: Choice of a Discretized Objective Functionp. 52
Derivation Step 0: Forward Map and Objective Functionp. 52
Derivation Step 1: State-Space Decompositionp. 53
Derivation Step 2: Lagrangianp. 54
Derivation Step 3: Adjoint Equationp. 56
Derivation Step 4: Gradient Equationp. 58
Examples 3 and 4: Continuous Adjoint Equationsp. 59
Example 5: Differential Equations, Discretized Versus Discrete Gradientp. 65
Implementing the Discretized Gradientp. 68
Implementing the Discrete Gradientp. 68
Example 6: Discrete Marching Problemsp. 73
Choosing a Parameterizationp. 79
Calibrationp. 80
On the Parameter Sidep. 80
On the Data Sidep. 83
Conclusionp. 84
How Many Parameters Can be Retrieved from the Data?p. 84
Simulation Versus Optimization Parametersp. 88
Parameterization by a Closed Form Formulap. 90
Decomposition on the Singular Basisp. 91
Multiscale Parameterizationp. 93
Simulation Parameters for a Distributed Parameterp. 93
Optimization Parameters at Scale kp. 94
Scale-By-Scale Optimizationp. 95
Examples of Multiscale Basesp. 105
Summary for Multiscale Parameterizationp. 108
Adaptive Parameterization: Refinement Indicatorsp. 108
Definition of Refinement Indicatorsp. 109
Multiscale Refinement Indicatorsp. 116
Application to Image Segmentationp. 121
Coarsening Indicatorsp. 122
A Refinement/Coarsening Indicators Algorithmp. 124
Implementation of the Inversionp. 126
Constraints and Optimization Parametersp. 126
Gradient with Respect to Optimization Parametersp. 129
Maximum Projected Curvature: A Descent Step for Nonlinear Least Squaresp. 135
Descent Algorithmsp. 135
Maximum Projected Curvature (MPC) Stepp. 137
Convergence Properties for the Theoretical MPC Stepp. 143
Implementation of the MPC Stepp. 144
Performance of the MPC Stepp. 148
Output Least Squares Identifiability and Quadratically Wellposed NLS Problemsp. 161
The Linear Casep. 163
Finite Curvature/Limited Deflection Problemsp. 165
Identifiability and Stability of the Linearized Problemsp. 174
A Sufficient Condition for OLS-Identifiabilityp. 176
The Case of Finite Dimensional Parametersp. 179
Four Questions to Q-Wellposednessp. 182
Case of Finite Dimensional Parametersp. 183
Case of Infinite Dimensional Parametersp. 184
Answering the Four Questionsp. 184
Application to Example 2: ID Parameter Estimation with H1 Observationp. 191
Linear Stabilityp. 193
Deflection Estimatep. 198
Curvature Estimatep. 199
Conclusion: OLS-Identifiabilityp. 200
Application to Example 4: 2D Parameter Estimation, with H1 Observationp. 200
Regularization of Nonlinear Least Squares Problemsp. 209
Levenberg-Marquardt-Tychonov (LMT) Regularizationp. 209
Linear Problemsp. 211
Finite Curvature/Limited Deflection (FC/LD) Problemsp. 219
General Nonlinear Problemsp. 231
Application to the Nonlinear 2D Source Problemp. 237
State-Space Regularizationp. 246
Dense Observation: Geometric Approachp. 248
Incomplete Observation: Soft Analysisp. 256
Adapted Regularization for Example 4: 2D Parameter Estimation with H1 Observationp. 259
Which Part of a is Constrained by the Data?p. 260
How to Control the Unconstrained Part?p. 262
The Adapted-Regularized Problemp. 264
Infinite Dimensional Linear Stability and Deflection Estimatesp. 265
Finite Curvature Estimatep. 267
OLS-Identifiability for the Adapted Regularized Problemp. 268
A Generalization of Convex Setsp. 271
Quasi-Convex Setsp. 275
Equipping the Set D with Pathsp. 277
Definition and Main Properties of q.c. Setsp. 281
Strictly Quasi-Convex Setsp. 299
Definition and Main Properties of s.q.c. Setsp. 300
Characterization by the Global Radius of Curvaturep. 304
Formula for the Global Radius of Curvaturep. 316
Deflection Conditions for the Strict Quasi-convexity of Setsp. 321
The General Case: D ⊂ Fp. 327
The Case of an Attainable Set D = ¿(C)p. 337
Bibliographyp. 345
Indexp. 353
Table of Contents provided by Ingram. All Rights Reserved.

An electronic version of this book is available through VitalSource.

This book is viewable on PC, Mac, iPhone, iPad, iPod Touch, and most smartphones.

By purchasing, you will be able to view this book online, as well as download it, for the chosen number of days.

Digital License

You are licensing a digital product for a set duration. Durations are set forth in the product description, with "Lifetime" typically meaning five (5) years of online access and permanent download to a supported device. All licenses are non-transferable.

More details can be found here.

A downloadable version of this book is available through the eCampus Reader or compatible Adobe readers.

Applications are available on iOS, Android, PC, Mac, and Windows Mobile platforms.

Please view the compatibility matrix prior to purchase.