Advertisement

Applications in Numerical Variational Analysis

  • Asen L. Dontchev
  • R. Tyrrell Rockafellar
Chapter
Part of the Springer Monographs in Mathematics book series (SMM)

Abstract

The classical implicit function theorem finds a wide range of applications in numerical analysis. For instance, it helps in deriving error estimates for approximations to differential equations and is often relied on in establishing the convergence of algorithms. Can the generalizations of the classical theory to which we have devoted so much of this book have comparable applications in the numerical treatment of nonclassical problems for generalized equations and beyond? In this chapter we provide positive answers in several directions.

We begin with a topic at the core of numerical work, the “conditioning” of a problem and how it extends to concepts like metric regularity. We also explain how the conditioning of a feasibility problem, like solving a system of inequalities, can be understood. Next we take up a general iterative scheme for solving generalized equations under metric regularity, obtaining convergence by means of our earlier basic results. As particular cases, we get various modes of convergence of the age-old procedure known as Newton’s method in several guises, and of the much more recently introduced proximal point algorithm. We go a step further with Newton’s method by showing that the mapping which assigns to an instance of a parameter the set of all sequences generated by the method obeys, in a Banach space of sequences, the implicit function theorem paradigm in the same pattern as the solution mapping for the underlying generalized equation. Approximations of quadratic optimization problems in Hilbert spaces are then studied. Finally, we apply our methodology to discrete approximations in optimal control.

Keywords

Variational Inequality Implicit Function Theorem Sequential Quadratic Programming Proximal Point Algorithm Lebesgue Point 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Copyright information

© Springer Science+Business Media, LLC 2009

Authors and Affiliations

  1. 1.Mathematical ReviewsAmerican Mathematical SocietyAnn ArborUSA
  2. 2.Department of MathematicsUniversity of WashingtonSeattleUSA

Personalised recommendations