Optimization Methods in Banach Spaces
In this chapter we present a selection of important algorithms for optimization problems with partial differential equations. The development and analysis of these methods is carried out in a Banach space setting. We begin by introducing a general framework for achieving global convergence. Then, several variants of generalized Newton methods are derived and analyzed. In particular, necessary and sufficient conditions for fast local convergence are derived. Based on this, the concept of semismooth Newton methods for operator equations is introduced. It is shown how complementarity conditions, variational inequalities, and optimality systems can be reformulated as semismooth operator equations. Applications to constrained optimal control problems are discussed, in particular for elliptic partial differential equations and for flow control problems governed by the incompressible instationary Navier-Stokes equations. As a further important concept, the formulation of optimality systems as generalized equations is addressed. We introduce and analyze the Josephy-Newton method for generalized equations. This provides an elegant basis for the motivation and analysis of sequential quadratic programming (SQP) algorithms. The chapter concludes with a short outline of recent algorithmic advances for state constrained problems and a brief discussion of several further aspects.
KeywordsBanach Space Optimal Control Problem Sequential Quadratic Programming Descent Direction Sequential Quadratic Programming Method
Unable to display preview. Download preview PDF.