The αBB Approach for General Constrained Twice-Differentiable NLPs : Theory
In this chapter, we present the theory of the αBB global optimization approach for general constrained nonlinear optimization problems that are twice differentiable. Section 12.1 presents the formulation and an outline of the basic principles behind the αBB approach. Section 12.2 discusses (i) the underestimation strategies employed for special structure terms such as bilinear, trilinear, fractional, fractional trilinear and univariate concave terms, (ii) the novel convex underestimation functions for general nonconvex terms, as well as (iii) key theoretical results that provide the relation between the convex envelope and the maximum separation distance of bilinear terms, and the convex envelope and the maximum separation distance of the novel underestimators for general structure terms. Section 12.3 presents the overall convex underestimating functions, discusses the treatment of the equality constraints, and provides the convex lower bounding formulation. Section 12.4 focuses on novel and rigorous methods for calculating values of the a parameters that ensure convergence to the global minimum. Section 12.5 presents an example that illustrates the performance of different methods for the rigorous calculation of the α parameters. Section 12.6 presents the algorithmic steps of the αBB global optimization approach for general constrained twice differentiable NLPs. Finally, section 12.7 discusses the geometrical interpretation of the αBB approach through a highly nonlinear example. The material presented in this chapter is based on the work of Androulakis et al. (1995), Adjiman and Floudas (1996), and Adjirnan et al. (1998a).
KeywordsHessian Matrix Interval Arithmetic Minimum Eigenvalue Maximum Separation Convex Envelope
Unable to display preview. Download preview PDF.