Abstract
In this chapter, we present the theory of the αBB global optimization approach for general constrained nonlinear optimization problems that are twice differentiable. Section 12.1 presents the formulation and an outline of the basic principles behind the αBB approach. Section 12.2 discusses (i) the underestimation strategies employed for special structure terms such as bilinear, trilinear, fractional, fractional trilinear and univariate concave terms, (ii) the novel convex underestimation functions for general nonconvex terms, as well as (iii) key theoretical results that provide the relation between the convex envelope and the maximum separation distance of bilinear terms, and the convex envelope and the maximum separation distance of the novel underestimators for general structure terms. Section 12.3 presents the overall convex underestimating functions, discusses the treatment of the equality constraints, and provides the convex lower bounding formulation. Section 12.4 focuses on novel and rigorous methods for calculating values of the a parameters that ensure convergence to the global minimum. Section 12.5 presents an example that illustrates the performance of different methods for the rigorous calculation of the α parameters. Section 12.6 presents the algorithmic steps of the αBB global optimization approach for general constrained twice differentiable NLPs. Finally, section 12.7 discusses the geometrical interpretation of the αBB approach through a highly nonlinear example. The material presented in this chapter is based on the work of Androulakis et al. (1995), Adjiman and Floudas (1996), and Adjirnan et al. (1998a).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Floudas, C.A. (2000). The αBB Approach for General Constrained Twice-Differentiable NLPs : Theory. In: Deterministic Global Optimization. Nonconvex Optimization and Its Applications, vol 37. Springer, Boston, MA. https://doi.org/10.1007/978-1-4757-4949-6_12
Download citation
DOI: https://doi.org/10.1007/978-1-4757-4949-6_12
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4419-4820-5
Online ISBN: 978-1-4757-4949-6
eBook Packages: Springer Book Archive