Abstract
Quasidifferentiable and codifferentiable optimization algorithms are based on gradientlike, descent, iterative techniques whereas gradient information is replaced by the setvalued quasidifferential or the codifferential. Then the steepest descent finding subproblems are appropriately replaced by quadratic programming subproblems with a polyhedral approximation of the aforementioned set—valued quantities. Since supergradients (resp. hyperdifferentials) pose a combinatorial problem in the descent direction finding subproblem, which can effectively be treated after making the polyhedral approximation by repeated solution of a number of similar subproblems or simply by solving one of them (supergradient—like technique), the basic methods used are the ones of hypodifferential optimization. These techniques will be described in the sequel (for more details we refer to [3], [9], [5]). It should be mentioned here that first order quasidifferential and codifferential optimization schemes treat more effectively, in a correct way vertical branches of laws and boundary conditions in mechanical problems, or equivalently, the nonsmoothness of the respective potentials. If at a neighbourhood of the solution the problem is essentially smooth, i.e. the solution lies far away from a point of nondifferentiability, classical methods of nonlinear computational mechanics (e.g. Newton—type methods and its derivates) can be used for the refinement of the accuracy and for speeding up the rate of convergence. Nevertheless if multiple points of nondifferentiability (cusps) have to be passed along a given loading path the general methods presented here must be used (see also Chapter 8).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Auchmuty G. (1989), Duality algorithms for nonconvex variational principles. Num. Functional Analysis and Optimization, 10, 211–264.
Bertsekas D.P. (1982), Constrained optimization and Lagrange multiplier methods, Academic Press, New York.
Demyanov V.F. and Vasiliev L.N. (1985), Nondifferentiable Optimization, Optimization Software, New York.
Demyanov V.F. and Rubinov A.M. (1990), Foundations of Nonsmooth Analysis. Quasidifferential Calculus, (in Russian), Nauka, Moscow, 431 p.
Demyanov V.F and Rubinov A.M. (1995), Introduction to Constructive Nonsmooth Analysis, Peter Lang Verlag, Frankfurt a.M. — Bern — New York, 414 p.
Hiriart-Urruty J.-B. (1985), Generalized differentiability, duality and optimization for problems dealing with differences of convex functions. In: Convexity and duality in optimization, Ed. J. Ponstein, 37–50, Lect. Notes in Economics and Mathematical Systems Vol. 256, Springer.
Di Pillo G. and Facchinei F. (1992), Regularity conditions and exact penalty functions in Lipschitz programming problems, In: Nonsmooth optimization methods and applications, Ed. F. Giannessi, 107–120, Gordon and Breach, Amsterdam.
Polyakova L.N. (1981), Necessary conditions for an extremum of quasidifferentiable functions, Vestnik Leningrad Univ. Math. 13, 241–249.
Polyakova L.N. (1986), On minimizing the sum of a convex function and a concave function, Mathematical Programming Study 29, 69–73.
Rockafellar R.T. (1970), Convex Analysis, Princeton University Press, Princeton.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1996 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Dem’yanov, V.F., Stavroulakis, G.E., Polyakova, L.N., Panagiotopoulos, P.D. (1996). Nonsmooth Optimization Algorithms. In: Quasidifferentiability and Nonsmooth Modelling in Mechanics, Engineering and Economics. Nonconvex Optimization and Its Applications, vol 10. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-4113-4_6
Download citation
DOI: https://doi.org/10.1007/978-1-4615-4113-4_6
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4613-6844-1
Online ISBN: 978-1-4615-4113-4
eBook Packages: Springer Book Archive