Second-order sequence-based necessary optimality conditions in constrained nonsmooth vector optimization and applications
Several notions of sequential directional derivatives and sequential local approximations are introduced. Under (first-order) Hadamard differentiability assumptions of the data at the point of study, these concepts are utilized to analyze second-order necessary optimality conditions, which rely on given sequences, for local weak solutions in nonsmooth vector optimization problems with constraints. Some applications to minimax programming problems are also derived.
KeywordsNonsmooth vector optimization Sequence-based necessary optimality condition Weak solution Second-order sequential tangent set Second-order sequential directional derivative
Mathematics Subject Classification90C29 90C46 49K27 26B05
The author would like to thank the editor and an anonymous referee for their valuable remarks and suggestions, which have helped him to improve the paper.
- 16.Ioffe, A.D.: On some recent developments in the theory of second order optimality conditions. In: Dolecki, S. (ed.) Optimization—Fifth French–German Conference, pp. 55–68. Springer, Berlin (1989)Google Scholar
- 29.Mordukhovich, B.S.: Variational Analysis and Generalized Differentiation, Vol. I: Basic Theory. Springer, Berlin (2006)Google Scholar
- 30.Mordukhovich, B.S.: Variational Analysis and Generalized Differentiation, Vol. II: Applications. Springer, Berlin (2006)Google Scholar