Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/85629
Title: Optimality conditions via exact penalty functions
Authors: Meng, Kaiwen
Degree: Ph.D.
Issue Date: 2011
Abstract: The purpose of this thesis is, to study optimality conditions for constrained optimization problems in finite dimension spaces from the viewpoint of exact penalty functions. The tools that we use are mainly from the modern variational analysis popularized by Rockafellar and Wets' classical book. The problem models that we focus on are nonlinear programming and mathematical programs with complementarity constraints. We aim at developing a unified framework and providing a detailed exposition of optimality conditions from exactness of penalty functions. In this connection, we intend to answer questions as to when penalty functions are exact and how optimality conditions of the original constrained problems can be inherited from those of exact penalty functions. We study sufficient conditions for penalty terms to possess local error bounds, which guarantee exactness of penalty functions. We give characterizations for a stronger version of the local error bound property in terms of strong slopes, subderivative and regular subgradients for points outside the referenced set. In particular, we give full characterizations of the local error bound property for the elementary max function of a finite collection of smooth functions. With the aid of these characterizations, we show that the quasinormality constraint qualification implies the existence of a local error bound. We also study sufficient and necessary conditions for the existence of local error bounds by virtue of various limits defined on the boundary of the referenced set. We study first-and second-order necessary and sufficient conditions for penalty functions to be exact. These conditions are expressed by subderivatives, second-order subderivatives, and parabolic subderivatives, which are the notions that have been utilized to formulate tight optimality conditions for optimization problems. In our investigation, the kernels of these derivatives, representing directions at which derivatives vanish, play an key role. In particular, we show an interesting auxiliary result which asserts that, the polar cone of the subderivative kernel of an extended real-valued function at a local minimum is the same as the positive hull of its regular subgradients at the same point. We show how Karush-Kuhn-Tucker conditions and second-order necessary conditions in nonlinear programming, and strong and Mordukhovich stationarities in mathematical programs with complementarity constraints, can be derived from exactness of penalty functions under some additional conditions on constraint functions. In presenting these additional conditions, it turns out that the kernels of (parabolic) subderivatives of penalty terms are very crucial. By virtue of these kernels and a variational description of regular subgradients, we show necessity and sufficiency of these additional conditions. We also present conditions in terms of the original data by applying (generalized) Taylor expansions to calculate these kernels.
Subjects: Mathematical optimization.
Nonlinear programming.
Algorithms.
Hong Kong Polytechnic University -- Dissertations
Pages: x, 135 p. : ill. ; 30 cm.
Appears in Collections:Thesis

Show full item record

Page views

51
Last Week
0
Last month
Citations as of Mar 24, 2024

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.