Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/30029
Title: Convergence analysis of a class of nonlinear penalization methods for constrained optimization via first-order necessary optimality conditions
Authors: Huang, XX
Yang, XQ 
Keywords: Differentiability
Lipschitz functions
Locally
Necessary optimality conditions
Nonlinear penalization
Smooth approximate variational principle
Issue Date: 2003
Publisher: Springer
Source: Journal of optimization theory and applications, 2003, v. 116, no. 2, p. 311-332 How to cite?
Journal: Journal of optimization theory and applications 
Abstract: We propose a scheme to solve constrained optimization problems by combining a nonlinear penalty method and a descent method. A sequence of nonlinear penalty optimization problems is solved to generate a sequence of stationary points, i.e., each point satisfies a first-order necessary optimality condition of a nonlinear penalty problem. Under some conditions, we show that any limit point of the sequence satisfies the first-order necessary condition of the original constrained optimization problem.
URI: http://hdl.handle.net/10397/30029
ISSN: 0022-3239
EISSN: 1573-2878
DOI: 10.1023/A:1022503820909
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

6
Last Week
0
Last month
1
Citations as of Aug 13, 2017

WEB OF SCIENCETM
Citations

6
Last Week
0
Last month
0
Citations as of Jul 30, 2017

Page view(s)

25
Last Week
1
Last month
Checked on Aug 13, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.