Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/90722
PIRA download icon_1.1View/Download Full Text
Title: Convergence rate analysis of a sequential convex programming method with line search for a class of constrained difference-of-convex optimization problems
Authors: Yu, P 
Pong, TK 
Lu, Z
Issue Date: 2021
Source: SIAM journal on optimization, 2021, v. 31, no. 3, p. 1605-2170
Abstract: In this paper, we study the sequential convex programming method with monotone line search (SCPls) in [46] for a class of difference-of-convex (DC) optimization problems with multiple smooth inequality constraints. The SCPls is a representative variant of moving-ball-approximationtype algorithms [6, 10, 13, 54] for constrained optimization problems. We analyze the convergence rate of the sequence generated by SCPls in both nonconvex and convex settings by imposing suitable Kurdyka- Lojasiewicz (KL) assumptions. Specifically, in the nonconvex settings, we assume that a special potential function related to the objective and the constraints is a KL function, while in the convex settings we impose KL assumptions directly on the extended objective function (i.e., sum of the objective and the indicator function of the constraint set). A relationship between these two different KL assumptions is established in the convex settings under additional differentiability assumptions. We also discuss how to deduce the KL exponent of the extended objective function from its Lagrangian in the convex settings, under additional assumptions on the constraint functions. Thanks to this result, the extended objectives of some constrained optimization models such as minimizing `1 subject to logistic/Poisson loss are found to be KL functions with exponent 1 2 under mild assumptions. To illustrate how our results can be applied, we consider SCPls for minimizing `1−2 [60] subject to residual error measured by `2 norm/Lorentzian norm [21]. We first discuss how the various conditions required in our analysis can be verified, and then perform numerical experiments to illustrate the convergence behaviors of SCPls.
Keywords: Constrained optimization
Difference-of-convex optimization
Kurdyka-Lojasiewicz property
Linear convergence
Publisher: Society for Industrial and Applied Mathematics
Journal: SIAM journal on optimization 
ISSN: 1052-6234
EISSN: 1095-7189
DOI: 10.1137/20M1314057
Rights: © 2021 Society for Industrial and Applied Mathematics
First Published in SIAM Journal on Optimization in Volume 31, Issue 3, published by the Society for Industrial and Applied Mathematics (SIAM). Copyright © by SIAM. Unauthorized reproduction of this article is prohibited.
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
2439_MBA_rate_rere6_repository_typo_fixed.pdfPre-Published version968.8 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

83
Last Week
0
Last month
Citations as of Mar 24, 2024

Downloads

82
Citations as of Mar 24, 2024

SCOPUSTM   
Citations

9
Citations as of Mar 29, 2024

WEB OF SCIENCETM
Citations

8
Citations as of Mar 28, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.