Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/99490
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Applied Mathematicsen_US
dc.creatorZhang, Yen_US
dc.creatorLi, Gen_US
dc.creatorPong, TKen_US
dc.creatorXu, Sen_US
dc.date.accessioned2023-07-11T02:47:15Z-
dc.date.available2023-07-11T02:47:15Z-
dc.identifier.urihttp://hdl.handle.net/10397/99490-
dc.language.isoenen_US
dc.publisherSpringeren_US
dc.rights© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023en_US
dc.rightsThis version of the article has been accepted for publication, after peer review (when applicable) and is subject to Springer Nature’s AM terms of use (https://www.springernature.com/gp/open-research/policies/accepted-manuscript-terms), but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record is available online at: http://dx.doi.org/10.1007/s10444-022-10002-2.en_US
dc.subjectFirst-order feasible methodsen_US
dc.subjectRetractionen_US
dc.subjectDifference-of-convex optimizationen_US
dc.subjectKurdyka-Łojasiewicz exponentsen_US
dc.titleRetraction-based first-order feasible methods for difference-of-convex programs with smooth inequality and simple geometric constraintsen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume49en_US
dc.identifier.doi10.1007/s10444-022-10002-2en_US
dcterms.abstractIn this paper, we propose first-order feasible methods for difference-of-convex (DC) programs with smooth inequality and simple geometric constraints. Our strategy for maintaining feasibility of the iterates is based on a “retraction” idea adapted from the literature of manifold optimization. When the constraints are convex, we establish the global subsequential convergence of the sequence generated by our algorithm under strict feasibility condition, and analyze its convergence rate when the objective is in addition convex according to the Kurdyka-Łojasiewicz (KL) exponent of the extended objective (i.e., sum of the objective and the indicator function of the constraint set). We also show that the extended objective of a large class of Euclidean norm (and more generally, group LASSO penalty) regularized convex optimization problems is a KL function with exponent 1/2; consequently, our algorithm is locally linearly convergent when applied to these problems. We then extend our method to solve DC programs with a single specially structured nonconvex constraint. Finally, we discuss how our algorithms can be applied to solve two concrete optimization problems, namely, group-structured compressed sensing problems with Gaussian measurement noise and compressed sensing problems with Cauchy measurement noise, and illustrate the empirical performance of our algorithms.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationAdvances in computational mathematics, 2023, v. 49, 8en_US
dcterms.isPartOfAdvances in computational mathematicsen_US
dcterms.issued2023-
dc.identifier.eissn1019-7168en_US
dc.identifier.artn8en_US
dc.description.validate202307 bcwwen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumbera2246-
dc.identifier.SubFormID47203-
dc.description.fundingSourceRGCen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Zhang_Retraction-based_First-order_Feasible.pdfPre-Published version634.77 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

127
Last Week
5
Last month
Citations as of Nov 30, 2025

Downloads

69
Citations as of Nov 30, 2025

WEB OF SCIENCETM
Citations

2
Citations as of Dec 18, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.