Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/89218
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Applied Mathematicsen_US
dc.creatorYu, Pen_US
dc.creatorPong, TKen_US
dc.date.accessioned2021-02-18T09:15:27Z-
dc.date.available2021-02-18T09:15:27Z-
dc.identifier.issn0926-6003en_US
dc.identifier.urihttp://hdl.handle.net/10397/89218-
dc.language.isoenen_US
dc.publisherSpringeren_US
dc.rights© Springer Science+Business Media, LLC, part of Springer Nature 2019en_US
dc.rightsThis is a post-peer-review, pre-copyedit version of an article published in Computational Optimization and Applications. The final authenticated version is available online at: http://dx.doi.org/10.1007/s10589-019-00081-1en_US
dc.subjectExtrapolationen_US
dc.subjectIteratively reweighted ℓ 1 algorithmen_US
dc.subjectKurdyka–Łojasiewicz propertyen_US
dc.titleIteratively reweighted ℓ 1 algorithms with extrapolationen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage353en_US
dc.identifier.epage386en_US
dc.identifier.volume73en_US
dc.identifier.issue2en_US
dc.identifier.doi10.1007/s10589-019-00081-1en_US
dcterms.abstractIteratively reweighted ℓ 1 algorithm is a popular algorithm for solving a large class of optimization problems whose objective is the sum of a Lipschitz differentiable loss function and a possibly nonconvex sparsity inducing regularizer. In this paper, motivated by the success of extrapolation techniques in accelerating first-order methods, we study how widely used extrapolation techniques such as those in Auslender and Teboulle (SIAM J Optim 16:697–725, 2006), Beck and Teboulle (SIAM J Imaging Sci 2:183–202, 2009), Lan et al. (Math Program 126:1–29, 2011) and Nesterov (Math Program 140:125–161, 2013) can be incorporated to possibly accelerate the iteratively reweighted ℓ 1 algorithm. We consider three versions of such algorithms. For each version, we exhibit an explicitly checkable condition on the extrapolation parameters so that the sequence generated provably clusters at a stationary point of the optimization problem. We also investigate global convergence under additional Kurdyka–Łojasiewicz assumptions on certain potential functions. Our numerical experiments show that our algorithms usually outperform the general iterative shrinkage and thresholding algorithm in Gong et al. (Proc Int Conf Mach Learn 28:37–45, 2013) and an adaptation of the iteratively reweighted ℓ 1 algorithm in Lu (Math Program 147:277–307, 2014, Algorithm 7) with nonmonotone line-search for solving random instances of log penalty regularized least squares problems in terms of both CPU time and solution quality.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationComputational optimization and applications, Jun 2019, v. 73, no. 2, p. 353-386en_US
dcterms.isPartOfComputational optimization and applicationsen_US
dcterms.issued2019-06-
dc.identifier.scopus2-s2.0-85062144858-
dc.identifier.eissn1573-2894en_US
dc.description.validate202102 bcwhen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumbera0585-n07-
dc.identifier.SubFormID286-
dc.description.fundingSourceRGCen_US
dc.description.fundingText15308516en_US
dc.description.pubStatusPublisheden_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
a0585-n07_IRL1e_re3.pdfPre-Published version1.4 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

72
Last Week
1
Last month
Citations as of Apr 21, 2024

Downloads

37
Citations as of Apr 21, 2024

SCOPUSTM   
Citations

11
Citations as of Apr 19, 2024

WEB OF SCIENCETM
Citations

8
Citations as of Apr 18, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.