Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/98623
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Applied Mathematicsen_US
dc.creatorLi, Xen_US
dc.creatorSun, Den_US
dc.creatorToh, KCen_US
dc.date.accessioned2023-05-10T02:00:43Z-
dc.date.available2023-05-10T02:00:43Z-
dc.identifier.issn1052-6234en_US
dc.identifier.urihttp://hdl.handle.net/10397/98623-
dc.language.isoenen_US
dc.publisherSociety for Industrial and Applied Mathematicsen_US
dc.rights© 2018 Society for Industrial and Applied Mathematicsen_US
dc.rightsThe following publication Li, X., Sun, D., & Toh, K. C. (2018). A highly efficient semismooth Newton augmented Lagrangian method for solving Lasso problems. SIAM Journal on Optimization, 28(1), 433-458 is available at https://doi.org/10.1137/16M1097572.en_US
dc.subjectLassoen_US
dc.subjectSparse optimizationen_US
dc.subjectAugmented Lagrangianen_US
dc.subjectMetric subregularityen_US
dc.subjectSemis-moothnessen_US
dc.subjectNewton’s methoden_US
dc.titleA highly efficient semismooth Newton augmented Lagrangian method for solving Lasso problemsen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage433en_US
dc.identifier.epage458en_US
dc.identifier.volume28en_US
dc.identifier.issue1en_US
dc.identifier.doi10.1137/16M1097572en_US
dcterms.abstractWe develop a fast and robust algorithm for solving large-scale convex composite optimization models with an emphasis on the '1-regularized least squares regression (lasso) problems. Despite the fact that there exist a large number of solvers in the literature for the lasso problems, we found that no solver can efficiently handle difficult large-scale regression problems with real data. By leveraging on available error bound results to realize the asymptotic superlinear convergence property of the augmented Lagrangian algorithm, and by exploiting the second order sparsity of the problem through the semismooth Newton method, we are able to propose an algorithm, called Ssnal, to efficiently solve the aforementioned difficult problems. Under very mild conditions, which hold automatically for lasso problems, both the primal and the dual iteration sequences generated by Ssnal possess a fast linear convergence rate, which can even be superlinear asymptotically. Numerical comparisons between our approach and a number of state-of-the-art solvers, on real data sets, are presented to demonstrate the high efficiency and robustness of our proposed algorithm in solving difficult large-scale lasso problems.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationSIAM journal on optimization, 2018, v. 28, no. 1, p. 433-458en_US
dcterms.isPartOfSIAM journal on optimizationen_US
dcterms.issued2018-
dc.identifier.scopus2-s2.0-85049681478-
dc.identifier.eissn1095-7189en_US
dc.description.validate202305 bcchen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberAMA-0402-
dc.description.fundingSourceSelf-fundeden_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS20280409-
dc.description.oaCategoryVoR alloweden_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
16m1097572.pdf473.92 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

114
Last Week
30
Last month
Citations as of Nov 10, 2025

Downloads

123
Citations as of Nov 10, 2025

SCOPUSTM   
Citations

126
Citations as of Dec 19, 2025

WEB OF SCIENCETM
Citations

127
Citations as of Dec 18, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.