Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/98587
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Applied Mathematicsen_US
dc.creatorLin, Men_US
dc.creatorLiu, YJen_US
dc.creatorSun, Den_US
dc.creatorToh, KCen_US
dc.date.accessioned2023-05-10T02:00:30Z-
dc.date.available2023-05-10T02:00:30Z-
dc.identifier.issn1052-6234en_US
dc.identifier.urihttp://hdl.handle.net/10397/98587-
dc.language.isoenen_US
dc.publisherSociety for Industrial and Applied Mathematicsen_US
dc.rights© 2019 Society for Industrial and Applied Mathematicsen_US
dc.rightsThe following publication Lin, M., Liu, Y. J., Sun, D., & Toh, K. C. (2019). Efficient sparse semismooth Newton methods for the clustered Lasso problem. SIAM Journal on Optimization, 29(3), 2026-2052 is available at https://doi.org/10.1137/18M1207752.en_US
dc.subjectClustered Lassoen_US
dc.subjectAugmented Lagrangian methoden_US
dc.subjectSemismooth Newton methoden_US
dc.subjectConvex minimizationen_US
dc.titleEfficient sparse semismooth Newton methods for the clustered Lasso problemen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage2026en_US
dc.identifier.epage2052en_US
dc.identifier.volume29en_US
dc.identifier.issue3en_US
dc.identifier.doi10.1137/18M1207752en_US
dcterms.abstractWe focus on solving the clustered Lasso problem, which is a least squares problem with the \ell 1-type penalties imposed on both the coefficients and their pairwise differences to learn the group structure of the regression parameters. Here we first reformulate the clustered Lasso regularizer as a weighted ordered-Lasso regularizer, which is essential in reducing the computational cost from O(n2) to O(nlog(n)). We then propose an inexact semismooth Newton augmented Lagrangian (Ssnal) algorithm to solve the clustered Lasso problem or its dual via this equivalent formulation, depending on whether the sample size is larger than the dimension of the features. An essential component of the Ssnal algorithm is the computation of the generalized Jacobian of the proximal mapping of the clustered Lasso regularizer. Based on the new formulation, we derive an efficient procedure for its computation. Comprehensive results on the global convergence and local linear convergence of the Ssnal algorithm are established. For the purpose of exposition and comparison, we also summarize/design several first-order methods that can be used to solve the problem under consideration, but with the key improvement from the new formulation of the clustered Lasso regularizer. As a demonstration of the applicability of our algorithms, numerical experiments on the clustered Lasso problem are performed. The experiments show that the Ssnal algorithm substantially outperforms the best alternative algorithm for the clustered Lasso problem.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationSIAM journal on optimization, 2019, v. 29, no. 3, p. 2026-2052en_US
dcterms.isPartOfSIAM journal on optimizationen_US
dcterms.issued2019-
dc.identifier.scopus2-s2.0-85073701649-
dc.identifier.eissn1095-7189en_US
dc.description.validate202305 bcchen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberAMA-0272-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextPolyUen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS20279970-
dc.description.oaCategoryVoR alloweden_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
18m1207752.pdf721.28 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

132
Citations as of Nov 10, 2025

Downloads

114
Citations as of Nov 10, 2025

SCOPUSTM   
Citations

24
Citations as of Dec 19, 2025

WEB OF SCIENCETM
Citations

27
Citations as of Dec 18, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.