Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/98568
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Applied Mathematicsen_US
dc.creatorZhang, Yen_US
dc.creatorZhang, Nen_US
dc.creatorSun, Den_US
dc.creatorToh, KCen_US
dc.date.accessioned2023-05-10T02:00:22Z-
dc.date.available2023-05-10T02:00:22Z-
dc.identifier.issn0025-5610en_US
dc.identifier.urihttp://hdl.handle.net/10397/98568-
dc.language.isoenen_US
dc.publisherSpringeren_US
dc.rights© Springer-Verlag GmbH Germany, part of Springer Nature and Mathematical Optimization Society 2018en_US
dc.rightsThis version of the article has been accepted for publication, after peer review (when applicable) and is subject to Springer Nature’s AM terms of use (https://www.springernature.com/gp/open-research/policies/accepted-manuscript-terms), but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record is available online at: http://dx.doi.org/10.1007/s10107-018-1329-6.en_US
dc.subjectSparse group Lassoen_US
dc.subjectGeneralized Jacobianen_US
dc.subjectAugmented Lagrangian methoden_US
dc.subjectSemismooth Newton methoden_US
dc.titleAn efficient Hessian based algorithm for solving large-scale sparse group Lasso problemsen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage223en_US
dc.identifier.epage263en_US
dc.identifier.volume179en_US
dc.identifier.issue1-2en_US
dc.identifier.doi10.1007/s10107-018-1329-6en_US
dcterms.abstractThe sparse group Lasso is a widely used statistical model which encourages the sparsity both on a group and within the group level. In this paper, we develop an efficient augmented Lagrangian method for large-scale non-overlapping sparse group Lasso problems with each subproblem being solved by a superlinearly convergent inexact semismooth Newton method. Theoretically, we prove that, if the penalty parameter is chosen sufficiently large, the augmented Lagrangian method converges globally at an arbitrarily fast linear rate for the primal iterative sequence, the dual infeasibility, and the duality gap of the primal and dual objective functions. Computationally, we derive explicitly the generalized Jacobian of the proximal mapping associated with the sparse group Lasso regularizer and exploit fully the underlying second order sparsity through the semismooth Newton method. The efficiency and robustness of our proposed algorithm are demonstrated by numerical experiments on both the synthetic and real data sets.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationMathematical programming, Jan. 2020, v. 179, no. 1-2, p. 223-263en_US
dcterms.isPartOfMathematical programmingen_US
dcterms.issued2020-01-
dc.identifier.scopus2-s2.0-85053538234-
dc.identifier.eissn1436-4646en_US
dc.description.validate202305 bcchen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberAMA-0223-
dc.description.fundingSourceRGCen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS20279775-
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Zhang_Efficient_Hessian_Based.pdfPre-Published version4.35 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

89
Citations as of Apr 14, 2025

Downloads

96
Citations as of Apr 14, 2025

SCOPUSTM   
Citations

56
Citations as of Dec 19, 2025

WEB OF SCIENCETM
Citations

41
Citations as of Oct 10, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.