Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/98518
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Applied Mathematicsen_US
dc.creatorTang, Pen_US
dc.creatorWang, Cen_US
dc.creatorSun, Den_US
dc.creatorToh, KCen_US
dc.date.accessioned2023-05-10T02:00:01Z-
dc.date.available2023-05-10T02:00:01Z-
dc.identifier.issn1532-4435en_US
dc.identifier.urihttp://hdl.handle.net/10397/98518-
dc.language.isoenen_US
dc.publisherJournal of Machine Learning Researchen_US
dc.rights© 2020 Peipei Tang, Chengjing Wang, Defeng Sun and Kim-Chuan Toh.en_US
dc.rightsLicense: CC-BY 4.0, see https://creativecommons.org/licenses/by/4.0/. Attribution requirements are provided at http://jmlr.org/papers/v21/19-247.html.en_US
dc.rightsThe following publication Tang, P., Wang, C., Sun, D., & Toh, K. C. (2020). A sparse semismooth Newton based proximal majorization-minimization algorithm for nonconvex square-root-loss regression problem. The Journal of Machine Learning Research, 21, 226. is available at https://www.jmlr.org/papers/v21/19-247.html.en_US
dc.subjectNonconvex square-root regression problemsen_US
dc.subjectProximal majorization-minimizationen_US
dc.subjectSemismooth Newton methoden_US
dc.titleA sparse semismooth Newton based proximal majorization-minimization algorithm for nonconvex square-root-loss regression problemsen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage1en_US
dc.identifier.epage38en_US
dc.identifier.volume21en_US
dcterms.abstractIn this paper, we consider high-dimensional nonconvex square-root-loss regression problems and introduce a proximal majorization-minimization (PMM) algorithm for solving these problems. Our key idea for making the proposed PMM to be efficient is to develop a sparse semismooth Newton method to solve the corresponding subproblems. By using the Kurdyka- Lojasiewicz property exhibited in the underlining problems, we prove that the PMM algorithm converges to a d-stationary point. We also analyze the oracle property of the initial subproblem used in our algorithm. Extensive numerical experiments are presented to demonstrate the high efficiency of the proposed PMM algorithm.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationJournal of machine learning research, 2020, v. 21, 226, p. 1-38en_US
dcterms.isPartOfJournal of machine learning researchen_US
dcterms.issued2020-
dc.identifier.eissn1533-7928en_US
dc.identifier.artn226en_US
dc.description.validate202305 bcchen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberAMA-0108-
dc.description.fundingSourceRGCen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS54170771-
dc.description.oaCategoryCCen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
19-247.pdf1.09 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

139
Last Week
3
Last month
Citations as of Nov 9, 2025

Downloads

38
Citations as of Nov 9, 2025

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.