Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/74472
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Applied Mathematicsen_US
dc.creatorShi, Yen_US
dc.creatorFeng, Zen_US
dc.creatorYiu, KFCen_US
dc.date.accessioned2018-03-29T07:16:54Z-
dc.date.available2018-03-29T07:16:54Z-
dc.identifier.issn1862-4472en_US
dc.identifier.urihttp://hdl.handle.net/10397/74472-
dc.language.isoenen_US
dc.publisherSpringeren_US
dc.rights© Springer-Verlag GmbH Germany 2017en_US
dc.rightsThis version of the article has been accepted for publication, after peer review (when applicable) and is subject to Springer Nature’s AM terms of use (https://www.springernature.com/gp/open-research/policies/accepted-manuscript-terms), but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record is available online at: http://dx.doi.org/10.1007/s11590-017-1157-2en_US
dc.subjectDescent methoden_US
dc.subjectLASSOen_US
dc.subjectLeast absolute deviationen_US
dc.subjectNonsmooth optimizationen_US
dc.titleA descent method for least absolute deviation lasso problemsen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage543en_US
dc.identifier.epage559en_US
dc.identifier.volume13en_US
dc.identifier.issue3en_US
dc.identifier.doi10.1007/s11590-017-1157-2en_US
dcterms.abstractVariable selection is an important method to analyze large quantity of data and extract useful information. Although least square regression is the most widely used scheme for its flexibility in obtaining explicit solutions, least absolute deviation (LAD) regression combined with lasso penalty becomes popular for its resistance to heavy-tailed errors in response variable, denoted as LAD-LASSO. In this paper, we consider the LAD-LASSO problem for variable selection. Based on a dynamic optimality condition of nonsmooth optimization problem, we develop a descent method to solve the nonsmooth optimization problem. Numerical experiments are conducted to confirm that the proposed method is more efficient than existing methods.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationOptimization letters, Apr. 2019, v. 13, no. 3, p. 543-559en_US
dcterms.isPartOfOptimization lettersen_US
dcterms.issued2019-04-
dc.identifier.scopus2-s2.0-85021768357-
dc.identifier.eissn1862-4480en_US
dc.description.validate201802 bcrcen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberAMA-0297-
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextPolyUen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS14561589-
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Shi_Descent_Method_Least.pdfPre-Published version887.83 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

112
Last Week
0
Last month
Citations as of Mar 24, 2024

Downloads

46
Citations as of Mar 24, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.