Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/74472
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Applied Mathematics | en_US |
dc.creator | Shi, Y | en_US |
dc.creator | Feng, Z | en_US |
dc.creator | Yiu, KFC | en_US |
dc.date.accessioned | 2018-03-29T07:16:54Z | - |
dc.date.available | 2018-03-29T07:16:54Z | - |
dc.identifier.issn | 1862-4472 | en_US |
dc.identifier.uri | http://hdl.handle.net/10397/74472 | - |
dc.language.iso | en | en_US |
dc.publisher | Springer | en_US |
dc.rights | © Springer-Verlag GmbH Germany 2017 | en_US |
dc.rights | This version of the article has been accepted for publication, after peer review (when applicable) and is subject to Springer Nature’s AM terms of use (https://www.springernature.com/gp/open-research/policies/accepted-manuscript-terms), but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record is available online at: http://dx.doi.org/10.1007/s11590-017-1157-2 | en_US |
dc.subject | Descent method | en_US |
dc.subject | LASSO | en_US |
dc.subject | Least absolute deviation | en_US |
dc.subject | Nonsmooth optimization | en_US |
dc.title | A descent method for least absolute deviation lasso problems | en_US |
dc.type | Journal/Magazine Article | en_US |
dc.identifier.spage | 543 | en_US |
dc.identifier.epage | 559 | en_US |
dc.identifier.volume | 13 | en_US |
dc.identifier.issue | 3 | en_US |
dc.identifier.doi | 10.1007/s11590-017-1157-2 | en_US |
dcterms.abstract | Variable selection is an important method to analyze large quantity of data and extract useful information. Although least square regression is the most widely used scheme for its flexibility in obtaining explicit solutions, least absolute deviation (LAD) regression combined with lasso penalty becomes popular for its resistance to heavy-tailed errors in response variable, denoted as LAD-LASSO. In this paper, we consider the LAD-LASSO problem for variable selection. Based on a dynamic optimality condition of nonsmooth optimization problem, we develop a descent method to solve the nonsmooth optimization problem. Numerical experiments are conducted to confirm that the proposed method is more efficient than existing methods. | en_US |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | Optimization letters, Apr. 2019, v. 13, no. 3, p. 543-559 | en_US |
dcterms.isPartOf | Optimization letters | en_US |
dcterms.issued | 2019-04 | - |
dc.identifier.scopus | 2-s2.0-85021768357 | - |
dc.identifier.eissn | 1862-4480 | en_US |
dc.description.validate | 201802 bcrc | en_US |
dc.description.oa | Accepted Manuscript | en_US |
dc.identifier.FolderNumber | AMA-0297 | - |
dc.description.fundingSource | RGC | en_US |
dc.description.fundingSource | Others | en_US |
dc.description.fundingText | PolyU | en_US |
dc.description.pubStatus | Published | en_US |
dc.identifier.OPUS | 14561589 | - |
Appears in Collections: | Journal/Magazine Article |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Shi_Descent_Method_Least.pdf | Pre-Published version | 887.83 kB | Adobe PDF | View/Open |
Page views
112
Last Week
0
0
Last month
Citations as of Mar 24, 2024
Downloads
46
Citations as of Mar 24, 2024
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.