Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/87579
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Applied Mathematics-
dc.creatorHuang, J-
dc.creatorJiao, Y-
dc.creatorLiu, Y-
dc.creatorLu, X-
dc.date.accessioned2020-07-16T03:59:04Z-
dc.date.available2020-07-16T03:59:04Z-
dc.identifier.issn1532-4435-
dc.identifier.urihttp://hdl.handle.net/10397/87579-
dc.language.isoenen_US
dc.publisherMIT Pressen_US
dc.rights© 2018 Jian Huang, Yuling Jiao, Yanyan Liu, and Xiliang Lu.en_US
dc.rightsLicense: CC-BY 4.0, see https://creativecommons.org/licenses/by/4.0/. Attribution requirements are provided at http://jmlr.org/papers/v19/17-194.html.en_US
dc.rightsThe following publication Huang, J., Jiao, Y., Liu, Y., & Lu, X. (2018). A constructive approach to L0 penalized regression. Journal of Machine Learning Research, 19, 1-37 is available at http://www.jmlr.org/papers/v19/17-194.htmlen_US
dc.subjectGeometrical convergenceen_US
dc.subjectKKT conditionsen_US
dc.subjectNonasymptotic error boundsen_US
dc.titleA constructive approach to L0 penalized regressionen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage1-
dc.identifier.epage37-
dc.identifier.volume19-
dcterms.abstractWe propose a constructive approach to estimating sparse, high-dimensional linear regression models. The approach is a computational algorithm motivated from the KKT conditions for the `0-penalized least squares solutions. It generates a sequence of solutions iteratively, based on support detection using primal and dual information and root _nding. We refer to the algorithm as SDAR for brevity. Under a sparse Riesz condition on the design matrix and certain other conditions, we show that with high probability, the `2 estimation error of the solution sequence decays exponentially to the minimax error bound in O(log(RpJ)) iterations, where J is the number of important predictors and R is the relative magnitude of the nonzero target coe_cients; and under a mutual coherence condition and certain other conditions, the `1 estimation error decays to the optimal error bound in O(log(R)) iterations. Moreover the SDAR solution recovers the oracle least squares estimator within a _nite number of iterations with high probability if the sparsity level is known. Computational complexity analysis shows that the cost of SDAR is O(np) per iteration. We also consider an adaptive version of SDAR for use in practical applications where the true sparsity level is unknown. Simulation studies demonstrate that SDAR outperforms Lasso, MCP and two greedy methods in accuracy and e_ciency.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationJournal of machine learning research, Aug. 2018, v. 19, p. 1-37-
dcterms.isPartOfJournal of machine learning research-
dcterms.issued2018-
dc.identifier.eissn1533-7928-
dc.identifier.rosgroupid2018003908-
dc.description.ros2018-2019 > Academic research: refereed > Publication in refereed journal-
dc.description.validate202007 bcrc-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Others (ROS1819)en_US
dc.description.pubStatusPublisheden_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Huang_L0_Penalized_Regression.pdf543.08 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

68
Last Week
0
Last month
Citations as of May 12, 2024

Downloads

244
Citations as of May 12, 2024

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.