Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/79592
Title: On the lasso regression and asymmetric laplace distribution with applications
Authors: Shi, Yue
Advisors: Yiu, Ka-fai (AMA)
Keywords: Mathematical optimization
Issue Date: 2018
Publisher: The Hong Kong Polytechnic University
Abstract: In this thesis, we consider four classes of optimization models. One class is LAD Generalized Lasso models. We develop a descent algorithm for LAD-Lasso and a new active zero set descent algorithm for LAD Generalized Lasso under nonsmooth optimality conditions; The second class is constrained LAD Lasso models. We extend the descent algorithm to tackle the constraints as well. Application in Mean Absolute Deviation Lasso portfolio selection is studied. The third class is selection of penalty parameter for compressive sensing. We carry out tests using several criteria for selection of the penalty parameter. The fourth class is optimization under Asymmetric Laplace Distributions, namely robust mixture linear regression model and portfolio selection. We first consider LAD Generalized Lasso models. Under dynamic nonsmooth optimality conditions, we develop a descent algorithm by selecting fastest descent directions for LAD-Lasso regression. Then we derive a new active zero set descent algorithm for LAD Generalized Lasso regression. The algorithm updates the zero set and basis search directions recursively until optimality conditions are satisfied. It is also shown that the proposed algorithm converges in finitely many steps. We then consider Constrained LAD Lasso models. We develop a descent algorith-m by updating descent directions selected from basis directional set for nonsmooth optimization problems for MAD-Lasso portfolio selection strategy, extensive real data analysis are provided to evaluate the out-of-sample performances. We next consider selection of penalty parameter. For compressive sensing based signal recovery model, we apply regularized Least Squares for sparse reconstruction since it can reconstruct speech signal from a noisy observation, and proposed a two-level optimization strategy to incorporate the quality design attributes in the sparse solution in compressive speech enhancement by hyper-parameterizing the tuning parameter. The first level involves the compression of the big data and the second level optimizes the tuning parameter by using different optimization criteria (such as Gini index, the Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC)). The set of solutions can then be measured against the desired design attributes to achieve the best trade-off between suppression and distortion. Finally, we study two models under Asymmetric Laplace Distributions. We first present an efficient two-level latent EM algorithm for parameter estimation of mixture linear regression models, with group label as the first level latent variable and laplace intermediate variable as the second level latent variable. Explicit updating formula of each iteration are derived and computational complexity can thus be reduced significantly. Then we consider robust portfolio selection model, and derived the Expectation-Maximization (EM) algorithm for parameter estimation of Asymmetric Laplace distribution, efficient frontier analysis is provided to evaluate the performance.
Description: xxii, 175 pages : color illustrations
PolyU Library Call No.: [THS] LG51 .H577P AMA 2018 Shi
URI: http://hdl.handle.net/10397/79592
Rights: All rights reserved.
Appears in Collections:Thesis

Files in This Item:
File Description SizeFormat 
991022165758103411_link.htmFor PolyU Users167 BHTMLView/Open
991022165758103411_pira.pdfFor All Users (Non-printable)3.46 MBAdobe PDFView/Open
Show full item record
PIRA download icon_1.1View/Download Contents

Page view(s)

9
Citations as of Jan 14, 2019

Download(s)

7
Citations as of Jan 14, 2019

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.