Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/77968
PIRA download icon_1.1View/Download Full Text
Title: Generalized conjugate gradient methods for ℓ1 regularized convex quadratic programming with finite convergence
Authors: Lu, Z
Chen, X 
Issue Date: Feb-2018
Source: Mathematics of operations research, Feb. 2018, v. 43, no. 1, p. 275-303
Abstract: The conjugate gradient (CG) method is an efficient iterative method for solving large-scale strongly convex quadratic programming (QP). In this paper, we propose some generalized CG (GCG) methods for solving the ℓ1 -regularized (possibly not strongly) convex QP that terminate at an optimal solution in a finite number of iterations. At each iteration, our methods first identify a face of an orthant and then either perform an exact line search along the direction of the negative projected minimum-norm subgradient of the objective function or execute a CG subroutine that conducts a sequence of CG iterations until a CG iterate crosses the boundary of this face or an approximate minimizer of over this face or a subface is found. We determine which type of step should be taken by comparing the magnitude of some components of the minimum-norm subgradient of the objective function to that of its rest components. Our analysis on finite convergence of these methods makes use of an error bound result and some key properties of the aforementioned exact line search and the CG subroutine. We also show that the proposed methods are capable of finding an approximate solution of the problem by allowing some inexactness on the execution of the CG subroutine. The overall arithmetic operation cost of our GCG methods for finding an ϵ-optimal solution depends on e in O(log(1/ϵ)), which is superior to the accelerated proximal gradient method (Beck and Teboulle [Beck A, Teboulle M (2009) A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1):183-202], Nesterov [Nesterov Yu (2013) Gradient methods for minimizing composite functions. Math. Program. 140(1):125-161]) that depends on e in O(1/√ϵ). In addition, our GCG methods can be extended straightforwardly to solve box-constrained convex QP with finite convergence. Numerical results demonstrate that our methods are very favorable for solving ill-conditioned problems.
Keywords: Conjugate gradient method
Convex quadratic programming
Finite convergence
Sparse optimization
Nℓ1-regularization
Publisher: Institute for Operations Research and the Management Sciences
Journal: Mathematics of operations research 
ISSN: 0364-765X
EISSN: 1526-5471
DOI: 10.1287/moor.2017.0865
Rights: Copyright: ©2017 INFORMS
This is the accepted manuscript of the following article: Lu, Z., & Chen, X. (2018). Generalized conjugate gradient methods for ℓ 1 regularized convex quadratic programming with finite convergence. Mathematics of Operations Research, 43(1), 275-303, which has been published in final form at https://doi.org/10.1287/moor.2017.0865
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
Chen_Generalized_Conjugate_Gradient.pdfPre-Published version1.25 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

69
Citations as of Oct 2, 2022

Downloads

2
Citations as of Oct 2, 2022

SCOPUSTM   
Citations

6
Last Week
0
Last month
Citations as of Oct 6, 2022

WEB OF SCIENCETM
Citations

6
Last Week
0
Last month
Citations as of Oct 6, 2022

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.