Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/90720
PIRA download icon_1.1View/Download Full Text
Title: A refined convergence analysis of pDCAe with applications to simultaneous sparse recovery and outlier detection
Authors: Liu, T
Pong, TK 
Takeda, A
Issue Date: 15-May-2019
Source: Computational optimization and applications, 15 May 2019, v. 73, no. 1, p. 69-100
Abstract: We consider the problem of minimizing a difference-of-convex (DC) function, which can be written as the sum of a smooth convex function with Lipschitz gradient, a proper closed convex function and a continuous possibly nonsmooth concave function. We refine the convergence analysis in Wen et al. (Comput Optim Appl 69, 297–324, 2018) for the proximal DC algorithm with extrapolation (pDCA e ) and show that the whole sequence generated by the algorithm is convergent without imposing differentiability assumptions in the concave part. Our analysis is based on a new potential function and we assume such a function is a Kurdyka–Łojasiewicz (KL) function. We also establish a relationship between our KL assumption and the one used in Wen et al. (2018). Finally, we demonstrate how the pDCA e can be applied to a class of simultaneous sparse recovery and outlier detection problems arising from robust compressed sensing in signal processing and least trimmed squares regression in statistics. Specifically, we show that the objectives of these problems can be written as level-bounded DC functions whose concave parts are typically nonsmooth. Moreover, for a large class of loss functions and regularizers, the KL exponent of the corresponding potential function are shown to be 1/2, which implies that the pDCA e is locally linearly convergent when applied to these problems. Our numerical experiments show that the pDCA e usually outperforms the proximal DC algorithm with nonmonotone linesearch (Liu et al. in Math Program, 2018. https://doi.org/10.1007/s10107-018-1327-8, Appendix A) in both CPU time and solution quality for this particular application.
Keywords: Difference-of-convex optimization
Kurdyka–Lojasiewicz property
Outlier detection
Sparse recovery
Publisher: Springer
Journal: Computational optimization and applications 
ISSN: 0926-6003
EISSN: 1573-2894
DOI: 10.1007/s10589-019-00067-z
Rights: © Springer Science+Business Media, LLC, part of Springer Nature 2019
This is a post-peer-review, pre-copyedit version of an article published in Computational Optimization and Applications. The final authenticated version is available online at: http://dx.doi.org/10.1007/s10589-019-00067-z.
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
2437_SROD_rev2.pdfPre-Published version727.02 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

23
Citations as of May 22, 2022

Downloads

7
Citations as of May 22, 2022

SCOPUSTM   
Citations

14
Citations as of May 20, 2022

WEB OF SCIENCETM
Citations

14
Citations as of May 26, 2022

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.