Please use this identifier to cite or link to this item:
Title: Proximal algorithms with extrapolation for nonconvex nonsmooth optimization problems
Authors: Wen, Bo
Degree: Ph.D.
Issue Date: 2017
Abstract: In this thesis, we consider the proximal algorithms with extrapolation for solving nonconvex nonsmooth optimization problems. This class of optimization problems arise in many application areas of engineering, computer science, economic field, see [18, 20, 23, 27, 40, 54, 64]. Due to the importance of these problems, a plenty of algorithms are proposed for solving them. This thesis mainly studies two classes of the proximal algorithms with extrapolation for solving different structured nonconvex and nonsmooth optimization problems. The details are as follows: 1. We first consider the proximal gradient algorithm with extrapolation for miniĀ­mizing the sum of a Lipschitz differentiable function and a proper closed convex function. Using the error bound condition studied in the literature [38] for analyzing the convergence properties of proximal gradient algorithm, we show that there exists a threshold such that if the extrapolation coefficients are chosen below this threshold, then the sequence generated converges R-linearly to a stationary point of the problem. Moreover, the corresponding sequence of objective values is also R-linearly convergent. In addition, the threshold reduces to 1 for convex problems and, as a consequence, we obtain the R-linear convergence of the sequence generated by FISTA with fixed restart. Again for convex problems, we show that the successive changes of the iterates vanish for many choices of sequences of extrapolation coefficients that approach the threshold. In particular, we prove that this conclusion also holds for the sequence generated by the FISTA. Finally, we present some numerical experiments to illustrate our results. 2. Difference-of-convex (DC) optimization problems attract many researchers' attention in recent years. Many numerical algorithms are proposed for solving them. Among these algorithms, difference-of-convex algorithm (DCA) is a fundamental and classical one. We consider a class of DC optimization problems whose objective is level-bounded and is the sum of a smooth convex function with Lipschitz gradient, a proper closed convex function and a continuous concave function. This kind of DC problems can be solved by the aforementioned DCA, however, a direct application of DCA may lead to difficult subproblems. To overcome this difficulty, proximal DCA has been proposed. While the subĀ­problems involved in the proximal DCA are simpler, proximal DCA maybe slow in practice. This is because proximal DCA reduces to the proximal gradient algorithm when the concave part of the objective is void. In this theis, motivated by the extrapolation techniques for accelerating the proximal gradient algorithm in the convex settings, we consider a proximal difference-of-convex algorithm with extrapolation to possibly accelerate the proximal DCA. We show that any accumulation point of the sequence generated by our algorithm is a stationary point of the DC optimization problem for a fairly general choice of extrapolation parameters: in particular, the parameters can be chosen as in FISTA with fixed restart [26]. Moreover, by assuming the Kurdyka-Lojasiewicz property of an auxiliary function and the differentiability of the concave part, we establish global convergence of the sequence generated by our algorithm and analyze its convergence rate. From the results in our numerical experiments on two difference-of-convex regularized least squares models, proximal difference-of-convex algorithm with extrapolation usually outperforms the proximal DCA with nonmonotone linesearch.
Subjects: Hong Kong Polytechnic University -- Dissertations
Mathematical optimization
Nonsmooth optimization
Pages: xvi, 91 pages : color illustrations
Appears in Collections:Thesis

Show full item record

Page views

Last Week
Last month
Citations as of Jun 4, 2023

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.