Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/27197
Title: Fast gradient vector flow computation based on augmented Lagrangian method
Authors: Ren, D
Zuo, W
Zhao, X
Lin, Z
Zhang, D 
Keywords: Augmented Lagrangian method
Convex optimization
Fast Fourier transform
Gradient vector flow
Multiresolution method
Issue Date: 2013
Source: Pattern recognition letters, 2013, v. 34, no. 2, p. 219-225 How to cite?
Journal: Pattern Recognition Letters 
Abstract: Gradient vector flow (GVF) and generalized GVF (GGVF) have been widely applied in many image processing applications. The high cost of GVF/GGVF computation, however, has restricted their potential applications on images with large size. Motivated by progress in fast image restoration algorithms, we reformulate the GVF/GGVF computation problem using the convex optimization model with equality constraint, and solve it using the inexact augmented Lagrangian method (IALM). With fast Fourier transform (FFT), we provide two novel simple and efficient algorithms for GVF/GGVF computation, respectively. To further improve the computational efficiency, the multiresolution approach is adopted to perform the GVF/GGVF computation in a coarse-to-fine manner. Experimental results show that the proposed methods can improve the computational speed of the original GVF/GGVF by one or two order of magnitude, and are more efficient than the state-of-the-art methods for GVF/GGVF computation.
URI: http://hdl.handle.net/10397/27197
ISSN: 0167-8655
DOI: 10.1016/j.patrec.2012.09.017
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

8
Last Week
0
Last month
0
Citations as of May 22, 2017

WEB OF SCIENCETM
Citations

3
Last Week
0
Last month
0
Citations as of May 21, 2017

Page view(s)

39
Last Week
1
Last month
Checked on May 21, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.