Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/37948
Title: Improving the quickprop algorithm
Authors: Cheung, CC
Ng, SC
Lui, AK
Keywords: Backpropagation
Convergence of numerical methods
Feedforward neural nets
Issue Date: 2012
Source: Proceedings of the International Joint Conference on Neural Networks (IJCNN'2012), Brisbane, Australia, 10-15 June 2012, p. p. 1-6 How to cite?
Abstract: Backpropagation (BP) algorithm is the most popular supervised learning algorithm that is extensively applied in training feed-forward neural networks. Many BP modifications have been proposed to increase the convergence rate of the standard BP algorithm, and Quickprop is one the most popular fast learning algorithms. The convergence rate of Quickprop is very fast; however, it is easily trapped into a local minimum and thus it cannot converge to the global minimum. This paper proposes a new fast learning algorithm modified from Quickprop. By addressing the drawbacks of the Quickprop algorithm, the new algorithm has a systematic approach to improve the convergence rate and the global convergence capability of Quickprop. Our performance investigation shows that the proposed algorithm always converges with a faster learning rate compared with Quickprop. The improvement in the global convergence capability is especially large. In one learning problem (application), the global convergence capability increased from 4% to 100%.
URI: http://hdl.handle.net/10397/37948
ISBN: 978-1-4673-1488-6
978-1-4673-1489-3 (E-ISBN)
DOI: 10.1109/IJCNN.2012.6252546
Appears in Collections:Conference Paper

Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

35
Last Week
0
Last month
Checked on Oct 15, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.