Please use this identifier to cite or link to this item:
Title: Enhanced Two-Phase Method in fast learning algorithms
Authors: Cheung, CC
Ng, SC
Lui, AK
Xu, S
Keywords: Backpropagation
Multilayer perceptrons
Problem solving
Recurrent neural nets
Issue Date: 2010
Publisher: IEEE
Source: The 2010 International Joint Conference on Neural Networks (IJCNN), 18-23 July 2010, Barcelona, p. 1-7 How to cite?
Abstract: Backpropagation (BP) learning algorithm is the most widely supervised learning technique which is extensively applied in the training of multi-layer feed-forward neural networks. Many modifications of BP have been proposed to speed up the learning of the original BP. However, the performance of these modifications is still not promising due to the existence of the local minimum problem and the error overshooting problem. This paper proposes an Enhanced Two-Phase method to solve these two problems to improve the performance of existing fast learning algorithms. The proposed method effectively locates the existence of the above problems and assigns appropriate fast learning algorithms to solve them. Throughout our investigation, the proposed method significantly improves the performance of different fast learning algorithms in terms of the convergence rate and the global convergence capability in different problems. The convergence rate can be increased up to 100 times compared with the existing fast learning algorithms.
ISBN: 978-1-4244-6916-1
ISSN: 1098-7576
DOI: 10.1109/IJCNN.2010.5596519
Appears in Collections:Conference Paper

View full-text via PolyU eLinks SFX Query
Show full item record


Last Week
Last month
Citations as of Jul 31, 2018

Page view(s)

Last Week
Last month
Citations as of Aug 14, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.