Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/37747
Title: A new fast learning algorithm with promising global convergence capability for feed-forward neural networks
Authors: Cheung, CC
Ng, SC
Lui, AK
Xu, SS
Keywords: Backpropagation
Convergence
Feedforward neural nets
Issue Date: 2013
Source: The 2013 International Joint Conference on Neural Networks (IJCNN), 4-9 Aug. 2013, Dallas, TX, p. 1-6 How to cite?
Abstract: Backpropagation (BP) learning algorithm is the most widely used supervised learning technique that is extensively applied in the training of multi-layer feed-forward neural networks. Although many modifications of BP have been proposed to speed up the learning of the original BP, they seldom address the local minimum and the flat-spot problem. This paper proposes a new algorithm called Local-minimum and Flat-spot Problem Solver (LFPS) to solve these two problems. It uses a systematic approach to check whether a learning process is trapped by a local minimum or a flat-spot area, and then escape from it. Thus, a learning process using LFPS can keep finding an appropriate way to converge to the global minimum. The performance investigation shows that the proposed algorithm always converges in different learning problems (applications) whereas other popular fast learning algorithms sometimes give very poor global convergence capabilities.
URI: http://hdl.handle.net/10397/37747
ISBN: 978-1-4673-6128-6
ISSN: 2161-4393
DOI: 10.1109/IJCNN.2013.6707006
Appears in Collections:Conference Paper

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

2
Citations as of Sep 18, 2017

Page view(s)

35
Last Week
5
Last month
Checked on Sep 18, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.