Please use this identifier to cite or link to this item:
Title: Addressing the local minima problem by output monitoring and modification algorithms
Authors: Ng, SC
Cheung, CC 
Lui, AKF
Tse, HT
Keywords: Back-propagation
Local minimum problem
Issue Date: 2012
Publisher: Springer
Source: Lecture notes in computer science (including subseries Lecture notes in artificial intelligence and lecture notes in bioinformatics), 2012, v. 7367 LNCS, no. PART 1, p. 206-216 How to cite?
Journal: Lecture notes in computer science (including subseries Lecture notes in artificial intelligence and lecture notes in bioinformatics) 
Abstract: This paper proposes a new approach called output monitoring and modification (OMM) to address the local minimum problem for existing gradient-descent algorithms (like BP, Rprop and Quickprop) in training feed-forward neural networks. OMM monitors the learning process. When the learning process is trapped into a local minimum, OMM changes some incorrect output values to escape from such local minimum. This modification can be repeated with different parameter settings until the learning process converges to the global optimum. The simulation experiments show that a gradient-descent learning algorithm with OMM has a much better global convergence capability than those without OMM but their convergence rates are similar. In one benchmark problem (application), the global convergence capability was increased from 1% to 100%.
Description: 9th International Symposium on Neural Networks, ISNN 2012, Shenyang, 11-14 July 2012
ISBN: 9783642313455
ISSN: 0302-9743
EISSN: 1611-3349
DOI: 10.1007/978-3-642-31346-2_24
Appears in Collections:Conference Paper

View full-text via PolyU eLinks SFX Query
Show full item record


Last Week
Last month
Citations as of Dec 5, 2018

Page view(s)

Citations as of Dec 9, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.