Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/66230
Title: A systematic algorithm to escape from local minima in training feed-forward neural networks
Authors: Cheung, CC
Xu, SS
Ng, SC
Issue Date: 2016
Publisher: Institute of Electrical and Electronics Engineers Inc.
Source: Proceedings of the International Joint Conference on Neural Networks, 2016, v. 2016-October, 7727226, p. 396-402 How to cite?
Abstract: A learning process is easily trapped into a local minimum when training multi-layer feed-forward neural networks. An algorithm called Wrong Output Modification (WOM) was proposed to help a learning process escape from local minima, but WOM still cannot totally solve the local minimum problem. Moreover, there is no performance analysis to show that the learning has a higher probability of converging to a global solution by using this algorithm. Additionally, the generalization performance of this algorithm was not investigated when the early stopping method of training is applied. Based on these limitations of WOM, we propose a new algorithm to ensure the learning can escape from local minima, and its performance is analyzed. We also test the generalization performance of this new algorithm when the early stopping method of training is applied.
Description: 2016 International Joint Conference on Neural Networks, IJCNN 2016, Vancouver, Canada, 24-29 July 2016
URI: http://hdl.handle.net/10397/66230
ISBN: 9781509006199
DOI: 10.1109/IJCNN.2016.7727226
Appears in Collections:Conference Paper

Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

30
Citations as of Jan 22, 2018

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.