Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/26707
Title: Parameter by parameter algorithm for multilayer perceptrons
Authors: Li, Y
Zhang, D 
Wang, K
Keywords: BP algorithm with momentum
Layer by layer algorithm
Multilayer perceptrons
Parameter by parameter algorithm
Training algorithm
Issue Date: 2006
Source: Neural processing letters, 2006, v. 23, no. 2, p. 229-242 How to cite?
Journal: Neural Processing Letters 
Abstract: This paper presents a parameter by parameter (PBP) algorithm for speeding up the training of multilayer perceptrons (MLP). This new algorithm uses an approach similar to that of the layer by layer (LBL) algorithm, taking into account the input errors of the output layer and hidden layer. The proposed PBP algorithm, however, is not burdened by the need to calculate the gradient of the error function. In each iteration step, the weights or thresholds can be optimized directly one by one with other variables fixed. Four classes of solution equations for parameters of networks are deducted. The effectiveness of the PBP algorithm is demonstrated using two benchmarks. In comparisons with the BP algorithm with momentum (BPM) and the conventional LBL algorithms, PBP obtains faster convergences and better simulation performances.
URI: http://hdl.handle.net/10397/26707
ISSN: 1370-4621
DOI: 10.1007/s11063-006-0003-9
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

2
Last Week
0
Last month
0
Citations as of Sep 11, 2017

WEB OF SCIENCETM
Citations

1
Last Week
0
Last month
0
Citations as of Sep 22, 2017

Page view(s)

35
Last Week
4
Last month
Checked on Sep 24, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.