Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/43356
Title: Feedforward kernel neural networks, generalized least learning machine, and its deep learning with application to image classification
Authors: Wang, S
Jiang, Y
Chung, FL 
Qian, P
Keywords: Deep architecture and learning
Feedforward kernel neural networks
Hidden-layer-tuning-free learning
Kernel principal component analysis (KPCA)
Least learning machine
Issue Date: 2015
Publisher: Elsevier
Source: Applied soft computing, 2015, v. 37, p. 125-141 How to cite?
Journal: Applied soft computing 
Abstract: In this paper, the architecture of feedforward kernel neural networks (FKNN) is proposed, which can include a considerably large family of existing feedforward neural networks and hence can meet most practical requirements. Different from the common understanding of learning, it is revealed that when the number of the hidden nodes of every hidden layer and the type of the adopted kernel based activation functions are pre-fixed, a special kernel principal component analysis (KPCA) is always implicitly executed, which can result in the fact that all the hidden layers of such networks need not be tuned and their parameters can be randomly assigned and even may be independent of the training data. Therefore, the least learning machine (LLM) is extended into its generalized version in the sense of adopting much more error functions rather than mean squared error (MSE) function only. As an additional merit, it is also revealed that rigorous Mercer kernel condition is not required in FKNN networks. When the proposed architecture of FKNN networks is constructed in a layer-by-layer way, i.e., the number of the hidden nodes of every hidden layer may be determined only in terms of the extracted principal components after the explicit execution of a KPCA, we can develop FKNN's deep architecture such that its deep learning framework (DLF) has strong theoretical guarantee. Our experimental results about image classification manifest that the proposed FKNN's deep architecture and its DLF based learning indeed enhance the classification performance.
URI: http://hdl.handle.net/10397/43356
ISSN: 1568-4946
DOI: 10.1016/j.asoc.2015.07.040
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

6
Last Week
0
Last month
Citations as of Mar 24, 2017

WEB OF SCIENCETM
Citations

2
Last Week
0
Last month
Citations as of Mar 19, 2017

Page view(s)

8
Last Week
0
Last month
Checked on Mar 19, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.