Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/28341
Title: A class of competitive learning models which avoids neuron underutilization problem
Authors: Choy, CST
Siu, WC 
Keywords: Multiplicatively biased competitive learning
Neuron underutilization problem
Vector quantization
Issue Date: 1998
Publisher: Institute of Electrical and Electronics Engineers
Source: IEEE transactions on neural networks, 1998, v. 9, no. 6, p. 1258-1269 How to cite?
Journal: IEEE transactions on neural networks 
Abstract: In this paper, we study a qualitative property of a class of competitive learning (CL) models, which is called the multiplicatively biased competitive learning (MBCL) model, namely that it avoids neuron underutilization with probability one as time goes to infinity. In the MBCL, the competition among neurons is biased by a multiplicative term, while only one weight vector is updated per learning step. This is of practical interest since its instances have computational complexities among the lowest in existing CL models. In addition, in applications like classification, vector quantizer design and probability density function estimation, a necessary condition for optimal performance is to avoid neuron underutilization. Hence, it is possible to define instances of MBCL to achieve optimal performance in these applications.
URI: http://hdl.handle.net/10397/28341
ISSN: 1045-9227
DOI: 10.1109/72.728374
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

6
Last Week
0
Last month
0
Citations as of Sep 22, 2017

WEB OF SCIENCETM
Citations

4
Last Week
0
Last month
Citations as of Sep 22, 2017

Page view(s)

36
Last Week
4
Last month
Checked on Sep 18, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.