Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/11291
Title: Using a neuro-fuzzy technique to improve the clustering based on similarity
Authors: Yeung, DS
Wang, XZ
Keywords: Fuzzy neural nets
Pattern clustering
Issue Date: 2000
Publisher: IEEE
Source: 2000 IEEE International Conference on Systems, Man, and Cybernetics, October 2000, Nashville, TN, v. 5, p. 3693-3698 How to cite?
Abstract: Although there have been many approaches to fuzzy clustering, the clustering based on a similarity matrix is still a popular technique which performs by means of transforming the similarity matrix into its transitive closure. The clustering performance depends strongly on the similarity matrix in which elements are determined according to a distance metric in many situations. For a given case library in which diverse similarity measures can be defined, different similarity matrixes result in different clustering results. This paper introduces the concept of feature weight and then incorporates this concept into the process of computing similarity between two cases, such that the similarity matrix relies on these feature weights. The purpose of this paper is to improve the clustering performance by adjusting these weights in terms of a neural-fuzzy technique. To learn the feature weights, a neural network is designed for minimizing an objective function. For achieving a local minimum of the objective function, the gradient-descent technique is used to train this network. Several indexes for measuring the quality of a clustering result are defined in this paper to compare the performance
URI: http://hdl.handle.net/10397/11291
ISBN: 0-7803-6583-6
ISSN: 1062-922X
DOI: 10.1109/ICSMC.2000.886584
Appears in Collections:Conference Paper

Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

36
Last Week
4
Last month
Checked on Aug 14, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.