Please use this identifier to cite or link to this item:
Title: Feature selection for Chinese character recognition based on inductive learning
Authors: Qian, G
Yeung, D
Tsang, ECC
Shu, W
Keywords: Chinese character recognition
Cost of feature extraction
Extension matrix
Feature selection
Inductive learning
Information entropy
Issue Date: 2004
Publisher: World Scientific
Source: International journal of pattern recognition and artificial intelligence, 2004, v. 18, no. 8, p. 1453-1471 How to cite?
Journal: International journal of pattern recognition and artificial intelligence 
Abstract: Feature selection is a difficult but important issue in the field of machine learning and pattern recognition. In this paper, features for Chinese character recognition are selected by using inductive learning algorithms. The existing inductive learning method based on extension matrix requires precise consistency between positive example and negative example sets, which is very difficult to maintain in most practical cases. The traditional decision tree algorithm ID3 considers only the performance of the discriminating power while selecting features. However, in actual practice the consideration of the associated cost of feature extraction may become a significant concern. In addressing these problems we propose a modified extension matrix approach to select feature subset from the training example set with noises. A decision tree algorithm based on information gain and cost evaluation is also proposed to facilitate cost consideration. The comparative experiments show that the proposed algorithms perform better than the existing inductive learning algorithms to a certain extent.
ISSN: 0218-0014
EISSN: 1793-6381
DOI: 10.1142/S0218001404003836
Appears in Collections:Journal/Magazine Article

View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

Last Week
Last month
Citations as of Jul 15, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.