Please use this identifier to cite or link to this item:
Title: Accurate indexing and classification for fabric weave patterns using entropy-based approach
Authors: Zheng, D
Baciu, G 
Hu, J 
Keywords: Content-based
Weave Pattern
Issue Date: 2009
Publisher: IEEE
Source: 8th IEEE International Conference on Cognitive Informatics, 2009 : ICCI '09, 15-17 June 2009, Kowloon, Hong Kong, p. 357-364 How to cite?
Abstract: In current textile design, fabric weave pattern indexing and searching require extensive manual operations. The manual weave pattern classification is not sufficient to give the accurate and precise result and it is time-consuming. There is no such research to index and search for weave pattern specially. In this paper we propose a method to index and search weave patterns. We use pattern clusters, transitions, entropy and fast Fourier transform (FFT) directionality as a hybrid approach for the cognitive comparison and classification of weave pattern. There are three common patterns used in textile design. They are plain weave, twill weave and satin weave patterns. First, we classify weave patterns into these three categories according to weave pattern definition and weave point distribution characteristics (weave pattern smoothness and connectivity). Second, we use the FFT to describe the weave point distribution. Finally, we use entropy method to calculate the weave point distribution into a significant index value. Our approach can avoid the problem of pattern duplications in the database. In our experiment, we select and test commonly used weave patterns with our proposed approach. Our experiment results show that our approach can achieve substantially accurate classification.
ISBN: 978-1-4244-4642-1
DOI: 10.1109/COGINF.2009.5250712
Appears in Collections:Conference Paper

View full-text via PolyU eLinks SFX Query
Show full item record


Last Week
Last month
Citations as of Jan 12, 2019

Page view(s)

Last Week
Last month
Citations as of Jan 14, 2019

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.