Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/21556
Title: A hybrid model using genetic algorithm and neural network for classifying garment defects
Authors: Yuen, CWM
Wong, WK 
Qian, SQ
Chan, LK
Fung, EHK
Keywords: Garment inspection
Genetic algorithms
Image segmentation
Morphological filter
Neural network
Issue Date: 2009
Publisher: Pergamon Press
Source: Expert systems with applications, 2009, v. 36, no. 2 part 1, p. 2037-2047 How to cite?
Journal: Expert systems with applications 
Abstract: The inspection of semi-finished and finished garments is very important for quality control in the clothing industry. Unfortunately, garment inspection still relies on manual operation while studies on garment automatic inspection are limited. In this paper, a novel hybrid model through integration of genetic algorithm (GA) and neural network is proposed to classify the type of garment defects. To process the garment sample images, a morphological filter, a method based on GA to find out an optimal structuring element, was presented. A segmented window technique is developed to segment images into several classes using monochrome single-loop ribwork of knitted garment. Four characteristic variables were collected and input into a back-propagation (BP) neural network to classify the sample images. According to the experimental results, the proposed method achieves very high accuracy rate of recognition and thus provides decision support in defect classification.
URI: http://hdl.handle.net/10397/21556
ISSN: 0957-4174
EISSN: 1873-6793
DOI: 10.1016/j.eswa.2007.12.009
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

42
Last Week
0
Last month
0
Citations as of Jun 11, 2018

WEB OF SCIENCETM
Citations

35
Last Week
0
Last month
1
Citations as of Jun 23, 2018

Page view(s)

42
Last Week
0
Last month
Citations as of Jun 25, 2018

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.