Back to results list
Show full item record
Please use this identifier to cite or link to this item:
|Title:||Computational and digital analysis of yarn fabrication, structure and surface appearance||Authors:||Li, Shengyan||Degree:||Ph.D.||Issue Date:||2016||Abstract:||Spun yarns are used worldwide for making a broad range of textiles and apparel. With an output of 32.9 million tons in 2009, spun yarns satisfy more than half the needs of the global textiles and clothing industry. The present study is devoted to developing computational and digital methods and systems for the accurate and efficient analysis of three predominant aspects of spun yarns: yarn formation, internal structure and surface appearance. In this study, a new generalized theoretical model has been developed by using Finite Element Method (FEM) for the theoretical and numerical analysis of yarn formation in spinning triangle. In this proposed model, the initial conditions are formulated together with an algorithm for fiber buckling. Compared with the earlier models, some important parameters ignored previously, such as inclined angle of spinning tension, frictional contact of fibers with bottom roller and fiber torsional strains, are considered. Numerical simulations were then carried out to explore the quantitative relationships between the mechanical performance of ring spinning triangle and various spinning parameters. Further comparison results showed that fiber tensions predicted by the proposed model are in good agreements with earlier models while the calculated torque of yarns is generally closer to experimental measurements. Moreover, a dynamic model of spinning triangle has been further developed in this study based on the above static FEM model with consideration of the dynamic characteristics of fiber inertia and damping for a more complete and accurate description. By this model, the dynamic behavior of fibers in spinning triangle, such as the natural frequency, mode shape, resonant response, harmonic analysis and response under a time-varying tension, has been originally studied. The results showed that dynamic parameters have a great influence on the amplitude and attenuation of the response of constituent fibers in spinning triangle. Tracer fiber measurement has been widely used for analysis of yarn internal structure by tracing the fiber path in a transparent liquid. Currently, the image mosaic and segmentation of tracer fiber images largely involve manual operation. It is extremely time-consuming. Therefore, in this study, an intelligent computer method has been developed for automatic mosaic and segmentation of tracer fiber images. In this method, an extended QRS complex detection method is developed for tracer fiber detection. A decision function, integrating several matching functions extracted from tracer fiber and gradient image, is proposed for image mosaic. An objective method is then proposed to evaluate the qualities of image mosaic and segmentation of the proposed method. Fifty series of tracer fiber images (total 872 images) with five different yarn counts (10Ne~60Ne) were prepared and used for a full evaluation of the proposed image processing method with respect to conventional manual method. Evaluation results showed that the proposed method works well in mosaic and segmentation for tracer fiber images and presents a much higher efficiency than the conventional method.
For yarn surface evaluation, an intelligent computer method has been developed to analyze yarn blackboard image and objectively evaluate yarn quality using computer vision and artificial intelligence. The evaluation method for yarn surface quality currently in use is mainly based on manual inspection. Although some works have been done on digital yarn analysis in order to resolve the limitations of human visual inspection, none of them can fully imitate human behavior in the inspection of yarn quality on a blackboard according to ASTM D2255 standard. In the proposed method, a multi-scale attention model is proposed and developed, which can fully imitate human attention at different observation distances for the whole and detailed yarn information. A Spectral Residual method is then extended to establish a general benchmark for the comparison among different grade yarns. In the developed system, Fourier transform is employed to separate yarn diameter and hairs, and image conspicuity is obtained by the multi-scale attention model. Total sixteen features, obtained from yarn diameter, hairiness and image conspicuity, are extracted to present the yarn surface characteristics and then used to classify and grade yarn surface qualities using Probabilistic Neural Network (PNN). Two kinds of PNNs, i.e. global and individual PNNs, and two types of classification, i.e. eight-grade and five-grade classifications, are designed and developed for various yarn quality classification purposes. For the evaluation of the proposed method, a database was constructed with 296 yarn board images, covering eight yarn counts (7Ne~80Ne) and different grades. Experimental results showed that the accuracy for eight-and five-grade global PNNs are 92.23% and 93.58%, respectively, demonstrating a good classification performance of the proposed digital method in yarn surface grading. Finally, with the above developed computer methods, two intelligent digital systems have been developed for the computerized analysis and evaluation of yarn internal structure and surface appearance, respectively. The systems were designed for controlling and visualizing the whole process of yarn measurement and analysis, covering image acquisition, image processing and data analysis with interactive and user-friendly interface. Both digital systems are potential for application in textile laboratories and spinning mills for yarn structure and surface analyses.
Hong Kong Polytechnic University -- Dissertations
|Pages:||xxviii, 370 pages : color illustrations|
|Appears in Collections:||Thesis|
View full-text via https://theses.lib.polyu.edu.hk/handle/200/8441
Citations as of May 22, 2022
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.