Please use this identifier to cite or link to this item:
Title: Image colour visualization for fashion e-commerce based on multispectral colour imaging technology
Authors: Tang, Wing Shan
Degree: Ph.D.
Issue Date: 2017
Abstract: Practicing e-commerce in fashion and textile industry is definitely a popular trend by which clicks and mortar business mode has been adopting by almost all international brands. From the empirical study conducted in this thesis towards the online targets selected, it helps confirming image colour visualization is of the prime importance to the reliable and effective operations of these online fashion stores which demonstrate their products based on the means of images. However, there are numerous factors influencing the image colour accuracy throughout the capturing to displaying process so that they intrinsically affect the real colours mainly due to their device-dependent way of colour communications between input and output devices. If there exists any colour discrepancies between the online displayed colour and the physical colour in reality during the online shopping, customers will certainly reject the goods as colour is one of the most influential factors for making purchase decisions. This undoubtedly harms the credibility of the brands and incurs huge additional costs. To sustainably develop the online fashion sales channel, colour communications using device-independent colour information should be applied throughout the colour input, colour reproduction and colour output. To achieve this, using multispectral imaging technology is one of the solutions. As proven by the experimental results, such system can perform with an accuracy of 0.25 CMC(2:1) unit. The cross-media metamerism is relatively small when using paper standard for characterization and fabrics for testing. In this study, the test results using nylon, cotton and polyester are 0.46, 0.52 and 0.77 CMC(2:1) units of mean colour differences respectively when compared to those spectrophotometric results. Then, a complete close-loop colour reproduction based on yarns directly is demonstrated involving all the necessary procedures with satisfactory calibration, back-prediction and forward prediction dyeing accuracy of 0.45 and 0.57 CMC(2:1) unit in average and good correlation with spectrophotometer having r2 = 0.80. As this is achieved impressively at the high convenience of requiring no yarn card but simply yarn twists, it remarks a breakthrough to the time-, effort- and cost- consuming practice in the industry conventionally. Besides, its adequate detections for colour measurement are also verified through correctly sensing the averaged small to large colour differences of those solid and two-colour yarn-dyed samples which are differentiated by yarn count, fabric density and yarn proportion in close relation to the cover factor and colour percentage changes with data correlation of around r2 = 0.90. The main parametric effects encountered upon image captures and image displays are quantified for defining the sources of those intrinsically imposed colour differences in the application of multi-media. Based on a series of experiments, they revealed four key parameters which are the image capture distances, sensitivity and uniformity as well as the image display uniformity. The alteration in these factors degrade the image visualization by which there are 6.24 CMC(2:1) units in average between the normal 60cm and close 36cm measurement distances; as high as 6.85 CMC(2:1) units if the sample is measured at a curved configuration instead of flat; 2.04 and 6.91 CMC(2:1) units in average if the sample corners are measured instead of the centre on the flat and curved surfaces; and 0.34, 0.55 and 2.14 CMC(2:1) units in average if the image corners are measured instead of the centre on EIZO, HP and IPAD monitor displays.
Through adopting the standardized experiments, weave textured samples are corrected with their core colour differences getting rid of extra influences due to non-calibrated monitors, measurement environment differences in terms of illuminant and media as well as disagreements in colour measuring tools. In the hands-on imaging practices, by the initial monitor characterizations, the colour differences between three monitors and the Imaging colour measurement (ICM) system are reduced from 7.04 to 6.91 on EIZO, 7.43 to 6.11 on HP and 8.63 to 8.07 on IPAD in averaged CMC(2:1) units. These are normalized according to their specific white point of self-luminance, after which their mean colour differences are minimized to 3.17, 3.28 and 5.09 CMC(2:1) units. With the R-model approach correlating physical light booth to ICM, differences can be corrected from 5.59 to 2.14 CMC(2:1) units in average. Finally, colourimetric monitor characterizations are proposed with ICM input of surface textured samples. The effectiveness of this method can be confirmed both instrumentally and visually where the colour differences are improved from 6.47 to 4.60 on EIZO, 4.92 to 3.60 on HP and 6.28 to 4.36 on IPAD in averaged CMC(2:1) units under the self-luminance conditions. From the psycho-physical experiments conducted, the visual differences are improved from 3.73 to 2.56 on EIZO and 3.17 to 2.25 on IPAD in general. As there are the positive progresses from all these step-by-step correctional measures and improvement means, it is confident that the multispectral imaging system is able to perform well from colour measurement and reproduction to colour communication and management especially narrow down the inter-instrumental colour disagreements and improve the image colour visualizations accordingly which is in great demand for the fashion e-commerce businesses nowadays.
Subjects: Hong Kong Polytechnic University -- Dissertations
Image processing -- Digital techniques
Multispectral imaging
Fashion merchandising
Color vision
Pages: xx, 353 pages : color illustrations
Appears in Collections:Thesis

Show full item record

Page views

Citations as of Aug 7, 2022

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.