Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/102209
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorMainland Development Office-
dc.contributorSchool of Fashion and Textiles-
dc.creatorZhou, Wen_US
dc.creatorMok, PYen_US
dc.creatorZhou, Yen_US
dc.creatorZhou, Yen_US
dc.creatorShen, Jen_US
dc.creatorQu, Qen_US
dc.creatorChau, KPen_US
dc.date.accessioned2023-10-12T02:21:50Z-
dc.date.available2023-10-12T02:21:50Z-
dc.identifier.issn1047-3203en_US
dc.identifier.urihttp://hdl.handle.net/10397/102209-
dc.language.isoenen_US
dc.publisherAcademic Pressen_US
dc.rights© 2019 Elsevier Inc. All rights reserved.en_US
dc.rights© 2019. This manuscript version is made available under the CC-BY-NC-ND 4.0 license https://creativecommons.org/licenses/by-nc-nd/4.0/en_US
dc.rightsThe following publication Zhou, W., Mok, P. Y., Zhou, Y., Zhou, Y., Shen, J., Qu, Q., & Chau, K. P. (2019). Fashion recommendations through cross-media information retrieval. Journal of Visual Communication and Image Representation, 61, pp. 112–120 is available at https://doi.org/10.1016/j.jvcir.2019.03.003.en_US
dc.subjectFashion recommendationsen_US
dc.subjectHuman parsingen_US
dc.subjectImage featuresen_US
dc.subjectImage retrievalen_US
dc.titleFashion recommendations through cross-media information retrievalen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage112en_US
dc.identifier.epage120en_US
dc.identifier.volume61en_US
dc.identifier.doi10.1016/j.jvcir.2019.03.003en_US
dcterms.abstractFashion recommendation has attracted much attention given its ready applications to e-commerce. Traditional methods usually recommend clothing products to users on the basis of their textual descriptions. Product images, although covering a large resource of information, are often ignored in the recommendation processes. In this study, we propose a novel fashion product recommendation method based on both text and image mining techniques. Our model facilitates two kinds of fashion recommendation, namely, similar product and mix-and-match, by leveraging text-based product attributes and image features. To suggest similar products, we construct a new similarity measure to compare the image colour and texture descriptors. For mix-and-match recommendation, we firstly adopt convolutional neural network (CNN) to classify fine-grained clothing categories and fine-grained clothing attributes from product images. Algorithm is developed to make mix-and-match recommendations by integrating the image extracted categories and attributes information are with text-based product attributes. Our comprehensive experimental work on a real-life online dataset has demonstrated the effectiveness of the proposed method.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationJournal of visual communication and image representation, May 2019, v. 61, p. 112-120en_US
dcterms.isPartOfJournal of visual communication and image representationen_US
dcterms.issued2019-05-
dc.identifier.scopus2-s2.0-85063349380-
dc.description.validate202310 bckw-
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberITC-0395-
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextITF, Guangdong Provincial Department of Science and Technology; Shenzhen Science and Technology Innovation Commission; National Natural Science Foundation of Chinaen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS13246040-
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Mok_Fashion_Recommendations_Information.pdfPre-Published version1.09 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

112
Citations as of Apr 14, 2025

Downloads

96
Citations as of Apr 14, 2025

SCOPUSTM   
Citations

36
Citations as of Dec 19, 2025

WEB OF SCIENCETM
Citations

25
Citations as of Sep 5, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.