Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/115459
DC FieldValueLanguage
dc.contributorDepartment of Industrial and Systems Engineeringen_US
dc.creatorShu, Jen_US
dc.creatorLee, LHen_US
dc.creatorSun, Yen_US
dc.creatorPu, Pen_US
dc.creatorHui, Pen_US
dc.date.accessioned2025-09-29T02:58:10Z-
dc.date.available2025-09-29T02:58:10Z-
dc.identifier.issn0141-9382en_US
dc.identifier.urihttp://hdl.handle.net/10397/115459-
dc.language.isoenen_US
dc.publisherElsevieren_US
dc.subjectChinese foodsen_US
dc.subjectComputational gastronomyen_US
dc.subjectHuman–Food Interaction (HFI)en_US
dc.subjectMachine learningen_US
dc.titleThe art of dish : what makes cooked food visually appealing?en_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume90en_US
dc.identifier.doi10.1016/j.displa.2025.103138en_US
dcterms.abstractPeople's liking for cooked food is affected by a number of factors, including appearance, taste, smell, and eating habits. Among all these factors, appearance plays a vital role, especially in some situations where only the appearance of food is available on mobile displays. However, previous research on the effects of appearance on people's liking for cooked food is limited in dimension and scale. In this paper, we investigate the relationship between three major visual aspects of cooked food and their visual appeal. We propose and extract several visual features in terms of color, texture, and layout, based on images collected from a large online food community. We also train classifiers using proposed visual features to predict the visual appeal of cooked foods. The results show that we can achieve about 77% prediction accuracy, and we find people prefer cooked food with bright and warm colors, and a smooth surface.en_US
dcterms.accessRightsembargoed accessen_US
dcterms.bibliographicCitationDisplays, Dec. 2025, v. 90, 103138en_US
dcterms.isPartOfDisplaysen_US
dcterms.issued2025-12-
dc.identifier.scopus2-s2.0-105010564522-
dc.identifier.eissn1872-7387en_US
dc.identifier.artn103138en_US
dc.description.validate202509 bchyen_US
dc.description.oaNot applicableen_US
dc.identifier.SubFormIDG000166/2025-08-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextFunding text 1: This research was supported by the Hong Kong Polytechnic University's Starting Grant for New Recruits, RIAM Impact Fund, Departmental GRF by ISE (Project ID: P0046056, P0056354, P0056767), the National Key Research and Development Program of China under Grant 2024YFC3307602, and the Guangdong Provincial Talent Program under Grant 2023JC10X009.; Funding text 2: This research was supported by the Hong Kong Polytechnic University \u2019s Starting Grant for New Recruits, RIAM Impact Fund, Departmental GRF by ISE (Project ID: P0046056, P0056354, P0056767), the National Key Research and Development Program of China under Grant 2024YFC3307602 , and the Guangdong Provincial Talent Program under Grant 2023JC10X009 .en_US
dc.description.pubStatusPublisheden_US
dc.date.embargo2027-12-31en_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Open Access Information
Status embargoed access
Embargo End Date 2027-12-31
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.