Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/105453
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computing-
dc.creatorJin, Y-
dc.creatorSheng, B-
dc.creatorLi, P-
dc.creatorChen, CLP-
dc.date.accessioned2024-04-15T07:34:28Z-
dc.date.available2024-04-15T07:34:28Z-
dc.identifier.issn2162-237X-
dc.identifier.urihttp://hdl.handle.net/10397/105453-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication Y. Jin, B. Sheng, P. Li and C. L. P. Chen, "Broad Colorization," in IEEE Transactions on Neural Networks and Learning Systems, vol. 32, no. 6, pp. 2330-2343, June 2021 is available at https://doi.org/10.1109/TNNLS.2020.3004634.en_US
dc.subjectColorizationen_US
dc.subjectGlobal broad learning system (GBLS)en_US
dc.subjectGlobal featuresen_US
dc.subjectLocal broad learning system (LBLS)en_US
dc.subjectLocal featuresen_US
dc.titleBroad colorizationen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage2330-
dc.identifier.epage2343-
dc.identifier.volume32-
dc.identifier.issue6-
dc.identifier.doi10.1109/TNNLS.2020.3004634-
dcterms.abstractThe scribble- and example-based colorization methods have fastidious requirements for users, and the training process of deep neural networks for colorization is quite time-consuming. We instead proposed an automatic colorization approach with no dependence on user input and no need to endure long training time, which combines local features and global features of the input gray-scale images. Low-, mid-, and high-level features are united as local features representing cues existed in the gray-scale image. The global feature is regarded as data prior to guiding the colorization process. The local broad learning system is trained for getting the chrominance value of each pixel from the local features, which could be expressed as a chrominance map according to the position of pixels. Then, the global broad learning system is trained to refine the chrominance map. There are no requirements for users in our approach, and the training time of our framework is an order of magnitude faster than the traditional methods based on deep neural networks. To increase the user's subjective initiative, our system allows users to increase training data without retraining the system. Substantial experimental results have shown that our approach outperforms state-of-the-art methods.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIEEE transactions on neural networks and learning systems, June 2021, v. 32, no. 6, p. 2330-2343-
dcterms.isPartOfIEEE transactions on neural networks and learning systems-
dcterms.issued2021-06-
dc.identifier.scopus2-s2.0-85107498131-
dc.identifier.pmid32614774-
dc.identifier.eissn2162-2388-
dc.description.validate202402 bcch-
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberCOMP-0033en_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextNational Natural Science Foundation of China; The Hong Kong Polytechnic Universityen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS52669528en_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Li_Broad_Colorization.pdfPre-Published version45.95 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

15
Citations as of Jul 7, 2024

Downloads

4
Citations as of Jul 7, 2024

SCOPUSTM   
Citations

14
Citations as of Jul 4, 2024

WEB OF SCIENCETM
Citations

15
Citations as of Jul 4, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.