Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/105558
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computing-
dc.creatorZhang, H-
dc.creatorAli, R-
dc.creatorSheng, B-
dc.creatorLi, P-
dc.creatorKim, J-
dc.creatorWang, J-
dc.date.accessioned2024-04-15T07:35:01Z-
dc.date.available2024-04-15T07:35:01Z-
dc.identifier.isbn978-3-030-61863-6-
dc.identifier.isbn978-3-030-61864-3 (eBook)-
dc.identifier.issn0302-9743-
dc.identifier.urihttp://hdl.handle.net/10397/105558-
dc.description37th Computer Graphics International Conference, CGI 2020, Geneva, Switzerland, October 20–23, 2020en_US
dc.language.isoenen_US
dc.publisherSpringeren_US
dc.rights© Springer Nature Switzerland AG 2020en_US
dc.rightsThis version of the proceeding paper has been accepted for publication, after peer review (when applicable) and is subject to Springer Nature’s AM terms of use(https://www.springernature.com/gp/open-research/policies/accepted-manuscript-terms), but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record is available online at: http://dx.doi.org/10.1007/978-3-030-61864-3_34.en_US
dc.subjectAdaptive SLICen_US
dc.subjectTemporal consistencyen_US
dc.subjectVideo processingen_US
dc.titlePreserving temporal consistency in videos through adaptive SLICen_US
dc.typeConference Paperen_US
dc.identifier.spage405-
dc.identifier.epage410-
dc.identifier.volume12221-
dc.identifier.doi10.1007/978-3-030-61864-3_34-
dcterms.abstractThe application of image processing techniques to individual frames of video often results in temporal inconsistency. Conventional approaches used for preserving the temporal consistency in videos have shortcomings as they are used for only particular jobs. Our work presents a multipurpose video temporal consistency preservation method that utilizes an adaptive simple linear iterative clustering (SLIC) algorithm. First, we locate the inter-frame correspondent pixels through the SIFT Flow and use them to find the respective regions. Then, we apply a multiframe matching statistical method to get the spatially or temporally correspondent frames. Besides, we devise a least-squares energy-based flickering-removing objective function by taking into account the inter-frame temporal consistency and inter-region spatial consistency jointly. The obtained results demonstrate the potential of the proposed method.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationLecture notes in computer science (including subseries Lecture notes in artificial intelligence and lecture notes in bioinformatics), 2020, v. 12221, p. 405-410-
dcterms.isPartOfLecture notes in computer science (including subseries Lecture notes in artificial intelligence and lecture notes in bioinformatics)-
dcterms.issued2020-
dc.identifier.scopus2-s2.0-85096524427-
dc.relation.conferenceComputer Graphics International Conference [CGI]-
dc.identifier.eissn1611-3349-
dc.description.validate202402 bcch-
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberCOMP-0448en_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextNational Key Research and Development Program of China; National Natural Science Foundation of China; Science and Technology Commission of Shanghai Municipalityen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS43001184en_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
Li_Preserving_Temporal_Consistency.pdfPre-Published version1.4 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

129
Last Week
2
Last month
Citations as of Nov 30, 2025

Downloads

38
Citations as of Nov 30, 2025

SCOPUSTM   
Citations

1
Citations as of Dec 19, 2025

WEB OF SCIENCETM
Citations

2
Citations as of Dec 18, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.