Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/106925
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Electrical and Electronic Engineering-
dc.creatorJian, M-
dc.creatorQi, Q-
dc.creatorYu, H-
dc.creatorDong, J-
dc.creatorCui, C-
dc.creatorNie, X-
dc.creatorZhang, H-
dc.creatorYin, Y-
dc.creatorLam, KM-
dc.date.accessioned2024-06-07T00:58:55Z-
dc.date.available2024-06-07T00:58:55Z-
dc.identifier.issn1568-4946-
dc.identifier.urihttp://hdl.handle.net/10397/106925-
dc.language.isoenen_US
dc.publisherElsevier BVen_US
dc.rights© 2019 Elsevier B.V. All rights reserved.en_US
dc.rights© 2019. This manuscript version is made available under the CC-BY-NC-ND 4.0 license https://creativecommons.org/licenses/by-nc-nd/4.0/en_US
dc.rightsThe following publication Jian, M., Qi, Q., Yu, H., Dong, J., Cui, C., Nie, X., ... & Lam, K. M. (2019). The extended marine underwater environment database and baseline evaluations. Applied Soft Computing, 80, 425-437 is available at https://doi.org/10.1016/j.asoc.2019.04.025.en_US
dc.subjectBenchmarken_US
dc.subjectSaliency detectionen_US
dc.subjectUnderwater image databaseen_US
dc.subjectUnderwater visionen_US
dc.titleThe extended marine underwater environment database and baseline evaluationsen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage425-
dc.identifier.epage437-
dc.identifier.volume80-
dc.identifier.doi10.1016/j.asoc.2019.04.025-
dcterms.abstractImages captured in underwater environments usually exhibit complex illuminations, severe turbidity of water, and often display objects with large varieties in pose and spatial location, etc., which cause challenges to underwater vision research. In this paper, an extended underwater image database for salient-object detection or saliency detection is introduced. This database is called the Marine Underwater Environment Database (MUED), which contains 8600 underwater images of 430 individual groups of conspicuous objects with complex backgrounds, multiple salient objects, and complicated variations in pose, spatial location, illumination, turbidity of water, etc. The publicly available MUED provides researchers in relevant industrial and academic fields with underwater images under different types of variations. Manually labeled ground-truth information is also included in the database, so as to facilitate the research on more applicable and robust methods for both underwater image processing and underwater computer vision. The scale, accuracy, diversity, and background structure of MUED cannot only be widely used to assess and evaluate the performance of the state-of-the-art salient-object detection and saliency-detection algorithms for general images, but also particularly benefit the development of underwater vision technology and offer unparalleled opportunities to researchers in the underwater vision community and beyond.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationApplied soft computing, July 2019, v. 80, p. 425-437-
dcterms.isPartOfApplied soft computing-
dcterms.issued2019-07-
dc.identifier.scopus2-s2.0-85064737254-
dc.identifier.eissn1872-9681-
dc.description.validate202405 bcch-
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberEIE-0360en_US
dc.description.fundingSourceSelf-fundeden_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS20082996en_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Lam_Extended_Marine_Underwater.pdfPre-Published version1.47 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

6
Citations as of Jun 30, 2024

Downloads

3
Citations as of Jun 30, 2024

SCOPUSTM   
Citations

55
Citations as of Jun 21, 2024

WEB OF SCIENCETM
Citations

46
Citations as of Jun 27, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.