Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/74722
Title: Evaluation of segmentation quality via adaptive composition of reference segmentations
Authors: Peng, B
Zhang, L 
Mou, X
Yang, MH
Keywords: Image segmentation dataset
Image segmentation evaluation
Segmentation quality
Issue Date: 2017
Publisher: IEEE Computer Society
Source: IEEE transactions on pattern analysis and machine intelligence, 2017, v. 39, no. 10, 7723880, p. 1929-1941 How to cite?
Journal: IEEE transactions on pattern analysis and machine intelligence 
Abstract: Evaluating image segmentation quality is a critical step for generating desirable segmented output and comparing performance of algorithms, among others. However, automatic evaluation of segmented results is inherently challenging since image segmentation is an ill-posed problem. This paper presents a framework to evaluate segmentation quality using multiple labeled segmentations which are considered as references. For a segmentation to be evaluated, we adaptively compose a reference segmentation using multiple labeled segmentations, which locally matches the input segments while preserving structural consistency. The quality of a given segmentation is then measured by its distance to the composed reference. A new dataset of 200 images, where each one has 6 to 15 labeled segmentations, is developed for performance evaluation of image segmentation. Furthermore, to quantitatively compare the proposed segmentation evaluation algorithm with the state-of-the-art methods, a benchmark segmentation evaluation dataset is proposed. Extensive experiments are carried out to validate the proposed segmentation evaluation framework.
URI: http://hdl.handle.net/10397/74722
ISSN: 0162-8828
EISSN: 1939-3539
DOI: 10.1109/TPAMI.2016.2622703
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

25
Citations as of Oct 15, 2018

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.