Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/19681
Title: Scale and orientation adaptive mean shift tracking
Authors: Ning, J
Zhang, L 
Zhang, D 
Wu, C
Issue Date: 2012
Source: IET Computer vision, 2012, v. 6, no. 1, p. 52-61 How to cite?
Journal: IET Computer Vision 
Abstract: A scale and orientation adaptive mean shift tracking (SOAMST) algorithm is proposed in this study to address the problem of how to estimate the scale and orientation changes of the target under the mean shift tracking framework. In the original mean shift tracking algorithm, the position of the target can be well estimated, whereas the scale and orientation changes cannot be adaptively estimated. Considering that the weight image derived from the target model and the candidate model can represent the possibility that a pixel belongs to the target, the authors show that the original mean shift tracking algorithm can be derived using the zeroth- and the first-order moments of the weight image. With the zeroth-order moment and the Bhattacharyya coefficient between the target model and candidate model, a simple and effective method is proposed to estimate the scale of target. Then an approach, which utilises the estimated area and the second-order centre moment, is proposed to adaptively estimate the width, height and orientation changes of the target. Extensive experiments are performed to testify the proposed method and validate its robustness to the scale and orientation changes of the target.
URI: http://hdl.handle.net/10397/19681
ISSN: 1751-9632
DOI: 10.1049/iet-cvi.2010.0112
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

93
Last Week
2
Last month
4
Citations as of Sep 15, 2017

WEB OF SCIENCETM
Citations

54
Last Week
1
Last month
0
Citations as of Sep 22, 2017

Page view(s)

55
Last Week
4
Last month
Checked on Sep 25, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.