Please use this identifier to cite or link to this item:
DC FieldValueLanguage
dc.contributorDepartment of Computing-
dc.creatorZhang, Kaihua-
dc.titleReal-time and robust visual tracking-
dcterms.abstractVisual tracking has been extensively studied because of its importance in practical applications such as visual surveillance, human computer interaction, traffic monitoring, to name a few. Despite extensive research in this topic with demonstrated success, it is still a very challenging task to build a robust and efficient tracking system to deal with various appearance changes caused by pose variation, illumination changes, shape deformation and abrupt motion. In this thesis, we address these challenging factors by building several robust appearance models for visual tracking. To effectively select informative features to build an appearance model, we first present an online boosting feature selection approach via optimizing the Fisher information criterion. Recently, the multiple instance learning (MIL) method has been introduced into tracking to solve the sample ambiguity problems. The MIL tracker puts the positive and negative samples into bags, and then selects features with an online boosting method via maximizing the bag likelihood function. However, the features selected by the MIL tracker are less informative to tell target from background. To solve this problem, motivated by the active learning method, we propose an active feature selection approach which is able to select more informative features than MIL tracker by using the Fisher information criterion to measure the uncertainty of classification model, thereby resulting in more robust and efficient real-time object tracking performance. We further show that it is unnecessary to use bag likelihood loss functions for feature selection as proposed in the MIL tracker. Instead, we can directly select features on the instance level by using a supervised learning method which is more efficient and robust than the MIL tracker. In the MIL tracker, the important prior information of instance labels and the most important positive instance (i.e., the tracking result in the current frame) are not exploited. We show that integrating such prior information into a supervised learning algorithm can handle visual drift more effectively and efficiently than the MIL tracker. We present an online discriminative feature selection algorithm that directly couples its score with the importance of samples, leading to a more robust and efficient tracker. Different from the above-mentioned methods that select features via online boosting methods to design appearance models, we then propose an appearance model which is built by features extracted from a multiscale image feature space with random projections. A very sparse measurement matrix is constructed to efficiently extract the features. The tracking task is then formulated as a binary classification via a naive Bayes classifier with online update in the compressed domain. Finally, we present a simple yet very fast and robust algorithm which exploits the spatio-temporal context for visual tracking. Our approach formulates the spatio-temporal relationship between the object of interest and its local context based on the Bayesian framework, which models the spatio-temporal statistical correlation between the low-level features (i.e., image intensity and position) from the target and its surrounding regions. The tracking problem is then formulated as computing a confidence map, and obtaining the best target location by maximizing an object location likelihood function. The Fast Fourier Transform is adopted for extremely fast learning and detection in this work. Implemented in MATLAB, the proposed tracker runs at 350 frames per second on an i7 machine.-
dcterms.accessRightsopen access-
dcterms.extentxviii, 149 p. : ill. ; 30 cm.-
dcterms.LCSHComputer vision-
dcterms.LCSHHong Kong Polytechnic University -- Dissertations-
Appears in Collections:Thesis
Show simple item record

Page views

Last Week
Last month
Citations as of Sep 24, 2023

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.