Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/237
PIRA download icon_1.1View/Download Full Text
Title: An efficient search strategy for block motion estimation using image features
Authors: Chan, YL 
Siu, WC 
Issue Date: Aug-2001
Source: IEEE transactions on image processing, Aug. 2001, v. 10, no. 8, p.1223-1238
Abstract: Block motion estimation using the exhaustive full search is computationally intensive. Fast search algorithms offered in the past tend to reduce the amount of computation by limiting the number of locations to be searched. Nearly all of these algorithms rely on this assumption: the MAD distortion function increases monotonically as the search location moves away from the global minimum. Essentially, this assumption requires that the MAD error surface be unimodal over the search window. Unfortunately, this is usually not true in real-world video signals. However, we can reasonably assume that it is monotonic in a small neighborhood around the global minimum. Consequently, one simple strategy, but perhaps the most efficient and reliable, is to place the checking point as close as possible to the global minimum. In this paper, some image features are suggested to locate the initial search points. Such a guided scheme is based on the location of certain feature points. After applying a feature detecting process to each frame to extract a set of feature points as matching primitives, we have extensively studied the statistical behavior of these matching primitives, and found that they are highly correlated with the MAD error surface of real-world motion vectors. These correlation characteristics are extremely useful for fast search algorithms. The results are robust and the implementation could be very efficient.
A beautiful point of our approach is that the proposed search algorithm can work together with other block motion estimation algorithms. Results of our experiment on applying the present approach to the block-based gradient descent search algorithm (BBGDS), the diamond search algorithm (DS) and our previously proposed edge-oriented block motion estimation show that the proposed search strategy is able to strengthen these searching algorithms. As compared to the conventional approach, the new algorithm, through the extraction of image features, is more robust, produces smaller motion compensation errors, and has simple computational complexity.
Keywords: Block matching algorithm
Image features extraction
Motion estimation
Motion vector
Publisher: Institute of Electrical and Electronics Engineers
Journal: IEEE transactions on image processing 
ISSN: 1057-7149
EISSN: 1941-0042
DOI: 10.1109/83.935038
Rights: © 2001 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
block-motion_01.pdf384.51 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

91
Last Week
1
Last month
Citations as of Mar 24, 2024

Downloads

167
Citations as of Mar 24, 2024

SCOPUSTM   
Citations

58
Last Week
0
Last month
0
Citations as of Mar 28, 2024

WEB OF SCIENCETM
Citations

39
Last Week
0
Last month
0
Citations as of Mar 28, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.