Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/21869
Title: Entropy-based motion extraction for motion capture animation
Authors: So, CKF
Baciu, G 
Keywords: Animation
Entropy
Motion capture
Motion database
Mutual information
Issue Date: 2005
Publisher: John Wiley & Sons
Source: Computer animation and virtual worlds, 2005, v. 16, no. 3-4, p. 225-235 How to cite?
Journal: Computer animation and virtual worlds 
Abstract: In this paper, we present a new segmentation solution for extracting motion patterns from motion capture data by searching for critical keyposes in the motion sequence. A rank is established for critical keyposes that identifies the significance of the directional change in motion data. The method is based on entropy metrics, specifically the mutual information measure. Displacement histograms between frames are evaluated and the mutual information metric is employed in order to calculate the inter-frame dependency. The most significant keypose identifies the largest directional change in the motion data. This will have the lowest mutual information level from all the candidate keyposes. Less significant keyposes are then listed with higher mutual information levels. The results show that the method has higher sensitivity in the directional change than methods based on the magnitude of the velocity alone. This method is intended to provide a summary of a motion clip by ranked keyposes, which is highly useful in motion browsing and motion retrieve database system.
URI: http://hdl.handle.net/10397/21869
ISSN: 1546-4261 (print)
DOI: 10.1002/cav.107
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

18
Last Week
1
Last month
0
Citations as of Dec 7, 2017

WEB OF SCIENCETM
Citations

10
Last Week
0
Last month
0
Citations as of Dec 9, 2017

Page view(s)

39
Last Week
2
Last month
Checked on Dec 10, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.