Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/39805
Title: Bidirectional visible neighborhood preserving embedding
Authors: Liu, Y
Liu, Y 
Chan, KCC 
Keywords: Bidirectional visible neighborhood preserving embedding
Dimensionality reduction
Manifold learning
Issue Date: 2009
Source: Proceedings of ACM International Conference on Internet Multimedia Computing and Service, China, Nov. 23 - Nov. 25, 2009, p. 169-174 How to cite?
Abstract: In this paper, we propose a series of dimensionality reduction algorithms according to a novel neighborhood graph construction method. This paper begins with the presentation of a new manifold learning algorithm called bidirectional visible neighborhood preserving embedding (BVNPE). Similar with existing manifold techniques, BVNPE first links every data point with its k nearest neighbors (NNs). Then, we construct a reliable neighborhood graph by checking two criteria: bidirectional linkage and visible neighborhood preserving. Third, we assign the weights to each edge in this reliable graph based on the pairwise distance between data points. Finally, we compute the low-dimensional embedding, trying to preserve the manifold structure of input dataset by mapping nearby points on the manifold to nearby points in low-dimensional space. Moreover, this paper also proposes a linear BVNPE called BVNPE/L for straightforward embedding of new data, and a multilinear BVNPE called BVNPE/M, which represents the tensor structure of image and video data better. Experiments on various datasets validate the effectiveness of proposed algorithms.
URI: http://hdl.handle.net/10397/39805
ISBN: 978-1-60558-840-7
DOI: 10.1145/1734605.1734642
Appears in Collections:Conference Paper

Access
View full-text via PolyU eLinks SFX Query
Show full item record

WEB OF SCIENCETM
Citations

7
Last Week
0
Last month
Citations as of Aug 14, 2017

Page view(s)

31
Last Week
4
Last month
Checked on Aug 20, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.