Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/28202
Title: Structured large margin machine ensemble
Authors: Chan, PKP
Wang, D
Tsang, ECC
Yeung, DS
Keywords: Covariance analysis
Data structures
Learning (artificial intelligence)
Pattern classification
Support vector machines
Issue Date: 2006
Publisher: IEEE
Source: IEEE International Conference on Systems, Man and Cybernetics, 2006 : SMC '06, 8-11 October 2006, Taipei, p. 840-844 How to cite?
Abstract: Large margin classifiers have been widely applied in solving supervised learning problems. One representative model in large margin learning is the support vector machine (SVM). SVM is an unstructured classifier since the data structure information is underutilized and the decision hyperplane calculation relies exclusively on the support vectors. To incorporate the data covariance information into the large margin learning, structured large margin machine (SLMM) is recently proposed and show better performance than classical SVM in some applications. Instead of utilizing the data structures straightly like SLMM, SVM ensemble (SVMe) improves the generalization ability of SVM in another way by combining the outputs of a series of SVMs. Inspired by SVMe, we are going to explore the ensemble counterpart for SLMM, i.e., SLMMe, and validate the effectiveness of multiple SLMM system. Experimental results on benchmark datasets demonstrate that SLMMe improves SLMM by reducing its variance, and SLMMe outperforms SVMe in most cases in terms of both classification accuracy and variance.
URI: http://hdl.handle.net/10397/28202
ISBN: 1-4244-0099-6
1-4244-0100-3 (E-ISBN)
DOI: 10.1109/ICSMC.2006.384493
Appears in Collections:Conference Paper

Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

18
Last Week
0
Last month
Checked on Jun 25, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.