Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/15949
Title: Structured large margin learning
Authors: Wang, DF
Yeung, DS
Ng, WWY
Tsang, ECC
Wang, XZ
Keywords: Generalisation (artificial intelligence)
Learning (artificial intelligence)
Pattern classification
Issue Date: 2005
Publisher: IEEE
Source: Proceedings of 2005 International Conference on Machine Learning and Cybernetics, 2005, 18-21 August 2005, Guangzhou, China, v. 7, p. 4242-4248 How to cite?
Abstract: This paper presents a new large margin learning approach, namely structured large margin machine (SLMM), which incorporates both merits of "structured" learning models and advantages of large margin learning schemes. The promising features of this model, such as enhanced generalization ability, scalability, extensibility, and noise tolerance, are demonstrated theoretically and empirically. SLMM is of theoretical importance because it is a generalization of learning models like SVM, MPM, LDA, and M4 etc. Moreover, it provides a novel insight into the study of learning methods and forms a foundation for conceiving other "structured" classifiers.
URI: http://hdl.handle.net/10397/15949
ISBN: 0-7803-9091-1
DOI: 10.1109/ICMLC.2005.1527682
Appears in Collections:Conference Paper

Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

27
Last Week
3
Last month
Checked on May 21, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.