Please use this identifier to cite or link to this item:
Title: Structured large margin learning
Authors: Wang, DF
Yeung, DS
Tsang, ECC
Wang, XZ
Keywords: Generalisation (artificial intelligence)
Learning (artificial intelligence)
Pattern classification
Issue Date: 2005
Publisher: IEEE
Source: Proceedings of 2005 International Conference on Machine Learning and Cybernetics, 2005, 18-21 August 2005, Guangzhou, China, v. 7, p. 4242-4248 How to cite?
Abstract: This paper presents a new large margin learning approach, namely structured large margin machine (SLMM), which incorporates both merits of "structured" learning models and advantages of large margin learning schemes. The promising features of this model, such as enhanced generalization ability, scalability, extensibility, and noise tolerance, are demonstrated theoretically and empirically. SLMM is of theoretical importance because it is a generalization of learning models like SVM, MPM, LDA, and M4 etc. Moreover, it provides a novel insight into the study of learning methods and forms a foundation for conceiving other "structured" classifiers.
ISBN: 0-7803-9091-1
DOI: 10.1109/ICMLC.2005.1527682
Appears in Collections:Conference Paper

View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

Last Week
Last month
Citations as of Aug 21, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.