Please use this identifier to cite or link to this item:
Title: Totally-corrective boosting using continuous-valued weak learners
Authors: Sun, C
Zhao, S
Hu, J
Lam, KM 
Keywords: Boosting
Column generation
Totally corrective
Issue Date: 2012
Publisher: IEEE
Source: 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 25-30 March 2012, Kyoto, p. 2049-2052 How to cite?
Abstract: The Boosting algorithm has two main variants: the gradient Boosting and the totally-corrective column-generation Boosting. Recently, the latter has received increasing attention since it exhibits a better convergence property, thus resulting in more efficient strong learners. In this work, we point out that the totally-corrective column-generation Boosting is equivalent to the gradient-descent method for the gradient Boosting in the weak-learner selection criterion, but uses additional totally-corrective updates for the weak-learner weights. Therefore, other techniques for the gradient Boosting that produce continuous-valued weak learners, e.g. step-wise direct minimization and Newtons method, may also be used in combination with the totally-corrective procedure. In this work we take the well known AdaBoost algorithm as an example, and show that employing the continuous-valued weak learners improves the performance when used with the totally-corrective weak-learner weight update.
ISBN: 978-1-4673-0045-2
978-1-4673-0044-5 (E-ISBN)
ISSN: 1520-6149
DOI: 10.1109/ICASSP.2012.6288312
Appears in Collections:Conference Paper

View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

Last Week
Last month
Citations as of Aug 21, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.