Abstract
The Boosting algorithm has two main variants: the gradient Boosting and the totally-corrective column-generation Boosting. Recently, the latter has received increasing attention since it exhibits a better convergence property, thus resulting in more efficient strong learners. In this work, we point out that the totally-corrective column-generation Boosting is equivalent to the gradient-descent method for the gradient Boosting in the weak-learner selection criterion, but uses additional totally-corrective updates for the weak-learner weights. Therefore, other techniques for the gradient Boosting that produce continuous-valued weak learners, e.g. step-wise direct minimization and Newtons method, may also be used in combination with the totally-corrective procedure. In this work we take the well known AdaBoost algorithm as an example, and show that employing the continuous-valued weak learners improves the performance when used with the totally-corrective weak-learner weight update.
Original language | English |
---|---|
Title of host publication | 2012 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2012 - Proceedings |
Pages | 2049-2052 |
Number of pages | 4 |
DOIs | |
Publication status | Published - 23 Oct 2012 |
Event | 2012 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2012 - Kyoto, Japan Duration: 25 Mar 2012 → 30 Mar 2012 |
Conference
Conference | 2012 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2012 |
---|---|
Country/Territory | Japan |
City | Kyoto |
Period | 25/03/12 → 30/03/12 |
Keywords
- Boosting
- column generation
- gradient
- totally corrective
ASJC Scopus subject areas
- Software
- Signal Processing
- Electrical and Electronic Engineering