Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
Efficient Multiclass Boosting Classification with Active Learning

Jian Huang, Seyda Erekia, Yang Song, Hongyuan Zha, and C. Lee Giles


We propose a novel multiclass classification algorithm Gentle Adaptive Multiclass Boosting Learning (GAMBLE). The algorithm naturally extends the two class Gentle AdaBoost algorithm to multiclass classification by using the multi-class exponential loss and the multiclass response encoding scheme. Unlike other multiclass algorithms which reduce the K-class classification task to K binary classifications, GAMBLE handles the task directly and symmetrically, with only one committee classifier. We formally derive the GAMBLE algorithm with the quasi-Newton method, and prove the structural equivalence of the two regression trees in each boosting step. To scale up to large datasets, we utilize the generalized Query By Committee (QBC) active learning framework to focus learning on the most informative samples. Our empirical results show that with QBC-style active sample selection, we can achieve faster training time and potentially higher classification accuracy. GAMBLE's numerical superiority, structural elegance and low computation complexity make it highly competitive with state-of-the-art multiclass classification algorithms.


Publication typeInproceedings
Published inSeventh SIAM International Conference (SDM 2007)
PublisherSociety for Industrial and Applied Mathematics
> Publications > Efficient Multiclass Boosting Classification with Active Learning