Gradient Boosting Learning of Hidden Markov Models

In this paper, we present a new training algorithm, gradient boosting learning, for Gaussian mixture density (GMD) based acoustic models. This algorithm is based on a function approximation scheme from the perspective of optimization in function space rather than parameter space, i.e., stage-wise additive expansions of GMDs are used to search for optimal models instead of gradient descent optimization of model parameters. In the proposed approach, GMD starts from a single Gaussian and is built up by sequentially adding new omponents. Each new component is globally selected to produce optimal gain in the objective function. MLE and MMI re unified under the H-criterion, which is optimized by the extended BW (EBW) algorithm. A partial extended EM

algorithm is developed for stage-wise optimization of new omponents. Experimental results on WSJ task demonstrate that he new algorithm leads to improved model quality and

recognition performance.

Paper_GreedyBoost_for_HMM_pp_I-1165.pdf
PDF file

In  Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP'06)

Publisher  Institute of Electrical and Electronics Engineers, Inc.
© 2007 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

Details

TypeInproceedings
AddressToulouse, France
> Publications > Gradient Boosting Learning of Hidden Markov Models