Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
KL-Divergence Regularized Deep Neural Network Adaptation For Improved Large Vocabulary Speech Recognition

Dong Yu, Kaisheng Yao, Hang Su, Gang Li, and Frank Seide

Abstract

We propose a novel regularized adaptation technique for context dependent deep neural network hidden Markov models (CD-DNNHMMs). The CD-DNN-HMM has a large output layer and many large hidden layers, each with thousands of neurons. The huge number of parameters in the CD-DNN-HMM makes adaptation a challenging task, esp. when the adaptation set is small. The technique developed in this paper adapts the model conservatively by forcing the senone distribution estimated from the adapted model to be close to that from the unadapted model. This constraint is realized by adding KullbackÔÇôLeibler divergence (KLD) regularization to the adaptation criterion. We show that applying this regularization is equivalent to changing the target distribution in the conventional backpropagation algorithm. Experiments on Xbox voice search, short message dictation, and Switchboard and lecture speech transcription tasks demonstrate that the proposed adaptation technique can provide 2%-30% relative error reduction against the already very strong speaker independent CD-DNN-HMM systems using different adaptation sets under both supervised and unsupervised adaptation setups.

Details

Publication typeInproceedings
Published inICASSP 2013
> Publications > KL-Divergence Regularized Deep Neural Network Adaptation For Improved Large Vocabulary Speech Recognition