A New Minimum Divergence Approach to Discriminative Training

Jun Du, Peng Liu, Hui Jiang, Frank Soong, and Ren-Hua Wang

Abstract

We propose to use Minimum Divergence, where acoustic similarity between HMMs is characterized by Kullback-Leibler divergence, for discriminative training. The MD objective function is defined as a posterior weighted divergence measured over the whole training set. Different from our earlier work, where KLD-based acoustic similarity is pre-computed for all initial models and stays invariant in the optimization procedure, here we propose to jointly optimize the whole variable MD by adjusting HMM parameters since MD is a function of the adjusted HMM parameters. An EBW optimization method is derived to minimize the whole MD objective function. The new MD formulation is evaluated on the TIDIGITS and Switchboard databases. Experimental results show that the new MD yields relative word error rate reductions of 62.1% on TIDIGITS and 8.8% on Switchboard databases when compared with the best ML-trained systems. It is also shown the new MD consistently outperforms other discriminative training criteria, such as MPE.

Details

Publication typeProceedings
Published inProc. of ICASSP 2007
PublisherIEEE
> Publications > A New Minimum Divergence Approach to Discriminative Training