Zhijie Yan, Peng Liu, Jun Du, Frank Soong, and Ren-Hua Wang
We propose to train Hidden Markov Model (HMM) by allocating Gaussian kernels non-uniformly across states so as to optimize a selected discriminative training criterion. The optimal kernel allocation problem is first formulated based upon a non-discriminative, Maximum Likelihood (ML) criterion and then generalized to incorporate discriminative ones. An effective kernel exchange algorithm is derived and tested on TIDIGITS, a speaker-independent (man, woman, boy and girl), connected digit recognition database. Relative 46-51% word error rate reductions are obtained comparing to the conventional uniformly allocated ML baseline. The recognition performance of discriminative kernel allocation is also consistently better than the non-discriminative ML based, nonuniform kernel allocation.
|Published in||Proc. of ISCSLP 2006|
|Publisher||Chinese and Oriental Language Information Processing Society|
Copyright © 2007 by Chinese and Oriental Language Information Processing Society