Sharing for Continuous Hidden Markov Models

As one of the most powerful smoothing techniques, deleted interpolation has been widely used in both discrete and semi-continuous hidden Markov model (HMM) based speech recognition systems. For continous HMMs, most smoothing trechniques are carried out on the parameters themselves such as Gaussian mean or covariance parameters. In this paper, we propose to smooth the probability density values instead of the parameters of continuous HMMs. This allows us to use most of the existing smoothing techniques for both discrete and continuous HMMs. We also point out that our deleted interpolation can be regarded as parameter charing technique. We further generalize this sharing to the probability density function (PDF) level, in which each PDF becomes a basic unit and can be freely shared across any Markov state. For a wide range of dictation experiments, deleted interpolation reduced the word error rate by 11% to 23% over other simple parameter smoothing techniques like flooring. Generic PDF sharing further reduced the error rate by 3%.

di.ps
PostScript file

Details

TypeInproceedings
> Publications > Sharing for Continuous Hidden Markov Models