A Semantically Structured Language Model

  • Alex Acero ,
  • Ye-Yi Wang ,
  • Kuansan Wang

Special Workshop in Maui |

In this paper we propose a semantically structured language (SSLM) model that significantly reduces the authoring load required over the traditional manually derived grammar when developing a spoken language system. At the same time, the SSLM results in an understanding error rate which is roughly half as large as that of the manually authored grammar. The proposed model combines the advantages of both statistical word n-grams and context-free grammars. When the SSLM directly acts as the recognizer’s language model there’s a significant reduction in understanding error rate over the case where it is applied only at the output of a recognizer driven by an word n-gram language model.