Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
A Semantically Structured Language Model

Alex Acero, Ye-Yi Wang, and Kuansan Wang

Abstract

In this paper we propose a semantically structured language (SSLM) model that significantly reduces the authoring load required over the traditional manually derived grammar when developing a spoken language system. At the same time, the SSLM results in an nderstanding error rate which is roughly half as large as that of the manually authored grammar. The proposed model combines the advantages of both statistical word n-grams and context-free grammars. When the SSLM directly acts as the recognizer's language model there's a significant reduction in understanding error rate over the case where it is applied only at the output of a recognizer driven by an word n-gram language model.

Details

Publication typeInproceedings
Published inSpecial Workshop in Maui
AddressMaui, Hawaii
> Publications > A Semantically Structured Language Model