Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
Latent Semantic Modeling for Slot Filling in Conversational Understanding

Gokhan Tur, Asli Celikyilmaz, and Dilek Hakkani-Tur


In this paper, we propose a new framework for semantic template filling in a conversational understanding (CU) system. Our method decomposes the task into two steps: latent n-gram clustering using a semi-supervised latent Dirichlet allocation (LDA) and sequence tagging for learning semantic structures in a CU system. Latent semantic modeling has been investigated to improve many natural language processing tasks such as syntactic parsing or topic tracking. However, due to several complexity problems caused by issues involving utterance length or dialog corpus size, it has not been analyzed directly for semantic parsing tasks. In this paper, we tackle with these complexities by first extenting the LDA by introducing prior knowledge we obtain from semantic knowledge bases. Later, we use the topic posteriors obtained from the new LDA model as additional constraints to sequence learning model for the semantic template filling task. Our experiment results show significant performance gains on semantic slot filling models when features from latent semantic models are used in conditional random field (CRF).


Publication typeInproceedings
PublisherIEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP)
> Publications > Latent Semantic Modeling for Slot Filling in Conversational Understanding