Convolutional Neural Network Based Semantic Tagging with Entity Embeddings

  • Asli Celikyilmaz ,
  • Dilek Hakkani-Tür

NIPS Workshop on Machine Learning for SLU & Interaction |

Publication

Unsupervised word embeddings provide rich linguistic and conceptual information about words. However, they may provide weak information about domain specific semantic relations for certain tasks such as semantic parsing of natural language queries, where such information about words or phrases can be valuable. To encode the prior knowledge about the semantic word relations, we extended the neural network based lexical word embedding objective function by incorporating the information about relationship between entities that we extract from knowledge bases [1]. In this paper, we focus on the semantic tagging of conversational utterances as our end task and we investigate two different ways of using these embeddings: as additional features to a linear sequence learning method, Conditional Random Fields (CRF), and as initial embeddings to a convolutional neural networks based CRF model (CNN-CRF) with shared feature layers and globally normalized sequence modeling components. While we obtain an average of 2% improvement in F-score compared to the previous baselines when the enriched embeddings are used as additional features for CRF models, we obtain slightly more gains – when the embeddings are used as initial word representations for the