New Transfer Learning Techniques For Disparate Label Sets

  • Young-Bum Kim ,
  • Karl Stratos ,
  • Ruhi Sarikaya ,
  • Minwoo Jeong

Association for Computational Linguistics (ACL) |

Published by ACL - Association for Computational Linguistics

In natural language understanding (NLU), a user utterance can be labeled differently depending on the domain or application (e.g., weather vs. calendar). Standard domain adaptation techniques are not directly applicable to take advantage of the existing annotations because they assume that the label set is invariant. We propose a solution based on label embeddings induced from canonical correlation analysis (CCA) that reduces the problem to a standard domain adaptation task and allows use of a number of transfer learning techniques. We also introduce a new transfer learning technique based on pretraining of hidden-unit CRFs (HUCRFs). We perform extensive experiments on slot tagging on eight personal digital assistant domains and demonstrate that the proposed methods are superior to strong baselines.