Improved Large Margin Dependency Parsing via Local Constraints and Laplacian Regularization

  • Qin Iris Wang ,
  • Colin Cherry ,
  • Dan Lizotte ,
  • Dale Schuurmans

Proceedings of the Tenth Conference on Computational Natural Language Learning (CoNLL-X) |

Published by Association for Computational Linguistics

We present an improved approach for learning dependency parsers from treebank data. Our technique is based on two ideas for improving large margin training in the context of dependency parsing. First, we incorporate local constraints that enforce the correctness of each individual link, rather than just scoring the global parse tree. Second, to cope with sparse data, we smooth the lexical parameters according to their underlying word similarities using Laplacian Regularization. To demonstrate the benefits of our approach, we consider the problem of parsing Chinese treebank data using only lexical features, that is, without part-of-speech tags or grammatical categories. We achieve state of the art performance, improving upon current large margin approaches.