Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
Competitive Generative Models with Structure Learning for NLP Classification Tasks

Kristina Toutanova

Abstract

In this paper we show that generative models are competitive with and sometimes superior to discriminative models, when both kinds of models are allowed to learn structures that are optimal for discrimination. In particular, we compare Bayesian Networks and Conditional loglinear models on two NLP tasks. We observe that when the structure of the generative model encodes very strong independence assumptions (a la Naive Bayes), a discriminative model is superior, but when the generative model is allowed to weaken these independence assumptions via learning a more complex structure, it can achieve very similar or better performance than a corresponding discriminative model. In addition, as structure learning for generative models is far more efficient, they may be preferable for some tasks.

Details

Publication typeInproceedings
Published inIn Proceedings of EMNLP
URLhttp://www.aclweb.org/
PublisherAssociation for Computational Linguistics
> Publications > Competitive Generative Models with Structure Learning for NLP Classification Tasks