Minimizing Expected Losses in Perturbation Models with Multidimensional Parametric Min-cuts

  • Adrian Kim ,
  • Kyomin Jung ,
  • Yongsub Lim ,
  • Daniel Tarlow ,
  • Pushmeet Kohli

Uncertainty in Artificial Intelligence Proceedings of the Thirty-First Conference, 2015 |

Published by AUAI Press

We consider the problem of learning perturbation-based probabilistic models by computing and differentiating expected losses. This is a challenging computational problem that has traditionally been tackled using Monte Carlo-based methods. In this work, we show how a generalization of parametric min-cuts can be used to address the same problem, achieving higher accuracy and faster performance than a sampling-based baseline. Utilizing our proposed Skeleton Method, we show that we can learn the perturbation model so as to directly minimize expected losses. Experimental results show that this approach offers promise as a new way of training structured prediction models under complex loss functions.