Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
Efficient gradient computation for conditional Gaussian models

Bo Thiesson and Christopher Meek

Abstract

We introduce Recursive Exponential Mixed Models (REMMs) and derive the gradient of the parameters for the incomplete-data likelihood. We demonstrate how one can use probabilistic inference in Conditional Gaussian (CG) graphical models, a special case of REMMs, to compute the gradient for a CG model. We also demonstrate that this approach can yield simple and effective algorithms for computing the gradient for models with tied parameters and illustrate this approach on stochastic ARMA models.

Details

Publication typeInproceedings
Published inProceedings of Tenth International Workshop on Artificial Intelligence and Statistics
URLhttp://www.vuse.vanderbilt.edu/~dfisher/ai-stats/society.html
PublisherThe Society for Artificial Intelligence and Statistics
> Publications > Efficient gradient computation for conditional Gaussian models