Efficient gradient computation for conditional Gaussian models

We introduce Recursive Exponential Mixed Models (REMMs) and derive the gradient of the parameters for the incomplete-data likelihood. We demonstrate how one can use probabilistic inference in Conditional Gaussian (CG) graphical models, a special case of REMMs, to compute the gradient for a CG model. We also demonstrate that this approach can yield simple and effective algorithms for computing the gradient for models with tied parameters and illustrate this approach on stochastic ARMA models.

cg-gradient3.pdf
PDF file

In  Proceedings of Tenth International Workshop on Artificial Intelligence and Statistics

Publisher  The Society for Artificial Intelligence and Statistics
Copyright © 2005 by The Society for Artificial Intelligence and Statistics.

Details

TypeInproceedings
URLhttp://www.vuse.vanderbilt.edu/~dfisher/ai-stats/society.html
> Publications > Efficient gradient computation for conditional Gaussian models