Learning mixtures of DAG models

Bo Thiesson, Christopher Meek, David Maxwell Chickering, and David Heckerman

Abstract

We describe computationally efficient methods for learning mixtures in which each component is a directed acyclic graphical model (mixtures of DAGs or MDAGs). We argue that simple search-and-score algorithms are infeasible for a variety of problems, and introduce a feasible approach in which parameter and structure search is interleaved and expected data is treated as real data. Our approach can be viewed as a combination of (1) the Cheeseman-Stutz asymptotic approximation for model posterior probability and (2) the Expectation-Maximization algorithm. We evaluate our procedure for selecting among MDAGs on synthetic and real examples.

Details

Publication typeInproceedings
Published inProceedings of Fourteenth Conference on Uncertainty in Artificial Intelligence
Pages504-513
PublisherMorgan Kaufmann Publishers
> Publications > Learning mixtures of DAG models