Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
Variational Message Passing and its Applications

John Winn

Abstract

This thesis is concerned with the development of Variational Message Passing (VMP), an al- gorithm for automatically performing variational inference in a probabilistic graphical model. VMP allows learning and reasoning about a system to proceed directly from a given proba- bilistic model of that system. The utility of VMP has been demonstrated by solving problems in the domains of machine vision and bioinformatics. VMP dramatically simplifies the con- struction and testing of new variational models and readily allows a range of alternative models to be tested on a given problem. In chapter 1, a probabilistic approach to automatic learning and reasoning is introduced. Belief propagation, an existing exact inference algorithm that uses message passing in a graphical model, is outlined, along with its limitations. These limitations lead to the need for approximate inference methods, including sampling methods and variational inference. The latter method of variational inference, which provides an analytical approximation to the posterior distribution, is described in detail. Chapter 2 presents a novel framework for performing automatic variational inference in a wide range of probabilistic models. The core of the framework is the Variational Message Passing algorithm which is an analog of belief propagation that uses message passing within a graphical model to optimise an approximate variational distribution. A software package, called VIBES (Variational Inference in BayESian networks), is presented as an implementation of the VMP framework. A tutorial is included which demonstrates applying VIBES to a small data set. Chapter 3 sees the framework being applied to the problem of modelling non-linear image manifolds such as those of face images and digits images. In chapter 4, the problems of DNA microarray image analysis and gene expression modelling are addressed, again using the VMP framework. Chapter 5 extends Variational Message Passing by allowing variational distributions which retain part of the dependency structure of the original model. The resulting Structured VMP algorithm is shown to improve the quality of the approximate inference and hence widen the applicability of the framework. Conclusions and suggestions for future research directions are presented in Chapter 6.

Details

Publication typePhdThesis
> Publications > Variational Message Passing and its Applications