Exact Sampling with Coupled Markov Chains
For many applications it is useful to sample from a finite set of
objects in accordance with some particular distribution. One approach
is to run an ergodic (i.e., irreducible aperiodic) Markov chain whose
stationary distribution is the desired distribution on this set; after
the Markov chain has run for M steps, with M sufficiently large,
the distribution governing the state of the chain approximates the
desired distribution. Unfortunately it can be difficult to determine
how large M needs to be. We describe a simple variant of this
method that determines on its own when to stop, and that outputs
samples in exact accordance with the desired distribution. The method
uses couplings, which have also played a role in other sampling
schemes; however, rather than running the coupled chains from the
present into the future, one runs from a distant point in the past up
until the present, where the distance into the past that one needs to
go is determined during the running of the algorithm itself. If the
state space has a partial order that is preserved under the moves of
the Markov chain, then the coupling is often particularly efficient.
Using our approach one can sample from the Gibbs distributions
associated with various statistical mechanics models (including Ising,
random-cluster, ice, and dimer) or choose uniformly at random from the
elements of a finite distributive lattice.
and Applications to Statistical Mechanics
Random Structures and Algorithms, volume 9 number 1&2, pp. 223--252, 1996. Copyright © 1996 by John Wiley & Sons.
gzipped PostScript version with a 4-megabyte picture
This article's listing in the CORA database gives information on articles that cite this one, and articles that this one cites.