Introduction Annotated Bibliographic Entries Articles Theses Research Talks Related Literature Frequently Asked Questions Picture Gallery Definitions Code Credits 
Perfectly Random Sampling with Markov Chains
If you would like to learn CFTP and/or Fill's algorithm, there are
several expositions to choose from. Many articles which develop new
coupling techniques for use in CFTP or Fill's algorithm also contain
brief explanations of these techniques.
Introduction and ScopeRandom sampling has found numerous applications in physics, statistics, and computer science. Perhaps the most versatile method of generating random samples from a probability space is to run a Markov chain. But for how many steps? In most cases one simply does not know how many Markov chain steps are needed to get a sufficiently random state. There is a large literature of heuristic algorithms for inferring when enough steps have been taken, but they are nonrigorous, and one never knows for sure that an adequate number of steps have been taken. These heuristic algorithms are beyond the scope of this bibliography, but the interested reader is referred to some lecture notes by Sokal and the MCMC Preprint Service. In the past decade there is been much research on obtaining rigorous bounds of how many Markov chain steps are needed to generate a random sample. Sometimes these bounds are tight, sometimes they are unduly pessimistic. The interested reader is referred to a survey by Diaconis and SaloffCoste and a survey by Jerrum and Sinclair; the size of this literature makes it beyond the scope of this bibliography. In recent years there have been a large number of algorithms developed for sampling from the steady state distribution of suitably wellstructured Markov chains, which require no a priori knowledge of how long the Markov chains take to get mixed. The algorithms determine on their own, during run time, how many steps to run the Markov chain. It is these algorithms that are the focus of this bibliography. Most of these algorithms return samples that are distributed exactly according to the stationary distribution of the Markov chain, but a few return samples that have some bias ε that the user can make as small as desired. Since the focus of this bibliography is on working computer algorithms, the symbol is placed next to those articles that contain simulation results or give sample outputs. Each annotated entry contains links relevant to the paper, giving the article's abstract (click on the title) and authors' homepages when available, as well as links to online preprints. Several of the annotations were contributed by people other than the maintainer. It has been pointed out that ``stationary stopping times'' may also be considered to be algorithms for sampling from a Markov chain's stationary distribution. However, these Markov chains typically have state spaces such as the symmetric group or the hypercube, for which one already knows how to effectively generate a random sample on a computer. Rather, the point of studying these stopping times is to understand interesting mathematical processes, such as shuffling a deck of cards. (A notable exception is Fill's algorithm.) Since the literature on these stopping times is sizable, only those articles with an algorithmic theme are included. The reader interested in stopping times is referred to the articles by Aldous and Diaconis and Diaconis and Fill, which contain many references. Also relevant to the present bibliography is a literature on backwards compositions of random maps. The backwards composition of random maps can be used to define a ``stochastically recursive sequence'' of points from the state space. The articles from this literature study when this sequence of points converge almost surely, since convergence implies the existence of a stationary distribution for the Markov chain. (Existence can be nontrivial for infinite state spaces.) If one has the additional conditions that 1) the convergence occurs after a finite number steps, and 2) one can determine when this convergence has occurred, then one may use the technique of ``coupling from the past'', which is used in several the articles listed below, to generate random samples from the state space. (There are numerous examples in which conditions 1 or 2 do not hold.) Diaconis and Freedman give a survey of the literature on stochastically recursive sequences; a few representative articles are listed below.
Annotated Bibliographic Entries
Andrei Broder. Generating random spanning trees. In 30th Annual Symposium on Foundations of Computer Science, pages 442447, 1989. David J. Aldous. A random walk construction of uniform spanning trees and uniform labelled trees. SIAM Journal on Discrete Mathematics, 3(4):450465, 1990. These two articles give the same independently discovered randomwalk based algorithm for generating random spanning trees of a graph. The algorithm uses a Markov chain on the set of spanning trees of an undirected graph to return a (perfectly) random spanning tree. Broder uses the algorithm to analyze the random walk on a ring. Aldous uses the algorithm to determine the properties of random trees and to compute some nontrivial probabilities pertaining to the random walk in the plane. (There is another random tree algorithm based on computing determinants.) Søren Asmussen, Peter W. Glynn, and Hermann Thorisson. Stationary detection in the initial transient problem. ACM Transactions on Modeling and Computer Simulation, 2(2):130157, 1992. This paper explores what is possible and what is not, and was the first paper to show that it is possible to obtain unbiased samples from the steady state distribution of a finite Markov chain by observing it, provided it is irreducible and one knows how many states it has. Equivalently, there is a universal randomized stationary stopping time that works for all (irreducible) Markov chains with a given (finite) number of states. David J. Aldous. On simulating a Markov chain stationary distribution when transition probabilities are unknown. In D. J. Aldous, P. Diaconis, J. Spencer, and J. M. Steele, editors, Discrete Probability and Algorithms, volume 72 of IMA Volumes in Mathematics and its Applications, pages 19. SpringerVerlag, 1995. Abstract: We present an algorithm which, given a nstate Markov chain whose steps can be simulated, outputs a random state whose distribution is within ε of the stationary distribution, using O(n) space and O(ε^{2} τ) time, where τ is a certain ``average hitting time" parameter of the chain. Remarks: While not exact, this algorithm was much more efficient than the previous one, and directly stimulated the development of two subsequent exact sampling algorithms. This paper gives the only nontrivial lower bound on the running time of an algorithm for exact sampling from generic Markov chains. Dana Randall and Alistair Sinclair. Testable algorithms for selfavoiding walks. In Proceedings of the Fifth Annual ACMSIAM Symposium on Discrete Algorithms, pages 593602, 1994. (postscript) Abstract: We present a polynomial time Monte Carlo algorithm for almost uniformly generating and approximately counting selfavoiding walks in rectangular lattices Z^{d}. These are classical problems that arise, for example, in the study of long polymer chains. While there are a number of Monte Carlo algorithms used to solve these problems in practice, these are heuristic and their correctness relies on unproven conjectures. In contrast, our algorithm depends on a single, widelybelieved conjecture that is weaker than preceding assumptions, and, more importantly, is one which the algorithm itself can test. Thus our algorithm is reliable, in the sense that it either outputs answers that are guaranteed, with high probability, to be correct, or finds a counterexample to the conjecture. László Lovász and Peter Winkler. Exact mixing in an unknown Markov chain. Electronic Journal of Combinatorics, 2, 1995. Paper #R15. Abstract: We give a simple stopping rule which will stop an unknown, irreducible $n$state Markov chain at a state whose probability distribution is exactly the stationary distribution of the chain. The expected stopping time of the rule is bounded by a polynomial in the maximum mean hitting time of the chain. Our stopping rule can be made deterministic unless the chain itself has no random transitions. Remarks: This paper gives the first (universal) exact sampling algorithm that runs in time that is polynomial in certain parameters associated with the Markov chain. Gives a deterministic stationary stopping time that works when the Markov chain itself is not deterministic. The paper also contains a pretty lemma on random trees that is of independent interest. James G. Propp and David B. Wilson. Exact sampling with coupled Markov chains and applications to statistical mechanics. Random Structures and Algorithms, 9(1&2):223252, 1996. Abstract: For many applications it is useful to sample from a finite set of objects in accordance with some particular distribution. One approach is to run an ergodic (i.e., irreducible aperiodic) Markov chain whose stationary distribution is the desired distribution on this set; after the Markov chain has run for M steps, with M sufficiently large, the distribution governing the state of the chain approximates the desired distribution. Unfortunately it can be difficult to determine how large M needs to be. We describe a simple variant of this method that determines on its own when to stop, and that outputs samples in exact accordance with the desired distribution. The method uses couplings, which have also played a role in other sampling schemes; however, rather than running the coupled chains from the present into the future, one runs from a distant point in the past up until the present, where the distance into the past that one needs to go is determined during the running of the algorithm itself. If the state space has a partial order that is preserved under the moves of the Markov chain, then the coupling is often particularly efficient. Using our approach one can sample from the Gibbs distributions associated with various statistical mechanics models (including Ising, randomcluster, ice, and dimer) or choose uniformly at random from the elements of a finite distributive lattice. Remarks: This paper gives an algorithm, monotone coupling from the past, for exact sampling with Markov chains on huge state spaces. Since monotoneCFTP relies on the Markov chain having special structure, it is not ``universal'' as several of the above algorithms are, but it is practical for a surprising number of previously studied Markov chains. Includes simulation results for the random cluster and dimer models. Valen E. Johnson. Studying convergence of Markov chain Monte Carlo algorithms using coupled sample paths. Journal of the American Statistical Association, 91(433):154166, 1996. (Postscript.) Abstract: I describe a simple procedure for investigating the convergence properties of Markov Chain Monte Carlo sampling schemes. The procedure employs multiple runs from a sampler, using the same random deviates for each run. When the sample paths from all sequences converge, it is argued that approximate equilibrium conditions hold. The procedure also provides a simple diagnostic for detecting modes in multimodal posteriors. Several examples of the procedure are provided. In Ising models, the relation between the correlation parameter and the convergence rate of rudimentary Gibbs samplers is investigated. In another example, the effects of multiple modes on the convergence of coupled paths are explored using mixtures of bivariate normal distributions. The technique is also used to evaluate the convergence properties of a Gibbs sampling scheme applied to a model for rat growth rates (Gelfand et al 1990). Remarks: While technically not a paper on exact sampling, this paper investigates how the mixing time of a Markov chain may be inferred by running a large number of coupled simulations until they coalesce. The initial states of the Markov chains are chosen at random, and if the probability of rejection in rejection sampling is known, then rigorous estimates of the mixing time are given. Includes the (independently made) observation that for monotone Markov chains, only two coupled states need to be simulated. Includes simulation results. Additional articles that take a similar approach are available from Johnson's homepage. Michael Luby, Dana Randall, and Alistair Sinclair. Markov chain algorithms for planar lattice structures (extended abstract). In 36th Annual Symposium on Foundations of Computer Science, pages 150159, 1995. Abstract: Consider the following Markov chain, whose states are all domino tilings of a 2n X 2n chessboard: starting from some arbitrary tiling, pick a 2 X 2 window uniformly at random. If the four squares appearing in this window are covered by two parallel dominoes, rotate the dominoes in place. Repeat many times. This process is used in practice to generate a random tiling, and is a key tool in the study of the combinatorics of tilings and the behavior of dimer systems in statistical physics. Analogous Markov chains are used to randomly generate other structures on various twodimensional lattices. This paper presents techniques which prove for the first time that, in many interesting cases, a small number of random moves suffice to obtain a uniform distribution. Remarks: This paper gives three new Markov chains for sampling certain dimer and ice systems. The focus of this paper is provable running time bounds. MonotoneCFTP may be applied to each of their Markov chains to get an exact algorithm; when this is done, their proofs may be interpreted as a priori bounds on the running time of CFTP, though in practice the exact algorithm runs much more quickly than the bounds suggest. Remarks: For the particular case of the 2n X 2n chessboard, it is much faster to generate a random spanning tree, and then use a Temperleylike bijection to convert it to a (perfectly) random domino tiling. Their methods apply to more general regions for which such a Temperleyan bijection does not exist. James G. Propp and David B. Wilson. How to get a perfectly random sample from a generic Markov chain and generate a random spanning tree of a directed graph. Journal of Algorithms, 27:170217, 1998. This article combines two conference papers, appearing in Proceedings of the Seventh Annual ACMSIAM Symposium on Discrete Algorithms and Proceedings of the TwentyEighth Annual ACM Symposium on the Theory of Computing. Abstract: A general problem in computational probability theory is that of generating a random sample from the state space of a Markov chain in accordance with the steadystate probability law of the chain. Another problem is that of generating a random spanning tree of a graph or spanning arborescence of a directed graph in accordance with the uniform distribution, or more generally in accordance with a distribution given by weights on the edges of the graph or digraph. This paper gives algorithms for both of these problems, improving on earlier results and exploiting the duality between the two problems. Each of the new algorithms hinges on the recentlyintroduced technique of coupling from the past or on the linked notions of looperased random walk and ``cyclepopping''. Remarks: This article gives what are so far the best algorithms for exact sampling on generic Markov chains, as well as the first application of CFTP to a huge nonmonotone state space. It gives a universal randomized stationary stopping time that is within a constant factor of optimal, and another algorithm that is faster when the Markov chain can be simulated starting from any state rather than just observed in action. Surprisingly, both algorithms are intimately related to the generation of random spanning trees of a weighted directed graph, and this paper gives tree algorithms that are both faster and more general than the Broder/Aldous algorithm. The algorithms use various versions of CFTP (voterCFTP, coalescingCFTP, coverCFTP, and treeCFTP), and another technique called cycle popping. (An upcoming book by Lyons and Peres makes use of the cyclepopping algorithm when analyzing essential spanning forests of infinite graphs.) Includes a sample output from the tree algorithm. David Eppstein. Representing all minimum spanning trees with applications to counting and generation, Technical Report 9550, U.C. Irvine, 1995. Shows how to use the algorithms for random spanning tree generation to solve other random sampling problems. Per Eppstein's description: Shows how to find for any edge weighted graph G an equivalent graph EG such that the minimum spanning trees of G correspond oneforone with the spanning trees of EG. The equivalent graph can be constructed in time O(m+n log n) given a single minimum spanning tree of G. As a consequence one can find fast algorithms for counting, listing, and randomly generating MSTs. Also discusses similar equivalent graph constructions for shortest paths, minimum cost flows, and bipartite matching. Stefan Felsner and Lorenz Wernisch. Markov chains for linear extensions, the twodimensional case. In Proceedings of the Eighth Annual ACMSIAM Symposium on Discrete Algorithms , pages 239247, 1997. (full version) Abstract: We study the generation of uniformly distributed linear extensions using Markov chains. In particular we show that monotone coupling from the past can be applied in the case of linear extensions of twodimensional orders. For width two orders a mixing rate of O(n^3 log n) is proved. We conjecture that this is the mixing rate in the general case and support the conjecture by empirical data. On the course we obtain several nice structural results concerning Bruhat order and weak Bruhat order of permutations. Remarks: Subsequent work confirmed their conjecture. Wilfrid S. Kendall. Perfect simulation for the areainteraction point process. In L. Accardi and C. C. Heyde, editors, Probability Towards 2000, pages 218234. Springer, 1998. Proceedings of the International Symposium, ``Probability Towards 2000'', held in 1995. (Postscript.) Abstract: The ideas of Propp and Wilson ("Exact sampling with coupled Markov chains and applications to statistical mechanics", Random Structures and Algorithms, 1996, 9:223252) for exact simulation of Markov chains are extended to deal with perfect simulation of attractive areainteraction point processes in bounded windows. A simple modification of the basic algorithm is described which provides perfect simulation of the nonmonotonic case of the repulsive areainteraction point process. Results from simulations using a C computer program are reported; these confirm the practicality of this approach in both attractive and repulsive cases. The paper concludes by mentioning other point processes which can be simulated perfectly in this way, and by speculating on useful future directions of research. Remarks: Shows how to do perfect simulations of the areainteraction point process, though the techniques extend to more general point processes. A unique feature of this application is that the Markov chain used is not uniformly ergodic, i.e. its mixing time is infinite. (The state space is an infinite partially ordered set with no top state, and one can find states from which the Markov chain takes arbirtrarily long to equilibrate.) To deal with this problem, Kendall modified monotoneCFTP, replacing references to the (nonexistent) top state with references to a stochastically dominating process. Kendall also shows how to apply CFTP to repulsive point processes, which are antimonotone. Includes simulation results. Wilfrid S. Kendall. On some weighted Boolean models. In D. Jeulin, editor, Advances in Theory and Applications of Random Sets, pages 105120. World Scientific Publishing Company, 1997. (postscript) Following up on the previous article, shows how to apply CFTP to attractive birthdeath processes and exclusion processes. James A. Fill. An interruptible algorithm for perfect sampling via Markov chains. The Annals of Applied Probability, 8(1):131162, 1998. (Postscript.) An extended abstract appeared in the Proceedings of the TwentyNinth Annual ACM Symposium on Theory of Computing. Abstract: For a large class of examples arising in statistical physics known as attractive spin systems (e.g., the Ising model), one seeks to sample from a probability distribution π on an enormously large state space, but elementary sampling is ruled out by the infeasibility of calculating an appropriate normalizing constant. The same difficulty arises in computer science problems where one seeks to sample randomly from a large finite distributive lattice whose precise size cannot be ascertained in any reasonable amount of time. The Markov chain Monte Carlo (MCMC) approximate sampling approach to such a problem is to construct and run ``for a long time'' a Markov chain with longrun distribution π. But determining how long is long enough to get a good approximation can be both analytically and empirically difficult. Recently, Jim Propp and David Wilson have devised an ingenious and efficient algorithm to use the same Markov chains to produce perfect (i.e., exact) samples from π. However, the running time of their algorithm is an unbounded random variable whose order of magnitude is typically unknown a priori and which is not independent of the state sampled, so a naive user with limited patience who aborts a long run of the algorithm will introduce bias. We present a new algorithm which (1) again uses the same Markov chains to produce perfect samples from π, but is based on a different idea (namely, acceptance/rejection sampling); and (2) eliminates userimpatience bias. Like the ProppWilson algorithm, the new algorithm applies to a general class of suitably monotone chains, and also (with modification) to ``antimonotone'' chains. When the chain is reversible, naive implementation of the algorithm uses fewer transitions but more space than ProppWilson. When finetuned and applied with the aid of a typical pseudorandom number generator to an attractive spin system on n sites using a random site updating Gibbs sampler whose mixing time tau is polynomial in n, the algorithm runs in time of the same order (bound) as ProppWilson [expectation O(tau log n)] and uses only logarithmically more space [expectation O(n log n), vs. O(n) for ProppWilson]. Peter W. Glynn and Philip Heidelberger. Bias properties of budget constrained simulations. Operations Research, 38(5):801814, 1990. Among other things, this article shows that so long as the user insists on waiting long enough to get at least one random sample, a deadline will not introduce bias. This holds for any sampling procedure, whether the deadline is in terms of Markov chain steps, real time, or any other measure. James A. Fill. The movetofront rule: a case study for two exact sampling algorithms. Probability in the Engineering and Informational Sciences, 12:283302, 1998. Abstract: The elementary problem of exhaustively sampling a finite population without replacement is used as a nonreversible test case for comparing two recently proposed MCMC algorithms for perfect sampling, one based on backward coupling and the other on strong stationary duality. The backward coupling algorithm runs faster in this case, but the dualitybased algorithm is unbiased for user impatience. An interesting byproduct of the analysis is a new and simple stochastic interpretation of a mixingtime result for the movetofront rule. Remark: See also this paper. O. Häggström, M. N. M. van Lieshout, and J. Møller. Characterisation results and Markov chain Monte Carlo algorithms including exact simulation for some spatial point processes. Bernoulli, 5(4):641658, 1999. (Technical Report R962040) Abstract: The areainteraction process and the continuum randomcluster model are characterised in terms of certain functional forms of their respective conditional intensities. In certain cases, these two point process models can be derived from a bivariate point process model which in many respects is simpler to analyse and simulate. Using this correspondence we devise a twocomponent Gibbs sampler, which can be used for fast and exact simulation by extending the recent ideas of Propp and Wilson. We further introduce a SwendsenWang type algorithm. The relevance of the results within spatial statistics as well as statistical physics is discussed. Remarks: Shows how to apply monotoneCFTP to sample from the WidomRowlinson point process. Here there is no top or bottom state, but (in contrast to Kendall's chain) such states can be adjoined to the state space. (The authors instead use two other states, already in the state space, that are just as effective at testing coalescence.) The WidomRowlinson model can be marginalized to obtain the attractive areainteraction point process and the continuum random cluster model (with positive integral q). Sampling from the areainteraction process in this way turns out to be faster than (though not as generalizable as) Kendall's approach. The paper also describes a SwendsenWang type algorithm, and includes simulation results. Jesper Møller. Markov chain Monte Carlo and spatial point processes. In W. S. Kendall, O. E. BarndorffNielsen, and M. N. M. van Lieshout, editors, Stochastic Geometry: Likelihood and Computation, Monographs on Statistics and Applied Probability #80, pages 141172. Chapman and Hall / CRC Press, 1998. Surveys the spatial point processes, including recent work on exact sampling, and includes simulation results. James Propp Generating random elements of a finite distributive lattice. Electronic Journal of Combinatorics, 4(2), 1997. Paper #R15. (Postscript.) arXiv:math.CO/9801066 Abstract: This survey article describes a method for choosing uniformly at random from any finite set whose objects can be viewed as constituting a distributive lattice. The method is based on ideas of the author and David Wilson for using ``coupling from the past'' to remove initialization bias from Monte Carlo randomization. The article describes several applications to specific kinds of combinatorial objects such as tilings, constrained lattice paths, an alternating sign matrices. Olle Häggström and Karin Nelander. Exact sampling from antimonotone systems. Statistica Neerlandica, 52:360380, 1998. Postscript. Abstract: A new approach to Markov chain Monte Carlo simulation was recently proposed by Propp and Wilson. This approach, unlike traditional ones, yield samples which have exactly the desired distribution. The ProppWilson algorithm requires this distribution to have a certain structure called monotonicity. In this paper an idea of Kendall is applied to show how the algorithm can be extended to the case where monotonicity is replaced by antimonotonicity. As illustrating examples, simulations of the hardcore model and the randomcluster model are presented. Michael Luby and Eric Vigoda. Approximately counting up to four (extended abstract). In Proceedings of the TwentyNinth Annual ACM Symposium on Theory of Computing, pages 682687, 1997. (Postscript.) Abstract: We present a fullypolynomial scheme to approximate the number of independent sets in graphs with maximum degree four. In general, for graphs with maximum degree Δ≥4, the scheme approximates a weighted sum of independent sets. The weight of each independent set is expressed in terms of a positive parameter λ≤ 1/(Δ3), where the weight of independent set S is λ^{S}. We also prove complementary hardness of approximation results, which show that it is hard to approximate the weighted sum for values of λ > c/Δ for some constant c > 0. Remarks: Gives a new Markov chain for generating random independent sets with activity λ. When the maximum degree Δ≥4 and λ ≤ 1/(Δ3), they show that this Markov chain is rapidly mixing. CFTP may be efficiently applied to their Markov chain; when this is done, their proofs (but not the theorem itself) may be interpreted as a priori bounds on the running time of CFTP. An article by Dyer and Greenhill extends and improves on this one, though itself is not connected to CFTP. D. J. Murdoch and P. J. Green. Exact sampling from a continuous state space. Scandinavian Journal of Statistics 25(3):483502, 1998. (Postscript) Abstract: Propp and Wilson (1995) described a protocol, called coupling from the past, for exact sampling from a finite distribution using a coupled Markov chain Monte Carlo algorithm. In this paper we extend coupling from the past to various MCMC samplers on a continuous state space; rather than following the monotone sampling device of Propp and Wilson, our approach uses methods related to gammacoupling and rejection sampling to simulate the chain, and direct accounting of sample paths to check whether they have coupled. Remarks: Describes a variety of scenarios for which CFTP may be applied to a (nonmonotone) continuous state space, giving a sequence of algorithms that start out simple, but become increasingly sophisticated. The methods used are related to gammacoupling and rejection sampling, and appear to be applicable to Bayesian parameter estimation. Includes a case study comparing the performance of each technique when used to sample from the posterior distribution of a set of pump reliability parameters. Wilfrid S. Kendall and Jesper Møller. Perfect simulation using dominating processes on ordered spaces, with application to locally stable point processes. Advances in Applied Probability 32(3):844865, 2000. (Postscript) Abstract: In this paper we investigate the application of perfect simulation, in particular Coupling from The Past (CFTP), to the simulation of random point processes. We give a general formulation of the method of dominated CFTP and apply it to the problem of perfect simulation of general locally stable point processes as equilibrium distributions of spatial birthanddeath processes. We then investigate discretetime MetropolisHastings samplers for point processes, and show how a variant which samples systematically from cells can be converted into a perfect version. An application is given to the Strauss point process. See also the document describing the implementation. Haiyan Cai. A note on an exact sampling algorithm and Metropolis Markov chains. Technical report, University of Missouri, St. Louis, 1997. Shows how to take a generic probability distribution on a space, together with a reference distribution from which it is already known how to sample, and construct a certain particular Metropolistype monotone Markov chain. Coalescence occurs precisely when the rejection sampler with the same reference measure would accept. S. G. Foss and R. L. Tweedie. Perfect simulation and backward coupling. Stochastic Models, 14(12):187203, 1998. (Postscript.) Abstract: Algorithms for perfect or exact simulation of random samples from the invariant measure of a Markov chain have received considerable recent attention following the introduction of the "couplingfromthepast" (CFTP) technique of Propp and Wilson. Here, we place such algorithms in the context of backward coupling of stochastically recursive sequences. We show that although general backward couplings can be constructed for chains with finite mean forward coupling times, and can even be thought of as extending the classical "Loynes schemes" from queuing theory, successful "vertical" CFTP algorithms such as those of Propp and Wilson can be constructed if and only if the chain is uniformly geometric ergodic. We also relate the convergence moments for backward coupling methods to those of forward coupling times: the former typically lose at most one moment compared to the latter. Remarks: Relates CFTP to more general stochastically recursive sequences, for which one has a sequence of random variables that with probability one converge to a particular value with the right distribution, but for which one isn't necessarily able to determine when this convergence has happened. Studies the moments of the convergence time of stochastically recursive sequences, some of which may be finite with others being infinite. Robert B. Lund and David B. Wilson. Exact sampling algorithms for storage systems. In preparation. Show how to apply monotoneCFTP to sample from the steadystate distribution of certain storage systems. Rather than the linear dataflow used in traditional monotoneCFTP, the flow of data through the algorithm is twodimensional. Jesper Møller. Perfect simulation of conditionally specified models. Journal of the Royal Statistical Society, Ser. B, 61(1):251264, 1999. Abstract: We discuss how the ideas of producing perfect simulations based on coupling from the past for finite state space models naturally extend to multivariate distributions with infinite or uncountable state spaces such as autogamma, autoPoisson and autonegative binomial models, using Gibbs sampling in combination with sandwiching methods originally introduced for perfect simulation of point processes. Remarks: Shows how to apply CFTP to sample from a class of Markov random fields, in which the individual variables (from a finite, countable, or continuous space) depend upon one another in a monotone or antimonotone way. For the continuous distributions, the user specifies an ε, which controls not the bias (which is 0), but instead the numerical precision of the output. The application to the autogamma distribution, which includes the pumpreliability application above, appears to be faster than the approach of Murdoch and Green. For the continuous distributions there is an even faster method, which also takes the numerical error ε to zero. The aforementioned sandwiching methods are the ones introduced for antimonotone CFTP by Kendall and HäggströmNelander. W. S. Kendall. Perfect Simulation for Spatial Point Processes. In Bulletin of the International Statistical Institute 51^{st} Session, Istanbul (August 1997), volume 3, pages 163166, 1997. Abstract: A survey is given of the ideas of perfect simulation involving Coupling From The Past (CFTP) as introduced by Propp and Wilson (1996), and its application to the perfect simulation of spatial point processes. A sketch is given of an extension to the perfect simulation of point processes defined over all R^2. Sommaire: Voici un exposé des idées de la simulation parfaite au moyen de Coupling From The Past (CFTP) selon Propp et Wilson (1996), et son application à la simulation des processus ponctuels en espace. Une indication est fournie du moyen de conduire la simulation parfaite des processus ponctuels étendus à travers tout R^2. S. G. Foss, R. L. Tweedie, and J. N. Corcoran. Simulating the invariant measures of Markov chains using backward coupling at regeneration times. Probability in the Engineering and Informational Sciences, 12:303320, 1998. (postscript). Abstract: We develop an algorithm for simulating approximate random samples from the invariant measure of a Markov chain using backward coupling of embedded regeneration times. Related methods have been used effectively for finite chains and for stochastically monotone chains: here we propose a method of implementation which avoids these restrictions by using a ``cyclelength'' truncation. We show that the coupling times have good theoretical properties and describe benefits and difficulties of implementing the methods in practice. Remarks: Gives a method for approximately sampling from the steadystate distribution of a Markov chain when the state of the chain takes on a certain particular value on a semiregular basis. The Markov chain of interest is lifted to one that keeps track of 1) the original chain's state, and 2) the number of steps since the original chain's state took on that special value. The lifted chain is then truncated to force a return to the special state whenever the second coordinate gets too big. Using an independent set of coins for each value of time and each value of the second coordinate of the modified lifted chain, the authors show how to effectively test for coalescence. Includes discussion for how to choose the truncation parameter, though this not mechanized to the extent where the user could specify a desired maximum bias. Includes a case study on a twoserver reentrant queueing network. Elke Thönnes. Perfect simulation of some point processes for the impatient user. Advances in Applied Probability, Stochastic Geometry and Statistical Applications, 31(1):6987, 1999. (Postscript.) Abstract: Recently Propp and Wilson have proposed an algorithm, called Coupling from the Past (CFTP), which allows not only an approximate but perfect (i.e. exact simulation of the stationary distribution of certain finite state space Markov chains. Perfect Sampling using CFTP has been successfully extended to the context of point processes, amongst other authors, by Häggström et al.. In Häggström et al. Gibbs Sampling is applied to a bivariate point process, the pentrable spheres mixture model. Fill also introduced an exact sampling algorithm, which, in contrast to CFTP, is unbiased for user impatience. Fill's algorithm is a form of rejection sampling and similar to CFTP requires sufficient monotonicity properties of the transition kernel used. We show how Fill's version of rejection sampling can be extended to produce perfect samples of the penetrable spheres mixture process and related models. Following Häggström et al. we use Gibbs sampling and make use of the partial order of the mixture model state space. Thus we construct an algorithm which protects agains bias caused by user impatience and which delivers samples not only of the mixture model but also of the attractive areainteraction and the continuum randomcluster process. James Propp and David Wilson. Coupling from the past: a user's guide. In D. Aldous and J. Propp, editors, Microsurveys in Discrete Probability, volume 41 of DIMACS Series in Discrete Mathematics and Theoretical Computer Science, pages 181192. American Mathematical Society, 1998. (Postscript) An expository paper describing a variety of very different situations for which couplingfromthepast may be applied, and gives advice on recognizing additional applications. Includes some caveats for the practitioner. Olle Häggström and Karin Nelander. On exact simulation of Markov random fields using coupling from the past. Scandinavian Journal of Statistics, 26(3):395411, 1999. Abstract: A general framework for exact simulation of Markov random fields using the ProppWilson coupling from the past approach is proposed. Our emphasis is on situations lacking the monotonicity properties that have been exploited in previous studies. A critical aspect is the convergence time of the algorithm; this we study both theoretically and experimentally. Our main theoretical result in this direction says, roughly, that if interactions are sufficiently weak, then the expected running time of a carefully designed implementation is O(N log N), where N is the number of interacting components of the system. Computer experiments are carried out for random qcolourings and for the WidomRowlinson lattice gas model. Some of the techniques in this paper were independently developed by Huber. J. van den Berg and J. E. Steif. On the existence and nonexistence of finitary codings for a class of random fields. The Annals of Probability 27(3):15011522, 1999. (Postscript.) Abstract: We study the existence of finitary codings (also called finitary homomorphisms or finitary factor maps) from a finitevalued i.i.d. process to certain random fields. For Markov random fields we show, using ideas of Marton and Shields, that the presence of a phase transition is an obstruction for the existence of the above coding: this yields a large class of Bernoulli shifts for which no such coding exists. Conversely, we show that for the stationary distribution of a monotone exponentially ergodic probabilistic cellular automaton such a coding does exist. The construction of the coding is partially inspired by the ProppWilson algorithm for exact simulation. In particular, combining our results with a theorem of Martinelli and Olivieri, we obtain the fact that for the plus state for the ferromagnetic Ising model on Z^{d}, d≥2, there is (not) such a coding when the interaction parameter is below (above) its critical value. J. N. Corcoran and R. L. Tweedie. Perfect sampling of ergodic Harris chains. Annals of Applied Probability 11(2):438451, 2001. (Postscript.) Abstract: We develop an algorithm for simulating ``perfect'' random samples from the invariant measure of a Harris recurrent Markov chain. The method uses backward coupling of embedded regeneration times, and works most effectively for finite chains, or on continuous spaces for stochastically monotone chains, where paths may be sandwiched between ``upper'' and ``lower'' processes. Examples show that more naive approaches to constructing such bounding processes may be considerably biased, but that the algorithm can be simplified in certain cases to make it easier to run. We give explicit analytic bounds on the backward coupling times in the stochastically monotone case. An application of the algorithm to bounded storage models is given, and we also develop a random ``upper'' process for analyzing unbounded storage models. Remarks: For the given storage model application there are more effective algorithms. David B. Wilson. Annotated bibliography of perfectly random sampling with Markov chains. In D. Aldous and J. Propp, editors, Microsurveys in Discrete Probability, volume 41 of DIMACS Series in Discrete Mathematics and Theoretical Computer Science, pages 209220. American Mathematical Society, 1998. Updated versions to appear at http://dimacs.rutgers.edu/~dbwilson/exact/. (Postscript.)
Morten Fismen. Exact simulation using Markov chains. Technical Report 6/98, Institutt for Matematiske Fag, 1998. Diplomathesis. (Postscript.) Abstract: This reports gives a review of the new exact simulation algorithms using Markov chains. The first part covers the discrete case. We consider two different algorithms, Propp and Wilsons ``coupling from the past'' (CFTP) technique and Fills rejection sampler. The algorithms are tested on the Ising model, with and without an external field. The second part covers continuous state spaces. We present several algorithms developed by Murdoch and Green, all based on ``coupling from the past''. We discuss the applicability of these methods on a Bayesian analysis problem of surgical failure rates. Peter J. Green and Duncan J. Murdoch. Exact sampling for Bayesian inference: towards general purpose algorithms (with discussion). In J. M. Bernardo, J. O. Berger, A. P. Dawid, and A. F. M. Smith, editors, Bayesian Statistics 6, pages 301321. Oxford University Press, 1999. Presented as an invited paper at the 6th Valencia International Meeting on Bayesian Statistics, Alcossebre, Spain, June 1998. Preprint. Abstract: Propp and Wilson (1996) described a protocol, called coupling from the past, for exact sampling from a target distribution using a coupled Markov chain Monte Carlo algorithm. In this paper we discuss the implementation of coupling from the past for samplers on a continuous state space; our ultimate objective is Bayesian MCMC with guaranteed convergence. We make some progress towards this objective, but our methods are still not automatic or generally applicable. Per author's description: Describes several coupling techniques aimed at routine application in the context of Bayesian inference using random walk Metropolis: the ``bisection coupler'', calculation of automatic cell bounds and the adaptation of the Kendall (1996)/Haggstrom et al (1996) idea of bounded CFTP. Includes simulations of the posterior in two 2parameter blood type datasets with a Dirichlet prior, a 3 dimensional Dirichlet distribution, and a N(0,1) distribution. W. S. Kendall and Elke Thönnes. Perfect Simulation in Stochastic Geometry. Pattern Recognition 32(9):15691586, 1999. Special issue on random sets. (Postscript.) Abstract: Simulation plays an important rôle in stochastic geometry and related fields, because all but the simplest random set models tend to be intractable to analysis. Many simulation algorithms deliver (approximate) samples of such random set models, for example by simulating the equilibrium distribution of a Markov chain such as a spatial birthanddeath process. The samples usually fail to be exact because the algorithm simulates the Markov chain for a long but finite time, and thus convergence to equilibrium is only approximate. The seminal work by Propp and Wilson made an important contribution to simulation by proposing a coupling method, Coupling from the Past (CFTP), which delivers perfect, that is to say exact, simulations of Markov chains. In this paper we introduce this new idea of perfect simulation and illustrate it using two common models in stochastic geometry: the dead leaves model and a Boolean model conditioned to cover a finite set of points. J. N. Corcoran and R. L. Tweedie. Perfect sampling from independent MetropolisHastings chains. Journal of Statistical Planning and Inference 104(2):297314, 2002. (Postscript) Abstract: ``Perfect sampling'' enables exact draws from the invariant measure of a Markov chain. We show that the independent MetropolisHastings chain has certain stochastic monotonicity properties that enable a perfect sampling algorithm to be implemented, at least when the candidate is overdispersed with respect to the target distribution. We prove that the algorithm has an optimal geometric convergence rate, and applications show that it may converge extremely rapidly. Persi Diaconis and David Freedman. Iterated random functions. SIAM Review, 41(1):4576, 1999. (Postscript, PDF.) Abstract: Iterated random functions are used to draw pictures or simulate large Ising models, among other applications. They offer a method for studying the steady state distribution of a Markov chain, and give useful bounds on rates of convergence in a variety of examples. The present paper surveys the field and presents some new examples. There is a simple unifying idea: the iterates of random Lipschitz functions converge if the functions are contracting on the average. The Editors. Graphical illustration of some examples related to the article ``Iterated random functions'' by Diaconis and Freedman. SIAM Review, 41(1):7782, 1999. Abstract: Figures are presented to illustrate some examples of iterated random functions of the kind described in the article in this issue by Diaconis and Freedman. Jesper Møller and Katja Schladitz. Extensions of Fill's algorithm for perfect simulation. Journal of the Royal Statistical Society, Ser. B, 61(4):955969, 1999. (Postscript.) Abstract: Fill's algorithm for perfect simulation for attractive finite state space models, unbiased for user impatience, is presented in terms of stochastic recursive sequences and extended in two ways. Repulsive discrete Markov random fields with two coding sets like the autoPoisson distribution on a lattice with 4neighbourhood can be treated as monotone systems if a particular partial ordering and quasimaximal and quasiminimal states are used. Fill's algorithm then applies directly. Combining Fill's rejection sampling with sandwiching leads to a version of the algorithm, which works for general discrete conditionally specified repulsive models. Extensions to other types of models are briefly discussed. Mark Huber. Exact Sampling and Approximate Counting Techniques. In Proceedings of the 30th ACM Symposium on the Theory of Computing, pages 3140, 1998. (Postscript) (PDF) Per author's description: This paper introduces the idea of a bounding chain for determining when complete coupling has occurred so that Coupling From the Past may be run on the chain to obtain exact samples. A bounding chain for the antiferromagnetic Potts model is given which is shown to couple completely in polynomial time given either a large number of colors or a sufficiently high temperature. The same analytical techniques also show the singlesite update model of Propp and Wilson (ferromagnetic Ising) and Häggström and Nelander (antiferromagnetic Ising) run in polynomial time given high enough temperature. Some of the techniques in this paper were independently developed by Häggström and Nelander. Karin Nelander. A Markov chain Monte Carlo study of the beach model. Markov Processes and Related Fields 6(3):345369, 2000. Abstract: We study the beach model introduced by Burton and Steif. The model is an example of a strongly irreducible subshift of finite type which, when the parameter of the model exceeds a critical value, has more than one measure of maximal entropy. We are interested in the critical value and how it depends on dimension. By way of simulations, we find that the critical value seems to be decreasing as a function of the dimension. We also present a conjecture for the whereabouts of the critical value in 2 dimensions. Our simulations are made with Markov chain Monte Carlo methods. For some parameter values and sizes of systems we are able to use the ProppWilson algorithm for exact simulation. For other combinations of parameter value and size we use a variant of the SwendsenWang algorithm. Krzysztof Burdzy and Wilfrid S. Kendall. Efficient Markovian couplings: examples and counterexamples. The Annals of Applied Probability 10(2):362409, 2000. (Postscript) Abstract: In this paper we study the notion of an efficient coupling of Markov processes. Informally, an efficient coupling is one which couples at the maximum possible exponential rate, as given by the spectral gap. This notion is of interest not only for its own sake, but also of growing importance arising from the very recent advent of methods of perfect simulation: it helps to establish the price of perfection' for such methods. In general one can always achieve efficient coupling if the coupling is allowed to cheat (if each component's behaviour is affected by future behaviour of the other component), but the situation is more interesting if the coupling is required to be coadapted. We present an informal heuristic for the existence of an efficient coupling, and justify the heuristic by proving rigorous results and examples in the contexts of finite reversible Markov chains and of reflecting Brownian motion in planar domains. Mark Huber. A bounding chain for SwendsenWang. Random Structure and Algorithms 22(1):4359, 2003. (PDF) A twopage version appeared in Tenth Annual ACMSIAM Symposium on Discrete Algorithms (1999). Abstract: The greatest drawback of Monte Carlo Markov chain methods is lack of knowledge of the mixing time of the chain. The use of bounding chains solves this difficulty for some chains by giving theoretical and experimental upper bounds on the mixing time. Moreover, when used with methodologies such as coupling from the past, bounding chains allow the user to take samples drawn exactly from the stationary distribution without knowledge of the mixing time. Here we present a bounding chain for the SwendsenWang process. The SwendsenWang bounding chain allow us to efficiently obtain exact samples from the ferromagnetic Qstate Potts model for certain classes of graphs. Also, by analyzing this bounding chain, we will show that SwendsenWang is rapidly mixing over a slightly larger range of parameters than was known previously. Duncan J. Murdoch and Jeffrey S. Rosenthal. Efficient use of exact samples. Statistics and Computing 10(3):237243, 2000. Presented at the 6th Valencia International Meeting on Bayesian Statistics, Alcossebre, Spain, June 1998. (Postscript) (Poster.) (Applet.) Abstract: Propp and Wilson (1996,1998) described a protocol called coupling from the past (CFTP) for exact sampling from the steadystate distribution of a Markov chain Monte Carlo (MCMC) process. In it a past time is identified from which the paths of coupled Markov chains starting at every possible state would have coalesced into a single value by the present time; this value is then a sample from the steadystate distribution. Unfortunately, producing an exact sample typically requires a large computational effort. We consider the question of how to make efficient use of the sample values that are generated. In particular, we make use of regeneration events (cf. Mykland et al., 1995) to aid in the analysis of MCMC runs. In a regeneration event, the chain is in a fixed reference distribution; this allows the chain to be broken up into a series of tours which are independent, or nearly so (though they do not represent draws from the true stationary distribution). In this paper we consider using the CFTP and related algorithms to create tours. In some cases their elements are exactly in the stationary distribution; their length may be fixed or random. This allows us to combine the precision of exact sampling with the efficiency of using entire tours. Several algorithms and estimators are proposed and analysed. James Allen Fill, Motoya Machida, Duncan J. Murdoch, and Jeffrey S. Rosenthal. Extension of Fill's perfect rejection sampling algorithm to general chains. Extended abstract to appear in Neil Madras, editor, Monte Carlo Methods, volume 26 of Fields Institute Communications, pages 3752. American Mathematical Society, 2000. Full version appears in a special double issue of Random Structures and Algorithms 17(3&4):290316, 2000. (Extended abstract) (Full paper) Abstract: By developing and applying a broad framework for rejection sampling using auxiliary randomness, we provide an extension of the perfect sampling algorithm of Fill (1998) to general chains on quite general state spaces, and describe how use of bounding processes can ease computational burden. Along the way, we unearth a simple connection between the Coupling From The Past (CFTP) algorithm originated by Propp and Wilson (1996) and our extension of Fill's algorithm.
Remarks:
The two versions of the fourauthored paper supersede the 1998 preprint
Olle Häggström and Jeffrey E. Steif. ProppWilson algorithms and finitary codings for high noise Markov random fields. Combinatorics, Probability and Computing 9(5):425439, 2000. (Postscript.)
Abstract:
In this paper, we combine two previous works, the first being by
the first author and K. Nelander, and the second by
J. van den Berg and the second author, to show (1) that one can carry out
a ProppWilson exact simulation for all Markov random fields
on James P. Hobert, Christian P. Robert, and D. M. Titterington. On perfect simulation for some mixtures of distributions. Statistics and Computing 9(4):287298, 1999. (Postscript.) Abstract: This paper studies the implementation of the coupling from the past (CFTP) method of Propp and Wilson (1996) in the setup of two and three component mixtures with known components. We show that monotonicity structures can be exhibited in both cases, but that CFTP can still be costly for three component mixtures. We conclude with a simulation experiment exhibiting an almost perfect sampling scheme where we only consider a subset of the exhaustive set of starting values. C. C. Holmes and B. K. Mallick. Perfect simulation for orthogonal model mixing, 1998. Preprint. Abstract: In this article we demonstrate how to generate independent and identically distributed samples from the model space of the Bayes linear model with orthogonal predictors. We use the method of coupled Markov chains from the past as introduced by Propp and Wilson (1996). This procedure alleviates any concerns over convergence and sample mixing. We present a number of examples including a perfect simulation of Bayesian wavelet selection in a 1024 dimensional model space, a knot selection problem for spline smoothers and, a standard linear regression variable selection problem.
Updated to A. Mira, J. Møller, and G. O. Roberts. Perfect slice samplers. Journal of the Royal Statistical Society, Ser. B, 63(3):593606, 2001. (Postscript.) Abstract: Perfect sampling allows exact simulation of random variables from the stationary measure of a Markov chain. By exploiting monotonicity properties of the slice sampler we show that a perfect version of the algorithm can be easily implemented, at least when the target distribution is bounded. Various extensions, including perfect product slice samplers, and examples of applications are discussed. Christian P. Robert, editor. Discretization and MCMC Convergence Assessment. Lecture Notes in Statistics # 135. Springer, 1998. Chapter 1, ``Markov chain Monte Carlo methods'' by Christian P. Robert and Sylvia Richardson, has a section 4 on ``Perfect sampling.'' James A. Fill and Motoya Machida. Stochastic monotonicity and realizable monotonicity. Technical Report 573, Department of Mathematical Sciences, The Johns Hopkins University, 1998. (Postscript.) Abstract: We explore and relate two notions of monotonicity, stochastic and realizable, for a system of probability measures on a common finite partially ordered set (poset) S when the measures are indexed by another poset A. We give counterexamples to show that the two notions are not always equivalent, but for various large classes of S we also present conditions on the poset A that are necessary and sufficient for equivalence. When A = S, the condition that the cover graph of S have no cycles is necessary and sufficient for equivalence. This case arises in comparing applicability of the perfect sampling algorithms of Propp and Wilson and the first author of the present paper. Mark Huber. Exact random sampling from independent sets, 1998. Preprint. Abstract: We consider the problem of randomly sampling from the set of indpendent sets of a graph, where the probability of selecting a particular independent set is proportional to λ raised to the size of the independent set. This distribution is the hard core gas model in statistical physics. If Δ is the maximum degree of the graph, Dyer and Greenhill gave a method for efficiently approximately sampling from this distribution when λ≤2/(Δ2) by utilizing a rapidly mixing Markov chain. We show here how to use their chain to exactly sample from this distribution. This method is shown to run slightly faster than the method of Dyer and Greenhill, and as a corollary gives a better bound on the mixing time of their chain. Mark Huber. Interruptible exact sampling and construction of strong stationary times for Markov chains, 1998. Preprint. Abstract: We consider the problem of exactly sampling from the stationary distribution of a Markov chain, a problem with numerous Monte Carlo applications. The earliest method for accomplishing this was construction of a strong stationary times, a stopping time that when reached, left the chain in the stationary distribution. Unfortunately for many chains of interest, these stopping times were very difficult to compute. Here we show how an algorithm of Murdoch and Rosenthal leads to construction of strong stationary times for many chains of practical interest. In addition, we analyze the running time of their algorithm, and compare to the running time of another successful algorithm for exact sampling, coupling from the past. We show how to test whether these stopping times have been reached for several chains for distributions of interest, the hard core gas model, proper k colorings of a graph, and the continuous WidomRowlinson mixture model. Armin S. A. Rährl. Fast, portable, parallel and scalable exact simulation using Markov chains, 1999. Preprint. Abstract: Markov Chain Monte Carlo samplers have become popular tools for Bayesian computation. However, these techniques are computationally expensive. In order to speed them up, this paper presents portable, parallel and linear scalable computer implementations in the Bulk Synchronous Parallel model and in Java. Optimality in scalability and asymptotic optimality in speed is proved. It is shown how this parallel approach can be used for exact sampling using Propp and Wilson's algorithm. In 40 seconds it performs more than one million iterations using a 8192 X 8192 transition matrix of double precision floating point numbers on a SGI with 32 R10000 processors. For timecritical computational applications, an efficient use of the sample values generated via a repeated coupling from the past algorithm is made. Therefore many more samples per time are obtained, though these samples are not drawn from the true stationary distribution. How to develop and analyze a parallel solution in two rather different models of computation is explained. Reference to the complete code with documentation is provided in [20]. Wilfrid S. Kendall and Jesper Møller. Perfect implementation of a MetropolisHastings simulation of Markov point processes, 1999. Preprint. Abstract: This document describes a C implementation of a perfect algorithm for a MetropolisHastings simulation of a point process, based on
A full Nuweb source for this program (in gzipped tarfile form, together with figures) is available at my [Kendall's] [homepage of available perfect simulation programs].
Motoya Machida. Stochastic Monotonicity and Realizable Monotonicity. PhD thesis, The Johns Hopkins University, Department of Mathematical Sciences, 1999.
Abstract:
Given particular partial orderings, however, these two notions of monotonicity can be equivalent. For various large classes of partially ordered sets (posets) S, we present various conditions on the poset A that are necessary and sufficient for equivalence. Furthermore, for a certain broad class of acyclic posets S, we develop inverse probability transforms from [0,1) into S and show how to utilize them to explicitly construct systems (X_{a}: a in A) with the monotonicity property from a single uniform random variable U. If A = S and both are equipped with the same partial ordering, the condition that the poset is acyclic is necessary and sufficient for monotonicity equivalence. This case arises in comparing applicability of two perfect sampling algorithms, one via coupling from the past introduced by Propp and Wilson, and the other based on rejection sampling introduced by Fill.
Jesper Møller and G. K. Nicholls. Perfect simulation for samplebased inference. Statistics and Computing, to appear. (PDF) Abstract: Perfect simulation algorithms based on Propp and Wilson (1996) have so far been of limited use for sampling problems of interest in statistics. We specify a new family of perfect sampling algorithms obtained by combining MCMC tempering algorithms with dominated coupling from the past, and demonstrate that our algorithms will be useful for sample based inference. Perfect tempering algorithms are less efficient than the MCMC algorithms on which they typically depend. However, samples returned by perfect tempering are distributed according to the intended distribution, so that these new sampling algorithms do not suffer from the convergence problems of MCMC. Perfect tempering is related to rejection sampling. When rejection sampling has been tried, but has proved impractical, it may be possible to convert the rejection algorithm into a perfect tempering algorithm, with a significant gain in algorithm efficiency. Mark Huber. The swap move: a tool for building faster Markov chains, 1999. Preprint. Abstract: In Monte Carlo Markov chain algorithms, a random sample is generated by running a Markov chain ``for a long time''. This technique has applications in many areas, including statistical physics and approximation algorithms. Two questions about this procedure arise, first, how long does the chain need to be run before the sample obtained is close to being in the desired distribution, and secondly, how should chains be designed to make the running time as low as possible? Here we present a method for increasing the speed of Markov chains that we call the swap move. First applied by Broder to approximate the permanent, here we apply it to several models from statistical physics. In each case we are able to create improved Markov chains for the problem. More importantly, we show how the concept of bounding chains may be applied to the swap move, thus allowing the use of perfect sampling algorithms such as coupling from the past. These algorithms eliminate the need to know the mixing time of the Markov chain, making these new chains always faster to use.
Mark Huber. Perfect sampling without a lifetime commitment, 1999. Preprint. Abstract: Generating perfect samples from distributions using Markov chains has a wide range of applications, from statistical physics to approximation algorithms. In perfect sampling algorithms, a sample is drawn exactly from the stationary distribution of a chain, as opposed to methods that run the chain ``for a long time'' and create samples drawn from a distribution that is close to the stationary distribution. One of the primary methods for creating perfect samples, the coupling from the past protocol of Propp and Wilson, suffers from the fact that the user must be willing to commit to running the algorithm in its entirety in order to obtain unbiased, perfect samples. By using another method, the acceptance rejection method of Fill, Murdoch, and Rosenthal (FMR), the user may abort the procedure in the middle of a run without introducing bias. In this paper, we give the first general analysis of the running time of FMR compared to the running time of of CFTP. Furthermore, we show that FMR can be used to generate a strong stationary stopping time of a Markov chain. We show how to use FMR to generate perfect sampling algorithms for two real world examples, the hard core gas model and the WidomRowlinson mixture model. These two examples illustrate the techniques needed to apply FMR to a wide range of discrete and infinite state space chains. Remarks: Two references for the general form of Fill's algorithm are (1) ``Fill'' (his original article) and (2) ``Fill, Machida, Murdoch, and Rosenthal'' (which supersedes the preprint by Murdoch and Rosenthal).
David B. Wilson. How to couple from the past using a readonce source of randomness. Random Structures and Algorithms 16(1):85113, 2000. arXiv:math.PR/9910050 Abstract: We give a new method for generating perfectly random samples from the stationary distribution of a Markov chain. The method is related to coupling from the past (CFTP), but only runs the Markov chain forwards in time, and never restarts it at previous times in the past. The method is also related to an idea known as PASTA (Poisson arrivals see time averages) in the operations research literature. Because the new algorithm can be run using a readonce stream of randomness, we call it readonce CFTP. The memory and time requirements of readonce CFTP are on par with the requirements of the usual form of CFTP, and for a variety of applications the requirements may be noticeably less. Some perfect sampling algorithms for point processes are based on an extension of CFTP known as coupling into and from the past; for completeness, we give a readonce version of coupling into and from the past, but it remains unpractical. For these point process applications, we give an alternative coupling method with which readonce CFTP may be efficiently used. XiaoLi Meng. Towards a more general ProppWilson algorithm: multistage backward coupling. In Neil Madras, editor, Monte Carlo Methods, volume 26 of Fields Institute Communications. American Mathematical Society, 2000. To appear. Abstract: A multistage implementation of Propp and Wilson's backward coupling scheme is proposed via cluster sampling, which can help reduce the time to coalescence. A random walk example is used to illustrate the multistage generalization and to compare it with Propp and Wilson's singlestage scheme. The use of backward cluster sampling is also proposed for stratified forward Markov chain Monte Carlo, taking advantage of the greater flexibility of the multistage framework. These proposals were inspired by a general principle in sample surveys  multistage sampling is typically more effective than single stage sampling. George Casella, Kerrie L. Mengersen, Christian P. Robert and D. M. Titterington). Perfect slice samplers for mixtures of distributions. Journal of the Royal Statistical Society, Ser. B 64(4):777790, 2002. (Postscript.) Abstract: We propose a perfect sampler for mixtures of distributions, in the spirit of Mira and Roberts (1999), building on Hobert, Robert and Titterington (1999). The method relies on a marginalisation akin to RaoBlackwellisation which illustrates the Duality Principle of Diebolt and Robert (1994) and utilises an envelope argument which embeds the finite support distribution on the latent variables within a continuous support distribution, easier to simulate by slice sampling. We also provide a number of illustrations in the cases of normal and exponential mixtures which show that the technique does not suffer from severe slowdown when the number of observations or the number of components increases. We thus obtain a general iid sampling method for mixture posterior distributions and illustrate convincingly that perfect sampling can be achieved for realistic statistical models and not only for toy problems. Remarks: The authors have written a corrigendum, in which they recharacterize their results as ``approximate exact sampling''. HansOtto Georgii. Phase transition and percolation in Gibbsian particle models, 1999. arXiv:math.PR/9910005 Abstract: We discuss the interrelation between phase transitions in interacting lattice or continuum models, and the existence of infinite clusters in suitable randomgraph models. In particular, we describe a randomgeometric approach to the phase transition in the continuum Ising model of two species of particles with soft or hard interspecies repulsion. We comment also on the related areainteraction process and on perfect simulation. D. J. Murdoch. Exact sampling for Bayesian inference: unbounded state spaces. In Neil Madras, editor, Monte Carlo Methods, volume 26 of Fields Institute Communications. American Mathematical Society, 2000. To appear. (Postscript)
Abstract:
Propp and Wilson [10, 11] described a protocol called
coupling from the past
(CFTP) for exact sampling from the steadystate distribution of a Markov chain Monte Carlo (MCMC) process. In it a past time is identified from which the paths of coupled Markov chains starting at every possible state would have coalesced into a single value by the present time; this value is then a sample from the steadystate distribution.
David B. Wilson. Layered multishift coupling for use in perfect sampling algorithms (with a primer on CFTP). In Neil Madras, editor, Monte Carlo Methods, volume 26 of Fields Institute Communications, pages 141176. American Mathematical Society, 2000. arXiv:math.PR/9912225
Abstract:
In this article we describe a new coupling technique which is useful
in a variety of perfect sampling algorithms. A multishift coupler
generates a random function f() so that for each real x,
f(x)x is governed by the same fixed probability distribution, such
as a normal distribution. We develop the class of layered multishift
couplers, which are simple and have several useful properties. For
the standard normal distribution, for instance, the layered multishift
coupler generates an f() which (surprisingly) maps an interval of
length l to fewer than 2+l/2.35 points  useful in
applications which perform computations on each such image point. The
layered multishift coupler improves and simplifies algorithms for
generating perfectly random samples from several distributions,
including and autogamma distribution, posterior
distributions for Bayesian inference, and the steady state
distribution for certain storage systems. We also use the layered
multishift coupler to develop a Markovchain based perfect sampling
algorithm for the autonormal distribution.
Haiyan Cai. Exact sampling using auxiliary variables, 1999. To Appear in the Statistical Computation Section of the ASA Proceedings. (Postscript.) Abstract: We discuss two simple algorithms for generating exact samples from the equilibrium distribution of a Markov chain. These algorithms are based on a single run of the Markov chain in an ordinary way. One can pick up certain samples from the run as samples from the equilibrium distribution. This is done by introducing auxiliary variables as the indicators during the run. Alessandra Guglielmi, Chris C. Holmes, and Stephen G. Walker. Perfect simulation involving functionals of a Dirichlet process. Journal of Computational and Graphical Statistics 11(2):306310, 2002. (Postscript) Abstract: In this paper, we show how to perform perfect simulation of a functional of a Dirichlet process, when the values of the functional are bounded. We also give a procedure for ``approximate'' simulation in the case of unbounded functional values. Xeni K. Dimakos. A guide to exact simulation. International Statistical Review 69(1):2748, 2001. (Postscript) Abstract: Markov Chain Monte Carlo (MCMC) methods are used to sample from complicated multivariate distributions with normalizing constants that may not be computable and from which direct sampling is not feasible. A fundamental problem is to determine convergence of the chains. Propp & Wilson (1996) devised a Markov chain algorithm called Coupling From The Past (CFTP) that solves this problem, as it produces exact samples from the target distribution and determines automatically how long it needs to run. Exact sampling by CFTP and other methods is currently a thriving research topic. This paper gives a review of some of these ideas, with emphasis on the CFTP algorithm. The concepts of coupling and monotone CFTP are introduced, and results on the running time of the algorithm presented. The interruptible method of Fill (1998) and the method of Murdoch & Green (1998) for exact sampling for continuous distributions are presented. Novel simulation experiments are reported for exact sampling from the Ising model in the setting of Bayesian image restoration, and the results are compared to standard MCMC. The results show that CFTP works at least as well as standard MCMC, with convergence monitored by the method of Raftery & Lewis (1992, 1996). M. A. Novotny. Introduction to the ProppWilson method of exact sampling for the Ising model. In D. P. Landau, S. P. Lewis, and H.B. Schüttler, editors, Computer Simulation Studies in Condensed Matter Physics XII, volume 85 of Springer Proceedings in Physics, pages 179184. Springer Verlag, 2000. arXiv:condmat/9905195. Abstract: An introduction to the ProppWilson method of couplingfromthepast for the Ising model is presented. It enables one to obtain exact samples from the equilibrium spin distribution for ferromagnetic interactions. Both uniform and random quenched magnetic fields are included. The time to couple from the past and its standard deviation are shown as functions of system size, temperature, and the field strength. The time should be an estimate of the ergodic time for the Ising system. Elke Thönnes. A primer on perfect simulation. Statistical Physics and Spatial Statistic, ed. Klaus R. Mecke and D. Stoyan, Springer Lecture Notes in Physics #554, 349378, 2000. Abstract: Markov chain Monte Carlo has long become a very useful, established tool in statistical physics and spatial statistics. Recent years have seen the development of a new, exciting generation of Markov chain Monte Carlo methods: perfect simulation algorithms. In contrast to conventional Markov chain Monte Carlo perfect simulation produces samples which are guaranteed to have the exact equilibrium distribution. In the following we provide an examplebased introduction into perfect simulation focussed on the method called Coupling from the Past. Michael Harvey. Monte Carlo inference for belief networks using coupling from the past. Master's thesis, department of computer science, University of Toronto, 1999. (Postscript, PDF.) Abstract: A common method of inference for belief networks is Gibbs sampling, in which a Markov chain converging to the desired distribution is simulated. In practice, however, the distribution obtained with Gibbs sampling differs from the desired distribution by an unknown error, since the simulation time is finite. Coupling from the past selects states from exactly the desired distribution by starting chains in every state at a time far enough back in the past that they reach the same state at time t= 0. To track every chain is an intractable procedure for large state spaces. The method proposed in this thesis uses a summary chain to approximate the set of chains. Transitions of the summary chain are efficient for noisyor belief networks, provided that sibling variables of the network are not directly connected, but often require more simulation time steps than would be needed if chains were tracked exactly. Testing shows that the method is a potential alternative to ordinary Gibbs sampling, especially for networks that have poor Gibbs sampling convergence, and when the user has a low error tolerance. Tine Møller Sørensen. A study of Propp & Wilson's method for exact Markov chain Monte Carlo sampling, 1997. Master's thesis in computational statistics, University of Bath.
Abstract:
The aim of this project is to study the Propp & Wilson method (Propp & Wilson, 1996) for exact simulation via Markov chain Monte Carlo.
Roberto Fernández, Pablo A. Ferrari, and Nancy L. Garcia. Perfect simulation for interacting point processes, loss networks and Ising models, 1999. math.PR/9911162 Abstract: We present a perfect simulation algorithm for measures that are absolutely continuous with respect to some Poisson process and can be obtained as invariant measures of birthanddeath processes. Examples include area and perimeterinteracting point processes (with stochastic grains), invariant measures of loss networks, and the Ising contour and random cluster models. The algorithm does not involve couplings of the process with different initial conditions and it is not tied up to monotonicity requirements. Furthermore, it directly provides perfect samples of finite windows of the infinitevolume measure, subjected to time and space ``userimpatience bias''. The algorithm is based on a twostep procedure: (i) a perfectsimulation scheme for a (finite and random) relevant portion of a (spacetime) marked Poisson processes (free birthanddeath process, free loss networks), and (ii) a ``cleaning'' algorithm that trims out this process according to the interaction rules of the target process. The first step involves the perfect generation of ``ancestors'' of a given object, that is of predecessors that may have an influence on the birthrate under the target process. The second step, and hence the whole procedure, is feasible if these ``ancestors'' form a finite set with probability one. We present a sufficiency criteria for this condition, based on the absence of infinite clusters for an associated (backwards) oriented percolation model. Bas Straatman. Exact sampling and applications to a mite dispersal model, 1998. Master's thesis, Utrecht.
This thesis does not itself come with an abstract, but the abstract for a talk given by Straatman on the topic follows.
Jens Lund and Elke Thönnes. Perfect simulation and inference for point processes given noisy observations. Computational Statistics 19(2):317336, 2004. (Postscript) Abstract: The paper is concerned with the Bayesian analysis of point processes which are observed with noise. It is shown how to produce exact samples from the posterior distribution of the unobserved true point process given a noisy observation. The algorithm is a perfect simulation method which applies dominated coupling from the past to a spatial birthanddeath process. Dominated coupling from the past is made amenable by augmenting the state space of the target chain. We discuss how to optimize this perfect simulation algorithm and how to use it in a statistical inference problem of pratical importance. We further describe the merits and the limitations of the method. Jens Lund. Statistical inference and perfect simulation for point processes observed with noise. PhD thesis, The Royal Veterinary and Agricultural University, Copenhagen, Department of Statistics, 1999.
Abstract:
The main themes of this thesis are spatial statistics and simulation algorithms. The thesis is split into five papers that may be read independently. All five papers deal with spatial models. Lund & Rudemo (1999), Lund et al. (1999), and Lund & Thönnes (1999b) deal with the same new model for point processes observed with noise, and Lund et al. (1999), Lund & Thönnes (1999b), and Lund & Thönnes (1999a) has a simulation aspect.
Alison Gibbs. Bounding the convergence time of the Gibbs sampler in Bayesian image restoration. Biometrika 87(4):749766, 2000. (Postscript). Abstract: This paper shows how coupling methodology can be used to give precise, a priori bounds on the convergence time of Markov chain Monte Carlo algorithms for which a partial order exists on the state space which is preserved by the Markov chain transitions. This methodology is applied to give a bound on the convergence time of the random scan Gibbs sampler used in the Bayesian restoration of an image of N pixels. For our algorithm in which only one pixel is updated at each iteration, the bound is a constant times N^{2}. The proportionality constant is given and is easily calculated. These bounds also give an indication of the running time of coupling from the past algorithms. Alison Gibbs. Convergence of Markov Chain Monte Carlo Algorithms with Applications to Image Restoration. PhD thesis, University of Toronto, Department of Statistics, 1999. (Postscript, PDF.)
Abstract:
Markov chain Monte Carlo algorithms, such as the Gibbs sampler and MetropolisHastings algorithm, are widely used in statistics, computer science, chemistry and physics for exploring complicated probability distributions. A critical issue for users of these algorithms is the determination of the number of iterations required so that the result will be approximately a sample from the distribution of interest.
Paul Fearnhead. Perfect simulation from population genetic models with selection. Theoretical Population Biology, 59:263279, 2001. (Postscript) Software available here. Abstract: We consider using the ancestral selection graph to simulate samples from genetic models with selection. Currently the use of the ancestral selection graph to simulate such samples is limited. This is because the computational requirement for simulating such samples increases exponentially with the selection rate, and also due to the need to be able to simulate a sample of size 1 from the population at equilibrium. For the only case where the distribution of a sample of size one is known, that of parent independent mutations, more efficient simulation algorithms exist. We will show that by applying the idea of coupling from the past to the ancestral selection graph, samples can be simulated from a general Kallele model without knowledge of the distribution of a sample of size 1. Further, the computation involved in generating such samples appears to be less than that of simulating the ancestral selection graph until its ultimate ancestor. In particular, in the case of genic selection with parent independent mutations, the computational requirement increases only quadratically with the selection rate. The algorithm is then used to simulate samples at microsatellite loci, and to consider the effect of selection on linked neutral sites. Jesper Møller. Aspects of spatial statistics, stochastic geometry and Markov chain Monte Carlo methods. Thesis for doctoral degree in Natural Sciences, Aalborg University.
From the preface:
This doctoral thesis is concerned with certain aspects of spatial statistics, stochastic geometry, and Markov chain Monte Carlo methods. It is based on a monograph [L] and twenty selected papers [AK, MV] produced over about the last decade; see the following two pages. The papers [A, G, H, I] and [95, 140] constituted my Ph.D. thesis from 1988 entitled Stochastic Geometry and Markov Models; [A, H, I] in revised form are included in the present thesis.
Remark: Section 3.2 deals with perfect simulation. SongChun Zhu and David Mumford. Learning Generic Prior Models for Visual Computation, 1997. To appear in IEEE Transactions on Pattern Analysis and Machine Intelligence. (Postscript) Abstract: Many generic prior models have been widely used in computer vision ranging from image and surface reconstruction to motion analysis, and these models presume that surfaces of objects be smooth, and adjacent pixels in images have similar intensity values. However, there is little rigorous theory to guide the construction and selection of prior models for a given application. Furthermore, images are often observed at arbitrary scales, but none of the existing prior models are scaleinvariant. Motivated by these problems, this article chooses general natural images as a domain of application, and proposes a theory for learning prior models from a set of observed natural images. Our theory is based on a maximum entropy principle, and the learned prior models are of Gibbs distributions. A novel information criterion is proposed for model selection by minimizing a KullbackLeibler information distance. We also investigate scale invariance in the statistics of natural images and study a prior model which has scale invariant property. In this paper, in contrast with all existing prior models, negative potentials in Gibbs distribution are first reported. The learned prior models are verified in two ways. Firstly images are sampled from the prior distributions to demonstrate what typical images they stand for. Secondly they are compared with existing prior models in experiments of image restoration. Remarks: This paper is about image processing rather than perfect sampling per se, but the end of section 3 includes discussion of their use of the ``ProppWilson criterion'' for deciding how long to run their Markov chains. Andrew M. Childs, Ryan B. Patterson, and David J. C. MacKay. Exact sampling from nonattractive distributions using summary states. Physical Review E 63:036113, 2001. arXiv:condmat/0005132 Abstract: Propp and Wilson's method of coupling from the past allows one to efficiently generate exact samples from attractive statistical distributions (e.g., the ferromagnetic Ising model). This method may be generalized to nonattractive distributions by the use of summary states, as first described by Huber. Using this method, we present exact samples from a frustrated antiferromagnetic triangular Ising model and the antiferromagnetic q= 3 Potts model. We discuss the advantages and limitations of the method of summary states for practical sampling, paying particular attention to the slowing down of the algorithm at low temperature. In particular, we show that such a slowing down can occur in the absence of a physical phase transition. Luc Devroye, James A. Fill, and Ralph Neininger. Perfect Simulation from the Quicksort Limit Distribution, 2000. Preprint. Abstract: The weak limit of the normalized number of comparisons needed by the Quicksort algorithm to sort n randomly permuted items is known to be determined implicitly by a distributional fixedpoint equation. We give an algorithm for perfect random variate generation from this distribution. George Casella, Michael Lavine, and Christian P. Robert. Explaining the perfect sampler. The American Statistician 55(4):299305, 2001. (Postscript). Abstract: In 1996, Propp and Wilson introduced Coupling from the Past (CFTP), an algorithm for generating a sample from the exact stationary distribution of a Markov chain. In 1998, Fill proposed another socalled perfect sampling algorithm. These algorithms have enormous potential in Markov Chain Monte Carlo (MCMC) problems because they eliminate the need to monitor convergence and mixing of the chain. This article provides a brief introduction to the algorithms, with an emphasis on understanding rather than technical detail. Olle Häggström. Finite Markov Chains and Algorithmic Applications. Cambridge University Press, 2000. Contents: These lecture notes give an elementary introduction to Markov chains, and applications to algorithms like MCMC and simulated annealing. Chapters are entitled:
Erik van Zwet. Perfect stochastic EM. In M. C. M. de Gunst, C. A. J. Klaassen, and A. W. van der Vaart, editors, State of the Art in Probability and Statistics, IMS Lecture Notes #36, pages 607616, 2001. (Postscript.) Abstract: In a missing data problem we observe the result of a (known) manytoone mapping of an unobservable `complete' dataset. The aim is to estimate some parameter of the distribution of the complete data. In this situation, the stochastic version of the EM algorithm is sometimes a viable option. It is an iterative algorithm that produces an ergodic Markov chain on the parameter space. The stochastic EM (StEM) estimator is then a sample from the equilibrium distribution of this chain. Recently, a method called `coupling from the past' was invented to generate a Markov chain in equilibrium. We investigate when this method can be used for a StEM chain and give examples where this is indeed possible. W. L. Cooper and R. L. Tweedie. Perfect simulation of an inventory model for perishable products. Stochastic Models, 18(2):229243, 2002. (Postscript.) Abstract: We study an inventory model for perishable products with a criticalnumber ordering policy under the assumption that demand for the product forms an i.i.d. sequence, so that the state of the system forms a Markov chain. Explicit calculation of the stationary distribution has proved impractical in cases where items have reasonably long lifetimes and for systems with large ``underupto'' levels. Using the recentlydeveloped couplingfromthepast method, we introduce a technique to estimate the stationary distribution of the Markov chain via ``perfect simulation.'' The Markov chain that results from the use of a criticalnumber policy is particularly amenable to these simulation techniques, despite not being ordered in its initial state, since the recursive equations satisfied by the Markov chain enable us to identify specific demand patterns where the backward coupling occurs. Jesper Møller. A review of perfect simulation in stochastic geometry. Selected Proceedings of the Symposium on Inference for Stochastic Processes. Eds. I.V. Basawa, C.C. Heyde and R.L. Taylor, IMS Lecture Notes & Monographs Series, 2001, Volume 37, pages 333355. (Postscript.) Abstract: We provide a review and a unified exposition of the ProppWilson algorithm and various other algorithms for making perfect simulations with a view to applications in stochastic geometry. Most examples of applications are for spatial point processes. Francis Comets, Roberto Fernández, and Pablo A. Ferrari. Processes with Long Memory: Regenerative Construction and Perfect Simulation, 2000. arXiv:math.PR/0009204
Abstract:
We present a perfect simulation algorithm for stationary processes indexed by James A. Fill and Mark Huber. The randomness recycler: a new technique for perfect sampling. 41st Annual Symposium on Foundations of Computer Science, pages 503511, 2000. arXiv:math.PR/0009242 Abstract: For many probability distributions of interest, it is quite difficult to obtain samples efficiently. Often, Markov chains are employed to obtain approximately random samples from these distributions. The primary drawback to traditional Markov chain methods is that the mixing time of the chain is usually unknown, which makes it impossible to determine how close the output samples are to having the target distribution. Here we present a new protocol, the randomness recycler (RR), that overcomes this difficulty. Unlike classical Markov chain approaches, an RRbased algorithm creates samples drawn exactly from the desired distribution. Other perfect sampling methods such as coupling from the past use existing Markov chains, but RR does not use the traditionalMarkov chain at all. While by no means universally useful, RR does apply to a wide variety of problems. In restricted instances of certain problems, it gives the first expected linear time algorithms for generating samples. Here we apply RR to selforganizing lists, the Ising model, random independent sets, random colorings, and the random cluster model. James A. Fill and Motoya Machida. Realizable Monotonicity and Inverse Probability Transform, 2000. arXiv:math.PR/0010026 Abstract: A system (P_{a}: a in A) of probability measures on a common state space S indexed by another index set A can be ``realized'' by a system (X_{a}: a in A) of Svalued random variables on some probability space in such a way that each X_{a} is distributed as P_{a}. Assuming that A and S are both partially ordered, we may ask when the system (P_{a}: a in A) can be realized by a system (X_{a}: a in A) with the monotonicity property that X_{a} <= X_{b} almost surely whenever a <= b. When such a realization is possible, we call the system (P_{a}: a in A) ``realizably monotone.'' Such a system necessarily is stochastically monotone, that is, satisfies P_{a} <= P_{b} in stochastic ordering whenever a <= b. In general, stochastic monotonicity is not sufficient for realizable monotonicity. However, for some particular choices of partial orderings in a finite state setting, these two notions of monotonicity are equivalent. We develop an inverse probability transform for a certain broad class of posets S, and use it to explicitly construct a system (X_{a}: a in A) realizing the monotonicity of a stochastically monotone system when the two notions of monotonicity are equivalent. Marc A. Loizeaux and Ian W. McKeague. Bayesian inference for spatial point processes via perfect sampling, 2000. Preprint. Abstract: We study a Bayesian cluster model for spatial point processes in which observations are assumed to cluster around a finite collection of underlying landmarks. Perfect sampling from the posterior distribution of landmarks is investigated. In the case of NeymanScott cluster models, perfect sampling from the posterior is shown to be computationally feasible via a couplingfromthepast type algorithm of Kendall and Møller. An application to data on leukemia incidence in upstate New York is developed. M. N. M. van Lieshout and Erik van Zwet. Maximum likelihood estimation for the bombing model, 2000. Preprint. Abstract: Perhaps the best known example of a random set is the Boolean model. It is the union of `grains' such as discs, squares or triangles which are placed at the points of a Poisson point process. The Poisson points are called the `germs'. We are interested in estimating the intensity, say lambda, of the Poisson process from a sample of a Boolean model of discs (the bombing model). A natural estimate is the number of germs in the observation region divided by the area of that region. Unfortunately, we do not observe the presence of a given germ when its associated disc is completely covered by other discs. On the other hand, we observe the exact location of a germ when we observe any part of its associated disc's boundary. We demonstrate how to apply Coupling From The Past to sample from the conditional distribution, given the data, of the unobserved germs. Such samples allow us to approximate the maximum likelihood estimator of the intensity. We discuss and compare two methods to do so. The first method is based on a Monte Carlo approximation of the likelihood function. The second is a stochastic version of the EM algorithm. James P. Hobert and Christian P. Robert. Moralizing perfect sampling, 2000. Preprint.
Abstract:
Let Φ be an irreducible, aperiodic, Harris recurrent Markov chain with invariant probability measure π. We show that if a minorization condition can be established for Φ, then π can be represented as an infinite mixture from which it may be possible to sample. Making exact draws from this mixture (and hence from π requires the ability to simulate two different random variables, T^{*} and N_{t}. When the small set in the minorization condition is the entire state space (and hence Φ is uniformly ergodic), simulating these random variables is straightforward and the resulting algorithm turns out to be equivalent to Murdoch and Green's (1998) multigamma coupler. Thus, we are able to achieve perfect sampling in the uniformly ergodic case without any appeal to coupling or backward simulation. When the small set is not the state space, simulating T^{*} is more problematic. Fortunately, T^{*} is univariate and has support Michael K. Schneider. Exact and Approximate Sampling from the Stationary Distribution of a Markov Chain, 1998. Area exam report. (postscript, compressed postscript, pdf) Abstract: When simulating a physical system with discrete sates, one often would like to generate a sample from the stationary distribution of a Markov chain. This report focuses on three sampling methodologies which do not rely on explicitly computing the stationary distribution. Two of these lead to algorithms which can generate an exact sample in finite time. The third yields a sample whose distribution approximates, but is arbitrary close to, the stationary distribution from which one desires a sample. The approximate and one of the exact methodologies are illustrated with examples from statistical mechanics. Krishna B. Athreya and Örjan Stenflo. Perfect Sampling for Doeblin Chains. Sankhya  The Indian Journal of Statistics 65(4):763777, 2003. (PDF).
Abstract:
For Markov chains that can be generated by iteration of i.i.d. random maps from the state space X into itself (this holds if X is Polish) it is shown that the Doeblin minorization condition is necessary and sufflcient for the method by Propp and Wilson for ``perfect'' sampling from the stationary distribution π to be successful. Joakim Linde, Cristopher Moore, Mats G. Nordahl. An ndimensional generalization of the rhombus tiling, 2001. Preprint.
Abstract:
Several classic tilings, including rhombuses and dominoes, possess height functions which allow us to 1) prove ergodicity and polynomial mixing times for Markov chains based on local moves, 2) use coupling from the past to sample perfectly random tilings, 3) map the statistics of random tilings at large scales to physical models of random surfaces, and and 4) are related to the ``arctic circle'' phenomenon. However, few examples are known for which this approach works in three or more dimensions. Here we show that the rhombus tiling can be generalized to ndimensional tiles for any n≥3. For each n, we show that a certain local move is ergodic, and conjecture that it has a mixing time of O(L Kasper K. Berthelsen and Jesper Møller. Spatial jump processes and perfect simulation, 2001. Preprint. Abstract: Spatial birthanddeath processes, spatial birthandcatastrophe processes, and more general types of spatial jump processes are studied in detail. Particularly, varied kind of coupling constructions are considered, leading to some known and some new perfect simulation procedures for different types of spatial jump processes in equilibrium. The considered equilibrium distributions include many classical Gibbs point process models and a new class of models for spatial point processes introduced in the text. Jesper Møller and Rasmus P. Waagepetersen. Simulationbased inference for spatial point processes, 2001. Preprint. Kasper K. Berthelsen and Jesper Møller. A primer on perfect simulation for spatial point processes, 2001. Preprint. Abstract: This primer provides a selfcontained exposition of the case where spatial birthanddeath processes are used for perfect simulation of locally stable point processes. Particularly, a simple dominating coupling from the past (CFTP) algorithm and the CFTP algorithms introduced in Kendall (1998), Kendall & Møller (2000), and Fernández, Ferrari & Garcia (2000) are studied. Some empirical results for the algorithms are discussed. Julian Besag. Likelihood analysis of binary data in space and time. Manuscript, 2001. Keywords: Autologistic; Conditional probability; Coupling from the past; Lattice data; Markov chain Monte Carlo; Markov random fields; Maximum likelihood; Maximum pseudolikelihood; Perfect simulation; Space and time; Spread of disease David B. Wilson. Mixing times of lozenge tiling and card shuffling Markov chains, 2001. arXiv:math.PR/0102193 Abstract: We show how to combine Fourier analysis with coupling arguments to bound the mixing times of a variety of Markov chains. The mixing time is the number of steps a Markov chain takes to approach its equilibrium distribution. One application is to a class of Markov chains introduced by Luby, Randall, and Sinclair to generate random tilings of regions by lozenges. For an L X L region we bound the mixing time by O(L^{4} log L), which improves on the previous bound of O(L^{7}), and we show the new bound to be essentially tight. In another application we resolve a few questions raised by Diaconis and SaloffCoste, by lower bounding the mixing time of various cardshuffling Markov chains. Our lower bounds are within a constant factor of their upper bounds. When we use our methods to modify a pathcoupling analysis of Bubley and Dyer, we obtain an O(n^{3} log n) upper bound on the mixing time of the KarzanovKhachiyan Markov chain for linear extensions. Remarks: Among other things this article analyses the running time of CFTP when applied to lozenge tilings, and contains experiments pertinent to Fill's algorithm. Kasper K. Berthelsen and Jesper Møller. Perfect simulation and inference for spatial point processes. Manuscript, 2002. Abstract: The advantages and limitations of using perfect simulation for simulationbased inference for pairwise interaction point processes are discussed. Various aspects of likelihood inference for the Strauss process with unknown range of interaction are studied. A large part of the paper concerns nonparametric Bayesian inference for the interaction function. Markov chain Monte Carlo methods, particularly path sampling, play an important role. Several empirical results and various datasets are considered. Duncan J. Murdoch and XiaoLi Meng. Towards perfect sampling for Bayesian mixture priors. Bayesian Methods with Applications to Science, Policy, and Official Statistics. Selected papers from ISBA 2000: the sixth world meeting of the International Society for Bayesian Analysis. Abstract: Perfect sampling using the coupling from the past (CFTP) algorithm was introduced by Propp and Wilson in 1996. In much the way rejection sampling allows one to convert samplers from one distribution into samplers from another, CFTP allows one to convert Markov chain Monte Carlo algorithms from approximate samplers of the steadystate distribution into perfect ones. Since 1996 CFTP has been applied to many different Markov chains. However, its use in routine Bayesian computation is still in the early stages of development. This paper provides a couple of building blocks for its potentially routine application in Bayesian mixture priors, including a t mixture coupler, and demonstrates the types of difficulties that currently prevent CFTP from being applied routinely in Bayesian computation. Radu V. Craiu and XiaoLi Meng. Antithetic coupling for perfect sampling. Bayesian Methods with Applications to Science, Policy, and Official Statistics. Selected papers from ISBA 2000: the sixth world meeting of the International Society for Bayesian Analysis. Abstract: This paper reports some initial investigations of the use of antithetic variates in perfect sampling. A simple random walk example is presented to illustrate the key ingredients of antithetic coupling for perfect sampling as well as its potential benefit. A key step in implementing antithetic coupling is to generate k≥2 random variates that are negatively associated, a stronger condition than negative correlation as it requires that the variates remain nonpositively correlated after any (componentwise) monotone transformations have been applied. For k=2, this step is typically trivial (e.g., by taking U and 1U where U~U(0,1)) and it constitutes much of the common use of antithetic variates in Monte Carlo simulation. Our emphasis is on k>2 because we have observed some general gains in going beyond the commonly used pair of antithetic variates. We discuss several ways of generating negatively associated random variates for arbitrary k, and our comparison generally favors Iterative Latin Hypercube Sampling. Nancy L. Garcia. Perfect simulation of spatial processes. Resenhas  IMEUSP, 4(3):281324, 2000. Abstract: This work presents a review of some of the schemes used to perfect sample from spatial processes. As examples, the areainteraction point process, Strauss process, penetrable sphere model, Peierls contours of the Ising model and continuous loss networks are studied under Coupling from The Past Algorithm (CFTP), Acceptance and Rejection Algorithm (ARA) and BackwardForward Algorithm(BFA). Sungwoo Cho and Ashish Goel. Exact sampling in machine scheduling problems. Lecture Notes in Computer Science 2129 (Approximation Randomization and Combinatorial Optimization: Algorithms and Techniques), Michel X. Goemans, Klaus Jansen, José D.P. Rolim, Luca Trevisan, Eds., pages 202210. Abstract: Analysis of machine scheduling problem can get very complicated even for simple scheduling policies and simple arrival processes. The problem becomes even harder if the scheduler and the arrival process are complicated, or worse still, given to us as a black box. In such cases it is useful to obtain a typical state of the system which can then be used to deduce information about the performance of the system or to tune the parameters for either the scheduling rule or the arrival process. We consider two general scheduling problems and present an algorithm for extracting an exact sample from the stationary distribution of the system when the system forms an ergodic Markov chain. We assume no knowledge of the internals of the arrival process or the scheduler. Our algorithm assumes that the scheduler has a natural monotonic property, and that the job service times are geometric/exponential. We use the Coupling From The Past paradigm due to Propp and Wilson to obtain our result. In order to apply their general framework to our problems, we perform a careful coupling of the different states of the Markov chain. Ashish Goel and Michael Mitzenmacher. Exact sampling of TCP window states. To appear in IEEE Infocom, 2002. Abstract: We demonstrate how to apply Coupling from the Past, a simulation technique for exact sampling, to Markov chains based on TCP variants. This approach provides a new, statistically sound paradigm for network simulations: instead of simulating a protocol over long times, or explicitly finding the stationary distribution of a Markov chain, use Coupling from the Past to quickly obtain samples from the stationary distribution. Coupling from the Past is most efficient when the underlying state space satisfies a partial order and certain monotonicity conditions. To efficiently apply this general paradigm to TCP, we demonstrate that the states of a simple TCP model possess a monotonic partial order; this order appears interesting in its own right. Preliminary simulation results indicate that this approach is quite efficient, and produces results which are similar to those obtained by simulating a TCPTahoe connection. Keith Crank and James A. Fill. Interruptible exact sampling in the passive case, 2002. arXiv:math.PR/0202136 Abstract: We establish, for various scenarios, whether or not interruptible exact stationary sampling is possible when a finitestate Markov chain can only be viewed passively. In particular, we prove that such sampling is not possible using a single copy of the chain. Such sampling is possible when enough copies of the chain are available, and we provide an algorithm that terminates with probability one. Robert P. Dobrow and James A. Fill. Speeding up the FMMR perfect sampling algorithm: A case study revisited. Random Structures and Algorithms 23(4):434452, 2003. arXiv:math.PR/0205120 Abstract: In a previous paper by the second author, two Markov chain Monte Carlo perfect sampling algorithms  one called coupling from the past (CFTP) and the other (FMMR) based on rejection sampling  are compared using as a case study the movetofront (MTF) selforganizing list chain. Here we revisit that case study and, in particular, exploit the dependence of FMMR on the userchosen initial state. We give a stochastic monotonicity result for the running time of FMMR applied to MTF and thus identify the initial state that gives the stochastically smallest running time; by contrast, the initial state used in the previous study gives the stochastically largest running time. By changing from worst choice to best choice of initial state we achieve remarkable speedup of FMMR for MTF; for example, we reduce the running time (as measured in Markov chain steps) from exponential in the length n of the list nearly down to n when the items in the list are requested according to a geometric distribution. For this same example, the running time for CFTP grows exponentially in n. Anne Philippe and Christian P. Robert. Perfect simulation of positive Gaussian distributions. Statistics and Computing 13(2):179186, 2003. (Postscript.) Abstract: We provide an exact simulation algorithm that produces variables from truncated Gaussian distributions on R_{+}^{p} via a perfect sampling scheme, based on stochastic ordering and slice sampling, since acceptreject algorithms like the one of Geweke (1991) and Robert (1995) are difficult to extend to higher dimensions. Gareth O. Roberts and Jeffrey S. Rosenthal. Combinatorial identities associated with CFTP, 2002.
Abstract:
We explore a method of obtaining combinatorial identities
by analysing partiallycompleted runs of the Coupling from the Past
(CFTP) algorithm. In particular, using CFTP for simple symmetric
random walk (ssrw) with holding boundaries, we derive an identity
involving linear combinations of L. A. Breyer and G. O. Roberts. A new method for coupling random fields. LMS Journal of Computation and Mathematics 5:7794, 2002. Abstract: Given a Markov chain, a stochastic flow which simultaneously constructs sample paths started at each possible initial value can be constructed as a composition of random fields. In this paper, we describe a method for coupling flows by modifying an arbitrary field (consistent with the Markov chain of interest) by an independence Metropolis Hastings iteration. We show that the resulting stochastic flow has many desirable coalescence properties, regardless of the form of the original flow. L. A. Breyer and G. O. Roberts. Catalytic perfect simulation. Methodology and Computing in Applied Probability 3(2):161177, 2001. Abstract: We introduce a methodology for perfect simulation using so called catalysts to modify random fields. Our methodology builds on a number of ideas previously introduced by Breyer and Roberts (1999), by Murdoch (1999), and by Wilson (2000). We illustrate our techniques by simulating two examples of Bayesian posterior distributions. David J. C. MacKay. Information Theory, Inference, and Learning Algorithms. Cambridge University Press (2003).
Textbook (viewable online), 640 pages.
Chapters include
Graeme K. Ambler. Dominated Coupling From The Past and Some Extensions of the AreaInteraction Process. PhD thesis, University of Bristol, Department of Mathematics, 2002. Advisor: Professor B.W. Silverman Abstract: Dominated coupling from the past, an example of perfect simulation, is a method for sampling from the equilibrium distribution of a Markov chain. A particular model which may be sampled in this way is the areainteraction point process of Baddeley and van Lieshout (1995). This thesis reviews perfect simulation and some aspects of point process theory. It also reviews some methods for simulating spatial point processes. It then introduces an extension of the areainteraction process which incorporates both attractive and repulsive components at different scales. The properties of this model are investigated using some standard test functions, and the model is fitted to a standard data set. Some discrete analogues of these models are then discussed along with methods of sampling from these models using dominated coupling from the past. While discussing these sampling methods, we highlight some of the pitfalls that exist when one attempts to use dominated coupling from the past. We set out and investigate the use of the discrete analogue of the areainteraction process as a prior distribution on wavelet coefficients in Bayesian nonparametric regression. Finally, we discuss some of the issues that arose while implementing the algorithms, both for simulating from the attractiverepulsive model and for the posterior distribution of the Bayesian wavelet thresholding model. Jordi Artés. Simulación exacta en procesos puntuales espaciales: método CFTP / Exact simulation for spatial point processes: CFTP method. Master's thesis, Universitat Jaume I, Departamento de Matemáticas, 2004. Advisor: Jorge Mateu Faheem Mitha. Perfect Sampling on Continuous State Spaces. PhD thesis, University of North Carolina at Chapel, Department of Statistics, 2003. Advisor: Chuanshu Ji Abstract: We consider algorithms that sample exactly from probability distributions on continuous state spaces. For the most part we focus our attention on multivariate distributions. Examples and computer implementations are included for the algorithms considered. Algorithms discussed include Murdoch and Green's multigamma and rejection couplers and Fill and Huber's randomness recycler. We prove convergence in finite time for the multigamma and rejection couplers as described by Murdoch and Green, and adapt these to a multivariate context. Applications of the Fill/Huber randomness recycler to multivariate distributions on continuous state spaces are also considered. Other algorithms such as Wilson's multishift coupler are also discussed. Can Mert. Bayesian inference in Schizophrenia research. Master's thesis, Royal Institute of Technology, Computer Science, 2002. Advisor: Stefan Arnborg Abstract: In this Master's thesis, Bayesian data mining techniques are used to analyze a database containing information from schizophrenia affected and healthy persons. The aim is to find dependencies between the diagnosis and various attributes. The database is assembled in the HUBIN project at the Karolinska Institutet. Two different techniques are employed, classification and graphical modeling. We also propose an MCMC extension to the graphical model problem. The results in this thesis indicate that the data mining approach is very promising in this kind of research. Several known facts about schizophrenia were reproducable in this study. Tina Dam, Jan Brønnum Sørensen, and Michael Hedegaard Thomsen. Master's thesis, Aalborg University, Department of Mathematical Sciences, 1999. Advisor: Jesper Møller Abstract: This thesis presents the theory of spatial point processes. It introduces the general setting and characterization results for point processes, followed by definitions of the binomial and the Poisson point processes. The Poisson point process is important for the theory of point processes as it models complete spatial randomness. Various summary statistics are presented, and estimation and examples of the use of such are given. Emphasis is placed on Ripley's Kfunction and statistics related to it. Different classes of models are defined; Cox, cluster, hardcore, Markov, nearest neighbour Markov, and Gibbs processes and simulation of each of these models is explained. Markov chain Monte Carlo simulation is covered and here the MetropolisHastings algorithm is an important tool. We then present spatial birth{and{death processes that provide an alternative to the MetropolisHastings algorithm. Perfect simulation is treated in order to make it possible to simulate from an exact equilibrium distribution. Parametric inference is the final topic covered. Here pseudolikelihood estimation, contrastmethods, Markov chain Monte Carlo maximum likelihood estimation, and stochastic gradient estimation is treated along with parametric bootstrapping, likelihood ratio and FisherWald tests. Finally, six different point patterns are analysed mainly to exemplify the use of the presented theory. Ulrike Schneider. Advances and Applications in Perfect Sampling. PhD thesis, University of Colorado, Department of Applied Mathematics, 2003. Advisor: Jem Corcoran. Abstract: Perfect sampling algorithms are Markov Chain Monte Carlo (MCMC) methods without statistical error. The latter are used when one needs to get samples from certain (nonstandard) distributions. This can be accomplished by creating a Markov chain that has the desired distribution as its stationary distribution, and by running sample paths "for a long time", i.e. until the chain is believed to be in equilibrium. The question "how long is long enough?" is generally hard to answer and the assessment of convergence is a major concern when applying MCMC schemes. This issue completely vanishes with the use of perfect sampling algorithms which  if applicable  enable exact simulation from the stationary distribution of a Markov chain. In this thesis, we give an introduction to the general idea of MCMC methods and perfect sampling. We develop advances in this area and highlight applications of these advances to two relevant problems. As advances, we discuss and devise several variants of the wellknown MetropolisHastings algorithm which address accuracy, applicability, and computational cost of this method. We also describe and extend the idea of slice coupling, a technique which enables one to couple continuous sample paths of Markov chains for use in perfect samplingalgorithms. As a first application, we consider Bayesian variable selection. The problem of variable selection arises when one wants to model the relationship between a variable of interest and a subset of explanatory variables. In the Bayesian approach one needs to sample from the posterior distribution of the model and this simulation is usually carried out using regular MCMC methods. We significantly expand the use of perfect sampling algorithms within this problem using ideas developed in this thesis. As a second application, we depict the use of these methods for the interacting fermion problem. We employ perfect sampling for computing self energy through Feynman diagrams using Monte Carlo integration techniques. We conclude by stating two open questions for future research. Mariana Rodrigues Motta. Perfect Simulation of the Bidimentional Silo Model. Master's thesis, 2001. Advisor: Nancy Lopes Garcia Murali Haran. Efficient Perfect and MCMC Sampling Methods for Bayesian Spatial and Components of Variance Models. PhD thesis, University of Minnesota, School of Statistics, 2003. Advisors: Luke Tierney and Brad Carlin. Author's description: I worked on Bayesian computation for spatial models and some hierarchical models. In particular, I investigated sampling techniques for hierarchical models that use Gaussian Markov Fields (GMRFs). A major portion of my thesis contains work on perfect sampling approaches for Bayesian disease mapping models. Graeme K. Ambler and Bernard W. Silverman. Perfect simulation for Bayesian wavelet thresholding with correlated coefficients, 2004. (PDF) Abstract: We introduce a new method of Bayesian wavelet shrinkage for reconstructing a signal when we observe a noisy version. Rather than making the usual assumption that the wavelet coefficients of the signal are independent, we allow for the possibility that they are locally correlated in both location (time) and scale (frequency). This leads us to a prior structure which is, unfortunately, analytically intractable. Nevertheless, it is possible to draw independent samples from a close approximation to the posterior distribution by an approach based on Coupling From The Past, making it possible to use a simulationbased approach to fit the model. Graeme K. Ambler and Bernard W. Silverman. Perfect simulation of spatial point processes using dominated coupling from the past with application to a multiscale areainteraction point process, 2004. (PDF) Abstract: We consider perfect simulation algorithms for locally stable point processes based on dominated coupling from the past. A new version of the algorithm is developed which is feasible for processes which are neither purely attractive nor purely repulsive. Such processes include multiscale areainteraction processes, which are capable of modelling point patterns whose clustering structure varies across scales. We prove correctness of the algorithm and existence of these processes. An application to the redwood seedlings data is discussed. M. N. M. van Lieshout, and R. S. Stoica. Perfect Simulation for Marked Point Processes. PNAR0306, ISSN 13863711, 2003. (PDF) Abstract: This paper extends some recently proposed exact simulation algorithms for point processes to marked patterns and reports on a simulation study into the relative efficiency for a range of Markov marked point processes. Elke Thönnes. Generic CFTP in Stochastic Geometry and beyond, Revista de Estatistica, Special Issue, Proceedings of the 23rd European Meeting of Statistician in Funchal, 81  84, 2001. Mark Huber. Perfect sampling using bounding chains. Annals of Applied Probability 14(2):734753, 2004. Abstract: Bounding chains are a technique that offers three benefits to Markov chain practitioners: a theoretical bound on the mixing time of the chain under restricted conditions, experimental bounds on the mixing time of the chain that are provably accurate and construction of perfect sampling algorithms when used in conjunction with protocols such as coupling from the past. Perfect sampling algorithms generate variates exactly from the target distribution without the need to know the mixing time of a Markov chain at all. We present here the basic theory and use of bounding chains for several chains from the literature, analyzing the running time when possible. We present bounding chains for the transposition chain on permutations, the hard core gas model, proper colorings of a graph, the antiferromagnetic Potts model and sink free orientations of a graph. Wilfrid S. Kendall. Notes on Perfect Simulation, 2004. Abstract: Perfect simulation refers to the art of converting suitable Markov Chain Monte Carlo algorithms into algorithms which return exact draws from the target distribution, instead of longtime approximations. The theoretical concepts underlying perfect simulation have a long history, but they were first drawn together to form a practical simulation technique in the groundbreaking paper of Propp and Wilson [75], which showed (for example) how to obtain exact draws from (for example) the critical Ising model. These lecture notes are organized around four main themes of perfect simulation: the original or classic Coupling From The Past algorithm (CFTP); variations which exploit regeneration ideas such as smallset or splitchain constructions from Markov chain theory (smallset CFTP); generalizations of CFTP which deal with nonmonotonic and nonuniformly ergodic examples (dominated CFTP); and finally some theoretical complements. Mark Huber. A bounding chain for SwendsenWang. Random Structures and Algorithms 22(1):4359, 2002. Abstract: The greatest drawback of Monte Carlo Markov chain methods is lack of knowledge of the mixing time of the chain. The use of bounding chains solves this difficulty for some chains by giving theoretical and experimental upper bounds on the mixing time. Moreover, when used with methodologies such as coupling from the past, bounding chains allow the user to take samples drawn exactly from the stationary distribution without knowledge of the mixing time. Here we present a bounding chain for the SwendsenWang process. The SwendsenWang bounding chain allow us to efficiently obtain exact samples from the ferromagnetic Qstate Potts model for certain classes of graphs. Also, by analyzing this bounding chain, we will show that SwendsenWang is rapidly mixing over a slightly larger range of parameters than was known previously. Nancy L. Garcia and Nevena Maric. Perfect simulation for a continuous onedimensional loss network, 2002. (Postscript)
Abstract:
Sufficient conditions for ergodicity of a onedimensional loss networks on Yuzhi Cai and Wilfrid S. Kendall. Perfect simulation for correlated Poisson random variables conditioned to be positive. Statistics and Computing 12(3):229243, 2002. Abstract: In this paper we present a perfect simulation method for obtaining perfect samples from collections of correlated Poisson random variables conditioned to be positive. We show how to use this method to produce a perfect sample from a Boolean model conditioned to cover a set of points: in W.S. Kendall and E. Thönnes (Pattern Recognition 32(9):15691586, 1999), this special case was treated in a more complicated way. The method is applied to several simple examples where exact calculations can be made, so as to check correctness of the program using χ²tests, and some smallscale experiments are carried out to explore the behaviour of the conditioned Boolean model. Ian W. McKeague and Marc Loizeaux. Perfect Sampling for Point Process Cluster Modelling. Chapter 5 of Spatial Cluster Modelling (A. Lawson and D. Denison, eds.), Chapman and Hall (2002), pages 87107. (Postscript) Marc A. Loizeaux and Ian W. McKeague. Perfect Sampling for Posterior Landmark Distributions with an Application to the Detection of Disease Clusters. IMS Lecture Notes  Monograph Series, Vol. 37, 321331 (2001). (Postscript) Abstract: We study perfect sampling for the posterior distribution in a class of spatial point process models introduced by Baddeley and van Lieshout (1993). For NeymanScott cluster models, perfect sampling from the posterior is shown to be computationally feasible via a couplingfromthepast type algorithm of Kendall and Møller. An application to data on leukemia incidence in upstate New York is presented. M. N. M. van Lieshout and Erik van Zwet. Exact sampling from conditional Boolean models with applications to maximum likelihood inference. Advances in Applied Probabability 33(2):339353, 2001. Abstract: We are interested in estimating the intensity parameter of a Boolean model of discs (the bombing model) from a single realization. To do so, we derive the conditional distribution of the points (germs) of the underlying Poisson process. We demonstrate how to apply coupling from the past to generate samples from this distribution, and use the samples thus obtained to approximate the maximum likelihood estimator of the intensity. We discuss and compare two methods: one based on a Monte Carlo approximation of the likelihood function, the other a stochastic version of the EM algorithm. George Casella, Christian P. Robert, and Martin T. Wells. RaoBlackwellization of Generalized AcceptReject Schemes, 2000. (Postscript) Abstract: This paper extends the acceptreject algorithm to allow the proposal distribution to change at each iteration. We first establish a necessary and sufficient condition for this generalized acceptreject algorithm to be valid, and then show how the RaoBlackwellization of Casella and Robert (1996) can be extended to this setting. An important application of these results is to the perfect sampling technique of Fill (1998), which is a generalized acceptreject algorithm in disguise. Jesper Møller and Jakob G. Rasmussen. Perfect Simulation of Hawkes Processes. Research Report R200418, Department of Mathematical Sciences, Aalborg University, 2004. Abstract: This article concerns a perfect simulation algorithm for unmarked and marked Hawkes processes. The usual straightforward simulation algorithm suffers from edge effects, whereas our perfect simulation algorithm does not. By viewing Hawkes processes as Poisson cluster processes and using their branching and conditional independence structure, useful approximations of the distribution function for the length of a cluster are derived. This is used to construct upper and lower processes for the perfect simulation algorithm. Examples of applications and empirical results are presented. Wilfrid S. Kendall. Geometric Ergodicity and Perfect Simulation, 2004. (PDF) Abstract: This note extends the work of Foss and Tweedie (1998), who showed that availability of the classic Coupling from The Past algorithm of Propp and Wilson (1996) is essentially equivalent to uniform ergodicity for a Markov chain (see also Hobert and Robert 2004). In this note we show that all geometrically ergodic chains possess dominated Coupling from The Past algorithms (not necessarily practical!) which are rather closely connected to FosterLyapunov criteria. Chris Holmes and David G. T. Denison. Pefect sampling for wavelet reconstruction of signals. IEEE Transactions on Signal Processing 50(2):337344, 2002. (PDF) Abstract: The coupling from the past (CFTP) procedure is a protocol for finitestate Markov chain Monte Carlo (MCMC) methods whereby the algorithm itself can determine the necessary runtime to convergence. In this paper, we demonstrate how this protocol can be applied to the problem of signal reconstruction using Bayesian wavelet analysis where the dimensionality of the wavelet basis set is unknown, and the observations are distorted by Gaussian white noise of unknown variance. MCMC simulation is used to account for model uncertainty by drawing samples of wavelet bases for approximating integrals (or summations) on the model space that are either too complex or too computationally demanding to perform analytically. We extend the CFTP protocol by making use of the central limit theorem to show how the algorithm can also monitor its own approximation error induced by MCMC. In this way, we can assess the number of MCMC samples needed to approximate the integral to within a user specified tolerance level. Hence, the method automatically ensures convergence and determines the necessary number of iterations needed to meet the error criteria. Radu V. Craiu and XiaoLi Meng. Multiprocess parallel antithetic coupling for backward and forward Markov chain Monte Carlo. Preprint, 2003 Abstract: Antithetic coupling is a general stratification strategy for reducing Monte Carlo variance without increasing the simulation size. The use of the antithetic principle in the Monte Carlo literature typically employs two strata via antithetic quantile coupling. We demonstrate here that further stratification, obtained by using k > 2 (e.g., k = 310) antithetically coupled variates, can offer substantial additional gain in Monte Carlo efficiency, in terms of both variance and bias. The reason for reduced bias is that antithetically coupled chains can provide a more dispersed search of the state space than multiple independent chains. The emerging area of perfect simulation provides a perfect setting for implementing the kprocess parallel antithetic coupling for MCMC because, without antithetic coupling, this class of methods delivers genuine independent draws. Furthermore, antithetic backward coupling provides a very convenient theoretical tool for investigating antithetic forward coupling. However, the generation of k > 2 antithetic variates that are negatively associated, i.e., they preserve negative correlation under monotone transformations, and extremely antithetic, i.e., they are as negatively correlated as possible, is more complicated compared to the case with k = 2. In this paper, we establish a theoretical framework for investigating such issues. Among the generating methods that we compare, Latin hypercube sampling and its iterative extension appear to be generalpurpose choices, making another direct link between Monte Carlo and Quasi Monte Carlo. J. N. Corcoran and U. Schneider. Pseudoperfect and adaptive variants of the MetropolisHasting algorithm with an independent candidate density, 2004. To appear in Journal of Statistical Computation and Simulation. (PDF) Abstract: We describe and examine an imperfect variant of a perfect sampling algorithm based on the MetropolisHastings algorithm that appears to perform better than a more traditional approach in terms of speed and accuracy. We then describe and examine an ``adaptive'' MetropolisHasting algorithm which generates and updates a selftarget candidate density in such a way that there is no ``wrong choice'' for an initial candidate density. Simulation examples are provided. M. S. Carvalho and J. N. Corcoran. Solving stochastic difference equations originated from continuoustime threshold AR(1) models through perfect simulation, 2003. Preprint. Abstract: In this paper we look at a discretization of a continuoustime threshold AR(1) process (CTAR(1)) and show how to apply perfect simulation to find the stationary distribution of the resulting stochastic difference equation. Such method leads to perfect solutions of stochastic difference equations of this kind. Depending on how sensitive the original CTAR(1) process is to the discretization, the method may give an approximate solution to the original continuoustime process. Also, we introduce the phenomenon of decoupling in a perfect simulation algorithm. Jesper Møller, A. N. Pettitt, K. K. Berthelsen, and R. W. Reeves. An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants. (PDF) Abstract: We present new methodology for drawing samples from a posterior distribution when (i) the likelihood function or (ii) a part of the prior distribution is only specified up to a normalising constant. In the case (i), the novelty lies in the introduction of an auxiliary variable in a MetropolisHastings algorithm and the choice of proposal distribution so that the algorithm does not depend upon the unknown normalising constant. In the case (ii), similar ideas apply and the situation is even simpler as no auxiliary variable is required. Our method is ``online'' as compared with alternative approaches to the problem which require ``offline'' computations. Since it is needed to simulate from the ``unknown distribution'', e.g. the likelihood function in case (i), perfect simulation such as the ProppWilson algorithm becomes useful. We illustrate the method in case (i) by producing posterior samples when the likelihood is given by an Ising model and by a Strauss point process. Shane G. Henderson and Richard L. Tweedie. Perfect simulation for Markov chains arising from discreteevent simulation. Proceedings of the 38th Annual Allerton Conference on Communication, Control and Computing, pp. 11251134, 2000. (Postscipt) Abstract: Virtually any discreteevent simulation can be rigorously defined as a Markov chain evolving on a general state space, and under appropriate conditions, the chain has a unique stationary probability distribution. Many steadystate performance measures can be expressed in terms of the stationary probability distribution of the chain. We would like to apply ``coupling from the past'' algorithms to obtain samples from the stationary probability distribution of such chains. Unfortunately, the structural properties of the chains arising from discreteevent simulations preclude the immediate application of current coupling from the past algorithms. We describe why this is the case, and extend a class of coupling from the past algorithms so that they may be applied in this setting. Michael Korn and Igor Pak. Tilings of rectangles with Ttetrominoes, 2003. Theoretical Computer Science 319(13):327, 2004. [PDF] Abstract: We prove that any two tilings of a rectangular region by Ttetrominoes are connected by moves involving only 2 and 4 tiles. We also show that the number of such tilings is an evaluation of the Tutte polynomial. The results are extended to a more general class of regions. From the introduction: Along the way we show that our new height functions enable us to define a lattice structure on all T tetromino tilings of a rectangle. While this may seem a theoretical curiosity, in fact there are important applications of this result. There is a natural definition of a Markov chain on tilings (perform ``random local moves'') which translates into a nice Markov chain on height functions. Using the ``coupling from the past'' technique of Propp and Wilson and the fact that the height functions form a lattice (see [15, 14]) one can sample random Ttetromino tilings from an exactly uniform distribution (as opposed to a ``nearly uniform'' distribution usually obtained by the Markov chain approach). We refer to [16] for a full discussion of this approach and other examples of lattice structures on tilings. James P. Hobert and Christian P. Robert. A mixture representation of π with applications in Markov chain Monte Carlo and perfect sampling. The Annals of Applied Probability 14(3):12951305, 2004. (PDF) Abstract: Let X ={X_{n} : n =0 1 2 ...} be an irreducible, positive recurrent Markov chain with invariant probability measure π. We show that if X satisfies a onestep minorization condition, then π can be represented as an infinite mixture. The distributions in the mixture are associated with the hitting times on an accessible atom introduced via the splitting construction of Athreya and Ney [Trans. Amer. Math. Soc. 245 (1978) 493501] and Nummelin [Z. Wahrsch. Verw. Gebiete 43 (1978) 309318]. When the small set in the minorization condition is the entire state space, our mixture representation of π reduces to a simple formula, first derived by Breyer and Roberts [Methodol. Comput. Appl. Probab. 3 (2001) 161177] from which samples can be easily drawn. Despite the fact that the derivation of this formula involves no coupling or backward simulation arguments, the formula can be used to reconstruct perfect sampling algorithms based on coupling from the past (CFTP) such as Murdoch and Green's [Scand. J. Statist. 25 (1998) 483502] Multigamma Coupler and Wilson's [Random Structures Algorithms 16 (2000) 85113] ReadOnce CFTP algorithm. In the general case where the state space is not necessarily 1small, under the assumption that X satisfies a geometric drift condition, our mixture representation can be used to construct an arbitrarily accurate approximation to π from which it is straightforward to sample. One potential application of this approximation is as a starting distribution for a Markov chain Monte Carlo algorithm based on X . Stephen P. Brooksy, Yanan Fan, and Jeffrey S. Rosenthal. Perfect Forward Simulation via Simulated Tempering. Preprint. Abstract: Several authors discuss how the simulated tempering scheme provides a very simple mechanism for introducing regenerations within a Markov chain. In this paper we explain how regenerative simulated tempering schemes provide a very natural mechanism for perfect simulation. We use this to provide a perfect simulation algorithm, which uses a singlesweep forwardsimulation without the need for recursively searching through negative times. We demonstrate this algorithm in the context of several examples. F. Amaya and J. M. Benedí. 2000. Using Perfect Sampling in Parameter Estimation of a Wole Sentence Maximum Entropy Language Model. Proc. Fourth Computational Natural Language Learning Workshop, CoNLL2000. Pages 7982. Abstract: The Maximum Entropy principle (ME) is an appropriate framework for combining information of a diverse nature from several sources into the same language model. In order to incorporate longdistance information into the ME framework in a language model, a Whole Sentence Maximum Entropy Language Model (WSME) could be used. Until now MonteCarlo Markov Chains (MCMC) sampling techniques has been used to estimate the paramenters of the WSME model. In this paper, we propose the application of another sampling technique: the Perfect Sampling (PS). The experiment has shown a reduction of 30% in the perplexity of the WSME model over the trigram model and a reduction of 2% over the WSME model trained with MCMC. Shuji Kijima and Tomomi Matsui. Polynomial time perfect sampling algorithm for tworowed contingency tables, 2003. Tech report METR 200315. Abstract: This paper proposes a polynomial time perfect (exact) sampling algorithm for 2×n contingency tables. Our algorithm is a Las Vegas type randomized algorithm and the expected running time is bounded by O(n^{3} ln N) where n is the number of columns and N is the total sum of whole entries in a table. The algorithm is based on monotone coupling from the past (monotone CFTP) algorithm and new Markov chain for sampling tworowed contingency tables uniformly. We employed the path coupling method and showed that the mixing rate of our chain is bounded by n(n1)²(1+ln(n N)). Our result indicates that uniform generation of tworowed contingency tables is easier than the corresponding counting problem, since the counting problem is known to be #Pcomplete. Tomomi Matsui and Shuji Kijima. Polynomial Time Perfect Sampler for Discretized Dirichlet Distribution, 2003. Tech report METR 200317. Abstract: In this paper, we propose a perfect (exact) sampling algorithm according to a discretized Dirichlet distribution. Our algorithm is a monotone coupling from the past algorithm, which is a Las Vegas type randomized algorithm. We propose a new Markov chain whose limit distribution is a discretized Dirichlet distribution. Our algorithm simulates transitions of the chain O(n^{3} ln Δ) times where n is the dimension (the number of parameters) and 1/Δ is the grid size for discretization. Thus the obtained bound does not depend on the magnitudes of parameters. In each transition, we need to sample a random variable according to a discretized beta distribution (2dimensional Dirichlet distribution). To show the polynomiality, we employ the path coupling method carefully and show that our chain is rapidly mixing. Motoya Machida. Fill's algorithm for absolutely continuous stochastically monotone kernels. Electronic Communications in Probability, Vol. 7, paper 15, 2002. Abstract: Fill, Machida, Murdoch, and Rosenthal (2000) presented their algorithm and its variants to extend the perfect sampling algorithm of Fill (1998) to chains on continuous state spaces. We consider their algorithm for absolutely continuous stochastically monotone kernels, and show the correctness of the algorithm under a set of certain regularity conditions. These conditions succeed in relaxing the previously known hypotheses sufficient for their algorithm to apply. Nancy L. Garcia and Mariana R. Motta. Perfect simulation for a stationary silo with absorbing walls. Methodology and Computing in Applied Probability 5(1):109121, 2003. (Postscript) Abstract: The objective of this work is to generate random samples of the unique stationary distribution associated to the stochastic model for grain storage in a finite bidimensional silo. The support of this measure is an unbounded and continuous state space and therefore a truncation was necessary to apply the CFTP perfect simulation scheme. The performance of the algorithm was measure by comparing the sample moments to the theoretical ones. Wolfgang Hörmann and Josef Leydold. Improved perfect slice sampling, 2003. Preprint. Abstract: Perfect slice sampling is a method to turn Markov Chain Monte Carlo (MCMC) samplers into exact generators for independent random variates. The originally proposed method is rather slow and thus several improvements have been suggested. However, two of them are erroneous. In this article we give a short introduction to perfect slice sampling, point out incorrect methods, and give a new improved version of the original algorithm. Damian Clancy and Philip D. O'Neill Perfect simulation for stochastic models of epidemics among a community of households, 2002. Preprint. Abstract: Recent progress in the analysis of infectious disease data using stochastic epidemic models has been largely due to Markov chain Monte Carlo methodology. In the present paper it is illustrated that certain methods of perfect simulation can also be applied in the context of epidemic models. Specifically, a perfect simulation algorithm due to Corcoran and Tweedie (2002) is considered for epidemic models that feature a population divided into households. The models allow individuals to be potentially infected both from outside the household, and from other household members. The data are assumed to consist of the final numbers ultimately infected within each household in the community. The methods are applied to data taken from outbreaks of in and the results compared with those obtained by other approaches. R. Fernández, P. A. Ferrari, and A. Galves. Coupling, renewal and perfect simulation of chains of infinite order. Notes for a course in the V Brazilian School of Probability, Ubatuba 2001. This is a 95 pages booklet developing an explicit construction of chains with infinite memory under the Harris regime. S. Foss and T. Konstantopoulos. Extended renovation theory and limit theorems for stochastic ordered graphs. Markov Processes Relat. Fields 9:413468, 2003. (PDF) Abstract: We extend Borovkov's renovation theory to obtain criteria for coupling convergence of stochastic processes that do not necessarily obey stochastic recursions. The results are applied to an ``infinite bin model'', a particular system that is an abstraction of a stochastic ordered graph, i.e., a graph on the integers that has (i, j), i < j, as an edge, with probability p, independently from edge to edge. A question of interest is an estimate of the length L_{n} of a longest path between two vertices at distance n. We give sharp bounds on C = lim_{n→∞} (L_{n}/n). This is done by first constructing the unique stationary version of the infinite bin model, using extended renovation theory. We also prove a functional law of large numbers and a functional central limit theorem for the infinite bin model. Finally, we discuss perfect simulation, in connection to extended renovation theory, and as a means for simulating the particular stochastic models considered in this paper. Francesco Bartolucci and Julian Besag. A recursive algorithm for Markov random fields. Biometrika 89(3):724730, 2002. (PDF) Abstract: We propose a recursive algorithm as a more useful alternative to the Brook expansion for the joint distribution of a vector of random variables when the original formulation is in terms of the corresponding full conditional distributions,as occurs for Markov random fields. Usually,in practical applications, the computational load will still be excessive but then the algorithm can be used to obtain the componentwise full conditionals of a system after marginalizing over some variables or the joint distribution of subsets of the variables, conditioned on values of the remainder, which is required for block Gibbs sampling. As an illustrative example, we apply the algorithm in the simplest nontrivial setting of hidden Markov chains. More important, we demonstrate how it applies to Markov random fields on regular lattices and to perfect block Gibbs sampling for binary systems. Olle Häggström, Johan Jonasson, and Russell Lyons. Coupling and Bernoullicity in randomcluster and Potts models. Bernoulli 8(3):275294, 2002. (Postscript) Abstract: An explicit coupling construction of randomcluster measures is presented. As one of the applications of the construction, the Potts model on amenable Cayley graphs is shown to exhibit at every temperature the mixing property known as Bernoullicity. M. N. M. van Lieshout, and A. J. Baddeley. Extrapolating and interpolating spatial patterns. Chapter 4 of Spatial Cluster Modelling (A. Lawson and D. Denison, eds.), Chapman and Hall/CRC, 2002. Pages 6186. Abstract: We discuss issues arising when a spatial pattern is observed within some bounded region of space, and one wishes to predict the process outside of this region (extrapolation) as well as to perform inference on features of the pattern that cannot be observed (interpolation). We focus on spatial cluster analysis. Here the interpolation arises from the fact that the centres of clustering are not observed. We take a Bayesian approach with a repulsive Markov prior, derive the posterior distribution of the complete data, i.e. cluster centres with associated o#spring marks, and propose an adaptive coupling from the past algorithm to sample from this posterior. The approach is illustrated by means of the redwood data set (Ripley, 1977). Jorge Mateu, Jordi Artés, and José A. López. Computational issues for perfect simulation in spatial point patterns. Communications in Nonlinear Science and Numerical Simulation 9(2):229240, 2004. Petar M. Djurić, Yufei Huang, and Tadesse Ghirmai. Perfect sampling: a review and applications to signal processing. IEEE Transactions on Signal Processing 50(2):345356, 2002. (PDF) Abstract: In recent years, Markov chain Monte Carlo (MCMC) sampling methods have gained much popularity among researchers in signal processing. The Gibbs and the MetropolisHastings algorithms, which are the two most popular MCMC methods, have already been employed in resolving a wide variety of signal processing problems. A drawback of these algorithms is that in general, they cannot guarantee that the samples are drawn exactly from a target distribution. More recently, new Markov chainbased methods have been proposed, and they produce samples that are guaranteed to come from the desired distribution. They are referred to as perfect samplers. In this paper, we review some of them, with the emphasis being given to the algorithm coupling from the past (CFTP). We also provide two signal processing examples where we apply perfect sampling. In the first, we use perfect sampling for restoration of binary images and, in the second, for multiuser detection of CDMA signals. Tournois Florent. Perfect simulation of a stochastic model for CDMA coverage. INRIA research report #4348, 2002. Abstract: The aim of this article is to present ways of getting realizations of the stochastic model for CDMA coverage introduced in [1]. This is a basic stochastic geometry model that represents the location of antennas as realizations of Poisson point processes in the plane and allows one to capture regions (CDMA cells) of the plane with signaltonoise ratio SNR to/from a given antenna at a sufficient level. We describe an algorithm of perfect simulation of arbitrary close approximation of the the CDMA cell mosaic given some coverage constraints. This is an adaptation of the so called backward simulation of conditional coverage process developed in [4]. The approximation comes from the estimation of the SNR, that is modeled by Poisson shotnoise process, based on an influence window whose size is theoretically adjusted to meet the true value of SNR with an arbitrary precision. As a motivating example we estimate the distribution functions of the distance from a fixed location to regions under various given handoff conditions (so called contact distribution functions). Philip D. O'Neill. Perfect simulation for ReedFrost epidemic models. Statistics and Computing 13(1):3744, 2003. Abstract: The ReedFrost epidemic model is a simple stochastic process with parameter q that describes the spread of an infectious disease among a closed population. Given data on the final outcome of an epidemic, it is possible to perform Bayesian inference for q using a simple Gibbs sampler algorithm. In this paper it is illustrated that by choosing latent variables appropriately, certain monotonicity properties hold which facilitate the use of a perfect simulation algorithm. The methods are applied to real data. J. N. Corcoran and U. Schneider. Shift and scale coupling methods for perfect simulation. Probability in the Engineering and Informational Sciences 17(3):277303, 2003. (PDF) Abstract: We describe and develop a variation on a layered multishift coupler due to Wilson that uses a slice sampling procedure to allow one to obtain potentially common draws from two different distributions. Our main application is coupling sample paths of Markov chains for use in perfect sampling algorithms. The coupler is based on slicing density functions and we describe a ``folding'' mechanism as an attractive alternative to the accept/reject step commonly used in slice sampling algorithms. Applications of the coupler are given to storage models and to autogamma distribution sampling. U. Schneider and J. N. Corcoran. Perfect sampling for Bayesian variable selection in a linear regression model. Journal of Statistical Planning and Inference 126(1):153171, 2004. Abstract: We describe the use of perfect sampling algorithms for Bayesian variable selection in a linear regression model. Starting with a basic case solved by Huang and Djuric (2002), where the model coefficients and noise variance are assumed to be known, we generalize the model step by step to allow for other sources of randomness, specifying perfect simulation algorithms that solve these cases by incorporating various techniques including Gibbs sampling, the perfect independent MetropolisHastings (IMH) algorithm, and recently developed ``slice coupling'' algorithms. Applications to simulated data sets suggest that our algorithms perform well in identifying relevant predictor variables. J. N. Corcoran, U. Schneider, and H.B. Schüttler. Perfect stochastic summation of high order Feynman graph expansions, 2003. Preprint. Abstract: We describe a new application of an existing perfect sampling technique of Corcoran and Tweedie to estimate the self energy of an interacting Fermion model via Monte Carlo summation. Simulations suggest that the algorithm in this context converges extremely rapidly and results compare favorably to true values obtained by brute force computations for low dimensional toy problems. A variant of the perfect sampling scheme which improves the accuracy of the Monte Carlo sum for small samples is also given. Iain Murray, Zoubin Ghahramani, and David J. C. MacKay. MCMC for doublyintractable distributions. Proceedings of the 22nd Annual Conference on Uncertainty in Artificial Intelligence (UAI), 2006. (PDF) Abstract: Markov Chain Monte Carlo (MCMC) algorithms are routinely used to draw samples from distributions with intractable normalization constants. However, standard MCMC algorithms do not apply to doublyintractable distributions in which there are additional parameterdependent normalization terms; for example, the posterior over parameters of an undirected graphical model. An ingenious auxiliaryvariable scheme (Møller et al., 2004) offers a solution: exact sampling (Propp and Wilson, 1996) is used to sample from a MetropolisHastings proposal for which the acceptance probability is tractable. Unfortunately the acceptance probability of these expensive updates can be low. This paper provides a generalization of Møller et al. (2004) and a new MCMC algorithm, which obtains better acceptance probabilities for the same amount of exact sampling, and removes the need to estimate model parameters before sampling begins. Theses Relevant to Perfect SamplingA number of people have written graduate theses in which a significant component was on perfect sampling with Markov chains. In chronological order these are:
David Bruce Wilson.
Exact Sampling with Markov Chains.
Ph.D. thesis,
MIT Mathematics Department,
May 28, 1996.
Tine Møller Sørensen.
A study of Propp & Wilson's method for exact Markov chain Monte Carlo sampling, 1997.
Master's thesis in computational statistics, University of Bath.
Morten Fismen.
Exact simulation using Markov chains.
Technical Report 6/98, Institutt for Matematiske Fag, 1998. Diplomathesis.
Karin Nelander.
Contributions to Percolation Theory and Exact Sampling.
Ph.D. thesis,
Deperartment of Mathematics, Göteborg,
March 29, 1998.
Jem Noelle Corcoran.
Perfect Sampling in MCMC Algorithms.
Ph.D. thesis, Department of Statistics at Colorado State University,
May 1998.
Bas Straatman. Exact sampling and applications to a mite dispersal model. Master's thesis, Utrecht, August 10, 1998.
Elke Thönnes.
Perfect and Imperfect Simulations in Stochastic Geometry.
Ph.D. thesis,
University of Warwick,
October 1998.
Motoya Machida.
Stochastic Monotonicity and Realizable Monotonicity.
PhD thesis,
The Johns Hopkins University,
Department of Mathematical Sciences,
1999.
Mark Lawrence Huber.
Perfect Sampling Using Bounding Chains.
Ph.D. thesis,
Cornell University,
May 1999.
Michael Harvey.
Monte Carlo Inference for Belief Networks Using Coupling From the Past.
Master's thesis, department of computer science, University of Toronto, 1999.
Jens Lund.
Statistical inference and perfect simulation for point processes observed with noise.
PhD thesis,
The Royal Veterinary and Agricultural University, Copenhagen,
Department of Statistics,
December 1999.
Alison Gibbs.
Convergence of Markov Chain Monte Carlo Algorithms with Applications to Image Restoration.
PhD thesis, University of Toronto, Department of Statistics, October 1999.
(Postscript, PDF.)
Jesper Møller. Aspects of spatial statistics, stochastic geometry and Markov chain Monte Carlo methods. Thesis for doctoral degree in Natural Sciences, Aalborg University.
Tina Dam,
Jan Brønnum Sørensen, and
Michael Hedegaard Thomsen.
Master's thesis,
Aalborg University,
Department of Mathematical Sciences,
1999.
Mariana Rodrigues Motta.
Perfect Simulation of the Bidimentional Silo Model.
Master's thesis, 2001.
Giovanni Montana.
Small sets and domination in perfect simulation.
Ph.D. thesis, University of Warwick,
April 2002.
Graeme K. Ambler.
Dominated Coupling From The Past and Some Extensions of the AreaInteraction Process.
PhD thesis,
University of Bristol,
Department of Mathematics,
2002.
Can Mert.
Bayesian inference in Schizophrenia research.
Master's thesis,
Royal Institute of Technology,
Computer Science,
2002.
Ulrike Schneider.
Advances and Applications in Perfect Sampling.
PhD thesis,
University of Colorado,
Department of Applied Mathematics,
2003.
Murali Haran.
Efficient Perfect and MCMC Sampling Methods for Bayesian Spatial and Components of Variance Models.
PhD thesis,
University of Minnesota,
School of Statistics,
2003.
Faheem Mitha.
Perfect Sampling on Continuous State Spaces.
PhD thesis,
University of North Carolina at Chapel,
Department of Statistics,
2003.
Jordi Artés.
Simulación exacta en procesos puntuales espaciales: método CFTP / Exact simulation for spatial point processes: CFTP method.
Master's thesis,
Universitat Jaume I,
Departamento de Matemáticas,
2004.
Tuomas Rajala.
Traditional MCMC and Exact simulation of the Ising model.
Master's thesis,
University of Jyväskylä,
Department of Mathematics and Statistics,
2005.
Research TalksThere are a number of research talks that do not appear to have associated written articles that are publicly available. These are listed below. Chris Jennison. Approximate exact sampling: towards the general application of Propp and Wilson's algorithm. Seminar at University of Bath, 1998.
Abstract:
Propp and Wilson's samplingfromthepast rule allows the possibility of exact sampling by Markov chain Monte Carlo (MCMC) methods. We describe the Propp and Wilson method and consider details of its practical usage.
XiaoLi Meng. Improving Perfect Simulation. Seminar at the Department of Statistics, The University of Chicago, January 25, 1999. Abstract:
Motoya Machida. The Use of Perfect Sampling for Mixture Problems. Talk given at the Fourth North American New Researchers Conference, held at The Johns Hopkins University, August 4  7, 1999, also given in Singapore.
Abstract:
Chris Holmes. Bayesian Model Averaging with Orthogonal Predictors. Seminar, April 29, 1999. Abstract: An MCMC sampling procedure known as coupling from the past (CFTP) is described for variable selection in the Bayesian linear model with orthogonal predictors, which includes wavelets, Fourier series and Chebyshev polynomials. CFTP allows one to draw perfect i.i.d. samples from the posterior model space in realistic CPU time. Examples are used to demonstrate this approach on a number of curve fitting problems. Olivier Roques. Coupler depuis le passé (Coupling from the past). Seminar, May 19, 1999. Abstract: Lorsque l'on souhaite engendrer des objets selon une distribution P, on peut construire une chaine de Markov ergodique dont la distribution stationnaire est P. Tout le probleme est de savoir combien de temps il faut faire tourner ce processus pour approcher à epsilon pres la distribution stationnaire, i.e borner le temps de mélange de la chaine. La méthode "Couplage depuis le passé" [Kendall, Propp,Wilson, Jerrum,...] permet de s'abstenir de ces calculs souvent compliqués et d'obtenir un échantillon de l'ensemble des états, distibué exactement selon la distribution stationnaire. Duncan Murdoch. Perfect Sampling: Not Just for Markov Chains? Talk 26310 at the 1999 Joint Satistical Meeting, also given at McMaster. Abstract: Propp and Wilson's (1996) coupling from the past (CFTP) algorithm generates a sample from the limiting distribution of a Markov chain. In this talk I argue that the underlying idea of CFTP is more widely applicable, and will demonstrate successful and unsuccessful attempts to apply it to problems ranging from approximation of integrals to simulation of stochastic differential equations. Antonietta Mira. Ordering Slicing and Splitting Monte Carlo Markov Chains. Talk 26328 at the 1999 Joint Satistical Meeting. Abstract: The class of Markov chains having a specified stationary distribution is very large, so it is important to have criteria telling when one chain performs better than another. The Peskun ordering is a partial ordering on this class. We study the implications of the Peskun ordering both in finite and general state spaces. Unfortunately there are many MetropolisHastings samplers that are not comparable in the Peskun sense. We thus define a more useful ordering, the covariance ordering, that allows us to compare a wider class of sampling schemes. It is always possible to beat the MetropolisHastings algorithm in terms of the Peskun ordering. Two new algorithms will be introduced and studied for this purpose. The slice sampler outperforms the independence MetropolisHastings algorithm. The splitting rejection sampler outperforms the more general MetropolisHastings algorithm. Finally we propose a ``perfect'' version of the slice sampler. Perfect sampling allows exact simulation of random variables from the stationary measure of a Markov chain. By exploiting monotonicity properties of the slice sampler we show that a perfect version of the algorithm can be easily implemented. Motoya Machida. Perfect Sampling via CrossMonotonicity for Simple Mixture Problems. October 13, 1999. Abstract: Since Propp and Wilson's seminal work published in 1996, the perfect sampling technique has extended its research into many ways, particularly in applications with Markov Chain Monte Carlo (MCMC). Given an ergodic Markov chain, the idea of perfect sampling is, in essence, to generate a sample from the exact stationary distribution in finites steps. Two methods for developing such a perfect samplingalgorithm are known, the one based on coupling from the past (CFTP) by Propp and Wilson (1996) and the other via rejection sampling by Fill (1998). Fill's algorithm has been known for its advantage to be independent of the running time, but also for its intricacy in both theory and practice. The recent paper by Fill, Machida, Murdoch, and Rosenthal (1999) spelled out a general framework for his algorithm, and proposed various possible extension Hobert, Robert, and Tittering (1999) have studied CFTP algorithm for an MCMC evaluation as simple mixture problems. Their study was limited to twoand threecomponent mixtures, and suggested a difficulty in implementing CFTP algorithm when the mixture has more than three components. We demonstrate how a framework of Fill's algorithm can be applied for this particular mixture problem, and propose a perfect sampling algorithm using a crossmonotone chain. Our algorithm can be implemented with any number of components, but may not perform well as the number of components increases. The objective of this study is to further stimulate practical applications based on Fill's algorithm. Laird Breyer. A new coupling construction for Markov chains. July 10, 1999. Abstract: Many branches of Probability Theory have benefited from coupling constructions, not least the theory of Markov chains. The coupling inequality for example allows us to bound the total variation distance of a Markov chain to its stationary distribution, in an entirely probabilistic way. This has led recently to computable bounds. As another example, Perfect Simulation exploits couplings to produce exact samples from distributions that are hard to simulate from. A central problem is the actual construction of the coupling, which tends to be problem specific and something of an art form. The "splitting technique" is one method which is very general and has been very successful for years. Among its drawbacks are the need to perform analytic estimates before the method can be applied. In this talk, we shall propose a "more automatic" way of coupling Markov chains, which does not require such estimates, and is applicable to a variety of chains. Kajsa Fröjd. Perfect simulation of some multitype spatial point processes. Jem Corcoran. Perfect Simulation with Applications to Statistical Mechanics. February 24, 2000. Abstract: "Perfect simulation" Markov chain Monte Carlo with no statistical error. It sounds great, it even looks kind of easy. So why aren't you doing it? Probably because, as they say, "the devil is in the details." Perfect simulation methods are based on the idea of coupling sample paths of a Markov chain. In this talk we will explore the details of three specific examples, each having specific implementation difficulties. We will consider an unbounded storage model where paths must be coupled on a continuous state space, the standard Ising model of magnetism (both the ferromagnet and antiferomagnet) where paths are running around in a more complicated state space of "spin configurations", and the hardcore lattice gas model where sample paths will naturally repel one another despite the fact that our goal is to get them to meet. E. Olusegun George, Giri Narasimhan, and Anthony Quas. Fast and exact Gibbs sampler: an application of Propp and Wilson's CFTP algorithm. Abstract: Gibbs Sampling is a popular technique for generating samples from posterior distributions with multidimensional parameters. Often the posterior distributions for such problems are analytically or numerically complex. As in the general MCMC procedure, the samples for posterior analysis are usually collected after a burnout period, by which time, it is assumed that the sequence of generated values are close to those that would be generated by the true posterior distribution. The problem with this method is that, in order to be sure that sample for posterior analysis is a reasonable representation of the true posterior, a very large number of iterations is typically used for burnout. Furthermore, although the bias introduced by the choice of initial value may become vanishingly small with increasing number of iterations, it is never completely eliminated. In practice, the choice of number of iterations required for burnout is arbitrary and many researchers have proposed less than satisfactory techniques to approach the question. We introduce an alternative method to perform Gibbs Sampling that addresses the abovementioned shortcomings. The practical advantages of all our methods are as follows: the samples generated are ‘perfect', i.e., the bias is completely eliminated; the method has a welldefined stopping rule and the expected number of iterations needed for burnout can be estimated in many cases. We illustrate our method by computing posterior distribution of in a teratology application. Our method is an extension of the techniques for perfect sampling suggested by Propp and Wilson (1996). Surveys and Representative Articles from Related AreasInformation on heuristic algorithms for assessing Markov chain convergence to the stationary distribution (among other things) can be found in these resources: Alan D. Sokal. Monte Carlo methods in statistical mechanics: Foundations and new algorithms, 1989. Lecture notes from Cours de Troisième Cycle de la Physique en Suisse Romande. Updated in 1996 for the Cargèse Summer School on ``Functional Integration: Basics and Applications.'' (Click here to download.) Surveys of rigorous bounds on the convergence rate of Markov chains: Persi Diaconis and Laurent SaloffCoste. What do we know about the Metropolis algorithm?, In Proceedings of the TwentySeventh Annual ACM Symposium on the Theory of Computing, pages 112129, 1995. Mark Jerrum and Alistair Sinclair. The Markov chain Monte Carlo method: an approach to approximate counting and integration, in ``Approximation Algorithms for NPhard Problems,'' D.S.Hochbaum ed., PWS Publishing, Boston, 1996. (Click here to download.) A good introduction to stopping times with many examples and references: David Aldous and Persi Diaconis. Shuffling Cards and Stopping Times, American Mathematical Monthly, 93(5):333348, 1986. A more recent and more developed reference on stopping times: Persi Diaconis and James A. Fill. Strong stationary times via a new form of duality. The Annals of Probability, 18:14831522, 1990. Representative articles on stochastically recursive sequences: Gérard Letac. A contraction principle for certain Markov chains and its applications. In Random Matrices and Their Applications, volume 50 of Contemporary Mathematics, pages 263273, 1986. Herman Thorisson. Backward limits, The Annals of Probability 16(2):914924, 1988. A. A. Borovkov and S. G. Foss. Stochastically recursive sequences and their generalizations. Siberian Advances in Mathematics, 2(1):1681, 1992.
Reversible Markov chains: David J. Aldous and James A. Fill. Reversible Markov Chains and Random Walks on Graphs. Book in preparation, http://www.stat.berkeley.edu/~aldous/book.html. Frequently Asked QuestionsWhen explaining CFTP to an audience, there are invariably many questions. Included below are some of these questions together with their answers. [Adapted from section 1.4 of Wilson (2000), first published by the AMS.]
Gallery of Pictures and Java Applets Relevant to the ArticlesPostscript illustration of monotone coupling from the past (CFTP) Postscript illustration of the MurdochGreen rejection coupler Postscript illustration of the MurdochGreen partitioned multigamma coupler Postscript illustration of the GreenMurdoch random walk Metropolis bisection coupling Postscript illustration of GreenMurdoch random walk Metropolis bounded CFTP Jason Woolever's random tiling applets (using monotoneCFTP) Wilfrid Kendall's animated comparsion of forwards and backwards coupling Georgii's perfect continuum Ising model pictures Jeffrey Rosenthal's monotoneCFTP applet for the chain MacKay's perfect simulation pictures random 3 colorings of the square and cubic lattices
The 4state Potts model is particularly interesting, click here to learn why and to see much larger perfectly random 4state Potts configurations. Childs, Patterson, and MacKay give a number of perfect Potts configurations with antiferromagnetic and ferromagnetic interactions, as well as a number of associated animations, and plots of thermodynamic quantities.
Perfect simulations of the areainteraction point process (from Kendall's web page of abstracts):
Definitions of Selected TermsIncluded here are the definitions of some terms used but not defined above.
Perfect Sampling Code on the WebThere seems to be enough perfect sampling code on the web to warrant making a section of links to it. Here is a partial listing, I will add more links as I track them down or they are called to my attention.
First published in volume 41 of DIMACS Series in Discrete Mathematics and Theoretical Computer Science, published by the American Mathematical Society, 1998. Online version last updated September 7, 2004.
Thanks go to
Andrei Broder,
James A. Fill,
Wilfrid S. Kendall,
Jesper Møller,
and
James G. Propp for their helpful comments, and David J. C. MacKay for help updating the HTML.
