Infer.NET Fun turns the simple succinct syntax of F# into an executable modeling language for Bayesian machine learning.
We propose a marriage of probabilistic functional programming with Bayesian reasoning. Infer.NET Fun turns F# into a probabilistic modeling language – you can code up the conditional probability distributions of Bayes’ rule using F# array comprehensions with constraints. Write your model in F#. Run it directly to synthesize test datasets and to debug models. Or compile it with Infer.NET for efficient statistical inference. Hence, efficient algorithms for a range of regression, classification, and specialist learning tasks derive by probabilistic functional programming.
- August 2013: our paper on measure transformer semantics for probabilistic programs has been accepted by the journal LMCS.
- March 2013: our TACAS paper wins the EAPLS Best Paper Award for ETAPS 2013. Let's you drive MCMC samplers like Filzbach from Fun programs.
- Read Andy Gordon's position statement An Agenda for Probabilistic Programming: Usable, Portable, and Ubiquitous for the ISAT/DARPA workshop on "Probabilistic Programming: Democratizing Machine Learning", Menlo Park, February 2013.
- See here for Andy Gordon's talk at POPL 2013, which explains the 5 distributions of a Bayesian model as 5 probabilistic programs in F#.
- And see here for Andy Gordon's Probabilistic Programming talk at OBT 2013.
- See here for the slides and video of Andy Gordon's Infer.NET Fun talk at Lang.NEXT 2012.
Some current participants in the Infer.NET Fun project:
- Mihhail Aizatulin (Open University)
- Johannes Borgström (Uppsala University)
- Andy Gordon (Microsoft Research Cambridge)
- Thore Graepel (Microsoft Research Cambridge)
- Aditya Nori (Microsoft Research Bangalore)
- Sriram Rajamani (Microsoft Research Bangalore)
- Claudio Russo (Microsoft Research Cambridge)
Since September 2012, Infer.NET Fun is a component of Infer.NET.
"I think it's extraordinarily important that we in computer science keep fun in computing."
Alan J. Perlis
ACM Turing Award Winner 1966.
- Johannes Borgström, Andrew D. Gordon, Michael Greenberg, James Margetson, and Jurgen Van Gael, Measure Transformer Semantics for Bayesian Machine Learning, no. MSR-TR-2011-18, July 2011
- Guillaume Claret, Sriram K. Rajamani, Aditya V. Nori, Andrew D. Gordon, and Johannes Borgstroem, Bayesian Inference Using Data Flow Analysis, no. MSR-TR-2013-27, March 2013
- Andrew D. Gordon, Mihhail Aizatulin, Johannes Borgstroem, Guillaume Claret, Thore Graepel, Aditya V. Nori, Sriram K. Rajamani, and Claudio Russo, A Model-Learner Pattern for Bayesian Reasoning, no. MSR-TR-2013-1, January 2013
- Sooraj Bhat, Johannes Borgstroem, Andrew D. Gordon, and Claudio Russo, Deriving Probability Density Functions from Probabilistic Functional Programs, in 19th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS), Springer, March 2013