Supercompilation

Call-by-need supercompilation

Call-by-need supercompilation, PhD Thesis, Max Bolingbroke, University of Cambridge, May 2013. Abstract
This thesis shows how supercompilation, a powerful technique for transformation and analysis of functional programs, can be effectively applied to a call-by-need language. Our setting will be core calculi suitable for use as intermediate languages when compiling higher-order, lazy functional programming languages such as Haskell.

We describe a new formulation of supercompilation which is more closely connected to operational semantics than the standard presentation. As a result of this connection, we are able to exploit a standard Sestoft-style operational semantics to build a supercompiler which, for the first time, is able to supercompile a call-by-need language with unrestricted recursive let bindings.

We give complete descriptions of all of the (surprisingly tricky) components of the resulting supercompiler, showing in detail how standard formulations of supercompilation have to be adapted for the call-by-need setting.

We show how the standard technique of generalisation can be extended to the call-by-need setting. We also describe a novel generalisation scheme which is simpler to implement than standard generalisation techniques, and describe a completely new form of generalisation which can be used when supercompiling a typed language to ameliorate the phenomenon of supercompilers overspecialising functions on their type arguments.

We also demonstrate a number of non-generalisation-based techniques that can be used to improve the quality of the code generated by the supercompiler. Firstly, we show how let-speculation can be used to ameliorate the effects of the work-duplication checks that are inherent to call-by-need supercompilation. Secondly, we demonstrate how the standard idea of ‘rollback’ in supercompilation can be adapted to our presentation of the supercompilation algorithm.

We have implemented our supercompiler as an optimisation pass in the Glasgow Haskell Compiler. We perform a comprehensive evaluation of our implementation on a suite of standard call-by-need benchmarks. We improve the runtime of the benchmarks in our suite by a geometric mean of 42%, and reduce the amount of memory which the benchmarks allocate by a geometric mean of 34%.


Termination combinators forever

Termination combinators forever, Max Bolingbroke, Simon Peyton Jones, and Dimitrios Vytiniotis, Haskell Symposium, Tokyo, Sept 2011.

Abstract

We describe a library-based approach to constructing termination tests suitable for controlling termination of symbolic methods such as partial evaluation, supercompilation and theorem proving. With our combinators, all termination tests are correct by construction. We show how the library can be designed to embody various optimisations of the termination tests, which the user of the library takes advantage of entirely transparently.

Improving supercompilation: tag-bags, rollback, speculation, normalisation, and generalisation

Improving supercompilation: tag-bags, rollback, speculation, normalisation, and generalisation, Max Bolingbroke and Simon Peyton Jones, submitted to ICFP 2011.

Abstract

Supercompilation is a technique due to Turchin Supercompilation is a powerful technique for program optimisation and theorem proving. In this paper we describe and evaluate three improvements to the Cambridge Haskell Supercompiler (CHSC). We reduce supercompiled program size by the use of a weak normaliser and aggressive rollback, and we improve the performance of supercompiled programs by heap speculation and generalisation. Our generalisation method is simpler than those in the literature, and is better at generalising computations involving primitive operations such as those on machine integers. We also provide the first comprehensive account of the tag-bag termination mechanism.

Supercompilation by evaluation, Max Bolingbroke and Simon Peyton Jones

Abstract

Supercompilation is a technique due to Turchin which allows for the construction of program optimisers that are both simple and extremely powerful. Supercompilation is capable of achieving transformations such as deforestation, function specialisation and constructor specialisation. Inspired by Mitchell's promising results (ICFP'10), we show how the call-by-need supercompilation algorithm can be recast to be based explicitly on an evaluator, and in the process extend it to deal with recursive let-expressions.

Simon Peyton Jones, simonpj@microsoft.com