Denys Duchier

a message written in response to the discussion of continuations on the comp.lang.python newsgroup.

The fixation on call/cc and on gritty details of its operational interpretation has so far obscured the simplicity and elegance of continuations. In this message, I would like to remedy the situation in 2 ways:

- first by presenting continuations in a simple and intuitive way
- second by offering _working_ python code that illustrates how one can program with continuations to implement search engines, without call/cc.

I exhibit:

- a validity checker for propositional formulae
- a rudimentary prolog engine with backtracking and cut

I hope this will help clarify matters for those who felt a little bewildered by the discussion so far. I would also like to point out that the two programs mentioned above were written this evening in a couple of hours, which should give you some measure of the power of the technique described in the following.

Traditionally a function returns a value. e.g.:

def foo(x): return x+1

This leaves implicit where this value is to be returned to. The idea
of continuations is to make this explicit by adding a continuation
argument. Instead of `returning' the value, the function `continues'
*with* the value by giving it as an argument to the continuation. In
a continuation-based world, the above function foo becomes:

def foo(x,c): c(x+1)

In this view, a function never `returns'. Instead it `continues'. For this reason, continuations have sometimes been described as `gotos with arguments'.

The idea described above is the basis of a compilation technique. More precisely, it is a preliminary code transformation known as CPS (Continuation Passing Style). The basic idea is to add to each function an extra `continuation' argument, and to further transform the body of the function so that instead of returning its value, it instead passes it on to its extra continuation argument.

This idea was already outlined in the example of the foo function above. To be more exact, however, it should be noted that the CPS transformation also unfolds all nested expressions which are not lambdas (in other words it explicitly threads the computations of all subexpressions). Let's look at an example:

def baz(x,y): return 2*x+y

In the continuation passing view, even primitive operators such as `*`

or `+`

take an extra continuation argument. We will simulate this with
the following definitions:

def add(x,y,c): c(x+y) def mul(x,y,c): c(x*y)

Now, CPS would transform the baz function above into:

def baz(x,y,c): mul(2,x,lambda v,y=y,c=c: add(v,y,c))

In other words, the computation of `2*x`

now takes a continuation to
receive the result `v`

and uses it to compute `v+y`

and finally passes
this result to the overall continuation `c`

.

When understood in this context, `call/cc`

is not mysterious at all. It
is merely a means to get our hands on the invisible extra continuation
argument introduced by the CPS transformation and to use it like any
other function value in our program. Consider `call/cc(f)`

where `f`

is a
function intended to receive the current continuation as an argument.
Essentially, `call/cc(f)`

is transformed by CPS into `call_cc(f,c)`

where
`c`

is the continuation argument that CPS is introducing and `call_cc`

is
defined as follows:

def call_cc(f,c): f(c,c)

i.e. the normal argument of `f`

and its extra continuation argument
introduced by CPS are both the current continuation `c`

.

There are details, but the above is the essence.

The CPS transformation is the basis of many compilers for functional languages. It's drawback is that it introduces many lambdas (i.e. closures), and it is essential that the compiler optimize as many of them away as possible. This was extensively studied by e.g. Steele in the Rabbit compiler for scheme, Kelsey etal. in the Orbit compiler for T, and Appel in the SML/NJ compiler. One advantage is that, if lambdas are your only control structure and you have optimized them to the max, then you have optimized all control structures.

However it should be noted that there is some disagreement about the value of the CPS transformation as a basis for compilation since, as many have noted, the job of the compiler is often to remove much of what CPS introduced.

You may have noticed that some people are overly enthusiastic about the arcane applications of continuations. There are many non-arcane applications of continuations and they don't require the existence of call/cc. You can write continuation passing programs in Python, or in any language that supports some form of closures and automated garbage collection.

The application that I know best concerns `search'. This is very much related to the on-going thread on iterators. I learned the technique, which I describe below, from my erstwhile advisor Drew McDermott, many years ago. This is an old AI technique which Drew called "generators". However, I should like to point out that, contrary to Tim's characterization, generators (in Drew's sense) do not necessarily behave in a `stack-like manner'; although it is extremely rare to come up with one that doesn't :-)

The idea is to drive search by passing 2 continuations:

- a success continuation, to proceed further with the search
- a failure continuation, to backtrack to an earlier choice point

Expressed in Python, this often takes the following form:

class Foo: def search(self,info,yes,no): if self.check(info): return yes(info,no) else: return no()

where `info' is some information that is passed around during search. `yes' is the success continuation and `no' is the failure continuation. `yes' takes as arguments the current `info' state and the current failure continuation. `no' takes no argument.

A Foo object satisfies the search criterion if `self.check(info)`

is
true.

Consider now a class Baz that has 2 Foo attributes `one' and `two'. A Baz object is defined to satisfy the search criterion if either its `one' attribute satisfies it or its `two' attribute satisfies it (in other words a Baz object is a kind of disjunction). We express this by calling the search method on the `one' attribute, but also passing it a failure continuation that will try the `two' attribute instead.

class Baz: def __init__(self,foo1,foo2): self.one = foo1 self.two = foo2 def search(self,info,yes,no): return self.one.search( info,yes, lambda self=self,info=info,yes=yes,no=no: \ self.two.search(info,yes,no))

What becomes evident in the above is that Python's lack of real closures makes it a bit painful to write what in a functional language is truly succinct and elegant.

Formulae of propositional logic look like:

((p|q) & (p->r) & (q->r)) -> r

which means:

if p or q and p implies r and q implies r then r

p,q,r are propositional variables that can be assigned truth values. You can verify that regardless of what truth values you assign to p,q, and r, the formula above is always true. This is easier to see on a simpler formula such as (p | !p) i.e. `p or not p'. Such a formula is said to be `valid': it is always true, no matter how you interpret its variables.

The program below implements in Python a validity checker for propositional formulae, using a continuation passing style as described earlier. This program is intended purely as an illustration. There are more efficient methods for this task. However, I believe that it conveys quite well the general ideas about implementing search by continuation passing.

Both programs (the validity checker and the prolog engine mentioned earlier) are also available at the following urls:

- http://www.ps.uni-sb.de/~duchier/python/validity.py
- http://www.ps.uni-sb.de/~duchier/python/prolog.py

Here is the validity checker: