On lowering to lambda-compose
Lambda-compose (\(\lambda_{\circ}\)) is the language presented in Kleffner's thesis A Foundation for Typed Concatenative Languages.
Previous versions of Growl were essentially a direct extension of it, and I've been thinking of going in that direction again by using Lambda-compose as a target calculus for Growl, essentially using it as an IR. The idea is that we can lower the untyped Growl AST into an untyped Lambda-compose AST, and operate within that for type inference and program transformations.
What does this mean for Growl? Well, it's going back to looking like a weird concatenative ML mish-mash syntactically. Of most importance is the \(\text{fix}\) primitive, which "wraps its function value in a fix-point block and pushes it to the stack" (direct quote from Kleffner's thesis, page 6.)
Let's discuss factorial as an example:
let rec factorial = dup 1 <= [drop 1] [dup 1 - factorial *] ifte
\(\lambda_{\circ}\) does not have recursive definitions, but we can lower this by
first wrapping it into a lambda abstraction receiving itself, replacing all
references of factorial to factorial call, passing it to the \(\text{fix}\)
primitive, and finally calling the returned function value:
let factorial = [ (\factorial. dup 1 <= [drop 1] [dup 1 - factorial call *] ifte) ] fix
We're not done yet, as this definition will trigger an occurs check due to an
infinite type in calling factorial. We have two options here; introduce
bindings for the arguments, or reorder the stack on each recursive call.
Which approach should be taken is still something I'm figuring out:
- If we decide to reorder the stack, each
factorial callgets replaced withswap factorial dip(or other sequences, based on the inferred input of the word), which introduces a dependency on thedipfamily of combinators. If we decide to introduce lambda abstractions for the arguments, the definition now becomes:
let factorial = [ (\factorial n. n 1 <= [1] [n 1 - factorial call n *] ifte) ] fix
One problem is, how do we know which words should we replace with (or append to) the \(n\) argument? First idea that comes to mind is using stack-arity inference to generate constraints that allow us to know where to put the calls to
nwhere needed. This sounds complicated in theory, so I need to think more about it. I love thinking! :D
With mutually recursive functions this works similarly, using a generalized \(\text{fix}_n\) primitive that receives \(n\) quotations as input.
Now comes the question: do we need to support the case of i.e. dup call? To
be honest, probably not. Even Factor avoids it by throwing a "recursive
quotation" error when you try to do it (source). The only reason I'd see it
useful is for implementing the Y combinator inside the language, and that's
basically what the \(\text{fix}\) primitive is.