+ All Categories
Home > Documents > A glimpse into the paradise of combinatory algebra

A glimpse into the paradise of combinatory algebra

Date post: 25-Nov-2023
Category:
Upload: independent
View: 0 times
Download: 0 times
Share this document with a friend
18
International Journal of Computer and Information Sciences, Vol. 13, No. 3, 1984 A Glimpse into the Paradise of Combinatory Algebra W. Richard Stark 1 Received October 1983; revised March 1984 This expository paper examines combinatory algebra and the lambda calculus as foundations for functional languages (LISP, FP, KRC, etc.,). The basic notions are defined and numerous examples are given. Historical comments and references are provided. KEY WORDS: Algebra; terms; syntax; semantics; fixed-point; finite state automata; recursion; functional language. 1. INTRODUCTION As pure mathematics, combinatory algebra is beautiful, surprising, and mysterious. As a model of computation, it is leading designers of modern computer languages to ultrahigh level languages far beyond those of FORTRAN and its relatives. As a foundation for mathematics, it has reduced the computable functions of mathematics to two primitive ancestors--denoted S and K. (If only particle physics could be as successful!) It is the intention of this paper to introduce these subjects. These algebras were developed by Haskell Curry (1900 ~ 1982), in his Doctoral dissertation. Curry worked first (1927-28) at Princeton under Oswald Veblen, later (1928-29) at G6ttingen under Paul Bernays and David Hilbert. Inspired by the work of Frege, Whitehead, and Russell, Curry became involved in the foundational questions of the day, primarily the problem of the analysis of the substitution operation in the propositional calculus. Curry was to devise a method of constructing functions that did not involve variables and hence did not require substitution. ~1-5) (While still at Princeton, Curry discovered the work of the Russian mathematician Moses 1Department of Mathematics, University of South Florida, Tampa, Florida 33620. 219 0091-7036/84/0600-0219503.50/0 © 1984 Plenum Publishing Corporation
Transcript

International Journal of Computer and Information Sciences, Vol. 13, No. 3, 1984

A Glimpse into the Paradise

of Combinatory Algebra

W. Richard Stark 1

Received October 1983; revised March 1984

This expository paper examines combinatory algebra and the lambda calculus as foundations for functional languages (LISP, FP, KRC, etc.,). The basic notions are defined and numerous examples are given. Historical comments and references are provided.

KEY WORDS: Algebra; terms; syntax; semantics; fixed-point; finite state automata; recursion; functional language.

1. INTRODUCTION

As pure mathematics, combinatory algebra is beautiful, surprising, and mysterious. As a model of computation, it is leading designers of modern computer languages to ultrahigh level languages far beyond those of FORTRAN and its relatives. As a foundation for mathematics, it has reduced the computable functions of mathematics to two primitive ancestors--denoted S and K. (If only particle physics could be as successful!) It is the intention of this paper to introduce these subjects.

These algebras were developed by Haskell Curry (1900 ~ 1982), in his Doctoral dissertation. Curry worked first (1927-28) at Princeton under Oswald Veblen, later (1928-29) at G6ttingen under Paul Bernays and David Hilbert. Inspired by the work of Frege, Whitehead, and Russell, Curry became involved in the foundational questions of the day, primarily the problem of the analysis of the substitution operation in the propositional calculus. Curry was to devise a method of constructing functions that did not involve variables and hence did not require substitution. ~1-5) (While still at Princeton, Curry discovered the work of the Russian mathematician Moses

1 Department of Mathematics, University of South Florida, Tampa, Florida 33620.

219

0091-7036/84/0600-0219503.50/0 © 1984 Plenum Publishing Corporation

220 Stark

Schoenfinkel at G6ttingen on this very subject.) Curry created combinatory algebra in 1929.

In 1930, Alonzo Church introduced the function creating operation that lead to the lambda calculus. Three decades later, John McCarthy developed the programming language LISP along lines suggested by the lambda ealculus. (6-8) In 1978, John Backus introduced the computer language FP. °'1°~ FP was developed along lines suggested by the combinatory algebra.

For more information on Curry and the development of combinatory algebra see Ref. 11. For the ideas of Moses Schoenfinkel, and the text of his related December 1920 talk at G6ttingen, see vanHeijenoort, 1967. For a history of the lambda calculus, consult Rosser. (12)

The notion of the "paradise of combinatory algebra" is due to Scott. ~13)

2. COMBINATORY ALGEBRAS

An applicative system is an algebra (X, *) such that • is a total binary operator on X, and X is a nonvoid set closed under , . The operator • is not assumed to have any other properties. The terms T[X] of (X, *) are defined as follows:

1. for each a in X, Co is a (constant) term;

2. for each integer i, v i is a (variable) term;

3. if U and V are terms, then (U • V) is a term.

The expression ( ( U , V ) , W) will be abbreviated as UVW. Since • is not assumed to be associative, UVW is different from (U • (V * W)). T(X) is the set of all such terms. T(X) is assumed to be disjoint from X.

The terms may be interpreted in X by a partial function int: T(X)~ X defined as follows:

1. int[Co] = a, where C o is the (constant) term for a;

2. int[UV] = (int(U) * int(V)).

For some terms (those with variables) int may be undefined. For example, int(vl). If int is extended to the variables (i.e. a value is assigned to each variable), then Condition 2 extends int to every term of T(X) (proof by induction).

Given terms U(x ..... z) and V(x ..... z), whose variables are all among x,..., z, the equation

U(x . . . . . z)--V(x . . . . . z )

Combinatory Algebra 221

is, by definition, equivalent to the equation

int[U(x,..., z)] = int[V(x ..... z)]

holding for every interpretation of x,..., z (i.e. every extension of int to x,..., z). We will be using v, w, x, y, and z to represent variables when the real (i.e. subscripted) variables are inconvenient. The notation U = U(x ..... z ) does not indicate that every variable of x ..... z is used in U, just that the variables of U are among x ..... z.

The expression U(x /a ) represents the term resulting from replacing every occurrence of the variable x in U by a.

A combinatory algebra is an applicative system (X, *) such that:

1. X contains more than one element,

2. for each term U(x,..., z ) where x,..., z is a list of variables, there exists a constant term F such that

F a ... c = U(x/a,. . . , z / c )

for every choice of a ..... c. The property described in 2 is combinatory completeness. (Our definition of combinatory algebra is equivalent to that of Barendregt. (14)) If each such F is uniquely determined, i.e.

(Fx = F ' x , for all x) implies F = F '

where the equality is semantic, then (X, *) is said to be extensional. (Extensionality is due to Curry. ~1))

In the context of these algebras, we are thinking of

(U, v)

as representing the application of the function U to the argument I1. Since U and V range freely over X, an element a of X may be either function (a * U), argument ( V , a), or both ( a , a). Implicit in the notation is the idea that every function can be viewed as a unary function (due, originally, to Frege 1892--see vanHeijenoort 1967). For example, +(2 7) is to be thought of as (+2)7. These details have significant computational impact.

Now consider examples of Property 2. If U(x,.o., z ) is just x, then F satisfies

F x = x

222 Stark

making F an identity function when it occurred to the left of , . If U(x ..... z) is a constant C, then Fx --- C. If U(x,..., z) is the term (z • x), then F satisfies

Fxz = zx

In other words, F permutes its two arguments. For U(xy) = x and V(xyz) = xz(yz) , combinatory completeness

guarantees the existence of constants K(=C/~) and S(=Cs) satisfying

Kxy = x and Sxyz = xz(yz)

for all x, y, and z. K and S are the primitives in our development of combinatorial algebra. By the Basic Theorem (next section), all of the algebraically defined functions can be built up from K and S. For example, the identity function 1 may be defined to be S KK (run through the computation Iv = v). The power of K and S were known to Schoenfinkel.

In a combinatory algebra there is an association of terms with elements of X that goes beyond int. The equation

F x . . . z = U ( x , . . . , z )

tells us that the term U(x,..., z) may be identified with F when F is used as a function (i.e. to the left of the *). L e t ' l x . ' name an operation from T(X) into T(X) which provides the appropriate F for a given U = U(x). In other words, given a term U, ' Ix . ' produces a term V [abbreviated (Ix. U)] of T(X) satisfying

Va = (Ix. U)a = U(x/a)

for every a. " Ix . " is defined inductively on the term U as follows:

1. (Ix. U ) = I, if U is the variable x;

2. (Ix. U) = KU, if the variable x does not appear in U;

3. (Ix. U) = S(lx. F)(lx. W) if U = V W and U contains x.

The notation ( Ix --- z. U), where x,..., z is a list of variables, abbreviates ( I x . . . . (lz. U) -.. ). The theorem of the following section is an easy conse- quence of this definition.

3. THE ELIMINATION OF VARIABLES AND SUBSTITUTION

Theorem. In a combinatory algebra, we have, for every U and " I x ... Z.",

Combinatory Algebra 223

1. ( i x ... z. U) is a uniquely constructed term of T(X),

2. ( I x ... z. U) does not contain the variables x,..., z.

The proof of this theorem is left to the reader. Notice that the (syntactic) uniqueness of ( I x ... z. U) does not imply extensionality (which would be semantic uniqueness).

Examples. Let U be (x * x), then (Ix. U) is computed as follows

(Ix. U) = (Ix. xx) = S( lx . x) ( lx , x) = $1I

Now the term assigned to U = xx is ( ix . U) = SII. Clearly, x does not occur in SI I so the second condition is met. Consider now the reversing function

(lxy. yx) = [lx. (S( ly. y ) ( ly , x))]

= ( lx. SI(Kx))

= (S( lx . SI) ( lx . Kx))

= [S(K(SI))(S(KK)I)]

Neither x nor y occurs in the term abbreviated (lxy. yx). For further information, see Ref. 2.

4. BUILDING COMPUTATIONAL FOUNDATIONS

Mathemat ical foundations can now be built upon K and S. Truth values T for ' t rue ' and F for 'false' are defined

T = K and F = K I

where of course I = SKK. The IF .THEN.ELSE. construction is simply

(IF U T H E N V ELSE W) = (UVW)

I f the value of U is T then

(IF T T H E N V E L S E W ) = T V W = K V W = V

while for F we have

(IF F T H E N V ELSE W) =- F V W = K I V W = I W = W

828/13/3-7

224 Stark

If the conditional clause has a value other than T or F, then the IF.THEN.ELSE function may misbehave. The NOT and OR functions are defined by

(NOT U) -- (IF U THEN F ELSE T) = (UFT)

(OR UV) -- (IF (NOT U) THEN V ELSE T) -- ((UFT) VT)

Other Boolean functions follow easily. Ordered pairs (UV) are defined

(uv> = Ox. xuv)

where x is a variable not occurring in either U or V. The projections (W)I and (1402 are defined by

(W)I = WT and (W)2 = WF

For W = (AB), they work as follows

(W)I = WT= (AB)T= (lx. xAB)T= TAB =,4

(W)2 = WF= <AB)F= (Ix. xAB)F= FAB = B

Exercise: define the function PAIR such that

P a I R UV= (UV)

for all U and V.

The natural numbers are

0 = I and given n n+ 1 = (nK)

The function PLUS1 is obviously defined by

PLUS1 = (lw. (wK)) = ( lwx. xwK)

= S[S{K(SI)}{S(KK)I}] {K(KK)}

The predecessor function SUB 1 on the natural numbers is defined

SUB 1 = (lx. (x)l) = SI(KT) = SI(KK)

SUB 1 works like this

SUB10 = (lx. (X)l) 0 = ( 0 ) 1 = OT=IT= T = K

Combinatory Algebra 225

and for n = rn + I

SUBln = (Ix. (x)O n = (n)l = (m + 1)1 = ( (mK) )~ = m

The zero predicate is defined by

ZERO = (lx. (x)z F T )

Thus we have

ZERO 0 = (lx. ( x ) z V T ) 0 = (0)z F T = ( I F ) F T = V

and f o r n = m + l

ZERO n = (lx. (x)2 F T ) m + l = ((mK)) 2 F T = K F T = r

For arguments e other than natural numbers, (ZERO e) may return a value other than F.

T = F? Suppose T - - F . It follows, for every C a and Co, that

C a = I F T THEN C a ELSE Cb = IF F THEN C a ELSE C b = C b

This contradicts Condition 1 of the definition of combinatory algebra. Thus, T and F are distinct.

5. FIXED-POINTS AND RECURSlVE DEFINITIONS

A definition that defines an object in terms of itself, such as

F A C T O R I A L n =

IF (ZERO n) THEN 1 ELSE (TIMES n (FACTORIAL (SUB 1 n)))

is a recursive definition. In a recursive definition, the new object (left side) is not presented in terms or previously determined objects (right side). Conse- quently, such a definition does no t - - a s it stands--define anything in the combinatory algebra. A recursive definition can be transformed into a nonrecursive definition by using the fixed-point of the term on the recursive definition's right hand side. It all begins with the following theorem.

Theorem. For each function E there exists a fixed-point P such that

E P = P

Proof . Let 0 denote the function (lx. E ( x x ) ) and P denote the term 0 0 . We can now compute

P = o o - - {lx.E(xx)I O=V.(OO)=Ee |

226 Stark

Define ADD recursively by

ADD On = n

ADD mn = ADD (SUB1 m) (PLUSI n) if 0 < m

(Think of ADD as a variable over T(X), and of these equations as an implicit definition of ADD's intended value.) Can ADD be constructed formally within a combinatory algebra? A slightly more formal definition is given by the equation

ADD = ( l m n . IF (ZERO m) THEN n ELSE ADD (SUB1 m) (PLUS1 n))

This definition is still not a construction because ADD is defined in terms of itself. In a construction, ADD would be presented in terms of previously determined objects.

We have the problem of the chicken: to make a chicken we have to have a chicken. However, we can approximate the chicken. Assuming that we have a function NEXTCHICKEN which given one approximation returns a better approximation

chic' = NEXTCHICKEN (chic)

we can get a sequence

(chic0, chicl, chic2,...)

with the chicken as its limit, The limit should be the fixed-point of the function NEXTCHICKEN--because the chicken is its own best approx- imation. Thus the chicken problem and the analogous problem of definition by recursion are both solved by the construction of a fixed-point.

We can now construct ADD. We begin again with the recursive definition

(lxy. IF (ZERO x) THEN y ELSE (ADD (SUB1 x) (PLUS1 y)))

then using the approximating function NEXTADD =

(lfxy. IF (ZERO x) THEN y ELSE ( f (SUB1 x) (PLUS I y)))

the definition is expressed recursively by

ADD = NEXTADD ADD

Combinatory Algebra 227

Constructively, ADD is the fixed-point

(lw. NEXTADD (ww))(lw. NEXTADD (ww))

of NEXTADD.

ADD computes as follows

ADD 2 3 (lw. NEXTADD (ww)) (lw. NEXTADD (ww) 2 3 NEXTADD ((lw. NEXTADD (ww)) (lw. NEXTADD (ww))) 2 3 NEXTADD ADD 2 3 IF (ZERO 2) THEN 3 ELSE (ADD (SUB1 2) (PLUSl 3))

ADD' 1 4 (lw. NEXT. . . ) (lw. NEXT. . . ) 1 4 NEXTADD ((lw. NEXT. . . ) (lw. NEXT. . . )) 1 4 N E X T A D D A D D 1 4 IF (ZERO 1) THEN 4 ELSE (ADD (SUB1 1) (PLUS1 4))

ADD 0 5 (lw. NEXT ... ) (lw. NEXT. . . ) 0 5 NEXTADD ADD 0 5 IF (ZERO 0) THEN 5 ELSE (ADD (SUB1 0) (PLUS1 5)) 5

Similarly, we may define the other basic integer-valued functions such as TIMES, EXP, and the nonnegative difference

SUBTRACT mn = 0 if m < n,

= m -- n otherwise

The predicates LESS and EQUAL for integers may be defined using SUBTRACT. Then using these predicates, MAX is defined for pairs of integers.

6. FINITE STATE AUTOMATA

Finite state automata define a very basic notion of computation--com- putations requiring only a fixed finite memory. As a basic notion of computation it is interesting to see exactly how the automaton models tran- slate into the terms of the combinatory algebra.

The equations defining the behavior of a finite state automaton look a bit like generalizations of the fixed-point equation. Consider, for example, the automaton determined by in Fig. 1.

228 Stark

T Fig. 1.

Using the states A, B, and C, as variables over TO(), the equations are

A T = A A F = B

B T = C B F = A

C T = A C F = C

where each variable (i.e. state) represents a function acting on the input. The solution (automaton) is constructed as follows. First,

a = ( l f g h x . x ( f f g h ) ( g f g h ) )

b = (lfghx. x (hfgh) (f fgh))

c = ( l f g h x . x ( f f g h ) ( h f g h ) )

then set

Let's try it out

A = aabc , B = babc, C = cabc

A F T -- a a b e F T -~ F ( aabc )( babc ) T

= b a b c T --- T ( c a b c ) ( a a b c )

= cabc -= C

It works! Notice that each state codes the complete machine. But of course, that is to be expected.

7. HIGHER FUNCTIONS

Now we'll explore the functions APPLY, COMPOSE, RECURR, FIXEDPOINT, and SEARCH. These functions are special because they are

Combinatory Algebra 229

natural operations on other functions. (Of course, in this algebra, all functions operate on functions.)

APPLY and COMPOSE are functions which satisfy A P P L Y f x = ( f x ) and C O M P O S E f g - - (Ix. ( f (g x))). Terms expressing these functions are

APPLY = (lfx. ( f x))

= ( l f S(Kf)I)

= (S(S(KS)(S(KK) I))(KI))

and

COMPOSE = (lfg. (lx. ( f (g x))))

= (S(S(KS)(S(S(KS)(S(KK)(KS)))

(S( S(KS)( S(KK)(KK) ) )

(S(KK) I)))

(S(S(KS)(S(S(KS)(S(KK)(KS)) )

(S(S(KS)(S(KK)(KK)))

(KI))))

(S(KK)(KI))))

(These expressions were computed in a LISP-based interpreter system for combinatory algebra, available from the author.)

RECURR is the function which, given a base function f and a step function g returns as its value the recursive function h satisfying

h = (lxy. IF (ZERO x) THEN ( f xy ) ELSE (gx(h (SUB1 x)y)))

RECURR is the fixed-point of

(lr. (lfg. (lxy. IF (ZERO x) THEN ( f x y) ELSE (g x (r f g(SUB1 x)y)))))

Given a function h, SEARCH performs a search starting at 0 for the least integer n such that

(hn) = 0

or, in other words, (ZERO (hn)) is satisfied. To construct SEARCH first define W as the fixed-point of

(lw. (lhn. IF (;ZERO (hn)) THEN n ELSE (wh (PLUS1 n))))

230 Stark

then SEARCH is

(lh. (WHO))

When SEARCH does not find its object, the computation never ends, and the function is partial.

Given a function f , FIXPOINT returns the fixed-point off . FIXPOINT is defined by

FIXPOINT = (If. (lx.f(xx))(lx.f(xx)))

= (If. S(Kf)(SII)(S(Kf)(SII)))

and

These functions can be used to define new functions. For example,

ADD - -RECURR (SK) (K(K ADD 1))

TIMES = RECURR (K0) (K PLUS)

EXPONENT = RECURR (K1) (K TIMES)

COMPOSE3 = COMPOSE COMPOSE COMPOSE

COMPOSE3 is the function (lfgh.(lx.f(g(hx)))). Can you define FACTORIAL and EQUAL in terms of these high level functions? Using SEARCH and EQUAL, define an integer-valued version of QUOTIENT.

8. C O M P U T A T I O N A L COMPLETENESS

Kleene's ~15) partial recursive functions make up the smallest family of functions which contains

1. Z(x) = O,

2. N(x)=x + 1, 3. P(xl,...,xj)=x i for all i a n d j with i < j + 1,

and is closed under the operations of

1. composition

2. recursion

3. search

By the results of the previous sections, every partial recursive funetion is definable in terms of S and K.

Combinatory Algebra 231

It is generally accepted that all computable (in the informal sense) functions on the integers are represented in the partial recursive functions. Thus, all computable functions on the integers are represented in terms of S and K. In other words, THE PRIMITIVE COMBINATORS S AND K CAN BE T H O U G H T OF AS A FOUNDATION FOR THE NOTION OF INTEGER COMPUTATION! The "generally accepted" notion mentioned above is known as Church's Thesis. Actually, Church's Thesis was originally expressed in terms of the lambda calculus. Kleene carried it to the partial recursive functions by showing that every function of the integers in the lambda calculus is partial recursive (see Ref. 15).

9. THE LAMBDA CALCULUS

In this section we present basic definitions and a basic theorem for the lambda calculus--nothing more. For details consult Rosser ~12) and Barendregt. (14)

In 1930, Church introduced the function creating operation 2 to denote--as (2x. U)--the function F defined by Fa = U(x/a). (This expansion of the combinatory algebra is analogous to the addition of the inverse function to the language of group theory.) The lambda calculus consists of a set T'(X) of terms using the 2-functions, a set L(X) of formulas on T'(X), a set of axioms in L(X), and a deduction rule. There are no constants. The formulas of L(X) are constructed from terms using the binary relations = and ~. The 'logic' on L(X) defines computations (analogous to proofs) of equations between terms.

The terms T'(X) of the lambda calculus consist of

1. (variable) terms v0, Vx, v2, v3,...;

2. (composite) terms (UV), from terms U and V;

3. (function) terms (2x. U), from a variable x and a term U. UVW abbreviates ((UV) W).

Now, L(X) is the least set containing the formulas

(U = V) and (U--* V)

for terms U and V of T'(X). (Church originally required that the U in clause 3 above contain x as a free variable. In doing so, he eliminated K. K = (,ty. x ) ) . )

The relation '~ ' is the reduction relation. One term reduces to another when it can be replaced by the other term in a computation of the first term's value. We have seen such reductions (for combinatory algebra) in the

232 Stark

detailed computations of A D D 2 3 (Section 5). This computation can be denoted

ADD 2 3 ~ t l ~ . . . -* t13--* 5

where t l , . . . , t l3 are the thirteen intermediate terms. An example for the lambda calculus will follow the presentation of the axioms and the deduction rule.

An occurrence of a variable x in a term U is free if it is not in some V for which ().x. V) is a subterm of U. Otherwise, the occurance is bound. U(x/V) represents the result of replacing every free occurance of x in U by V. In using the expression U(x/V), it is assumed that no variable occuring free in V becomes bound in U(x/V).

The lambda calculus is axiomitized as follows

1. (Xx. u) v ~ ~:(x/V), 2a. ( u ~ u),

2b. (U- - ,V)&(V~W)~(U~W) , 3a. ( U - V)=> (UW- VW), 3b. (U-- V) => (WU-- WV), 3c. (u -~ v) ~ ((,tx. v3 -~ (~x. v)),

4. (U-~ V) ~ (U = V),

5a. (tJ= v)=> (v= u), 5b. (U= V)e , (v= W)=>(~J= W), 6a. (U= V)~ (UW= VW), 6b. (U= V) ~ (WU= WV), 6e. (tl = V)=> (O.x. U) = (;~x. V)),

together with the deduction rule

A , A ~ B

B.

In this system, the (extra) axiom of extensionality is

(~x = Vx) => (t: = V)

for all U and V not containing x free. The notation k-B indicates that the formula B is formally derivable

from the axioms using the deduction rule.

Combinatory Algebra 233

The lambda calculus is important to computer science as a model of computation. The key theorem from this point of view says that reduction, as described by the axioms, can be used as a means of computing (and defining) the value of a term.

The Church-Rosser Theorem.

(U--, gO

If ~- (U = V), then for some W

and ~- ( V ~ W)

For an example of the use of the axioms, we return to the finite state automaton.

8.

9.

10.

11.

12.

1. a = (2fghx. x(f fgh)(gfgh)),

2. r=(~xy.x) ,

3. A = aabe,

4. A T = aabeT [line 3 and axiom 6a],

5. aabcT = O~fghx. x( f fgh )( gfgh ) ) abeT [line I, axiom 6a (X4), and axiom 5b (X3)],

6. (2fghx. x( f fgh)(gfgh)) abeT--, T(aabc)(babe) [axiom 1 (X4) and axiom 2b (X3)],

O~fghx. x( f fgh )( gfgh ) ) abeT = T(aabc)(babc) [line 6 and axiom 4],

T(aabe)(babe) = (£xy. x)(aabe)(babe)

( 2xy. x )( aabe )(babe ) -~ ( aabe )

(2xy. x)(aabe)(babe) = (aabe)

(aabe) = A

A T = A

[line 2 and axiom 6a (X2)],

[line 1 (X2) and axiom 2b],

[line 9 and axiom 4],

[line 3 and axiom 5a],

[lines 4, 5, 7, 8, 10, 11, axiom 5b (X4)].

Now, we know with absolute certainty that an input of T takes our automaton from state A to state A.

10. OFFSPRING LIVING IN COMPUTER SCIENCE

In Refs. 6 and 8, the language LISP (for LISt Processor) is described. LISP is an approximation to the lambda calculus built in a world of binary trees. It is, at the same time, both an algebra of functions of these trees, and an algebra in these trees. Thus, LISP has the G6delesque ability to talk

234 Stark

about and process itself. This implementation of the language in its own domain allows the LISP interpreter (i.e. a program which evaluates LISP expressions) to be written in and to compute in the language itself. The feature that most clearly brands LISP as a child of the lambda calculus is the presence of the conversion

u ( x ..... ( i x . . . .....

of terms into functions. On the other hand, significant differences exist. Not every function of LISP can be treated as a unary function, there is a difference between terms and functions, and variables are used in functions (contrasting with the combinatory algebra). These differences make developments such as those of Section 7, which are easy in combinatory algebra and the lambda calculus, awkward in LISP.

The similarities between LISP and the rest of the world of computer languages--FORTRAN, COBOL, BASIC, PASCAL, ALGOL, APL, etc.,--is limited to the fact that they are all computer languages. In its pure from, LISP does not have assignment statements. Programming in LISP amounts to defining successive extensions of the language until the ultimate operation, no matter how complex, is reduced to a single function. Because of its differences, the main stream of the computing community has, until recently, thought of LISP as being the tool of artificial intelligence, the toy of academia. This dichotomy is expressed in the division of languages into two groups: the vonNeumann languages (FORTRAN et al.) and the functional languages (LISP, PROLOG, FP).

In accepting the 1977 ACM Turing Award, Backus presented a strong case for the functional languages. (9) Backus had headed the IBM group that created FORTRAN during the early 50's and had participated in the creation of ALGOL, so he spoke from the opposition's camp. In 1978, Backus presented the functional language FP. In many ways FP is closer to Curry's combinatory algebra than is LISP. In FP, programs are functions without variables. A small set of functional forms algebraically generates the set of FP functions. These forms are the operations of an algebra of programs. "The programs of FP are so structured that their behavior can often be understood and proven by the mechanical use of algebraic techniques. ''(9) The body of Backus' paper is concerned with developing FP along the lines hinted at in our Section 7 on higher functions.

Backus ~1°) makes the point that the vonNeumann languages and even LISP are object-oriented, and that programs and functions are expressed using object-level variables and constants--as opposed to function-level variables and constants. LISP and the lambda calculus are object-oriented because of the use of object-level variables in the LAMBDA construction of

Combinatory Algebra 235

functions. Backus provides a clear and convincing presentation of function- level computation in FP. Examples are similar to those of Section 7. "Function-level definitions represent a more powerful style of programming not available at the object-level. ''~1°)

Turner u6> sets out "to demonstrate the superiority of applicative language style over the vonNeumann or assignment style, by showing that it makes programming easier." The problem attacked by Turner is to build a program that enumerates parafins (parafins are hydrocarbons, so this is a problem in organic chemistry). At first the data structures required to represent hydrocarbons seem difficult to manipulate. But, then Turner's approach makes it easy. The actual program is written in KRC. (KRC--for Kent Recursive Calculator--is Turner's language.) This enjoyable paper accomplishes what it set out to do.

Other functional languages include Landin's ISWIM, u7'18~ and PROLOG, see Warren °9) and Clark. ~2°)

Continuing work in this area may be found in the proceedings programming and the annual European Symposium on 2-calculus and Computer Science Theory. The proceedings of the ACM's symposium is published by the Association for Computing Machinery, New York. The proceedings of the European symposium is published in the "Lecture Notes in Computer Science" series by Springer, Berlin.

REFERENCES

1. H. B. Curry, An analysis of logical substitution, Amer. J. Math., 51:363-384 (1929). 2. H. B. Curry, Apparent variables from the standpoint of combinatory logic, Ann. Math. 2

(34):381404 (1933). 3. H. B. Curry, Some philosophical aspects of combinatory logic, The Kleene Symposium,

(eds., J. Barwise, H. J. Keisler, and K. Kunen, North-Holland, Amsterdam, p. 61-84, (1978).

4. H. B. Curry and Robert Feys, Combinatory Logic, Vol. I, North-Holland, Amsterdam, (1958).

5. H. B. Curry, J. R. Hindley, and J. P. Seldin, Combinatory Logic, Vol. II, North-Holland, Amsterdam, (1972).

6. John McCarthy, Recursive functions of symbolic expressions and their computation by machine, Part I, CACM, 3 (4):184-195 (1960).

7. John McCarthy, A basis for a mathematical theory of computation, Computer Programming and Formal Systems, North-Holland, Amsterdam (1963).

8. John McCarthy, History of LISP, in History of Programming Languages edited by R. L. Wexelblat, Academic Press, (1981).

9. John Backus, Can Programming be liberated from the vonNeumann style? A functional style and its algebra of programs, CACM, 21 (8):613-641 (1978).

10. John Backus, Functional level programs as mathematical objects, Proc. ACM Syrup. on LISP and Functional Computing, New York, (1981).

236 Stark

11. J. P. Seldin and J. R. Hindley, To H. B. Curry: Essays on Combinatory Logic, Lambda Calculus and Formalism, Academic Press, New York (1980).

12. J. B. Rosser, Highlights in the history of the lambda calculus, ACM Symposium on LISP and Functional computing, (1982).

13. Dana Scott, Some philosophical issues concerning the theories of combinators, 2-Calculus and Computer Science Theory, Lecture Notes in Computer Science, Springer, Berlin 37:346-366 (1975); Lambda calculus: some models, some philosophy, The Kleene Symposium, North-Holland, Amsterdam, p. 223-266, (1977); Logic and programming languages, CACM 20 (9):634-49 (1980).

14. Henk P. Barendregt, The lambda calculus, North-Holland, Amsterdam, (1980). 15. S. Kleene, Introduction to Metamathematics, D. Van Nostrand, Princeton, (1950). 16. D. A. Turner, The semantic elegance of applicative languages, Proceedings of the 1981

ACM Symposium on LISP and Functional Computing, (1981). 17. P. J. Landin, A correspondence between ALGOL 60 and Church's Lambda Calculus,

CACM, 8 (3):158-65. 18. P. J. Landin, The next 700 programming languages, CACM, 9 (3):157-64 (1966). 19. D. Warren, et al., PROLOG--The language and its implementation compared with LISP,

Proceedings of the Symposium on Artificial Intelligence and Programming Languages, SIGPLAN Notices, 12 (18):000-000 (1977).

20. K. L. Clark and F. G. McCabe, PROLOG: a language for implementing expert systems, memo of the Department of Computing, Imperial College, London, (1980).

21. P. Henderson, Functional Programming: Applications and Implementation, Prentice-Hall, Englewood Cliffs, New Jersey (1980).

22. C. Smorynski, Fixed point algebras, Bulletin (new) AMS, 6 (3):317-429; 8 (2):407 (1982).

23. Sch6nfinkle, On the building blocks of mathematical logic, in From Frege to GdJdel: A source book in mathematical logic (1924), edited by Jean vanHeijenoort, Harvard University Press, Cambridge, Mass., (1967).

Printed in Belgium


Recommended