Thierry Coquand Preliminary Draft For The TYPES Summer School, August 1999
Thierry Coquand Preliminary Draft For The TYPES Summer School, August 1999
Thierry Coquand Preliminary Draft For The TYPES Summer School, August 1999
Thierry Coquand Preliminary draft for the TYPES Summer School, August 1999
Introduction
Martin-Lof's type theory can be described as an intuitionistic theory of iterated inductive de nitions developed in a framework of dependent types. It was originally intended to be a full-scale system for the formalization of constructive mathematics, but has also proved to be a powerful framework for programming. The theory integrates an expressive speci cation language (its type system) and a functional programming language (where all programs terminate). There now exist several proofassistants based on type theory, and non-trivial examples from programming, computer science, logic, and mathematics have been implemented using these. This note contains four parts. The rst one is an (still preliminary) historical presentation of some historical roots of Type Theory, written with Peter Dybjer. The second part presents some basic examples of inductive de nition in type theory. This shows how to use the elegant notations designed in functional programming for representing and checking inductive proofs. The third part presents the notion of inductive de nition in the setting of rule set 1] and show how this notion can be represented in type theory. The fourth part explains why inductive de nitions play an important r^ ole in proof theory 4].
1.1 Prehistory
Reasoning by induction on the natural numbers, though it had been used unconsciously for a long time, seems rst to have been made explicit in the XVIIth century, in two di erent forms, by Pascal and Fermat. As formulated by Pascal and used for proving properties of numbers in his \triangle", to prove that a property P holds for all natural numbers, it is enough to prove that it holds for 1; and that it holds for n + 1 whenever it holds for n: Fermat formulated it as a principle of \in nite descent" for establishing that a property is universally false: if a property P is such that P (x) implies that there exists y < x such that P (y); then P (x) never holds. These two formulations can be found also for more complicated cases. The fact that a relation > is well-founded can be expressed in two a priori di erent forms: the fact that there is no in nite decreasing chain x0 > x1 > : : : or the fact that the following induction schema (noetherian induction principle) is valid ((8z )((8y < z ) (y)) ! (z )) ! (x):
Of course in the de nition of the measure, one should check that the measure does not depend on the presentation of the set.
2
1.2.4 Realizability
Recursive realizability was introduced by Kleene 16] in order to relate precisely two quite di erent approach to the notion of e ective processes: the constructive functions in intuitionism, and the notion of recursive functions. Modi ed realizability was introduced by Kreisel in 17], as a tool to show the non derivability of Markov's principle in Heyting's arithmetic. The main di erence is 4
that extracted computations are programs of a programming language that has the normalisation property. (Modi ed) realizability can be seen as a natural projection operation on formulae (that become types) and proofs. This is explained in 29]. What is important here is that realizability suggests strongly an elegant approach to rigourous programming: to prove (constructively) a proposition can be seen as a way to achieve the \hand-inhand" development of programs together with proofs that they satisfy their speci cation, advocated by Dijkstra or Gries 7]3 . The book 2] contains a detailed discussion of realizability and of its potential interest for computer science.
Inductively de ned notions are permeate theoretical computer science too. One fundamental example is the notion of a language in the sense of formal language theory. But we shall focus on two aspects of programming languages here: inductive programming language constructs and inductive constructs for reasoning about programs.
seen as inductively de ned relations between inductively de ned objects, and the proofs are done by structural induction 6]. In the 60s, people started to realize the deep analogies between functional systems and proof theoretical systems. For instance, the techniques developped by Tait for proving the totality of functions de ned in Godel's system T; could be used essentially to prove the normalisation property of systems of natural deduction. A typical example of this situation was the normalisation property for system F. Though it was found having in mind a termination theorem for a functional system, it was later realized that it can be applied, as by magic, to give a proof of Takeuti's conjecture, that was a conjecture about cut-elimination in a sequent calculus of second-order arithmetic. These analogy were used by Prawitz in his study of \validity" of proofs. At about the same time, de Bruijn, Scott, and Martin-Lof showed that these analogy could be given a common expression in a general type theory with dependent types, unifying for instance the notion of proofs by induction and functions de ned by induction. Ideas closely related to the Curry-Howard correspondence had also appeared earlier in Lawvere's work on categorical logic from the 60s. His hyperdoctrines 18], which are categorical models of rst order predicate logic, model propositions as objects and proofs as morphisms, and has a quite similar structure to type theory. The notions of constructor and concrete data type turned out to be similar to the notions of introduction rule and logical connective respectively. Thus the close connection between functional systems studied in metamathematics and proof theory could be extended to a connection to functional programming languages. The notion of admissible rule corresponds closely to the notion of recursively de ned functions. It should be noted for instance that the introduction of data types such as product or disjoint union, which seems so natural in a functional programming language, did not appear in the literature of functional systems before the 70s. In the previous sections we have pointed to several traditions which have been brought together by the Curry-Howard isomorphism. It is interesting to reread this history in the light of the structure of type-theory. We note for example that there are several di erent levels of complexity of inductive de nitions in type theory: First order datatypes and type constructors. { Simple type constructors such as disjoint union and cartesian product, and their logical analogues conjunction and disjunction. { Natural numbers. { Other simple datatypes such as lists and binary trees. Higher type constructors and trans nite types. { The function space construction and implication. { Brouwer ordinals. Dependent types. 6
quanti ers. { The equality relation. { Other inductively de ned relations, such as the transitive closure of a relation. Inductively de ned families of types such as vectors (lists of a certain length). Universes. Metamathematical re ection. There is also another kind of strati cation coming from the distinction between ordinary primitive recursive de nitions and primitive recursive functionals. We may then structure the history by asking for some particular concept: Where does it come from? How has it been used informally? How was it explained/justi ed mathematically? When do its rules appear as part of a formal system? In each case we have to look at both sides of the Curry-Howard isomorphism: Logical systems, induction principles. Programming languages and type systems, de nition by recursion. For example, analysing and this way, we note that the cartesian product and disjoint union of a family of sets are well-known mathematical concepts with a clear set-theoretic de nition. The formal type-theoretic rules for and were used by Scott 33] who gives the following background 33] p 240]: \Next de Bruijn made good use of cartesian products of species (formation of function spaces) in connection with the universal quanti er - an idea also familiar to Lawvere and to a certain extent to Lauchli - and the author took this at once. Now dual to products (as Lawvere knows) are disjoint sums which must be used for the interpretation of the existential quanti er (cf. Kreisel - Goodman). These sums were not employed by de Bruijn, but it would be easy to add them to his system.". Later Martin-Lof introduced the eliminators split 24] and funsplit 26]. On the logical side we rst have Heyting's and Kolmogorov's explanations of the logical constants in terms of proofs and problems respectively. For example, according to Heyting, a proof of 8x:P x] is a function that given an arbitrary element a returns a proof of P a], and a proof of 9x:P x] is a pair consisting of an element a and a proof of P a]. These explanations and also Gentzen's idea that logical constants are de ned by their introduction rules can be read as saying that the logical constants are inductively de ned sets of proofs. Gentzen also formulated the system NJ of natural deduction in intuitionistic logic. Rules for reduction of proofs in this system were given by Prawitz.
{ Disjoint union and cartesian product of a family of types and their logical analogues the
2 Some Examples
We give some basic examples of inductive de nitions represented in type theory. For the complete schema, see 30, 8]. We introduce three basic objects N : Set; 0 : N; S : N ! N: These form the introduction rules. The elimination rule is natrec : (C : N ! Set)C 0 ! ((x : N )C x ! C (S x)) ! (x : N )C x and the computation rules are natrec C a g 0 = a natrec C a g (S n) = g n (natrec C a g n): It is remarkable that the non dependent version of the elimination rule natrec : (C : Set)C ! (N ! C ! C ) ! N ! C gives primitive recursion and not only iteration. The usual justi cation of the elimination and computation rules uses the intuition of objects of N as \ nite" sequences of S ended \eventually" by a 0. We suppose to have already introduced N and we introduce Ord : Set; 0 : Ord; S : Ord ! Ord; Lim : (N ! Ord) ! Ord: The elimination rule is ordrec : (C : Ord ! Set)C 0 ! ((x : Ord)C x ! C (S x)) ! ((u : N ! Ord)((n : N )C (u n)) ! C (Lim u)) ! (x : Ord)C x and the computation rules are ordrec C a g h 0 = a ordrec C a g h (S x) = g x (ordrec C a g h x) ordrec C a g h (Lim u) = h u ( n ordrec C a g h (u n)): Intuitively an ordinal is a tree, that may be in nite, but always well-founded. It is harder to make intuitive sense of these elimination and computation rules: this justi cation is of a similar nature as Brouwer's bar induction, as noticed in Troelstra-van-Dalen 35]. 8
2.2 Ordinals
Exercice: De ne the addition and multiplication of ordinals. De ne then the operation ! ! and use this to de ne an element 0 : Ord which represents the limit of the sequences u0 = 0; un+1 =
!u :
n
Try then to de ne such an element without using the elimination rule over ordinals. This is possible, but much more di cult. This exercice gives a good example of how the use of complex notions (elimination rules) can simplify a construction.
We have at least two ways of representing a transition system (set with a binary relation) in type theory. The rst way is to give a set A with a binary relation R : A ! A ! Set: The second way is to give a dependent type B : A ! Set with a function d : (x : A)B x ! A: Intuitively, B a represents the possible transition from a : A and d a b : A the result of applying transition b : B a to a: We show how to express in type theory that, intuitively, there is no in nite transition starting from a: The parameters are A : Set; B : A ! Set; d : (x : A)B x ! A: We introduce Acc : A ! Set with a constant sup : (a : A)((b : B a)Acc (d a b)) ! Acc a and an elimination rule accrec : (C : A ! Set) ((x : A)((z : B x)C (d x z )) ! C x) ! (x : A)Acc x ! C x and the computation rule is accrec C g x (sup x y) = g x ( z accrec C g (d x z) (y z)): following way: take A = A1 A2 and B (a1 ; a2 ) = B1 a1 + B2 a2 with d (a1 ; a2 ) (i b1 ) = d1 a1 b1 and d (a1 ; b1 ) (j b2 ) = d2 a2 b2 : Prove then that Acc a1 ! Acc a2 ! Acc (a1 ; a2 ): Exercice: Given a transition system A; B; d and U : A ! Set de nes in type theory the relation U j a (read \U is a bar for a") which holds i U a or, inductively, U j (d a b) for all b : B a: Show that if U is monotone that is U a implies U (d a b) for all b : B a then so is U j:
2.3 Accessibility
Type Theory can be seen as a functional programming language with dependent types. Important notions of natural deductions get a concrete representation in type theory: introduction rules are represented as constructors, canonical proofs are represented by terms starting with a constructor, the method of producing a canonical proof from a proof is represented by head-reduction, inductively de ned types and relations are represented as data types, doing a proof by induction over an object is seen as a recursive de nition of a proof object.
What can be noticed is that we get in this way a logic with a natural computational semantics. As noticed by Gentzen (for natural deduction), this logic is intuitionistic and does not justify the law of excluded-middle. The computational semantics of type theory not only ensures, but gives direct and elegant proofs of the following strong properties consistency: the absurd proposition ? is not provable, existence property: if a property P of natural numbers can be represented in type theory, and if 9x : N:P (x) has a proof in type theory, then, from this proof, we can e ectively nd a natural number n that satis es the property P:
All these points get a concrete illustration for the examples presented above. For instance,
10
The types are inductively de ned as being N or of the form ! : A context ; ; : : : is a set of typed variables x1 : 1 ; : : : ; xn : n with x1 ; : : : ; xn distinct. The typing rules are the following
represents an hypothetical method of type with free variables : If = x1 : 1 ; : : : ; xn : n and ui is a closed expression of type i then we say that the substitution = (x1 = u1 ; : : : ; xn = un) is a (closed) instantiation of : We can then form the instantiation t of the method t and we get a closed expression of type : Recursively, the closed expressions are hence described by the following clauses: 0 is a closed expression of type N , S u is a closed expression of type N , if u is a closed expression of type N , u u0 is a closed expression of type , if u; u0 are closed expressions of type ! ; tively, t is a closed expression of type if ` t : and is an instantiation of : We describe next the head-reduction on closed expressions: 0 ! 0; (S t) ! S (t ); 12
` x : if x : is in ; ` 0 : N; ` S t : N if ` t : N; ` t t0 : if ` t : ! and ` t0 : ; ` x t : ! if x does not appear in and ; x : ` t : ; ` fa;b : N ! if ` a : and ` b : N ! ! . We recall that ! ! should be read ! ( ! ). If we have ` t : then t
1 2 1 2
respec-
( x t) u ! t( ; x = u); u u0 ! u1 u0 if u ! u1, fa;b 0 ! a ; fa;b (S u) ! b u (fa;b u); fa;b u ! fa;b u1 if u ! u1: As usual, let us write ! for the re exive transitive closure of ! : We de ne inductively the predicate RN u which expresses that the closed expression u is reducible or computable of type N : u ! S u 1 RN u 1 u! 0 and we de ne recursively on the type what it means to be reducible at any type: R that R u0 implies R (u u0 ):
RN u
RN u
u means
The proofs is divided in two lemmas. Lemma 1: If R u0 and u ! u0 then R u. Corollary: If R u0 and u ! u0 then R u. Lemma 2: If x1 : 1; : : : ; xn : n ` t : and R ui then we have R t(x1 = u1; : : : ; xn = un): Theorem: All closed expressions are reducible. Let us prove the theorem, assuming rst lemma 2. We do an induction on the form of a closed expression v:
i
v = 0 is reducible by de nition, v = S u, by induction hypothesis, u is reducible, hence so is v; v = u u0 by induction hypothesis, both u and u0 are reducible, hence so is v; v = t(x1 = u1 ; : : : ; xn = un); by induction hypothesis all u1 ; : : : ; un are reducible, hence, by lemma 2, so is v.
Hence the theorem is easily proved if we have already lemma 2. The proof of lemma 1 and its corollary are left as exercices: to prove lemma 1, do rst an induction on the type and then in the base case where = N an induction on the de nition of RN : A predicate on expressions that satis es the statement of lemma 1 is called saturated. We prove now the key lemma 2. This is proved by induction on t:
lemma 1, t = 0; then we have 0 ! 0; and hence RN (0 ) by de nition of RN ,
t = S t0; then we have (S t0) ! S (t0 ); but t0 is reducible by induction hypothesis, and hence so is t ,
13
t = x t0 is of type ! ; then we have ( x t0 ) u ! t0 ( ; x = u); for any reducible u of type : By induction hypothesis t0( ; x = u) is reducible, and hence, by lemma 1, t u is reducible, t = t1 t2 ; then t ! t1 (t2 ); by induction hypothesis, both t1 and t2 are reducible, hence t1 (t2 ) is reducible and so is t using lemma 1, t = fa;b; then we prove that t u is reducible by induction on the proof of RN u: If u ! 0 then t u ! a . Since a is reducible by induction hypothesis, then so is t u by the corollary of lemma 1. If u ! S u1 with u1 reducible then t ! b u1 (t u1 ): By induction hypothesis, t u1 is reducible; also by induction hypothesis (on t) b is reducible. Since u1 is reducible
we can conclude by the corollary of lemma 1. This proof is a good example of a proof by induction. There are two induction going on there: one is an induction on the types, and the other is the inductive de nition of RN : The advantage of such a proof is that it is rather direct to extend it to the case of other data types. For instance, if we have a type of ordinals O = 0 j S O j L (N ! O) the de nition of being reducible of type O will be u ! S u 1 RO u 1 u ! L f (8v)RN v ! RO (f v) : u! 0
RO u
RO u
RO u
It is interesting to compare the use of ` and the use of ! in this argument. One can consider ! as an internalisation of `, while the application of lambda-calculus is an internalisation of the operation of instantiation. The similarity in these two operations appear also in the framework of dependent types: if we write x : A]B for the dependent product and x : A] the operation of extending a context with a type A then the rule of product formation can be formulated as: ` x : A]B i x : A] ` B:
14
5 An Interpretation of
1 1
Quanti cation
Since the work of Russell and the incisive remarks of Poincare, one of the main goal of mathematical logic is to analyse impredicativity. The problem is the de nition of a set of natural numbers (or a real number) obtained by quantifying over all sets of natural numbers (resp. over all real numbers). One can argue that the collection of all sets of natural numbers (resp. all real numbers) is not given in advance. It can \change" for instance by adding a new de nition of a set. There is then a \circularity" problem in a de nition which uses universal quanti cation over subsets: the de nition itself may introduce a new set, changing the range of quanti cation used in the de nition. According to Poincare's analysis, this is the root of the contradictions that were found in the beginning of set theory. A classical example of such a situation is the de nition of a lower bound of a set of real numbers, where a real numbers is de ned as a Dedekind cut, that is a downward closed subset of rational numbers. Given a set C of such subsets, the lower bound of the set C is the intersection of all elements of C , and this intersection is de ned by universal quanti cation over all sets of rational numbers. Here we show how to explain this circularity in term of inductive de nitions/well-founded trees. This explains why inductive de nitions play a crucial r^ ole in proof theory 4]. The basic idea 19] is that it is possible to explain inductively when a formula (8X )A(X ) is valid, if A(X ) does not contain any universal quanti cation over predicates: introduce a new atomic predicate X , about which we know nothing, and then explain inductively when A(X ) is correct. For instance (8X )X 5 ! X 5 is correct since X 5 ! X 5 is intuitionistically correct, with X variable. The problem with this explanation is that it explains when a formula (8X )A(X ) is valid, but does not explain what it means to assume a formula (8X )A(X ) that is what can be the meaning of ((8X )A(X )) ! . In order to solve this, we have to use a Kripke model. We consider the following (in nitary, hence described by an inductive de nition) language H (X ) =0j1j
^
n
j ! jXn
where 0 stands for the false, 1 for the true, n n is an in nitary conjunction and where n is a natural number n = 0; 1; : : :. In this language we can represent arbitrary arithmetic formulae, and even formulae depending of an extra predicate variable X: Given an interpretation of X it is possible to give a meaning of a formula A(X ) of H (X ), but it is not possible in general to explain predicatively when a formula A(X ) is valid for every possible interpretation of X: In some cases, this is possible, because we can reinterpret (8X )A(X ) for instance if A(X ) is ^^ ( ( k l ! X k ) ! X l) ! X n then (8X )A(X ) is equivalent to say that n is accessible with respect to the relation . But, in general, for complicated formulae A(X ) we cannot reinterpret (8X )A(X ). We can interpret the language H (X ) more generally in any Kripke model. Given an aribitrary poset P; with a least element 0, let Down(P ) the collection of downward closed subsets of P containing 0. Given an interpretation D of X as a function N ! Down(P ), we give a meaning 15
l k
X ) Y = fx 2 P j (8x0 x) x0 2 X ) x0 2 Y g
we have 0]] = f0g; 1]] = P; V ] = \ ]; n n ! ] = ] ) ]; X n] = D(n): The problem will be the same however: in general \D ] X =D cannot be described predicatively. The main point is now that there exists a particular poset P0 ; such that for any formula (X ) of H (X ), \D ] X =D can be described predicatively. To build this poset P0 , consider the languages H (X0 ; : : : ; Xk ) like H (X ) but with nitely many atomic predicates X0 ; : : : ; Xk and de ne P0 to be the union k H (X0 ; : : : ; Xk ): An element of P0 is a formula (X0 ; : : : ; Xk ): As for H (X ), for any poset P , we can give a meaning ] X =E of any formula 2 P0 as an element of Down(P ) if we are given a meaning Ei : N ! Down(P ) of each
i i
Xi :
Furthermore, we can explain using only inductive de nitions when a formula 2 P0 is valid. This can be done for instance by describing inductively when a nite set of formulas entails another formula :
` if 2 ; ` ! if ; ` ; ` 1; ` if ` 0; ` V n if ` n for all n. We write now i ! is valid. This de nes a poset P with a least element 0. The fact that the relation is transitive follows from Gentzen's cut-elimination theorem: if ` and ; ` then ; ` , which is a good example of a theorem proved by induction.
0
Xi we have ] X =E
i
It can then be proved that if holds then for any poset P and for any interpretation Ei of ] X =E : This expresses the soundness of Kripke models with respect to this logical calculus. The completness will be proved below. The paper 23] presents a calculus of proofs for a calculus similar to the one given here.
i i i
16
Given a formula A(X ) of H (X ) we interpret now (8X )A(X ) as a downward closed set S of P0 . This is the set S of all formulae 2 P0 such that A(Xi ) whenever Xi does not appear in : Notice that if A(Xi ) for one Xi not in ; then we have A(Xl ) for all Xl not in : Of course, it is essential that this set is de ned without use of universal quanti cation over predicates. Lemma 1: S is a downward closed subset of H . Proof: If 2 S and , let Xl be an atomic predicate not in neither in : We have then A(Xl ) by hypothesis and hence A(Xl ): This implies 2 S by the remark above. 2 If 2 H let # ( ) be the set of formulae : We let Dk (n) be the set # (Xk n): Lemma 2: We have # (^ n ) = \ # ( n) and # ( ! ) =# ( ) )# ( ): Lemma 3: ] X =D =# ( ): Proof: This is true by de nition for = Xk n and for = 0 or = 1: It then holds by induction on the formula using lemma 2. 2 It follows from this lemma that we have i for all Kripke models and all interpretations of Xi , we have ] ]: This lemma is crucial in the argument, and holds only because of the use of !-logic. 4 Lemma 4: S is the intersection of all interpretations of A(X )]]X =D where D varies over all possible functions N ! Down(P0 ): Proof: Our claim is that S is the intersection of all A(X )]]X =D . First, if 2 S then A(Xi ) for Xi not in : Hence ! A(Xi ) is a valid formula, which holds in any model, in particular in the Kripke model over H . It follows that we have
k k
1 1
quanti cation
] X =D ;X =D = ] X =D ;X =D
k k i i k k i k k
By lemma 3, we have ] X =D =# ( ) and hence 2 A(Xi )]]X =D as expected. Conversely if 2 A(X )]]X =D for all possible interpretation D, then in particular for Xi not in we have 2 A(X )]]X =D = A(Xi )]]X =D : Since A(Xi )]]X =D =# (A(Xi )) by lemma 2, this implies A(Xi ). 2
i i i i i
Second-order arithmetic HA2 is de ned as a two sorted logic: with variables n; m; x; y; : : : over natural numbers and variables X; Y; Z; : : : over sets of natural numbers. We can form, besides the usual propositions of Heyting arithmetic, the atomic proposition x X and the comprehension axiom can be stated as 9X 8x x X (x)] where (x) is any formula. If we restrict (x) to a formula of the form 8X1 : : : 8Xn (X1 ; : : : ; Xn ; x) without set quanti 1 cation, we have 1 1 -comprehension without parameters: 1 -comprehension. We have built a model of the formula (x) has no free set variables.
4 It follows from Godel's incompletness theorem that the impredicative quanti cation we deal with cannot be explained using only nitary inductive de nitions.
17
The rst to obtain an explanation of 1 1 quanti cation in terms of inductive de nitions was Takeuti, by a somewhat indirect method: he obtained a partial cut-elimination result generalising the one of Gentzen using ordinal diagrams, which can be proved well-founded using inductive de nitions. The method we have used is inspired from the one in the book of Buchholz-Schutte 3], and from some remarks of Lorenzen. We have explained only a special case of 1 1 quanti cation: in (8X )A(X ) the formula A(X ) cannot have any universal quanti cation over predicates. In order to cover a more general case, one needs to use iterated inductive de nitions. The program of reducing the circularity of quanti cations over predicates is going on. The more recent progress in this direction can be elegantly described in terms of type theory: one considers type theory with W -types (iterated inductive de nitions) with stronger and stronger re ection principles. These re ection principles are quite similar to large cardinal axioms in set theory. A simple explanation of 1 2 comprehension, generalising the one we have given is still a challenge!
18
References
1] P. Aczel. An introduction to inductive de nitions. In J. Barwise, editor, Handbook of Mathematical Logic, pages 739{782. North-Holland, 1977. 2] M. Beeson. Foundations of Constructive Mathematics. Springer-Verlag, 1985. 3] W. Buchholz and K. Schutte. Proof theory of impredicative subsystems of analysis. Studies in Proof Theory. Monographs, 2. Bibliopolis, Naples., 1988. 4] P. W. Buchholz W., Feferman S. and S. W. Iterated Inductive De nitions and Subsystems of Analysis. Springer, Berlin, 1981. 5] R. M. Burstall. Proving properties of programs by structural induction. Computer Journal, (39):135{154, 1985. 6] J. Despeyroux. Proof of translation in natural semantics. In Proceedings of the First ACM Conference on Logic in Computer Science, pages 193{205, 1986. 7] E. W. Dijkstra. A Discipline of Programming. Prentice-Hall, 1976. 8] P. Dybjer. Inductive families. Formal Aspects of Computing 6, pages 440{465, 1994. 9] K. Godel. Collected Works, Volumes I and II. Oxford University Press, 1986. 10] J. A. Goguen, J. W. Thatcher, E. G. Wagner, and J. B. Wright. An Initial Algebra Approach to the Speci cation, Correctness, and Implementation of Abstract Data Types. Prentice Hall, 1978. 11] D. Gries. The Science of Programming. Springer Texts and Monographs in Computer Science, 1981. 12] J. Herbrand. On the consistency of arithmetic. In J. van Heijenoort, editor, From Frege to Godel, pages 618{628. Harvard University Press. 13] C. Hoare. Recursive data structures. International Journal of Computer and Information Sciences, 4:105 { 124, 1975. 14] W. Howard. Functional interpretation of bar induction by bar recursion. Compositio Mathematica, 20:107 { 124. 15] G. Kahn. Natural semantics. Technical Report 601, 1987. 16] S. Kleene. On the interpretation of intuitionistic number theory. Journal of Symbolic Logic, 10:109{124, 1945. 17] G. Kreisel. Interpretation of analysis by mean of constructive functionals of nite type. In Heyting, editor, Constructivity in Mathematics, pages 101{128. North-Holland, 1959. 18] F. W. Lawvere. Equality in hyperdoctrines and comprehension schema as an adjoint functor. In A. Heller, editor, Applications of Categorical Algebra, Proceedings of Symposia in Pure Mathematics. AMS, 1970. 19] K. Lorenzen. Logical re ection and formalism. Journal of Symbolic Logic, 23:241{249, 1958. 19
20] K. Lorenzen. Metamathematique. Edition Gauthier-Villars, 1962. 21] P. Martin-Lof. Notes on Constructive Mathematics. Almqvist & Wiksell, Stockholm, 1968. 22] P. Martin-Lof. Hauptsatz for the intuitionistic theory of iterated inductive de nitions. In J. E. Fenstad, editor, Proceedings of the Second Scandinavian Logic Symposium, pages 179{216. North-Holland, 1971. 23] P. Martin-Lof. In nite terms and a system of natural deduction. Composition Mathematica, 34:93{103, 1972. 24] P. Martin-Lof. An intuitionistic theory of types: Predicative part. In Logic Colloquium `73, pages 73{118. North-Holland, 1975. 25] P. Martin-Lof. The domain interpretation of type theory, lecture notes. In K. Karlsson and K. Petersson, editors, Workshop on Semantics of Programming Languages, Abstracts and Notes, Chalmers University of Technology and University of Goteborg, August 1983. Programming Methodology Group. 26] P. Martin-Lof. Intuitionistic Type Theory. Bibliopolis, 1984. 27] J. McCarthy and J. A. Painter. Correctness of a compiler for arithmetic expressions. In Mathematical Aspects of Computer Science, pages 33{41. AMS, 1967. 28] B. Nordstrom, K. Petersson, and J. Smith. Programming in Martin-Lof's Type Theory: an Introduction. Oxford University Press, 1990. 29] C. Paulin-Mohring. Extracting f! programs from proofs in the calculus of constructions. In Sixteenth Annual ACM Symposium on Principles of Programing Languages. ACM, 1989. 30] C. Paulin-Mohring. Inductive de nitions in the system Coq - rules and properties. In Proceedings Typed -Calculus and Applications, pages 328{245. Springer-Verlag, LNCS, March 1993. 31] G. D. Plotkin. A structural approach to operational semantics. Technical Report DAIMI FN -19, Computer Science Depratment, Aarhus University, Aarhus, Denmark, 1981. 32] E. Post. Absolutely unsolvable problems and relatively undecidable propositions. account of an anticipation. In M. Davis, editor, The undecidable. Raven Press, Hewlett, NY, 1965. 33] D. S. Scott. Constructive validity. In Symposium on Automatic Demonstration, pages 237{275. Springer Lecture Notes in Mathematics 125, 1970. 34] W. W. Tait. Normal Derivability in Classical Logic. Springer, 1968. 35] A. S. Troelstra and D. van Dalen. Constructivism in Mathematics, an introduction. NorthHolland, 1988.
20