Mathematical Structures in Computer Science, Apr 29, 2014
Seely's paper Locally cartesian closed categories and type theory contains a well-known result in... more Seely's paper Locally cartesian closed categories and type theory contains a well-known result in categorical type theory: that the category of locally cartesian closed categories is equivalent to the category of Martin-Löf type theories with Π, Σ, and extensional identity types. However, Seely's proof relies on the problematic assumption that substitution in types can be interpreted by pullbacks. Here we prove a corrected version of Seely's theorem: that the Bénabou-Hofmann interpretation of Martin-Löf type theory in locally cartesian closed categories yields a biequivalence of 2-categories. To facilitate the technical development we employ categories with families as a substitute for syntactic Martin-Löf type theories. As a second result we prove that if we remove Π-types the resulting categories with families with only Σ and extensional identity types are biequivalent to left exact categories.
We give a new syntax independent definition of the notion of a finitely presented generalized alg... more We give a new syntax independent definition of the notion of a finitely presented generalized algebraic theory as an initial object in a category of categories with families (cwfs) with extra structure. To this end we define inductively how to build a valid signature Σ for a generalized algebraic theory and the associated category CwFΣ of cwfs with a Σ-structure and cwf-morphisms that preserve Σ-structure on the nose. Our definition refers to the purely semantic notions of uniform family of contexts, types, and terms. Furthermore, we show how to syntactically construct initial cwfs with Σ-structures. This result can be viewed as a generalization of Birkhoff's completeness theorem for equational logic. It is obtained by extending Castellan, Clairambault, and Dybjer's construction of an initial cwf. We provide examples of generalized algebraic theories for monoids, categories, categories with families, and categories with families with extra structure for some type formers of dependent type theory. The models of these are internal monoids, internal categories, and internal categories with families (with extra structure) in a category with families. Finally, we show how to extend our definition to some generalized algebraic theories that are not finitely presented, such as the theory of contextual categories with families.
HAL (Le Centre pour la Communication Scientifique Directe), Nov 30, 2017
We show that a version of Martin-Löf type theory with an extensional identity type former I, a un... more We show that a version of Martin-Löf type theory with an extensional identity type former I, a unit type N1, Σ-types, Π-types, and a base type is a free category with families (supporting these type formers) both in a 1-and a 2-categorical sense. It follows that the underlying category of contexts is a free locally cartesian closed category in a 2-categorical sense because of a previously proved biequivalence. We show that equality in this category is undecidable by reducing it to the undecidability of convertibility in combinatory logic. Essentially the same construction also shows a slightly strengthened form of the result that equality in extensional Martin-Löf type theory with one universe is undecidable.
We show that Hyland and Ong's game semantics for PCF can be presented using normalization by eval... more We show that Hyland and Ong's game semantics for PCF can be presented using normalization by evaluation (nbe). We use the bijective correspondence between innocent well-bracketed strategies and PCF Böhm trees, and show how operations on PCF Böhm trees, such as composition, can be computed lazily and simply by nbe. The usual equations characteristic of games follow from the nbe construction without reference to low-level game-theoretic machinery. As an illustration, we give a Haskell program computing the application of innocent strategies.
The aim of this paper is to refine and extend proposals by Sozeau and Tabareau and by Voevodsky f... more The aim of this paper is to refine and extend proposals by Sozeau and Tabareau and by Voevodsky for universe polymorphism in type theory. In those systems judgments can depend on explicit constraints between universe levels. We here present a system where we also have products indexed by universe levels and by constraints. Our theory has judgments for internal universe levels, built up from level variables by a successor operation and a binary supremum operation, and also judgments for equality of universe levels. 2012 ACM Subject Classification Theory of computation → Type theory Keywords and phrases type theory, universes in type theory, universe polymorphism, level-indexed products, constraint-indexed products
Page 1. Lecture Notes in Computer Science Peter Dybjer Bengt Nordstrom Jan Smith (Eds.) Types for... more Page 1. Lecture Notes in Computer Science Peter Dybjer Bengt Nordstrom Jan Smith (Eds.) Types for Proofs and Programs International Workshop TYPES'94 Bastad, Sweden, June 1994 Selected Papers Springer Page 2. Page 3. ...
Type-checking algorithms for dependent type theories often rely on the interpretation of terms in... more Type-checking algorithms for dependent type theories often rely on the interpretation of terms in some semantic domain of values when checking equalities. Here we analyze a version of Coquand's algorithm for checking the βη-equality of such semantic values in a theory with a predicative universe hierarchy and large elimination rules. Although this algorithm does not rely on normalization by evaluation explicitly, we show that similar ideas can be employed for its verification. In particular, our proof uses the new notions of contextual reification and strong semantic equality. The algorithm is part of a bi-directional type checking algorithm which checks whether a normal term has a certain semantic type, a technique used in the proof assistants Agda and Epigram. We work with an abstract notion of semantic domain in order to accommodate a variety of possible implementation techniques, such as normal forms, weak head normal forms, closures, and compiled code. Our aim is to get closer than previous work to verifying the type-checking algorithms which are actually used in practice.
Coherence and valid isomorphism in closed categories applications of proof theory to category the... more Coherence and valid isomorphism in closed categories applications of proof theory to category theory in a computer sclentist perspective.- An algebraic view of interleaving and distributed operational semantics for CCS.- Temporal structures.- Compositional relational semantics for indeterminate dataflow networks.- Operations on records.- Projections for polymorphic strictness analysis.- A category-theoretic account of program modules.- A note on categorical datatypes.- A set constructor for inductive sets in Martin-Lof's type theory.- Independence results for calculi of dependent types.- Quantitative domains, groupoids and linear logic.- Graded multicategories of polynomial-time realizers.- On the semantics of second order lambda calculus: From bruce-meyer-mitchell models to hyperdoctrine models and vice-versa.- Dictoses.- Declarative continuations: An investigation of duality in programming language semantics.- Logic representation in LF.- Unification properties of commutative theories: A categorical treatment.- An abstract formulation for rewrite systems.- From petri nets to linear logic.- A dialectica-like model of linear logic.- A final coalgebra theorem.
Coherence and valid isomorphism in closed categories applications of proof theory to category the... more Coherence and valid isomorphism in closed categories applications of proof theory to category theory in a computer sclentist perspective.- An algebraic view of interleaving and distributed operational semantics for CCS.- Temporal structures.- Compositional relational semantics for indeterminate dataflow networks.- Operations on records.- Projections for polymorphic strictness analysis.- A category-theoretic account of program modules.- A note on categorical datatypes.- A set constructor for inductive sets in Martin-Lof's type theory.- Independence results for calculi of dependent types.- Quantitative domains, groupoids and linear logic.- Graded multicategories of polynomial-time realizers.- On the semantics of second order lambda calculus: From bruce-meyer-mitchell models to hyperdoctrine models and vice-versa.- Dictoses.- Declarative continuations: An investigation of duality in programming language semantics.- Logic representation in LF.- Unification properties of commutative theories: A categorical treatment.- An abstract formulation for rewrite systems.- From petri nets to linear logic.- A dialectica-like model of linear logic.- A final coalgebra theorem.
This volume is the post-proceedings of the 24th International Conference on Types for Proofs and ... more This volume is the post-proceedings of the 24th International Conference on Types for Proofs and Programs, TYPES 2018, which was held at Universidade do Minho in Braga, Portugal, between the 18th and the 21st of June in 2018. The TYPES meetings are a forum to present new and ongoing work in all aspects of type theory and its applications, especially in formalized and computer assisted reasoning and computer programming. The meetings from 1990 to 2008 were annual workshops of a sequence of five EU funded networking projects. Since 2009, TYPES has been run as an independent conference series.
We prove the correctness of an algorithm for normalizing untyped combinator terms by evaluation. ... more We prove the correctness of an algorithm for normalizing untyped combinator terms by evaluation. The algorithm is written in the functional programming language Haskell, and we prove that it lazily computes the combinatory Böhm tree of the term. The notion of combinatory Böhm tree is analogous to the usual notion of Böhm tree for the untyped lambda calculus. It is defined operationally by repeated head reduction of terms to head normal forms. We use formal neighbourhoods to characterize finite, partial information about data, and define a Böhm tree as a filter of such formal neighbourhoods. We also define formal topology style denotational semantics of a fragment of Haskell following Martin-Löf, and let each closed program denote a filter of formal neighbourhoods. We prove that the denotation of the output of our algorithm is the Böhm tree of the input term. The key construction in the proof is a "glueing" relation between terms and semantic neighbourhoods which is defined by induction on the latter. This relation is related to the glueing relation which was earlier used for proving the correctness of a normalization by evaluation algorithm for typed combinatory logic.
Abstract. Seely’s paper Locally cartesian closed categories and type the-ory contains a well-know... more Abstract. Seely’s paper Locally cartesian closed categories and type the-ory contains a well-known result in categorical type theory: that the category of locally cartesian closed categories is equivalent to the cat-egory of Martin-Löf type theories with Π,Σ, and extensional identity types. However, Seely’s proof relies on the problematic assumption that substitution in types can be interpreted by pullbacks. Here we prove a corrected version of Seely’s theorem: that the Bénabou-Hofmann inter-pretation of Martin-Löf type theory in locally cartesian closed categories yields a biequivalence of 2-categories. To facilitate the technical devel-opment we employ categories with families as a substitute for syntactic Martin-Löf type theories. As a second result we prove that if we remove Π-types the resulting categories with families are biequivalent to left exact categories. 1
We show how to write generic programs and proofs in MartinL of type theory. To this end we consid... more We show how to write generic programs and proofs in MartinL of type theory. To this end we consider several extensions of MartinL of's logical framework for dependent types. Each extension has a universes of codes (signatures) for inductively defined sets with generic formation, introduction, elimination, and equality rules. These extensions are modeled on Dybjer and Setzer's finitely axiomatized theories of inductive-recursive definitions, which also have a universe of codes for sets, and generic formation, introduction, elimination, and equality rules.
Martin-Lof's type theory can be described as an intuitionistic theory of iterated inductive ... more Martin-Lof's type theory can be described as an intuitionistic theory of iterated inductive definitions developed in a framework of dependent types. It was originally intended to be a full-scale system for the formalization of constructive mathematics, but has also proved to be a powerful framework for programming. The theory integrates an expressive specification language (its type system) and a functional programming language (where all programs terminate). There now exist several proof-assistants based on type theory, and many non-trivial examples from programming, computer science, logic, and mathematics have been implemented using these. In this series of lectures we shall describe type theory as a theory of inductive definitions. We emphasize its open nature: much like in a standard functional language such as ML or Haskell the user can add new types whenever there is a need for them. We discuss the syntax and semantics of the theory. Moreover, we present some examples ...
A general formulation of inductive and recursive definitions in Martin-Lof's type theory is ... more A general formulation of inductive and recursive definitions in Martin-Lof's type theory is presented. It extends Backhouse's `Do-It-Yourself Type Theory' to include inductive definitions of families of sets and definitions of functions by recursion on the way elements of such sets are generated. The formulation is in natural deduction and is intended to be a natural generalization to type theory of Martin-Lof's theory of iterated inductive definitions in predicate logic. Formal criteria are given for correct formation and introduction rules of a new set former capturing definition by strictly positive, iterated, generalized induction. Moreover, there is an inversion principle for deriving elimination and equality rules from the formation and introduction rules. Finally, there is an alternative schematic presentation of definition by recursion. The resulting theory is a flexible and powerful language for programming and constructive mathematics. We hint at the we...
The first example of a simultaneous inductive-recursive definition in intuitionistic type theory ... more The first example of a simultaneous inductive-recursive definition in intuitionistic type theory is Martin-Löf's universe à la Tarski. A set U0 of codes for small sets is generated inductively at the same time as a function T0 , which maps a code to the corresponding small set, is defined by recursion on the way the elements of U0 are generated. In this paper we argue that there is an underlying general notion of simultaneous inductiverecursive definition which is implicit in Martin-Löf's intuitionistic type theory. We extend previously given schematic formulations of inductive definitions in type theory to encompass a general notion of simultaneous induction-recursion. This enables us to give a unified treatment of several interesting constructions including various universe constructions by Palmgren, Griffor, Rathjen, and Setzer and a constructive version of Aczel's Frege structures. Consistency of a restricted version of the extension is shown by constructing a rea...
We describe the breadth-first traversal algorithm by Martin Hofmann that uses a non-strictly posi... more We describe the breadth-first traversal algorithm by Martin Hofmann that uses a non-strictly positive data type and carry out a simple verification in an extensional setting. Termination is shown by implementing the algorithm in the strongly normalising extension of system F by Mendler-style recursion. We then analyze the same algorithm by alternative verifications first in an intensional setting using a non-strictly positive inductive definition (not just a non-strictly positive data type), and subsequently by two different algebraic reductions. The verification approaches are compared in terms of notions of simulation and should elucidate the somewhat mysterious algorithm and thus make a case for other uses of non-strictly positive data types. Except for the termination proof, which cannot be formalised in Coq, all proofs were formalised in Coq and some of the algorithms were implemented in Agda and Haskell. 2012 ACM Subject Classification Theory of computation → Logic and verific...
Mathematical Structures in Computer Science, Apr 29, 2014
Seely's paper Locally cartesian closed categories and type theory contains a well-known result in... more Seely's paper Locally cartesian closed categories and type theory contains a well-known result in categorical type theory: that the category of locally cartesian closed categories is equivalent to the category of Martin-Löf type theories with Π, Σ, and extensional identity types. However, Seely's proof relies on the problematic assumption that substitution in types can be interpreted by pullbacks. Here we prove a corrected version of Seely's theorem: that the Bénabou-Hofmann interpretation of Martin-Löf type theory in locally cartesian closed categories yields a biequivalence of 2-categories. To facilitate the technical development we employ categories with families as a substitute for syntactic Martin-Löf type theories. As a second result we prove that if we remove Π-types the resulting categories with families with only Σ and extensional identity types are biequivalent to left exact categories.
We give a new syntax independent definition of the notion of a finitely presented generalized alg... more We give a new syntax independent definition of the notion of a finitely presented generalized algebraic theory as an initial object in a category of categories with families (cwfs) with extra structure. To this end we define inductively how to build a valid signature Σ for a generalized algebraic theory and the associated category CwFΣ of cwfs with a Σ-structure and cwf-morphisms that preserve Σ-structure on the nose. Our definition refers to the purely semantic notions of uniform family of contexts, types, and terms. Furthermore, we show how to syntactically construct initial cwfs with Σ-structures. This result can be viewed as a generalization of Birkhoff's completeness theorem for equational logic. It is obtained by extending Castellan, Clairambault, and Dybjer's construction of an initial cwf. We provide examples of generalized algebraic theories for monoids, categories, categories with families, and categories with families with extra structure for some type formers of dependent type theory. The models of these are internal monoids, internal categories, and internal categories with families (with extra structure) in a category with families. Finally, we show how to extend our definition to some generalized algebraic theories that are not finitely presented, such as the theory of contextual categories with families.
HAL (Le Centre pour la Communication Scientifique Directe), Nov 30, 2017
We show that a version of Martin-Löf type theory with an extensional identity type former I, a un... more We show that a version of Martin-Löf type theory with an extensional identity type former I, a unit type N1, Σ-types, Π-types, and a base type is a free category with families (supporting these type formers) both in a 1-and a 2-categorical sense. It follows that the underlying category of contexts is a free locally cartesian closed category in a 2-categorical sense because of a previously proved biequivalence. We show that equality in this category is undecidable by reducing it to the undecidability of convertibility in combinatory logic. Essentially the same construction also shows a slightly strengthened form of the result that equality in extensional Martin-Löf type theory with one universe is undecidable.
We show that Hyland and Ong's game semantics for PCF can be presented using normalization by eval... more We show that Hyland and Ong's game semantics for PCF can be presented using normalization by evaluation (nbe). We use the bijective correspondence between innocent well-bracketed strategies and PCF Böhm trees, and show how operations on PCF Böhm trees, such as composition, can be computed lazily and simply by nbe. The usual equations characteristic of games follow from the nbe construction without reference to low-level game-theoretic machinery. As an illustration, we give a Haskell program computing the application of innocent strategies.
The aim of this paper is to refine and extend proposals by Sozeau and Tabareau and by Voevodsky f... more The aim of this paper is to refine and extend proposals by Sozeau and Tabareau and by Voevodsky for universe polymorphism in type theory. In those systems judgments can depend on explicit constraints between universe levels. We here present a system where we also have products indexed by universe levels and by constraints. Our theory has judgments for internal universe levels, built up from level variables by a successor operation and a binary supremum operation, and also judgments for equality of universe levels. 2012 ACM Subject Classification Theory of computation → Type theory Keywords and phrases type theory, universes in type theory, universe polymorphism, level-indexed products, constraint-indexed products
Page 1. Lecture Notes in Computer Science Peter Dybjer Bengt Nordstrom Jan Smith (Eds.) Types for... more Page 1. Lecture Notes in Computer Science Peter Dybjer Bengt Nordstrom Jan Smith (Eds.) Types for Proofs and Programs International Workshop TYPES'94 Bastad, Sweden, June 1994 Selected Papers Springer Page 2. Page 3. ...
Type-checking algorithms for dependent type theories often rely on the interpretation of terms in... more Type-checking algorithms for dependent type theories often rely on the interpretation of terms in some semantic domain of values when checking equalities. Here we analyze a version of Coquand's algorithm for checking the βη-equality of such semantic values in a theory with a predicative universe hierarchy and large elimination rules. Although this algorithm does not rely on normalization by evaluation explicitly, we show that similar ideas can be employed for its verification. In particular, our proof uses the new notions of contextual reification and strong semantic equality. The algorithm is part of a bi-directional type checking algorithm which checks whether a normal term has a certain semantic type, a technique used in the proof assistants Agda and Epigram. We work with an abstract notion of semantic domain in order to accommodate a variety of possible implementation techniques, such as normal forms, weak head normal forms, closures, and compiled code. Our aim is to get closer than previous work to verifying the type-checking algorithms which are actually used in practice.
Coherence and valid isomorphism in closed categories applications of proof theory to category the... more Coherence and valid isomorphism in closed categories applications of proof theory to category theory in a computer sclentist perspective.- An algebraic view of interleaving and distributed operational semantics for CCS.- Temporal structures.- Compositional relational semantics for indeterminate dataflow networks.- Operations on records.- Projections for polymorphic strictness analysis.- A category-theoretic account of program modules.- A note on categorical datatypes.- A set constructor for inductive sets in Martin-Lof's type theory.- Independence results for calculi of dependent types.- Quantitative domains, groupoids and linear logic.- Graded multicategories of polynomial-time realizers.- On the semantics of second order lambda calculus: From bruce-meyer-mitchell models to hyperdoctrine models and vice-versa.- Dictoses.- Declarative continuations: An investigation of duality in programming language semantics.- Logic representation in LF.- Unification properties of commutative theories: A categorical treatment.- An abstract formulation for rewrite systems.- From petri nets to linear logic.- A dialectica-like model of linear logic.- A final coalgebra theorem.
Coherence and valid isomorphism in closed categories applications of proof theory to category the... more Coherence and valid isomorphism in closed categories applications of proof theory to category theory in a computer sclentist perspective.- An algebraic view of interleaving and distributed operational semantics for CCS.- Temporal structures.- Compositional relational semantics for indeterminate dataflow networks.- Operations on records.- Projections for polymorphic strictness analysis.- A category-theoretic account of program modules.- A note on categorical datatypes.- A set constructor for inductive sets in Martin-Lof's type theory.- Independence results for calculi of dependent types.- Quantitative domains, groupoids and linear logic.- Graded multicategories of polynomial-time realizers.- On the semantics of second order lambda calculus: From bruce-meyer-mitchell models to hyperdoctrine models and vice-versa.- Dictoses.- Declarative continuations: An investigation of duality in programming language semantics.- Logic representation in LF.- Unification properties of commutative theories: A categorical treatment.- An abstract formulation for rewrite systems.- From petri nets to linear logic.- A dialectica-like model of linear logic.- A final coalgebra theorem.
This volume is the post-proceedings of the 24th International Conference on Types for Proofs and ... more This volume is the post-proceedings of the 24th International Conference on Types for Proofs and Programs, TYPES 2018, which was held at Universidade do Minho in Braga, Portugal, between the 18th and the 21st of June in 2018. The TYPES meetings are a forum to present new and ongoing work in all aspects of type theory and its applications, especially in formalized and computer assisted reasoning and computer programming. The meetings from 1990 to 2008 were annual workshops of a sequence of five EU funded networking projects. Since 2009, TYPES has been run as an independent conference series.
We prove the correctness of an algorithm for normalizing untyped combinator terms by evaluation. ... more We prove the correctness of an algorithm for normalizing untyped combinator terms by evaluation. The algorithm is written in the functional programming language Haskell, and we prove that it lazily computes the combinatory Böhm tree of the term. The notion of combinatory Böhm tree is analogous to the usual notion of Böhm tree for the untyped lambda calculus. It is defined operationally by repeated head reduction of terms to head normal forms. We use formal neighbourhoods to characterize finite, partial information about data, and define a Böhm tree as a filter of such formal neighbourhoods. We also define formal topology style denotational semantics of a fragment of Haskell following Martin-Löf, and let each closed program denote a filter of formal neighbourhoods. We prove that the denotation of the output of our algorithm is the Böhm tree of the input term. The key construction in the proof is a "glueing" relation between terms and semantic neighbourhoods which is defined by induction on the latter. This relation is related to the glueing relation which was earlier used for proving the correctness of a normalization by evaluation algorithm for typed combinatory logic.
Abstract. Seely’s paper Locally cartesian closed categories and type the-ory contains a well-know... more Abstract. Seely’s paper Locally cartesian closed categories and type the-ory contains a well-known result in categorical type theory: that the category of locally cartesian closed categories is equivalent to the cat-egory of Martin-Löf type theories with Π,Σ, and extensional identity types. However, Seely’s proof relies on the problematic assumption that substitution in types can be interpreted by pullbacks. Here we prove a corrected version of Seely’s theorem: that the Bénabou-Hofmann inter-pretation of Martin-Löf type theory in locally cartesian closed categories yields a biequivalence of 2-categories. To facilitate the technical devel-opment we employ categories with families as a substitute for syntactic Martin-Löf type theories. As a second result we prove that if we remove Π-types the resulting categories with families are biequivalent to left exact categories. 1
We show how to write generic programs and proofs in MartinL of type theory. To this end we consid... more We show how to write generic programs and proofs in MartinL of type theory. To this end we consider several extensions of MartinL of's logical framework for dependent types. Each extension has a universes of codes (signatures) for inductively defined sets with generic formation, introduction, elimination, and equality rules. These extensions are modeled on Dybjer and Setzer's finitely axiomatized theories of inductive-recursive definitions, which also have a universe of codes for sets, and generic formation, introduction, elimination, and equality rules.
Martin-Lof's type theory can be described as an intuitionistic theory of iterated inductive ... more Martin-Lof's type theory can be described as an intuitionistic theory of iterated inductive definitions developed in a framework of dependent types. It was originally intended to be a full-scale system for the formalization of constructive mathematics, but has also proved to be a powerful framework for programming. The theory integrates an expressive specification language (its type system) and a functional programming language (where all programs terminate). There now exist several proof-assistants based on type theory, and many non-trivial examples from programming, computer science, logic, and mathematics have been implemented using these. In this series of lectures we shall describe type theory as a theory of inductive definitions. We emphasize its open nature: much like in a standard functional language such as ML or Haskell the user can add new types whenever there is a need for them. We discuss the syntax and semantics of the theory. Moreover, we present some examples ...
A general formulation of inductive and recursive definitions in Martin-Lof's type theory is ... more A general formulation of inductive and recursive definitions in Martin-Lof's type theory is presented. It extends Backhouse's `Do-It-Yourself Type Theory' to include inductive definitions of families of sets and definitions of functions by recursion on the way elements of such sets are generated. The formulation is in natural deduction and is intended to be a natural generalization to type theory of Martin-Lof's theory of iterated inductive definitions in predicate logic. Formal criteria are given for correct formation and introduction rules of a new set former capturing definition by strictly positive, iterated, generalized induction. Moreover, there is an inversion principle for deriving elimination and equality rules from the formation and introduction rules. Finally, there is an alternative schematic presentation of definition by recursion. The resulting theory is a flexible and powerful language for programming and constructive mathematics. We hint at the we...
The first example of a simultaneous inductive-recursive definition in intuitionistic type theory ... more The first example of a simultaneous inductive-recursive definition in intuitionistic type theory is Martin-Löf's universe à la Tarski. A set U0 of codes for small sets is generated inductively at the same time as a function T0 , which maps a code to the corresponding small set, is defined by recursion on the way the elements of U0 are generated. In this paper we argue that there is an underlying general notion of simultaneous inductiverecursive definition which is implicit in Martin-Löf's intuitionistic type theory. We extend previously given schematic formulations of inductive definitions in type theory to encompass a general notion of simultaneous induction-recursion. This enables us to give a unified treatment of several interesting constructions including various universe constructions by Palmgren, Griffor, Rathjen, and Setzer and a constructive version of Aczel's Frege structures. Consistency of a restricted version of the extension is shown by constructing a rea...
We describe the breadth-first traversal algorithm by Martin Hofmann that uses a non-strictly posi... more We describe the breadth-first traversal algorithm by Martin Hofmann that uses a non-strictly positive data type and carry out a simple verification in an extensional setting. Termination is shown by implementing the algorithm in the strongly normalising extension of system F by Mendler-style recursion. We then analyze the same algorithm by alternative verifications first in an intensional setting using a non-strictly positive inductive definition (not just a non-strictly positive data type), and subsequently by two different algebraic reductions. The verification approaches are compared in terms of notions of simulation and should elucidate the somewhat mysterious algorithm and thus make a case for other uses of non-strictly positive data types. Except for the termination proof, which cannot be formalised in Coq, all proofs were formalised in Coq and some of the algorithms were implemented in Agda and Haskell. 2012 ACM Subject Classification Theory of computation → Logic and verific...
Uploads
Papers by Peter Dybjer