143 research outputs found

    Proviola: A Tool for Proof Re-animation

    Full text link
    To improve on existing models of interaction with a proof assistant (PA), in particular for storage and replay of proofs, we in- troduce three related concepts, those of: a proof movie, consisting of frames which record both user input and the corresponding PA response; a camera, which films a user's interactive session with a PA as a movie; and a proviola, which replays a movie frame-by-frame to a third party. In this paper we describe the movie data structure and we discuss a proto- type implementation of the camera and proviola based on the ProofWeb system. ProofWeb uncouples the interaction with a PA via a web- interface (the client) from the actual PA that resides on the server. Our camera films a movie by "listening" to the ProofWeb communication. The first reason for developing movies is to uncouple the reviewing of a formal proof from the PA used to develop it: the movie concept enables users to discuss small code fragments without the need to install the PA or to load a whole library into it. Other advantages include the possibility to develop a separate com- mentary track to discuss or explain the PA interaction. We assert that a combined camera+proviola provides a generic layer between a client (user) and a server (PA). Finally we claim that movies are the right type of data to be stored in an encyclopedia of formalized mathematics, based on our experience in filming the Coq standard library.Comment: Accepted for the 9th International Conference on Mathematical Knowledge Management (MKM 2010), 15 page

    A type system for Continuation Calculus

    Get PDF
    Continuation Calculus (CC), introduced by Geron and Geuvers, is a simple foundational model for functional computation. It is closely related to lambda calculus and term rewriting, but it has no variable binding and no pattern matching. It is Turing complete and evaluation is deterministic. Notions like "call-by-value" and "call-by-name" computation are available by choosing appropriate function definitions: e.g. there is a call-by-value and a call-by-name addition function. In the present paper we extend CC with types, to be able to define data types in a canonical way, and functions over these data types, defined by iteration. Data type definitions follow the so-called "Scott encoding" of data, as opposed to the more familiar "Church encoding". The iteration scheme comes in two flavors: a call-by-value and a call-by-name iteration scheme. The call-by-value variant is a double negation variant of call-by-name iteration. The double negation translation allows to move between call-by-name and call-by-value.Comment: In Proceedings CL&C 2014, arXiv:1409.259

    From coinductive proofs to exact real arithmetic: theory and applications

    Full text link
    Based on a new coinductive characterization of continuous functions we extract certified programs for exact real number computation from constructive proofs. The extracted programs construct and combine exact real number algorithms with respect to the binary signed digit representation of real numbers. The data type corresponding to the coinductive definition of continuous functions consists of finitely branching non-wellfounded trees describing when the algorithm writes and reads digits. We discuss several examples including the extraction of programs for polynomials up to degree two and the definite integral of continuous maps

    Congruence types

    Get PDF

    The logic and mathematics of occasion sentences

    Get PDF
    The prime purpose of this paper is, first, to restore to discourse-bound occasion sentences their rightful central place in semantics and secondly, taking these as the basic propositional elements in the logical analysis of language, to contribute to the development of an adequate logic of occasion sentences and a mathematical (Boolean) foundation for such a logic, thus preparing the ground for more adequate semantic, logical and mathematical foundations of the study of natural language. Some of the insights elaborated in this paper have appeared in the literature over the past thirty years, and a number of new developments have resulted from them. The present paper aims atproviding an integrated conceptual basis for this new development in semantics. In Section 1 it is argued that the reduction by translation of occasion sentences to eternal sentences, as proposed by Russell and Quine, is semantically and thus logically inadequate. Natural language is a system of occasion sentences, eternal sentences being merely boundary cases. The logic hasfewer tasks than is standardly assumed, as it excludes semantic calculi, which depend crucially on information supplied by cognition and context and thus belong to cognitive psychology rather than to logic. For sentences to express a proposition and thus be interpretable and informative, they must first be properly anchored in context. A proposition has a truth value when it is, moreover, properly keyed in the world, i.e. is about a situation in the world. Section 2 deals with the logical properties of natural language. It argues that presuppositional phenomena require trivalence and presents the trivalent logic PPC3, with two kinds of falsity and two negations. It introduces the notion of Σ-space for a sentence A (or A/A, the set of situations in which A is true) as the basis of logical model theory, and the notion of PA/ (the Σ-space of the presuppositions of A), functioning as a `private' subuniverse for A/A. The trivalent Kleene calculus is reinterpreted as a logical account of vagueness, rather than of presupposition. PPC3 and the Kleene calculus are refinements of standard bivalent logic and can be combined into one logical system. In Section 3 the adequacy of PPC3 as a truth-functional model of presupposition is considered more closely and given a Boolean foundation. In a noncompositional extended Boolean algebra, three operators are defined: 1a for the conjoined presuppositions of a, ã for the complement of a within 1a, and â for the complement of 1a within Boolean 1. The logical properties of this extended Boolean algebra are axiomatically defined and proved for all possible models. Proofs are provided of the consistency and the completeness of the system. Section 4 is a provisional exploration of the possibility of using the results obtained for a new discourse-dependent account of the logic of modalities in natural language. The overall result is a modified and refined logical and model-theoretic machinery, which takes into account both the discourse-dependency of natural language sentences and the necessity of selecting a key in the world before a truth value can be assigne

    A Focused Sequent Calculus Framework for Proof Search in Pure Type Systems

    Get PDF
    Basic proof-search tactics in logic and type theory can be seen as the root-first applications of rules in an appropriate sequent calculus, preferably without the redundancies generated by permutation of rules. This paper addresses the issues of defining such sequent calculi for Pure Type Systems (PTS, which were originally presented in natural deduction style) and then organizing their rules for effective proof-search. We introduce the idea of Pure Type Sequent Calculus with meta-variables (PTSCalpha), by enriching the syntax of a permutation-free sequent calculus for propositional logic due to Herbelin, which is strongly related to natural deduction and already well adapted to proof-search. The operational semantics is adapted from Herbelin's and is defined by a system of local rewrite rules as in cut-elimination, using explicit substitutions. We prove confluence for this system. Restricting our attention to PTSC, a type system for the ground terms of this system, we obtain the Subject Reduction property and show that each PTSC is logically equivalent to its corresponding PTS, and the former is strongly normalising iff the latter is. We show how to make the logical rules of PTSC into a syntax-directed system PS for proof-search, by incorporating the conversion rules as in syntax-directed presentations of the PTS rules for type-checking. Finally, we consider how to use the explicitly scoped meta-variables of PTSCalpha to represent partial proof-terms, and use them to analyse interactive proof construction. This sets up a framework PE in which we are able to study proof-search strategies, type inhabitant enumeration and (higher-order) unification

    A dependent nominal type theory

    Full text link
    Nominal abstract syntax is an approach to representing names and binding pioneered by Gabbay and Pitts. So far nominal techniques have mostly been studied using classical logic or model theory, not type theory. Nominal extensions to simple, dependent and ML-like polymorphic languages have been studied, but decidability and normalization results have only been established for simple nominal type theories. We present a LF-style dependent type theory extended with name-abstraction types, prove soundness and decidability of beta-eta-equivalence checking, discuss adequacy and canonical forms via an example, and discuss extensions such as dependently-typed recursion and induction principles

    Rigorous Polynomial Approximation using Taylor Models in Coq

    Get PDF
    International audienceOne of the most common and practical ways of representing a real function on machines is by using a polynomial approximation. It is then important to properly handle the error introduced by such an approximation. The purpose of this work is to offer guaranteed error bounds for a specific kind of rigorous polynomial approximation called Taylor model. We carry out this work in the Coq proof assistant, with a special focus on genericity and efficiency for our implementation. We give an abstract interface for rigorous polynomial approximations, parameter- ized by the type of coefficients and the implementation of polynomials, and we instantiate this interface to the case of Taylor models with inter- val coefficients, while providing all the machinery for computing them. We compare the performances of our implementation in Coq with those of the Sollya tool, which contains an implementation of Taylor models written in C. This is a milestone in our long-term goal of providing fully formally proved and efficient Taylor models

    Comprehending Isabelle/HOL's consistency

    Get PDF
    The proof assistant Isabelle/HOL is based on an extension of Higher-Order Logic (HOL) with ad hoc overloading of constants. It turns out that the interaction between the standard HOL type definitions and the Isabelle-specific ad hoc overloading is problematic for the logical consistency. In previous work, we have argued that standard HOL semantics is no longer appropriate for capturing this interaction, and have proved consistency using a nonstandard semantics. The use of an exotic semantics makes that proof hard to digest by the community. In this paper, we prove consistency by proof-theoretic means—following the healthy intuition of definitions as abbreviations, realized in HOLC, a logic that augments HOL with comprehension types. We hope that our new proof settles the Isabelle/HOL consistency problem once and for all. In addition, HOLC offers a framework for justifying the consistency of new deduction schemas that address practical user needs

    Verifiable certificates for predicate subtyping

    Get PDF
    Adding predicate subtyping to higher-order logic yields a very expressive language in which type-checking is undecidable, making the definition of a system of verifiable certificates challenging. This work presents a solution to this issue with a minimal formalization of predicate subtyping, named PVS-Core, together with a system of verifiable certificates for PVS-Core, named PVS-Cert. PVS-Cert is based on the introduction of proof terms and explicit coercions. Its design is similar to that of PTSs with dependent pairs, at the exception of the definition of conversion, which is based on a specific notion of reduction → β * , corresponding to β-reduction combined with the erasure of coercions. The use of this reduction instead of the more standard reduction → βσ allows to establish a simple correspondence between PVS-Core and PVS-Cert. On the other hand, a type-checking algorithm is designed for PVS-Cert, built on proofs of type preservation of → βσ and strong normalization of both → βσ and → β *. Using these results, PVS-Cert judgements are used as verifiable certificates for predicate subtyping. In addition, the reduction → βσ is used to define a cut elimination procedure adapted to predicate subtyping. Its use to study the properties of predicate subtyp-ing is illustrated with a proof of consistency
    corecore