421 research outputs found
The continuity of monadic stream functions
Brouwer’s continuity principle states that all functions from infinite sequences of naturals to naturals are continuous, that is, for every sequence the result depends only on a finite initial segment. It is an intuitionistic axiom that is incompatible with classical mathematics. Recently Mart́ín Escardó proved that it is also inconsistent in type theory. We propose a reformulation of the continuity principle that may be more faithful to the original meaning by Brouwer. It applies to monadic streams, potentially unending sequences of values produced by steps triggered by a monadic action, possibly involving side effects. We consider functions on them that are uniform, in the sense that they operate in the same way independently of the particular monad that provides the specific side effects. Formally this is done by requiring a form of naturality in the monad. Functions on monadic streams have not only a foundational importance, but have also practical applications in signal processing and reactive programming. We give algorithms to determine the modulus of continuity of monadic stream functions and to generate dialogue trees for them (trees whose nodes and branches describe the interaction of the process with the environment)
General Recursion via Coinductive Types
A fertile field of research in theoretical computer science investigates the
representation of general recursive functions in intensional type theories.
Among the most successful approaches are: the use of wellfounded relations,
implementation of operational semantics, formalization of domain theory, and
inductive definition of domain predicates. Here, a different solution is
proposed: exploiting coinductive types to model infinite computations. To every
type A we associate a type of partial elements Partial(A), coinductively
generated by two constructors: the first, return(a) just returns an element
a:A; the second, step(x), adds a computation step to a recursive element
x:Partial(A). We show how this simple device is sufficient to formalize all
recursive functions between two given types. It allows the definition of fixed
points of finitary, that is, continuous, operators. We will compare this
approach to different ones from the literature. Finally, we mention that the
formalization, with appropriate structural maps, defines a strong monad.Comment: 28 page
An Improved Implementation and Abstract Interface for Hybrid
Hybrid is a formal theory implemented in Isabelle/HOL that provides an
interface for representing and reasoning about object languages using
higher-order abstract syntax (HOAS). This interface is built around an HOAS
variable-binding operator that is constructed definitionally from a de Bruijn
index representation. In this paper we make a variety of improvements to
Hybrid, culminating in an abstract interface that on one hand makes Hybrid a
more mathematically satisfactory theory, and on the other hand has important
practical benefits. We start with a modification of Hybrid's type of terms that
better hides its implementation in terms of de Bruijn indices, by excluding at
the type level terms with dangling indices. We present an improved set of
definitions, and a series of new lemmas that provide a complete
characterization of Hybrid's primitives in terms of properties stated at the
HOAS level. Benefits of this new package include a new proof of adequacy and
improvements to reasoning about object logics. Such proofs are carried out at
the higher level with no involvement of the lower level de Bruijn syntax.Comment: In Proceedings LFMTP 2011, arXiv:1110.668
From coinductive proofs to exact real arithmetic: theory and applications
Based on a new coinductive characterization of continuous functions we
extract certified programs for exact real number computation from constructive
proofs. The extracted programs construct and combine exact real number
algorithms with respect to the binary signed digit representation of real
numbers. The data type corresponding to the coinductive definition of
continuous functions consists of finitely branching non-wellfounded trees
describing when the algorithm writes and reads digits. We discuss several
examples including the extraction of programs for polynomials up to degree two
and the definite integral of continuous maps
Partiality, revisited: the partiality monad as a quotient inductive-inductive type
Capretta's delay monad can be used to model partial computations, but it has the "wrong" notion of built-in equality, strong bisimilarity. An alternative is to quotient the delay monad by the "right" notion of equality, weak bisimilarity. However, recent work by Chapman et al. suggests that it is impossible to define a monad structure on the resulting construction in common forms of type theory without assuming (instances of) the axiom of countable choice. Using an idea from homotopy type theory - a higher inductive-inductive type - we construct a partiality monad without relying on countable choice. We prove that, in the presence of countable choice, our partiality monad is equivalent to the delay monad quotiented by weak bisimilarity. Furthermore we outline several applications
Highway Robbery: Due Process, Equal Protection, and Punishing Poverty with Driver’s License Suspensions
Termination Casts: A Flexible Approach to Termination with General Recursion
This paper proposes a type-and-effect system called Teqt, which distinguishes
terminating terms and total functions from possibly diverging terms and partial
functions, for a lambda calculus with general recursion and equality types. The
central idea is to include a primitive type-form "Terminates t", expressing
that term t is terminating; and then allow terms t to be coerced from possibly
diverging to total, using a proof of Terminates t. We call such coercions
termination casts, and show how to implement terminating recursion using them.
For the meta-theory of the system, we describe a translation from Teqt to a
logical theory of termination for general recursive, simply typed functions.
Every typing judgment of Teqt is translated to a theorem expressing the
appropriate termination property of the computational part of the Teqt term.Comment: In Proceedings PAR 2010, arXiv:1012.455
A digital twin-based optimization model to improve production control decisions
LAUREA MAGISTRALERecentemente, la digitalizzazione dei processi di produzione rappresenta una trasformazione cruciale per l'industria di oggi, percepita come un'opportunità per raggiungere livelli più elevati di produttività. In questa prospettiva, i Cyber-Physical Systems (CPSs) rappresentano un concetto chiave di trasformazione, pilastro fondamentale del paradigma dell'Industria 4.0. Infatti, la connessione virtuale e la messa in rete dei CPSs può aprire la strada al monitoraggio in tempo reale e alla sincronizzazione delle attività del sistema produttivo con lo spazio virtuale. Nella loro componente virtuale, i CPSs possono realizzare simulazioni di alto livello capaci di replicare il comportamento dell’intero sistema produttivo. Ciò concorda con la definizione di Digital Twin (DT) nella produzione. Il ruolo del DT all'interno dei sistemi di produzione basati sull’Industria 4.0 è quello di prevedere e ottimizzare il comportamento del sistema di produzione in ogni fase del ciclo di vita. Nonostante gli esempi nella letteratura scientifica esistente delle potenzialità offerte dal DT, una sfida aperta in ricerca concerne come supportare il processo decisionale del controllo della produzione. Pertanto, il presente lavoro di tesi mira a proporre un framework di ottimizzazione basato sul DT per sviluppare una strategia predittiva nel campo del controllo della produzione sfruttando le capacità in tempo reale del DT. Il protocollo di controllo considerato si occupa sia di un controllo centralizzato per il rilascio degli ordini che di un controllo decentralizzato per il controllo delle operazioni in produzione. La strategia di ottimizzazione mira a ridurre i limiti di WIP di ogni stazione di lavorazione del sistema produttivo, migliorando al contempo la produttività complessiva del sistema. Il framework è stato testato in un general flow shop per dimostrare il beneficio degli aggiustamenti dinamici dei limiti di WIP durante le operazioni contro il loro set ottimo statico. Il paragone sperimentale mostra che l’aggiustamento dinamico dei limiti di WIP – permesso dall’ottimizzazione basata su DT- permette il raggiungimento di migliori produttività, così come il raggiungimento di WIP inferiori garantendo al contempo simili produttività.In recent years, the digitalization of manufacturing processes represents a crucial transformation for today’s industry, perceived as an opportunity to achieve higher levels of productivity. In this context, Cyber-Physical Systems (CPSs) represent a key transformative concept, pillar of the Industry 4.0 paradigm. Indeed, the virtual connection and the networking of CPSs can open the way to real-time monitoring and synchronization of the shop floor activities to the virtual space. In their cyber part, CPS can realize high-level simulations that enable to replicate the behavior of the entire production system. This aspect is well aligned with the definition of the Digital Twin (DT) in manufacturing. The role of DT within Industry 4.0-based manufacturing systems is to forecast and optimize the behavior of the production system at each life cycle phase. Despite the examples in the existing scientific literature of the potentialities offered by DT, an open challenge in research concerns how to support the decision-making process of production control. Therefore, the present thesis work aims at proposing a DT-based optimization framework aimed to develop a predictive strategy in the field of production control, building on the exploitation of the real-time capabilities of the DT. The considered control protocol takes care of both a centralized control for orders release and a decentralized control for operations control on the shop floor. The optimization strategy aims at reducing the WIP limits of each workstation of the production system while improving the overall system’s throughput. The framework has been tested in a general flow shop to demonstrate the benefit of the dynamic adjustments of the WIP limits during the operations against their static optimal setting. The experimental comparison shows that the dynamic adjustments of the WIP limits - enabled by the DT-based optimization - allows the achievement of better throughputs, as well as reaching lower WIP while granting similar throughputs
A coalgebraic view of bar recursion and bar induction
We reformulate the bar recursion and induction principles in terms of recursive and wellfounded coalgebras. Bar induction was originally proposed by Brouwer as an axiom to recover certain classically valid theorems in a constructive setting. It is a form of induction on non- wellfounded trees satisfying certain properties. Bar recursion, introduced later by Spector, is the corresponding function defnition principle.
We give a generalization of these principles, by introducing the notion of barred coalgebra: a process with a branching behaviour given by a functor, such that all possible computations terminate.
Coalgebraic bar recursion is the statement that every barred coalgebra is recursive; a recursive coalgebra is one that allows defnition of functions by a coalgebra-to-algebra morphism. It is a framework to characterize valid forms of recursion for terminating functional programs. One application of the principle is the tabulation of continuous functions: Ghani, Hancock and Pattinson defned a type of wellfounded trees that represent continuous functions on streams. Bar recursion allows us to prove that every stably continuous function can be tabulated to such a tree where by stability we mean that the modulus of continuity is also continuous.
Coalgebraic bar induction states that every barred coalgebra is well-founded; a wellfounded coalgebra is one that admits proof by induction
Step-Indexed Normalization for a Language with General Recursion
The Trellys project has produced several designs for practical dependently
typed languages. These languages are broken into two
fragments-a_logical_fragment where every term normalizes and which is
consistent when interpreted as a logic, and a_programmatic_fragment with
general recursion and other convenient but unsound features. In this paper, we
present a small example language in this style. Our design allows the
programmer to explicitly mention and pass information between the two
fragments. We show that this feature substantially complicates the metatheory
and present a new technique, combining the traditional Girard-Tait method with
step-indexed logical relations, which we use to show normalization for the
logical fragment.Comment: In Proceedings MSFP 2012, arXiv:1202.240
- …
