770 research outputs found
A robust semantics hides fewer errors
In this paper we explore how formal models are interpreted and to what degree meaning is captured in the formal semantics and to what degree it remains in the informal interpretation of the semantics. By applying a robust approach to the definition of refinement and semantics, favoured by the event-based community, to state-based theory we are able to move some aspects from the informal interpretation into the formal semantics
Report on the first round of the Mock LISA Data Challenges
The Mock LISA Data Challenges (MLDCs) have the dual purpose of fostering the
development of LISA data analysis tools and capabilities, and demonstrating the
technical readiness already achieved by the gravitational-wave community in
distilling a rich science payoff from the LISA data output. The first round of
MLDCs has just been completed: nine data sets containing simulated
gravitational wave signals produced either by galactic binaries or massive
black hole binaries embedded in simulated LISA instrumental noise were released
in June 2006 with deadline for submission of results at the beginning of
December 2006. Ten groups have participated in this first round of challenges.
Here we describe the challenges, summarise the results, and provide a first
critical assessment of the entries.Comment: Proceedings report from GWDAW 11. Added author, added reference,
clarified some text, removed typos. Results unchanged; Removed author, minor
edits, reflects submitted versio
Pointfree factorization of operation refinement
The standard operation refinement ordering is a kind of “meet of op- posites”: non-determinism reduction suggests “smaller” behaviour while increase of definition suggests “larger” behaviour. Groves’ factorization of this ordering into two simpler relations, one per refinement concern, makes it more mathe- matically tractable but is far from fully exploited in the literature. We present a pointfree theory for this factorization which is more agile and calculational than the standard set-theoretic approach. In particular, we show that factorization leads to a simple proof of structural refinement for arbitrary parametric types and ex- ploit factor instantiation across different subclasses of (relational) operation. The prospect of generalizing the factorization to coalgebraic refinement is discussedFundação para a Ciência e a Tecnologia (FCT) - PURE Project (Program Understanding and Re-engineering: Calculi
and Applications), contract POSI/ICHS/44304/2002
A specification patterns system for discrete event systems analysis
As formal verification tools gain popularity, the problem arises of making them more accessible to engineers. A correct understanding of the logics used to express properties of a system's behavior is needed in order to guarantee that properties correctly encode the intent of the verification process. Writing appropriate properties, in a logic suitable for verification, is a skillful process. Errors in this step of the process can create serious problems since a false sense of safety is gained with the analysis. However, when compared to the effort put into developing and applying modeling languages, little attention has been devoted to the process of writing properties that accurately capture verification requirements. In this paper we illustrate how a collection of property patterns can help in simplifying the process of generating logical formulae from informally expressed requirements
A theory of normed simulations
In existing simulation proof techniques, a single step in a lower-level
specification may be simulated by an extended execution fragment in a
higher-level one. As a result, it is cumbersome to mechanize these techniques
using general purpose theorem provers. Moreover, it is undecidable whether a
given relation is a simulation, even if tautology checking is decidable for the
underlying specification logic. This paper introduces various types of normed
simulations. In a normed simulation, each step in a lower-level specification
can be simulated by at most one step in the higher-level one, for any related
pair of states. In earlier work we demonstrated that normed simulations are
quite useful as a vehicle for the formalization of refinement proofs via
theorem provers. Here we show that normed simulations also have pleasant
theoretical properties: (1) under some reasonable assumptions, it is decidable
whether a given relation is a normed forward simulation, provided tautology
checking is decidable for the underlying logic; (2) at the semantic level,
normed forward and backward simulations together form a complete proof method
for establishing behavior inclusion, provided that the higher-level
specification has finite invisible nondeterminism.Comment: 31 pages, 10figure
Oxygen dependency of mitochondrial metabolism indicates outcome of newborn brain injury
There is a need for a method of real-time assessment of brain metabolism during neonatal hypoxic-ischaemic encephalopathy (HIE). We have used broadband near-infrared spectroscopy (NIRS) to monitor cerebral oxygenation and metabolic changes in 50 neonates with HIE undergoing therapeutic hypothermia treatment. In 24 neonates, 54 episodes of spontaneous decreases in peripheral oxygen saturation (desaturations) were recorded between 6 and 81 h after birth. We observed differences in the cerebral metabolic responses to these episodes that were related to the predicted outcome of the injury, as determined by subsequent magnetic resonance spectroscopy derived lactate/N-acetyl-aspartate. We demonstrated that a strong relationship between cerebral metabolism (broadband NIRS-measured cytochrome-c-oxidase (CCO)) and cerebral oxygenation was associated with unfavourable outcome; this is likely to be due to a lower cerebral metabolic rate and mitochondrial dysfunction in severe encephalopathy. Specifically, a decrease in the brain tissue oxidation state of CCO greater than 0.06 µM per 1 µM brain haemoglobin oxygenation drop was able to predict the outcome with 64% sensitivity and 79% specificity (receiver operating characteristic area under the curve = 0.73). With further work on the implementation of this methodology, broadband NIRS has the potential to provide an early, cotside, non-invasive, clinically relevant metabolic marker of perinatal hypoxic-ischaemic injury
Formalising the Continuous/Discrete Modeling Step
Formally capturing the transition from a continuous model to a discrete model
is investigated using model based refinement techniques. A very simple model
for stopping (eg. of a train) is developed in both the continuous and discrete
domains. The difference between the two is quantified using generic results
from ODE theory, and these estimates can be compared with the exact solutions.
Such results do not fit well into a conventional model based refinement
framework; however they can be accommodated into a model based retrenchment.
The retrenchment is described, and the way it can interface to refinement
development on both the continuous and discrete sides is outlined. The approach
is compared to what can be achieved using hybrid systems techniques.Comment: In Proceedings Refine 2011, arXiv:1106.348
Taxing the Informal Economy: The Current State of Knowledge and Agendas for Future Research
This paper reviews the literature on taxation of the informal economy, taking stock of key debates
and drawing attention to recent innovations. Conventionally, the debate on whether to tax has frequently focused
on the limited revenue potential, high cost of collection, and potentially adverse impact on small firms. Recent
arguments have increasingly emphasised the more indirect benefits of informal taxation in relation to economic
growth, broader tax compliance, and governance. More research is needed, we argue, into the relevant costs and
benefits for all, including quasi-voluntary compliance, political and administrative incentives for reform, and
citizen-state bargaining over taxation
- …
