256 research outputs found
Correct and Efficient Antichain Algorithms for Refinement Checking
The notion of refinement plays an important role in software engineering. It
is the basis of a stepwise development methodology in which the correctness of
a system can be established by proving, or computing, that a system refines its
specification. Wang et al. describe algorithms based on antichains for
efficiently deciding trace refinement, stable failures refinement and
failures-divergences refinement. We identify several issues pertaining to the
soundness and performance in these algorithms and propose new, correct,
antichain-based algorithms. Using a number of experiments we show that our
algorithms outperform the original ones in terms of running time and memory
usage. Furthermore, we show that additional run time improvements can be
obtained by applying divergence-preserving branching bisimulation minimisation
Adaptive Non-Linear Pattern Matching Automata
Efficient pattern matching is fundamental for practical term rewrite engines. By preprocessing the given patterns into a finite deterministic automaton the matching patterns can be decided in a single traversal of the relevant parts of the input term. Most automaton-based techniques are restricted to linear patterns, where each variable occurs at most once, and require an additional post-processing step to check so-called variable consistency. However, we can show that interleaving the variable consistency and pattern matching phases can reduce the number of required steps to find a match all matches. Therefore, we take the existing adaptive pattern matching automata as introduced by Sekar et al and extend it these with consistency checks. We prove that the resulting deterministic pattern matching automaton is correct, and show that its evaluation depth is can be shorter than two-phase approaches
Recommended from our members
Assisting students with concept acquisition in basic skills reading through the use of an interactive website
This project creates an interactive website on the literature surrounding reading and concept acquisition skills in adult learners. This website used in conjunction with the Basic Skills English 10A course given in community colleges will enhance reading and concept acquisitions skills enabling student the self cofidence, encouragement and motivation to complete the course
Decomposing monolithic processes in a process algebra with multi-actions
A monolithic process is a single recursive equation with data parameters, which only uses non-determinism, action prefixing, and recursion. We present a technique that decomposes such a monolithic process into multiple processes where each process defines behaviour for a subset of the parameters of the monolithic process. For this decomposition we can show that a composition of these processes is strongly bisimilar to the monolithic process under a suitable synchronisation context. Minimising the resulting processes before determining their composition can be used to derive a state space that is smaller than the one obtained by a monolithic exploration. We apply the decomposition technique to several specifications to show that this works in practice. Finally, we prove that state invariants can be used to further improve the effectiveness of this decomposition technique.</p
Decompositional Branching Bisimulation Minimisation of Monolithic Processes
One of the biggest challenges in model checking complex systems is the state space explosion problem. A well known technique to reduce the impact of this problem is compositional minimisation. In this technique, first the state spaces of all components are computed and minimised modulo some behavioural equivalence (e.g., some form of bisimilarity). These minimised transition systems are subsequently combined to obtain the final state space. In earlier work a compositional minimisation technique was presented tailored to mCRL2: it provides support for the multi-action semantics of mCRL2 and allows splitting up a monolithic linear process specification into components. Only strong bisimulation minimisation of components can be used, limiting the effectiveness of the approach. In this paper we propose an extension to support branching bisimulation reduction and prove its correctness. We present a number of benchmarks using mCRL2 models derived from industrial SysML models, showing that a significant reduction can be achieved, also compared to compositional minimisation with strong bisimulation reduction.</p
Challenges in conducting community-driven research created by differing ways of talking and thinking about science: a researcher’s perspective
Increasingly, health scientists are becoming aware that research collaborations that include community partnerships can be an effective way to broaden the scope and enhance the impact of research aimed at improving public health. Such collaborations extend the reach of academic scientists by integrating a variety of perspectives and thus strengthening the applicability of the research. Communication challenges can arise, however, when attempting to address specific research questions in these collaborations. In particular, inconsistencies can exist between scientists and community members in the use and interpretation of words and other language features, particularly when conducting research with a biomedical component. Additional challenges arise from differing perceptions of the investigative process. There may be divergent perceptions about how research questions should and can be answered, and in expectations about requirements of research institutions and research timelines. From these differences, misunderstandings can occur about how the results will ultimately impact the community. These communication issues are particularly challenging when scientists and community members are from different ethnic and linguistic backgrounds that may widen the gap between ways of talking and thinking about science, further complicating the interactions and exchanges that are essential for effective joint research efforts. Community-driven research that aims to describe the burden of disease associated with Helicobacter pylori infection is currently underway in northern Aboriginal communities located in the Yukon and Northwest Territories, Canada, with the goal of identifying effective public health strategies for reducing health risks from this infection. This research links community representatives, faculty from various disciplines at the University of Alberta, as well as territorial health care practitioners and officials. This highly collaborative work will be used to illustrate, from a researcher’s perspective, some of the challenges of conducting public health research in teams comprising members with varying backgrounds. The consequences of these challenges will be outlined, and potential solutions will be offered
A Thread-Safe Term Library:(with a New Fast Mutual Exclusion Protocol)
Terms are one of the fundamental mathematical concepts in computing. E.g. every expression characterisable by a context free grammar is a term. We developed a thread-safe Term Library. The biggest challenge is to implement hyper-efficient multi-reader/single-writer mutual exclusion for which we designed the new busy-forbidden protocol. Model checking is used to show both the correctness of the protocol and the Term Library. Benchmarks show this Term Library has little overhead compared to sequential versions and outperforms them already on two processors. Using the new library in an existing state space generation tool, very substantial speed ups can be obtained.</p
Adaptive Non-linear Pattern Matching Automata
Efficient pattern matching is fundamental for practical term rewrite engines.
By preprocessing the given patterns into a finite deterministic automaton the
matching patterns can be decided in a single traversal of the relevant parts of
the input term. Most automaton-based techniques are restricted to linear
patterns, where each variable occurs at most once, and require an additional
post-processing step to check so-called variable consistency. However, we can
show that interleaving the variable consistency and pattern matching phases can
reduce the number of required steps to find all matches. Therefore, we take the
existing adaptive pattern matching automata as introduced by Sekar et al and
extend these with consistency checks. We prove that the resulting deterministic
pattern matching automaton is correct, and show several examples where some
reduction can be achieved
- …
