2,020 research outputs found
Talking quiescence: a rigorous theory that supports parallel composition, action hiding and determinisation
The notion of quiescence - the absence of outputs - is vital in both
behavioural modelling and testing theory. Although the need for quiescence was
already recognised in the 90s, it has only been treated as a second-class
citizen thus far. This paper moves quiescence into the foreground and
introduces the notion of quiescent transition systems (QTSs): an extension of
regular input-output transition systems (IOTSs) in which quiescence is
represented explicitly, via quiescent transitions. Four carefully crafted rules
on the use of quiescent transitions ensure that our QTSs naturally capture
quiescent behaviour.
We present the building blocks for a comprehensive theory on QTSs supporting
parallel composition, action hiding and determinisation. In particular, we
prove that these operations preserve all the aforementioned rules.
Additionally, we provide a way to transform existing IOTSs into QTSs, allowing
even IOTSs as input that already contain some quiescent transitions. As an
important application, we show how our QTS framework simplifies the fundamental
model-based testing theory formalised around ioco.Comment: In Proceedings MBT 2012, arXiv:1202.582
Testing Reactive Probabilistic Processes
We define a testing equivalence in the spirit of De Nicola and Hennessy for
reactive probabilistic processes, i.e. for processes where the internal
nondeterminism is due to random behaviour. We characterize the testing
equivalence in terms of ready-traces. From the characterization it follows that
the equivalence is insensitive to the exact moment in time in which an internal
probabilistic choice occurs, which is inherent from the original testing
equivalence of De Nicola and Hennessy. We also show decidability of the testing
equivalence for finite systems for which the complete model may not be known
Using schedulers to test probabilistic distributed systems
This is the author's accepted manuscript. The final publication is available at Springer via http://dx.doi.org/10.1007/s00165-012-0244-5. Copyright © 2012, British Computer Society.Formal methods are one of the most important approaches to increasing the confidence in the correctness of software systems. A formal specification can be used as an oracle in testing since one can determine whether an observed behaviour is allowed by the specification. This is an important feature of formal testing: behaviours of the system observed in testing are compared with the specification and ideally this comparison is automated. In this paper we study a formal testing framework to deal with systems that interact with their environment at physically distributed interfaces, called ports, and where choices between different possibilities are probabilistically quantified. Building on previous work, we introduce two families of schedulers to resolve nondeterministic choices among different actions of the system. The first type of schedulers, which we call global schedulers, resolves nondeterministic choices by representing the environment as a single global scheduler. The second type, which we call localised schedulers, models the environment as a set of schedulers with there being one scheduler for each port. We formally define the application of schedulers to systems and provide and study different implementation relations in this setting
An Algorithm for Probabilistic Alternating Simulation
In probabilistic game structures, probabilistic alternating simulation
(PA-simulation) relations preserve formulas defined in probabilistic
alternating-time temporal logic with respect to the behaviour of a subset of
players. We propose a partition based algorithm for computing the largest
PA-simulation, which is to our knowledge the first such algorithm that works in
polynomial time, by extending the generalised coarsest partition problem (GCPP)
in a game-based setting with mixed strategies. The algorithm has higher
complexities than those in the literature for non-probabilistic simulation and
probabilistic simulation without mixed actions, but slightly improves the
existing result for computing probabilistic simulation with respect to mixed
actions.Comment: We've fixed a problem in the SOFSEM'12 conference versio
Probabilistic Mobility Models for Mobile and Wireless Networks
International audienceIn this paper we present a probabilistic broadcast calculus for mobile and wireless networks whose connections are unreliable. In our calculus, broadcasted messages can be lost with a certain probability, and due to mobility the connection probabilities may change. If a network broadcasts a message from a location, it will evolve to a network distribution depending on whether nodes at other locations receive the message or not. Mobility of nodes is not arbitrary but guarded by a probabilistic mobility function (PMF), and we also define the notion of a weak bisimulation given a PMF. It is possible to have weak bisimular networks which have different probabilistic connectivity information. We furthermore examine the relation between our weak bisimulation and a minor variant of PCTL* [1]. Finally, we apply our calculus on a small example called the Zeroconf protocol [2]
On coalgebras with internal moves
In the first part of the paper we recall the coalgebraic approach to handling
the so-called invisible transitions that appear in different state-based
systems semantics. We claim that these transitions are always part of the unit
of a certain monad. Hence, coalgebras with internal moves are exactly
coalgebras over a monadic type. The rest of the paper is devoted to supporting
our claim by studying two important behavioural equivalences for state-based
systems with internal moves, namely: weak bisimulation and trace semantics.
We continue our research on weak bisimulations for coalgebras over order
enriched monads. The key notions used in this paper and proposed by us in our
previous work are the notions of an order saturation monad and a saturator. A
saturator operator can be intuitively understood as a reflexive, transitive
closure operator. There are two approaches towards defining saturators for
coalgebras with internal moves. Here, we give necessary conditions for them to
yield the same notion of weak bisimulation.
Finally, we propose a definition of trace semantics for coalgebras with
silent moves via a uniform fixed point operator. We compare strong and weak
bisimilation together with trace semantics for coalgebras with internal steps.Comment: Article: 23 pages, Appendix: 3 page
Effectiveness of dolutegravir-based regimens as either first-line or switch antiretroviral therapy: data from the Icona cohort
Introduction: Concerns about dolutegravir (DTG) tolerability in the real-life setting have recently arisen. We aimed to estimate the risk of treatment discontinuation and virological failure of DTG-based regimens from a large cohort of HIV-infected individuals. Methods: We performed a multicentre, observational study including all antiretroviral therapy (ART)-naïve and virologically suppressed treatment-experienced (TE) patients from the Icona (Italian Cohort Naïve Antiretrovirals) cohort who started, for the first time, a DTG-based regimen from January 2015 to December 2017. We estimated the cumulative risk of DTG discontinuation regardless of the reason and for toxicity, and of virological failure using Kaplan–Meier curves. We used Cox regression model to investigate predictors of DTG discontinuation. Results: About 1679 individuals (932 ART-naïve, 747 TE) were included. The one- and two-year probabilities (95% CI) of DTG discontinuation were 6.7% (4.9 to 8.4) and 11.5% (8.7 to 14.3) for ART-naïve and 6.6% (4.6 to 8.6) and 7.6% (5.4 to 9.8) for TE subjects. In both ART-naïve and TE patients, discontinuations of DTG were mainly driven by toxicity with an estimated risk (95% CI) of 4.0% (2.6 to 5.4) and 2.5% (1.3 to 3.6) by one year and 5.6% (3.8 to 7.5) and 4.0% (2.4 to 5.6) by two years respectively. Neuropsychiatric events were the main reason for stopping DTG in both ART-naïve (2.1%) and TE (1.7%) patients. In ART-naïve, a concomitant AIDS diagnosis predicted the risk of discontinuing DTG for any reason (adjusted relative hazard (aRH) = 3.38, p = 0.001), whereas starting DTG in combination with abacavir (ABC) was associated with a higher risk of discontinuing because of toxicity (aRH = 3.30, p = 0.009). TE patients starting a DTG-based dual therapy compared to a triple therapy had a lower risk of discontinuation for any reason (adjusted hazard ratio (aHR) = 2.50, p = 0.037 for ABC-based triple-therapies, aHR = 3.56, p = 0.012 for tenofovir-based) and for toxicity (aHR = 5.26, p = 0.030 for ABC-based, aHR = 6.60, p = 0.024 for tenofovir-based). The one- and two-year probabilities (95% CI) of virological failure were 1.2% (0.3 to 2.0) and 4.6% (2.7 to 6.5) in the ART naïve group and 2.2% (1.0 to 3.3) and 2.9% (1.5 to 4.3) in the TE group. Conclusions: In this large cohort, DTG showed excellent efficacy and optimal tolerability both as first-line and switching ART. The low risk of treatment-limiting toxicities in ART-naïve as well as in treated individuals reassures on the use of DTG in everyday clinical practice
A tutorial on interactive Markov chains
Interactive Markov chains (IMCs) constitute a powerful sto- chastic model that extends both continuous-time Markov chains and labelled transition systems. IMCs enable a wide range of modelling and analysis techniques and serve as a semantic model for many industrial and scientific formalisms, such as AADL, GSPNs and many more. Applications cover various engineering contexts ranging from industrial system-on-chip manufacturing to satellite designs. We present a survey of the state-of-the-art in modelling and analysis of IMCs.\ud
We cover a set of techniques that can be utilised for compositional modelling, state space generation and reduction, and model checking. The significance of the presented material and corresponding tools is highlighted through multiple case studies
Search for the standard model Higgs boson in the H to ZZ to 2l 2nu channel in pp collisions at sqrt(s) = 7 TeV
A search for the standard model Higgs boson in the H to ZZ to 2l 2nu decay
channel, where l = e or mu, in pp collisions at a center-of-mass energy of 7
TeV is presented. The data were collected at the LHC, with the CMS detector,
and correspond to an integrated luminosity of 4.6 inverse femtobarns. No
significant excess is observed above the background expectation, and upper
limits are set on the Higgs boson production cross section. The presence of the
standard model Higgs boson with a mass in the 270-440 GeV range is excluded at
95% confidence level.Comment: Submitted to JHE
Combined search for the quarks of a sequential fourth generation
Results are presented from a search for a fourth generation of quarks
produced singly or in pairs in a data set corresponding to an integrated
luminosity of 5 inverse femtobarns recorded by the CMS experiment at the LHC in
2011. A novel strategy has been developed for a combined search for quarks of
the up and down type in decay channels with at least one isolated muon or
electron. Limits on the mass of the fourth-generation quarks and the relevant
Cabibbo-Kobayashi-Maskawa matrix elements are derived in the context of a
simple extension of the standard model with a sequential fourth generation of
fermions. The existence of mass-degenerate fourth-generation quarks with masses
below 685 GeV is excluded at 95% confidence level for minimal off-diagonal
mixing between the third- and the fourth-generation quarks. With a mass
difference of 25 GeV between the quark masses, the obtained limit on the masses
of the fourth-generation quarks shifts by about +/- 20 GeV. These results
significantly reduce the allowed parameter space for a fourth generation of
fermions.Comment: Replaced with published version. Added journal reference and DO
- …
