1,967 research outputs found
On the Super-Additivity and Estimation Biases of Quantile Contributions
Sample measures of top centile contributions to the total (concentration) are
downward biased, unstable estimators, extremely sensitive to sample size and
concave in accounting for large deviations. It makes them particularly unfit in
domains with power law tails, especially for low values of the exponent. These
estimators can vary over time and increase with the population size, as shown
in this article, thus providing the illusion of structural changes in
concentration. They are also inconsistent under aggregation and mixing
distributions, as the weighted average of concentration measures for A and B
will tend to be lower than that from A U B. In addition, it can be shown that
under such fat tails, increases in the total sum need to be accompanied by
increased sample size of the concentration measurement. We examine the
estimation superadditivity and bias under homogeneous and mixed distributions
Policy-based autonomic control service
Recently, there has been a considerable interest in policy-based, goal-oriented service management and autonomic computing. Much work is still required to investigate designs and policy models and associate meta-reasoning systems for policy-based autonomic systems. In this paper we outline a proposed autonomic middleware control service used to orchestrate selfhealing of distributed applications. Policies are used to adjust the systems autonomy and define self-healing strategies to stabilize/correct a given system in the event of failures
A deliberative model for self-adaptation middleware using architectural dependency
A crucial prerequisite to externalized adaptation is an understanding of how components are interconnected, or more particularly how and why they depend on one another. Such dependencies can be used to provide an architectural model, which provides a reference point for externalized adaptation. In this paper, it is described how dependencies are used as a basis to systems' self-understanding and subsequent architectural reconfigurations. The approach is based on the combination of: instrumentation services, a dependency meta-model and a system controller. In particular, the latter uses self-healing repair rules (or conflict resolution strategies), based on extensible beliefs, desires and intention (EBDI) model, to reflect reconfiguration changes back to a target application under examination
When Patrolmen Become Corrupted: Monitoring a Graph Using Faulty Mobile Robots
A team of k mobile robots is deployed on a weighted graph whose edge weights represent distances. The robots move perpetually along the domain, represented by all points belonging to the graph edges, without exceeding their maximum speed. The robots need to patrol the graph by regularly visiting all points of the domain. In this paper, we consider a team of robots (patrolmen), at most f of which may be unreliable, i.e., they fail to comply with their patrolling duties. What algorithm should be followed so as to minimize the maximum time between successive visits of every edge point by a reliable patrolman? The corresponding measure of efficiency of patrolling called idleness has been widely accepted in the robotics literature. We extend it to the case of untrusted patrolmen; we denote by Ifk(G) the maximum time that a point of the domain may remain unvisited by reliable patrolmen. The objective is to find patrolling strategies minimizing Ifk(G). We investigate this problem for various classes of graphs. We design optimal algorithms for line segments, which turn out to be surprisingly different from strategies for related patrolling problems proposed in the literature. We then use these results to study general graphs. For Eulerian graphs G, we give an optimal patrolling strategy with idleness Ifk(G)=(f+1)|E|/k, where |E| is the sum of the lengths of the edges of G. Further, we show the hardness of the problem of computing the idle time for three robots, at most one of which is faulty, by reduction from 3-edge-coloring of cubic graphs—a known NP-hard problem. A byproduct of our proof is the investigation of classes of graphs minimizing idle time (with respect to the total length of edges); an example of such a class is known in the literature under the name of Kotzig graphs
Capability in the digital: institutional media management and its dis/contents
This paper explores how social media spaces are occupied, utilized and negotiated by the British Military in relation to the Ministry of Defence’s concerns and conceptualizations of risk. It draws on data from the DUN Project to investigate the content and form of social media about defence through the lens of ‘capability’, a term that captures and describes the meaning behind multiple representations of the military institution. But ‘capability’ is also a term that we hijack and extend here, not only in relation to the dominant presence of ‘capability’ as a representational trope and the extent to which it is revealing of a particular management of social media spaces, but also in relation to what our research reveals for the wider digital media landscape and ‘capable’ digital methods. What emerges from our analysis is the existence of powerful, successful and critically long-standing media and reputation management strategies occurring within the techno-economic online structures where the exercising of ‘control’ over the individual – as opposed to the technology – is highly effective. These findings raise critical questions regarding the extent to which ‘control’ and management of social media – both within and beyond the defence sector – may be determined as much by cultural, social, institutional and political influence and infrastructure as the technological economies. At a key moment in social media analysis, then, when attention is turning to the affordances, criticisms and possibilities of data, our research is a pertinent reminder that we should not forget the active management of content that is being similarly, if not equally, effective
Don't know, can't know: Embracing deeper uncertainties when analysing risks
This article is available open access through the publisher’s website at the link below. Copyright @ 2011 The Royal Society.Numerous types of uncertainty arise when using formal models in the analysis of risks. Uncertainty is best seen as a relation, allowing a clear separation of the object, source and ‘owner’ of the uncertainty, and we argue that all expressions of uncertainty are constructed from judgements based on possibly inadequate assumptions, and are therefore contingent. We consider a five-level structure for assessing and communicating uncertainties, distinguishing three within-model levels—event, parameter and model uncertainty—and two extra-model levels concerning acknowledged and unknown inadequacies in the modelling process, including possible disagreements about the framing of the problem. We consider the forms of expression of uncertainty within the five levels, providing numerous examples of the way in which inadequacies in understanding are handled, and examining criticisms of the attempts taken by the Intergovernmental Panel on Climate Change to separate the likelihood of events from the confidence in the science. Expressing our confidence in the adequacy of the modelling process requires an assessment of the quality of the underlying evidence, and we draw on a scale that is widely used within evidence-based medicine. We conclude that the contingent nature of risk-modelling needs to be explicitly acknowledged in advice given to policy-makers, and that unconditional expressions of uncertainty remain an aspiration
- …
