6,743 research outputs found
Building a Holistic ATM Model for Future KPI Trade-Offs
We present the model developed within the Vista project, studying the future evolution of trade-offs between Key Performance Indicators. The model has a very broad scope and aims to simulate the changes that business and regulatory forces have at a strategic, pre-tactical and tactical level. The relevant factors that will affect the air transportation system are presented, as well as the scenarios to be simulated. The overall architecture of the model is described and a more detailed presentation of the economic component of the model is given. Some preliminary results of this part of the model illustrate its main mechanisms and capabilities
Discreteness and entropic fluctuations in GREM-like systems
Within generalized random energy models, we study the effects of energy
discreteness and of entropy extensivity in the low temperature phase. At zero
temperature, discreteness of the energy induces replica symmetry breaking, in
contrast to the continuous case where the ground state is unique. However, when
the ground state energy has an extensive entropy, the distribution of overlaps
P(q) instead tends towards a single delta function in the large volume limit.
Considering now the whole frozen phase, we find that P(q) varies continuously
with temperature, and that state-to-state fluctuations of entropy wash out the
differences between the discrete and continuous energy models.Comment: 7 pages, 3 figure, 2 figures are added, the volume changes from 4
pages to 7 page
Deep learning cardiac motion analysis for human survival prediction
Motion analysis is used in computer vision to understand the behaviour of
moving objects in sequences of images. Optimising the interpretation of dynamic
biological systems requires accurate and precise motion tracking as well as
efficient representations of high-dimensional motion trajectories so that these
can be used for prediction tasks. Here we use image sequences of the heart,
acquired using cardiac magnetic resonance imaging, to create time-resolved
three-dimensional segmentations using a fully convolutional network trained on
anatomical shape priors. This dense motion model formed the input to a
supervised denoising autoencoder (4Dsurvival), which is a hybrid network
consisting of an autoencoder that learns a task-specific latent code
representation trained on observed outcome data, yielding a latent
representation optimised for survival prediction. To handle right-censored
survival outcomes, our network used a Cox partial likelihood loss function. In
a study of 302 patients the predictive accuracy (quantified by Harrell's
C-index) was significantly higher (p < .0001) for our model C=0.73 (95 CI:
0.68 - 0.78) than the human benchmark of C=0.59 (95 CI: 0.53 - 0.65). This
work demonstrates how a complex computer vision task using high-dimensional
medical image data can efficiently predict human survival
Analysis of Kapitza-Dirac diffraction patterns beyond the Raman-Nath regime
We study Kapitza-Dirac diffraction of a Bose-Einstein condensate from a
standing light wave for a square pulse with variable pulse length but constant
pulse area. We find that for sufficiently weak pulses, the usual analytical
short-pulse prediction for the Raman-Nath regime continues to hold for longer
times, albeit with a reduction of the apparent modulation depth of the standing
wave. We quantitatively relate this effect to the Fourier width of the pulse,
and draw analogies to the Rabi dynamics of a coupled two-state system. Our
findings, combined with numerical modeling for stronger pulses, are of
practical interest for the calibration of optical lattices in ultracold atomic
systems
Population stability: regulating size in the presence of an adversary
We introduce a new coordination problem in distributed computing that we call
the population stability problem. A system of agents each with limited memory
and communication, as well as the ability to replicate and self-destruct, is
subjected to attacks by a worst-case adversary that can at a bounded rate (1)
delete agents chosen arbitrarily and (2) insert additional agents with
arbitrary initial state into the system. The goal is perpetually to maintain a
population whose size is within a constant factor of the target size . The
problem is inspired by the ability of complex biological systems composed of a
multitude of memory-limited individual cells to maintain a stable population
size in an adverse environment. Such biological mechanisms allow organisms to
heal after trauma or to recover from excessive cell proliferation caused by
inflammation, disease, or normal development.
We present a population stability protocol in a communication model that is a
synchronous variant of the population model of Angluin et al. In each round,
pairs of agents selected at random meet and exchange messages, where at least a
constant fraction of agents is matched in each round. Our protocol uses
three-bit messages and states per agent. We emphasize that
our protocol can handle an adversary that can both insert and delete agents, a
setting in which existing approximate counting techniques do not seem to apply.
The protocol relies on a novel coloring strategy in which the population size
is encoded in the variance of the distribution of colors. Individual agents can
locally obtain a weak estimate of the population size by sampling from the
distribution, and make individual decisions that robustly maintain a stable
global population size
Rapid dissemination of human T-lymphotropic virus type 1 during primary infection in transplant recipients
Must naive realists be relationalists?
Relationalism maintains that perceptual experience involves, as part of its nature, a distinctive kind of conscious perceptual relation between a subject of experience and an object of experience. Together with the claim that perceptual experience is presentational, relationalism is widely believed to be a core aspect of the naive realist outlook on perception. This is a mistake. I argue that naive realism about perception can be upheld without a commitment to relationalism
Innovator resilience potential: A process perspective of individual resilience as influenced by innovation project termination
Innovation projects fail at an astonishing rate. Yet, the negative effects of innovation project failures on the team members of these projects have been largely neglected in research streams that deal with innovation project failures. After such setbacks, it is vital to maintain or even strengthen project members’ innovative capabilities for subsequent innovation projects. For this, the concept of resilience, i.e. project members’ potential to positively adjust (or even grow) after a setback such as an innovation project failure, is fundamental. We develop the second-order construct of innovator resilience potential, which consists of six components – self-efficacy, outcome expectancy, optimism, hope, self-esteem, and risk propensity – that are important for project members’ potential of innovative functioning in innovation projects subsequent to a failure. We illustrate our theoretical findings by means of a qualitative study of a terminated large-scale innovation project, and derive implications for research and management
Neurology
Contains reports on four research projects.U. S. Public Health Service (B-3055-4)U. S. Public Health Service (B-3090-4)U. S. Public Health Service (MH-06175-02)U.S. Navy (Office of Naval Research (Nonr-1841 (70))U. S. Air Force (AF49(638)-1313
Simplest random K-satisfiability problem
We study a simple and exactly solvable model for the generation of random
satisfiability problems. These consist of random boolean constraints
which are to be satisfied simultaneously by logical variables. In
statistical-mechanics language, the considered model can be seen as a diluted
p-spin model at zero temperature. While such problems become extraordinarily
hard to solve by local search methods in a large region of the parameter space,
still at least one solution may be superimposed by construction. The
statistical properties of the model can be studied exactly by the replica
method and each single instance can be analyzed in polynomial time by a simple
global solution method. The geometrical/topological structures responsible for
dynamic and static phase transitions as well as for the onset of computational
complexity in local search method are thoroughly analyzed. Numerical analysis
on very large samples allows for a precise characterization of the critical
scaling behaviour.Comment: 14 pages, 5 figures, to appear in Phys. Rev. E (Feb 2001). v2: minor
errors and references correcte
- …
