5,818 research outputs found
Towards Patterns for Heaps and Imperative Lambdas
In functional programming, point-free relation calculi have been fruitful for
general theories of program construction, but for specific applications
pointwise expressions can be more convenient and comprehensible. In imperative
programming, refinement calculi have been tied to pointwise expression in terms
of state variables, with the curious exception of the ubiquitous but invisible
heap. To integrate pointwise with point-free, de Moor and Gibbons extended
lambda calculus with non-injective pattern matching interpreted using
relations. This article gives a semantics of that language using ``ideal
relations'' between partial orders, and a second semantics using predicate
transformers. The second semantics is motivated by its potential use with
separation algebra, for pattern matching in programs acting on the heap. Laws
including lax beta and eta are proved in these models and a number of open
problems are posed
Left of bang interventions in trauma : ethical implications for military medical prophylaxis
Advances in medical capability should be accompanied by discussion of their ethical implications. In the military medical context there is a growing interest in developing prophylactic interventions that will mitigate the effects of trauma and improve survival. The ethics of this novel capability are currently unexplored. This paper describes the concept of trauma prophylaxis (Left Of Bang Interventions in Trauma) and outlines some of the ethical issues that need to be considered, including within concept development, research and implementation. Trauma prophylaxis can be divided into interventions that do not (type 1) and those that do (type 2) have medical enhancement as an unintended side effect of their prophylactic action. We conclude that type 1 interventions have much in common with established military medical prophylaxis, and the potentially enhancing qualities of type 2 interventions raise different issues. We welcome further debate on both interventions
03411 Abstracts Collection -- Language Based Security
From October 5th to 10th 2003,the Dagstuhl Seminar 03411
``Language Based security\u27\u27 was held
in the International Conference and Research Center (IBFI), Schloss Dagstuhl.
During the seminar, several participants presented their current
research, and ongoing work and open problems were discussed. Abstracts of
the presentations given during the seminar are put together in this paper
Relational Logic with Framing and Hypotheses
Relational properties arise in many settings: relating two versions of a program that use different data representations, noninterference properties for security, etc. The main ingredient of relational verification, relating aligned pairs of intermediate steps, has been used in numerous guises, but existing relational program logics are narrow in scope. This paper introduces a logic based on novel syntax that weaves together product programs to express alignment of control flow points at which relational formulas are asserted. Correctness judgments feature hypotheses with relational specifications, discharged by a rule for the linking of procedure implementations. The logic supports reasoning about program-pairs containing both similar and dissimilar control and data structures. Reasoning about dynamically allocated objects is supported by a frame rule based on frame conditions amenable to SMT provers. We prove soundness and sketch how the logic can be used for data abstraction, loop optimizations, and secure information flow
Recommended from our members
Monitoring Neural Activity with Bioluminescence during Natural Behavior
Existing techniques for monitoring neural activity in awake, freely behaving vertebrates are invasive and difficult to target to genetically identified neurons. We used bioluminescence to non-invasively monitor the activity of genetically specified neurons in freely behaving zebrafish. Transgenic fish with the Ca-sensitive photoprotein green fluorescent protein (GFP)-Aequorin in most neurons generated large and fast bioluminescent signals that were related to neural activity, neuroluminescence, which could be recorded continuously for many days. To test the limits of this technique, we specifically targeted GFP-Aequorin to the hypocretin-positive neurons of the hypothalamus. We found that neuroluminescence generated by this group of ~20 neurons was associated with periods of increased locomotor activity and identified two classes of neural activity corresponding to distinct swim latencies. Our neuroluminescence assay can report, with high temporal resolution and sensitivity, the activity of small subsets of neurons during unrestrained behavior.Molecular and Cellular Biolog
A multivariate timeseries modeling approach to severity of illness assessment and forecasting in ICU with sparse, heterogeneous clinical data
The ability to determine patient acuity (or severity of illness) has immediate practical use for clinicians. We evaluate the use of multivariate timeseries modeling with the multi-task Gaussian process (GP) models using noisy, incomplete, sparse, heterogeneous and unevenly-sampled clinical data, including both physiological signals and clinical notes. The learned multi-task GP (MTGP) hyperparameters are then used to assess and forecast patient acuity. Experiments were conducted with two real clinical data sets acquired from ICU patients: firstly, estimating cerebrovascular pressure reactivity, an important indicator of secondary damage for traumatic brain injury patients, by learning the interactions between intracranial pressure and mean arterial blood pressure signals, and secondly, mortality prediction using clinical progress notes. In both cases, MTGPs provided improved results: an MTGP model provided better results than single-task GP models for signal interpolation and forecasting (0.91 vs 0.69 RMSE), and the use of MTGP hyperparameters obtained improved results when used as additional classification features (0.812 vs 0.788 AUC).Intel Science and Technology Center for Big DataNational Institutes of Health. (U.S.). National Library of Medicine (Biomedical Informatics Research Training Grant NIH/NLM 2T15 LM007092-22)National Institute of Biomedical Imaging and Bioengineering (U.S.) (R01 Grant EB001659)Singapore. Agency for Science, Technology and Research (Graduate Scholarship
ROOT - A C++ Framework for Petabyte Data Storage, Statistical Analysis and Visualization
ROOT is an object-oriented C++ framework conceived in the high-energy physics
(HEP) community, designed for storing and analyzing petabytes of data in an
efficient way. Any instance of a C++ class can be stored into a ROOT file in a
machine-independent compressed binary format. In ROOT the TTree object
container is optimized for statistical data analysis over very large data sets
by using vertical data storage techniques. These containers can span a large
number of files on local disks, the web, or a number of different shared file
systems. In order to analyze this data, the user can chose out of a wide set of
mathematical and statistical functions, including linear algebra classes,
numerical algorithms such as integration and minimization, and various methods
for performing regression analysis (fitting). In particular, ROOT offers
packages for complex data modeling and fitting, as well as multivariate
classification based on machine learning techniques. A central piece in these
analysis tools are the histogram classes which provide binning of one- and
multi-dimensional data. Results can be saved in high-quality graphical formats
like Postscript and PDF or in bitmap formats like JPG or GIF. The result can
also be stored into ROOT macros that allow a full recreation and rework of the
graphics. Users typically create their analysis macros step by step, making use
of the interactive C++ interpreter CINT, while running over small data samples.
Once the development is finished, they can run these macros at full compiled
speed over large data sets, using on-the-fly compilation, or by creating a
stand-alone batch program. Finally, if processing farms are available, the user
can reduce the execution time of intrinsically parallel tasks - e.g. data
mining in HEP - by using PROOF, which will take care of optimally distributing
the work over the available resources in a transparent way
- …
