1,257 research outputs found
Prospects for intermediate mass black hole binary searches with advanced gravitational-wave detectors
We estimated the sensitivity of the upcoming advanced, ground-based
gravitational-wave observatories (the upgraded LIGO and Virgo and the KAGRA
interferometers) to coalescing intermediate mass black hole binaries (IMBHB).
We added waveforms modeling the gravitational radiation emitted by IMBHBs to
detectors' simulated data and searched for the injected signals with the
coherent WaveBurst algorithm. The tested binary's parameter space covers
non-spinning IMBHBs with source-frame total masses between 50 and 1050
and mass ratios between and 1. We found that
advanced detectors could be sensitive to these systems up to a range of a few
Gpc. A theoretical model was adopted to estimate the expected observation
rates, yielding up to a few tens of events per year. Thus, our results indicate
that advanced detectors will have a reasonable chance to collect the first
direct evidence for intermediate mass black holes and open a new, intriguing
channel for probing the Universe over cosmological scales.Comment: 9 pages, 4 figures, corrected the name of one author (previously
misspelled
Enhancing the significance of gravitational wave bursts through signal classification
The quest to observe gravitational waves challenges our ability to
discriminate signals from detector noise. This issue is especially relevant for
transient gravitational waves searches with a robust eyes wide open approach,
the so called all- sky burst searches. Here we show how signal classification
methods inspired by broad astrophysical characteristics can be implemented in
all-sky burst searches preserving their generality. In our case study, we apply
a multivariate analyses based on artificial neural networks to classify waves
emitted in compact binary coalescences. We enhance by orders of magnitude the
significance of signals belonging to this broad astrophysical class against the
noise background. Alternatively, at a given level of mis-classification of
noise events, we can detect about 1/4 more of the total signal population. We
also show that a more general strategy of signal classification can actually be
performed, by testing the ability of artificial neural networks in
discriminating different signal classes. The possible impact on future
observations by the LIGO-Virgo network of detectors is discussed by analysing
recoloured noise from previous LIGO-Virgo data with coherent WaveBurst, one of
the flagship pipelines dedicated to all-sky searches for transient
gravitational waves
On the expressiveness of forwarding in higher-order communication
Abstract. In higher-order process calculi the values exchanged in communications may contain processes. There are only two capabilities for received processes: execution and forwarding. Here we propose a limited form of forwarding: output actions can only communicate the parallel composition of statically known closed processes and processes received through previously executed input actions. We study the expressiveness of a higher-order process calculus featuring this style of communication. Our main result shows that in this calculus termination is decidable while convergence is undecidable.
An overview of the ciao multiparadigm language and program development environment and its design philosophy
We describe some of the novel aspects and motivations behind
the design and implementation of the Ciao multiparadigm programming system. An important aspect of Ciao is that it provides the programmer with a large number of useful features from different programming paradigms and styles, and that the use of each of these features can be turned on and off at will for each program module. Thus, a given module may be using e.g. higher order functions and constraints, while another module may be using objects, predicates, and concurrency. Furthermore, the language is designed to be extensible in a simple and modular way. Another important aspect of Ciao is its programming environment, which provides a powerful preprocessor (with an associated assertion language) capable of statically finding non-trivial bugs, verifying that programs comply with specifications, and performing many types of program optimizations. Such optimizations produce code that is highly competitive with other dynamic languages or, when the highest levéis of optimization are used, even that of static languages, all while retaining the interactive development environment of a dynamic language. The environment also includes a powerful auto-documenter. The paper provides an informal overview of the language and program development environment. It aims at illustrating the design philosophy rather than at being exhaustive, which would be impossible in the format of a paper, pointing instead to the existing literature on the system
Continuation-Passing C: compiling threads to events through continuations
In this paper, we introduce Continuation Passing C (CPC), a programming
language for concurrent systems in which native and cooperative threads are
unified and presented to the programmer as a single abstraction. The CPC
compiler uses a compilation technique, based on the CPS transform, that yields
efficient code and an extremely lightweight representation for contexts. We
provide a proof of the correctness of our compilation scheme. We show in
particular that lambda-lifting, a common compilation technique for functional
languages, is also correct in an imperative language like C, under some
conditions enforced by the CPC compiler. The current CPC compiler is mature
enough to write substantial programs such as Hekate, a highly concurrent
BitTorrent seeder. Our benchmark results show that CPC is as efficient, while
using significantly less space, as the most efficient thread libraries
available.Comment: Higher-Order and Symbolic Computation (2012). arXiv admin note:
substantial text overlap with arXiv:1202.324
A formally verified compiler back-end
This article describes the development and formal verification (proof of
semantic preservation) of a compiler back-end from Cminor (a simple imperative
intermediate language) to PowerPC assembly code, using the Coq proof assistant
both for programming the compiler and for proving its correctness. Such a
verified compiler is useful in the context of formal methods applied to the
certification of critical software: the verification of the compiler guarantees
that the safety properties proved on the source code hold for the executable
compiled code as well
Partial derivative automata formalized in Coq
In this paper we present a computer assisted proof of the correctness of a partial derivative automata construction from a regular expression within the Coq proof assistant. This proof is part of a for- malization of Kleene algebra and regular languages in Coq towards their usage in program certification.Fundação para a Ciência e Tecnologia (FCT)
Program POSI, RESCUE (PTDC/EIA/65862/2006), SFRH/BD/33233/2007
Phenothiazine-mediated rescue of cognition in tau transgenic mice requires neuroprotection and reduced soluble tau burden
Abstract Background It has traditionally been thought that the pathological accumulation of tau in Alzheimer's disease and other tauopathies facilitates neurodegeneration, which in turn leads to cognitive impairment. However, recent evidence suggests that tau tangles are not the entity responsible for memory loss, rather it is an intermediate tau species that disrupts neuronal function. Thus, efforts to discover therapeutics for tauopathies emphasize soluble tau reductions as well as neuroprotection. Results Here, we found that neuroprotection alone caused by methylene blue (MB), the parent compound of the anti-tau phenothiaziazine drug, Rember™, was insufficient to rescue cognition in a mouse model of the human tauopathy, progressive supranuclear palsy (PSP) and fronto-temporal dementia with parkinsonism linked to chromosome 17 (FTDP17): Only when levels of soluble tau protein were concomitantly reduced by a very high concentration of MB, was cognitive improvement observed. Thus, neurodegeneration can be decoupled from tau accumulation, but phenotypic improvement is only possible when soluble tau levels are also reduced. Conclusions Neuroprotection alone is not sufficient to rescue tau-induced memory loss in a transgenic mouse model. Development of neuroprotective agents is an area of intense investigation in the tauopathy drug discovery field. This may ultimately be an unsuccessful approach if soluble toxic tau intermediates are not also reduced. Thus, MB and related compounds, despite their pleiotropic nature, may be the proverbial "magic bullet" because they not only are neuroprotective, but are also able to facilitate soluble tau clearance. Moreover, this shows that neuroprotection is possible without reducing tau levels. This indicates that there is a definitive molecular link between tau and cell death cascades that can be disrupted.http://deepblue.lib.umich.edu/bitstream/2027.42/78314/1/1750-1326-5-45.xmlhttp://deepblue.lib.umich.edu/bitstream/2027.42/78314/2/1750-1326-5-45.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/78314/3/1750-1326-5-45-S1.PDFPeer Reviewe
Low and decreasing vaccine effectiveness against influenza A(H3) in 2011/12 among vaccination target groups in Europe: results from the I-MOVE multicentre case-control study
Within the Influenza Monitoring Vaccine Effectiveness in Europe (I-MOVE) project we conducted a multicentre case–control study in eight European Union (EU) Member States to estimate the 2011/12 influenza vaccine effectiveness against medically attended influenza-like illness (ILI) laboratory-confirmed as influenza A(H3) among the vaccination target groups. Practitioners systematically selected ILI / acute respiratory infection patients to swab within seven days of symptom onset. We restricted the study population to those meeting the EU ILI case definition and compared influenza A(H3) positive to influenza laboratory-negative patients. We used logistic regression with study site as fixed effect and calculated adjusted influenza vaccine effectiveness (IVE), controlling for potential confounders (age group, sex, month of symptom onset, chronic diseases and related hospitalisations, number of practitioner visits in the previous year). Adjusted IVE was 25% (95% confidence intervals (CI): -6 to 47) among all ages (n=1,014), 63% (95% CI: 26 to 82) in adults aged between 15 and 59 years and 15% (95% CI: -33 to 46) among those aged 60 years and above. Adjusted IVE was 38% (95%CI: -8 to 65) in the early influenza season (up to week 6 of 2012) and -1% (95% CI: -60 to 37) in the late phase. The results suggested a low adjusted IVE in 2011/12. The lower IVE in the late season could be due to virus changes through the season or waning immunity. Virological surveillance should be enhanced to quantify change over time and understand its relation with duration of immunological protection. Seasonal influenza vaccines should be improved to achieve acceptable levels of protection.ECD
- …
