8,170 research outputs found
Simulating all non-signalling correlations via classical or quantum theory with negative probabilities
Many-party correlations between measurement outcomes in general probabilistic
theories are given by conditional probability distributions obeying the
non-signalling condition. We show that any such distribution can be obtained
from classical or quantum theory, by relaxing positivity constraints on either
the mixed state shared by the parties, or the local functions which generate
measurement outcomes. Our results apply to generic non-signalling correlations,
but in particular they yield two distinct quasi-classical models for quantum
correlations.Comment: 6 page
Microscopic colitis
This issue of eMedRef provides information to clinicians on the pathophysiology, diagnosis, and therapeutics of microscopic colitis
Information causality from an entropic and a probabilistic perspective
The information causality principle is a generalisation of the no-signalling
principle which implies some of the known restrictions on quantum correlations.
But despite its clear physical motivation, information causality is formulated
in terms of a rather specialised game and figure of merit. We explore different
perspectives on information causality, discussing the probability of success as
the figure of merit, a relation between information causality and the non-local
`inner-product game', and the derivation of a quadratic bound for these games.
We then examine an entropic formulation of information causality with which one
can obtain the same results, arguably in a simpler fashion.Comment: 7 pages, v2: some references added and minor improvement
Reversible Dynamics in Strongly Non-Local Boxworld Systems
In order to better understand the structure of quantum theory, or speculate
about theories that may supercede it, it can be helpful to consider alternative
physical theories. ``Boxworld'' describes one such theory, in which all
non-signaling correlations are achievable. In a limited class of multipartite
Boxworld systems - wherein all subsystems are identical and all measurements
have the same number of outcomes - it has been demonstrated that the set of
reversible dynamics is `trivial', generated solely by local relabellings and
permutations of subsystems. We develop the convex formalism of Boxworld to give
an alternative proof of this result, then extend this proof to all multipartite
Boxworld systems, and discuss the potential relevance to other theories. These
results lend further support to the idea that the rich reversible dynamics in
quantum theory may be the key to understanding its structure and its
informational capabilities.Comment: 5 pages + appendice
Development of a Priest interferometer for measurement of the thermal expansion of a graphite epoxy in the temperature range 116-366 K
The thermal expansion behavior of graphite epoxy laminates between 116 and 366 degrees Kelvin was investigated using as implementation of the Priest interferometer concept. The design, construction and use of the interferometer along with the experimental results it was used to generate are described. The experimental program consisted of 25 tests on 25.4 mm and 6.35 mm wide, 8 ply pi/4 quasi-isotropic T300-5208 graphite/epoxy specimens and 3 tests on a 25.4 mm wide unidirectional specimen. Experimental results are presented for all tests along with a discussion of the interferometer's limitations and some possible improvements in its design
Lee Short
Lee Short is a member of the Class of 1944 and served in a variety of campus offices including Registrar, Director of Admissions and head of the Development Office. He retired in 1978
Thermal expansion of graphite-epoxy between 116 K and 366 K
A Priest laser interferometer was developed to measure the thermal strain of composite laminates. The salient features of this interferometer are that: (1) it operates between 116 K and 366 K; (2) it is easy to operate; (3) minimum specimen preparation is required; (4) coefficients of thermal expansion in the range of 0-5 micro epsilon/K can be measured; and (5) the resolution of thermal strain is on the order of micro epsilon. The thermal response of quasi-isotropic, T300/5208, grahite-epoxy composite material was studied with this interferometer. The study showed that: (1) for the material tested, thermal cycling effects are negligible; (2) variability of thermal response from specimen to specimen may become significant at cryogenic temperatures; and (3) the thermal response of 0.6 cm and 2.5 cm wide specimens are the same above room temperature
Electrode level Monte Carlo model of radiation damage effects on astronomical CCDs
Current optical space telescopes rely upon silicon Charge Coupled Devices
(CCDs) to detect and image the incoming photons. The performance of a CCD
detector depends on its ability to transfer electrons through the silicon
efficiently, so that the signal from every pixel may be read out through a
single amplifier. This process of electron transfer is highly susceptible to
the effects of solar proton damage (or non-ionizing radiation damage). This is
because charged particles passing through the CCD displace silicon atoms,
introducing energy levels into the semi-conductor bandgap which act as
localized electron traps. The reduction in Charge Transfer Efficiency (CTE)
leads to signal loss and image smearing. The European Space Agency's
astrometric Gaia mission will make extensive use of CCDs to create the most
complete and accurate stereoscopic map to date of the Milky Way. In the context
of the Gaia mission CTE is referred to with the complementary quantity Charge
Transfer Inefficiency (CTI = 1-CTE). CTI is an extremely important issue that
threatens Gaia's performances. We present here a detailed Monte Carlo model
which has been developed to simulate the operation of a damaged CCD at the
pixel electrode level. This model implements a new approach to both the charge
density distribution within a pixel and the charge capture and release
probabilities, which allows the reproduction of CTI effects on a variety of
measurements for a large signal level range in particular for signals of the
order of a few electrons. A running version of the model as well as a brief
documentation and a few examples are readily available at
http://www.strw.leidenuniv.nl/~prodhomme/cemga.php as part of the CEMGA java
package (CTI Effects Models for Gaia).Comment: Accepted by MNRAS on 13 February 2011. 15 pages, 7 figures and 5
table
Hard limits on the postselectability of optical graph states
Coherent control of large entangled graph states enables a wide variety of
quantum information processing tasks, including error-corrected quantum
computation. The linear optical approach offers excellent control and
coherence, but today most photon sources and entangling gates---required for
the construction of large graph states---are probabilistic and rely on
postselection. In this work, we provide proofs and heuristics to aid
experimental design using postselection. We derive a fundamental limitation on
the generation of photonic qubit states using postselected entangling gates:
experiments which contain a cycle of postselected gates cannot be postselected.
Further, we analyse experiments that use photons from postselected photon pair
sources, and lower bound the number of classes of graph state entanglement that
are accessible in the non-degenerate case---graph state entanglement classes
that contain a tree are are always accessible. Numerical investigation up to
9-qubits shows that the proportion of graph states that are accessible using
postselection diminishes rapidly. We provide tables showing which classes are
accessible for a variety of up to nine qubit resource states and sources. We
also use our methods to evaluate near-term multi-photon experiments, and
provide our algorithms for doing so.Comment: Our manuscript comprises 4843 words, 6 figures, 1 table, 47
references, and a supplementary material of 1741 words, 2 figures, 1 table,
and a Mathematica code listin
Stochastic Event Reconstruction of Atmospheric Contaminant Dispersion Using Bayesian Inference
Environmental sensors have been deployed in various cities for early detection of contaminant releases into the atmosphere. Event reconstruction and improved dispersion modeling capabilities are needed to estimate the extent of contamination, which is required to implement effective strategies in emergency management. To this end, a stochastic event reconstruction capability that can process information from an environmental sensor network is developed. A probability model is proposed to take into account both zero and non-zero concentration measurements that can be available from a sensor network because of a sensor’s specified limit of detection. The inference is based on the Bayesian paradigm with Markov chain Monte Carlo (MCMC) sampling. Fast-running Gaussian plume dispersion models are adopted as the forward model in the Bayesian inference approach to achieve rapid-response event reconstructions. The Gaussian plume model is substantially enhanced by introducing stochastic parameters in its turbulent diffusion parameterizations and estimating them within the Bayesian inference framework. Additionally, parameters of the likelihood function are estimated in a principled way using data and prior probabilities to avoid tuning in the overall method, The event reconstruction method is successfully validated for both real and synthetic dispersion problems, and posterior distributions of the model parameters are used to generate probabilistic plume envelopes with specified confidence levels to aid emergency decisions
- …
