5,747 research outputs found
On the communication cost of entanglement transformations
We study the amount of communication needed for two parties to transform some
given joint pure state into another one, either exactly or with some fidelity.
Specifically, we present a method to lower bound this communication cost even
when the amount of entanglement does not increase. Moreover, the bound applies
even if the initial state is supplemented with unlimited entanglement in the
form of EPR pairs, and the communication is allowed to be quantum mechanical.
We then apply the method to the determination of the communication cost of
asymptotic entanglement concentration and dilution. While concentration is
known to require no communication whatsoever, the best known protocol for
dilution, discovered by Lo and Popescu [Phys. Rev. Lett. 83(7):1459--1462,
1999], requires a number of bits to be exchanged which is of the order of the
square root of the number of EPR pairs. Here we prove a matching lower bound of
the same asymptotic order, demonstrating the optimality of the Lo-Popescu
protocol up to a constant factor and establishing the existence of a
fundamental asymmetry between the concentration and dilution tasks.
We also discuss states for which the minimal communication cost is
proportional to their entanglement, such as the states recently introduced in
the context of ``embezzling entanglement'' [W. van Dam and P. Hayden,
quant-ph/0201041].Comment: 9 pages, 1 figure. Added a reference and some further explanations.
In v3 some arguments are given in more detai
Chiroptical studies on brevianamide B : vibrational and electronic circular dichroism confronted
Chiroptical spectroscopy, such as electronic circular dichroism (ECD) and vibrational circular dichroism (VCD) are highly sensitive techniques to probe molecular conformation, configuration, solvation and aggregation. Here we report the application of these techniques to study the fungal metabolite brevianamide B. Comparison of the experimental ECD and VCD spectra with the density functional theory (DFT) simulated counterparts establishes that VCD is the more reliable technique to assign absolute configuration due to the larger functional and dispersion dependence of computed ECD spectra. Despite a low amount of available material, and a relatively unusual example of using VCD carbonyl multiplets, the absolute configuration could be reliably predicted, strengthening the case for application of VCD in the study of complex natural products. Spectral and crystallographic evidence for or against the formation of a dimeric aggregate is discussed; in solution the VCD spectra strongly suggest only monomeric species are present
Trading quantum for classical resources in quantum data compression
We study the visible compression of a source E of pure quantum signal states,
or, more formally, the minimal resources per signal required to represent
arbitrarily long strings of signals with arbitrarily high fidelity, when the
compressor is given the identity of the input state sequence as classical
information. According to the quantum source coding theorem, the optimal
quantum rate is the von Neumann entropy S(E) qubits per signal.
We develop a refinement of this theorem in order to analyze the situation in
which the states are coded into classical and quantum bits that are quantified
separately. This leads to a trade--off curve Q(R), where Q(R) qubits per signal
is the optimal quantum rate for a given classical rate of R bits per signal.
Our main result is an explicit characterization of this trade--off function
by a simple formula in terms of only single signal, perfect fidelity encodings
of the source. We give a thorough discussion of many further mathematical
properties of our formula, including an analysis of its behavior for group
covariant sources and a generalization to sources with continuously
parameterized states. We also show that our result leads to a number of
corollaries characterizing the trade--off between information gain and state
disturbance for quantum sources. In addition, we indicate how our techniques
also provide a solution to the so--called remote state preparation problem.
Finally, we develop a probability--free version of our main result which may be
interpreted as an answer to the question: ``How many classical bits does a
qubit cost?'' This theorem provides a type of dual to Holevo's theorem, insofar
as the latter characterizes the cost of coding classical bits into qubits.Comment: 51 pages, 7 figure
The GstLAL Search Analysis Methods for Compact Binary Mergers in Advanced LIGO's Second and Advanced Virgo's First Observing Runs
After their successful first observing run (September 12, 2015 - January 12,
2016), the Advanced LIGO detectors were upgraded to increase their sensitivity
for the second observing run (November 30, 2016 - August 26, 2017). The
Advanced Virgo detector joined the second observing run on August 1, 2017. We
discuss the updates that happened during this period in the GstLAL-based
inspiral pipeline, which is used to detect gravitational waves from the
coalescence of compact binaries both in low latency and an offline
configuration. These updates include deployment of a zero-latency whitening
filter to reduce the over-all latency of the pipeline by up to 32 seconds,
incorporation of the Virgo data stream in the analysis, introduction of a
single-detector search to analyze data from the periods when only one of the
detectors is running, addition of new parameters to the likelihood ratio
ranking statistic, increase in the parameter space of the search, and
introduction of a template mass-dependent glitch-excision thresholding method.Comment: 12 pages, 7 figures, to be submitted to Phys. Rev. D, comments
welcom
Rule-based Cross-matching of Very Large Catalogs
The NASA Extragalactic Database (NED) has deployed a new rule-based cross-matching algorithm called Match Expert (MatchEx), capable of cross-matching very large catalogs (VLCs) with >10 million objects. MatchEx goes beyond traditional position-based cross-matching algorithms by using other available data together with expert logic to determine which candidate match is the best. Furthermore, the local background density of sources is used to determine and minimize the false-positive match rate and to estimate match completeness. The logical outcome and statistical probability of each match decision is stored in the database and may be used to tune the algorithm and adjust match parameter thresholds. For our first production run, we cross-matched the GALEX All Sky Survey Catalog (GASC), containing nearly 40 million NUV-detected sources, against a directory of 180 million objects in NED. Candidate matches were identified for each GASC source within a 7''.5 radius. These candidates were filtered on position-based matching probability and on other criteria including object type and object name. We estimate a match completeness of 97.6% and a match accuracy of 99.75%. Over the next year, we will be cross-matching over 2 billion catalog sources to NED, including the Spitzer Source List, the 2MASS point-source catalog, AllWISE, and SDSS DR 10. We expect to add new capabilities to filter candidate matches based on photometry, redshifts, and refined object classifications. We will also extend MatchEx to handle more heterogenous datasets federated from smaller catalogs through NED's literature pipeline
Security of quantum bit string commitment depends on the information measure
Unconditionally secure non-relativistic bit commitment is known to be
impossible in both the classical and the quantum world. However, when
committing to a string of n bits at once, how far can we stretch the quantum
limits? In this letter, we introduce a framework of quantum schemes where Alice
commits a string of n bits to Bob, in such a way that she can only cheat on a
bits and Bob can learn at most b bits of information before the reveal phase.
Our results are two-fold: we show by an explicit construction that in the
traditional approach, where the reveal and guess probabilities form the
security criteria, no good schemes can exist: a+b is at least n. If, however,
we use a more liberal criterion of security, the accessible information, we
construct schemes where a=4 log n+O(1) and b=4, which is impossible
classically. Our findings significantly extend known no-go results for quantum
bit commitment.Comment: To appear in PRL. Short version of quant-ph/0504078, long version to
appear separately. Improved security definition and result, one new lemma
that may be of independent interest. v2: added funding reference, no other
change
Translating Glucose Variability Metrics into the Clinic via Continuous Glucose Monitoring: A Graphical User Interface for Diabetes Evaluation (CGM-GUIDE)
Background: Several metrics of glucose variability have been proposed to date, but an integrated approach that provides a complete and consistent assessment of glycemic variation is missing. As a consequence, and because of the tedious coding necessary during quantification, most investigators and clinicians have not yet adopted the use of multiple glucose variability metrics to evaluate glycemic variation. Methods: We compiled the most extensively used statistical techniques and glucose variability metrics, with adjustable hyper- and hypoglycemic limits and metric parameters, to create a user-friendly Continuous Glucose Monitoring Graphical User Interface for Diabetes Evaluation (CGM-GUIDE-). In addition, we introduce and demonstrate a novel transition density profile that emphasizes the dynamics of transitions between defined glucose states. Results: Our combined dashboard of numerical statistics and graphical plots support the task of providing an integrated approach to describing glycemic variability. We integrated existing metrics, such as SD, area under the curve, and mean amplitude of glycemic excursion, with novel metrics such as the slopes across critical transitions and the transition density profile to assess the severity and frequency of glucose transitions per day as they move between critical glycemic zones. Conclusions: By presenting the above-mentioned metrics and graphics in a concise aggregate format, CGM-GUIDE provides an easy to use tool to compare quantitative measures of glucose variability. This tool can be used by researchers and clinicians to develop new algorithms of insulin delivery for patients with diabetes and to better explore the link between glucose variability and chronic diabetes complications.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/90437/1/dia-2E2011-2E0099.pd
An advanced modulation scheme emphasising neutral point ripple suppression using predictive control for three-level NPC converters in aircraft electric starter generator applications
Electrical starter/generator (ESG) system is one of the key innovations of more-electric aircraft initiative. The ESG cranks the engine and accelerates it up to self-sustained speed using electric energy (starter mode) and then runs as a generator to supply onboard loads. Three-level neutral point clamped (NPC) converter have been identified as a preferable choice for ESG applications due to high power quality as well as efficiency. However, the application of three-level NPC converter in the ESG systems has certain challenges. One of which is the low frequency neutral point voltage ripple, especially in generation mode when running at high speeds such that the flux weakening control is required. The paper proposes an advanced modulation scheme which can balance the neutral point voltage for the full range of speeds and loading conditions. Using the proposed technique, zero neutral point voltage deviation within each switching period is achieved by introducing a sharing factor computed in a deadbeat predictive approach. The proposed technique is validated with simulation results
Outbreak of Fatal Childhood Lead Poisoning Related to Artisanal Gold Mining in Northwestern Nigeria, 2010.
Background: In May 2010, a team of national and international organizations was assembled to investigate children's deaths due to lead poisoning in villages in northwestern Nigeria. Objectives: To determine the cause of the childhood lead poisoning outbreak, investigate risk factors for child mortality, and identify children aged <5 years in need of emergency chelation therapy for lead poisoning. Methods: We administered a cross-sectional, door-to-door questionnaire in two affected villages, collected blood from children aged 2-59 months, and soil samples from family compounds. Descriptive and bivariate analyses were performed with survey, blood-lead, and environmental data. Multivariate logistic regression techniques were used to determine risk factors for childhood mortality. Results: We surveyed 119 family compounds. One hundred eighteen of 463 (25%) children aged <5 years had died in the last year. We tested 59% (204/345) of children, aged <5 years, and all were lead poisoned (≥10 µg/dL); 97% (198/204) of children had blood-lead levels ≥45 µg/dL, the threshold for initiating chelation therapy. Gold ore was processed inside two-thirds of the family compounds surveyed. In multivariate modeling significant risk factors for death in the previous year from suspected lead poisoning included: the child's age, the mother performing ore-processing activities, community well as primary water source, and the soil-lead concentration in the compound. Conclusion: The high levels of environmental contamination, percentage of children aged <5 years with elevated blood-lead levels (97%, >45 µg/dL), and incidence of convulsions among children prior to death (82%) suggest that most of the recent childhood deaths in the two surveyed villages were caused by acute lead poisoning from gold ore-processing activities. Control measures included environmental remediation, chelation therapy, public health education, and control of mining activities
- …
