2,708 research outputs found
The mother of all protocols: Restructuring quantum information's family tree
We give a simple, direct proof of the "mother" protocol of quantum
information theory. In this new formulation, it is easy to see that the mother,
or rather her generalization to the fully quantum Slepian-Wolf protocol,
simultaneously accomplishes two goals: quantum communication-assisted
entanglement distillation, and state transfer from the sender to the receiver.
As a result, in addition to her other "children," the mother protocol generates
the state merging primitive of Horodecki, Oppenheim and Winter, a fully quantum
reverse Shannon theorem, and a new class of distributed compression protocols
for correlated quantum sources which are optimal for sources described by
separable density operators. Moreover, the mother protocol described here is
easily transformed into the so-called "father" protocol whose children provide
the quantum capacity and the entanglement-assisted capacity of a quantum
channel, demonstrating that the division of single-sender/single-receiver
protocols into two families was unnecessary: all protocols in the family are
children of the mother.Comment: 25 pages, 6 figure
On the reversible extraction of classical information from a quantum source
Consider a source E of pure quantum states with von Neumann entropy S. By the
quantum source coding theorem, arbitrarily long strings of signals may be
encoded asymptotically into S qubits/signal (the Schumacher limit) in such a
way that entire strings may be recovered with arbitrarily high fidelity.
Suppose that classical storage is free while quantum storage is expensive and
suppose that the states of E do not fall into two or more orthogonal subspaces.
We show that if E can be compressed with arbitrarily high fidelity into A
qubits/signal plus any amount of auxiliary classical storage then A must still
be at least as large as the Schumacher limit S of E. Thus no part of the
quantum information content of E can be faithfully replaced by classical
information. If the states do fall into orthogonal subspaces then A may be less
than S, but only by an amount not exceeding the amount of classical information
specifying the subspace for a signal from the source.Comment: 22 pages, Latex2e, journal versio
Optimal superdense coding of entangled states
We present a one-shot method for preparing pure entangled states between a
sender and a receiver at a minimal cost of entanglement and quantum
communication. In the case of preparing unentangled states, an earlier paper
showed that a 2n-qubit quantum state could be communicated to a receiver by
physically transmitting only n+o(n) qubits in addition to consuming n ebits of
entanglement and some shared randomness. When the states to be prepared are
entangled, we find that there is a reduction in the number of qubits that need
to be transmitted, interpolating between no communication at all for maximally
entangled states and the earlier two-for-one result of the unentangled case,
all without the use of any shared randomness. We also present two applications
of our result: a direct proof of the achievability of the optimal superdense
coding protocol for entangled states produced by a memoryless source, and a
demonstration that the quantum identification capacity of an ebit is two
qubits.Comment: Final Version. Several technical issues clarifie
Upgraded experiments with super neutrino beams: Reach versus Exposure
We introduce exposure as a means to making balanced comparisons of the
sensitivities of long-baseline neutrino experiments to a nonzero \theta_{13},
to CP violation and to the neutrino mass hierarchy. We illustrate its use by
comparing the sensitivities of possible upgrades of superbeam experiments,
namely NOvA*, T2KK and experiments with wide band beams. For the proposed
exposures, we find the best nominal CP violation performance for T2KK. For
equal exposures, a wide band beam experiment has the best mass hierarchy
performance. The physics concept on which NOvA* is based has the best potential
for discovering CP violation only for exposures above a threshold value.Comment: 4 pages, 2 figures, 1 table. Version to appear as a Rapid
Communication in PR
A decoupling approach to the quantum capacity
We give a short proof that the coherent information is an achievable rate for
the transmission of quantum information through a noisy quantum channel. Our
method is to produce random codes by performing a unitarily covariant
projective measurement on a typical subspace of a tensor power state. We show
that, provided the rank of each measurement operator is sufficiently small, the
transmitted data will with high probability be decoupled from the channel's
environment. We also show that our construction leads to random codes whose
average input is close to a product state and outline a modification yielding
unitarily invariant ensembles of maximally entangled codes.Comment: 13 pages, published versio
Structure of states which satisfy strong subadditivity of quantum entropy with equality
We give an explicit characterisation of the quantum states which saturate the
strong subadditivity inequality for the von Neumann entropy. By combining a
result of Petz characterising the equality case for the monotonicity of
relative entropy with a recent theorem by Koashi and Imoto, we show that such
states will have the form of a so-called short quantum Markov chain, which in
turn implies that two of the systems are independent conditioned on the third,
in a physically meaningful sense. This characterisation simultaneously
generalises known necessary and sufficient entropic conditions for quantum
error correction as well as the conditions for the achievability of the Holevo
bound on accessible information.Comment: 9 pages, revtex4. v2 corrects/adds some references and has a bit more
discussio
Physics and optimization of beta-beams: From low to very high gamma
The physics potential of beta beams is investigated from low to very high
gamma values and it is compared to superbeams and neutrino factories. The gamma
factor and the baseline are treated as continuous variables in the optimization
of the beta beam, while a fixed mass water Cherenkov detector or a totally
active scintillator detector is assumed. We include in our discussion also the
gamma dependence of the number of ion decays per year. For low gamma, we find
that a beta beam could be a very interesting alternative to a superbeam
upgrade, especially if it is operated at the second oscillation maximum to
reduce correlations and degeneracies. For high gamma, we find that a beta beam
could have a potential similar to a neutrino factory. In all cases, the
sensitivity of the beta beams to CP violation is very impressive if similar
neutrino and anti-neutrino event rates can be achieved.Comment: 34 pages, 16 figures, Fig. 2 modified, discussion improved, refs.
added, version to appear in PR
New radiocarbon dates from the Bapot-1 site in Saipan and Neolithic dispersal by stratified diffusion
The colonisation of the Mariana Islands in Western Micronesia is likely to represent an early ocean dispersal of more than 2000 km. Establishing the date of human arrival in the archipelago is important for modelling Neolithic expansion in Island Southeast Asia and the Pacific, particularly the role of long-distance dispersals. This paper presents new ¹⁴C results and a ΔR estimate from the Bapot-1 site on Saipan Island, which indicate human arrival at ca. 3400-3200 cal. BP. Archaeological chronologies of long-distance dispersal to Western Micronesia and the Lapita expansion (Bismarcks to Samoa) show that the Neolithic dispersal rate was increasing during the period ca. 3400-2900 cal. BP. The range-versus-time relationship is similar to stratified diffusion whereby a period of relatively slow expansion is succeeded by long-distance movement. An increase in new colonies created by long-distance migrants results in accelerating range expansion
- …
