1,539 research outputs found
Statistical Constraints on State Preparation for a Quantum Computer
Quantum computing algorithms require that the quantum register be initially
present in a superposition state. To achieve this, we consider the practical
problem of creating a coherent superposition state of several qubits. Owing to
considerations of quantum statistics, this requires that the entropy of the
system go down. This, in turn, has two practical implications: (i) the initial
state cannot be controlled; (ii) the temperature of the system must be reduced.
These factors, in addition to decoherence and sensitivity to errors, must be
considered in the implementation of quantum computers.Comment: 7 pages; the final published versio
Quantum Information and Entropy
Thermodynamic entropy is not an entirely satisfactory measure of information
of a quantum state. This entropy for an unknown pure state is zero, although
repeated measurements on copies of such a pure state do communicate
information. In view of this, we propose a new measure for the informational
entropy of a quantum state that includes information in the pure states and the
thermodynamic entropy. The origin of information is explained in terms of an
interplay between unitary and non-unitary evolution. Such complementarity is
also at the basis of the so-called interaction-free measurement.Comment: 21 pages, 3 figure
Momentum space tomographic imaging of photoelectrons
We apply tomography, a general method for reconstructing 3-D distributions
from multiple projections, to reconstruct the momentum distribution of
electrons produced via strong field photoionization. The projections are
obtained by rotating the electron distribution via the polarization of the
ionizing laser beam and recording a momentum spectrum at each angle with a 2-D
velocity map imaging spectrometer. For linearly polarized light the tomographic
reconstruction agrees with the distribution obtained using an Abel inversion.
Electron tomography, which can be applied to any polarization, will simplify
the technology of electron imaging. The method can be directly generalized to
other charged particles.Comment: Accepted by J. Phys.
Theoretical analysis of dynamic chemical imaging with lasers using high-order harmonic generation
We report theoretical investigations of the tomographic procedure suggested
by Itatani {\it et al.} [Nature, {\bf 432} 867 (2004)] for reconstructing
highest occupied molecular orbitals (HOMO) using high-order harmonic generation
(HHG). Using the limited range of harmonics from the plateau region, we found
that under the most favorable assumptions, it is still very difficult to obtain
accurate HOMO wavefunction, but the symmetry of the HOMO and the internuclear
separation between the atoms can be accurately extracted, especially when
lasers of longer wavelengths are used to generate the HHG. We also considered
the possible removal or relaxation of the approximations used in the
tomographic method in actual applications. We suggest that for chemical
imaging, in the future it is better to use an iterative method to locate the
positions of atoms in the molecule such that the resulting HHG best fits the
macroscopic HHG data, rather than by the tomographic method.Comment: 13 pages, 14 figure
Thermoacoustic tomography with detectors on an open curve: an efficient reconstruction algorithm
Practical applications of thermoacoustic tomography require numerical
inversion of the spherical mean Radon transform with the centers of integration
spheres occupying an open surface. Solution of this problem is needed (both in
2-D and 3-D) because frequently the region of interest cannot be completely
surrounded by the detectors, as it happens, for example, in breast imaging. We
present an efficient numerical algorithm for solving this problem in 2-D
(similar methods are applicable in the 3-D case). Our method is based on the
numerical approximation of plane waves by certain single layer potentials
related to the acquisition geometry. After the densities of these potentials
have been precomputed, each subsequent image reconstruction has the complexity
of the regular filtration backprojection algorithm for the classical Radon
transform. The peformance of the method is demonstrated in several numerical
examples: one can see that the algorithm produces very accurate reconstructions
if the data are accurate and sufficiently well sampled, on the other hand, it
is sufficiently stable with respect to noise in the data
Rationing tests for drug-resistant tuberculosis - who are we prepared to miss?
BACKGROUND: Early identification of patients with drug-resistant tuberculosis (DR-TB) increases the likelihood of treatment success and interrupts transmission. Resource-constrained settings use risk profiling to ration the use of drug susceptibility testing (DST). Nevertheless, no studies have yet quantified how many patients with DR-TB this strategy will miss. METHODS: A total of 1,545 subjects, who presented to Lima health centres with possible TB symptoms, completed a clinic-epidemiological questionnaire and provided sputum samples for TB culture and DST. The proportion of drug resistance in this population was calculated and the data was analysed to demonstrate the effect of rationing tests to patients with multidrug-resistant TB (MDR-TB) risk factors on the number of tests needed and corresponding proportion of missed patients with DR-TB. RESULTS: Overall, 147/1,545 (9.5%) subjects had culture-positive TB, of which 32 (21.8%) had DR-TB (MDR, 13.6%; isoniazid mono-resistant, 7.5%; rifampicin mono-resistant, 0.7%). A total of 553 subjects (35.8%) reported one or more MDR-TB risk factors; of these, 506 (91.5%; 95% CI, 88.9-93.7%) did not have TB, 32/553 (5.8%; 95% CI, 3.4-8.1%) had drug-susceptible TB, and only 15/553 (2.7%; 95% CI, 1.5-4.4%) had DR-TB. Rationing DST to those with an MDR-TB risk factor would have missed more than half of the DR-TB population (17/32, 53.2%; 95% CI, 34.7-70.9). CONCLUSIONS: Rationing DST based on known MDR-TB risk factors misses an unacceptable proportion of patients with drug-resistance in settings with ongoing DR-TB transmission. Investment in diagnostic services to allow universal DST for people with presumptive TB should be a high priority
Implementation of an Optimal First-Order Method for Strongly Convex Total Variation Regularization
We present a practical implementation of an optimal first-order method, due
to Nesterov, for large-scale total variation regularization in tomographic
reconstruction, image deblurring, etc. The algorithm applies to -strongly
convex objective functions with -Lipschitz continuous gradient. In the
framework of Nesterov both and are assumed known -- an assumption
that is seldom satisfied in practice. We propose to incorporate mechanisms to
estimate locally sufficient and during the iterations. The mechanisms
also allow for the application to non-strongly convex functions. We discuss the
iteration complexity of several first-order methods, including the proposed
algorithm, and we use a 3D tomography problem to compare the performance of
these methods. The results show that for ill-conditioned problems solved to
high accuracy, the proposed method significantly outperforms state-of-the-art
first-order methods, as also suggested by theoretical results.Comment: 23 pages, 4 figure
Studying model suspensions using high resolution synchrotron X-ray microtomography
The addition of minor quantities of secondary liquids to suspensions may lead to a transition from a fluid-like structure to paste-like structure for the system. Previous studies have shown how rheological properties such as viscosity and yield stress are affected, however, qualitative visual observation on the micro-scale during both short and long term storage has yet to be achieved or reported.
This research focuses on the movement of a secondary immiscible liquid (water or saturated sucrose solution) when added to a model food system. The model food system used in this study is a suspension of sucrose particles in a continuous oil phase to better understand the interactions between the particles and the liquid phases present. This was accomplished using dynamic X-ray computer tomography to study the behaviour of the sample. This non-destructive approach allowed the movement of the secondary liquid as well as the solid particles from the bulk suspension to be monitored through a time lapse of scans. This was achieved by observing the changes in the grey scale range of the droplet with time, which was then correlated to the uptake and movement of sucrose into the secondary liquid using an innovative method. This movement was due to the hydrophilicity and solubility of sucrose with gravity/sedimentation playing a minimal role
- …
