716 research outputs found
LSST: Comprehensive NEO Detection, Characterization, and Orbits
(Abridged) The Large Synoptic Survey Telescope (LSST) is currently by far the
most ambitious proposed ground-based optical survey. Solar System mapping is
one of the four key scientific design drivers, with emphasis on efficient
Near-Earth Object (NEO) and Potentially Hazardous Asteroid (PHA) detection,
orbit determination, and characterization. In a continuous observing campaign
of pairs of 15 second exposures of its 3,200 megapixel camera, LSST will cover
the entire available sky every three nights in two photometric bands to a depth
of V=25 per visit (two exposures), with exquisitely accurate astrometry and
photometry. Over the proposed survey lifetime of 10 years, each sky location
would be visited about 1000 times. The baseline design satisfies strong
constraints on the cadence of observations mandated by PHAs such as closely
spaced pairs of observations to link different detections and short exposures
to avoid trailing losses. Equally important, due to frequent repeat visits LSST
will effectively provide its own follow-up to derive orbits for detected moving
objects. Detailed modeling of LSST operations, incorporating real historical
weather and seeing data from LSST site at Cerro Pachon, shows that LSST using
its baseline design cadence could find 90% of the PHAs with diameters larger
than 250 m, and 75% of those greater than 140 m within ten years. However, by
optimizing sky coverage, the ongoing simulations suggest that the LSST system,
with its first light in 2013, can reach the Congressional mandate of cataloging
90% of PHAs larger than 140m by 2020.Comment: 10 pages, color figures, presented at IAU Symposium 23
The disjointness of stabilizer codes and limitations on fault-tolerant logical gates
Stabilizer codes are a simple and successful class of quantum
error-correcting codes. Yet this success comes in spite of some harsh
limitations on the ability of these codes to fault-tolerantly compute. Here we
introduce a new metric for these codes, the disjointness, which, roughly
speaking, is the number of mostly non-overlapping representatives of any given
non-trivial logical Pauli operator. We use the disjointness to prove that
transversal gates on error-detecting stabilizer codes are necessarily in a
finite level of the Clifford hierarchy. We also apply our techniques to
topological code families to find similar bounds on the level of the hierarchy
attainable by constant depth circuits, regardless of their geometric locality.
For instance, we can show that symmetric 2D surface codes cannot have non-local
constant depth circuits for non-Clifford gates.Comment: 8+3 pages, 2 figures. Comments welcom
Decomposition unit Patent
Unit for generating thrust from catalytic decomposition of hydrogen peroxide, for high altitude aircraft or spacecraft reaction contro
Efficient intra- and inter-night linking of asteroid detections using kd-trees
The Panoramic Survey Telescope And Rapid Response System (Pan-STARRS) under
development at the University of Hawaii's Institute for Astronomy is creating
the first fully automated end-to-end Moving Object Processing System (MOPS) in
the world. It will be capable of identifying detections of moving objects in
our solar system and linking those detections within and between nights,
attributing those detections to known objects, calculating initial and
differentially-corrected orbits for linked detections, precovering detections
when they exist, and orbit identification. Here we describe new kd-tree and
variable-tree algorithms that allow fast, efficient, scalable linking of intra
and inter-night detections. Using a pseudo-realistic simulation of the
Pan-STARRS survey strategy incorporating weather, astrometric accuracy and
false detections we have achieved nearly 100% efficiency and accuracy for
intra-night linking and nearly 100% efficiency for inter-night linking within a
lunation. At realistic sky-plane densities for both real and false detections
the intra-night linking of detections into `tracks' currently has an accuracy
of 0.3%. Successful tests of the MOPS on real source detections from the
Spacewatch asteroid survey indicate that the MOPS is capable of identifying
asteroids in real data.Comment: Accepted to Icaru
Metabolism of ticagrelor in patients with acute coronary syndromes.
© The Author(s) 2018Ticagrelor is a state-of-the-art antiplatelet agent used for the treatment of patients with acute coronary syndromes (ACS). Unlike remaining oral P2Y12 receptor inhibitors ticagrelor does not require metabolic activation to exert its antiplatelet action. Still, ticagrelor is extensively metabolized by hepatic CYP3A enzymes, and AR-C124910XX is its only active metabolite. A post hoc analysis of patient-level (n = 117) pharmacokinetic data pooled from two prospective studies was performed to identify clinical characteristics affecting the degree of AR-C124910XX formation during the first six hours after 180 mg ticagrelor loading dose in the setting of ACS. Both linear and multiple regression analyses indicated that ACS patients presenting with ST-elevation myocardial infarction or suffering from diabetes mellitus are more likely to have decreased rate of ticagrelor metabolism during the acute phase of ACS. Administration of morphine during ACS was found to negatively influence transformation of ticagrelor into AR-C124910XX when assessed with linear regression analysis, but not with multiple regression analysis. On the other hand, smoking appears to increase the degree of ticagrelor transformation in ACS patients. Mechanisms underlying our findings and their clinical significance warrant further research.Peer reviewedFinal Published versio
Order preserving pattern matching on trees and DAGs
The order preserving pattern matching (OPPM) problem is, given a pattern
string and a text string , find all substrings of which have the
same relative orders as . In this paper, we consider two variants of the
OPPM problem where a set of text strings is given as a tree or a DAG. We show
that the OPPM problem for a single pattern of length and a text tree
of size can be solved in time if the characters of are
drawn from an integer alphabet of polynomial size. The time complexity becomes
if the pattern is over a general ordered alphabet. We
then show that the OPPM problem for a single pattern and a text DAG is
NP-complete
Assessing availability and greenhouse gas emissions of lignocellulosic biomass feedstock supply – case study for a catchment in England
© 2019 Society of Chemical Industry and John Wiley & Sons, Ltd.Feedstocks from lignocellulosic biomass (LCB) include crop residues and dedicated per¬ennial biomass crops. The latter are often considered superior in terms of climate change mitigation potential. Uncertainty remains over their availability as feedstocks for biomass provision and the net greenhouse gas emissions (GHG) during crop production. Our objective was to assess the optimal land allocation to wheat and Miscanthus in a specific case study located in England, to increase bio¬mass availability, improve the carbon balance (and reduce the consequent GHG emissions), and mini¬mally constrain grain production losses from wheat. Using soil and climate variables for a catchment in east England, biomass yields and direct nitrogen emissions were simulated with validated process-based models. A ‘Field to up-stream factory gate’ life-cycle assessment was conducted to estimate indirect management-related GHG emissions. Results show that feedstock supply from wheat straw can be supplemented beneficially with LCB from Miscanthus grown on selected low-quality soils. In our study, 8% of the less productive arable land area was dedicated to Miscanthus, increasing total LCB provision by about 150%, with a 52% reduction in GHG emission per ton LCB delivered and only a minor effect on wheat grain production (−3%). In conclusion, even without considering the likely carbon sequestration in impoverished soils, agriculture should embrace the opportunities to provide the bioeconomy with LCB from dedicated, perennial crops.Peer reviewe
Duel and sweep algorithm for order-preserving pattern matching
Given a text and a pattern over alphabet , the classic exact
matching problem searches for all occurrences of pattern in text .
Unlike exact matching problem, order-preserving pattern matching (OPPM)
considers the relative order of elements, rather than their real values. In
this paper, we propose an efficient algorithm for OPPM problem using the
"duel-and-sweep" paradigm. Our algorithm runs in time in
general and time under an assumption that the characters in a string
can be sorted in linear time with respect to the string size. We also perform
experiments and show that our algorithm is faster that KMP-based algorithm.
Last, we introduce the two-dimensional order preserved pattern matching and
give a duel and sweep algorithm that runs in time for duel stage and
time for sweeping time with preprocessing time.Comment: 13 pages, 5 figure
Orbit Determination with the two-body Integrals
We investigate a method to compute a finite set of preliminary orbits for
solar system bodies using the first integrals of the Kepler problem. This
method is thought for the applications to the modern sets of astrometric
observations, where often the information contained in the observations allows
only to compute, by interpolation, two angular positions of the observed body
and their time derivatives at a given epoch; we call this set of data
attributable. Given two attributables of the same body at two different epochs
we can use the energy and angular momentum integrals of the two-body problem to
write a system of polynomial equations for the topocentric distance and the
radial velocity at the two epochs. We define two different algorithms for the
computation of the solutions, based on different ways to perform elimination of
variables and obtain a univariate polynomial. Moreover we use the redundancy of
the data to test the hypothesis that two attributables belong to the same body
(linkage problem). It is also possible to compute a covariance matrix,
describing the uncertainty of the preliminary orbits which results from the
observation error statistics. The performance of this method has been
investigated by using a large set of simulated observations of the Pan-STARRS
project.Comment: 23 pages, 1 figur
The Pan-STARRS Moving Object Processing System
We describe the Pan-STARRS Moving Object Processing System (MOPS), a modern
software package that produces automatic asteroid discoveries and
identifications from catalogs of transient detections from next-generation
astronomical survey telescopes. MOPS achieves > 99.5% efficiency in producing
orbits from a synthetic but realistic population of asteroids whose
measurements were simulated for a Pan-STARRS4-class telescope. Additionally,
using a non-physical grid population, we demonstrate that MOPS can detect
populations of currently unknown objects such as interstellar asteroids.
MOPS has been adapted successfully to the prototype Pan-STARRS1 telescope
despite differences in expected false detection rates, fill-factor loss and
relatively sparse observing cadence compared to a hypothetical Pan-STARRS4
telescope and survey. MOPS remains >99.5% efficient at detecting objects on a
single night but drops to 80% efficiency at producing orbits for objects
detected on multiple nights. This loss is primarily due to configurable MOPS
processing limits that are not yet tuned for the Pan-STARRS1 mission.
The core MOPS software package is the product of more than 15 person-years of
software development and incorporates countless additional years of effort in
third-party software to perform lower-level functions such as spatial searching
or orbit determination. We describe the high-level design of MOPS and essential
subcomponents, the suitability of MOPS for other survey programs, and suggest a
road map for future MOPS development.Comment: 57 Pages, 26 Figures, 13 Table
- …
