1,368 research outputs found
Guiding Transformation: How Medical Practices Can Become Patient-Centered Medical Homes
Describes in detail eight change concepts as a guide to transforming a practice into a patient-centered medical home, including engaged leadership, quality improvement strategy, continuous and team-based healing relationships, and enhanced access
ON the CONSERVATION of the VERTICAL ACTION in GALACTIC DISKS
We employ high-resolution N-body simulations of isolated spiral galaxy models, from low-amplitude, multi-armed galaxies to Milky Way-like disks, to estimate the vertical action of ensembles of stars in an axisymmetrical potential. In the multi-armed galaxy the low-amplitude arms represent tiny perturbations of the potential, hence the vertical action for a set of stars is conserved, although after several orbital periods of revolution the conservation degrades significantly. For a Milky Way-like galaxy with vigorous spiral activity and the formation of a bar, our results show that the potential is far from steady, implying that the action is not a constant of motion. Furthermore, because of the presence of high-amplitude arms and the bar, considerable in-plane and vertical heating occurs that forces stars to deviate from near-circular orbits, reducing the degree at which the actions are conserved for individual stars, in agreement with previous results, but also for ensembles of stars. If confirmed, this result has several implications, including the assertion that the thick disk of our Galaxy forms by radial migration of stars, under the assumption of the conservation of the action describing the vertical motion of stars. © 2016. The American Astronomical Society. All rights reserved
A focus on L dwarfs with trigonometric parallaxes
This is an author-created, un-copyedited version of an article published in Publications of the Astronomical Society of the Pacific. Under embargo until 14 May 2019. IOP Publishing Ltd is not responsible for any errors or omissions in this version of the manuscript or any version derived from it. The Version of Record is available online at https://doi.org/10.1088/1538-3873/aaacc5.We report new parallax measurements for ten L and early T type dwarfs, five of which have no previous published values, using observations over 3 years at the robotic Liverpool Telescope. The resulting parallaxes and proper motions have median errors of 2\,mas and 1.5\,mas/year respectively. Their space motions indicate they are all Galactic disk members. We combined this sample with other objects with astrometry from the Liverpool Telescope and with published literature astrometry to construct a sample of 260 L and early T type dwarfs with measured parallaxes, designated the Astrometry Sample. We study the kinematics of the Astrometry Sample, and derived a solar motion of \,\kms~ with respect to the local standard of rest, in agreement with recent literature. We derive a kinematic age of 1.5-1.7\,Gyr for the Astrometry Sample assuming the age increases monotonically with the total velocity for a given disk sample. This kinematic age is less than half literature values for other low mass dwarf samples. We believe this difference arises for two reasons (1) the sample is mainly composed of mid to late L dwarfs which are expected to be relatively young and (2) the requirement that objects have a measured parallax biases the sample to the brighter examples which tend to be younger.Peer reviewedFinal Accepted Versio
Quantum entanglement for systems of identical bosons: I. General features
These two accompanying papers are concerned with two mode entanglement for systems of identical massive bosons and the relationship to spin squeezing and other quantum correlation effects. Entanglement is a key quantum feature of composite systems in which the probabilities for joint measurements on the composite sub-systems are no longer determined from measurement probabilities on the separate sub-systems. There are many aspects of entanglement that can be studied. This two-part review focuses on the meaning of entanglement, the quantum paradoxes associated with entangled states, and the important tests that allow an experimentalist to determine whether a quantum state—in particular, one for massive bosons is entangled. An overall outcome of the review is to distinguish criteria (and hence experiments) for entanglement that fully utilize the symmetrization principle and the super-selection rules that can be applied to bosonic massive particles. In the first paper (I), the background is given for the meaning of entanglement in the context of systems of identical particles. For such systems, the requirement is that the relevant quantum density operators must satisfy the symmetrization principle and that global and local super-selection rules prohibit states in which there are coherences between differing particle numbers. The justification for these requirements is fully discussed. In the second quantization approach that is used, both the system and the sub-systems are modes (or sets of modes) rather than particles, particles being associated with different occupancies of the modes. The definition of entangled states is based on first defining the non-entangled states—after specifying which modes constitute the sub-systems. This work mainly focuses on the two mode entanglement for massive bosons, but is put in the context of tests of local hidden variable theories, where one may not be able to make the above restrictions. The review provides the detailed arguments necessary for the conclusions of a recent paper, where the question of how to rigorously demonstrate the entanglement of a two-mode Bose–Einstein condensate (BEC) has been examined. In the accompanying review paper (II), we consider spin squeezing and other tests for entanglement that have been proposed for two-mode bosonic systems. We apply the approach of review (I) to determine which tests, and which modifications of the tests, are useful for detecting entanglement in massive bosonic (BEC), as opposed to photonic, systems. Several new inequalities are derived, a theory for the required two-mode interferometry is presented, and key experiments to date are analyzed
The Atacama Cosmology Telescope: A Measurement of the Cosmic Microwave Background Power Spectrum at 148 and 218 GHz from the 2008 Southern Survey
We present measurements of the cosmic microwave background (CMB) power
spectrum made by the Atacama Cosmology Telescope at 148 GHz and 218 GHz, as
well as the cross-frequency spectrum between the two channels. Our results
clearly show the second through the seventh acoustic peaks in the CMB power
spectrum. The measurements of these higher-order peaks provide an additional
test of the {\Lambda}CDM cosmological model. At l > 3000, we detect power in
excess of the primary anisotropy spectrum of the CMB. At lower multipoles 500 <
l < 3000, we find evidence for gravitational lensing of the CMB in the power
spectrum at the 2.8{\sigma} level. We also detect a low level of Galactic dust
in our maps, which demonstrates that we can recover known faint, diffuse
signals.Comment: 19 pages, 13 figures. Submitted to ApJ. This paper is a companion to
Hajian et al. (2010) and Dunkley et al. (2010
Results from the centers for disease control and prevention's predict the 2013-2014 Influenza Season Challenge
Background: Early insights into the timing of the start, peak, and intensity of the influenza season could be useful in planning influenza prevention and control activities. To encourage development and innovation in influenza forecasting, the Centers for Disease Control and Prevention (CDC) organized a challenge to predict the 2013-14 Unites States influenza season. Methods: Challenge contestants were asked to forecast the start, peak, and intensity of the 2013-2014 influenza season at the national level and at any or all Health and Human Services (HHS) region level(s). The challenge ran from December 1, 2013-March 27, 2014; contestants were required to submit 9 biweekly forecasts at the national level to be eligible. The selection of the winner was based on expert evaluation of the methodology used to make the prediction and the accuracy of the prediction as judged against the U.S. Outpatient Influenza-like Illness Surveillance Network (ILINet). Results: Nine teams submitted 13 forecasts for all required milestones. The first forecast was due on December 2, 2013; 3/13 forecasts received correctly predicted the start of the influenza season within one week, 1/13 predicted the peak within 1 week, 3/13 predicted the peak ILINet percentage within 1 %, and 4/13 predicted the season duration within 1 week. For the prediction due on December 19, 2013, the number of forecasts that correctly forecasted the peak week increased to 2/13, the peak percentage to 6/13, and the duration of the season to 6/13. As the season progressed, the forecasts became more stable and were closer to the season milestones. Conclusion: Forecasting has become technically feasible, but further efforts are needed to improve forecast accuracy so that policy makers can reliably use these predictions. CDC and challenge contestants plan to build upon the methods developed during this contest to improve the accuracy of influenza forecasts. © 2016 The Author(s)
The Atacama Cosmology Telescope: Data Characterization and Map Making
We present a description of the data reduction and mapmaking pipeline used
for the 2008 observing season of the Atacama Cosmology Telescope (ACT). The
data presented here at 148 GHz represent 12% of the 90 TB collected by ACT from
2007 to 2010. In 2008 we observed for 136 days, producing a total of 1423 hours
of data (11 TB for the 148 GHz band only), with a daily average of 10.5 hours
of observation. From these, 1085 hours were devoted to a 850 deg^2 stripe (11.2
hours by 9.1 deg) centered on a declination of -52.7 deg, while 175 hours were
devoted to a 280 deg^2 stripe (4.5 hours by 4.8 deg) centered at the celestial
equator. We discuss sources of statistical and systematic noise, calibration,
telescope pointing, and data selection. Out of 1260 survey hours and 1024
detectors per array, 816 hours and 593 effective detectors remain after data
selection for this frequency band, yielding a 38% survey efficiency. The total
sensitivity in 2008, determined from the noise level between 5 Hz and 20 Hz in
the time-ordered data stream (TOD), is 32 micro-Kelvin sqrt{s} in CMB units.
Atmospheric brightness fluctuations constitute the main contaminant in the data
and dominate the detector noise covariance at low frequencies in the TOD. The
maps were made by solving the least-squares problem using the Preconditioned
Conjugate Gradient method, incorporating the details of the detector and noise
correlations. Cross-correlation with WMAP sky maps, as well as analysis from
simulations, reveal that our maps are unbiased at multipoles ell > 300. This
paper accompanies the public release of the 148 GHz southern stripe maps from
2008. The techniques described here will be applied to future maps and data
releases.Comment: 20 pages, 18 figures, 6 tables, an ACT Collaboration pape
Using airborne LiDAR Survey to explore historic-era archaeological landscapes of Montserrat in the eastern Caribbean
This article describes what appears to be the first archaeological application of airborne LiDAR survey to historic-era landscapes in the Caribbean archipelago, on the island of Montserrat. LiDAR is proving invaluable in extending the reach of traditional pedestrian survey into less favorable areas, such as those covered by dense neotropical forest and by ashfall from the past two decades of active eruptions by the Soufrière Hills volcano, and to sites in localities that are inaccessible on account of volcanic dangers. Emphasis is placed on two aspects of the research: first, the importance of ongoing, real-time interaction between the LiDAR analyst and the archaeological team in the field; and second, the advantages of exploiting the full potential of the three-dimensional LiDAR point cloud data for purposes of the visualization of archaeological sites and features
Mathematical modelling of polyamine metabolism in bloodstream-form trypanosoma brucei: An application to drug target identification
© 2013 Gu et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are creditedThis article has been made available through the Brunel Open Access Publishing Fund.We present the first computational kinetic model of polyamine metabolism in bloodstream-form Trypanosoma brucei, the causative agent of human African trypanosomiasis. We systematically extracted the polyamine pathway from the complete metabolic network while still maintaining the predictive capability of the pathway. The kinetic model is constructed on the basis of information gleaned from the experimental biology literature and defined as a set of ordinary differential equations. We applied Michaelis-Menten kinetics featuring regulatory factors to describe enzymatic activities that are well defined. Uncharacterised enzyme kinetics were approximated and justified with available physiological properties of the system. Optimisation-based dynamic simulations were performed to train the model with experimental data and inconsistent predictions prompted an iterative procedure of model refinement. Good agreement between simulation results and measured data reported in various experimental conditions shows that the model has good applicability in spite of there being gaps in the required data. With this kinetic model, the relative importance of the individual pathway enzymes was assessed. We observed that, at low-to-moderate levels of inhibition, enzymes catalysing reactions of de novo AdoMet (MAT) and ornithine production (OrnPt) have more efficient inhibitory effect on total trypanothione content in comparison to other enzymes in the pathway. In our model, prozyme and TSHSyn (the production catalyst of total trypanothione) were also found to exhibit potent control on total trypanothione content but only when they were strongly inhibited. Different chemotherapeutic strategies against T. brucei were investigated using this model and interruption of polyamine synthesis via joint inhibition of MAT or OrnPt together with other polyamine enzymes was identified as an optimal therapeutic strategy.The work was carried out under a PhD programme partly funded by Prof. Ray Welland, School of Computing Science, University of Glasgo
Recommended from our members
Global predictability of temperature extremes
Extreme temperatures are one of the leading causes of death and disease in both developed and developing countries, and heat extremes are projected to rise in many regions. To reduce risk, heatwave plans and cold weather plans have been effectively implemented around the world. However, much of the world’s population is not yet protected by such systems, including many data-scarce but also highly vulnerable regions. In this study, we assess at a global level where such systems have the potential to be effective at reducing risk from temperature extremes, characterizing (1) long-term average occurrence of heatwaves and coldwaves, (2) seasonality of these extremes, and (3) short-term predictability of these extreme events three to ten days in advance. Using both the NOAA and ECMWF weather forecast models, we develop global maps indicating a first approximation of the locations that are likely to benefit from the development of seasonal preparedness plans and/or short-term early warning systems for extreme temperature. The extratropics generally show both short-term skill as well as strong seasonality; in the tropics, most locations do also demonstrate one or both. In fact, almost 5 billion people live in regions that have seasonality and predictability of heatwaves and/or coldwaves. Climate adaptation investments in these regions can take advantage of seasonality and predictability to reduce risks to vulnerable populations
- …
