1,914 research outputs found

    Cost comparison of orthopaedic fracture pathways using discrete event simulation in a Glasgow hospital

    Get PDF
    Objective: Healthcare faces the continual challenge of improving outcome whilst aiming to reduce cost. The aim of this study was to determine the micro cost differences of the Glasgow non-operative trauma virtual pathway in comparison to a traditional pathway. Design:  Discrete event simulation was used to model and analyse cost and resource utilisation with an activity based costing approach. Data for a full comparison before the process change was unavailable so we utilised a modelling approach, comparing a Virtual Fracture Clinic (VFC) to a simulated Traditional Fracture Clinic (TFC). Setting:  The orthopaedic unit VFC pathway pioneered at Glasgow Royal Infirmary has attracted significant attention and interest and is the focus of this cost study. Outcome measures: Our study focused exclusively on non-operative trauma patients attending Emergency Department or the minor injuries unit and the subsequent step in the patient pathway. Retrospective studies of patient outcomes as a result of the protocol introductions for specific injuries in association with activity costs from the models.ResultsPatients are satisfied with the new pathway, the information provided and the outcome of their injuries (Evidence Level IV). There was a 65% reduction in the number of first outpatient face-to-face attendances in orthopaedics. In the VFC pathway, the resources required per day were significantly lower for all staff groups (p=<0.001). The overall cost per patient of the VFC pathway was £22.84 (95% CI: 21.74, 23.92) per patient compared with £36.81 (95% CI: 35.65, 37.97) for the TFC pathway.  Conclusions:  Our results give a clearer picture of the cost comparison of the virtual pathway over a wholly traditional face-to-face clinic system. The use of simulation-based stochastic costings in healthcare economic analysis has been limited to date, but this study provides evidence for adoption of this method as a basis for its application in other healthcare settings

    Single-Shot Electron Diffraction using a Cold Atom Electron Source

    Get PDF
    Cold atom electron sources are a promising alternative to traditional photocathode sources for use in ultrafast electron diffraction due to greatly reduced electron temperature at creation, and the potential for a corresponding increase in brightness. Here we demonstrate single-shot, nanosecond electron diffraction from monocrystalline gold using cold electron bunches generated in a cold atom electron source. The diffraction patterns have sufficient signal to allow registration of multiple single-shot images, generating an averaged image with significantly higher signal-to-noise ratio than obtained with unregistered averaging. Reflection high-energy electron diffraction (RHEED) was also demonstrated, showing that cold atom electron sources may be useful in resolving nanosecond dynamics of nanometre scale near-surface structures.Comment: This is an author-created, un-copyedited version of an article published in Journal of Physics B: Atomic, Molecular and Optical Physics. IOP Publishing Ltd is not responsible for any errors or omissions in this version of the manuscript or any version derived from it. The Version of Record is available online at http://dx.doi.org/10.1088/0953-4075/48/21/21400

    Atmospheric Processing Module for Mars Propellant Production

    Get PDF
    The multi-NASA center Mars Atmosphere and Regolith COllector/PrOcessor for Lander Operations (MARCO POLO) project was established to build and demonstrate a methane/oxygen propellant production system in a Mars analog environment. Work at the Kennedy Space Center (KSC) Applied Chemistry Laboratory is focused on the Atmospheric Processing Module (APM). The purpose of the APM is to freeze carbon dioxide from a simulated Martian atmosphere containing the minor components nitrogen, argon, carbon monoxide, and water vapor at Martian pressures (approx.8 torr) by using dual cryocoolers with alternating cycles of freezing and sublimation. The resulting pressurized CO2 is fed to a methanation subsystem where it is catalytically combined with hydrogen in a Sabatier reactor supplied by the Johnson Space Center (JSC) to make methane and water vapor. We first used a simplified once-through setup and later employed a HiCO2 recycling system to improve process efficiency. This presentation and paper will cover (1) the design and selection of major hardware items, such as the cryocoolers, pumps, tanks, chillers, and membrane separators, (2) the determination of the optimal cold head design and flow rates needed to meet the collection requirement of 88 g CO2/hr for 14 hr, (3) the testing of the CO2 freezer subsystem, and (4) the integration and testing of the two subsystems to verify the desired production rate of 31.7 g CH4/hr and 71.3 g H2O/hr along with verification of their purity. The resulting 2.22 kg of CH4/O2 propellant per 14 hr day (including O2 from electrolysis of water recovered from regolith, which also supplies the H2 for methanation) is of the scale needed for a Mars Sample Return mission. In addition, the significance of the project to NASA's new Mars exploration plans will be discussed

    The Rise Times of High and Low Redshift Type Ia Supernovae are Consistent

    Get PDF
    We present a self-consistent comparison of the rise times for low- and high-redshift Type Ia supernovae. Following previous studies, the early light curve is modeled using a t-squared law, which is then mated with a modified Leibundgut template light curve. The best-fit t-squared law is determined for ensemble samples of low- and high-redshift supernovae by fitting simultaneously for all light curve parameters for all supernovae in each sample. Our method fully accounts for the non-negligible covariance amongst the light curve fitting parameters, which previous analyses have neglected. Contrary to Riess et al. (1999), we find fair to good agreement between the rise times of the low- and high-redshift Type Ia supernovae. The uncertainty in the rise time of the high-redshift Type Ia supernovae is presently quite large (roughly +/- 1.2 days statistical), making any search for evidence of evolution based on a comparison of rise times premature. Furthermore, systematic effects on rise time determinations from the high-redshift observations, due to the form of the late-time light curve and the manner in which the light curves of these supernovae were sampled, can bias the high-redshift rise time determinations by up to +3.6/-1.9 days under extreme situations. The peak brightnesses - used for cosmology - do not suffer any significant bias, nor any significant increase in uncertainty.Comment: 18 pages, 4 figures, Accepted for publication in the Astronomical Journal. Also available at http://www.lbl.gov/~nugent/papers.html Typos were corrected and a few sentences were added for improved clarit

    Correcting the z~8 Galaxy Luminosity Function for Gravitational Lensing Magnification Bias

    Get PDF
    We present a Bayesian framework to account for the magnification bias from both strong and weak gravitational lensing in estimates of high-redshift galaxy luminosity functions. We illustrate our method by estimating the z8z\sim8 UV luminosity function using a sample of 97 Y-band dropouts (Lyman break galaxies) found in the Brightest of Reionizing Galaxies (BoRG) survey and from the literature. We find the luminosity function is well described by a Schechter function with characteristic magnitude of M=19.850.35+0.30M^\star = -19.85^{+0.30}_{-0.35}, faint-end slope of α=1.720.29+0.30\alpha = -1.72^{+0.30}_{-0.29}, and number density of log10Ψ[Mpc3]=3.000.31+0.23\log_{10} \Psi^\star [\textrm{Mpc}^{-3}] = -3.00^{+0.23}_{-0.31}. These parameters are consistent within the uncertainties with those inferred from the same sample without accounting for the magnification bias, demonstrating that the effect is small for current surveys at z8z\sim8, and cannot account for the apparent overdensity of bright galaxies compared to a Schechter function found recently by Bowler et al. (2014a,b) and Finkelstein et al. (2014). We estimate that the probability of finding a strongly lensed z8z\sim8 source in our sample is in the range 315%\sim 3-15 \% depending on limiting magnitude. We identify one strongly-lensed candidate and three cases of intermediate lensing in BoRG (estimated magnification μ>1.4\mu>1.4) in addition to the previously known candidate group-scale strong lens. Using a range of theoretical luminosity functions we conclude that magnification bias will dominate wide field surveys -- such as those planned for the Euclid and WFIRST missions -- especially at z>10z>10. Magnification bias will need to be accounted for in order to derive accurate estimates of high-redshift luminosity functions in these surveys and to distinguish between galaxy formation models.Comment: Accepted for publication in ApJ. 20 pages, 13 figure

    Ambiguity, multiple streams, and EU policy

    Get PDF
    The multiple streams framework draws insight from interactions between agency and institutions to explore the impact of context, time, and meaning on policy change and to assess the institutional and issue complexities permeating the European Union (EU) policy process. The authors specify the assumptions and structure of the framework and review studies that have adapted it to reflect more fully EU decision-making processes. The nature of policy entrepreneurship and policy windows are assessed to identify areas of improvement. Finally, the authors sketch out a research agenda that refines the logic of political manipulation which permeates the lens and the institutional complexity which frames the EU policy process

    The U.S. Catholic Bishops and Gay Civil Rights: Four Case Studies

    Get PDF

    High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Full text link
    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.Comment: 72 page

    SN 2006bt: A Perplexing, Troublesome, and Possibly Misleading Type Ia Supernova

    Full text link
    SN 2006bt displays characteristics unlike those of any other known Type Ia supernova (SN Ia). We present optical light curves and spectra of SN 2006bt which demonstrate the peculiar nature of this object. SN 2006bt has broad, slowly declining light curves indicative of a hot, high-luminosity SN, but lacks a prominent second maximum in the i band as do low-luminosity SNe Ia. Its spectra are similar to those of low-luminosity SNe Ia, containing features that are only present in cool SN photospheres. Light-curve fitting methods suggest that SN 2006bt is reddened by a significant amount of dust; however, it occurred in the outskirts of its early-type host galaxy and has no strong Na D absorption in any of its spectra, suggesting a negligible amount of host-galaxy dust absorption. C II is possibly detected in our pre-maximum spectra, but at a much lower velocity than other elements. The progenitor was likely very old, being a member of the halo population of a galaxy that shows no signs of recent star formation. SNe Ia have been very successfully modeled as a one-parameter family, and this is fundamental to their use as cosmological distance indicators. SN 2006bt is a challenge to that picture, yet its relatively normal light curves allowed SN 2006bt to be included in cosmological analyses. We generate mock SN Ia datasets which indicate that contamination by similar objects will both increase the scatter of a SN Ia Hubble diagram and systematically bias measurements of cosmological parameters. However, spectra and rest-frame i-band light curves should provide a definitive way to identify and eliminate such objects.Comment: ApJ, accepted. 13 pages, 13 figure
    corecore