1,683 research outputs found

    In-n-out: The Gas Cycle From Dwarfs To Spiral Galaxies

    Get PDF
    We examine the scalings of galactic outflows with halo mass across a suite of 20 high-resolution cosmological zoom galaxy simulations covering halo masses in the range 10^9.5-10^12\M. These simulations self-consistently generate outflows from the available supernova energy in a manner that successfully reproduces key galaxy observables, including the stellar mass–halo mass, Tully–Fisher, and mass–metallicity relations. We quantify the importance of ejective feedback to setting the stellar mass relative to the efficiency of gas accretion and star formation. Ejective feedback is increasingly important as galaxy mass decreases; we find an effective mass loading factor that scales as v-circ-2.2, with an amplitude and shape that are invariant with redshift. These scalings are consistent with analytic models for energy-driven wind, based solely on the halo potential. Recycling is common: about half of the outflow mass across all galaxy masses is later reaccreted. The recycling timescale is typically ~1 Gyr, virtually independent of halo mass. Recycled material is reaccreted farther out in the disk and with typically ~2–3 times more angular momentum. These results elucidate and quantify how the baryon cycle plausibly regulates star formation and alters the angular momentum distribution of disk material across the halo mass range where most cosmic star formation occurs

    Momentum asymmetries as CP violating observables

    Full text link
    Three body decays can exhibit CP violation that arises from interfering diagrams with different orderings of the final state particles. We construct several momentum asymmetry observables that are accessible in a hadron collider environment where some of the final state particles are not reconstructed and not all the kinematic information can be extracted. We discuss the complications that arise from the different possible production mechanisms of the decaying particle. Examples involving heavy neutralino decays in supersymmetric theories and heavy Majorana neutrino decays in Type-I seesaw models are examined.Comment: 20 pages, 9 figures. Clarifying comments and one reference added, matches published versio

    Why is it difficult to implement e-health initiatives? A qualitative study

    Get PDF
    <b>Background</b> The use of information and communication technologies in healthcare is seen as essential for high quality and cost-effective healthcare. However, implementation of e-health initiatives has often been problematic, with many failing to demonstrate predicted benefits. This study aimed to explore and understand the experiences of implementers - the senior managers and other staff charged with implementing e-health initiatives and their assessment of factors which promote or inhibit the successful implementation, embedding, and integration of e-health initiatives.<p></p> <b>Methods</b> We used a case study methodology, using semi-structured interviews with implementers for data collection. Case studies were selected to provide a range of healthcare contexts (primary, secondary, community care), e-health initiatives, and degrees of normalization. The initiatives studied were Picture Archiving and Communication System (PACS) in secondary care, a Community Nurse Information System (CNIS) in community care, and Choose and Book (C&B) across the primary-secondary care interface. Implementers were selected to provide a range of seniority, including chief executive officers, middle managers, and staff with 'on the ground' experience. Interview data were analyzed using a framework derived from Normalization Process Theory (NPT).<p></p> <b>Results</b> Twenty-three interviews were completed across the three case studies. There were wide differences in experiences of implementation and embedding across these case studies; these differences were well explained by collective action components of NPT. New technology was most likely to 'normalize' where implementers perceived that it had a positive impact on interactions between professionals and patients and between different professional groups, and fit well with the organisational goals and skill sets of existing staff. However, where implementers perceived problems in one or more of these areas, they also perceived a lower level of normalization.<p></p> <b>Conclusions</b> Implementers had rich understandings of barriers and facilitators to successful implementation of e-health initiatives, and their views should continue to be sought in future research. NPT can be used to explain observed variations in implementation processes, and may be useful in drawing planners' attention to potential problems with a view to addressing them during implementation planning

    Logarithmic correction to BH entropy as Noether charge

    Get PDF
    We consider the role of the type-A trace anomaly in static black hole solutions to semiclassical Einstein equation in four dimensions. Via Wald's Noether charge formalism, we compute the contribution to the entropy coming from the anomaly induced effective action and unveil a logarithmic correction to the Bekenstein-Hawking area law. The corrected entropy is given by a seemingly universal formula involving the coefficient of the type-A trace anomaly, the Euler characteristic of the horizon and the value at the horizon of the solution to the uniformization problem for Q-curvature. Two instances are examined in detail: Schwarzschild and a four-dimensional massless topological black hole. We also find agreement with the logarithmic correction due to one-loop contribution of conformal fields in the Schwarzschild background.Comment: 14 pages, JHEP styl

    Presynaptic partner selection during retinal circuit reassembly varies with timing of neuronal regeneration in vivo

    Get PDF
    Whether neurons can restore their original connectivity patterns during circuit repair is unclear. Taking advantage of the regenerative capacity of zebrafish retina, we show here the remarkable specificity by which surviving neurons reassemble their connectivity upon regeneration of their major input. H3 horizontal cells (HCs) normally avoid red and green cones, and prefer ultraviolet over blue cones. Upon ablation of the major (ultraviolet) input, H3 HCs do not immediately increase connectivity with other cone types. Instead, H3 dendrites retract and re-extend to contact new ultraviolet cones. But, if regeneration is delayed or absent, blue-cone synaptogenesis increases and ectopic synapses are made with red and green cones. Thus, cues directing synapse specificity can be maintained following input loss, but only within a limited time period. Further, we postulate that signals from the major input that shape the H3 HC's wiring pattern during development persist to restrict miswiring after damage

    Development of fluorogenic probe-based PCR assays for the detection and quantification of bovine piroplasmids.

    Get PDF
    This paper reports two new quantitative PCR (qPCR) assays, developed in an attempt to improve the detection of bovine piroplasmids. The first of these techniques is a duplex TaqMan assay for the simultaneous diagnosis of Babesia bovis and B. bigemina. This technique is ideal for use in South America where bovids harbour no theilerids. The second technique, which is suitable for the diagnosis of both babesiosis and theileriosis worldwide, involves fluorescence resonance energy transfer (FRET) probes. In FRET assays, Babesia bovis, B. divergens, Babesia sp. (B. major or B. bigemina), Theileria annae and Theileria sp. were all identifiable based on the melting temperatures of their amplified fragments. Both techniques provided linear calibration curves over the 0.1fg/microl to 0.01ng/microl DNA range. The assays showed good sensitivity and specificity. To assess their performance, both procedures were compared in two separate studies: the first was intended to monitor the experimental infection of calves with B. bovis and the second was a survey where 200 bovid/equine DNA samples from different countries were screened for piroplasmids. Comparative studies showed that duplex TaqMan qPCR was more sensitive than FRET qPCR in the detection of babesids

    Graphene plasmonics

    Full text link
    Two rich and vibrant fields of investigation, graphene physics and plasmonics, strongly overlap. Not only does graphene possess intrinsic plasmons that are tunable and adjustable, but a combination of graphene with noble-metal nanostructures promises a variety of exciting applications for conventional plasmonics. The versatility of graphene means that graphene-based plasmonics may enable the manufacture of novel optical devices working in different frequency ranges, from terahertz to the visible, with extremely high speed, low driving voltage, low power consumption and compact sizes. Here we review the field emerging at the intersection of graphene physics and plasmonics.Comment: Review article; 12 pages, 6 figures, 99 references (final version available only at publisher's web site

    Emergent global patterns of ecosystem structure and function from a mechanistic general ecosystem model

    Get PDF
    Anthropogenic activities are causing widespread degradation of ecosystems worldwide, threatening the ecosystem services upon which all human life depends. Improved understanding of this degradation is urgently needed to improve avoidance and mitigation measures. One tool to assist these efforts is predictive models of ecosystem structure and function that are mechanistic: based on fundamental ecological principles. Here we present the first mechanistic General Ecosystem Model (GEM) of ecosystem structure and function that is both global and applies in all terrestrial and marine environments. Functional forms and parameter values were derived from the theoretical and empirical literature where possible. Simulations of the fate of all organisms with body masses between 10 µg and 150,000 kg (a range of 14 orders of magnitude) across the globe led to emergent properties at individual (e.g., growth rate), community (e.g., biomass turnover rates), ecosystem (e.g., trophic pyramids), and macroecological scales (e.g., global patterns of trophic structure) that are in general agreement with current data and theory. These properties emerged from our encoding of the biology of, and interactions among, individual organisms without any direct constraints on the properties themselves. Our results indicate that ecologists have gathered sufficient information to begin to build realistic, global, and mechanistic models of ecosystems, capable of predicting a diverse range of ecosystem properties and their response to human pressures

    The 2009 Samoa–Tonga great earthquake triggered doublet

    Get PDF
    Great earthquakes (having seismic magnitudes of at least 8) usually involve abrupt sliding of rock masses at a boundary between tectonic plates. Such interplate ruptures produce dynamic and static stress changes that can activate nearby intraplate aftershocks, as is commonly observed in the trench-slope region seaward of a great subduction zone thrust event1. The earthquake sequence addressed here involves a rare instance in which a great trench-slope intraplate earthquake triggered extensive interplate faulting, reversing the typical pattern and broadly expanding the seismic and tsunami hazard. On 29 September 2009, within two minutes of the initiation of a normal faulting event with moment magnitude 8.1 in the outer trench-slope at the northern end of the Tonga subduction zone, two major interplate underthrusting subevents (both with moment magnitude 7.8), with total moment equal to a second great earthquake of moment magnitude 8.0, ruptured the nearby subduction zone megathrust. The collective faulting produced tsunami waves with localized regions of about 12 metres run-up that claimed 192 lives in Samoa, American Samoa and Tonga. Overlap of the seismic signals obscured the fact that distinct faults separated by more than 50 km had ruptured with different geometries, with the triggered thrust faulting only being revealed by detailed seismic wave analyses. Extensive interplate and intraplate aftershock activity was activated over a large region of the northern Tonga subduction zone

    The quest for the solar g modes

    Full text link
    Solar gravity modes (or g modes) -- oscillations of the solar interior for which buoyancy acts as the restoring force -- have the potential to provide unprecedented inference on the structure and dynamics of the solar core, inference that is not possible with the well observed acoustic modes (or p modes). The high amplitude of the g-mode eigenfunctions in the core and the evanesence of the modes in the convection zone make the modes particularly sensitive to the physical and dynamical conditions in the core. Owing to the existence of the convection zone, the g modes have very low amplitudes at photospheric levels, which makes the modes extremely hard to detect. In this paper, we review the current state of play regarding attempts to detect g modes. We review the theory of g modes, including theoretical estimation of the g-mode frequencies, amplitudes and damping rates. Then we go on to discuss the techniques that have been used to try to detect g modes. We review results in the literature, and finish by looking to the future, and the potential advances that can be made -- from both data and data-analysis perspectives -- to give unambiguous detections of individual g modes. The review ends by concluding that, at the time of writing, there is indeed a consensus amongst the authors that there is currently no undisputed detection of solar g modes.Comment: 71 pages, 18 figures, accepted by Astronomy and Astrophysics Revie
    corecore