5,408 research outputs found

    From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument

    Get PDF
    <b>Background</b> Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field.<p></p> <b>Methods</b> A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals.<p></p> <b>Results</b> The developed instrument was pre-tested in two professional samples (N = 46; N = 231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts.<p></p> <b>Conclusions</b> To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study

    Chandra Observations of the Radio Galaxy 3C 445 and the Hotspot X-ray Emission Mechanism

    Get PDF
    We present new {\it Chandra} observations of the radio galaxy 3C 445, centered on its southern radio hotspot. Our observations detect X-ray emission displaced upstream and to the west of the radio-optical hotspot. Attempting to reproduce both the observed spectral energy distribution (SED) and the displacement, excludes all one zone models. Modeling of the radio-optical hotspot spectrum suggests that the electron distribution has a low energy cutoff or break approximately at the proton rest mass energy. The X-rays could be due to external Compton scattering of the cosmic microwave background (EC/CMB) coming from the fast (Lorentz factor Γ4\Gamma\approx 4) part of a decelerating flow, but this requires a small angle between the jet velocity and the observer's line of sight (θ14\theta\approx 14^{\circ}). Alternatively, the X-ray emission can be synchrotron from a separate population of electrons. This last interpretation does not require the X-ray emission to be beamed.Comment: 9 pages, 5 figures, ApJ, in pres

    How Gaussian competition leads to lumpy or uniform species distributions

    Get PDF
    A central model in theoretical ecology considers the competition of a range of species for a broad spectrum of resources. Recent studies have shown that essentially two different outcomes are possible. Either the species surviving competition are more or less uniformly distributed over the resource spectrum, or their distribution is 'lumped' (or 'clumped'), consisting of clusters of species with similar resource use that are separated by gaps in resource space. Which of these outcomes will occur crucially depends on the competition kernel, which reflects the shape of the resource utilization pattern of the competing species. Most models considered in the literature assume a Gaussian competition kernel. This is unfortunate, since predictions based on such a Gaussian assumption are not robust. In fact, Gaussian kernels are a border case scenario, and slight deviations from this function can lead to either uniform or lumped species distributions. Here we illustrate the non-robustness of the Gaussian assumption by simulating different implementations of the standard competition model with constant carrying capacity. In this scenario, lumped species distributions can come about by secondary ecological or evolutionary mechanisms or by details of the numerical implementation of the model. We analyze the origin of this sensitivity and discuss it in the context of recent applications of the model.Comment: 11 pages, 3 figures, revised versio

    Iterated maps for clarinet-like systems

    Full text link
    The dynamical equations of clarinet-like systems are known to be reducible to a non-linear iterated map within reasonable approximations. This leads to time oscillations that are represented by square signals, analogous to the Raman regime for string instruments. In this article, we study in more detail the properties of the corresponding non-linear iterations, with emphasis on the geometrical constructions that can be used to classify the various solutions (for instance with or without reed beating) as well as on the periodicity windows that occur within the chaotic region. In particular, we find a regime where period tripling occurs and examine the conditions for intermittency. We also show that, while the direct observation of the iteration function does not reveal much on the oscillation regime of the instrument, the graph of the high order iterates directly gives visible information on the oscillation regime (characterization of the number of period doubligs, chaotic behaviour, etc.)

    Self-organising Thermoregulatory Huddling in a Model of Soft Deformable Littermates

    Get PDF
    Thermoregulatory huddling behaviours dominate the early experiences of developing rodents, and constrain the patterns of sensory and motor input that drive neural plasticity. Huddling is a complex emergent group behaviour, thought to provide an early template for the development of adult social systems, and to constrain natural selection on metabolic physiology. However, huddling behaviours are governed by simple rules of interaction between individuals, which can be described in terms of the thermodynamics of heat exchange, and can be easily controlled by manipulation of the environment temperature. Thermoregulatory huddling thus provides an opportunity to investigate the effects of early experience on brain development in a social, developmental, and evolutionary context, through controlled experimentation. This paper demonstrates that thermoregulatory huddling behaviours can self-organise in a simulation of rodent littermates modelled as soft-deformable bodies that exchange heat during contact. The paper presents a novel methodology, based on techniques in computer animation, for simulating the early sensory and motor experiences of the developing rodent

    Dissociation constants and thermodynamic properties of amino acids used in CO2 absorption from (293 to 353) K

    Get PDF
    The second dissociation constants of the amino acids βalanine, taurine, sarcosine, 6-aminohexanoic acid, DL-methionine, glycine, L-phenylalanine, and L-proline and the third dissociation constants of L-glutamic acid and L-aspartic acid have been determined from electromotive force measurements at temperatures from (293 to 353) K. Experimental results are reported and compared to literature values. Values of the standard state thermodynamic properties are derived from the experimental results and compared to the values of commercially available amines used as absorbents for CO 2 capture.

    Deep-coverage whole genome sequences and blood lipids among 16,324 individuals.

    Get PDF
    Large-scale deep-coverage whole-genome sequencing (WGS) is now feasible and offers potential advantages for locus discovery. We perform WGS in 16,324 participants from four ancestries at mean depth >29X and analyze genotypes with four quantitative traits-plasma total cholesterol, low-density lipoprotein cholesterol (LDL-C), high-density lipoprotein cholesterol, and triglycerides. Common variant association yields known loci except for few variants previously poorly imputed. Rare coding variant association yields known Mendelian dyslipidemia genes but rare non-coding variant association detects no signals. A high 2M-SNP LDL-C polygenic score (top 5th percentile) confers similar effect size to a monogenic mutation (~30 mg/dl higher for each); however, among those with severe hypercholesterolemia, 23% have a high polygenic score and only 2% carry a monogenic mutation. At these sample sizes and for these phenotypes, the incremental value of WGS for discovery is limited but WGS permits simultaneous assessment of monogenic and polygenic models to severe hypercholesterolemia

    A high-resolution radio survey of the Vela supernova remnant

    Full text link
    This paper presents a high-resolution radio continuum (843 MHz) survey of the Vela supernova remnant. The contrast between the structures in the central pulsar-powered nebula of the remnant and the synchrotron radiation shell allows the remnant to be identified morphologically as a member of the composite class. The data are the first of a composite remnant at spatial scales comparable with those available for the Cygnus Loop and the Crab Nebula, and make possible a comparison of radio, optical and soft X-ray emission from the resolved shell filaments. The survey, made with the Molonglo Observatory Synthesis Telescope, covers an area of 50 square degrees at a resolution of 43'' x 60'', while imaging structures on scales up to 30'.Comment: 18 pages, 7 jpg figures (version with ps figures at http://astro.berkeley.edu/~dbock/papers/); AJ, in pres

    Gravitational waves in dynamical spacetimes with matter content in the Fully Constrained Formulation

    Full text link
    The Fully Constrained Formulation (FCF) of General Relativity is a novel framework introduced as an alternative to the hyperbolic formulations traditionally used in numerical relativity. The FCF equations form a hybrid elliptic-hyperbolic system of equations including explicitly the constraints. We present an implicit-explicit numerical algorithm to solve the hyperbolic part, whereas the elliptic sector shares the form and properties with the well known Conformally Flat Condition (CFC) approximation. We show the stability andconvergence properties of the numerical scheme with numerical simulations of vacuum solutions. We have performed the first numerical evolutions of the coupled system of hydrodynamics and Einstein equations within FCF. As a proof of principle of the viability of the formalism, we present 2D axisymmetric simulations of an oscillating neutron star. In order to simplify the analysis we have neglected the back-reaction of the gravitational waves into the dynamics, which is small (<2 %) for the system considered in this work. We use spherical coordinates grids which are well adapted for simulations of stars and allow for extended grids that marginally reach the wave zone. We have extracted the gravitational wave signature and compared to the Newtonian quadrupole and hexadecapole formulae. Both extraction methods show agreement within the numerical errors and the approximations used (~30 %).Comment: 17 pages, 9 figures, 2 tables, accepted for publication in PR
    corecore