1,179 research outputs found

    Anomalous material-dependent transport of focused, laser-driven proton beams.

    Get PDF
    Intense lasers can accelerate protons in sufficient numbers and energy that the resulting beam can heat materials to exotic warm (10 s of eV temperature) states. Here we show with experimental data that a laser-driven proton beam focused onto a target heated it in a localized spot with size strongly dependent upon material and as small as 35 μm radius. Simulations indicate that cold stopping power values cannot model the intense proton beam transport in solid targets well enough to match the large differences observed. In the experiment a 74 J, 670 fs laser drove a focusing proton beam that transported through different thicknesses of solid Mylar, Al, Cu or Au, eventually heating a rear, thin, Au witness layer. The XUV emission seen from the rear of the Au indicated a clear dependence of proton beam transport upon atomic number, Z, of the transport layer: a larger and brighter emission spot was measured after proton transport through the lower Z foils even with equal mass density for supposed equivalent proton stopping range. Beam transport dynamics pertaining to the observed heated spot were investigated numerically with a particle-in-cell (PIC) code. In simulations protons moving through an Al transport layer result in higher Au temperature responsible for higher Au radiant emittance compared to a Cu transport case. The inferred finding that proton stopping varies with temperature in different materials, considerably changing the beam heating profile, can guide applications seeking to controllably heat targets with intense proton beams

    Determinants of the voltage dependence of G protein modulation within calcium channel β subunits

    Get PDF
    CaVβ subunits of voltage-gated calcium channels contain two conserved domains, a src-homology-3 (SH3) domain and a guanylate kinase-like (GK) domain with an intervening HOOK domain. We have shown in a previous study that, although Gβγ-mediated inhibitory modulation of CaV2.2 channels did not require the interaction of a CaVβ subunit with the CaVα1 subunit, when such interaction was prevented by a mutation in the α1 subunit, G protein modulation could not be removed by a large depolarization and showed voltage-independent properties (Leroy et al., J Neurosci 25:6984–6996, 2005). In this study, we have investigated the ability of mutant and truncated CaVβ subunits to support voltage-dependent G protein modulation in order to determine the minimal domain of the CaVβ subunit that is required for this process. We have coexpressed the CaVβ subunit constructs with CaV2.2 and α2δ-2, studied modulation by the activation of the dopamine D2 receptor, and also examined basal tonic modulation. Our main finding is that the CaVβ subunit GK domains, from either β1b or β2, are sufficient to restore voltage dependence to G protein modulation. We also found that the removal of the variable HOOK region from β2a promotes tonic voltage-dependent G protein modulation. We propose that the absence of the HOOK region enhances Gβγ binding affinity, leading to greater tonic modulation by basal levels of Gβγ. This tonic modulation requires the presence of an SH3 domain, as tonic modulation is not supported by any of the CaVβ subunit GK domains alone

    Viral population estimation using pyrosequencing

    Get PDF
    The diversity of virus populations within single infected hosts presents a major difficulty for the natural immune response as well as for vaccine design and antiviral drug therapy. Recently developed pyrophosphate based sequencing technologies (pyrosequencing) can be used for quantifying this diversity by ultra-deep sequencing of virus samples. We present computational methods for the analysis of such sequence data and apply these techniques to pyrosequencing data obtained from HIV populations within patients harboring drug resistant virus strains. Our main result is the estimation of the population structure of the sample from the pyrosequencing reads. This inference is based on a statistical approach to error correction, followed by a combinatorial algorithm for constructing a minimal set of haplotypes that explain the data. Using this set of explaining haplotypes, we apply a statistical model to infer the frequencies of the haplotypes in the population via an EM algorithm. We demonstrate that pyrosequencing reads allow for effective population reconstruction by extensive simulations and by comparison to 165 sequences obtained directly from clonal sequencing of four independent, diverse HIV populations. Thus, pyrosequencing can be used for cost-effective estimation of the structure of virus populations, promising new insights into viral evolutionary dynamics and disease control strategies.Comment: 23 pages, 13 figure

    SCAMP:standardised, concentrated, additional macronutrients, parenteral nutrition in very preterm infants: a phase IV randomised, controlled exploratory study of macronutrient intake, growth and other aspects of neonatal care

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Infants born <29 weeks gestation are at high risk of neurocognitive disability. Early postnatal growth failure, particularly head growth, is an important and potentially reversible risk factor for impaired neurodevelopmental outcome. Inadequate nutrition is a major factor in this postnatal growth failure, optimal protein and calorie (macronutrient) intakes are rarely achieved, especially in the first week. Infants <29 weeks are dependent on parenteral nutrition for the bulk of their nutrient needs for the first 2-3 weeks of life to allow gut adaptation to milk digestion. The prescription, formulation and administration of neonatal parenteral nutrition is critical to achieving optimal protein and calorie intake but has received little scientific evaluation. Current neonatal parenteral nutrition regimens often rely on individualised prescription to manage the labile, unpredictable biochemical and metabolic control characteristic of the early neonatal period. Individualised prescription frequently fails to translate into optimal macronutrient delivery. We have previously shown that a standardised, concentrated neonatal parenteral nutrition regimen can optimise macronutrient intake.</p> <p>Methods</p> <p>We propose a single centre, randomised controlled exploratory trial of two standardised, concentrated neonatal parenteral nutrition regimens comparing a standard macronutrient content (maximum protein 2.8 g/kg/day; lipid 2.8 g/kg/day, dextrose 10%) with a higher macronutrient content (maximum protein 3.8 g/kg/day; lipid 3.8 g/kg/day, dextrose 12%) over the first 28 days of life. 150 infants 24-28 completed weeks gestation and birthweight <1200 g will be recruited. The primary outcome will be head growth velocity in the first 28 days of life. Secondary outcomes will include a) auxological data between birth and 36 weeks corrected gestational age b) actual macronutrient intake in first 28 days c) biomarkers of biochemical and metabolic tolerance d) infection biomarkers and other intravascular line complications e) incidence of major complications of prematurity including mortality f) neurodevelopmental outcome at 2 years corrected gestational age</p> <p>Trial registration</p> <p>Current controlled trials: <a href="http://www.controlled-trials.com/ISRCTN76597892">ISRCTN76597892</a>; EudraCT Number: 2008-008899-14</p

    Making Associativity Operational

    Get PDF
    The purpose of this paper is to propose an operational idea for developing algebraic thinking in the absence of alphanumeric symbols. The paper reports on a design experiment encouraging preschool children to use the associative property algebraically. We describe the theoretical basis of the design, the tasks used, and examples of algebraic thinking in 5–6-year-old children. Theoretically, the paper makes a critical distinction between operational and structural meanings of the notion of equality. We argue that mathematical thinking involving equality among young learners can comprise both an operational and a structural conception and that the operational conception has a side that is productively linked to the structural conception. Using carefully designed hands-on tasks, the crux of the paper is the realization of algebraic thinking (in verbal mathematics) as operationally experienced in the ability to transform one number structure, with a quantity that is subject to change, into another through equality-preserving transformations

    A Measurement of Rb using a Double Tagging Method

    Get PDF
    The fraction of Z to bbbar events in hadronic Z decays has been measured by the OPAL experiment using the data collected at LEP between 1992 and 1995. The Z to bbbar decays were tagged using displaced secondary vertices, and high momentum electrons and muons. Systematic uncertainties were reduced by measuring the b-tagging efficiency using a double tagging technique. Efficiency correlations between opposite hemispheres of an event are small, and are well understood through comparisons between real and simulated data samples. A value of Rb = 0.2178 +- 0.0011 +- 0.0013 was obtained, where the first error is statistical and the second systematic. The uncertainty on Rc, the fraction of Z to ccbar events in hadronic Z decays, is not included in the errors. The dependence on Rc is Delta(Rb)/Rb = -0.056*Delta(Rc)/Rc where Delta(Rc) is the deviation of Rc from the value 0.172 predicted by the Standard Model. The result for Rb agrees with the value of 0.2155 +- 0.0003 predicted by the Standard Model.Comment: 42 pages, LaTeX, 14 eps figures included, submitted to European Physical Journal

    Measurement of the B+ and B-0 lifetimes and search for CP(T) violation using reconstructed secondary vertices

    Get PDF
    The lifetimes of the B+ and B-0 mesons, and their ratio, have been measured in the OPAL experiment using 2.4 million hadronic Z(0) decays recorded at LEP. Z(0) --> b (b) over bar decays were tagged using displaced secondary vertices and high momentum electrons and muons. The lifetimes were then measured using well-reconstructed charged and neutral secondary vertices selected in this tagged data sample. The results aretau(B+) = 1.643 +/- 0.037 +/- 0.025 pstau(Bo) = 1.523 +/- 0.057 +/- 0.053 pstau(B+)/tau(Bo) = 1.079 +/- 0.064 +/- 0.041,where in each case the first error is statistical and the second systematic.A larger data sample of 3.1 million hadronic Z(o) decays has been used to search for CP and CPT violating effects by comparison of inclusive b and (b) over bar hadron decays, No evidence fur such effects is seen. The CP violation parameter Re(epsilon(B)) is measured to be Re(epsilon(B)) = 0.001 +/- 0.014 +/- 0.003and the fractional difference between b and (b) over bar hadron lifetimes is measured to(Delta tau/tau)(b) = tau(b hadron) - tau((b) over bar hadron)/tau(average) = -0.001 +/- 0.012 +/- 0.008

    Novel interactions of transglutaminase-2 with heparan sulphate proteoglycans: reflection on physiological implications

    Get PDF
    This mini-review brings together information from publications and recent conference proceedings that have shed light on the biological interaction between transglutaminase-2 and heparan sulphate proteoglycans. We subsequently draw hypothesis of possible implications in the wound healing process. There is a substantial overlap in the action of transglutaminase-2 and the heparan sulphate proteoglycan syndecan-4 in normal and abnormal wound repair. Our latest findings have identified syndecan-4 as a possible binding and signalling partner of fibronectinbound TG2 and support the idea that transglutaminase-2 and syndecan-4 acts in synergy

    On the power and the systematic biases of the detection of chromosomal inversions by paired-end genome sequencing

    Get PDF
    One of the most used techniques to study structural variation at a genome level is paired-end mapping (PEM). PEM has the advantage of being able to detect balanced events, such as inversions and translocations. However, inversions are still quite difficult to predict reliably, especially from high-throughput sequencing data. We simulated realistic PEM experiments with different combinations of read and library fragment lengths, including sequencing errors and meaningful base-qualities, to quantify and track down the origin of false positives and negatives along sequencing, mapping, and downstream analysis. We show that PEM is very appropriate to detect a wide range of inversions, even with low coverage data. However, % of inversions located between segmental duplications are expected to go undetected by the most common sequencing strategies. In general, longer DNA libraries improve the detectability of inversions far better than increments of the coverage depth or the read length. Finally, we review the performance of three algorithms to detect inversions -SVDetect, GRIAL, and VariationHunter-, identify common pitfalls, and reveal important differences in their breakpoint precisions. These results stress the importance of the sequencing strategy for the detection of structural variants, especially inversions, and offer guidelines for the design of future genome sequencing projects
    corecore