1,361 research outputs found
Ambiguities in the partial-wave analysis of pseudoscalar-meson photoproduction
Ambiguities in pseudoscalar-meson photoproduction, arising from incomplete
experimental data, have analogs in pion-nucleon scattering. Amplitude
ambiguities have important implications for the problems of amplitude
extraction and resonance identification in partial-wave analysis. The effect of
these ambiguities on observables is described. We compare our results with
those found in earlier studies.Comment: 12 pages of text. No figure
A mass-dependent density profile for dark matter haloes including the influence of galaxy formation
We introduce a mass-dependent density profile to describe the distribution of dark matter within galaxies, which takes into account the stellar-to-halo mass dependence of the response of dark matter to baryonic processes. The study is based on the analysis of hydrodynamically simulated galaxies from dwarf to Milky Way mass, drawn from the Making Galaxies In a Cosmological Context project, which have been shown to match a wide range of disc scaling relationships. We find that the best-fitting parameters of a generic double power-law density profile vary in a systematic manner that depends on the stellar-to-halo mass ratio of each galaxy. Thus, the quantity M⋆/Mhalo constrains the inner (γ) and outer (β) slopes of dark matter density, and the sharpness of transition between the slopes (α), reducing the number of free parameters of the model to two. Due to the tight relation between stellar mass and halo mass, either of these quantities is sufficient to describe the dark matter halo profile including the effects of baryons. The concentration of the haloes in the hydrodynamical simulations is consistent with N-body expectations up to Milky Way-mass galaxies, at which mass the haloes become twice as concentrated as compared with pure dark matter runs. This mass-dependent density profile can be directly applied to rotation curve data of observed galaxies and to semi-analytic galaxy formation models as a significant improvement over the commonly used NFW profile
Recommended from our members
Stops making sense: translational trade-offs and stop codon reassignment
Background
Efficient gene expression involves a trade-off between (i) premature termination of protein synthesis; and (ii) readthrough, where the ribosome fails to dissociate at the terminal stop. Sense codons that are similar in sequence to stop codons are more susceptible to nonsense mutation, and are also likely to be more susceptible to transcriptional or translational errors causing premature termination. We therefore expect this trade-off to be influenced by the number of stop codons in the genetic code. Although genetic codes are highly constrained, stop codon number appears to be their most volatile feature.
Results
In the human genome, codons readily mutable to stops are underrepresented in coding sequences. We construct a simple mathematical model based on the relative likelihoods of premature termination and readthrough. When readthrough occurs, the resultant protein has a tail of amino acid residues incorrectly added to the C-terminus. Our results depend strongly on the number of stop codons in the genetic code. When the code has more stop codons, premature termination is relatively more likely, particularly for longer genes. When the code has fewer stop codons, the length of the tail added by readthrough will, on average, be longer, and thus more deleterious. Comparative analysis of taxa with a range of stop codon numbers suggests that genomes whose code includes more stop codons have shorter coding sequences.
Conclusions
We suggest that the differing trade-offs presented by alternative genetic codes may result in differences in genome structure. More speculatively, multiple stop codons may mitigate readthrough, counteracting the disadvantage of a higher rate of nonsense mutation. This could help explain the puzzling overrepresentation of stop codons in the canonical genetic code and most variants
Calibration of myocardial T2 and T1 against iron concentration.
BACKGROUND: The assessment of myocardial iron using T2* cardiovascular magnetic resonance (CMR) has been validated and calibrated, and is in clinical use. However, there is very limited data assessing the relaxation parameters T1 and T2 for measurement of human myocardial iron.
METHODS: Twelve hearts were examined from transfusion-dependent patients: 11 with end-stage heart failure, either following death (n=7) or cardiac transplantation (n=4), and 1 heart from a patient who died from a stroke with no cardiac iron loading. Ex-vivo R1 and R2 measurements (R1=1/T1 and R2=1/T2) at 1.5 Tesla were compared with myocardial iron concentration measured using inductively coupled plasma atomic emission spectroscopy.
RESULTS: From a single myocardial slice in formalin which was repeatedly examined, a modest decrease in T2 was observed with time, from mean (± SD) 23.7 ± 0.93 ms at baseline (13 days after death and formalin fixation) to 18.5 ± 1.41 ms at day 566 (p<0.001). Raw T2 values were therefore adjusted to correct for this fall over time. Myocardial R2 was correlated with iron concentration [Fe] (R2 0.566, p<0.001), but the correlation was stronger between LnR2 and Ln[Fe] (R2 0.790, p<0.001). The relation was [Fe] = 5081•(T2)-2.22 between T2 (ms) and myocardial iron (mg/g dry weight). Analysis of T1 proved challenging with a dichotomous distribution of T1, with very short T1 (mean 72.3 ± 25.8 ms) that was independent of iron concentration in all hearts stored in formalin for greater than 12 months. In the remaining hearts stored for <10 weeks prior to scanning, LnR1 and iron concentration were correlated but with marked scatter (R2 0.517, p<0.001). A linear relationship was present between T1 and T2 in the hearts stored for a short period (R2 0.657, p<0.001).
CONCLUSION: Myocardial T2 correlates well with myocardial iron concentration, which raises the possibility that T2 may provide additive information to T2* for patients with myocardial siderosis. However, ex-vivo T1 measurements are less reliable due to the severe chemical effects of formalin on T1 shortening, and therefore T1 calibration may only be practical from in-vivo human studies
Cosmological parameters from SDSS and WMAP
We measure cosmological parameters using the three-dimensional power spectrum
P(k) from over 200,000 galaxies in the Sloan Digital Sky Survey (SDSS) in
combination with WMAP and other data. Our results are consistent with a
``vanilla'' flat adiabatic Lambda-CDM model without tilt (n=1), running tilt,
tensor modes or massive neutrinos. Adding SDSS information more than halves the
WMAP-only error bars on some parameters, tightening 1 sigma constraints on the
Hubble parameter from h~0.74+0.18-0.07 to h~0.70+0.04-0.03, on the matter
density from Omega_m~0.25+/-0.10 to Omega_m~0.30+/-0.04 (1 sigma) and on
neutrino masses from <11 eV to <0.6 eV (95%). SDSS helps even more when
dropping prior assumptions about curvature, neutrinos, tensor modes and the
equation of state. Our results are in substantial agreement with the joint
analysis of WMAP and the 2dF Galaxy Redshift Survey, which is an impressive
consistency check with independent redshift survey data and analysis
techniques. In this paper, we place particular emphasis on clarifying the
physical origin of the constraints, i.e., what we do and do not know when using
different data sets and prior assumptions. For instance, dropping the
assumption that space is perfectly flat, the WMAP-only constraint on the
measured age of the Universe tightens from t0~16.3+2.3-1.8 Gyr to
t0~14.1+1.0-0.9 Gyr by adding SDSS and SN Ia data. Including tensors, running
tilt, neutrino mass and equation of state in the list of free parameters, many
constraints are still quite weak, but future cosmological measurements from
SDSS and other sources should allow these to be substantially tightened.Comment: Minor revisions to match accepted PRD version. SDSS data and ppt
figures available at http://www.hep.upenn.edu/~max/sdsspars.htm
The National Lung Matrix Trial: translating the biology of stratification in advanced non-small-cell lung cancer
© The Author 2015.Background: The management of NSCLC has been transformed by stratified medicine. The National Lung Matrix Trial (NLMT) is a UK-wide study exploring the activity of rationally selected biomarker/targeted therapy combinations. Patients and methods: The Cancer Research UK (CRUK) Stratified Medicine Programme 2 is undertaking the large volume national molecular pre-screening which integrates with the NLMT. At study initiation, there are eight drugs being used to target 18 molecular cohorts. The aim is to determine whether there is sufficient signal of activity in any drug-biomarker combination to warrant further investigation. A Bayesian adaptive design that gives a more realistic approach to decision making and flexibility to make conclusions without fixing the sample size was chosen. The screening platform is an adaptable 28-gene Nextera next-generation sequencing platform designed by Illumina, covering the range of molecular abnormalities being targeted. The adaptive design allows new biomarker-drug combination cohorts to be incorporated by substantial amendment. The pre-clinical justification for each biomarker-drug combination has been rigorously assessed creating molecular exclusion rules and a trumping strategy in patients harbouring concomitant actionable genetic abnormalities. Discrete routes of pathway activation or inactivation determined by cancer genome aberrations are treated as separate cohorts. Key translational analyses include the deep genomic analysis of pre- and post-treatment biopsies, the establishment of patient-derived xenograft models and longitudinal ctDNA collection, in order to define predictive biomarkers, mechanisms of resistance and early markers of response and relapse. Conclusion: The SMP2 platform will provide large scale genetic screening to inform entry into the NLMT, a trial explicitly aimed at discovering novel actionable cohorts in NSCLC
Placing the library at the heart of plagiarism prevention: The University of Bradford experience.
yesPlagiarism is a vexed issue for Higher Education, affecting student transition, retention and attainment. This paper reports on two initiatives from the University of Bradford library aimed at reducing student plagiarism. The first initiative is an intensive course for students who have contravened plagiarism regulations. The second course introduces new students to the concepts surrounding plagiarism with the aim to prevent plagiarism breaches. Since the Plagiarism Avoidance for New Students course was introduced there has been a significant drop in students referred to the disciplinary programme. This paper discusses the background to both courses and the challenges of implementation
Implementation salvage experiences from the Melbourne diabetes prevention study
Background Many public health interventions based on apparently sound evidence from randomised controlled trials encounter difficulties when being scaled up within health systems. Even under the best of circumstances, implementation is exceedingly difficult. In this paper we will describe the implementation salvage experiences from the Melbourne Diabetes Prevention Study, which is a randomised controlled trial of the effectiveness and cost-effectiveness nested in the state-wide Life! Taking Action on Diabetes program in Victoria, Australia.Discussion The Melbourne Diabetes Prevention Study sits within an evolving larger scale implementation project, the Life! program. Changes that occurred during the roll-out of that program had a direct impact on the process of conducting this trial. The issues and methods of recovery the study team encountered were conceptualised using an implementation salvage strategies framework. The specific issues the study team came across included continuity of the state funding for Life! program and structural changes to the Life! program which consisted of adjustments to eligibility criteria, referral processes, structure and content, as well as alternative program delivery for different population groups. Staff turnover, recruitment problems, setting and venue concerns, availability of potential participants and participant characteristics were also identified as evaluation roadblocks. Each issue and corresponding salvage strategy is presented.Summary The experiences of conducting such a novel trial as the preliminary Melbourne Diabetes Prevention Study have been invaluable. The lessons learnt and knowledge gained will inform the future execution of this trial in the coming years. We anticipate that these results will also be beneficial to other researchers conducting similar trials in the public health field. We recommend that researchers openly share their experiences, barriers and challenges when conducting randomised controlled trials and implementation research. We encourage them to describe the factors that may have inhibited or enhanced the desired outcomes so that the academic community can learn and expand the research foundation of implementation salvage.<br /
Multiple novel prostate cancer susceptibility signals identified by fine-mapping of known risk loci among Europeans
Genome-wide association studies (GWAS) have identified numerous common prostate cancer (PrCa) susceptibility loci. We have
fine-mapped 64 GWAS regions known at the conclusion of the iCOGS study using large-scale genotyping and imputation in
25 723 PrCa cases and 26 274 controls of European ancestry. We detected evidence for multiple independent signals at 16
regions, 12 of which contained additional newly identified significant associations. A single signal comprising a spectrum of
correlated variation was observed at 39 regions; 35 of which are now described by a novel more significantly associated lead SNP,
while the originally reported variant remained as the lead SNP only in 4 regions. We also confirmed two association signals in
Europeans that had been previously reported only in East-Asian GWAS. Based on statistical evidence and linkage disequilibrium
(LD) structure, we have curated and narrowed down the list of the most likely candidate causal variants for each region.
Functional annotation using data from ENCODE filtered for PrCa cell lines and eQTL analysis demonstrated significant
enrichment for overlap with bio-features within this set. By incorporating the novel risk variants identified here alongside the
refined data for existing association signals, we estimate that these loci now explain ∼38.9% of the familial relative risk of PrCa,
an 8.9% improvement over the previously reported GWAS tag SNPs. This suggests that a significant fraction of the heritability of
PrCa may have been hidden during the discovery phase of GWAS, in particular due to the presence of multiple independent
signals within the same regio
- …
