4,609 research outputs found

    ASCORE: an up-to-date cardiovascular risk score for hypertensive patients reflecting contemporary clinical practice developed using the (ASCOT-BPLA) trial data.

    No full text
    A number of risk scores already exist to predict cardiovascular (CV) events. However, scores developed with data collected some time ago might not accurately predict the CV risk of contemporary hypertensive patients that benefit from more modern treatments and management. Using data from the randomised clinical trial Anglo-Scandinavian Cardiac Outcomes Trial-BPLA, with 15 955 hypertensive patients without previous CV disease receiving contemporary preventive CV management, we developed a new risk score predicting the 5-year risk of a first CV event (CV death, myocardial infarction or stroke). Cox proportional hazard models were used to develop a risk equation from baseline predictors. The final risk model (ASCORE) included age, sex, smoking, diabetes, previous blood pressure (BP) treatment, systolic BP, total cholesterol, high-density lipoprotein-cholesterol, fasting glucose and creatinine baseline variables. A simplified model (ASCORE-S) excluding laboratory variables was also derived. Both models showed very good internal validity. User-friendly integer score tables are reported for both models. Applying the latest Framingham risk score to our data significantly overpredicted the observed 5-year risk of the composite CV outcome. We conclude that risk scores derived using older databases (such as Framingham) may overestimate the CV risk of patients receiving current BP treatments; therefore, 'updated' risk scores are needed for current patients

    Semantic closure demonstrated by the evolution of a universal constructor architecture in an artificial chemistry

    Get PDF
    We present a novel stringmol-based artificial chemistry system modelled on the universal constructor architecture (UCA) first explored by von Neumann. In a UCA, machines interact with an abstract description of themselves to replicate by copying the abstract description and constructing the machines that the abstract description encodes. DNA-based replication follows this architecture, with DNA being the abstract description, the polymerase being the copier, and the ribosome being the principal machine in expressing what is encoded on the DNA. This architecture is semantically closed as the machine that defines what the abstract description means is itself encoded on that abstract description. We present a series of experiments with the stringmol UCA that show the evolution of the meaning of genomic material, allowing the concept of semantic closure and transitions between semantically closed states to be elucidated in the light of concrete examples. We present results where, for the first time in an in silico system, simultaneous evolution of the genomic material, copier and constructor of a UCA, giving rise to viable offspring

    Using assignment data to analyse a blended information literacy intervention: a quantitative approach

    Get PDF
    This research sought to determine whether a blended information literacy learning and teaching intervention could statistically significantly enhance undergraduates’ information discernment compared to standard face-to-face delivery. A mixture of face-to-face and online activities, including online social media learning, was used. Three interventions were designed to develop the information literacies of first-year undergraduates studying Sport and Exercise at Staffordshire University and focused on one aspect of information literacy: the ability to evaluate source material effectively. An analysis was devised where written evaluations of found information for an assessment were converted into numerical scores and then measured statistically. This helped to evaluate the efficacy of the interventions and provided data for further analysis. An insight into how the information literacy pedagogical intervention and the cognitive processes involved in enabling participants to interact critically with information is provided. The intervention which incorporated social media learning proved to be the most successful learning and teaching approach. The data indicated that undergraduate students’ information literacy can be developed. However, additional long-term data is required to establish whether this intervention would have a lasting impact

    Variant supercurrent multiplets

    Full text link
    In N = 1 rigid supersymmetric theories, there exist three standard realizations of the supercurrent multiplet corresponding to the (i) old minimal, (ii) new minimal and (iii) non-minimal off-shell formulations for N = 1 supergravity. Recently, Komargodski and Seiberg in arXiv:1002.2228 put forward a new supercurrent and proved its consistency, although in the past it was believed not to exist. In this paper, three new variant supercurrent multiplets are proposed. Implications for supergravity-matter systems are discussed.Comment: 11 pages; V2: minor changes in sect. 3; V3: published version; V4: typos in eq. (2.3) corrected; V5: comments and references adde

    A realistic pattern of fermion masses from a five-dimensional SO(10) model

    Full text link
    We provide a unified description of fermion masses and mixing angles in the framework of a supersymmetric grand unified SO(10) model with anarchic Yukawa couplings of order unity. The space-time is five dimensional and the extra flat spatial dimension is compactified on the orbifold S1/(Z2×Z2)S^1/(Z_2 \times Z_2'), leading to Pati-Salam gauge symmetry on the boundary where Yukawa interactions are localised. The gauge symmetry breaking is completed by means of a rather economic scalar sector, avoiding the doublet-triplet splitting problem. The matter fields live in the bulk and their massless modes get exponential profiles, which naturally explain the mass hierarchy of the different fermion generations. Quarks and leptons properties are naturally reproduced by a mechanism, first proposed by Kitano and Li, that lifts the SO(10) degeneracy of bulk masses in terms of a single parameter. The model provides a realistic pattern of fermion masses and mixing angles for large values of tanβ\tan\beta. It favours normally ordered neutrino mass spectrum with the lightest neutrino mass below 0.01 eV and no preference for leptonic CP violating phases. The right handed neutrino mass spectrum is very hierarchical and does not allow for thermal leptogenesis. We analyse several variants of the basic framework and find that the results concerning the fermion spectrum are remarkably stable.Comment: 30 pages, 7 figures, 4 table

    Live to cheat another day: bacterial dormancy facilitates the social exploitation of beta-lactamases

    Get PDF
    The breakdown of antibiotics by β-lactamases may be cooperative, since resistant cells can detoxify their environment and facilitate the growth of susceptible neighbours. However, previous studies of this phenomenon have used artificial bacterial vectors or engineered bacteria to increase the secretion of β-lactamases from cells. Here, we investigated whether a broad-spectrum β-lactamase gene carried by a naturally occurring plasmid (pCT) is cooperative under a range of conditions. In ordinary batch culture on solid media, there was little or no evidence that resistant bacteria could protect susceptible cells from ampicillin, although resistant colonies could locally detoxify this growth medium. However, when susceptible cells were inoculated at high densities, late-appearing phenotypically susceptible bacteria grew in the vicinity of resistant colonies. We infer that persisters, cells that have survived antibiotics by undergoing a period of dormancy, founded these satellite colonies. The number of persister colonies was positively correlated with the density of resistant colonies and increased as antibiotic concentrations decreased. We argue that detoxification can be cooperative under a limited range of conditions: if the toxins are bacteriostatic rather than bacteridical; or if susceptible cells invade communities after resistant bacteria; or if dormancy allows susceptible cells to avoid bactericides. Resistance and tolerance were previously thought to be independent solutions for surviving antibiotics. Here, we show that these are interacting strategies: the presence of bacteria adopting one solution can have substantial effects on the fitness of their neighbours

    Extended supersymmetric sigma models in AdS_4 from projective superspace

    Full text link
    There exist two superspace approaches to describe N=2 supersymmetric nonlinear sigma models in four-dimensional anti-de Sitter (AdS_4) space: (i) in terms of N=1 AdS chiral superfields, as developed in arXiv:1105.3111 and arXiv:1108.5290; and (ii) in terms of N=2 polar supermultiplets using the AdS projective-superspace techniques developed in arXiv:0807.3368. The virtue of the approach (i) is that it makes manifest the geometric properties of the N=2 supersymmetric sigma-models in AdS_4. The target space must be a non-compact hyperkahler manifold endowed with a Killing vector field which generates an SO(2) group of rotations on the two-sphere of complex structures. The power of the approach (ii) is that it allows us, in principle, to generate hyperkahler metrics as well as to address the problem of deformations of such metrics. Here we show how to relate the formulation (ii) to (i) by integrating out an infinite number of N=1 AdS auxiliary superfields and performing a superfield duality transformation. We also develop a novel description of the most general N=2 supersymmetric nonlinear sigma-model in AdS_4 in terms of chiral superfields on three-dimensional N=2 flat superspace without central charge. This superspace naturally originates from a conformally flat realization for the four-dimensional N=2 AdS superspace that makes use of Poincare coordinates for AdS_4. This novel formulation allows us to uncover several interesting geometric results.Comment: 88 pages; v3: typos corrected, version published in JHE

    Multi-centre reproducibility of diffusion MRI parameters for clinical sequences in the brain.

    Get PDF
    The purpose of this work was to assess the reproducibility of diffusion imaging, and in particular the apparent diffusion coefficient (ADC), intra-voxel incoherent motion (IVIM) parameters and diffusion tensor imaging (DTI) parameters, across multiple centres using clinically available protocols with limited harmonization between sequences. An ice-water phantom and nine healthy volunteers were scanned across fives centres on eight scanners (four Siemens 1.5T, four Philips 3T). The mean ADC, IVIM parameters (diffusion coefficient D and perfusion fraction f) and DTI parameters (mean diffusivity MD and fractional anisotropy FA), were measured in grey matter, white matter and specific brain sub-regions. A mixed effect model was used to measure the intra- and inter-scanner coefficient of variation (CV) for each of the five parameters. ADC, D, MD and FA had a good intra- and inter-scanner reproducibility in both grey and white matter, with a CV ranging between 1% and 7.4%; mean 2.6%. Other brain regions also showed high levels of reproducibility except for small structures such as the choroid plexus. The IVIM parameter f had a higher intra-scanner CV of 8.4% and inter-scanner CV of 24.8%. No major difference in the inter-scanner CV for ADC, D, MD and FA was observed when analysing the 1.5T and 3T scanners separately. ADC, D, MD and FA all showed good intra-scanner reproducibility, with the inter-scanner reproducibility being comparable or faring slightly worse, suggesting that using data from multiple scanners does not have an adverse effect compared with using data from the same scanner. The IVIM parameter f had a poorer inter-scanner CV when scanners of different field strengths were combined, and the parameter was also affected by the scan acquisition resolution. This study shows that the majority of diffusion MRI derived parameters are robust across 1.5T and 3T scanners and suitable for use in multi-centre clinical studies and trials

    Overcoming barriers to engaging socio-economically disadvantaged populations in CHD primary prevention: a qualitative study

    Get PDF
    <p><b>Background:</b> Preventative medicine has become increasingly important in efforts to reduce the burden of chronic disease in industrialised countries. However, interventions that fail to recruit socio-economically representative samples may widen existing health inequalities. This paper explores the barriers and facilitators to engaging a socio-economically disadvantaged (SED) population in primary prevention for coronary heart disease (CHD).</p> <p><b>Methods:</b> The primary prevention element of Have a Heart Paisley (HaHP) offered risk screening to all eligible individuals. The programme employed two approaches to engaging with the community: a) a social marketing campaign and b) a community development project adopting primarily face-to-face canvassing. Individuals living in areas of SED were under-recruited via the social marketing approach, but successfully recruited via face-to-face canvassing. This paper reports on focus group discussions with participants, exploring their perceptions about and experiences of both approaches.</p> <p><b>Results:</b> Various reasons were identified for low uptake of risk screening amongst individuals living in areas of high SED in response to the social marketing campaign and a number of ways in which the face-to-face canvassing approach overcame these barriers were identified. These have been categorised into four main themes: (1) processes of engagement; (2) issues of understanding; (3) design of the screening service and (4) the priority accorded to screening. The most immediate barriers to recruitment were the invitation letter, which often failed to reach its target, and the general distrust of postal correspondence. In contrast, participants were positive about the face-to-face canvassing approach. Participants expressed a lack of knowledge and understanding about CHD and their risk of developing it and felt there was a lack of clarity in the information provided in the mailing in terms of the process and value of screening. In contrast, direct face-to-face contact meant that outreach workers could explain what to expect. Participants felt that the procedure for uptake of screening was demanding and inflexible, but that the drop-in sessions employed by the community development project had a major impact on recruitment and retention.</p> <p><b>Conclusion:</b> Socio-economically disadvantaged individuals can be hard-to-reach; engagement requires strategies tailored to the needs of the target population rather than a population-wide approach.</p&gt

    Argumentation in school science : Breaking the tradition of authoritative exposition through a pedagogy that promotes discussion and reasoning

    Get PDF
    The value of argumentation in science education has become internationally recognised and has been the subject of many research studies in recent years. Successful introduction of argumentation activities in learning contexts involves extending teaching goals beyond the understanding of facts and concepts, to include an emphasis on cognitive and metacognitive processes, epistemic criteria and reasoning. The authors focus on the difficulties inherent in shifting a tradition of teaching from one dominated by authoritative exposition to one that is more dialogic, involving small-group discussion based on tasks that stimulate argumentation. The paper builds on previous research on enhancing the quality of argument in school science, to focus on how argumentation activities have been designed, with appropriate strategies, resources and modelling, for pedagogical purposes. The paper analyses design frameworks, their contexts and lesson plans, to evaluate their potential for enhancing reasoning through foregrounding the processes of argumentation. Examples of classroom dialogue where teachers adopt the frameworks/plans are analysed to show how argumentation processes are scaffolded. The analysis shows that several layers of interpretation are needed and these layers need to be aligned for successful implementation. The analysis serves to highlight the potential and limitations of the design frameworks
    corecore