6,086 research outputs found

    Quantification of mesoscale variability and geometrical reconstruction of a textile

    Get PDF
    Automated image analysis of textile surfaces allowed determination and quantification of intrinsic yarn path variabilities in a 2/2 twill weave during the lay-up process. The yarn paths were described in terms of waves and it was found that the frequencies are similar in warp and weft directions and hardly affected by introduced yarn path deformations. The most significant source of fabric variability was introduced during handling before cutting. These resulting systematic deformations will need to be considered when designing or analysing a composite component. An automated method for three dimensional reconstruction of the analysed lay-up was implemented in TexGen which will allow virtual testing of components in the future

    Bottom mixed layer oxygen dynamics in the Celtic Sea

    Get PDF
    The seasonally stratified continental shelf seas are highly productive, economically important environments which are under considerable pressure from human activity. Global dissolved oxygen concentrations have shown rapid reductions in response to anthropogenic forcing since at least the middle of the twentieth century. Oxygen consumption is at the same time linked to the cycling of atmospheric carbon, with oxygen being a proxy for carbon remineralisation and the release of CO2. In the seasonally stratified seas the bottom mixed layer (BML) is partially isolated from the atmosphere and is thus controlled by interplay between oxygen consumption processes, vertical and horizontal advection. Oxygen consumption rates can be both spatially and temporally dynamic, but these dynamics are often missed with incubation based techniques. Here we adopt a Bayesian approach to determining total BML oxygen consumption rates from a high resolution oxygen time-series. This incorporates both our knowledge and our uncertainty of the various processes which control the oxygen inventory. Total BML rates integrate both processes in the water column and at the sediment interface. These observations span the stratified period of the Celtic Sea and across both sandy and muddy sediment types. We show how horizontal advection, tidal forcing and vertical mixing together control the bottom mixed layer oxygen concentrations at various times over the stratified period. Our muddy-sand site shows cyclic spring-neap mediated changes in oxygen consumption driven by the frequent resuspension or ventilation of the seabed. We see evidence for prolonged periods of increased vertical mixing which provide the ventilation necessary to support the high rates of consumption observed

    Epigenetics as a mechanism driving polygenic clinical drug resistance

    Get PDF
    Aberrant methylation of CpG islands located at or near gene promoters is associated with inactivation of gene expression during tumour development. It is increasingly recognised that such epimutations may occur at a much higher frequency than gene mutation and therefore have a greater impact on selection of subpopulations of cells during tumour progression or acquisition of resistance to anticancer drugs. Although laboratory-based models of acquired resistance to anticancer agents tend to focus on specific genes or biochemical pathways, such 'one gene : one outcome' models may be an oversimplification of acquired resistance to treatment of cancer patients. Instead, clinical drug resistance may be due to changes in expression of a large number of genes that have a cumulative impact on chemosensitivity. Aberrant CpG island methylation of multiple genes occurring in a nonrandom manner during tumour development and during the acquisition of drug resistance provides a mechanism whereby expression of multiple genes could be affected simultaneously resulting in polygenic clinical drug resistance. If simultaneous epigenetic regulation of multiple genes is indeed a major driving force behind acquired resistance of patients' tumour to anticancer agents, this has important implications for biomarker studies of clinical outcome following chemotherapy and for clinical approaches designed to circumvent or modulate drug resistance

    Increasing dominance of large lianas in Amazonian forests

    Get PDF
    Ecological orthodoxy suggests that old-growth forests should be close to dynamic equilibrium, but this view has been challenged by recent findings that neotropical forests are accumulating carbon and biomass, possibly in response to the increasing atmospheric concentrations of carbon dioxide. However, it is unclear whether the recent increase in tree biomass has been accompanied by a shift in community composition. Such changes could reduce or enhance the carbon storage potential of old-growth forests in the long term. Here we show that non-fragmented Amazon forests are experiencing a concerted increase in the density, basal area and mean size of woody climbing plants (lianas). Over the last two decades of the twentieth century the dominance of large lianas relative to trees has increased by 1.7–4.6% a year. Lianas enhance tree mortality and suppress tree growth, so their rapid increase implies that the tropical terrestrial carbon sink may shut down sooner than current models suggest. Predictions of future tropical carbon fluxes will need to account for the changing composition and dynamics of supposedly undisturbed forests

    Measuring the impact and costs of a universal group based parenting programme : protocol and implementation of a trial

    Get PDF
    Background Sub-optimal parenting is a common risk factor for a wide range of negative health, social and educational outcomes. Most parenting programmes have been developed in the USA in the context of delinquency prevention for targeted or indicated groups and the main theoretical underpinning for these programmes is behaviour management. The Family Links Nurturing Programme (FLNP) focuses on family relationships as well as behaviour management and is offered on a universal basis. As a result it may be better placed to improve health and educational outcomes. Developed in the UK voluntary sector, FLNP is popular with practitioners, has impressed policy makers throughout the UK, has been found to be effective in before/after and qualitative studies, but lacks a randomised controlled trial (RCT) evidence base. Methods/Design A multi-centre, investigator blind, randomised controlled trial of the FLNP with a target sample of 288 south Wales families who have a child aged 2-4 yrs living in or near to Flying Start/Sure Start areas. Changes in parenting, parent child relations and parent and child wellbeing are assessed with validated measures immediately and at 6 months post intervention. Economic components include cost consequences and cost utility analyses based on parental ranking of states of quality of life. Attendance and completion rates and fidelity to the FLNP course delivery are assessed. A nested qualitative study will assess reasons for participation and non-participation and the perceived value of the programme to families. By the end of May 2010, 287 families have been recruited into the trial across four areas of south Wales. Recruitment has not met the planned timescales with barriers including professional anxiety about families entering the control arm of the trial, family concern about video and audio recording, programme facilitator concern about the recording of FLNP sessions for fidelity purposes and delays due to the new UK research governance procedures. Discussion Whilst there are strong theoretical arguments to support universal provision of parenting programmes, few universal programmes have been subjected to randomised controlled trials. In this paper we describe a RCT protocol with quantitative and qualitative outcome measures and an economic evaluation designed to provide clear evidence with regard to effectiveness and costs. We describe challenges implementing the protocol and how we are addressing these

    Solving Quantum Ground-State Problems with Nuclear Magnetic Resonance

    Get PDF
    Quantum ground-state problems are computationally hard problems; for general many-body Hamiltonians, there is no classical or quantum algorithm known to be able to solve them efficiently. Nevertheless, if a trial wavefunction approximating the ground state is available, as often happens for many problems in physics and chemistry, a quantum computer could employ this trial wavefunction to project the ground state by means of the phase estimation algorithm (PEA). We performed an experimental realization of this idea by implementing a variational-wavefunction approach to solve the ground-state problem of the Heisenberg spin model with an NMR quantum simulator. Our iterative phase estimation procedure yields a high accuracy for the eigenenergies (to the 10^-5 decimal digit). The ground-state fidelity was distilled to be more than 80%, and the singlet-to-triplet switching near the critical field is reliably captured. This result shows that quantum simulators can better leverage classical trial wavefunctions than classical computers.Comment: 11 pages, 13 figure

    Construct-level predictive validity of educational attainment and intellectual aptitude tests in medical student selection: meta-regression of six UK longitudinal studies

    Get PDF
    Background: Measures used for medical student selection should predict future performance during training. A problem for any selection study is that predictor-outcome correlations are known only in those who have been selected, whereas selectors need to know how measures would predict in the entire pool of applicants. That problem of interpretation can be solved by calculating construct-level predictive validity, an estimate of true predictor-outcome correlation across the range of applicant abilities. Methods: Construct-level predictive validities were calculated in six cohort studies of medical student selection and training (student entry, 1972 to 2009) for a range of predictors, including A-levels, General Certificates of Secondary Education (GCSEs)/O-levels, and aptitude tests (AH5 and UK Clinical Aptitude Test (UKCAT)). Outcomes included undergraduate basic medical science and finals assessments, as well as postgraduate measures of Membership of the Royal Colleges of Physicians of the United Kingdom (MRCP(UK)) performance and entry in the Specialist Register. Construct-level predictive validity was calculated with the method of Hunter, Schmidt and Le (2006), adapted to correct for right-censorship of examination results due to grade inflation. Results: Meta-regression analyzed 57 separate predictor-outcome correlations (POCs) and construct-level predictive validities (CLPVs). Mean CLPVs are substantially higher (.450) than mean POCs (.171). Mean CLPVs for first-year examinations, were high for A-levels (.809; CI: .501 to .935), and lower for GCSEs/O-levels (.332; CI: .024 to .583) and UKCAT (mean = .245; CI: .207 to .276). A-levels had higher CLPVs for all undergraduate and postgraduate assessments than did GCSEs/O-levels and intellectual aptitude tests. CLPVs of educational attainment measures decline somewhat during training, but continue to predict postgraduate performance. Intellectual aptitude tests have lower CLPVs than A-levels or GCSEs/O-levels. Conclusions: Educational attainment has strong CLPVs for undergraduate and postgraduate performance, accounting for perhaps 65% of true variance in first year performance. Such CLPVs justify the use of educational attainment measure in selection, but also raise a key theoretical question concerning the remaining 35% of variance (and measurement error, range restriction and right-censorship have been taken into account). Just as in astrophysics, ‘dark matter’ and ‘dark energy’ are posited to balance various theoretical equations, so medical student selection must also have its ‘dark variance’, whose nature is not yet properly characterized, but explains a third of the variation in performance during training. Some variance probably relates to factors which are unpredictable at selection, such as illness or other life events, but some is probably also associated with factors such as personality, motivation or study skills

    The holographic principle

    Get PDF
    There is strong evidence that the area of any surface limits the information content of adjacent spacetime regions, at 10^(69) bits per square meter. We review the developments that have led to the recognition of this entropy bound, placing special emphasis on the quantum properties of black holes. The construction of light-sheets, which associate relevant spacetime regions to any given surface, is discussed in detail. We explain how the bound is tested and demonstrate its validity in a wide range of examples. A universal relation between geometry and information is thus uncovered. It has yet to be explained. The holographic principle asserts that its origin must lie in the number of fundamental degrees of freedom involved in a unified description of spacetime and matter. It must be manifest in an underlying quantum theory of gravity. We survey some successes and challenges in implementing the holographic principle.Comment: 52 pages, 10 figures, invited review for Rev. Mod. Phys; v2: reference adde

    Quantum Simulation of Tunneling in Small Systems

    Full text link
    A number of quantum algorithms have been performed on small quantum computers; these include Shor's prime factorization algorithm, error correction, Grover's search algorithm and a number of analog and digital quantum simulations. Because of the number of gates and qubits necessary, however, digital quantum particle simulations remain untested. A contributing factor to the system size required is the number of ancillary qubits needed to implement matrix exponentials of the potential operator. Here, we show that a set of tunneling problems may be investigated with no ancillary qubits and a cost of one single-qubit operator per time step for the potential evolution. We show that physically interesting simulations of tunneling using 2 qubits (i.e. on 4 lattice point grids) may be performed with 40 single and two-qubit gates. Approximately 70 to 140 gates are needed to see interesting tunneling dynamics in three-qubit (8 lattice point) simulations.Comment: 4 pages, 2 figure
    corecore