11,786 research outputs found

    Of mice and men: Sparse statistical modeling in cardiovascular genomics

    Full text link
    In high-throughput genomics, large-scale designed experiments are becoming common, and analysis approaches based on highly multivariate regression and anova concepts are key tools. Shrinkage models of one form or another can provide comprehensive approaches to the problems of simultaneous inference that involve implicit multiple comparisons over the many, many parameters representing effects of design factors and covariates. We use such approaches here in a study of cardiovascular genomics. The primary experimental context concerns a carefully designed, and rich, gene expression study focused on gene-environment interactions, with the goals of identifying genes implicated in connection with disease states and known risk factors, and in generating expression signatures as proxies for such risk factors. A coupled exploratory analysis investigates cross-species extrapolation of gene expression signatures--how these mouse-model signatures translate to humans. The latter involves exploration of sparse latent factor analysis of human observational data and of how it relates to projected risk signatures derived in the animal models. The study also highlights a range of applied statistical and genomic data analysis issues, including model specification, computational questions and model-based correction of experimental artifacts in DNA microarray data.Comment: Published at http://dx.doi.org/10.1214/07-AOAS110 in the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Irrigation Technology Adoption in the Texas High Plains: A Real Options Approach

    Get PDF
    Water scarcity has been a significant issue for several decades in the Texas High Plains, with agriculture identified as the main activity contributing to this scarcity. To address this issue, much effort has been devoted to developing and encouraging adoption of sophisticated irrigation systems with high levels of water application efficiency, such as the low energy precision application (LEPA) system, subsurface drip irrigation (SDI), and variable rate irrigation (VRI). In this study, the economic feasibility of these irrigation systems is evaluated in cotton farming in the Texas High Plains using a real options approach. Results find that only the LEPA system is profitable under current conditions. The VRI system is profitable with high cotton prices (above $0.72/lb), while SDI is not profitable under any conditions explored.Resource /Energy Economics and Policy,

    EFFECT OF REVENUE INSURANCE ON ENTRY AND EXIT DECISIONS IN TABLE GRAPE PRODUCTION: A REAL OPTION APPROACH

    Get PDF
    This study determines the entry and exit thresholds of table grape farming with irreversible investment under uncertainty. Real option approach is adopted to consider the investment and management flexibility. Also revenue insurance is introduced to consider the effect of the risk management programs on the entry and exit thresholds. Results show that revenue insurance increases the entry and exit thresholds by 1% and 4%, respectively, thus discouraging new investment and current farming, as long as the revenue guarantee is less than the exit threshold. Revenue insurance increases the entry threshold by 3% and decreases the exit threshold by 13% as long as the revenue guarantee is greater than the exit threshold. In this case, revenue insurance discourages the investment and encourages the current farmer to stay in farming, further. However, the decrease in the subsidy rate results in the increase in both entry and exit thresholds. Thus, the premium subsidy levels should be carefully considered if the policy objective is to encourage growers to shift to higher-value crops.Risk and Uncertainty,

    Performance assessment of urban precinct design: a scoping study

    Get PDF
    Executive Summary: Significant advances have been made over the past decade in the development of scientifically and industry accepted tools for the performance assessment of buildings in terms of energy, carbon, water, indoor environment quality etc. For resilient, sustainable low carbon urban development to be realised in the 21st century, however, will require several radical transitions in design performance beyond the scale of individual buildings. One of these involves the creation and application of leading edge tools (not widely available to built environment professions and practitioners) capable of being applied to an assessment of performance across all stages of development at a precinct scale (neighbourhood, community and district) in either greenfield, brownfield or greyfield settings. A core aspect here is the development of a new way of modelling precincts, referred to as Precinct Information Modelling (PIM) that provides for transparent sharing and linking of precinct object information across the development life cycle together with consistent, accurate and reliable access to reference data, including that associated with the urban context of the precinct. Neighbourhoods are the ‘building blocks’ of our cities and represent the scale at which urban design needs to make its contribution to city performance: as productive, liveable, environmentally sustainable and socially inclusive places (COAG 2009). Neighbourhood design constitutes a major area for innovation as part of an urban design protocol established by the federal government (Department of Infrastructure and Transport 2011, see Figure 1). The ability to efficiently and effectively assess urban design performance at a neighbourhood level is in its infancy. This study was undertaken by Swinburne University of Technology, University of New South Wales, CSIRO and buildingSMART Australasia on behalf of the CRC for Low Carbon Living

    Quantum decoherence dynamics of divacancy spins in silicon carbide

    Full text link
    Long coherence times are key to the performance of quantum bits (qubits). Here, we experimentally and theoretically show that the Hahn-echo coherence time (T2) of electron spins associated with divacancy defects in 4H-SiC reaches 1.3 ms, one of the longest T2 times of an electron spin in a naturally isotopic crystal. Using a first-principles microscopic quantum-bath model, we find that two factors determine the unusually robust coherence. First, in the presence of moderate magnetic fields (300 G and above), the 29Si and 13C paramagnetic nuclear spin baths are decoupled. In addition, because SiC is a binary crystal, homo-nuclear spin pairs are both diluted and forbidden from forming strongly coupled, nearest-neighbor spin pairs. Longer neighbor distances result in fewer nuclear spin flip-flops, a less fluctuating intra-crystalline magnetic environment, and thus a longer T2 time. Our results point to polyatomic crystals as promising hosts for coherent qubits in the solid state.Comment: 22 pages, 5 figures, Supplementary information is adde

    The SDSS Coadd: Cross-Correlation Weak Lensing and Tomography of Galaxy Clusters

    Full text link
    The shapes of distant galaxies are sheared by intervening galaxy clusters. We examine this effect in Stripe 82, a 275 square degree region observed multiple times in the Sloan Digital Sky Survey and coadded to achieve greater depth. We obtain a mass-richness calibration that is similar to other SDSS analyses, demonstrating that the coaddition process did not adversely affect the lensing signal. We also propose a new parameterization of the effect of tomography on the cluster lensing signal which does not require binning in redshift, and we show that using this parameterization we can detect tomography for stacked clusters at varying redshifts. Finally, due to the sensitivity of the tomographic detection to accurately marginalizing over the effect of the cluster mass, we show that tomography at low redshift (where dependence on exact cosmological models is weak) can be used to constrain mass profiles in clusters.Comment: 8 pages, 13 figures, submitted to ApJ. Analysis updated using revised photo-z catalog of Reis et al. arXiv:1111.6620v2. Changes in results are within the errors and the conclusions are unaffecte

    Particle mesh simulations of the Lyman-alpha forest and the signature of Baryon Acoustic Oscillations in the intergalactic medium

    Full text link
    We present a set of ultra-large particle-mesh simulations of the LyA forest targeted at understanding the imprint of baryon acoustic oscillations (BAO) in the inter-galactic medium. We use 9 dark matter only simulations which can, for the first time, simultaneously resolve the Jeans scale of the intergalactic gas while covering the large volumes required to adequately sample the acoustic feature. Mock absorption spectra are generated using the fluctuating Gunn-Peterson approximation which have approximately correct flux probability density functions (PDFs) and small-scale power spectra. On larger scales there is clear evidence in the redshift space correlation function for an acoustic feature, which matches a linear theory template with constant bias. These spectra, which we make publicly available, can be used to test pipelines, plan future experiments and model various physical effects. As an illustration we discuss the basic properties of the acoustic signal in the forest, the scaling of errors with noise and source number density, modified statistics to treat mean flux evolution and misestimation, and non-gravitational sources such as fluctuations in the photo-ionizing background and temperature fluctuations due to HeII reionization.Comment: 11 pages, 10 figures, minor changes to address referee repor
    corecore