2,510 research outputs found

    Using somatic mutation data to test tumors for clonal relatedness

    Full text link
    A major challenge for cancer pathologists is to determine whether a new tumor in a patient with cancer is a metastasis or an independent occurrence of the disease. In recent years numerous studies have evaluated pairs of tumor specimens to examine the similarity of the somatic characteristics of the tumors and to test for clonal relatedness. As the landscape of mutation testing has evolved, a number of statistical methods for determining clonality have developed, notably for comparing losses of heterozygosity at candidate markers, and for comparing copy number profiles. Increasingly tumors are being evaluated for point mutations in panels of candidate genes using gene sequencing technologies. Comparison of the mutational profiles of pairs of tumors presents unusual methodological challenges: mutations at some loci are much more common than others; knowledge of the marginal mutation probabilities is scanty for most loci at which mutations might occur; the sample space of potential mutational profiles is vast. We examine this problem and propose a test for clonal relatedness of a pair of tumors from a single patient. Using simulations, its properties are shown to be promising. The method is illustrated using several examples from the literature.Comment: Published at http://dx.doi.org/10.1214/15-AOAS836 in the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Comparing ROC Curves Derived From Regression Models

    Get PDF
    In constructing predictive models, investigators frequently assess the incremental value of a predictive marker by comparing the ROC curve generated from the predictive model including the new marker with the ROC curve from the model excluding the new marker. Many commentators have noticed empirically that a test of the two ROC areas often produces a non-significant result when a corresponding Wald test from the underlying regression model is significant. A recent article showed using simulations that the widely-used ROC area test [1] produces exceptionally conservative test size and extremely low power [2]. In this article we show why the ROC area test is invalid in this context. We demonstrate how a valid test of the ROC areas can be constructed that has comparable statistical properties to the Wald test. We conclude that using the Wald test to assess the incremental contribution of a marker remains the best strategy. We also examine the use of derived markers from non-nested models and the use of validation samples. We show that comparing ROC areas is invalid in these contexts as well

    Autonomy Infused Teleoperation with Application to BCI Manipulation

    Full text link
    Robot teleoperation systems face a common set of challenges including latency, low-dimensional user commands, and asymmetric control inputs. User control with Brain-Computer Interfaces (BCIs) exacerbates these problems through especially noisy and erratic low-dimensional motion commands due to the difficulty in decoding neural activity. We introduce a general framework to address these challenges through a combination of computer vision, user intent inference, and arbitration between the human input and autonomous control schemes. Adjustable levels of assistance allow the system to balance the operator's capabilities and feelings of comfort and control while compensating for a task's difficulty. We present experimental results demonstrating significant performance improvement using the shared-control assistance framework on adapted rehabilitation benchmarks with two subjects implanted with intracortical brain-computer interfaces controlling a seven degree-of-freedom robotic manipulator as a prosthetic. Our results further indicate that shared assistance mitigates perceived user difficulty and even enables successful performance on previously infeasible tasks. We showcase the extensibility of our architecture with applications to quality-of-life tasks such as opening a door, pouring liquids from containers, and manipulation with novel objects in densely cluttered environments

    Estimating the Empirical Lorenz Curve and Gini Coefficient in the Presence of Error

    Get PDF
    The Lorenz curve is a graphical tool that is widely used to characterize the concentration of a measure in a population, such as wealth. It is frequently the case that the measure of interest used to rank experimental units when estimating the empirical Lorenz curve, and the corresponding Gini coefficient, is subject to random error. This error can result in an incorrect ranking of experimental units which inevitably leads to a curve that exaggerates the degree of concentration (variation) in the population. We explore this bias and discuss several widely available statistical methods that have the potential to reduce or remove the bias in the empirical Lorenz curve. The properties of these methods are examined and compared in a simulation study. This work is motivated by a health outcomes application which seeks to assess the concentration of black patient visits among primary care physicians. The methods are illustrated on data from this study

    Statistical Evaluation of Evidence for Clonal Allelic Alterations in array-CGH Experiments

    Get PDF
    In recent years numerous investigators have conducted genetic studies of pairs of tumor specimens from the same patient to determine whether the tumors share a clonal origin. These studies have the potential to be of considerable clinical significance, especially in clinical settings where the distinction of a new primary cancer and metastatic spread of a previous cancer would lead to radically different indications for treatment. Studies of clonality have typically involved comparison of the patterns of somatic mutations in the tumors at candidate genetic loci to see if the patterns are sufficiently similar to indicate a clonal origin. More recently, some investigators have explored the use of array CGH for this purpose. Standard clustering approaches have been used to analyze the data, but these existing statistical methods are not suited to this problem due to the paired nature of the data, and the fact that there exists no “gold standard” diagnosis to provide a definitive determination of which pairs are clonal and which pairs are of independent origin. In this article we propose a new statistical method that focuses on the individual allelic gains or losses that have been identified in both tumors, and a statistical test is developed that assesses the degree of matching of the locations of the markers that indicate the endpoints of the allelic change. The validity and statistical power of the test is evaluated, and it is shown to be a promising approach for establishing clonality in tumor samples

    Statistical characteristics of the total ion density in the topside ionosphere during the period 1996-2004 using empirical orthogonal function (EOF) analysis

    Get PDF
    International audienceWe have applied the empirical orthogonal function (EOF) analysis to examine the climatology of the total ion density Ni at 840 km during the period 1996-2004, obtained from the Defense Meteorological Satellite Program (DMSP) spacecraft. The data set for each of the local time (09:30 LT and 21:30 LT) is decomposed into a time mean plus the sum of EOF bases Ei of space, multiplied by time-varying EOF coefficients Ai. Physical explanations are made on the first three EOFs, which together can capture more than 95% of the total variance of the original data set. Results show that the dominant mode that controls the Ni variability is the solar EUV flux, which is consistent with the results of Rich et al. (2003). The second EOF, associated with the solar declination, presents an annual (summer to winter) asymmetry that is caused by the transequatorial winds. The semiannual variation that appears in the third EOF for the evening sector is interpreted as both the effects of the equatorial electric fields and the wind patterns. Both the annual and semiannual variations are modulated by the solar flux, which has a close relationship with the O+ composition. The quick convergence of the EOF expansion makes it very convenient to construct an empirical model for the original data set. The modeled results show that the accuracy of the prediction depends mainly on the first principal component which has a close relationship with the solar EUV flux

    A Metastasis or a Second Independent Cancer? Evaluating the Clonal Origin of Tumors Using Array-CGH Data

    Get PDF
    When a cancer patient develops a new tumor it is necessary to determine if this is a recurrence (metastasis) of the original cancer, or an entirely new occurrence of the disease. This is accomplished by assessing the histo-pathology of the lesions, and it is frequently relatively straightforward. However, there are many clinical scenarios in which this pathological diagnosis is difficult. Since each tumor is characterized by a genetic fingerprint of somatic mutations, a more definitive diagnosis is possible in principle in these difficult clinical scenarios by comparing the fingerprints. In this article we develop and evaluate a statistical strategy for this comparison when the data are derived from array comparative genomic hybridization, a technique designed to identify all of the somatic allelic gains and losses across the genome. Our method involves several stages. First a segmentation algorithm is used to estimate the regions of allelic gain and loss. Then the broad correlation in these patterns between the two tumors is assessed, leading to an initial likelihood ratio for the two diagnoses. This is then further refined by comparing in detail each plausibly clonal mutation within individual chromosome arms, and the results are aggregated to determine a final likelihood ratio. The method is employed to diagnose patients from several clinical scenarios, and the results show that in many cases a strong clonal signal emerges, occasionally contradicting the clinical diagnosis. The “quality” of the arrays can be summarized by a parameter that characterizes the clarity with which allelic changes are detected. Sensitivity analyses show that most of the diagnoses are robust when the data are of high quality

    The UTMOST: A hybrid digital signal processor transforms the MOST

    Get PDF
    The Molonglo Observatory Synthesis Telescope (MOST) is an 18,000 square meter radio telescope situated some 40 km from the city of Canberra, Australia. Its operating band (820-850 MHz) is now partly allocated to mobile phone communications, making radio astronomy challenging. We describe how the deployment of new digital receivers (RX boxes), Field Programmable Gate Array (FPGA) based filterbanks and server-class computers equipped with 43 GPUs (Graphics Processing Units) has transformed MOST into a versatile new instrument (the UTMOST) for studying the dynamic radio sky on millisecond timescales, ideal for work on pulsars and Fast Radio Bursts (FRBs). The filterbanks, servers and their high-speed, low-latency network form part of a hybrid solution to the observatory's signal processing requirements. The emphasis on software and commodity off-the-shelf hardware has enabled rapid deployment through the re-use of proven 'software backends' for its signal processing. The new receivers have ten times the bandwidth of the original MOST and double the sampling of the line feed, which doubles the field of view. The UTMOST can simultaneously excise interference, make maps, coherently dedisperse pulsars, and perform real-time searches of coherent fan beams for dispersed single pulses. Although system performance is still sub-optimal, a pulsar timing and FRB search programme has commenced and the first UTMOST maps have been made. The telescope operates as a robotic facility, deciding how to efficiently target pulsars and how long to stay on source, via feedback from real-time pulsar folding. The regular timing of over 300 pulsars has resulted in the discovery of 7 pulsar glitches and 3 FRBs. The UTMOST demonstrates that if sufficient signal processing can be applied to the voltage streams it is possible to perform innovative radio science in hostile radio frequency environments.Comment: 12 pages, 6 figure
    corecore