1,711 research outputs found

    Causal inference for long-term survival in randomised trials with treatment switching: Should re-censoring be applied when estimating counterfactual survival times?

    Get PDF
    Treatment switching often has a crucial impact on estimates of effectiveness and cost-effectiveness of new oncology treatments. Rank preserving structural failure time models (RPSFTM) and two-stage estimation (TSE) methods estimate ‘counterfactual’ (i.e. had there been no switching) survival times and incorporate re-censoring to guard against informative censoring in the counterfactual dataset. However, re-censoring causes a loss of longer term survival information which is problematic when estimates of long-term survival effects are required, as is often the case for health technology assessment decision making. We present a simulation study designed to investigate applications of the RPSFTM and TSE with and without re-censoring, to determine whether re-censoring should always be recommended within adjustment analyses. We investigate a context where switching is from the control group onto the experimental treatment in scenarios with varying switch proportions, treatment effect sizes and time-dependencies, disease severity and switcher prognosis. Methods were assessed according to their estimation of control group restricted mean survival (that would be observed in the absence of switching) at the end of the simulated trial follow-up. We found that RPSFTM and TSE analyses which incorporated re-censoring usually produced negative bias (i.e. under-estimating control group restricted mean survival and therefore over-estimating the treatment effect). RPSFTM and TSE analyses that did not incorporate re-censoring consistently produced positive bias (i.e. under-estimating the treatment effect) which was often smaller in magnitude than the bias associated with the re-censored analyses. We believe that analyses should be conducted with and without re-censoring, as this may provide decision makers with useful information on where the true treatment effect is likely to lie. Analyses that incorporate re-censoring should not always represent the default approach when the objective is to estimate long-term survival times and treatment effects on long-term survival

    Tactile Interactions with a Humanoid Robot : Novel Play Scenario Implementations with Children with Autism

    Get PDF
    Acknowledgments: This work has been partially supported by the European Commission under contract number FP7-231500-ROBOSKIN. Open Access: This article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.The work presented in this paper was part of our investigation in the ROBOSKIN project. The project has developed new robot capabilities based on the tactile feedback provided by novel robotic skin, with the aim to provide cognitive mechanisms to improve human-robot interaction capabilities. This article presents two novel tactile play scenarios developed for robot-assisted play for children with autism. The play scenarios were developed against specific educational and therapeutic objectives that were discussed with teachers and therapists. These objectives were classified with reference to the ICF-CY, the International Classification of Functioning – version for Children and Youth. The article presents a detailed description of the play scenarios, and case study examples of their implementation in HRI studies with children with autism and the humanoid robot KASPAR.Peer reviewedFinal Published versio

    Assessing methods for dealing with treatment switching in clinical trials: A follow-up simulation study

    Get PDF
    When patients randomised to the control group of a randomised controlled trial are allowed to switch onto the experimental treatment, intention-to-treat analyses of the treatment effect are confounded because the separation of randomised groups is lost. Previous research has investigated statistical methods that aim to estimate the treatment effect that would have been observed had this treatment switching not occurred and has demonstrated their performance in a limited set of scenarios. Here, we investigate these methods in a new range of realistic scenarios, allowing conclusions to be made based upon a broader evidence base. We simulated randomised controlled trials incorporating prognosis-related treatment switching and investigated the impact of sample size, reduced switching proportions, disease severity, and alternative data-generating models on the performance of adjustment methods, assessed through a comparison of bias, mean squared error, and coverage, related to the estimation of true restricted mean survival in the absence of switching in the control group. Rank preserving structural failure time models, inverse probability of censoring weights, and two-stage methods consistently produced less bias than the intentionto-treat analysis. The switching proportion was confirmed to be a key determinant of bias: sample size and censoring proportion were relatively less important. It is critical to determine the size of the treatment effect in terms of an acceleration factor (rather than a hazard ratio) to provide information on the likely bias associated with rank-preserving structural failure time model adjustments. In general, inverse probability of censoring weight methods are more volatile than other adjustment methods

    Integrated multiple mediation analysis: A robustness–specificity trade-off in causal structure

    Get PDF
    Recent methodological developments in causal mediation analysis have addressed several issues regarding multiple mediators. However, these developed methods differ in their definitions of causal parameters, assumptions for identification, and interpretations of causal effects, making it unclear which method ought to be selected when investigating a given causal effect. Thus, in this study, we construct an integrated framework, which unifies all existing methodologies, as a standard for mediation analysis with multiple mediators. To clarify the relationship between existing methods, we propose four strategies for effect decomposition: two-way, partially forward, partially backward, and complete decompositions. This study reveals how the direct and indirect effects of each strategy are explicitly and correctly interpreted as path-specific effects under different causal mediation structures. In the integrated framework, we further verify the utility of the interventional analogues of direct and indirect effects, especially when natural direct and indirect effects cannot be identified or when cross-world exchangeability is invalid. Consequently, this study yields a robustness–specificity trade-off in the choice of strategies. Inverse probability weighting is considered for estimation. The four strategies are further applied to a simulation study for performance evaluation and for analyzing the Risk Evaluation of Viral Load Elevation and Associated Liver Disease/Cancer data set from Taiwan to investigate the causal effect of hepatitis C virus infection on mortality

    Semiparametric Multivariate Accelerated Failure Time Model with Generalized Estimating Equations

    Full text link
    The semiparametric accelerated failure time model is not as widely used as the Cox relative risk model mainly due to computational difficulties. Recent developments in least squares estimation and induced smoothing estimating equations provide promising tools to make the accelerate failure time models more attractive in practice. For semiparametric multivariate accelerated failure time models, we propose a generalized estimating equation approach to account for the multivariate dependence through working correlation structures. The marginal error distributions can be either identical as in sequential event settings or different as in parallel event settings. Some regression coefficients can be shared across margins as needed. The initial estimator is a rank-based estimator with Gehan's weight, but obtained from an induced smoothing approach with computation ease. The resulting estimator is consistent and asymptotically normal, with a variance estimated through a multiplier resampling method. In a simulation study, our estimator was up to three times as efficient as the initial estimator, especially with stronger multivariate dependence and heavier censoring percentage. Two real examples demonstrate the utility of the proposed method

    Livestock trade networks for guiding animal health surveillance

    Get PDF
    BACKGROUND: Trade in live animals can contribute to the introduction of exotic diseases, the maintenance and spread endemic diseases. Annually millions of animals are moved across Europe for the purposes of breeding, fattening and slaughter. Data on the number of animals moved were obtained from the Directorate General Sanco (DG Sanco) for 2011. These were converted to livestock units to enable direct comparison across species and their movements were mapped, used to calculate the indegrees and outdegrees of 27 European countries and the density and transitivity of movements within Europe. This provided the opportunity to discuss surveillance of European livestock movement taking into account stopping points en-route. RESULTS: High density and transitivity of movement for registered equines, breeding and fattening cattle, breeding poultry and pigs for breeding, fattening and slaughter indicates that hazards have the potential to spread quickly within these populations. This is of concern to highly connected countries particularly those where imported animals constitute a large proportion of their national livestock populations, and have a high indegree. The transport of poultry (older than 72 hours) and unweaned animals would require more rest breaks than the movement of weaned animals, which may provide more opportunities for disease transmission. Transitivity is greatest for animals transported for breeding purposes with cattle, pigs and poultry having values of over 50%. CONCLUSIONS: This paper demonstrated that some species (pigs and poultry) are traded much more frequently and at a larger scale than species such as goats. Some countries are more vulnerable than others due to importing animals from many countries, having imported animals requiring rest-breaks and importing large proportions of their national herd or flock. Such knowledge about the vulnerability of different livestock systems related to trade movements can be used to inform the design of animal health surveillance systems to facilitate the trade in animals between European member states. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1186/s12917-015-0354-4) contains supplementary material, which is available to authorized users

    Spontaneous symmetry breaking in a quenched ferromagnetic spinor Bose condensate

    Full text link
    A central goal in condensed matter and modern atomic physics is the exploration of many-body quantum phases and the universal characteristics of quantum phase transitions in so far as they differ from those established for thermal phase transitions. Compared with condensed-matter systems, atomic gases are more precisely constructed and also provide the unique opportunity to explore quantum dynamics far from equilibrium. Here we identify a second-order quantum phase transition in a gaseous spinor Bose-Einstein condensate, a quantum fluid in which superfluidity and magnetism, both associated with symmetry breaking, are simultaneously realized. 87^{87}Rb spinor condensates were rapidly quenched across this transition to a ferromagnetic state and probed using in-situ magnetization imaging to observe spontaneous symmetry breaking through the formation of spin textures, ferromagnetic domains and domain walls. The observation of topological defects produced by this symmetry breaking, identified as polar-core spin-vortices containing non-zero spin current but no net mass current, represents the first phase-sensitive in-situ detection of vortices in a gaseous superfluid.Comment: 6 pages, 4 figure

    The dependence of dijet production on photon virtuality in ep collisions at HERA

    Get PDF
    The dependence of dijet production on the virtuality of the exchanged photon, Q^2, has been studied by measuring dijet cross sections in the range 0 < Q^2 < 2000 GeV^2 with the ZEUS detector at HERA using an integrated luminosity of 38.6 pb^-1. Dijet cross sections were measured for jets with transverse energy E_T^jet > 7.5 and 6.5 GeV and pseudorapidities in the photon-proton centre-of-mass frame in the range -3 < eta^jet <0. The variable xg^obs, a measure of the photon momentum entering the hard process, was used to enhance the sensitivity of the measurement to the photon structure. The Q^2 dependence of the ratio of low- to high-xg^obs events was measured. Next-to-leading-order QCD predictions were found to generally underestimate the low-xg^obs contribution relative to that at high xg^obs. Monte Carlo models based on leading-logarithmic parton-showers, using a partonic structure for the photon which falls smoothly with increasing Q^2, provide a qualitative description of the data.Comment: 35 pages, 6 eps figures, submitted to Eur.Phys.J.

    A frequentist framework of inductive reasoning

    Full text link
    Reacting against the limitation of statistics to decision procedures, R. A. Fisher proposed for inductive reasoning the use of the fiducial distribution, a parameter-space distribution of epistemological probability transferred directly from limiting relative frequencies rather than computed according to the Bayes update rule. The proposal is developed as follows using the confidence measure of a scalar parameter of interest. (With the restriction to one-dimensional parameter space, a confidence measure is essentially a fiducial probability distribution free of complications involving ancillary statistics.) A betting game establishes a sense in which confidence measures are the only reliable inferential probability distributions. The equality between the probabilities encoded in a confidence measure and the coverage rates of the corresponding confidence intervals ensures that the measure's rule for assigning confidence levels to hypotheses is uniquely minimax in the game. Although a confidence measure can be computed without any prior distribution, previous knowledge can be incorporated into confidence-based reasoning. To adjust a p-value or confidence interval for prior information, the confidence measure from the observed data can be combined with one or more independent confidence measures representing previous agent opinion. (The former confidence measure may correspond to a posterior distribution with frequentist matching of coverage probabilities.) The representation of subjective knowledge in terms of confidence measures rather than prior probability distributions preserves approximate frequentist validity.Comment: major revisio
    corecore