564 research outputs found

    Stochastic Variational Inference

    Full text link
    We develop stochastic variational inference, a scalable algorithm for approximating posterior distributions. We develop this technique for a large class of probabilistic models and we demonstrate it with two probabilistic topic models, latent Dirichlet allocation and the hierarchical Dirichlet process topic model. Using stochastic variational inference, we analyze several large collections of documents: 300K articles from Nature, 1.8M articles from The New York Times, and 3.8M articles from Wikipedia. Stochastic inference can easily handle data sets of this size and outperforms traditional variational inference, which can only handle a smaller subset. (We also show that the Bayesian nonparametric topic model outperforms its parametric counterpart.) Stochastic variational inference lets us apply complex Bayesian models to massive data sets

    Ultrasonic locating devices for central venous cannulation: meta-analysis

    Get PDF
    OBJECTIVES: To assess the evidence for the clinical effectiveness of ultrasound guided central venous cannulation. DATA SOURCES: 15 electronic bibliographic databases, covering biomedical, science, social science, health economics, and grey literature. DESIGN: Systematic review and meta-analysis of randomised controlled trials. POPULATIONS: Patients scheduled for central venous access. INTERVENTION REVIEWED: Guidance using real time two dimensional ultrasonography or Doppler needles and probes compared with the anatomical landmark method of cannulation. DATA EXTRACTION: Risk of failed catheter placement (primary outcome), risk of complications from placement, risk of failure on first attempt at placement, number of attempts to successful catheterisation, and time (seconds) to successful catheterisation. DATA SYNTHESIS: 18 trials (1646 participants) were identified. Compared with the landmark method, real time two dimensional ultrasound guidance for cannulating the internal jugular vein in adults was associated with a significantly lower failure rate both overall (relative risk 0.14, 95% confidence interval 0.06 to 0.33) and on the first attempt (0.59, 0.39 to 0.88). Limited evidence favoured two dimensional ultrasound guidance for subclavian vein and femoral vein procedures in adults (0.14, 0.04 to 0.57 and 0.29, 0.07 to 1.21, respectively). Three studies in infants confirmed a higher success rate with two dimensional ultrasonography for internal jugular procedures (0.15, 0.03 to 0.64). Doppler guided cannulation of the internal jugular vein in adults was more successful than the landmark method (0.39, 0.17 to 0.92), but the landmark method was more successful for subclavian vein procedures (1.48, 1.03 to 2.14). No significant difference was found between these techniques for cannulation of the internal jugular vein in infants. An indirect comparison of relative risks suggested that two dimensional ultrasonography would be more successful than Doppler guidance for subclavian vein procedures in adults (0.09, 0.02 to 0.38). CONCLUSIONS: Evidence supports the use of two dimensional ultrasonography for central venous cannulation

    Systematic review of the evidence on housing interventions for ‘housing-vulnerable’ adults and its relationship to wellbeing

    Get PDF
    Access to safe, good quality affordable housing is essential to wellbeing and housing related factors can have an important influence on neighbourliness and sense of community belonging. A recent scoping review on housing and wellbeing identified a lack of review-level evidence around the impact of housing interventions on wellbeing of people who are vulnerable to discrimination or exclusion in relation to housing (Preston et al., 2016). This systematic review was commissioned to address that gap. We synthesise and consider the quality of evidence on how housing interventions can contribute to improving the lives of adults who are vulnerable in relation to the security of their housing tenure (‘housing-vulnerable’ adults)

    A data analytic approach to automatic fault diagnosis and prognosis for distribution automation

    Get PDF
    Distribution Automation (DA) is deployed to reduce outages and to rapidly reconnect customers following network faults. Recent developments in DA equipment have enabled the logging of load and fault event data, referred to as ‘pick-up activity’. This pick-up activity provides a picture of the underlying circuit activity occurring between successive DA operations over a period of time and has the potential to be accessed remotely for off-line or on-line analysis. The application of data analytics and automated analysis of this data supports reactive fault management and post fault investigation into anomalous network behavior. It also supports predictive capabilities that identify when potential network faults are evolving and offers the opportunity to take action in advance in order to mitigate any outages. This paper details the design of a novel decision support system to achieve fault diagnosis and prognosis for DA schemes. It combines detailed data from a specific DA device with rule-based, data mining and clustering techniques to deliver the diagnostic and prognostic functions. These are applied to 11kV distribution network data captured from Pole Mounted Auto-Reclosers (PMARs) as provided by a leading UK network operator. This novel automated analysis system diagnoses the nature of a circuit’s previous fault activity, identifies underlying anomalous circuit activity, and highlights indications of problematic events gradually evolving into a full scale circuit fault. The novel contributions include the tackling of ‘semi-permanent faults’ and the re-usable methodology and approach for applying data analytics to any DA device data sets in order to provide diagnostic decisions and mitigate potential fault scenarios

    Electrical Pressurization Concept for the Orion MPCV European Service Module Propulsion System

    Get PDF
    The paper presents the design of the pressurization system of the European Service Module (ESM) of the Orion Multi-Purpose Crew Vehicle (MPCV). Being part of the propulsion subsystem, an electrical pressurization concept is implemented to condition propellants according to the engine needs via a bang-bang regulation system. Separate pressurization for the oxidizer and the fuel tank permits mixture ratio adjustments and prevents vapor mixing of the two hypergolic propellants during nominal operation. In case of loss of pressurization capability of a single side, the system can be converted into a common pressurization system. The regulation concept is based on evaluation of a set of tank pressure sensors and according activation of regulation valves, based on a single-failure tolerant weighting of three pressure signals. While regulation is performed on ESM level, commanding of regulation parameters as well as failure detection, isolation and recovery is performed from within the Crew Module, developed by Lockheed Martin Space System Company. The overall design and development maturity presented is post Preliminary Design Review (PDR) and reflects the current status of the MPCV ESM pressurization system

    Location Dependent Dirichlet Processes

    Full text link
    Dirichlet processes (DP) are widely applied in Bayesian nonparametric modeling. However, in their basic form they do not directly integrate dependency information among data arising from space and time. In this paper, we propose location dependent Dirichlet processes (LDDP) which incorporate nonparametric Gaussian processes in the DP modeling framework to model such dependencies. We develop the LDDP in the context of mixture modeling, and develop a mean field variational inference algorithm for this mixture model. The effectiveness of the proposed modeling framework is shown on an image segmentation task

    Anomalous relaxations and chemical trends at III-V nitride non-polar surfaces

    Full text link
    Relaxations at nonpolar surfaces of III-V compounds result from a competition between dehybridization and charge transfer. First principles calculations for the (110) and (101ˉ\bar{1}0) faces of zincblende and wurtzite AlN, GaN and InN reveal an anomalous behavior as compared with ordinary III-V semiconductors. Additional calculations for GaAs and ZnO suggest close analogies with the latter. We interpret our results in terms of the larger ionicity (charge asymmetry) and bonding strength (cohesive energy) in the nitrides with respect to other III-V compounds, both essentially due to the strong valence potential and absence of pp core states in the lighter anion. The same interpretation applies to Zn II-VI compounds.Comment: RevTeX 7 pages, 8 figures included; also available at http://kalix.dsf.unica.it/preprints/; improved after revie

    Clinical effectiveness and cost-effectiveness of pegvisomant for the treatment of acromegaly: a systematic review and economic evaluation

    Get PDF
    Background: Acromegaly, an orphan disease usually caused by a benign pituitary tumour, is characterised by hyper-secretion of growth hormone (GH) and insulin-like growth factor I (IGF-1). It is associated with reduced life expectancy, cardiovascular problems, a variety of insidiously progressing detrimental symptoms and metabolic malfunction. Treatments include surgery, radiotherapy and pharmacotherapy. Pegvisomant (PEG) is a genetically engineered GH analogue licensed as a third or fourth line option when other treatments have failed to normalise IGF-1 levels. Methods: Evidence about effectiveness and cost-effectiveness of PEG was systematically reviewed. Data were extracted from published studies and used for a narrative synthesis of evidence. A decision analytical economic model was identified and modified to assess the cost-effectiveness of PEG. Results: One RCT and 17 non-randomised studies were reviewed for effectiveness. PEG substantially reduced and rapidly normalised IGF-1 levels in the majority of patients, approximately doubled GH levels, and improved some of the signs and symptoms of the disease. Tumour size was unaffected at least in the short term. PEG had a generally safe adverse event profile but a few patients were withdrawn from treatment because of raised liver enzymes. An economic model was identified and adapted to estimate the lower limit for the cost-effectiveness of PEG treatment versus standard care. Over a 20 year time horizon the incremental cost-effectiveness ratio was pound81,000/QALY and pound212,000/LYG. To reduce this to pound30K/QALY would require a reduction in drug cost by about one third. Conclusion: PEG is highly effective for improving patients' IGF-1 level. Signs and symptoms of disease improve but evidence is lacking about long term effects on improved signs and symptoms of disease, quality of life, patient compliance and safety. Economic evaluation indicated that if current standards (UK) for determining cost-effectiveness of therapies were to be applied to PEG it would be considered not to represent good value for money

    Unsupervised Bayesian linear unmixing of gene expression microarrays

    Get PDF
    Background: This paper introduces a new constrained model and the corresponding algorithm, called unsupervised Bayesian linear unmixing (uBLU), to identify biological signatures from high dimensional assays like gene expression microarrays. The basis for uBLU is a Bayesian model for the data samples which are represented as an additive mixture of random positive gene signatures, called factors, with random positive mixing coefficients, called factor scores, that specify the relative contribution of each signature to a specific sample. The particularity of the proposed method is that uBLU constrains the factor loadings to be non-negative and the factor scores to be probability distributions over the factors. Furthermore, it also provides estimates of the number of factors. A Gibbs sampling strategy is adopted here to generate random samples according to the posterior distribution of the factors, factor scores, and number of factors. These samples are then used to estimate all the unknown parameters. Results: Firstly, the proposed uBLU method is applied to several simulated datasets with known ground truth and compared with previous factor decomposition methods, such as principal component analysis (PCA), non negative matrix factorization (NMF), Bayesian factor regression modeling (BFRM), and the gradient-based algorithm for general matrix factorization (GB-GMF). Secondly, we illustrate the application of uBLU on a real time-evolving gene expression dataset from a recent viral challenge study in which individuals have been inoculated with influenza A/H3N2/Wisconsin. We show that the uBLU method significantly outperforms the other methods on the simulated and real data sets considered here. Conclusions: The results obtained on synthetic and real data illustrate the accuracy of the proposed uBLU method when compared to other factor decomposition methods from the literature (PCA, NMF, BFRM, and GB-GMF). The uBLU method identifies an inflammatory component closely associated with clinical symptom scores collected during the study. Using a constrained model allows recovery of all the inflammatory genes in a single factor
    corecore