1,689 research outputs found

    Next-generation phenotyping using computer vision algorithms in rare genomic neurodevelopmental disorders

    Get PDF
    Purpose The interpretation of genetic variants after genome-wide analysis is complex in heterogeneous disorders such as intellectual disability (ID). We investigate whether algorithms can be used to detect if a facial gestalt is present for three novel ID syndromes and if these techniques can help interpret variants of uncertain significance. Methods Facial features were extracted from photos of ID patients harboring a pathogenic variant in three novel ID genes (PACS1, PPM1D, and PHIP) using algorithms that model human facial dysmorphism, and facial recognition. The resulting features were combined into a hybrid model to compare the three cohorts against a background ID population. Results We validated our model using images from 71 individuals with Koolen–de Vries syndrome, and then show that facial gestalts are present for individuals with a pathogenic variant in PACS1 (p = 8 × 10−4), PPM1D (p = 4.65 × 10−2), and PHIP (p = 6.3 × 10−3). Moreover, two individuals with a de novo missense variant of uncertain significance in PHIP have significant similarity to the expected facial phenotype of PHIP patients (p < 1.52 × 10−2). Conclusion Our results show that analysis of facial photos can be used to detect previously unknown facial gestalts for novel ID syndromes, which will facilitate both clinical and molecular diagnosis of rare and novel syndromes

    Integrating Sequencing Technologies in Personal Genomics: Optimal Low Cost Reconstruction of Structural Variants

    Get PDF
    The goal of human genome re-sequencing is obtaining an accurate assembly of an individual's genome. Recently, there has been great excitement in the development of many technologies for this (e.g. medium and short read sequencing from companies such as 454 and SOLiD, and high-density oligo-arrays from Affymetrix and NimbelGen), with even more expected to appear. The costs and sensitivities of these technologies differ considerably from each other. As an important goal of personal genomics is to reduce the cost of re-sequencing to an affordable point, it is worthwhile to consider optimally integrating technologies. Here, we build a simulation toolbox that will help us optimally combine different technologies for genome re-sequencing, especially in reconstructing large structural variants (SVs). SV reconstruction is considered the most challenging step in human genome re-sequencing. (It is sometimes even harder than de novo assembly of small genomes because of the duplications and repetitive sequences in the human genome.) To this end, we formulate canonical problems that are representative of issues in reconstruction and are of small enough scale to be computationally tractable and simulatable. Using semi-realistic simulations, we show how we can combine different technologies to optimally solve the assembly at low cost. With mapability maps, our simulations efficiently handle the inhomogeneous repeat-containing structure of the human genome and the computational complexity of practical assembly algorithms. They quantitatively show how combining different read lengths is more cost-effective than using one length, how an optimal mixed sequencing strategy for reconstructing large novel SVs usually also gives accurate detection of SNPs/indels, how paired-end reads can improve reconstruction efficiency, and how adding in arrays is more efficient than just sequencing for disentangling some complex SVs. Our strategy should facilitate the sequencing of human genomes at maximum accuracy and low cost

    Minimal Symptom Expression' in Patients With Acetylcholine Receptor Antibody-Positive Refractory Generalized Myasthenia Gravis Treated With Eculizumab

    Get PDF
    The efficacy and tolerability of eculizumab were assessed in REGAIN, a 26-week, phase 3, randomized, double-blind, placebo-controlled study in anti-acetylcholine receptor antibody-positive (AChR+) refractory generalized myasthenia gravis (gMG), and its open-label extension

    Post-intervention Status in Patients With Refractory Myasthenia Gravis Treated With Eculizumab During REGAIN and Its Open-Label Extension

    Get PDF
    OBJECTIVE: To evaluate whether eculizumab helps patients with anti-acetylcholine receptor-positive (AChR+) refractory generalized myasthenia gravis (gMG) achieve the Myasthenia Gravis Foundation of America (MGFA) post-intervention status of minimal manifestations (MM), we assessed patients' status throughout REGAIN (Safety and Efficacy of Eculizumab in AChR+ Refractory Generalized Myasthenia Gravis) and its open-label extension. METHODS: Patients who completed the REGAIN randomized controlled trial and continued into the open-label extension were included in this tertiary endpoint analysis. Patients were assessed for the MGFA post-intervention status of improved, unchanged, worse, MM, and pharmacologic remission at defined time points during REGAIN and through week 130 of the open-label study. RESULTS: A total of 117 patients completed REGAIN and continued into the open-label study (eculizumab/eculizumab: 56; placebo/eculizumab: 61). At week 26 of REGAIN, more eculizumab-treated patients than placebo-treated patients achieved a status of improved (60.7% vs 41.7%) or MM (25.0% vs 13.3%; common OR: 2.3; 95% CI: 1.1-4.5). After 130 weeks of eculizumab treatment, 88.0% of patients achieved improved status and 57.3% of patients achieved MM status. The safety profile of eculizumab was consistent with its known profile and no new safety signals were detected. CONCLUSION: Eculizumab led to rapid and sustained achievement of MM in patients with AChR+ refractory gMG. These findings support the use of eculizumab in this previously difficult-to-treat patient population. CLINICALTRIALSGOV IDENTIFIER: REGAIN, NCT01997229; REGAIN open-label extension, NCT02301624. CLASSIFICATION OF EVIDENCE: This study provides Class II evidence that, after 26 weeks of eculizumab treatment, 25.0% of adults with AChR+ refractory gMG achieved MM, compared with 13.3% who received placebo

    Consistent improvement with eculizumab across muscle groups in myasthenia gravis

    Get PDF

    Improving topological cluster reconstruction using calorimeter cell timing in ATLAS

    Get PDF
    Clusters of topologically connected calorimeter cells around cells with large absolute signal-to-noise ratio (topo-clusters) are the basis for calorimeter signal reconstruction in the ATLAS experiment. Topological cell clustering has proven performant in LHC Runs 1 and 2. It is, however, susceptible to out-of-time pile-up of signals from soft collisions outside the 25 ns proton-bunch-crossing window associated with the event’s hard collision. To reduce this effect, a calorimeter-cell timing criterion was added to the signal-to-noise ratio requirement in the clustering algorithm. Multiple versions of this criterion were tested by reconstructing hadronic signals in simulated events and Run 2 ATLAS data. The preferred version is found to reduce the out-of-time pile-up jet multiplicity by ∼50% for jet pT ∼ 20 GeV and by ∼80% for jet pT 50 GeV, while not disrupting the reconstruction of hadronic signals of interest, and improving the jet energy resolution by up to 5% for 20 < pT < 30 GeV. Pile-up is also suppressed for other physics objects based on topo-clusters (electrons, photons, τ -leptons), reducing the overall event size on disk by about 6% in early Run 3 pileup conditions. Offline reconstruction for Run 3 includes the timing requirement

    Software Performance of the ATLAS Track Reconstruction for LHC Run 3

    Get PDF
    Charged particle reconstruction in the presence of many simultaneous proton–proton (pp) collisions in the LHC is a challenging task for the ATLAS experiment’s reconstruction software due to the combinatorial complexity. This paper describes the major changes made to adapt the software to reconstruct high-activity collisions with an average of 50 or more simultaneous pp interactions per bunch crossing (pileup) promptly using the available computing resources. The performance of the key components of the track reconstruction chain and its dependence on pile-up are evaluated, and the improvement achieved compared to the previous software version is quantified. For events with an average of 60 pp collisions per bunch crossing, the updated track reconstruction is twice as fast as the previous version, without significant reduction in reconstruction efficiency and while reducing the rate of combinatorial fake tracks by more than a factor two

    Long-term safety and efficacy of eculizumab in generalized myasthenia gravis

    Get PDF

    Measurement and interpretation of same-sign W boson pair production in association with two jets in pp collisions at √s = 13 TeV with the ATLAS detector

    Get PDF
    This paper presents the measurement of fducial and diferential cross sections for both the inclusive and electroweak production of a same-sign W-boson pair in association with two jets (W±W±jj) using 139 fb−1 of proton-proton collision data recorded at a centre-of-mass energy of √ s = 13 TeV by the ATLAS detector at the Large Hadron Collider. The analysis is performed by selecting two same-charge leptons, electron or muon, and at least two jets with large invariant mass and a large rapidity diference. The measured fducial cross sections for electroweak and inclusive W±W±jj production are 2.92 ± 0.22 (stat.) ± 0.19 (syst.)fb and 3.38±0.22 (stat.)±0.19 (syst.)fb, respectively, in agreement with Standard Model predictions. The measurements are used to constrain anomalous quartic gauge couplings by extracting 95% confdence level intervals on dimension-8 operators. A search for doubly charged Higgs bosons H±± that are produced in vector-boson fusion processes and decay into a same-sign W boson pair is performed. The largest deviation from the Standard Model occurs for an H±± mass near 450 GeV, with a global signifcance of 2.5 standard deviations

    Performance and calibration of quark/gluon-jet taggers using 140 fb⁻¹ of pp collisions at √s=13 TeV with the ATLAS detector

    Get PDF
    The identification of jets originating from quarks and gluons, often referred to as quark/gluon tagging, plays an important role in various analyses performed at the Large Hadron Collider, as Standard Model measurements and searches for new particles decaying to quarks often rely on suppressing a large gluon-induced background. This paper describes the measurement of the efficiencies of quark/gluon taggers developed within the ATLAS Collaboration, using √s=13 TeV proton–proton collision data with an integrated luminosity of 140 fb-1 collected by the ATLAS experiment. Two taggers with high performances in rejecting jets from gluon over jets from quarks are studied: one tagger is based on requirements on the number of inner-detector tracks associated with the jet, and the other combines several jet substructure observables using a boosted decision tree. A method is established to determine the quark/gluon fraction in data, by using quark/gluon-enriched subsamples defined by the jet pseudorapidity. Differences in tagging efficiency between data and simulation are provided for jets with transverse momentum between 500 GeV and 2 TeV and for multiple tagger working points
    corecore