760 research outputs found
Biologically-informed interpretable deep learning techniques for BMI prediction and gene interaction detection
The analysis of genetic point mutations at the population level can offer insights into the genetic basis of human traits, which in turn could potentially lead to new diagnostic and treatment options for heritable diseases. However, existing genetic data analysis methods tend to rely on simplifying assumptions that ignore nonlinear interactions between variants. The ability to model and describe nonlinear genetic interactions could lead to both improved trait prediction and enhanced understanding of the underlying biology. Deep Learning models offer the possibility of automatically learning complex nonlinear genetic architectures, but it is currently unclear how best to optimise them for genetic data. It is also essential that any models be able to “explain” what they have learned in order for them to be used for genetic discovery or clinical applications, which can be difficult due to the black-box nature of DL predictors.
This thesis addresses a number of methodological gaps in applying explainable DL models end-to-end on variant-level genetic data. We propose novel methods for encoding genetic data for deep learning applications and show that feature encodings designed specifically for genetic variants offer the possibility of improved model efficiency and performance. We then benchmark a variety of models for the prediction of Body Mass Index using data from the UK Biobank, yielding insights into DL performance in this domain. We then propose a series of novel DL model interpretation methods with features optimised for biological insights. We first show how these can be used to validate that the network has automatically replicated existing knowledge, and then illustrate their ability to detect complex nonlinear genetic interactions that influence BMI in our cohort. Overall, we show that DL model training and interpretation procedures that have been optimised for genetic data can be used to yield new insights into disease aetiology
La méthode d'évaluation contingente comme outil néolibéral de planification environnementale
Ce mémoire vise une critique politique de la méthode d'évaluation contingente, qui permet l'évaluation monétaire de biens environnementaux non marchands à des fins de planification environnementale. Cette méthode repose sur un sondage dans lequel on demande à un échantillon d'individus la valeur monétaire qu'ils seraient prêts à payer pour protéger un bien environnemental d'intérêt. La valeur totale est ensuite intégrée à une analyse avantages-coûts, comme coût s'il s'agit d'un projet d'exploitation ou comme bénéfice s'il s'agit d'un projet de conservation. On a largement critiqué la fiabilité de la méthode ainsi que ses fondements théoriques. Dans plusieurs études, les répondants se montrent peu sensibles aux prix et aux quantités proposés, ce qui amène des chercheurs à conclure que les résultats des questionnaires ne peuvent être utilisés dans des analyses avantages-coûts, mais devraient plutôt être interprétés comme ceux d'un pseudo-référendum. Or, face à ce constat, plutôt que de proposer l'abandon du cadre économique au profit de consultations ouvertement politiques, on suggère d'en multiplier l'utilisation afin d'habituer les répondants. Un tel entêtement à utiliser la méthode nous a amené à la considérer comme un outil qui, loin d'être neutre, s'inscrit dans un projet politique beaucoup plus vaste. Ce projet, c'est celui du néolibéralisme. Et puisque ses politiques sont entièrement basées sur la rationalité utilitariste, il lui faut produire des sujets adaptés, soit des individus opérant des choix uniquement en fonction d'un calcul d'intérêt personnel, telles des entreprises. C'est pourquoi le néolibéralisme peut être appréhendé, à la façon de Michel Foucault, comme un mode de subjectivation. En nous appuyant sur les réflexions de cet auteur dans Naissance de la biopolitique, de même que sur les écrits de plusieurs de ses héritiers, nous développons l'argument que la démultiplication des sondages d'évaluation monétaire participe à la mise en forme de sujets néolibéraux. Forcés de réfléchir les questions environnementales dans un cadre économique, ceux-ci n'entrevoient plus d'autre justice que ce qu'ils ont déclaré vouloir payer et les biens et services environnementaux qu'ils reçoivent en retour. L'utilisation de la méthode a donc pour effet de dépolitiser le rapport État-citoyen. Afin de sortir de cette rationalité néolibérale a-démocratique, on peut avoir recours à d'autres outils de planification environnementale, tels que l'évaluation multicritères, qui permet de se départir de l'aspect monétaire des évaluations.\ud
______________________________________________________________________________ \ud
MOTS-CLÉS DE L’AUTEUR : méthode d'évaluation contingente, biens environnementaux non marchands, planification environnementale, néolibéralisme, mode de subjectivation, Michel Foucaul
The methodology of surveillance for antimicrobial resistance and healthcare-associated infections in Europe (SUSPIRE): a systematic review of publicly available information.
OBJECTIVES: Surveillance is a key component of any control strategy for healthcare-associated infections (HAIs) and antimicrobial resistance (AMR), and public availability of methodologic aspects is crucial for the interpretation of the data. We sought to systematically review publicly available information for HAIs and/or AMR surveillance systems organized by public institutions or scientific societies in European countries. METHODS: A systematic review of scientific and grey literature following the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines was performed. Information on HAIs and/or AMR surveillance systems published until 31 October 2016 were included. RESULTS: A total of 112 surveillance systems were detected; 56 from 20 countries were finally included. Most exclusions were due to lack of publicly available information. Regarding AMR, the most frequent indicator was the proportion of resistant isolates (27 of 34 providing information, 79.42%); only 18 (52.9%) included incidence rates; the data were only laboratory based in 33 (78.5%) of the 42 providing this information. Regarding HAIs in intensive care units, all 22 of the systems providing data included central line-associated bloodstream infections, and 19 (86.3%) included ventilator-associated pneumonia and catheter-associated urinary tract infections; incidence density was the most frequent indicator. Regarding surgical site infections, the most frequent procedures included were hip prosthesis, colon surgery and caesarean section (21/22, 95.5%). CONCLUSIONS: Publicly available information about the methods and indicators of the surveillance system is frequently lacking. Despite the efforts of European Centre for Disease Control and Prevention (ECDC) and other organizations, wide heterogeneity in procedures and indicators still exists
Development of the correction procedure for High Volume Instrument elongation measurement.
Cotton spinning mills need high-quality fibers to maintain their manufacturing efficiency. Machinery throughput is increasing and it could translate into more processes with higher breaking stress. Consequently, more fibers are susceptible to breaking or damage. To face this problem, breeders must develop new varieties whose fibers can better withstand this mechanical stress. The main tool utilized in cotton breeding programs is the High Volume Instrument (HVI), which reports in a short time measurements such as micronaire, length, color, and strength. This instrument can also determine fiber elongation, but there is no current correction method for it. Both elongation and strength factor into the work-to-break of fibers, which plays a direct role in fiber breakage and spinning performance. The objective of this work was to develop cotton elongation standards, devise a correction procedure for HVI lines, evaluate measurement stability, and validate these results with a set of independent samples. Two commercial bales, one with low and one with high HVI elongation, were identified as potential elongation standards. The potential standards were produced and evaluated. After validation, they were used to correct HVI lines against Stelometer (STrength-ELOngation-METER) measurements. An independent set of samples was tested on corrected HVIs to confirm the effectiveness of the elongation corrected measurements. The HVI data were at least as good as the Stelometer data, with increased data acquisition speed and precision. This research can help cotton breeders to improve fiber elongation and strength at the same time, resulting in better fibers for yarn spinning
Association of mixed hematopoietic chimerism with elevated circulating autoantibodies and chronic graft-versus-host disease occurrence.
International audienceBACKGROUND: Use of a reduced-intensity conditioning regimen before an allogeneic hematopoietic cell transplantation is frequently associated with an early state of mixed hematopoietic chimerism. Such a coexistence of both host and donor hematopoietic cells may influence posttransplant alloreactivity and may affect the occurrence and severity of acute and chronic graft-versus-host disease (GVHD) as well as the intensity of the graft-versus-leukemia effect. Here we evaluated the relation between chimerism state after reduced-intensity conditioning transplantation (RICT), autoantibody production, and chronic GVHD (cGVHD)-related pathology. METHODS: Chimerism state, circulating anticardiolipin, and antidouble stranded DNA autoantibody (Ab) titers as well as occurrence of cGVHD-like lesions were investigated in a murine RICT model. RESULTS: We observed a novel association between mixed chimerism state, high levels of pathogenic IgG autoantibodies, and subsequent development of cGVHD-like lesions. Furthermore, we found that the persistence of host B cells, but not dendritic cell origin or subset, was a factor associated with the appearance of cGVHD-like lesions. The implication of host B cells was confirmed by a host origin of autoantibodies. CONCLUSION: Recipient B cell persistence may contribute to the frequency and/or severity of cGVHD after RICT
Body surface area and baseline blood pressure predict subclinical anthracycline cardiotoxicity in women treated for early breast cancer.
BACKGROUND AND AIMS: Anthracyclines are highly effective chemotherapeutic agents which may cause long-term cardiac damage (chronic anthracycline cardiotoxicity) and heart failure. The pathogenesis of anthracycline cardiotoxicity remains incompletely understood and individual susceptibility difficult to predict. We sought clinical features which might contribute to improved risk assessment. METHODS: Subjects were women with early breast cancer, free of pre-existing cardiac disease. Left ventricular ejection fraction was measured using cardiovascular magnetic resonance before and >12 months after anthracycline-based chemotherapy (>3 months post-Trastuzumab). Variables associated with subclinical cardiotoxicity (defined as a fall in left ventricular ejection fraction of ≥5%) were identified by logistic regression. RESULTS: One hundred and sixty-five women (mean age 48.3 years at enrollment) completed the study 21.7 months [IQR 18.0-26.8] after starting chemotherapy. All received anthracyclines (98.8% epirubicin, cumulative dose 400 [300-450] mg/m2); 18% Trastuzumab. Baseline blood pressure was elevated (≥140/90mmHg, mean 147.3/86.1mmHg) in 18 subjects. Thirty-four subjects (20.7%) were identified with subclinical cardiotoxicity, independent predictors of which were the number of anthracycline cycles (odds ratio, OR 1.64 [1.17-2.30] per cycle), blood pressure ≥140/90mmHg (OR 5.36 [1.73-17.61]), body surface area (OR 2.08 [1.36-3.20] per standard deviation (0.16m2) increase), and Trastuzumab therapy (OR 3.35 [1.18-9.51]). The resultant predictive-model had an area under the receiver operating characteristics curve of 0.78 [0.70-0.86]. CONCLUSIONS: We found subclinical cardiotoxicity to be common even within this low risk cohort. Risk of cardiotoxicity was associated with modestly elevated baseline blood pressure-indicating that close attention should be paid to blood pressure in patients considered for anthracycline based chemotherapy. The association with higher body surface area suggests that indexing of anthracycline doses to surface area may not be appropriate for all, and points to the need for additional research in this area
Fluorinated MRI contrast agents and their versatile applications in the biomedical field
peer reviewe
Do endemic trees flora make endemic forests? Insights from New Caledonian forests
New Caledonia homes a rich and highly original flora with a species endemic rate > 75% and fascinating representation of relict taxa (gymnosperms and basal angiosperms). As a result, previous studies on the island flora have mostly focused on the taxonomy and biogeographical origins of this exceptional flora while few studies have attempted to understand the spatial distribution of species and the structure and diversity of species assemblages. Here, we present new insights into the diversity, structure, and ecology of trees communities derived from the New Caledonian Plant Inventory and Permanent Plot Network (NC-PIPPN). NC-PIPPN consists of standardized forest inventories scattered throughout the New Caledonian main island. This network groups together ca. 450 plots including more than 70 000 occurrences of woody plants (trees, shrubs, lianas, tree ferns, and palms) belonging to more than 950 mostly endemics species. Most species are distributed along wide environmental ranges (ca. 900 m of elevation and 2200 mm of mean annual rainfall) and contrasted substrates (volcano-sedimentary, ultramafics and calcareous). Wide environmental ranges, however, do not significantly correlate with large spatial distribution or high local abundance. As in other places in the tropics, the diversity of the forests is support by a highly uneven species abundance distribution. Less than 20%of tree species account for > 50% of all known occurrences, while half of the tree species contribute to < 16% of occurrences. Local abundance is also independant to spatial distribution: some rare species at the island scale are locally abundant while some frequent species at the island scale are locally rare. The spatial distribution of species results in highly heterogeneous forests (high beta diversity) that contrasts with a relatively homogeneous structure of communities. Despite a highly original flora and a pattern of aggregative species distribution, New Caledonian forests are not so distinguishable from other forests in the South Pacific region. Our results suggest that the New Caledonian rainforests are mostly constrained by geographical features (area and isolation of the archipelago) and climatic features (e.g. cyclonic frequency) while the flora uniqueness poorly contribute to the forest structure
Mining rare Earth elements: Identifying the plant species most threatened by ore extraction in an insular hotspot
Conservation efforts in global biodiversity hotspots often face a common predicament: an urgent need for conservation action hampered by a significant lack of knowledge about that biodiversity. In recent decades, the computerisation of primary biodiversity data worldwide has provided the scientific community with raw material to increase our understanding of the shared natural heritage. These datasets, however, suffer from a lot of geographical and taxonomic inaccuracies. Automated tools developed to enhance their reliability have shown that detailed expert examination remains the best way to achieve robust and exhaustive datasets. In New Caledonia, one of the most important biodiversity hotspots worldwide, the plant diversity inventory is still underway, and most taxa awaiting formal description are narrow endemics, hence by definition hard to discern in the datasets. In the meantime, anthropogenic pressures, such as nickel-ore mining, are threatening the unique ultramafic ecosystems at an increasing rate. The conservation challenge is therefore a race against time, as the rarest species must be identified and protected before they vanish. In this study, based on all available datasets and resources, we applied a workflow capable of highlighting the lesser known taxa. The main challenges addressed were to aggregate all data available worldwide, and tackle the geographical and taxonomic biases, avoiding the data loss resulting from automated filtering. Every doubtful specimen went through a careful taxonomic analysis by a local and international taxonomist panel. Geolocation of the whole dataset was achieved through dataset cross-checking, local botanists’ field knowledge, and historical material examination. Field studies were also conducted to clarify the most unresolved taxa. With the help of this method and by analysing over 85,000 data, we were able to double the number of known narrow endemic taxa, elucidate 68 putative new species, and update our knowledge of the rarest species’ distributions so as to promote conservation measures
- …
