233 research outputs found

    Coefficient of thermal expansion of nanostructured tungsten based coatings assessed by thermally induced substrate curvature method

    Full text link
    The in plane coefficient of thermal expansion (CTE) and the residual stress of nanostructured W based coatings are extensively investigated. The CTE and the residual stresses are derived by means of an optimized ad-hoc developed experimental setup based on the detection of the substrate curvature by a laser system. The nanostructured coatings are deposited by Pulsed Laser Deposition. Thanks to its versatility, nanocrystalline W metallic coatings, ultra-nano-crystalline pure W and W-Tantalum coatings and amorphous-like W coatings are obtained. The correlation between the nanostructure, the residual stress and the CTE of the coatings are thus elucidated. We find that all the samples show a compressive state of stress that decreases as the structure goes from columnar nanocrystalline to amorphous-like. The CTE of all the coatings is higher than the one of the corresponding bulk W form. In particular, as the grain size shrinks, the CTE increases from 5.1 106^{-6} K1^{-1} for nanocrystalline W to 6.6 106^{-6} K1^{-1} in the ultra-nano-crystalline region. When dealing with amorphous W, the further increase of the CTE is attributed to a higher porosity degree of the samples. The CTE trend is also investigated as function of materials stiffness. In this case, as W coatings become softer, the easier they thermally expand.Comment: The research leading to these results has also received funding from the European Research Council Consolidator Grant ENSURE (ERC-2014-CoG No. 647554

    Thermomechanical properties of amorphous metallic tungsten-oxygen and tungsten-oxide coatings

    Get PDF
    In this work, we investigate the correlation between morphology, composition, and the mechanical properties of metallic amorphous tungsten-oxygen and amorphous tungsten-oxide films deposited by Pulsed Laser Deposition. This correlation is investigated by the combined use of Brillouin Spectroscopy and the substrate curvature method. The stiffness of the films is strongly affected by both the oxygen content and the mass density. The elastic moduli show a decreasing trend as the mass density decreases and the oxygen-tungsten ratio increases. A plateaux region is detected in correspondence of the transition between metallic and oxide films. The compressive residual stresses, moderate stiffness and high local ductility that characterize compact amorphous tungsten-oxide films make them promising for applications involving thermal or mechanical loads. The coefficient of thermal expansion is quite high (i.e. 8.9 \cdot 106^{-6} K1^{-1}), being strictly correlated to the amorphous structure and stoichiometry of the films. Under thermal treatments they show a quite low relaxation temperature (i.e. 450 K). They crystallize into the γ\gamma monoclinic phase of WO3_3 starting from 670 K, inducing an increase by about 70\% of material stiffness.Comment: The research leading to these results has also received funding from the European Research Council Consolidator Grant ENSURE (ERC-2014-CoG No. 647554). The views and opinions expressed herein do not necessarily reflect those of the European Commissio

    Drip and Mate Operations Acting in Test Tube Systems and Tissue-like P systems

    Full text link
    The operations drip and mate considered in (mem)brane computing resemble the operations cut and recombination well known from DNA computing. We here consider sets of vesicles with multisets of objects on their outside membrane interacting by drip and mate in two different setups: in test tube systems, the vesicles may pass from one tube to another one provided they fulfill specific constraints; in tissue-like P systems, the vesicles are immediately passed to specified cells after having undergone a drip or mate operation. In both variants, computational completeness can be obtained, yet with different constraints for the drip and mate operations

    Analysis of single-cell RNA sequencing data based on autoencoders

    Get PDF
    Background: Single-cell RNA sequencing (scRNA-Seq) experiments are gaining ground to study the molecular processes that drive normal development as well as the onset of different pathologies. Finding an effective and efficient low-dimensional representation of the data is one of the most important steps in the downstream analysis of scRNA-Seq data, as it could provide a better identification of known or putatively novel cell-types. Another step that still poses a challenge is the integration of different scRNA-Seq datasets. Though standard computational pipelines to gain knowledge from scRNA-Seq data exist, a further improvement could be achieved by means of machine learning approaches. Results: Autoencoders (AEs) have been effectively used to capture the non-linearities among gene interactions of scRNA-Seq data, so that the deployment of AE-based tools might represent the way forward in this context. We introduce here scAEspy, a unifying tool that embodies: (1) four of the most advanced AEs, (2) two novel AEs that we developed on purpose, (3) different loss functions. We show that scAEspy can be coupled with various batch-effect removal tools to integrate data by different scRNA-Seq platforms, in order to better identify the cell-types. We benchmarked scAEspy against the most used batch-effect removal tools, showing that our AE-based strategies outperform the existing solutions. Conclusions: scAEspy is a user-friendly tool that enables using the most recent and promising AEs to analyse scRNA-Seq data by only setting up two user-defined parameters. Thanks to its modularity, scAEspy can be easily extended to accommodate new AEs to further improve the downstream analysis of scRNA-Seq data. Considering the relevant results we achieved, scAEspy can be considered as a starting point to build a more comprehensive toolkit designed to integrate multi single-cell omics

    A Multiscale Modeling Framework Based on P Systems

    Get PDF
    Cellular systems present a highly complex organization at different scales including the molecular, cellular and colony levels. The complexity at each one of these levels is tightly interrelated. Integrative systems biology aims to obtain a deeper understanding of cellular systems by focusing on the systemic and systematic integration of the different levels of organization in cellular systems. The different approaches in cellular modeling within systems biology have been classified into mathematical and computational frameworks. Specifically, the methodology to develop computational models has been recently called executable biology since it produces executable algorithms whose computations resemble the evolution of cellular systems. In this work we present P systems as a multiscale modeling framework within executable biology. P system models explicitly specify the molecular, cellular and colony levels in cellular systems in a relevant and understandable manner. Molecular species and their structure are represented by objects or strings, compartmentalization is described using membrane structures and finally cellular colonies and tissues are modeled as a collection of interacting individual P systems. The interactions between the components of cellular systems are described using rewriting rules. These rules can in turn be grouped together into modules to characterize specific cellular processes. One of our current research lines focuses on the design of cell systems biology models exhibiting a prefixed behavior through the automatic assembly of these cellular modules. Our approach is equally applicable to synthetic as well as systems biology.Kingdom's Engineering and Physical Sciences Research Council EP/ E017215/1Biotechnology and Biological Sciences Research Council/United Kingdom BB/F01855X/1Biotechnology and Biological Sciences Research Council/United Kingdom BB/D019613/

    Biochemical parameter estimation vs. benchmark functions: A comparative study of optimization performance and representation design

    Get PDF
    © 2019 Elsevier B.V. Computational Intelligence methods, which include Evolutionary Computation and Swarm Intelligence, can efficiently and effectively identify optimal solutions to complex optimization problems by exploiting the cooperative and competitive interplay among their individuals. The exploration and exploitation capabilities of these meta-heuristics are typically assessed by considering well-known suites of benchmark functions, specifically designed for numerical global optimization purposes. However, their performances could drastically change in the case of real-world optimization problems. In this paper, we investigate this issue by considering the Parameter Estimation (PE) of biochemical systems, a common computational problem in the field of Systems Biology. In order to evaluate the effectiveness of various meta-heuristics in solving the PE problem, we compare their performance by considering a set of benchmark functions and a set of synthetic biochemical models characterized by a search space with an increasing number of dimensions. Our results show that some state-of-the-art optimization methods – able to largely outperform the other meta-heuristics on benchmark functions – are characterized by considerably poor performances when applied to the PE problem. We also show that a limiting factor of these optimization methods concerns the representation of the solutions: indeed, by means of a simple semantic transformation, it is possible to turn these algorithms into competitive alternatives. We corroborate this finding by performing the PE of a model of metabolic pathways in red blood cells. Overall, in this work we state that classic benchmark functions cannot be fully representative of all the features that make real-world optimization problems hard to solve. This is the case, in particular, of the PE of biochemical systems. We also show that optimization problems must be carefully analyzed to select an appropriate representation, in order to actually obtain the performance promised by benchmark results

    MedGA: A novel evolutionary method for image enhancement in medical imaging systems

    Get PDF
    Medical imaging systems often require the application of image enhancement techniques to help physicians in anomaly/abnormality detection and diagnosis, as well as to improve the quality of images that undergo automated image processing. In this work we introduce MedGA, a novel image enhancement method based on Genetic Algorithms that is able to improve the appearance and the visual quality of images characterized by a bimodal gray level intensity histogram, by strengthening their two underlying sub-distributions. MedGA can be exploited as a pre-processing step for the enhancement of images with a nearly bimodal histogram distribution, to improve the results achieved by downstream image processing techniques. As a case study, we use MedGA as a clinical expert system for contrast-enhanced Magnetic Resonance image analysis, considering Magnetic Resonance guided Focused Ultrasound Surgery for uterine fibroids. The performances of MedGA are quantitatively evaluated by means of various image enhancement metrics, and compared against the conventional state-of-the-art image enhancement techniques, namely, histogram equalization, bi-histogram equalization, encoding and decoding Gamma transformations, and sigmoid transformations. We show that MedGA considerably outperforms the other approaches in terms of signal and perceived image quality, while preserving the input mean brightness. MedGA may have a significant impact in real healthcare environments, representing an intelligent solution for Clinical Decision Support Systems in radiology practice for image enhancement, to visually assist physicians during their interactive decision-making tasks, as well as for the improvement of downstream automated processing pipelines in clinically useful measurements

    Modeling cell proliferation in human acute myeloid leukemia xenografts

    Get PDF
    Motivation: Acute myeloid leukemia (AML) is one of the most common hematological malignancies, characterized by high relapse and mortality rates. The inherent intra-tumor heterogeneity in AML is thought to play an important role in disease recurrence and resistance to chemotherapy. Although experimental protocols for cell proliferation studies are well established and widespread, they are not easily applicable to in vivo contexts, and the analysis of related time-series data is often complex to achieve. To overcome these limitations, model-driven approaches can be exploited to investigate different aspects of cell population dynamics. Results: In this work, we present ProCell, a novel modeling and simulation framework to investigate cell proliferation dynamics that, differently from other approaches, takes into account the inherent stochasticity of cell division events. We apply ProCell to compare different models of cell proliferation in AML, notably leveraging experimental data derived from human xenografts in mice. ProCell is coupled with Fuzzy Self-Tuning Particle Swarm Optimization, a swarm-intelligence settings-free algorithm used to automatically infer the models parameterizations. Our results provide new insights on the intricate organization of AML cells with highly heterogeneous proliferative potential, highlighting the important role played by quiescent cells and proliferating cells characterized by different rates of division in the progression and evolution of the disease, thus hinting at the necessity to further characterize tumor cell subpopulations. Availability and implementation: The source code of ProCell and the experimental data used in this work are available under the GPL 2.0 license on GITHUB at the following URL: https://github.com/aresio/ProCell

    Computational strategies for a system-level understanding of metabolism

    Get PDF
    Cell metabolism is the biochemical machinery that provides energy and building blocks to sustain life. Understanding its fine regulation is of pivotal relevance in several fields, from metabolic engineering applications to the treatment of metabolic disorders and cancer. Sophisticated computational approaches are needed to unravel the complexity of metabolism. To this aim, a plethora of methods have been developed, yet it is generally hard to identify which computational strategy is most suited for the investigation of a specific aspect of metabolism. This review provides an up-to-date description of the computational methods available for the analysis of metabolic pathways, discussing their main advantages and drawbacks. In particular, attention is devoted to the identification of the appropriate scale and level of accuracy in the reconstruction of metabolic networks, and to the inference of model structure and parameters, especially when dealing with a shortage of experimental measurements. The choice of the proper computational methods to derive in silico data is then addressed, including topological analyses, constraint-based modeling and simulation of the system dynamics. A description of some computational approaches to gain new biological knowledge or to formulate hypotheses is finally provided
    corecore