145 research outputs found

    Purification of immature neuronal cells from neural stem cell progeny

    Get PDF
    Large-scale proliferation and multi-lineage differentiation capabilities make neural stem cells (NSCs) a promising renewable source of cells for therapeutic applications. However, the practical application for neuronal cell replacement is limited by heterogeneity of NSC progeny, relatively low yield of neurons, predominance of astrocytes, poor survival of donor cells following transplantation and the potential for uncontrolled proliferation of precursor cells. To address these impediments, we have developed a method for the generation of highly enriched immature neurons from murine NSC progeny. Adaptation of the standard differentiation procedure in concert with flow cytometry selection, using scattered light and positive fluorescent light selection based on cell surface antibody binding, provided a near pure (97%) immature neuron population. Using the purified neurons, we screened a panel of growth factors and found that bone morphogenetic protein-4 (BMP-4) demonstrated a strong survival effect on the cells in vitro, and enhanced their functional maturity. This effect was maintained following transplantation into the adult mouse striatum where we observed a 2-fold increase in the survival of the implanted cells and a 3-fold increase in NeuN expression. Additionally, based on the neural-colony forming cell assay (N-CFCA), we noted a 64 fold reduction of the bona fide NSC frequency in neuronal cell population and that implanted donor cells showed no signs of excessive or uncontrolled proliferation. The ability to provide defined neural cell populations from renewable sources such as NSC may find application for cell replacement therapies in the central nervous system

    Including cognitive aspects in multiple criteria decision analysis

    Get PDF
    "First Online: 21 December 2016"Many Multiple Criteria Decision Analysis (MCDA) methods have been proposed over the last decades. Some of the most known methods share some similarities in the way they are used and configured. However, we live in a time of change and nowadays the decision-making process (especially when done in group) is even more demanding and dynamic. In this work, we propose a Multiple Criteria Decision Analysis method that includes cognitive aspects (Cognitive Analytic Process). By taking advantage of aspects such as expertise level, credibility and behaviour style of the decision-makers, we propose a method that relates these aspects with problem configurations (alternatives and criteria preferences) done by each decision-maker. In this work, we evaluated the Cognitive Analytic Process (CAP) in terms of configuration costs and the capability to enhance the quality of the decision. We have used the satisfaction level as a metric to compare our method with other known MCDA methods in literature (Utility function, AHP and TOPSIS). Our method proved to be capable to achieve higher satisfaction levels compared to other MCDA methods, especially when the decision suggested by CAP is different from the one proposed by those methods.This work was supported by COMPETE Programme (operational programme for competitiveness) within project POCI-01-0145-FEDER-007043, by National Funds through the FCT – Fundação para a Ciência e a Tecnologia (Portuguese Foundation for Science and Technology) within the Projects UID/CEC/00319/2013, UID/EEA/00760/2013, and the João Carneiro PhD grant with the reference SFRH/BD/89697/2012.info:eu-repo/semantics/publishedVersio

    Multi-period Project Portfolio Selection under Risk considerations and Stochastic Income

    Get PDF
    This paper deals with multi-period project portfolio selection problem. In this problem, the available budget is invested on the best portfolio of projects in each period such that the net profit is maximized. We also consider more realistic assumptions to cover wider range of applications than those reported in previous studies. A novel mathematical model is presented to solve the problem, considering risks, stochastic incomes, and possibility of investing extra budget in each time period. Due to the complexity of the problem, an effective meta-heuristic method hybridized with a local search procedure is presented to solve the problem. The algorithm is based on genetic algorithm (GA), which is a prominent method to solve this type of problems. The GA is enhanced by a new solution representation and well selected operators. It also is hybridized with a local search mechanism to gain better solution in shorter time. The performance of the proposed algorithm is then compared with well-known algorithms, like basic genetic algorithm (GA), particle swarm optimization (PSO), and electromagnetism-like algorithm (EM-like) by means of some prominent indicators. The computation results show the superiority of the proposed algorithm in terms of accuracy, robustness and computation time. At last, the proposed algorithm is wisely combined with PSO to improve the computing time considerably

    The impact of a change on the size of the smoke compartment in the evacuation of health care facilities

    Get PDF
    Evacuation in health-care facilities is complex due to the physical impairment of the patients. This kind of evacuation usually requires the assistance of the workforce members. A proposed change of NFPA 101, Life Safety Code, would increase the maximum allowable size of a smoke compartment (a space within the building enclosed by smoke barriers on all sides that restricts the movement of smoke) in health-care occupancies from 2090 m2 to 3700 m2, almost double the size. This study aims to analyse the impact of this change in the required time for evacuating patients during a fire in order to understand the consequences of that potential change. This paper is focused on the area where the patient?s rooms are located. The evacuation scenario is a floor plan comprised of four smoke compartments. To analyse the proposed change, the smoke barriers between two adjacent compartments were removed in a floor plan and three ratios of number of patients per one staff member were considered (4:1, 3:1 and 2:1). A computational methodology was conducted to calibrate the model STEPS for simulating assisted evacuation processes. In addition, Fire Dynamic Simulator (FDS) was used to simulate the fire and smoke spread in a table and a PC to compare fire and evacuation results The evacuation results show that the change of the smoke compartment size increases the mean evacuation time by 23%; however, the fire results show that the available safe egress time is 16 min for both smaller and large smoke compartment. The ratio of the number of patients per staff member is also a strong factor that increases the evacuation up to 82% when comparing the ratios of 2 patients per staff member and 4 patients per staff member

    A framework for group decision-making: Including cognitive and affective aspects in a MCDA method for alternatives rejection

    Get PDF
    © Springer International Publishing AG, part of Springer Nature 2019. With the evolution of the organizations and technology, Group Decision Support Systems have changed to support decision-makers that cannot be together at the same place and time to make a decision. However, these systems must now be able to support the interaction between decision-makers and provide all the relevant information at the most adequate times. Failing to do so may compromise the success and the acceptance of the system. In this work it is proposed a framework for group decision using a Multiple Criteria Decision Analysis method capable of identify inconsistent assessments done by the decision-maker and identify alternatives that should be rejected by the group of decision-makers. The proposed framework allows to present more relevant information throughout the decision-making process and this way guide decision-makers in the achievement of more consensual and satisfactory decisions.INCT-EN - Instituto Nacional de Ciência e Tecnologia para Excitotoxicidade e Neuroproteção(ANI|P2020 21958

    Accurate detection of spontaneous seizures using a generalized linear model with external validation

    Get PDF
    Objective Seizure detection is a major facet of electroencephalography (EEG) analysis in neurocritical care, epilepsy diagnosis and management, and the instantiation of novel therapies such as closed-loop stimulation or optogenetic control of seizures. It is also of increased importance in high-throughput, robust, and reproducible pre-clinical research. However, seizure detectors are not widely relied upon in either clinical or research settings due to limited validation. In this study, we create a high-performance seizure-detection approach, validated in multiple data sets, with the intention that such a system could be available to users for multiple purposes. Methods We introduce a generalized linear model trained on 141 EEG signal features for classification of seizures in continuous EEG for two data sets. In the first (Focal Epilepsy) data set consisting of 16 rats with focal epilepsy, we collected 1012 spontaneous seizures over 3 months of 24/7 recording. We trained a generalized linear model on the 141 features representing 20 feature classes, including univariate and multivariate, linear and nonlinear, time, and frequency domains. We tested performance on multiple hold-out test data sets. We then used the trained model in a second (Multifocal Epilepsy) data set consisting of 96 rats with 2883 spontaneous multifocal seizures. Results From the Focal Epilepsy data set, we built a pooled classifier with an Area Under the Receiver Operating Characteristic (AUROC) of 0.995 and leave-one-out classifiers with an AUROC of 0.962. We validated our method within the independently constructed Multifocal Epilepsy data set, resulting in a pooled AUROC of 0.963. We separately validated a model trained exclusively on the Focal Epilepsy data set and tested on the held-out Multifocal Epilepsy data set with an AUROC of 0.890. Latency to detection was under 5 seconds for over 80% of seizures and under 12 seconds for over 99% of seizures. Significance This method achieves the highest performance published for seizure detection on multiple independent data sets. This method of seizure detection can be applied to automated EEG analysis pipelines as well as closed loop interventional approaches, and can be especially useful in the setting of research using animals in which there is an increased need for standardization and high-throughput analysis of large number of seizures

    Mechanical and Assembly Units of Viral Capsids Identified via Quasi-Rigid Domain Decomposition

    Get PDF
    Key steps in a viral life-cycle, such as self-assembly of a protective protein container or in some cases also subsequent maturation events, are governed by the interplay of physico-chemical mechanisms involving various spatial and temporal scales. These salient aspects of a viral life cycle are hence well described and rationalised from a mesoscopic perspective. Accordingly, various experimental and computational efforts have been directed towards identifying the fundamental building blocks that are instrumental for the mechanical response, or constitute the assembly units, of a few specific viral shells. Motivated by these earlier studies we introduce and apply a general and efficient computational scheme for identifying the stable domains of a given viral capsid. The method is based on elastic network models and quasi-rigid domain decomposition. It is first applied to a heterogeneous set of well-characterized viruses (CCMV, MS2, STNV, STMV) for which the known mechanical or assembly domains are correctly identified. The validated method is next applied to other viral particles such as L-A, Pariacoto and polyoma viruses, whose fundamental functional domains are still unknown or debated and for which we formulate verifiable predictions. The numerical code implementing the domain decomposition strategy is made freely available

    Disposable sensors in diagnostics, food and environmental monitoring

    Get PDF
    Disposable sensors are low‐cost and easy‐to‐use sensing devices intended for short‐term or rapid single‐point measurements. The growing demand for fast, accessible, and reliable information in a vastly connected world makes disposable sensors increasingly important. The areas of application for such devices are numerous, ranging from pharmaceutical, agricultural, environmental, forensic, and food sciences to wearables and clinical diagnostics, especially in resource‐limited settings. The capabilities of disposable sensors can extend beyond measuring traditional physical quantities (for example, temperature or pressure); they can provide critical chemical and biological information (chemo‐ and biosensors) that can be digitized and made available to users and centralized/decentralized facilities for data storage, remotely. These features could pave the way for new classes of low‐cost systems for health, food, and environmental monitoring that can democratize sensing across the globe. Here, a brief insight into the materials and basics of sensors (methods of transduction, molecular recognition, and amplification) is provided followed by a comprehensive and critical overview of the disposable sensors currently used for medical diagnostics, food, and environmental analysis. Finally, views on how the field of disposable sensing devices will continue its evolution are discussed, including the future trends, challenges, and opportunities
    corecore