347 research outputs found

    A Bi-Layer Multi-Objective Techno-Economical Optimization Model for Optimal Integration of Distributed Energy Resources into Smart/Micro Grids

    Full text link
    The energy management system is executed in microgrids for optimal integration of distributed energy resources (DERs) into the power distribution grids. To this end, various strategies have been more focused on cost reduction, whereas effectively both economic and technical indices/factors have to be considered simultaneously. Therefore, in this paper, a two-layer optimization model is proposed to minimize the operation costs, voltage fluctuations, and power losses of smart microgrids. In the outer-layer, the size and capacity of DERs including renewable energy sources (RES), electric vehicles (EV) charging stations and energy storage systems (ESS), are obtained simultaneously. The inner-layer corresponds to the scheduled operation of EVs and ESSs using an integrated coordination model (ICM). The ICM is a fuzzy interface that has been adopted to address the multi-objectivity of the cost function developed based on hourly demand response, state of charges of EVs and ESS, and electricity price. Demand response is implemented in the ICM to investigate the effect of time-of-use electricity prices on optimal energy management. To solve the optimization problem and load-flow equations, hybrid genetic algorithm (GA)-particle swarm optimization (PSO) and backward-forward sweep algorithms are deployed, respectively. One-day simulation results confirm that the proposed model can reduce the power loss, voltage fluctuations and electricity supply cost by 51%, 40.77%, and 55.21%, respectively, which can considerably improve power system stability and energy efficiency.</jats:p

    An Introduction to Advanced Machine Learning : Meta Learning Algorithms, Applications and Promises

    Get PDF
    In [1, 2], we have explored the theoretical aspects of feature extraction optimization processes for solving largescale problems and overcoming machine learning limitations. Majority of optimization algorithms that have been introduced in [1, 2] guarantee the optimal performance of supervised learning, given offline and discrete data, to deal with curse of dimensionality (CoD) problem. These algorithms, however, are not tailored for solving emerging learning problems. One of the important issues caused by online data is lack of sufficient samples per class. Further, traditional machine learning algorithms cannot achieve accurate training based on limited distributed data, as data has proliferated and dispersed significantly. Machine learning employs a strict model or embedded engine to train and predict which still fails to learn unseen classes and sufficiently use online data. In this chapter, we introduce these challenges elaborately. We further investigate Meta-Learning (MTL) algorithm, and their application and promises to solve the emerging problems by answering how autonomous agents can learn to learn?

    Crystalline phases involved in the hydration of calcium silicate-based cements: Semi-quantitative Rietveld X-ray diffraction analysis

    Get PDF
    Chemical comparisons of powder and hydrated forms of calcium silicate cements (CSCs) and calculation of alterations in tricalcium silicate (Ca3SiO5) calcium hydroxide (Ca(OH)2) are essential for understanding their hydration processes. This study aimed to evaluate and compare these changes in ProRoot MTA, Biodentine and CEM cement. Powder and hydrated forms of tooth coloured ProRoot MTA, Biodentine and CEM cement were subjected to X-ray diffraction (XRD) analysis with Rietveld refinement to semi-quantitatively identify and quantify the main phases involved in their hydration process. Data were reported descriptively. Reduction in Ca3SiO5 and formation of Ca(OH)2 were seen after the hydration of ProRoot MTA and Biodentine; however, in the case of CEM cement, no reduction of Ca3SiO5 and no formation of Ca(OH)2 were detected. The highest percentages of amorphous phases were seen in Biodentine samples. Ettringite was detected in the hydrated forms of ProRoot MTA and CEM cement but not in Biodentine

    Applications of Nature-Inspired Algorithms for Dimension Reduction: Enabling Efficient Data Analytics

    Get PDF
    In [1], we have explored the theoretical aspects of feature selection and evolutionary algorithms. In this chapter, we focus on optimization algorithms for enhancing data analytic process, i.e., we propose to explore applications of nature-inspired algorithms in data science. Feature selection optimization is a hybrid approach leveraging feature selection techniques and evolutionary algorithms process to optimize the selected features. Prior works solve this problem iteratively to converge to an optimal feature subset. Feature selection optimization is a non-specific domain approach. Data scientists mainly attempt to find an advanced way to analyze data n with high computational efficiency and low time complexity, leading to efficient data analytics. Thus, by increasing generated/measured/sensed data from various sources, analysis, manipulation and illustration of data grow exponentially. Due to the large scale data sets, Curse of dimensionality (CoD) is one of the NP-hard problems in data science. Hence, several efforts have been focused on leveraging evolutionary algorithms (EAs) to address the complex issues in large scale data analytics problems. Dimension reduction, together with EAs, lends itself to solve CoD and solve complex problems, in terms of time complexity, efficiently. In this chapter, we first provide a brief overview of previous studies that focused on solving CoD using feature extraction optimization process. We then discuss practical examples of research studies are successfully tackled some application domains, such as image processing, sentiment analysis, network traffics / anomalies analysis, credit score analysis and other benchmark functions/data sets analysis

    Evolutionary Computation, Optimization and Learning Algorithms for Data Science

    Get PDF
    A large number of engineering, science and computational problems have yet to be solved in a computationally efficient way. One of the emerging challenges is how evolving technologies grow towards autonomy and intelligent decision making. This leads to collection of large amounts of data from various sensing and measurement technologies, e.g., cameras, smart phones, health sensors, smart electricity meters, and environment sensors. Hence, it is imperative to develop efficient algorithms for generation, analysis, classification, and illustration of data. Meanwhile, data is structured purposefully through different representations, such as large-scale networks and graphs. We focus on data science as a crucial area, specifically focusing on a curse of dimensionality (CoD) which is due to the large amount of generated/sensed/collected data. This motivates researchers to think about optimization and to apply nature-inspired algorithms, such as evolutionary algorithms (EAs) to solve optimization problems. Although these algorithms look un-deterministic, they are robust enough to reach an optimal solution. Researchers do not adopt evolutionary algorithms unless they face a problem which is suffering from placement in local optimal solution, rather than global optimal solution. In this chapter, we first develop a clear and formal definition of the CoD problem, next we focus on feature extraction techniques and categories, then we provide a general overview of meta-heuristic algorithms, its terminology, and desirable properties of evolutionary algorithms

    Low potency toxins reveal dense interaction networks in metabolism

    Get PDF
    Background The chemicals of metabolism are constructed of a small set of atoms and bonds. This may be because chemical structures outside the chemical space in which life operates are incompatible with biochemistry, or because mechanisms to make or utilize such excluded structures has not evolved. In this paper I address the extent to which biochemistry is restricted to a small fraction of the chemical space of possible chemicals, a restricted subset that I call Biochemical Space. I explore evidence that this restriction is at least in part due to selection again specific structures, and suggest a mechanism by which this occurs. Results Chemicals that contain structures that our outside Biochemical Space (UnBiological groups) are more likely to be toxic to a wide range of organisms, even though they have no specifically toxic groups and no obvious mechanism of toxicity. This correlation of UnBiological with toxicity is stronger for low potency (millimolar) toxins. I relate this to the observation that most chemicals interact with many biological structures at low millimolar toxicity. I hypothesise that life has to select its components not only to have a specific set of functions but also to avoid interactions with all the other components of life that might degrade their function. Conclusions The chemistry of life has to form a dense, self-consistent network of chemical structures, and cannot easily be arbitrarily extended. The toxicity of arbitrary chemicals is a reflection of the disruption to that network occasioned by trying to insert a chemical into it without also selecting all the other components to tolerate that chemical. This suggests new ways to test for the toxicity of chemicals, and that engineering organisms to make high concentrations of materials such as chemical precursors or fuels may require more substantial engineering than just of the synthetic pathways involved

    Global, regional, and national incidence, prevalence, and mortality of HIV, 1980–2017, and forecasts to 2030, for 195 countries and territories: a systematic analysis for the Global Burden of Diseases, Injuries, and Risk Factors Study 2017

    Get PDF
    Background Understanding the patterns of HIV/AIDS epidemics is crucial to tracking and monitoring the progress of prevention and control efforts in countries. We provide a comprehensive assessment of the levels and trends of HIV/AIDS incidence, prevalence, mortality, and coverage of antiretroviral therapy (ART) for 1980–2017 and forecast these estimates to 2030 for 195 countries and territories. Methods We determined a modelling strategy for each country on the basis of the availability and quality of data. For countries and territories with data from population-based seroprevalence surveys or antenatal care clinics, we estimated prevalence and incidence using an open-source version of the Estimation and Projection Package—a natural history model originally developed by the UNAIDS Reference Group on Estimates, Modelling, and Projections. For countries with cause-specific vital registration data, we corrected data for garbage coding (ie, deaths coded to an intermediate, immediate, or poorly defined cause) and HIV misclassification. We developed a process of cohort incidence bias adjustment to use information on survival and deaths recorded in vital registration to back-calculate HIV incidence. For countries without any representative data on HIV, we produced incidence estimates by pulling information from observed bias in the geographical region. We used a re-coded version of the Spectrum model (a cohort component model that uses rates of disease progression and HIV mortality on and off ART) to produce age-sex-specific incidence, prevalence, and mortality, and treatment coverage results for all countries, and forecast these measures to 2030 using Spectrum with inputs that were extended on the basis of past trends in treatment scale-up and new infections. Findings Global HIV mortality peaked in 2006 with 1·95 million deaths (95% uncertainty interval 1·87–2·04) and has since decreased to 0·95 million deaths (0·91–1·01) in 2017. New cases of HIV globally peaked in 1999 (3·16 million, 2·79–3·67) and since then have gradually decreased to 1·94 million (1·63–2·29) in 2017. These trends, along with ART scale-up, have globally resulted in increased prevalence, with 36·8 million (34·8–39·2) people living with HIV in 2017. Prevalence of HIV was highest in southern sub-Saharan Africa in 2017, and countries in the region had ART coverage ranging from 65·7% in Lesotho to 85·7% in eSwatini. Our forecasts showed that 54 countries will meet the UNAIDS target of 81% ART coverage by 2020 and 12 countries are on track to meet 90% ART coverage by 2030. Forecasted results estimate that few countries will meet the UNAIDS 2020 and 2030 mortality and incidence targets. Interpretation Despite progress in reducing HIV-related mortality over the past decade, slow decreases in incidence, combined with the current context of stagnated funding for related interventions, mean that many countries are not on track to reach the 2020 and 2030 global targets for reduction in incidence and mortality. With a growing population of people living with HIV, it will continue to be a major threat to public health for years to come. The pace of progress needs to be hastened by continuing to expand access to ART and increasing investments in proven HIV prevention initiatives that can be scaled up to have population-level impact

    Approaches in biotechnological applications of natural polymers

    Get PDF
    Natural polymers, such as gums and mucilage, are biocompatible, cheap, easily available and non-toxic materials of native origin. These polymers are increasingly preferred over synthetic materials for industrial applications due to their intrinsic properties, as well as they are considered alternative sources of raw materials since they present characteristics of sustainability, biodegradability and biosafety. As definition, gums and mucilages are polysaccharides or complex carbohydrates consisting of one or more monosaccharides or their derivatives linked in bewildering variety of linkages and structures. Natural gums are considered polysaccharides naturally occurring in varieties of plant seeds and exudates, tree or shrub exudates, seaweed extracts, fungi, bacteria, and animal sources. Water-soluble gums, also known as hydrocolloids, are considered exudates and are pathological products; therefore, they do not form a part of cell wall. On the other hand, mucilages are part of cell and physiological products. It is important to highlight that gums represent the largest amounts of polymer materials derived from plants. Gums have enormously large and broad applications in both food and non-food industries, being commonly used as thickening, binding, emulsifying, suspending, stabilizing agents and matrices for drug release in pharmaceutical and cosmetic industries. In the food industry, their gelling properties and the ability to mold edible films and coatings are extensively studied. The use of gums depends on the intrinsic properties that they provide, often at costs below those of synthetic polymers. For upgrading the value of gums, they are being processed into various forms, including the most recent nanomaterials, for various biotechnological applications. Thus, the main natural polymers including galactomannans, cellulose, chitin, agar, carrageenan, alginate, cashew gum, pectin and starch, in addition to the current researches about them are reviewed in this article.. }To the Conselho Nacional de Desenvolvimento Cientfíico e Tecnológico (CNPq) for fellowships (LCBBC and MGCC) and the Coordenação de Aperfeiçoamento de Pessoal de Nvíel Superior (CAPES) (PBSA). This study was supported by the Portuguese Foundation for Science and Technology (FCT) under the scope of the strategic funding of UID/BIO/04469/2013 unit, the Project RECI/BBB-EBI/0179/2012 (FCOMP-01-0124-FEDER-027462) and COMPETE 2020 (POCI-01-0145-FEDER-006684) (JAT)

    Treatment of atypical central neurocytoma in a child with high dose chemotherapy and autologous stem cell rescue

    Get PDF
    The authors describe a 9 month old female with recurrent atypical central neurocytoma and leptomeningeal spread treated with high dose chemotherapy, autologous stem cell rescue, and adjuvant therapy. She had a complete response to therapy and was disease free at 4 years of age until a recurrence 6 months later. The use of intensive chemotherapy followed by autologous stem cell rescue for atypical neurocytoma may be considered as an adjunct to surgical therapy in young patients with atypical neurocytoma not amenable to radiation therapy

    Petrographical and geochemical evidences for paragenetic sequence interpretation of diagenesis in mixed siliciclastic–carbonate sediments: Mozduran Formation (Upper Jurassic), south of Agh-Darband, NE Iran

    Get PDF
    The Upper Jurassic Mozduran Formation with a thickness of 420 m at the type locality is the most important gas-bearing reservoir in NE Iran. It is mainly composed of limestone, dolostone with shale and gypsum interbeds that grade into coarser siliciclastics in the easternmost part of the basin. Eight stratigraphic sections were studied in detail in south of the Agh-Darband area. These analyses suggest that four carbonate facies associations and three siliciclastic lithofacies were deposited in shallow marine to shoreline environments, respectively. Cementation, compaction, dissolution, micritization, neomorphism, hematitization, dolomitization and fracturing are diagenetic processes that affected these sediments.Stable isotope variations of δ18O and δ13C in carbonate rocks show two different trends. High depletion of δ18O and low variation of δ13C probably reflect increasing temperatures during burial diagenesis, while the higher depletion in carbon isotope values with low variations in oxygen isotopes are related to fresh water flushing during meteoric diagenesis. Negative values of carbon isotopes may have also resulted from organic matter alteration during penetration of meteoric water. Fe and Mn enrichment with depletion of δ18O also supports the contention that alteration associated with higher depletion in carbon isotope values with low variations in oxygen isotopes took place during meteoric diagenesis. The presence of bright luminescence indicates redox conditions during precipitation of calcite cement
    corecore