2,848 research outputs found

    Introducing Australia’s first hybrid testing facility for performance-based assessment of structures

    Get PDF
    Hybrid simulation is a cost-effective cyber-physical testing technique, in which computational models and physical components are integrated at run-time. This method can be viewed as conventional finite element analysis, where physical models of some portions of the structure are embedded in the numerical model. In such a way, the errors related to the simplification of the theoretical modeling of complex nonlinear structures or subassemblies can be effectively mitigated as they are tested physically in the lab. This paper introduces Australia’s first hybrid testing facility, referred to as the Multi-Axis Substructure Testing (MAST) system, which is capable of simulating the complex three-dimensional time-varying boundary effects on large-scale structural components. The MAST system is unique in Australasia and is capable to serve the research community and practice, nationally and internationally. An application of the MAST system to investigate the performance of a CFRP-repaired limited-ductile RC column under sequential ground motions from linear-elastic response range through collapse is also presented

    Pathways to diversification

    Get PDF
    A fundamental research question in regional economic development, is why some regions are able to diversify into new products and industries, while others continue to face challenges in diversification? This doctorate research explores the different pathways to diversification. It follows the three-stage modular structure of DBA for Cranfield School of Management. This thesis consists of a systematic literature review, a single qualitative case study on UAE, and a research synthesis of published cases on Singapore, Norway and UAE. The linking document provides a summary of the three projects and consolidates findings and contributions into a path creation model that provides new understanding on the pathways to regional diversifications. This research integrates existing theoretical foundations of evolutionary economic geography, institutional economic geography, path dependence, industry relatedness, economic complexity, and path creation into a unified conceptual path creation model. It generates propositions, builds a framework and develops a matrix for path creation that integrate context, actors, factors, mechanisms and outcomes shaping regional diversification. It finds that in the context of path dependence and existing conditions of a region, economic actors undertake strategic measures to influence the institutional capabilities to accumulate knowledge and trigger indigenous creation, anchoring, branching, and clustering diversification mechanisms to create complex varieties of related and unrelated diversification outcomes. The institutional collaboration capabilities are found to be instrumental in accumulating knowledge and determining the relatedness and complexity of diversification outcomes. This research further provides a set of integrated platform strategies to guide policy-makers on setting up the pathways to regional diversification

    Effects of Two Types of Statins in Lipid Profile of Ovarictomized Female Rats

    Get PDF
    Abstract Statins are the most important drugs that used widely in reducing cholesterol and lipids related with cholesterol and vascular heart diseases prevention, through HMG-Coenzyme reductase inhibition. Statins classified into two types: hydrophilic and lipophilic. The current study was carried out to clarify whether simvastatin or rousvastatin the better in serum lipid profile of ovariectomized female rats which used as a model for postmenopausal women who has hyperlipidemia, since menopause a physiological stage characterized by estrogen deficiency and ovaries function loss and related with some diseases of metabolism as the type-2  of diabetes mellitus, lipid metabolism disturbance, metabolic syndrome and increased risk of heart diseases. Twenty four adult female rats aged(2.5-3) month and weight(220-250) gm were grouped into four groups(6/group): control group (sham): without ovariectomy, ovariectomized(ovx) group, ovariectomized(ovx) rats that administrated by 20 mg/kg/day(orally) with rousvastatin and ovariectomized(ovx) rats that administrated by 20 mg/kg/day(orally) with simvastatin. After experiment period 60 day. Samples of blood were drawn in order to estimate lipid profile. The results were showed that ovariectomy operation caused a significant elevation in total cholesterol, triglyceride, low density lipoprotein-cholesterol (LDL-C),  very low density lipoprotein-cholesterol (VLDL-C), non-HDL-C, and decrease in high density lipoprotein-cholesterol (HDL-C), but treatment with rousvastatin caused a greater reduction in serum total cholesterol and also in LDL-C than simvastatin, however, simvastatin showed a greater drooping in  serum triglycerides, VLDL-C and also non-HDL-C. We concluded that there are differences in activity between simvastatin and rosuvastatin in lipid profile and the rosuvastatin has the effectiveness in total cholesterol and LDL-C reduction

    How high does Paper Mario have to jump to match the strength of his regular counterpart?

    Get PDF
    The video game superstar Mario is well known for his jumping ability. In the spin-off game, Paper Mario is similarly well-known but is physically made of paper. This paper explores the differences in the impact force between regular and Paper Mario and calculates the jump height Paper Mario would need to attain in order for him to carry the same impact force as regular Mario. To do this, Paper Mario is assumed to be a rectangular sheet of paper, and the same height as regular Mario, but much less dense. From calculating the impact force from regular Mario to be 17.3 kN, it was found that in order to match this force, Paper Mario would need to attain a height of 47.6 m. As a result, while it is possible for Paper Mario to match Mario in damage, it is unrealistic that he would be able to do so. He can however, jump multiple times on enemies which would increase his damage output

    Modulation of inflammatory process and tissue regeneration in calvaria mouse models

    Full text link
    MicroRNAs (miRNAs) are short, non-coding RNAs involved in the regulation of several processes associated with inflammatory diseases and infection. Bacterial infection modulates miRNA expression to subvert innate immune response. In this study, we analyzed bacterial modulation of miRNAs in bone-marrow-derived macrophages (BMMs), in which activity was induced by infection with Porphyromonas gingivalis (Pg) through a microarray analysis. Several miRNA expressions levels were modulated 3 hours post infection (at a multiplicity of infection (MOI) of 25). A bioinformatics analysis was performed to further identify pathways related to the innate immune host-response pathways that are under the influence of the selected miRNAs. To assess the effects of the identified miRNAs on cytokines secretion (pro inflammatory TNF-α and anti-inflammatory IL-10), BMMs were transfected with selected miRNAs mimics or inhibitors. Transfection with mmu-miR-155 and mmu-miR- 2137 did not modify TNF-α secretion while their inhibitors increased it. Inhibitors of mmumiR-2137 and mmu-miR-7674 increased the secretion of the anti-inflammatory IL-10. In Pginfected BMMs, mmu-miR-155-5p significantly decreased TNF-α secretion while inhibitor of mmu-miR-2137 increased IL-10 secretion. In vivo, in a Pg-induced calvarial bone resorption mouse model, injection of mmu-miR-155-5p or anti-mmu-miR-2137 reduced the size of the lesion significantly. Furthermore, anti-mmu-miR-2137 significantly reduced inflammatorycell infiltration, osteoclast activity and bone loss. Bioinformatics analysis demonstrated that pathways related to cytokines and chemokines related pathways but also osteoclast differentiation may be involved in the observed effects. The study highlights the potential therapeutic merits of targeting mmu-miR-155-5p and mmu-miR-2137 to control inflammation induced by Pg infection. To assess the regenerative process in the same animal model, we aimed to compare the effect of Bone Morphogenic Protien 2 (BMP2), Platelets Rich Plasma (PRP), Leukocyte-Platelets Rich Fibrin (L-PRF), and Polygucosamine (PGIcNAc) on bone formation in critical size bone defects in mice. One-hundred-thirty-eight mice were divided into 23 groups (n=6), negative control, different combinations of the PGIcNAc with or without of BMP2, Collagen Sponge (SurgiFoam), PRP, and L-PRF. The 5mm defect, then, was allowed to heal. After six weeks, samples were analyzed for bone formation utilizing radiographs, H&E staining, alkaline phosphatase staining. Our results show that BMP2 were able to produce 90-95% healing of critical size defects after six weeks histologically and radiographically. However, SurgiFoam, PRP and L-PRF with or without PGIcNAc were able to close 60% of the original defect. This study supports that BMP2 is more effective for bone regeneration than SurgiFoam, PRP, L-PRF and PGIcNAc

    Applying data mining techniques over big data

    Full text link
    Thesis (M.S.C.S.) PLEASE NOTE: Boston University Libraries did not receive an Authorization To Manage form for this thesis or dissertation. It is therefore not openly accessible, though it may be available by request. If you are the author or principal advisor of this work and would like to request open access for it, please contact us at [email protected]. Thank you.The rapid development of information technology in recent decades means that data appear in a wide variety of formats — sensor data, tweets, photographs, raw data, and unstructured data. Statistics show that there were 800,000 Petabytes stored in the world in 2000. Today’s internet has about 0.1 Zettabytes of data (ZB is about 1021 bytes), and this number will reach 35 ZB by 2020. With such an overwhelming flood of information, present data management systems are not able to scale to this huge amount of raw, unstructured data—in today’s parlance, Big Data. In the present study, we show the basic concepts and design of Big Data tools, algorithms, and techniques. We compare the classical data mining algorithms to the Big Data algorithms by using Hadoop/MapReduce as a core implementation of Big Data for scalable algorithms. We implemented the K-means algorithm and A-priori algorithm with Hadoop/MapReduce on a 5 nodes Hadoop cluster. We explore NoSQL databases for semi-structured, massively large-scaling of data by using MongoDB as an example. Finally, we show the performance between HDFS (Hadoop Distributed File System) and MongoDB data storage for these two algorithms

    The Effect of Pulse Duration, Laser Energy, and Gas Pressure on High Harmonic Generation Efficiency and Spectral Characteristics

    Get PDF
    This Bachelor’s thesis investigates the role of pulse duration, laser energy, and gas pressure on the efficiency and colour of the High Harmonic Generation (HHG) process. HHG is a nonlinear optical phenomenon that generates extreme ultraviolet (XUV) radiation when an intense laser beam interacts with a material. XUV radiation is characterized by high photon energies and short wavelengths, making it useful for a wide range of applications in science and technology, such as imaging, spectroscopy, and microscopy. The study is conducted using a laser system that generates femtosecond laser pulses, which are focused onto a gas target to generate XUV radiation. The energy and spectrum of the generated XUV radiation are analyzed for different values of pulse duration, laser energies, and gas pressure values, as well as the effects of using gas cells of different lengths and gap diameters. The results show that changing the pulse duration and laser energy have a significant effect on the energy of the XUV photons and photon flux, while the gas pressure has a minor effect. Moreover, the use of gas cells of different lengths and gap diameters also affected the energy of the XUV photons and photon flux. The findings of this study have important implications for the design and optimization of HHG systems for various applications, including the development of more efficient and versatile XUV sources that can contribute to the advancement of various fields of science and technology

    Applying Data Mining Techniques Over Big Data

    Full text link
    With rapid development of information technology, data flows in different variety of formats - sensors data, tweets, photos, raw data, and unstructured data. Statistics show that there were 800,000 Petabytes stored in the world in 2000. Today Internet is about 1.8 Zettabytes (Zettabytes is 10^21), and this number will reach 35 Zettabytes by 2020. With that, data management systems are not able to scale to this huge amount of raw, unstructured data, which what is called today big data. In this present study, we show the basic concept and design of big data tools, algorthims [sic] and techniques. We compare the classical data mining algorithms with big data algorthims [sic] by using hadoop/MapReuce [sic] as the core implemention [sic] of big-data for scalable algorthims. [sic] We implemented K-means and A-priori algorthim [sic] by using Hadoop/MapReduce on 5 nodes cluster of hadoop. We also show their performance for Gigabytes of data. Finally, we explore NoSQL (Not Only SQL) databases for semi-structured, massively large-scale of data using MongoDB as an example. Then, we show the performance between HDFS (Hadoop Distributed File System) and MongoDB data stores for these two algorithms.This research work is part of a full scholarship fund of a Master degree through Minisrty of Higher Education and Scientific Research (MOHESR), Republic of Iraq (Fund 17004)

    Assessment of the Effectiveness of Available Pollution Control Technologies in Reducing Phenol Level at ENOC Refinery Wastewater Treatment Plant

    Get PDF
    Phenols are present in discharge effluents of many heavy industries such as refineries. They are regulated and monitored by environmental authorities. Emirates National Oil Company (ENOC) processes condensate oil and produces wastewater that is treated at the ENOC Processing Company\u27s (EPCL-refinery) wastewater treatment plant (ETP) through the use of physical, chemical and biological treatment processes. This study provides an in-depth description of the unit processes employed to treat the different waste streams generated at the EPCL refinery and identifies the different types of phenols which are formulated in the processes. Characterization of phenol level at EPCL-ETP and assessment of the effectiveness of employed pollution control technologies in reducing phenol level at the treatment plant have been conducted in this study. In addition, the study explores the potential correlation between phenol level and other water quality parameters in the treated wastewater. A thorough review of the adverse effects of phenols on the receiving environment has been provided in this study. It is well documented that phenol and its derivatives pose a danger to humans and marine life in their varying toxicities. Such toxicity depends on the solubility and persistence of the phenolic compounds in the system. The literature reveals that the sources and formation of phenol and its derivatives are based on many complex reactions. It was found that the main sources of phenol in the received waste streams at ENOC-ETP are the crude oil tank wastewater drain (average 65.34 mg/l), the desalter effluent (average 0.95 mg/l) and the neutralized spent caustic (average 180 mg/l) waste streams. However, there are large fluctuations from the average phenol level within each waste stream. Also, the level of phenol and its derivatives in these streams vary significantly with phenol, m-p-cresols, o-cresol, tri and tetra-chlorophenols and to a lesser extent 4-chloro-3-cresol are common among these streams. Based on the average concentration from samples collected during this study and based on the average production of these streams, typical phenol loadings from tank wastewater drain, neutralized spent caustic and desalter wastewaters are 69.3 kg, 1497.8 kg and 131.5 kg per year, respectively. Other waste streams have been tested for phenol, but the contribution of the other waste streams to phenol loading was found to be insignificant due either to the low flow rate or low phenol levels in the stream. The processes employed in the treatment plant vary from a semi-continuous flow process (i.e. CPI, IGF) to a completely mixed batch reactor followed by continuous flow packed bed reactors. The study shows variations in water quality parameters at early stages of treatment but the waste stream is homogenized in the SBR and a more or less uniform treated waste stream enters the sand filter and the carbon bed. The effluent in the wastewater treatment plant, basic odour, colour and clarity improves significantly through the stages of treatment. The study further showed that the most effective process employed in the reduction of phenols within the plant is the Sequencing Batch Reactors (SBR). This is also true for the organic loading which is reduced mainly by the SBR and to a lesser extent by the carbon reactor. The reduction of sulphides, like phenols, is almost entirely dependant on the performance of the SBR units. Based on the analysis of five years (2000-2004) of daily data, it was found that the correlation between phenol level in the discharged treated effluent and levels of COD, BOD5 and sulphides are weak although statistically significant in most cases. The relationship between BOD5 and COD in the final effluent has been established with a ratio (COD:BOD5) of 1.6 to 1 and an average non-degradable COD of about 55 mg/l. However, the relationship is again weak due to the scatter of the data as reflected by the low value of the coefficient of determination. Recommendations are given mainly focusing on conducting further studies, such as characterization of waste streams at different condensate oil refineries, ETP performance, and process optimization. In addition, further investigation is needed to optimize SBR operations by characterizing and enriching microorganisms that specifically degrade phenol. It is also recommended to study the ETP performance on removal of other aromatic hydrocarbons
    corecore