187 research outputs found

    Direct sampling of measured or model data to improve uncertainty analysis in Life Cycle Assessment

    Get PDF
    Standard practice in Life Cycle Assessment (LCA) is to assume that each uncertain parameter behaves independently under Monte Carlo sampling. This leads to cases which are clearly incorrect, such as engine CO2 emissions being sampled independent from the fuel efficiency, or different providers into a market (such as an electricity market) increasing or decreasing without any regard to the behaviour of other providers. We have developed an open-source toolkit (https://github.com/PascalLesage/brightway2-presamples) that can solve these and other problems through the direct use of measured or pre-computed data and Monte Carlo samples. We demonstrate how this toolkit can provide a number of novel features for uncertainty and sensitivity assessment in LCA: - Monte Carlo samples can be saved and transferred between computers, allowing for perfect reproducibility. - Pre-generated static or stochastic values can be generated by complex, non-linear models, capturing system dynamics more accurately. - Pre-sampled Monte Carlo values can capture correlations between parameters, such as between characterization factors, or between input and outputs (e.g. fuel use and CO2 emissions). - Direct use of population data avoids losses or introduced inaccuracies from fitting data to distributions. We also introduce and demonstrate the idea of campaigns , an organizational tool for sets of pre-sampled data that allows for quick system variation, guided data acquisition, and prospective LCA

    Implementation of Multipath and Multiple Description Coding in OLSR

    Get PDF
    In this paper we discussed the application and the implementation of multipath routing and multiple description coding (MDC) extension of OLSR, called MP-OLSR. It is based on the link state algorithm and employs periodic exchange of messages to maintain topology information of the networks. In the mean time, it updates the routing table in an on-demand scheme and forwards the packets in multiple paths which have been determined at the source. If a link failure is detected, the algorithm recovers the route automatically. Concerning the instability of the wireless networks, the multiple description coding is used to improve reliability of the network transmission, and several methods are proposed to allocate the redundancy in different paths. The simulation in NS2 shows that the new protocol can effectively improve the performance of the networks. The implementation of MP-OLSR is also proposed in the end

    Method to enable LCA analysis through each level of development of a BIM model

    Get PDF
    Whole Building life cycle assessment (LCA) calculations are increasingly done using building information modeling (BIM) data exports, but some challenges need to be overcome. BIM models lack data for a whole building LCA analysis. To counter this lack of detailed information, manual inputs are often required when using a static BIM model and cannot easily consider recalculations over the duration of the project. This paper presents a method to automatically perform LCA calculations early, at the first level of a BIM model’s development (i.e. the LOD100 level), and to allow for easier updates of the calculation throughout the evolution of the BIM model. To achieve this goal, a novel data layer and format is proposed. This data layer fills the information gap between extracted BIM data and existing LCA data provided by common LCA databases such as ecoinvent

    Detección de cambios de velocidad asociados a la erupción del volcán Ubinas 2014, usando auto- y cross- correlación de ruidos sísmico y eventos multiplets

    Get PDF
    En este trabajo usamos la correlación de ruido símico y eventos multiplets para calcular los cambios de velocidad en el volcán Ubinas durante el 2014. Se han identificado descenso de la velocidad símica del medio hasta de -0.8%, tres semanas antes de las principales explosiones que ocurrieron entre el 13 y el 19 de abril de 2014 en ambos métodos. Estos cambios de velocidad tuvieron un carácter precursor. La ubicación en el plano horizontal de la perturbación de velocidad y el cambio estructural para la fase de mayor actividad eruptiva, muestra que la disminución de velocidad se originó en todo el edificio volcánico, mientras que la perturbación estructural se concentra en el flanco sur del volcán, zona que corresponde a un antiguo colapso

    Assessing Temporary Carbon Sequestration and Storage Projects through Land Use, Land-Use Change and Forestry: Comparison of Dynamic Life Cycle Assessment with Ton-Year Approaches

    Get PDF
    In order to properly assess the climate impact of temporary carbon sequestration and storage projects through land-use, land-use change and forestry (LULUCF), it is important to consider their temporal aspect. Dynamic life cycle assessment (dynamic LCA) was developed to account for time while assessing the potential impact of life cycle greenhouse gases (GHG) emissions. In this paper, the dynamic LCA approach is applied to a temporary carbon sequestration project through afforestation, and the results are compared with those of the two principal ton-year approaches: the Moura-Costa and the Lashof methods. The dynamic LCA covers different scenarios, which are distinguished by the assumptions regarding what happens at the end of the sequestration period. In order to ascertain the degree of compensation of an emission through a LULUCF project, the ratio of the cumulative impact of the project to the cumulative impact of a baseline GHG emission is calculated over time. This ratio tends to 100% when assuming that, after the end of the sequestration project period, the forest is maintained indefinitely. Conversely, the ratio tends to much lower values in scenarios where part of the carbon is released back to the atmosphere. The comparison of the dynamic LCA approach with the Moura-Costa and the Lashof methods shows that dynamic LCA is a more flexible approach as it allows the consideration of every life cycle stage of the project and it gives decision makers the opportunity to test the sensitivity of the results to the choice of different time horizons.JRC.H.8-Sustainability Assessmen

    Reconstruction of primary vertices at the ATLAS experiment in Run 1 proton–proton collisions at the LHC

    Get PDF
    This paper presents the method and performance of primary vertex reconstruction in proton–proton collision data recorded by the ATLAS experiment during Run 1 of the LHC. The studies presented focus on data taken during 2012 at a centre-of-mass energy of √s=8 TeV. The performance has been measured as a function of the number of interactions per bunch crossing over a wide range, from one to seventy. The measurement of the position and size of the luminous region and its use as a constraint to improve the primary vertex resolution are discussed. A longitudinal vertex position resolution of about 30μm is achieved for events with high multiplicity of reconstructed tracks. The transverse position resolution is better than 20μm and is dominated by the precision on the size of the luminous region. An analytical model is proposed to describe the primary vertex reconstruction efficiency as a function of the number of interactions per bunch crossing and of the longitudinal size of the luminous region. Agreement between the data and the predictions of this model is better than 3% up to seventy interactions per bunch crossing

    Search for dark matter produced in association with bottom or top quarks in √s = 13 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for weakly interacting massive particle dark matter produced in association with bottom or top quarks is presented. Final states containing third-generation quarks and miss- ing transverse momentum are considered. The analysis uses 36.1 fb−1 of proton–proton collision data recorded by the ATLAS experiment at √s = 13 TeV in 2015 and 2016. No significant excess of events above the estimated backgrounds is observed. The results are in- terpreted in the framework of simplified models of spin-0 dark-matter mediators. For colour- neutral spin-0 mediators produced in association with top quarks and decaying into a pair of dark-matter particles, mediator masses below 50 GeV are excluded assuming a dark-matter candidate mass of 1 GeV and unitary couplings. For scalar and pseudoscalar mediators produced in association with bottom quarks, the search sets limits on the production cross- section of 300 times the predicted rate for mediators with masses between 10 and 50 GeV and assuming a dark-matter mass of 1 GeV and unitary coupling. Constraints on colour- charged scalar simplified models are also presented. Assuming a dark-matter particle mass of 35 GeV, mediator particles with mass below 1.1 TeV are excluded for couplings yielding a dark-matter relic density consistent with measurements
    corecore