7,770 research outputs found

    Developing a global risk engine

    Get PDF
    Risk analysis is a critical link in the reduction of casualties and damages due to earthquakes. Recognition of this relation has led to a rapid rise in demand for accurate, reliable and flexible risk assessment software. However, there is a significant disparity between the high quality scientific data developed by researchers and the availability of versatile, open and user-friendly risk analysis tools to meet the demands of end-users. In the past few years several open-source software have been developed that play an important role in the seismic research, such as OpenSHA and OpenSEES. There is however still a gap when it comes to open-source risk assessment tools and software. In order to fill this gap, the Global Earthquake Model (GEM) has been created. GEM is an internationally sanctioned program initiated by the OECD that aims to build independent, open standards to calculate and communicate earthquake risk around the world. This initiative started with a one-year pilot project named GEM1, during which an evaluation of a number of existing risk software was carried out. After a critical review of the results it was concluded that none of the software were adequate for GEM requirements and therefore, a new object-oriented tool was to be developed. This paper presents a summary of some of the most well known applications used in risk analysis, highlighting the main aspects that were considered for the development of this risk platform. The research that was carried out in order to gather all of the necessary information to build this tool was distributed in four different areas: information technology approach, seismic hazard resources, vulnerability assessment methodologies and sources of exposure data. The main aspects and findings for each of these areas will be presented as well as how these features were incorporated in the up-to-date risk engine. Currently, the risk engine is capable of predicting human or economical losses worldwide considering both deterministic and probabilistic-based events, using vulnerability curves. A first version of GEM will become available at the end of 2013. Until then the risk engine will continue to be developed by a growing community of developers, using a dedicated open-source platform

    Evaluation of analytical methodologies to derive vulnerability functions

    Get PDF
    The recognition of fragility functions as a fundamental tool in seismic risk assessment has led to the development of more and more complex and elaborate procedures for their computation. Although vulnerability functions have been traditionally produced using observed damage and loss data, more recent studies propose the employment of analytical methodologies as a way to overcome the frequent lack of post-earthquake data. The variation of the structural modelling approaches on the estimation of building capacity has been the target of many studies in the past, however, its influence in the resulting vulnerability model, impact in loss estimations or propagation of the uncertainty to the seismic risk calculations has so far been the object of restricted scrutiny. Hence, in this paper, an extensive study of static and dynamic procedures for estimating the nonlinear response of buildings has been carried out in order to evaluate the impact of the chosen methodology on the resulting vulnerability and risk outputs. Moreover, the computational effort and numerical stability provided by each approach were evaluated and conclusions were obtained regarding which one offers the optimal balance between accuracy and complexity

    Parameter identification in continuum models

    Get PDF
    Approximation techniques for use in numerical schemes for estimating spatially varying coefficients in continuum models such as those for Euler-Bernoulli beams are discussed. The techniques are based on quintic spline state approximations and cubic spline parameter approximations. Both theoretical and numerical results are presented

    Extending displacement-based earthquake loss assessment (DBELA) for the computation of fragility curves

    Get PDF
    This paper presents a new procedure to derive fragility functions for populations of buildings that relies on the displacement-based earthquake loss assessment (DBELA) methodology. In the method proposed herein, thousands of synthetic buildings have been produced considering the probabilistic distribution describing the variability in geometrical and material properties. Then, their nonlinear capacity has been estimated using the DBELA method and their response against a large set of ground motion records has been estimated. Global limit states are used to estimate the distribution of buildings in each damage state for different levels of ground motion, and a regression algorithm is applied to derive fragility functions for each limit state. The proposed methodology is demonstrated for the case of ductile and non-ductile Turkish reinforced concrete frames with masonry infills

    Development of an open-source platform for calculating losses from earthquakes

    Get PDF
    Risk analysis has a critical role in the reduction of casualties and damages due to earthquakes. Recognition of this relation has led to a rapid rise in demand for accurate, reliable and flexible risk assessment numerical tools and software. As a response to this need, the Global Earthquake Model (GEM) started the development of an open source platform called OpenQuake, for calculating seismic hazard and risk at different scales. Along with this framework, also several other tools to support users creating their own models and visualizing their results are currently being developed, and will be made available as a Modelers Tool Kit (MTK). In this paper, a description of the architecture of OpenQuake is provided, highlighting the current data model, workflow of the calculators and the main challenges raised when running this type of calculations in a global scale. In addition, a case study is presented using the Marmara Region (Turkey) for the calculations, in which the losses for a single event are estimated, as well as probabilistic risk for a 50 years time span

    Run-time parallelization and scheduling of loops

    Get PDF
    Run time methods are studied to automatically parallelize and schedule iterations of a do loop in certain cases, where compile-time information is inadequate. The methods presented involve execution time preprocessing of the loop. At compile-time, these methods set up the framework for performing a loop dependency analysis. At run time, wave fronts of concurrently executable loop iterations are identified. Using this wavefront information, loop iterations are reordered for increased parallelism. Symbolic transformation rules are used to produce: inspector procedures that perform execution time preprocessing and executors or transformed versions of source code loop structures. These transformed loop structures carry out the calculations planned in the inspector procedures. Performance results are presented from experiments conducted on the Encore Multimax. These results illustrate that run time reordering of loop indices can have a significant impact on performance. Furthermore, the overheads associated with this type of reordering are amortized when the loop is executed several times with the same dependency structure

    An Analytical Model of Radiation-Induced Charge Transfer Inefficiency for CCD Detectors

    Full text link
    The European Space Agency's Gaia mission is scheduled for launch in 2013. It will operate at L2 for 5 years, rotating slowly to scan the sky so that its two optical telescopes will repeatedly observe more than one billion stars. The resulting data set will be iteratively reduced to solve for the position, parallax and proper motion of every observed star. The focal plane contains 106 large area silicon CCDs continuously operating in a mode where the line transfer rate and the satellite rotation are in synchronisation. One of the greatest challenges facing the mission is radiation damage to the CCDs which will cause charge deferral and image shape distortion. This is particularly important because of the extreme accuracy requirements of the mission. Despite steps taken at hardware level to minimise the effects of radiation, the residual distortion will need to be calibrated during the pipeline data processing. Due to the volume and inhomogeneity of data involved, this requires a model which describes the effects of the radiation damage which is physically realistic, yet fast enough to implement in the pipeline. The resulting charge distortion model was developed specifically for the Gaia CCD operating mode. However, a generalised version is presented in this paper and this has already been applied in a broader context, for example to investigate the impact of radiation damage on the Euclid dark-energy mission data.Comment: 8 pages, 5 figures, paper accepted for publication in MNRA

    Methods for the identification of material parameters in distributed models for flexible structures

    Get PDF
    Theoretical and numerical results are presented for inverse problems involving estimation of spatially varying parameters such as stiffness and damping in distributed models for elastic structures such as Euler-Bernoulli beams. An outline of algorithms used and a summary of computational experiences are presented

    Policy Shocks and Stock Market Returns: Evidence from Chinese Solar Panels

    Get PDF
    We examine the stock market performance of publicly-listed Chinese firms in the solar panel industry over 2012 and 2013 in response to announcements of new import restrictions by the European Union and domestic policy changes by the Chinese government. Using daily stock market prices from the Shanghai-Shenzhen, New York and Hong Kong markets, we calculate abnormal returns to several policy changes affecting solar panels produced in China. We find, consistent with the Melitz (2003) model, that larger, more export-oriented firms experienced larger stock market losses in the wake of European trade restriction announcements. We further show that European trade policy had a larger negative effect on Chinese private sector firms relative to state owned enterprises. Finally, we use a two stage least squares estimation technique to show that firms listed on US markets are more responsive to news events than those listed in China and Hong Kong
    corecore