986 research outputs found
Meta-models for structural reliability and uncertainty quantification
A meta-model (or a surrogate model) is the modern name for what was
traditionally called a response surface. It is intended to mimic the behaviour
of a computational model M (e.g. a finite element model in mechanics) while
being inexpensive to evaluate, in contrast to the original model which may take
hours or even days of computer processing time. In this paper various types of
meta-models that have been used in the last decade in the context of structural
reliability are reviewed. More specifically classical polynomial response
surfaces, polynomial chaos expansions and kriging are addressed. It is shown
how the need for error estimates and adaptivity in their construction has
brought this type of approaches to a high level of efficiency. A new technique
that solves the problem of the potential biasedness in the estimation of a
probability of failure through the use of meta-models is finally presented.Comment: Keynote lecture Fifth Asian-Pacific Symposium on Structural
Reliability and its Applications (5th APSSRA) May 2012, Singapor
Global sensitivity analysis for stochastic simulators based on generalized lambda surrogate models
Global sensitivity analysis aims at quantifying the impact of input
variability onto the variation of the response of a computational model. It has
been widely applied to deterministic simulators, for which a set of input
parameters has a unique corresponding output value. Stochastic simulators,
however, have intrinsic randomness due to their use of (pseudo)random numbers,
so they give different results when run twice with the same input parameters
but non-common random numbers. Due to this random nature, conventional Sobol'
indices, used in global sensitivity analysis, can be extended to stochastic
simulators in different ways. In this paper, we discuss three possible
extensions and focus on those that depend only on the statistical dependence
between input and output. This choice ignores the detailed data generating
process involving the internal randomness, and can thus be applied to a wider
class of problems. We propose to use the generalized lambda model to emulate
the response distribution of stochastic simulators. Such a surrogate can be
constructed without the need for replications. The proposed method is applied
to three examples including two case studies in finance and epidemiology. The
results confirm the convergence of the approach for estimating the sensitivity
indices even with the presence of strong heteroskedasticity and small
signal-to-noise ratio
Hierarchical adaptive polynomial chaos expansions
Polynomial chaos expansions (PCE) are widely used in the framework of
uncertainty quantification. However, when dealing with high dimensional complex
problems, challenging issues need to be faced. For instance, high-order
polynomials may be required, which leads to a large polynomial basis whereas
usually only a few of the basis functions are in fact significant. Taking into
account the sparse structure of the model, advanced techniques such as sparse
PCE (SPCE), have been recently proposed to alleviate the computational issue.
In this paper, we propose a novel approach to SPCE, which allows one to exploit
the model's hierarchical structure. The proposed approach is based on the
adaptive enrichment of the polynomial basis using the so-called principle of
heredity. As a result, one can reduce the computational burden related to a
large pre-defined candidate set while obtaining higher accuracy with the same
computational budget
Computing derivative-based global sensitivity measures using polynomial chaos expansions
In the field of computer experiments sensitivity analysis aims at quantifying
the relative importance of each input parameter (or combinations thereof) of a
computational model with respect to the model output uncertainty. Variance
decomposition methods leading to the well-known Sobol' indices are recognized
as accurate techniques, at a rather high computational cost though. The use of
polynomial chaos expansions (PCE) to compute Sobol' indices has allowed to
alleviate the computational burden though. However, when dealing with large
dimensional input vectors, it is good practice to first use screening methods
in order to discard unimportant variables. The {\em derivative-based global
sensitivity measures} (DGSM) have been developed recently in this respect. In
this paper we show how polynomial chaos expansions may be used to compute
analytically DGSMs as a mere post-processing. This requires the analytical
derivation of derivatives of the orthonormal polynomials which enter PC
expansions. The efficiency of the approach is illustrated on two well-known
benchmark problems in sensitivity analysis
Metamodel-based importance sampling for the simulation of rare events
In the field of structural reliability, the Monte-Carlo estimator is
considered as the reference probability estimator. However, it is still
untractable for real engineering cases since it requires a high number of runs
of the model. In order to reduce the number of computer experiments, many other
approaches known as reliability methods have been proposed. A certain approach
consists in replacing the original experiment by a surrogate which is much
faster to evaluate. Nevertheless, it is often difficult (or even impossible) to
quantify the error made by this substitution. In this paper an alternative
approach is developed. It takes advantage of the kriging meta-modeling and
importance sampling techniques. The proposed alternative estimator is finally
applied to a finite element based structural reliability analysis.Comment: 8 pages, 3 figures, 1 table. Preprint submitted to ICASP11
Mini-symposia entitled "Meta-models/surrogate models for uncertainty
propagation, sensitivity and reliability analysis
Metamodel-based importance sampling for structural reliability analysis
Structural reliability methods aim at computing the probability of failure of
systems with respect to some prescribed performance functions. In modern
engineering such functions usually resort to running an expensive-to-evaluate
computational model (e.g. a finite element model). In this respect simulation
methods, which may require runs cannot be used directly. Surrogate
models such as quadratic response surfaces, polynomial chaos expansions or
kriging (which are built from a limited number of runs of the original model)
are then introduced as a substitute of the original model to cope with the
computational cost. In practice it is almost impossible to quantify the error
made by this substitution though. In this paper we propose to use a kriging
surrogate of the performance function as a means to build a quasi-optimal
importance sampling density. The probability of failure is eventually obtained
as the product of an augmented probability computed by substituting the
meta-model for the original performance function and a correction term which
ensures that there is no bias in the estimation even if the meta-model is not
fully accurate. The approach is applied to analytical and finite element
reliability problems and proves efficient up to 100 random variables.Comment: 20 pages, 7 figures, 2 tables. Preprint submitted to Probabilistic
Engineering Mechanic
Polynomial-Chaos-based Kriging
Computer simulation has become the standard tool in many engineering fields
for designing and optimizing systems, as well as for assessing their
reliability. To cope with demanding analysis such as optimization and
reliability, surrogate models (a.k.a meta-models) have been increasingly
investigated in the last decade. Polynomial Chaos Expansions (PCE) and Kriging
are two popular non-intrusive meta-modelling techniques. PCE surrogates the
computational model with a series of orthonormal polynomials in the input
variables where polynomials are chosen in coherency with the probability
distributions of those input variables. On the other hand, Kriging assumes that
the computer model behaves as a realization of a Gaussian random process whose
parameters are estimated from the available computer runs, i.e. input vectors
and response values. These two techniques have been developed more or less in
parallel so far with little interaction between the researchers in the two
fields. In this paper, PC-Kriging is derived as a new non-intrusive
meta-modeling approach combining PCE and Kriging. A sparse set of orthonormal
polynomials (PCE) approximates the global behavior of the computational model
whereas Kriging manages the local variability of the model output. An adaptive
algorithm similar to the least angle regression algorithm determines the
optimal sparse set of polynomials. PC-Kriging is validated on various benchmark
analytical functions which are easy to sample for reference results. From the
numerical investigations it is concluded that PC-Kriging performs better than
or at least as good as the two distinct meta-modeling techniques. A larger gain
in accuracy is obtained when the experimental design has a limited size, which
is an asset when dealing with demanding computational models
- …
