3,353 research outputs found
General theory of the modified Gutenberg-Richter law for large seismic moments
The Gutenberg-Richter power law distribution of earthquake sizes is one of
the most famous example illustrating self-similarity. It is well-known that the
Gutenberg-Richter distribution has to be modified for large seismic moments,
due to energy conservation and geometrical reasons. Several models have been
proposed, either in terms of a second power law with a larger b-value beyond a
cross-over magnitude, or based on a ``hard'' magnitude cut-off or a ``soft''
magnitude cut-off using an exponential taper. Since the large scale tectonic
deformation is dominated by the very largest earthquakes and since their impact
on loss of life and properties is huge, it is of great importance to constrain
as much as possible the shape of their distribution. We present a simple and
powerful probabilistic theoretical approach that shows that the Gamma
distribution is the best model, under the two hypothesis that the
Gutenberg-Richter power law distribution holds in absence of any condition
(condition of criticality) and that one or several constraints are imposed,
either based on conservation laws or on the nature of the observations
themselves. The selection of the Gamma distribution does not depend on the
specific nature of the constraint. We illustrate the approach with two
constraints, the existence of a finite moment release rate and the observation
of the size of a maximum earthquake in a finite catalog. Our predicted ``soft''
maximum magnitudes compare favorably with those obtained by Kagan [1997] for
the Flinn-Engdahl regionalization of subduction zones, collision zones and
mid-ocean ridges.Comment: 24 pages, including 3 tables, in press in Bull. Seism. Soc. A
Acoustic fluidization for earthquakes?
Melosh [1996] has suggested that acoustic fluidization could provide an
alternative to theories that are invoked as explanations for why some crustal
faults appear to be weak. We show that there is a subtle but profound
inconsistency in the theory that unfortunately invalidates the results. We
propose possible remedies but must acknowledge that the relevance of acoustic
fluidization remains an open question.Comment: 13 page
Renormalization Group Analysis of the 2000-2002 anti-bubble in the US S&P 500 index: Explanation of the hierarchy of 5 crashes and Prediction
We propose a straightforward extension of our previously proposed
log-periodic power law model of the ``anti-bubble'' regime of the USA market
since the summer of 2000, in terms of the renormalization group framework to
model critical points. Using a previous work by Gluzman and Sornette (2002) on
the classification of the class of Weierstrass-like functions, we show that the
five crashes that occurred since August 2000 can be accurately modelled by this
approach, in a fully consistent way with no additional parameters. Our theory
suggests an overall consistent organization of the investors forming a
collective network which interact to form the pessimistic bearish
``anti-bubble'' regime with intermittent acceleration of the positive feedbacks
of pessimistic sentiment leading to these crashes. We develop retrospective
predictions, that confirm the existence of significant arbitrage opportunities
for a trader using our model. Finally, we offer a prediction for the unknown
future of the US S&P500 index extending over 2003 and 2004, that refines the
previous prediction of Sornette and Zhou (2002).Comment: Latex document, 11 eps figures and 1 tabl
The 2006-2008 Oil Bubble and Beyond
We present an analysis of oil prices in US$ and in other major currencies
that diagnoses unsustainable faster-than-exponential behavior. This supports
the hypothesis that the recent oil price run-up has been amplified by
speculative behavior of the type found during a bubble-like expansion. We also
attempt to unravel the information hidden in the oil supply-demand data
reported by two leading agencies, the US Energy Information Administration
(EIA) and the International Energy Agency (IEA). We suggest that the found
increasing discrepancy between the EIA and IEA figures provides a measure of
the estimation errors. Rather than a clear transition to a supply restricted
regime, we interpret the discrepancy between the IEA and EIA as a signature of
uncertainty, and there is no better fuel than uncertainty to promote
speculation!Comment: 4 pages; 4 figures, discussion of the oil supply-demand view point
and uncertaintie
Predictability of catastrophic events: material rupture, earthquakes, turbulence, financial crashes and human birth
We propose that catastrophic events are "outliers" with statistically
different properties than the rest of the population and result from mechanisms
involving amplifying critical cascades. Applications and the potential for
prediction are discussed in relation to the rupture of composite materials,
great earthquakes, turbulence and abrupt changes of weather regimes, financial
crashes and human parturition (birth).Comment: Latex document of 22 pages including 6 ps figures, in press in PNA
Icequakes coupled with surface displacements for predicting glacier break-off
A hanging glacier at the east face of Weisshorn (Switzerland) broke off in
2005. We were able to monitor and measure surface motion and icequake activity
for 25 days up to three days prior to the break-off. The analysis of seismic
waves generated by the glacier during the rupture maturation process revealed
four types of precursory signals of the imminent catastrophic rupture: (i) an
increase in seismic activity within the glacier, (ii) a decrease in the waiting
time between two successive icequakes, (iii) a change in the size-frequency
distribution of icequake energy, and (iv) a modification in the structure of
the waiting time distributions between two successive icequakes. Morevover, it
was possible to demonstrate the existence of a correlation between the seismic
activity and the log-periodic oscillations of the surface velocities
superimposed on the global acceleration of the glacier during the rupture
maturation. Analysis of the seismic activity led us to the identification of
two regimes: a stable phase with diffuse damage, and an unstable and dangerous
phase characterized by a hierarchical cascade of rupture instabilities where
large icequakes are triggered.Comment: 16 pages, 7 figure
"Slimming" of power law tails by increasing market returns
We introduce a simple generalization of rational bubble models which removes
the fundamental problem discovered by [Lux and Sornette, 1999] that the
distribution of returns is a power law with exponent less than 1, in
contradiction with empirical data. The idea is that the price fluctuations
associated with bubbles must on average grow with the mean market return r.
When r is larger than the discount rate r_delta, the distribution of returns of
the observable price, sum of the bubble component and of the fundamental price,
exhibits an intermediate tail with an exponent which can be larger than 1. This
regime r>r_delta corresponds to a generalization of the rational bubble model
in which the fundamental price is no more given by the discounted value of
future dividends. We explain how this is possible. Our model predicts that, the
higher is the market remuneration r above the discount rate, the larger is the
power law exponent and thus the thinner is the tail of the distribution of
price returns.Comment: 13 pages + 4 figure
Log-periodic Oscillations for Biased Diffusion in 3D Random Lattices
Random walks with a fixed bias direction on randomly diluted cubic lattices
far above the percolation threshold exhibit log-periodic oscillations in the
effective exponent versus time. A scaling argument accounts for the numerical
results in the limit of large biases and small dilution and shows the
importance of the interplay of these two ingredients in the generation of the
log-periodicity. These results show that log-periodicity is the dominant effect
compared to previous predictions of and reports on anomalous diffusion.Comment: 5 pages, 3 figures, revised with better theory and better comparison
to simulations and finite size effects, in press in Physica
Statistical Physics of Rupture in Heterogeneous Media
The damage and fracture of materials are technologically of enormous interest
due to their economic and human cost. They cover a wide range of phenomena like
e.g. cracking of glass, aging of concrete, the failure of fiber networks in the
formation of paper and the breaking of a metal bar subject to an external load.
Failure of composite systems is of utmost importance in naval, aeronautics and
space industry. By the term composite, we refer to materials with heterogeneous
microscopic structures and also to assemblages of macroscopic elements forming
a super-structure. Chemical and nuclear plants suffer from cracking due to
corrosion either of chemical or radioactive origin, aided by thermal and/or
mechanical stress. Despite the large amount of experimental data and the
considerable effort that has been undertaken by material scientists, many
questions about fracture have not been answered yet. There is no comprehensive
understanding of rupture phenomena but only a partial classification in
restricted and relatively simple situations. This lack of fundamental
understanding is indeed reflected in the absence of reliable prediction methods
for rupture, based on a suitable monitoring of the stressed system. Not only is
there a lack of non-empirical understanding of the reliability of a system, but
also the empirical laws themselves have often limited value. The difficulties
stem from the complex interplay between heterogeneities and modes of damage and
the possible existence of a hierarchy of characteristic scales (static and
dynamic).
The paper presents a review of recent efforts from the statistical physics
community to address these points.Comment: Enlarged review and updated references, 21 pages with 2 figure
- …
