500 research outputs found
Properties of Foreshocks and Aftershocks of the Non-Conservative SOC Olami-Feder-Christensen Model: Triggered or Critical Earthquakes?
Following Hergarten and Neugebauer [2002] who discovered aftershock and
foreshock sequences in the Olami-Feder-Christensen (OFC) discrete block-spring
earthquake model, we investigate to what degree the simple toppling mechanism
of this model is sufficient to account for the properties of earthquake
clustering in time and space. Our main finding is that synthetic catalogs
generated by the OFC model share practically all properties of real seismicity
at a qualitative level, with however significant quantitative differences. We
find that OFC catalogs can be in large part described by the concept of
triggered seismicity but the properties of foreshocks depend on the mainshock
magnitude, in qualitative agreement with the critical earthquake model and in
disagreement with simple models of triggered seismicity such as the Epidemic
Type Aftershock Sequence (ETAS) model [Ogata, 1988]. Many other features of OFC
catalogs can be reproduced with the ETAS model with a weaker clustering than
real seismicity, i.e. for a very small average number of triggered earthquakes
of first generation per mother-earthquake.Comment: revtex, 19 pages, 8 eps figure
Statistical analysis of rockfall volume distributions: implications for rockfall dynamics.
International audienceWe analyze the volume distribution of natural rockfalls on different geological settings (i.e., calcareous cliffs in the French Alps, Grenoble area, and granite Yosemite cliffs, California Sierra) and different volume ranges (i.e., regional and worldwide catalogs). Contrary to previous studies that included several types of landslides, we restrict our analysis to rockfall sources which originated on subvertical cliffs. For the three data sets, we find that the rockfall volumes follow a power law distribution with a similar exponent value, within error bars. This power law distribution was also proposed for rockfall volumes that occurred along road cuts. All these results argue for a recurrent power law distribution of rockfall volumes on subvertical cliffs, for a large range of rockfall sizes (102–1010 m3), regardless of the geological settings and of the preexisting geometry of fracture patterns that are drastically different on the three studied areas. The power law distribution for rockfall volumes could emerge from two types of processes. First, the observed power law distribution of rockfall volumes is similar to the one reported for both fragmentation experiments and fragmentation models. This argues for the geometry of rock mass fragment sizes to possibly control the rockfall volumes. This way neither cascade nor avalanche processes would influence the rockfall volume distribution. Second, without any requirement of scale-invariant quenched heterogeneity patterns, the rock mass dynamics can arise from avalanche processes driven by fluctuations of the rock mass properties, e.g., cohesion or friction angle. This model may also explain the power law distribution reported for landslides involving unconsolidated materials. We find that the exponent values of rockfall volume on subvertical cliffs, 0.5 ± 0.2, is significantly smaller than the 1.2 ± 0.3 value reported for mixed landslide types. This change of exponents can be driven by the material strength, which controls the in situ topographic slope values, as simulated in numerical models of landslides [Densmore et al., 1998; Champel et al., 2002]. INDEX TERMS: 5104 Physical Properties of Rocks: Fracture and flow; 1815 Hydrology: Erosion and sedimentation; 8122 Tectonophysics: Dynamics, gravity and tectonics
Scale free networks of earthquakes and aftershocks
We propose a new metric to quantify the correlation between any two
earthquakes. The metric consists of a product involving the time interval and
spatial distance between two events, as well as the magnitude of the first one.
According to this metric, events typically are strongly correlated to only one
or a few preceding ones. Thus a classification of events as foreshocks, main
shocks or aftershocks emerges automatically without imposing predefined
space-time windows. To construct a network, each earthquake receives an
incoming link from its most correlated predecessor. The number of aftershocks
for any event, identified by its outgoing links, is found to be scale free with
exponent . The original Omori law with emerges as a
robust feature of seismicity, holding up to years even for aftershock sequences
initiated by intermediate magnitude events. The measured fat-tailed
distribution of distances between earthquakes and their aftershocks suggests
that aftershock collection with fixed space windows is not appropriate.Comment: 7 pages and 7 figures. Submitte
Response Functions to Critical Shocks in Social Sciences: An Empirical and Numerical Study
We show that, provided one focuses on properly selected episodes, one can
apply to the social sciences the same observational strategy that has proved
successful in natural sciences such as astrophysics or geodynamics. For
instance, in order to probe the cohesion of a policy, one can, in different
countries, study the reactions to some huge and sudden exogenous shocks, which
we call Dirac shocks. This approach naturally leads to the notion of structural
(as opposed or complementary to temporal) forecast. Although structural
predictions are by far the most common way to test theories in the natural
sciences, they have been much less used in the social sciences. The Dirac shock
approach opens the way to testing structural predictions in the social
sciences. The examples reported here suggest that critical events are able to
reveal pre-existing ``cracks'' because they probe the social cohesion which is
an indicator and predictor of future evolution of the system, and in some cases
foreshadows a bifurcation. We complement our empirical work with numerical
simulations of the response function (``damage spreading'') to Dirac shocks in
the Sznajd model of consensus build-up. We quantify the slow relaxation of the
difference between perturbed and unperturbed systems, the conditions under
which the consensus is modified by the shock and the large variability from one
realization to another
Universal features of correlated bursty behaviour
Inhomogeneous temporal processes, like those appearing in human
communications, neuron spike trains, and seismic signals, consist of
high-activity bursty intervals alternating with long low-activity periods. In
recent studies such bursty behavior has been characterized by a fat-tailed
inter-event time distribution, while temporal correlations were measured by the
autocorrelation function. However, these characteristic functions are not
capable to fully characterize temporally correlated heterogenous behavior. Here
we show that the distribution of the number of events in a bursty period serves
as a good indicator of the dependencies, leading to the universal observation
of power-law distribution in a broad class of phenomena. We find that the
correlations in these quite different systems can be commonly interpreted by
memory effects and described by a simple phenomenological model, which displays
temporal behavior qualitatively similar to that in real systems
Dragon-kings: mechanisms, statistical methods and empirical evidence
This introductory article presents the special Discussion and Debate volume
"From black swans to dragon-kings, is there life beyond power laws?" published
in Eur. Phys. J. Special Topics in May 2012. We summarize and put in
perspective the contributions into three main themes: (i) mechanisms for
dragon-kings, (ii) detection of dragon-kings and statistical tests and (iii)
empirical evidence in a large variety of natural and social systems. Overall,
we are pleased to witness significant advances both in the introduction and
clarification of underlying mechanisms and in the development of novel
efficient tests that demonstrate clear evidence for the presence of
dragon-kings in many systems. However, this positive view should be balanced by
the fact that this remains a very delicate and difficult field, if only due to
the scarcity of data as well as the extraordinary important implications with
respect to hazard assessment, risk control and predictability.Comment: 20 page
On the Occurrence of Finite-Time-Singularities in Epidemic Models of Rupture, Earthquakes and Starquakes
We present a new kind of critical stochastic finite-time-singularity, relying
on the interplay between long-memory and extreme fluctuations. We illustrate it
on the well-established epidemic-type aftershock (ETAS) model for aftershocks,
based solely on the most solidly documented stylized facts of seismicity
(clustering in space and in time and power law Gutenberg-Richter distribution
of earthquake energies). This theory accounts for the main observations (power
law acceleration and discrete scale invariant structure) of critical rupture of
heterogeneous materials, of the largest sequence of starquakes ever attributed
to a neutron star as well as of earthquake sequences.Comment: Revtex document of 4 pages including 1 eps figur
An observational test of the origin of accelerating moment release before large earthquakes
International audience[1] A progressive increase of seismic activity distributed over a wide region around a future earthquake epicenter is termed accelerating moment release (AMR). This phenomenon has been observed in several studies over the last 15 years, although there is no consensus about the physical origin of the effect. In a recent hypothesis known as the stress accumulation (SA) model, the AMR is thought to result from the last stage of loading in the earthquake cycle. In this view, the increasing seismicity is due to minor stress release as the whole region becomes sufficiently stressed for the major event to occur. The stress accumulation model makes specific predictions about the distribution of events in an AMR sequence. Because the AMR is predicted to be a result of loading on the main fault, the precursory activity should be concentrated in the positive lobes of the far-field stresses calculated by a backslip dislocation model of the main shock. To test this model, AMR is first found in optimal circular regions around the epicenters of each of the M w ! 6.5 earthquakes in central and southern California since 1950. A backslip dislocation model is then used to determine which of the precursory events occur in the regions predicted by stress accumulation. AMR is shown to occur preferentially in the lobes of the backslip stress field predicted by the stress accumulation model
Characterization of rockfalls from seismic signal: insights from laboratory experiments
International audienceThe seismic signals generated by rockfalls can provide information on their dynamics and location. However, the lack of field observations makes it difficult to establish clear relationships between the characteristics of the signal and the source. In this study, scaling laws are derived from analytical impact models to relate the mass and the speed of an individual impactor to the radiated elastic energy and the frequency content of the emitted seismic signal. It appears that the radiated elastic energy and frequencies decrease when the impact is viscoelastic or elasto-plastic compared to the case of an elastic impact. The scaling laws are validated with laboratory experiments of impacts of beads and gravels on smooth thin plates and rough thick blocks. Regardless of the involved materials, the masses and speeds of the impactors are retrieved from seismic measurements within afactor of 3. A quantitative energy budget of the impacts is established. On smooth thin plates, the lost energy is either radiated in elastic waves or dissipated in viscoelasticity when the impactor is large or small with respect to the plate thickness, respectively. In contrast, on rough thick blocks, theelastic energy radiation represents less than 5% of the lost energy. Most of the energy is lost in plastic deformation or rotation modes of the bead owingto surface roughness. Finally, we estimate the elastic energy radiated during field scale rockfalls experiments. This energy is shown to be proportional to the boulder mass, in agreement with the theoretical scaling laws
Quantum walks: a comprehensive review
Quantum walks, the quantum mechanical counterpart of classical random walks,
is an advanced tool for building quantum algorithms that has been recently
shown to constitute a universal model of quantum computation. Quantum walks is
now a solid field of research of quantum computation full of exciting open
problems for physicists, computer scientists, mathematicians and engineers.
In this paper we review theoretical advances on the foundations of both
discrete- and continuous-time quantum walks, together with the role that
randomness plays in quantum walks, the connections between the mathematical
models of coined discrete quantum walks and continuous quantum walks, the
quantumness of quantum walks, a summary of papers published on discrete quantum
walks and entanglement as well as a succinct review of experimental proposals
and realizations of discrete-time quantum walks. Furthermore, we have reviewed
several algorithms based on both discrete- and continuous-time quantum walks as
well as a most important result: the computational universality of both
continuous- and discrete- time quantum walks.Comment: Paper accepted for publication in Quantum Information Processing
Journa
- …
