371 research outputs found
Effects of climate extremes on the terrestrial carbon cycle : concepts, processes and potential future impacts
This article is protected by copyright. All rights reserved. Acknowledgements This work emerged from the CARBO-Extreme project, funded by the European Community’s 7th framework programme under grant agreement (FP7-ENV-2008-1-226701). We are grateful to the Reviewers and the Subject Editor for helpful guidance. We thank to Silvana Schott for graphic support. Mirco Miglivacca provided helpful comments on the manuscript. Michael Bahn acknowledges support from the Austrian Science Fund (FWF; P22214-B17). Sara Vicca is a postdoctoral research associate of the Fund for Scientific Research – Flanders. Wolfgang Cramer contributes to the Labex OT-Med (n° ANR-11- LABX-0061) funded by the French government through the A*MIDEX project (n° ANR-11-IDEX-0001-02). Flurin Babst acknowledges support from the Swiss National Science Foundation (P300P2_154543).Peer reviewedPublisher PD
Crack Front Waves and the dynamics of a rapidly moving crack
Crack front waves are localized waves that propagate along the leading edge
of a crack. They are generated by the interaction of a crack with a localized
material inhomogeneity. We show that front waves are nonlinear entities that
transport energy, generate surface structure and lead to localized velocity
fluctuations. Their existence locally imparts inertia, which is not
incorporated in current theories of fracture, to initially "massless" cracks.
This, coupled to crack instabilities, yields both inhomogeneity and scaling
behavior within fracture surface structure.Comment: Embedded Latex file including 4 figure
Structure-Preserving Smooth Projective Hashing
International audienceSmooth projective hashing has proven to be an extremely useful primitive, in particular when used in conjunction with commitments to provide implicit decommitment. This has lead to applications proven secure in the UC framework, even in presence of an adversary which can do adaptive corruptions, like for example Password Authenticated Key Exchange (PAKE), and 1-out-of-m Oblivious Transfer (OT). However such solutions still lack in efficiency, since they heavily scale on the underlying message length. Structure-preserving cryptography aims at providing elegant and efficient schemes based on classical assumptions and standard group operations on group elements. Recent trend focuses on constructions of structure- preserving signatures, which require message, signature and verification keys to lie in the base group, while the verification equations only consist of pairing-product equations. Classical constructions of Smooth Projective Hash Function suffer from the same limitation as classical signatures: at least one part of the computation (messages for signature, witnesses for SPHF) is a scalar. In this work, we introduce and instantiate the concept of Structure- Preserving Smooth Projective Hash Function, and give as applications more efficient instantiations for one-round PAKE and three-round OT, and information retrieval thanks to Anonymous Credentials, all UC- secure against adaptive adversaries
Can One Trust Quantum Simulators?
Various fundamental phenomena of strongly-correlated quantum systems such as
high- superconductivity, the fractional quantum-Hall effect, and quark
confinement are still awaiting a universally accepted explanation. The main
obstacle is the computational complexity of solving even the most simplified
theoretical models that are designed to capture the relevant quantum
correlations of the many-body system of interest. In his seminal 1982 paper
[Int. J. Theor. Phys. 21, 467], Richard Feynman suggested that such models
might be solved by "simulation" with a new type of computer whose constituent
parts are effectively governed by a desired quantum many-body dynamics.
Measurements on this engineered machine, now known as a "quantum simulator,"
would reveal some unknown or difficult to compute properties of a model of
interest. We argue that a useful quantum simulator must satisfy four
conditions: relevance, controllability, reliability, and efficiency. We review
the current state of the art of digital and analog quantum simulators. Whereas
so far the majority of the focus, both theoretically and experimentally, has
been on controllability of relevant models, we emphasize here the need for a
careful analysis of reliability and efficiency in the presence of
imperfections. We discuss how disorder and noise can impact these conditions,
and illustrate our concerns with novel numerical simulations of a paradigmatic
example: a disordered quantum spin chain governed by the Ising model in a
transverse magnetic field. We find that disorder can decrease the reliability
of an analog quantum simulator of this model, although large errors in local
observables are introduced only for strong levels of disorder. We conclude that
the answer to the question "Can we trust quantum simulators?" is... to some
extent.Comment: 20 pages. Minor changes with respect to version 2 (some additional
explanations, added references...
Shot noise in mesoscopic systems
This is a review of shot noise, the time-dependent fluctuations in the
electrical current due to the discreteness of the electron charge, in small
conductors. The shot-noise power can be smaller than that of a Poisson process
as a result of correlations in the electron transmission imposed by the Pauli
principle. This suppression takes on simple universal values in a symmetric
double-barrier junction (suppression factor 1/2), a disordered metal (factor
1/3), and a chaotic cavity (factor 1/4). Loss of phase coherence has no effect
on this shot-noise suppression, while thermalization of the electrons due to
electron-electron scattering increases the shot noise slightly. Sub-Poissonian
shot noise has been observed experimentally. So far unobserved phenomena
involve the interplay of shot noise with the Aharonov-Bohm effect, Andreev
reflection, and the fractional quantum Hall effect.Comment: 37 pages, Latex, 10 figures (eps). To be published in "Mesoscopic
Electron Transport," edited by L. P. Kouwenhoven, G. Schoen, and L. L. Sohn,
NATO ASI Series E (Kluwer Academic Publishing, Dordrecht
Adaptive Oblivious Transfer and Generalization
International audienceOblivious Transfer (OT) protocols were introduced in the seminal paper of Rabin, and allow a user to retrieve a given number of lines (usually one) in a database, without revealing which ones to the server. The server is ensured that only this given number of lines can be accessed per interaction, and so the others are protected; while the user is ensured that the server does not learn the numbers of the lines required. This primitive has a huge interest in practice, for example in secure multi-party computation, and directly echoes to Symmetrically Private Information Retrieval (SPIR). Recent Oblivious Transfer instantiations secure in the UC framework suf- fer from a drastic fallback. After the first query, there is no improvement on the global scheme complexity and so subsequent queries each have a global complexity of O(|DB|) meaning that there is no gain compared to running completely independent queries. In this paper, we propose a new protocol solving this issue, and allowing to have subsequent queries with a complexity of O(log(|DB|)), and prove the protocol security in the UC framework with adaptive corruptions and reliable erasures. As a second contribution, we show that the techniques we use for Obliv- ious Transfer can be generalized to a new framework we call Oblivi- ous Language-Based Envelope (OLBE). It is of practical interest since it seems more and more unrealistic to consider a database with uncontrolled access in access control scenarii. Our approach generalizes Oblivious Signature-Based Envelope, to handle more expressive credentials and requests from the user. Naturally, OLBE encompasses both OT and OSBE, but it also allows to achieve Oblivious Transfer with fine grain access over each line. For example, a user can access a line if and only if he possesses a certificate granting him access to such line. We show how to generically and efficiently instantiate such primitive, and prove them secure in the Universal Composability framework, with adaptive corruptions assuming reliable erasures. We provide the new UC ideal functionalities when needed, or we show that the existing ones fit in our new framework. The security of such designs allows to preserve both the secrecy of the database values and the user credentials. This symmetry allows to view our new approach as a generalization of the notion of Symmetrically PIR
Multidimensional Conservation Laws: Overview, Problems, and Perspective
Some of recent important developments are overviewed, several longstanding
open problems are discussed, and a perspective is presented for the
mathematical theory of multidimensional conservation laws. Some basic features
and phenomena of multidimensional hyperbolic conservation laws are revealed,
and some samples of multidimensional systems/models and related important
problems are presented and analyzed with emphasis on the prototypes that have
been solved or may be expected to be solved rigorously at least for some cases.
In particular, multidimensional steady supersonic problems and transonic
problems, shock reflection-diffraction problems, and related effective
nonlinear approaches are analyzed. A theory of divergence-measure vector fields
and related analytical frameworks for the analysis of entropy solutions are
discussed.Comment: 43 pages, 3 figure
Whisker Movements Reveal Spatial Attention: A Unified Computational Model of Active Sensing Control in the Rat
Spatial attention is most often investigated in the visual modality through measurement of eye movements, with primates, including humans, a widely-studied model. Its study in laboratory rodents, such as mice and rats, requires different techniques, owing to the lack of a visual fovea and the particular ethological relevance of orienting movements of the snout and the whiskers in these animals. In recent years, several reliable relationships have been observed between environmental and behavioural variables and movements of the whiskers, but the function of these responses, as well as how they integrate, remains unclear. Here, we propose a unifying abstract model of whisker movement control that has as its key variable the region of space that is the animal's current focus of attention, and demonstrate, using computer-simulated behavioral experiments, that the model is consistent with a broad range of experimental observations. A core hypothesis is that the rat explicitly decodes the location in space of whisker contacts and that this representation is used to regulate whisker drive signals. This proposition stands in contrast to earlier proposals that the modulation of whisker movement during exploration is mediated primarily by reflex loops. We go on to argue that the superior colliculus is a candidate neural substrate for the siting of a head-centred map guiding whisker movement, in analogy to current models of visual attention. The proposed model has the potential to offer a more complete understanding of whisker control as well as to highlight the potential of the rodent and its whiskers as a tool for the study of mammalian attention
Neuronal networks provide rapid neuroprotection against spreading toxicity
Acute secondary neuronal cell death, as seen in neurodegenerative disease, cerebral ischemia (stroke) and traumatic brain injury (TBI), drives spreading neurotoxicity into surrounding, undamaged, brain areas. This spreading toxicity occurs via two mechanisms, synaptic toxicity through hyperactivity, and excitotoxicity following the accumulation of extracellular glutamate. To date, there are no fast-acting therapeutic tools capable of terminating secondary spreading toxicity within a time frame relevant to the emergency treatment of stroke or TBI patients. Here, using hippocampal neurons (DIV 15-20) cultured in microfluidic devices in order to deliver a localized excitotoxic insult, we replicate secondary spreading toxicity and demonstrate that this process is driven by GluN2B receptors. In addition to the modeling of spreading toxicity, this approach has uncovered a previously unknown, fast acting, GluN2A-dependent neuroprotective signaling mechanism. This mechanism utilizes the innate capacity of surrounding neuronal networks to provide protection against both forms of spreading neuronal toxicity, synaptic hyperactivity and direct glutamate excitotoxicity. Importantly, network neuroprotection against spreading toxicity can be effectively stimulated after an excitotoxic insult has been delivered, and may identify a new therapeutic window to limit brain damage
Changing atmospheric CO2 concentration was the primary driver of early Cenozoic climate
The Early Eocene Climate Optimum (EECO, which occurred about 51 to 53 million years ago)1, was the warmest interval of the past 65 million years, with mean annual surface air temperature over ten degrees Celsius warmer than during the pre-industrial period2–4. Subsequent global cooling in the middle and late Eocene epoch, especially at high latitudes, eventually led to continental ice sheet development in Antarctica in the early Oligocene epoch (about 33.6 million years ago). However, existing estimates place atmospheric carbon dioxide (CO2) levels during the Eocene at 500–3,000 parts per million5–7, and in the absence of tighter constraints carbon–climate interactions over this interval remain uncertain. Here we use recent analytical and methodological developments8–11 to generate a new high-fidelity record of CO2 concentrations using the boron isotope (δ11Β) composition of well preserved planktonic foraminifera from the Tanzania Drilling Project, revising previous estimates6. Although species-level uncertainties make absolute values difficult to constrain, CO2 concentrations during the EECO were around 1,400 parts per million. The relative decline in CO2 concentration through the Eocene is more robustly constrained at about fifty per cent, with a further decline into the Oligocene12. Provided the latitudinal dependency of sea surface temperature change for a given climate forcing in the Eocene was similar to that of the late Quaternary period13, this CO2 decline was sufficient to drive the well documented high- and low-latitude cooling that occurred through the Eocene14. Once the change in global temperature between the pre-industrial period and the Eocene caused by the action of all known slow feedbacks (apart from those associated with the carbon cycle) is removed2–4, both the EECO and the late Eocene exhibit an equilibrium climate sensitivity relative to the pre-industrial period of 2.1 to 4.6 degrees Celsius per CO2 doubling (66 per cent confidence), which is similar to the canonical range (1.5 to 4.5 degrees Celsius15), indicating that a large fraction of the warmth of the early Eocene greenhouse was driven by increased CO2 concentrations, and that climate sensitivity was relatively constant throughout this period
- …
