657 research outputs found
Controllability and observabiliy of an artificial advection-diffusion problem
In this paper we study the controllability of an artificial
advection-diffusion system through the boundary. Suitable Carleman estimates
give us the observability on the adjoint system in the one dimensional case. We
also study some basic properties of our problem such as backward uniqueness and
we get an intuitive result on the control cost for vanishing viscosity.Comment: 20 pages, accepted for publication in MCSS. DOI:
10.1007/s00498-012-0076-
Revisiting Shared Data Protection Against Key Exposure
This paper puts a new light on secure data storage inside distributed
systems. Specifically, it revisits computational secret sharing in a situation
where the encryption key is exposed to an attacker. It comes with several
contributions: First, it defines a security model for encryption schemes, where
we ask for additional resilience against exposure of the encryption key.
Precisely we ask for (1) indistinguishability of plaintexts under full
ciphertext knowledge, (2) indistinguishability for an adversary who learns: the
encryption key, plus all but one share of the ciphertext. (2) relaxes the
"all-or-nothing" property to a more realistic setting, where the ciphertext is
transformed into a number of shares, such that the adversary can't access one
of them. (1) asks that, unless the user's key is disclosed, noone else than the
user can retrieve information about the plaintext. Second, it introduces a new
computationally secure encryption-then-sharing scheme, that protects the data
in the previously defined attacker model. It consists in data encryption
followed by a linear transformation of the ciphertext, then its fragmentation
into shares, along with secret sharing of the randomness used for encryption.
The computational overhead in addition to data encryption is reduced by half
with respect to state of the art. Third, it provides for the first time
cryptographic proofs in this context of key exposure. It emphasizes that the
security of our scheme relies only on a simple cryptanalysis resilience
assumption for blockciphers in public key mode: indistinguishability from
random, of the sequence of diferentials of a random value. Fourth, it provides
an alternative scheme relying on the more theoretical random permutation model.
It consists in encrypting with sponge functions in duplex mode then, as before,
secret-sharing the randomness
Observations of the Sunyaev-Zel'dovich effect at high angular resolution towards the galaxy clusters A665, A2163 and CL0016+16
We report on the first observation of the Sunyaev-Zel'dovich effect with the
Diabolo experiment at the IRAM 30 metre telescope. A significant brightness
decrement is detected in the direction of three clusters (Abell 665, Abell 2163
and CL0016+16). With a 30 arcsecond beam and 3 arcminute beamthrow, this is the
highest angular resolution observation to date of the SZ effect.Comment: 23 pages, 8 figures, 6 tables, accepted to New Astronom
CdWO4 scintillating bolometer for Double Beta Decay: Light and Heat anticorrelation, light yield and quenching factors
We report the performances of a 0.51 kg CdWO4 scintillating bolometer to be
used for future Double Beta Decay Experiments. The simultaneous read-out of the
heat and the scintillation light allows to discriminate between different
interacting particles aiming at the disentanglement and the reduction of
background contribution, key issue for next generation experiments. We will
describe the observed anticorrelation between the heat and the light signal and
we will show how this feature can be used in order to increase the energy
resolution of the bolometer over the entire energy spectrum, improving up to a
factor 2.6 on the 2615 keV line of 208Tl. The detector was tested in a 433 h
background measurement that permitted to estimate extremely low internal trace
contaminations of 232Th and 238U. The light yield of gamma/beta, alpha and
neutrons is presented. Furthermore we developed a method in order to correctly
evaluate the absolute thermal quenching factor of alpha particles in
scintillating bolometers.Comment: 8 pages 7 figure
Efficient computation of hashes
The sequential computation of hashes at the core of many distributed storage systems and found, for example, in grid services can hinder efficiency in service quality and even pose security challenges that can only be addressed by the use of parallel hash tree modes. The main contributions of this paper are, first, the identification of several efficiency and security challenges posed by the use of sequential hash computation based on the Merkle-Damgard engine. In addition, alternatives for the parallel computation of hash trees are discussed, and a prototype for a new parallel implementation of the Keccak function, the SHA-3 winner, is introduced
First results of the ROSEBUD Dark Matter experiment
Rare Objects SEarch with Bolometers UndergrounD) is an experiment which
attempts to detect low mass Weak Interacting Massive Particles (WIMPs) through
their elastic scattering off Al and O nuclei. It consists of three small
sapphire bolometers (of a total mass of 100 g) with NTD-Ge sensors in a
dilution refrigerator operating at 20 mK in the Canfranc Underground
Laboratory. We report in this paper the results of several runs (of about 10
days each) with successively improved energy thresholds, and the progressive
background reduction obtained by improvement of the radiopurity of the
components and subsequent modifications in the experimental assembly, including
the addition of old lead shields. Mid-term plans and perspectives of the
experiment are also presented.Comment: 14 pages, 8 figures, submitted to Astroparticle Physic
Random Oracles in a Quantum World
The interest in post-quantum cryptography - classical systems that remain
secure in the presence of a quantum adversary - has generated elegant proposals
for new cryptosystems. Some of these systems are set in the random oracle model
and are proven secure relative to adversaries that have classical access to the
random oracle. We argue that to prove post-quantum security one needs to prove
security in the quantum-accessible random oracle model where the adversary can
query the random oracle with quantum states.
We begin by separating the classical and quantum-accessible random oracle
models by presenting a scheme that is secure when the adversary is given
classical access to the random oracle, but is insecure when the adversary can
make quantum oracle queries. We then set out to develop generic conditions
under which a classical random oracle proof implies security in the
quantum-accessible random oracle model. We introduce the concept of a
history-free reduction which is a category of classical random oracle
reductions that basically determine oracle answers independently of the history
of previous queries, and we prove that such reductions imply security in the
quantum model. We then show that certain post-quantum proposals, including ones
based on lattices, can be proven secure using history-free reductions and are
therefore post-quantum secure. We conclude with a rich set of open problems in
this area.Comment: 38 pages, v2: many substantial changes and extensions, merged with a
related paper by Boneh and Zhandr
Recommended from our members
Impact of particles on the Planck HFI detectors: Ground-based measurements and physical interpretation
The Planck High Frequency Instrument (HFI) surveyed the sky continuously from
August 2009 to January 2012. Its noise and sensitivity performance were
excellent, but the rate of cosmic ray impacts on the HFI detectors was
unexpectedly high. Furthermore, collisions of cosmic rays with the focal plane
produced transient signals in the data (glitches) with a wide range of
characteristics. A study of cosmic ray impacts on the HFI detector modules has
been undertaken to categorize and characterize the glitches, to correct the HFI
time-ordered data, and understand the residual effects on Planck maps and data
products. This paper presents an evaluation of the physical origins of glitches
observed by the HFI detectors. In order to better understand the glitches
observed by HFI in flight, several ground-based experiments were conducted with
flight-spare HFI bolometer modules. The experiments were conducted between 2010
and 2013 with HFI test bolometers in different configurations using varying
particles and impact energies. The bolometer modules were exposed to 23 MeV
protons from the Orsay IPN TANDEM accelerator, and to Am and Cm
-particle and Fe radioactive X-ray sources. The calibration data
from the HFI ground-based preflight tests were used to further characterize the
glitches and compare glitch rates with statistical expectations under
laboratory conditions. Test results provide strong evidence that the dominant
family of glitches observed in flight are due to cosmic ray absorption by the
silicon die substrate on which the HFI detectors reside. Glitch energy is
propagated to the thermistor by ballistic phonons, while there is also a
thermal diffusion contribution. The implications of these results for future
satellite missions, especially those in the far-infrared to sub-millimetre and
millimetre regions of the electromagnetic spectrum, are discussed.Comment: 11 pages, 13 figure
Calibration and First light of the Diabolo photometer at the Millimetre and Infrared Testa Grigia Observatory
We have designed and built a large-throughput dual channel photometer,
Diabolo. This photometer is dedicated to the observation of millimetre
continuum diffuse sources, and in particular, of the Sunyaev-Zel'dovich effect
and of anisotropies of the 3K background. We describe the optical layout and
filtering system of the instrument, which uses two bolometric detectors for
simultaneous observations in two frequency channels at 1.2 and 2.1 mm. The
bolometers are cooled to a working temperature of 0.1 K provided by a compact
dilution cryostat. The photometric and angular responses of the instrument are
measured in the laboratory. First astronomical light was detected in March 1995
at the focus of the new Millimetre and Infrared Testa Grigia Observatory (MITO)
Telescope. The established sensitivity of the system is of 7 mK_RJ s^1/2$. For
a typical map of at least 10 beams, with one hour of integration per beam, one
can achieve the rms values of y_SZ ~ 7 10^-5 and the 3K background anisotropy
Delta T/T ~ 7 10^-5, in winter conditions. We also report on a novel bolometer
AC readout circuit which allows for the first time total power measurements on
the sky. This technique alleviates (but does not forbid) the use of chopping
with a secondary mirror. This technique and the dilution fridge concept will be
used in future scan--modulated space instrument like the ESA Planck mission
project.Comment: 10 pages, LaTeX, 12 figures, accepted for publication in Astronomy
and Astrophysics Supplement Serie
Global Carleman inequalities for parabolic systems and applications to controllability
This paper has been conceived as an overview on the controllability properties of some relevant (linear and nonlinear) parabolic systems. Specifically, we deal with the null controllability and the exact controllability to the trajectories. We try to explain the role played by the observability
inequalities in this context and the need of global Carleman estimates. We also recall the main ideas used to overcome the difficulties motivated by nonlinearities. First, we considered the classical heat equation with Dirichlet conditions and distributed controls. Then we analyze recent extensions to
other linear and semilinear parabolic systems and/or boundary controls. Finally, we review the controllability properties for the Stokes and Navier–Stokes equations that are known to date. In this context, we have paid special attention to obtaining the necessary Carleman estimates. Some open questions are mentioned throughout the paper. We hope that this unified presentation will be useful for those researchers interested in the field.Ministerio de Educación y Cienci
- …
