1,075 research outputs found
Who is going to save us now? Bureaucrats, Politicians and Risky Tasks
The paper compares the policy choices regarding risk-transfer against low-probability-high-loss events between elected and appointed public officials. Empirical evidence using data on U.S. municipality-level shows that appointed city managers are more likely to adopt federal risk-transfer regimes. It is argued that the variation in the level of insurance activity emerges from the different incentive schemes each government form is facing. Controlling for spatial dependencies further shows that the participation decision in the insurance program significantly depends on the decision of neighboring communities.Politicians, bureaucrats, decision making under uncertainty, flood insurance, spatial econometrics
Accelerating Wilson Fermion Matrix Inversions by Means of the Stabilized Biconjugate Gradient Algorithm
The stabilized biconjugate gradient algorithm BiCGStab recently presented by
van der Vorst is applied to the inversion of the lattice fermion operator in
the Wilson formulation of lattice Quantum Chromodynamics. Its computational
efficiency is tested in a comparative study against the conjugate gradient and
minimal residual methods. Both for quenched gauge configurations at beta= 6.0
and gauge configurations with dynamical fermions at beta=5.4, we find BiCGStab
to be superior to the other methods. BiCGStab turns out to be particularly
useful in the chiral regime of small quark masses.Comment: 25 pages, WUB 94-1
Making TCP More Robust to Long Connectivity Disruptions (TCP-LCD)
Disruptions in end-to-end path connectivity, which last longer than one retransmission timeout, cause suboptimal TCP performance. The reason for this performance degradation is that TCP interprets segment loss induced by long connectivity disruptions as a sign of congestion, resulting in repeated retransmission timer backoffs. This, in turn, leads to a delayed detection of the re-establishment of the connection since TCP waits for the next retransmission timeout before it attempts a retransmission. This document proposes an algorithm to make TCP more robust to long connectivity disruptions (TCP-LCD). It describes how standard ICMP messages can be exploited during timeout-based loss recovery to disambiguate true congestion loss from non-congestion loss caused by connectivity disruptions. Moreover, a reversion strategy of the retransmission timer is specified that enables a more prompt detection of whether or not the connectivity to a previously disconnected peer node has been restored. TCP-LCD is a TCP senderonly modification that effectively improves TCP performance in the case of connectivity disruptions. Status of This Memo This document is not an Internet Standards Track specification; it is published for examination, experimental implementation, and evaluation. This document defines an Experimental Protocol for the Internet community. This document is a product of the Internet Engineering Task Force (IETF). It represents the consensus of the IETF community. It has received public review and has been approved for publication by the Internet Engineering Steering Group (IESG). Not all documents approved by the IESG are a candidate for any level of Internet Standard; see Section 2 of RFC 5741. Information about the current status of this document, any errata, and how to provide feedback on it may be obtained a
Charity hazard : A real hazard to natural disaster insurance?
After the flooding in 2002 European governments provided billions of Euros of financial assistance to their citizens. Although there is no doubt that solidarity and some sort of assistance is reasonable, the question arises why these damages were not sufficiently insured. One explanation why individuals reject to obtain insurance cover against natural hazards is that they anticipate governmental and private aid. This problem became to be known as charity hazard. The present paper gives an economic analysis of the institutional arrangements on the market for natural disaster insurances focusing on imperfections caused by governmental financial relief. It provides a theoretical explanation why charity hazard is a problem on the market for natural disaster insurances, in the way that it acts as an obstacle for the proper diffusion and therefore the establishment of natural hazard insurances. This paper provides a review of the scientific discussion on charity hazard, provides a theoretical analysis and points out the existing empirical problems regarding this issue
Who is going to save us now? Bureaucrats, politicians and risky tasks
The paper compares the policy choices regarding risk-transfer against low-probability-high-loss events between elected and appointed public officials. Empirical evidence using data on U.S. municipality-level shows that appointed city managers are more likely to adopt federal risk-transfer regimes. It is argued that the variation in the level of insurance activity emerges from the different incentive schemes each government form is facing. Controlling for spatial dependencies further shows that the participation decision in the insurance program significantly depends on the decision of neighboring communities
Colored Non-Crossing Euclidean Steiner Forest
Given a set of -colored points in the plane, we consider the problem of
finding trees such that each tree connects all points of one color class,
no two trees cross, and the total edge length of the trees is minimized. For
, this is the well-known Euclidean Steiner tree problem. For general ,
a -approximation algorithm is known, where is the
Steiner ratio.
We present a PTAS for , a -approximation algorithm
for , and two approximation algorithms for general~, with ratios
and
High-fidelity state detection and tomography of a single ion Zeeman qubit
We demonstrate high-fidelity Zeeman qubit state detection in a single trapped
88 Sr+ ion. Qubit readout is performed by shelving one of the qubit states to a
metastable level using a narrow linewidth diode laser at 674 nm followed by
state-selective fluorescence detection. The average fidelity reached for the
readout of the qubit state is 0.9989(1). We then measure the fidelity of state
tomography, averaged over all possible single-qubit states, which is 0.9979(2).
We also fully characterize the detection process using quantum process
tomography. This readout fidelity is compatible with recent estimates of the
detection error-threshold required for fault-tolerant computation, whereas
high-fidelity state tomography opens the way for high-precision quantum process
tomography
Ferroelectric properties in thin film barium titanate grown using pulsed laser deposition
- …
