12,066 research outputs found
Minimally entangled typical thermal states versus matrix product purifications for the simulation of equilibrium states and time evolution
For the simulation of equilibrium states and finite-temperature response
functions of strongly-correlated quantum many-body systems, we compare the
efficiencies of two different approaches in the framework of the density matrix
renormalization group (DMRG). The first is based on matrix product
purifications. The second, more recent one, is based on so-called minimally
entangled typical thermal states (METTS). For the latter, we highlight the
interplay of statistical and DMRG truncation errors, discuss the use of
self-averaging effects, and describe schemes for the computation of response
functions. For critical as well as gapped phases of the spin-1/2 XXZ chain and
the one-dimensional Bose-Hubbard model, we assess the computation costs and
accuracies of the two methods at different temperatures. For almost all
considered cases, we find that, for the same computation cost, purifications
yield more accurate results than METTS -- often by orders of magnitude. The
METTS algorithm becomes more efficient only for temperatures well below the
system's energy gap. The exponential growth of the computation cost in the
evaluation of response functions limits the attainable timescales in both
methods and we find that in this regard, METTS do not outperform purifications.Comment: 12 pages + 4 pages appendix, 12 figures; minor improvements of data
and text; published versio
Racing through the swampland: de Sitter uplift vs weak gravity
We observe that racetrack models for moduli stabilization are in tension with
strong forms of the Weak Gravity Conjecture (WGC). Moreover, recently, it was
noted that controlled KKLT-type de Sitter vacua seem to require a racetrack
fine-tuning of the type introduced by Kallosh and Linde. We combine these
observations and conclude that the quests for realizing parametrically large
axion decay constants and controlled de Sitter vacua are intimately related.
Finally, we discuss possible approaches to curing the conflict between the
racetrack scheme and the WGC.Comment: 5 page
The Lucas Paradox and the quality of institutions: then and now
In the first era of financial globalization (1880-1914), global capital market integration led to substantial net capital movements from rich to poor economies. The historical experience stands in contrast to the contemporary globalization where gross capital mobility is equally high, but did not incite a substantial transfer of savings from rich to poor economies. Using data for the historical and modern periods we extend Lucas (1990) original model and show that differences in institutional quality between rich and poor countries can account for the sharply divergent patterns of international capital movements. --capital market integration,financial globalization,economic history
Does Financial Integration Spur Economic Growth? New Evidence from the First Era of Financial Globalization
Does international financial integration boost economic growth? The question has been discussed controversially for a long time, and a large number of studies has been devoted to its empirical investigation. As of yet, robust evidence for a positive impact of capital market integration on economic growth is lacking, as documented by Edison et al. (2002). However, there is substantial narrative evidence from economic history that highlights the contribution European capital made to economic growth of peripheral economies during the so-called first age of financial globalization before 1914. For this paper, we have compiled the first comprehensive data set to test econometrically if capital market integration had a positive impact on economic growth before WW1. Using the same models and techniques as contemporary studies, we show that there was indeed a significant and robust growth effect of international financial integration in the first era of financial globalization. Our temptative explanation for this marked difference between now and then stresses property rights protection as a prerequisite for the standard neoclassical model to work properly.International financial integration; Economic growth; First era of globalization.
Dynamic Time-Dependent Route Planning in Road Networks with User Preferences
There has been tremendous progress in algorithmic methods for computing
driving directions on road networks. Most of that work focuses on
time-independent route planning, where it is assumed that the cost on each arc
is constant per query. In practice, the current traffic situation significantly
influences the travel time on large parts of the road network, and it changes
over the day. One can distinguish between traffic congestion that can be
predicted using historical traffic data, and congestion due to unpredictable
events, e.g., accidents. In this work, we study the \emph{dynamic and
time-dependent} route planning problem, which takes both prediction (based on
historical data) and live traffic into account. To this end, we propose a
practical algorithm that, while robust to user preferences, is able to
integrate global changes of the time-dependent metric~(e.g., due to traffic
updates or user restrictions) faster than previous approaches, while allowing
subsequent queries that enable interactive applications
Subsampling Mathematical Relaxations and Average-case Complexity
We initiate a study of when the value of mathematical relaxations such as
linear and semidefinite programs for constraint satisfaction problems (CSPs) is
approximately preserved when restricting the instance to a sub-instance induced
by a small random subsample of the variables. Let be a family of CSPs such
as 3SAT, Max-Cut, etc., and let be a relaxation for , in the sense
that for every instance , is an upper bound the maximum
fraction of satisfiable constraints of . Loosely speaking, we say that
subsampling holds for and if for every sufficiently dense instance and every , if we let be the instance obtained by
restricting to a sufficiently large constant number of variables, then
. We say that weak subsampling holds if the
above guarantee is replaced with whenever
. We show: 1. Subsampling holds for the BasicLP and BasicSDP
programs. BasicSDP is a variant of the relaxation considered by Raghavendra
(2008), who showed it gives an optimal approximation factor for every CSP under
the unique games conjecture. BasicLP is the linear programming analog of
BasicSDP. 2. For tighter versions of BasicSDP obtained by adding additional
constraints from the Lasserre hierarchy, weak subsampling holds for CSPs of
unique games type. 3. There are non-unique CSPs for which even weak subsampling
fails for the above tighter semidefinite programs. Also there are unique CSPs
for which subsampling fails for the Sherali-Adams linear programming hierarchy.
As a corollary of our weak subsampling for strong semidefinite programs, we
obtain a polynomial-time algorithm to certify that random geometric graphs (of
the type considered by Feige and Schechtman, 2002) of max-cut value
have a cut value at most .Comment: Includes several more general results that subsume the previous
version of the paper
The gains from early intervention in Europe: Fiscal surveillance and fiscal planning using cash data
This paper does two things. First it examines the use of real time inter-annual cash data and the role of early interventions for improving the monitoring of national fiscal policies and the correction of fiscal indiscipline. Early warnings are important because they allow us to spread the necessary adjustments over time. Examples from Germany and Italy show that large corrections are often necessary early on to make adjustments later on acceptable and to keep debt ratios from escalating. There is a credibility issue here; we find the difference between front-loaded and back-loaded adjustment schemes is likely to be vital for the time consistency of fiscal policymaking. Second, without early interventions, the later deficit reductions typically double in size – meaning governments become subject to the excessive deficit procedure and significant improvement tests more often. Thus the budget savings from early intervention and the use of cash data are significant; in our examples they are similar in size to the operating budget of the department of housing and urban development in Germany. Similar results apply in other Eurozone countries. JEL Classification: E62, H50, H68additive vs slope adjustments, cash data, early warning, fiscal credibility, fiscal surveillance
- …
