1,615 research outputs found

    The Impact of Public Guarantees on Bank Risk Taking: Evidence from a Natural Experiment

    Get PDF
    In 2001, government guarantees for savings banks in Germany were removed following a law suit. We use this natural experiment to examine the effect of government guarantees on bank risk taking, using a large data set of matched bank/borrower information. The results suggest that banks whose government guarantee was removed reduced credit risk by cutting off the riskiest borrowers from credit. At the same time, the banks also increased interest rates on their remaining borrowers. The effects are economically large: the Z-Score of average borrowers increased by 7% and the average loan size declined by 13%. Remaining borrowers paid 57 basis points higher interest rates, despite their higher quality. Using a difference-in-differences approach we show that the effect is larger for banks that ex ante benefitted more from the guarantee. We show that both the credit quality of new customers improved (screening) and that the loans of existing riskier borrowers were less likely to be renewed (monitoring), after the removal of public guarantees. Public guarantees seem to be associated with substantial moral hazard effects.banking;public guarantees;credit risk;moral hazard

    Quantum Monte Carlo Study of Strongly Correlated Electrons: Cellular Dynamical Mean-Field Theory

    Full text link
    We study the Hubbard model using the Cellular Dynamical Mean-Field Theory (CDMFT) with quantum Monte Carlo (QMC) simulations. We present the algorithmic details of CDMFT with the Hirsch-Fye QMC method for the solution of the self-consistently embedded quantum cluster problem. We use the one- and two-dimensional half-filled Hubbard model to gauge the performance of CDMFT+QMC particularly for small clusters by comparing with the exact results and also with other quantum cluster methods. We calculate single-particle Green's functions and self-energies on small clusters to study their size dependence in one- and two-dimensions.Comment: 14 pages, 18 figure

    Therapie bei Progression und Rezidiv des Ovarialkarzinoms

    Get PDF
    Secondary surgery after failure of primary treatment is a promising and reasonable option only for patients with a relapse-free interval of at least 6-12 months who should have ideally achieved a tumor-free status after primary therapy. As after primary surgery, size of residual tumor is the most significant predictor of survival after secondary surgery. Even in the case of multiple tumor sites, complete removal of the tumor can be achieved in nearly 30% of the patients. Treatment results are much better in specialized oncology centers with optimal interdisciplinary cooperation compared with smaller institutions. Chemotherapy can be used both for consolidation after successful secondary surgery and for palliation in patients with inoperable recurrent disease. Since paclitaxel has been integrated into first-line chemotherapy, there is no defined standard for second-line chemotherapy. Several cytotoxic agents have shown moderate activity in this setting, including treosulfan, epirubicin, and newer agents such as topotecan, gemcitabine, vinorelbine, and PEG(polyethylene glycol)-liposomal doxorubicin. Thus, the German Arbeitsgemeinschaft Gynakologische Onkologie (AGO) has initiated several randomized studies in patients with recurrent ovarian cancer in order to define new standards for second-line chemotherapy

    Learning from the Success of MPI

    Full text link
    The Message Passing Interface (MPI) has been extremely successful as a portable way to program high-performance parallel computers. This success has occurred in spite of the view of many that message passing is difficult and that other approaches, including automatic parallelization and directive-based parallelism, are easier to use. This paper argues that MPI has succeeded because it addresses all of the important issues in providing a parallel programming model.Comment: 12 pages, 1 figur

    Beyond XSPEC: Towards Highly Configurable Analysis

    Full text link
    We present a quantitative comparison between software features of the defacto standard X-ray spectral analysis tool, XSPEC, and ISIS, the Interactive Spectral Interpretation System. Our emphasis is on customized analysis, with ISIS offered as a strong example of configurable software. While noting that XSPEC has been of immense value to astronomers, and that its scientific core is moderately extensible--most commonly via the inclusion of user contributed "local models"--we identify a series of limitations with its use beyond conventional spectral modeling. We argue that from the viewpoint of the astronomical user, the XSPEC internal structure presents a Black Box Problem, with many of its important features hidden from the top-level interface, thus discouraging user customization. Drawing from examples in custom modeling, numerical analysis, parallel computation, visualization, data management, and automated code generation, we show how a numerically scriptable, modular, and extensible analysis platform such as ISIS facilitates many forms of advanced astrophysical inquiry.Comment: Accepted by PASP, for July 2008 (15 pages

    Breakup of the aligned H2_2 molecule by xuv laser pulses: A time-dependent treatment in prolate spheroidal coordinates

    Full text link
    We have carried out calculations of the triple-differential cross section for one-photon double ionization of molecular hydrogen for a central photon energy of 7575~eV, using a fully {\it ab initio}, nonperturbative approach to solve the time-dependent \Schro equation in prolate spheroidal coordinates. The spatial coordinates ξ\xi and η\eta are discretized in a finite-element discrete-variable representation. The wave packet of the laser-driven two-electron system is propagated in time through an effective short iterative Lanczos method to simulate the double ionization of the hydrogen molecule. For both symmetric and asymmetric energy sharing, the present results agree to a satisfactory level with most earlier predictions for the absolute magnitude and the shape of the angular distributions. A notable exception, however, concerns the predictions of the recent time-independent calculations based on the exterior complex scaling method in prolate spheroidal coordinates [Phys.~Rev.~A~{\bf 82}, 023423 (2010)]. Extensive tests of the numerical implementation were performed, including the effect of truncating the Neumann expansion for the dielectronic interaction on the description of the initial bound state and the predicted cross sections. We observe that the dominant escape mode of the two photoelectrons dramatically depends upon the energy sharing. In the parallel geometry, when the ejected electrons are collected along the direction of the laser polarization axis, back-to-back escape is the dominant channel for strongly asymmetric energy sharing, while it is completely forbidden if the two electrons share the excess energy equally.Comment: 17 pages, 9 figure

    An agent-based approach to immune modelling

    Get PDF
    This study focuses on trying to understand why the range of experience with respect to HIV infection is so diverse, especially as regards to the latency period. The challenge is to determine what assumptions can be made about the nature of the experience of antigenic invasion and diversity that can be modelled, tested and argued plausibly. To investigate this, an agent-based approach is used to extract high-level behaviour which cannot be described analytically from the set of interaction rules at the cellular level. A prototype model encompasses local variation in baseline properties contributing to the individual disease experience and is included in a network which mimics the chain of lymphatic nodes. Dealing with massively multi-agent systems requires major computational efforts. However, parallelisation methods are a natural consequence and advantage of the multi-agent approach. These are implemented using the MPI library

    Parallel TREE code for two-component ultracold plasma analysis

    Full text link
    The TREE method has been widely used for long-range interaction {\it N}-body problems. We have developed a parallel TREE code for two-component classical plasmas with open boundary conditions and highly non-uniform charge distributions. The program efficiently handles millions of particles evolved over long relaxation times requiring millions of time steps. Appropriate domain decomposition and dynamic data management were employed, and large-scale parallel processing was achieved using an intermediate level of granularity of domain decomposition and ghost TREE communication. Even though the computational load is not fully distributed in fine grains, high parallel efficiency was achieved for ultracold plasma systems of charged particles. As an application, we performed simulations of an ultracold neutral plasma with a half million particles and a half million time steps. For the long temporal trajectories of relaxation between heavy ions and light electrons, large configurations of ultracold plasmas can now be investigated, which was not possible in past studies
    corecore