7,783 research outputs found

    Voter experience of corrupt officials is an overlooked reason for the electoral success of radical right parties

    Get PDF
    The radical right has had a good few years, with the economic crisis providing fertile ground for a political discourse centred around immigration and economic injustice. Conrad Ziller and Thomas Schübel share research which shows that one of the key, though overlooked, drivers of the rise of parties like the Finns Party in Finland and the Freedom Party of Austria is voter experience of corruption by officials

    Explaining Inflation Persistence by a Time-Varying Taylor Rule

    Get PDF
    In a simple New Keynesian model, we derive a closed form solution for the inflation persistence parameter as a function of the policy weights in the central bank’s Taylor rule. By estimating the time-varying weights that the FED attaches to inflation and the output gap, we show that the empirically observed changes in U.S. inflation persistence during the period 1975 to 2010 can be well explained by changes in the conduct of monetary policy. Our findings are in line with Benati’s (2008) view that inflation persistence should not be considered a structural parameter in the sense of Lucas

    The Clarens web services architecture

    Get PDF
    Clarens is a uniquely flexible web services infrastructure providing a unified access protocol to a diverse set of functions useful to the HEP community. It uses the standard HTTP protocol combined with application layer, certificate based authentication to provide single sign-on to individuals, organizations and hosts, with fine-grained access control to services, files and virtual organization (VO) management. This contribution describes the server functionality, while client applications are described in a subsequent talk.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics (CHEP03), La Jolla, Ca, USA, March 2003, 6 pages, LaTeX, 4 figures, PSN MONT00

    Implementing a Reconciliation and Balancing Model in the U.s. Industry Accounts

    Get PDF
    As part of the U.S. Bureau of Economic Analysis’ integration initiative (Yuskavage, 2000; Moyer et al., 2004a, 2004b; Lawson et al., 2006), the Industry Accounts Directorate is drawing upon the Stone method (Stone et al., 1942) and Chen (2006) to reconcile the gross operating surplus component of value-added from the 2002 expenditure-based benchmark input-output accounts and the 2002 income-based gross domestic product-by-industry accounts. The objective of the reconciliation is to use information regarding the relative reliabilities of underlying data in both the benchmark input-output use table and the gross domestic product-by-industry accounts in a balanced input-output framework in order to improve intermediate input estimates and gross operating surplus estimates in both accounts. Given a balanced input-output framework, the Stone method also provides a tool for balancing the benchmark use table. This paper presents work by the Industry Accounts Directorate to develop and implement the reconciliation and balancing model. The paper provides overviews of the benchmark use table and gross domestic product-by-industry accounts, including features of external source data and adjustment methodologies that are relevant for the reconciliation. In addition, the paper presents the empirical model that the Industry Accounts Directorate is building and briefly describes the technology used to solve the model. Preliminary work during development of the model shows that reconciling and balancing a large system with disaggregated data is computationally feasible and efficient in pursuit of an economically accurate and reliable benchmark use table and gross domestic product-by-industry accounts.

    Sensitivity of the Cherenkov Telescope Array to the detection of a dark matter signal in comparison to direct detection and collider experiments

    Full text link
    Imaging atmospheric Cherenkov telescopes (IACTs) that are sensitive to potential γ\gamma-ray signals from dark matter (DM) annihilation above 50\sim50 GeV will soon be superseded by the Cherenkov Telescope Array (CTA). CTA will have a point source sensitivity an order of magnitude better than currently operating IACTs and will cover a broad energy range between 20 GeV and 300 TeV. Using effective field theory and simplified models to calculate γ\gamma-ray spectra resulting from DM annihilation, we compare the prospects to constrain such models with CTA observations of the Galactic center with current and near-future measurements at the Large Hadron Collider (LHC) and direct detection experiments. For DM annihilations via vector or pseudoscalar couplings, CTA observations will be able to probe DM models out of reach of the LHC, and, if DM is coupled to standard fermions by a pseudoscalar particle, beyond the limits of current direct detection experiments.Comment: Accepted for publication in PRD. 20 pages, 11 figure

    Transitions : individuelle Handhabung und Verarbeitungsformen institutionellen Wandels

    Get PDF
    Unter Transitionen werden hier allgemein unstete, diskontinuierliche Übergangsprozesse verstanden. In der Selbstbeobachtung erscheinen sie z.B. als Brüche, überraschende Ereignisse, ungeahnte Chancen oder nie für möglich gehaltene Schocks. Retrospektiv jedenfalls – positiv wie negativ bewertet – als entscheidende Weichenstellungen, die später nachfolgende Entscheidungen in einschneidendem Umfang wenn schon nicht determinieren, so jedenfalls aber nachhaltig oder dauerhaft prägen. Als unstet werden sie deswegen eingeschätzt, weil Akteure heute davon zunehmend „unvorhergesehen“ und nicht planbar betroffen sind

    UK/US naval interoperability collaborative rersearch

    Get PDF
    This paper outlinees a collaborative program being carried out under an agreement between the US and the UK which started in January 2000, and is due to continue for four years. The research and is looking at the operational problems of coalition force interoperability initial from a naval perspective at the command and combat system level but then moving to a wider domain to cover both land and air participation. Details are given of why the research is necessary, the objectives and the approach being adopted. It then provides some information on the experiences gain from the initial trials which have been carried out during the first six months of this year

    The Clarens Web Service Framework for Distributed Scientific Analysis in Grid Projects

    Get PDF
    Large scientific collaborations are moving towards service oriented architecutres for implementation and deployment of globally distributed systems. Clarens is a high performance, easy to deploy Web Service framework that supports the construction of such globally distributed systems. This paper discusses some of the core functionality of Clarens that the authors believe is important for building distributed systems based on Web Services that support scientific analysis

    Job Monitoring in an Interactive Grid Analysis Environment

    Get PDF
    The grid is emerging as a great computational resource but its dynamic behavior makes the Grid environment unpredictable. Systems and networks can fail, and the introduction of more users can result in resource starvation. Once a job has been submitted for execution on the grid, monitoring becomes essential for a user to see that the job is completed in an efficient way, and to detect any problems that occur while the job is running. In current environments once a user submits a job he loses direct control over the job and the system behaves like a batch system: the user submits the job and later gets a result back. The only information a user can obtain about a job is whether it is scheduled, running, cancelled or finished. Today users are becoming increasingly interested in such analysis grid environments in which they can check the progress of the job, obtain intermediate results, terminate the job based on the progress of job or intermediate results, steer the job to other nodes to achieve better performance and check the resources consumed by the job. In order to fulfill their requirements of interactivity a mechanism is needed that can provide the user with real time access to information about different attributes of a job. In this paper we present the design of a Job Monitoring Service, a web service that will provide interactive remote job monitoring by allowing users to access different attributes of a job once it has been submitted to the interactive Grid Analysis Environment
    corecore