4,375 research outputs found

    Migration and Regional Disparities: the Role of Skill Biased Flows

    Get PDF
    The persistence of disparities is one of the most striking features of regional development. We argue that movements of labour force, instead of being an always equilibrating mechanism, can also make persistent or even reinforce such inequalities. The most advanced regions are in fact generally more attractive, in terms of opportunities, especially to more qualified workers, who, in turn, are an essential ingredient of regional development and competitiveness because of the human capital they bear. We set up a two-regional framework, with a continuum of different skill- type individuals. Each agent’s utility function depends on the wage she earns through her skills, leaving the process of human capital formation out of this paper. Within this framework, we identify and model two complementary mechanisms for skill biased migration flows to take place. The first one resides in the way wages are set. If, in fact, the most skilled workers are not paid their productivity because of wage compression, they will have an incentive to move towards regions with a more dispersed wage scheme. The second mechanism dwells in the existence of some regional specific immobile assets, which make workers di®erently productive in different regions; this happens to a larger extent for those endowed with highest skills, which will therefore be more likely to overcome the mobility costs. Hence a Kaldor-type cumulative process bearing persistent regional disparities is set up.

    Performance pay and shifts in macroeconomic correlations

    Get PDF
    A coincidence in time between the volatility break associated with the "Great Moderation" and large changes in the pattern of conditional and unconditional correlations between output, hours and labor productivity was detected by Galí and Gambetti (2009). We provide a novel explanation for these findings, based on the major changes that occurred in the U.S. design of labor compensation around the mid-1980s. These include a substantial increase in the incidence of performance pay coupled with a higher responsiveness of real wages to the business cycle. We capture this shift in the structure of labor compensation in a Dynamic New Keynesian (DNK) model and show that, by itself, it generates the disappearance of the procyclical response of labor productivity to non-technology shocks and a reduction of the contractionary effects of technology shocks on hours worked. Moreover, it accounts for a large share of the observed drop in output volatility after 1984 and for most of the observed changes in unconditional correlations.procyclical productivity, wage rigidities, performance pay.

    MORE: Merged Opinions Reputation Model

    Get PDF
    Reputation is generally defined as the opinion of a group on an aspect of a thing. This paper presents a reputation model that follows a probabilistic modelling of opinions based on three main concepts: (1) the value of an opinion decays with time, (2) the reputation of the opinion source impacts the reliability of the opinion, and (3) the certainty of the opinion impacts its weight with respect to other opinions. Furthermore, the model is flexible with its opinion sources: it may use explicit opinions or implicit opinions that can be extracted from agent behavior in domains where explicit opinions are sparse. We illustrate the latter with an approach to extract opinions from behavioral information in the sports domain, focusing on football in particular. One of the uses of a reputation model is predicting behavior. We take up the challenge of predicting the behavior of football teams in football matches, which we argue is a very interesting yet difficult approach for evaluating the model.Comment: 12th European Conference on Multi-Agent Systems (EUMAS 2014

    An Innovative Workspace for The Cherenkov Telescope Array

    Get PDF
    The Cherenkov Telescope Array (CTA) is an initiative to build the next generation, ground-based gamma-ray observatories. We present a prototype workspace developed at INAF that aims at providing innovative solutions for the CTA community. The workspace leverages open source technologies providing web access to a set of tools widely used by the CTA community. Two different user interaction models, connected to an authentication and authorization infrastructure, have been implemented in this workspace. The first one is a workflow management system accessed via a science gateway (based on the Liferay platform) and the second one is an interactive virtual desktop environment. The integrated workflow system allows to run applications used in astronomy and physics researches into distributed computing infrastructures (ranging from clusters to grids and clouds). The interactive desktop environment allows to use many software packages without any installation on local desktops exploiting their native graphical user interfaces. The science gateway and the interactive desktop environment are connected to the authentication and authorization infrastructure composed by a Shibboleth identity provider and a Grouper authorization solution. The Grouper released attributes are consumed by the science gateway to authorize the access to specific web resources and the role management mechanism in Liferay provides the attribute-role mapping

    Reinterpreting the development of extensive air showers initiated by nuclei and photons

    Full text link
    Ultra-high energy cosmic rays (UHECRs) interacting with the atmosphere generate extensive air showers (EAS) of secondary particles. The depth corresponding to the maximum development of the shower, \Xmax, is a well-known observable for determining the nature of the primary cosmic ray which initiated the cascade process. In this paper, we present an empirical model to describe the distribution of \Xmax for EAS initiated by nuclei, in the energy range from 101710^{17} eV up to 102110^{21} eV, and by photons, in the energy range from 101710^{17} eV up to 1019.610^{19.6} eV. Our model adopts the generalized Gumbel distribution motivated by the relationship between the generalized Gumbel statistics and the distribution of the sum of non-identically distributed variables in dissipative stochastic systems. We provide an analytical expression for describing the \Xmax distribution for photons and for nuclei, and for their first two statistical moments, namely \langle \Xmax\rangle and \sigma^{2}(\Xmax). The impact of the hadronic interaction model is investigated in detail, even in the case of the most up-to-date models accounting for LHC observations. We also briefly discuss the differences with a more classical approach and an application to the experimental data based on information theory.Comment: 21 pages, 4 tables, 8 figure

    Automated detection of extended sources in radio maps: progress from the SCORPIO survey

    Get PDF
    Automated source extraction and parameterization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper we present a new algorithm, dubbed CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parameterization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, including also different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the ASKAP-EMU survey. The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.Comment: 15 pages, 9 figure
    corecore