4,089 research outputs found

    Implementing means-tested welfare systems in the United States

    Get PDF
    While targeting can effectively channel resources to the poor, implementation details matter tremendously to distributive outcomes. Several key factors affect performance, including: data collection processes; information management; household assessment mechanisms; institutional arrangements; and monitoring and oversight mechanisms. This report conducts an in-depth assessment of key design and implementation factors and their potential impact on outcomes for the household targeting system used in the United States to target social programs to the poor and vulnerable.

    Risk and vulnerability in Guatemala: a quantitative and qualitative assessment

    Get PDF
    This study combines quantitative data from the Living Standards Measurement Study and qualitative information from an in-depth qualitative study of poverty and exclusion conducted in 10 villages in Guatemala. Both data sources were designed to capture issues related to vulnerability, risks, and risk management. The quantitative survey included a risks and shocks module, in which households were asked to report if they had experienced a shock during the previous 12 months, using precoded questions for 28 economic, natural, social/political, and life-cycle shocks. These shocks were classified ex ante into covariant and idiosyncratic shocks. Households also reported: (1) whether these shocks triggered a reduction or loss of their income or wealth; (2) the main strategy that they used to cope with their welfare loss; (3) if they had succeeded in reversing the reduction or loss in their welfare by the time of the survey, and (4) the estimated time that had elapsed until successful resolution of the situation. Information on covariant shocks was also collected from the community questionnaire at the survey cluster level. The vulnerability assessment includes several types of analysis of shocks and their impact, including (1) factor analysis to understand the correlation structure or"bunching"of shocks; (2) a multivariate logistic model to examine the association between a household's characteristics and location and the probability that it reports a shock or incurs wealth and income losses due to the shock and the probability that it has recovered from the negative impact of the shock by the time of the interview; (3) nonparametric density estimation to estimate the counterfactual density of consumption or income; (4) multiple regression analysis to estimate the cost of shocks; (5) propensity score matching to estimate the cost of shocks; and (6) multiple regression analysis toestimate vulnerability to consumption poverty.Poverty Assessment,Environmental Economics&Policies,Health Economics&Finance,Social Risk Management,Services&Transfers to Poor

    Measuring Ancient Inequality

    Get PDF
    Is inequality largely the result of the Industrial Revolution? Or, were pre-industrial incomes and life expectancies as unequal as they are today? For want of sufficient data, these questions have not yet been answered. This paper infers inequality for 14 ancient, pre-industrial societies using what are known as social tables, stretching from the Roman Empire 14 AD, to Byzantium in 1000, to England in 1688, to Nueva España around 1790, to China in 1880 and to British India in 1947. It applies two new concepts in making those assessments – what we call the inequality possibility frontier and the inequality extraction ratio. Rather than simply offering measures of actual inequality, we compare the latter with the maximum feasible inequality (or surplus) that could have been extracted by the elite. The results, especially when compared with modern poor countries, give new insights in to the connection between inequality and economic development in the very long run.Inequality possibility frontier; pre-industrial inequality; history

    Does Globalization Make the World More Unequal?

    Get PDF
    The world economy has become more unequal over the last two centuries. Since within- country inequality exhibits no ubiquitous trend, it follows that virtually all of the observed rise in world income inequality has been driven by widening gaps between nations, while almost none of it has driven by widening gaps within nations. Meanwhile, the world economy has become much more globally integrated over the past two centuries. If correlation meant causation, these facts would imply that globalization has raised inequality between nations, but that it has had no clear effect on inequality within nations. This paper argues that the likely impact of globalization on world inequality has been very different from what these simple correlations suggest. Globalization probably mitigated rising inequality between participating nations. The nations that gained the most from globalization are those poor ones that changed their policies to exploit it, while the ones that gained the least did not, or were too isolated to do so. The effect of globalization on inequality within nations has gone both ways, but here too those who have lost the most from globalization typically have been the excluded non-participants. In any case far too small to explain the observed long run rise in world inequality.

    THREE CENTURIES OF INEQUALITY IN BRITAIN AND AMERICA

    Get PDF
    Income and wealth inequality rose over the first 150 years of U.S. history. They may have risen at times in Britain before 1875. The first half of this century equalized pre-fisc incomes more in Britain than in America. From the 1970s to the 1990s inequality rose in both countries, reversing some of the previous equalization. Government redistribution explains part but not all of the reversals in inequality trends. Factor-market forces and economic growth would have produced a similar chronology of rises and falls in income inequality even without shifts in the progressivity of redistribution through government. For economies starting from highly unequal property ownership, the development process lowers inequality. History suggests, however, that this may happen only once. Redistribution toward the poor tends to happen least in those times and polities where it would seem most justified by the usual goals of welfare policy.

    Direct stau production at the LHC

    Full text link
    We investigate the direct production of supersymmetric scalar taus at the LHC. We present the general calculation of the dominant cross section contributions for hadronic stau pair production within the MSSM, taking into account left-right mixing of the stau eigenstates. We find that b-quark annihilation and gluon fusion can enhance the cross sections by more than one order of magnitude with respect to the Drell-Yan predictions. For long-lived staus, we consider CMSSM parameter regions with such enhanced cross sections and possible consequences from recent searches. We find that regions of exceptionally small stau yields, favoured by cosmology, are in tension with a recent CMS limit on m_stau.Comment: 9 pages, 6 figures, Talk given at the workshop "School and Workshops on Elementary Particle Physics and Gravity" September 4-18, 2011 Corfu, Greec

    Pseudo-observables in electroweak Higgs production

    Get PDF
    We discuss how the leading electroweak Higgs production processes at the LHC, namely vector-boson fusion and Higgs+W/Z associated production, can be characterized in generic extensions of the Standard Model by a proper set of pseudo-observables (PO). We analyze the symmetry properties of these PO and their relation with the PO set appearing in Higgs decays. We discuss in detail the kinematical studies necessary to extract the production PO from data, and present a first estimate of the LHC sensitivity on these observables in the high-luminosity phase. The impact of QCD corrections and the kinematical studies necessary to test the validity of the momentum expansion at the basis of the PO decomposition are also discussed.Comment: 34 pages, 12 figures, 1 tabl

    LHC Tests of Light Neutralino Dark Matter without Light Sfermions

    Get PDF
    We address the question how light the lightest MSSM neutralino can be as dark matter candidate in a scenario where all supersymmetric scalar particles are heavy. The hypothesis that the neutralino accounts for the observed dark matter density sets strong requirements on the supersymmetric spectrum, thus providing an handle for collider tests. In particular for a lightest neutralino below 100 GeV the relic density constraint translates into an upper bound on the Higgsino mass parameter μ\mu in case all supersymmetric scalar particles are heavy. One can define a simplified model that highlights only the necessary features of the spectrum and their observable consequences at the LHC. Reinterpreting recent searches at the LHC we derive limits on the mass of the lightest neutralino that, in many cases, prove to be more constraining than dark matter experiments themselves.Comment: 22 pages, 8 figure
    corecore