3,531 research outputs found

    A Taste of Cosmology

    Full text link
    This is the summary of two lectures that aim to give an overview of cosmology. I will not try to be too rigorous in derivations, nor to give a full historical overview. The idea is to provide a "taste" of cosmology and some of the interesting topics it covers. The standard cosmological model is presented and I highlight the successes of cosmology over the past decade or so. Keys to the development of the standard cosmological model are observations of the cosmic microwave background and of large-scale structure, which are introduced. Inflation and dark energy and the outlook for the future are also discussed. Slides from the lectures are available from the school website: physicschool.web.cern.ch/PhysicSchool/CLASHEP/CLASHEP2011/.Comment: 16 pages, contribution to the 2011 CERN-Latin-American School of High-Energy Physics, Natal, Brazil, 23 March-5 April 2011, edited by C. Grojean, M. Mulders and M. Spiropul

    Large-scale bias in the Universe: bispectrum method

    Get PDF
    Evidence that the Universe may be close to the critical density, required for its expansion eventually to be halted, comes principally from dynamical studies of large-scale structure. These studies either use the observed peculiar velocity field of galaxies directly, or indirectly by quantifying its anisotropic effect on galaxy clustering in redshift surveys. A potential difficulty with both such approaches is that the density parameter Ω0\Omega_0 is obtained only in the combination β=Ω00.6/b\beta = \Omega_0^{0.6}/b, if linear perturbation theory is used. The determination of the density parameter Ω0\Omega_0 is therefore compromised by the lack of a good measurement of the bias parameter bb, which relates the clustering of sample galaxies to the clustering of mass. In this paper, we develop an idea of Fry (1994), using second-order perturbation theory to investigate how to measure the bias parameter on large scales. The use of higher-order statistics allows the degeneracy between bb and Ω0\Omega_0 to be lifted, and an unambiguous determination of Ω0\Omega_0 then becomes possible. We apply a likelihood approach to the bispectrum, the three-point function in Fourier space. This paper is the first step in turning the idea into a practical proposition for redshift surveys, and is principally concerned with noise properties of the bispectrum, which are non-trivial. The calculation of the required bispectrum covariances involves the six-point function, including many noise terms, for which we have developed a generating functional approach which will be of value in calculating high-order statistics in general.Comment: 12 pages, latex, 7 postscript figures included. Accepted by MNRAS. (Minor numerical typesetting errors corrected: results unchanged

    Assessment and validation of wildfire susceptibility and hazard in Portugal

    Get PDF
    A comprehensive methodology to assess forest fire susceptibility, that uses variables of strong spatial correlation, is presented and applied for the Portuguese mainland. Our study is based on a thirty-year chronological series of burnt areas. The first twenty years (1975–1994) are used for statistical modelling, and the last ten (1995–2004) are used for the independent validation of results. The wildfire affected areas are crossed with a set of independent layers that are assumed to be relevant wildfire conditioning factors: elevation, slope, land cover, rainfall and temperature. Moreover, the wildfire recurring pattern is also considered, as a proxy variable expressing the influence of human action in wildfire occurrence. A sensitivity analysis is performed to evaluate the weight of each individual theme within the susceptibility model. Validation of the wildfire susceptibility models is made through the computation of success rate and prediction rate curves. The results show that it is possible to have a good compromise between the number of variables within the model and the model predictive power. Additionally, it is shown that integration of climatic variables does not produce any relevant increase in the prediction capacity of wildfire susceptibility models. Finally, the prediction rate curves produced by the independent cross validation are used to assess the probabilistic wildfire hazard at a scenario basis, for the complete mainland Portuguese territory

    La dermatitis alérgica a la picadura de pulga : estudio de factores epidemiológicos en el área urbana de Zaragoza. :

    Get PDF
    Se estudiaron 101 casos seleccionados de entre los que llegaron a la consulta de dermatologia a lo largo de un año y se analizaron los resultados de un cuestionario epidemiologico que se aplico a cada uno de los casos. Despues de aplicar el tratamiento estadistico a los datos, se encontraron cuatro factores significativos (p<0, 05)que interfieren en la presentacion de la enfermedad. Tres de estos factores se consideran factores de riesgo: edad de presentacion de los primeros signos clinicos, estacion en la qeu se manifiesta el prurito e infestacion por pulgas. El control de las pulgas se manifesto como un factor de proteccion

    Tests for primordial non-Gaussianity

    Get PDF
    We investigate the relative sensitivities of several tests for deviations from Gaussianity in the primordial distribution of density perturbations. We consider models for non-Gaussianity that mimic that which comes from inflation as well as that which comes from topological defects. The tests we consider involve the cosmic microwave background (CMB), large-scale structure (LSS), high-redshift galaxies, and the abundances and properties of clusters. We find that the CMB is superior at finding non-Gaussianity in the primordial gravitational potential (as inflation would produce), while observations of high-redshift galaxies are much better suited to find non-Gaussianity that resembles that expected from topological defects. We derive a simple expression that relates the abundance of high-redshift objects in non-Gaussian models to the primordial skewness.Comment: 6 pages, 2 figures, MNRAS in press (minor changes to match the accepted version

    Reducing sample variance: halo biasing, non-linearity and stochasticity

    Get PDF
    Comparing clustering of differently biased tracers of the dark matter distribution offers the opportunity to reduce the cosmic variance error in the measurement of certain cosmological parameters. We develop a formalism that includes bias non-linearities and stochasticity. Our formalism is general enough that can be used to optimise survey design and tracers selection and optimally split (or combine) tracers to minimise the error on the cosmologically interesting quantities. Our approach generalises the one presented by McDonald & Seljak (2009) of circumventing sample variance in the measurement of fdlnD/dlnaf\equiv d \ln D/d\ln a. We analyse how the bias, the noise, the non-linearity and stochasticity affect the measurements of DfDf and explore in which signal-to-noise regime it is significantly advantageous to split a galaxy sample in two differently-biased tracers. We use N-body simulations to find realistic values for the parameters describing the bias properties of dark matter haloes of different masses and their number density. We find that, even if dark matter haloes could be used as tracers and selected in an idealised way, for realistic haloes, the sample variance limit can be reduced only by up to a factor σ2tr/σ1tr0.6\sigma_{2tr}/\sigma_{1tr}\simeq 0.6. This would still correspond to the gain from a three times larger survey volume if the two tracers were not to be split. Before any practical application one should bear in mind that these findings apply to dark matter haloes as tracers, while realistic surveys would select galaxies: the galaxy-host halo relation is likely to introduce extra stochasticity, which may reduce the gain further.Comment: 21 pages, 13 figures. Published version in MNRA

    The Bispectrum of IRAS Galaxies

    Full text link
    We compute the bispectrum for the galaxy distribution in the IRAS QDOT, 2Jy, and 1.2Jy redshift catalogs for wavenumbers 0.05<k<0.2 h/Mpc and compare the results with predictions from gravitational instability in perturbation theory. Taking into account redshift space distortions, nonlinear evolution, the survey selection function, and discreteness and finite volume effects, all three catalogs show evidence for the dependence of the bispectrum on configuration shape predicted by gravitational instability. Assuming Gaussian initial conditions and local biasing parametrized by linear and non-linear bias parameters b_1 and b_2, a likelihood analysis yields 1/b_1 = 1.32^{+0.36}_{-0.58}, 1.15^{+0.39}_{-0.39} and b_2/b_1^2=-0.57^{+0.45}_{-0.30}, -0.50^{+0.31}_{-0.51}, for the for the 2Jy and 1.2Jy samples, respectively. This implies that IRAS galaxies trace dark matter increasingly weakly as the density contrast increases, consistent with their being under-represented in clusters. In a model with chi^2 non-Gaussian initial conditions, the bispectrum displays an amplitude and scale dependence different than that found in the Gaussian case; if IRAS galaxies do not have bias b_1> 1 at large scales, \chi^2 non-Gaussian initial conditions are ruled out at the 95% confidence level. The IRAS data do not distinguish between Lagrangian or Eulerian local bias.Comment: 30 pages, 11 figure

    Nontrivial Geometries: Bounds on the Curvature of the Universe

    Get PDF
    Probing the geometry of the universe is one of the most important endevours in cosmology. Current observational data from the Cosmic Microwave Background anisotropy (CMB), galaxy surveys and type Ia supernovae (SNe Ia) strongly constrain the curvature of the universe to be close to zero for a universe dominated by a cosmological constant or dark energy with a constant equation of state. Here we investigate the role of cosmic priors on deriving these tight bounds on geometry, by considering a landscape motivated scenario with an oscillating curvature term. We perform a likelihood analysis of current data under such a model of non-trivial geometry and find that the uncertainties on curvature, and correspondingly on parameters of the matter and dark energy sectors, are larger. Future dark energy experiments together with CMB data from experiments like Planck could dramatically improve our ability to constrain cosmic curvature under such models enabling us to probe possible imprints of quantum gravity.Comment: 7 pages, 8 figures. Submitte

    Evolution of the decay mechanisms in central collisions of XeXe + SnSn from E/AE/A = 8 to 29 MeVMeV

    Full text link
    Collisions of Xe+Sn at beam energies of E/AE/A = 8 to 29 MeVMeV and leading to fusion-like heavy residues are studied using the 4π4\pi INDRA multidetector. The fusion cross section was measured and shows a maximum at E/AE/A = 18-20 MeVMeV. A decomposition into four exit-channels consisting of the number of heavy fragments produced in central collisions has been made. Their relative yields are measured as a function of the incident beam energy. The energy spectra of light charged particles (LCP) in coincidence with the fragments of each exit-channel have been analyzed. They reveal that a composite system is formed, it is highly excited and first decays by emitting light particles and then may breakup into 2- or many- fragments or survives as an evaporative residue. A quantitative estimation of this primary emission is given and compared to the secondary decay of the fragments. These analyses indicate that most of the evaporative LCP precede not only fission but also breakup into several fragments.Comment: Invited Talk given at the 11th International Conference on Nucleus-Nucleus Collisions (NN2012), San Antonio, Texas, USA, May 27-June 1, 2012. To appear in the NN2012 Proceedings in Journal of Physics: Conference Series (JPCS
    corecore