3,796 research outputs found

    Why bayesian “evidence for H1” in one condition and bayesian “evidence for H0” in another condition does not mean good-enough bayesian evidence for a difference between the conditions

    Get PDF
    Psychologists are often interested in whether an independent variable has a different effect in condition A than in condition B. To test such a question, one needs to directly compare the effect of that variable in the two conditions (i.e., test the interaction). Yet many researchers tend to stop when they find a significant test in one condition and a nonsignificant test in the other condition, deeming this as sufficient evidence for a difference between the two conditions. In this Tutorial, we aim to raise awareness of this inferential mistake when Bayes factors are used with conventional cutoffs to draw conclusions. For instance, some researchers might falsely conclude that there must be good-enough evidence for the interaction if they find good-enough Bayesian evidence for the alternative hypothesis, H1, in condition A and good-enough Bayesian evidence for the null hypothesis, H0, in condition B. The case study we introduce highlights that ignoring the test of the interaction can lead to unjustified conclusions and demonstrates that the principle that any assertion about the existence of an interaction necessitates the direct comparison of the conditions is as true for Bayesian as it is for frequentist statistics. We provide an R script of the analyses of the case study and a Shiny app that can be used with a 2 × 2 design to develop intuitions on this issue, and we introduce a rule of thumb with which one can estimate the sample size one might need to have a well-powered design

    Cosmic microwave background constraints on cosmological models with large-scale isotropy breaking

    Get PDF
    Several anomalies appear to be present in the large-angle cosmic microwave background (CMB) anisotropy maps of WMAP, including the alignment of large-scale multipoles. Models in which isotropy is spontaneously broken (e.g., by a scalar field) have been proposed as explanations for these anomalies, as have models in which a preferred direction is imposed during inflation. We examine models inspired by these, in which isotropy is broken by a multiplicative factor with dipole and/or quadrupole terms. We evaluate the evidence provided by the multipole alignment using a Bayesian framework, finding that the evidence in favor of the model is generally weak. We also compute approximate changes in estimated cosmological parameters in the broken-isotropy models. Only the overall normalization of the power spectrum is modified significantly.Comment: Accepted for publication in Phys. Rev.

    Application of Bayesian model averaging to measurements of the primordial power spectrum

    Get PDF
    Cosmological parameter uncertainties are often stated assuming a particular model, neglecting the model uncertainty, even when Bayesian model selection is unable to identify a conclusive best model. Bayesian model averaging is a method for assessing parameter uncertainties in situations where there is also uncertainty in the underlying model. We apply model averaging to the estimation of the parameters associated with the primordial power spectra of curvature and tensor perturbations. We use CosmoNest and MultiNest to compute the model Evidences and posteriors, using cosmic microwave data from WMAP, ACBAR, BOOMERanG and CBI, plus large-scale structure data from the SDSS DR7. We find that the model-averaged 95% credible interval for the spectral index using all of the data is 0.940 < n_s < 1.000, where n_s is specified at a pivot scale 0.015 Mpc^{-1}. For the tensors model averaging can tighten the credible upper limit, depending on prior assumptions.Comment: 7 pages with 7 figures include

    Might EPR particles communicate through a wormhole?

    Get PDF
    We consider the two-particle wave function of an Einstein-Podolsky-Rosen system, given by a two dimensional relativistic scalar field model. The Bohm-de Broglie interpretation is applied and the quantum potential is viewed as modifying the Minkowski geometry. In this way an effective metric, which is analogous to a black hole metric in some limited region, is obtained in one case and a particular metric with singularities appears in the other case, opening the possibility, following Holland, of interpreting the EPR correlations as being originated by an effective wormhole geometry, through which the physical signals can propagate.Comment: Corrected version, to appears in EP

    Determining the Neutrino Mass Hierarchy with Cosmology

    Full text link
    The combination of current large scale structure and cosmic microwave background (CMB) anisotropies data can place strong constraints on the sum of the neutrino masses. Here we show that future cosmic shear experiments, in combination with CMB constraints, can provide the statistical accuracy required to answer questions about differences in the mass of individual neutrino species. Allowing for the possibility that masses are non-degenerate we combine Fisher matrix forecasts for a weak lensing survey like Euclid with those for the forthcoming Planck experiment. Under the assumption that neutrino mass splitting is described by a normal hierarchy we find that the combination Planck and Euclid will possibly reach enough sensitivity to put a constraint on the mass of a single species. Using a Bayesian evidence calculation we find that such future experiments could provide strong evidence for either a normal or an inverted neutrino hierachy. Finally we show that if a particular neutrino hierachy is assumed then this could bias cosmological parameter constraints, for example the dark energy equation of state parameter, by > 1\sigma, and the sum of masses by 2.3\sigma.Comment: 9 pages, 6 figures, 3 table

    Focused laser Doppler velocimeter

    Get PDF
    A system for remotely measuring velocities present in discrete volumes of air is described. A CO2 laser beam is focused by a telescope at such a volume, a focal volume, and within the focusable range, near field, of the telescope. The back scatter, or reflected light, principally from the focal volume, passes back through the telescope and is frequency compared with the original frequency of the laser, and the difference frequency or frequencies represent particle velocities in that focal volume

    Extended Heat-Fluctuation Theorems for a System with Deterministic and Stochastic Forces

    Full text link
    Heat fluctuations over a time \tau in a non-equilibrium stationary state and in a transient state are studied for a simple system with deterministic and stochastic components: a Brownian particle dragged through a fluid by a harmonic potential which is moved with constant velocity. Using a Langevin equation, we find the exact Fourier transform of the distribution of these fluctuations for all \tau. By a saddle-point method we obtain analytical results for the inverse Fourier transform, which, for not too small \tau, agree very well with numerical results from a sampling method as well as from the fast Fourier transform algorithm. Due to the interaction of the deterministic part of the motion of the particle in the mechanical potential with the stochastic part of the motion caused by the fluid, the conventional heat fluctuation theorem is, for infinite and for finite \tau, replaced by an extended fluctuation theorem that differs noticeably and measurably from it. In particular, for large fluctuations, the ratio of the probability for absorption of heat (by the particle from the fluid) to the probability to supply heat (by the particle to the fluid) is much larger here than in the conventional fluctuation theorem.Comment: 23 pages, 6 figures. Figures are now in color, Eq. (67) was corrected and a footnote was added on the d-dimensional cas

    Thermodynamic Properties of Generalized Exclusion Statistics

    Full text link
    We analytically calculate some thermodynamic quantities of an ideal gg-on gas obeying generalized exclusion statistics. We show that the specific heat of a gg-on gas (g0g \neq 0) vanishes linearly in any dimension as T0T \to 0 when the particle number is conserved and exhibits an interesting dual symmetry that relates the particle-statistics at gg to the hole-statistics at 1/g1/g at low temperatures. We derive the complete solution for the cluster coefficients bl(g)b_l(g) as a function of Haldane's statistical interaction gg in DD dimensions. We also find that the cluster coefficients bl(g)b_l(g) and the virial coefficients al(g)a_l(g) are exactly mirror symmetric (ll=odd) or antisymmetric (ll=even) about g=1/2g=1/2. In two dimensions, we completely determine the closed forms about the cluster and the virial coefficients of the generalized exclusion statistics, which exactly agree with the virial coefficients of an anyon gas of linear energies. We show that the gg-on gas with zero chemical potential shows thermodynamic properties similar to the photon statistics. We discuss some physical implications of our results.Comment: 24 pages, Revtex, Corrected typo

    The thermodynamics of prediction

    Full text link
    A system responding to a stochastic driving signal can be interpreted as computing, by means of its dynamics, an implicit model of the environmental variables. The system's state retains information about past environmental fluctuations, and a fraction of this information is predictive of future ones. The remaining nonpredictive information reflects model complexity that does not improve predictive power, and thus represents the ineffectiveness of the model. We expose the fundamental equivalence between this model inefficiency and thermodynamic inefficiency, measured by dissipation. Our results hold arbitrarily far from thermodynamic equilibrium and are applicable to a wide range of systems, including biomolecular machines. They highlight a profound connection between the effective use of information and efficient thermodynamic operation: any system constructed to keep memory about its environment and to operate with maximal energetic efficiency has to be predictive.Comment: 5 pages, 1 figur

    Direct reconstruction of the quintessence potential

    Get PDF
    We describe an algorithm which directly determines the quintessence potential from observational data, without using an equation of state parametrisation. The strategy is to numerically determine observational quantities as a function of the expansion coefficients of the quintessence potential, which are then constrained using a likelihood approach. We further impose a model selection criterion, the Bayesian Information Criterion, to determine the appropriate level of the potential expansion. In addition to the potential parameters, the present-day quintessence field velocity is kept as a free parameter. Our investigation contains unusual model types, including a scalar field moving on a flat potential, or in an uphill direction, and is general enough to permit oscillating quintessence field models. We apply our method to the `gold' Type Ia supernovae sample of Riess et al. (2004), confirming the pure cosmological constant model as the best description of current supernovae luminosity-redshift data. Our method is optimal for extracting quintessence parameters from future data.Comment: 9 pages RevTeX4 with lots of incorporated figure
    corecore