9,855 research outputs found

    Generalized Taylor and Generalized Calvo Price and Wage-Setting: Micro Evidence with Macro Implications

    Get PDF
    The Generalized Calvo and the Generalized Taylor model of price and wage-setting are, unlike the standard Calvo and Taylor counter-parts, exactly consistent with the distribution of durations observed in the data. Using price and wage micro-data from a major euro-area economy (France), we develop calibrated versions of these models. We assess the consequences for monetary policy transmission by embedding these calibrated models in a standard DSGE model. The Generalized Taylor model is found to help rationalizing the hump-shaped response of inflation, without resorting to the counterfactual assumption of systematic wage and price indexation.contract length, steady state, hazard rate, Calvo, Taylor, wage-setting, price-setting

    Heterogeneity in Consumer Price Stickiness: A Microeconometric Investigation.

    Get PDF
    This paper examines heterogeneity in price stickiness using a large, original, set of individual price data collected at the retail level for the computation of the French CPI. To that end, we estimate, at a very high level of disaggregation, competing-risks duration models that distinguish between price increases, price decreases and product replacements. The main .ndings are the following: i) cross-product and cross-outlet-type heterogeneity in both the shape of the hazard function and the impact of covariates is pervasive ii) at the product-outlet type level, the baseline hazard function of a price spell is non-decreasing iii) there is strong evidence of state-dependence, especially for price increases.Sticky prices ; Heterogeneity ; Hazard function ; Duration models.

    Optimized diffusion gradient orientation schemes for corrupted clinical DTI data sets

    Get PDF
    Object:A method is proposed for generating schemes of diffusion gradient orientations which allow the diffusion tensor to be reconstructed from partial data sets in clinical DT-MRI, should the acquisition be corrupted or terminated before completion because of patient motion. Materials and methods: A general energy-minimization electrostatic model was developed in which the interactions between orientations are weighted according to their temporal order during acquisition. In this report, two corruption scenarios were specifically considered for generating relatively uniform schemes of 18 and 60 orientations, with useful subsets of 6 and 15 orientations. The sets and subsets were compared to conventional sets through their energy, condition number and rotational invariance. Schemes of 18 orientations were tested on a volunteer. Results: The optimized sets were similar to uniform sets in terms of energy, condition number and rotational invariance, whether the complete set or only a subset was considered. Diffusion maps obtained in vivo were close to those for uniform sets whatever the acquisition time was. This was not the case with conventional schemes, whose subset uniformity was insufficient. Conclusion: With the proposed approach, sets of orientations responding to several corruption scenarios can be generated, which is potentially useful for imaging uncooperative patients or infant

    Developpement d'un système de dopage "PM" : premiers essais

    Get PDF
    National audienceThe participation to inter-laboratory exercises is a key step for any ambient air quality monitoring network. To guarantee the interest of such campaigns, participants need a large range of concentration, including regulatory limits. As far as PM10 is concerned, the 1996 European directive implement a 24h hours limit of 50 micro g/m3 with a maximum relative uncertainty of 25%. In the frame of the French National Air Quality Laboratory (LCSQA), INERIS -in relation with LNI Inc.- is developing a special PM generator. The objective is to distribute to all participants an ambient air enriched with PM10 or PM2,5 particles. Preliminary results show that our prototype is able to distribute to 4 TEOM and TEOM-FDMS microbalances air in a range from the background up to 100 micro g/m3 and more. The set-up and the results will be presented.La reconnaissance d'une compétence en matière de mesurage passe par la participation à des exercices d'intercomparaison, ou " exercices inter-laboratoires ". Ce type d'exercice est organisé dans le cadre de la surveillance réglementaire " air ambiant " française. Afin de garantir l'efficacité d'une telle session, il est essentiel de bénéficier d'un spectre large de concentration, et tout particulièrement d'inclure les valeurs limites pour lesquelles il existe des exigences en matière d'incertitude. C'est le cas de la surveillance des PM10, pour lesquels une incertitude maximale de 25% est exigée à 50 micro g/m3 (mesure journalière). Il est en pratique impossible de garantir a priori un tel niveau de concentration. C'est pourquoi dans le cadre du LCSQA, l'INERIS a entrepris de développer des systèmes d'enrichissement, appelés aussi " systèmes de dopage ". La présente communication est consacrée au développement d'un tel dispositif pour les particules de type PM10 et PM 2,5. Les objectifs, le montage expérimental, ainsi que les premiers résultats, seront rapportés dans le cas de microbalances TEOM et TEOM-FDMS, pour des concentrations allant du niveau de fond à plus de 100 micro g/m3. Une attention particulière sera portée sur les contraintes et les résultats obtenus en matière de représentativité de la matrice, ainsi qu'en matière d'équivalence des échantillons fournis à chaque analyseur participant

    Analytic Metaphysics versus Naturalized Metaphysics: The Relevance of Applied Ontology

    Get PDF
    The relevance of analytic metaphysics has come under criticism: Ladyman & Ross, for instance, have suggested do discontinue the field. French & McKenzie have argued in defense of analytic metaphysics that it develops tools that could turn out to be useful for philosophy of physics. In this article, we show first that this heuristic defense of metaphysics can be extended to the scientific field of applied ontology, which uses constructs from analytic metaphysics. Second, we elaborate on a parallel by French & McKenzie between mathematics and metaphysics to show that the whole field of analytic metaphysics, being useful not only for philosophy but also for science, should continue to exist as a largely autonomous field

    Rational invariants of even ternary forms under the orthogonal group

    Get PDF
    In this article we determine a generating set of rational invariants of minimal cardinality for the action of the orthogonal group O3\mathrm{O}_3 on the space R[x,y,z]2d\mathbb{R}[x,y,z]_{2d} of ternary forms of even degree 2d2d. The construction relies on two key ingredients: On one hand, the Slice Lemma allows us to reduce the problem to dermining the invariants for the action on a subspace of the finite subgroup B3\mathrm{B}_3 of signed permutations. On the other hand, our construction relies in a fundamental way on specific bases of harmonic polynomials. These bases provide maps with prescribed B3\mathrm{B}_3-equivariance properties. Our explicit construction of these bases should be relevant well beyond the scope of this paper. The expression of the B3\mathrm{B}_3-invariants can then be given in a compact form as the composition of two equivariant maps. Instead of providing (cumbersome) explicit expressions for the O3\mathrm{O}_3-invariants, we provide efficient algorithms for their evaluation and rewriting. We also use the constructed B3\mathrm{B}_3-invariants to determine the O3\mathrm{O}_3-orbit locus and provide an algorithm for the inverse problem of finding an element in R[x,y,z]2d\mathbb{R}[x,y,z]_{2d} with prescribed values for its invariants. These are the computational issues relevant in brain imaging.Comment: v3 Changes: Reworked presentation of Neuroimaging application, refinement of Definition 3.1. To appear in "Foundations of Computational Mathematics

    Price Setting in the Euro Area: Some Stylized Facts from Individual Consumer Price Data.

    Get PDF
    This paper documents patterns of price setting at the retail level in the euro area. A set of stylized facts on the frequency and size of price changes is presented along with an econometric investigation of their main determinants. Price adjustment in the euro area can be summarized in six stylized facts. First, prices of most products change rarely. The average monthly frequency of price adjustment is 15 p.c., compared to about 25 p.c. in the US. Second, the frequency of price changes is characterized by substantial cross-product heterogeneity and pronounced sectoral patterns: prices of (oil-related) energy and unprocessed food products change very often, while price adjustments are less frequent for processed food products, non-energy industrial goods and services. Third, cross-country heterogeneity exists but is less pronounced. Fourth, price decreases are not uncommon. Fifth, price increases and decreases are sizeable compared to aggregate and sectoral inflation rates. Sixth, price changes are not highly synchronized across price-setters. Moreover, the frequency of price changes in the euro area is related to a number of factors, in particular seasonality, outlet type, indirect taxation, use of attractive prices as well as aggregate or product-specific inflation.Price-setting ; consumer price ; frequency of price change.

    Estimation of Fiber Orientations Using Neighborhood Information

    Full text link
    Data from diffusion magnetic resonance imaging (dMRI) can be used to reconstruct fiber tracts, for example, in muscle and white matter. Estimation of fiber orientations (FOs) is a crucial step in the reconstruction process and these estimates can be corrupted by noise. In this paper, a new method called Fiber Orientation Reconstruction using Neighborhood Information (FORNI) is described and shown to reduce the effects of noise and improve FO estimation performance by incorporating spatial consistency. FORNI uses a fixed tensor basis to model the diffusion weighted signals, which has the advantage of providing an explicit relationship between the basis vectors and the FOs. FO spatial coherence is encouraged using weighted l1-norm regularization terms, which contain the interaction of directional information between neighbor voxels. Data fidelity is encouraged using a squared error between the observed and reconstructed diffusion weighted signals. After appropriate weighting of these competing objectives, the resulting objective function is minimized using a block coordinate descent algorithm, and a straightforward parallelization strategy is used to speed up processing. Experiments were performed on a digital crossing phantom, ex vivo tongue dMRI data, and in vivo brain dMRI data for both qualitative and quantitative evaluation. The results demonstrate that FORNI improves the quality of FO estimation over other state of the art algorithms.Comment: Journal paper accepted in Medical Image Analysis. 35 pages and 16 figure

    Non-Parametric Approximations for Anisotropy Estimation in Two-dimensional Differentiable Gaussian Random Fields

    Full text link
    Spatially referenced data often have autocovariance functions with elliptical isolevel contours, a property known as geometric anisotropy. The anisotropy parameters include the tilt of the ellipse (orientation angle) with respect to a reference axis and the aspect ratio of the principal correlation lengths. Since these parameters are unknown a priori, sample estimates are needed to define suitable spatial models for the interpolation of incomplete data. The distribution of the anisotropy statistics is determined by a non-Gaussian sampling joint probability density. By means of analytical calculations, we derive an explicit expression for the joint probability density function of the anisotropy statistics for Gaussian, stationary and differentiable random fields. Based on this expression, we obtain an approximate joint density which we use to formulate a statistical test for isotropy. The approximate joint density is independent of the autocovariance function and provides conservative probability and confidence regions for the anisotropy parameters. We validate the theoretical analysis by means of simulations using synthetic data, and we illustrate the detection of anisotropy changes with a case study involving background radiation exposure data. The approximate joint density provides (i) a stand-alone approximate estimate of the anisotropy statistics distribution (ii) informed initial values for maximum likelihood estimation, and (iii) a useful prior for Bayesian anisotropy inference.Comment: 39 pages; 8 figure
    corecore