4,590 research outputs found

    The underlying social dynamics of paradigm shifts

    Get PDF
    We develop here a multi-agent model of the creation of knowledge (scientific progress or technological evolution) within a community of researchers devoted to such endeavors. In the proposed model, agents learn in a physical-technological landscape, and weight is attached to both individual search and social influence. We find that the combination of these two forces together with random experimentation can account for both i) marginal change, that is, periods of normal science or refinements on the performance of a given technology (and in which the community stays in the neighborhood of the current paradigm); and ii) radical change, which takes the form of scientific paradigm shifts (or discontinuities in the structure of performance of a technology) that is observed as a swift migration of the knowledge community towards the new and superior paradigm. The efficiency of the search process is heavily dependent on the weight that agents posit on social influence. The occurrence of a paradigm shift becomes more likely when each member of the community attaches a small but positive weight to the experience of his/her peers. For this parameter region, nevertheless, a conservative force is exerted by the representatives of the current paradigm. However, social influence is not strong enough to seriously hamper individual discovery, and can act so as to empower successful individual pioneers who have conquered the new and superior paradigm.Fil: Rodriguez Sickert, Carlos. Universidad del Desarrollo; ChileFil: Cosmelli, Diego. Pontificia Universidad Católica de Chile; ChileFil: Claro, Francisco. Pontificia Universidad Católica de Chile; ChileFil: Fuentes, Miguel Angel. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad San Sebastián; Chil

    Monetary policy and inflation persistence in the Eurozone

    Get PDF
    The primary goal of the European Central Bank’s (ECB) monetary policy is to achieve price stability. Whereas during the 1980s and 1990s there was a rapid and strong convergence in terms of price differential among the Euro countries, particularly in those countries with higher inflation rates in the past, single monetary policy has proved to be quite inefficient in continuing this trend and has not achieved further reductions in inflation rate differentials within the euro zone. Since the ECB sets the official interest rate according to the average inflation of the euro area, the persistence of such price differentials within the area would mean that the “one size interest rate policy” would not fit all. This paper studies empirically the inflation rate differentials and their persistence in some currency unions with the aim to draw some conclusions for the working of the ECB monetary policy. KEYWORDS: monetary policy; inflation persistence; currency unions

    Model Space Priors for Objective Sparse Bayesian Regression

    Full text link
    This paper investigates the construction of model space priors from an alternative point of view to the usual indicators for inclusion of covariates in a given model. Assumptions about indicator variables often lead to Beta-Binomial priors on the model space, which do not appropriately penalize for model complexity when the parameters are fixed. This can be alleviated by changing the parameters of the prior to depend on the number of covariates, though justification for this is lacking from a first-principles point of view. We propose viewing the model space as a partially ordered set. When the number of covariates increases, an isometry argument leads to the Poisson distribution as the unique, natural limiting prior over model dimension. This limiting prior is derived using two constructions that view an individual model as though it is a "local" null hypothesis and compares its prior probability to the probability of the alternatives that nest it. We show that this prior induces a posterior that concentrates on a finite true model asymptotically. Additionally, we provide a deterministic algorithm that takes advantage of the nature of the prior and explores good models in polynomial time

    Study of the lateral pass width for conventional and ultrasonic vibrations-assisted ball burnishing on Ti-6Al-4V specimens

    Get PDF
    Ball burnishing is a technological finishing process based on plastic deformation of the objective surface by means of a hard ball gliding over it. Along with its easiness of application, possible on the same machine where machining was performed [1], burnishing is a comprehensive process able to achieve surface roughness improvements, and compressive residual stresses up to deep layers of the material [2]. Burnishing results have proved to be depending of a proper selection of parameters, which must be correctly controlled during the process. That is the case of burnishing force or the number of passes [3]. Among these parameters, the lateral pass width has proved to be influential on the surface roughness results, due to the behavior that most materials show when being plastically deformed. In effect, the applied force makes the material to flow to the borders of the burnishing imprint, giving way to a pile-up effect. This paper deals with indentation experiments on Ti-6Al-4V to deepen in the burnishing process of this material. Single burnishing imprints are geometrically characterized combining different levels of force, number of passes, and comparing the conventional process with that assisted with vibrations. An optimal lateral pass width is thus determined, and technological recommendations are made for future applications of the process.Peer ReviewedPostprint (author's final draft

    Comparison of thermal performance of 3D printer liquefiers through finite element models

    Get PDF
    Open source 3D printers have experienced an intense expansion during the last years, mainly because of their accessibility and the vast availability of information thanks to user communities. This fact presents researchers with a perfect context for hardware innovation, by improving the overall printing process, also in terms of durability of the printing machine. A 3D printer liquefier must transmit heat to the thermoplastic material in order to extrude it, reaching temperatures above 200 degrees for some materials like ABS on the tip of the nozzle. The design of the heating process must comply with keeping the balance between proper heating of the material and controlling the temperature along the extruding body, so that the printer itself is not harmed for overtemperature. On the other hand, the design must guarantee that the melting front is located in an intermediate point between the nozzle tip and the entrance of the raw material, to minimize pressure drops in the system, and so decreasing the demanding energy to the feeding motors. An alternative design of the heating system, Twist3D, is proposed in this paper.Peer ReviewedPostprint (published version

    Financial deregulation, banking competition and regional development: the Spanish experience

    Full text link
    The purpose of this paper is to consider the implications of European monetary integration for peripheral economies within Europe, with a particular focus on the experience of Spain. The view represented in the European Commission's own research is summarised as follows: As regards the regional distribution of the impact [of EMU], which is relevant to th

    Experimental analysis of manufacturing parameters’ effect on the flexural properties of wood-PLA composite parts built through FFF

    Get PDF
    This paper aims to determine the flexural stiffness and strength of a composite made of a polylactic acid reinforced with wood particles, named commercially as Timberfill, manufactured through fused filament fabrication (FFF). The influence of four factors (layer height, nozzle diameter, fill density, and printing velocity) is studied through an L27Taguchi orthogonal array. The response variables used as output results for an analysis of variance are obtained from a set of four-point bending tests. Results show that the layer height is the most influential parameter on flexural strength, followed by nozzle diameter and infill density, whereas the printing velocity has no significant influence. Ultimately, an optimal parameter set that maximizes the material’s flexural strength is found by combining a 0.2-mm layer height, 0.7-mm nozzle diameter, 75% fill density, and 35-mm/s velocity. The highest flexural resistance achieved experimentally is 47.26 MPa. The statistical results are supported with microscopic photographs of fracture sections, and validated by comparing them with previous studies performed on non-reinforced PLA material, proving that the introduction of wood fibers in PLA matrix reduces the resistance of raw PLA by hindering the cohesion between filaments and generating voids inside it. Lastly, five solid Timberfill specimens manufactured by injection molding were also tested to compare their strength with the additive manufactured samples. Results prove that treating the wood-PLA through additive manufacturing results in an improvement of its resistance and elastic properties, being the Young’s module almost 25% lower than the injected material.Preprin

    Digital compensation of the side-band-rejection ratio in a fully analog 2SB sub-millimeter receiver

    Get PDF
    In observational radio astronomy, sideband-separating receivers are preferred, particularly under high atmospheric noise, which is usually the case in the sub-millimeter range. However, obtaining a good rejection ratio between the two sidebands is difficult since, unavoidably, imbalances in the different analog components appear. We describe a method to correct these imbalances without making any change in the analog part of the sideband-separating receiver, specifically, keeping the intermediate-frequency hybrid in place. This opens the possibility of implementing the method in any existing receiver. We have built hardware to demonstrate the validity of the method and tested it on a fully analog receiver operating between 600 and 720GHz. We have tested the stability of calibration and performance vs time and after full resets of the receiver. We have performed an error analysis to compare the digital compensation in two configurations of analog receivers, with and without intermediate frequency (IF) hybrid. An average compensated sideband rejection ratio of 46dB is obtained. Degradation of the compensated sideband rejection ratio on time and after several resets of the receiver is minimal. A receiver with an IF hybrid is more robust to systematic errors. Moreover, we have shown that the intrinsic random errors in calibration have the same impact for configuration without IF hybrid and for a configuration with IF hybrid with analog rejection ratio better than 10dB. Compensated rejection ratios above 40dB are obtained even in the presence of high analog rejection. The method is robust allowing its use under normal operational conditions at any telescope. We also demonstrate that a full analog receiver is more robust against systematic errors. Finally, the error bars associated to the compensated rejection ratio are almost independent of whether IF hybrid is present or not
    corecore