17,591 research outputs found

    A Dynamic Structural Model for Stock Return Volatility and Trading Volume

    Get PDF
    This paper seeks to develop a structural model that lets data on asset returns and trading volume speak to whether volatility autocorrelation comes from the fundamental that the trading process is pricing or, is caused by the trading process itself. Returns and volume data argue, in the context of our model, that persistent volatility is caused by traders experimenting with different beliefs based upon past profit experience and their estimates of future profit experience. A major theme of our paper is to introduce adaptive agents in the spirit of Sargent (1993) but have them adapt their strategies on a time scale that is slower than the time scale on which the trading process takes place. This will lead to positive autocorrelation in volatility and volume on the time scale of the trading process which generates returns and volume data. Positive autocorrelation of volatility and volume is caused by persistence of strategy patterns that are associated with high volatility and high volume. Thee following features seen in the data: (i) The autocorrelation function of a measure of volatility such as squared returns or absolute value of returns is positive with a slowly decaying tail. (ii) The autocorrelation function of a measure of trading activity such as volume or turnover is positive with a slowly decaying tail. (iii) The cross correlation function of a measure of volatility such as squared returns is about zero for squared returns with past and future volumes and is positive for squared returns with current volumes. (iv) Abrupt changes in prices and returns occur which are hard to attach to 'news.' The last feature is obtained by a version of the model where the Law of Large Numbers fails in the large economy limit.

    Quick X-ray Reflectivity using Monochromatic Synchrotron Radiation for Time-Resolved Applications

    Full text link
    We describe and demonstrate a new technique for parallel collection of x-ray reflectivity data, compatible with monochromatic synchrotron radiation and flat substrates, and apply it to the in-situ observation of thin-film growth. The method employs a polycapillary x-ray optic to produce a converging fan of radiation incident onto a sample surface, and an area detector to simultaneously collect the XRR signal over an angular range matching that of the incident fan. Factors determining the range and instrumental resolution of the technique in reciprocal space, in addition to the signal-to-background ratio, are described in detail. Our particular implementation records \sim5\degree{} in 2θ2\theta and resolves Kiessig fringes from samples with layer thicknesses ranging from 3 to 76 nm. Finally, we illustrate the value of this approach by showing in-situ XRR data obtained with 100 ms time resolution during the growth of epitaxial \ce{La_{0.7}Sr_{0.3}MnO3} on \ce{SrTiO3} by Pulsed Laser Deposition (PLD) at the Cornell High Energy Synchrotron Source (CHESS). Compared to prior methods for parallel XRR data collection, ours is the first method that is both sample-independent and compatible with highly collimated, monochromatic radiation typical of 3rd generation synchrotron sources. Further, our technique can be readily adapted for use with laboratory-based sources.Comment: Accepted in Journal of Synchrotron Radiatio

    Measurements of Surface Diffusivity and Coarsening During Pulsed Laser Deposition

    Full text link
    Pulsed Laser Deposition (PLD) of homoepitaxial SrTiO3 was studied with in-situ x-ray specular reflectivity and surface diffuse x-ray scattering. Unlike prior reflectivity-based studies, these measurements access both the time- and the length-scales of the evolution of the surface morphology during growth. In particular, we show that this technique allows direct measurements of the diffusivity for both inter- and intra-layer transport. Our results explicitly limit the possible role of island break-up, demonstrate the key roles played by nucleation and coarsening in PLD, and place an upper bound on the Ehrlich-Schwoebel (ES) barrier for downhill diffusion

    Model Uncertainty and Policy Evaluation: Some Theory and Empirics

    Get PDF
    This paper explores ways to integrate model uncertainty into policy evaluation. We first describe a general framework for the incorporation of model uncertainty into standard econometric calculations. This framework employs Bayesian model averaging methods that have begun to appear in a range of economic studies. Second, we illustrate these general ideas in the context of assessment of simple monetary policy rules for some standard New Keynesian specifications. The specifications vary in their treatment of expectations as well as in the dynamics of output and inflation. We conclude that the Taylor rule has good robustness properties, but may reasonably be challenged in overall quality with respect to stabilization by alternative simple rules that also condition on lagged interest rates, even though these rules employ parameters that are set without accounting for model uncertainty.

    Policy Evaluation in Uncertain Economic Environments

    Get PDF
    This paper develops a decision-theoretic approach to policy analysis. We argue that policy evaluation should be conducted on the basis of two factors: the policymaker's preferences, and the conditional distribution of the outcomes of interest given a policy and available information. From this perspective, the common practice of conditioning on a particular model is often inappropriate, since model uncertainty is an important element of policy evaluation. We advocate the use of model averaging to account for model uncertainty and show how it may be applied to policy evaluation exercises. We illustrate our approach with applications to monetary policy and to growth policy.

    Don't bleach chaotic data

    Full text link
    A common first step in time series signal analysis involves digitally filtering the data to remove linear correlations. The residual data is spectrally white (it is ``bleached''), but in principle retains the nonlinear structure of the original time series. It is well known that simple linear autocorrelation can give rise to spurious results in algorithms for estimating nonlinear invariants, such as fractal dimension and Lyapunov exponents. In theory, bleached data avoids these pitfalls. But in practice, bleaching obscures the underlying deterministic structure of a low-dimensional chaotic process. This appears to be a property of the chaos itself, since nonchaotic data are not similarly affected. The adverse effects of bleaching are demonstrated in a series of numerical experiments on known chaotic data. Some theoretical aspects are also discussed.Comment: 12 dense pages (82K) of ordinary LaTeX; uses macro psfig.tex for inclusion of figures in text; figures are uufile'd into a single file of size 306K; the final dvips'd postscript file is about 1.3mb Replaced 9/30/93 to incorporate final changes in the proofs and to make the LaTeX more portable; the paper will appear in CHAOS 4 (Dec, 1993

    Radiocarbon dates from the Oxford AMS system: archaeometry datelist 35

    Get PDF
    This is the 35th list of AMS radiocarbon determinations measured at the Oxford Radiocarbon Accelerator Unit (ORAU). Amongst some of the sites included here are the latest series of determinations from the key sites of Abydos, El Mirón, Ban Chiang, Grotte de Pigeons (Taforalt), Alepotrypa and Oberkassel, as well as others dating to the Palaeolithic, Mesolithic and later periods. Comments on the significance of the results are provided by the submitters of the material

    Policy Evaluation in Uncertain Economic Environments

    Get PDF
    This paper develops a general framework for economic policy evaluation. Using ideas from statistical decision theory, it argues that conventional approaches fail to appropriately integrate econometric analysis into evaluation problems. Further, it is argued that evaluation of alternative policies should explicitly account for uncertainty about the appropriate model of the economy. The paper shows how to develop an explicitly decision-theoretic approach to policy evaluation and how to incorporate model uncertainty into such an analysis. The theoretical implications of model uncertainty are explored in a set of examples, with a specific focus on how to design policies that are robust against such uncertainty. Finally, the framework is applied to the evaluation of monetary policy rules and to the analysis of tariff reductions as a way to increase aggregate economic growth.macroeconomics, Policy Evaluation, Uncertain Economic Environments

    Multiple Time Scales in Diffraction Measurements of Diffusive Surface Relaxation

    Full text link
    We grew SrTiO3 on SrTiO3 (001) by pulsed laser deposition, using x-ray scattering to monitor the growth in real time. The time-resolved small angle scattering exhibits a well-defined length scale associated with the spacing between unit cell high surface features. This length scale imposes a discrete spectrum of Fourier components and rate constants upon the diffusion equation solution, evident in multiple exponential relaxation of the "anti-Bragg" diffracted intensity. An Arrhenius analysis of measured rate constants confirms that they originate from a single activation energy.Comment: 4 pages, 3 figure
    corecore