832 research outputs found

    Optical interferometers for tests of relativistic gravity in space

    Get PDF
    A space-based astrometric interferometer with a large optical bandwidth is considered. POINTS (Precision Optical INTerferometry in Space) would measure the angular separation of two stars separated by about 90 deg on the sky with a nominal measurement error of 5 microarcseconds (mas). For a pair of mag 10 stars, the observation would require about 10 minutes. It is estimated that the instrument would measure daily the separation of two stars for each of about 60 pairs of stars; a random sequence of such measurements, if suitably redundant, contains the closure information necessary to detect and correct time dependent measurement biases to well below the nominal measurement accuracy. The 90 deg target separation permits absolute parallax measurements in all directions. A redundant observing schedule for 300 stars and 5 quasars would provide extra redundancy to compensate for the quasars' higher magnitude. If a nominal 30-day observation sequence were repeated 4 times per year for 10 years, stellar parameter uncertainties would be obtained of: 0.6 mas, position; 0.4 mas/y, proper motion; and 0.4 mas, parallax. This set of well-observed stars and quasars would form a rigid frame and the stars would serve as reference objects for measurements of all additional targets, as well as being targets of direct scientific interest. The instrument global data analysis since objectives are considered including a relativity test and technology

    Geophysical interpretation of Venus gravity data

    Get PDF
    The investigation of the subsurface mass distribution of Venus through the analysis of the data from Pioneer Venus Orbiter (PVO) is presented. The Doppler tracking data was used to map the gravitational potential, which was compared to the topographic data from the PVO radar (ORAD). In order to obtain an unbiased comparison, the topography obtained from the PVO-ORAD was filtered to introduce distortions which are the same as those of our gravity models. The last major software package that was required in order to determine the spectral admittance Z (lambda) was used. This package solves the forward problem: given the topography and its density, and assuming no compensation, find the resulting spacecraft acceleration along a given nominal trajectory. The filtered topography is obtained by processing these accelerations in the same way (i.e., with the same geophysical inverter) as the Doppler-rate data that we use to estimate the gravity maps

    Prospects for Large Relativity Violations in Matter-Gravity Couplings

    Get PDF
    Deviations from relativity are tightly constrained by numerous experiments. A class of unmeasured and potentially large violations is presented that can be tested in the laboratory only via weak gravity couplings. Specialized highly sensitive experiments could achieve measurements of the corresponding effects. A single constraint of 1 x 10^{-11} GeV is extracted on one combination of the 12 possible effects in ordinary matter. Estimates are provided for attainable sensitivities in existing and future experiments.Comment: 10 page

    Neutrino Oscillations from Strings and Other Funny Things

    Get PDF
    I will discuss three related unconventional ways to generate neutrino oscillations (1)Equivalence principle violation by the string dilaton field (i2)Violation of Lorentz Invariance and (3) Equivalence principle violation through a non-universal tensor neutrino-gravity coupling. These unorthodox neutrino oscillation mechanisms are shown to be viable at the level of our present experimental knowledge and demonstrate that neutrino oscillations can probe very profound questions

    Weak Equivalence Principle Test on a Sounding Rocket

    Full text link
    SR-POEM, our principle of equivalence measurement on a sounding rocket, will compare the free fall rate of two substances yielding an uncertainty of E-16 in the estimate of \eta. During the past two years, the design concept has matured and we have been working on the required technology, including a laser gauge that is self aligning and able to reach 0.1 pm per root hertz for periods up to 40 s. We describe the status and plans for this project.Comment: Presented at the Fifth Meeting on CPT and Lorentz Symmetry, Bloomington, Indiana, June 28-July 2, 201

    Artifactual log-periodicity in finite size data: Relevance for earthquake aftershocks

    Full text link
    The recently proposed discrete scale invariance and its associated log-periodicity are an elaboration of the concept of scale invariance in which the system is scale invariant only under powers of specific values of the magnification factor. We report on the discovery of a novel mechanism for such log-periodicity relying solely on the manipulation of data. This ``synthetic'' scenario for log-periodicity relies on two steps: (1) the fact that approximately logarithmic sampling in time corresponds to uniform sampling in the logarithm of time; and (2) a low-pass-filtering step, as occurs in constructing cumulative functions, in maximum likelihood estimations, and in de-trending, reddens the noise and, in a finite sample, creates a maximum in the spectrum leading to a most probable frequency in the logarithm of time. We explore in detail this mechanism and present extensive numerical simulations. We use this insight to analyze the 27 best aftershock sequences studied by Kisslinger and Jones [1991] to search for traces of genuine log-periodic corrections to Omori's law, which states that the earthquake rate decays approximately as the inverse of the time since the last main shock. The observed log-periodicity is shown to almost entirely result from the ``synthetic scenario'' owing to the data analysis. From a statistical point of view, resolving the issue of the possible existence of log-periodicity in aftershocks will be very difficult as Omori's law describes a point process with a uniform sampling in the logarithm of the time. By construction, strong log-periodic fluctuations are thus created by this logarithmic sampling.Comment: LaTeX, JGR preprint with AGU++ v16.b and AGUTeX 5.0, use packages graphicx, psfrag and latexsym, 41 eps figures, 26 pages. In press J. Geophys. Re

    Importance of direct and indirect triggered seismicity

    Full text link
    Using the simple ETAS branching model of seismicity, which assumes that each earthquake can trigger other earthquakes, we quantify the role played by the cascade of triggered seismicity in controlling the rate of aftershock decay as well as the overall level of seismicity in the presence of a constant external seismicity source. We show that, in this model, the fraction of earthquakes in the population that are aftershocks is equal to the fraction of aftershocks that are indirectly triggered and is given by the average number of triggered events per earthquake. Previous observations that a significant fraction of earthquakes are triggered earthquakes therefore imply that most aftershocks are indirectly triggered by the mainshock.Comment: Latex document of 17 pages + 2 postscript figure

    Importance of small earthquakes for stress transfers and earthquake triggering

    Full text link
    We estimate the relative importance of small and large earthquakes for static stress changes and for earthquake triggering, assuming that earthquakes are triggered by static stress changes and that earthquakes are located on a fractal network of dimension D. This model predicts that both the number of events triggered by an earthquake of magnitude m and the stress change induced by this earthquake at the location of other earthquakes increase with m as \~10^(Dm/2). The stronger the spatial clustering, the larger the influence of small earthquakes on stress changes at the location of a future event as well as earthquake triggering. If earthquake magnitudes follow the Gutenberg-Richter law with b>D/2, small earthquakes collectively dominate stress transfer and earthquake triggering, because their greater frequency overcomes their smaller individual triggering potential. Using a Southern-California catalog, we observe that the rate of seismicity triggered by an earthquake of magnitude m increases with m as 10^(alpha m), where alpha=1.00+-0.05. We also find that the magnitude distribution of triggered earthquakes is independent of the triggering earthquake magnitude m. When alpha=b, small earthquakes are roughly as important to earthquake triggering as larger ones. We evaluate the fractal correlation dimension of hypocenters D=2 using two relocated catalogs for Southern California, and removing the effect of short-term clustering. Thus D=2alpha as predicted by assuming that earthquake triggering is due to static stress. The value D=2 implies that small earthquakes are as important as larger ones for stress transfers between earthquakes.Comment: 14 pages, 7 eps figures, latex. In press in J. Geophys. Re
    corecore