7,143 research outputs found

    The Montage Image Mosaic Service: Custom Image Mosaics On-Demand

    Get PDF
    The Montage software suite has proven extremely useful as a general engine for reprojecting, background matching, and mosaicking astronomical image data from a wide variety of sources. The processing algorithms support all common World Coordinate System (WCS) projections and have been shown to be both astrometrically accurate and flux conserving. The background ‘matching’ algorithm does not remove background flux but rather finds the best compromise background based on all the input and matches the individual images to that. The Infrared Science Archive (IRSA), part of the Infrared Processing and Analysis Center (IPAC) at Caltech, has now wrapped the Montage software as a CGI service and provided a compute and request management infrastructure capable of producing approximately 2 TBytes / day of image mosaic output (e.g. from 2MASS and SDSS data). Besides the basic Montage engine, this service makes use of a 16-node LINUX cluster (dual processor, dual core) and the ROME request management software developed by the National Virtual Observatory (NVO). ROME uses EJB/database technology to manage user requests, queue processing and load balance between users, and managing job monitoring and user notification. The Montage service will be extended to process userdefined data collections, including private data uploads

    A Cost-Benefit Study of Doing Astrophysics On The Cloud: Production of Image Mosaics

    Get PDF
    Utility grids such as the Amazon EC2 and Amazon S3 clouds offer computational and storage resources that can be used on-demand for a fee by compute- and data-intensive applications. The cost of running an application on such a cloud depends on the compute, storage and communication resources it will provision and consume. Different execution plans of the same application may result in significantly different costs. We studied via simulation the cost performance trade-offs of different execution and resource provisioning plans by creating, under the Amazon cloud fee structure, mosaics with the Montage image mosaic engine, a widely used data- and compute-intensive application. Specifically, we studied the cost of building mosaics of 2MASS data that have sizes of 1, 2 and 4 square degrees, and a 2MASS all-sky mosaic. These are examples of mosaics commonly generated by astronomers. We also study these trade-offs in the context of the storage and communication fees of Amazon S3 when used for long-term application data archiving. Our results show that by provisioning the right amount of storage and compute resources cost can be significantly reduced with no significant impact on application performance

    Adaptive Regret Minimization in Bounded-Memory Games

    Get PDF
    Online learning algorithms that minimize regret provide strong guarantees in situations that involve repeatedly making decisions in an uncertain environment, e.g. a driver deciding what route to drive to work every day. While regret minimization has been extensively studied in repeated games, we study regret minimization for a richer class of games called bounded memory games. In each round of a two-player bounded memory-m game, both players simultaneously play an action, observe an outcome and receive a reward. The reward may depend on the last m outcomes as well as the actions of the players in the current round. The standard notion of regret for repeated games is no longer suitable because actions and rewards can depend on the history of play. To account for this generality, we introduce the notion of k-adaptive regret, which compares the reward obtained by playing actions prescribed by the algorithm against a hypothetical k-adaptive adversary with the reward obtained by the best expert in hindsight against the same adversary. Roughly, a hypothetical k-adaptive adversary adapts her strategy to the defender's actions exactly as the real adversary would within each window of k rounds. Our definition is parametrized by a set of experts, which can include both fixed and adaptive defender strategies. We investigate the inherent complexity of and design algorithms for adaptive regret minimization in bounded memory games of perfect and imperfect information. We prove a hardness result showing that, with imperfect information, any k-adaptive regret minimizing algorithm (with fixed strategies as experts) must be inefficient unless NP=RP even when playing against an oblivious adversary. In contrast, for bounded memory games of perfect and imperfect information we present approximate 0-adaptive regret minimization algorithms against an oblivious adversary running in time n^{O(1)}.Comment: Full Version. GameSec 2013 (Invited Paper

    How good are your fits? Unbinned multivariate goodness-of-fit tests in high energy physics

    Full text link
    Multivariate analyses play an important role in high energy physics. Such analyses often involve performing an unbinned maximum likelihood fit of a probability density function (p.d.f.) to the data. This paper explores a variety of unbinned methods for determining the goodness of fit of the p.d.f. to the data. The application and performance of each method is discussed in the context of a real-life high energy physics analysis (a Dalitz-plot analysis). Several of the methods presented in this paper can also be used for the non-parametric determination of whether two samples originate from the same parent p.d.f. This can be used, e.g., to determine the quality of a detector Monte Carlo simulation without the need for a parametric expression of the efficiency.Comment: 32 pages, 12 figure

    Design of the Spitzer Space Telescope Heritage Archive

    Get PDF
    It is predicted that Spitzer Space Telescope’s cryogen will run out in April 2009, and the final reprocessing for the cryogenic mission is scheduled to end in April 2011, at which time the Spitzer archive will be transferred to the NASA/IPAC Infrared Science Archive (IRSA) for long-term curation. The Spitzer Science Center (SSC) and IRSA are collaborating to design and deploy the Spitzer Heritage Archive (SHA), which will supersede the current Spitzer archive. It will initially contain the raw and final reprocessed cryogenic science products, and will eventually incorporate the final products from the Warm mission. The SHA will be accompanied by tools deemed necessary to extract the full science content of the archive and by comprehensive documentation

    Exposure to Household Air Pollution from Biomass-Burning Cookstoves and HbA1c and Diabetic Status Among Honduran Women

    Full text link
    Household air pollution from biomass cookstoves is estimated to be responsible for more than two and a half million premature deaths annually, primarily in low and middle‐income countries where cardiometabolic disorders, such as Type II Diabetes, are increasing. Growing evidence supports a link between ambient air pollution and diabetes, but evidence for household air pollution is limited. This cross‐sectional study of 142 women (72 with traditional stoves and 70 with cleaner‐burning Justa stoves) in rural Honduras evaluated the association of exposure to household air pollution (stove type, 24‐hour average kitchen and personal fine particulate matter [PM2.5] mass and black carbon) with glycated hemoglobin (HbA1c) levels and diabetic status based on HbA1c levels. The prevalence ratio (PR) per interquartile range increase in pollution concentration indicated higher prevalence of prediabetes/diabetes (vs normal HbA1c) for all pollutant measures (eg, PR per 84 μg/m3 increase in personal PM2.5, 1.49; 95% confidence interval [CI], 1.11‐2.01). Results for HbA1c as a continuous variable were generally in the hypothesized direction. These results provide some evidence linking household air pollution with the prevalence of prediabetes/diabetes, and, if confirmed, suggest that the global public health impact of household air pollution may be broader than currently estimated

    Degree of explanation

    Get PDF
    Partial explanations are everywhere. That is, explanations citing causes that explain some but not all of an effect are ubiquitous across science, and these in turn rely on the notion of degree of explanation. I argue that current accounts are seriously deficient. In particular, they do not incorporate adequately the way in which a cause’s explanatory importance varies with choice of explanandum. Using influential recent contrastive theories, I develop quantitative definitions that remedy this lacuna, and relate it to existing measures of degree of causation. Among other things, this reveals the precise role here of chance, as well as bearing on the relation between causal explanation and causation itself

    The NASA Exoplanet Archive: Data and Tools for Exoplanet Research

    Full text link
    We describe the contents and functionality of the NASA Exoplanet Archive, a database and tool set funded by NASA to support astronomers in the exoplanet community. The current content of the database includes interactive tables containing properties of all published exoplanets, Kepler planet candidates, threshold-crossing events, data validation reports and target stellar parameters, light curves from the Kepler and CoRoT missions and from several ground-based surveys, and spectra and radial velocity measurements from the literature. Tools provided to work with these data include a transit ephemeris predictor, both for single planets and for observing locations, light curve viewing and normalization utilities, and a periodogram and phased light curve service. The archive can be accessed at http://exoplanetarchive.ipac.caltech.edu.Comment: Accepted for publication in the Publications of the Astronomical Society of the Pacific, 4 figure
    corecore