875 research outputs found
Data Compression in the Petascale Astronomy Era: a GERLUMPH case study
As the volume of data grows, astronomers are increasingly faced with choices
on what data to keep -- and what to throw away. Recent work evaluating the
JPEG2000 (ISO/IEC 15444) standards as a future data format standard in
astronomy has shown promising results on observational data. However, there is
still a need to evaluate its potential on other type of astronomical data, such
as from numerical simulations. GERLUMPH (the GPU-Enabled High Resolution
cosmological MicroLensing parameter survey) represents an example of a data
intensive project in theoretical astrophysics. In the next phase of processing,
the ~27 terabyte GERLUMPH dataset is set to grow by a factor of 100 -- well
beyond the current storage capabilities of the supercomputing facility on which
it resides. In order to minimise bandwidth usage, file transfer time, and
storage space, this work evaluates several data compression techniques.
Specifically, we investigate off-the-shelf and custom lossless compression
algorithms as well as the lossy JPEG2000 compression format. Results of
lossless compression algorithms on GERLUMPH data products show small
compression ratios (1.35:1 to 4.69:1 of input file size) varying with the
nature of the input data. Our results suggest that JPEG2000 could be suitable
for other numerical datasets stored as gridded data or volumetric data. When
approaching lossy data compression, one should keep in mind the intended
purposes of the data to be compressed, and evaluate the effect of the loss on
future analysis. In our case study, lossy compression and a high compression
ratio do not significantly compromise the intended use of the data for
constraining quasar source profiles from cosmological microlensing.Comment: 15 pages, 9 figures, 5 tables. Published in the Special Issue of
Astronomy & Computing on The future of astronomical data format
A new parameter space study of cosmological microlensing
Cosmological gravitational microlensing is a useful technique for
understanding the structure of the inner parts of a quasar, especially the
accretion disk and the central supermassive black hole. So far, most of the
cosmological microlensing studies have focused on single objects from ~90
currently known lensed quasars. However, present and planned all-sky surveys
are expected to discover thousands of new lensed systems. Using a graphics
processing unit (GPU) accelerated ray-shooting code, we have generated 2550
magnification maps uniformly across the convergence ({\kappa}) and shear
({\gamma}) parameter space of interest to microlensing. We examine the effect
of random realizations of the microlens positions on map properties such as the
magnification probability distribution (MPD). It is shown that for most of the
parameter space a single map is representative of an average behaviour. All of
the simulations have been carried out on the GPU-Supercomputer for Theoretical
Astrophysics Research (gSTAR).Comment: 16 pages, 10 figures, accepted for publication in MNRA
Three-dimensional shapelets and an automated classification scheme for dark matter haloes
We extend the two-dimensional Cartesian shapelet formalism to d-dimensions.
Concentrating on the three-dimensional case, we derive shapelet-based equations
for the mass, centroid, root-mean-square radius, and components of the
quadrupole moment and moment of inertia tensors. Using cosmological N-body
simulations as an application domain, we show that three-dimensional shapelets
can be used to replicate the complex sub-structure of dark matter halos and
demonstrate the basis of an automated classification scheme for halo shapes. We
investigate the shapelet decomposition process from an algorithmic viewpoint,
and consider opportunities for accelerating the computation of shapelet-based
representations using graphics processing units (GPUs).Comment: 19 pages, 11 figures, accepted for publication in MNRA
- …
