2,175 research outputs found
The Use of HepRep in GLAST
HepRep is a generic, hierarchical format for description of graphics
representables that can be augmented by physics information and relational
properties. It was developed for high energy physics event display applications
and is especially suited to client/server or component frameworks. The GLAST
experiment, an international effort led by NASA for a gamma-ray telescope to
launch in 2006, chose HepRep to provide a flexible, extensible and maintainable
framework for their event display without tying their users to any one graphics
application. To support HepRep in their GUADI infrastructure, GLAST developed a
HepRep filler and builder architecture. The architecture hides the details of
XML and CORBA in a set of base and helper classes allowing physics experts to
focus on what data they want to represent. GLAST has two GAUDI services:
HepRepSvc, which registers HepRep fillers in a global registry and allows the
HepRep to be exported to XML, and CorbaSvc, which allows the HepRep to be
published through a CORBA interface and which allows the client application to
feed commands back to GAUDI (such as start next event, or run some GAUDI
algorithm). GLAST's HepRep solution gives users a choice of client
applications, WIRED (written in Java) or FRED (written in C++ and Ruby), and
leaves them free to move to any future HepRep-compliant event display.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics
(CHEP03), La Jolla, Ca, USA, March 2003, 9 pages pdf, 15 figures. PSN THLT00
Data Management and Mining in Astrophysical Databases
We analyse the issues involved in the management and mining of astrophysical
data. The traditional approach to data management in the astrophysical field is
not able to keep up with the increasing size of the data gathered by modern
detectors. An essential role in the astrophysical research will be assumed by
automatic tools for information extraction from large datasets, i.e. data
mining techniques, such as clustering and classification algorithms. This asks
for an approach to data management based on data warehousing, emphasizing the
efficiency and simplicity of data access; efficiency is obtained using
multidimensional access methods and simplicity is achieved by properly handling
metadata. Clustering and classification techniques, on large datasets, pose
additional requirements: computational and memory scalability with respect to
the data size, interpretability and objectivity of clustering or classification
results. In this study we address some possible solutions.Comment: 10 pages, Late
Grid services for the MAGIC experiment
Exploring signals from the outer space has become an observational science
under fast expansion. On the basis of its advanced technology the MAGIC
telescope is the natural building block for the first large scale ground based
high energy gamma-ray observatory. The low energy threshold for gamma-rays
together with different background sources leads to a considerable amount of
data. The analysis will be done in different institutes spread over Europe.
Therefore MAGIC offers the opportunity to use the Grid technology to setup a
distributed computational and data intensive analysis system with the nowadays
available technology. Benefits of Grid computing for the MAGIC telescope are
presented.Comment: 5 pages, 1 figures, to be published in the Proceedings of the 6th
International Symposium ''Frontiers of Fundamental and Computational
Physics'' (FFP6), Udine (Italy), Sep. 26-29, 200
Self-Organising Networks for Classification: developing Applications to Science Analysis for Astroparticle Physics
Physics analysis in astroparticle experiments requires the capability of
recognizing new phenomena; in order to establish what is new, it is important
to develop tools for automatic classification, able to compare the final result
with data from different detectors. A typical example is the problem of Gamma
Ray Burst detection, classification, and possible association to known sources:
for this task physicists will need in the next years tools to associate data
from optical databases, from satellite experiments (EGRET, GLAST), and from
Cherenkov telescopes (MAGIC, HESS, CANGAROO, VERITAS)
A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline
The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the
handling of the scientific and housekeeping telemetry. It is a critical
component of the Planck ground segment which has to strictly commit to the
project schedule to be ready for the launch and flight operations. In order to
guarantee the quality necessary to achieve the objectives of the Planck
mission, the design and development of the Level 1 software has followed the
ESA Software Engineering Standards. A fundamental step in the software life
cycle is the Verification and Validation of the software. The purpose of this
work is to show an example of procedures, test development and analysis
successfully applied to a key software project of an ESA mission. We present
the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by
detailing the methods used and the results obtained. Different approaches have
been used to test the scientific and housekeeping data processing. Scientific
data processing has been tested by injecting signals with known properties
directly into the acquisition electronics, in order to generate a test dataset
of real telemetry data and reproduce as much as possible nominal conditions.
For the HK telemetry processing, validation software have been developed to
inject known parameter values into a set of real housekeeping packets and
perform a comparison with the corresponding timelines generated by the Level 1.
With the proposed validation and verification procedure, where the on-board and
ground processing are viewed as a single pipeline, we demonstrated that the
scientific and housekeeping processing of the Planck-LFI raw data is correct
and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI
papers published on JINST:
http://www.iop.org/EJ/journal/-page=extra.proc5/jins
CIWS-FW: a Customizable InstrumentWorkstation Software Framework for instrument-independent data handling
The CIWS-FW is aimed at providing a common and standard solution for the
storage, processing and quick look at the data acquired from scientific
instruments for astrophysics. The target system is the instrument workstation
either in the context of the Electrical Ground Support Equipment for
space-borne experiments, or in the context of the data acquisition system for
instrumentation. The CIWS-FW core includes software developed by team members
for previous experiments and provides new components and tools that improve the
software reusability, configurability and extensibility attributes. The CIWS-FW
mainly consists of two packages: the data processing system and the data access
system. The former provides the software components and libraries to support
the data acquisition, transformation, display and storage in near real time of
either a data packet stream and/or a sequence of data files generated by the
instrument. The latter is a meta-data and data management system, providing a
reusable solution for the archiving and retrieval of the acquired data. A
built-in operator GUI allows to control and configure the IW. In addition, the
framework provides mechanisms for system error and logging handling. A web
portal provides the access to the CIWS-FW documentation, software repository
and bug tracking tools for CIWS-FW developers. We will describe the CIWS-FW
architecture and summarize the project status.Comment: Accepted for pubblication on ADASS Conference Serie
A novel background reduction strategy for high level triggers and processing in gamma-ray Cherenkov detectors
Gamma ray astronomy is now at the leading edge for studies related both to
fundamental physics and astrophysics. The sensitivity of gamma detectors is
limited by the huge amount of background, constituted by hadronic cosmic rays
(typically two to three orders of magnitude more than the signal) and by the
accidental background in the detectors. By using the information on the
temporal evolution of the Cherenkov light, the background can be reduced. We
will present here the results obtained within the MAGIC experiment using a new
technique for the reduction of the background. Particle showers produced by
gamma rays show a different temporal distribution with respect to showers
produced by hadrons; the background due to accidental counts shows no
dependence on time. Such novel strategy can increase the sensitivity of present
instruments.Comment: 4 pages, 3 figures, Proc. of the 9th Int. Syposium "Frontiers of
Fundamental and Computational Physics" (FFP9), (AIP, Melville, New York,
2008, in press
Off-line radiometric analysis of Planck/LFI data
The Planck Low Frequency Instrument (LFI) is an array of 22
pseudo-correlation radiometers on-board the Planck satellite to measure
temperature and polarization anisotropies in the Cosmic Microwave Background
(CMB) in three frequency bands (30, 44 and 70 GHz). To calibrate and verify the
performances of the LFI, a software suite named LIFE has been developed. Its
aims are to provide a common platform to use for analyzing the results of the
tests performed on the single components of the instrument (RCAs, Radiometric
Chain Assemblies) and on the integrated Radiometric Array Assembly (RAA).
Moreover, its analysis tools are designed to be used during the flight as well
to produce periodic reports on the status of the instrument. The LIFE suite has
been developed using a multi-layered, cross-platform approach. It implements a
number of analysis modules written in RSI IDL, each accessing the data through
a portable and heavily optimized library of functions written in C and C++. One
of the most important features of LIFE is its ability to run the same data
analysis codes both using ground test data and real flight data as input. The
LIFE software suite has been successfully used during the RCA/RAA tests and the
Planck Integrated System Tests. Moreover, the software has also passed the
verification for its in-flight use during the System Operations Verification
Tests, held in October 2008.Comment: Planck LFI technical papers published by JINST:
http://www.iop.org/EJ/journal/-page=extra.proc5/1748-022
Planck-LFI: Design and Performance of the 4 Kelvin Reference Load Unit
The LFI radiometers use a pseudo-correlation design where the signal from the
sky is continuously compared with a stable reference signal, provided by a
cryogenic reference load system. The reference unit is composed by small
pyramidal horns, one for each radiometer, 22 in total, facing small absorbing
targets, made of a commercial resin ECCOSORB CR (TM), cooled to approximately
4.5 K. Horns and targets are separated by a small gap to allow thermal
decoupling. Target and horn design is optimized for each of the LFI bands,
centered at 70, 44 and 30 GHz. Pyramidal horns are either machined inside the
radiometer 20K module or connected via external electro-formed bended
waveguides. The requirement of high stability of the reference signal imposed a
careful design for the radiometric and thermal properties of the loads.
Materials used for the manufacturing have been characterized for thermal, RF
and mechanical properties. We describe in this paper the design and the
performance of the reference system.Comment: This is an author-created, un-copyedited version of an article
accepted for publication in JINST. IOP Publishing Ltd is not responsible for
any errors or omissions in this version of the manuscript or any version
derived from it. The definitive publisher authenticated version is available
online at [10.1088/1748-0221/4/12/T12006]. 14 pages, 34 figure
Planck pre-launch status: calibration of the Low Frequency Instrument flight model radiometers
The Low Frequency Instrument (LFI) on-board the ESA Planck satellite carries
eleven radiometer subsystems, called Radiometer Chain Assemblies (RCAs), each
composed of a pair of pseudo-correlation receivers. We describe the on-ground
calibration campaign performed to qualify the flight model RCAs and to measure
their pre-launch performances. Each RCA was calibrated in a dedicated
flight-like cryogenic environment with the radiometer front-end cooled to 20K
and the back-end at 300K, and with an external input load cooled to 4K. A
matched load simulating a blackbody at different temperatures was placed in
front of the sky horn to derive basic radiometer properties such as noise
temperature, gain, and noise performance, e.g. 1/f noise. The spectral response
of each detector was measured as was their susceptibility to thermal variation.
All eleven LFI RCAs were calibrated. Instrumental parameters measured in these
tests, such as noise temperature, bandwidth, radiometer isolation, and
linearity, provide essential inputs to the Planck-LFI data analysis.Comment: 15 pages, 18 figures. Accepted for publication in Astronomy and
Astrophysic
- …
