21 research outputs found
Development of a quality assurance process for the SoLid experiment
The SoLid experiment has been designed to search for an oscillation pattern induced by a light sterile neutrino state, utilising the BR2 reactor of SCK circle CEN, in Belgium.
The detector leverages a new hybrid technology, utilising two distinct scintillators in a cubic array, creating a highly segmented detector volume. A combination of 5 cm cubic polyvinyltoluene cells, with (LiF)-Li-6:ZnS(Ag) sheets on two faces of each cube, facilitate reconstruction of the neutrino signals. Whilst the high granularity provides a powerful toolset to discriminate backgrounds; by itself the segmentation also represents a challenge in terms of homogeneity and calibration, for a consistent detector response. The search for this light sterile neutrino implies a sensitivity to distortions of around O(10)% in the energy spectrum of reactor (v) over bare. Hence, a very good neutron detection efficiency, light yield and homogeneous detector response are critical for data validation. The minimal requirements for the SoLid physics program are a light yield and a neutron detection efficiency larger than 40 PA/MeV/cube and 50% respectively. In order to guarantee these minimal requirements, the collaboration developed a rigorous quality assurance process for all 12800 cubic cells of the detector. To carry out the quality assurance process, an automated calibration system called CALIPSO was designed and constructed. CALIPSO provides precise, automatic placement of radioactive sources in front of each cube of a given detector plane (16 x 16 cubes). A combination of Na-22, Cf-252 and AmBe gamma and neutron sources were used by CALIPSO during the quality assurance process. Initially, the scanning identified defective components allowing for repair during initial construction of the SoLid detector. Secondly, a full analysis of the calibration data revealed initial estimations for the light yield of over 60 PA/MeV and neutron reconstruction efficiency of 68%, validating the SoLid physics requirements
Search for high-energy neutrinos from gravitational wave event GW151226 and candidate LVT151012 with ANTARES and IceCube
The Advanced LIGO observatories detected gravitational waves from two binary black hole mergers during their first observation run (O1). We present a high-energy neutrino follow-up search for the second gravitational wave event, GW151226, as well as for gravitational wave candidate LVT151012. We find two and four neutrino candidates detected by IceCube, and one and zero detected by Antares, within ±500 s around the respective gravitational wave signals, consistent with the expected background rate. None of these neutrino candidates are found to be directionally coincident with GW151226 or LVT151012. We use nondetection to constrain isotropic-equivalent high-energy neutrino emission from GW151226, adopting the GW event's 3D localization, to less than 2×1051-2×1054 erg. © 2017 American Physical Society
Recherche de sources cosmiques de neutrinos à haute énergie avec le détecteur AMANDA-II
AMANDA-II est un télescope à neutrinos composé d'un réseau tri-dimensionnel de senseurs optiques déployé dans la glace du Pôle Sud.<p>Son principe de détection repose sur la mise en évidence de particules secondaires chargées émises lors de l'interaction d'un neutrino de haute énergie (> 100 GeV) avec la matière environnant le détecteur, sur base de la détection de rayonnement Cerenkov.<p><p>Ce travail est basé sur les données enregistrées par AMANDA-II entre 2000 et 2006, afin de rechercher des sources cosmiques de neutrinos.<p>Le signal recherché est affecté d'un bruit de fond important de muons et de neutrinos issus de l'interaction du rayonnement cosmique primaire dans l'atmosphère. En se limitant à l'observation de l'hémisphère nord, le bruit de fond des muons atmosphériques, absorbés par la Terre, est éliminé.<p>Par contre, les neutrinos atmosphériques forment un bruit de fond irréductible constituant la majorité des 6100 événements sélectionnés pour cette analyse.<p>Il est cependant possible d'identifier une source ponctuelle de neutrinos cosmiques en recherchant un excès local se détachant du bruit de fond isotrope de neutrinos atmosphériques, couplé à une sélection basée sur l'énergie, dont le spectre est différent pour les deux catégories de neutrinos.<p><p>Une approche statistique originale est développée dans le but d'optimiser le pouvoir de détection de sources ponctuelles, tout en contrôlant le taux de fausses découvertes, donc le niveau de confiance d'une observation.<p>Cette méthode repose uniquement sur la connaissance de l'hypothèse de bruit de fond, sans aucune hypothèse sur le modèle de production de neutrinos par les sources recherchées. De plus, elle intègre naturellement la notion de facteur d'essai rencontrée dans le cadre de test d'hypothèses multiples.La procédure a été appliquée sur l'échantillon final d'évènements récoltés par AMANDA-II.<p><p>---------<p><p>MANDA-II is a neutrino telescope which comprises a three dimensional array of optical sensors deployed in the South Pole glacier. <p>Its principle rests on the detection of the Cherenkov radiation emitted by charged secondary particles produced by the interaction of a high energy neutrino (> 100 GeV) with the matter surrounding the detector.<p><p>This work is based on data recorded by the AMANDA-II detector between 2000 and 2006 in order to search for cosmic sources of neutrinos. A potential signal must be extracted from the overwhelming background of muons and neutrinos originating from the interaction of primary cosmic rays within the atmosphere.<p>The observation is limited to the northern hemisphere in order to be free of the atmospheric muon background, which is stopped by the Earth. However, atmospheric neutrinos constitute an irreducible background composing the main part of the 6100 events selected for this analysis.<p>It is nevertheless possible to identify a point source of cosmic neutrinos by looking for a local excess breaking away from the isotropic background of atmospheric neutrinos;<p>This search is coupled with a selection based on the energy, whose spectrum is different from that of the atmospheric neutrino background.<p><p>An original statistical approach has been developed in order to optimize the detection of point sources, whilst controlling the false discovery rate -- hence the confidence level -- of an observation. This method is based solely on the knowledge of the background hypothesis, without any assumption on the production model of neutrinos in sought sources. Moreover, the method naturally accounts for the trial factor inherent in multiple testing.The procedure was applied on the final sample of events collected by AMANDA-II.Doctorat en Sciencesinfo:eu-repo/semantics/nonPublishe
Erratum: Search for high-energy Muon neutrinos from the "naked-eye" GRB080319b with the icecube neutrino telescope (The Astrophysical Journal (2009) 701 (1721))
0SCOPUS: er.jinfo:eu-repo/semantics/publishe
Using Business Data in Customs Risk Management: Data Quality and Data Value Perspective
With the rise of data analytics use in government, government organizations are starting to explore the possibilities of using business data to create further public value. This process, however, is far from straightforward: key questions that governments need to address relate to the quality of this external data and the value it brings. In the domain of global trade, customs administrations are responsible on the one hand to control trade for safety and security and duty collection and on the other hand they need to facilitate trade and not hinder economic activities. With the increased trade volumes, also due to growth in eCommerce, customs administrations have turned their attention to the use of data analytics to support their risk management processes. Beyond the internal customs data sources, customs is starting to explore the value of business data provided by business infrastructures and platforms. While these external data sources seem to hold valuable information for customs, data quality of the external data sources, as well as the value they bring to customs need to be well understood. Building on a case study conducted in the context of the PROFILE research project, this contribution reports the findings on data quality and data linking of ENS customs data with external data (BigDataMari) and other customs (import declaration) data and we discuss specific lessons learned and recommendations for practice. In addition, we also develop a data quality and data value evaluation framework applied to customs as high-level framework to help data users to evaluate potential value of external data sources. From a theoretical perspective this paper further extends earlier research on value of data analytics for government supervision, by zooming on data quality.Green Open Access added to TU Delft Institutional Repository ‘You share, we take care!’ – Taverne project https://www.openaccess.nl/en/you-share-we-take-care Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.Information and Communication Technolog
Search for relativistic magnetic monopoles with the AMANDA-II neutrino telescope: The IceCube Collaboration
We present the search for Cherenkov signatures from relativistic magnetic monopoles in data taken with the AMANDA-II detector, a neutrino telescope deployed in the Antarctic ice cap at the Geographic South Pole. The non-observation of a monopole signal in data collected during the year 2000 improves present experimental limits on the flux of relativistic magnetic monopoles: Our flux limit varies between 3.8 × 10-17 cm-2 s-1 sr-1 (for monopoles moving at the vacuum speed of light) and 8.8 × 10-16 cm-2 s-1 sr-1 (for monopoles moving at a speed β=v/c=0.76, just above the Cherenkov threshold in ice). These limits apply to monopoles that are energetic enough to penetrate the Earth and enter the detector from below the horizon. The limit obtained for monopoles reaching the detector from above the horizon is less stringent by roughly an order of magnitude, due to the much larger background from down-going atmospheric muons. This looser limit is however valid for a larger class of magnetic monopoles, since the monopoles are not required to pass through the Earth. © 2010 The Author(s).0SCOPUS: ar.jinfo:eu-repo/semantics/publishe
Identifying the value of data analytics in the context of government supervision: Insights from the customs domain
Using Business Data in Customs Risk Management: Data Quality and Data Value Perspective
With the rise of data analytics use in government, government organizations are starting to explore the possibilities of using business data to create further public value. This process, however, is far from straightforward: key questions that governments need to address relate to the quality of this external data and the value it brings. In the domain of global trade, customs administrations are responsible on the one hand to control trade for safety and security and duty collection and on the other hand they need to facilitate trade and not hinder economic activities. With the increased trade volumes, also due to growth in eCommerce, customs administrations have turned their attention to the use of data analytics to support their risk management processes. Beyond the internal customs data sources, customs is starting to explore the value of business data provided by business infrastructures and platforms. While these external data sources seem to hold valuable information for customs, data quality of the external data sources, as well as the value they bring to customs need to be well understood. Building on a case study conducted in the context of the PROFILE research project, this contribution reports the findings on data quality and data linking of ENS customs data with external data (BigDataMari) and other customs (import declaration) data and we discuss specific lessons learned and recommendations for practice. In addition, we also develop a data quality and data value evaluation framework applied to customs as high-level framework to help data users to evaluate potential value of external data sources. From a theoretical perspective this paper further extends earlier research on value of data analytics for government supervision, by zooming on data quality
