2,033 research outputs found
Analysis of the Web Graph Aggregated by Host and Pay-Level Domain
In this paper the web is analyzed as a graph aggregated by host and pay-level
domain (PLD). The web graph datasets, publicly available, have been released by
the Common Crawl Foundation and are based on a web crawl performed during the
period May-June-July 2017. The host graph has 1.3 billion nodes and
5.3 billion arcs. The PLD graph has 91 million nodes and 1.1
billion arcs. We study the distributions of degree and sizes of strongly/weakly
connected components (SCC/WCC) focusing on power laws detection using
statistical methods. The statistical plausibility of the power law model is
compared with that of several alternative distributions. While there is no
evidence of power law tails on host level, they emerge on PLD aggregation for
indegree, SCC and WCC size distributions. Finally, we analyze distance-related
features by studying the cumulative distributions of the shortest path lengths,
and give an estimation of the diameters of the graphs
Time-resolved investigation of the compensation process of pulsed ion beams
A LEBT system consisting of an ion source, two solenoids, and a diagnostic section has been set up to investigate the space charge compensation process due to residual gas ionization [1] and to study experimentally the rise of compensation. To gain the radial beam potential distribution time resolved measurements of the residual gas ion energy distribution were carried out using a Hughes Rojanski analyzer [2,3]. To measure the radial density profile of the ion beam a CCD-camera performed time resolved measurements, which allow an estimation the rise time of compensation. Further the dynamic effect of the space charge compensation on the beam transport was shown. A numerical simulation under assumption of selfconsistent states [4] of the beam plasma has been used to determine plasma parameters such as the radial density profile and the temperature of the electrons. The acquired data show that the theoretical estimated rise time of space charge compensation neglecting electron losses is shorter than the build up time determined experimentally. An interpretation of the achieved results is given
Investigation of the focus shift due to compensation process for low energy ion beam transport
In magnetic Low Energy Beam Transport (LEBT) sections space charge compensation helps to enhance the transportable beam current and to reduce emittance growth due to space charge forces. For pulsed beams the time neccesary to establish space charge compensation is of great interest for beam transport. Particularly with regard to beam injection into the first accelerator section (e.g. RFQ) investigation of effects on shift of the beam focus due to space charge compensation are very important. The achieved results helps to obviate a mismatch into the first RFQ. To investigate the space charge compensation due to residual gas ionization, time resolved measurements using pulsed ion beams were performed at the LEBT system at the IAP and at the CEA-Saclay injektion line. A residual gas ion energy analyser (RGIA) equiped with a channeltron was used to measure the potential destribution as a function of time to estimate the rise time of compensation. For time resolved measurements (delta t min=50ns) of the radial density profile of the ion beam a CCD-camera was applied. The measured data were used in a numerical simulation of selfconsistant eqilibrium states of the beam plasma [1] to determine plasma parameters such as the density, the temperature, the kinetic and potential energy of the compensation electrons as a function of time. Measurements were done using focused proton beams (10keV, 2mA at IAP and 92keV, 62mA at CEA-Saclay) to get a better understanding of the influence of the compensation process. An interpretation of the acquired data and the achieved results will be presented
The Need for a Versioned Data Analysis Software Environment
Scientific results in high-energy physics and in many other fields often rely
on complex software stacks. In order to support reproducibility and scrutiny of
the results, it is good practice to use open source software and to cite
software packages and versions. With ever-growing complexity of scientific
software on one side and with IT life-cycles of only a few years on the other
side, however, it turns out that despite source code availability the setup and
the validation of a minimal usable analysis environment can easily become
prohibitively expensive. We argue that there is a substantial gap between
merely having access to versioned source code and the ability to create a data
analysis runtime environment. In order to preserve all the different variants
of the data analysis runtime environment, we developed a snapshotting file
system optimized for software distribution. We report on our experience in
preserving the analysis environment for high-energy physics such as the
software landscape used to discover the Higgs boson at the Large Hadron
Collider
- …
