54 research outputs found
LHCb upgrade software and computing : technical design report
This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis
Physics case for an LHCb Upgrade II - Opportunities in flavour physics, and beyond, in the HL-LHC era
The LHCb Upgrade II will fully exploit the flavour-physics opportunities of the HL-LHC, and study additional physics topics that take advantage of the forward acceptance of the LHCb spectrometer. The LHCb Upgrade I will begin operation in 2020. Consolidation will occur, and modest enhancements of the Upgrade I detector will be installed, in Long Shutdown 3 of the LHC (2025) and these are discussed here. The main Upgrade II detector will be installed in long shutdown 4 of the LHC (2030) and will build on the strengths of the current LHCb experiment and the Upgrade I. It will operate at a luminosity up to 2×1034
cm−2s−1, ten times that of the Upgrade I detector. New detector components will improve the intrinsic performance of the experiment in certain key areas. An Expression Of Interest proposing Upgrade II was submitted in February 2017. The physics case for the Upgrade II is presented here in more depth. CP-violating phases will be measured with precisions unattainable at any other envisaged facility. The experiment will probe b → sl+l−and b → dl+l− transitions in both muon and electron decays in modes not accessible at Upgrade I. Minimal flavour violation will be tested with a precision measurement of the ratio of B(B0 → μ+μ−)/B(Bs → μ+μ−). Probing charm CP violation at the 10−5 level may result in its long sought discovery. Major advances in hadron spectroscopy will be possible, which will be powerful probes of low energy QCD. Upgrade II potentially will have the highest sensitivity of all the LHC experiments on the Higgs to charm-quark couplings. Generically, the new physics mass scale probed, for fixed couplings, will almost double compared with the pre-HL-LHC era; this extended reach for flavour physics is similar to that which would be achieved by the HE-LHC proposal for the energy frontier
TrackML: A High Energy Physics Particle Tracking Challenge
International audienceTo attain its ultimate discovery goals, the luminosity of the Large Hadron Collider at CERN will increase so the amount of additional collisions will reach a level of 200 interaction per bunch crossing, a factor 7 w.r.t the current (2017) luminosity. This will be a challenge for the ATLAS and CMS experiments, in particular for track reconstruction algorithms. In terms of software, the increased combinatorial complexity will have to harnessed without any increase in budget. To engage the Computer Science community to contribute new ideas, we organized a Tracking Machine Learning challenge (TrackML) running on the Kaggle platform from March to June 2018, building on the experience of the successful Higgs Machine Learning challenge in 2014. The data were generated using [ACTS], an open source accurate tracking simulator, featuring a typical all silicon LHC tracking detector, with 10 layers of cylinders and disks. Simulated physics events (Pythia ttbar) overlaid with 200 additional collisions yield typically 10000 tracks (100000 hits) per event. The first lessons from the Accuracy phase of the challenge will be discussed
Toward the End-to-End Optimization of Particle Physics Instruments with Differentiable Programming: a White Paper
The full optimization of the design and operation of instruments whose functioning relies on the interaction of radiation with matter is a super-human task, given the large dimensionality of the space of possible choices for geometry, detection technology, materials, data-acquisition, and information-extraction techniques, and the interdependence of the related parameters. On the other hand, massive potential gains in performance over standard, "experience-driven" layouts are in principle within our reach if an objective function fully aligned with the final goals of the instrument is maximized by means of a systematic search of the configuration space. The stochastic nature of the involved quantum processes make the modeling of these systems an intractable problem from a classical statistics point of view, yet the construction of a fully differentiable pipeline and the use of deep learning techniques may allow the simultaneous optimization of all design parameters. In this document we lay down our plans for the design of a modular and versatile modeling tool for the end-to-end optimization of complex instruments for particle physics experiments as well as industrial and medical applications that share the detection of radiation as their basic ingredient. We consider a selected set of use cases to highlight the specific needs of different applications
Exploiting Differentiable Programming for the End-to-end Optimization of Detectors
The coming of age of differentiable programming makes possible today to create complete
computer models of experimental apparatus that include the stochastic data-generation processes, the full modeling of the reconstruction and inference procedures, and a suitably defined
objective function, along with the cost of any given detector configuration, geometry and materials. This enables the end-to-end optimization of the instruments, by using techniques developed
within computer science that are currently vastly exploited in fields such as fluid dynamics.
The MODE Collaboration has started to consider the problem in its generality, to provide
software architectures that may be useful for the optimization of experimental design. These
models may be useful in a ”human in the middle” system as they provide information on the
relative merit of different configurations as a continuous function of the design choices. In this
short contribution we summarize the plan of studies that has been laid out, and its potential in
the long term for the future of experimental studies in fundamental physics
Toward the End-to-End Optimization of Particle Physics Instruments with Differentiable Programming: a White Paper
The full optimization of the design and operation of instruments whose functioning relies on the interaction of radiation with matter is a super-human task, given the large dimensionality of the space of possible choices for geometry, detection technology, materials, data-acquisition, and information-extraction techniques, and the interdependence of the related parameters. On the other hand, massive potential gains in performance over standard, 'experience-driven' layouts are in principle within our reach if an objective function fully aligned with the final goals of the instrument is maximized by means of a systematic search of the configuration space. The stochastic nature of the involved quantum processes make the modeling of these systems an intractable problem from a classical statistics point of view, yet the construction of a fully differentiable pipeline and the use of deep learning techniques may allow the simultaneous optimization of all design parameters. In this document we lay down our plans for the design of a modular and versatile modeling tool for the end-to-end optimization of complex instruments for particle physics experiments as well as industrial and medical applications that share the detection of radiation as their basic ingredient. We consider a selected set of use cases to highlight the specific needs of different applications
HEP Community White Paper on Software Trigger and Event Reconstruction: Executive Summary
Realizing the physics programs of the planned and upgraded high-energy physics (HEP) experiments over the next 10 years will require the HEP community to address a number of challenges in the area of software and computing. For this reason, the HEP software community has engaged in a planning process over the past two years, with the objective of identifying and prioritizing the research and development required to enable the next generation of HEP detectors to fulfill their full physics potential. The aim is to produce a Community White Paper which will describe the community strategy and a roadmap for software and computing research and development in HEP for the 2020s. The topics of event reconstruction and software triggers were considered by a joint working group and are summarized together in this document
Observation of Collider Muon Neutrinos with the SND@LHC Experiment
We report the direct observation of muon neutrino interactions with the SND@LHC detector at the Large Hadron Collider. A dataset of proton-proton collisions at s=13.6 TeV collected by SND@LHC in 2022 is used, corresponding to an integrated luminosity of 36.8 fb-1. The search is based on information from the active electronic components of the SND@LHC detector, which covers the pseudorapidity region of 7.2<η<8.4, inaccessible to the other experiments at the collider. Muon neutrino candidates are identified through their charged-current interaction topology, with a track propagating through the entire length of the muon detector. After selection cuts, 8 νμ interaction candidate events remain with an estimated background of 0.086 events, yielding a significance of about 7 standard deviations for the observed νμ signal
Observation of the B<SUP>+</SUP>→ Jψη<SUP>′</SUP>K<SUP>+</SUP> decay (vol 2023, 174, 2023)
LPHE-OSSCI-SB-FBLPHE-L
- …
