212 research outputs found

    Potentiality of automatic parameter tuning suite available in ACTS track reconstruction software framework

    Full text link
    Particle tracking is among the most sophisticated and complex part of the full event reconstruction chain. A number of reconstruction algorithms work in a sequence to build these trajectories from detector hits. These algorithms use many configuration parameters that need to be fine-tuned to properly account for the detector/experimental setup, the available CPU budget and the desired physics performance. The most popular method to tune these parameters is hand-tuning using brute-force techniques. These techniques can be inefficient and raise issues for the long-term maintainability of such algorithms. The open-source track reconstruction software framework known as "A Common Tracking Framework (ACTS)" offers an alternative solution to these parameter tuning techniques through the use of automatic parameter optimization algorithms. ACTS comes equipped with an auto-tuning suite that provides necessary setup for performing optimization of input parameters belonging to track reconstruction algorithms. The user can choose the tunable parameters in a flexible way and define a cost/benefit function for optimizing the full reconstruction chain. The fast execution speed of ACTS allows the user to run several iterations of optimization within a reasonable time bracket. The performance of these optimizers has been demonstrated on different track reconstruction algorithms such as trajectory seed reconstruction and selection, particle vertex reconstruction and generation of simplified material map, and on different detector geometries such as Generic Detector and Open Data Detector (ODD). We aim to bring this approach to all aspects of trajectory reconstruction by having a more flexible integration of tunable parameters within ACTS

    Ranking-based neural network for ambiguity resolution in ACTS

    Get PDF
    The reconstruction of particle trajectories is a key challenge of particle physics experiments, as it directly impacts particle identification and physics performances while also representing one of the main CPU consumers of many high-energy physics experiments. As the luminosity of particle colliders increases, this reconstruction will become more challenging and resourceintensive. New algorithms are thus needed to address these challenges efficiently. One potential step of track reconstruction is ambiguity resolution. In this step, performed at the end of the tracking chain, we select which tracks candidates should be kept and which must be discarded. The speed of this algorithm is directly driven by the number of track candidates, which can be reduced at the cost of some physics performance. Since this problem is fundamentally an issue of comparison and classification, we propose to use a machine learning-based approach to the Ambiguity Resolution. Using a shared-hits-based clustering algorithm, we can efficiently determine which candidates belong to the same truth particle. Afterwards, we can apply a Neural Network (NN) to compare those tracks and decide which ones are duplicates and which ones should be kept. This approach is implemented within A Common Tracking Software (ACTS) framework and tested on the Open Data Detector (ODD), a realistic virtual detector similar to a future ATLAS one. This new approach was shown to be 15 times faster than the default ACTS algorithm while removing 32 times more duplicates down to less than one duplicated track per event

    Potentiality of automatic parameter tuning suite available in ACTS track reconstruction software framework

    Get PDF
    Particle tracking is among the most sophisticated and complex part of the full event reconstruction chain. A number of reconstruction algorithms work in a sequence to build these trajectories from detector hits. Each of these algorithms use many configuration parameters that need to be fine-tuned to properly account for the detector/experimental setup, the available CPU budget and the desired physics performance. Few examples of such parameters include the cut values limiting the search space of the algorithm, the approximations accounting for complex phenomena or the parameters controlling algorithm performance. The most popular method to tune these parameters is hand-tuning using brute-force techniques. These techniques can be inefficient and raise issues for the long-term maintainability of such algorithms. The opensource track reconstruction software framework known as “A Common Tracking Framework (ACTS)” offers an alternative solution to these parameter tuning techniques through the use of automatic parameter optimization algorithms. ACTS come equipped with an auto-tuning suite that provides necessary setup for performing optimization of input parameters belonging to track reconstruction algorithms. The user can choose the tunable parameters in a flexible way and define a cost/benefit function for optimizing the full reconstruction chain. The fast execution speed of ACTS allows the user to run several iterations of optimization within a reasonable time bracket. The performance of these optimizers has been demonstrated on different track reconstruction algorithms such as trajectory seed reconstruction and selection, particle vertex reconstruction and generation of simplified material map, and on different detector geometries such as Generic Detector and Open Data Detector (ODD). We aim to bring this approach to all aspects of trajectory reconstruction by having a more flexible integration of tunable parameters within ACTS

    Optimization of the rejection of the ttbar background for the search of SuperSymmetry

    No full text
    This paper presented a method to reject the ttbar background apply to the Supersymmetric process where two gluino decay into 2 b quarks and the LSP (lightest supersymmetric particle). This rejection is done using a veto on the tau lepton. This report is going to show that even if this optimisation could in theory improve greatly the significance, the low efficiency of the tau identification is going to limit us to an augmentation of a few percent of the significance.This report also contain an analyse of the Atlas trigger

    ATLAS : Search for Supersymmetry and optimization of the High Granularity timing detector

    No full text
    The Standard Model of particle physics has been extremely successful in describing the elementary particles and their interactions. Nevertheless, there are open questions that are left unanswered. Whether supersymmetry can provide answers to some of these is being studied in 13 TeV proton-proton collisions in the ATLAS experiment at the LHC. In this thesis a search for pair produced colored particles in ATLAS decaying into pairs of jets using data from 2016, 2017 and 2018 is presented. Such particles would escape standard Supersymmetry searches due to the absence of missing transverse energy in the final state. Stops decaying via a R-parity violating coupling and sgluon, scalar partners of the gluino, were considered. In the absence of a signal, an improvement of 200 GeV on the limit on the stop mass is expected. The HL-LHC will increase the integrated luminosity delivered to probe even higher mass ranges as well as improving the precision of Standard model measurements. The instantaneous luminosity will be increased by a factor 5 and an integrated luminosity of 4000 fb⁻¹ should be reached by the end of the LHC in 2037.A study of the Higgs coupling measurement prospects at the HL-LHC using SFitter is performed. Using the Delta and EFT framework shows that the increase in luminosity will result in a significant improvement of the precision of the measurement of the couplings. The High granularity timing detector detector will be installed in ATLAS for the HL-LHC. A simulation of the detector that takes into account the timing resolution was developed and used to optimize its layout. The detector performance was studied. More than 80 % of the tracks have their time correctly reconstructed with a resolution of 20 ps before irradiation and 50 ps after. Using the timing information, the electron isolation efficiency is improved by 10 %

    A High-Granularity Timing Detector in ATLAS: Performance at the HL-LHC

    No full text
    International audienceThe large increase of pileup is one of the main experimental challenges for the HL-LHC physics program. A powerful new way to address this challenge is to exploit the time spread of the interactions to distinguish between collisions occurring very close in space but well separated in time. A High-Granularity Timing Detector, based on low gain avalanche detector technology, is proposed for the ATLAS Phase-II upgrade. Covering the pseudorapidity region between 2.4 and 4.0, with a timing resolution of 30 ps for minimum-ionizing particles, this device will significantly improve the performance in the forward region. High-precision timing greatly improves the track-to-vertex association, leading to a performance similar to that in the central region for the reconstruction of both jets and leptons, as well as for the tagging of heavy-flavor jets. These improvements in object reconstruction performance translate into important sensitivity gains and enhance the reach of the HL-LHC physics program. In addition, the High-Granularity Timing Detector offers unique capabilities for the online and offline luminosity determination

    A High-Granularity Timing Detector in ATLAS: Performance at the HL-LHC

    No full text
    The large increase of pileup is one of the main experimental challenges for the HL-LHC physics program. A powerful new way to address this challenge is to exploit the time spread of the interactions to distinguish between collisions occurring very close in space but well separated in time. A High-Granularity Timing Detector, based on low gain avalanche detector technology, is proposed for the ATLAS Phase-II upgrade. Covering the pseudorapidity region between 2.4 and 4.0, with a timing resolution of 30 ps for minimum-ionizing particles, this device will significantly improve the performance in the forward region. High-precision timing greatly improves the track-to-vertex association, leading to a performance similar to that in the central region for the reconstruction of both jets and leptons, as well as for the tagging of heavy-flavor jets. These improvements in object reconstruction performance translate into important sensitivity gains and enhance the reach of the HL-LHC physics program. In addition, the High-Granularity Timing Detector offers unique capabilities for the online and offline luminosity determination

    ATLAS : recherche de la supersymétrie et optimisation du détecteur de temps fortement segmenté

    No full text
    The Standard Model of particle physics has been extremely successful in describing the elementary particles and their interactions. Nevertheless, there are open questions that are left unanswered. Whether supersymmetry can provide answers to some of these is being studied in 13 TeV proton-proton collisions in the ATLAS experiment at the LHC. In this thesis a search for pair produced colored particles in ATLAS decaying into pairs of jets using data from 2016, 2017 and 2018 is presented. Such particles would escape standard Supersymmetry searches due to the absence of missing transverse energy in the final state. Stops decaying via a R-parity violating coupling and sgluon, scalar partners of the gluino, were considered. In the absence of a signal, an improvement of 200 GeV on the limit on the stop mass is expected. The HL-LHC will increase the integrated luminosity delivered to probe even higher mass ranges as well as improving the precision of Standard model measurements. The instantaneous luminosity will be increased by a factor 5 and an integrated luminosity of 4000 fb⁻¹ should be reached by the end of the LHC in 2037.A study of the Higgs coupling measurement prospects at the HL-LHC using SFitter is performed. Using the Delta and EFT framework shows that the increase in luminosity will result in a significant improvement of the precision of the measurement of the couplings. The High granularity timing detector detector will be installed in ATLAS for the HL-LHC. A simulation of the detector that takes into account the timing resolution was developed and used to optimize its layout. The detector performance was studied. More than 80 % of the tracks have their time correctly reconstructed with a resolution of 20 ps before irradiation and 50 ps after. Using the timing information, the electron isolation efficiency is improved by 10 %.Le Modèle Standard de la physique des particules a jusqu’alors extrêmement bien réussi à décrire les particules élémentaires et leurs interactions. Malgré cela, il demeure toujours des questions ouvertes. La possibilité de répondre à ces questions grâce la Supersymétrie est actuellement à l’étude dans les collisions proton-proton à 13 TeV dans le cadre de l’expérience ATLAS au LHC. Cette thèse présente la recherche de la production de paires de particules colorées dans ATLAS, ces dernières se désintégrant en paires de jets. Pour ce faire, les données de 2016, 2017 et 2018 ont été utilisées. De telles particules échappent aux recherches standards de la Supersymétrie du fait de l’absence d’énergie transverse manquante dans l’état final. Deux signatures furent considérées, la désintégration de stops via des couplages violant la R-parité et la production de sgluon, le partenaire scalaire du gluino. En l’absence de signal, une amélioration de 200 GeV sur la masse maximum exclue est attendue. Le HL-LHC augmentera la luminosité intégrée délivrée afin de nous permettre de rechercher des particules plus massives et d'améliorer les mesures de précision du Modèle Standard. La luminosité instantanée augmentera d’un facteur 5 et une luminosité intégrée de 4000 fb⁻¹ devrait pouvoir être atteinte à la fin du LHC en 2037.Cette thèse présente également une étude des perspectives de mesure des couplages du Higgs au HL-LHC effectuée à l’aide de SFitter. Il est démontré que dans le cadre des Delta et d’une EFT, l’augmentation de la luminosité génère une amélioration de la précision de la mesure des couplages. Finalement, le Détecteur de temps fortement segmenté, qui sera installé dans ATLAS au HL-LHC, est présenté. La simulation de ce détecteur a été développée pour prendre en compte la résolution temporelle du détecteur et fut utilisée pour optimiser sa géométrie. Les performances de ce détecteur ont été étudiées, plus de 80 % des traces ont leurs temps correctement associés avec une résolution de 20 ps avant irradiation de 50 ps après. En utilisant les informations temporelles, l’isolation des électrons peut être amélioré de 10 %

    Interdisciplinary Digital Twin Engine InterTwin for calorimeter simulation

    No full text
    International audienceCalorimeter shower simulations are computationally expensive, and generative models offer an efficient alternative. However, achieving a balance between accuracy and speed remains a challenge, with distribution tail modeling being a key limitation. Invertible generative network CaloINN provides a trade-off between simulation quality and efficiency. The ongoing study targets introducing a set of post-processing modifications of analysis-level observables aimed at improving the accuracy of distribution tails. As part of interTwin project initiative developing an open-source Digital Twin Engine, we implemented the CaloINN within the interTwin AI framework

    Ranking-based neural network for ambiguity resolution in ACTS

    No full text
    International audienceThe reconstruction of particle trajectories is a key challenge of particle physics experiments, as it directly impacts particle identification and physics performances while also representing one of the main CPU consumers of many high-energy physics experiments. As the luminosity of particle colliders increases, this reconstruction will become more challenging and resource-intensive. New algorithms are thus needed to address these challenges efficiently. One potential step of track reconstruction is ambiguity resolution. In this step, performed at the end of the tracking chain, we select which tracks candidates should be kept and which must be discarded. The speed of this algorithm is directly driven by the number of track candidates, which can be reduced at the cost of some physics performance. Since this problem is fundamentally an issue of comparison and classification, we propose to use a machine learning-based approach to the Ambiguity Resolution. Using a shared-hits-based clustering algorithm, we can efficiently determine which candidates belong to the same truth particle. Afterwards, we can apply a Neural Network (NN) to compare those tracks and decide which ones are duplicates and which ones should be kept. This approach is implemented within A Common Tracking Software (ACTS) framework and tested on the Open Data Detector (ODD), a realistic virtual detector similar to a future ATLAS one. This new approach was shown to be 15 times faster than the default ACTS algorithm while removing 32 times more duplicates down to less than one duplicated track per event
    corecore