133 research outputs found

    Measurement of the inclusive and dijet cross-sections of b-jets in pp collisions at sqrt(s) = 7 TeV with the ATLAS detector

    Get PDF
    The inclusive and dijet production cross-sections have been measured for jets containing b-hadrons (b-jets) in proton-proton collisions at a centre-of-mass energy of sqrt(s) = 7 TeV, using the ATLAS detector at the LHC. The measurements use data corresponding to an integrated luminosity of 34 pb^-1. The b-jets are identified using either a lifetime-based method, where secondary decay vertices of b-hadrons in jets are reconstructed using information from the tracking detectors, or a muon-based method where the presence of a muon is used to identify semileptonic decays of b-hadrons inside jets. The inclusive b-jet cross-section is measured as a function of transverse momentum in the range 20 < pT < 400 GeV and rapidity in the range |y| < 2.1. The bbbar-dijet cross-section is measured as a function of the dijet invariant mass in the range 110 < m_jj < 760 GeV, the azimuthal angle difference between the two jets and the angular variable chi in two dijet mass regions. The results are compared with next-to-leading-order QCD predictions. Good agreement is observed between the measured cross-sections and the predictions obtained using POWHEG + Pythia. MC@NLO + Herwig shows good agreement with the measured bbbar-dijet cross-section. However, it does not reproduce the measured inclusive cross-section well, particularly for central b-jets with large transverse momenta.Comment: 10 pages plus author list (21 pages total), 8 figures, 1 table, final version published in European Physical Journal

    Anaerobic animals from an ancient, anoxic ecological niche

    Get PDF
    Tiny marine animals that complete their life cycle in the total absence of light and oxygen are reported by Roberto Danovaro and colleagues in this issue of BMC Biology. These fascinating animals are new members of the phylum Loricifera and possess mitochondria that in electron micrographs look very much like hydrogenosomes, the H2-producing mitochondria found among several unicellular eukaryotic lineages. The discovery of metazoan life in a permanently anoxic and sulphidic environment provides a glimpse of what a good part of Earth's past ecology might have been like in 'Canfield oceans', before the rise of deep marine oxygen levels and the appearance of the first large animals in the fossil record roughly 550-600 million years ago. The findings underscore the evolutionary significance of anaerobic deep sea environments and the anaerobic lifestyle among mitochondrion-bearing cells. They also testify that a fuller understanding of eukaryotic and metazoan evolution will come from the study of modern anoxic and hypoxic habitats

    Search for CP violation in D0 → KS0KS0 decays in proton-proton collisions at √s=13TeV

    Get PDF
    A search is reported for charge-parity CP violation in D0 → K0 SK0 S decays, using data collected in proton– proton collisions at √s = 13 TeV recorded by the CMS experiment in 2018. The analysis uses a dedicated data set that corresponds to an integrated luminosity of 41.6 fb−1 , which consists of about 10 billion events containing a pair of b hadrons, nearly all of which decay to charm hadrons. The flavor of the neutral D meson is determined by the pion charge in the reconstructed decays D∗+ → D0 π+ and D∗− → D0 π−. The CP asymmetry in D0 → K0 SK0 S is measured to be ACP(K0 SK0 S) = (6.2 ± 3.0 ± 0.2 ± 0.8)%, where the three uncertainties represent the statistical uncertainty, the systematic uncertainty, and the uncertainty in the measurement of theCP asymmetry in the D0 → K0 Sπ+π− decay. This is the first CP asymmetry measurement by CMS in the charm sector as well as the first to utilize a fully hadronic final state

    Accuracy versus precision in boosted top tagging with the ATLAS detector

    Get PDF
    Abstract The identification of top quark decays where the top quark has a large momentum transverse to the beam axis, known as top tagging, is a crucial component in many measurements of Standard Model processes and searches for beyond the Standard Model physics at the Large Hadron Collider. Machine learning techniques have improved the performance of top tagging algorithms, but the size of the systematic uncertainties for all proposed algorithms has not been systematically studied. This paper presents the performance of several machine learning based top tagging algorithms on a dataset constructed from simulated proton-proton collision events measured with the ATLAS detector at √ s = 13 TeV. The systematic uncertainties associated with these algorithms are estimated through an approximate procedure that is not meant to be used in a physics analysis, but is appropriate for the level of precision required for this study. The most performant algorithms are found to have the largest uncertainties, motivating the development of methods to reduce these uncertainties without compromising performance. To enable such efforts in the wider scientific community, the datasets used in this paper are made publicly available.</jats:p

    Search for light long-lived particles in pp collisions at √s = 13 TeV using displaced vertices in the ATLAS inner detector

    Get PDF
    A search for long-lived particles (LLPs) using 140 fb−1 of pp collision data with √s = 13 TeV recorded by the ATLAS experiment at the LHC is presented. The search targets LLPs with masses between 5 and 55 GeV that decay hadronically in the ATLAS inner detector. Benchmark models with LLP pair production from exotic decays of the Higgs boson and models featuring long-lived axionlike particles (ALPs) are considered. No significant excess above the expected background is observed. Upper limits are placed on the branching ratio of the Higgs boson to pairs of LLPs, the cross section for ALPs produced in association with a vector boson, and, for the first time, on the branching ratio of the top quark to an ALP and a u/c quark

    Measurement of the transverse momentum distribution of [Z over γ*] bosons in proton-proton collisions at √s = 7 TeV with the ATLAS detector

    Get PDF
    A measurement of the [Z over γ*] transverse momentum (p[Z over T]) distribution in proton–proton collisions at √s = 7 TeV is presented using [Z over γ*] →e[superscript +]e[superscript −] and [Z over γ*] →μ[superscript +]μ[superscript −] decays collected with the ATLAS detector in data sets with integrated luminosities of 35 pb[superscript −1] and 40 pb[superscript −1], respectively. The normalized differential cross sections are measured separately for electron and muon decay channels as well as for their combination up to p[Z over T] of 350 GeV for invariant dilepton masses 66 GeV<m[subscript ℓℓ]<116 GeV. The measurement is compared to predictions of perturbative QCD and various event generators. The prediction of resummed QCD combined with fixed order perturbative QCD is found to be in good agreement with the data.United States. Dept. of EnergyNational Science Foundation (U.S.)Brookhaven National LaboratoryEuropean Organization for Nuclear Researc

    Performance of the ATLAS Trigger System in 2010

    Get PDF
    Proton-proton collisions at sqrt{s} = 7 TeV and heavy ion collisions at sqrt{s_NN} = 2.76 TeV were produced by the LHC and recorded using the ATLAS experiment's trigger system in 2010. The LHC is designed with a maximum bunch crossing rate of 40 MHz and the ATLAS trigger system is designed to record approximately 200 of these per second. The trigger system selects events by rapidly identifying signatures of muon, electron, photon, tau lepton, jet, and B meson candidates, as well as using global event signatures, such as missing transverse energy. An overview of the ATLAS trigger system, the evolution of the system during 2010 and the performance of the trigger system components and selections based on the 2010 collision data are shown. A brief outline of plans for the trigger system in 2011 is presente

    The CMS Statistical Analysis and Combination Tool: Combine

    Get PDF
    Metrics: https://link.springer.com/article/10.1007/s41781-024-00121-4/metricsThis paper describes the Combine software package used for statistical analyses by the CMS Collaboration. The package, originally designed to perform searches for a Higgs boson and the combined analysis of those searches, has evolved to become the statistical analysis tool presently used in the majority of measurements and searches performed by the CMS Collaboration. It is not specific to the CMS experiment, and this paper is intended to serve as a reference for users outside of the CMS Collaboration, providing an outline of the most salient features and capabilities. Readers are provided with the possibility to run Combine and reproduce examples provided in this paper using a publicly available container image. Since the package is constantly evolving to meet the demands of ever-increasing data sets and analysis sophistication, this paper cannot cover all details of Combine. However, the online documentation referenced within this paper provides an up-to-date and complete user guide.CERN (European Organization for Nuclear Research)STFC (United Kingdom)Marie-Curie programme and the European Research Council and Horizon 2020 Grant, contract Nos. 675440, 724704, 752730, 758316, 765710, 824093, 101115353, 101002207, and COST Action CA16108 (European Union); the Leventis Foundation; the Alfred P. Sloan Foundatio

    Portable Acceleration of CMS Computing Workflows with Coprocessors as a Service

    Get PDF
    A preprint version of the article is available at: arXiv:2402.15366v2 [physics.ins-det], https://arxiv.org/abs/2402.15366 . Comments: Replaced with the published version. Added the journal reference and the DOI. All the figures and tables can be found at https://cms-results.web.cern.ch/cms-results/public-results/publications/MLG-23-001 (CMS Public Pages). Report numbers: CMS-MLG-23-001, CERN-EP-2023-303.Data Availability: No datasets were generated or analyzed during the current study.Computing demands for large scientific experiments, such as the CMS experiment at the CERN LHC, will increase dramatically in the next decades. To complement the future performance increases of software running on central processing units (CPUs), explorations of coprocessor usage in data processing hold great potential and interest. Coprocessors are a class of computer processors that supplement CPUs, often improving the execution of certain functions due to architectural design choices. We explore the approach of Services for Optimized Network Inference on Coprocessors (SONIC) and study the deployment of this as-a-service approach in large-scale data processing. In the studies, we take a data processing workflow of the CMS experiment and run the main workflow on CPUs, while offloading several machine learning (ML) inference tasks onto either remote or local coprocessors, specifically graphics processing units (GPUs). With experiments performed at Google Cloud, the Purdue Tier-2 computing center, and combinations of the two, we demonstrate the acceleration of these ML algorithms individually on coprocessors and the corresponding throughput improvement for the entire workflow. This approach can be easily generalized to different types of coprocessors and deployed on local CPUs without decreasing the throughput performance. We emphasize that the SONIC approach enables high coprocessor usage and enables the portability to run workflows on different types of coprocessors.SCOAP3. Open access funding provided by CERN (European Organization for Nuclear Research
    corecore