843 research outputs found

    (N)NLO+NLL’ accurate predictions for plain and groomed 1-jettiness in neutral current DIS

    Get PDF
    The possibility to reanalyse data taken by the HERA experiments offers the chance to study modern QCD jet and event-shape observables in deep-inelastic scattering. To address this, we compute resummed and matched predictions for the 1-jettiness distribution in neutral current DIS with and without grooming the hadronic final state using the soft-drop technique. Our theoretical predictions also account for non-perturbative corrections from hadronisation through parton-to-hadron level transfer matrices extracted from dedicated Monte Carlo simulations with Sherpa. To estimate parameter uncertainties in particular for the beam-fragmentation modelling we derive a family of replica tunes to data from the HERA experiments. While NNLO QCD normalisation corrections to the NLO+NLL’ prediction are numerically small, hadronisation corrections turn out to be quite sizeable. However, soft-drop grooming significantly reduces the impact of non-perturbative contributions. We supplement our study with hadron-level predictions from Sherpa based on the matching of NLO QCD matrix elements with the parton shower. Good agreement between the predictions from the two calculational methods is observed

    The Alaric parton shower for hadron colliders

    Full text link
    We introduce the Alaric parton shower for simulating QCD radiation at hadron colliders and present numerical results from an implementation in the event generator Sherpa. Alaric provides a consistent framework to quantify certain systematic uncertainties which cannot be eliminated by comparing the parton shower with analytic resummation. In particular, it allows to study recoil effects away from the soft and collinear limits without the need to change the evolution variable or the splitting functions. We assess the performance of Alaric in Drell-Yan lepton pair and QCD jet production, and present the first multi-jet merging for the new algorithm.Comment: 18 pages, 12 figure

    Electroweak Corrections and EFT Operators in W+WW^+W^- production at the LHC

    Full text link
    We investigate the impact of electroweak corrections and Effective Field Theory operators on W+WW^+W^- production at the Large Hadron Collider (LHC). Utilising the Standard Model Effective Field Theory (SMEFT) framework, we extend the Standard Model by incorporating higher-dimensional operators to encapsulate potential new physics effects. These operators allow for a model-independent approach to data interpretation, essential for probing beyond the Standard Model physics. We generate pseudo data at the next-to-leading order in Quantum Chromodynamics and include approximate electroweak corrections. Our analysis focuses on the interplay between these corrections and SMEFT operators at leading order. The inclusion of electroweak corrections is crucial as they can counteract the effects predicted by SMEFT operators, necessitating precise theoretical and experimental handling. By examining ppW+Wpp \to W^+W^- production, a process sensitive to the electroweak symmetry-breaking mechanism, we demonstrate the importance of these corrections in isolating and interpreting new physics signatures. Our results highlight the significant role of electroweak corrections in enhancing the interpretative power of LHC data and in obtaining reliable constraints on new physics interactions.Comment: 14 pages, 4 figures, 6 table

    Design and implementation of an indoor modeling method through crowdsensing

    Get PDF
    While automatic modeling and mapping of outdoor environments is well-established, the indoor equivalent of automated generation of building floor plans poses a challenge. In fact, outdoor localization is commonly available and inexpensive through the existing satellite positioning systems, such as GPS and Galileo. However, these technologies are not applicable in indoor environments, since a direct line of sight to the satellites, orbiting the globes, is required. As a substitution, the technical literature comprises several proposals for the development of simultaneous indoor localization and mapping (SLAM). In these approaches, the authors mostly exploit indoor resources such as the WiFi access points and the mobile smart devices carried by individuals in the indoor environment. Collecting data from several mobile devices is referred to as crowdsensing. To enable the generation of two-dimensional (2D) as well as three-dimensional (3D) maps, we propose crowdsensing of point clouds, which are 3D data structures of points in space. For localization, we integrate two features of a recently developed mobile device, called Project Tango. Specifically, the Tango platform provides two main technologies for reliable localization, namely motion tracking and area learning. Moreover, Tango-powered devices provide us with the ability to collect point clouds though a third technology, called depth perception. In the past few years, spatial data obtained from range imaging was used to generate indoor maps. Nevertheless, range images are expensive and not always available. The required equipment, e.g. laser range scanners, are both expensive in procurement and require trained personnel for proper setup and operation. In this thesis, we aim for obtaining spatial point clouds via crowdsensing. The main idea is to use sensor data which can be scanned by volunteering individuals using easy to handle mobile devices. Specifically, we depend on depth perception capabilities as provided by Google Tango-powered tablet computers. A crowdsensing infrastructure assigns scanning tasks to individuals carrying a Tango device. Execution of such a task consists of taking scans of e.g. offices in a public building. The scanning results contain both spatial information about the room layout and its position. Energy consumption on the mobile device is reduced by applying Octree compression to the scanned point clouds, which results in a significant reduction of the amount of data, which has to be transferred to a back-end server. Afterwards, the back-end is responsible for assembling the received scans and the extraction of an indoors model. The modeling process - developed in this thesis - comprises two-phases. First, we extract a basic model from the obtained point clouds, which may contain outliers, inaccuracies and gaps. In the second phase, we refine the model by exploiting formal grammars. It is worth to mention here that we are the first to exploit formal grammars as a model fitting tool. We feed the information obtained in the first phase to an indoors grammar, which has been developed in the ComNSense project, University of Stuttgart. The resultant model both contains much less deviations from the ground truth and provides improved robustness against aberrations with respect to localization during the scanning process. Thus, instead of scanning multiple point clouds per room, we need only one scan to be able to construct an indoor map. During evaluation of this process, using scans of offices of our department, we were able to reproduce a model which is very close to the ground truth

    Decoding Higgs Boson Branching Ratios from event shapes

    Get PDF
    This contribution will discuss a novel strategy for the simultaneous measurements of Higgs boson branching ratios into gluons and light quarks at a future lepton collider, operating in the Higgs-factory mode. The method is based on template fits to global event-shape observables, and in particular fractional energy correlations, thereby exploiting differences in the QCD radiation patterns of quarks and gluons. This approach is orthogonal to measurements based on traditional tagging methods based mainly on displaced vertices and allows for an extraction of limits on both Higgs boson to gluon- and light quark branching ratios separately. Additionally, state-of-the-art calculations for the relevant observables are commented on

    The ALARIC parton shower

    Get PDF
    Parton showers are important tools in the event generation chain for present and future colliders. Recently, their formally achieved accuracy has been under extended scrutiny. This contribution will present a novel take on dipole parton showers, resulting in the design of a new parton shower called ALARIC that is implemented in the SHERPA framework. Its resummation properties, including analytic and numerical proofs of its NLL accuracy, will be discussed alongside the latest developments

    Precision calculations for groomed event shapes at HERA

    Full text link
    The possibility to reanalyse data taken by the HERA experiments offers the chance to study modern QCD jet and event-shape observables in deep-inelastic scattering production. In this contribution we present resummed and matched predictions for the groomed invariant-mass event shape in neutral-current DIS including the effect of grooming the hadronic final state using the soft-drop technique. Non-perturbative corrections from hadronisation are taken into account through parton-to-hadron level transfer matrices extracted from dedicated Monte Carlo simulations with SHERPA, including uncertainties extracted from replica tunes to data from the HERA experiments.Comment: 10 pages, 5 figures, contribution to the 31st International Workshop on Deep Inelastic Scattering and Related Subjects (DIS2024

    Measuring hadronic Higgs boson branching ratios at future lepton colliders

    Get PDF
    We present a novel strategy for the simultaneous measurement of Higgs-boson branching ratios into gluons and light quarks at a future lepton collider operating in the Higgs-factory mode. Our method is based on template fits to global event-shape observables, and in particular fractional energy correlations, thereby exploiting differences in the QCD radiation patterns of quarks and gluons. In a constrained fit of the deviations of the light-flavour hadronic Higgs-boson branching ratios from their Standard Model expectations, based on an integrated luminosity of 5ab-1, we obtain 68% confidence level limits of μgg=1±0.05 and μqq¯<21
    corecore