40 research outputs found
Bayesian Methodologies with pyhf
bayesian_pyhf is a Python package that allows for the parallel Bayesian and
frequentist evaluation of multi-channel binned statistical models. The Python
library pyhf is used to build such models according to the HistFactory
framework and already includes many frequentist inference methodologies. The
pyhf-built models are then used as data-generating model for Bayesian inference
and evaluated with the Python library PyMC. Based on Monte Carlo Chain Methods,
PyMC allows for Bayesian modelling and together with the arviz library offers a
wide range of Bayesian analysis tools.Comment: 8 pages, 3 figures, 1 listing. Contribution to the Proceedings of the
26th International Conference on Computing In High Energy and Nuclear Physics
(CHEP 2023
Study of the worm-gear-T function g1T with semi-inclusive DIS data
The worm-gear-T function parametrizes the probability of finding a longitudinally polarized quark inside a transversely polarized hadron. We extract it from data on polarized semi-inclusive deep-inelastic scattering measured at COMPASS and HERMES. As a theoretical model, we use a Wandzura-Wilczek-type approximation at next-to-leading order. We find that at present, the data quality is insufficient to determine the worm-gear-T function faithfully. We also provide predictions for the transverse single-spin asymmetry associated with the worm-gear-T function, which could be measured in weak-boson production at STAR
Bayesian Methodologies with pyhf
bayesian_pyhf is a Python package that allows for the parallel Bayesian and frequentist evaluation of multi-channel binned statistical models. The Python library pyhf is used to build such models according to the HistFactory framework and already includes many frequentist inference methodologies. The pyhf-built models are then used as data-generating model for Bayesian inference and evaluated with the Python library PyMC. Based on Monte Carlo Chain Methods, PyMC allows for Bayesian modelling and together with the arviz library offers a wide range of Bayesian analysis tools
Antibiotic Containing Bone Substitute in Major Hip Surgery:A Long Term Gentamicin Elution Study
Objectives: The objective is to present the antibiotic elution from a locally implanted gentamicin containing hydroxyapatite and calcium sulphate bone substitute with an extended follow up of 30 days. We also compare the pharmacokinetics of the ceramic bone substitute with a published study on gentamicin containing poly (methyl methacrylate) (PMMA) bone cement used in primary total hip arthroplasty. Methods: Gentamicin release was measured in the urine for a month and the serum for 4 days in 10 patients operated for trochanteric hip fractures and 10 patients in uncemented hip revisions. 17 patients were followed up at one year and 3 patients at 6 months. Results and Discussion: The gentamicin concentrations measured in serum were low and approximately 100 times less than in urine during the first days, indicating high local concentrations at the implant site. The elution from the biphasic bone substitute showed a stronger burst and higher gentamicin concentrations for the first week compared to that reported for PMMA used in hip arthroplasty. Also, for the bone substitute a complete gentamicin elution was obtained after 30 days, while for the PMMA cement sub-inhibitory MIC levels of gentamicin were still present in urine 60 days past surgery. No infections were detected. Conclusions: A new biphasic bone substitute containing antibiotics could potentially be used to prevent infection in patients treated for trochanteric hip fractures or uncemented hip revisions. The gentamicin elution from the bone substitute is efficient with high initial local gentamicin concentrations and complete release at 30 days
Constructing model-agnostic likelihoods, a method for the reinterpretation of particle physics results
Experimental High Energy Physics has entered an era of precision measurements. However, measurements of many of the accessible processes assume that the final states’ underlying kinematic distribution is the same as the Standard Model prediction. This assumption introduces an implicit model-dependency into the measurement, rendering the reinterpretation of the experimental analysis complicated without reanalysing the underlying data. We present a novel reweighting method in order to perform reinterpretation of particle physics measurements. It makes use of reweighting the Standard Model templates according to kinematic signal distributions of alternative theoretical models, prior to performing the statistical analysis. The generality of this method allows us to perform statistical inference in the space of theoretical parameters, assuming different kinematic distributions, according to a beyond Standard Model prediction. We implement our method as an extension to the pyhf software and interface it with the EOS software, which allows us to perform flavor physics phenomenology studies. Furthermore, we argue that, beyond the pyhf or HistFactory likelihood specification, only minimal information is necessary to make a likelihood model-agnostic and hence easily reinterpretable. We showcase that publishing such likelihoods is crucial for a full exploitation of experimental results
Bayesian Methodologies with pyhf
bayesian_pyhf is a Python package that allows for the parallel Bayesian and frequentist evaluation of multi-channel binned statistical models. The Python library pyhf is used to build such models according to the HistFactory framework and already includes many frequentist inference methodologies. The pyhf-built models are then used as data-generating model for Bayesian inference and evaluated with the Python library PyMC. Based on Monte Carlo Chain Methods, PyMC allows for Bayesian modelling and together with the arviz library offers a wide range of Bayesian analysis tools
Constructing model-agnostic likelihoods, a method for the reinterpretation of particle physics results
Experimental High Energy Physics has entered an era of precision measurements. However, measurements
of many of the accessible processes assume that the final states’ underlying kinematic distribution is the same as the
Standard Model prediction. This assumption introduces an implicit model-dependency into the measurement, rendering
the reinterpretation of the experimental analysis complicated without reanalysing the underlying data. We present a novel
reweighting method in order to perform reinterpretation of particle physics measurements. It makes use of reweighting
the Standard Model templates according to kinematic signal distributions of alternative theoretical models, prior to per-
forming the statistical analysis. The generality of this method allows us to perform statistical inference in the space of the-
oretical parameters, assuming different kinematic distributions, according to a beyond Standard Model prediction. We
implement our method as an extension to the pyhf software and interface it with the EOS software, which allows us to per-
form flavor physics phenomenology studies. Furthermore, we argue that, beyond the pyhf or HistFactory likelihood specification, only minimal information is necessary to make a likelihood model-agnostic and hence easily reinterpretable. We showcase that publishing such likelihoods is crucial for a full exploitation of experimental results
Search for heavy neutral leptons in decays of W bosons using leptonic and semi-leptonic displaced vertices in = 13 TeV pp collisions with the ATLAS detector
A search is performed for long-lived heavy neutral leptons (HNLs), produced through the decay of a W boson along with a muon or electron. Two channels are explored: a leptonic channel, in which the HNL decays into two leptons and a neutrino, and a semi-leptonic channel, in which the HNL decays into a lepton and a charged pion. The search is performed with 140 fb of = 13 TeV proton-proton collision data collected by ATLAS during Run 2 of the Large Hadron Collider. No excess of events is observed; Dirac-like and Majorana-like HNLs with masses below 14.5 GeV and mixing coefficients as small as 10 are excluded at the 95% confidence level. The results are interpreted under different assumptions on the flavour of the leptons from the HNL decays.[graphic not available: see fulltext
Configuration, Performance, and Commissioning of the ATLAS b-jet Triggers for the 2022 and 2023 LHC data-taking periods
In 2022 and 2023, the Large Hadron Collider producedapproximately two billion hadronic interactions each second frombunches of protons that collide at a rate of 40 MHz. The ATLAStrigger system is used to reduce this rate to a few kHz forrecording. Selections based on hadronic jets, their energy, andevent topology reduce the rate to (10) kHz whilemaintaining high efficiencies for important signatures resulting in b-quarks, but to reach the desired recording rate of hundreds ofHz, additional real-time selections based on the identification ofjets containing b-hadrons (b-jets) are employed to achieve lowthresholds on the jet transverse momentum at the High-Level Trigger.The configuration, commissioning, and performance of the real-timeATLAS b-jet identification algorithms for the early LHC Run 3collision data are presented. These recent developments providesubstantial gains in signal efficiency for critical signatures; forthe Standard Model production of Higgs boson pairs, a 50%improvement in selection efficiency is observed in final states withfour b-quarks or two b-quarks and two hadronically decaying τ-leptons
