384 research outputs found
Including Systematic Uncertainties in Confidence Interval Construction for Poisson Statistics
One way to incorporate systematic uncertainties into the calculation of
confidence intervals is by integrating over probability density functions
parametrizing the uncertainties. In this note we present a development of this
method which takes into account uncertainties in the prediction of background
processes, uncertainties in the signal detection efficiency and background
efficiency and allows for a correlation between the signal and background
detection efficiencies. We implement this method with the Feldman & Cousins
unified approach with and without conditioning. We present studies of coverage
for the Feldman & Cousins and Neyman ordering schemes. In particular, we
present two different types of coverage tests for the case where systematic
uncertainties are included. To illustrate the method we show the relative
effect of including systematic uncertainties the case of dark matter search as
performed by modern neutrino tel escopes.Comment: 23 pages, 10 figures, replaced to match published versio
Dark Matter and Fundamental Physics with the Cherenkov Telescope Array
The Cherenkov Telescope Array (CTA) is a project for a next-generation
observatory for very high energy (GeV-TeV) ground-based gamma-ray astronomy,
currently in its design phase, and foreseen to be operative a few years from
now. Several tens of telescopes of 2-3 different sizes, distributed over a
large area, will allow for a sensitivity about a factor 10 better than current
instruments such as H.E.S.S, MAGIC and VERITAS, an energy coverage from a few
tens of GeV to several tens of TeV, and a field of view of up to 10 deg. In the
following study, we investigate the prospects for CTA to study several science
questions that influence our current knowledge of fundamental physics. Based on
conservative assumptions for the performance of the different CTA telescope
configurations, we employ a Monte Carlo based approach to evaluate the
prospects for detection. First, we discuss CTA prospects for cold dark matter
searches, following different observational strategies: in dwarf satellite
galaxies of the Milky Way, in the region close to the Galactic Centre, and in
clusters of galaxies. The possible search for spatial signatures, facilitated
by the larger field of view of CTA, is also discussed. Next we consider
searches for axion-like particles which, besides being possible candidates for
dark matter may also explain the unexpectedly low absorption by extragalactic
background light of gamma rays from very distant blazars. Simulated
light-curves of flaring sources are also used to determine the sensitivity to
violations of Lorentz Invariance by detection of the possible delay between the
arrival times of photons at different energies. Finally, we mention searches
for other exotic physics with CTA.Comment: (31 pages, Accepted for publication in Astroparticle Physics
Accelerated expansion from ghost-free bigravity: a statistical analysis with improved generality
We study the background cosmology of the ghost-free, bimetric theory of
gravity. We perform an extensive statistical analysis of the model using both
frequentist and Bayesian frameworks and employ the constraints on the expansion
history of the Universe from the observations of supernovae, the cosmic
microwave background and the large scale structure to estimate the model's
parameters and test the goodness of the fits. We explore the parameter space of
the model with nested sampling to find the best-fit chi-square, obtain the
Bayesian evidence, and compute the marginalized posteriors and mean
likelihoods. We mainly focus on a class of sub-models with no explicit
cosmological constant (or vacuum energy) term to assess the ability of the
theory to dynamically cause a late-time accelerated expansion. The model
behaves as standard gravity without a cosmological constant at early times,
with an emergent extra contribution to the energy density that converges to a
cosmological constant in the far future. The model can in most cases yield very
good fits and is in perfect agreement with the data. This is because many
points in the parameter space of the model exist that give rise to
time-evolution equations that are effectively very similar to those of the
CDM. This similarity makes the model compatible with observations as
in the CDM case, at least at the background level. Even though our
results indicate a slightly better fit for the CDM concordance model
in terms of the -value and evidence, none of the models is statistically
preferred to the other. However, the parameters of the bigravity model are in
general degenerate. A similar but perturbative analysis of the model as well as
more data will be required to break the degeneracies and constrain the
parameters, in case the model will still be viable compared to the
CDM.Comment: 42 pages, 9 figures; typos corrected in equations (2.12), (2.13),
(3.7), (3.8) and (3.9); more discussions added (footnotes 5, 8, 10 and 13)
and abstract, sections 4.2, 4.3 and 5 (conclusions) modified in response to
referee's comments; references added; acknowledgements modified; all results
completely unchanged; matches version accepted for publication in JHE
VERITAS Search for VHE Gamma-ray Emission from Dwarf Spheroidal Galaxies
Indirect dark matter searches with ground-based gamma-ray observatories
provide an alternative for identifying the particle nature of dark matter that
is complementary to that of direct search or accelerator production
experiments. We present the results of observations of the dwarf spheroidal
galaxies Draco, Ursa Minor, Bootes 1, and Willman 1 conducted by VERITAS. These
galaxies are nearby dark matter dominated objects located at a typical distance
of several tens of kiloparsecs for which there are good measurements of the
dark matter density profile from stellar velocity measurements. Since the
conventional astrophysical background of very high energy gamma rays from these
objects appears to be negligible, they are good targets to search for the
secondary gamma-ray photons produced by interacting or decaying dark matter
particles. No significant gamma-ray flux above 200 GeV was detected from these
four dwarf galaxies for a typical exposure of ~20 hours. The 95% confidence
upper limits on the integral gamma-ray flux are in the range 0.4-2.2x10^-12
photons cm^-2s^-1. We interpret this limiting flux in the context of pair
annihilation of weakly interacting massive particles and derive constraints on
the thermally averaged product of the total self-annihilation cross section and
the relative velocity of the WIMPs. The limits are obtained under conservative
assumptions regarding the dark matter distribution in dwarf galaxies and are
approximately three orders of magnitude above the generic theoretical
prediction for WIMPs in the minimal supersymmetric standard model framework.
However significant uncertainty exists in the dark matter distribution as well
as the neutralino cross sections which under favorable assumptions could
further lower the limits.Comment: 21 pages, 2 figures, updated to reflect version published in ApJ.
NOTE: M.D. Wood added as autho
First bounds on the high-energy emission from isolated Wolf-Rayet binary systems
High-energy gamma-ray emission is theoretically expected to arise in tight
binary star systems (with high mass loss and high velocity winds), although the
evidence of this relationship has proven to be elusive so far. Here we present
the first bounds on this putative emission from isolated Wolf-Rayet (WR) star
binaries, WR 147 and WR 146, obtained from observations with the MAGIC
telescope.Comment: (Authors are the MAGIC Collaboration.) Manuscript in press at The
Astrophysical Journal Letter
The Power to See: A New Graphical Test of Normality
Many statistical procedures assume the underlying data generating process involves Gaussian errors. Among the well-known procedures are ANOVA, multiple regression, linear discriminant analysis and many more. There are a few popular procedures that are commonly used to test for normality such as the Kolmogorov-Smirnov test and the ShapiroWilk test. Excluding the Kolmogorov-Smirnov testing procedure, these methods do not have a graphical representation. As such these testing methods offer very little insight as to how the observed process deviates from the normality assumption. In this paper we discuss a simple new graphical procedure which provides confidence bands for a normal quantile-quantile plot. These bands define a test of normality and are much narrower in the tails than those related to the Kolmogorov-Smirnov test. Correspondingly the new procedure has much greater power to detect deviations from normality in the tails
Cancer treatment-related neuropathic pain:proof of concept study with menthol—a TRPM8 agonist
PURPOSE: Effective treatment of neuropathic pain without unacceptable side effects is challenging. Cancer sufferers increasingly live with long-term treatment-related neuropathic pain, resulting from chemotherapy-induced peripheral neuropathy (CIPN) or surgical scars. This proof-of-concept study aimed to determine whether preclinical evidence for TRPM8 ion channels in sensory neurons as a novel analgesic target could be translated to clinical benefit in patients with neuropathic pain, using the TRPM8 activator menthol. PATIENTS AND METHODS: Patients with problematic treatment-related neuropathic pain underwent a baseline assessment using validated questionnaires, psychophysical testing, and objective functional measures. The painful area was treated with topical 1 % menthol cream twice daily. Assessments were repeated at 4–6 weeks. The primary outcome was the change in Brief Pain Inventory total scores at 4–6 weeks. Secondary outcomes included changes in function, mood and skin sensation. RESULTS: Fifty-one patients (female/male, 32/19) were recruited with a median age of 61 (ranging from 20 to 89). The commonest aetiology was CIPN (35/51), followed by scar pain (10/51). Thirty-eight were evaluable on the primary outcome. Eighty-two per cent (31/38) had an improvement in total Brief Pain Inventory scores (median, 47 (interquartile range, 30 to 64) to 34 (6 to 59), P < 0.001). Improvements in mood (P = 0.0004), catastrophising (P = 0.001), walking ability (P = 0.008) and sensation (P < 0.01) were also observed. CONCLUSION: This proof-of-concept study indicates that topical menthol has potential as a novel analgesic therapy for cancer treatment-related neuropathic pain. Improvements in patient-rated measures are supported by changes in objective measures of physical function and sensation. Further systematic evaluation of efficacy is required
The lancet weight determines wheal diameter in response to skin prick testing with histamine
BACKGROUND:Skin prick test (SPT) is a common test for diagnosing immunoglobulin E-mediated allergies. In clinical routine, technicalities, human errors or patient-related biases, occasionally results in suboptimal diagnosis of sensitization. OBJECTIVE:Although not previously assessed qualitatively, lancet weight is hypothesized to be important when performing SPT to minimize the frequency of false positives, false negatives, and unwanted discomfort. METHODS:Accurate weight-controlled SPT was performed on the volar forearms and backs of 20 healthy subjects. Four predetermined lancet weights were applied (25 g, 85 g, 135 g and 265 g) using two positive control histamine solutions (1 mg/mL and 10 mg/mL) and one negative control (saline). A total of 400 SPTs were conducted. The outcome parameters were: wheal size, neurogenic inflammation (measured by superficial blood perfusion), frequency of bleeding, and the lancet provoked pain response. RESULTS:The mean wheal diameter increased significantly as higher weights were applied to the SPT lancet, e.g. from 3.2 ± 0.28 mm at 25 g to 5.4 ± 1.7 mm at 265 g (p<0.01). Similarly, the frequency of bleeding, the provoked pain, and the neurogenic inflammatory response increased significantly. At 265 g saline evoked two wheal responses (/160 pricks) below 3 mm. CONCLUSION AND CLINICAL RELEVANCE:The applied weight of the lancet during the SPT-procedure is an important factor. Higher lancet weights precipitate significantly larger wheal reactions with potential diagnostic implications. This warrants additional research of the optimal lancet weight in relation to SPT-guidelines to improve the specificity and sensitivity of the procedure
An update of the Worldwide Integrated Assessment (WIA) on systemic insecticides. Part 2: impacts on organisms and ecosystems
New information on the lethal and sublethal effects of neonicotinoids and fipronil on organisms is presented in this review, complementing the previous WIA in 2015. The high toxicity of these systemic insecticides to invertebrates has been confirmed and expanded to include more species and compounds. Most of the recent research has focused on bees and the sublethal and ecological impacts these insecticides have on pollinators. Toxic effects on other invertebrate taxa also covered predatory and parasitoid natural enemies and aquatic arthropods. Little, while not much new information has been gathered on soil organisms. The impact on marine coastal ecosystems is still largely uncharted. The chronic lethality of neonicotinoids to insects and crustaceans, and the strengthened evidence that these chemicals also impair the immune system and reproduction, highlights the dangers of this particular insecticidal classneonicotinoids and fipronil. , withContinued large scale – mostly prophylactic – use of these persistent organochlorine pesticides has the potential to greatly decreasecompletely eliminate populations of arthropods in both terrestrial and aquatic environments. Sublethal effects on fish, reptiles, frogs, birds and mammals are also reported, showing a better understanding of the mechanisms of toxicity of these insecticides in vertebrates, and their deleterious impacts on growth, reproduction and neurobehaviour of most of the species tested. This review concludes with a summary of impacts on the ecosystem services and functioning, particularly on pollination, soil biota and aquatic invertebrate communities, thus reinforcing the previous WIA conclusions (van der Sluijs et al. 2015)
Evaluation und Controlling derUnternehmenskommunikation
Kommunikationsmanager wissen heute um die Wirkung ihrer Arbeit und umihren Beitrag zur Erreichung von Unternehmenszielen. Dabei handelt es sich umLeistungen für eine gute Berichterstattung in den Medien, für die Reputation desUnternehmens und mitunter sogar für die direkte Verkaufsförderung, für dieMitarbeitermotivation genauso wie für die Gewinnung von Nachwuchskräften.Allerdings beruht dieses Wissen allzu oft auf Intuition, auf besonderen Erfahrun-gen wie im Falle von Krisen oder auf vereinzelten Erfolgsmessungen. Was häufigfehlt, ist ein institutionalisiertes Controlling der Unternehmenskommunikation,mit der die Steuerung und Evaluation der Kommunikationsprozesse systematischacceptedVersio
- …
