749 research outputs found
Supersymmetry and the LHC Inverse Problem
Given experimental evidence at the LHC for physics beyond the standard model,
how can we determine the nature of the underlying theory? We initiate an
approach to studying the "inverse map" from the space of LHC signatures to the
parameter space of theoretical models within the context of low-energy
supersymmetry, using 1808 LHC observables including essentially all those
suggested in the literature and a 15 dimensional parametrization of the
supersymmetric standard model. We show that the inverse map of a point in
signature space consists of a number of isolated islands in parameter space,
indicating the existence of "degeneracies"--qualitatively different models with
the same LHC signatures. The degeneracies have simple physical
characterizations, largely reflecting discrete ambiguities in electroweak-ino
spectrum, accompanied by small adjustments for the remaining soft parameters.
The number of degeneracies falls in the range 1<d<100, depending on whether or
not sleptons are copiously produced in cascade decays. This number is large
enough to represent a clear challenge but small enough to encourage looking for
new observables that can further break the degeneracies and determine at the
LHC most of the SUSY physics we care about. Degeneracies occur because
signatures are not independent, and our approach allows testing of any new
signature for its independence. Our methods can also be applied to any other
theory of physics beyond the standard model, allowing one to study how model
footprints differ in signature space and to test ways of distinguishing
qualitatively different possibilities for new physics at the LHC.Comment: 55 pages, 30 figure
Theoretical predictions for the direct detection of neutralino dark matter in the NMSSM
We analyse the direct detection of neutralino dark matter in the framework of
the Next-to-Minimal Supersymmetric Standard Model. After performing a detailed
analysis of the parameter space, taking into account all the available
constraints from LEPII, we compute the neutralino-nucleon cross section, and
compare the results with the sensitivity of detectors. We find that sizable
values for the detection cross section, within the reach of dark matter
detectors, are attainable in this framework. For example, neutralino-proton
cross sections compatible with the sensitivity of present experiments can be
obtained due to the exchange of very light Higgses with m_{h_1^0}\lsim 70
GeV. Such Higgses have a significant singlet composition, thus escaping
detection and being in agreement with accelerator data. The lightest neutralino
in these cases exhibits a large singlino-Higgsino composition, and a mass in
the range 50\lsim m_{\tilde\chi_1^0}\lsim 100 GeV.Comment: Final version to appear in JHEP. References added. LaTeX, 53 pages,
23 figure
Shifts in the Properties of the Higgs Boson from Radion Mixing
We examine how mixing between the Standard Model Higgs boson, , and the
radion present in the Randall-Sundrum model of localized gravity modifies the
expected properties of the Higgs boson. In particular, we demonstrate that the
total and partial decay widths of the Higgs, as well as the branching
fraction, can be substantially altered from their Standard Model expectations.
The remaining branching fractions are modified less than \lsim 5% for most of
the parameter space volume.Comment: 17 pages, 7 figs., LaTex; revised versio
Aspects of CP violation in the HZZ coupling at the LHC
We examine the CP-conserving (CPC) and CP-violating (CPV) effects of a
general HZZ coupling through a study of the process H -> ZZ* -> 4 leptons at
the LHC. We construct asymmetries that directly probe these couplings. Further,
we present complete analytical formulae for the angular distributions of the
decay leptons and for some of the asymmetries. Using these we have been able to
identify new observables which can provide enhanced sensitivity to the CPV coupling. We also explore probing CP violation through shapes of
distributions in different kinematic variables, which can be used for Higgs
bosons with mH < 2 mZ.Comment: 36 pages, 17 figures, LaTeX, version accepted for publicatio
Biodiversity Loss and the Taxonomic Bottleneck: Emerging Biodiversity Science
Human domination of the Earth has resulted in dramatic changes to global and local patterns of biodiversity. Biodiversity is critical to human sustainability because it drives the ecosystem services that provide the core of our life-support system. As we, the human species, are the primary factor leading to the decline in biodiversity, we need detailed information about the biodiversity and species composition of specific locations in order to understand how different species contribute to ecosystem services and how humans can sustainably conserve and manage biodiversity. Taxonomy and ecology, two fundamental sciences that generate the knowledge about biodiversity, are associated with a number of limitations that prevent them from providing the information needed to fully understand the relevance of biodiversity in its entirety for human sustainability: (1) biodiversity conservation strategies that tend to be overly focused on research and policy on a global scale with little impact on local biodiversity; (2) the small knowledge base of extant global biodiversity; (3) a lack of much-needed site-specific data on the species composition of communities in human-dominated landscapes, which hinders ecosystem management and biodiversity conservation; (4) biodiversity studies with a lack of taxonomic precision; (5) a lack of taxonomic expertise and trained taxonomists; (6) a taxonomic bottleneck in biodiversity inventory and assessment; and (7) neglect of taxonomic resources and a lack of taxonomic service infrastructure for biodiversity science. These limitations are directly related to contemporary trends in research, conservation strategies, environmental stewardship, environmental education, sustainable development, and local site-specific conservation. Today’s biological knowledge is built on the known global biodiversity, which represents barely 20% of what is currently extant (commonly accepted estimate of 10 million species) on planet Earth. Much remains unexplored and unknown, particularly in hotspots regions of Africa, South Eastern Asia, and South and Central America, including many developing or underdeveloped countries, where localized biodiversity is scarcely studied or described. ‘‘Backyard biodiversity’’, defined as local biodiversity near human habitation, refers to the natural resources and capital for ecosystem services at the grassroots level, which urgently needs to be explored, documented, and conserved as it is the backbone of sustainable economic development in these countries. Beginning with early identification and documentation of local flora and fauna, taxonomy has documented global biodiversity and natural history based on the collection of ‘‘backyard biodiversity’’ specimens worldwide. However, this branch of science suffered a continuous decline in the latter half of the twentieth century, and has now reached a point of potential demise. At present there are very few professional taxonomists and trained local parataxonomists worldwide, while the need for, and demands on, taxonomic services by conservation and resource management communities are rapidly increasing. Systematic collections, the material basis of biodiversity information, have been neglected and abandoned, particularly at institutions of higher learning. Considering the rapid increase in the human population and urbanization, human sustainability requires new conceptual and practical approaches to refocusing and energizing the study of the biodiversity that is the core of natural resources for sustainable development and biotic capital for sustaining our life-support system. In this paper we aim to document and extrapolate the essence of biodiversity, discuss the state and nature of taxonomic demise, the trends of recent biodiversity studies, and suggest reasonable approaches to a biodiversity science to facilitate the expansion of global biodiversity knowledge and to create useful data on backyard biodiversity worldwide towards human sustainability
Lower limit on the neutralino mass in the general MSSM
We discuss constraints on SUSY models with non-unified gaugino masses and R_P
conservation. We derive a lower bound on the neutralino mass combining the
direct limits from LEP, the indirect limits from gmuon, bsgamma, Bsmumu and the
relic density constraint from WMAP. The lightest neutralino (mneutralino=6GeV)
is found in models with a light pseudoscalar with MA<200GeV and a large value
for . Models with heavy pseudoscalars lead to mneutralino>18(29)GeV
for . We show that even a very conservative bound from the
muon anomalous magnetic moment can increase the lower bound on the neutralino
mass in models with mu<0 and/or large values of . We then examine
the potential of the Tevatron and the direct detection experiments to probe the
SUSY models with the lightest neutralinos allowed in the context of light
pseudoscalars with high . We also examine the potential of an e+e-
collider of 500GeV to produce SUSY particles in all models with neutralinos
lighter than the W. In contrast to the mSUGRA models, observation of at least
one sparticle is not always guaranteed.Comment: 37 pages, LateX, 16 figures, paper with higher resolution figures
available at
http://wwwlapp.in2p3.fr/~boudjema/papers/bound-lsp/bound-lsp.htm
Direct, Indirect and Collider Detection of Neutralino Dark Matter In SUSY Models with Non-universal Higgs Masses
In supersymmetric models with gravity-mediated SUSY breaking, universality of
soft SUSY breaking sfermion masses m_0 is motivated by the need to suppress
unwanted flavor changing processes. The same motivation, however, does not
apply to soft breaking Higgs masses, which may in general have independent
masses from matter scalars at the GUT scale. We explore phenomenological
implications of both the one-parameter and two-parameter non-universal Higgs
mass models (NUHM1 and NUHM2), and examine the parameter ranges compatible with
Omega_CDM h^2, BF(b --> s,gamma) and (g-2)_mu constraints. In contrast to the
mSUGRA model, in both NUHM1 and NUHM2 models, the dark matter A-annihilation
funnel can be reached at low values of tan(beta), while the higgsino dark
matter annihilation regions can be reached for low values of m_0. We show that
there may be observable rates for indirect and direct detection of neutralino
cold dark matter in phenomenologically aceptable ranges of parameter space. We
also examine implications of the NUHM models for the Fermilab Tevatron, the
CERN LHC and a Sqrt(s)=0.5-1 TeV e+e- linear collider. Novel possibilities
include: very light s-top_R, s-charm_R squark and slepton_L masses as well as
light charginos and neutralinos and H, A and H^+/- Higgs bosons.Comment: LaTeX, 48pages, 26 Figures. The version with high resolution Figures
is available at http://hep.pa.msu.edu/belyaev/public/projects/nuhm/nuhm.p
Ectopic lipid storage in non-alcoholic fatty liver disease is not mediated by impaired mitochondrial oxidative capacity in skeletal muscle
Background and Aims. Simple clinical algorithms including the Fatty Liver Index (FLI) and Lipid Accumulation Product (LAP) have been developed as a surrogate marker for Non-Alcoholic Fatty Liver Disease (NAFLD). These algorithms have been constructed using ultrasonography, a semi-quantitative method. This study aimed to validate FLI and LAP as measures of hepatic steatosis, as measured quantitatively by proton magnetic resonance spectroscopy (1H-MRS).
Methods. Data were collected from 168 patients with NAFLD and 168 controls who had undergone clinical, biochemical and anthropometric assessment in the course of research studies. Values of FLI and LAP were determined, and assessed both as predictors of the presence of hepatic steatosis (liver fat >5.5 %) and of actual liver fat content, as measured by 1H MRS. The discriminative ability of FLI and LAP was estimated using the area under the Receiver Operator Characteristic curve (AUROC). Since FLI can also be interpreted as a predictive probability of hepatic steatosis, we assessed how well calibrated it was in our cohort. Linear regression with prediction intervals was used to assess the ability of FLI and LAP to predict liver fat content.
Results. FLI and LAP discriminated between patients with and without hepatic steatosis with an AUROC of 0.79 (IQR= 0.74, 0.84) and 0.78 (IQR= 0.72, 0.83), although quantitative prediction of liver fat content was unsuccessful. Additionally, the algorithms accurately matched the observed percentages of patients with hepatic steatosis in our cohort.
Conclusions. FLI and LAP may be used clinically, and for metabolic and epidemiological research, to identify patients with hepatic steatosis, but not as surrogates for liver fat content
Evaluation of the impact of universal testing for gestational diabetes mellitus on maternal and neonatal health outcomes: a retrospective analysis
Background: Gestational diabetes (GDM) affects a substantial proportion of women in pregnancy and is associated with increased risk of adverse perinatal and long term outcomes. Treatment seems to improve perinatal outcomes, the relative effectiveness of different strategies for identifying women with GDM however is less clear. This paper describes an evaluation of the impact of a change in policy from selective risk factor based offering, to universal offering of an oral glucose tolerance test (OGTT) to identify women with GDM on maternal and neonatal outcomes. Methods: Retrospective six year analysis of 35,674 births at the Women’s and Newborn unit, Bradford Royal Infirmary, United Kingdom. Results: The proportion of the whole obstetric population diagnosed with GDM increased almost fourfold following universal offering of an OGTT compared to selective offering of an OGTT; Rate Ratio (RR) 3.75 (95% CI 3.28 to 4.29), the proportion identified with severe hyperglycaemia doubled following the policy change; 1.96 (1.50 to 2.58). The case detection rate however, for GDM in the whole population and severe hyperglycaemia in those with GDM reduced by 50-60%; 0.40 (0.35 to 0.46) and 0.51 (0.39 to 0.67) respectively. Universally offering an OGTT was associated with an increased induction of labour rate in the whole obstetric population and in women with GDM; 1.43 (1.35 to 1.50) and 1.21 (1.00 to1.49) respectively. Caesarean section, macrosomia and perinatal mortality rates in the whole population were similar. For women with GDM, rate of caesarean section; 0.70 (0.57 to 0.87), macrosomia; 0.22 (0.15 to 0.34) and perinatal mortality 0.12 (0.03 to 0.46) decreased following the policy change. Conclusions: Universally offering an OGTT was associated with increased identification of women with GDM and severe hyperglycaemia and with neonatal benefits for those with GDM. There was no evidence of benefit or adverse effects in neonatal outcomes in the whole obstetric population
- …
