2,338 research outputs found

    Trial protocol OPPTIMUM : does progesterone prophylaxis for the prevention of preterm labour improve outcome?

    Get PDF
    Background Preterm birth is a global problem, with a prevalence of 8 to 12% depending on location. Several large trials and systematic reviews have shown progestogens to be effective in preventing or delaying preterm birth in selected high risk women with a singleton pregnancy (including those with a short cervix or previous preterm birth). Although an improvement in short term neonatal outcomes has been shown in some trials these have not consistently been confirmed in meta-analyses. Additionally data on longer term outcomes is limited to a single trial where no difference in outcomes was demonstrated at four years of age of the child, despite those in the “progesterone” group having a lower incidence of preterm birth. Methods/Design The OPPTIMUM study is a double blind randomized placebo controlled trial to determine whether progesterone prophylaxis to prevent preterm birth has long term neonatal or infant benefit. Specifically it will study whether, in women with singleton pregnancy and at high risk of preterm labour, prophylactic vaginal natural progesterone, 200 mg daily from 22 – 34 weeks gestation, compared to placebo, improves obstetric outcome by lengthening pregnancy thus reducing the incidence of preterm delivery (before 34 weeks), improves neonatal outcome by reducing a composite of death and major morbidity, and leads to improved childhood cognitive and neurosensory outcomes at two years of age. Recruitment began in 2009 and is scheduled to close in Spring 2013. As of May 2012, over 800 women had been randomized in 60 sites. Discussion OPPTIMUM will provide further evidence on the effectiveness of vaginal progesterone for prevention of preterm birth and improvement of neonatal outcomes in selected groups of women with singleton pregnancy at high risk of preterm birth. Additionally it will determine whether any reduction in the incidence of preterm birth is accompanied by improved childhood outcome

    Dark Matter from Minimal Flavor Violation

    Full text link
    We consider theories of flavored dark matter, in which the dark matter particle is part of a multiplet transforming nontrivially under the flavor group of the Standard Model in a manner consistent with the principle of Minimal Flavor Violation (MFV). MFV automatically leads to the stability of the lightest state for a large number of flavor multiplets. If neutral, this particle is an excellent dark matter candidate. Furthermore, MFV implies specific patterns of mass splittings among the flavors of dark matter and governs the structure of the couplings between dark matter and ordinary particles, leading to a rich and predictive cosmology and phenomenology. We present an illustrative phenomenological study of an effective theory of a flavor SU(3)_Q triplet, gauge singlet scalar.Comment: 10 pages, 2 figures; v2: references added, minor changes to collider analysis, conclusions unchange

    Networked buffering: a basic mechanism for distributed robustness in complex adaptive systems

    Get PDF
    A generic mechanism - networked buffering - is proposed for the generation of robust traits in complex systems. It requires two basic conditions to be satisfied: 1) agents are versatile enough to perform more than one single functional role within a system and 2) agents are degenerate, i.e. there exists partial overlap in the functional capabilities of agents. Given these prerequisites, degenerate systems can readily produce a distributed systemic response to local perturbations. Reciprocally, excess resources related to a single function can indirectly support multiple unrelated functions within a degenerate system. In models of genome:proteome mappings for which localized decision-making and modularity of genetic functions are assumed, we verify that such distributed compensatory effects cause enhanced robustness of system traits. The conditions needed for networked buffering to occur are neither demanding nor rare, supporting the conjecture that degeneracy may fundamentally underpin distributed robustness within several biotic and abiotic systems. For instance, networked buffering offers new insights into systems engineering and planning activities that occur under high uncertainty. It may also help explain recent developments in understanding the origins of resilience within complex ecosystems. \ud \u

    Selective serotonin reuptake inhibitors in the treatment of generalized anxiety disorder

    Get PDF
    Selective serotonin reuptake inhibitors have proven efficacy in the treatment of panic disorder, obsessive–compulsive disorder, post-traumatic stress disorder and social anxiety disorder. Accumulating data shows that selective serotonin reuptake inhibitor treatment can also be efficacious in patients with generalized anxiety disorder. This review summarizes the findings of randomized controlled trials of selective serotonin reuptake inhibitor treatment for generalized anxiety disorder, examines the strengths and weaknesses of other therapeutic approaches and considers potential new treatments for patients with this chronic and disabling anxiety disorder

    Future orientation and planning in forestry: a comparison of forest managers' planning horizons in Germany and the Netherlands

    Get PDF
    Long range (or strategic) planning is an important tool for forest management to deal with the complex and unpredictable future. However, it is the ability to make meaningful predictions about the rapidly changing future that is questioned. What appears to be particularly neglected is the question of the length of time horizons and the limits (if any) to these horizons, despite being considered one of the most critical factors in strategic planning. As the future creation of values lies within individual responsibility, this research empirically explored the limits (if any) of individual foresters¿ time horizons. To draw comparisons between countries with different traditions in forest management planning, data were collected through telephone surveys of forest managers in the state/national forest services of the Netherlands and Germany. In order to minimize other cultural differences, the research in Germany concentrated on the federal state of Nordrhein-Westfalen, which has considerable similarities with the Netherlands, e.g. in topography, forest types and forest functions. The results show that, in practice, 15 years appears to be the most distant horizon that foresters can identify with. This is in sharp contrast to the time horizons spanning decades and even generations that are always said to exist in forestry. The ¿doctrine of the long run¿¿the faith in the capacity of foresters to overcome the barriers of the uncertain future and look ahead and plan for long-range goals¿which in many countries still underlies traditional forest management, can therefore be rejected

    The Dark Side of the Electroweak Phase Transition

    Get PDF
    Recent data from cosmic ray experiments may be explained by a new GeV scale of physics. In addition the fine-tuning of supersymmetric models may be alleviated by new O(GeV) states into which the Higgs boson could decay. The presence of these new, light states can affect early universe cosmology. We explore the consequences of a light (~ GeV) scalar on the electroweak phase transition. We find that trilinear interactions between the light state and the Higgs can allow a first order electroweak phase transition and a Higgs mass consistent with experimental bounds, which may allow electroweak baryogenesis to explain the cosmological baryon asymmetry. We show, within the context of a specific supersymmetric model, how the physics responsible for the first order phase transition may also be responsible for the recent cosmic ray excesses of PAMELA, FERMI etc. We consider the production of gravity waves from this transition and the possible detectability at LISA and BBO

    International Veterinary Epilepsy Task Force consensus proposal: Medical treatment of canine epilepsy in Europe

    Get PDF
    In Europe, the number of antiepileptic drugs (AEDs) licensed for dogs has grown considerably over the last years. Nevertheless, the same questions remain, which include, 1) when to start treatment, 2) which drug is best used initially, 3) which adjunctive AED can be advised if treatment with the initial drug is unsatisfactory, and 4) when treatment changes should be considered. In this consensus proposal, an overview is given on the aim of AED treatment, when to start long-term treatment in canine epilepsy and which veterinary AEDs are currently in use for dogs. The consensus proposal for drug treatment protocols, 1) is based on current published evidence-based literature, 2) considers the current legal framework of the cascade regulation for the prescription of veterinary drugs in Europe, and 3) reflects the authors’ experience. With this paper it is aimed to provide a consensus for the management of canine idiopathic epilepsy. Furthermore, for the management of structural epilepsy AEDs are inevitable in addition to treating the underlying cause, if possible

    Beyond the standard seesaw: neutrino masses from Kahler operators and broken supersymmetry

    Get PDF
    We investigate supersymmetric scenarios in which neutrino masses are generated by effective d=6 operators in the Kahler potential, rather than by the standard d=5 superpotential operator. First, we discuss some general features of such effective operators, also including SUSY-breaking insertions, and compute the relevant renormalization group equations. Contributions to neutrino masses arise at low energy both at the tree level and through finite threshold corrections. In the second part we present simple explicit realizations in which those Kahler operators arise by integrating out heavy SU(2)_W triplets, as in the type II seesaw. Distinct scenarios emerge, depending on the mechanism and the scale of SUSY-breaking mediation. In particular, we propose an appealing and economical picture in which the heavy seesaw mediators are also messengers of SUSY breaking. In this case, strong correlations exist among neutrino parameters, sparticle and Higgs masses, as well as lepton flavour violating processes. Hence, this scenario can be tested at high-energy colliders, such as the LHC, and at lower energy experiments that measure neutrino parameters or search for rare lepton decays.Comment: LaTeX, 34 pages; some corrections in Section

    Predictions for Higgs production at the Tevatron and the associated uncertainties

    Get PDF
    We update the theoretical predictions for the production cross sections of the Standard Model Higgs boson at the Fermilab Tevatron collider, focusing on the two main search channels, the gluon-gluon fusion mechanism ggHgg \to H and the Higgs-strahlung processes qqˉVHq \bar q \to VH with V=W/ZV=W/Z, including all relevant higher order QCD and electroweak corrections in perturbation theory. We then estimate the various uncertainties affecting these predictions: the scale uncertainties which are viewed as a measure of the unknown higher order effects, the uncertainties from the parton distribution functions and the related errors on the strong coupling constant, as well as the uncertainties due to the use of an effective theory approach in the determination of the radiative corrections in the ggHgg \to H process at next-to-next-to-leading order. We find that while the cross sections are well under control in the Higgs--strahlung processes, the theoretical uncertainties are rather large in the case of the gluon-gluon fusion channel, possibly shifting the central values of the next-to-next-to-leading order cross sections by more than 40\approx 40%. These uncertainties are thus significantly larger than the 10\approx 10% error assumed by the CDF and D0 experiments in their recent analysis that has excluded the Higgs mass range MH=M_H=162-166 GeV at the 95% confidence level. These exclusion limits should be, therefore, reconsidered in the light of these large theoretical uncertainties.Comment: 40 pages, 12 figures. A few typos are corrected and some updated numbers are provide

    Relational Contracts and Organizational Capabilities

    Get PDF
    A large literature identifies unique organizational capabilities as a potent source of competitive advantage, yet our knowledge of why capabilities fail to diffuse more rapidly—particularly in situations in which competitors apparently have strong incentives to adopt them and a well-developed understanding of how they work—remains incomplete. In this paper we suggest that competitively significant capabilities often rest on managerial practices that in turn rely on relational contracts (i.e., informal agreements sustained by the shadow of the future). We argue that one of the reasons these practices may be difficult to copy is that effective relational contracts must solve the twin problems of credibility and clarity and that although credibility might, in principle, be instantly acquired, clarity may take time to develop and may interact with credibility in complex ways so that relational contracts may often be difficult to build
    corecore