1,735 research outputs found

    Neutrino-driven Turbulent Convection and Standing Accretion Shock Instability in Three-Dimensional Core-Collapse Supernovae

    Get PDF
    We conduct a series of numerical experiments into the nature of three-dimensional (3D) hydrodynamics in the postbounce stalled-shock phase of core-collapse supernovae using 3D general-relativistic hydrodynamic simulations of a 2727-MM_\odot progenitor star with a neutrino leakage/heating scheme. We vary the strength of neutrino heating and find three cases of 3D dynamics: (1) neutrino-driven convection, (2) initially neutrino-driven convection and subsequent development of the standing accretion shock instability (SASI), (3) SASI dominated evolution. This confirms previous 3D results of Hanke et al. 2013, ApJ 770, 66 and Couch & Connor 2014, ApJ 785, 123. We carry out simulations with resolutions differing by up to a factor of \sim4 and demonstrate that low resolution is artificially favorable for explosion in the 3D convection-dominated case, since it decreases the efficiency of energy transport to small scales. Low resolution results in higher radial convective fluxes of energy and enthalpy, more fully buoyant mass, and stronger neutrino heating. In the SASI-dominated case, lower resolution damps SASI oscillations. In the convection-dominated case, a quasi-stationary angular kinetic energy spectrum E()E(\ell) develops in the heating layer. Like other 3D studies, we find E()1E(\ell) \propto \ell^{-1} in the "inertial range," while theory and local simulations argue for E()5/3E(\ell) \propto \ell^{-5/3}. We argue that current 3D simulations do not resolve the inertial range of turbulence and are affected by numerical viscosity up to the energy containing scale, creating a "bottleneck" that prevents an efficient turbulent cascade.Comment: 24 pages, 15 figures. Accepted for publication in The Astrophysical Journal. Added one figure and made minor modifications to text according to suggestions from the refere

    Assessment of Natural Resources Use for Sustainable Development - DPSIR Framework for Case Studies in Portsmouth and Thames Gateway, U.K.

    Get PDF
    This chapter reports on the uses of the DPSIR framework to assess the sustainability of the intertidal environments within the two UK case study areas, Portsmouth and Thames Gateway. It focuses on statutory conservation areas dominated by intertidal habitats. Two are located in Portsmouth (Portsmouth and Langstone Harbours) and four in the Thames Gateway (Benfleet Marshes, South Thames Estuary, Medway Estuary and the Swale in the Thames Gateway). Based on the reduction of a number of pressures and impacts observed in recent decades and the improvement of overall environmental quality, all six SSSIs are considered to be sustainable in the short and medium term. In the future, it is possible that the impacts of climate change, especially sea-level rise, might result in further reduction in the area and/or quality of intertidal habitats. Further integration between conservation and planning objectives (both for urban development and management of flood risk) at local level is needed to support the long-term sustainability of intertidal habitats

    Flexible Causal Inference for Political Science

    Get PDF
    Measuring the causal impact of state behavior on outcomes is one of the biggest methodological challenges in the field of political science, for two reasons: behavior is generally endogenous, and the threat of unobserved variables that confound the relationship between behavior and outcomes is pervasive. Matching methods, widely considered to be the state of the art in causal inference in political science, are generally ill-suited to inference in the presence of unobserved confounders. Heckman-style multiple-equation models offer a solution to this problem; however, they rely on functional-form assumptions that can produce substantial bias in estimates of average treatment effects. We describe a category of models, flexible joint likelihood models, that account for both features of the data while avoiding reliance on rigid functional-form assumptions. We then assess these models’ performance in a series of neutral simulations, in which they produce substantial (55% to 90%) reduction in bias relative to competing models. Finally, we demonstrate their utility in a reanalysis of Simmons’ (2000) classic study of the impact of Article VIII commitment on compliance with the IMF’s currency-restriction regime

    Consequences of asteroid fragmentation during impact hazard mitigation

    Get PDF
    The consequences of the fragmentation of an Earth-threatening asteroid due to an attempted deflection are examined in this paper. The minimum required energy for a successful impulsive deflection of a threatening object is computed and compared to the energy required to break up a small size asteroid. The results show that the fragmentation of an asteroid that underwent an impulsive deflection, such as a kinetic impact or a nuclear explosion, is a very plausible event.Astatistical model is used to approximate the number and size of the fragments as well as the distribution of velocities at the instant after the deflection attempt takes place. This distribution of velocities is a function of the energy provided by the deflection attempt, whereas the number and size of the asteroidal fragments is a function of the size of the largest fragment. The model also takes into account the gravity forces that could lead to a reaggregation of the asteroid after fragmentation. The probability distribution of the pieces after the deflection is then propagated forward in time until the encounter with Earth. A probability damage factor (i.e., expected damage caused by a given size fragment multiplied by its impact probability) is then computed and analyzed for different plausible scenarios, characterized by different levels of deflection energies and lead times

    A Test of the Standard Hypothesis for the Origin of the HI Holes in Holmberg II

    Get PDF
    The nearby irregular galaxy Holmberg II has been extensively mapped in HI using the Very Large Array (VLA), revealing intricate structure in its interstellar gas component (Puche et al. 1992). An analysis of these structures shows the neutral gas to contain a number of expanding HI holes. The formation of the HI holes has been attributed to multiple supernova events occurring within wind-blown shells around young, massive star clusters, with as many as 10-200 supernovae required to produce many of the holes. From the sizes and expansion velocities of the holes, Puche et al. assigned ages of ~10^7 to 10^8 years. If the supernova scenario for the formation of the HI holes is correct, it implies the existence of star clusters with a substantial population of late-B, A and F main sequence stars at the centers of the holes. Many of these clusters should be detectable in deep ground-based CCD images of the galaxy. In order to test the supernova hypothesis for the formation of the HI holes, we have obtained and analyzed deep broad-band BVR and narrow-band H-alpha images of Ho II. We compare the optical and HI data and search for evidence of the expected star clusters in and around the HI holes. We also use the HI data to constrain models of the expected remnant stellar population. We show that in several of the holes the observed upper limits for the remnant cluster brightness are strongly inconsistent with the SNe hypothesis described in Puche et al. Moreover, many of the HI holes are located in regions of very low optical surface brightness which show no indication of recent star formation. Here we present our findings and explore possible alternative explanations for the existence of the HI holes in Ho II, including the suggestion that some of the holes were produced by Gamma-ray burst events.Comment: 30 pages, including 6 tables and 3 images. To appear in Astron. Journal (June 1999

    Toward a conceptual framework of emotional relationship marketing: an examination of two UK political parties

    Get PDF
    The purpose of this paper is to review the notion of branding and evaluate its applicability to political parties. As ideological politics is in decline, branding may provide a consistent narrative where voters feel a sense of warmth and belonging. The paper aims to build an understanding of the complexity of building a political brand where a combination of image, logo, leadership, and values can all contribute to a compelling brand narrative. It investigates how competing positive and negative messages attempt to build and distort the brand identity. A critical review of bran ding, relationship marketing, and political science literature articulates the conceptual development of branding and its applicability to political parties. The success or failure of negative campaigning is due to the authenticity of a political party’s brand values — creating a coherent brand story — if there is no distance between the brand values articulated by the political party and the values their community perceives then this creates an "authentic" brand. However, if there is a gap this paper illustrates how negative campaigning can be used to build a "doppelganger brand," which undermines the credibility of the authentic political brand. The paper argues that political parties need to understand how brand stories are developed but also how they can be used to protect against negative advertising. This has implications for political marketing strategists and political parties. This paper draws together branding theory and relationship marketing and incorporates them into a framework that makes a contribution to the political marketing literature

    Optimal low-thrust trajectories to asteroids through an algorithm based on differential dynamic programming

    Get PDF
    In this paper an optimisation algorithm based on Differential Dynamic Programming is applied to the design of rendezvous and fly-by trajectories to near Earth objects. Differential dynamic programming is a successive approximation technique that computes a feedback control law in correspondence of a fixed number of decision times. In this way the high dimensional problem characteristic of low-thrust optimisation is reduced into a series of small dimensional problems. The proposed method exploits the stage-wise approach to incorporate an adaptive refinement of the discretisation mesh within the optimisation process. A particular interpolation technique was used to preserve the feedback nature of the control law, thus improving robustness against some approximation errors introduced during the adaptation process. The algorithm implements global variations of the control law, which ensure a further increase in robustness. The results presented show how the proposed approach is capable of fully exploiting the multi-body dynamics of the problem; in fact, in one of the study cases, a fly-by of the Earth is scheduled, which was not included in the first guess solution

    Correlation between IgA tissue transglutaminase antibody ratio and histological finding in celiac disease.

    Get PDF
    OBJECTIVES: Positivity of both immunoglobulin A anti-tissue transglutaminase (TTG) and anti-endomysium antibodies (EMA) has a positive predictive value of nearly 100% for celiac disease (CD). The objective of the present study was to evaluate whether patients of any age, with high pretest probability of CD and high titre of anti-TTG and EMA positivity, have a high probability of intestinal damage and may not require the biopsy for final diagnosis. METHODS: A retrospective analysis of 412 consecutively referred patients, age range 10 months to 72 years, who underwent small-bowel biopsy for suspicion of CD and positivity to both anti-TTG and EMA, was performed at 4 Italian centers. Biopsies were evaluated independently by 2 pathologists using Marsh modified classification; in cases of dissimilar results, a third pathologist examined the biopsy. The final histological finding diagnosis was expressed as the prevalent or highest score assigned by the pathologist board. RESULTS: Three hundred ninety-six patients (96.1%) had histological findings consistent with CD (grade 2 and 3a, 3b, or 3c of modified Marsh classification). An anti-TTG ratio ≥ 7 was able to identify with the 3 assays used (Celikey, anti-TTG immunoglobulin A, EuTTG) all of the patients with significant mucosal damage (Marsh ≥ 2) independent of age and sex; specificity and positive predictive value were 100%. An anti-TTG ratio >20 was more specific (99.8%) for identification of patients with villous atrophy (Marsh 3 a, b, or c). CONCLUSIONS: Patients with positivity of anti-TTG ≥ 7-fold cutoff, confirmed by positivity to EMA, have a high-degree probability of duodenal damage. In selected conditions, a duodenal biopsy may be avoided and a confirmed greatly positive anti-TTG result could be the basis to prescribe a gluten-free diet

    Flower development and pollen vitality of moringa oleifera lam. Grown in a humid temperate climatic condition

    Get PDF
    Moringa oleifera is a tropical tree cultivated in many countries. This species has acquired a great importance in human nutrition and it was recently indicated as a “novel food” by the European Commission. Recently, moringa plants have been introduced in humid temperate climatic areas, among which Moreno (Buenos Aires Province-Argentina). In such area, the cultivation is possible for the production of leaves, but plants need protection during winter time in order to overcome damages due to low temperatures and hence to produce capsules and seeds. The main objective of this research was to study flower morphology and anatomy of M. oleifera, as well as microsporogenesis and viability of pollen grains of plants cultivated in Moreno in comparison with those produced in a humid sub-tropical climatic area of Argentina (San Miguel de Tucumán). Flowers grown in the temperate environment resulted similar for morphological parameters to those observed in the sub-tropical environment. Nevertheless, pollen grain fertility depended directly on air temperature and it was negatively affected by the lower temperatures registered in the temperate site. According to the observed results, pollen viability increases with mean monthly temperatures above 16°C.Fil: Radice, Silvia. Universidad de Moron. Facultad de Agronomia y Ciencias Agroalimentarias. Laboratorio de Investigaciones En Fisiología Vegetal; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Giordani, Edgardo. Università degli Studi di Firenze; Itali

    Crucial Physical Dependencies of the Core-Collapse Supernova Mechanism

    Full text link
    We explore with self-consistent 2D F{\sc{ornax}} simulations the dependence of the outcome of collapse on many-body corrections to neutrino-nucleon cross sections, the nucleon-nucleon bremsstrahlung rate, electron capture on heavy nuclei, pre-collapse seed perturbations, and inelastic neutrino-electron and neutrino-nucleon scattering. Importantly, proximity to criticality amplifies the role of even small changes in the neutrino-matter couplings, and such changes can together add to produce outsized effects. When close to the critical condition the cumulative result of a few small effects (including seeds) that individually have only modest consequence can convert an anemic into a robust explosion, or even a dud into a blast. Such sensitivity is not seen in one dimension and may explain the apparent heterogeneity in the outcomes of detailed simulations performed internationally. A natural conclusion is that the different groups collectively are closer to a realistic understanding of the mechanism of core-collapse supernovae than might have seemed apparent.Comment: 25 pages; 10 figure
    corecore