3,148 research outputs found

    Differential gene network analysis for the identification of asthma-associated therapeutic targets in allergen-specific T-helper memory responses

    Get PDF
    Fifty most significant differentially expressed genes in HDM-stimulated versus resting CD4 T cells from HDM-sensitized atopics with asthmatics. Gene expression patterns were compared between HDM-stimulated and unstimulated CD4 T cells from HDM-sensitized atopics with asthma. Here we present the 50 most significant differentially expressed genes. (XLS 34 kb

    The long-run performance of U.S. bidding firms in the post M&A period : the impact of bid type, payment method and industry specialisation

    Get PDF
    This study investigates how mergers and acquisitions (M&A) affect the wealth of shareholders of public firms in the United States (U.S). More specifically, it investigates whether the nature of the bid, the payment method used, and the type of M&A have implications for shareholders of U.S bidding firms. The study analyses 352 mergers and acquisitions in the U.S during the period 1999-2008, and its results indicate that bidding firms suffer significant negative buy-and-hold abnormal returns in the three years period after a M&A announcement. The results also suggest that, in the long-run, hostile bids and cash-financed bidders outperform friendly bids and stock-funded bidders, respectively. Furthermore, the study also finds that in the long-run bidder firms that focus on industry specialisation within their M&A targets significantly outperform firms that adopt a more diversified strategy. The analysis also investigates the effects of M&A specialisation/diversification in six different sectors, and finds that specialised bidders outperform diversified bidders in four sectors: consumer & basic materials, energy & utilities, communications and technology. Furthermore, bidder firms in the financial services sector perform significantly better when diversifying into other sectors, while the performance of bidder firms in the industrial sector appears unaffected by the degree of M&A specialisation or diversification

    Enabling Parametric Optimal Ascent Trajectory Modeling During Early Phases of Design

    Get PDF
    During the early phases of engineering design, the costs committed are high, costs incurred are low, and the design freedom is high. It is well documented that decisions made in these early design phases drive the entire design's life cycle. In a traditional paradigm, key design decisions are made when little is known about the design. As the design matures, design changes become more difficult -- in both cost and schedule -- to enact. Indeed, the current capability-based paradigm that has emerged because of the constrained economic environment calls for the infusion of knowledge acquired during later design phases into earlier design phases, i.e. bring knowledge acquired during preliminary and detailed design into pre-conceptual and conceptual design. An area of critical importance to launch vehicle design is the optimization of its ascent trajectory, as the optimal trajectory will be able to take full advantage of the launch vehicle's capability to deliver a maximum amount of payload into orbit. Hence, the optimal ascent trajectory plays an important role in the vehicle's affordability posture as the need for more economically viable access to space solutions are needed in today's constrained economic environment. The problem of ascent trajectory optimization is not a new one. There are several programs that are widely used in industry that allows trajectory analysts to, based on detailed vehicle and insertion orbit parameters, determine the optimal ascent trajectory. Yet, little information is known about the launch vehicle early in the design phase - information that is required of many different disciplines in order to successfully optimize the ascent trajectory. Thus, the current paradigm of optimizing ascent trajectories involves generating point solutions for every change in a vehicle's design parameters. This is often a very tedious, manual, and time-consuming task for the analysts. Moreover, the trajectory design space is highly non-linear and multi-modal due to the interaction of various constraints. Additionally, when these obstacles are coupled with The Program to Optimize Simulated Trajectories [1] (POST), an industry standard program to optimize ascent trajectories that is difficult to use, it requires expert trajectory analysts to effectively optimize a vehicle's ascent trajectory. As it has been pointed out, the paradigm of trajectory optimization is still a very manual one because using modern computational resources on POST is still a challenging problem. The nuances and difficulties involved in correctly utilizing, and therefore automating, the program presents a large problem. In order to address these issues, the authors will discuss a methodology that has been developed. The methodology is two-fold: first, a set of heuristics will be introduced and discussed that were captured while working with expert analysts to replicate the current state-of-the-art; secondly, leveraging the power of modern computing to evaluate multiple trajectories simultaneously, and therefore, enable the exploration of the trajectory's design space early during the pre-conceptual and conceptual phases of design. When this methodology is coupled with design of experiments in order to train surrogate models, the authors were able to visualize the trajectory design space, enabling parametric optimal ascent trajectory information to be introduced with other pre-conceptual and conceptual design tools. The potential impact of this methodology's success would be a fully automated POST evaluation suite for the purpose of conceptual and preliminary design trade studies. This will enable engineers to characterize the ascent trajectory's sensitivity to design changes in an arbitrary number of dimensions and for finding settings for trajectory specific variables, which result in optimal performance for a "dialed-in" launch vehicle design. The effort described in this paper was developed for the Advanced Concepts Office [2] at NASA Marshall Space Flight Cente

    An Expert System-Driven Method for Parametric Trajectory Optimization During Conceptual Design

    Get PDF
    During the early phases of engineering design, the costs committed are high, costs incurred are low, and the design freedom is high. It is well documented that decisions made in these early design phases drive the entire design's life cycle cost. In a traditional paradigm, key design decisions are made when little is known about the design. As the design matures, design changes become more difficult in both cost and schedule to enact. The current capability-based paradigm, which has emerged because of the constrained economic environment, calls for the infusion of knowledge usually acquired during later design phases into earlier design phases, i.e. bringing knowledge acquired during preliminary and detailed design into pre-conceptual and conceptual design. An area of critical importance to launch vehicle design is the optimization of its ascent trajectory, as the optimal trajectory will be able to take full advantage of the launch vehicle's capability to deliver a maximum amount of payload into orbit. Hence, the optimal ascent trajectory plays an important role in the vehicle's affordability posture yet little of the information required to successfully optimize a trajectory is known early in the design phase. Thus, the current paradigm of optimizing ascent trajectories involves generating point solutions for every change in a vehicle's design parameters. This is often a very tedious, manual, and time-consuming task for the analysts. Moreover, the trajectory design space is highly non-linear and multi-modal due to the interaction of various constraints. When these obstacles are coupled with the Program to Optimize Simulated Trajectories (POST), an industry standard program to optimize ascent trajectories that is difficult to use, expert trajectory analysts are required to effectively optimize a vehicle's ascent trajectory. Over the course of this paper, the authors discuss a methodology developed at NASA Marshall's Advanced Concepts Office to address these issues. The methodology is two-fold: first, capture the heuristics developed by human analysts over their many years of experience; and secondly, leverage the power of modern computing to evaluate multiple trajectories simultaneously and therefore enable the exploration of the trajectory's design space early during the pre- conceptual and conceptual phases of design. This methodology is coupled with design of experiments in order to train surrogate models, which enables trajectory design space visualization and parametric optimal ascent trajectory information to be available when early design decisions are being made

    The Douglas-Fir Genome Sequence Reveals Specialization of the Photosynthetic Apparatus in Pinaceae.

    Get PDF
    A reference genome sequence for Pseudotsuga menziesii var. menziesii (Mirb.) Franco (Coastal Douglas-fir) is reported, thus providing a reference sequence for a third genus of the family Pinaceae. The contiguity and quality of the genome assembly far exceeds that of other conifer reference genome sequences (contig N50 = 44,136 bp and scaffold N50 = 340,704 bp). Incremental improvements in sequencing and assembly technologies are in part responsible for the higher quality reference genome, but it may also be due to a slightly lower exact repeat content in Douglas-fir vs. pine and spruce. Comparative genome annotation with angiosperm species reveals gene-family expansion and contraction in Douglas-fir and other conifers which may account for some of the major morphological and physiological differences between the two major plant groups. Notable differences in the size of the NDH-complex gene family and genes underlying the functional basis of shade tolerance/intolerance were observed. This reference genome sequence not only provides an important resource for Douglas-fir breeders and geneticists but also sheds additional light on the evolutionary processes that have led to the divergence of modern angiosperms from the more ancient gymnosperms

    Does Judicial Immunity Serve the Common Good?

    Get PDF

    The free volume of poly(vinyl methylether) as computed in a wide temperature range and at length scales up to the nanoregion

    Get PDF
    14 páginas, 12 figuras.In the present work, we focus on the free volume evaluations from different points of view, including the aspect of probe sizes, temperature, and cavity threshold. The free volume structure is analyzed on structures of poly(vinyl methylether) prepared by fully atomistic molecular dynamics. At first, the temperature behavior of an overall free volume and a free volume separated into individual cavities is shown. The origin of large free volume cavities is explained. A complex view on the cavity number is provided, while a complicated behavior previously observed is now explained. The number of large cavities remained almost constant with the temperature. Oppositely, the number of small cavities related to the atomic packing changes with temperature in a distinct way for glassy and supercooled regions. The cavity number maxima determine a percolation threshold according to percolation theory. The change in polymer properties with temperature can be related to a percolation of the free volume according to the free volume theory, when proper probe radii ∼0.8 Å are used for its observation. A construction of probabilistic distribution of free volume sizes is suggested. The free volume distributions reported here are bimodal. The bimodal character is explained by two different packings—atomic and segmental—forming a prepeak and a main peak on the distribution. Further attention is dedicated to comparisons of the computed free volume sizes and the ortho-positronium (o-Ps) lifetimes. The prepeak of the free volume distribution is probably unseen by o-Ps because of a cavity threshold limit. The effect of the shape factor on the computed o-Ps lifetimes is tested. The quasicavities obtained by redistributing the free volume maintain the ratio of the main dimensions with temperature. Finally, novel data on the cavity environment are provided, while it is suggested how these can be useful with the recent developments in the positron annihilation methods. The coordination number of large cavities with the polymer segments is around 1, as predicted in the free volume theory. Similarly to the percolation and the cavity number, the coordination number exhibits a change when explored by a suitable probe radius ∼0.8 Å. The insightful visualizations showed properties of interest investigated within the actual work.This work was supported by Project No. MAT2007– 63681 (Spanish Ministry of Education) and Grant No. IT- 436–07 (Basque Government). Support from Spanish Ministry of Education Grant No. CSD2006-53 is also acknowledged.Peer reviewe

    Determination of in silico rules for predicting small molecule binding behavior to nucleic acids in vitro.

    Get PDF
    The vast knowledge of nucleic acids is evolving and it is now known that DNA can adopt highly complex, heterogeneous structures. Among the most intriguing are the G-quadruplex structures, which are thought to play a pivotal role in cancer pathogenesis. Efforts to find new small molecules for these and other physiologically relevant nucleic acid structures have generally been limited to isolation from natural sources or rationale synthesis of promising lead compounds. However, with the rapid growth in computational power that is increasingly becoming available, virtual screening and computational approaches are quickly becoming a reality in academia and industry as an efficient and economical way to discover new lead compounds. These computational efforts have historically almost entirely focused on proteins as targets and have neglected DNA. We present research here showing that not only can software be utilized for targeting DNA, but that selectivity metrics can be developed to predict the binding mechanism of a small molecule to a DNA target. The software Surflex and Autodock were chosen for evaluation and were demonstrated to be able to accurately reproduce the known crystal structures of several small molecules that bind by the most common nucleic acid interacting mechanisms of groove binding and intercalation. These software were further used to rationalize known affinity and selectivity data of a 67 compound library of compounds for a library of nucleic acid structures including duplex, triplex and quadruplexes. Based upon the known binding behavior of these compounds, in silica metrics were developed to classify compounds as either groove binders or intercalators. These rules were subsequently used to identify new triplex and quadruplex binding small molecules by structure and ligand-based virtual screening approaches using a virtual library consisting of millions of commercially available small molecules. The binding behavior of the newly discovered triplex and quadruplex binding compounds was empirically validated using a number of spectroscopic, fluorescent and thermodynamic equilibrium techniques. In total, this research predicted the binding behavior of these test compounds in silica and subsequently validated these findings in vitro. This research presents a novel approach to discover lead compounds that target multiple nucleic acid morphologies

    Ecological equivalence: a realistic assumption for niche theory as a testable alternative to neutral theory

    Get PDF
    Hubbell's 2001 neutral theory unifies biodiversity and biogeography by modelling steady-state distributions of species richness and abundances across spatio-temporal scales. Accurate predictions have issued from its core premise that all species have identical vital rates. Yet no ecologist believes that species are identical in reality. Here I explain this paradox in terms of the ecological equivalence that species must achieve at their coexistence equilibrium, defined by zero net fitness for all regardless of intrinsic differences between them. I show that the distinction of realised from intrinsic vital rates is crucial to evaluating community resilience. An analysis of competitive interactions reveals how zero-sum patterns of abundance emerge for species with contrasting life-history traits as for identical species. I develop a stochastic model to simulate community assembly from a random drift of invasions sustaining the dynamics of recruitment following deaths and extinctions. Species are allocated identical intrinsic vital rates for neutral dynamics, or random intrinsic vital rates and competitive abilities for niche dynamics either on a continuous scale or between dominant-fugitive extremes. Resulting communities have steady-state distributions of the same type for more or less extremely differentiated species as for identical species. All produce negatively skewed log-normal distributions of species abundance, zero-sum relationships of total abundance to area, and Arrhenius relationships of species to area. Intrinsically identical species nevertheless support fewer total individuals, because their densities impact as strongly on each other as on themselves. Truly neutral communities have measurably lower abundance/area and higher species/abundance ratios. Neutral scenarios can be parameterized as null hypotheses for testing competitive release, which is a sure signal of niche dynamics. Ignoring the true strength of interactions between and within species risks a substantial misrepresentation of community resilience to habitat los
    corecore