157 research outputs found

    Target prediction and a statistical sampling algorithm for RNA-RNA interaction

    Get PDF
    It has been proven that the accessibility of the target sites has a critical influence for miRNA and siRNA. In this paper, we present a program, rip2.0, not only the energetically most favorable targets site based on the hybrid-probability, but also a statistical sampling structure to illustrate the statistical characterization and representation of the Boltzmann ensemble of RNA-RNA interaction structures. The outputs are retrieved via backtracing an improved dynamic programming solution for the partition function based on the approach of Huang et al. (Bioinformatics). The O(N6)O(N^6) time and O(N4)O(N^4) space algorithm is implemented in C (available from \url{http://www.combinatorics.cn/cbpc/rip2.html})Comment: 7 pages, 10 figure

    Yangians, finite W-algebras and the Non Linear Schrodinger hierarchy

    Get PDF
    We show an algebra morphism between Yangians and some finite W-algebras. This correspondence is nicely illustrated in the framework of the Non Linear Schrodinger hierarchy. For such a purpose, we give an explicit realization of the Yangian generators in terms of deformed oscillators.Comment: LaTeX2e, 10 pages, Talk presented by E. Ragoucy at ACTP-Nankai Symposium on Yang-Baxter systems, non linear models and their applications, Seoul (Korea) October 20-23, 199

    Writing a Scientific Paper Prior to the Research

    Get PDF
    The traditional approach to preparing a research report for publication is to begin writing after the study has been completed. We propose another approach- to write a zeroth draft before the study is begun. This approach helps to focus the investigator\u27s attention during the planning stage on critical aspects of the study. The discipline of writing down the rationale, the methods, and the variety of possible outcomes and their significance helps to clarify the logic on which the study is based. If these are acceptable to all authors and colleagues in the zeroth draft, it is likely that the research questions posed will be answered in a definitive way and that the final draft will be scientifically sound. The notion of writing a paper before doing the research may raise concerns of prejudice, preconception, or even academic dishonesty. How could one possibly know what to write until after the study is completed? However, if one considers the actual content of a scientific paper or research report, it becomes clear that most of the report can be drafted before the first data are collected. The process is in many ways similar to that of preparing a formal proposal to a funding agency. Indeed, a grant application may borrow heavily from the zeroth draft of the paper, and vice versa. The content of the zeroth draft is only the first of a series of approximations to the final form. Yet, it can be a very useful beginning. Authors often procrastinate when faced with writing up the results of completed research projects and may find it much easier to write at the beginning of a project when enthusiasm is at its peak. Most importantly, there may be no better way to prepare the mind, anticipate pitfalls, and avoid wasted time, effort, and money than to write a zeroth draft

    Quality Improvement Intervention for Reduction of Redundant Testing

    Get PDF
    Laboratory data are critical to analyzing and improving clinical quality. In the setting of residual use of creatine kinase M and B isoenzyme testing for myocardial infarction, we assessed disease outcomes of discordant creatine kinase M and B isoenzyme +/troponin I (−) test pairs in order to address anticipated clinician concerns about potential loss of case-finding sensitivity following proposed discontinuation of routine creatine kinase and creatine kinase M and B isoenzyme testing. Time-sequenced interventions were introduced. The main outcome was the percentage of cardiac marker studies performed within guidelines. Nonguideline orders dominated at baseline. Creatine kinase M and B isoenzyme testing in 7496 order sets failed to detect additional myocardial infarctions but was associated with 42 potentially preventable admissions/quarter. Interruptive computerized soft stops improved guideline compliance from 32.3% to 58% (P \u3c .001) in services not receiving peer leader intervention and to \u3e80% (P \u3c .001) with peer leadership that featured dashboard feedback about test order performance. This successful experience was recapitulated in interrupted time series within 2 additional services within facility 1 and then in 2 external hospitals (including a critical access facility). Improvements have been sustained postintervention. Laboratory cost savings at the academic facility were estimated to be ≥US$635 000 per year. National collaborative data indicated that facility 1 improved its order patterns from fourth to first quartile compared to peer norms and imply that nonguideline orders persist elsewhere. This example illustrates how pathologists can provide leadership in assisting clinicians in changing laboratory ordering practices. We found that clinicians respond to local laboratory data about their own test performance and that evidence suggesting harm is more compelling to clinicians than evidence of cost savings. Our experience indicates that interventions done at an academic facility can be readily instituted by private practitioners at external facilities. The intervention data also supplement existing literature that electronic order interruptions are more successful when combined with modalities that rely on peer education combined with dashboard feedback about laboratory order performance. The findings may have implications for the role of the pathology laboratory in the ongoing pivot from quantity-based to value-based health care

    Estimation of current density distribution under electrodes for external defibrillation

    Get PDF
    BACKGROUND: Transthoracic defibrillation is the most common life-saving technique for the restoration of the heart rhythm of cardiac arrest victims. The procedure requires adequate application of large electrodes on the patient chest, to ensure low-resistance electrical contact. The current density distribution under the electrodes is non-uniform, leading to muscle contraction and pain, or risks of burning. The recent introduction of automatic external defibrillators and even wearable defibrillators, presents new demanding requirements for the structure of electrodes. METHOD AND RESULTS: Using the pseudo-elliptic differential equation of Laplace type with appropriate boundary conditions and applying finite element method modeling, electrodes of various shapes and structure were studied. The non-uniformity of the current density distribution was shown to be moderately improved by adding a low resistivity layer between the metal and tissue and by a ring around the electrode perimeter. The inclusion of openings in long-term wearable electrodes additionally disturbs the current density profile. However, a number of small-size perforations may result in acceptable current density distribution. CONCLUSION: The current density distribution non-uniformity of circular electrodes is about 30% less than that of square-shaped electrodes. The use of an interface layer of intermediate resistivity, comparable to that of the underlying tissues, and a high-resistivity perimeter ring, can further improve the distribution. The inclusion of skin aeration openings disturbs the current paths, but an appropriate selection of number and size provides a reasonable compromise

    RNAalifold: improved consensus structure prediction for RNA alignments

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The prediction of a consensus structure for a set of related RNAs is an important first step for subsequent analyses. RNAalifold, which computes the minimum energy structure that is simultaneously formed by a set of aligned sequences, is one of the oldest and most widely used tools for this task. In recent years, several alternative approaches have been advocated, pointing to several shortcomings of the original RNAalifold approach.</p> <p>Results</p> <p>We show that the accuracy of RNAalifold predictions can be improved substantially by introducing a different, more rational handling of alignment gaps, and by replacing the rather simplistic model of covariance scoring with more sophisticated RIBOSUM-like scoring matrices. These improvements are achieved without compromising the computational efficiency of the algorithm. We show here that the new version of RNAalifold not only outperforms the old one, but also several other tools recently developed, on different datasets.</p> <p>Conclusion</p> <p>The new version of RNAalifold not only can replace the old one for almost any application but it is also competitive with other approaches including those based on SCFGs, maximum expected accuracy, or hierarchical nearest neighbor classifiers.</p

    Pyrometallurgical Treatment of Apatite Concentrate with the Objective of Rare Earth Element Recovery: Part II

    Get PDF
    Apatite, Ca5(PO4)3F, is a useful raw material for the production of both elemental phosphorus and phosphoric acid, and the mine tailings present at Luossavaara-Kiirunavaara AB (LKAB) in Kiruna, Sweden, represent a significant potential European source of apatite if upgraded to a concentrate. In the present study, pilot apatite concentrate made from the LKAB tailings has been pyrometallurgically treated using carbon to extract phosphorus without fluxing at temperatures exceeding 1800 °C, with the ultimate objective of recovery of rare earth elements (REEs) from the resulting slag/residue phases. Experimental behavior has been modeled using equilibrium thermodynamic predictions performed using HSC®. A process is proposed, and mass–energy balance presented, for the simultaneous production of P4 and CaC2 (ultimately for acetylene, C2H2, and PVC production) from apatite, producing a lime residue significantly enriched in REEs. Possible implications to kiln-based processing of apatite are also discussed

    ViennaRNA Package 2.0

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Secondary structure forms an important intermediate level of description of nucleic acids that encapsulates the dominating part of the folding energy, is often well conserved in evolution, and is routinely used as a basis to explain experimental findings. Based on carefully measured thermodynamic parameters, exact dynamic programming algorithms can be used to compute ground states, base pairing probabilities, as well as thermodynamic properties.</p> <p>Results</p> <p>The <monospace>ViennaRNA</monospace> Package has been a widely used compilation of RNA secondary structure related computer programs for nearly two decades. Major changes in the structure of the standard energy model, the <it>Turner 2004 </it>parameters, the pervasive use of multi-core CPUs, and an increasing number of algorithmic variants prompted a major technical overhaul of both the underlying <monospace>RNAlib</monospace> and the interactive user programs. New features include an expanded repertoire of tools to assess RNA-RNA interactions and restricted ensembles of structures, additional output information such as <it>centroid </it>structures and <it>maximum expected accuracy </it>structures derived from base pairing probabilities, or <it>z</it>-<it>scores </it>for locally stable secondary structures, and support for input in <monospace>fasta</monospace> format. Updates were implemented without compromising the computational efficiency of the core algorithms and ensuring compatibility with earlier versions.</p> <p>Conclusions</p> <p>The <monospace>ViennaRNA Package 2.0</monospace>, supporting concurrent computations <monospace>via OpenMP</monospace>, can be downloaded from <url>http://www.tbi.univie.ac.at/RNA</url>.</p
    corecore