216 research outputs found
SNPmplexViewer--toward a cost-effective traceability system
<p>Abstract</p> <p>Background</p> <p>Beef traceability has become mandatory in many regions of the world and is typically achieved through the use of unique numerical codes on ear tags and animal passports. DNA-based traceability uses the animal's own DNA code to identify it and the products derived from it. Using <it>SNaPshot</it>, a primer-extension-based method, a multiplex of 25 SNPs in a single reaction has been practiced for reducing the expense of genotyping a panel of SNPs useful for identity control.</p> <p>Findings</p> <p>To further decrease <it>SNaPshot</it>'s cost, we introduced the Perl script <it>SNPmplexViewer</it>, which facilitates the analysis of trace files for reactions performed without the use of fluorescent size standards. <it>SNPmplexViewer </it>automatically aligns reference and target trace electropherograms, run with and without fluorescent size standards, respectively. <it>SNPmplexViewer </it>produces a modified target trace file containing a normalised trace in which the reference size standards are embedded. <it>SNPmplexViewer </it>also outputs aligned images of the two electropherograms together with a difference profile.</p> <p>Conclusions</p> <p>Modified trace files generated by <it>SNPmplexViewer </it>enable genotyping of <it>SnaPshot </it>reactions performed without fluorescent size standards, using common fragment-sizing software packages. <it>SNPmplexViewer</it>'s normalised output may also improve the genotyping software's performance. Thus, <it>SNPmplexViewer </it>is a general free tool enabling the reduction of <it>SNaPshot</it>'s cost as well as the fast viewing and comparing of trace electropherograms for fragment analysis. <it>SNPmplexViewer </it>is available at <url>http://cowry.agri.huji.ac.il/cgi-bin/SNPmplexViewer.cgi</url>.</p
Recommended from our members
Recent progress in understanding and projecting regional and global mean sea-level change
Considerable progress has been made in understanding the present and future regional and global sea level in the 2 years since the publication of the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change. Here, we evaluate how the new results affect the AR5’s assessment of (i) historical sea level rise, including attribution of that rise and implications for the sea level budget, (ii) projections of the components and of total global mean sea level (GMSL), and (iii) projections of regional variability and emergence of the anthropogenic signal. In each of these cases, new work largely provides additional evidence in support of the AR5 assessment, providing greater confidence in those findings. Recent analyses confirm the twentieth century sea level rise, with some analyses showing a slightly smaller rate before 1990 and some a slightly larger value than reported in the AR5. There is now more evidence of an acceleration in the rate of rise. Ongoing ocean heat uptake and associated thermal expansion have continued since 2000, and are consistent with ocean thermal expansion reported in the AR5. A significant amount of heat is being stored deeper in the water column, with a larger rate of heat uptake since 2000 compared to the previous decades and with the largest storage in the Southern Ocean. The first formal detection studies for ocean thermal expansion and glacier mass loss since the AR5 have confirmed the AR5 finding of a significant anthropogenic contribution to sea level rise over the last 50 years. New projections of glacier loss from two regions suggest smaller contributions to GMSL rise from these regions than in studies assessed by the AR5; additional regional studies are required to further assess whether there are broader implications of these results. Mass loss from the Greenland Ice Sheet, primarily as a result of increased surface melting, and from the Antarctic Ice Sheet, primarily as a result of increased ice discharge, has accelerated. The largest estimates of acceleration in mass loss from the two ice sheets for 2003–2013 equal or exceed the acceleration of GMSL rise calculated from the satellite altimeter sea level record over the longer period of 1993–2014. However, when increased mass gain in land water storage and parts of East Antarctica, and decreased mass loss from glaciers in Alaska and some other regions are taken into account, the net acceleration in the ocean mass gain is consistent with the satellite altimeter record. New studies suggest that a marine ice sheet instability (MISI) may have been initiated in parts of the West Antarctic Ice Sheet (WAIS), but that it will affect only a limited number of ice streams in the twenty-first century. New projections of mass loss from the Greenland and Antarctic Ice Sheets by 2100, including a contribution from parts of WAIS undergoing unstable retreat, suggest a contribution that falls largely within the likely range (i.e., two thirds probability) of the AR5. These new results increase confidence in the AR5 likely range, indicating that there is a greater probability that sea level rise by 2100 will lie in this range with a corresponding decrease in the likelihood of an additional contribution of several tens of centimeters above the likely range. In view of the comparatively limited state of knowledge and understanding of rapid ice sheet dynamics, we continue to think that it is not yet possible to make reliable quantitative estimates of future GMSL rise outside the likely range. Projections of twenty-first century GMSL rise published since the AR5 depend on results from expert elicitation, but we have low confidence in conclusions based on these approaches. New work on regional projections and emergence of the anthropogenic signal suggests that the two commonly predicted features of future regional sea level change (the increasing tilt across the Antarctic Circumpolar Current and the dipole in the North Atlantic) are related to regional changes in wind stress and surface heat flux. Moreover, it is expected that sea level change in response to anthropogenic forcing, particularly in regions of relatively low unforced variability such as the low-latitude Atlantic, will be detectable over most of the ocean by 2040. The east-west contrast of sea level trends in the Pacific observed since the early 1990s cannot be satisfactorily accounted for by climate models, nor yet definitively attributed either to unforced variability or forced climate change
Recent Advances in Our Understanding of the Role of Meltwater in the Greenland Ice Sheet System
Nienow, Sole and Cowton’s Greenland research has been supported by a number of UK NERC research grants (NER/O/S/2003/00620; NE/F021399/1; NE/H024964/1; NE/K015249/1; NE/K014609/1) and Slater has been supported by a NERC PhD studentshipPurpose of the review: This review discusses the role that meltwater plays within the Greenland ice sheet system. The ice sheet’s hydrology is important because it affects mass balance through its impact on meltwater runoff processes and ice dynamics. The review considers recent advances in our understanding of the storage and routing of water through the supraglacial, englacial, and subglacial components of the system and their implications for the ice sheet Recent findings: There have been dramatic increases in surface meltwater generation and runoff since the early 1990s, both due to increased air temperatures and decreasing surface albedo. Processes in the subglacial drainage system have similarities to valley glaciers and in a warming climate, the efficiency of meltwater routing to the ice sheet margin is likely to increase. The behaviour of the subglacial drainage system appears to limit the impact of increased surface melt on annual rates of ice motion, in sections of the ice sheet that terminate on land, while the large volumes of meltwater routed subglacially deliver significant volumes of sediment and nutrients to downstream ecosystems. Summary: Considerable advances have been made recently in our understanding of Greenland ice sheet hydrology and its wider influences. Nevertheless, critical gaps persist both in our understanding of hydrology-dynamics coupling, notably at tidewater glaciers, and in runoff processes which ensure that projecting Greenland’s future mass balance remains challenging.Publisher PDFPeer reviewe
Evidence of an active volcanic heat source beneath the Pine Island Glacier
Tectonic landforms reveal that the West Antarctic Ice Sheet (WAIS) lies atop a major volcanic rift system. However, identifying subglacial volcanism is challenging. Here we show geochemical evidence of a volcanic heat source upstream of the fast-melting Pine Island Ice Shelf, documented by seawater helium isotope ratios at the front of the Ice Shelf cavity. The localization of mantle helium to glacial meltwater reveals that volcanic heat induces melt beneath the grounded glacier and feeds the subglacial hydrological network crossing the grounding line. The observed transport of mantle helium out of the Ice Shelf cavity indicates that volcanic heat is supplied to the grounded glacier at a rate of ~ 2500 ± 1700 MW, which is ca. half as large as the active Grimsvötn volcano on Iceland. Our finding of a substantial volcanic heat source beneath a major WAIS glacier highlights the need to understand subglacial volcanism, its hydrologic interaction with the marine margins, and its potential role in the future stability of the WAIS
Implementation and performance of adaptive mesh refinement in the Ice Sheet System Model (ISSM v4.14)
Accurate projections of the evolution of ice sheets in a changing climate
require a fine mesh/grid resolution in ice sheet models to correctly capture
fundamental physical processes, such as the evolution of the grounding line,
the region where grounded ice starts to float. The evolution of the grounding
line indeed plays a major role in ice sheet dynamics, as it is a fundamental
control on marine ice sheet stability. Numerical modeling of a grounding line
requires significant computational resources since the accuracy of its
position depends on grid or mesh resolution. A technique that improves
accuracy with reduced computational cost is the adaptive mesh refinement
(AMR) approach. We present here the implementation of the AMR technique in
the finite element Ice Sheet System Model (ISSM) to simulate grounding line
dynamics under two different benchmarks: MISMIP3d and MISMIP+. We test
different refinement criteria: (a) distance around the grounding line, (b) a
posteriori error estimator, the Zienkiewicz–Zhu (ZZ) error estimator, and
(c) different combinations of (a) and (b). In both benchmarks, the ZZ error
estimator presents high values around the grounding line. In the MISMIP+ setup,
this estimator also presents high values in the grounded
part of the ice sheet, following the complex shape of the bedrock geometry.
The ZZ estimator helps guide the refinement procedure such that AMR
performance is improved. Our results show that computational time with AMR
depends on the required accuracy, but in all cases, it is significantly
shorter than for uniformly refined meshes. We conclude that AMR without an
associated error estimator should be avoided, especially for real glaciers
that have a complex bed geometry.</p
Coupling computer-interpretable guidelines with a drug-database through a web-based system – The PRESGUID project
BACKGROUND: Clinical Practice Guidelines (CPGs) available today are not extensively used due to lack of proper integration into clinical settings, knowledge-related information resources, and lack of decision support at the point of care in a particular clinical context. OBJECTIVE: The PRESGUID project (PREScription and GUIDelines) aims to improve the assistance provided by guidelines. The project proposes an online service enabling physicians to consult computerized CPGs linked to drug databases for easier integration into the healthcare process. METHODS: Computable CPGs are structured as decision trees and coded in XML format. Recommendations related to drug classes are tagged with ATC codes. We use a mapping module to enhance computerized guidelines coupling with a drug database, which contains detailed information about each usable specific medication. In this way, therapeutic recommendations are backed up with current and up-to-date information from the database. RESULTS: Two authoritative CPGs, originally diffused as static textual documents, have been implemented to validate the computerization process and to illustrate the usefulness of the resulting automated CPGs and their coupling with a drug database. We discuss the advantages of this approach for practitioners and the implications for both guideline developers and drug database providers. Other CPGs will be implemented and evaluated in real conditions by clinicians working in different health institutions
Decoding of Superimposed Traces Produced by Direct Sequencing of Heterozygous Indels
Direct Sanger sequencing of a diploid template containing a heterozygous insertion or deletion results in a difficult-to-interpret mixed trace formed by two allelic traces superimposed onto each other. Existing computational methods for deconvolution of such traces require knowledge of a reference sequence or the availability of both direct and reverse mixed sequences of the same template. We describe a simple yet accurate method, which uses dynamic programming optimization to predict superimposed allelic sequences solely from a string of letters representing peaks within an individual mixed trace. We used the method to decode 104 human traces (mean length 294 bp) containing heterozygous indels 5 to 30 bp with a mean of 99.1% bases per allelic sequence reconstructed correctly and unambiguously. Simulations with artificial sequences have demonstrated that the method yields accurate reconstructions when (1) the allelic sequences forming the mixed trace are sufficiently similar, (2) the analyzed fragment is significantly longer than the indel, and (3) multiple indels, if present, are well-spaced. Because these conditions occur in most encountered DNA sequences, the method is widely applicable. It is available as a free Web application Indelligent at http://ctap.inhs.uiuc.edu/dmitriev/indel.asp
Algorithms for optimizing drug therapy
BACKGROUND: Drug therapy has become increasingly efficient, with more drugs available for treatment of an ever-growing number of conditions. Yet, drug use is reported to be sub optimal in several aspects, such as dosage, patient's adherence and outcome of therapy. The aim of the current study was to investigate the possibility to optimize drug therapy using computer programs, available on the Internet. METHODS: One hundred and ten officially endorsed text documents, published between 1996 and 2004, containing guidelines for drug therapy in 246 disorders, were analyzed with regard to information about patient-, disease- and drug-related factors and relationships between these factors. This information was used to construct algorithms for identifying optimum treatment in each of the studied disorders. These algorithms were categorized in order to define as few models as possible that still could accommodate the identified factors and the relationships between them. The resulting program prototypes were implemented in HTML (user interface) and JavaScript (program logic). RESULTS: Three types of algorithms were sufficient for the intended purpose. The simplest type is a list of factors, each of which implies that the particular patient should or should not receive treatment. This is adequate in situations where only one treatment exists. The second type, a more elaborate model, is required when treatment can by provided using drugs from different pharmacological classes and the selection of drug class is dependent on patient characteristics. An easily implemented set of if-then statements was able to manage the identified information in such instances. The third type was needed in the few situations where the selection and dosage of drugs were depending on the degree to which one or more patient-specific factors were present. In these cases the implementation of an established decision model based on fuzzy sets was required. Computer programs based on one of these three models could be constructed regarding all but one of the studied disorders. The single exception was depression, where reliable relationships between patient characteristics, drug classes and outcome of therapy remain to be defined. CONCLUSION: Algorithms for optimizing drug therapy can, with presumably rare exceptions, be developed for any disorder, using standard Internet programming methods
Diverse M-Best Solutions by Dynamic Programming
Many computer vision pipelines involve dynamic programming primitives such as finding a shortest path or the minimum energy solution in a tree-shaped probabilistic graphical model. In such cases, extracting not merely the best, but the set of M-best solutions is useful to generate a rich collection of candidate proposals that can be used in downstream processing. In this work, we show how M-best solutions of tree-shaped graphical models can be obtained by dynamic programming on a special graph with M layers. The proposed multi-layer concept is optimal for searching M-best solutions, and so flexible that it can also approximate M-best diverse solutions. We illustrate the usefulness with applications to object detection, panorama stitching and centerline extraction
Recommended from our members
A High-End Estimate of Sea Level Rise for Practitioners
Sea level rise (SLR) is a long-lasting consequence of climate change because global anthropogenic warming takes centuries to millennia to equilibrate for the deep ocean and ice sheets. SLR projections based on climate models support policy analysis, risk assessment and adaptation planning today, despite their large uncertainties. The central range of the SLR distribution is estimated by process-based models. However, risk-averse practitioners often require information about plausible future conditions that lie in the tails of the SLR distribution, which are poorly defined by existing models. Here, a community effort combining scientists and practitioners builds on a framework of discussing physical evidence to quantify high-end global SLR for practitioners. The approach is complementary to the IPCC AR6 report and provides further physically plausible high-end scenarios. High-end estimates for the different SLR components are developed for two climate scenarios at two timescales. For global warming of +2°C in 2100 (RCP2.6/SSP1-2.6) relative to pre-industrial values our high-end global SLR estimates are up to 0.9 m in 2100 and 2.5 m in 2300. Similarly, for a (RCP8.5/SSP5-8.5), we estimate up to 1.6 m in 2100 and up to 10.4 m in 2300. The large and growing differences between the scenarios beyond 2100 emphasize the long-term benefits of mitigation. However, even a modest 2°C warming may cause multi-meter SLR on centennial time scales with profound consequences for coastal areas. Earlier high-end assessments focused on instability mechanisms in Antarctica, while here we emphasize the importance of the timing of ice shelf collapse around Antarctica. This is highly uncertain due to low understanding of the driving processes. Hence both process understanding and emission scenario control high-end SLR
- …
