213 research outputs found

    Lignin biomarkers as tracers of mercury sources in lakes water column

    Get PDF
    This study presents the role of specific terrigenous organic compounds as important vectors of mercury (Hg) transported from watersheds to lakes of the Canadian boreal forest. In order to differentiate the autochthonous from the allochthonous organic matter (OM), lignin derived biomarker signatures [Lambda, S/V, C/V, P/(V ? S), 3,5-Bd/V and (Ad/Al)v] were used. Since lignin is exclusively produced by terrigenous plants, this approach can give a non equivocal picture of the watershed inputs to the lakes. Moreover, it allows a characterization of the source of OM and its state of degradation. The water column of six lakes from the Canadian Shield was sampled monthly between June and September 2005. Lake total dissolved Hg concentrations and Lambda were positively correlated, meaning that Hg and ligneous inputs are linked (dissolved OM r2 = 0.62, p\0.0001; particulate OM r2 = 0.76, p\0.0001). Ratios of P/(V ? S) and 3,5-Bd/V from both dissolved OM and particulate OM of the water column suggest an inverse relationship between the progressive state of pedogenesis and maturation of the OM in soil before entering the lake, and the Hg concentrations in the water column. No relation was found between Hg levels in the lakes and the watershed flora composition—angiosperm versus gymnosperm or woody versus non-woody compounds. This study has significant implications for watershed management of ecosystems since limiting fresh terrestrial OM inputs should reduce Hg inputs to the aquatic systems. This is particularly the case for largescale land-use impacts, such as deforestation, agriculture and urbanization, associated to large quantities of soil OM being transferred to aquatic systems

    Automated Labeling of German Chest X-Ray Radiology Reports using Deep Learning

    Full text link
    Radiologists are in short supply globally, and deep learning models offer a promising solution to address this shortage as part of clinical decision-support systems. However, training such models often requires expensive and time-consuming manual labeling of large datasets. Automatic label extraction from radiology reports can reduce the time required to obtain labeled datasets, but this task is challenging due to semantically similar words and missing annotated data. In this work, we explore the potential of weak supervision of a deep learning-based label prediction model, using a rule-based labeler. We propose a deep learning-based CheXpert label prediction model, pre-trained on reports labeled by a rule-based German CheXpert model and fine-tuned on a small dataset of manually labeled reports. Our results demonstrate the effectiveness of our approach, which significantly outperformed the rule-based model on all three tasks. Our findings highlight the benefits of employing deep learning-based models even in scenarios with sparse data and the use of the rule-based labeler as a tool for weak supervision

    The impact of natural and anthropogenic Dissolved Organic Carbon (DOC), and pH on the toxicity of triclosan to the crustacean Gammarus pulex (L.).

    Get PDF
    Regulatory ecotoxicology testing rarely accounts for the influence of natural water chemistry on the bioavailability and toxicity of a chemical. Therefore, this study identifies whether key omissions in relation to Dissolved Organic Carbon (DOC) and pH have an impact on measured effect concentrations (EC). Laboratory ecotoxicology tests were undertaken for the widely used antimicrobial compound triclosan, using adult Gammarus pulex (L.), a wild-type amphipod using synthetic fresh water, humic acid solutions and wastewater treatment works effluent. The toxicity of triclosan was tested at two different pHs of 7.3 and 8.4, with and without the addition of DOC and 24 and 48hour EC values with calculated 95% confidence intervals calculated. Toxicity tests undertaken at a pH above triclosan's pKa and in the presents of humic acid and effluent, containing 11 and 16mgL(-1) mean DOC concentrations respectively, resulted in significantly decreased triclosan toxicity. This was most likely a result of varying triclosan speciation and complexation due to triclosan's pKa and high hydrophobicity controlling its bioavailability. The mean 48hour EC50 values varied between 0.75±0.45 and 1.93±0.12mgL(-1) depending on conditions. These results suggest that standard ecotoxicology tests can cause inaccurate estimations of triclosan's bioavailability and subsequent toxicity in natural aquatic environments. These results highlight the need for further consideration regarding the role that water chemistry has on the toxicity of organic contaminants and how ambient environmental conditions are incorporated into the standard setting and consenting processes in the future

    Critical Limits for Hg(II) in soils, derived from chronic toxicity data

    Get PDF
    Published chronic toxicity data for Hg(II) added to soils were assembled and evaluated to produce a data set comprising 52 chronic endpoints, five each for plants and invertebrates and 42 for microbes. With endpoints expressed in terms of added soil Hg(II) contents, Critical Limits were derived from the 5th percentiles of species sensitivity distributions, values of 0.13 μg (g soil)-1 and 3.3 μg (g soil organic matter)-1 being obtained. The latter value exceeds the currently-recommended Critical Limit, used to determine Hg(II) Critical Loads in Europe, of 0.5 μg (g soil organic matter)-1. We also applied the WHAM/Model VI chemical speciation model to estimate concentrations of Hg2+ in soil solution, and derived an approximate Critical Limit Function (CLF) that includes pH; log [Hg2+]crit = - 2.15 pH – 17.10. Because they take soil properties into account, the soil organic matter-based limit and the CLF provide the best assessment of toxic threat for different soils. For differing representative soils, each predicts a range of up to 100-fold in the dry weight-based content of mercury that corresponds to the Critical Limit

    Complex event processing for monitoring business processes

    No full text
    Zsfassung in dt. SpracheMit ständig steigender Relevanz von Geschäftsprozessen in den verschiedensten Bereichen, werden auch die Anwendungen, die damit implementiert werden, stetig komplexer. Damit im Einklang steigt auch die Notwendigkeit der Kontrolle und Überwachung solcher komplexer Applikationen. Bereits bestehende Applikationen sind meist nur mit einem, bestenfalls mit einigen wenigen Applikations-Servern kompatibel. In dieser Diplomarbeit entwickeln wir ein Kontrollsystem zur Überwachung von auf BPEL basierenden Geschäftsprozessen, welches leicht mit bestehenden Applikationsservern integriert werden kann und sowohl technische Informationen zur Fehlersuche als auch abstrahierte Informationen zu den Geschäftsabläufen in Form von sogenannten "Views" darstellt.Zusätzlich ist es möglich, mit diesem System die Einhaltung von Randbedingungen, die während der Laufzeit Gültigkeit haben müssen, zu überwachen und eine Nachricht abzuschicken, falls eine Bedingung verletzt wird. Um die Anpassung an vorhandene Systeme möglichst einfach zu gestalten und rasch und ohne großen Aufwand auf Änderungen im System reagieren zu können, wurde von uns ein Ansatz auf Basis der modellgetriebenen Entwicklung gewählt [5].Während wir als Laufzeitumgebung Apache Tomcat verwenden, ist unsere Lösung auch auf anderen Java-basierten Applikationsservern wie zum Beispiel JBoss1 einsetzbar, um diese leicht in vorhandene System-Architektur integrieren zu können.With Business Processes becoming more and more popular and the applications growing more complex, the need of monitoring and controlling enterprise business process systems grows. The existing solutions do not support modelbased compliance control [16] and often have limited event-abstraction capabilities.In this thesis we develop and implement a monitoring system for BPEL processes based on the ideas from Holmes et al. [15, 14], that can be used for monitoring business processes, compliance-constraint monitoring, as well as for debugging. It provides different user-interfaces for different users like high level views for non-technical personnel or debug-views for software engineers.By using model-driven engineering [5] and complex event processing we are able to build a monitoring system that can easily adapted for a variety of business process engines.From our event meta-model we generate a system, that is split into three independent components which can be distributed a long different servers and are connected via message queues. While the first component (event-listener) collects the events from the event-sources, the second component (event-processor) extract business relevant information like e.g. constraint violations from the event-streams generated by the event-listener. The third component (monitoring application) provides a user-interface for reviewing the data extracted by the event-processor.7
    corecore