256 research outputs found

    Improving selection of markers in nutrition research: evaluation of the criteria proposed by the ILSI Europe Marker Validation Initiative

    Get PDF
    The conduct of high-quality nutrition research requires the selection of appropriate markers as outcomes, for example as indicators of food or nutrient intake, nutritional status, health status or disease risk. Such selection requires detailed knowledge of the markers, and consideration of the factors that may influence their measurement, other than the effects of nutritional change. A framework to guide selection of markers within nutrition research studies would be a valuable tool for researchers. A multidisciplinary Expert Group set out to test criteria designed to aid the evaluation of candidate markers for their usefulness in nutrition research and subsequently to develop a scoring system for markers. The proposed criteria were tested using thirteen markers selected from a broad range of nutrition research fields. The result of this testing was a modified list of criteria and a template for evaluating a potential marker against the criteria. Subsequently, a semi-quantitative system for scoring a marker and an associated template were developed. This system will enable the evaluation and comparison of different candidate markers within the same field of nutrition research in order to identify their relative usefulness. The ranking criteria of proven, strong, medium or low are likely to vary according to research setting, research field and the type of tool used to assess the marker and therefore the considerations for scoring need to be determined in a setting-, field- and tool-specific manner. A database of such markers, their interpretation and range of possible values would be valuable to nutrition researchers

    Paracetamol metabolism, hepatotoxicity, biomarkers and therapeutic interventions: a perspective

    Get PDF
    After over 60 years of therapeutic use in the UK, paracetamol (acetaminophen, N-acetyl-p-aminophenol, APAP) remains the subject of considerable research into both its mode of action and toxicity. The pharmacological properties of APAP are the focus of some activity, with the role of the metabolite N-arachidonoylaminophenol (AM404) still a topic of debate. However, that the hepatotoxicity of APAP results from the production of the reactive metabolite N-acetyl-p-benzoquinoneimine (NAPQI/NABQI) that can deplete glutathione, react with cellular macromolecules, and initiate cell death, is now beyond dispute. The disruption of cellular pathways that results from the production of NAPQI provides a source of potential biomarkers of the severity of the damage. Research in this area has provided new diagnostic markers such as the microRNA miR-122 as well as mechanistic biomarkers associated with apoptosis, mitochondrial dysfunction, inflammation and tissue regeneration. Additionally, biomarkers of, and systems biology models for, glutathione depletion have been developed. Furthermore, there have been significant advances in determining the role of both the innate immune system and genetic factors that might predispose individuals to APAP-mediated toxicity. This perspective highlights some of the progress in current APAP-related research

    Discovering drug–drug interactions: a text-mining and reasoning approach based on properties of drug metabolism

    Get PDF
    Motivation: Identifying drug–drug interactions (DDIs) is a critical process in drug administration and drug development. Clinical support tools often provide comprehensive lists of DDIs, but they usually lack the supporting scientific evidences and different tools can return inconsistent results. In this article, we propose a novel approach that integrates text mining and automated reasoning to derive DDIs. Through the extraction of various facts of drug metabolism, not only the DDIs that are explicitly mentioned in text can be extracted but also the potential interactions that can be inferred by reasoning

    Thresholds of Toxicological Concern for Cosmetics-Related Substances: New Database, Thresholds, and Enrichment of Chemical Space

    Get PDF
    A new dataset of cosmetics-related chemicals for the Threshold of Toxicological Concern (TTC) approach has been compiled, comprising 552 chemicals with 219, 40, and 293 chemicals in Cramer Classes I, II, and III, respectively. Data were integrated and curated to create a database of No-/Lowest-Observed-Adverse-Effect Level (NOAEL/LOAEL) values, from which the final COSMOS TTC dataset was developed. Criteria for study inclusion and NOAEL decisions were defined, and rigorous quality control was performed for study details and assignment of Cramer classes. From the final COSMOS TTC dataset, human exposure thresholds of 42 and 7.9 μg/kg-bw/day were derived for Cramer Classes I and III, respectively. The size of Cramer Class II was insufficient for derivation of a TTC value. The COSMOS TTC dataset was then federated with the dataset of Munro and colleagues, previously published in 1996, after updating the latter using the quality control processes for this project. This federated dataset expands the chemical space and provides more robust thresholds. The 966 substances in the federated database comprise 245, 49 and 672 chemicals in Cramer Classes I, II and III, respectively. The corresponding TTC values of 46, 6.2 and 2.3 μg/kg-bw/day are broadly similar to those of the original Munro dataset

    Application of the TTC concept to unknown substances found in analysis of foods

    Get PDF
    Unknown substances, not previously observed, are frequently detected in foods by quality control laboratories. In many cases, the assessment of these 'new' substances requires additional chemical analysis for their identification prior to assessing risk. This identification procedure can be time-consuming, expensive and in some instances difficult. Furthermore, in many cases, no toxicological information will be available for the substance. Therefore, there is a need to develop pragmatic tools for the assessment of the potential toxicity of substances with unknown identity to avoid delays in their risk assessment. Hence, the 'ILSI Europe expert group on the application of the threshold of toxicological concern (TTC) to unexpected peaks found in food' was established to explore whether the TTC concept may enable a more pragmatic risk assessment of unknown substances that were not previously detected in food. A step-wise approach is introduced that uses expert judgement on the source of the food, information on the analytical techniques, the dietary consumption of food sources containing the unknown substance and quantitative information of the unknown substance to assess the safety to the consumer using the TTC. By following this step-wise approach, it may be possible to apply a TTC threshold of 90. µg/day for an unknown substance in food. © 2011 Elsevier Ltd

    Chemical carcinogenicity revisited 3: Risk assessment of carcinogenic potential based on the current state of knowledge of carcinogenesis in humans

    Get PDF
    Abstract Over 50 years, we have learned a great deal about the biology that underpins cancer but our approach to testing chemicals for carcinogenic potential has not kept up. Only a small number of chemicals has been tested in animal-intensive, time consuming, and expensive long-term bioassays in rodents. We now recommend a transition from the bioassay to a decision-tree matrix that can be applied to a broader range of chemicals, with better predictivity, based on the premise that cancer is the consequence of DNA coding errors that arise either directly from mutagenic events or indirectly from sustained cell proliferation. The first step is in silico and in vitro assessment for mutagenic (DNA reactive) activity. If mutagenic, it is assumed to be carcinogenic unless evidence indicates otherwise. If the chemical does not show mutagenic potential, the next step is assessment of potential human exposure compared to the threshold for toxicological concern (TTC). If potential human exposure exceeds the TTC, then testing is done to look for effects associated with the key characteristics that are precursors to the carcinogenic process, such as increased cell proliferation, immunosuppression, or significant estrogenic activity. Protection of human health is achieved by limiting exposures to below NOEALs for these precursor effects. The decision tree matrix is animal-sparing, cost effective, and in step with our growing knowledge of the process of cancer formation

    Chemical carcinogenicity revisited 1: A unified theory of carcinogenicity based on contemporary knowledge

    Get PDF
    Abstract Developments in the understanding of the etiology of cancer have profound implications for the way the carcinogenicity of chemicals is addressed. This paper proposes a unified theory of carcinogenesis that will illuminate better ways to evaluate and regulate chemicals. In the last four decades, we have come to understand that for a cell and a group of cells to begin the process of unrestrained growth that is defined as cancer, there must be changes in DNA that reprogram the cell from normal to abnormal. Cancer is the consequence of DNA coding errors that arise either directly from mutagenic events or indirectly from cell proliferation especially if sustained. Chemicals that act via direct interaction with DNA can induce cancer because they cause mutations which can be carried forward in dividing cells. Chemicals that act via non-genotoxic mechanisms must be dosed to maintain a proliferative environment so that the steps toward neoplasia have time to occur. Chemicals that induce increased cellular proliferation can be divided into two categories: those which act by a cellular receptor to induce cellular proliferation, and those which act via non-specific mechanisms such as cytotoxicity. This knowledge has implications for testing chemicals for carcinogenic potential and risk management

    E2F1-mediated FOS induction in arsenic trioxide-induced cellular transformation: effects of global H3K9 hypoacetylation and promoter-specific hyperacetylation in vitro.

    Get PDF
    BACKGROUND: Aberrant histone acetylation has been observed in carcinogenesis and cellular transformation associated with arsenic exposure; however, the molecular mechanisms and cellular outcomes of such changes are poorly understood. OBJECTIVE: We investigated the impact of tolerated and toxic arsenic trioxide (As2O3) exposure in human embryonic kidney (HEK293T) and urothelial (UROtsa) cells to characterize the alterations in histone acetylation and gene expression as well as the implications for cellular transformation. METHODS: Tolerated and toxic exposures of As2O3 were identified by measurement of cell death, mitochondrial function, cellular proliferation, and anchorage-independent growth. Histone extraction, the MNase sensitivity assay, and immunoblotting were used to assess global histone acetylation levels, and gene promoter-specific interactions were measured by chromatin immunoprecipitation followed by reverse-transcriptase polymerase chain reaction. RESULTS: Tolerated and toxic dosages, respectively, were defined as 0.5 μM and 2.5 μM As2O3 in HEK293T cells and 1 μM and 5 μM As2O3 in UROtsa cells. Global hypoacetylation of H3K9 at 72 hr was observed in UROtsa cells following tolerated and toxic exposure. In both cell lines, tolerated exposure alone led to H3K9 hyperacetylation and E2F1 binding at the FOS promoter, which remained elevated after 72 hr, contrary to global H3K9 hypoacetylation. Thus, promoter-specific H3K9 acetylation is a better predictor of cellular transformation than are global histone acetylation patterns. Tolerated exposure resulted in an increased expression of the proto-oncogenes FOS and JUN in both cell lines at 72 hr. CONCLUSION: Global H3K9 hypoacetylation and promoter-specific hyperacetylation facilitate E2F1-mediated FOS induction in As2O3-induced cellular transformation

    Application of Key Events Analysis to Chemical Carcinogens and Noncarcinogens

    Get PDF
    The existence of thresholds for toxicants is a matter of debate in chemical risk assessment and regulation. Current risk assessment methods are based on the assumption that, in the absence of sufficient data, carcinogenesis does not have a threshold, while noncarcinogenic endpoints are assumed to be thresholded. Advances in our fundamental understanding of the events that underlie toxicity are providing opportunities to address these assumptions about thresholds. A key events dose-response analytic framework was used to evaluate three aspects of toxicity. The first section illustrates how a fundamental understanding of the mode of action for the hepatic toxicity and the hepatocarcinogenicity of chloroform in rodents can replace the assumption of low-dose linearity. The second section describes how advances in our understanding of the molecular aspects of carcinogenesis allow us to consider the critical steps in genotoxic carcinogenesis in a key events framework. The third section deals with the case of endocrine disrupters, where the most significant question regarding thresholds is the possible additivity to an endogenous background of hormonal activity. Each of the examples suggests that current assumptions about thresholds can be refined. Understanding inter-individual variability in the events involved in toxicological effects may enable a true population threshold(s) to be identified
    corecore