90 research outputs found

    Scientific Opinion on Exploring options for providing advice about possible human health risks based on the concept of Threshold of Toxicological Concern (TTC)

    Get PDF
    <p>Synthetic and naturally occurring substances present in food and feed, together with their possible breakdown or reaction products, represent a large number of substances, many of which require risk assessment. EFSA’s Scientific Committee was requested to evaluate the threshold of toxicological concern (TTC) approach as a tool for providing scientific advice about possible human health risks from low level exposures, its applicability to EFSA’s work, and to advise on any additional data that might be needed to strengthen the underlying basis of the TTC approach. The Scientific Committee examined the published literature on the TTC approach, undertook its own analyses and commissioned an <em>in silico </em>investigation of the databases underpinning the TTC approach. The Scientific Committee concluded that the TTC approach can be recommended as a useful screening tool either for priority setting or for deciding whether exposure to a substance is so low that the probability of adverse health effects is low and that no further data are necessary. The following human exposure threshold values are sufficiently conservative to be used in EFSA’s work; 0.15 μg/person per day for substances with a structural alert for genotoxicity, 18 μg/person per day for organophosphate and carbamate substances with anti-cholinesterase activity, 90 μg/person per day for Cramer Class III and Cramer Class II substances, and 1800 μg/person per day for Cramer Class I substances, but for application to all groups in the population, these values should be expressed in terms of body weight, i.e. 0.0025, 0.3, 1.5 and 30 μg/kg body weight per day, respectively. Use of the TTC approach for infants under the age of 6 months, with immature metabolic and excretory systems, should be considered on a case-by-case basis. The Committee defined a number of exclusion categories of substances for which the TTC approach would not be used.</p&gt

    Small Area Estimation of Latent Economic Well-being

    Get PDF
    © The Author(s) 2019. Small area estimation (SAE) plays a crucial role in the social sciences due to the growing need for reliable and accurate estimates for small domains. In the study of well-being, for example, policy makers need detailed information about the geographical distribution of a range of social indicators. We investigate data dimensionality reduction using factor analysis models and implement SAE on the factor scores under the empirical best linear unbiased prediction approach. We contrast this approach with the standard approach of providing a dashboard of indicators or a weighted average of indicators at the local level. We demonstrate the approach in a simulation study and a real data application based on the European Union Statistics for Income and Living Conditions for the municipalities of Tuscany

    Mapping the Future: Policy Applications of Climate Vulnerability Mapping in West Africa

    Get PDF
    We describe the development of climate vulnerability maps for three Sahelian countries – Mali, Burkina Faso, and Niger – and for coastal West Africa, with a focus on the way the maps were designed to meet decision-making needs and their ultimate influence and use in policy contexts. The paper provides a review of the literature on indicators and maps in the science-policy interface. We then assess the credibility, salience, and legitimacy of the maps as tools for decision-making. Results suggest that vulnerability maps are a useful boundary object for generating discussions among stakeholders with different objectives and technical backgrounds, and that they can provide useful input for targeting development assistance. We conclude with a discussion of the power of maps to capture policy maker attention, and how this increases the onus on map developers to communicate clearly uncertainties and limitations. The assessment of policy uptake in this paper is admittedly subjective; the article includes a discussion of ways to conduct more objective and rigorous assessments of policy impact so as to better evaluate the value and use of vulnerability mapping in decision-making processes

    In silico toxicology protocols

    Get PDF
    The present publication surveys several applications of in silico (i.e., computational) toxicology approaches across different industries and institutions. It highlights the need to develop standardized protocols when conducting toxicity-related predictions. This contribution articulates the information needed for protocols to support in silico predictions for major toxicological endpoints of concern (e.g., genetic toxicity, carcinogenicity, acute toxicity, reproductive toxicity, developmental toxicity) across several industries and regulatory bodies. Such novel in silico toxicology (IST) protocols, when fully developed and implemented, will ensure in silico toxicological assessments are performed and evaluated in a consistent, reproducible, and well-documented manner across industries and regulatory bodies to support wider uptake and acceptance of the approaches. The development of IST protocols is an initiative developed through a collaboration among an international consortium to reflect the state-of-the-art in in silico toxicology for hazard identification and characterization. A general outline for describing the development of such protocols is included and it is based on in silico predictions and/or available experimental data for a defined series of relevant toxicological effects or mechanisms. The publication presents a novel approach for determining the reliability of in silico predictions alongside experimental data. In addition, we discuss how to determine the level of confidence in the assessment based on the relevance and reliability of the information

    Frameworks and tools for risk assessment of manufactured nanomaterials

    Get PDF
    Commercialization of nanotechnologies entails a regulatory requirement for understanding their environmental, health and safety (EHS) risks. Today we face challenges to assess these risks, which emerge from uncertainties around the interactions of manufactured nanomaterials (MNs) with humans and the environment. In order to reduce these uncertainties, it is necessary to generate sound scientific data on hazard and exposure by means of relevant frameworks and tools. The development of such approaches to facilitate the risk assessment (RA) of MNs has become a dynamic area of research. The aim of this paper was to review and critically analyse these approaches against a set of relevant criteria. The analysis concluded that none of the reviewed frameworks were able to fulfill all evaluation criteria. Many of the existing modelling tools are designed to provide screening-level assessments rather than to support regulatory RA and risk management. Nevertheless, there is a tendency towards developing more quantitative, higher-tier models, capable of incorporating uncertainty into their analyses. There is also a trend towards developing validated experimental protocols for material identification and hazard testing, reproducible across laboratories. These tools could enable a shift from a costly case-by-case RA of MNs towards a targeted, flexible and efficient process, based on grouping and read-across strategies and compliant with the 3R (Replacement, Reduction, Refinement) principles. In order to facilitate this process, it is important to transform the current efforts on developing databases and computational models into creating an integrated data and tools infrastructure to support the risk assessment and management of MNs.Commercialization of nanotechnologies entails a regulatory requirement for understanding their environmental, health and safety (EHS) risks. Today we face challenges to assess these risks, which emerge from uncertainties around the interactions of manufactured nanomaterials (MNs) with humans and the environment. In order to reduce these uncertainties, it is necessary to generate sound scientific data on hazard and exposure by means of relevant frameworks and tools. The development of such approaches to facilitate the risk assessment (RA) of MNs has become a dynamic area of research. The aim of this paper was to review and critically analyse these approaches against a set of relevant criteria. The analysis concluded that none of the reviewed frameworks were able to fulfill all evaluation criteria. Many of the existing modelling tools are designed to provide screening level assessments rather than to support regulatory RA and risk management Nevertheless, there is a tendency towards developing more quantitative, higher-tier models, capable of incorporating uncertainty into their analyses. There is also a trend towards developing validated experimental protocols for material identification and hazard testing, reproducible across laboratories. These tools could enable a shift from a costly case-by-case RA of MNs towards a targeted, flexible and efficient process, based on grouping and read-across strategies and compliant with the 3R (Replacement, Reduction, Refinement) principles. In order to facilitate this process, it is important to transform the current efforts on developing databases and computational models into creating an integrated data and tools infrastructure to support the risk assessment and management of MNs. (C) 2016 Elsevier Ltd. All rights reserved

    A research agenda for improving national Ecological Footprint accounts

    Full text link
    corecore