463 research outputs found

    Water Policies: Regions with Open-Pit Lignite Mining (Introduction to the IIASA Study)

    Get PDF
    There is an apparent need for the analysis of long-term regional water policies to reconcile conflicting interests in regions with open-pit lignite mining. The most important. interest groups in such regions are mining, municipal and industrial water supply, agriculture as well as the "environment". A scientifically sound and practically simple policy-oriented system of methods and computerized procedures has to be developed. To develop such a system is part of the research work in the Regional Water Policies project carried out at the International Institute for Applied Systems Analysis (IIASA) in collaboration with research institutes in the German Democratic Republic, Poland, and in other countries as well. A test area that includes typical water-related elements of mining regions and significant conflicts and interest groups has been chosen. The first stage in the analysis is oriented towards developing a scenario generating system as a tool to choose "good" policies from the regional point of view. Therefore a policy-oriented interactive decision support model system is under development, considering the dynamic, nonlinear and uncertain systems behaviour. It combines a model for multi-criteria analysis in planning periods with a simulation model for monthly systems behaviour. The paper outlines the methodological approach. describes the test region in the GDR, and the submodels for the test region

    Development of Simplified Models of Regional Groundwater and Surface Water Flow Processes based on Computational Experiments with Comprehensive Models

    Get PDF
    The development of complex decision support model systems for the analysis of regional water policies for regions with intense socio-economic development affecting and being affected by the water resources system is of increasing importance. One of the most illustrative examples are regions with open-pit lignite mining. Such model systems have to be based on appropriate submodels, e.g. for water quantity processes. The paper describes submodels for groundwater and surface water flow with special regard to open-pit lignite mining regions. Starting with a problem definition in Section 2 the methodological background is given. The state-of-the-art of comprehensive models of regional water flow processes based on groundwater flow models and of stochastic long-term management modeling are described in details. Section 3 gives the methodological approach for model reduction. The application of this approach is illustrated in Section 4 for the modeling of mine drainage and groundwater tables, for the modeling of remaining pit management and of groundwater-surface water interactions. In the appendix computer programs of some submodels are given being suitable for a more general application

    A Review of Sensor-Based Sorting in Mineral Processing: The Potential Benefits of Sensor Fusion

    Get PDF
    Published: 27 October 2022Sensor-based sorting techniques offer the potential to improve ore grades and reduce the amount of waste material processed. Previous studies show that sensor-based sorting can reduce energy, water and reagent consumption and fine waste production by discarding waste prior to further processing. In this literature review, recent investigations of sensor-based sorting and the fundamental mechanisms of the main sorting techniques are evaluated to inform optimal sensor selection. Additionally, the fusing of data from multiple sensing techniques to improve characterization of the sensed material and hence sorting capability is investigated. It was found that the key to effective implementation of sensor-based sorting is the selection of a sensing technique which can sense a characteristic capable of separating ore from waste with a sampling distribution sufficient for the considered sorting method. Classes of potential sensor fusion sorting applications in mineral processing are proposed and illustrated with example cases. It was also determined that the main holdup for implementing sensor fusion is a lack of correlatable data on the response of multiple sensing techniques for the same ore sample. A combined approach of experimental testing supplemented by simulations is proposed to provide data to enable the evaluation and development of sensor fusion techniques.Dylan Peukert, Chaoshui Xu and Peter Dow

    Exploring the sensitivity of Northern Hemisphere atmospheric circulation to different surface temperature forcing using a statistical–dynamical atmospheric model

    Get PDF
    Climate and weather conditions in the mid-latitudes are strongly driven by the large-scale atmosphere circulation. Observational data indicate that important components of the large-scale circulation have changed in recent decades, including the strength and the width of the Hadley cell, jets, storm tracks and planetary waves. Here, we use a new statistical–dynamical atmosphere model (SDAM) to test the individual sensitivities of the large-scale atmospheric circulation to changes in the zonal temperature gradient, meridional temperature gradient and global-mean temperature. We analyze the Northern Hemisphere Hadley circulation, jet streams, storm tracks and planetary waves by systematically altering the zonal temperature asymmetry, the meridional temperature gradient and the global-mean temperature. Our results show that the strength of the Hadley cell, storm tracks and jet streams depend, in terms of relative changes, almost linearly on both the global-mean temperature and the meridional temperature gradient, whereas the zonal temperature asymmetry has little or no influence. The magnitude of planetary waves is affected by all three temperature components, as expected from theoretical dynamical considerations. The width of the Hadley cell behaves nonlinearly with respect to all three temperature components in the SDAM. Moreover, some of these observed large-scale atmospheric changes are expected from dynamical equations and are therefore an important part of model validation.</p

    Schumpeter: Theorist of the Avant-Garde

    Get PDF
    This paper argues that Schumpeter’s 1911 edition of ‘Theory of Economic Development’ can be fruitfully read as a theory of the avant-garde, in line with such theories developed by artistic avant-garde around the same time, in particular by the Italian Futurists. In particular it will show that both Schumpeter and other avant-garde theorists sought to break with past (1), identify an avant-garde who could force that break (2), find new ways to represent the dynamic world (3), embrace the new and dynamic (4) and promote a perpetual dynamic process, instead of a specific end-state or utopia (5). This new reading helps us to understand the cultural meaning of this seminal text in economics. Secondly it greatly facilitates our understanding of the differences with the later interwar German edition and English edition, which were more cautious in their embrace of the new, less focused on the individual qualities of the entrepreneur and placed more emphasis on historical continuity. Thirdly this reading suggests a different reason for the bifurcation between Schumpeter and the rest of the Austrian school of economics. Traditionally this split is explained by Schumpeter’s affinities with the Lausanne School, this paper instead suggests that the crucial break between Schumpeter on the one hand and Böhm-Bawerk, Wieser and later members of the Austrian School on the other hand is their theory of and attitude toward social change

    A numerical sensitivity study -The effectiveness of RFID-based ore tracking through a simulated coarse ore stockpile and the impacts of key process variables

    Get PDF
    Available online 24 August 2023The ability to understand ore characteristics in real-time during mining processes is vital for ensuring product quality control. However, it is challenging to continuously track ore flow from the mine to the mill due to the blending of ore batches, especially within stockpiles. This paper presents a numerical study of copper ore tracking through a coarse ore stockpile. A discrete element model of a 3D stockpile was created using the EDEM software to evaluate the effectiveness of using RFID tags for ore tracking. To identify the primary variables and their effect on ore transport and tracking through the stockpile, a sensitivity study was conducted to investigate a range of process variables, such as ore size distribution, ore size range, RFID tag size, wall friction, the trajectory of charging particles and stockpile charging methods. The results show that the stockpile model is not sensitive to variables such as the ore size distribution, ore size range and RFID tag size, while wall friction, stockpile feed belt speed, segregation in the ore flow region and contact model have a significant effect on ore blending within the stockpile. It was found that the overall performance of RFID-based ore tracking through the stockpile is poor. For cases with only one or a few tags per ore batch the order in which the tags are read did not provide a good representation of the ore distribution for most scenarios This sensitivity study provides insights into new tracking strategies given the poor performance of RFID tracking shown by the simulation study.Juan Chen, Tien-Fu Lu, Dylan Peukert, Peter Dow

    Antiinflammatory Therapy with Canakinumab for Atherosclerotic Disease

    Get PDF
    Background: Experimental and clinical data suggest that reducing inflammation without affecting lipid levels may reduce the risk of cardiovascular disease. Yet, the inflammatory hypothesis of atherothrombosis has remained unproved. Methods: We conducted a randomized, double-blind trial of canakinumab, a therapeutic monoclonal antibody targeting interleukin-1β, involving 10,061 patients with previous myocardial infarction and a high-sensitivity C-reactive protein level of 2 mg or more per liter. The trial compared three doses of canakinumab (50 mg, 150 mg, and 300 mg, administered subcutaneously every 3 months) with placebo. The primary efficacy end point was nonfatal myocardial infarction, nonfatal stroke, or cardiovascular death. RESULTS: At 48 months, the median reduction from baseline in the high-sensitivity C-reactive protein level was 26 percentage points greater in the group that received the 50-mg dose of canakinumab, 37 percentage points greater in the 150-mg group, and 41 percentage points greater in the 300-mg group than in the placebo group. Canakinumab did not reduce lipid levels from baseline. At a median follow-up of 3.7 years, the incidence rate for the primary end point was 4.50 events per 100 person-years in the placebo group, 4.11 events per 100 person-years in the 50-mg group, 3.86 events per 100 person-years in the 150-mg group, and 3.90 events per 100 person-years in the 300-mg group. The hazard ratios as compared with placebo were as follows: in the 50-mg group, 0.93 (95% confidence interval [CI], 0.80 to 1.07; P = 0.30); in the 150-mg group, 0.85 (95% CI, 0.74 to 0.98; P = 0.021); and in the 300-mg group, 0.86 (95% CI, 0.75 to 0.99; P = 0.031). The 150-mg dose, but not the other doses, met the prespecified multiplicity-adjusted threshold for statistical significance for the primary end point and the secondary end point that additionally included hospitalization for unstable angina that led to urgent revascularization (hazard ratio vs. placebo, 0.83; 95% CI, 0.73 to 0.95; P = 0.005). Canakinumab was associated with a higher incidence of fatal infection than was placebo. There was no significant difference in all-cause mortality (hazard ratio for all canakinumab doses vs. placebo, 0.94; 95% CI, 0.83 to 1.06; P = 0.31). Conclusions: Antiinflammatory therapy targeting the interleukin-1β innate immunity pathway with canakinumab at a dose of 150 mg every 3 months led to a significantly lower rate of recurrent cardiovascular events than placebo, independent of lipid-level lowering. (Funded by Novartis; CANTOS ClinicalTrials.gov number, NCT01327846.
    corecore