315 research outputs found

    Characterization and washability studies of raw coal from the Little Tonzona Field, Alaska

    Get PDF
    Coal occurs in an isolated exposure of Tertiary, non-marine sedimentary rocks along the southwest bank of the Little Tonzona River, near Farewell, Alaska. The Little Tonzona River coal field is located approximately 150 air miles northwest of Anchorage, Alaska, and 210 air miles southwest of Fairbanks, Alaska; near the boundaries of Denali National Park. The Alaska Railroad and the Parks Highway are approximately 100 air miles from the coal field at their nearest point. The village of McGrath, on the Kuskokwim River, is located approximately 90 miles to the west (1). An impressive outcrop of coal-bearing Tertiary sediments is exposed for a distance of more than 275 feet on the west bank of the Little Tonzona River (Figure 1). More than seven coal beds, ranging in thickness from 3 feet ta 30 feet, with a cumulative thickness of over 134 feet, are interbedded with clay beds up to 40 feet thick. The clays are fine textured, extremely plastic, light grey to nearly white bentonites andlor tonsteins. Doyon Ltd., an ANSCA Native Corporation, holds land selections covering the inferred limits of the coal field. During 1980 and 1981, Doyon entered into exploration agreements with McIntyre Mines Inc. of Nevada. The two season exploration program took place from June 1,1980 through August 22,1980 and from May 27,1981 through August 22, 1981. During the 1980 field season, geologic mapping, prospecting, stratigraphy, trenching and bulk sampling of all coal outcrops were performed. This produced a total of 34 samples, which were taken for analysis. In 1981, six diamond drill holes with a cumulative length of 2,935 feet were completed. Core recovery was close to 90%, and a total of 147 coal samples, which represented 802.8 cumulative feet of coal, were taken for analysis. The exploration program confirmed a strike length of over 3 miles to the southwest from the main river bank exposure. Northward extension is unknown at this time. Although outcrop exposure is poor away from the river banks, burnout zones resulting from past coal bed fires form a resistant, recognizable on strike feature in the relatively unindurated Tertialy sequence. The appearance of these burnout zones along strike is often the only surface indication of the buried coal-bearing strata. Well preserved plant fossil impressions in the baked clays date the deposit as probable Miocene (2). Coal characterization and washability studies were performed on all coal samples by the Mineral Industry Research Laboratory of the University of Alaska Fairbanks. This work was conducted under the direction of Dr. P.D. Rao, Professor of Coal Technology.This study was conducted under the sponsorship of McIntyre Mines Ltd

    Zero gap alkaline electrolysis cell design for renewable energy storage as hydrogen gas

    Get PDF
    Zero gap alkaline electrolysers hold the key to cheap and efficient renewable energy storage via the production and distribution of hydrogen gas. A zero gap design, where porous electrodes are spacially separated only by the gas separator, allows the unique benefits of alkaline electrolysis to be combined with the high efficiencies currently only associated with the more expensive PEM set-up. This review covers the basics of alkaline electrolysis, and provides a detailed description of the advantages of employing a zero gap cell design over the traditional arrangement. A comparison with different types of zero gap cell designs currently seen in research is made, and a description of recent developments is presented. Finally, the current state of research into zero gap alkaline electrolysis is discussed, and pathways for future research identified. Zero gap alkaline electrolysis will allow excess renewable energy to be stored, transported and used on demand in a green and environmentally friendly manner as when the hydrogen is burnt or passed into a fuel cell it produces only water and energy

    What is news? News values revisited (again)

    Get PDF
    The deceptively simple question “What is news?” remains pertinent even as we ponder the future of journalism in the digital age. This article examines news values within mainstream journalism and considers the extent to which news values may be changing since earlier landmark studies were undertaken. Its starting point is Harcup and O’Neill’s widely-cited 2001 updating of Galtung and Ruge’s influential 1965 taxonomy of news values. Just as that study put Galtung and Ruge’s criteria to the test with an empirical content analysis of published news, this new study explores the extent to which Harcup and O’Neill’s revised list of news values remain relevant given the challenges (and opportunities) faced by journalism today, including the emergence of social media. A review of recent literature contextualises the findings of a fresh content analysis of news values within a range of UK media 15 years on from the last study. The article concludes by suggesting a revised and updated set of contemporary news values, whilst acknowledging that no taxonomy can ever explain everything

    Financial Partnering and Other Strategies to Help Centers of Teaching and Learning Thrive in Hard Times

    Get PDF
    With Centers for Teaching and Learning (CTLs) entering a period of economic downturn, the authors demonstrate how their Center has survived hard times through financial partnering with on- and off-campus groups. They also explain how to develop successful strategies for partnering (both financial and otherwise), analyze the dynamics of such collaborations, and offer some useful guidelines

    The DEEP2 Galaxy Redshift Survey: Design, Observations, Data Reduction, and Redshifts

    Get PDF
    We describe the design and data sample from the DEEP2 Galaxy Redshift Survey, the densest and largest precision-redshift survey of galaxies at z ~ 1 completed to date. The survey has conducted a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude M_B = -20 at z ~ 1 via ~90 nights of observation on the DEIMOS spectrograph at Keck Observatory. DEEP2 covers an area of 2.8 deg^2 divided into four separate fields, observed to a limiting apparent magnitude of R_AB=24.1. Objects with z < 0.7 are rejected based on BRI photometry in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted ~2.5 times more efficiently than in a purely magnitude-limited sample. Approximately sixty percent of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets which fail to yield secure redshifts are blue objects that lie beyond z ~ 1.45. The DEIMOS 1200-line/mm grating used for the survey delivers high spectral resolution (R~6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. DEEP2 surpasses other deep precision-redshift surveys at z ~ 1 in terms of galaxy numbers, redshift accuracy, sample number density, and amount of spectral information. We also provide an overview of the scientific highlights of the DEEP2 survey thus far. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the publicly-available DEEP2 DEIMOS data reduction pipelines. [Abridged]Comment: submitted to ApJS; data products available for download at http://deep.berkeley.edu/DR4

    The DEEP2 Galaxy Redshift Survey: Design, Observations, Data Reduction, and Redshifts

    Get PDF
    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z approx. 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude MB = 20 at z approx. 1 via approx.90 nights of observation on the Keck telescope. The survey covers an area of 2.8 Sq. deg divided into four separate fields observed to a limiting apparent magnitude of R(sub AB) = 24.1. Objects with z approx. 0.7 to be targeted approx. 2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z approx. 1.45, where the [O ii] 3727 Ang. doublet lies in the infrared. The DEIMOS 1200 line mm(exp 1) grating used for the survey delivers high spectral resolution (R approx. 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed through more than 2000 objects with duplicate observations. Sky subtraction is essentially photon-limited even under bright OH sky lines; we describe the strategies that permitted this, based on high image stability, accurate wavelength solutions, and powerful B-spline modeling methods. We also investigate the impact of targets that appear to be single objects in ground-based targeting imaging but prove to be composite in Hubble Space Telescope data; they constitute several percent of targets at z approx. 1, approaching approx. 5%-10% at z > 1.5. Summary data are given that demonstrate the superiority of DEEP2 over other deep high-precision redshift surveys at z approx. 1 in terms of redshift accuracy, sample number density, and amount of spectral information. We also provide an overview of the scientific highlights of the DEEP2 survey thus far

    Design Thinking for Innovation. Stress Testing Human Factors in Ideation Sessions

    Get PDF
    This paper reports on a series of studies that attempt to unpick the factors that contribute to successful team ideation. Ideation is a popular, structured approach to creative thinking, where the goal is to produce many viable and innovative ideas and concepts. This is often accomplished through structured collaborative workshops that include ‘Design Thinking’ techniques and methods. The reported studies involved manipulating variables in controlled experiments with subjects (AKA ideators). The sample of ideators, were tasked with generating ideas to solve a challenge and the outcome of their work was measured by quantity and quality of output. The latter criterion was assessed by an expert panel using a standardised evaluation framework. Four variables were employed to understand idea generation success factors. These were identified as common and thus easily applied factors in typical ideation scenarios and included varying levels of participant stimulation (before sessions), presence or absence of a facilitator, application of ‘Design Thinking’ technique (or not) and lastly, participant profile based on professional background. In this case, participant characteristics were split between designers and non-designers. The different experiments were run, with participants generating ideas in a timeboxed activity in which their outputs were assessed against the various experimental conditions. The findings suggest that counter orthodox thinking, applying the methods (e.g. Round Robin) is less effective than the influence of ideators’ differing professional background and their level of stimulation. These conclusions in turn suggest the possibility of extending the effectiveness of workshop facilitation to increase efficiency and quality of output. The paper concludes with pointers on improving ideation. In particular, increasing levels of engagement and immersion among participants and using aspects of game theory are seen a possible areas of further investigation

    WHO decides what is fair? International HIV treatment guidelines, social value judgements and equitable provision of lifesaving antiretroviral therapy

    Get PDF
    The new 2013 WHO Consolidated Guidelines on the Use of Antiretroviral Therapy (ART) make aspirational recommendations for ART delivery in low and middle income countries. Comprehensive assessments of available evidence were undertaken and the recommendations made are likely to improve individual health outcomes. However feasibility was downplayed, the Guidelines represent high-cost policy options not all of which are compatible with the core public health principles of decentralization; task-shifting; and a commitment to universality. Critically, their impact on equity and the population-level distribution of health outcomes were not fully considered. We analyze the likely distribution of health outcomes resulting from alternative ways of realising the 2013 Guidelines and assess practicality, feasibility and health attainment amongst different sections of the population in the context of financial and human resource constraints. Claim can be made that direct interpretation of the Guidelines follows a "human rights" based approach in seeking to provide individual patients with the best alternatives amongst those available on the basis of current evidence. However, there lies a basic conflict between this and "consequentialist" public health based approaches that provide more equal population-level outcomes. When determining how to respond to the 2013 Guidelines and fairly allocate scarce lifesaving resources, national policymakers must carefully consider the distribution of outcomes and the underpinning social value judgements required to inform policy choice. It is important to consider whose values should determine what is a just distribution of health outcomes. The WHO Guidelines committees are well placed to compile evidence on the costs and effects of health care alternatives. However, their mandate for making distributional social value judgements remains unclear
    corecore