935 research outputs found

    Panel Discussion - Management of Eurasian watermilfoil in the United States using native insects: State regulatory and management issues

    Get PDF
    While researchers have evaluated the potential of native insect herbivores to manage nonindigenous aquatic plant species such as Eurasian watermilfoil ( Myriophyllum spicatum L.), the practical matters of regulatory compliance and implementation have been neglected. A panel of aquatic nuisance species program managers from three state natural resource management agencies (Minnesota, Vermont and Washington) discussed their regulatory and policy concerns. In addition, one ecological consultant attempting to market one of the native insects to manage Eurasian watermilfoil added his perspective on the special challenges of distributing a native biological control agent for management of Eurasian watermilfoil

    Making Classical Ground State Spin Computing Fault-Tolerant

    Full text link
    We examine a model of classical deterministic computing in which the ground state of the classical system is a spatial history of the computation. This model is relevant to quantum dot cellular automata as well as to recent universal adiabatic quantum computing constructions. In its most primitive form, systems constructed in this model cannot compute in an error free manner when working at non-zero temperature. However, by exploiting a mapping between the partition function for this model and probabilistic classical circuits we are able to show that it is possible to make this model effectively error free. We achieve this by using techniques in fault-tolerant classical computing and the result is that the system can compute effectively error free if the temperature is below a critical temperature. We further link this model to computational complexity and show that a certain problem concerning finite temperature classical spin systems is complete for the complexity class Merlin-Arthur. This provides an interesting connection between the physical behavior of certain many-body spin systems and computational complexity.Comment: 24 pages, 1 figur

    Scaling laws governing stochastic growth and division of single bacterial cells

    Get PDF
    Uncovering the quantitative laws that govern the growth and division of single cells remains a major challenge. Using a unique combination of technologies that yields unprecedented statistical precision, we find that the sizes of individual Caulobacter crescentus cells increase exponentially in time. We also establish that they divide upon reaching a critical multiple (\approx1.8) of their initial sizes, rather than an absolute size. We show that when the temperature is varied, the growth and division timescales scale proportionally with each other over the physiological temperature range. Strikingly, the cell-size and division-time distributions can both be rescaled by their mean values such that the condition-specific distributions collapse to universal curves. We account for these observations with a minimal stochastic model that is based on an autocatalytic cycle. It predicts the scalings, as well as specific functional forms for the universal curves. Our experimental and theoretical analysis reveals a simple physical principle governing these complex biological processes: a single temperature-dependent scale of cellular time governs the stochastic dynamics of growth and division in balanced growth conditions.Comment: Text+Supplementar

    Evaluating socio-economic and environmental sustainability of the sheep farming activity in Greece: a whole-farm mathematical programming approach

    Get PDF
    Ruminant livestock farming is an important agricultural activity, mainly located in less favoured areas. Furthermore, ruminants have been identi fi ed as a signi fi cant source of GHG emissions. In this study, a whole-farm optimization model is used to assess the socio-economic and environmental performance of the dairy sheep farming activity in Greece. The analysis is undertaken in two sheep farms that represent the extensive and the semi-intensive farming systems. Gross margin and labour are regarded as socio-economic indicators and GHG emissions as environmental indicators. The issue of the marginal abatement cost is also addressed. The results indicate that the semi-intensive system yields a higher gross margin/ewe (179 €) than the extensive system (117 €) and requires less labour. The extensive system causes higher emissions/kg of milk than the semi-intensive system (5.45 and 2.99 kg of CO2 equivalents, respectively). In both production systems, abatement is achieved primarily via reduction of the fl ock size and switch to cash crops. However, the marginal abatement cost is much higher in the case of the semi-intensive farms, due to their high productivity

    Methods for Characterizing Fine Particulate Matter Using Satellite Remote-Sensing Data and Ground Observations: Potential Use for Environmental Public Health Surveillance

    Get PDF
    This study describes and demonstrates different techniques for surfacing daily environmental / hazards data of particulate matter with aerodynamic diameter less than or equal to 2.5 micrometers (PM2.5) for the purpose of integrating respiratory health and environmental data for the Centers for Disease Control and Prevention (CDC s) pilot study of Health and Environment Linked for Information Exchange (HELIX)-Atlanta. It described a methodology for estimating ground-level continuous PM2.5 concentrations using B-Spline and inverse distance weighting (IDW) surfacing techniques and leveraging National Aeronautics and Space Administration (NASA) Moderate Resolution Imaging Spectrometer (MODIS) data to complement The Environmental Protection Agency (EPA) ground observation data. The study used measurements of ambient PM2.5 from the EPA database for the year 2003 as well as PM2.5 estimates derived from NASA s satellite data. Hazard data have been processed to derive the surrogate exposure PM2.5 estimates. The paper has shown that merging MODIS remote sensing data with surface observations of PM2.5 not only provides a more complete daily representation of PM2.5 than either data set alone would allow, but it also reduces the errors in the PM2.5 estimated surfaces. The results of this paper have shown that the daily IDW PM2.5 surfaces had smaller errors, with respect to observations, than those of the B-Spline surfaces in the year studied. However the IDW mean annual composite surface had more numerical artifacts, which could be due to the interpolating nature of the IDW that assumes that the maxima and minima can occur only at the observation points. Finally, the methods discussed in this paper improve temporal and spatial resolutions and establish a foundation for environmental public health linkage and association studies for which determining the concentrations of an environmental hazard such as PM2.5 with good accuracy levels is critical

    Electronic Health Record Functionality Needed to Better Support Primary Care

    Get PDF
    Electronic health records (EHRs) must support primary care clinicians and patients, yet many clinicians remain dissatisfied with their system. This manuscript presents a consensus statement about gaps in current EHR functionality and needed enhancements to support primary care. The Institute of Medicine primary care attributes were used to define needs and Meaningful Use (MU) objectives to define EHR functionality. Current objectives remain disease- rather than whole-person focused, ignoring factors like personal risks, behaviors, family structure, and occupational and environmental influences. Primary care needs EHRs to move beyond documentation to interpreting and tracking information over time as well as patient partnering activities, support for team based care, population management tools that deliver care, and reduced documentation burden. While Stage 3 MU’s focus on outcomes is laudable, enhanced functionality is still needed including EHR modifications, expanded use of patient portals, seamless integration with external applications, and advancement of national infrastructure and policies

    The traditional, the ideal and the unexplored: sport coaches’ social identity constructs in film

    Get PDF
    The sport coaching construct within mainstream fiction films has been described as stereotypical, reinforcing the traditional notion of the sport coach as a technician who conquers all, or a hapless individual, open to ridicule from athletes and fans. Although this depiction is also prevalent in some independent fiction films and documentaries, film sub genres such as social realism and “fly on the wall” style documentaries move away from the “Hollywood sports film structure” towards stories that focus on everyday coaching moments. Through a critical discourse analysis of two U.K. films (Bend it Like Beckham and Twenty Four Seven), both featuring sport coaches in central roles, we reflect critically on these mass media multidimensional representations in terms of the sport coaching professionalisation agenda in the U.K. and the social identification process of sport coaches within their sporting environments. Keywords: Sport coaching, film, social identification, professionalisation, coaching roles

    Global optimization of data quality checks on 2‐D and 3‐D networks of GPR cross‐well tomographic data for automatic correction of unknown well deviations

    Full text link
    Significant errors related to poor time zero estimation, well deviation or mislocation of the transmitter (TX) and receiver (RX) stations can render even the most sophisticated modeling and inversion routine useless. Previous examples of methods for the analysis and correction of data errors in geophysical tomography include the works of Maurer and Green (1997), Squires et al. (1992) and Peterson (2001). Here we follow the analysis and techniques of Peterson (2001) for data quality control and error correction. Through our data acquisition and quality control procedures we have very accurate control on the surface locations of wells, the travel distance of both the transmitter and receiver within the boreholes, and the change in apparent zero time. However, we often have poor control on well deviations, either because of economic constraints or the nature of the borehole itself prevented the acquisition of well deviation logs. Also, well deviation logs can sometimes have significant errors. Problems with borehole deviations can be diagnosed prior to inversion of travel-time tomography data sets by plotting the apparent velocity of a straight ray connecting a transmitter (TX) to a receiver (RX) against the take-off angle of the ray. Issues with the time-zero pick or distances between wells appear as symmetric smiles or frown in these QC plots. Well deviation or dipping-strong anisotropy will result in an asymmetric correlation between apparent velocity and take-off angle (Figure 1-B). In addition, when a network of interconnected GPR tomography data is available, one has the additional quality constraint of insuring that there is continuity in velocity between immediately adjacent tomograms. A sudden shift in the mean velocity indicates that either position deviations are present or there is a shift in the pick times. Small errors in well geometry may be effectively treated during inversion by including weighting, or relaxation, parameters into the inversion (e.g. Bautu et al., 2006). In the technique of algebraic reconstruction tomography (ART), which is used herein for the travel time inversion (Peterson et al., 1985), a small relaxation parameter will smooth imaging artifacts caused by data errors at the expense of resolution and contrast (Figure 2). However, large data errors such as unaccounted well deviations cannot be adequately suppressed through inversion weighting schemes. Previously, problems with tomograms were treated manually. However, in large data sets and/or networks of data sets, trial and error changes to well geometries become increasingly difficult and ineffective. Mislocation of the transmitter and receiver stations of GPR cross-well tomography data sets can lead to serious imaging artifacts if not accounted for prior to inversion. Previously, problems with tomograms have been treated manually prior to inversion. In large data sets and/or networks of tomographic data sets, trial and error changes to well geometries become increasingly difficult and ineffective. Our approach is to use cross-well data quality checks and a simplified model of borehole deviation with particle swarm optimization (PSO) to automatically correct for source and receiver locations prior to tomographic inversion. We present a simple model of well deviation, which is designed to minimize potential corruption of actual data trends. We also provide quantitative quality control measures based on minimizing correlations between take-off angle and apparent velocity, and a quality check on the continuity of velocity between adjacent wells. This methodology is shown to be accurate and robust for simple 2-D synthetic test cases. Plus, we demonstrate the method on actual field data where it is compared to deviation logs. This study shows the promise for automatic correction of well deviations in GPR tomographic data. Analysis of synthetic data shows that very precise estimates of well deviation can be made for small deviations, even in the presence of static data errors. However, the analysis of the synthetic data and the application of the method to a large network of field data show that the technique is sensitive to data errors varying between neighboring tomograms
    corecore