1,667 research outputs found
Recommended from our members
Expressions of Maya Identity and Culture in Los Angeles: Challenges and Success Among Maya Youth
Latin American Studie
Compressed Transmission of Route Descriptions
We present two methods to compress the description of a route in a road
network, i.e., of a path in a directed graph. The first method represents a
path by a sequence of via edges. The subpaths between the via edges have to be
unique shortest paths. Instead of via edges also via nodes can be used, though
this requires some simple preprocessing. The second method uses contraction
hierarchies to replace subpaths of the original path by shortcuts. The two
methods can be combined with each other. Also, we propose the application to
mobile server based routing: We compute the route on a server which has access
to the latest information about congestions for example. Then we transmit the
computed route to the car using some mobile radio communication. There, we
apply the compression to save costs and transmission time. If the compression
works well, we can transmit routes even when the bandwidth is low. Although we
have not evaluated our ideas with realistic data yet, they are quite promising.Comment: 7 pages, technical repor
Ranking the Risks: The 10 Pathogen-Food Combinations With the Greatest Burden on Public Health
Examines food-borne pathogens with the highest disease burdens and the top ten foods most commonly contaminated by them, such as salmonella in poultry, toxoplasma in pork, and listeria in deli meats. Makes policy recommendations for improving prevention
Let’s Take This One Step at a Time: The Effect of Presenting the Brainstorming Rules in Stages on Brainstorming Effectiveness
The purpose of this research is to further our understanding of the way groups work together to generate ideas while using a procedure called brainstorming. Brainstorming requires groups to follow four procedural rules while generating their ideas (Osborn 1957). However, two of theses rules seem to call for contradictory processes. One of these rules states that “free-wheeling is welcomed; the wilder the idea the better,” while another rule says to “combine and build on the ideas already generated.” The contradiction is apparent when a person notices that one rule requests a group to generate ideas that are different from previously generated ideas and the other rule requests a group to generate ideas that are similar to previously generated ideas.The implication of this contradiction was examined by presenting the rules to groups in two different ways. In the first condition, groups received all four rules at once, which is the standard way they are usually presented. In the second condition groups received all four rules again but they received the “free-wheeling” instruction first and, after generating ideas for a set amount of time, the groups then received the final “build-on” instruction. It was found that when groups were given the two contradictory rules separately and sequentially, they generated significantly more feasible ideas, though not more original. These results suggest that this presentation of the rules could be used with real-world groups to help improve their ability to generate more viable and useful ideas
Shade-Grown Coffee: Simulation and Policy Analysis for Coastal Oaxaca, Mexico
Shade-grown coffee provides a livelihood to many farmers, protects biodiversity, and creates environmental services. Many shade-coffee farmers have abandoned production in recent years, however, in response to declines in international coffee prices. This paper builds a farmer decision model under price uncertainty and uses simulation analysis of that model to examine the likely impact of various policies on abandonment of shade-coffee plantations. Using information from coastal Oaxaca, Mexico, this paper examines the role of various constraints in abandonment decisions, reveals the importance of the timing of policies, and characterizes the current situation in the study region.coffee farming, decision analysis, numerical modeling, Monte Carlo, price variability
The Inner Galaxy resolved at IJK using DENIS data
We present the analysis of three colour optical/near-infrared images, in IJK,
taken for the DENIS project. The region considered covers 17.4 square deg and
lies within |l|<5 deg, |b|<1.5 deg. The adopted methods for deriving photometry
and astrometry in these crowded images, together with an analysis of the
deficiencies nevertheless remaining, are presented. The numbers of objects
extracted in I,J and K are 748000, 851000 and 659000 respectively, to magnitude
limits of 17,15 and 13. 80% completeness levels typically fall at magnitudes
16, 13 and 10 respectively, fainter by about 2 magnitudes than the usual DENIS
limits due to the crowded nature of these fields. A simple model to describe
the disk contribution to the number counts is constructed, and parameters for
the dust layer derived. We find that a formal fit of parameters for the dust
plane, from these data in limited directions, gives a scalelength and
scaleheight of 3.4+-1.0 kpc and 40+-5 pc respectively, and a solar position
14.0+-2.5 pc below the plane. This latter value is likely to be affected by
localised dust asymmetries. We convolve a detailed model of the systematic and
random errors in the photometry with a simple model of the Galactic disk and
dust distribution, to simulate expected colour-magnitude diagrams. These are in
good agreement with the observed diagrams, allowing us to isolate those stars
from the inner disk and bulge. After correcting for local dust-induced
asymmetries, we find evidence for longitude-dependent asymmetries in the
distant J and K sources, consistent with the general predictions of some
Galactic bar models. We consider complementary L-band observations in a second
paper.Comment: 14 pages, 33 figures, LaTeX, MNRAS accepte
Prioritizing Opportunities to Reduce the Risk of Foodborne Illness: A Conceptual Framework
Determining the best use of food safety resources is a difficult task faced by public policymakers, regulatory agencies, state and local food safety and health agencies, as well as private firms. The Food Safety Research Consortium (FSRC) has developed a conceptual framework for priority setting and resource allocation for food safety that takes full account of the food system’s complexity and available data but is simple enough to be workable and of practical value to decisionmakers. The conceptual framework addresses the question of how societal resources, both public and private, can be used most effectively to reduce the public health burden of foodborne illness by quantitatively ranking risks and considering the availability, effectiveness, and cost of interventions to address these risks. We identify two types of priority-setting decisions: Purpose 1 priority setting that guides risk-based allocation of food safety resources, primarily by government food safety agencies, across a wide range of opportunities to reduce the public health impact of foodborne illness; and Purpose 2 priority setting that guides the choice of risk management actions and strategies with respect to particular hazards and commodities. It is essential that such a framework be grounded in a systems approach, multi-disciplinary in approach and integration of data, practical, flexible, and dynamic by including ongoing evaluation and continuous updating of risk rankings and other elements. The conceptual framework is a synthesis of ideas and information generated in connection with and during the three FSRC workshops convened under a project funded by the Cooperative State Research, Education, and Extension Service of USDA. Workshop materials are available on the project website: http://www.card.iastate.edu/food_safety/.
Maquiladoras, Air Pollution, and Human Health in Ciudad Juarez and El Paso
Ciudad Juárez, Chihuahua, is home to the U.S.–Mexico border’s largest maquiladora labor force, and also its worst air pollution. We marshal two types of evidence to examine the link between maquiladoras and air pollution in Ciudad Juárez, and in its sister city, El Paso, Texas. First, we use a publicly available sector-level emissions inventory for Ciudad Juárez to determine the importance of all industrial facilities (including maquiladoras) as a source of air pollution. Second, we use original plantlevel data from two sample maquiladoras to better understand the impacts of maquiladora air pollution on human health. We use a series of computational models to estimate health damages attributable to air pollution from these plants, we compare these damages to estimates of damages from non-maquiladora industrial polluters, and we use regression analysis to determine whether the poor suffer disproportionately from maquiladora air pollution. We find that air pollution from maquiladoras has serious consequences for human health, including respiratory disease and premature mortality. However, maquiladoras are clearly not the leading cause of air pollution in Ciudad Juárez and El Paso. Moreover, most maquiladoras are probably less important sources of dangerous air pollution than at least one notoriously polluting Mexican-owned industry. Finally, we find no evidence to suggest that maquiladora air pollution affects the poor disproportionately.maquiladora, air pollution, human health, environmental justice, U.S.-Mexico border, Ciudad Juárez, El Paso
Cost-Effective NOx Control in the Eastern United States
Reducing nitrogen oxide (NOx) emissions in the eastern United States has become the focus of efforts to meet ozone air quality goals and will be useful for reducing particulate matter (PM) concentrations in the future. This paper addresses many aspects of the debate over the appropriate approach for obtaining reductions in NOx emissions from point sources beyond those called for in the Clean Air Act Amendments of 1990. Data on NOx control technologies and their associated costs, spatial models linking NOx emissions and air quality, and benefit estimates of the health effects of changes in ozone and PM concentrations are combined to allow an analysis of alternative policies in thirteen states in the eastern United States. The first part of the study examines the cost and other consequences of a command-and-control approach embodied in the Environmental Protection Agency’s (EPA) NOx SIP call, which envisions large reductions in NOx from electric utilities and other point sources. These results are compared to the alternative policy of ton-for-ton NOx emissions trading, similar to that proposed by the EPA for utilities. We find that emission reduction targets can be met at roughly 50% cost savings under a trading program when there are no transaction costs. The paper examines a number of alternative economic incentive policies that have the potential to improve upon the utility NOx trading plan proposed by EPA, including incorporation of other point sources in the trading program, incorporation of ancillary PM benefits to ozone reductions in the trading program, and trading on the basis of ozone exposures that incorporates the spatial impact of emissions on ozone levels. For the latter analysis, we examine spatially differentiated permit systems for reducing ozone exposures under different and uncertain meteorological conditions, including an empirical analysis of the trade-off between the reliability (or degree of certainty) of meeting ozone exposure reduction targets and the cost of NOx control. Finally, several policies that combine costs and health benefits from both ozone and PM reductions are compared to command-and-control and single-pollutant trading policies. The first of these is a full multipollutant trading system that achieves a health benefit goal, with the interpollutant trading ratios governed by the ratio of unit health benefits of ozone and PM. Then, a model that maximizes aggregate benefits from both ozone and PM exposure reductions net of the costs of NOx controls is estimated. EPA’s program appears to be reasonably cost-effective compared to all of the other more complex trading programs we examined. It may even be considered an optimal policy that maximizes net aggregate benefits if the high estimate of benefits is used in which mortality risk is linked to ozone exposure. Without this controversial assumption, however, we find that EPA’s NOx reduction target is far too large.
How long, O Bayesian network, will I sample thee? A program analysis perspective on expected sampling times
Bayesian networks (BNs) are probabilistic graphical models for describing
complex joint probability distributions. The main problem for BNs is inference:
Determine the probability of an event given observed evidence. Since exact
inference is often infeasible for large BNs, popular approximate inference
methods rely on sampling.
We study the problem of determining the expected time to obtain a single
valid sample from a BN. To this end, we translate the BN together with
observations into a probabilistic program. We provide proof rules that yield
the exact expected runtime of this program in a fully automated fashion. We
implemented our approach and successfully analyzed various real-world BNs taken
from the Bayesian network repository
- …
