4,570 research outputs found
Modelling distributed lag effects in epidemiological time series studies
The paper argues that much of the existing literature on air pollution and mortality deals only with the transient effects of air pollution. Policy, on the other hand, needs to know when, whether and to what extent pollution-induced increases in mortality are reversed. This involves modelling the entire distributed lag effects of air pollution. Borrowing from econometrics this paper presents a method by which distributed lag effects can be estimated parsimoniously but plausibly estimated. The paper
presents a time series study into the relationship between ambient levels of air pollution and daily mortality counts for Manchester employing this technique. Black Smoke is shown to have a highly significant effect on mortality counts in the short term. Nevertheless we find that 80 percent of the deaths attributable to BS would have occurred anyway within one week whereas the remaining 20 percent of individuals would otherwise have enjoyed a normal life expectancy
The amenity value of the Italian climate
The hedonic price literature suggests that locations with more favourable characteristics should display compensating wage and house price differentials.
Estimates of the marginal willingness to pay for small changes in climate variables are derived using the hedonic price technique applied to Italian data. A hedonic price model was specified in terms of January and July averages. There exists considerable empirical support for the hypothesis that amenity values for climate are embedded in the labour and housing market. Italians would prefer a
drier climate during the winter months, but higher summertime temperatures are shown to reduce welfare. These results may have relevance to the task of determining the economic impact of future climate change
Modelling distributed lag effects in mortality and air pollution studies: the case of Santiago
Most of the epidemiological literature on air pollution and mortality deals only with single or dual pollutant models whose results are hard to interpret and of questionable value from the policy perspective. In addition, much of the existing literature deals only with the very short-term effects of air pollution whereas policy makers need to know when, whether and to what extent pollution-induced
increases in mortality counts are reversed. This involves modelling the infinite distributed lag effects of air pollution.
Borrowing from econometrics this paper presents a method by which the infinite distributed lag effects can be estimated parsimoniously but plausibly estimated. The paper presents a time series study into the relationship between ambient
levels of air pollution and daily mortality counts for Santiago employing this technique which confirms that the infinite lag effects are highly significant.
It is also shown that day to day variations in NO2 concentrations and in the concentrations of both fine and coarse particulates are associated with short-term
variations in death rates. These findings are made in the context of a model that simultaneously includes six different pollutants. Evidence is found pointing to the operation of a very short term harvesting effect
Valuing congestion costs in the British Museum
Museums are potentially congestible resources because the exhibits they contain are, in any relevant sense of the word, irreproducible. Insofar as visitor congestion
diminishes the value of individuals’ visits it constitutes an additional reason for charging for admission to museums, albeit one not previously considered. A policy of
free access to a museum containing unique treasures may dissipate the economic benefits of the museum.
Within the context of an empirical study undertaken for the British Museum using stated preference techniques it is shown that the congestion cost posed by the marginal visitor is quite high. Notwithstanding the argument that visits to the museum may possess external benefits, this points to the desirability of instigating charges for admission. Furthermore, it is shown that the marginal congestion cost decreases at least over a range as visitor numbers increase. In other words beyond certain levels introducing more visitors does not worsen congestion. This suggests that, contrary to what is often assumed, charging more during periods of high demand may be undesirable.
Insofar as congestion is a widespread phenomenon in important museums, galleries and sites of historical heritage the issues raised in this paper as well as the
methodology devised to determine congestion costs could have widespread application
Efficient FPT algorithms for (strict) compatibility of unrooted phylogenetic trees
In phylogenetics, a central problem is to infer the evolutionary
relationships between a set of species ; these relationships are often
depicted via a phylogenetic tree -- a tree having its leaves univocally labeled
by elements of and without degree-2 nodes -- called the "species tree". One
common approach for reconstructing a species tree consists in first
constructing several phylogenetic trees from primary data (e.g. DNA sequences
originating from some species in ), and then constructing a single
phylogenetic tree maximizing the "concordance" with the input trees. The
so-obtained tree is our estimation of the species tree and, when the input
trees are defined on overlapping -- but not identical -- sets of labels, is
called "supertree". In this paper, we focus on two problems that are central
when combining phylogenetic trees into a supertree: the compatibility and the
strict compatibility problems for unrooted phylogenetic trees. These problems
are strongly related, respectively, to the notions of "containing as a minor"
and "containing as a topological minor" in the graph community. Both problems
are known to be fixed-parameter tractable in the number of input trees , by
using their expressibility in Monadic Second Order Logic and a reduction to
graphs of bounded treewidth. Motivated by the fact that the dependency on
of these algorithms is prohibitively large, we give the first explicit dynamic
programming algorithms for solving these problems, both running in time
, where is the total size of the input.Comment: 18 pages, 1 figur
Recommended from our members
Upstream cyclone influence on the predictability of block onsets over the Euro-Atlantic region
Atmospheric blocking has been shown to be a phenomenon that models struggle to predict accurately, particularly the onset of a blocked state following a more zonal flow. This struggle is, in part, due to the lack of a complete dynamical theory for block onset and maintenance. Here, we evaluate the impact cyclone representation had on the forecast of block onset in two case studies from the North Atlantic Waveguide and Downstream Impact Experiment field campaign and the 20 most unpredictable block onsets over the Euro-Atlantic region in medium-range forecasts from the ECMWF. The six-day forecast of block onset in the case studies is sensitive to changes in the forecast location and intensity of upstream cyclones (one cyclone for one case and two for the other case) in the days preceding the onset. Ensemble sensitivity analysis reveals that this is often the case in unpredictable block onset cases: a one-standard deviation change in 1000-hPa geopotential height near an upstream cyclone, or 320-K potential vorticity near the tropopause, two or three days prior to block onset is associated with more than a 10% change in block area on the analyzed onset day in 17 of the 20 onset cases. These results imply that improvement in the forecasts of upstream cyclone location and intensity may help improve block onset forecasts
An attempt to observe economy globalization: the cross correlation distance evolution of the top 19 GDP's
Economy correlations between the 19 richest countries are investigated
through their Gross Domestic Product increments. A distance is defined between
increment correlation matrix elements and their evolution studied as a function
of time and time window size. Unidirectional and Bidirectional Minimal Length
Paths are generated and analyzed for different time windows. A sort of critical
correlation time window is found indicating a transition for best observations.
The mean length path decreases with time, indicating stronger correlations. A
new method for estimating a realistic minimal time window to observe
correlations and deduce macroeconomy conclusions from such features is thus
suggested.Comment: to be published in the Dyses05 proceedings, in Int. J. Mod Phys C 15
pages, 5 figures, 1 tabl
- …
