22,885 research outputs found
The scope for biomanipulation in improving water quality
Biomanipulation is a form of biological engineering in which organisms are selectively removed or encouraged to alleviate the symptoms of eutrophication. Most examples involve fish and grazer zooplankton though mussels have also been used. The technique involves continuous management in many deeper lakes and is not a substitute for nutrient control. In some lakes, alterations to the lake environment have given longer-term positive effects. And in some shallow lakes, biomanipulation may be essential, alongside nutrient control, in re- establishing former aquatic-plant-dominated ecosystems which have been lost through severe eutrophication. The emergence of biomanipulation techniques emphasises that lake systems are not simply chemical reactors which respond simply to engineered chemical changes, but very complex and still very imperfectly understood ecosystems which require a yet profounder understanding before they can be restored with certainty
VERTICAL INTEGRATION AND TRADE POLICY: THE CASE OF SUGAR
The degree of vertical integration in the U.S. sugar industry between raw sugar processing and sugar refining cannot be explained using theories of vertical integration based on transaction costs (e.g. Williamson). We graphically decompose the economic rents accruing to each level in the marketing channel. Different strategies of several major sugar producing, processing and refining entities with regard to sugar quota policy are explored.Agribusiness, Industrial Organization,
The Effect of Consumption Based Taxes on Agriculture in the United States
Recently several proposals have arisen to replace the current income tax system in the United States with a consumption based or Fair Tax. This study investigates the effect of such a consumption based tax on agricultural investment decisions using stochastic optimal control to model the investment decision at the farm level. The results indicate that a consumption tax rate of 25.9 percent would be equivalent to the income tax rate paid by very large producers in the United States.Public Economics,
Rethinking the Role of History in Law & Economics: The Case of the Federal Radio Commission in 1927
In the study of law and economics, there is a danger that historical inferences from theory may infect historical tests of theory. It is imperative, therefore, that historical tests always involve a vigorous search not only for confirming evidence, but for disconfirming evidence as well. We undertake such a search in the context of a single well-known case: the Federal Radio Commission's (FRC's) 1927 decision not to expand the broadcast radio band. The standard account of this decision holds that incumbent broadcasters opposed expansion (to avoid increased competition) and succeeded in capturing the FRC. Although successful broadcaster opposition may be taken as confirming evidence for this interpretation, our review of the record reveals even stronger disconfirming evidence. In particular, we find that every major interest group, not just radio broadcasters, publicly opposed expansion of the band in 1927, and that broadcasters themselves were divided at the FRC's hearings.
New vaccinia virus recombination plasmids incorporating a synthetic late promoter for high level expression of foreign proteins
The Effect of US Energy Policy and Farm Program Payments on the Bio-Fuel Sector: A Regime-Switching Approach
Resource /Energy Economics and Policy,
Rigidity and stability of cold dark solid universe model
Observational evidence suggests that the large scale dynamics of the universe
is presently dominated by dark energy, meaning a non-luminous cosmological
constituent with a negative value of the pressure to density ratio ,
which would be unstable if purely fluid, but could be stable if effectively
solid with sufficient rigidity. It was suggested by Bucher and Spergel that
such a solid constituent might be constituted by an effectively cold (meaning
approximately static) distribution of cosmic strings with , or
membranes with the observationally more favoured value , but it was not
established whether the rigidity in such models actually would be sufficient
for stabilisation. The present article provides an explicit evaluation of the
rigidity to density ratio, which is shown to be given in both string and
membrane cases by , and it is confirmed that this is indeed
sufficient for stabilisation.Comment: 6 pages latex, revised version extended to include 4 figure
Gaugino condensation in an improved heterotic M-theory
Gaugino condensation is discussed in the context of a consistent new version
of low energy heterotic M-theory. The four dimensional reduction of the theory
is described, based on simple boson and fermion backgrounds. This is
generalised to include gaugino condenstates and various background fluxes, some
with non-trivial topology. It is found that condensate and quantised flux
contributions to the four-dimensional superpotential contain no corrections due
to the warping of the higher dimensional metric.Comment: 11 pages, 4 figures, LaTe
Beltway: Getting Around Garbage Collection Gridlock
We present the design and implementation of a new garbage collection framework that significantly generalizes existing copying collectors. The Beltway framework exploits and separates object age and incrementality. It groups objects in one or more increments on queues called belts, collects belts independently, and collects increments on a belt in first-in-first-out order. We show that Beltway configurations, selected by command line options, act and perform the same as semi-space, generational, and older-first collectors, and encompass all previous copying collectors of which we are aware. The increasing reliance on garbage collected languages such as Java requires that the collector perform well. We show that the generality of Beltway enables us to design and implement new collectors that are robust to variations in heap size and improve total execution time over the best generational copying collectors of which we are aware by up to 40%, and on average by 5 to 10%, for small to moderate heap sizes. New garbage collection algorithms are rare, and yet we define not just one, but a new family of collectors that subsumes previous work. This generality enables us to explore a larger design space and build better collectors
- …
