1,753 research outputs found
Modelling the US Federal Spending Process: Overview and Implications
The object of study is the US Federal budget process - an institutional process of increasing prominence in US and world affairs - which is unique in generating quantitative data for scholarly research. The authors first outline their rigorous, but simple, econometric models of how budget decisions are made, coordinated, and implemented and then trace the implications of their high-inertia view of the process for the US economic cycle. They propound a presidential and Congressional ambition model of current and postwar cyclical economic difficulties, including stagflation, in terms of a macroeconomic model of the US economy in which federal governmental expenditure is endogenous. The chapter concludes with speculation on the disastrous consequences for society of the growth of a sluggishly adaptable bureaucratic process operating in a rapidly changing economic and social environment
Modeling performance of Hadoop applications: A journey from queueing networks to stochastic well formed nets
Nowadays, many enterprises commit to the extraction of actionable knowledge from huge datasets as part of their core business activities. Applications belong to very different domains such as fraud detection or one-to-one marketing, and encompass business analytics and support to decision making in both private and public sectors. In these scenarios, a central place is held by the MapReduce framework and in particular its open source implementation, Apache Hadoop. In such environments, new challenges arise in the area of jobs performance prediction, with the needs to provide Service Level Agreement guarantees to the enduser and to avoid waste of computational resources. In this paper we provide performance analysis models to estimate MapReduce job execution times in Hadoop clusters governed by the YARN Capacity Scheduler. We propose models of increasing complexity and accuracy, ranging from queueing networks to stochastic well formed nets, able to estimate job performance under a number of scenarios of interest, including also unreliable resources. The accuracy of our models is evaluated by considering the TPC-DS industry benchmark running experiments on Amazon EC2 and the CINECA Italian supercomputing center. The results have shown that the average accuracy we can achieve is in the range 9–14%
Multilocation Corn Stover Harvest Effects on Crop Yields and Nutrient Removal
Corn (Zea mays L.) stover was identified as an important feedstock for cellulosic bioenergy production because of the extensive area upon which the crop is already grown. This report summarizes 239 site-years of field research examining effects of zero, moderate, and high stover removal rates at 36 sites in seven different states. Grain and stover yields from all sites as well as N, P, and K removal from 28 sites are summarized for nine longitude and six latitude bands, two tillage practices (conventional vs no tillage), two stover-harvest methods (machine vs calculated), and two crop rotations {continuous corn (maize) vs corn/soybean [Glycine max (L.) Merr.]}. Mean grain yields ranged from 5.0 to 12.0 Mg ha−1 (80 to 192 bu ac−1). Harvesting an average of 3.9 or 7.2 Mg ha−1(1.7 or 3.2 tons ac−1) of the corn stover resulted in a slight increase in grain yield at 57 and 51 % of the sites, respectively. Average no-till grain yields were significantly lower than with conventional tillage when stover was not harvested, but not when it was collected. Plant samples collected between physiological maturity and combine harvest showed that compared to not harvesting stover, N, P, and K removal was increased by 24, 2.7, and 31 kg ha−1, respectively, with moderate (3.9 Mg ha−1) harvest and by 47, 5.5, and 62 kg ha−1, respectively, with high (7.2 Mg ha−1) removal. This data will be useful for verifying simulation models and available corn stover feedstock projections, but is too variable for planning site-specific stover harvest
Evaluation of Rehabilitation of Memory in Neurological Disabilities (ReMiND): a randomized controlled trial
OBJECTIVE:The evidence for the effectiveness of memory rehabilitation is inconclusive. The aim was to compare the effectiveness of two group memory rehabilitation programmes with a self-help group control.
DESIGN:Single-blind randomized controlled trial.
PARTICIPANTS:Participants with memory problems following traumatic brain injury, stroke or multiple sclerosis were recruited from community settings.
INTERVENTIONS:Participants were randomly allocated, in cohorts of four, to compensation or restitution group treatment programmes or a self-help group control. All programmes were manual-based and comprised two individual and ten weekly group sessions.
MAIN MEASURES:Memory functions, mood, and activities of daily living were assessed at baseline and five and seven months after randomization.
RESULTS:There were 72 participants (mean age 47.7, SD 10.2 years; 32 men). There was no significant effect of treatment on the Everyday Memory Questionnaire (P = 0.97). At seven months the mean scores were comparable (restitution 36.6, compensation 41.0, self-help 44.1). However, there was a significant difference between groups on the Internal Memory Aids Questionnaire (P = 0.002). The compensation and restitution groups each used significantly more internal memory aids than the self-help group (P 0.05).
CONCLUSIONS:There results show few statistically significant effects of either compensation or restitution memory group treatment as compared with a self-help group control. Further randomized trials of memory rehabilitation are needed
Rapidity and Centrality Dependence of Proton and Anti-proton Production from Au+Au Collisions at sqrt(sNN) = 130GeV
We report on the rapidity and centrality dependence of proton and anti-proton
transverse mass distributions from Au+Au collisions at sqrt(sNN) = 130GeV as
measured by the STAR experiment at RHIC. Our results are from the rapidity and
transverse momentum range of |y|<0.5 and 0.35 <p_t<1.00GeV/c. For both protons
and anti-protons, transverse mass distributions become more convex from
peripheral to central collisions demonstrating characteristics of collective
expansion. The measured rapidity distributions and the mean transverse momenta
versus rapidity are flat within |y|<0.5. Comparisons of our data with results
from model calculations indicate that in order to obtain a consistent picture
of the proton(anti-proton) yields and transverse mass distributions the
possibility of pre-hadronic collective expansion may have to be taken into
account.Comment: 4 pages, 3 figures, 1 table, submitted to PR
Global circulation patterns of seasonal influenza viruses vary with antigenic drift.
Understanding the spatiotemporal patterns of emergence and circulation of new human seasonal influenza virus variants is a key scientific and public health challenge. The global circulation patterns of influenza A/H3N2 viruses are well characterized, but the patterns of A/H1N1 and B viruses have remained largely unexplored. Here we show that the global circulation patterns of A/H1N1 (up to 2009), B/Victoria, and B/Yamagata viruses differ substantially from those of A/H3N2 viruses, on the basis of analyses of 9,604 haemagglutinin sequences of human seasonal influenza viruses from 2000 to 2012. Whereas genetic variants of A/H3N2 viruses did not persist locally between epidemics and were reseeded from East and Southeast Asia, genetic variants of A/H1N1 and B viruses persisted across several seasons and exhibited complex global dynamics with East and Southeast Asia playing a limited role in disseminating new variants. The less frequent global movement of influenza A/H1N1 and B viruses coincided with slower rates of antigenic evolution, lower ages of infection, and smaller, less frequent epidemics compared to A/H3N2 viruses. Detailed epidemic models support differences in age of infection, combined with the less frequent travel of children, as probable drivers of the differences in the patterns of global circulation, suggesting a complex interaction between virus evolution, epidemiology, and human behaviour.T.B.
was
supported
by
a
Newton
International
Fellowship
from
the
Royal
Society
and
through
NIH
U54
GM111274.
S.R.
was
supported
by
MRC
(UK,
Project
MR/J008761/1),
Wellcome
Trust
(UK,
Project
093488/Z/10/Z),
Fogarty
International
Centre
(USA,
R01
TW008246‐01),
DHS
(USA,
RAPIDD
program),
NIGMS
(USA,
MIDAS
U01
GM110721‐01)
and
NIHR
(UK,
Health
Protection
Research
Unit
funding).
The
Melbourne
WHO
Collaborating
Centre
for
Reference
and
Research
on
Influenza
was
supported
by
the
Australian
Government
Department
of
Health
and
thanks
N.
Komadina
and
Y.‐M.
Deng.
The
Atlanta
WHO
Collaborating
Center
for
Surveillance,
Epidemiology
and
Control
of
Influenza
was
supported
by
the
U.S.
Department
of
13
Health
and
Human
Services.
NIV
thanks
A.C.
Mishra,
M.
Chawla‐Sarkar,
A.M.
Abraham,
D.
Biswas,
S.
Shrikhande,
AnuKumar
B,
and
A.
Jain.
Influenza
surveillance
in
India
was
expanded,
in
part,
through
US
Cooperative
Agreements
(5U50C1024407
and
U51IP000333)
and
by
the
Indian
Council
of
Medical
Research.
M.A.S.
was
supported
through
NSF
DMS
1264153
and
NIH
R01
AI
107034.
Work
of
the
WHO
Collaborating
Centre
for
Reference
and
Research
on
Influenza
at
the
MRC
National
Institute
for
Medical
Research
was
supported
by
U117512723.
P.L.,
A.R.
&
M.A.S
were
supported
by
EU
Seventh
Framework
Programme
[FP7/2007‐2013]
under
Grant
Agreement
no.
278433-‐PREDEMICS
and
ERC
Grant
agreement
no.
260864.
C.A.R.
was
supported
by
a
University
Research
Fellowship
from
the
Royal
Society.This is the author accepted manuscript. It is currently under infinite embargo pending publication of the final version
Reciprocity as a foundation of financial economics
This paper argues that the subsistence of the fundamental theorem of contemporary financial mathematics is the ethical concept ‘reciprocity’. The argument is based on identifying an equivalence between the contemporary, and ostensibly ‘value neutral’, Fundamental Theory of Asset Pricing with theories of mathematical probability that emerged in the seventeenth century in the context of the ethical assessment of commercial contracts in a framework of Aristotelian ethics. This observation, the main claim of the paper, is justified on the basis of results from the Ultimatum Game and is analysed within a framework of Pragmatic philosophy. The analysis leads to the explanatory hypothesis that markets are centres of communicative action with reciprocity as a rule of discourse. The purpose of the paper is to reorientate financial economics to emphasise the objectives of cooperation and social cohesion and to this end, we offer specific policy advice
Hierarchical Bayesian estimation and hypothesis testing for delay discounting tasks
A state-of-the-art data analysis procedure is presented to conduct hierarchical Bayesian inference and hypothesis testing on delay discounting data. The delay discounting task is a key experimental paradigm used across a wide range of disciplines from economics, cognitive science, and neuroscience, all of which seek to understand how humans or animals trade off the immediacy verses the magnitude of a reward. Bayesian estimation allows rich inferences to be drawn, along with measures of confidence, based upon limited and noisy behavioural data. Hierarchical modelling allows more precise inferences to be made, thus using sometimes expensive or difficult to obtain data in the most efficient way. The proposed probabilistic generative model describes how participants compare the present subjective value of reward choices on a trial-to-trial basis, estimates participant- and group-level parameters. We infer discount rate as a function of reward size, allowing the magnitude effect to be measured. Demonstrations are provided to show how this analysis approach can aid hypothesis testing. The analysis is demonstrated on data from the popular 27-item monetary choice questionnaire (Kirby, Psychonomic Bulletin & Review, 16(3), 457–462 2009), but will accept data from a range of protocols, including adaptive procedures. The software is made freely available to researchers
Phytostabilization of metals in mine soils using Brassica juncea in combination with organic amendments
Background and aims The high metal bioavailability and the poor conditions of mine soils yield a low plant biomass, limiting the application of phytoremediation techniques. A greenhouse experiment was performed to evaluate the effects of organic amendments on metal stabilization and the potential of Brassica juncea L. for phytostabilization in mine soils. Methods Plants were grown in pots filled with soils collected from two mine sites located in Central Spain mixed with 0, 30 and 60 tha?1 of pine bark compost and horse- and sheep-manure compost. Plant biomass and metal concentrations in roots and shoots were measured. Metal bioavailability was assessed using a rhizosphere-based method (rhizo), which consists of a mixture of low-molecular-weight organic acids to simulate root exudates. Results Manure reduced metal concentrations in shoots (10?50 % reduction of Cu and 40?80 % of Zn in comparison with non-amended soils), bioconcentration factor (10?50 % of Cu and 40?80 % of Zn) and metal bioavailability in soil (40?50 % of Cu and 10?30 % of Zn) due to the high pH and the contribution of organic matter. Manure improved soil fertility and was also able to increase plant biomass (5?20 times in shoots and 3?30 times in roots), which resulted in a greater amount of metals removed from soil and accumulated in roots (increase of 2?7 times of Cu and Zn). Plants grown in pine bark treatments and in non-amended soils showed a limited biomass and high metal concentrations in shoots. Conclusions The addition of manure could be effective for the stabilization of metals and for enhancing the phytostabilization ability of B. juncea in mine soils. In this study, this species resulted to be a potential candidate for phytostabilization in combination with manure, differing from previous results, in which B. juncea had been recognized as a phytoextraction plant
- …
