1,397 research outputs found
LUCIAE 3.0: A new version of a computer program for Firecracker Model and rescattering in relativistic heavy-ion collisions
LUCIAE is a Monte Carlo program that, connected to FRITIOF, implements both
the Firecracker Model (FCM), a possible mechanism for collective multi-gluon
emission from the colour fields of interacting strings, and the reinteraction
of the final state hadrons in relativistic heavy ion collisions. This paper
includes a brief presentation of the dynamics of LUCIAE with an emphasis on the
new features in this version, as well as a description of the program.Comment: LaTeX, no figur
Quantum anti-Zeno effect without wave function reduction
We study the measurement-induced enhancement of the spontaneous decay (called
quantum anti-Zeno effect) for a two-level subsystem, where measurements are
treated as couplings between the excited state and an auxiliary state rather
than the von Neumann's wave function reduction. The photon radiated in a fast
decay of the atom, from the auxiliary state to the excited state, triggers a
quasi-measurement, as opposed to a projection measurement. Our use of the term
"quasi-measurement" refers to a "coupling-based measurement". Such frequent
quasi-measurements result in an exponential decay of the survival probability
of atomic initial state with a photon emission following each
quasi-measurement. Our calculations show that the effective decay rate is of
the same form as the one based on projection measurements. What is more
important, the survival probability of the atomic initial state which is
obtained by tracing over all the photon states is equivalent to the survival
probability of the atomic initial state with a photon emission following each
quasi-measurement to the order under consideration. That is because the
contributions from those states with photon number less than the number of
quasi-measurements originate from higher-order processes.Comment: 7 pages, 3 figure
A universal model for mobility and migration patterns
Introduced in its contemporary form by George Kingsley Zipf in 1946, but with
roots that go back to the work of Gaspard Monge in the 18th century, the
gravity law is the prevailing framework to predict population movement, cargo
shipping volume, inter-city phone calls, as well as bilateral trade flows
between nations. Despite its widespread use, it relies on adjustable parameters
that vary from region to region and suffers from known analytic
inconsistencies. Here we introduce a stochastic process capturing local
mobility decisions that helps us analytically derive commuting and mobility
fluxes that require as input only information on the population distribution.
The resulting radiation model predicts mobility patterns in good agreement with
mobility and transport patterns observed in a wide range of phenomena, from
long-term migration patterns to communication volume between different regions.
Given its parameter-free nature, the model can be applied in areas where we
lack previous mobility measurements, significantly improving the predictive
accuracy of most of phenomena affected by mobility and transport processes.Comment: Main text and supplementary informatio
Design agency:prototyping multi-agent systems in architecture
This paper presents research on the prototyping of multi-agent systems for architectural design. It proposes a design exploration methodology at the intersection of architecture, engineering, and computer science. The motivation of the work includes exploring bottom up generative methods coupled with optimizing performance criteria including for geometric complexity and objective functions for environmental, structural and fabrication parameters. The paper presents the development of a research framework and initial experiments to provide design solutions, which simultaneously satisfy complexly coupled and often contradicting objectives. The prototypical experiments and initial algorithms are described through a set of different design cases and agents within this framework; for the generation of façade panels for light control; for emergent design of shell structures; for actual construction of reciprocal frames; and for robotic fabrication. Initial results include multi-agent derived efficiencies for environmental and fabrication criteria and discussion of future steps for inclusion of human and structural factors
Peer influence in network markets: a theoretical and empirical analysis
Network externalities spur the growth of networks and the adoption of network goods in two ways. First, they make it more attractive to join a network the larger its installed base. Second, they create incentives for network members to actively recruit new members. Despite indications that the latter "peer effect" can be more important for network growth than the installed-base effect, it has so far been largely ignored in the literature. We address this gap using game-theoretical models. When all early adopters can band together to exert peer influence-an assumption that fits, e.g., the case of firms supporting a technical standard-we find that the peer effect induces additional growth of the network by a factor. When, in contrast, individuals exert peer influence in small groups of size n, the increase in network size is by an additive constant-which, for small networks, can amount to a large relative increase. The difference between small, local, personal networks and large, global, anonymous networks arises endogenously from our analysis. Fundamentally, the first type of networks is "tie-reinforcing," the other, "tie-creating". We use survey data from users of the Internet services, Skype and eBay, to illustrate the main logic of our theoretical results. As predicted by the model, we find that the peer effect matters strongly for the network of Skype users-which effectively consists of numerous small sub-networks-but not for that of eBay users. Since many network goods give rise to small, local networks
The Origin of Minus-end Directionality and Mechanochemistry of Ncd Motors
Adaptation of molecular structure to the ligand chemistry and interaction with the cytoskeletal filament are key to understanding the mechanochemistry of molecular motors. Despite the striking structural similarity with kinesin-1, which moves towards plus-end, Ncd motors exhibit minus-end directionality on microtubules (MTs). Here, by employing a structure-based model of protein folding, we show that a simple repositioning of the neck-helix makes the dynamics of Ncd non-processive and minus-end directed as opposed to kinesin-1. Our computational model shows that Ncd in solution can have both symmetric and asymmetric conformations with disparate ADP binding affinity, also revealing that there is a strong correlation between distortion of motor head and decrease in ADP binding affinity in the asymmetric state. The nucleotide (NT) free-ADP (?-ADP) state bound to MTs favors the symmetric conformation whose coiled-coil stalk points to the plus-end. Upon ATP binding, an enhanced flexibility near the head-neck junction region, which we have identified as the important structural element for directional motility, leads to reorienting the coiled-coil stalk towards the minus-end by stabilizing the asymmetric conformation. The minus-end directionality of the Ncd motor is a remarkable example that demonstrates how motor proteins in the kinesin superfamily diversify their functions by simply rearranging the structural elements peripheral to the catalytic motor head domain
Recommended from our members
What explains Cambodia’s success in reducing child stunting-2000-2014?
In many developing countries, high levels of child undernutrition persist alongside rapid economic growth. There is considerable interest in the study of countries that have made rapid progress in child nutrition to uncover the driving forces behind these improvements. Cambodia is often cited as a success case having reduced the incidence of child stunting from 51% to 34% over the period 2000 to 2014. To what extent is this success driven by improvements in the underlying determinants of nutrition, such as wealth and education, (“covariate effects”) and to what extent by changes in the strengths of association between these determinants and nutrition outcomes (“coefficient effects”)? Using determinants derived from the widely-applied UNICEF framework for the analysis of child nutrition and data from four Demographic and Health Surveys datasets, we apply quantile regression based decomposition methods to quantify the covariate and coefficient effect contributions to this improvement in child nutrition. The method used in the study allows the covariate and coefficient effects to vary across the entire distribution of child nutrition outcomes. There are important differences in the drivers of improvements in child nutrition between severely stunted and moderately stunted children and between rural and urban areas. The translation of improvements in household endowments, characteristics and practices into improvements in child nutrition (the coefficient effects) may be influenced by macroeconomic shocks or other events such as natural calamities or civil disturbance and may vary substantially over different time periods. Our analysis also highlights the need to explicitly examine the contribution of targeted child health and nutrition interventions to improvements in child nutrition in developing countries
Combining vitamin C and carotenoid biomarkers better predicts fruit and vegetable intake than individual biomarkers in dietary intervention studies.
The aim of this study was to determine whether combining potential biomarkers of fruit and vegetables is better at predicting FV intake within FV intervention studies than single biomarkers
The feasibility and utility of grocery receipt analyses for dietary assessment
OBJECTIVE: To establish the feasibility and utility of a simple data collection methodology for dietary assessment. DESIGN: Using a cross-sectional design, trained data collectors approached adults (~20 – 40 years of age) at local grocery stores and asked whether they would volunteer their grocery receipts and answer a few questions for a small stipend ($1). METHODS: The grocery data were divided into 3 categories: "fats, oils, and sweets," "processed foods," and "low-fat/low-calorie substitutions" as a percentage of the total food purchase price. The questions assessed the shopper's general eating habits (eg, fast-food consumption) and a few demographic characteristics and health aspects (eg, perception of body size). Statistical Analyses Performed. Descriptive and analytic analyses using non-parametric tests were conducted in SAS. RESULTS: Forty-eight receipts and questionnaires were collected. Nearly every respondent reported eating fast food at least once per month; 27% ate out once or twice a day. Frequency of fast-food consumption was positively related to perceived body size of the respondent (p = 0.02). Overall, 30% of the food purchase price was for fats, oils, sweets, 10% was for processed foods, and almost 6% was for low-fat/low-calorie substitutions. Households where no one was perceived to be overweight spent a smaller proportion of their food budget on fats, oils, and sweets than did households where at least one person was perceived to be overweight (p = 0.10); household where the spouse was not perceived to be overweight spent less on fats, oils, and sweets (p = 0.02) and more on low-fat/low-calorie substitutions (p = 0.09) than did households where the spouse was perceived to be overweight; and, respondents who perceived themselves to be overweight spent more on processed foods than did respondents who did not perceive themselves to be overweight (p = 0.06). CONCLUSION: This simple dietary assessment method, although global in nature, may be a useful indicator of dietary practices as evidenced by its association with perceived weight status
The 100 most cited articles investigating the radiological staging of oesophageal and junctional cancer: a bibliometric analysis
Objectives
Accurate staging of oesophageal cancer (OC) is vital. Bibliometric analysis highlights key topics and publications that have shaped understanding of a subject. The 100 most cited articles investigating radiological staging of OC are identified.
Methods
The Thomas Reuters Web of Science database with search terms including “CT, PET, EUS, oesophageal and gastro-oesophageal junction cancer” was used to identify all English language, full-script articles. The 100 most cited articles were further analysed by topic, journal, author, year and institution.
Results
A total of 5,500 eligible papers were returned. The most cited paper was Flamen et al. (n = 306), investigating the utility of positron emission tomography (PET) for the staging of patients with potentially operable OC. The most common research topic was accuracy of staging investigations (n = 63). The article with the highest citation rate (38.00), defined as the number of citations divided by the number of complete years published, was Tixier et al. investigating PET texture analysis to predict treatment response to neo-adjuvant chemo-radiotherapy, cited 114 times since publication in 2011.
Conclusion
This bibliometric analysis has identified key publications regarded as important in radiological OC staging. Articles with the highest citation rates all investigated PET imaging, suggesting this modality could be the focus of future research
- …
