2,802 research outputs found
An alternative procedure for selecting a good value for the parameter c in RBF-interpolation
The impact of the scaling parameter c on the accuracy of interpolation schemes using radial basis functions (RBFs) has been pointed out by several authors. Rippa (Adv Comput Math 11:193-210, 1999) proposes an algorithm based on the idea of cross validation for selecting a good such parameter value. In this paper we present an alternative procedure, that can be interpreted as a refinement of Rippa's algorithm for a cost function based on the euclidean norm. We point out how this method is related to the procedure of maximum likelihood estimation, which is used for identifying covariance parameters of stochastic processes in spatial statistics. Using the same test functions as Rippa we show that our algorithm compares favorably with cross validation in many cases and discuss its limitations. Finally we present some computational aspects of our algorithm
The extraordinary evolutionary history of the reticuloendotheliosis viruses
The reticuloendotheliosis viruses (REVs) comprise several closely related amphotropic retroviruses isolated from birds. These viruses exhibit several highly unusual characteristics that have not so far been adequately explained, including their extremely close relationship to mammalian retroviruses, and their presence as endogenous sequences within the genomes of certain large DNA viruses. We present evidence for an iatrogenic origin of REVs that accounts for these phenomena. Firstly, we identify endogenous retroviral fossils in mammalian genomes that share a unique recombinant structure with REVs—unequivocally demonstrating that REVs derive directly from mammalian retroviruses. Secondly, through sequencing of archived REV isolates, we confirm that contaminated Plasmodium lophurae stocks have been the source of multiple REV outbreaks in experimentally infected birds. Finally, we show that both phylogenetic and historical evidence support a scenario wherein REVs originated as mammalian retroviruses that were accidentally introduced into avian hosts in the late 1930s, during experimental studies of P. lophurae, and subsequently integrated into the fowlpox virus (FWPV) and gallid herpesvirus type 2 (GHV-2) genomes, generating recombinant DNA viruses that now circulate in wild birds and poultry. Our findings provide a novel perspective on the origin and evolution of REV, and indicate that horizontal gene transfer between virus families can expand the impact of iatrogenic transmission events
Networked buffering: a basic mechanism for distributed robustness in complex adaptive systems
A generic mechanism - networked buffering - is proposed for the generation of robust traits in complex systems. It requires two basic conditions to be satisfied: 1) agents are versatile enough to perform more than one single functional role within a system and 2) agents are degenerate, i.e. there exists partial overlap in the functional capabilities of agents. Given these prerequisites, degenerate systems can readily produce a distributed systemic response to local perturbations. Reciprocally, excess resources related to a single function can indirectly support multiple unrelated functions within a degenerate system. In models of genome:proteome mappings for which localized decision-making and modularity of genetic functions are assumed, we verify that such distributed compensatory effects cause enhanced robustness of system traits. The conditions needed for networked buffering to occur are neither demanding nor rare, supporting the conjecture that degeneracy may fundamentally underpin distributed robustness within several biotic and abiotic systems. For instance, networked buffering offers new insights into systems engineering and planning activities that occur under high uncertainty. It may also help explain recent developments in understanding the origins of resilience within complex ecosystems. \ud
\u
Modelling the nucleon wave function from soft and hard processes
Current light-cone wave functions for the nucleon are unsatisfactory since
they are in conflict with the data of the nucleon's Dirac form factor at large
momentum transfer. Therefore, we attempt a determination of a new wave function
respecting theoretical ideas on its parameterization and satisfying the
following constraints: It should provide a soft Feynman contribution to the
proton's form factor in agreement with data; it should be consistent with
current parameterizations of the valence quark distribution functions and
lastly it should provide an acceptable value for the \jp \to N \bar N decay
width. The latter process is calculated within the modified perturbative
approach to hard exclusive reactions. A simultaneous fit to the three sets of
data leads to a wave function whose -dependent part, the distribution
amplitude, shows the same type of asymmetry as those distribution amplitudes
constrained by QCD sum rules. The asymmetry is however much more moderate as in
those amplitudes. Our distribution amplitude resembles the asymptotic one in
shape but the position of the maximum is somewhat shifted.Comment: 32 pages RevTex + PS-file with 5 figures in uu-encoded, compressed
fil
Interleukin-6 gene (IL-6): a possible role in brain morphology in the healthy adult brain
Background: Cytokines such as interleukin 6 (IL-6) have been implicated in dual functions in neuropsychiatric disorders. Little is known about the genetic predisposition to neurodegenerative and neuroproliferative properties of cytokine genes. In this study the potential dual role of several IL-6 polymorphisms in brain morphology is investigated. Methodology: In a large sample of healthy individuals (N = 303), associations between genetic variants of IL-6 (rs1800795; rs1800796, rs2069833, rs2069840) and brain volume (gray matter volume) were analyzed using voxel-based morphometry (VBM). Selection of single nucleotide polymorphisms (SNPs) followed a tagging SNP approach (e.g., Stampa algorigthm), yielding a capture 97.08% of the variation in the IL-6 gene using four tagging SNPs. Principal findings/results: In a whole-brain analysis, the polymorphism rs1800795 (−174 C/G) showed a strong main effect of genotype (43 CC vs. 150 CG vs. 100 GG; x = 24, y = −10, z = −15; F(2,286) = 8.54, puncorrected = 0.0002; pAlphaSim-corrected = 0.002; cluster size k = 577) within the right hippocampus head. Homozygous carriers of the G-allele had significantly larger hippocampus gray matter volumes compared to heterozygous subjects. None of the other investigated SNPs showed a significant association with grey matter volume in whole-brain analyses. Conclusions/significance: These findings suggest a possible neuroprotective role of the G-allele of the SNP rs1800795 on hippocampal volumes. Studies on the role of this SNP in psychiatric populations and especially in those with an affected hippocampus (e.g., by maltreatment, stress) are warranted.Bernhard T Baune, Carsten Konrad, Dominik Grotegerd, Thomas Suslow, Eva Birosova, Patricia Ohrmann, Jochen Bauer, Volker Arolt, Walter Heindel, Katharina Domschke, Sonja Schöning, Astrid V Rauch, Christina Uhlmann, Harald Kugel and Udo Dannlowsk
Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?
© 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio
Measurement of the B0 anti-B0 oscillation frequency using l- D*+ pairs and lepton flavor tags
The oscillation frequency Delta-md of B0 anti-B0 mixing is measured using the
partially reconstructed semileptonic decay anti-B0 -> l- nubar D*+ X. The data
sample was collected with the CDF detector at the Fermilab Tevatron collider
during 1992 - 1995 by triggering on the existence of two lepton candidates in
an event, and corresponds to about 110 pb-1 of pbar p collisions at sqrt(s) =
1.8 TeV. We estimate the proper decay time of the anti-B0 meson from the
measured decay length and reconstructed momentum of the l- D*+ system. The
charge of the lepton in the final state identifies the flavor of the anti-B0
meson at its decay. The second lepton in the event is used to infer the flavor
of the anti-B0 meson at production. We measure the oscillation frequency to be
Delta-md = 0.516 +/- 0.099 +0.029 -0.035 ps-1, where the first uncertainty is
statistical and the second is systematic.Comment: 30 pages, 7 figures. Submitted to Physical Review
Life Satisfaction and Sense of Coherence of Breast Cancer Survivors Compared to Women with Mental Depression, Arterial Hypertension and Healthy Controls
The purpose of the study was to compare the life satisfaction (LS) and sense of coherence (SOC) of women recovering from breast cancer (BC) to LS and SOC of women with depression or hypertension and of healthy controls. Finnish Health and Social Support (HeSSup) follow-up survey data in 2003 was linked with national health registries. BC patients were followed up for mortality until the end of 2012. The statistical computations were carried out with SAS (R). There were no significant differences in LS and SOC between the groups with BC, arterial hypertension or healthy controls. Women recovering from BC are as satisfied with their life as healthy controls, and their perceived LS is better and SOC is stronger compared to women with depression. SOC correlated positively (r(2) = 0.36, p <0.001) with LS. However, more studies on determinants of the LS are needed for designing and organizing health care services for BC survivors.Peer reviewe
Search for new phenomena in final states with an energetic jet and large missing transverse momentum in pp collisions at √ s = 8 TeV with the ATLAS detector
Results of a search for new phenomena in final states with an energetic jet and large missing transverse momentum are reported. The search uses 20.3 fb−1 of √ s = 8 TeV data collected in 2012 with the ATLAS detector at the LHC. Events are required to have at least one jet with pT > 120 GeV and no leptons. Nine signal regions are considered with increasing missing transverse momentum requirements between Emiss T > 150 GeV and Emiss T > 700 GeV. Good agreement is observed between the number of events in data and Standard Model expectations. The results are translated into exclusion limits on models with either large extra spatial dimensions, pair production of weakly interacting dark matter candidates, or production of very light gravitinos in a gauge-mediated supersymmetric model. In addition, limits on the production of an invisibly decaying Higgs-like boson leading to similar topologies in the final state are presente
Ab initio alpha-alpha scattering
Processes involving alpha particles and alpha-like nuclei comprise a major
part of stellar nucleosynthesis and hypothesized mechanisms for thermonuclear
supernovae. In an effort towards understanding alpha processes from first
principles, we describe in this letter the first ab initio calculation of
alpha-alpha scattering. We use lattice effective field theory to describe the
low-energy interactions of nucleons and apply a technique called the adiabatic
projection method to reduce the eight-body system to an effective two-cluster
system. We find good agreement between lattice results and experimental phase
shifts for S-wave and D-wave scattering. The computational scaling with
particle number suggests that alpha processes involving heavier nuclei are also
within reach in the near future.Comment: 6 pages, 6 figure
- …
